PDA

View Full Version : ARMAR, humanoid robots, Humanoids and Intelligence Systems Lab (HIS), Karlsruhe Institue of Technology (KIT), Karlsruhe, Germany



Airicist
3rd April 2014, 09:20
Developer and manufacturer - Humanoids and Intelligence Systems Lab (https://pr.ai/showthread.php?3613)

Home Page - his.anthropomatik.kit.edu/english/241.php (http://his.anthropomatik.kit.edu/english/241.php)

Armar 3 (https://de.wikipedia.org/wiki/Armar_3) on Wikipedia

High Performance Humanoid Technologies Lab (H2T) - h2t.anthropomatik.kit.edu (https://h2t.anthropomatik.kit.edu)

Professor on Humanoid Robotics Systems - Tamim Asfour (https://www.linkedin.com/in/tamim-asfour-829979a)

Airicist
3rd April 2014, 09:24
https://youtu.be/N9C4NNRwOAQ

Humanoid Robot Armar

Uploaded on Jun 25, 2010

Airicist
3rd April 2014, 09:25
https://youtu.be/TkFdyJ5_PKo

Humanoid Robot Armar-II: Grasping and Placing

Uploaded on Jun 25, 2010

Airicist
3rd April 2014, 09:26
https://youtu.be/m3Y9iVQHyvg

Humanoid Robot Armar-II: Object Recognition

Uploaded on Jun 25, 2010

Airicist
3rd April 2014, 09:27
https://youtu.be/SHMSyYLRQPM

The humanoid robot ARMAR-III

Uploaded on Aug 25, 2010


This video presentation introduce a fully integrated and autonomous humanoid robot performing complex manipulation and grasping tasks in a kitchen environment.

Airicist
3rd April 2014, 09:29
https://youtu.be/jmB7zzGPZrE

ARMAR, the humanoid robot

Uploaded on Nov 29, 2011

Airicist
5th May 2014, 11:05
https://youtu.be/_itWYnfRnTQ

Controlling the ARMAR III with a tablet

Published on Apr 3, 2014


In this video, intuitive control of the humanoid robot ARMAR III with a tablet is shown.
The android application to control the robot was developed by a group of students at KIT during their internship at our institute.

Airicist
5th May 2015, 20:19
https://youtu.be/HNispeu0WVU

Programming a humanoid robot assistant kitchen

Published on May 5, 2014


The students had to extend the task, the capabilities of our kitchen assistant robot ARMAR III. You should combine existing capabilities complex tasks such as grasping and placing as bringing the ingredients of a recipe.

Airicist
6th November 2015, 22:27
https://youtu.be/VXI1pbp_10c

Validation of Whole-Body Loco-Manipulation Affordances for Pushability and Liftability

Published on Nov 6, 2015


This video shows the results presented in:
P. Kaiser, M. Grotz, E. E. Aksoy, M. Do, N. Vahrenkamp and T. Asfour, Validation of Whole-Body Loco-Manipulation Affordances for Pushability and Liftability, IEEE/RAS International Conference on Humanoid Robots (Humanoids), 2015

Abstract

Autonomous robots that are intended to work in disaster scenarios like collapsed or contaminated buildings need to be able to efficiently identify action possibilities in unknown environments. This includes the detection of environmental elements that allow interaction, such as doors or debris, as well as the utilization of fixed environmental structures for stable whole-body loco-manipulation. Affordances that refer to whole- body actions are especially valuable for humanoid robots as the necessity of stabilization is an integral part of their control strategies.

Based on our previous work we propose to apply the concept of affordances to actions of stable whole-body loco- manipulation, in
particular to pushing and lifting of large objects. We extend our perceptual pipeline in order to build large- scale representations of
the robot’s environment in terms of environmental primitives like
planes, cylinders and spheres. A rule-based system is employed to derive whole-body affordance hypotheses from these primitives, which are then subject to validation by the robot. An experimental evaluation demonstrates our progress in detection, validation and utilization of whole-body affordances.

Airicist
28th July 2016, 14:09
https://youtu.be/PyJ5hCW3zQM

Integration of natural language understanding, robot's memory and planning in a humanoid robot

Published on Jul 28, 2016


This video shows the results of the approach presented in the paper "Integration of Multi-Purpose Natural Language Understanding, Robot's Memory, and Planning in a Humanoid Robot Platform?".

We introduce a framework that allows the robot to understand natural language, generate symbolic representations of its sensorimotor experience, generate complex plans according to the current world state, monitor plan execution, replace missing objects and suggest object locations. The framework is implemented within the robot development environment ArmarX and is based on the concept of structural bootstrapping developed in the context of the European project Xperience. We test the framework on the humanoid robot ARMAR-III in the kitchen environment with a complex scenario about setting a table and preparing a salad, which is shown in this video.

Airicist
28th July 2016, 14:10
https://youtu.be/-8oC-WW5P1I

Robots bootstrapped through learning from experience

Published on Jul 28, 2016


This video shows an integrated demonstration of central results of the Xperience project. In the example task of preparing a salad
and setting a table together with a human, the robot ARMAR-III uses its knowledge gained from previous experience to plan and execute
the necessary actions towards its goal. The demonstration highlights the aspects of the realization of integrated complete robot systems,
and emphasizes the concept of structural bootstrapping on the levels of human-robot communication and physical interaction, sensorimotor
learning, learning of object affordances, and planning in robotics.

The scenario integrates several scientific methods developed in the project:
- Execution of complex manipulation tasks and plans based on the developed architecture and its implementation
- Automatic generation of domain descriptions for planning based on the robots experience
- Replanning on the fly in case of missing objects
- Replacing missing objects by employing different bootstrapping (replacement) strategies
- Replacing actions by adapting previously learnt actions to new context
- Human-robot communication in natural language including the robot's understanding of spoken commands, world descriptions, and feedback as well as the robot's ability to ask the human for help and information
- Handing over objects between the robot and the human

Airicist
19th January 2018, 10:14
https://youtu.be/b9ZI6NX7fCk

SecondHands project members present first robot prototype ARMAR-6

Published on Jan 11, 2018


SecondHands (https://pr.ai/showthread.php?t=16912) is an EU-funded Horizon 2020 project aiming to design a collaborative robot (cobot) that can proactively offer support to maintenance technicians working in Ocado’s highly automated warehouses, also known as Customer Fulfilment Centers (CFCs). This robot will be a second pair of hands that will assist technicians when they are in need of help. The robot will learn through observation and will augment the humans' capabilities by completing tasks that require a level of precision or physical strength that are not available to human workers.

"KIT's ARMAR-6 Humanoid Will Help Humans Fix Other Robots (https://spectrum.ieee.org/automaton/robotics/industrial-robots/kit-armar6-humanoid)"

by Evan Ackerman
January 16, 2018

Airicist
20th July 2018, 09:22
https://youtu.be/6AFkrGkKI7g

Published on Jul 20, 2018


The humanoid robot ARMAR-6 collaborating with a human worker in a bimanual overhead task and performing force-based bimanual manipulation, vision-based grasping, fluent object handover, human activity recognition, natural language based human-robot dialog and interaction, navigation among many other features on a use case supplied by Ocado. In a complex maintenance task demonstration, the robot was able to recognize the need of help of a technician based on speech, force and visual information.
The video shows a one shot take of the demonstration shown at the CeBIT 2018 (June 11-18) more than 50 times.

Airicist
8th November 2018, 21:41
https://youtu.be/5BEVluiEqcI

ARMAR-6: A collaborative humanoid robot for industrial environments

Published on Nov 8, 2018


Video attachment to our paper:

ARMAR-6: A Collaborative Humanoid Robot for Industrial Environments
Asfour, T., Kaul, L., Wächter, M., Ottenhaus, S., Weiner, P., Rader, S., Grimm, R., Zhou, Y., Grotz, M., Paus, F., Shingarey, D. and Haubert, H.

Airicist
15th November 2018, 15:47
https://youtu.be/_ddWTvKwrXA

ARMAR-6 grasping pipeline

Published on Nov 15, 2018


ARMAR-6 demonstrates grasping various tools in a warehouse environment utilizing the mobile platform and the 8-DoF arms. Based on environment models and depth cameras collision-free trajectories for the mobile platform and the arms are calculated to reach an automatically selected grasping pose by employing a pipeline of algorithms consisting of inverse kinematics, grasp planning, collision-free motion planning, and robot placement calculation.

Airicist
29th January 2020, 19:38
https://youtu.be/trVhkEgd-Y0

Autonomous grasping of unknown objects in cluttered scenes with the humanoid robot ARMAR-6

Jan 29, 2020


The humanoid Robot ARMAR-6 grasps unknown objects in a cluttered box autonomously.

Airicist
5th May 2020, 08:01
https://youtu.be/-KF5XSSTn_o

Horizon 2020 Secondhands Project - Pioneering collaborative robotics

May 5, 2020


The Horizon2020 SecondHands consortium has achieved major breakthroughs in its development of a collaborative robot which can proactively assist humans in maintenance tasks.

The ARMAR-6 is a revolutionary robot platform which has been tested in real-world industry for robustness and high performance. The breakthroughs made as part of this project in AI Learning, Natural Language Processing and Robotic Manipulation achieve collaborative robotic capabilities which exceed the current state-of-the-art.

secondhands.eu (https://secondhands.eu)

Airicist
20th October 2020, 14:35
https://youtu.be/6cDgVrwchSg

ARMAR-6: collaborative humanoid robot that recognizes the need of help and provides assistance

Oct 20, 2020


ARMAR-6: Collaborative Humanoid Robot that Recognizes the Need of Help and Provides Assistance to Technicians in a Pro-Active Manner

This video shows a demonstration of central results of the SecondHands project. In the context of maintenance and repair tasks, in warehouse environments, the collaborative humanoid robot ARMAR-6 demonstrates a number of cognitive and sensorimotor abilities such as 1) recognition of the need of help based on speech, force, haptics and visual scene and action interpretation,
2) collaborative bimanual manipulation of large objects, 3) compliant mobile manipulation, 4) grasping known and unknown objects and tools, 5) human-robot interaction (object and tool handover) 6) natural dialog and 7) force predictive control