PDA

View Full Version : iCub, humanoid robot, Italian Institute of Technology (IIT), Genova, Italy, RobotCub Consortium, Europe



Airicist
9th May 2013, 11:55
Designer - Italian Institute of Technology (IIT), Genova, Italy (https://pr.ai/showthread.php?4655)

Website - icub.iit.it (https://icub.iit.it)
icub.org (http://www.icub.org)

youtube.com/robotcub (https://www.youtube.com/robotcub)

facebook.com/iCubHumanoid (https://www.facebook.com/iCubHumanoid)

twitter.com/iCub (https://twitter.com/iCub)

iCub (https://en.wikipedia.org/wiki/ICub) on Wikipedia

Project CoDyCo (https://pr.ai/showthread.php?8876), Whole-body Compliant Dynamical Contacts in cognitive contacts, Cognitive Systems and Robotics

Director of the iCub Facility at the Istituto Italiano di Tecnologia in Genoa busying myself in running the iCub humanoid robot project - Giorgio Metta (https://pr.ai/showthread.php?15295)

Airicist
9th May 2013, 11:56
https://youtu.be/JfzlAoSXpd0

iCub - the robot child

Uploaded on May 13, 2009


IIT interview by Euronews regarding the Robotcub project.


https://youtu.be/ZcTwO2dpX8A

iCub - Humanoid Platform

Uploaded on Dec 6, 2011


The iCub is the humanoid robot developed as part of the EU project RobotCub and subsequently adopted by more than 20 laboratories worldwide. It can see and hear, it has the sense of proprioception and movement.

Credits: Laura Taverna, Matteo Tamboli, Vadim Tikhanoff, Carlo Ciliberto, Ugo Pattacini, Lorenzo Natale, Francesco Nori, Francesco Becchi, Giorgio Metta, Giulio Sandini. Robotics, Brain & Cognitive Sciences - Italian Institue of Technology


https://youtu.be/X4mbYz0JUxM

All gestures you can 2.0

Published on Nov 5, 2013


This video shows a new version of the memory game "All gestures you can (https://youtu.be/U_JLoe_fT3I?list=UUXBFWo4IQFkSJBfqdNrE1cA)",
where a human challenges the humanoid robot iCub. "All gestures you can" is a real-time memory game: the goal is to perform the longest sequence of hand gestures that you or the opponent can remember. This game is based on a Gesture Recognition System, that exploits 3D features based on motion and appearance; these features are then enriched with a sparse coding stage and classified by Linear SVMs. In the previous version we used Kinect to obtain 3D information; in this case instead, we rely directly on the stereo vision of the iCub. Furthermore, we are now able to train new gestures in real-time, via a single demonstration.

Airicist
27th November 2013, 06:24
https://vimeo.com/80345485

iCab
November 26, 2013

Airicist
17th April 2014, 10:02
https://youtu.be/rITQlGuXXOw

Enhancing Software module reusability using port plug-ins: an iCub Experiment

Published on Apr 17, 2014


Systematically developing high--quality reusable software components is a difficult task and requires careful design to find a proper balance between potential reuse, functionalities and ease of implementation. Extendibility is an important property for software which helps to reduce cost of development and significantly boosts its reusability. This work introduces an approach to enhance components reusability by extending their functionalities using plug-ins at the level of the connection points (ports). Application dependent functionalities such as data monitoring and arbitration can be implemented using a conventional scripting language and plugged into the ports of components. The main advantage of our approach is that it avoids to introduce application dependent modifications to existing components, thus reducing development time and fostering the development of simpler and therefore more reusable components. Another advantage of our approach is that it reduces communication and deployment overheads because extra functionalities can be added without introducing additional modules. The video demonstrates the port plug-in in a clean the table scenario on the iCub Humanoid robot.

Airicist
17th April 2014, 10:27
https://youtu.be/Dz8WM7FXKSQ

Execution of Pushing Action with Semantic Event Chains: iCub First Integration

Published on Apr 17, 2014


Here we present the first integration of the framework for manipulation execution based on the so called "Semantic Event Chain" on the iCub robot. The Semantic Event Chain is an abstract description of relations between the objects in the scene. It captures the change of those relations during a manipulation and thereby provides the decisive temporal anchor points by which a manipulation is critically defined.

Airicist
26th May 2014, 16:46
https://youtu.be/ErgfgF0uwUo

The 10th iCub birthday

Published on May 26, 2014


The iCub is a humanoid robot shaped as a four years old child. It is available as an open system platform following the GPL license. The iCub was originally designed by a consortium of 11 partners guided by the Italian Institute of Technology, with background ranging from engineering to neurophysiology and developmental psychology, within the RobotCub Integrated Project funded by European Commission through its Cognitive Systems and Robotics Unit. The iCub can crawl on all fours and sit up. Its hands allow dexterous manipulation and its head and eyes are fully articulated. It has visual, vestibular, auditory, and tactile sensory capabilities. In the past few years the community of researchers working on the iCub grew at a constant pace. Today there are more than 25 iCub platforms available worldwide. Simultaneously the platform evolved significantly in terms of its sensors and actuators. Thanks to the recent improvements the iCub is now equipped with: whole-body distributed tactile and force/torque sensing, series elastic actuators for compliant walking experiments (the COMAN actuators) and movable head with microphones, speaker, actuated eyes, eyelids and lips for speech and human-robot interaction studies. The key advantage of the iCub is that it is an integrated platform that allows the study of complex tasks that require speech and auditory perception, vision as well as proper coordination and integration of sensory data and control. We believe that this, more than ever, requires that researchers with different expertise join forces and start working together. Having platforms with compatible software and hardware is clearly a unique opportunity for collaboration.

Airicist
29th May 2014, 14:30
https://youtu.be/jaTEbCsFp_M

iCub balancing by controlling external forces: CoDyCo project results

Published on May 29, 2014


iCub, the humanoid robot of the Italian Institute of Techology, can stand and balance even when interacting with people. Thanks to the artificial skin, which equips the robot with 4000 sensitive contact points, the iCub's control system measures the external forces and properly regulate these interactions in order to keep the balance.

These new capacities will be pivotal when iCub will cohabitate with human beings in domestic environments. The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Project CoDyCo, coordinated by Dr. Francesco Nori.

Airicist
20th October 2014, 23:02
https://youtu.be/mQpVCSM8Vgc

3D Estimation and Fully Automated Learning of Eye-Hand Coordination in Humanoid Robots

Published on Oct 20, 2014


This work deals with the problem of 3D estimation and eye-hand calibration in humanoid robots. Using the iCub humanoid robot, we developed a fully automatic procedure based on optimization techniques that does not require any human supervision. The end-effector of the humanoid robot is automatically detected in the stereo images. We demonstrate the usefulness and the effectiveness of the proposed system in two typical robotic scenarios: (1) object grasping; (2) 3D scene reconstruction.

Fanello S.R., Pattacini U., Gori I., Tikhanoff V., Randazzo M., Roncone A., Odone F. & Metta G. 2014, ‘3D Stereo Estimation and Fully Automated Learning of Eye-Hand Coordination in Humanoid Robots’, IEEE-RAS International Conference on Humanoid Robots, Madrid, Spain, November 18-20, 2014.

Airicist
22nd October 2014, 09:28
https://youtu.be/pfse424t5mQ

Automatic kinematic chain calibration using artificial skin

Published on Oct 22, 2014


This video deals with the task of solving the problem of self-(or double-)touch. This behaviour is defined as the robot being able to touch itself on a specific region of the skin, and it represents an unprecedented opportunity for a humanoid robot to achieve the simultaneous activation of multiple skin parts (in this video, patches belonging to the right hand and the left forearm). This high amount of information can be then used later on for exploiting a completely autonomous calibration of the body model (i.e. kinematic calibration or tactile calibration). In the reference paper cited below, this competence has been used to perform a kinematic calibration of both the right and the left arms.

Roncone A., Hoffmann M., Pattacini U. & Metta G. 2014, ‘Automatic kinematic chain calibration using artificial skin: self-touch in the iCub humanoid robot’, IEEE International Conference on Robotics and Automation (ICRA2014), IEEE pp.2305-2312, , Hong Kong, China, May 31-June 7, 2014.

Airicist
11th November 2014, 15:12
https://youtu.be/neiX_eP4qq4

Learning grasp dependent pull affordances of tools on the iCub humanoid robot

Published on Nov 11, 2014


This video shows a condensed version of the experiment carried out in order to study how tool affordances can be learned and predicted based on the tool’s functional/geometrical features. In a nutshell, what the robot does on each trial is to associate the geometrical features of the tool with the affordance of the action performed with that tool grasped in a particular orientation. Affordance here is defined as the vector representing the measure of the effect as a function of an action parameter. In this case the action parameter is the position of the tool with respect to the object while the effect is the displacement of the object after the pull action.
In this experiment we use 3 grasp orientations and 4 tools. After enough trials have been recorded a classifier is trained to predict the affordance of a tool based on its geometrical features. Based on this prediction, the iCub is able to determine how to position the tool to pull the object closer more successfully.

Airicist
15th November 2014, 13:01
https://youtu.be/MQLwzmi3noc

Improvements of the iCub balancing

Published on Nov 15, 2014


This video shows some of the latest results achieved in the whole-body control of iCub, the humanoid robot of the Italian Institute of Technology. In particular, it shows the improvements of the balancing controller, which now optimizes the internal torques according to some bounds on the external wrenches (i.e. feet forces and torques). These bounds ensure that, for instance, the robot's feet do not slip even when performing highly dynamic tasks.

The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Project CoDyCo (http://codyco.eu), coordinated by Dr. Francesco Nori.

Airicist
20th November 2014, 22:51
https://youtu.be/l8Qz6lFkuPc

iCub, humanoid robot, Italian Institute of Technology
ET2014
ST Microelectronics booth

Published on Nov 20, 2014

Airicist
19th December 2014, 16:10
https://youtu.be/6_jBmYrNRmw

iCub Philosophy, Some History & Recent Results - video lecture by Prof. Giorgio Metta

Published on Dec 19, 2014


The iCub project was started 10 years ago within the field of Human Robotics, focused mostly on building models of cognitive behaviours. The goals of the project include understanding how the human brain resolves certain problems just to use this knowledge within robotics and build models that can allow robots to solve autonomously the same problems, hence become highly intelligent systems.

20 years ago neurophysiologists discovered special types of neurons called visuomotor neurons. This discovery triggered a shift in the way robots were programmed to learn and perceive as now, mirroring the function of these neurons, the goal was not only for the robot to perceive the surroundings but also to act upon the new information acquired, to carry out a task, solve a problem, imitate what a person would do when he or she perceives and processes external information.

To translate these goals into practice, first experiments where focused on registering human grasping actions via data gloves and trackers and then these registered actions, translated into images, were used to build a classifier which would aid the robot to predict better what action to carry out when perceiving certain information. Furthermore, neuro studies showing that it is not that important to recognise the object to be grasped it but it’s enough to recognise it’s shape, helped Prof. Metta and his team build models that prepared the robot to recognise the shape of an object just before executing an action like grasping it. This made possible for the robot to execute more complex behaviours such as recognise not only the objects to be grasped or pushed, but also to recognise tools which may help the robot to do something with the object in question. In one of the experiments lead by Prof. Metta, this was translated into an iCub grasping a tool to bring closer an object placed too far away from the robot, in order for the iCub to grasp it.

Watch Prof. Metta’s presentation to find out more about his amazing experiments using the iCub robot and what he’s latest research efforts are all about.

Airicist
24th February 2015, 12:03
https://youtu.be/VrPBSSQEr3A

iCub balancing on one foot while interacting with humans

Published on Feb 24, 2015


This video shows the latest results achieved in the whole-body control of the iCub, the humanoid robot developed by the Italian Institute of Technology. In particular, it shows the performances of the balancing controller when the robot stands on one foot. The knowledge of the robot dynamics and the measurement of the external perturbations allow for safely interacting with humans as well as controlling highly dynamic motions.

The control of the robot is achieved by regulating the interaction forces between the robot and its surrounding environment. In particular, the force and torque exchanged between the robot's foot and the floor is regulated so that the robot keeps its balance even when strongly perturbed.

These new capacities will be pivotal when iCub will cohabitate with human beings in domestic environments. The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Projects CoDyCo and Koroibot with Dr. Francesco Nori as principal investigator.

Airicist
12th March 2015, 12:43
https://youtu.be/h1WnXQzBbSk

Codyco Review Music

Published on Mar 12, 2015


The iCubParis01 is getting in shape for the CoDyCo second year review meeting.

Airicist
18th March 2015, 19:06
https://youtu.be/RW425vW4BtQ

Humanoid robot has a sense of self

Published on Mar 18, 2015

Airicist
22nd April 2015, 14:10
https://youtu.be/EUqJb5zsaKg

iCub balancing while performing goal directed actions: CoDyCo 2nd year validation

Published on Apr 22, 2015


iCub, the humanoid robot of the Italian Institute of Technology, can stand, balance perform goal directed actions while interacting with people. Thanks to the artificial skin, which equips the robot with 4000 sensitive contact points, the iCub's control system measures the external forces and properly regulate these interactions in order to keep the balance.

These new capacities will be pivotal when iCub will cohabitate with human beings in domestic environments. The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Project CoDyCo, coordinated by Dr. Francesco Nori.

Airicist
5th May 2015, 20:15
https://youtu.be/8hl5AUN5LNo

Tranferring grasping skills from Armar (https://pr.ai/showthread.php?5788) (KIT) to the iCub

Published on May 5, 2015


As part of the Xperience project we used the Armar grasping planner to teach the iCub to grasp an object

Airicist
5th May 2015, 20:22
https://youtu.be/ghUFweqm7W8

Interactive object learning on the iCub robot with Caffe libraries

Published on May 5, 2015


This video shows the new visual recognition capabilities of the iCub. Recognition mixes a classifier trained on top of the output of a deep convolutional neural network

Airicist
11th May 2015, 21:49
https://youtu.be/S7Kk6KEw3C4

Zero force control

Published on May 11, 2015


Compliant control by integrating force and tactile feedback.

Airicist
29th May 2015, 15:19
https://youtu.be/jsBAiAoCH58

Torque Control balancing on iCub@Heidelberg

Published on May 29, 2015


This video shows the latest results achieved in the whole-body control of the iCub, the humanoid robot developed by the Italian Institute of Technology. In particular, it shows the performances of the balancing controller on the specific platform of iCub@Heidelberg, which has a different hardware configuration from the classical iCub.

The control of the robot is achieved by regulating the interaction forces between the robot and its surrounding environment. In particular, the force and torque exchanged between the robot's feet and the floor are regulated so that the robot keeps its balance even when strongly perturbed.

The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Projects CoDyCo and Koroibot with Dr. Francesco Nori as principal investigator.

Airicist
11th June 2015, 17:35
https://youtu.be/fsgEUI1w4fI

iCub walking

Published on Jun 11, 2015

Airicist
24th June 2015, 13:05
https://youtu.be/DE3VynOr6HE

In Studio "iCub" puppy robot, and Roberto Cingolani

Published on Jun 24, 2015


iCub is a puppy 5 years, solutions to problems that we do not know yet. It is not remote controlled, is a humanoid, does it all! It needs to re-create that connection body and mind of humans. iCub is an Italian excellence, a global product of IT, was born in Genoa. Piero Angela: "The Italian Institute of Technology has 1500 people from 56 countries"

Airicist
26th June 2015, 07:01
https://vimeo.com/51011081

Toward Intelligent Humanoids | iCub 2012
October 8, 2012


Director / Screenwriter / Voice Over: Mikhail Frank
Co-Director / Cinematographer / Editor: Tomas Donoso (tomasdonoso.com)

Airicist
4th July 2015, 14:02
https://youtu.be/76SqqHHBgng

The iCub audio and visual attention system

Published on Jul 4, 2015


Saliency based sensor fusion of the audio and visual channel into a unique saliency map. Additional gaze strategy directs the iCub heads towards stimuli out of the field of view

Airicist
8th July 2015, 21:45
https://youtu.be/5DsLpl-ohlQ

The iCub Project: a shared platform for research in artificial intelligence and robotics

Published on Jul 8, 2015

Airicist
21st September 2015, 08:05
https://youtu.be/3IaXxNwC_7E

Learning Peripersonal Space on the iCub

Published on Sep 21, 2015


In this video, the tactile system is used in order to build a representation of space immediately surrounding the body - peripersonal space. In particular, the iCub skin acts as a reinforcement for the visual system, with the goal of enhancing the perception of the surrounding world. By exploiting a temporal and spatial congruence between a purely visual event (e.g. an object approaching the robot’s body) and a purely tactile event (e.g. the same object eventually touching a skin part), a representation is learned that allows the robot to autonomously establish a margin of safety around its body through interaction with the environment - extending its cutaneous tactile space into the space surrounding it.
We considered a scenario where external objects were approaching individual skin parts. A volume was chosen to demarcate a theoretical visual “receptive field” around every taxel. Learning then proceeded in a distributed, event-driven manner - every taxel stores and continuously updates a record of the count of positive (resulting in contact) and negative examples it has encountered.

Airicist
21st September 2015, 20:44
https://youtu.be/_2CYrSIwy2Q

Robot that can talk, see and hear engages with humans

Published on Sep 17, 2015


iCub, a talking humanoid robot head which is being taught how to talk and engage in natural interaction with humans, being unveiled at the ScotSoft Developers Conference in Edinburgh

Airicist
20th October 2015, 00:01
https://youtu.be/LBmYkjDAxf8

iCub Eye Calibration

Published on Oct 19, 2015

Airicist
10th November 2015, 21:32
https://youtu.be/RBP4BvW4RBs

iCub getting better in balancing on one foot

Published on Nov 10, 2015


This video shows some of the work going on at the Italian Institute of Technology aimed at improving the capacities of iCub when balancing on one foot.

Airicist
19th February 2016, 11:12
https://youtu.be/ZeXuHur495c

Published on Jun 8, 2015


This video shows the latest results achieved in the whole-body control of the iCub, the humanoid robot developed by the Italian Institute of Technology. In particular, it shows the performances of the balancing controller on the specific platform of iCub@Heidelberg, which has a different hardware configuration from the classical iCub.

The control of the robot is achieved by regulating the interaction forces between the robot and its surrounding environment. In particular, the force and torque exchanged between the robot's feet and the floor are regulated so that the robot keeps its balance even when strongly perturbed.

The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Projects CoDyCo and Koroibot with Dr. Francesco Nori as principal investigator.

Airicist
28th April 2016, 16:33
https://youtu.be/UPOLcE1vwA0

iCub performing YOGA++ demo

Published on Apr 28, 2016


The video shows the results of the iCub yoga++ demo on simulation (gazebo). This simulation can be replicated on Linux and OsX.

Airicist
4th May 2016, 14:41
https://youtu.be/I4ZKfAvs1y0

A Cartesian 6-DoF gaze controller for humanoid robots

Published on May 4, 2016


This video shows how we address the problem of controlling the 3D fixation point of a binocular, 6 Degrees-of-Freedom (DOF), antropomorhic head.
It is possible to define the fixation point as the virtual end-effector of the kinematic chain composed by the neck and the eyes. Consequently,
the control of the fixation point can be achieved using techniques for inverse kinematics and trajectory generation normally adopted for controlling
robotic arms. Further, the redundancy of the task allows for the integration of different corollary behaviors in addition to the main control loop:
vestibulo-ocular reflex (VOR), sacccadic behavior, and gaze stabilization (for a video on the gaze stabilization system, please refer to
https://youtu.be/NSGea-tCLZM)
REFERENCE PAPER: Roncone A., Pattacini U., Metta G., Natale L. 2016, 'A Cartesian 6-DoF Gaze Controller for Humanoid Robots',
Proceedings of Robotics: Science and Systems (RSS), Ann Arbor, MI, June 18-22 2016

Airicist
29th July 2016, 16:45
https://youtu.be/9XRI4BeXN78

iCub performing highly dynamic balancing via force control

Published on Jul 29, 2016


This video shows the latest results on the whole-body control of humanoid robots achieved by the Dynamic Interaction Control Lab at the Italian Institute of Technology.

The control of the robot is achieved by regulating the interaction forces between the robot and its surrounding environment. The force and torque exchanged between the robot's feet and the floor is regulated so that the robot keeps its balance even when strongly perturbed.

In particular, the control architecture is composed of two nested control loops. The internal loop, which runs at 1 KHz, is in charge of stabilizing any desired joint torque. This task is achieved thanks to an off-line identification procedure providing us with a reliable model of friction and motor constants. The outer loop, which generates desired joint torques at 100 Hz, is a momentum based control algorithm with the formalism of free-floating systems subject to constraints (i.e. Differential Algebraic Equation frameworks). More precisely, the control objective for the outer loop is the stabilization of the robot’s linear and angular momentum and the associated zero dynamics. The latter objective can be used to stabilize a desired joint configuration. The stability of the control framework is shown to be in the sense of Lyapunov. The contact forces and torques at the contacts are regulated so as to break the contact only at desired configurations. Switching between several contacts is taken into account thanks to a finite-state-machine that dictates the constraints acting on the system. The control framework is implemented on the iCub humanoid robot.

Airicist
18th October 2016, 14:40
https://youtu.be/azLSd13PbDQ

Hierarchical grasp controller using tactile feedback

Published on Oct 18, 2016


iCub uses tactile feedback to control and improve object grip.

Details in: Hierarchical Grasp Controller using Tactile Feedback, M. Regoli, U. Pattacini, Metta, G. and Natale, L., Humanoids 2016.

Airicist
29th October 2016, 10:18
https://youtu.be/iQ75jQV1DGI

HeiCub walking motion

Published on Jan 5, 2016


https://youtu.be/I5TyLOfVa-M

HeiCub squat motions

Published on Jan 5, 2016


https://youtu.be/A1A5R5EXItE

HeiCub walking with NMPC based pattern generator

Published on Oct 28, 2016


https://youtu.be/zn6GmnCYg8Q

Implementation and performance analysis of walking on the humanoid robot iCub

Published on Oct 28, 2016


In this video the HeiCub robot performs walking for the first time on level ground, slope up and down and small stairs. The motion is generated by means of a ZMP based pattern generator.

The HeiCub humanoid robot is a reduced version of the iCub humanoid robot located at Heidelberg University, Germany.

Airicist
30th October 2016, 00:34
https://youtu.be/eN9uoESpF-8

HeiCub performing torque balancing with contacts switching

Published on Oct 29, 2016


In this video the HeiCub robot is performing the YOGA++ demo using the same control software developed by IIT as for the full iCub robot as in youtube.com/watch?v=UPOLcE1vwA0 (https://www.youtube.com/watch?v=UPOLcE1vwA0)

The HeiCub humanoid robot is a reduced version of the iCub humanoid robot located at Heidelberg University, Germany.

Airicist
2nd December 2016, 17:19
Article "Robot Toddlers Show Scientists a Child's Learning Process (https://www.natureworldnews.com/articles/33034/20161130/robot-toddlers-show-scientists-childs-learning-process.htm)"

by Jaimee Bruce
November 30, 2016

Airicist
30th March 2017, 23:22
https://youtu.be/yIrepoNJyGc

iCub - Torque-Control Stepping Strategy Push Recovery - IIT

Published on Feb 15, 2017


=== How a robot can avoid to fall? ===

In this video we present the implementation on iCub of a so-called stepping-strategy, whose aim is to recover the robot balance by taking a step.

This approach mimics humans reactions to pushes, but its development resorts to simple models like the Linear Inverted Pendulum.

After the push, the robot understands it is the right moment to start moving, then it places the swing foot in a position computed depending on the perturbation, ensuring the robot won't fall once in double support.

These two informations (the trigger and the foot target position), together with the definition of a Center of Mass trajectory, constitute a planner able to provide references to the Momentum-Based Whole-Body Controller implemented on iCub. Thus, a peculiarity of this approach is the application on a torque controlled robot, providing additional robustness to impacts and foot placement errors, for example.

Airicist
5th April 2017, 14:00
https://youtu.be/xS-7xYRYSLc

Robust visual tracking with a freely moving event camera

Published on Mar 1, 2017


The iCub follows the moving target using event-based cameras. A novel event-based particle filter tracks the ball position at over 200Hz.

Airicist
23rd May 2017, 23:46
https://youtu.be/PmPUI-xX0CA

A grasping approach based on superquadric models

Published on May 22, 2017


This video demonstrates grasping of objects using superquadric models. The method is described in the following paper:

Vezzani, G., Pattacini, U., and Natale, L., "A Grasping Approach Based on Superquadric Models", in IEEE International Conference on Robotics and Automation, Singapore, 2017

Airicist
5th June 2017, 13:40
https://youtu.be/2Jmm4zel134

iCub - Behavior-based use of tool affordances for a table cleaning task

Published on Jun 5, 2017


Tanis Mar, Vadim Tikhanoff, Lorenzo Natale

Airicist
3rd August 2017, 10:26
https://youtu.be/vHlPjv8tFUE

Visual end-effector tracking using a 3D model-aided particle filter for humanoid robot platforms

Published on Aug 3, 2017


This video demonstrates recursive markerless estimation of a robot’s end-effector using visual observations from its cameras. The problem is formulated into the Bayesian framework and addressed using Sequential Monte Carlo (SMC) filtering. We demonstrate that the tracking is robust to clutter, allows compensating for errors in the robot kinematics and servoing the arm in closed loop using vision.

The method is described in the following paper:

C. Fantacci, U. Pattacini, V. Tikhanoff and L. Natale, "Visual end-effector tracking using a 3D model-aided particle filter for humanoid robot platforms", IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, September 24-28, 2017.

Airicist
10th November 2017, 16:02
https://youtu.be/2qcaLLipqPA

Markerless visual servoing on unknown objects for humanoid robot platforms

Published on Nov 10, 2017


This video shows a new framework for markerless visual servoing on unknown objects in action. The pipeline consists of four main parts:
1) a least- squares minimization problem is formulated to find the volume of the object graspable by the robot’s hand using its stereo vision;
2) a recursive Bayesian filtering technique, based on Sequential Monte Carlo (SMC) filtering, estimates the 6D pose (position and orientation) of the robot’s end-effector without the use of markers;
3) a nonlinear constrained optimization problem is formulated to compute the desired graspable pose about the object;
4) an image-based visual servo control commands the robot’s end-effector toward the desired pose.

The method is described in the following preprint arXiv paper:

C. Fantacci, G. Vezzani, U. Pattacini, V. Tikhanoff and L. Natale, "Markerless visual servoing on unknown objects for humanoid robot platforms", arXiv preprint arXiv:1710.04465, 2017.

Airicist
7th April 2018, 10:39
https://youtu.be/33F2G5mYiHY

iCub dynamic balancing and walking

Published on Mar 30, 2018


This video shows the latest results in iCub whole-body control achieved by the Dynamic Interaction Control lab at the Italian Institute of Technology. In particular, the iCub balancing capabilities have been improved considerably, and the reactive quadratic-programming based controller ensures balance and safe interaction. Also, iCub walking capabilities have been implemented by means of on-line reactive model-predictive-control algorithms.

Airicist
12th September 2018, 14:36
https://youtu.be/jemGKRxdAM8

iCub teleoperated walking and manipulation

Published on Sep 12, 2018


Cite this contribution
- Teleoperation:
"Telexistence and Teleoperation for Walking Humanoid Robots"
submitted to IEEE Humanoids 2018

- Walking:
"A Benchmarking of DCM Based Architectures for Position and Velocity Controlled Walking of Humanoid Robots"
submitted to IEEE Humanoids 2018

This video shows the latest results achieved by the Dynamic Interaction Control Lab at the Italian Institute of Technology on teleoperated walking and manipulation for humanoid robots.

We have integrated the iCub walking algorithms with a new teleoperation system, thus allowing a human being to teleoperate the robot during locomotion and manipulation tasks.

Airicist
3rd January 2019, 23:01
https://youtu.be/vP70QCZhi8w

Dynamic Interaction Control lab's 2018 year in review

Published on Jan 2, 2019


This video reviews the research results of the Dynamic Interaction Control lab obtained in 2018.

Airicist
3rd September 2019, 18:17
https://youtu.be/KTpRiPFbmuE

Will this ability put robots everywhere?

Published on Sep 3, 2019


Meet the remarkable iCub that learns like a child, and can share its expertise with others.

Revolutions: The Ideas that Changed the World is the extraordinary story of six remarkable inventions: The Aeroplane, The Car, The Rocket, The Smartphone, The Telescope, The Robot. They are familiar, yet hidden within them are thousands of years of thought, struggle, sacrifice, determination and insight.

Each episode explores little-known stories and is packed with incredible ideas. The result is a mind-blowing science-led journey through human history, full of unintended consequences and incredible connections. It reveals how science, invention and technology build on one another to change everything. Mostly, it celebrates the achievements of the some of the greatest minds in human history.

Revolutions: The Ideas that Changed the World | Episode 6 Robots | BBC

Airicist
9th October 2020, 19:00
https://youtu.be/Fb_R6IDDU4A

iCub reactive walking

Oct 9, 2020


This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line (https://dic.iit.it/). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment. The algorithms have been published in the proceedings of the 2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids) ieeexplore.ieee.org/document/9034996 (https://ieeexplore.ieee.org/document/9034996) and the video shows their validation on the humanoid robot iCub.

Airicist
18th July 2021, 16:55
https://youtu.be/t_bVOQnorPs

IFRR Robotics Global Colloquium "10 years with iCub"

May 6, 2021


iCub is a humanoid robot designed to support research in embodied AI. At 104 cm tall, iCub has the size of a five year old child. It can crawl on all fours, walk, and sit up to manipulate objects. Its hands have been designed to support sophisticated manipulation skills. iCub is distributed as Open Source following the GPL licenses and can now count on a worldwide community of enthusiastic developers. The entire design is available for download from the project’s repositories (http://www.iCub.org). More than 40 robots have been built so far which are available in laboratories across Europe, US, Korea, Singapore, and Japan. It is one of the few platforms in the world with a sensitive full-body skin to deal with the physical interaction with the environment including possibly people. I will present the iCub project in its entirety showing how it is evolving towards fulfilling the dream of a personal humanoid in every home. Approach (optional): The iCub stance on artificial intelligence postulates that manipulation plays a fundamental role in the development of cognitive capability. As many of these basic skills are not ready- made at birth, but developed during ontogenesis, we aimed at testing and developing this paradigm through the creation of a child-like humanoid robot: i.e. the iCub. This “baby” robot is meant to act in daily life scenarios, performing tasks useful for learning while interacting with objects and people. The small (104cm tall), compact size (approximately 29kg and fitting within the volume of a child) and high number (53) of degrees of freedom combined with the Open-Source approach distinguish iCub from other humanoid robotics projects developed worldwide.

Airicist2
11th March 2022, 23:41
https://youtu.be/NMzhDqVgVvk

Redball++

Mar 7, 2022


This video shows the integration of the pose tracking and grasping with superquadric functions on the iCub humanoid robot.

Details of the components used in this work:

Piga, N., Onyshchuk, Y., Pasquale, G., Pattacini, U., and Natale, L., ROFT: Real-time Optical Flow-aided 6D Object Pose and Velocity Tracking, IEEE Robotics & Automation Magazine, vol. 7, no. 1, pp. 159-166, 2022 ieeexplore.ieee.org/document/9568706 (https://ieeexplore.ieee.org/document/9568706)

Nguyen, P. D. H., Bottarel, F., Pattacini, U., Hoffmann, M., Natale, L., and Metta, G., Merging Physical and Social Interaction for Effective Human-Robot Collaboration, in Proc. IEEE-RAS International Conference on Humanoid Robots, Beijing, China, 2018, pp. 1-9.