PDA

View Full Version : Heni Ben Amor



Airicist
18th April 2014, 08:52
Assistant Professor and Principal Investigator at Interactive Robotics Lab (https://pr.ai/showthread.php?16773)

ias.informatik.tu-darmstadt.de/Member/HeniBenAmor (https://www.ias.informatik.tu-darmstadt.de/Member/HeniBenAmor)

vimeo.com/user27009916 (https://vimeo.com/user27009916)

facebook.com/ben.a.heni (https://www.facebook.com/ben.a.heni)

linkedin.com/in/heni-ben-amor-31a51a2 (https://www.linkedin.com/in/heni-ben-amor-31a51a2)

Airicist
18th April 2014, 09:07
https://vimeo.com/92147065

Cooperative human-robot tasks
April 16, 2014


In many cooperative tasks between a human and a robotic assistant, the human guides the robot by exerting forces, either through direct physical interaction or indirectly via a jointly manipulated object. These physical forces perturb the robot’s behavior execution and need to be compensated for in order to successfully complete such tasks. Typically, this problem is tackled by means of special purpose force sensors which are, however, not available on many robotic platforms. In con-
trast, we propose a machine learning approach based on sensor data, such as accelerometer and pressure sensor information. In the training phase, a statistical model of behavior execution is
learned that combines Gaussian Process Regression with a novel periodic kernel. During behavior execution, predictions from the statistical model are continuously compared with stability
parameters derived from current sensor readings. Differences between predicted and measured values exceeding the variance of the statistical model are interpreted as guidance information
and used to adapt the robot’s behavior. Several examples of cooperative tasks between a human and a humanoid NAO robot demonstrate the feasibility of our approach.

Airicist
18th April 2014, 09:08
https://vimeo.com/92149421

Imitation learning for robot grasping
April 16, 2014


Multi-fingered robot grasping is a challenging problem that is difficult to tackle using hand-coded programs. In this paper we present an imitation learning approach for learning and generalizing grasping skills based on human demonstrations. To this end, we split the task of synthesizing a grasping motion into three parts: (1) learning efficient grasp representations from human demonstrations, (2) warping contact points onto new objects, and (3) optimizing and executing
the reach-and-grasp movements. We learn low-dimensional latent grasp spaces for different grasp types, which form the basis for a novel extension to dynamic motor primitives. These latent-space dynamic motor primitives are used to synthesize entire reach-and-grasp movements. We evaluated our method on a real humanoid robot. The results of the experiment demonstrate the robustness and versatility of our approach.