Autonomous Systems Lab (ASL), Institute of Robotics and Intelligent Systems (IRIS) at ETH Zurich, Zurich, Switzerland

Administrator

Administrator
Staff member
Website - asl.ethz.ch

youtube.com/aslteam

twitter.com/ASL_ETHZ

instagram.com/asl_ethz

Director - Roland Siegwart

Projects:

Ascento, two-wheeled jumping robot

Project ARC, autonomous cars

Voliro, omnidirectional hexacopter

VertiGo, wall-climbing robot

AtlantikSolar, solar-powered fixed-wing UAV

Sepios, bio-inspired nautical robot

Scewo, electric wheelchair which is able to climb stairs

ANYmal, quadrupedal robot

StarlETH, quadruped robot

Ballbot Rezero

Detection of Slippery Terrain with a Heterogeneous Team of Legged Robots

ICARUS FP7 Project - Unmanned Search and Rescuing

naro robots, nautical robot, robotic fish

Autonomous Christmas Lab

V-Charge: Fully automated valet parking and charging using only low-cost sensors

ARGOS (Autonomous Robot for Gas & Oil Sites)
 
Last edited:

Multi-robot Control and Interaction with a Hand-held Tablet

Published on Oct 28, 2014

Collaborative project of Disney Research Zurich and the Autonomous Systems Lab, ETH Zurich.

Real-time interaction with a group of robots is shown with a hand-held tablet, which tracks the robots and computes collision-free trajectories. Efficient algorithms are described and experiments are performed in scenarios with changing illumination. Augmented reality and a multi-player setup are also described.

See 'Multi-robot Control and Interaction with a Hand-held Tablet' - Reto Grieder, Javier Alonso-Mora, Cyrill Bloechlinger, Roland Siegwart and Paul Beardsley, Workshop at IEEE Int. Conf. Robotics and Automation, 2014.
 

Image and Animation Display with Multiple Mobile Robots

Published on Oct 28, 2014

A new kind of display formed by a swarm of mobile robot pixels.

Collaborative project of Disney Research Zurich and the Autonomous Systems Lab, ETH Zurich.

See 'Image and Animation Display with Multiple Mobile Robots' - Javier Alonso-Mora, Andreas Breitenmoser, Martin Rufli, Roland Siegwart and Paul Beardsley, International Journal of Robotics Research, 31:753-773, 2012.
 

Human - Robot Swarm Interaction for Entertainment: From animation display to gesture based control

Published on Oct 28, 2014

Interaction with a swarm of mobile robot pixels. Gesture-based,
real-time drawing and with a hand-held tablet.

Collaborative project of Disney Research Zurich and the Autonomous Systems Lab, ETH Zurich.

See "Human - Robot Swarm Interaction for Entertainment: From animation display to gesture based control" - Javier Alonso-Mora, Roland Siegwart and Paul Beardsley, Proc. of the 9th ACM / IEEE International Conference on Human-Robot Interaction (HRI), 2014.
 

Collision Avoidance for Aerial Vehicles in Multi-Agent Scenarios

Published on Mar 30, 2015

Up to four quadrotors avoid collisions in real-time, between them and with a human.

"Collision Avoidance for Aerial Vehicles in Multi-Agent Scenarios", Javier Alonso-Mora, Tobias Naegeli, Roland Siegwart, Paul Beardsley, Autonomous Robots, January 2015

This work is in collaboration between ETH Zurich and Disney Research Zurich.
 

Collaborative Multi-Robot Manipulation of Deformable Objects

Published on Mar 30, 2015

This video shows experiments with up to three Kuka Youbots mobile manipulators. They collaboratively carry deformable objects: a towel, a bed sheet, a rope, and a mat. And they avoid collisions with static and dynamic obstacles.

"Local Motion Planning for Collaborative Multi-Robot Manipulation of Deformable Objects", Javier Alonso-Mora, Ross Knepper, Roland Siegwart and Daniela Rus, IEEE Int. Conf. in Robotics and Automation, 2015
 

Collaborative navigation for flying and walking robots

Published on Oct 5, 2015

The results of this video have been published in:

P. Fankhauser, M. Bloesch, P. Krüsi, R. Diethelm, M. Wermelinger, T. Schneider, M. Dymczyk, M. Hutter, R. Siegwart, "Collaborative Navigation for Flying and Walking Robots," in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2016.

Flying and walking robots can use their complementary features in terms of viewpoint and payload capability to the best in a heterogeneous team. To this end, we present our online collaborative navigation framework for unknown and challenging terrain. The method leverages the flying robot’s onboard monocular camera to create both a map of visual features for simultaneous localization and mapping and a dense representation of the environment as an elevation map. This prior knowledge from the initial exploration enables the walking robot to localize itself against the global map, and plan a global path to the goal by interpreting the elevation map in terms of traversability. While following the planned path, the absolute pose corrections are fused with the legged state estimation and the elevation map is continuously updated with distance measurements from an onboard laser range sensor. This allows the legged robot to safely navigate towards the goal while taking into account any changes in the environment.
Contact: Peter Fankhauser
 

Autonomous Robotic Stone Stacking with Online next Best Object Target Pose Planning

Published on Sep 23, 2016

We show how to form stable compositions with these objects. We present a next best stacking pose searching method in a Stochastic Gradient Descent (SGD) manner, implemented using a physics engine. This approach is validated in an experimental setup using a robotic manipulator by constructing balancing vertical stacks without mortars and adhesives. We show the results of eleven consecutive trials to form such towers autonomously using four arbitrarily placed rocks.
 
Back
Top