Miscellaneous

Administrator

Administrator
Staff member

A robot teaches itself how to walk

Uploaded on Feb 15, 2012

Cornell University professor Hod Lipson demonstrates how a robot can teach itself to walk without any knowledge of its form and function. "Within a relatively small number of these babbling actions, it will figure out what it looks like," Lipson says. He adds that eventually "it can figure out how to move."
 

Walking Follows Form
May 25, 2014

This film is showing a few characters walking. And we can find that the method and style of walking depends on how each character's body structure is formed.
Animation: Jun seo Hahm
Music: Sun min Hwang
 

RI Seminar: Jonathan Hurst : Designing robots to walk and run

Streamed live April 1, 2016

Jonathan Hurst
College of Engineering Deans Professor, Oregon State University

April 01, 2016

Abstract
Legged locomotion is a challenging physical interaction task: underactuation, unexpected impacts, and large and rapidly changing forces and velocities are commonplace. Utilizing passive hardware dynamics in tight integration with the software control, with both aspects of “behavior design” considered together as part of the overall design process, can drastically improve the performance of a machine as measured by efficiency, agility, and robustness to disturbances.

This design philosophy was recently demonstrated on ATRIAS, a bipedal spring-mass robot. The passive dynamics of the hardware match a simple biomechanically-derived spring-mass model, while the software control relies on the passive dynamics as an integrated aspect of the system behavior. ATRIAS walks using approximately 400W of power, accelerates to a run, handles large unexpected obstacles with no prior knowledge of the terrain, and is the first machine to reproduce the dynamics of a human walking gait. In this presentation, we explain our design philosophy, results with ATRIAS, current work on a successor robot Cassie, and plans for commercialization of this technology by Agility Robotics.


Speaker Biography
Jonathan W. Hurst is the College of Engineering Dean's Professor of Robotics in the School of Mechanical, Industrial, and Manufacturing Engineering at Oregon State University, and the co-founder and Chief Technology Officer of Agility Robotics. He holds a B.S. in Mechanical Engineering, and both an M.S. and Ph.D. in Robotics, all from Carnegie Mellon University. His university research focuses on understanding the fundamental science and engineering best practices for legged locomotion. Investigations range from numerical studies and analysis of animal data, to simulation studies of theoretical models, to designing, constructing, and experimenting with legged robots for walking and running.
 

RGB-D camera-based navigation system of a walking robot

Published on Apr 19, 2016

We present the application of the RGB-D sensor in the navigation system of a six-legged walking robot. The RGB-D sensor is used in the SLAM subsystem to estimate pose of the robot and to build dense environment model (elevation map). The paper presents the navigation system of the robot. The system includes SLAM subsystem, mapping module, motion planner and robot's controller. The results of the experiments on the real robot are provided. The influence of the localization system on the quality of the obtained elevation map is presented.
References:
Motion planning:
[1] D. Belter, P. Labecki, P. Skrzypczynski, Adaptive Motion Planning for Autonomous Rough Terrain Traversal with a Walking Robot, Journal of Field Robotics
[2] D. Belter, Perception-based motion planning for a walking robot in rugged terrain, In Lecture Notes in Control and Information Sciences: Robot Motion and Control (K. Kozlowski, Ed.), pp. 127-136, Springer, Berlin 2011
Mapping:
[3] D. Belter, P. Labecki, P. Fankhauser, R. Siegwart, RGB-D terrain perception and dense mapping for legged robots, International Journal of Applied Mathematics and Computer Science, vol. 26(1), pp. 81-97, 2016
SLAM:
[4] D. Belter, M. Nowicki, P. Skrzypczynski, Accurate Map-Based RGB-D SLAM for Mobile Robots, Robot 2015: Second Iberian Robotics Conference, Vol. 418 of the series Advances in Intelligent Systems and Computing, pp. 533-545, 2015
 

Talk: Legged Robots: Stepping out of the continuous and differentiable zone by Dr.Diego Pardo

Published on May 10, 2017

Talk given on 13.02.2017. For more information please read Dr. Pardo's most recent publication
 

Free gait software overview

Published on Aug 11, 2017

An Architecture for the Versatile Control of Legged Robots.

github.com/leggedrobotics/free_gait

Free Gait is a software framework for the versatile, robust, and task-oriented control of legged robots. The Free Gait interface defines a whole-body abstraction layer to accommodate a variety of task-space control commands such as end effector, joint, and base motions. The defined motion tasks are tracked with a feedback whole-body controller to ensure accurate and robust motion execution even under slip and external disturbances. The application of this framework includes intuitive tele-operation of the robot, efficient scripting of behaviors, and fully autonomous operation with motion and footstep planners.

P. Fankhauser, D. Bellicoso, C. Gehring, R. Dubé, A. Gawel, M. Hutter, "Free Gait – An Architecture for the Versatile Control of Legged Robots", in IEEE-RAS International Conference on Humanoid Robots, 2016.
 

Some legged robot at World Robotics Conference Beijing 2018.8.19

Published on Aug 21, 2018

Some legged robot at World Robotics Conference Beijing 2018.8.19.
The black one is from unitree robotics four legged robot Laikago.
 

Learning legged locomotion: ML as one tool in an engineered system

Oct 23, 2020

ML/RL methods are often viewed as a magical black box, and while that's not true, learned policies are nonetheless a valuable tool that can work in conjunction with the underlying physics of the robot. In this video, Agility CTO Jonathan Hurst - wearing his professor hat at Oregon State University - presents some recent student work on using learned policies as a control method for highly dynamic legged robots.
 

Omni-Roach: A legged robot capable of traversing multiple types of large obstacles and self-righting

Jan 11, 2022

Supplementary video of our paper submitted to ICRA 2022: Omni-Roach: A legged robot capable of traversing multiple types of large obstacles and self-righting.

Visit https://li.me.jhu.edu to learn more.
 
Last edited:
Back
Top