Smooth Real-Time Walking-Pattern Generation for Humanoid Robot LOLA
Published on Jul 3, 2019
This video demonstrates our new approach for planning smooth center-of-mass trajectories for biped walking robots. The method is based on quintic spline interpolation and collocation and generates dynamically and kinematically feasible motions in real-time.
Method/Testscenario:
Our humanoid robot Lola steps up and down a platform of 12.5cm height. The planned center-of-mass motion respects the dynamics and kinematic limits of the robot using simplified models. The complete motion lasts more than 17 seconds and is planned in less than 9 milliseconds (CPU only, single-core).
Simulation:
Custom multi-body simulation
Visualization with Blender (custom interface using Blenders Python API)
Experiments:
Planning and control of the robot runs in real-time (onboard). External communication only triggers a signal to start and stop walking. The vision system is not active, thus foothold sequence is predefined.