Messor, Messor II, hexapods, walking robots, Poznan, Poland


Experiment on the mockup: adaptive motion planning for a walking robot

Published on Feb 8, 2016

Experiment on the mockup: Achieving full autonomy in a mobile robot requires to combine robust environment perception with on-board sensors, efficient environment mapping, and real-time motion planning. All these tasks become more challenging when we consider a natural, outdoor environment and a robot that has many degrees of freedom (d.o.f.). In this paper we address the issues of motion planning in a legged robot walking over a rough terrain, using only its on-board sensors to gather the necessary environment model. The proposed solution takes the limited perceptual capabilities of the robot into account. A~multi-sensor system is considered for environment perception. The key idea of the motion planner is to use the dual representation concept of the map: (i) a higher-level planner applies the A$^{*}$ algorithm for coarse path planning on a low-resolution elevation grid, and (ii) a lower-level planner applies the guided-RRT (Rapidly-exploring Random Tree) algorithm to find a sequence of feasible motions on a more precise but smaller map. This paper contributes a new method that can learn the terrain traversability cost function to the benefit of the A$^{*}$ algorithm. A probabilistic regression technique is applied for the traversability assessment with the typical RRT-based motion planner used to explore the space of traversability values. The efficiency of our motion planning approach is demonstrated in simulations that provide ground truth data unavailable in field tests. However, the simulation-verified approach is then thoroughly tested under real-world conditions in experiments with two six-legged walking robots having different perception systems.
D Belter, P Lab?cki, P Skrzypczynski, Adaptive Motion Planning for Autonomous Rough Terrain Traversal with a Walking Robot, Journal of Field Robotics
 

Teleoperation of a six-legged walking robot using a hand tracking interface

Published on Sep 15, 2017

In this paper, we propose the teleoperation interface for a hexapod walking robot. The interface is based on the Kinect sensor and hand tracking libraries. The set of gestures is used to switch between motion modes of the robot. The estimated position of the operator's hands is used to define the reference motion of the robot. We show in the experiments that the operator can control the robot to reach the goal position and manipulates objects using hand gestures only.
[1] W. Cieślak, S. Rodykow, D. Belter, Teleoperation of a Six-legged Walking Robot Using a Hand Tracking Interface, Human-Centric Robotics, World Scientific, M.E. Silva et al. (Eds.), Singapore, pp. 527-536, 2017
 

Efficient reactive behavior for six-legged rough terrain with proprioceptive sensing

Published on Sep 15, 2017

In this paper, we propose the gait control strategy for a six-legged robot walking on rough terrain. To walk efficiently on rough terrain the robot uses proprioceptive sensors only. The robot detects contact with the ground and uses Attitude and Heading Reference System (AHRS) unit to measure the inclination of the platform. We propose a single-step procedure to compute inclination of the robot's platform taking into account the terrain slope and kinematic margin of each robot's leg. Additionally, we use a procedure, which keeps the robot stable during walking on rough terrain. We show in the experiments that the robot is capable of climbing slopes inclined by 25 deg and walking efficiently on rough terrain.
[1] D. Belter, Efficient Reactive Behavior for Six-legged Walking on Rough Terrain with Proprioceptive Sensing, Human-Centric Robotics, World Scientific, M.E. Silva et al. (Eds.), Singapore, pp. 357-366, 2017
 
Back
Top