Charlotte and Golem, Raspberry Pi robots, hexapods, Kevin Ochs, Pasadena, California, USA


Raspberry Pi Robot with Xtion Pro Live (Prime Sense) called Charlotte

Published on May 21, 2013

(Images through the prime sense linked below.) Here is a quick test of the collision detection using a Xtion Pro Live (Prime Sense) with a Raspberry Pi as the brain. To note the Rpi is overclocked to 1000Mhz. (You will notice a slight click in the middle of me turning on the collision interrupt call. I forgot I turned it off in the beginning since it was so close to the camera. I realized it after it wasn't responding to the cone in front of it.) Collision is only detected in the middle of it's vision. This is due to the legs sometimes crossing over into it's field of vision during full body rotations. I have also integrated the three vision options into a "heads up display" using OpenNI and OpenCV. (See links below.) Speech is handled via a espeak library my friend Kurt created. The gait algorithm is a complete rewrite in order to give as much stability to the camera. (Plus I just wanted to see if I could do it...) The USB hub is a de-cased powered 4 port which gets it power from a BEC that takes the 3-cell lipo (11 volt) and drops it down to 5.1 volts which in turn powers the Raspberry Pi, Xtion and the amplified speaker. All code is done in c++. Special thanks to Kurt Eckhardt for creating libraries for XBee communication and the espeak library. Communication to the servos is done with a USB2AX micro controller created by Nicolas Saugnier. I've linked a couple of images of the HUD and the three viewing options.
 

Raspberry Pi Robot called Charlotte

Published on Mar 27, 2013

Here is my second pass at running my Trossen PhantomX MKII with a Raspberry Pi. I've finally decided to give it a name.... "Charlotte" It's running a custom c++ port of the Phoenix software developed by Jeroen Janssen, Kurt Eckhardt and Kare Halvorsen. This version is all floating point and standard math libraries are used. Communication to the servos are done with a USB2AX created by Nicolas Saugnier and using a modified Robotis Dynamixel SDK. Power to the servos is provided by the Robotis SMPS2dynamixel. Remote Control is handled by a USB XBee module and Arbotix Commander.
 

Hexapod Robot Prototype called Golem using an Intel NUC

Published on Feb 23, 2014

Gait test of my next project named Golem. My primary goal for the project is to learn ROS (www.ros.org) so I designed this platform to keep it interesting. Golem is roughly 2.5 feet (76.2 cm) in diameter and weighs 16 pounds (7.3 Kg). His body is T6 Aluminum with parts powder coated white gloss for a bit of flair. In regards to actuators, the legs are Dynamixel MX-64s and the turret is a MX-28. The camera system isn't ready so it isn't attached at this time. It has an onboard Intel NUC D54250 motherboard that consists of a Intel 4th Gen i5 processor 8 Gig of RAM and a 120 Gig SSD. Running Ubuntu 13.04. It also has a gyroscope, accelerometer and compass for bearing and orientation aide. Manual control is done by a Sony Playstation 3 controller paired to the onboard Bluetooth. It's power supply is a 6000 mAh military spec lipo battery. All code is written in C++. Much more to come but I thought I would share my current progress. I'd like to note my development of this project is in partnership with Interbotix Labs. Parts were machined by eMachineShop.com.
 

Hexapod robot called Golem using Google translate as text to speech. (Test video)

Published on Apr 22, 2014

Testing out using google translate as a text to speech addition to my robot project. Of course it requires being connected to the internet but it is very clear and super easy to setup. All debug messages are converted on the fly and played over the speaker.

Basic info if you didn't see the first video:

My primary goal for the project is to learn ROS (www.ros.org) so I designed this platform to keep it interesting. Golem is roughly 2.5 feet (76.2 cm) in diameter and weighs 16 pounds (7.3 Kg). His body is T6 Aluminum with parts powder coated white gloss for a bit of flair. In regards to actuators, the legs are Dynamixel MX-64s and the turret is a MX-28. It has an onboard Intel NUC D54250 motherboard that consists of a Intel 4th Gen i5 processor 8 Gig of RAM and a 120 Gig SSD. Running Ubuntu 13.04. It also has a gyroscope, accelerometer and compass for bearing and orientation aide. Manual control is done by a Sony Playstation 3 controller paired to the onboard Bluetooth. It's power supply is a 6000 mAh military spec lipo battery. All code is written in C++.
 

ROS Hexapod called Golem slowstep and IMU readings test

Published on Jan 6, 2015

So figured I'd give a update on the ROS progress. I currently have my ROS stack working. Xacro/URDF built, IMU active and publishing odom messages, the point grey camera streaming and just put on the Prime Sense depth sensor a few days ago (not in video). I am using depthimage_to_laserscan so I can use gmapping.

In regards to odometry I have had to slow the gait down a considerable bit.Above is a test of the "slow gait," it is using simple cosine and sine waves with a pretty high sample rate, so high I've lowered it a bit since this video as you will see it added a bit of shaking on the coxa joints due to rounding. I currently have one package that handles the leg motion but as far as ROS knows it is just the "base" of the robot. It simplifies the command structure to a simple Twist message for velocity. The max speed is rather slow at 8cm per second, anything higher the IMU gets to much of the impact of the legs in its readings...even with a low pass filter applied.

Next is implementing the ROS navigation stack since I have the base tf, the odometry, and the laser scan data ready. That will give me the autonomous path navigation which will be probably the end goal in the development process of this project.

The more and more I learn ROS I constantly say "Why aren't we funding this!" :P It really is a great system.
 
Back
Top