So figured I'd give a update on the ROS progress. I currently have my ROS stack working. Xacro/URDF built, IMU active and publishing odom messages, the point grey camera streaming and just put on the Prime Sense depth sensor a few days ago (not in video). I am using depthimage_to_laserscan so I can use gmapping.
In regards to odometry I have had to slow the gait down a considerable bit.Above is a test of the "slow gait," it is using simple cosine and sine waves with a pretty high sample rate, so high I've lowered it a bit since this video as you will see it added a bit of shaking on the coxa joints due to rounding. I currently have one package that handles the leg motion but as far as ROS knows it is just the "base" of the robot. It simplifies the command structure to a simple Twist message for velocity. The max speed is rather slow at 8cm per second, anything higher the IMU gets to much of the impact of the legs in its readings...even with a low pass filter applied.
Next is implementing the ROS navigation stack since I have the base tf, the odometry, and the laser scan data ready. That will give me the autonomous path navigation which will be probably the end goal in the development process of this project.
The more and more I learn ROS I constantly say "Why aren't we funding this!"
It really is a great system.