Matt Fitzgerald of Rethink Robotics demonstrates Baxter, an adaptive manufacturing robot, to potential clients during the Association for Advancing Automation's convention in McCormick Place in Chicago, on January 21, 2013. (Zbigniew Bzdak, Chicago Tribune)
Active Robots has developed a program that shows the Baxter Research Robot from Rethink Robotics introducing itself: Integrating a Text-to-Speech engine with the Baxter Research Robot Software Development Kit to allow the Baxter Research Robot to communicate with us more effectively. Here the robot lets us know a little bit about itself, including its unique compliant joint architecture and how it can be used in Manufacturing, Academic and also Corporate Research & Development.
Here is the Baxter Research Robot -- I am very impressed with the wide range of capabilities that Baxter offers at an affordable price.
I am glad to find out that several British Universities already have this excellent research resource.
Full story: "Make robots useful by teaching them to talk like us"
by Aviva Rutkin
June 27, 2014
Teaching robots how to handle the complex ways that humans communicate will make them better at dealing with our requests -- or asking for help
Mod and Sim students teamed up to create this fun video of the Baxter Robot by ReThink Robotics, working happily to their favorite song.
Robots that collaborate with humans must be able to identify objects used for shared tasks, for example tools such as a knife for assistance at cooking, or parts such as a screw on a factory floor. Humans communicate about objects using language and gesture, fusing information from multiple modalities over time. Existing work has addressed this problem in single modalities, such as natural language or gesture, or fused modalities in non-realtime systems, but a gap remains in creating systems that simultaneously fuse information from language and gesture over time. To address this problem, we define a multimodal Bayes’ filter for interpreting referring expressions to objects. Our approach outputs a distribution over the referent object at 14Hz, updating dynamically as it receives new observations of the person’s spoken words and gestures. This real-time update enables a robot to dynamically respond with backchannel feedback while a person is still communi- cating, pointing toward a mathematical framework for human- robot communication as a joint activity [Clark, 1996]. Moreover, our approach takes into account rich timing information in the language as words are spoken by processing incremental output from the speech recognition system, traditionally ignored when processing a command as an entire sentence. It quickly adapts when the person refers to a new object. We collected a new dataset of people referring to objects in a tabletop setting and demonstrate that our approach is able to infer the correct object with 90% accuracy. Additionally, we demonstrate that our approach enables a Baxter robot to provide back-channel responses in real-time.
Baxter is on the job. Check out this compilation of Rethink Robotics Baxter working in various applications from across the country!
3-DOF Visual Servoing, decapping a bottle, picking up golf balls, shaking hands, mimicking motions, and shooting targets using the Baxter robot.
A Baxter robot sorts objects based on their weight. The weight of each object is estimated from measured joint torques, and then the objects are sorted according to their relative weights. The objects are numbered according to their weight.
This is a video demo of our project "Kendo Robot". The project is a part of the course EE125/215 at University of California Berkeley. Fall semester 2014. Project team: Ingrid Kugelberg, James Lam, Jiewen Sun.
A demonstration of Baxter learning to pour correct amounts of liquid into a moving container. This shows how our learning model is able to very quickly adjust to new task variations (specific volumes of liquid).
Full title:
Generation and Exploitation of Local Models for Rapid Learning of a Pouring Task on a Moving Platform
Authors:
Joshua D. Langsfeld and Krishnanand N. Kaipa and Satyandra K. Gupta
Maryland Robotics Center
University of Maryland, College Park, MD USA
A ONExia Software Engineer put one of our Baxter robots to work this week to assist in a long term software test for one of our custom machines. During the 2 hour test Baxter was used to simulate the operator interaction. This allowed our engineer to work on something else while Baxter pushed the trays for the duration of the test.