Baxter, industrial and research robot, Rethink Robotics GmbH, Bochum, Germany


Baxter the robot ready to go

Published on Mar 13, 2013

Matt Fitzgerald of Rethink Robotics demonstrates Baxter, an adaptive manufacturing robot, to potential clients during the Association for Advancing Automation's convention in McCormick Place in Chicago, on January 21, 2013. (Zbigniew Bzdak, Chicago Tribune)
 

Baxter Research Robot Speaks Out

Published on Apr 22, 2014

Active Robots has developed a program that shows the Baxter Research Robot from Rethink Robotics introducing itself: Integrating a Text-to-Speech engine with the Baxter Research Robot Software Development Kit to allow the Baxter Research Robot to communicate with us more effectively. Here the robot lets us know a little bit about itself, including its unique compliant joint architecture and how it can be used in Manufacturing, Academic and also Corporate Research & Development.
 

Tom Strong meets the Baxter Research Robot by ReThink Robotics at Active Robots

Published on May 3, 2014

Here is the Baxter Research Robot -- I am very impressed with the wide range of capabilities that Baxter offers at an affordable price.
I am glad to find out that several British Universities already have this excellent research resource.
 

Baxter Robots performing to "Happy" song by Pharrell Williams

Published on Jun 19, 2014

Mod and Sim students teamed up to create this fun video of the Baxter Robot by ReThink Robotics, working happily to their favorite song.
 

Interpreting Multimodal Referring Expressions in Real Time
from David Whitney
October 1, 2014

Robots that collaborate with humans must be able to identify objects used for shared tasks, for example tools such as a knife for assistance at cooking, or parts such as a screw on a factory floor. Humans communicate about objects using language and gesture, fusing information from multiple modalities over time. Existing work has addressed this problem in single modalities, such as natural language or gesture, or fused modalities in non-realtime systems, but a gap remains in creating systems that simultaneously fuse information from language and gesture over time. To address this problem, we define a multimodal Bayes’ filter for interpreting referring expressions to objects. Our approach outputs a distribution over the referent object at 14Hz, updating dynamically as it receives new observations of the person’s spoken words and gestures. This real-time update enables a robot to dynamically respond with backchannel feedback while a person is still communi- cating, pointing toward a mathematical framework for human- robot communication as a joint activity [Clark, 1996]. Moreover, our approach takes into account rich timing information in the language as words are spoken by processing incremental output from the speech recognition system, traditionally ignored when processing a command as an entire sentence. It quickly adapts when the person refers to a new object. We collected a new dataset of people referring to objects in a tabletop setting and demonstrate that our approach is able to infer the correct object with 90% accuracy. Additionally, we demonstrate that our approach enables a Baxter robot to provide back-channel responses in real-time.
 

Baxter at Work - Application Examples

Published on Nov 5, 2014

Baxter is on the job. Check out this compilation of Rethink Robotics Baxter working in various applications from across the country!
 

Final projects in "Introduction to Robotics" at CU Boulder, Fall 2014

Published on Dec 12, 2014

3-DOF Visual Servoing, decapping a bottle, picking up golf balls, shaking hands, mimicking motions, and shooting targets using the Baxter robot.
 

Part sorting by weight
December 13, 2014

A Baxter robot sorts objects based on their weight. The weight of each object is estimated from measured joint torques, and then the objects are sorted according to their relative weights. The objects are numbered according to their weight.
 

Kendo Robot - Fightback
December 8, 2014

This is a video demo of our project "Kendo Robot". The project is a part of the course EE125/215 at University of California Berkeley. Fall semester 2014. Project team: Ingrid Kugelberg, James Lam, Jiewen Sun.
 

Baxter Robot Learning to Pour into a Moving Container

Published on May 22, 2015

A demonstration of Baxter learning to pour correct amounts of liquid into a moving container. This shows how our learning model is able to very quickly adjust to new task variations (specific volumes of liquid).

Full title:
Generation and Exploitation of Local Models for Rapid Learning of a Pouring Task on a Moving Platform

Authors:
Joshua D. Langsfeld and Krishnanand N. Kaipa and Satyandra K. Gupta

Maryland Robotics Center
University of Maryland, College Park, MD USA
 

Baxter Robot Assisting in Software Testing

Published on May 27, 2015

A ONExia Software Engineer put one of our Baxter robots to work this week to assist in a long term software test for one of our custom machines. During the 2 hour test Baxter was used to simulate the operator interaction. This allowed our engineer to work on something else while Baxter pushed the trays for the duration of the test.
 
Back
Top