Henny Admoni


RI Seminar: Henny Admoni: toward natural interactions with assistive robots

Published on Sep 22, 2017

Henny Admoni
Assistant Professor, Robotics Institute at Carnegie Mellon University

Abstract
Robots can help people live better lives by assisting them with the complex tasks involved in everyday activities. This is especially impactful for people with disabilities, who can benefit from robotic assistance to increase their independence. For example, physically assistive robots can collaborate with people in preparing a meal, enabling people with motor impairments to be self sufficient in cooking and eating. Socially assistive robots can act as tutors, coaches, and partners, to help people with social or learning deficits practice the skills they have learned in a non-threatening environment. Developing effective human-robot interactions in these cases requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence and machine learning.

In this talk, I will describe my vision for robots that collaborate with and assist humans on complex tasks. I will explain how we can leverage our understanding of natural, intuitive human behaviors to detect when and how people need assistance, and then apply robotics algorithms to produce effective human-robot interactions. I explain how models of human attention, drawn from cognitive science, can help select robot behaviors that improve human performance on a collaborative task. I detail my work on algorithms that predict people’s mental states based on their eye gaze and provide assistance in response to those predictions. And I show how breaking the seamlessness of an interaction can make robots appear smarter. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.

Bio
Henny Admoni is an Assistant Professor in the Robotics Institute at Carnegie Mellon University, where she works on assistive robotics and human-robot interaction. Henny develops and studies intelligent robots that improve people’s lives by providing assistance through social and physical interactions. She studies how nonverbal communication, such as eye gaze and pointing, can improve assistive interactions by revealing underlying human intentions and increasing human-robot communication. Previously, Henny was a postdoctoral fellow at CMU with Siddhartha Srinivasa in the Personal Robotics Lab. Henny completed her PhD in Computer Science at Yale University with Professor Brian Scassellati. Her PhD dissertation was about modeling the complex dynamics of nonverbal behavior for socially assistive human-robot interaction. Henny holds an MS in Computer Science from Yale University, and a BA/MA joint degree in Computer Science from Wesleyan University. Henny’s scholarship has been recognized with awards such as the NSF Graduate Research Fellowship, the Google Anita Borg Memorial Scholarship, and the Palantir Women in Technology Scholarship.
 

RI Seminar: Henny Admoni : understanding human behavior for robotic assistance and collaboration

Published on Apr 29, 2019

Henny Admoni
Assistant Professor
Robotics Institute, Carnegie Mellon University
April 11, 2019

Human-robot collaboration has the potential to transform the way people work and live. Researchers are currently developing robots that assist people in public spaces, on the job, and in their homes. To be effective assistants, these robots must be able to recognize aspects of their human partners such as what their goals are, what their next action will be, and when they need help—in short, their task-relevant mental states. A large part of communication about mental states occurs nonverbally, through eye gaze, gestures, and other behaviors that provide implicit information. Therefore, to be effective collaborators, robots must understand nonverbal human communication as well as generate sufficiently expressive nonverbal behaviors that are understandable by their human partners. Developing effective human-robot interactions requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence, machine learning, and computer vision. In this talk, I will describe my work on robots that collaborate with and assist humans on complex tasks, such as eating a meal. I will show how robots can guide human action using nonverbal behaviors, and how natural, intuitive human behaviors can reveal human mental states that robots must respond to. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.

Bio:
Henny Admoni is an Assistant Professor in the Robotics Institute at Carnegie Mellon University, where she leads the Human And Robot Partners (HARP) Lab. Henny studies how to develop intelligent robots that can assist and collaborate with humans on complex tasks like preparing a meal. She is most interested in how natural human behavior, like where someone is looking, can reveal underlying human mental states and can be used to improve human-robot interactions. Henny’s research has been supported by the US National Science Foundation, the US Office of Naval Research, the Paralyzed Veterans of America Foundation, and Sony Corporation. Her work has been featured by the media such as NPR’s Science Friday, Voice of America News, and WESA radio.
 

Robotics Professor answers robot questions from Twitter

Nov 22, 2022

Robotics professor Henny Admoni answers the internet's burning questions about robots! How do you program a personality? Can robots pick up a single M&M? Why do we keep making humanoid robots? What is Elon Musk's goal for the Tesla Optimus robot? Will robots take over my job writing video descriptions...I mean, um, all our jobs? Henny answers all these questions and much more.

Director: Lisandro Perez-Rey
Director of Photography: Jeff Smee
Editor: Ron Douglas
Expert: Henny Admoni

Line Producer: Joseph Buscemi
Associate Producer: Brandon White
Production Manager: Eric Martinez
Production Coordinator: Fernando Davila

Camera Operator: Alex Grant
Audio: Robert Buncher
Production Assistant: Maria Bosetti

Post Production Supervisor: Alexa Deutsch
Post Production Coordinator: Ian Bryant
Supervising Editor: Doug Larsen
Assistant Editor: Andy Morell

Special Thanks: Human And Robot Partners Lab at Carnegie Mellon University
 
Back
Top