Miscellaneous

Administrator

Administrator
Staff member

World's First Emotional Home Robot

Published on Jul 4, 2014

Artificial intelligence is still a work in progress, but if you're simply looking for robotic companionship, We may have a humanoid for you.

The Japanese mobile operator teamed up with Aldebaran Robotics to develop Pepper, the world's first personal robot that can read emotions.

Feeling blue? Pepper can detect sadness based on your expressions and voice tones. Using built-in sensors and pre-programmed algorithms, the robot will also react appropriately.

In the vein of its corporate philosophy of "happiness for everyone," SoftBank entered the cyborg business "with the aim of developing affectionate robots that make people smile," according to CEO Masayoshi Son.

But Pepper does more than tell a joke or two. The device comes equipped with a 10.1-inch touch display, as well as voice-recognition technology and emotion recognition, to enable human-to-humanoid communication. It can also handle gestures, like a wave of the arm, or nod of the head.

These advanced technologies make it easy for users to interact with Pepper just as they would family and friends. And, really, who needs human contact when they've got a robot that makes jokes, dances, and provides other forms of entertainment?

"For the past nine years, I've believed that the most important role of robots will be as kind and emotional companions to enhance our daily lives, to bring happiness, constantly surprise us, and make people grow," Aldebaran CEO Bruno Maisonnier said in a statement. "The emotional robot will create a new dimension in our lives and new ways of interacting with technology. It's just the beginning, but already a promising reality."

The friendly-looking cyborg will begin rolling into Japanese homes in February, for a base price of 198,000 yen ($1,931/1,152 pounds).
 

Poor thing! Would you feel sorry for a simulated robot?

Published on Mar 27, 2015

This is an accompanying video on our HRI 2015 paper.

Poor Thing! Would You Feel Sorry for a Simulated Robot?
A comparison of empathy toward a physical and a simulated robot
dl.acm.org/citation.cfm?doid=2696454.2696471
Article "Would you feel sorry for a simulated robot? Study shows people empathize more with the real thing"

by Stela Seo, Denise Geiskkovitch, Masayuki Nakane, Corey King, Jim Young
March 27, 2015
 

Human-Robot Emotional and Musical Interactions - video lecture by Prof. Massimiliano Zecca

Published on May 11, 2015

In his lecture titled ”Human-Robot Emotional and Musical Interactions", part of the IJARS’s Video Series, prof Massimiliano Zecca touches upon 3 main topics, emotional robotics, musical robotics and wearable bionistrumentation. Prof. Zecca centers his research on robotic systems and technologies that would make up for assisting those in need due to advanced age or illness. In this view, he’s mostly interested in robotic systems-human interactions on an emotional level and designing a mental model for the humanoid robot. This lead to the development of the WE-4RII robot. His second interest and research focus lies in musical robotics where the basic idea is to make the robot capable of interacting with other musicians on a same level, as if the robot was a band member, a music player part of the same musicians’ group. This is how the Waseda flutist robot N.4 refined VI was born, followed by the Waseda Saxophonist. Finally, the first two research focuses raised the question of how people perceived the interaction with these robots rather than concentrating solely on advancing the technology. This brought his attention to wearable sensors for humans where the data collected is directly fed to robots for interpretation.

Prof. Zecca explores issues such as how people feel about interacting with the robot on an emotional level, how the robots interact within groups, how can they “sense” what humans feel during the interaction. We invite you to watch his lecture for a more in-depth overview of the research presented.

"Human-robot emotional and musical interactions: Lecture by Massimiliano Zecca"

by International Journal of Advanced Robotic Systems (IJARS)
May 15, 2015
 

Hacking Emotional Intelligence (EQ) - Joe Dunn

Published on Jun 19, 2015

From Ignite Velocity San Jose 2015, a series of 5-minute presentations. Emotional Intelligence is a set of techniques for knowing yourself and relating to other people. It’s not that hard, despite often being described as a mysterious “secret sauce” for professional success. In this Ignite talk, we’ll introduce “Hacking EQ” – how to increase your EQ with a set of quick, simple techniques you can use any time and in any professional situation.
 

A Simple Mind Trick Will Help You Think More Rationally

Published on Jul 22, 2015

Emotions can cloud our rational decision-making. By adopting the perspective of an outside advisor, psychologist Dan Ariely says we can inject some rationality into our cognitive processes. Ariely's new book is titled "Irrationally Yours: On Missing Socks, Pickup Lines, and Other Existential Puzzles".

Transcript - There’s one way to be rational, there are many ways to be irrational. We could be irrational by getting confused, not taking actions, being myopic, vindictive, emotional. You name it. There’s lots of ways to be wrong. And because of that there’s not one way to fix it.

But one interesting way to try and inject some rationality is to think from an outsider’s perspective. So here’s what happens. When you think about your own life you’re trapped within your own perspective. You’re trapped within your own emotions and feelings and so on. But if you give advice to somebody else all of a sudden you’re not trapped within that emotional combination mish-mash complexity and you can give advice that is more forward-looking and not so specific to the emotions.

So one idea is to basically ask people for advice. So if you're falling in love with some person, good advice is to go to your mother and say, “Mother, what do you think about the long term compatibility of that person?” You’re infatuated, right. When you’re infatuated you’re not able to see things three months down the road. You’re saying I’m infatuated. I’ll stay infatuated forever and this will never go away. Your mother being an outsider is not infatuated and she could probably look at things like long term compatibility and so on. But there’s other ways to do it which is not to be advisors to other people but to be advisors for ourselves. So for example in one experiment we asked people, we said look, you went to your doctor.

They gave you this diagnosis. You know that the thing that the doctor recommended is much more expensive and there are other things that would be much cheaper. Would you go for a second opinion? And people say no, my doctor recommended it. How could I not take their advice? How could I say can you please refer me for a second opinion? Then we asked another group. We said here is the situation. If this happened to your friend would you recommend that they go for a second opinion? People said absolutely. How could you not go for a second opinion? So one idea is to try and get ourselves from an outside perspective. You look at the situation and then you say to yourself if this was about somebody else, somebody I love and care about and then when this situation what would I advise them? And you would realize that often your advice will be different and often a more rational, useful perspective.
 

Beyond Verbal talks at IFA'15 about emotional BIGDATA

Published on Oct 1, 2015

We are passionate creatures that beat to the rhythm of our sentiments, yet our machines are oblivious to these same emotions that power us humans and big data practically ignores this critical piece of information. But this is about to change. Enabling machines to understand our emotions is introducing a whole new dimension to big data. A big data of emotions change the way we look and analyze our world on a macro level but at the same time also change the very way we understand our own micro-selves.
The talk will show how emotions can be extracted, analyzed and quantified on a massive scale and what insights it can drive into our ability to understand our surroundings as-well-as our own personal wellbeing.
 

Computing With Emotions | Peter Robinson

Published on Nov 19, 2015

The importance of emotional expression as part of human communication has been understood since the seventeenth century, and has been explored scientifically since Charles Darwin and others in the nineteenth century. Recent advances in Psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. We can now consider how these advances relate to each other and how they can be brought together to influence future research in perception, attention, learning, memory, communication, decision-making and other applications.

This talk will survey recent advances in theories of emotion and affect, their embodiment in computational systems, the implications for general communications, and broader applications. The combination of new results in psychology with new techniques of computation on new technologies will enable new applications in commerce, education, entertainment, security, therapy and everyday life. However, there are important issues of privacy and personal expression that must also be considered.

Prof Peter Robinson
Computer Laboratory, Rainbow Research Group, University of Cambridge

Peter Robinson is Professor of Computer Technology in the Computer Laboratory at the University of Cambridge, where he leads the Rainbow Research Group working on computer graphics and interaction. Professor Robinson's research concerns problems at the boundary between people and computers. This involves investigating new technologies to enhance communication between computers and their users, and new applications to exploit these technologies. He has been leading work for some years on augmented environments in which everyday objects acquire computational properties through user interfaces based on video projection and digital cameras. Recent work has investigated inference of people's mental states from facial expressions, vocal nuances, body posture and gesture, and other physiological signals, and also considered the expression of emotions by robots and cartoon avatars.
 

Rudolph Tanzi: How to Encourage The Evolution of Your Brain

Published on Nov 21, 2015

Read more at BigThink.com: "Your Emotions Determine Your Genetic Activity"

Transcript - Emotion is so interesting. If you think about the evolution of emotions, you know, first there was 400 million years ago the brain was – the reptilian brain as we call it. And these are memories that were instinctively programmed by genetics. You don’t need to learn how to run away when you’re attacked or how to fight – fight or flight. You don’t need to learn how to find food or to go find sex to reproduce, right. It’s instinctively programmed. And then it was only about 100 million years ago that if I use what’s called the handy brain that Dan Siegel uses. This is brainstem down here, you tuck your thumb in, that’s the mid brain so that’s the 100 milling, 400 million. And then that’s the frontal cortex, that’s only four million years old. That’s meaning, creativity, purpose, self-awareness. Well tucked in here that’s where we live. That’s short-term memory. And the first short-term memories we had were based on the roots of our emotions – fear and desire. And what was fear? Then the first memory of pain and the anticipation of pain in the future. Pain or punishment. And what is desire but the first memory of pleasure and reward and then the desire, the anticipation of that in the future. So the first messages of acquired memories that involved us living our lives and saying, you know, I remember that was – remembering something was bad and fearing in the future, having anxiety. Or remembering something’s good and say I want it again. We still live in that part of the brain. Now our emotions become more complicated as jealous, as greed, as resentment. But they’re all based in basically reward and punishment. Remembering reward, seeking it again, punishment, remembering that and avoiding it again, right. So the way to think about this is when you live in that short-term memory the reason why we live there is that sensations are coming in all the time. We’re seeing, all your five senses are bringing information to you. It’s all packaged in one big bundle called the performant pathway because it perforates the short-term memory – or the short-term memory area is called the hippocampus and it’s Greek for seahorse because it looks like this.
 

Meeting an emotional robot - Dara O'Briain's Science Club - Brit Lab

Published on Nov 21, 2015

The first step in human computer interaction is to teach the machines to decode more subtle sophisticated cues of our feelings, starting with the face... Taken from Dara O'Briain's Science Club.
 

Davos 2016 - Issue Briefing: Infusing Emotional Intelligence into AI

Published on Jan 22, 2016

Learn first-hand about how to endow artificial intelligence with emotional intelligence using social-interaction skills that are too often ignored in emerging technologies.

Justine Cassell, Associate Dean, Technology, Strategy and Impact, School of Computer Science, Carnegie Mellon University, USA
Vanessa Evers, Professor of Human Media Interaction, University of Twente, Netherlands
Maja Pantic, Professor of Affective and Behavioral Computing, Imperial College London, United Kingdom
Moderated by
Michael Hanley, Head of Digital Communications, Member of the
Executive Committee, World Economic Forum
 

Sarah Palin's emotions whilest endorsing Donald Trump

Published on Jan 25, 2016

Beyond Verbal decodes human vocal intonations into their undelining emotions, in real time - enabling voice anbled devices or apps to understand our emotions.
 

Machines that can read human emotions

Published on Feb 19, 2016

Machines are already very good at recognising human emotions when they have a static, frontal view of a person’s face. Maja Pantic, Professor of Affective and Behavioral Computing at Imperial College London, shares progress towards identifying people’s emotions “in the wild” and discusses possible applications, from marketing to medicine.
 

Artificial Intelligence

Published on Aug 17, 2015

Should we be scared of artificial intelligence and all it will bring us? Not so long as we remember to make sure to build artificial emotional intelligence into the technology.
 

Emotional technology

Published on Mar 9, 2016

When we think of the future of technology, we often imagine gadgets that will make us go faster. But some of the truly exciting developments will be around gadgets that help us with the tricky aspects of our emotional lives.
 

Processing emotions

Published on Apr 1, 2016

A new study from MIT reveals how two populations of neurons in the brain contribute to the process of assigning emotional associations to specific events. They hope this could shed light on mental illnesses and how to treat them best.

"How the brain processes emotions"
Neuroscientists identify circuits that could play a role in mental illnesses, including depression.

by Anne Trafton
March 31, 2016

Tye Lab: https://tyelab.org
Picower Institute for Learning and Memory: https://picower.mit.edu
MIT Brain and Cognitive Sciences: https://bcs.mit.edu
 
Back
Top