Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 47

Thread: Miscellaneous

  1. #21


    EmoDetect

    Published on May 30, 2016

    Emotion recognition program
    Neurobotics

  2. #22


    Can a robot feel? | Susan Schneider | TEDxCambridge

    Published on Jun 22, 2016

    If and when you encounter an AI, it is best to look beyond superficialities, like a humanlike appearance. Perhaps only biological beings can have experience, or perhaps superintelligent AI doesn't need to be conscious. Susan Schneider proposes a test for determining whether AI can be conscious.

    Susan Schneider is an associate professor of philosophy and cognitive science at the University of Connecticut and a member of the Interdisciplinary Center for Bioethics at Yale University. Dr. Schneider writes about matters involving the nature of the self, which she examines from the vantage point of issues in philosophy of mind and cognitive science. Her work wrestles with vexed issues such as the nature of the mind, whether AI can be conscious, preparing for artificial general intelligence (AGI), superintelligent AI, and futuristic brain enhancements, such as brain chips and uploading.

  3. #23


    ei: emotional intelligence
    July 31, 2016

    "The story of an AI unit who is anything but artificial."
    First year film at the University of Pennsylvania, taking around ten months for completion. Thank you everyone who has supported me along the way.
    A film by Dennis Sung Min Kim
    Narration by Adam Parham
    Music composed and arranged by Nicholas Escobar

  4. #24


    EQ-Radio: emotion recognition using wireless signals

    Published on Sep 20, 2016

    "Emotion Recognition using Wireless Signals"

    by Mingmin Zhao, Fadel Adib, Dina Katabi
    Massachusetts Institute of Technology

  5. #25


    Why AI needs emotion - Rana El Kaliouby (Affectiva)

    Published on Sep 29, 2016

    Highly connected, interactive artificial intelligence systems surround us daily, but as smart as these systems are, they lack the ability to truly empathize with us humans. Rana El Kaliouby explores why emotion AI is critical to accelerating adoption of AI systems, how emotion AI is being used today, and what the future will look like.
    Why AI needs emotion - Rana El Kaliouby (Affectiva) by O'Reilly Media, Inc.

  6. #26


    Can machines read your emotions? - Kostas Karpouzis

    Published on Nov 29, 2016

    View full lesson: "Can machines read your emotions?" - Kostas Karpouzis

    Computers can beat us in board games, transcribe speech, and instantly identify almost any object. But will future robots go further by learning to figure out what we’re feeling? Kostas Karpouzis imagines a future where machines and the people who run them can accurately read our emotional states — and explains how that could allow them to assist us, or manipulate us, at unprecedented scales.

    Lesson by Kostas Karpouzis, animation by Lasse Rützou Bruntse.

  7. #27


    FutureRobot FURO emotional face behavior

    Published on Dec 23, 2016

    Future Robot Co., Ltd.

  8. #28


    Avengers Ultron Part 28, A REAL ROBOT - Tracking Emotions - XRobots

    Published on Dec 27, 2016

    Avengers Ultron Part 28, A REAL ROBOT: this time I'm tracking for emotions in Ultron's AI and demonstrating how this could make him react. This robot was fabricated with 3D printed mechanics and Arduino based electronics.

  9. #29


    Sketching CuddleBits: Coupled Prototyping of Body and Behaviour for an Affective Robot Pet

    Published on May 2, 2017

    "Sketching CuddleBits: Coupled Prototyping of Body and Behaviour for an Affective Robot Pet"
    by Paul Bucci, Xi Laura Cang, Anasazi Valair, David Marino, Lucia Tseng, Merel Jung, Jussi Rantala, Oliver S Schneider, Karon E MacLean

    CHI'17: ACM CHI Conference on Human Factors in Computing Systems
    Session: Fabrication and DIY

    Abstract:
    Social robots that physically display emotion invite natural communication with their human interlocutors, enabling applications like robot-assisted therapy where a complex robot's breathing influences human emotional and physiological state. Using DIY fabrication and assembly, we explore how simple 1-DOF robots can express affect with economy and user customizability, leveraging open-source designs. \ \ We developed low-cost techniques for coupled iteration of a simple robot's body and behaviour, and evaluated its potential to display emotion. Through two user studies, we \ (1) validated these CuddleBits' ability to express emotions (N=20); \ (2) sourced a corpus of 72 robot emotion behaviours from participants (N=10); and \ (3) analyzed it to link underlying parameters to emotional perception (N=14). \ \ We found that CuddleBits can express arousal (activation), and to a lesser degree valence (pleasantness). We also show how a sketch-refine paradigm combined with DIY fabrication and novel input methods enable parametric design of physical emotion display, and discuss how mastering this parsimonious case can give insight into layering simple behaviours in more complex robots.

  10. #30


    The Robot Program 027 - Microsoft Cognitive Emotion

    Published on May 18, 2017

    Robot overlord DJ Sures and Professor E show you how to use Microsoft Cognitive Emotion. Your robot can tell if you're happy or sad!

Page 3 of 5 FirstFirst 12345 LastLast

Социальные закладки

Социальные закладки

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •