PDA

View Full Version : Painting a picture without lifting a finger, Brain & Behaviour Lab, London, United Kingdom



Airicist
1st October 2015, 12:47
Developer - Brain & Behaviour Lab (https://pr.ai/showthread.php?12593)

Team:

Aldo Faisal (https://pr.ai/showthread.php?12592)

William Abbot

Sabine Dziemian

Airicist
1st October 2015, 12:49
https://youtu.be/qiX8QQorPiY

Robot art

Published on Sep 25, 2015


Engineers from Imperial College London have developed computer software that enables the user to control a robotic arm with eye commands to paint a simple picture.

Article "Painting a picture without lifting a finger, thanks to robotic technology (http://www3.imperial.ac.uk/newsandeventspggrp/imperialcollege/newssummary/news_24-9-2015-12-43-1)"

by Martin Sayers, Colin Smith
September 25, 2015

Airicist
25th October 2015, 14:58
https://youtu.be/kbQzph3Q5Qg

Scientist eats, drinks and paints simultaneously

Published on Oct 22, 2015


A computer scientist uses an eye-tracking robotic arm to paint a picture while simultaneously eating and drinking; demonstrating how, in the future, the technology could help people multi-task by literally giving them an extra hand. Matthew Stock reports.

Transcript:


Sabine Dzienmian (PRON. Sabeen-A Jem-Yian) may well be the first person in the world to eat a croissant and drink coffee while simultaneously painting a picture. Researchers from Imperial College London say their system demonstrates how eye-tracking technology could literally give people an extra pair of hands. They say their software accurately decodes a user's intended action from their eye movements. (SOUNDBITE) (English) DR. ALDO FAISAL, ASSOCIATE PROFESSOR IN NEUROTECHNOLOGY AT IMPERIAL COLLEGE LONDON, SAYING: "So you can imagine, for example, when you want to grab a cup; you will look at that cup before you grab it. And you will look in a specific way so you can judge where it is and how wide you have to shape your grip. And so we're developing algorithms that decode this intention from eye movement and we're then translating them into action." Using industrial robotics and simple eye-tracking hardware, the team's intuitive algorithm translates the path of the user's gaze and blinking into commands that control the robotic arm. The resulting painting is, admittedly, not quite a Picasso. But the system is very easy to use - even when your hands are otherwise occupied. (SOUNDBITE) (English) SABINE DZIEMIAN (PRON. Sabeen-A Jem-Yian), POST GRADUATE STUDENT IN THE DEPARTMENT OF COMPUTING AT IMPERIAL COLLEGE LONDON, SAYING: "It's very intuitive because I don't have to think about commands or something like this. I simply think about where I want to draw or which colour I want to take. And by thinking, a person usually looks at that colour... I didn't need a lot of time to learn how to use it. Actually, using it one time was enough to know how to control it completely." It could have a huge impact on the lives of people with disabilities, and maybe one day replace risky brain implants. (SOUNDBITE) (English) DR. ALDO FAISAL, ASSOCIATE PROFESSOR IN NEUROTECHNOLOGY AT IMPERIAL COLLEGE LONDON, SAYING: "We are following a non-invasive approach where you don't have to put technology into the head, but you can just, you know, you can just take it on and off like a pair of glasses. That's the level of technology that we want to offer to people." The researchers are looking for partners to commercialise the technology. And they're working on making the software even more intuitive, so that it becomes a seamless interface between man and machine.