This is a robot I developed my my final major project at Huddersfield university, the robot can track movement and respond to questions via a wireless keyboard. The robot uses an adapted version of the Eliza framework to respond to participant questions, The 'script' is then out-putted to apple scripts voice modulator so it could be heard through the robots internal speaker system.
The application of the project is as an interactive exhibit, but this project is highly adaptable and would be suited for museum displays. The Eliza framework can be changed to mimic any individual (Currently based on Marvin from Hitchhikers Guide to the Galaxy) and also answer complex questions, making this an highly interactive and knowledgeable system. The voice can also be outputted and modeled on specific individuals and the mouth and lips react to the sound coming into the computer board so it is always more or less in time.
The system tracks peoples movement via a Kinect module, the current system uses an open software library that tracks the nearest pixel to the sensor, however skeleton tracking can also be used. This means that multiple individuals can be tracked and interacted with at once, allowing for larger audiences.