By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Aug 08, 2006

The Talented Mr. Ripley

Via Thinking Meat 

Ripley is a robot designed by Deb Roy of the Cognitive Machines Group at MIT's Media Lab.

This robot was designed for learning about the environment by moving and touching objects in it. The underlying theoretical framework is the "Grounded Situation Model". In this approach, developed by Deb Roy and his colleague Nikolaos Mavridis, "the robot updates beliefs about its physical environment and body, based on a mixture of linguistic, visual and proprioceptive evidence. It can answer basic questions about the present or past and also perform actions through verbal interaction".

This story on NPR reports about the project, and includes an interview with Deb Roy.

From the MIT's Media Lab website

We have constructed a 7 degree-of-freedom robot, Ripley, to investigate connections between natural language semantics, perception, and action. Our goal is to enable Ripley to perform collaborative manipulation tasks mediated by natural spoken dialogue. Key issues include representation and learning of spatial language, object and temporal reference, and physical actions / verbs. Furthermore, a "Grounded Situation Model" representation has been designed for Ripley, as well as associated processes, and a cognitive architecture was implemented through numerous intercommunicating modules. 


Links to Ripley video clips (from the MIT Media Lab website):


Ripley imagines and remembers [high resolution (12M) | low resolution (440K)]

Ripley tracks faces [high resolution (6M) | low resolution (280K)]

Ripley imagines objects [high resolution (13.5M) | low resolution (826k)]

Ripley grasping objects [high resolution (201M) | low resolution (23M)]

Ripley changing perspectives [.mov, 17M)]

Training HMM model to pick up [.mov, 844K)]

HMM model generating pick up [.mov, 791K)]


The comments are closed.