Researchers at MIT have developed a new gesture-based system that combines a standard webcam, colored lycra gloves, and a software that includes a dataset of pictures. This simple and cheap system allows to translate hands gestures into a computer-generated 3d-model of the hand in realtime. Once the webcam has captured an image of the glove, the software matches it with the corresponding hand position stored in the visual dataset and triggers the answer. This approach reduces computation time as there is no need to calculate the relative positions of the fingers, palm, and back of the...

Read more 0