Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Sep 26, 2010

Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy

Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy.

BMC Neurosci. 2010 Sep 16;11(1):117

Authors: Hashimoto Y, Ushiba J, Kimura A, Liu M, Tomita Y

ABSTRACT: BACKGROUND: For severely paralyzed people, a brain-computer interface (BCI) provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD) appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG) after long-term use of the virtual reality (VR)-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI) in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home. RESULTS: After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD) following MI was significantly enhanced (p < 0.01) for feet MI (from -29% to -55%), left-hand MI (from -23% to -42%), and right-hand MI (from -22% to -51%). CONCLUSIONS: These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.

Sep 20, 2010

XWave: Control your iPhone with your brain

The XWave is a new technology that uses a single electrode placed on the wearer’s forehead to measure electroencephalography (EEG) data, and converts these analog signals into digital so they can be used to control an external device. The XWave comes bundled with a software that includes a number of brain-training exercises. These include levitating a ball on the iDevice’s screen, changing a color based on the relaxation level of your brain and training your brain to maximize its attention span.

 

In the company’s own words:

XWave, powered by NeuroSky eSense patented technologies, senses the faintest electrical impulses transmitted through your skull to the surface of your forehead and converts these analogue signals into digital. With XWave, you will be able to detect attention and meditation levels, as well as train your mind to control things. Objects in a game can be controlled, lights in your living room can change colour depending on your mood; the possibilities are limited to only the power of your imagination.





The interesting feature is that the company is even serving up their APIs so developers can design and develop apps using the XWave device. The company reports that some apps already in development include games in which objects are controlled by the wearer’s mind and another that allows the wearer to control the lights in their home or select music based on their mood. You can order an XWave for $US100; it ships on November 1.


Sep 19, 2010

Artificial skin projects could restore feeling to wearers of prosthetic limbs

Via Telemedicine and E-Health news

Research groups at Stanford University and the University of California at Berkeley are developing sensor-based artificial skin that could provide prosthetic and robotic limbs with a realistic sense of touch. Stanford's project is based on organic electronics and is capable of detecting the weight of a fly upon the artificial skin, according to Zhenan Bao, professor of chemical engineering at Stanford.

The highly sensitive surfaces could also help robots pick up delicate objects without breaking them, improve surgeons' control over tools used for minimally invasive surgery, and increase efficiency of touch screen devices, she noted. Meanwhile, UC Berkeley's "e-skin" uses low-power, integrated arrays of nanowire transistors, according to UC Berkeley Professor of Electrical Engineering and Computer Science Ali Javey.

Thus far, the skin, the first ever made out of inorganic single crystalline semiconductors, is able to detect pressure equivalent to the touch of a keyboard. "It's a technique that can be potentially scaled up," said study lead author Kuniharu Takei, post-doctoral fellow in electrical engineering and computer sciences at UC Berkeley. "The limit now to the size of the e-skin we developed is the size of the processing tools we are using."

 

Sep 03, 2010

Samsung Galaxy Tab: iPad killer or flop?

Samsung has finally unveiled its new Galaxy Tab at the IFA conference in Berlin. The Galaxy runs on Android 2.2 operating system, which can run HTML5 and Adobe's Flash Player — unlike the iPad. It comes with a capacity of 16 or 32GB, expandable by 32GB more. The device weighs 380g (14oz), and has an 18cm (7in) screen - smaller and lighter than the iPad.

The Galaxy supports Bluetooth, Wi-Fi and 3G cell phone networks, and comes with two cameras, one 3-megabyte digital camera with a flash on the back of the device, and a second camera on the front for video conferences — a feature the iPad lacks.

The price is not very competitive, though: a number of European news sources are reporting that the Galaxy Tab will cost €699 and €799 for the 16GB and 32GB models, respectively.

 

Sep 02, 2010

The Blue Brain Project

As computing power continues to increase, it may ultimately be possible to simulate the functioning of the most complex system of the known universe: the brain. This is the ambitious goal of the Blue Brain Project, the first attempt to reverse-engineer the mammalian brain.

The project is expected to provide answers to a number of fundamental questions, ranging from the emergence of biological intelligence to the evolution of consciousness.

Lead by neuroscientist Henry Markram, Blue Brain was launched in 2005 as a joint research initiative between the Brain Mind Institute at the École Polytechnique Fédérale de Lausanne (EPFL) and the information technology giant IBM. Using the impressive processing power of IBM’s Blue Gene/L supercomputer, the project reached its first milestone in December 2006, with the development of the model of a rat’s neocortical column (NCC).

BBPLogo.png

To perform the empirically-based simulation of individual cells, the Blue Gene/L supercomputer uses the NEURON software developed by Michael Hines, John W. Moore, and Ted Carnevale at Yale and Duke. Thanks to this software, processors are converted into neuron simulators and communication cables are converted into axons interconnecting the neurons, which allows to transform the entire Blue Gene into a cortical microcircuit.

In November 2007, the project achieved the completion of its first phase, with the development of a new modelling framework for the construction and validation of neural circuits built from biological data. The project is now striving to simplify the simulation of the column, in order to allow the parallel simulation of multiple connected columns. If this strategy will be successful, the final objective will be to simulate a whole human neocortex, which includes about one million cortical columns.

More to explore:

H. Markram, The Blue Brain Project, Nature Reviews Neuroscience, 7:153-160, 2006