By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Jul 20, 2006

Using thought power to control artificial limbs

Neuronal ensemble control of prosthetic devices by a human with tetraplegia

Nature 442, 164-171(13 July 2006)

Leigh R. Hochberg, Mijail D. Serruya, Gerhard M. Friehs, Jon A. Mukand, Maryam Saleh, Abraham H. Caplan, Almut Branner, David Chen, Richard D. Penn and John P. Donoghue

Neuromotor prostheses (NMPs) aim to replace or restore lost motor functions in paralysed humans by routeing movement-related signals from the brain, around damaged parts of the nervous system, to external effectors. To translate preclinical results from intact animals to a clinically useful NMP, movement signals must persist in cortex after spinal cord injury and be engaged by movement intent when sensory inputs and limb movement are long absent. Furthermore, NMPs would require that intention-driven neuronal activity be converted into a control signal that enables useful tasks. Here we show initial results for a tetraplegic human (MN) using a pilot NMP. Neuronal ensemble activity recorded through a 96-microelectrode array implanted in primary motor cortex demonstrated that intended hand motion modulates cortical spiking patterns three years after spinal cord injury. Decoders were created, providing a 'neural cursor' with which MN opened simulated e-mail and operated devices such as a television, even while conversing. Furthermore, MN used neural control to open and close a prosthetic hand, and perform rudimentary actions with a multi-jointed robotic arm. These early results suggest that NMPs based upon intracortical neuronal ensemble spiking activity could provide a valuable new neurotechnology to restore independence for humans with paralysis.


Jul 18, 2006

A high-performance brain-computer interface

A high-performance brain-computer interface.

Nature. 2006 Jul 13;442(7099):195-8

Authors: Santhanam G, Ryu SI, Yu BM, Afshar A, Shenoy KV

Recent studies have demonstrated that monkeys and humans can use signals from the brain to guide computer cursors. Brain-computer interfaces (BCIs) may one day assist patients suffering from neurological injury or disease, but relatively low system performance remains a major obstacle. In fact, the speed and accuracy with which keys can be selected using BCIs is still far lower than for systems relying on eye movements. This is true whether BCIs use recordings from populations of individual neurons using invasive electrode techniques or electroencephalogram recordings using less- or non-invasive techniques. Here we present the design and demonstration, using electrode arrays implanted in monkey dorsal premotor cortex, of a manyfold higher performance BCI than previously reported. These results indicate that a fast and accurate key selection system, capable of operating with a range of keyboard sizes, is possible (up to 6.5 bits per second, or approximately 15 words per minute, with 96 electrodes). The highest information throughput is achieved with unprecedentedly brief neural recordings, even as recording quality degrades over time. These performance results and their implications for system design should substantially increase the clinical viability of BCIs in humans.

Novel BCI device will allow people to search through images faster

Via KurzweilAI.net 

Researchers at Columbia University are combining the processing power of the human brain with computer vision to develop a novel device that will allow people to search through images ten times faster than they can on their own. 

The "cortically coupled computer vision system," known as C3 Vision, is the brainchild of professor Paul Sajda, director of the Laboratory for Intelligent Imaging and Neural Computing at Columbia University. He received a one-year, $758,000 grant from Darpa for the project in late 2005.

The brain emits a signal as soon as it sees something interesting, and that "aha" signal can be detected by an electroencephalogram, or EEG cap. While users sift through streaming images or video footage, the technology tags the images that elicit a signal, and ranks them in order of the strength of the neural signatures. Afterwards, the user can examine only the information that their brains identified as important, instead of wading through thousands of images.

Read the full story on Wired 

Jul 06, 2006

Third International Meeting on Brain-Computer Interface Technology

This special issue of the IEEE Transactions on Neural Systems and Rehabilitation Engineering provides a representative and comprehensive bird's-eye view of the most recent developments in brain–computer interface (BCI) technology from laboratories around the world. The 30 research communications and papers are the direct outcome of the Third International Meeting on Brain–Computer Interface Technology held at the Rensselaerville Institute, Rensselaerville, NY, in June 2005. Fifty-three research groups from North and South America, Europe, and Asia, representing the majority of all the existing BCI laboratories around the world, participated in this highly focused meeting sponsored by the National Institutes of Health and organized by the BCI Laboratory of the Wadsworth Center of the New York State Department of Health. As demonstrated by the papers in this special issue, the rapid advances in BCI research and development make this technology capable of providing communication and control to people severely disabled by amyotrophic lateral sclerosis (ALS), brainstem stroke, cerebral palsy, and other neuromuscular disorders. Future work is expected to improve the performance and utility of BCIs, and to focus increasingly on making them a viable, practical, and affordable communication alternative for many thousands of severely disabled people worldwide.

May 30, 2006

Decoding the visual and subjective contents of the human brain

Decoding the visual and subjective contents of the human brain 

Yukiyasu Kamitani & Frank Tong

Nature Neuroscience  8, 679 - 685 (2005) 

The potential for human neuroimaging to read out the detailed contents of a person's mental state has yet to be fully explored. We investigated whether the perception of edge orientation, a fundamental visual feature, can be decoded from human brain activity measured with functional magnetic resonance imaging (fMRI). Using statistical algorithms to classify brain states, we found that ensemble fMRI signals in early visual areas could reliably predict on individual trials which of eight stimulus orientations the subject was seeing. Moreover, when subjects had to attend to one of two overlapping orthogonal gratings, feature-based attention strongly biased ensemble activity toward the attended orientation. These results demonstrate that fMRI activity patterns in early visual areas, including primary visual cortex (V1), contain detailed orientation information that can reliably predict subjective perception. Our approach provides a framework for the readout of fine-tuned representations in the human brain and their subjective contents.

Jan 16, 2006

Walking from thought

Walking from thought

Brain Res. 2006 Jan 5;

Authors: Pfurtscheller G, Leeb R, Keinrath C, Friedman D, Neuper C, Guger C, Slater M

Online analysis and classification of single electroencephalogram (EEG) trials during motor imagery were used for navigation in the virtual environment (VE). The EEG was recorded bipolarly with electrode placement over the hand and foot representation areas. The aim of the study was to demonstrate for the first time that it is possible to move through a virtual street without muscular activity when the participant only imagines feet movements. This is achieved by exploiting a brain-computer interface (BCI) which transforms thought-modulated EEG signals into an output signal that controls events within the VE. The experiments were carried out in an immersive projection environment, commonly referred to as a "Cave" (Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Surround-screen projection-based virtual reality: the design and implementation of the CAVE. Proceedings of the 20th annual conference on Computer graphics and interactive techniques, ACM Press, 1993, pp. 135-142) where participants were able to move through a virtual street by foot imagery only. Prior to the final experiments in the Cave, the participants underwent an extensive BCI training.

Jan 03, 2006

Brainport: an alternative input to the brain

J Integr Neurosci. 2005 Dec;4(4):537-50

Authors: Danilov Y, Tyler M

Brain Computer Interface (BCI) technology is one of the most rapidly developing areas of modern science; it has created numerous significant crossroads between Neuroscience and Computer Science. The goal of BCI technology is to provide a direct link between the human brain and a computerized environment. The objective of recent BCI approaches and applications have been designed to provide the information flow from the brain to the computerized periphery. The opposite or alternative direction of the flow of information (computer to brain interface, or CBI) remains almost undeveloped. The BrainPort is a CBI that offers a complementary technology designed to support a direct link from a computerized environment to the human brain - and to do so non-invasively. Currently, BrainPort research is pursuing two primary goals. One is the delivery of missing sensory information critical for normal human behavior through an additional artificial sensory channel around the damaged or malfunctioning natural sensory system. The other is to decrease the risk of sensory overload in human-machine interactions by providing a parallel and supplemental channel for information flow to the brain. In contrast, conventional CBI strategies (e.g., Virtual Reality), are usually designed to provide additional or substitution information through pre-existing sensory channels, and unintentionally aggravate the brain overload problem.

Dec 01, 2005

A wavelet-based time-frequency analysis approach for classification of motor imagery for brain-computer interface applications

J Neural Eng. 2005 Dec;2(4):65-72

Authors: Qin L, He B

Electroencephalogram (EEG) recordings during motor imagery tasks are often used as input signals for brain-computer interfaces (BCIs). The translation of these EEG signals to control signals of a device is based on a good classification of various kinds of imagination. We have developed a wavelet-based time-frequency analysis approach for classifying motor imagery tasks. Time-frequency distributions (TFDs) were constructed based on wavelet decomposition and event-related (de)synchronization patterns were extracted from symmetric electrode pairs. The weighted energy difference of the electrode pairs was then compared to classify the imaginary movement. The present method has been tested in nine human subjects and reached an averaged classification rate of 78%. The simplicity of the present technique suggests that it may provide an alternative method for EEG-based BCI applications.

Characterization of four-class motor imagery EEG data for the BCI-competition 2005

J Neural Eng. 2005 Dec;2(4):L14-22

Authors: Schlögl A, Lee F, Bischof H, Pfurtscheller G

To determine and compare the performance of different classifiers applied to four-class EEG data is the goal of this communication. The EEG data were recorded with 60 electrodes from five subjects performing four different motor-imagery tasks. The EEG signal was modeled by an adaptive autoregressive (AAR) process whose parameters were extracted by Kalman filtering. By these AAR parameters four classifiers were obtained, namely minimum distance analysis (MDA)-for single-channel analysis, and linear discriminant analysis (LDA), k-nearest-neighbor (kNN) classifiers as well as support vector machine (SVM) classifiers for multi-channel analysis. The performance of all four classifiers was quantified and evaluated by Cohen's kappa coefficient, an advantageous measure we introduced here to BCI research for the first time. The single-channel results gave rise to topographic maps that revealed the channels with the highest level of separability between classes for each subject. Our results of the multi-channel analysis indicate SVM as the most successful classifier, whereas kNN performed worst.

Nov 03, 2005

New asynchronous brain computer interface

Via Smart Mobs

An asynchronous brain computer interface is under development at Oxford University, with the collaboration of Southampton and Essex universities. The system should allow a more effective way of controlling robotic arms and wheelchairs, as opposed to the less natural on/off mode of existing synchronous BCI technology. But the real novelty of this BCI apparatus is that it will use only one electrode.

The two-year project has been funded £180,000 by the EPSRC. According to project's leader Prof. Stephen Roberts (Oxford University), the new BCI system could improve the quality of life of severely disabled, but potential applications of this technology range from the gaming and entertainment industries to the automotive sector.

Read full article on the Engineer Online

More to explore

This page offers some introductory links to sources of information on the Web about the BCI

A list of BCI research lab

Sep 16, 2005

Non-invasive neural interface technology

Via Engadget (thanks to Giuseppe Riva)

NeuroSky Inc. claims to have developed a non-invasive neural sensor and signal processing technology that converts brainwaves and eye movements into electronic signals to control a range of electronic devices.

According to Neurosky, neural interface technology promises to simplify cell phone-based applications that today require error-prone human input, as well as revolutionize applications from gaming to medical diagnostics and therapy.

EETimes reports that five companies, including a Bluetooth headset provider, game console maker and trucking company, have signed up to market end-user products containing NeuroSky's chips.


From the company website:

NeuroSky, a fabless semiconductor/module company, has developed a non-invasive neural sensor and signal processing technology that converts brainwaves and eye movements into useful electronic signals to communicate with a wide range of electronic devices, consoles, and computers. While brainwaves have been used as a form of diagnostics and therapy in neurosciences for years, the related technology has never reached a large audience due to price/size constraints, inconvenient physical limitations, and/or invasive surgical procedures. NeuroSky draws from this research and adapts it to commercialize neural interface technologies for various attractive global markets.

More to explore

Neural interface

Brain-computer interface

Aug 03, 2005

Downloading video from the brain

Via Pasta and Vinegar

Using cats selected for their sharp vision, in 1999 Garret Stanley and his team recorded signals from a total of 177 cells in the lateral geniculate nucleus - a part of the brain's thalamus (the thalamus integrates all of the brains sensory input and forms the base of the seven-layered thalamocortical loop with the six layered neocortex) - as they played 16 second digitized (64 by 64 pixels) movies of indoor and outdoor scenes.

Using simple mathematical filters, Dr. Stanley and his colleagues decoded the signals to generate movies of what the cats actually saw. This finding has enormous implications in the fields of neurorehabilitation and neural repair. For example, it could allow to wire artificial limbs directly into the brain, or to develop artificial brain extensions.

More to explore

Garrett B. Stanley, Fei F. Li, and Yang Dan, Reconstruction of Natural Scenes from Ensemble Responses in the Lateral Geniculate Nucleus. Journal of Neuroscience, 1999; 19: 8036 - 8042 Link to the full-text PDF

Jul 19, 2005

Vagus Nerve Stimulation System for Severe Depression

Via Medgadget

The Food and Drug Administration has recently approved a new treatment for severe depression based on a nerve stimulation system that delivers tiny electric shocks through vagus nerve and into a region of the brain thought to play a role in mood.

Cyberonics Inc.'s stimulation system is an adjunctive long-term treatment of chronic or recurrent depression for adults who are experiencing a major depressive episode that has not had an adequate response to two or more adequate antidepressant treatments.

Despite controversy over whether it's really been proven to work, the potential treatment targets an estimated 4 million Americans with hard-to-treat depression.

Jun 29, 2005

Bionic Arm Technology is advancing

From Medgadget

Mr. Jesse Sullivan is the world's first "Bionic Man", according to the Rehabilitation Institute of Chicago (RIC). In May 2001, Mr. Sullivan lost both of his upper extremities as a result of an accident. The technology that allows this patient to control his artificial arms has been developed by RIC's Neural Engineering Center for Artificial Limbs (NECAL).

Developed within the Neural Engineering Center for Artificial Limbs (NECAL) at RIC, Dr. Todd Kuiken, MD, PhD, pioneered the muscle reinnervation procedure which takes an amputee's own nerves and connects them to a healthy muscle. In this case, four of Mr. Sullivan's nerves were dissected from the shoulder and transferred to the muscles of his chest. Doing so allows the user to move his or her prosthetic arm as if it were a real limb--by simply thinking about what they want the arm to do. The "Bionic Arm," or myoelectric arm, is driven using electrical signals from the muscles of the chest, now activated by the user's own thought-generated nerve impulses. These impulses are sensed, via surface electrodes, from the pectoral muscle and carried through to the mechanical arm, causing the arm to move.

NECAL uses nerve-muscle grafts in amputees to gain added control signals for an artificial arm. Doctors take nerves that used to go to the arm and move those nerves onto chest muscles. The nerves grow into the chest muscles, so when the patient thinks "close hand," a portion of his chest muscle contracts and electrodes that detect this muscle activity tell the computerized arm when to close the hand. Thus, the patient thinks "close hand" and his artificial hand closes...

Researchers at RIC have learned that although the limb is lost with an amputation, the control signals to that limb remain accessible in the residual peripheral nerves. Grafting the residual nerves of an upper-limb amputee to spare muscles produces additional control signals, allowing for simultaneous operation of multiple functions in an externally powered prosthesis with a more natural feel than is possible with conventional prostheses.

The "Bionic Arm" technology has been very successful so far in both significantly improving the function of artificial limbs as well as allowing the skin to be reinnervated with nerves form the arm. The first patient to undergo the new procedure, Jesse Sullivan, has experienced significant improvements in the functioning of his prosthetic arms. While previously moving his artificial arms was slow and cumbersome, today he is able to do many of the routine tasks he took for granted before his accident, including putting on socks, shaving, eating dinner, taking out the garbage, carrying groceries and vacuuming.

Apr 28, 2005

Brain-machine Interface Test Promising

From Betterhumans

An experimental brain-machine interface has allowed a quadriplegic person to control a computer in what could be an early step to new assistive technologies for the disabled.

Cyberkinetics of Foxborough, Massachusetts has reported preliminary results for a pilot study of its BrainGate Neural Interface System, which the company aims to develop into a safe, effective and unobtrusive universal operating system for allowing disabled people to control devices using their thoughts.

"While these results are preliminary, I am extremely encouraged by what has been achieved to date," says study investigator Jon Mukand of Sargent Rehabilitation Center. "We now have early evidence that a person unable to move their arms, hands and legs can quickly gain control of a system which uses thoughts to control a computer and perform meaningful tasks. With additional development this may represent a significant breakthrough for people with severe disabilities."

Mental link

BrainGate, in a pilot study under a US Food and Drug Administration Investigational Device Exemption, uses an implanted neural signal sensor and external processors to allow users to control machinery.

The implanted sensor is about the size of a baby aspirin and contains 100 electrode probes thinner than a human hair. Implanted in part of the brain responsible for movement, the primary motor cortex, the probes detect the electrical activity of brain cells and relay this through a small wire exiting the scalp to a pedestal on the skull. A cable runs from the pedestal to a cart with computers, signal processors and monitors, allowing operators to study how well users can control their neural output.

For the reported study, an unidentified quadriplegic person with a three-year-old spinal cord injury had the sensor implanted this June in an approximately three-hour operation at Rhode Island Hospital in Providence. The procedure reportedly went as planned and the recipient has reportedly experienced no side-effects or problems healing.

Interactive mind

The study examined the recipient's use of BrainGate over two months and 20 study sessions. It found that the recipient could immediately adjust neural output in response to commands. It also found that a computer interface developed using the patient's thoughts allowed the subject to perform tasks and operate basic computer functions, including controlling a cursor, playing Pong with 70% accuracy and performing multiple tasks simultaneously, such as controlling a TV while talking.

While the findings are preliminary and based on a single patient, Cyberkinetics aims to enroll a total of five quadriplegic people between the ages of 18 and 60 who meet such criteria as being able to verbally communicate.

Each participant is expected to participate in a study for 13 months. Afterwards, participants can undergo surgery to have the device removed or choose to participate in future studies, which the first patient has chosen to do.

"Our ultimate goal is to develop the BrainGate System so that it can be linked to many useful devices, including for example, medical devices such as muscle stimulators, to give the physically disabled a significant improvement in their ability to interact with the world around them," says John Donoghue, chief scientific officer of Cyberkinetics.

The research was reported in Phoenix, Arizona at the 2004 annual meeting of the American Academy of Physical Medicine and Rehabilitation.

Cyberkinetics plans to announce more results and observations from the pilot study in San Diego, California at the 2004 annual meeting of the Society for Neuroscience.