Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Mar 11, 2013

Is virtual reality always an effective stressors for exposure treatments? Some insights from a controlled trial

Is virtual reality always an effective stressors for exposure treatments? Some insights from a controlled trial.

BMC psychiatry, 13(1) p. 52, 2013

Federica Pallavicini, Pietro Cipresso, Simona Raspelli, Alessandra Grassi, Silvia Serino, Cinzia Vigna, Stefano Triberti, Marco Villamira, Andrea Gaggioli, Giuseppe Riva

Abstract. Several research studies investigating the effectiveness of the different treatments have demonstrated that exposure-based therapies are more suitable and effective than others for the treatment of anxiety disorders. Traditionally, exposure may be achieved in two manners: in vivo, with direct contact to the stimulus, or by imagery, in the person’s imagination. However, despite its effectiveness, both types of exposure present some limitations that supported the use of Virtual Reality (VR). But is VR always an effective stressor? Are the technological breakdowns that may appear during such an experience a possible risk for its effectiveness? (...)

Full paper available here (open access)

Mar 03, 2013

Permanently implanted neuromuscolar electrodes allow natural control of a robotic prosthesis

Source: Chalmers University of Technology

 
Dr Rickard Brånemark tests the functionality of the world's first muscle and nerve control...
 
For the first time, a surgical team led by Dr Rickard Brånemark, Sahlgrenska University Hospital, has carried out the first operation of its kind, where neuromuscular electrodes that enable a prosthetic arm and hand to be controlled by thought have been permanently implanted into the nerves and muscles of an amputee.

“The new technology is a major breakthrough that has many advantages over current technology, which provides very limited functionality to patients with missing limbs,” Brånemark says.

Presently, robotic prostheses rely on electrodes over the skin to pick up the muscles electrical activity to drive few actions by the prosthesis. The problem with this approach is that normally only two functions are regained out of the tens of different movements an able-body is capable of. By using implanted electrodes, more signals can be retrieved, and therefore control of more movements is possible. Furthermore, it is also possible to provide the patient with natural perception, or “feeling”, through neural stimulation.

“We believe that implanted electrodes, together with a long-term stable human-machine interface provided by the osseointegrated implant, is a breakthrough that will pave the way for a new era in limb replacement,” says Rickard Brånemark.

Read full story

Kibo space robot underwent zero gravity testing

Via Gizmag

The Japanese communication robot destined to join the crew aboard the International Space Station (ISS) this summer recently underwent some zero gravity testing. The Kibo Robot Project, organized by Dentsu Inc. in response to a proposal made by the Japan Aerospace Exploration Agency, unveiled the final design of its diminutive humanoid robot and its Earthbound counterpart.

Watch the video:

14:29 Posted in AI & robotics | Permalink | Comments (0)

Call for papers - International Workshop on Intelligent Digital Games for Empowerment and Inclusion

1st International Workshop on Intelligent Digital Games for Empowerment and Inclusion

Website: http://idgei.fdg2013.org/
14 May 2013, Chania, Crete, Greece
chaired by Björn Schuller, Lucas Paletta, Nicolas Sabouret

Paper submission deadline: 11 March 2013

Digital Games for Empowerment and Inclusion possess the potential to change our society in a most positive way by preparing selected groups in a playful and fun way for their everyday life’s social and special situations. Exemplary domains span as far as from children with Autism Spectrum Condition to young adults preparing for their first job interviews or migrants familiarizing with their new environment. The current generation of such games thereby increasingly demands for computational intelligence algorithms to help analyze players’ behavior and monitor their motivation and interest to adapt game progress. The development of such games usually thus requires expertise from the general gaming domain, but in particular also from a game’s target domain, besides technological savoir-faire to provide intelligent analysis and reaction solutions. IDGEI 2013 aims at bridging across these communities and disciplines by inviting respective researchers and experts to discuss their latest perspectives and findings in the field of Intelligent Digital Games for Empowerment and Inclusion.

Suggested workshop topics include, but are by no means limited to:

- Machine Intelligence in Serious Games
- Mobile and Real-World Serious Gaming
- Emotion & Affect in Serious Games
- Player Behavior and Attention Modeling
- Player-Adaptation and Motivation
- Security & Privacy Preservation
- Novel Serious Games
- User Studies & Tests of Serious Games

Brain-to-brain communication between rats achieved

From Duke Medicine News and Communications

Researchers at Duke University Medical Center in the US report in the February 28, 2013 issue of Scientific Reports the successful wiring together of sensory areas in the brains of two rats. The result of the experiment is that one rat will respond to the experiences to which the other is exposed.

Brain-to-brain interface with rats | Duke

The results of these projects suggest the future potential for linking multiple brains to form what the research team is calling an "organic computer," which could allow sharing of motor and sensory information among groups of animals.

"Our previous studies with brain-machine interfaces had convinced us that the rat brain was much more plastic than we had previously thought," said Miguel Nicolelis, M.D., PhD, lead author of the publication and professor of neurobiology at Duke University School of Medicine. "In those experiments, the rat brain was able to adapt easily to accept input from devices outside the body and even learn how to process invisible infrared light generated by an artificial sensor. So, the question we asked was, ‘if the brain could assimilate signals from artificial sensors, could it also assimilate information input from sensors from a different body?’"

To test this hypothesis, the researchers first trained pairs of rats to solve a simple problem: to press the correct lever when an indicator light above the lever switched on, which rewarded the rats with a sip of water. They next connected the two animals' brains via arrays of microelectrodes inserted into the area of the cortex that processes motor information.

One of the two rodents was designated as the "encoder" animal. This animal received a visual cue that showed it which lever to press in exchange for a water reward. Once this “encoder” rat pressed the right lever, a sample of its brain activity that coded its behavioral decision was translated into a pattern of electrical stimulation that was delivered directly into the brain of the second rat, known as the "decoder" animal.

The decoder rat had the same types of levers in its chamber, but it did not receive any visual cue indicating which lever it should press to obtain a reward. Therefore, to press the correct lever and receive the reward it craved, the decoder rat would have to rely on the cue transmitted from the encoder via the brain-to-brain interface.

The researchers then conducted trials to determine how well the decoder animal could decipher the brain input from the encoder rat to choose the correct lever. The decoder rat ultimately achieved a maximum success rate of about 70 percent, only slightly below the possible maximum success rate of 78 percent that the researchers had theorized was achievable based on success rates of sending signals directly to the decoder rat’s brain.

Importantly, the communication provided by this brain-to-brain interface was two-way. For instance, the encoder rat did not receive a full reward if the decoder rat made a wrong choice. The result of this peculiar contingency, said Nicolelis, led to the establishment of a "behavioral collaboration" between the pair of rats.

"We saw that when the decoder rat committed an error, the encoder basically changed both its brain function and behavior to make it easier for its partner to get it right," Nicolelis said. "The encoder improved the signal-to-noise ratio of its brain activity that represented the decision, so the signal became cleaner and easier to detect. And it made a quicker, cleaner decision to choose the correct lever to press. Invariably, when the encoder made those adaptations, the decoder got the right decision more often, so they both got a better reward."

In a second set of experiments, the researchers trained pairs of rats to distinguish between a narrow or wide opening using their whiskers. If the opening was narrow, they were taught to nose-poke a water port on the left side of the chamber to receive a reward; for a wide opening, they had to poke a port on the right side.

The researchers then divided the rats into encoders and decoders. The decoders were trained to associate stimulation pulses with the left reward poke as the correct choice, and an absence of pulses with the right reward poke as correct. During trials in which the encoder detected the opening width and transmitted the choice to the decoder, the decoder had a success rate of about 65 percent, significantly above chance.

To test the transmission limits of the brain-to-brain communication, the researchers placed an encoder rat in Brazil, at the Edmond and Lily Safra International Institute of Neuroscience of Natal (ELS-IINN), and transmitted its brain signals over the Internet to a decoder rat in Durham, N.C. They found that the two rats could still work together on the tactile discrimination task.

"So, even though the animals were on different continents, with the resulting noisy transmission and signal delays, they could still communicate," said Miguel Pais-Vieira, PhD, a postdoctoral fellow and first author of the study. "This tells us that it could be possible to create a workable, network of animal brains distributed in many different locations."

Nicolelis added, "These experiments demonstrated the ability to establish a sophisticated, direct communication linkage between rat brains, and that the decoder brain is working as a pattern-recognition device. So basically, we are creating an organic computer that solves a puzzle."

"But in this case, we are not inputting instructions, but rather only a signal that represents a decision made by the encoder, which is transmitted to the decoder’s brain which has to figure out how to solve the puzzle. So, we are creating a single central nervous system made up of two rat brains,” said Nicolelis.  He pointed out that, in theory, such a system is not limited to a pair of brains, but instead could include a network of brains, or “brain-net.” Researchers at Duke and at the ELS-IINN are now working on experiments to link multiple animals cooperatively to solve more complex behavioral tasks.

"We cannot predict what kinds of emergent properties would appear when animals begin interacting as part of a brain-net. In theory, you could imagine that a combination of brains could provide solutions that individual brains cannot achieve by themselves," continued Nicolelis. Such a connection might even mean that one animal would incorporate another's sense of "self," he said.

"In fact, our studies of the sensory cortex of the decoder rats in these experiments showed that the decoder's brain began to represent in its tactile cortex not only its own whiskers, but the encoder rat's whiskers, too. We detected cortical neurons that responded to both sets of whiskers, which means that the rat created a second representation of a second body on top of its own." Basic studies of such adaptations could lead to a new field that Nicolelis calls the "neurophysiology of social interaction."

Such complex experiments will be enabled by the laboratory's ability to record brain signals from almost 2,000 brain cells at once. The researchers hope to record the electrical activity produced simultaneously by 10-30,000 cortical neurons in the next five years.

Such massive brain recordings will enable more precise control of motor neuroprostheses—such as those being developed by the Walk Again Project—to restore motor control to paralyzed people, Nicolelis said.

More to explore:

Pais-Vieira, M., Lebedev, M., Kunicki, C., Wang, J. & Nicolelis, M. A. L. Sci. Rep. 3, 1319 (2013). PUBMED

2nd Summer School on Human Computer Confluence

Date: 17th, 18th and 19th July 2013

Venue: IRCAM
Location: Paris, France
Website: http://www.ircam.fr/

The 2nd HCC summer school aims to share scientific knowledge and experience among participants, enhance and stimulate interdisciplinary dialogue as well as provide further opportunities for co-operation within the study domains of Human Computer Confluence.

The topics of the summer school will be framed around the following issues:
• re-experience yourself,
• experience being others,
• experience being together in more powerful ways,
• experience other environments,
• experience new senses,
• experience abstract data spaces.

The 2nd HCC summer school will try to benefit most from the research interests and the special facilities of the IRCAM institute, the last as a place dedicated to the coupling of art with the sciences of sound and media. Special attention will be given to the following thematic categories:
• Musical interfaces
• Interactive sound design
• Sensorimotor learning and gesture-sound interactive systems
• Croudsourcing and human computation approaches in artistic applications

The three-day summer school will include invited lectures by experts in the field, a round-table and practical workshops. During the workshops, participants will engage in hands-on HCC group projects that they will present at the end of the summer school.

Program committee

• Isabelle Viaud-Delmon, Acoustic and cognitive spaces team, CNRS - IRCAM, France.
• Andrea Gaggioli, Department of Psychology, UCSC, Milan, Italy.
• Stephen Dunne, Neuroscience Department, STARLAB, Barcelona, Spain.
• Alois Ferscha, Pervasive computing lab, Johannes Kepler Universitat Linz, Austria.
• Fivos Maniatakos, Acoustic and Cognitive Spaces Group, IRCAM, France.

Organisation committee

• Isabelle Viaud-Delmon, IRCAM
• Hugues Vinet, IRCAM
• Marine Taffou, IRCAM
• Sylvie Benoit, IRCAM
• Fivos Maniatakos, IRCAM