Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Apr 28, 2005

Brain-machine Interface Test Promising

From Betterhumans

An experimental brain-machine interface has allowed a quadriplegic person to control a computer in what could be an early step to new assistive technologies for the disabled.

Cyberkinetics of Foxborough, Massachusetts has reported preliminary results for a pilot study of its BrainGate Neural Interface System, which the company aims to develop into a safe, effective and unobtrusive universal operating system for allowing disabled people to control devices using their thoughts.

"While these results are preliminary, I am extremely encouraged by what has been achieved to date," says study investigator Jon Mukand of Sargent Rehabilitation Center. "We now have early evidence that a person unable to move their arms, hands and legs can quickly gain control of a system which uses thoughts to control a computer and perform meaningful tasks. With additional development this may represent a significant breakthrough for people with severe disabilities."

Mental link

BrainGate, in a pilot study under a US Food and Drug Administration Investigational Device Exemption, uses an implanted neural signal sensor and external processors to allow users to control machinery.

The implanted sensor is about the size of a baby aspirin and contains 100 electrode probes thinner than a human hair. Implanted in part of the brain responsible for movement, the primary motor cortex, the probes detect the electrical activity of brain cells and relay this through a small wire exiting the scalp to a pedestal on the skull. A cable runs from the pedestal to a cart with computers, signal processors and monitors, allowing operators to study how well users can control their neural output.

For the reported study, an unidentified quadriplegic person with a three-year-old spinal cord injury had the sensor implanted this June in an approximately three-hour operation at Rhode Island Hospital in Providence. The procedure reportedly went as planned and the recipient has reportedly experienced no side-effects or problems healing.

Interactive mind

The study examined the recipient's use of BrainGate over two months and 20 study sessions. It found that the recipient could immediately adjust neural output in response to commands. It also found that a computer interface developed using the patient's thoughts allowed the subject to perform tasks and operate basic computer functions, including controlling a cursor, playing Pong with 70% accuracy and performing multiple tasks simultaneously, such as controlling a TV while talking.

While the findings are preliminary and based on a single patient, Cyberkinetics aims to enroll a total of five quadriplegic people between the ages of 18 and 60 who meet such criteria as being able to verbally communicate.

Each participant is expected to participate in a study for 13 months. Afterwards, participants can undergo surgery to have the device removed or choose to participate in future studies, which the first patient has chosen to do.

"Our ultimate goal is to develop the BrainGate System so that it can be linked to many useful devices, including for example, medical devices such as muscle stimulators, to give the physically disabled a significant improvement in their ability to interact with the world around them," says John Donoghue, chief scientific officer of Cyberkinetics.

The research was reported in Phoenix, Arizona at the 2004 annual meeting of the American Academy of Physical Medicine and Rehabilitation.

Cyberkinetics plans to announce more results and observations from the pilot study in San Diego, California at the 2004 annual meeting of the Society for Neuroscience.

Apr 18, 2005

Samsung mobile phone uses alpha waves to improve brain memory

From I4U

Samsung introduces the SCH-S350 mobile phone that uses apparently alpha waves to improve brain memory and concentration. This could be straight out of a sci-fi novel, but it is not. It is from a Samsung press-release.

Samsung argued that alpha waves, known to improve memory and concentration, are generated when the phone plays functional music.

The Samsung SCH-S350 is a small slider phone featuring a 0.3MP digital camera, MP3 Player and GPS. The new Samsung mobile phone measures 85mm in length and weighs only 85g.

13:20 Posted in Meditation & brain | Permalink | Comments (0) | Tags: meditation

Apr 13, 2005

Rescue Robots

Disaster rescue is one of the most serious social issues which involves very large numbers of heterogeneous agents in an hostile environment.

Rescue Robotics is a newly emerging field dealing with systems that support first response units in disaster missions. Especially mobile robots can be highly valuable tools in urban rescue missions after catastrophes like earthquakes, bomb- or gas-explosions or daily incidents like fires and road accidents involving hazardous materials. The robots can be used to inspect collapsed structures, to assess the situation and to search and locate victims.

There are many engineering and scientific challenges in this domain. Rescue robots not only have to be designed for the harsh environmental conditions of disasters, but they also need advanced capabilities like intelligent behaviors to free them from constant supervision by operators.

The main goal of the RoboCupRescue project is to promote research and development in this socially significant domain at various levels involving multi-agent team work coordination, physical robotic agents for search and rescue, information infrastructures, personal digital assistants, a standard simulator and decision support systems, evaluation benchmarks for rescue strategies and robotic systems that are all integrated into a comprehensive systems in future.

More to explore

The web site of RescueRobots Freiburg, one of the leading lab in the RR field

Apr 12, 2005

8th Annual International Workshop on Presence

Call for Papers

PRESENCE 2005 8th Annual International Workshop on Presence

London, England September 21-23, 2005

Submission deadline: June 6, 2005

Full details on the web at http://ispr.info

OVERVIEW

Academics and practitioners with an interest in the concept of (tele)presence are invited to submit their work for presentation at PRESENCE 2005 at University College London in London, England, September 21-23, 2005.

The eighth in a series of highly successful international workshops, PRESENCE 2005 will provide an open discussion forum to share ideas regarding concepts and theories, measurement techniques, technology, and applications related to presence, the psychological state or subjective perception in which a person fails to accurately and completely acknowledge the role of technology in an experience, including the sense of 'being there' experienced by users of advanced media such as virtual reality.

The concept of presence in virtual environments has been around for at least 15 years, and the earlier idea of telepresence at least since Minsky's seminal paper in 1980. Recently there has been a burst of funded research activity in this area for the first time with the European FET Presence Research initiative. What do we really know about presence and its determinants? How can presence be successfully delivered with today's technology?
This conference invites papers that are based on empirical results from studies of presence and related issues and/or which contribute to the technology for the delivery of presence. Papers that make substantial advances in theoretical understanding of presence are also welcome. The interest is not solely in virtual environments but in mixed reality environments. Submissions will be reviewed more rigorously than in previous conferences. High quality papers are therefore sought which make substantial contributions to the field.

Approximately 20 papers will be selected for two successive special issues for the journal Presence: Teleoperators and Virtual Environments.

PRESENCE 2005 takes place in London and is hosted by University College London. The conference is organized by ISPR, the International Society for Presence Research and is supported by the European Commission's FET Presence Research Initiative through the Presencia and IST OMNIPRES projects and by University College London.

TOPICS

Submissions of papers, demonstrations, and panels that represent completed or ongoing work are encouraged in areas including but not limited to:

* Explications of the presence concept

* Presence evaluation/measurement methodologies

* Causes and consequences (effects) of presence

* Presence in shared virtual environments and online communities

* Social/affective interfaces, virtual agents, parasocial interactions

* Presence-associated technologies:
- Immersive, interactive, multimodal displays
- Advanced broadcast and cinematic displays (stereoscopic TV, HDTV, IMAX)
- Virtual environments/simulators
- 3-D sound
- Haptic/tactile displays

* Presence applications:
- Education and training
- Medicine and therapy
- Entertainment
- Communication and collaboration
- Teleoperation
- Presence and design
- Presence in art

* Presence and philosophical issues (e.g., the nature of 'reality')

* The ethics of presence

* Presence in the future: Media experiences in the 21st century and beyond

CONFERENCE FORMAT

Like the earlier workshops, PRESENCE 2005 will have an interactive format in which all participants (attendees, presenters, invited speakers) attend each of the sessions as well as several social events, allowing participants to exchange ideas and build knowledge together as the conference progresses.

The conference will feature keynote presentations by three prominent presence scholars:

Paul Verschure (Institute of Neuroinformatics, University/ETH Zürich, Zurich, Switzerland)

Woody Barfield (Human Interface Technology Lab, University of Washington, Seattle, Washington, USA)

Carolina Cruz-Neira (Virtual Reality Applications Center, Iowa State University, Ames, Iowa, USA)

VENUE

The Workshop will be hosted by University College London
(UCL) in the Bloomsbury area in the heart of the great city of London.

For more information about London visit the official website for London (http://www.visitlondon.com/); for more information about UCL, visit the UCL web site (http://www.ucl.ac.uk).

SUBMISSIONS

We invite researchers and practitioners to submit work in the following categories:

Full papers: Comprehensive descriptions of original research or design work within the scope of the workshop. Full papers are limited to 12 pages in the PRESENCE 2005 template format (see submission page at http://ispr.info) and will be considered for oral presentation (unless the submitter requests consideration only for poster presentation).

Short papers: Brief presentation of tentative or preliminary results of research or design work within the scope of the workshop. Short papers are limited to 4 pages in the PRESENCE 2005 template format and will be considered for both oral presentation and poster presentation.

Posters: Visual display presentation. Submissions are limited to 4 A4 pages which contain miniature versions of the larger pages that would be displayed at the conference.

Demonstrations/exhibitions: Step-by-step audiovisual demonstrations and/or hands-on experiences of (commercial or academic) work within the scope of the workshop. Proposals for demonstrations/exhibitions are limited to 2 pages in the PRESENCE 2005 template format.

Panels: Sets of presentations on a single theme or topic within the scope of the workshop. Submitters are encouraged to be creative regarding both the topic and format of panel proposals, which are limited to 4 pages in the PRESENCE 2005 template format.

All submitted papers will be blind peer-reviewed by at least two selected reviewers. Work accepted for presentation will be included in the official conference proceedings and may be posted on the ISPR, presence-connect, and presence- research.org web sites prior to the conference. Authors of as many as 20 of the presented papers will be invited to revise their paper for publication in one of two special conference issues (August and October 2006) of the MIT Press journal Presence:
Teleoperators and Virtual Environments.

Please submit your work online at the submission page of the conference web site at http://ispr.info by the conference deadline of June 6, 2005.

Apr 08, 2005

!!! Groundbreaking virtual reality SONY patent - A step closer to the Matrix !!!

07 April 2005
Exclusive from New Scientist Print Edition

Jenny Hogan Barry Fox

Imagine movies and computer games in which you get to smell, taste and perhaps even feel things. That's the tantalising prospect raised by a patent on a device for transmitting sensory data directly into the human brain - granted to none other than the entertainment giant Sony.

The technique suggested in the patent is entirely non-invasive. It describes a device that fires pulses of ultrasound at the head to modify firing patterns in targeted parts of the brain, creating "sensory experiences" ranging from moving images to tastes and sounds. This could give blind or deaf people the chance to see or hear, the patent claims.

While brain implants are becoming increasingly sophisticated, the only non-invasive ways of manipulating the brain remain crude. A technique known as transcranial magnetic stimulation can activate nerves by using rapidly changing magnetic fields to induce currents in brain tissue. However, magnetic fields cannot be finely focused on small groups of brain cells, whereas ultrasound could be.

If the method described by Sony really does work, it could have all sorts of uses in research and medicine, even if it is not capable of evoking sensory experiences detailed enough for the entertainment purposes envisaged in the patent.

Details are sparse, and Sony declined New Scientist's request for an interview with the inventor, who is based in its offices in San Diego, California. However, independent experts are not dismissing the idea out of hand. "I looked at it and found it plausible," says Niels Birbaumer, a pioneering neuroscientist at the University of Tübingen in Germany who has created devices that let people control devices via brain waves.

The application contains references to two scientific papers presenting research that could underpin the device. One, in an echo of Galvani's classic 18th-century experiments on frogs' legs that proved electricity can trigger nerve impulses, showed that certain kinds of ultrasound pulses can affect the excitability of nerves from a frog's leg. The author, Richard Mihran of the University of Colorado, Boulder, had no knowledge of the patent until New Scientist contacted him, but says he would be concerned about the proposed method's long-term safety.

Sony first submitted a patent application for the ultrasound method in 2000, which was granted in March 2003. Since then Sony has filed a series of continuations, most recently in December 2004 (US 2004/267118).

Elizabeth Boukis, spokeswoman for Sony Electronics, says the work is speculative. "There were not any experiments done," she says. "This particular patent was a prophetic invention. It was based on an inspiration that this may someday be the direction that technology will take us."

More to explore

Virtual Reality
Transcranial Magnetic Stimulation
Matrix: the movie

Press releases

Australian IT - Sony patents 'real' Matrix
News24 - Sony eyes 'real-life Matrix'
Forbes.com - Sony takes first step to patent 'real-life Matrix
Dawn International - Real life Matrix in the making
Timesonline - Sony takes 3-D cinema directly to the brain
Australian Financial Review - First step to patent 'real-life Matrix'

Apr 06, 2005

Remote healthcare monitoring not so distant

Efficient, effective and reliable remote healthcare monitoring is a holy grail in medicine but solutions have so far proved elusive. But it took a step closer to reality with the recent completion of the E-Care project.

The IST programme funded-project developed a comprehensive monitoring system to capture, transmit and distribute vital health data to doctors, carers and family. Pilot tests of the E-Care system indicate that doctors, nurses, patients and their families found E-Care reliable, simple to use and an effective method to improve the quality of care while reducing costs.

E-Care’s system dynamically produces data depending who accesses a patient's record. A doctor will see all the health information, a system or medical administrator will see data relevant to them, while patients, their friends or family, will see another set of data, all coming from the one file.

The system will monitor patients with chronic, or long-term, illnesses such as diabetes or cardiovascular disease, and patients discharged after an operation or serious medical crises, such as stroke victims.

It can acquire vital information about a patient who lives far away from medical support, and it can alert medical staff if there is a dangerous change in patient's status. With e-Care’s system doctors spend less time going to see patients and more time treating them. It also means real-time monitoring without high staff or capital costs.

"Citizens with long-term illnesses as well as those who are in post-surgery state, or predisposed to illness, need monitoring of their health until their condition becomes stable," says E-Care project director Mariella Devoti.

"They, as well as their family and friends, also need an efficient way to collaborate with their doctor and get informed about their state. Until now, monitoring of the health condition of such people could only be accomplished by prearranged visits from a doctor, or by visits to the local hospital for a check-up. However, this is an inefficient solution, as well as costly, as these visits would scarcely be on a daily basis."

"[By] increasing the demand for long-term care, often at home, there are fewer facilities and medical staff available per patient," says Devoti. "This situation represents an important challenge for our society and it is urgent to provide consistent solutions to avoid a deep deterioration of the quality of life of millions of people."

Building a state-of-the-art remote monitoring system
Ten partners from Italy, Greece, the UK, Germany and Cyprus joined forces to develop a remote monitoring system that could take vital data from patients, automatically add the data to the patient's chart and render the information accessible via computer for analysis at a hospital or clinic.

"E-Care makes best use of state-of-the-art know-how from a wide spectrum of disciplines, ranging from medical devices and software to workflow management system, [which brings] experience from business modelling," Devoti says.

The system includes nine components deployed across two primary elements: patient monitoring and the central system.

On the patient side there's a wireless intelligent sensor network (WISE), bio-medical sensors and a radio terminal. WISE consists of a series of monitors that track signs like activity, temperature, pulse, blood pressure and glucose or other personal data like weight, pain measurement and drug conformance. Data collected by the sensors are sent to the transmitter that sends them to the central system.

But the system can work two ways. SMS messages sent to the drug conformance device to remind the patients when they need to take medication. Patients send a confirmation once they take the drugs.

The central system includes a medical data manager (MDM), E-Care repository, collaboration module, workflow system, security system and user Web applications.

The MDM automatically checks patient data against the patient's record and any doctor's notes. If there is a disturbing change in the patient's vital signs, for example high glucose levels in a diabetic, an alarm is sent directly to the patient's physician. This provides peace of mind for patient and family and ensures a doctor or medics can respond rapidly to any problems that arise. Similarly, the MDM can alert paramedical staff or a doctor if patient data fails to arrive when expected.

The E-Care repository stores all patient data. The collaboration module allows user to communicate using real-time synchronous message, audio conferencing or videoconference. Patients and family can confer with their doctor, or GPs or nurses can confer with a specialist.

The workflow system controls overall system processes, while user Web applications dynamically format the patient data depending on who is accessing the information and what access rights they have. Layouts for doctors, patients, system administrators, medical administrators and patient relatives, each with different information, are possible.

Researchers sought to avoid re-inventing the wheel by using standard and widely adopted technologies where possible. Transmission is across standard telecoms protocols such as GSM, 3G, bluetooth, radio and landline broadband. It means the system will work with any modern hospital or clinic.

Positive results from pilots
E-Care deployed in three pilot programmes between February 2003 and February 2004 and the overall result was very positive.

"The final validation results for E-Care showed a general satisfaction among doctors and medical staff, system and medical administrators, patients and their family," says coordinator Devoti.

Users praised its practicability, reliability, effectiveness and patient acceptance. Specifically, medical staff at Aldia in Italy, one of the project partners, gave some suggestions of the possible uses of the system, for example cardiovascular disease, or patients with chronic obliterating arteriopathy.

Diabetes specialists praised the glucose monitoring system. It allows the doctor to change patient treatment based on vital sign analysis. Similarly the blood pressure monitor was particularly of interest for patients with stroke or cardiovascular disease.

With the project validation complete the partners will fine tune the system and search for commercial opportunities. Remote monitoring is not so distant.

Contact:
Mariella Devoti
coordinator@e-care-project.org

Source: Based on information from E-Care

Exploring the frontier of telepresence

By Geoff McMaster, ExpressNews Staff - The closest many of us have come to imagining virtual reality is the holodeck, a fantasy playground featured on the television series Star Trek.

Such flights of fancy are no longer the stuff of science fiction, however. Computer scientists at the U of A have already created technology allowing people to sit across from three-dimensional recreations of each other, even though in reality they may be thousands of miles apart.

It's only the beginning of a revolution in virtual reality technology expected to take us by storm in the next decade or so, says Dr. Pierre Boulanger, a U of A computer scientist who just received an iCORE/TRLabs Industrial Research Chair worth a total of $1.7 million to develop his groundbreaking work in collaborative virtual environments.

Imagine a world, for example, where professors of surgery transmit hand and scalpel movements, as well as what they see while operating, thousands of miles across a computer network, where it is recreated in an operating room.

"The student will actually look at that and actually feel what the doctor is doing," said Boulanger. "On the other hand, the doctor can feel what the students are doing and give them a nudge in the right direction--It's like being in virtual residence with doctors."

Families separated by travel will spend meals together through what is called "telepresence," said Boulanger. "You would wear special goggles--and we're working on that--which would allow you to see your wife sitting in front of you, having a day-to-day conversation. In the future you will have virtual encounters like this, people you want to be part of a meeting sitting beside you virtually and having a conversation."

At a press conference on campus Tuesday to celebrate his chair, and that of Dr. Christoph Sensen at the University of Calgary, Boulanger explained how scientists are now able to create and manipulate a model of the earth's core by feeding computers highly sophisticated mathematical equations. Once recreated in 3D, the average person is fully capable of understanding such complex physical phenomena, he said. "People can actually interact with it, and say, "What happens if we have that instead of this?'"

"A three-dimensional visual model will allow you to explain complex systems, and understand how the world runs. People understand complex systems because daily life is actually very complex--the new technology is truly human-centred. Computers are smart enough today to adapt to people, and that's really a recent shift in computing."

The two chairs held by Boulanger and Sensen will focus on a variety of pursuits that will benefit from virtual reality technology, including engineering prototypes, testing medical procedures and conducting scientific research. Sensen, who has received $1 million for his research, is developing new tools to virtually work in the human body.

Boulanger was recruited to the University of Alberta's computing science department in 2001 from the National Research Council of Canada, where he spent 18 years as a senior research officer. He is also an adjunct scientist and principal investigator for new media at TRLabs, Canada's largest not-for-profit information and communications technology research consortium , and at the Banff Centre.

His new chair includes an iCORE Industrial Chair Establishment grant of $50,000 per year for five years, in addition to further grants from TRLabs, the University of Alberta, the Canada Foundation for Innovation, the Natural Sciences and Engineering Research Council, the Canadian Network for the Advancement of Research in Industry and Education and other industry partners.

"From my perspective, an absolutely essential part of Pierre's work on the collaborative virtual environment is described by the word collaborative," said dean of science Gregory Taylor. "It's at the very forefront of what I like to call the new science, interdisciplinary science where the collaborative team becomes the vehicle for discovery."

iCORE was established six years ago by the provincial government to support university research that supports information and communications technology. There are now 20 chairs focused on emerging areas such as wireless communications, artificial intelligence and quantum nanocomputing

New lightweight LCD screen

From Gizmodo

I4U has a few details on an impressively lightweight LCD screen from Scalar Corp. that can be mounted on ordinary glasses that weighes a scant seven grams. The screen offers a simulated 14-inch screen with 180,000 pixels (roughly television resolution). It's not terribly expensive, either, at just $460 or so.




14:55 Posted in Wearable & mobile | Permalink | Comments (0) | Tags: wearable

Apr 04, 2005

Bianco-Valente: Neurovisions of the post-human age

Contemporary art is exploring the link between mind, brain and technology. Artists are scientists who experiment reality in a different way. They don't use the language of maths and physics, but the universal language of sensory experiences. There are concepts that cannot be expressed using numbers. For example, how would one represent the infinite complexity of the relationship between the mind and the brain? And how to describe the invisible co-evolution between natural and artificial, between biological and technological networks?

Bianco and Valente, two artists working in Neaple (Italy) propose an original answer to these questions. The focus of their work is "on perception phenomenons and brain dynamism that enable us to retain the memories of our experiences and perceive mind images, through which we can conceive an evolving reproduction of the external reality A very fascinating subject for us is the body-mind duality: a flesh organic structure, finite and dependent on space and time that carries about the mind, a spontaneous phenomenon without visible boundaries, totally free and self-referential. We search for boundaries of this immaterial space living into the cerebral convolutions, trying to understand if and where it is possible to locate a point of contact linking indivisibly the two domains, the material and the immaterial"

More to explore

Bianco-Valente Web site (only Italian, but with a few English articles about their work)

18:41 Posted in Cyberart | Permalink | Comments (0)

Virtual reality for disabled individuals

From the "Presence" listserv

By ERIN BELL
September 23, 2004
Special to The Globe and Mail

Anyone who's used Sony's EyeToy for the PlayStation 2 console knows how much fun it can be to put yourself into a video game with the help of a camera. More advanced technology developed in Canada is being used to help people with special needs learn skills and experience things that used to be beyond their reach.

The technology comes from a partnership between integrator Xperiential Learning Solutions Inc. and Toronto developer Jestertek Inc. (formerly the Vivid Group), whose virtual reality systems have been installed in places ranging from museums and science centres to the hockey and basketball halls of fame.
The pair's Experiential Learning Product Suite is aimed at people with physical, mental or behavioural disabilities.

Xperiential's founder, Theo D'Hollander, has a son with autism and relatives who suffer from cerebral palsy and Down syndrome. "It made me realize that this whole area [of technology for people with disabilities] is almost like a generation behind, it's in the industrial age when the rest of the world is in the Internet age," he said. "Today's technology is great for things like information gathering, but it has actually created more distance between people with disabilities and the life or job opportunities they need. The answer is to use technology to help create new experiences for them."

Like Sony's EyeToy, the Experiential Learning Product Suite uses cameras to capture a person's image and project it onto a monitor or large screen, combining it in real-time with the computer-generated action. The player can participate in virtual reality scenarios such as snowboarding, soccer, boxing, racing, and even mountain climbing, controlling the action by moving parts of their body. It's there the similarities cease, however.

Using cameras that capture at least 30 frames a second and hardware much more powerful than a game console, the suite can adapt to a player's physical characteristics and abilities.
Sensitivity, speed and range of motion are adjustable, allowing people to control programs with tiny gestures -- from a shrug to a toe-twitch -- letting a bedridden person see what it's like to ride a horse, or someone without the use of their hands play a virtual musical instrument.

The profile for each user can be fine-tuned as their mobility or skills improve. The partners are also working on a way to allow people to compete or collaborate on-line.

"It's very much a motivational experience for the kids or adults with disabilities who use the system," Jestertek president Vincent John Vincent said.

"Children with cognitive disabilities have short attention spans,"Mr. D'Hollander added. "With these programs, they're engaged by the games and music. There's something enticing about seeing themselves on television, and the idea that they're inside a computer game."

A study by the University of Ottawa is looking at ways to use the suite to make children's home-exercise rehabilitation programs more engaging.

"This gives people, especially those with some cognitive impairments or disabilities, the opportunity to have an experience that could not be possible otherwise," explained Dr. Heidi Seistrup, associate professor in the University's School of Rehabilitation Sciences.

"For example, they could play volleyball even though they're in a wheelchair, just by moving their fingers. The environment can be tweaked to allow someone with a very limited range of motion to play against someone who has a full range of motion. You put them on a level playing field, which you can't do in real life very easily."

In other cases, the suite is used to teach life skills. There are modules that can train people to sort a load of laundry, teach basic traffic safety, or show how to serve customers in a doughnut shop.

The system can be a social behaviour coach, too, Mr.
D'Hollander said. "Having a child with autism go to a family gathering at Christmas or go into a crowded mall is a big issue, because the initial encounter is so intense they can't handle it.
We can create a tape of family settings or a mall, and allow children to get used to it by interacting with [the virtual crowd] before they encounter the real thing, helping them over that social hurdle."

A site licence for the Experiential Learning Product Suite is $5,600 (including hardware for a single user and several dozen applications), and systems for additional users can be purchased for around $750 each. There's also a basic unit that sells for around $400, including about a dozen games, to deliver extra entertainment and exercise-related programs in a home setting.

Over the past several months, Xperiential has sold about 30 site licences to community living homes, rehabilitation centres and school boards across Ontario, as well as to customers in the United States and Europe, Mr. D'Hollander said.

A unit was recently installed at Community Living Oakville in Oakville, Ont. "We use it three times a week, and it's awesome,"
day service worker Kelleigh Melito said. "They love the dancing, racing and snowboarding. Because of our location, they can't get out much to go for walks, because there are no sidewalks; this is how they get their exercise."

Don Seymour, executive director of developmental services for Lambton County, Ont., said he hopes to use the virtual reality units in all the special needs homes under his jurisdiction.

"I watched a fellow we support who is in a wheelchair get in front of the camera and look at the screen, and all of a sudden realize he was in a racecar," Mr. Seymour said. "All he had to do was move his shoulders to race this car around a track. For a person who has limited mobility with their arms or legs, to be able to steer a racecar on a big screen was incredible."

"You hear about this stuff a dozen times in a year, and then to actually find something that has immediate flexibility to our folks is quite something," he added.

New article about Presence published on Nature!

Maria V. Sanchez-Vives and Mel Slater have published a review article entitled "From Presence to Consciousness through Virtual Reality" on the prestigious journal Nature Neuroscience (ASI impact factor 2003: 27.007).

In this article, the authors argue that presence - the sense of "being there" perceived during the exploration of a virtual environment - is worthy of study by neuroscientists, and that it might aid the study of perception and consciousness.

Maria V. Sanchez-Vives, Mel Slater, From Presence to Consciousness through Virtual Reality, Nature Reviews Neuroscience 6, 332-339 (2005); doi:10.1038/nrn1651