By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Mar 23, 2005

VR for PTSD rehabilitation in Israel

From Israel21c, "A focus beyond conflict"

Most people have never witnessed a terror attack. But the graphic depiction of a suicide bus bombing on the computer screen that Prof. Patrice (Tamar) Weiss is displaying seems vividly real.

Watching it - in three dimensions and full sound while wearing a head-mounted display helmet -may help hundreds of Israelis who have witnessed real terror attacks overcome post-traumatic stress disorder (PTSD), and is the basis of a new therapy for treating particularly resistant cases of PTSD.

The treatment is just one of dozens of novel applications of virtual reality (VR) technology which were demonstrated recently at the University of Haifa during the Third VR Symposium. Weiss, the person who brought together many of the world's leading VR wizards - and who is herself involved in several cutting edge VR applications, is a strictly observant Israeli who lives in the ultra-Orthodox neighborhood of B'nai Brak.

"It's not exactly normal," admits Weiss to ISRAEL21c, laughing at the contrast between her traditional way of life and the 'Brave New World' that characterizes her professional pursuits.

But Weiss sees no contradiction between the two. "I have always been interested in different technologies and my goal has always been to help people," says the researcher, whose library has volumes of Psalms and kinesiology textbooks side by side.

An occupational therapist by training, Weiss grew up in Canada and taught at McGill University in Montreal for many years, before immigrating to Israel in 1991 with her Israel-born husband. For the last four years she has been a researcher and lecturer at the University of Haifa, and a member of its newly-established Laboratory for Innovations in Rehabilitation Technology.

Weiss's interest in VR was piqued when she read an article by one of the pioneers in the field, Prof. Albert 'Skip' Rizzo of the University of Southern California, nearly a decade ago. That ultimately led to a close collaboration with Rizzo, who also attended this month's symposium.

What interests her about the field? "Look at this," says Weiss, showing a videotape of a woman with a spinal cord injury doing traditional physiotherapy. The therapist hands her a plastic ring which she must grasp without losing her balance - then another ring, and another, and another. "Let's face it. It's very static and very boring."

Now she shows a videotape of another patient who is also learning to balance himself - only he is watching himself on a giant screen, against a breath-taking mountain backdrop, swatting at balls in the sky. Every ball he hits turns into a colourful bird. The scene is virtual, but the man's movements - he is leaping and swatting with increasing determination - are very real.

"It's interesting and motivating," explains Weiss. "I have yet to meet a patient - of any age - who didn't like it. So it's very effective." (In a newer version, she notes excitedly, patients will wear a glove which vibrates whenever they make contact with a virtual ball - further increasing the sense of realness.)

The symposium Weiss organized, which brought leading VR experts from the US, Canada, Europe, Japan and Israel, to Haifa showed the dizzying range of new VR technologies dedicated to health and rehabilitation - from a robotic dog, who can be a reliable companion for the elderly - "no need to feed him or take him for walks," noted the researcher who demonstrated the small, black, yelping Sony invention - to 3D interactive games that could some day be used for early diagnoses of Alzheimer's disease, treatment of attention deficit disorder, and rehabilitation of patients who have suffered central nervous system injuries.

"Virtual reality has completely revolutionized the field of occupational therapy," says Weiss, who is personally involved in several innovative VR projects, including the simulated bus bombing program designed to treat Israelis suffering from severe post-traumatic stress.

That program - developed together with Dr. Naomi Josman, Prof.Eli Somer and Ayelet Reisberg, all of the University of Haifa, as well as with American researchers - is designed to expose patients in a controlled manner to the traumatic incident which they are often unable to remember, but which has a powerful and debilitating effect on their lives.

The realistic rendering of the bus bombing triggers the patient's memories - the first vital step on the path to overcoming trauma. (The simulation does not include all the gruesome details of the attack, but rather just enough to help the patient recall what happened.)

It was Josman who first came up with the idea of using such a treatment in Israel. She was attending a conference in the United States when she saw how University of Washington Prof. Hunter Hoffman had applied VR to successfully treat Americans suffering from PTSD following the 9/11 attack on the Twin Towers.

Similar programs have also been used recently to help American veterans traumatized by their tour of duty in Iraq, and even Vietnam veterans for whom no other treatment has proven effective.

Through close collaboration with Hoffman, the U. of Haifa team developed an Israeli version of the program which is now being used to treat the first few patients.

"If our pilot study is effective, we will launch a full-scale clinical trial," says Weiss, "and hopefully we will be able to provide a solution for those PTSD patients who have been resistant to more traditional cognitive therapy."

In another application of VR technology, Weiss and U. of Haifa colleagues have developed a program to help stroke victims relearn the basic skills required to shop on their own. The patient composes a grocery list and makes his or her way through a 'virtual supermarket,' seeking the right products, pulling them off the shelves and into a shopping cart, while announcements of sales are broadcast on the loudspeaker system.

"It's the first such program designed to improve both cognitive and motor skills of stroke victims," she notes.

Last week, the American Occupational Therapy Foundation (AOTF) invited Weiss to join its Academy of Research, the highest scholarly honor that the AOTF confers.

"Your work clearly helps to move the profession ahead, and demonstrates powerful evidence of the importance of assistive technology in helping persons with disabilities participate in the occupations of their choice, while improving the quality of their lives," the AOTF wrote in its letter to Weiss.

For Weiss, virtual reality is not only the focus of research, but a way of life - at least in her work. She communicates with her colleagues around the world by tele-conference and, of course, email - and notes that she has never even met her close collaborator Hoffman even though they have been
communicating several times a week for years. She also taught an entire university course last semester - without ever attending a lecture hall. Instead, she sat in the comfort of her B'nai Brak home, wearing a headset and microphone to deliver a weekly videoconferenced lesson on assistance technology to students who sat in their own homes.

"They could see a video of me, and whenever a student wanted to speak I would see the icon of a hand being raised. We even had guest lecturers from abroad. The students really appreciated not having to come to the university late at night for the course," says Weiss, who was pleased to be able to - once again - harness technology to help make people's lives a little easier.

Mar 22, 2005

Virtual reality games alleviate pain in children

From BioMed Central

Virtual reality games can help alleviate pain in children being treated for severe injuries, according to research published in the Open Access, peer reviewed journal BMC Pediatrics.

Immersion in a virtual world of monsters and aliens helps children feel less pain during the treatment of severe injuries such as burns, according to a preliminary study by Karen Grimmer and colleagues from the Women's and Children's Hospital in Adelaide, Australia.

A virtual reality game is a computer game especially designed to completely immerse the user in a simulated environment. Unlike other computer games, the game is played wearing a special headset with two small computer screens and a special sensor, which allows the player to interact with the game and feel a part of its almost dreamlike world. "Owing to its ability to allow the user to immerse and interact with the artificial environment that he/she can visualize, the game-playing experience is engrossing", explain the authors.

Children with severe burns suffer great pain and emotional trauma, especially during the cleaning and dressing of their wounds. They are usually given strong painkiller drugs, muscle relaxants or sedatives, but these are often not enough to completely alleviate pain and anxiety. These medications also have side effects such as drowsiness, nausea or lack of energy.

Grimmer and colleagues asked seven children, aged five to eighteen, to play a virtual reality game while their dressing was being changed. The children were also given the usual amount of painkillers. The researchers assessed the pain the children felt when they were playing and then compared it to the amount of pain felt when painkillers were used alone.

To measure the intensity of the pain, the team used the Faces Scale, which attributes a score from 0 to 10, wherein 10 represents maximum pain, to a facial manifestation of pain. For younger children they used 5 different faces representing no pain to very bad pain. The researchers also interviewed the nurses and parents present during the dressing change.

The average pain score when the children received painkillers alone was 4.1/10. It decreased to 1.3/10 when the children had played a game and been given painkillers. Because the sample size was so small the researchers analysed their results per child, and they found that all but one child lost at least 2 points on the scale when they were playing the game. The parents and nurses confirmed these results and said that the children clearly showed less signs of pain when they played the game.

"We found that virtual reality coupled with analgesics was significantly more effective in reducing pain responses in children than analgesic only" conclude the authors.

This is only a preliminary study, but the researchers are hopeful. They propose to test virtual reality on more subjects, possibly with games appropriate to each age group, in the hope that it could one day greatly reduce, if not completely replace, the use of painkillers.

This press release is based on the article:
The efficacy of playing a virtual reality game in modulating pain for children with acute burn injuries: A single blinded randomized controlled trial Debashish Das, Karen Grimmer, Tony Sparnon, Sarah McRae, Bruce Thomas BMC Pediatrics 2005 4:27 (3 March 2005)

Feb 11, 2005

Cibex-Trazer VR system for neurorehabilitation

Retrieved from: the PRESENCE - Listserve

It hasn’t yet, but the underlying concept is intriguing -- the idea that you could take one of the causes of childhood obesity and re-engineer it as part of the solution.

It may sound like science fiction, but it’s exactly what Bob Verdun, the new director of the Pawtucket Family YMCA, wants to achieve with a cutting-edge piece of fitness equipment called the Trazer.

The Trazer is a serious machine, using optical sensors, an electronic "beacon," and interactive, virtual reality software to facilitate exercise training, sports testing and rehabilitation -- but Verdun is more interested in its kid-appeal.

That’s because the Trazer is the fitness equivalent of the nutritious Twinkie: A virtual reality video game designed to make excise universally accessible and, believe it or not, fun.

"There is a resistance for kids who are not good at team sports to join a school team, and these are usually the kids who need exercise the most," said Ray Giannelli, the senior vice president of research and development for Cybex International, the Massachusetts-based company that will ship three of the first production Trazers to Pawtucket this spring.

The Trazer, said Giannelli, will give sports-adverse kids a fighting chance to stay in shape by putting simple, effective exercises within the familiar context of a video game.

"A lot of kids today are on the computer all day, e-mailing or chatting or playing video games," said Verdun, "This is the perfect bridge to exercise, a viable way to win the war against childhood obesity."

Here’s how it works:

Optical sensors mounted beneath a sleek, high-definition video display are trained to register the movements of an electronic beacon in a 6-foot, three-dimensional space.

When the beacon is attached to a belt worn by the user, his or her movements are mimicked in real-time by a virtual character in a virtual world.

An easy-to-navigate options menu features interactive programs for performance testing and sports training, as well as rehabilitative movement therapy, kinesthetic learning and fitness fun.

There are programs designed for the elderly to help them strengthen neglected muscles and maintain a full range of motion, and games designed to keep kids excited about physical activity.

In one game called "Trap Attack," players are required to move their virtual self across a three-dimensional chessboard, in sync with a roving red cursor.

Play the game on a basic difficulty setting, and the cursor moves relatively slowly, one square at a time. Crank up the difficulty, and trapdoors will appear, forcing you to jump over them.

Another game, "Goalie Wars," allows the player to intercept and "catch" soccer balls thrown by a virtual goalie. Once caught, a ball can be thrown back by lunging forward, and if you fake-out your polygonal opponent, you score a point.

Although optical sensors and virtual reality video games are impressive, they’re not exactly new.

The real meat and potatoes of Trazer is its ability to analyze and tabulate the data it collects. While you play "Goalie Wars" or "Trap Attack," the Trazer is measuring your reaction time, acceleration, speed, power, balance, agility, jumping height, endurance, heart rate and caloric output.

These statistics are displayed and saved after the completion of each exercise, a feature which will allows serious athletes, convalescents and kids to quantify their performance and track their progress.

As they’re fond of saying at Cybex, Trazer has more in common with a flight simulator than an exercise machine.

The Trazer is a versatile exercise machine, but Verdun and his colleagues are hoping it will be a crowd-pleaser as well.

Although the Trazer is still in development, a video demonstration of its capabilities and a hands-on demonstration of the more conventional Cybex Arc Trainer -- a high-tech step- machine -- are expected to be the main attractions at the YMCA’s open house on Saturday.

Scheduled for Saturday, from 10 a.m. to 12 p.m., the open house is a sneak preview: a way to show current and prospective members what they can expect from the Y’s $8 million facelift -- a soup-to-nuts overhaul which will include renovated facilities, expanded program space and brand-new, state-of-the-art fitness equipment.

The ultimate goal is to make the YMCA more family-friendly and appeal to old people, young people and everyone in between. To Verdun’s thinking, what better way to bring the 115-year-old building into the realm of iPods and Instant Messages than a video game.

"We want parents and children to know that there isn’t anything they can’t do at our facility," Verdun said in a recent interview.
"Families can come here, exercise together, and share an experience that is affordable, safe and fun.

"You don’t think about moving around when you’re on the Trazer, and that’s exactly what we want. The essence of the YMCA is having fun."

Jan 28, 2005

VR to treat PTSD

From Wired

SAN DIEGO -- While the real Iraq is more than enough for most people to handle, there's a virtual Iraq lurking on the laptop of psychologist Skip Rizzo, a research scientist at the University of Southern California.

With a push of a button, special effects will appear -- a mosque's call to prayer, a sandstorm, the sounds of bullets or bombs. "We can put a person in a VR headset and have them walk down the streets of Baghdad," Rizzo said. "They can ride in a Humvee, fly in a helicopter over a battle scene or drive on a desert road."

This is no video game, nor is it a training device. Rizzo and colleagues are developing a psychological tool to treat post-traumatic stress disorder, or PTSD, by bringing soldiers back to the scenes that still haunt them. A similar simulation is in the works for victims of the World Trade Center attacks.

PTSD treatment, the newest frontier in the intersection between virtual reality and mental health, is one of the hot topics this week at the 13th annual Medicine Meets Virtual Reality conference, which began Wednesday in Long Beach, California. Rizzo and others will explore plans to expand virtual reality's role in mental health by adding more elements like touch and the ability to interact with simulations. "The driving vision is a holodeck," Rizzo said. "If you look at the holodeck, and all the things people do in Star Trek, that's what we'd like to be able to do."

Powerful computers are cheaper -- the necessary machines used to cost as much as $175,000 but now the Virtual Reality Medical Center in San Diego, one of about 10 private VR mental-health clinics in the United States, picks up its hardware at Fry's Electronics. VR helmets -- which allow users to turn their heads and see things above, below and behind them in the 360-degree virtual world -- cost as little as a few thousand dollars. And perhaps most importantly, the graphics are more advanced, thanks to partnerships with video-game developers.

At the San Diego clinic, graphics designers are developing a remarkably realistic virtual world based on digital photos and audio from San Diego International Airport. Patients afraid of flying will be able to take a virtual tour of the airport, from the drop-off area through the ticket counter, metal detectors and waiting areas. The simulation is so precise that users can enter restrooms, peruse magazines at the newsstand or wander around the food court; recordings will allow the virtual PA system to offer the requisite incomprehensible announcements.

The clinic already offers a simulation of a flight. At $120 a session, patients sit in actual airplane seats and watch a simulation of a takeoff, accurate all the way down to announcements by flight attendants and pilots. At takeoff, actual airplane audio -- engines revving, landing gear retracting -- is channeled into subwoofers below the seat, providing a dead-on simulation of what a passenger feels. Even the view outside the window is based on actual digital video from a flight.

"Exposure therapy" has long been a common treatment for phobias. "It's a gradual reversal of avoidance," said psychologist Hunter Hoffman, a researcher who studies VR at the University of Washington. "You start by having them hold their ground. A lot of phobics have mental misunderstandings about what would happen if they face the thing they're afraid of. A spider phobic, they may think they're going to have a heart attack -- they think if they don't leave the room, they'll go insane. They have these unrealistic theories about what will happen."

Jan 20, 2005

Play scents like you play music!!

Though the Scentstories player doesn't actually play music, it teams with Scentstories disc themes to work much like a music CD player. Just insert one of the themed discs and push play. The player then rotates through five scents on each disc, one by one with a new scent every 30 minutes.

The player shuts off automatically after all five scents have been played. You can stop the player or skip through the scent tracks at any time. Together, the Scentstories player and disc create a new-to-the-world scent experience.


"As Scentstories suggests, smell is the next big accessory for the home ... pop in a scented disc and go barefoot on the shore" The New York Times, August 5, 2004

"And now the latest entrant ... Scentstories from Procter & Gamble, an electric machine that amounts to a mini-jukebox of nice smells" USA Today, August 3, 2004

"We then sneak-sniffed the "Celebrate the Holidays" disc (the sixth disc, not on the market until October). Track 2, "baking holiday pie" was this tester's fave overall." The Chicago Tribune, August 15, 2004

"Smoky candles aren't exactly office-friendly. So what's a girl to do when she craves a Zen like moment on the job? Enter Febreze Scentstories. Discs you slip into a boom box-like gadget for two-plus hours of varying scents like lavender or vanilla. "I love that you can adjust the intensity so the smell isn't overwhelming" one relaxed tester comments" Self magazine, September 2004

Jan 18, 2005

Simulating Human Touch


From InformIT.com

Haptics: The Technology of Simulating Human Touch

Date: Jan 14, 2005
By Laurie Rowell.

When haptics research — that is, the technology of touch — moves from theory into hardware and software, it concentrates on two areas: tactile human-computer interfaces and devices that can mimic human physical touch. In both cases, that means focusing on artificial hands. Here you can delve into futuristic projects on simulating touch.

At a lunch table some time back, I listened to several of my colleagues eagerly describing the robots that would make their lives easier. Typical was the servo arm mounted on a sliding rod in the laundry room. It plucked dirty clothes from the hamper one at a time. Using information from the bar code—which new laws would insist be sewn into every label—the waldo would sort these items into a top, middle, or lower nylon sack.

As soon as a sack was full of, say, permanent press or delicates, the hand would tip the contents into the washing machine. In this way, garments could be shepherded through the entire cycle until the shirts were hung on a nearby rack, socks were matched and pulled together, and pajamas were patted smooth and stacked on the counter.

Sounds like a great idea, right? I mean, how hard could be for a robotic hand to feel its way around a collar until it connects with a label? As it turns out, that's pretty tricky. In fact, one of the things that keeps us from those robotic servants that we feel sure are our due and virtual reality that lets us ski without risking a broken leg is our limited knowledge of touch.

We understand quite a bit about how humans see and hear, and much of that information has been tested and refined by our interaction with computers over the past several years. But if we are going to get VR that really lets us practice our parasailing, the reality that we know has to be mapped and synthesized and presented to our touch so that it is effectively "fooled." And if we want androids that can sort the laundry, they have to be able to mimic the human tactile interface.

That leads us to the study of haptics, the technology of touch.

Research that explores the tactile intersection of humans and computers can be pretty theoretical, particularly when it veers into the realm of psychophysics. Psychophysics is the branch of experimental psychology that deals with the physical environment and the reactive perception of that environment.
Researchers in the field try, through experimentation, to determine parameters such as sensory thresholds for signal perception, to determine perceptual boundaries.

But once haptics research moves from theory into hardware and software, it concentrates on two primary areas of endeavor:
tactile human-computer interfaces and devices that can mimic human physical touch, most specifically and most commonly artificial hands.

Substitute Hands

A lot of information can be conveyed by the human hand.
Watching The Quiet Man the other night, I was struck by the scene in which the priest, played by Ward Bond, insists that Victor M shake hands with John Wayne. Angrily, M complies, but clearly the pressure he exerts far exceeds the requirements of the gesture. Both men are visibly "not wincing" as the Duke drawls, "I never could stand a flabby handshake myself."

When they release and back away from each other, the audience is left flexing its collective fingers in response.

In this particular exchange, complex social messages are presented to audience members, who recognize the indicators of pressure, position, and grip without being involved in the tactile cycle. Expecting mechanical hands to do all that ours can is a tall order, so researchers have been inching that way for a long time by making them do just some of the things ours can.

Teleoperators, for example, are distance-controlled robotic arms and hands that were first built to touch things too hot for humans to handle—specifically, radioactive substances in the Manhattan Project in the 1950s.

While operators had to be protected from radiation by a protective wall, the radioactive material itself had to be shaped with careful precision. A remote-controlled servo arm seemed like the perfect solution.

Accordingly, two identical mechanical arms were stationed on either side of a 1m-thick quartz window. The joints of one were connected to the joints of the other by means of pulleys and steel ribbons. In other words, whatever an operator made the arm do on one side of the barrier was echoed by the device on the other side.

These were effective and useful instruments, allowing the operator to move toxic substances from a remote location, but they were "dumb." They offered no electronic control and were not linked to a computer.

Modern researchers working on this problem would be concentrating now on devices that could "feel" the density, shape, and character of the materials that were perhaps miles away, seen only on a computer screen. This kind of teleoperator depends on a haptic interface and requires some understanding of how touch works.

Worlds in Your Hand

To build a mechanical eye—say, a camera—you need to study optics. To build a receiver, you need to understand acoustics and how these work with the human ear. Similarly, if you expect to build an artificial hand—or even a finger that perceives tactile sensation—you need to understand skin biomechanics.

At the MIT Touch Lab, where numerous projects in the realm of haptics are running at any given time, one project seeks to mimic the skin sensitivity of the primate fingertip as closely as possible, concentrating on having it react to touch as the human finger would.

The research is painstaking and exacting, involving, for example, precise friction and compressibility measurements of the fingerpads of human subjects. Fingertip dents and bends in response to edges, corners, and surfaces have provided additional data. At the same time, magnetic resonance imaging
(MRI) and high-frequency ultrasound show how skin behaves in response to these stimuli on the physical plane.

Not satisfied with the close-ups that they could get from available devices, the team developed a new tool, the Ultrasound Backscatter Microscope (UBM), which shows the papillary ridges of the fingertip and the layers of skin underneath in far greater detail than an MRI.

As researchers test reactions to surfaces from human and monkey participants, the data they gather is mapped and recorded to emerging 2D and 3D fingertip models. At this MIT project and elsewhere, human and robot tactile sensing is simulated by means of an array of mechanosensors presented in some medium that can be pushed, pressed, or bent.

In the Realm of Illusion

Touch might well be the most basic of human senses, its complex messages easily understood and analyzed even by the crib and pacifier set. But what sets it apart from other senses is its dual communication conduit, allowing us to send information by the same route through which we perceive it. In other words, those same fingers that acknowledge your receipt of a handshake send data on their own.

In one project a few years back, Peter J. Berkelman and Ralph L.
Hollis began stretching reality in all sorts of bizarre ways. Not only could humans using their device touch things that weren't there, but they could reach into a three-dimensional landscape and, guided by the images appearing on a computer screen, move those objects around.

This was all done with a device built at the lab based on Lorentz force magnetic levitation (Lorenz force is the force exerted on a charged particle in an electromagnetic field). The design depended upon a magnetic tool levitated or suspended over a surface by means of electromagnetic coils.

To understand the design of this maglev device, imagine a mixing bowl with a joystick bar in the middle. Now imagine that the knob of the joystick floats barely above the stick, with six degrees of freedom. Coils, magnet assemblies, and sensor assemblies fill the basin, while a rubber ring makes the top comfortable for a human operator to rest a wrist. This whole business is set in the top of a desk-high metal box that holds the power supplies, amplifiers, and control processors.

Looking at objects on a computer screen, a human being could take hold of the levitated tool and try to manipulate the objects as they were displayed. Force-feedback data from the tool itself provided tactile information for holding, turning, and moving the virtual objects.

What might not be obvious from this description is that this model offered a marvel of economy, replacing the bulk of previous systems with an input device that had only one moving part.
Holding the tool—or perhaps pushing at it with a finger—the operator could "feel" the cube seen on the computer screen:
edges, corners, ridges, and flat surfaces. With practice, operators could use the feedback data to maneuver a virtual peg into a virtual hole with unnerving reliability.

Notice something here: An operator could receive tactile impressions of a virtual object projected on a screen. In other words, our perception of reality was starting to be seriously messed around with here.


Some of the most interesting work in understanding touch has been done to compensate for hearing, visual, or tactile impairments.

At Stanford, the TalkingGlove was designed to support individuals with hearing limitations. It recognized American Sign Language finger spelling to generate text on a screen or synthesize speech. This device applied a neural-net algorithm to map the movement of the human hand to an instrumented glove to produce a digital output. It was so successful that it spawned a commercial application in the Virtex Cyberglove, which was later purchased by Immersion and became simply the Cyberglove.
Current uses include virtual reality biomechanics and animation.

At Lund University in Sweden, work is being done in providing haptic interfaces for those with impaired vision. Visually impaired computer users have long had access to Braille displays or devices that provide synthesized speech, but these just give text, not graphics, something that can be pretty frustrating for those working in a visual medium like the Web. Haptic interfaces offer an alternative, allowing the user to feel shapes and textures that could approximate a graphical user interface.

At Stanford, this took shape in the 1990s as the "Moose," an experimental haptic mouse that gave new meaning to the terms drag and drop, allowing the user to feel a pull to suggest one and then feel the sudden loss of mass to signify the other. As users approached the edge of a window, they could feel the groove; a check box repelled or attracted, depending on whether it was checked. Some of the time, experimental speech synthesizers were used to "read" the text.

Such research has led to subsequent development of commercial haptic devices, such as the Logitech iFeel Mouse, offering the promise of new avenues into virtual words for the visually impaired.

Where Is This Taking Us?

How is all this research doing toward getting us to virtual realities and actual robot design? Immersion and other companies offer a variety of VR gadgets emerging from the study of haptics, but genuine simulated humans are pretty far out on the horizons.
What we have is a number of researchers around the globe working on perfecting robotic hands, trying to make them not only hold things securely, but also send and receive messages as our own do. Here is a representative sampling:

The BarrettHand
BH8-262: Originally developed by Barrett Technology for NASA but now available commercially, it offers a three-fingered grasper with four degrees of freedom, embedded intelligence, and the ability to hold on to any geometric shape from any angle.

The Anatomically Correct Testbed (ACT) Hand
[]: A project at Carnegie Mellon's Robotics Institute, this is an ambitious effort to create a synthetic human hand for several purposes. These include having the hand function as a teleoperator or prosthetic, as an investigative tool for examining complex neural control of human hand movement, and as a model for surgeons working on damaged human hands. Still in its early stages, the project has created an actuated index finger that mimics human muscle behavior.

Cyberhand []: This collaboration of researchers and developers from Italy, Spain, Germany, and Denmark proposes to create a prosthetic hand that connects to remaining nerve tissue. It will use one set of electrodes to record and translate motor signals from the brain, and a second to pick up and conduct sensory signals from the artificial hand to nerves of the arm for transport through regular channels to the brain.

Research does not produce the products we'll be seeing in common use during the next few years. It produces their predecessors. But many of the scientists in these labs later create marketable devices. Keep an eye on these guys; they are the ones responsible for the world we'll be living in, the one with bionic replacement parts, robotic housekeepers, and gym equipment that will let us fly through virtual skies.

Jan 10, 2005

EMagin Z800 3D Visor

The eMagin Z800 3D Visor is the first product to deliver on the promise of an immersive 3D
computing experience. Anyone can now surround themselves with the visual data they need without the limits of traditional displays and all in complete privacy. Now, gamers can play
"virtually inside" their games, personally immersed in the action. PC users can experience
and work with their data in a borderless environment.

360 degree panoramic view
Two high-contrast eMagin SVGA 3D OLED Microdisplays deliver fluid full-motion video in more than 16.7 million colors. Driving the user’s experience is the highly responsive head-tracking system that provides a full 360-degree angle of view. eMagin’s specially developed optics deliver a bright, crisp image.

Weighing less than 8 oz, the eMagin Z800 3D Visor is compact and comfortable. While the eMagin OLED displays are only 0.59 inch diagonal, the picture is big – the equivalent of a 105-inch movie screen viewed at 12 feet.

Only eMagin OLED displays provide brilliant, rich colors in full 3D with no flicker and no screen smear. eMagin’s patented OLED-on-silicon technology enhances the inherently fast refresh rates of OLED materials with onchip signal processing and data buffering at each pixel site. This enables each pixel to continuously emit only the colors they are programmed to show. Full-color data is buffered under every pixel built into each display, providing flicker-free stereovision capability.


The Z800’s head-tracking system enables users to “see” their data in full 3Dsurround viewing with just a turn of the head. Virtual multiple monitors can also be simulated. Designers, publishers and engineers can view multiple drawings and renderings as if they were each laid out on an artist’s table, even in 3D. The eMagin Z800 3D Visor integrates state-of-the-art audio with high-fidelity stereo sound and a built-in noisecanceling microphone system to complete the immersive experience.

* Brilliant 3D stereovision with hi-fi sound for an immersive experience
* Superb high-contrast OLED displays delivering more than 16.7 million colors
* Advanced 360 degree head-tracking that takes you “inside” the game
* Comfortable, lightweight, USB-powered visor; PC compatible

Jan 05, 2005

Computer generated brain surgery to help trainees


[From E-Health Insider (<http://www.e-health-insider.com/news/item.cfm?ID=988>)...

Researchers at the University of Nottingham have developed a
virtual reality brain surgery simulator for trainee surgeons that
combines haptics with three-dimensional graphics to give what
they claim is the most realistic model in the world.

A 'map' of the brain surface is produced by the software, which
also renders the tweezers or other surgical implement and shows
any incisions made into the virtual brain. The simulator is
controlled by a device held by the user, which uses a robotic
mechanism to give the same pressure and resistance as it would
if it were touching a real brain.

Map of brain on virtual surgery simulator

Dr Michael Vloeberghs, senior lecturer in paediatric neurosurgery
at the University's School of Human Development, who led the
development team, said that the new system would benefit
trainees: "Traditionally a large amount of the training that
surgeons get is by observing and performing operations under

supervision. However, pressures on resources, staff shortages
and new EU directives on working hours mean that this teaching
time is getting less and less.

"This simulator will allow surgeons to become familiar with
instruments and practice brain surgery techniques with
absolutely no risk to the patient whatsoever."

The pilot software was developed with the Queen's Medical
Centre, in Nottingham, which contains a Simulation Centre in
which dummies are often used for surgical training.

Dr Vloeberghs says that the haptic system is an improvement on
the existing system: "Dummies can only go so far – you're still
limited by the physical precense, and you can't do major surgery
on dummies... you can simulate electrically and phonetically what
is happening, but nothing more than that."

Adib Becker, Professor of Mechanical Engineering at the
university, said that the technology could be developed for the
future, and that brain surgery online could even be possible: "If
you project maybe four or five years from now, it may be possible
for a surgeon to operate on a patient totally remotely.

"So the surgeons would be located somewhere else in the world
and can communicate through the internet, and can actually feel
the operation as they are seeing it on the screen."

The team hopes that the piloted software, which was funded by a
grant of £300,000 from the Engineering and Physical Sciences
Research Council (EPSRC), will help train surgeons to a higher
level before their first operation on live patients, thereby
increasing safety.

Jan 03, 2005

Free ISSUE of Journal of NeuroEngineering and Rehabilitation

Free Papers about VR available from the Journal of Neuroengineering and Rehabilitation

Simulator sickness when performing gaze shifts within a wide field of view optic flow environment: preliminary evidence for using virtual reality in vestibular rehabilitation
Patrick J. Sparto, Susan L. Whitney, Larry F. Hodges, Joseph M. Furman, Mark S. Redfern
Journal of NeuroEngineering and Rehabilitation 2004, 1:14 (23 December 2004)
[Abstract] [Provisional PDF]

Considerations for the future development of virtual technology as a rehabilitation tool
Robert V. Kenyon, Jason Leigh, Emily A. Keshner
Journal of NeuroEngineering and Rehabilitation 2004, 1:13 (23 December 2004)
[Abstract] [Provisional PDF]

Video capture virtual reality as a flexible and effective rehabilitation tool
Patrice L. Weiss, Debbie Rand, Noomi Katz, Rachel Kizony
Journal of NeuroEngineering and Rehabilitation 2004, 1:12 (20 December 2004)
[Abstract] [Provisional PDF]

Reaching in reality and virtual reality: a comparison of movement kinematics in healthy subjects and in adults with hemiparesis
Antonin Viau, Anatol G. Feldman, Bradford J. McFadyen, Mindy F. Levin
Journal of NeuroEngineering and Rehabilitation 2004, 1:11 (14 December 2004)
[Abstract] [Provisional PDF]

Motor rehabilitation using virtual reality
Heidi Sveistrup
Journal of NeuroEngineering and Rehabilitation 2004, 1:10 (10 December 2004)
[Abstract] [Provisional PDF]

Presence and rehabilitation: toward second-generation virtual reality applications in neuropsychology
Giuseppe Riva, Fabrizia Mantovani, Andrea Gaggioli
Journal of NeuroEngineering and Rehabilitation 2004, 1:9 (8 December 2004)
[Abstract] [Provisional PDF]

Virtual reality and physical rehabilitation: a new toy or a new research and rehabilitation tool?
Emily A Keshner
Journal of NeuroEngineering and Rehabilitation 2004, 1:8 (3 December 2004)
[Abstract] [Provisional PDF]

Dec 21, 2004

Inexpensive 3-D technology starting to look real

By ADAM FLEMING (from www.presence-research.org) December 08, 2004 Say goodbye to your red-and-blue glasses. The once-great gimmick turned movie-house nostalgia could be in the waning hours of its twilight years, as scientists at the Pittsburgh Supercomputing Center push forward with research in the blossoming field of 3-D technology, otherwise known as stereo visualization. Stuart Pomerantz and Joel Stiles hope to lower the cost and increase the convenience of displaying images and movies in 3-D for large groups of people. “We wanted to be able to show what we do in stereo, but do it, more or less, at the drop of a hat,” Stiles said, “or at very high quality, but very low cost compared to one of these gigantic, multi-projector, multi-screen systems.” The stereo-visualization process adopted by Pomerantz and Stiles involves two separate projectors. Each projector has a linear filter in front of its lens that polarizes the image it projects. Both images are then shown on one screen that is specially designed not to depolarize the images. By wearing a pair of sunglasses, for which each lens is polarized differently, the viewer receives separate images for each eye. And that, in effect, is the essence of viewing in 3-D. “You’ve got to see different images in each eye, just as we always do naturally,” Stiles explained. While reading this article, try covering your left eye. Now cover your right eye while uncovering your left, and you’ll notice that the paper appears to shift slightly. This is because humans see in stereo by forming a composite of two images. Stereo visualization, at its best, is an imitation of this natural process. Attaching polarizing lenses to projectors is not a new development, but Pomerantz and Stiles have coupled the process with new content and playback software. “What we needed to do new was create a pipeline for creating content in the form of movie files,” Stiles said. “We wanted to use stereo as a routine thing, instead of a special case or a one-off demo.” Professors at Pitt have already incorporated stereo visualization in the classroom. Kenneth Jordan and his colleagues in the chemistry department “designed and constructed a 3-D stereo-visualization system in one of the main lecture halls in the Chevron Science Center,” according to an October 2002 article in the University of Pittsburgh Teaching Times. The system in Chevron allows professors to display complex molecules and structures in 3-D, as opposed to the flat models found in textbooks and drawn on chalkboards. With stereo visualization appearing in labs and classrooms, how long will it be until methods of 3-D are available in movie theaters, or even living rooms? For now, the technology is willing, but the space is weak. The projected file size of a feature-length film, packaged for stereo visualization, would be too big for any widely available equipment. But with constant improvements being made in the storage capacity of portable disks, there may one day be a triumphant return of 3-D movies, sans those old paper glasses.

Dec 10, 2004

Virtual Reality in rehabilitation: the IREX system

IREX Exercise Applications

Virtual reality rehabilitation therapy applications enhance a patient’s rehab experience by immersing them in a virtual reality world. While in the virtual reality environment, the patient is motivated by seeing herself/himself engaging in various sports and games. This dramatically improves a patient’s focus and compliance to the activity in therapy.


Virtual reality sport and game environments aid clinicians in the development of exercise programs geared towards the creation of therapeutic treatment protocols.

Dec 09, 2004

Neurotiv project

According to the recent “ISTAG SCENARIOS FOR AMBIENT INTELLIGENCE 2010” (European Commission, February 2001) the evolutionary technology scenarios in support of the Knowledge Society of the years 2000’s will be rooted into three dominant trends:

- Pervasive diffusion of intelligence in the space around us, through the development of network technologies towards the objective of the so-called “Ambient Intelligence” (AmI);
- Increasingly relevant role of mobility, through the develpoment of mobile communications, from the UMTS towards the so-called “Fourth Generation” (4G);
- Increase of reachness and completeness of communications, through the development of multimedia technologies, towards the “Immersive Virtual Telepresence” (TIV), including an increased attention to the aspects of human perception and of person-machine interaction.

The TIV perspective is reached through the complete development of multimedia technologies, generating far away the sense of presence through the integrated availability of sound, vision, smell, touch-and-feel (haptics) signals. Typically, the sense of presence is accomplished through multisensor stimuli by means of which actual reality is either hidden and substituted with a virtual reality, i.e. fully synthetic, or is virtualized, i.e. made virtual through audio and 3-D video analysis and modeling procedures.

The convergence of AmI, 4G and TIV technologies manifests itself as the next frontier of ICT. This convergence will determine the advent of ubiquitous 3-D telecommunications and the built-up of intelligent environments for health care in which complex multimedia contents integrate and enrich the real space. The most ambitious objective is to integrate the computer interfaces in the real environment (Mixed Reality) so that the user can take advantage of the clinical care in the most natural and intuitive way.

The Neurotiv Project aims at improving the know-how level, at generating new development and application opportunities, at organizing and finalizing multidisciplinary skills, and at developing system and component prototypes for the use of TIV technologies in a managed care system for neuropsychology and clinical psychology.

More in detail, the project aims at:

- proving the technical and clinical viability of integrating TIV systems in a managed care platform for neuro-psychology assessment and rehabilitation
- designing/tuning and developing managed care tools to be used in the prevention, planning provision and follow-up of the required treatment. The developed modules will be optimised to meet the demands of the emerging 2,5G/GPRS and 3G/UMTS wireless networks and terminals.
- Defining new treatment protocols for the use of the clinical tools in assessment, therapy and follow up.
- Verify the clinical economic/organizative efficacy of the managed care system by using controlled clinical trials.
- Disseminating the obtained results through scientific papers and conference presentations.

More to explore

The NEUROTIV project web site

EMMA project

The main goal of EMMA project -a European Community funded research project (IST-2001-39192)- is to study the relationships between presence and emotions. In particular, after analyzing the possible emotional impact of high compelling synthetic experiences characterized by an high level of presence, the EMMA project wants to develop "mood devices" able to induce different forms of mood enhancement on both clinical and non clinical samples.
This research will help to understand better the development of some psychopathological phenomena and the development of "new correcting experiences and learnings" to cope those psychopathological experiences.
Furthermore, EMMA project will pretend the development of innovative tools to be used with three different real users:

* users of (real world) mental health services, such as treatment for anxiety disorders, depression, and so on
* users with acute restricted mobility (e.g. designed experiences for hospital inpatients), and
* mood enhancement for general population (relaxation environments through TV or VR).

The VEPSY project

VEPSY is a European-Union funded research project for Telemedicine and Portable Virtual Environments for Clinical Psychology.

The project ended in July 2003. VEPSY-updated has involved partners from an international network of academic institutions and industrial companies.

The main goal of the project was to prove the technical and clinical viability of using portable and shared Virtual Reality systems in clinical psychology. bild.jpg

The project has provided both innovative VR based tools for the treatment of patients, clinical trials to verify their viability and dissemination of its results.

In 2004 the VEPSY project won the Honourable Mention Award at the Europan e-Health Award 2004

New eDimensional VirtualFX Brings Mind-Blowing 3D to Your XBOX, PS2 and More


New eDimensional VirtualFX Brings Mind-Blowing 3D to Your
XBOX, PS2 and More

Games Press 02/12/2004

(West Palm Beach, FL – December 2, 2004) eDimensional, the
leading manufacturer and worldwide distributor of cutting
edge gaming and virtual reality accessories, announces the
release of the new VirtualFX 2D to 3D TV Converter – bringing
a true virtual reality experience to your standard home

At just $129.95 including 2 pairs of wireless 3D glasses,
eDimensional's new VirtualFX instantly converts any existing
video game into a mind-blowing 3D experience, giving players
the most lifelike gaming environment ever created – literally
putting them inside the game. Fighter planes seemingly buzz
by just inches away, racecars zoom at awesome velocity, and
First Person Shooters are suddenly a battlefield reality.

In addition to enhancing the gaming experience, the
proprietary E-D technology can also be used to watch DVDs and
even live TV in real 3D on a standard television (plasma, LCD
and projection screens are not supported).

"Our E-D 3D Glasses for the PC have been extremely popular
for years, but our recent breakthrough allows us to finally
bring that same amazing 3D effect eDimensional is known for
to the TV." explains Michael Epstein, president of
eDimensional. "With our new VirtualFX we are revolutionizing
the home entertainment experience – more interactivity, more
immersion, more realistic graphics and more exciting effects
– giving gamers and movie-watchers alike a truly mind-blowing
3D experience."

The VirtualFX package comes complete with two pair of
wireless 3D glasses, one converter box and a remote control.
Installation of the VirtualFX is a snap and is hooks up just
as easily as a regular DVD player. A dual-emitter transmitter
is utilized to give the widest viewing angle and range
possible and to beam a signal to perfectly synchronize the
refresh rate of the screen with the glasses. This
transmission also allows for additional users who have their
own pair of wireless glasses which can be purchased
individually for just $25.

The VirtualFX is available now through www.eDimensional.com.
As a bonus offer for the holidays, eDimensional is including
three 3D IMAX DVDs (A $60 value), originally shown in IMAX
theaters and specially made for exceptional 3D effects.

About eDimensional

eDimensional was founded in 2000 by a group of gamers at
heart, dedicated to creating the most realistic gaming and
entertainment experience ever. Thanks to fantastic customer
feedback and swift success, eDimensional has grown rapidly
and emerged as the leading manufacturer and worldwide
distributor of cutting-edge gaming accessories.
eDimensional's flagship product, the E-D 3D Gaming System,
was released to critical acclaim, and has since received an
unprecedented number of awards and accolades for providing
the most realistic PC viewing experience. For more
information on eDimensional and its wide array of gaming
accessories, go to eDimensional.com.

Previous 3 4 5 6 7 8 9 10 11