Oct 12, 2014

New Scientist on new virtual reality headset Oculus Rift

From New Scientist

The latest prototype of virtual reality headset Oculus Rift allows you to step right into the movies you watch and the games you play

AN OLD man sits across the fire from me, telling a story. An inky dome of star-flecked sky arcs overhead as his words mingle with the crackling of the flames. I am entranced.

This isn't really happening, but it feels as if it is. This is a program called Storyteller – Fireside Tales that runs on the latest version of the Oculus Rift headset, unveiled last month. The audiobook software harnesses the headset's virtual reality capabilities to deepen immersion in the story. (See also "Plot bots")

Fireside Tales is just one example of a new kind of entertainment that delivers convincing true-to-life experiences. Soon films will get a similar treatment.

Movie company 8i, based in Wellington, New Zealand, plans to make films specifically for Oculus Rift. These will be more immersive than just mimicking a real screen in virtual reality because viewers will be able to step inside and explore the movie while they are watching it.

"We are able to synthesise photorealistic views in real-time from positions and directions that were not directly captured," says Eugene d'Eon, chief scientist at 8i. "[Viewers] can not only look around a recorded experience, but also walk or fly. You can re-watch something you love from many different perspectives."

The latest generation of games for Oculus Rift are more innovative, too. Black Hat Oculus is a two-player, cooperative game designed by Mark Sullivan and Adalberto Garza, both graduates of MIT's Game Lab. One headset is for the spy, sneaking through guarded buildings on missions where detection means death. The other player is the overseer, with a God-like view of the world, warning the spy of hidden traps, guards and passageways.

Deep immersion is now possible because the latest Oculus Rift prototype – known as Crescent Bay – finally delivers full positional tracking. This means the images that you see in the headset move in sync with your own movements.

This is the key to unlocking the potential of virtual reality, says Hannes Kaufmann at the Technical University of Vienna in Austria. The headset's high-definition display and wraparound field of view are nice additions, he says, but they aren't essential.

The next step, says Kaufmann is to allow people to see their own virtual limbs, not just empty space, in the places where their brain expects them to be. That's why Beijing-based motion capture company Perception, which raised more than $500,000 on Kickstarter in September, is working on a full body suit that gathers body pose estimation and gives haptic feedback – a sense of touch – to the wearer. Software like Fireside Tales will then be able to take your body position into account.

In the future, humans will be able to direct live virtual experiences themselves, says Kaufmann. "Imagine you're meeting an alien in the virtual reality, and you want to shake hands. You could have a real person go in there and shake hands with you, but for you only the alien is present."

Oculus, which was bought by Facebook in July for $2 billion, has not yet announced when the headset will be available to buy.

This article appeared in print under the headline "Deep and meaningful"

Oct 06, 2014

Is the metaverse still alive?

In the last decade, online virtual worlds such as Second Life and alike have become enormously popular. Since their appearance on the technology landscape, many analysts regarded shared 3D virtual spaces as a disruptive innovation, which would have rendered the Web itself obsolete.

This high expectation attracted significant investments from large corporations such as IBM, which started building their virtual spaces and offices in the metaverse. Then, when it became clear that these promises would not be kept, disillusionment set in and virtual worlds started losing their edge. However, this is not a new phenomenon in high-tech, happening over and over again.

The US consulting company Gartner has developed a very popular model to describe this effect, called the “Hype Cycle”. The Hype Cycle provides a graphic representation of the maturity and adoption of technologies and applications.

It consists of five phases, which show how emerging technologies will evolve.

In the first, “technology trigger” phase, a new technology is launched which attracts the interest of media. This is followed by the “peak of inflated expectations”, characterized by a proliferation of positive articles and comments, which generate overexpectations among users and stakeholders.

In the next, “trough of disillusionment” phase, these exaggerated expectations are not fulfilled, resulting in a growing number of negative comments generally followed by a progressive indifference.

In the “slope of enlightenment” the technology potential for further applications becomes more broadly understood and an increasing number of companies start using it.

In the final, “plateau of productivity” stage, the emerging technology established itself as an effective tool and mainstream adoption takes off. 

So what stage in the hype cycle are virtual worlds now?

After the 2006-2007 peak, metaverses entered the downward phase of the hype cycle, progressively loosing media interest, investments and users. Many high-tech analysts still consider this decline an irreversible process.

However, the negative outlook that headed shared virtual worlds into the trough of disillusionment maybe soon reversed. This is thanks to the new interest in virtual reality raised by the Oculus Rift (recently acquired by Facebook for $2 billion), Sony’s Project Morpheus and alike immersive displays, which are still at the takeoff stage in the hype cycle.

Oculus Rift's chief scientist Michael Abrash makes no mystery of the fact that his main ambition has always been to build a metaverse such the one described in Neal Stephenson's (1992) cyberpunk novel Snow Crash. As he writes on the Oculus blog

"Sometime in 1993 or 1994, I read Snow Crash and for the first time thought something like the Metaverse might be possible in my lifetime."

Furthermore, despite the negative comments and deluded expectations, the metaverse keeps attracting new users: in its 10th anniversary on June 23rd 2013, an infographic reported that Second Life had over 1 million users visit around the world monthly, more than 400,000 new accounts per month, and 36 million registered users.

So will Michael Abrash’s metaverse dream come true? Even if one looks into the crystal ball of the hype cycle, the answer is not easily found.

Sep 25, 2014

With Cyberith's Virtualizer, you can run around wearing an Oculus Rift

Sep 21, 2014

First hands-on: Crescent Bay demo

I just tested the Oculus Crescent Bay prototype at the Oculus Connect event in LA.

I still can't close my mouth.

The demo lasted about 10 min, during which several scenes were presented. The resolution and framerate are astounding, you can turn completely around. I can say this is the first time in my life I can really say I was there.

I believe this is really the begin of a new era for VR, and I am sure I won't sleep tonight thinking about the infinite possibilities and applications of this technology. and I don't think I am exaggerating - if anything, I am underestimating

 

 

Aug 05, 2014

Life's a beach at work for Japanese company

A Japanese company has recreated a tropical beach in the very reception area they also use as their employee meeting space and staff lounge.

Aug 03, 2014

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity.

Neuropsychologia. 2014 Jul 21;

Authors: Hui M, Zhang H, Ge R, Yao L, Long Z

Abstract. Although the neurofeedback of real-time fMRI can reportedly enable people to gain control of the activity in the premotor cortex (PMA) during motor imagery, it is unclear how the neurofeedback training of PMA affect the motor network engaged in the motor execution (ME) and imagery (MI) task. In this study, we investigated the changes in the motor network engaged in both ME and MI task induced by real-time neurofeedback training of the right PMA. The neurofeedback training induced changes in activity of the ME-related motor network as well as alterations in the functional connectivity of both the ME-related and MI-related motor networks. Especially, the percent signal change of the right PMA in the last training run was found to be significantly correlated with the connectivity between the right PMA and the left posterior parietal lobe (PPL) during the pre-training MI run, post-training MI run and the last training run. Moreover, the increase in the tapping frequency was significantly correlated with the increase of connectivity between the right cerebellum and the primary motor area / primary sensory area (M1/S1) of the ME-related motor network after neurofeedback training. These findings show the importance of the connectivity between the right PMA and left PPL of the MI network for the up-regulation of the right PMA as well as the critical role of connectivity between the right cerebellum and M1/S1 of the ME network in improving the behavioral performance.

Fly like a Birdly

Birdly is a full body, fully immersive, Virtual Reality flight simulator developed at the Zurich University of the Arts (ZHdK). With Birdly, you can embody an avian creature, the Red Kite, visualized through Oculus Rift, as it soars over the 3D virtual city of San Francisco, heightened by sonic, olfactory, and wind feedback.

Jul 30, 2014

A virtual rehabilitation program after amputation: a phenomenological exploration

A virtual rehabilitation program after amputation: a phenomenological exploration.

Disabil Rehabil Assist Technol. 2013 Nov;8(6):511-5

Authors: Moraal M, Slatman J, Pieters T, Mert A, Widdershoven G

Abstract. PURPOSE: This study provides an analysis of bodily experiences of a man with a lower leg amputation who used a virtual rehabilitation program. METHOD: The study reports data from semi-structured interviews with a 32-year veteran who used a virtual environment during rehabilitation. The interviews were analyzed using interpretative phenomenological analysis (IPA). RESULTS: During this rehabilitation program, he initially experienced his body as an object, which he had to handle carefully. As he went along with the training sessions, however, he was more stimulated to react directly without being aware of the body's position. In order to allow himself to react spontaneously, he needed to gain trust in the device. This was fostered by his narrative, in which he stressed how the device mechanically interacts with his movements. CONCLUSION: The use of a virtual environment facilitated the process of re-inserting one's body into the flow of one's experience in two opposite, but complementary ways: (1) it invited this person to move automatically without taking into account his body; (2) it invited him to take an instrumental or rational view on his body. Both processes fostered his trust in the device, and ultimately in his body. IMPLICATIONS FOR REHABILITATION: Providing (more) technological explanation of the technological device (i.e. the virtual environment), may facilitate a rehabilitation process. Providing (more) explicit technological feedback, during training sessions in a virtual environment, may facilitate a rehabilitation process.

Jul 09, 2014

Experiential Virtual Scenarios With Real-Time Monitoring (Interreality) for the Management of Psychological Stress: A Block Randomized Controlled Trial

Gaggioli, A., Pallavicini, F., Morganti, L. et al. (2014) Journal of Medical Internet Research. 16(7):e167. DOI: 10.2196/jmir.3235

The recent convergence between technology and medicine is offering innovative methods and tools for behavioral health care. Among these, an emerging approach is the use of virtual reality (VR) within exposure-based protocols for anxiety disorders, and in particular posttraumatic stress disorder. However, no systematically tested VR protocols are available for the management of psychological stress. Objective: Our goal was to evaluate the efficacy of a new technological paradigm, Interreality, for the management and prevention of psychological stress. The main feature of Interreality is a twofold link between the virtual and the real world achieved through experiential virtual scenarios (fully controlled by the therapist, used to learn coping skills and improve self-efficacy) with real-time monitoring and support (identifying critical situations and assessing clinical change) using advanced technologies (virtual worlds, wearable biosensors, and smartphones).

Full text paper available at: http://www.jmir.org/2014/7/e167/

Jun 30, 2014

Never do a Tango with an Eskimo

Apr 15, 2014

A post-stroke rehabilitation system integrating robotics, VR and high-resolution EEG imaging

A post-stroke rehabilitation system integrating robotics, VR and high-resolution EEG imaging.

IEEE Trans Neural Syst Rehabil Eng. 2013 Sep;21(5):849-59

Authors: Steinisch M, Tana MG, Comani S

Abstract
We propose a system for the neuro-motor rehabilitation of upper limbs in stroke survivors. The system is composed of a passive robotic device (Trackhold) for kinematic tracking and gravity compensation, five dedicated virtual reality (VR) applications for training of distinct movement patterns, and high-resolution EEG for synchronous monitoring of cortical activity. In contrast to active devices, the Trackhold omits actuators for increased patient safety and acceptance levels, and for reduced complexity and costs. VR applications present all relevant information for task execution as easy-to-understand graphics that do not need any written or verbal instructions. High-resolution electroencephalography (HR-EEG) is synchronized with kinematic data acquisition, allowing for the epoching of EEG signals on the basis of movement-related temporal events. Two healthy volunteers participated in a feasibility study and performed a protocol suggested for the rehabilitation of post-stroke patients. Kinematic data were analyzed by means of in-house code. Open source packages (EEGLAB, SPM, and GMAC) and in-house code were used to process the neurological data. Results from kinematic and EEG data analysis are in line with knowledge from currently available literature and theoretical predictions, and demonstrate the feasibility and potential usefulness of the proposed rehabilitation system to monitor neuro-motor recovery.

Avegant - Glyph Kickstarter - Wearable Retinal Display

Via Mashable

Move over Google Glass and Oculus Rift, there's a new kid on the block: Glyph, a mobile, personal theater.

Glyph looks like a normal headset and operates like one, too. That is, until you move the headband down over your eyes and it becomes a fully-functional visual visor that displays movies, television shows, video games or any other media connected via the attached HDMI cable.

Using Virtual Retinal Display (VRD), a technology that mimics the way we see light, the Glyph projects images directly onto your retina using one million micromirrors in each eye piece. These micromirrors reflect the images back to the retina, producing a reportedly crisp and vivid quality.

Apr 06, 2014

Measuring the effects through time of the influence of visuomotor and visuotactile synchronous stimulation on a virtual body ownership illusion

Measuring the effects through time of the influence of visuomotor and visuotactile synchronous stimulation on a virtual body ownership illusion.

Perception. 2014;43(1):43-58

Authors: Kokkinara E, Slater M

Abstract. Previous studies have examined the experience of owning a virtual surrogate body or body part through specific combinations of cross-modal multisensory stimulation. Both visuomotor (VM) and visuotactile (VT) synchronous stimulation have been shown to be important for inducing a body ownership illusion, each tested separately or both in combination. In this study we compared the relative importance of these two cross-modal correlations, when both are provided in the same immersive virtual reality setup and the same experiment. We systematically manipulated VT and VM contingencies in order to assess their relative role and mutual interaction. Moreover, we present a new method for measuring the induced body ownership illusion through time, by recording reports of breaks in the illusion of ownership ('breaks') throughout the experimental phase. The balance of the evidence, from both questionnaires and analysis of the breaks, suggests that while VM synchronous stimulation contributes the greatest to the attainment of the illusion, a disruption of either (through asynchronous stimulation) contributes equally to the probability of a break in the illusion.

Glass brain flythrough: beyond neurofeedback

Via Neurogadget

Researchers have developed a new way to explore the human brain in virtual reality. The system, called Glass Brain, which is developed by Philip Rosedale, creator of the famous game Second Life, and Adam Gazzaley, a neuroscientist at the University of California San Francisco, combines brain scanning, brain recording and virtual reality to allow a user to journey through a person’s brain in real-time.

Read the full story on Neurogadget

Mar 03, 2014

Virtual reality for the assessment of frontotemporal dementia, a feasibility study

Virtual reality for the assessment of frontotemporal dementia, a feasibility study.

Disabil Rehabil Assist Technol. 2014 Feb 14;

Authors: Mendez MF, Joshi A, Jimenez E

Abstract
Abstract Purpose: Behavioral variant frontotemporal dementia (bvFTD) is a non-Alzheimer dementia characterized by difficulty in documenting social-emotional changes. Few investigations have used virtual reality (VR) for documentation and rehabilitation of non-Alzheimer dementias. Methods: Five bvFTD patients underwent insight interviews while immersed in a virtual environment. They were interviewed by avatars, their answers were recorded, and their heart rates were monitored. They were asked to give ratings of their stress immediately at the beginning and at the end of the session. Results: The patients tolerated the head-mounted display and VR without nausea or disorientation, heart rate changes, or worsening stress ratings. Their insight responses were comparable to real world interviews. All bvFTD patients showed their presence in the VR environment as they moved their heads to face and respond to each avatar's questions. The bvFTD patients tended to greater verbal elaboration of answers with larger mean length of utterances compared to their real world interviews. Conclusions: VR is feasible and well-tolerated in bvFTD. These patients may have VR responses comparable to real world performance and they may display a presence in the virtual environment which could even facilitate assessment. Further research can explore the promise of VR for the evaluation and rehabilitation of dementias beyond Alzheimer's disease. Implications for Rehabilitation Clinicians need effective evaluation and rehabilitation strategies for dementia, a neurological syndrome of epidemic proportions and a leading cause of disability. Memory and cognitive deficits are the major disabilities and targets for rehabilitation in Alzheimer's disease, the most common dementia. In contrast, social and emotional disturbances are the major disabilities and targets for rehabilitation in behavioral variant frontotemporal dementia (bvFTD), an incompletely understood non-Alzheimer dementia. Virtual reality is a technology that holds great promise for the evaluation and rehabilitation of patients with bvFTD and other non-Alzheimer dementias, and preliminary evidence suggests that this technology is feasible in patients with bvFTD.

Evaluation of a virtual reality prospective memory task for use with individuals with severe traumatic brain injury

Evaluation of a virtual reality prospective memory task for use with individuals with severe traumatic brain injury.

Neuropsychol Rehabil. 2014 Feb 24;

Authors: Canty AL, Fleming J, Patterson F, Green HJ, Man D, Shum DH

Abstract
The current study aimed to evaluate the sensitivity, convergent validity and ecological validity of a newly developed virtual reality prospective memory (PM) task (i.e., the Virtual Reality Shopping Task; VRST) for use with individuals with traumatic brain injury (TBI). Thirty individuals with severe TBI and 24 uninjured adults matched on age, gender and education level were administered the VRST, a lexical decision PM task (LDPMT), an index of task-friendliness and a cognitive assessment battery. Significant others rated disruptions in the TBI participants' occupational activities, interpersonal relationships and independent living skills. The performance of the TBI group was significantly poorer than that of controls on event-based PM as measured by the LDPMT, and on time- and event-based PM as measured by the VRST. Performance on the VRST significantly predicted significant others' ratings of patients' occupational activities and independent living skills. The VRST was rated as significantly more reflective of an everyday activity, interesting and was afforded a higher recommendation than the LDPMT. For the TBI group, event and total PM performance on the VRST significantly correlated with performance on measures of mental flexibility and verbal fluency, and total PM performance correlated with verbal memory. These results provide preliminary but promising evidence of the sensitivity, as well as the convergent and ecological validity of the VRST.

Virtual Reality for sensorimotor rehabilitation post-stroke

Virtual Reality for Sensorimotor Rehabilitation Post-Stroke: The Promise and Current State of the Field.

Curr Phys Med Rehabil Reports. 2013 Mar;1(1):9-20

Authors: Fluet GG, Deutsch JE

Abstract
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested.

Mar 02, 2014

3D Thought controlled environment via Interaxon

In this demo video, artist Alex McLeod shows an environment he designed for Interaxon to use at CES in 2011 interaxon.ca/CES#.

The glasses display the scene in 3D and attaches sensors read users brain-states which control elements of the scene.

3D Thought controlled environment via Interaxon from Alex McLeod on Vimeo.

Humanlike robot hands controlled by brain activity arouse illusion of ownership in operators.

Humanlike robot hands controlled by brain activity arouse illusion of ownership in operators.

Sci Rep. 2013;3:2396

Authors: Alimardani M, Nishio S, Ishiguro H

Abstract
Operators of a pair of robotic hands report ownership for those hands when they hold image of a grasp motion and watch the robot perform it. We present a novel body ownership illusion that is induced by merely watching and controlling robot's motions through a brain machine interface. In past studies, body ownership illusions were induced by correlation of such sensory inputs as vision, touch and proprioception. However, in the presented illusion none of the mentioned sensations are integrated except vision. Our results show that during BMI-operation of robotic hands, the interaction between motor commands and visual feedback of the intended motions is adequate to incorporate the non-body limbs into one's own body. Our discussion focuses on the role of proprioceptive information in the mechanism of agency-driven illusions. We believe that our findings will contribute to improvement of tele-presence systems in which operators incorporate BMI-operated robots into their body representations.

Feb 09, 2014

A high-fidelity virtual environment for the study of paranoia

A high-fidelity virtual environment for the study of paranoia.

Schizophr Res Treatment. 2013;2013:538185

Authors: Broome MR, Zányi E, Hamborg T, Selmanovic E, Czanner S, Birchwood M, Chalmers A, Singh SP

Abstract. Psychotic disorders carry social and economic costs for sufferers and society. Recent evidence highlights the risk posed by urban upbringing and social deprivation in the genesis of paranoia and psychosis. Evidence based psychological interventions are often not offered because of a lack of therapists. Virtual reality (VR) environments have been used to treat mental health problems. VR may be a way of understanding the aetiological processes in psychosis and increasing psychotherapeutic resources for its treatment. We developed a high-fidelity virtual reality scenario of an urban street scene to test the hypothesis that virtual urban exposure is able to generate paranoia to a comparable or greater extent than scenarios using indoor scenes. Participants (n = 32) entered the VR scenario for four minutes, after which time their degree of paranoid ideation was assessed. We demonstrated that the virtual reality scenario was able to elicit paranoia in a nonclinical, healthy group and that an urban scene was more likely to lead to higher levels of paranoia than a virtual indoor environment. We suggest that this study offers evidence to support the role of exposure to factors in the urban environment in the genesis and maintenance of psychotic experiences and symptoms. The realistic high-fidelity street scene scenario may offer a useful tool for therapists.

1 2 3 4 5 6 7 8 Next