By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Jul 26, 2007

Video Depth Illusion - Neato!

Via Neatorama and the Presurfer

check it out: 


19:32 Posted in Research tools | Permalink | Comments (0)

Jul 23, 2007

Simulating hemispatial neglect with virtual reality

Simulating hemispatial neglect with virtual reality.

J Neuroengineering Rehabil. 2007 Jul 19;4(1):27

Authors: Baheux K, Yoshizawa M, Yoshida Y

ABSTRACT: BACKGROUND: Hemispatial neglect is a cognitive disorder defined as a lack of attention for stimuli contra-lateral to the brain lesion. The assessment is traditionally done with basic pencil and paper tests and the rehabilitation programs are generally not well adapted. We propose a virtual reality system featuring an eye-tracking device for a better characterization of the neglect that will lead to new rehabilitation techniques. METHODS: This paper presents a comparison of eye-gaze patterns of healthy subjects, patients and healthy simulated patients on a virtual line bisection test. The task was also executed with a reduced visual field condition hoping that fewer stimuli would limit the neglect. RESULTS: We found that patients and healthy simulated patients had similar eye-gaze patterns. However, while the reduced visual field condition had no effect on the healthy simulated patients, it actually had a negative impact on the patients. We discuss the reasons for these differences and how they relate to the limitations of the neglect simulation. CONCLUSIONS: We argue that with some improvements the technique could be used to determine the potential of new rehabilitation techniques and also help the rehabilitation staff or the patient's relatives to better understand the neglect condition.

Novel brain-scanning technology invented

Researchers from Siemens have developed a prototype MRI scanner that uses a lattice of small coils positioned around the head rather than large coils you lie inside. As noted in this Technology Review article, the device is likely to have important applications in functional magnetic resonance imaging (fMRI), a variation of standard MRI that tracks blood flow in the brain as an indirect measure of activity.

The technique is often used to locate the parts of the brain that control specific functions, such as speech and movement. The first clinical application for the device will likely be fMRI for neurosurgery planning, says [Siemens MR vice president] Bundy. "Surgeons want to know where speech and motor areas are when they take a tumor out- the more precise, the better."


18:50 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

Jul 13, 2007

A VR extended neuropsychological assessment for topographical disorientation

A virtual reality extended neuropsychological assessment for topographical disorientation: a feasibility study.

J Neuroengineering Rehabil. 2007 Jul 11;4(1):26

Authors: Morganti F, Gaggioli A, Strambi L, Rusconi ML, Riva G

ABSTRACT: BACKGROUND: Topographical disorientation represents one of the main consequences of brain injury. Up to now several methodological approaches have been used in the assessment of the brain injured patient's navigational abilities showing a moderate correlation with the impairments observed in everyday contexts. METHODS: We propose a combination of standardized neuropsychological tests and a more situated virtual reality-based assessment for the evaluation of spatial orientation in brain injured patients. RESULTS: When tested with this virtual reality integrated procedure patients showed performance and execution times congruent with their neuropsychological evaluation. When compared to a control group, patients revealed significantly slower times and greater errors in solving virtual reality based spatial tasks. CONCLUSIONS: The use of virtual reality, when combined with classical neuropsychological tests, can provide an effective tool for the study of topographical disorientation.

Jul 06, 2007

EEG-based assessment of driver cognitive responses in a dynamic virtual-reality driving environment

EEG-based assessment of driver cognitive responses in a dynamic virtual-reality driving environment.

IEEE Trans Biomed Eng. 2007 Jul;54(7):1349-52

Authors: Lin CT, Chung IF, Ko LW, Chen YC, Liang SF, Duann JR

Accidents caused by errors and failures in human performance among traffic fatalities have a high death rate and become an important issue in public security. They are mainly caused by the failures of the drivers to perceive the changes of the traffic lights or the unexpected conditions happening accidentally on the roads. In this paper, we devised a quantitative analysis for assessing driver's cognitive responses by investigating the neurobiological information underlying electroencephalographic (EEG) brain dynamics in traffic-light experiments in a virtual-reality (VR) dynamic driving environment. The VR technique allows subjects to interact directly with the moving virtual environment instead of monotonic auditory and visual stimuli, thereby provides interactive and realistic tasks without the risk of operating on an actual machine. Independent component analysis (ICA) is used to separate and extract noise-free ERP signals from the multi-channel EEG signals. A temporal filter is used to solve the time-alignment problem of ERP features and principle component analysis (PCA) is used to reduce feature dimensions. The dimension-reduced features are then input to a self-constructing neural fuzzy inference network (SONFIN) to recognize different brain potentials stimulated by red/green/yellow traffic events, the accuracy can be reached 87% in average eight subjects in this visual-stimuli ERP experiment. It demonstrates the feasibility of detecting and analyzing multiple streams of ERP signals that represent operators' cognitive states and responses to task events.

Jun 30, 2007

Mediated: How the Media Shapes Your World and the Way You Live in It

Via Networked Performance



"(W)e are living a fusion of real and unreal time, an ongoing undulation of overlays and intersections...It's most like the way good old-fashioned thinking and imagining work in relation to sensing and perceiving ... It says that back before representational technologies developed, before literacy itself, people were also living in a fusion of real and unreal time because they were daydreaming while they were doing this or that. Just having a mind is to be in unreal time as well as in real time ... What that says is that representational technologies have colonized our minds ... To the extent that our thoughts no longer wander around on their own, stocked only with materials drawn from direct experience, to the extent that they follow flows of representations instead--to just that extent that we don't think our own thoughts. Literally...

When the term first arose, "real-time" implied speed, intensified velocity. The medium doing the representing was transforming reality into representation immediately. The expression was first used in connection with digital processing of information ... It was a term of praise that focused on how fast a computer could record the file transaction as compared with paper-shuffling clerks. It wasn't until the fact that computers could keep up with events was taken for granted that we noticed that security cameras in public places were real-time media too. And nothing seems slower than those! How strange. Why is that? No editing. No manipulation of what is presented.

In the same way, an innovation like videoconferencing could surprise us with a real-time capacity that the telephone had all along. Bit we only noticed that a lot of analog media were in real time after computers achieved sufficient processing speed to do it too. It was the malleability of digital transformations that made the difference. The fact that we could now manipulate what had once just been conveyed on a screen or over a wire, that's what go the juices going. That's why "interactive" became the mother of all buzz words. The idea that real time emerged when we became players on screens we had once viewed passively. The fusional loop of subject-object that is a video games expresses most cogently the thrill of real-time existence in unreal realms. You tweak the joystick and press the buttons and virtual swords flash and machine guns blaze in some tunnel on asteroid in a distant galaxy--not as a result of, but as a function of, at the same time as, your fingers on the console. You exist as agent and instrument simulateously in two places, in the meat world of fingers and consoles and the virtual world of cyborg warriors. Representational being incarnate. The primordial aim of the human imagination realized--literally "made real.

So "real time" is a compliment we pay to representations that reflect our agency either directly or in the way they conform to our designs subsequently ... Incidentally, remember when people thought that the Web was going to build bridges between communities and inspire cross-cultural understanding, etc.? ... The multiplication of niches has been so intense that the word fragmentation doesn't begin to describe it. What with these search worms and filters and custom advertising hooking you up with stuff you're already interested in ... you can spend your whole life online and never leave your own head." From Mediated: How the Media Shapes Your World and the Way You Live in It by Thomas De Zengotita.

18:49 Posted in Research tools | Permalink | Comments (0)

Jun 19, 2007

NeuroVR presented to the US congress

a little self-promotion... ;-)

last week we presented NeuroVR, an open-source virtual reality software platform for clinical and neuroscience applications, to the Congressional Modeling & Simulation Caucus , during the CyberTherapy Reception.

The Reception was held on Wednesday, June 13 from 5-7 pm in the foyer of the Rayburn House Office Building, Washington, DC, USA.

read the full news release



Jun 14, 2007

IQR simulator for large scale neural systems

Via Neurobot 

Ulysses Bernardet from the Institute of Neuroinformatics University ETH Zurich has developed IQR, an efficient graphical environment to design large-scale multi-level neuronal systems that can control real-world devices - robots in the broader sense - in real-time.

IQR has been released as open-source under GNU General Public License (GPL)

IQR neuronal simulator


16:01 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

May 24, 2007

Wearable brain scanner

Via Medgadget



Hitachi has successfully trial manufactured a lightweight, portable brain scanner that enables users to keep tabs on their mental activity during the course of their daily lives. The system, which consists of a 400 gram (14 oz) headset and a 630 gram (1 lb 6 oz) controller worn on the waist, is the result of Hitachi’s efforts to transform the brain scanner into a familiar everyday item that anyone can use.

The rechargeable battery-operated mind reader relies on Hitachi’s so-called “optical topography” technology, which interprets mental activity based on subtle changes in the brain’s blood flow. Because blood flow increases to areas of the brain where neurons are firing (to supply glucose and oxygen to the tissue), changes in hemoglobin concentrations are an important index by which to measure brain activity. To measure these hemoglobin concentrations in real time, eight small surface-emitting lasers embedded in the headset fire harmless near-infrared rays into the brain and the headset’s photodiode sensors convert the reflected light into electrical signals, which are relayed to the controller.

The real-time brain data can either be stored in Flash memory or sent via wifi to a computer for instant analysis and display. A single computer can support up to 24 mind readers at a time, allowing multiple users to monitor brain activity while communicating or engaging in group activities.

In addition to health and medical applications, Hitachi foresees uses for the personal mind reader in fields such as psychology, education and marketing. Although it is unclear what neuromarketing applications the company has in mind, it is pretty clear that access to real-time customer brain data would provide marketers with a better understanding of how and why shoppers make their purchasing decisions. One can also imagine interactive campaigns that, for example, ask customers to think positive thoughts about a certain product in exchange for discount coupons or the chance to win a prize.

The technology could also be used in new forms of entertainment such as “mind gaming,” where the player’s physical brain activity becomes a part of game play. It is also feasible to integrate the brain scanner with a remote control brain-machine interface that would allow users to operate electronic devices with their minds.

Apr 27, 2007

Mind-altering media

Via New Scientist

The electronic age is changing our brains, but are we getting smarter, or dumb and dangerous?

New Scientist investigates...




Apr 01, 2007

Learning and memory in virtual reality

Is learning and memory different in a virtual environment?

Clin Neuropsychol. 2007 Jan;21(1):146-61

Authors: Matheis RJ, Schultheis MT, Tiersky LA, Deluca J, Millis SR, Rizzo A

It has been suggested that virtual reality may provide a medium for producing neuropsychological measures with greater ecological validity. The present study examined the usefulness of virtual reality (VR) to assess learning and memory in individuals with traumatic brain injury (TBI). A total of 20 TBI participants were compared with 20 healthy controls on their ability to learn and recall 16 target items presented within a VR-based generic office environment. The results indicated that VR memory testing accurately distinguished the TBI group from controls. Additionally, non-memory-impaired TBI participants acquired targets at the same rate as HC participants. Finally, there was a significant relationship between the VR Office and a standard neuropsychological measure of memory, suggesting the construct validity of the task. These findings suggest that the VR Office provides a viable medium for measuring learning and memory. The present results provide preliminary support for the ecological validity of the VR Office, which ultimately can improve assessment of real-world functioning following TBI.

19:58 Posted in Research tools | Permalink | Comments (0) | Tags: virtual worlds

Mar 16, 2007

New eye-tracking system analyzes the interest level of TV viewers

Via Pink Tentacle

Eye-tracking system recognizes viewer interest ---

NTT Learning Systems (NTTLS) and the Visual Interactive Sensitivity Research Institute (VIS),  have developed a eye-tracking system that analyzes the interest level of TV viewers and web surfers by monitoring their eye movement, pupil size and blinking. 

read more >> 

21:41 Posted in Research tools | Permalink | Comments (0) | Tags: eye-tracking

Mar 15, 2007


Via Networked Performance 





LX 2.0: Contemporary Online Experiments: NeuroZappingFolks is a non-linear zapping through the Internet, a path leading to the inside of a web of relations, a web that can be explored from one tag to a site, to another tag, to another site... from word to image to word to image. NeuroZappingFolks is then the simulation of a brain lost in the web (lost between servers, but also lost in Internet's double identity: word and image).

00:54 Posted in Cyberart, Research tools | Permalink | Comments (0) | Tags: cyberart

Mar 10, 2007

Scientists claim first in using brain scans to predict intentions

Via kurzweilAI.net

Researchers at Berlin's Bernstein Center for Computational Neuroscience claim they have identified people's decisions about how they would later do a high-level mental activity - in this case, adding versus subtracting. 

Read the full story 



20:13 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

Feb 25, 2007

The size-weight illusion in natural and virtual reality

Seeing size and feeling weight: the size-weight illusion in natural and virtual reality.

Hum Factors. 2007 Feb;49(1):136-44

Authors: Heineken E, Schulte FP

OBJECTIVE: We experimentally tested the degree that the size-weight illusion depends on perceptual conditions allowing the observer to assume that both the visual and the kinesthetic stimuli of a weight seen and lifted emanate from the same object. We expected that the degree of the illusion depended on the "realism" provided by different kinds of virtual reality (VR) used when the weights are seen in virtual reality and at the same time lifted in natural reality. BACKGROUND: Welch and Warren (1980) reported that an intermodal influence can be expected only if perceptual information of different modalities is compellingly related to only one object. METHOD: Objects of different sizes and weights were presented to 50 participants in natural reality or in four virtual realities: two immersive head-mounted display VRs (with or without head tracking) and two nonimmersive desktop VRs (with or without screening from input of the natural environment using a visor). The objects' heaviness was scaled using the magnitude estimation method. RESULTS: Data show that the degree of the illusion is largest in immersive and lowest in nonimmersive virtual realities. CONCLUSION: The higher the degree of the illusion is, the more compelling the situation is perceived and the more the observed data are in correspondence with the data predicted for the illusion in natural reality. This shows that the kind of mediating technology used strongly influences the presence experienced. APPLICATION: The size-weight illusion's sensitivity to conditions that affect the sense of presence makes it a promising objective presence measure.

Feb 19, 2007

Some notable moments in recorded life

Recent progresses in miniaturization and storage capability have made it possible to record, access, retrieve, and potential sharing, all the generated information of a user's or object's life experience.

Two of the most important projects in this area are Lifelogs (initially funded by DARPA, then killed by the Pentagon in 2004) and Microsoft MyLifeBits 

I am fashinated by how these new technologies could radically change psychotherapy and, more generally, how they could fundamentally affect our life.

In this article entitled On the Record, All the Time, Scott Carlson thaces the story and the implications of the introduction of LifeLogging. In the article I found a list of some notable moments in "recorded life":

1900s: The Brownie camera makes photography available to the masses.

1940: President Franklin D. Roosevelt begins recording press conferences and some meetings.

1945: Vannevar Bush, a prominent American scientist, predicts a time when scientists will be photographing their lab work and storing their correspondence in a machine called a "memex."

1960s: Presidents John F. Kennedy and Lyndon B. Johnson record meetings and phone conversations for posterity, which later provides hundreds of hours of programming for C-Span.

1969: The microcassette goes on the market and becomes the voice-recording medium of choice.

1973: An American Family, documenting the domestic drama of the Louds, is the first reality-TV show.

1973-74: President Richard Nixon releases the Watergate tapesjust some of more than 3,500 hours of conversations that he had recordedwhich leads to his resignation.

Late 1970s: Steve Mann, a professor at the University of Toronto, begins dabbling in wearable computing.

Mid-1980s: Fitness nuts are wearing stretch pants and leggings, along with wristwatch-sized devices that measure heart rate and blood pressure. The heart monitors can cost $200 or more.

1991: The first Webcam goes online.

Mid-1990s: Cellphones, digital cameras, and the Internet become commonplace.

1995: Gordon Bell, a computer engineer and entrepreneur, gets involved with Microsoft Research and begins work that will lead him to record various aspects of his life for the MyLifeBits project.

1999: Microsoft Research invents prototype SenseCams, cameras that hang around the neck and continuously snap pictures.

2000: Scrapbooking has a renaissance, leading to new retail stores devoted to a hobby industry now worth $2-billion.

2003: MySpace debuts. 2004: The Final Cut, starring Robin Williams, describes a future where memories are recorded on implanted chips. The Defense Advanced Research Projects Agency drops a lifelogging project amid a furor over privacy. A workshop on the "Continuous Archival and Retrieval of Personal Experiences" convenes at Columbia University.

2005: YouTube appears.

2006: Nokia releases Lifeblog 2.0, which allows people to upload audio notes, photographs, location information, and other records of life events to a database.

Feb 17, 2007

Action Video Games Sharpen Vision 20 Percent

From Medgadget

According to researchers at University of Rochester, video games that contain high levels of action, such as Unreal Tournament, can actually improve your vision:

Researchers at the University of Rochester have shown that people who played action video games for a few hours a day over the course of a month improved by about 20 percent in their ability to identify letters presented in clutter--a visual acuity test similar to ones used in regular ophthalmology clinics.

In essence, playing video game improves your bottom line on a standard eye chart...

Bavelier [Daphne Bavelier is Professor of brain and cognitive sciences at the University of Rochester --ed.] and graduate student Shawn Green tested college students who had played few, if any, video games in the last year. "That alone was pretty tough," says Green. "Nearly everybody on a campus plays video games."

At the outset, the students were given a crowding test, which measured how well they could discern the orientation of a "T" within a crowd of other distracting symbols--a sort of electronic eye chart. Students were then divided into two groups. The experimental group played Unreal Tournament, a first-person shoot-'em-up action game, for roughly an hour a day. The control group played Tetris, a game equally demanding in terms of motor control, but visually less complex.

After about a month of near-daily gaming, the Tetris players showed no improvement on the test, but the Unreal Tournament players could tell which way the "T" was pointing much more easily than they had just a month earlier.

"When people play action games, they're changing the brain's pathway responsible for visual processing," says Bavelier. "These games push the human visual system to the limits and the brain adapts to it. That learning carries over into other activities and possibly everyday life."

The improvement was seen both in the part of the visual field where video game players typically play, but also beyond--the part of your vision beyond the monitor. The students' vision improved in the center and at the periphery where they had not been "trained." That suggests that people with visual deficits, such as amblyopic patients, may also be able to gain an increase in their visual acuity with special rehabilitation software that reproduces an action game's need to identify objects very quickly.

20:58 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

Brain Scan Mind Reading 70% Accurate

Via NeuroGuy 

brain-fmri 1


Researchers are making progresses in predicting the intention of subjects with reasonable accuracy:

Now researchers have been able to decode these secret intentions from patterns of their brain activity. They let subjects freely and covertly choose between two possible tasks - to either add or subtract two numbers. They were then asked to hold in mind their intention for a while until the relevant numbers were presented on a screen. The researchers were able to recognize the subjects intentions with 70% accuracy based alone on their brain activity - even before the participants had seen the numbers and had started to perform the calculation.


That quote is from a press release from the Max Planck Institute, Revealing secret intentions in the brain

20:23 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

Feb 06, 2007

Special issue on ToM

The new issue (number 3-4, September-December 2006) of Social Neuroscience is dedicated to the topic of Theory of Mind.

The special issue is edited by Rebecca Saxe and Simon Baron-Cohen


Jan 22, 2007

The mystery of consciousness

Steven Pinker, professor of psychology at Harvard University, has an article in Time magazine about the current state of understanding of consciousness.

From the article:

So neuroscientists are well on the way to identifying the neural correlates of consciousness, a part of the Easy Problem. But what about explaining how these events actually cause consciousness in the sense of inner experience--the Hard Problem?

TO APPRECIATE THE HARDNESS OF THE HARD PROBLEM, CONSIDER how you could ever know whether you see colors the same way that I do. Sure, you and I both call grass green, but perhaps you see grass as having the color that I would describe, if I were in your shoes, as purple. Or ponder whether there could be a true zombie--a being who acts just like you or me but in whom there is no self actually feeling anything. This was the crux of a Star Trek plot in which officials wanted to reverse-engineer Lieut. Commander Data, and a furious debate erupted as to whether this was merely dismantling a machine or snuffing out a sentient life.

No one knows what to do with the Hard Problem. Some people may see it as an opening to sneak the soul back in, but this just relabels the mystery of "consciousness" as the mystery of "the soul"--a word game that provides no insight.

Many philosophers, like Daniel Dennett, deny that the Hard Problem exists at all. Speculating about zombies and inverted colors is a waste of time, they say, because nothing could ever settle the issue one way or another. Anything you could do to understand consciousness--like finding out what wavelengths make people see green or how similar they say it is to blue, or what emotions they associate with it--boils down to information processing in the brain and thus gets sucked back into the Easy Problem, leaving nothing else to explain. Most people react to this argument with incredulity because it seems to deny the ultimate undeniable fact: our own experience.

The most popular attitude to the Hard Problem among neuroscientists is that it remains unsolved for now but will eventually succumb to research that chips away at the Easy Problem. Others are skeptical about this cheery optimism because none of the inroads into the Easy Problem brings a solution to the Hard Problem even a bit closer. Identifying awareness with brain physiology, they say, is a kind of "meat chauvinism" that would dogmatically deny consciousness to Lieut. Commander Data just because he doesn't have the soft tissue of a human brain. Identifying it with information processing would go too far in the other direction and grant a simple consciousness to thermostats and calculators--a leap that most people find hard to stomach. Some mavericks, like the mathematician Roger Penrose, suggest the answer might someday be found in quantum mechanics. But to my ear, this amounts to the feeling that quantum mechanics sure is weird, and consciousness sure is weird, so maybe quantum mechanics can explain consciousness.

And then there is the theory put forward by philosopher Colin McGinn that our vertigo when pondering the Hard Problem is itself a quirk of our brains. The brain is a product of evolution, and just as animal brains have their limitations, we have ours. Our brains can't hold a hundred numbers in memory, can't visualize seven-dimensional space and perhaps can't intuitively grasp why neural information processing observed from the outside should give rise to subjective experience on the inside. This is where I place my bet, though I admit that the theory could be demolished when an unborn genius--a Darwin or Einstein of consciousness--comes up with a flabbergasting new idea that suddenly makes it all clear to us.

21:55 Posted in Research tools | Permalink | Comments (0) | Tags: research tools