Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Dec 21, 2013

New Scientist: Mind-reading light helps you stay in the zone

Re-blogged from New Scientist

WITH a click of a mouse, I set a path through the mountains for drone #4. It's one of five fliers under my control, all now heading to different destinations. Routes set, their automation takes over and my mind eases, bringing a moment of calm. But the machine watching my brain notices the lull, decides I can handle more, and drops a new drone in the south-east corner of the map.

The software is keeping my brain in a state of full focus known as flow, or being "in the zone". Too little work, and the program notices my attention start to flag and gives me more drones to handle. If I start to become a frazzled air traffic controller, the computer takes one of the drones off my plate, usually without me even noticing.

The system monitors the workload by pulsing light into my prefrontal cortex 12 times a second. The amount of light that oxygenated and deoxygenated haemoglobin in the blood there absorbs and reflects gives an indication of how mentally engaged I am. Harder brain work calls for more oxygenated blood, and changes how the light is absorbed. Software interprets the signal from this functional near infrared spectroscopy (fNIRS) and uses it to assign me the right level of work.

Dan Afergan, who is running the study at Tufts University in Medford, Massachusetts, points to an on-screen readout as I play. "It's predicting high workload with very high certainty, and, yup, number three just dropped off," he says over my shoulder. Sure enough, I'm now controlling just five drones again.

To achieve this mind-monitoring, I'm hooked up to a bulky rig of fibre-optic cables and have an array of LEDs stuck to my forehead. The cables stream off my head into a box that converts light signals to electrical ones. These fNIRS systems don't have to be this big, though. A team led by Sophie Piper at Charité University of Medicine in Berlin, Germany, tested a portable device on cyclists in Berlin earlier this year – the first time fNIRS has been done during an outdoor activity.

Afergan doesn't plan to be confined to the lab for long either. He's studying ways to integrate brain-activity measuring into the Google Glass wearable computer. A lab down the hall already has a prototype fNIRS system on a chip that could, with a few improvements, be built into a Glass headset. "Glass is already on your forehead. It's really not much of a stretch to imagine building fNIRS into the headband," he says.

Afergan is working on a Glass navigation system for use in cars that responds to a driver's level of focus. When they are concentrating hard, Glass will show only basic instructions, or perhaps just give audio directions. When the driver is focusing less, on a straight stretch of road perhaps, Glass will provide more details of the route. The team also plans to adapt Google Now – the company's digital assistant software – for Glass so that it only gives you notifications when your mind has room for them.

Peering into drivers' minds will become increasingly important, says Erin Solovey, a computer scientist at Drexel University in Philadelphia, Pennsylvania. Many cars have automatic systems for adaptive cruise control, keeping in the right lane and parking. These can help, but they also bring the risk that drivers may not stay focused on the task at hand, because they are relying on the automation.

Systems using fNIRS could monitor a driver's focus and adjust the level of automation to keep drivers safely engaged with what the car is doing, she says.

This article appeared in print under the headline "Stay in the zone"

The comments are closed.