Oct 31, 2013
Neuroscientists are starting to decipher what a person is seeing, remembering and even dreaming just by looking at their brain activity. They call it brain decoding.
In this Nature Video, we see three different uses of brain decoding, including a virtual reality experiment that could use brain activity to figure out whether someone has been to the scene of a crime.
Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation
Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation.
Int J Psychophysiol. 2013 Oct 18;
Authors: Kranczioch C, Zich C, Schierholz I, Sterr A
Abstract. Studying the brain in its natural state remains a major challenge for neuroscience. Solving this challenge would not only enable the refinement of cognitive theory, but also provide a better understanding of cognitive function in the type of complex and unpredictable situations that constitute daily life, and which are often disturbed in clinical populations. With mobile EEG, researchers now have access to a tool that can help address these issues. In this paper we present an overview of technical advancements in mobile EEG systems and associated analysis tools, and explore the benefits of this new technology. Using the example of motor imagery (MI) we will examine the translational potential of MI-based neurofeedback training for neurological rehabilitation and applied research.
Via New Scientist
They look like snazzy sunglasses, but these computerised specs don't block the sun – they make the world a brighter place for people with partial vision.
These specs do more than bring blurry things into focus. This prototype pair of smart glasses translates visual information into images that blind people can see.
Many people who are registered as blind can perceive some light and motion. The glasses, developed by Stephen Hicks of the University of Oxford, are an attempt to make that residual vision as useful as possible.
They use two cameras, or a camera and an infrared projector, that can detect the distance to nearby objects. They also have a gyroscope, a compass and GPS to help orient the wearer.
The collected information can be translated into a variety of images on the transparent OLED displays, depending on what is most useful to the person sporting the shades. For example, objects can be made clearer against the background, or the distance to obstacles can be indicated by the varying brightness of an image.
Hicks has won the Royal Society's Brian Mercer Award for Innovation"" for his work on the smart glasses. He plans to use the £50,000 prize money to add object and text recognition to the glasses' abilities.
A new signal processing algorithm that enables any pair of earphones to detect your pulse was demonstrated recently at the Healthcare Device Exhibition 2013 in Yokohama, Japan. The technology comes from a joint effort of Bifrostec (Tokyo, Japan) and the Kaiteki Institute. It is built on the premise that the eardrum creates pressure waves with each heartbeat, which can be detected in a perfectly enclosed space. However, typically, earphones do not create a perfect seal, which is what gives everyone in a packed elevator the privilege to listen to that guy’s tunes. The new algorithm allows the software to process the pressure signal despite the lack of a perfect seal to determine a user’s pulse.
Oct 25, 2013
Our networking session proposal was accepted at the ICT 2013 Conference in Vilnius (6-8 November, 2013).
Title: Positive Technology: Steps Towards Ubiquitous Empowerment (07/11/2013, Booth 4, 18:00-19:30)
More than 5000 researchers, innovators, entrepreneurs, industry representatives are expected to attend the conference. That's indeed a great opportunity to explore the future developments of Positive Technology within Horizon2020.
If you are also planning to attend the conference and you're interested in participating to this special networking session, drop me a message here.