Nov 16, 2013
NeuroPace has received FDA pre-market approval for the NeuroPace RNS System, used to treat medically refractory partial epilepsy. The battery powered device is implanted in the cranium and monitors electrical activity in the brain. If abnormal activity is detected, electrical impulses are sent to the seizure focus in the brain via leads, helping to prevent the onset of a seizure. The RNS System also comes with a programmer for physicians to non-invasively set the detection and stimulation parameters for the implanted device, and has the ability to view the patients electrocorticogram (ECoG) in real time and upload previously recorded ECoGs stored on the RNS implant.
Results from clinical studies show significant benefits for patients, with a 37.9% reduction in seizure frequency for subjects with active implants. Follow up with patients two years post-implant showed that over half experienced a reduction in seizures of 50% or more.
Neurofeedback training aimed to improve focused attention and alertness in children with ADHD: a study of relative power of EEG rhythms using custom-made software application.
Clin EEG Neurosci. 2013 Jul;44(3):193-202
Authors: Hillard B, El-Baz AS, Sears L, Tasman A, Sokhadze EM
Abstract. Neurofeedback is a nonpharmacological treatment for attention-deficit hyperactivity disorder (ADHD). We propose that operant conditioning of electroencephalogram (EEG) in neurofeedback training aimed to mitigate inattention and low arousal in ADHD, will be accompanied by changes in EEG bands' relative power. Patients were 18 children diagnosed with ADHD. The neurofeedback protocol ("Focus/Alertness" by Peak Achievement Trainer) has a focused attention and alertness training mode. The neurofeedback protocol provides one for Focus and one for Alertness. This does not allow for collecting information regarding changes in specific EEG bands (delta, theta, alpha, low and high beta, and gamma) power within the 2 to 45 Hz range. Quantitative EEG analysis was completed on each of twelve 25-minute-long sessions using a custom-made MatLab application to determine the relative power of each of the aforementioned EEG bands throughout each session, and from the first session to the last session. Additional statistical analysis determined significant changes in relative power within sessions (from minute 1 to minute 25) and between sessions (from session 1 to session 12). Analysis was of relative power of theta, alpha, low and high beta, theta/alpha, theta/beta, and theta/low beta and theta/high beta ratios. Additional secondary measures of patients' post-neurofeedback outcomes were assessed, using an audiovisual selective attention test (IVA + Plus) and behavioral evaluation scores from the Aberrant Behavior Checklist. Analysis of data computed in the MatLab application, determined that theta/low beta and theta/alpha ratios decreased significantly from session 1 to session 12, and from minute 1 to minute 25 within sessions. The findings regarding EEG changes resulting from brain wave self-regulation training, along with behavioral evaluations, will help elucidate neural mechanisms of neurofeedback aimed to improve focused attention and alertness in ADHD.
Phonebloks is a modular smartphone concept created by Dutch designer Dave Hakkens to reduce electronic waste. By attaching individual third-party components (called "bloks") to a main board, a user would create a personalized smartphone. These bloks can be replaced at will if they break or the user wishes to upgrade.
Stanford Center on Longevity competition challenges students to design products to help older adults
The design contest solicits entries from student teams worldwide and is aimed at finding solutions that help keep people with cognitive impairments independent as long as possible.
The competition is currently accepting submissions in what is called Phase I of the challenge. Submitted concepts will be judged in January and finalists will be given financial help to flesh out their design and travel to Stanford to present it.
From January until April, called Phase II, finalists will also have access to mentors in different schools and centers at Stanford
The final presentations, in April, will be before a panel of academics, industry professionals, nonprofit groups and investors.
The top prize is $10,000, while the second place team will take home $5,000 and third place will get $3,000.
Nov 03, 2013
Artist Javier Pérez turns everyday objects into whimsical illustrations. Here are some of my favourites. Discover more on his Instagram account.
The neurocam is the world’s first wearable camera system that automatically records what interests you, based on brainwaves, DigInfo TV reports.
It consists of a headset with a brain-wave sensor and uses the iPhone’s camera to record a 5-second GIF animation. It could also be useful for life-logging.
The algorithm for quantifying brain waves was co-developed by Associate Professor Mitsukura at Keio University.
The project team plans to create an emotional interface.
Oct 31, 2013
Neuroscientists are starting to decipher what a person is seeing, remembering and even dreaming just by looking at their brain activity. They call it brain decoding.
In this Nature Video, we see three different uses of brain decoding, including a virtual reality experiment that could use brain activity to figure out whether someone has been to the scene of a crime.
Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation
Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation.
Int J Psychophysiol. 2013 Oct 18;
Authors: Kranczioch C, Zich C, Schierholz I, Sterr A
Abstract. Studying the brain in its natural state remains a major challenge for neuroscience. Solving this challenge would not only enable the refinement of cognitive theory, but also provide a better understanding of cognitive function in the type of complex and unpredictable situations that constitute daily life, and which are often disturbed in clinical populations. With mobile EEG, researchers now have access to a tool that can help address these issues. In this paper we present an overview of technical advancements in mobile EEG systems and associated analysis tools, and explore the benefits of this new technology. Using the example of motor imagery (MI) we will examine the translational potential of MI-based neurofeedback training for neurological rehabilitation and applied research.
Via New Scientist
They look like snazzy sunglasses, but these computerised specs don't block the sun – they make the world a brighter place for people with partial vision.
These specs do more than bring blurry things into focus. This prototype pair of smart glasses translates visual information into images that blind people can see.
Many people who are registered as blind can perceive some light and motion. The glasses, developed by Stephen Hicks of the University of Oxford, are an attempt to make that residual vision as useful as possible.
They use two cameras, or a camera and an infrared projector, that can detect the distance to nearby objects. They also have a gyroscope, a compass and GPS to help orient the wearer.
The collected information can be translated into a variety of images on the transparent OLED displays, depending on what is most useful to the person sporting the shades. For example, objects can be made clearer against the background, or the distance to obstacles can be indicated by the varying brightness of an image.
Hicks has won the Royal Society's Brian Mercer Award for Innovation"" for his work on the smart glasses. He plans to use the £50,000 prize money to add object and text recognition to the glasses' abilities.
A new signal processing algorithm that enables any pair of earphones to detect your pulse was demonstrated recently at the Healthcare Device Exhibition 2013 in Yokohama, Japan. The technology comes from a joint effort of Bifrostec (Tokyo, Japan) and the Kaiteki Institute. It is built on the premise that the eardrum creates pressure waves with each heartbeat, which can be detected in a perfectly enclosed space. However, typically, earphones do not create a perfect seal, which is what gives everyone in a packed elevator the privilege to listen to that guy’s tunes. The new algorithm allows the software to process the pressure signal despite the lack of a perfect seal to determine a user’s pulse.
Oct 25, 2013
Our networking session proposal was accepted at the ICT 2013 Conference in Vilnius (6-8 November, 2013).
Title: Positive Technology: Steps Towards Ubiquitous Empowerment (07/11/2013, Booth 4, 18:00-19:30)
More than 5000 researchers, innovators, entrepreneurs, industry representatives are expected to attend the conference. That's indeed a great opportunity to explore the future developments of Positive Technology within Horizon2020.
If you are also planning to attend the conference and you're interested in participating to this special networking session, drop me a message here.
Sep 10, 2013
BITalino is a low-cost toolkit that allows anyone from students to professional developers to create projects and applications with physiological sensors. Out of the box, BITalino already integrates easy to use software & hardware blocks with sensors for electrocardiography (ECG), electromyography (EMG), electrodermal activity (EDA), an accelerometer, & ambient light. Imagination is the limit; each individual block can be snapped off and combined to prototype anything you want. You can connect others sensors, including your own custom designs.
Sep 09, 2013
Effortless awareness: using real time neurofeedback to investigate correlates of posterior cingulate cortex activity in meditators' self-report
Effortless awareness: using real time neurofeedback to investigate correlates of posterior cingulate cortex activity in meditators' self-report.
Front Hum Neurosci. 2013;7:440
Authors: Garrison KA, Santoyo JF, Davis JH, Thornhill TA, Kerr CE, Brewer JA
Neurophenomenological studies seek to utilize first-person self-report to elucidate cognitive processes related to physiological data. Grounded theory offers an approach to the qualitative analysis of self-report, whereby theoretical constructs are derived from empirical data. Here we used grounded theory methodology (GTM) to assess how the first-person experience of meditation relates to neural activity in a core region of the default mode network-the posterior cingulate cortex (PCC). We analyzed first-person data consisting of meditators' accounts of their subjective experience during runs of a real time fMRI neurofeedback study of meditation, and third-person data consisting of corresponding feedback graphs of PCC activity during the same runs. We found that for meditators, the subjective experiences of "undistracted awareness" such as "concentration" and "observing sensory experience," and "effortless doing" such as "observing sensory experience," "not efforting," and "contentment," correspond with PCC deactivation. Further, the subjective experiences of "distracted awareness" such as "distraction" and "interpreting," and "controlling" such as "efforting" and "discontentment," correspond with PCC activation. Moreover, we derived several novel hypotheses about how specific qualities of cognitive processes during meditation relate to PCC activity, such as the difference between meditation and "trying to meditate." These findings offer novel insights into the relationship between meditation and mind wandering or self-related thinking and neural activity in the default mode network, driven by first-person reports.
Aug 28, 2013
Sentiment in New York City: A High Resolution Spatial and Temporal View
Karla Z. Bertrand, Maya Bialik, Kawandeep Virdee, Andreas Gros, Yaneer Bar-Yam
http://arxiv.org/abs/1308.5010v1 (link to PDF full text)
Measuring public sentiment is a key task for researchers and policymakers alike. The explosion of available social media data allows for a more time-sensitive and geographically specific analysis than ever before. In this paper we analyze data from the micro-blogging site Twitter and generate a sentiment map of New York City. We develop a classifier specifically tuned for 140-character Twitter messages, or tweets, using key words, phrases and emoticons to determine the mood of each tweet. This method, combined with geotagging provided by users, enables us to gauge public sentiment on extremely fine-grained spatial and temporal scales. We find that public mood is generally highest in public parks and lowest at transportation hubs, and locate other areas of strong sentiment such as cemeteries, medical centers, a jail, and a sewage facility. Sentiment progressively improves with proximity to Times Square. Periodic patterns of sentiment fluctuate on both a daily and a weekly scale: more positive tweets are posted on weekends than on weekdays, with a daily peak in sentiment around midnight and a nadir between 9:00 a.m. and noon.
Aug 07, 2013
A recent introductory talk on the problem that consciousness and qualia presents to physicalism by Frank C. Jackson.
Welcome to wonderland: the influence of the size and shape of a virtual hand on the perceived size and shape of virtual objects
Welcome to wonderland: the influence of the size and shape of a virtual hand on the perceived size and shape of virtual objects.
PLoS One. 2013;8(7):e68594
Authors: Linkenauger SA, Leyrer M, Bülthoff HH, Mohler BJ
The notion of body-based scaling suggests that our body and its action capabilities are used to scale the spatial layout of the environment. Here we present four studies supporting this perspective by showing that the hand acts as a metric which individuals use to scale the apparent sizes of objects in the environment. However to test this, one must be able to manipulate the size and/or dimensions of the perceiver's hand which is difficult in the real world due to impliability of hand dimensions. To overcome this limitation, we used virtual reality to manipulate dimensions of participants' fully-tracked, virtual hands to investigate its influence on the perceived size and shape of virtual objects. In a series of experiments, using several measures, we show that individuals' estimations of the sizes of virtual objects differ depending on the size of their virtual hand in the direction consistent with the body-based scaling hypothesis. Additionally, we found that these effects were specific to participants' virtual hands rather than another avatar's hands or a salient familiar-sized object. While these studies provide support for a body-based approach to the scaling of the spatial layout, they also demonstrate the influence of virtual bodies on perception of virtual environments.
Using avatars to model weight loss behaviors: participant attitudes and technology development.
J Diabetes Sci Technol. 2013;7(4):1057-65
Authors: Napolitano MA, Hayes S, Russo G, Muresu D, Giordano A, Foster GD
BACKGROUND: Virtual reality and other avatar-based technologies are potential methods for demonstrating and modeling weight loss behaviors. This study examined avatar-based technology as a tool for modeling weight loss behaviors. METHODS: This study consisted of two phases: (1) an online survey to obtain feedback about using avatars for modeling weight loss behaviors and (2) technology development and usability testing to create an avatar-based technology program for modeling weight loss behaviors. RESULTS: Results of phase 1 (n = 128) revealed that interest was high, with 88.3% stating that they would participate in a program that used an avatar to help practice weight loss skills in a virtual environment. In phase 2, avatars and modules to model weight loss skills were developed. Eight women were recruited to participate in a 4-week usability test, with 100% reporting they would recommend the program and that it influenced their diet/exercise behavior. Most women (87.5%) indicated that the virtual models were helpful. After 4 weeks, average weight loss was 1.6 kg (standard deviation = 1.7). CONCLUSIONS: This investigation revealed a high level of interest in an avatar-based program, with formative work indicating promise. Given the high costs associated with in vivo exposure and practice, this study demonstrates the potential use of avatar-based technology as a tool for modeling weight loss behaviors.Abstract
What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold.
Front Hum Neurosci. 2013;7:438
Authors: Martini M, Perez-Marcos D, Sanchez-Vives MV
It has been demonstrated that visual inputs can modulate pain. However, the influence of skin color on pain perception is unknown. Red skin is associated to inflamed, hot and more sensitive skin, while blue is associated to cyanotic, cold skin. We aimed to test whether the color of the skin would alter the heat pain threshold. To this end, we used an immersive virtual environment where we induced embodiment of a virtual arm that was co-located with the real one and seen from a first-person perspective. Virtual reality allowed us to dynamically modify the color of the skin of the virtual arm. In order to test pain threshold, increasing ramps of heat stimulation applied on the participants' arm were delivered concomitantly with the gradual intensification of different colors on the embodied avatar's arm. We found that a reddened arm significantly decreased the pain threshold compared with normal and bluish skin. This effect was specific when red was seen on the arm, while seeing red in a spot outside the arm did not decrease pain threshold. These results demonstrate an influence of skin color on pain perception. This top-down modulation of pain through visual input suggests a potential use of embodied virtual bodies for pain therapy.
When it comes to some of the health hazards of light at night, a new study suggests that the color of the light can make a big difference.
Read full story on Science Daily