Apr 15, 2014
Brain-computer interfaces: a powerful tool for scientific inquiry
Brain-computer interfaces: a powerful tool for scientific inquiry.
Curr Opin Neurobiol. 2014 Apr;25C:70-75
Authors: Wander JD, Rao RP
Abstract. Brain-computer interfaces (BCIs) are devices that record from the nervous system, provide input directly to the nervous system, or do both. Sensory BCIs such as cochlear implants have already had notable clinical success and motor BCIs have shown great promise for helping patients with severe motor deficits. Clinical and engineering outcomes aside, BCIs can also be tremendously powerful tools for scientific inquiry into the workings of the nervous system. They allow researchers to inject and record information at various stages of the system, permitting investigation of the brain in vivo and facilitating the reverse engineering of brain function. Most notably, BCIs are emerging as a novel experimental tool for investigating the tremendous adaptive capacity of the nervous system.
23:14 Posted in Brain-computer interface | Permalink | Comments (0)
Android Wear
Via KurzweilAI.net
Google has announced Android Wear, a project that extends Android to wearables, starting with two watches, both due out this Summer: Motorola’s Moto 360 and LG’s G Watch.
Android Wear will show you info from the wide variety of Android apps, such as messages, social apps, chats, notifications, health and fitness, music playlists, and videos.
It will also enable Google Now functions — say “OK, Google” for flight times, sending a text, weather, view email, get directions, travel time, making a reservation, etc..
Google says it’s working with several other consumer-electronics manufacturers, including Asus, HTC, and Samsung; chip makers Broadcom, Imagination, Intel, Mediatek and Qualcomm; and fashion brands like the Fossil Group to offer watches powered by Android Wear later this year.
If you’re a developer, there’s a new section on developer.android.com/wear focused on wearables. Starting today, you can download a Developer Preview so you can tailor your existing app notifications for watches powered by Android Wear.
23:07 Posted in Wearable & mobile | Permalink | Comments (0)
A Hybrid Brain Computer Interface System Based on the Neurophysiological Protocol and Brain-actuated Switch for Wheelchair Control
A Hybrid Brain Computer Interface System Based on the Neurophysiological Protocol and Brain-actuated Switch for Wheelchair Control.
J Neurosci Methods. 2014 Apr 5;
Authors: Cao L, Li J, Ji H, Jiang C
BACKGROUND: Brain Computer Interfaces (BCIs) are developed to translate brain waves into machine instructions for external devices control. Recently, hybrid BCI systems are proposed for the multi-degree control of a real wheelchair to improve the systematical efficiency of traditional BCIs. However, it is difficult for existing hybrid BCIs to implement the multi-dimensional control in one command cycle.
NEW METHOD: This paper proposes a novel hybrid BCI system that combines motor imagery (MI)-based bio-signals and steady-state visual evoked potentials (SSVEPs) to control the speed and direction of a real wheelchair synchronously. Furthermore, a hybrid modalities-based switch is firstly designed to turn on/off the control system of the wheelchair.
RESULTS: Two experiments were performed to assess the proposed BCI system. One was implemented for training and the other one conducted a wheelchair control task in the real environment. All subjects completed these tasks successfully and no collisions occurred in the real wheelchair control experiment.
COMPARISON WITH EXISTING METHOD(S): The protocol of our BCI gave much more control commands than those of previous MI and SSVEP-based BCIs. Comparing with other BCI wheelchair systems, the superiority reflected by the index of path length optimality ratio validated the high efficiency of our control strategy.
CONCLUSIONS: The results validated the efficiency of our hybrid BCI system to control the direction and speed of a real wheelchair as well as the reliability of hybrid signals-based switch control.
23:03 Posted in Brain-computer interface, Neurotechnology & neuroinformatics | Permalink | Comments (0)
Avegant - Glyph Kickstarter - Wearable Retinal Display
Via Mashable
Move over Google Glass and Oculus Rift, there's a new kid on the block: Glyph, a mobile, personal theater.
Glyph looks like a normal headset and operates like one, too. That is, until you move the headband down over your eyes and it becomes a fully-functional visual visor that displays movies, television shows, video games or any other media connected via the attached HDMI cable.
Using Virtual Retinal Display (VRD), a technology that mimics the way we see light, the Glyph projects images directly onto your retina using one million micromirrors in each eye piece. These micromirrors reflect the images back to the retina, producing a reportedly crisp and vivid quality.
22:56 Posted in Future interfaces, Telepresence & virtual presence, Virtual worlds, Wearable & mobile | Permalink | Comments (0)
A sweet, sad stop-motion film made with 3-D printing
Via Wired
London-based creative agency DBLG shows the way with “Bears on Stairs,” a short clip that combines a 3-D printed hero with traditional stop-motion animation to charming effect. The ursine epic has a 2-second run time and took four weeks to complete, making it about as efficient as your average Michael Bay production, by my rough calculations. The lumbering action took 50 printed models in all.
BEARS ON STAIRS from DBLG on Vimeo.
22:32 Posted in Cyberart | Permalink | Comments (0)
First video game
Fifty years ago, before either arcades or home video games, visitors waited in line at Brookhaven National Laboratory to play Tennis for Two, an electronic tennis game that is unquestionably a forerunner of the modern video game. Two people played the electronic tennis game with separate controllers that connected to an analog computer and used an oscilloscope for a screen. The game's creator, William Higinbotham, was a physicist who lobbied for nuclear nonproliferation as the first chair of the Federation of American Scientists.
22:18 Posted in Vintage computing | Permalink | Comments (0)
Apr 06, 2014
Measuring the effects through time of the influence of visuomotor and visuotactile synchronous stimulation on a virtual body ownership illusion
Measuring the effects through time of the influence of visuomotor and visuotactile synchronous stimulation on a virtual body ownership illusion.
Perception. 2014;43(1):43-58
Authors: Kokkinara E, Slater M
Abstract. Previous studies have examined the experience of owning a virtual surrogate body or body part through specific combinations of cross-modal multisensory stimulation. Both visuomotor (VM) and visuotactile (VT) synchronous stimulation have been shown to be important for inducing a body ownership illusion, each tested separately or both in combination. In this study we compared the relative importance of these two cross-modal correlations, when both are provided in the same immersive virtual reality setup and the same experiment. We systematically manipulated VT and VM contingencies in order to assess their relative role and mutual interaction. Moreover, we present a new method for measuring the induced body ownership illusion through time, by recording reports of breaks in the illusion of ownership ('breaks') throughout the experimental phase. The balance of the evidence, from both questionnaires and analysis of the breaks, suggests that while VM synchronous stimulation contributes the greatest to the attainment of the illusion, a disruption of either (through asynchronous stimulation) contributes equally to the probability of a break in the illusion.
23:59 Posted in Research tools, Telepresence & virtual presence, Virtual worlds | Permalink | Comments (0)
Glass brain flythrough: beyond neurofeedback
Via Neurogadget
Researchers have developed a new way to explore the human brain in virtual reality. The system, called Glass Brain, which is developed by Philip Rosedale, creator of the famous game Second Life, and Adam Gazzaley, a neuroscientist at the University of California San Francisco, combines brain scanning, brain recording and virtual reality to allow a user to journey through a person’s brain in real-time.
Read the full story on Neurogadget
23:52 Posted in Biofeedback & neurofeedback, Blue sky, Information visualization, Physiological Computing, Virtual worlds | Permalink | Comments (0)
Stick-on electronic patches for health monitoring
Researchers at at John A. Rogers’ lab at the University of Illinois, Urbana-Champaign have incorporated off-the-shelf chips into fexible electronic patches to allow for high quality ECG and EEG monitoring.
Here is the video:
23:45 Posted in Physiological Computing, Research tools, Self-Tracking, Wearable & mobile | Permalink | Comments (0)
The effects of augmented visual feedback during balance training in Parkinson's disease - trial protocol
The effects of augmented visual feedback during balance training in Parkinson's disease: study design of a randomized clinical trial.
BMC Neurol. 2013;13:137
Authors: van den Heuvel MR, van Wegen EE, de Goede CJ, Burgers-Bots IA, Beek PJ, Daffertshofer A, Kwakkel G
Abstract. BACKGROUND: Patients with Parkinson's disease often suffer from reduced mobility due to impaired postural control. Balance exercises form an integral part of rehabilitative therapy but the effectiveness of existing interventions is limited. Recent technological advances allow for providing enhanced visual feedback in the context of computer games, which provide an attractive alternative to conventional therapy. The objective of this randomized clinical trial is to investigate whether a training program capitalizing on virtual-reality-based visual feedback is more effective than an equally-dosed conventional training in improving standing balance performance in patients with Parkinson's disease.
METHODS/DESIGN: Patients with idiopathic Parkinson's disease will participate in a five-week balance training program comprising ten treatment sessions of 60 minutes each. Participants will be randomly allocated to (1) an experimental group that will receive balance training using augmented visual feedback, or (2) a control group that will receive balance training in accordance with current physical therapy guidelines for Parkinson's disease patients. Training sessions consist of task-specific exercises that are organized as a series of workstations. Assessments will take place before training, at six weeks, and at twelve weeks follow-up. The functional reach test will serve as the primary outcome measure supplemented by comprehensive assessments of functional balance, posturography, and electroencephalography. DISCUSSION: We hypothesize that balance training based on visual feedback will show greater improvements on standing balance performance than conventional balance training. In addition, we expect that learning new control strategies will be visible in the co-registered posturographic recordings but also through changes in functional connectivity.
23:35 Posted in Augmented/mixed reality, Cybertherapy | Permalink | Comments (0)
Mar 09, 2014
Shugo Tokumaru "Katachi"
14:29 Posted in Creativity and computers, Cyberart | Permalink | Comments (0)
Mar 03, 2014
Virtual reality for the assessment of frontotemporal dementia, a feasibility study
Virtual reality for the assessment of frontotemporal dementia, a feasibility study.
Disabil Rehabil Assist Technol. 2014 Feb 14;
Authors: Mendez MF, Joshi A, Jimenez E
Abstract
Abstract Purpose: Behavioral variant frontotemporal dementia (bvFTD) is a non-Alzheimer dementia characterized by difficulty in documenting social-emotional changes. Few investigations have used virtual reality (VR) for documentation and rehabilitation of non-Alzheimer dementias. Methods: Five bvFTD patients underwent insight interviews while immersed in a virtual environment. They were interviewed by avatars, their answers were recorded, and their heart rates were monitored. They were asked to give ratings of their stress immediately at the beginning and at the end of the session. Results: The patients tolerated the head-mounted display and VR without nausea or disorientation, heart rate changes, or worsening stress ratings. Their insight responses were comparable to real world interviews. All bvFTD patients showed their presence in the VR environment as they moved their heads to face and respond to each avatar's questions. The bvFTD patients tended to greater verbal elaboration of answers with larger mean length of utterances compared to their real world interviews. Conclusions: VR is feasible and well-tolerated in bvFTD. These patients may have VR responses comparable to real world performance and they may display a presence in the virtual environment which could even facilitate assessment. Further research can explore the promise of VR for the evaluation and rehabilitation of dementias beyond Alzheimer's disease. Implications for Rehabilitation Clinicians need effective evaluation and rehabilitation strategies for dementia, a neurological syndrome of epidemic proportions and a leading cause of disability. Memory and cognitive deficits are the major disabilities and targets for rehabilitation in Alzheimer's disease, the most common dementia. In contrast, social and emotional disturbances are the major disabilities and targets for rehabilitation in behavioral variant frontotemporal dementia (bvFTD), an incompletely understood non-Alzheimer dementia. Virtual reality is a technology that holds great promise for the evaluation and rehabilitation of patients with bvFTD and other non-Alzheimer dementias, and preliminary evidence suggests that this technology is feasible in patients with bvFTD.
00:27 Posted in Cybertherapy, Virtual worlds | Permalink | Comments (0)
Evaluation of a virtual reality prospective memory task for use with individuals with severe traumatic brain injury
Evaluation of a virtual reality prospective memory task for use with individuals with severe traumatic brain injury.
Neuropsychol Rehabil. 2014 Feb 24;
Authors: Canty AL, Fleming J, Patterson F, Green HJ, Man D, Shum DH
Abstract
The current study aimed to evaluate the sensitivity, convergent validity and ecological validity of a newly developed virtual reality prospective memory (PM) task (i.e., the Virtual Reality Shopping Task; VRST) for use with individuals with traumatic brain injury (TBI). Thirty individuals with severe TBI and 24 uninjured adults matched on age, gender and education level were administered the VRST, a lexical decision PM task (LDPMT), an index of task-friendliness and a cognitive assessment battery. Significant others rated disruptions in the TBI participants' occupational activities, interpersonal relationships and independent living skills. The performance of the TBI group was significantly poorer than that of controls on event-based PM as measured by the LDPMT, and on time- and event-based PM as measured by the VRST. Performance on the VRST significantly predicted significant others' ratings of patients' occupational activities and independent living skills. The VRST was rated as significantly more reflective of an everyday activity, interesting and was afforded a higher recommendation than the LDPMT. For the TBI group, event and total PM performance on the VRST significantly correlated with performance on measures of mental flexibility and verbal fluency, and total PM performance correlated with verbal memory. These results provide preliminary but promising evidence of the sensitivity, as well as the convergent and ecological validity of the VRST.
00:23 Posted in Cybertherapy, Virtual worlds | Permalink | Comments (0)
Virtual Reality for sensorimotor rehabilitation post-stroke
Virtual Reality for Sensorimotor Rehabilitation Post-Stroke: The Promise and Current State of the Field.
Curr Phys Med Rehabil Reports. 2013 Mar;1(1):9-20
Authors: Fluet GG, Deutsch JE
Abstract
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested.
00:20 Posted in Cybertherapy, Virtual worlds | Permalink | Comments (0)
By licking these electric ice cream cones, you can make music
From Wired
Ice cream can be the reward after a successful little league game, a consolation after a bad breakup, or, in the hands of gourmet geeks, a sweet musical instrument. Designers Carla Diana and Emilie Baltz recently whipped up a musical performance where a quartet of players jammed using just a quart of vanilla ice cream and some high-tech cones
00:07 Posted in Creativity and computers, Cyberart, Future interfaces | Permalink | Comments (0)
Mar 02, 2014
3D Thought controlled environment via Interaxon
In this demo video, artist Alex McLeod shows an environment he designed for Interaxon to use at CES in 2011 interaxon.ca/CES#.
The glasses display the scene in 3D and attaches sensors read users brain-states which control elements of the scene.
3D Thought controlled environment via Interaxon from Alex McLeod on Vimeo.
Myoelectric controlled avatar helps stop phantom limb pain
Reblogged from Medgadget
People unfortunate enough to lose an arm or a leg often feel pain in their missing limb, an unexplained condition known as phantom limb pain. Researchers at Chalmers University of Technology in Sweden decided to test whether they can fool the brain into believing the limb is still there and maybe stop the pain.
They attached electrodes to the skin of the remaining arm of an amputee to read the myoelectric signals from the muscles below. Additionally, the arm was tracked in 3D using a marker so that the data could be integrated into a moving generated avatar as well as computer games. The amputee moves the arm of the avatar like he would if his own still existed, while the brain becomes reacquainted with its presence. After repeated use, and playing video games that were controlled using the same myoelectric interface, the person in the study had significant pain reduction after decades of phantom limb pain.
Here’s a video showing off the experimental setup:
Study in Frontiers in Neuroscience: Treatment of phantom limb pain (PLP) based on augmented reality and gaming controlled by myoelectric pattern recognition: a case study of a chronic PLP patient…
22:35 Posted in Cybertherapy, Wearable & mobile | Permalink | Comments (0)
Voluntary Out-of-Body Experience: An fMRI Study
Voluntary Out-of-Body Experience: An fMRI Study.
Front Hum Neurosci. 2014;8:70
Authors: Smith AM, Messier C
Abstract
The present single-case study examined functional brain imaging patterns in a participant that reported being able, at will, to produce somatosensory sensations that are experienced as her body moving outside the boundaries of her physical body all the while remaining aware of her unmoving physical body. We found that the brain functional changes associated with the reported extra-corporeal experience (ECE) were different than those observed in motor imagery. Activations were mainly left-sided and involved the left supplementary motor area and supramarginal and posterior superior temporal gyri, the last two overlapping with the temporal parietal junction that has been associated with out-of-body experiences. The cerebellum also showed activation that is consistent with the participant's report of the impression of movement during the ECE. There was also left middle and superior orbital frontal gyri activity, regions often associated with action monitoring. The results suggest that the ECE reported here represents an unusual type of kinesthetic imagery.
22:24 Posted in Telepresence & virtual presence | Permalink | Comments (0)
Humanlike robot hands controlled by brain activity arouse illusion of ownership in operators.
Humanlike robot hands controlled by brain activity arouse illusion of ownership in operators.
Sci Rep. 2013;3:2396
Authors: Alimardani M, Nishio S, Ishiguro H
Abstract
Operators of a pair of robotic hands report ownership for those hands when they hold image of a grasp motion and watch the robot perform it. We present a novel body ownership illusion that is induced by merely watching and controlling robot's motions through a brain machine interface. In past studies, body ownership illusions were induced by correlation of such sensory inputs as vision, touch and proprioception. However, in the presented illusion none of the mentioned sensations are integrated except vision. Our results show that during BMI-operation of robotic hands, the interaction between motor commands and visual feedback of the intended motions is adequate to incorporate the non-body limbs into one's own body. Our discussion focuses on the role of proprioceptive information in the mechanism of agency-driven illusions. We believe that our findings will contribute to improvement of tele-presence systems in which operators incorporate BMI-operated robots into their body representations.
22:17 Posted in Telepresence & virtual presence, Virtual worlds | Permalink | Comments (0)
Feb 16, 2014
How much science is there?
The accelerating pace of scientific publishing and the rise of open access, as depicted by xkcd.com cartoonist Randall Munroe.
14:31 Posted in Blue sky, Information visualization | Permalink | Comments (0)