Oct 04, 2010
Several initiatives are exploring the potential of crowdfunding for supporting scientific research. In this approach, that I described in a letter to Science donors can choose from a list of public projects. Projects seeking funding are stored in an online repository. Investors (either people or funding agencies) can decide which projects to fund.
The closest example of crowdfunding science is Cancer Research UK's MyProjects scheme (http://myprojects.cancerresearchuk.org/). Launched in October 2008, MyProjects allows Cancer Research UK donors to search projects by type of cancer and location to find a specific research project to donate money.
I am also running a crowdfunding-science project in Italy, called Open Genius. The website is available only in Italian, but you can find the essential info about the project in this presentation.
I wish to hear your comments about this!
Wanna make money with your blog? Do you have quality content that you would like to monetize? Now you can do it, with a revolutionary micropayment system called Flattr.
The system was launched publicly in March 2010 on an invite-only basis, and then opened up to the public in August 2010.
How does it work?
You pay a small monthly amount (using either Moneybookers or PayPal) and then click buttons on sites to share out the money you paid in among those sites, sort of like an Internet tip jar. The minimum to pay is 2 euros per month. The money payed each month is spread evenly among the buttons you click in a month. In this way, users share not only money, but also content. For the service, Flattr keeps 10% of all the users monthly flatrate.
What is interesting about this service is that not only sites which support Flattr, but all sites, can have Flattr buttons.
Good Flattr to everybody!
Sep 26, 2010
Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy
Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy.
BMC Neurosci. 2010 Sep 16;11(1):117
Authors: Hashimoto Y, Ushiba J, Kimura A, Liu M, Tomita Y
ABSTRACT: BACKGROUND: For severely paralyzed people, a brain-computer interface (BCI) provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD) appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG) after long-term use of the virtual reality (VR)-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI) in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home. RESULTS: After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD) following MI was significantly enhanced (p < 0.01) for feet MI (from -29% to -55%), left-hand MI (from -23% to -42%), and right-hand MI (from -22% to -51%). CONCLUSIONS: These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.
Sep 20, 2010
The XWave is a new technology that uses a single electrode placed on the wearer’s forehead to measure electroencephalography (EEG) data, and converts these analog signals into digital so they can be used to control an external device. The XWave comes bundled with a software that includes a number of brain-training exercises. These include levitating a ball on the iDevice’s screen, changing a color based on the relaxation level of your brain and training your brain to maximize its attention span.
In the company’s own words:
XWave, powered by NeuroSky eSense patented technologies, senses the faintest electrical impulses transmitted through your skull to the surface of your forehead and converts these analogue signals into digital. With XWave, you will be able to detect attention and meditation levels, as well as train your mind to control things. Objects in a game can be controlled, lights in your living room can change colour depending on your mood; the possibilities are limited to only the power of your imagination.
Sep 19, 2010
Research groups at Stanford University and the University of California at Berkeley are developing sensor-based artificial skin that could provide prosthetic and robotic limbs with a realistic sense of touch. Stanford's project is based on organic electronics and is capable of detecting the weight of a fly upon the artificial skin, according to Zhenan Bao, professor of chemical engineering at Stanford.
The highly sensitive surfaces could also help robots pick up delicate objects without breaking them, improve surgeons' control over tools used for minimally invasive surgery, and increase efficiency of touch screen devices, she noted. Meanwhile, UC Berkeley's "e-skin" uses low-power, integrated arrays of nanowire transistors, according to UC Berkeley Professor of Electrical Engineering and Computer Science Ali Javey.
Thus far, the skin, the first ever made out of inorganic single crystalline semiconductors, is able to detect pressure equivalent to the touch of a keyboard. "It's a technique that can be potentially scaled up," said study lead author Kuniharu Takei, post-doctoral fellow in electrical engineering and computer sciences at UC Berkeley. "The limit now to the size of the e-skin we developed is the size of the processing tools we are using."
Sep 03, 2010
Samsung has finally unveiled its new Galaxy Tab at the IFA conference in Berlin. The Galaxy runs on Android 2.2 operating system, which can run HTML5 and Adobe's Flash Player — unlike the iPad. It comes with a capacity of 16 or 32GB, expandable by 32GB more. The device weighs 380g (14oz), and has an 18cm (7in) screen - smaller and lighter than the iPad.
The Galaxy supports Bluetooth, Wi-Fi and 3G cell phone networks, and comes with two cameras, one 3-megabyte digital camera with a flash on the back of the device, and a second camera on the front for video conferences — a feature the iPad lacks.
The price is not very competitive, though: a number of European news sources are reporting that the Galaxy Tab will cost €699 and €799 for the 16GB and 32GB models, respectively.
Sep 02, 2010
As computing power continues to increase, it may ultimately be possible to simulate the functioning of the most complex system of the known universe: the brain. This is the ambitious goal of the Blue Brain Project, the first attempt to reverse-engineer the mammalian brain.
The project is expected to provide answers to a number of fundamental questions, ranging from the emergence of biological intelligence to the evolution of consciousness.
Lead by neuroscientist Henry Markram, Blue Brain was launched in 2005 as a joint research initiative between the Brain Mind Institute at the École Polytechnique Fédérale de Lausanne (EPFL) and the information technology giant IBM. Using the impressive processing power of IBM’s Blue Gene/L supercomputer, the project reached its first milestone in December 2006, with the development of the model of a rat’s neocortical column (NCC).
To perform the empirically-based simulation of individual cells, the Blue Gene/L supercomputer uses the NEURON software developed by Michael Hines, John W. Moore, and Ted Carnevale at Yale and Duke. Thanks to this software, processors are converted into neuron simulators and communication cables are converted into axons interconnecting the neurons, which allows to transform the entire Blue Gene into a cortical microcircuit.
In November 2007, the project achieved the completion of its first phase, with the development of a new modelling framework for the construction and validation of neural circuits built from biological data. The project is now striving to simplify the simulation of the column, in order to allow the parallel simulation of multiple connected columns. If this strategy will be successful, the final objective will be to simulate a whole human neocortex, which includes about one million cortical columns.
More to explore:
H. Markram, The Blue Brain Project, Nature Reviews Neuroscience, 7:153-160, 2006
Aug 27, 2010
A great presentation by Mattew Cornell about the concept of self-tracking and the implications of this emerging research field for people's wellbeing.
Keiichi Matsuda did it again. After the success of Domestic Robocop, the architecture graduate and filmaker got the nomination for the Royal Institute of British Architects (RIBA) Silver Medal award, for his new video "Augmented City". As in his previous work, in this new video Matsuda describes a future world overlaid with digital information, whose built environment can be manipulated by the individual. In this way, the objective physical world is transformed in a subjective virtual space.
In Matsuda's own words:
Augmented City explores the social and spatial implications of an AR-supported future. 'Users' of the city can browse through channels of the augmented city, creating aggregated customised environments. Identity is constructed and broadcast, while local records and coupons litter the streets. The augmented city is an architectural construct modulated by the user, a liquid city of stratified layers that perceptually exists in the space between the self and the built environment. This subjective space allows us to re-evaluate current trends, and examine our future occupation of the augmented city.
Aug 26, 2010
The Heart Chamber Orchestra consists of classical musicians who use their heartbeats to control a computer composition and visualization environment. To my best knowledge, this is the first example of "group biofeedback".
The musicians are equipped with ECG (electrocardiogram) sensors. A computer monitors and analyzes the state of these 12 hearts in real time. The acquired information is used to compose a musical score with the aid of computer software. It is a living score dependent on the state of the hearts.
While the musicians are playing, their heartbeats influence and change the composition and vice versa. The musicians and the electronic composition are linked via the hearts in a circular motion, a feedback structure. The emerging music evolves entirely during the performance.
The resulting music is the expression of this process and of an organism forming itself from the circular interplay of the individual musicians and the machine.
The sensor network consists of 12 individual sensors; each one is fitted onto the body of a musician. A computer receives the heartbeat data. Software analyzes the data and generates via different algorithms the real-time musical score for the musicians, the electronic sounds and the computer graphic visualization
Below is a video documentation from the Heart Chamber Orchestra performance on the 28th of March 2010 at Kiasma Theatre at Pixelache Festival in Helsinki, Finland.
Aug 24, 2010
Vicon Revue is a wearable digital camera that allows to take photographs automatically, while it is being worn. In this way, the user can keep a photo log of his/her day to day life. The camera is the commercial evolution of SenseCam, a project developed by Microsoft Research in Cambridge, UK. Vicon Revue is sold to researchers for about $700, but the consumer version (expected for next year) could be significantly cheaper.
The camera is equipped with a wide-angle lens that provides a fish-eye view and contains several sensors, such as a temperature sensor, a light color and intensity sensor, an infra-red motion detector, a multi-axis accelerometer, and a 3-axis magnetometer (compass). The size of the camera is reasonably small (6.5cm w x 7cm h x 1.7cm d; 94g weight) and it includes 2GB memory.
To get an idea of the quality of the output you can take a look at this time lapse video showing images taken over the course of 24 hours with Vicon Revue.
Short-term meditation induces white matter changes in the anterior cingulate.
Proc Natl Acad Sci U S A. 2010 Aug 16;
Authors: Tang YY, Lu Q, Geng X, Stein EA, Yang Y, Posner MI
The anterior cingulate cortex (ACC) is part of a network implicated in the development of self-regulation and whose connectivity changes dramatically in development. In previous studies we showed that 3 h of mental training, based on traditional Chinese medicine (integrative body-mind training, IBMT), increases ACC activity and improves self-regulation. However, it is not known whether changes in white matter connectivity can result from small amounts of mental training. We here report that 11 h of IBMT increases fractional anisotropy (FA), an index indicating the integrity and efficiency of white matter in the corona radiata, an important white-matter tract connecting the ACC to other structures. Thus IBMT could provide a means for improving self-regulation and perhaps reducing or preventing various mental disorders.
The Large Hadron Collider, the world's largest and highest-energy particle accelerator, promises to revolutionize our knowledge of the universe and advance our understanding of the most fundamental laws of nature.
But if you are not very good at quantum physics, do not worry: you can still get an explanation in RAP format of what the LHC is all about.
A 'sound' explanation, indeed!
Aug 17, 2010
Using mirror visual feedback and virtual reality to treat fibromyalgia.
Med Hypotheses. 2010 Aug 5;
Authors: Ramachandran VS, Seckel EL
Fibromyalgia is a condition characterized by long term body-wide pain and tender points in joints, muscles and soft tissues. Other symptoms include chronic fatigue, morning stiffness, and depression. It is well known that these symptoms are exacerbated under periods of high stress. When pain becomes severe enough, the mind can enter what is known as a dissociative state, characterized by depersonalization - the feeling of detachment from one's physical body and the illusion of watching one's physical body from outside. In evolutionary terms, dissociative states are thought to be an adaptive mechanism to mentally distance oneself from pain, often during trauma. Similar dissociative experiences are reported by subjects who have used psychoactive drugs such as ketamine. We have previously used non-invasive mirror visual feedback to treat subjects with chronic pain from phantom limbs and suggested its use for complex regional pain syndrome: once considered intractable pain. We wondered whether such methods would work to alleviate the chronic pain of fibromyalgia. We tested mirror visual feedback on one fibromyalgia patient. On 15 trials, the patient's lower limb pain rating (on a scale from 1 to 10) decreased significantly. These preliminary results suggest that non-invasive dissociative anesthetics such as VR goggles, ketamine, and mirror visual feedback could be used to alleviate chronic pain from fibromyalgia. This would furnish us with a better understanding of the mechanism by which external visual feedback interacts with the internal physical manifestation of pain.
A randomized, controlled trial of immersive virtual reality analgesia, during physical therapy for pediatric burns.
Burns. 2010 Aug 6;
Authors: Schmitt YS, Hoffman HG, Blough DK, Patterson DR, Jensen MP, Soltani M, Carrougher GJ, Nakamura D, Sharar SR
This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6-19 years old) performed range-of-motion exercises under a therapist's direction for 1-5 days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects' perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27-44%) in pain ratings during virtual reality. They also reported improved affect ("fun") during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use.
Aug 16, 2010
There is nothing more regenerating than a long sea vacation. But what we do as we are back to the office and find an overwhelming pile of email? A good recovery strategy from post-vacation stress is essential, and advanced technologies may help.
For example, My Relax 3D is a mobile application that helps you relax while watching at stunning 3D landscapes of an exotic island. When you enter the application, you can choose between highly realistic 3d environments, depicting various island scenarios (i.e. a tropical forest, a sunset)
During the experience, a voiceover provides instructions to relieve from stress and develop positive emotions.
The application is highly configurable: it can be experienced with or without 3D glasses (but I strongly recommend this option to enhance your feeling of "presence"). It is also possible to choose between different pleasant music themes.
Of course, it's not like a first class holiday in a luxury resort... but it's definitely the best you can do with five bucks!
Jul 30, 2010
Disabled persons, quadriplegics and others suffering from paralysis may be able to regain movement with a sniff-activated sensor, according to a study by Israeli researchers.
The technology works by translating changes in nasal air pressure into electrical signals that are passed to a computer. Patients can sniff in certain patterns to select letters or numbers to compose text, or on the computer, to control the mouse. For getting around, sniffing controls the direction of the wheelchair, Bloomberg reports.
Quadriplegic patients were able to use the device to navigate wheelchairs as well as healthy people. Two participants who were completely paralyzed but with intact mental function used the technology to communicate by choosing letters on a computer screen to write. The study appears in the Proceedings of the National Academy of Sciences.
Jun 23, 2010
First person experience of body transfer in virtual reality.
PLoS One. 2010;5(5):e10564
Authors: Slater M, Spanlang B, Sanchez-Vives MV, Blanke O
BACKGROUND: Altering the normal association between touch and its visual correlate can result in the illusory perception of a fake limb as part of our own body. Thus, when touch is seen to be applied to a rubber hand while felt synchronously on the corresponding hidden real hand, an illusion of ownership of the rubber hand usually occurs. The illusion has also been demonstrated using visuomotor correlation between the movements of the hidden real hand and the seen fake hand. This type of paradigm has been used with respect to the whole body generating out-of-the-body and body substitution illusions. However, such studies have only ever manipulated a single factor and although they used a form of virtual reality have not exploited the power of immersive virtual reality (IVR) to produce radical transformations in body ownership. PRINCIPAL FINDINGS: Here we show that a first person perspective of a life-sized virtual human female body that appears to substitute the male subjects' own bodies was sufficient to generate a body transfer illusion. This was demonstrated subjectively by questionnaire and physiologically through heart-rate deceleration in response to a threat to the virtual body. This finding is in contrast to earlier experimental studies that assume visuotactile synchrony to be the critical contributory factor in ownership illusions. Our finding was possible because IVR allowed us to use a novel experimental design for this type of problem with three independent binary factors: (i) perspective position (first or third), (ii) synchronous or asynchronous mirror reflections and (iii) synchrony or asynchrony between felt and seen touch. CONCLUSIONS: The results support the notion that bottom-up perceptual mechanisms can temporarily override top down knowledge resulting in a radical illusion of transfer of body ownership. The research also illustrates immersive virtual reality as a powerful tool in the study of body representation and experience, since it supports experimental manipulations that would otherwise be infeasible, with the technology being mature enough to represent human bodies and their motion.
Virtual reality hypnosis for pain associated with recovery from physical trauma.
Int J Clin Exp Hypn. 2010 Jul;58(3):288-300
Authors: Patterson DR, Jensen MP, Wiechman SA, Sharar SR
Pain following traumatic injuries is common, can impair injury recovery and is often inadequately treated. In particular, the role of adjunctive nonpharmacologic analgesic techniques is unclear. The authors report a randomized, controlled study of 21 hospitalized trauma patients to assess the analgesic efficacy of virtual reality hypnosis (VRH)-hypnotic induction and analgesic suggestion delivered by customized virtual reality (VR) hardware/software. Subjective pain ratings were obtained immediately and 8 hours after VRH (used as an adjunct to standard analgesic care) and compared to both adjunctive VR without hypnosis and standard care alone. VRH patients reported less pain intensity and less pain unpleasantness compared to control groups. These preliminary findings suggest that VRH analgesia is a novel technology worthy of further study, both to improve pain management and to increase availability of hypnotic analgesia to populations without access to therapist-provided hypnosis and suggestion.
Therapists' Perception of Benefits and Costs of Using Virtual Reality Treatments.
Cyberpsychol Behav Soc Netw. 2010 Jun 14;
Authors: Segal R, Bhatia M, Drapeau M
Abstract Research indicates that virtual reality is effective in the treatment of many psychological difficulties and is being used more frequently. However, little is known about therapists' perception of the benefits and costs related to the use of virtual therapy in treatment delivery. In the present study, 271 therapists completed an online questionnaire that assessed their perceptions about the potential benefits and costs of using virtual reality in psychotherapy. Results indicated that therapists perceived the potential benefits as outweighing the potential costs. Therapists' self-reported knowledge of virtual reality, theoretical orientation, and interest in using virtual reality were found to be associated with perceptual measures. These findings contribute to the current knowledge of the perception of virtual reality amongst psychotherapists.