Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Oct 19, 2010

Neurocognitive systems related to real-world prospective memory

Neurocognitive systems related to real-world prospective memory.

PLoS One. 2010;5(10):

Authors: Kalpouzos G, Eriksson J, Sjölie D, Molin J, Nyberg L

BACKGROUND: Prospective memory (PM) denotes the ability to remember to perform actions in the future. It has been argued that standard laboratory paradigms fail to capture core aspects of PM. METHODOLOGY/PRINCIPAL FINDINGS: We combined functional MRI, virtual reality, eye-tracking and verbal reports to explore the dynamic allocation of neurocognitive processes during a naturalistic PM task where individuals performed errands in a realistic model of their residential town. Based on eye movement data and verbal reports, we modeled PM as an iterative loop of five sustained and transient phases: intention maintenance before target detection (TD), TD, intention maintenance after TD, action, and switching, the latter representing the activation of a new intention in mind. The fMRI analyses revealed continuous engagement of a top-down fronto-parietal network throughout the entire task, likely subserving goal maintenance in mind. In addition, a shift was observed from a perceptual (occipital) system while searching for places to go, to a mnemonic (temporo-parietal, fronto-hippocampal) system for remembering what actions to perform after TD. Updating of the top-down fronto-parietal network occurred at both TD and switching, the latter likely also being characterized by frontopolar activity. CONCLUSION/SIGNIFICANCE: Taken together, these findings show how brain systems complementary interact during real-world PM, and support a more complete model of PM that can be applied to naturalistic PM tasks and that we named PROspective MEmory DYnamic (PROMEDY) model because of its dynamics on both multi-phase iteration and the interactions of distinct neurocognitive networks.

HRP-4C cybernetic human dance

Dance Robot LIVE! is a performance recently shown at the Digital Content Expo in Tokyo. The performance features AIST's feminine HRP-4C robot and four humans. The routine was produced by renowned dancer/choreographer SAM-san and the lip-synced song is a Vocaloid version of "Deatta Koro no Yō ni" by Kaori Mochida (Every Little Thing).

00:00 Posted in AI & robotics | Permalink | Comments (0) | Tags: robotics, hrp4c, dance

Oct 18, 2010

Nature Neuroscience features crowdfuding in science

The September issue of Nature Neuroscience has an editorial about the use of microfinance for scientific research.

The editorial is a sign of growing interest from the research community toward this strategy, which I and my colleague Giuseppe Riva described in a letter to Science [Gaggioli, A, Riva, G. (2008) Working the Crowd, Science 321, 5895, 1443]

Recently, we have teamed up with the Institute of Physiology of the National Research Council and the Italian Federation of Rare Diseases to develop Open Genius, a crowdfunding platform for research in rare diseases.

We have also created a website (in Italian and English) where you can find updated information about the project.

 

 

Open Genius is a not-for-profit initative of the scientific community that partners with like minded entities including academic, philantropic, government funding agencies.

If you want to collaborate or propose a partnership you can write us to:

info(at)opengenius.org

 

 

19:13 | Permalink | Comments (0) | Tags: crowdfunding

Oct 17, 2010

Mapping virtual content on 3d physical constructions

Via Augmented Times

This video shows the results achieved in the paper "Build Your World and Play In It: Interacting with Surface Particles on Complex Objects" presented at the conference ISMAR 2010 by Brett Jones and other researchers from the University of Illinois. The paper presents a way to map virtual content on 3d physical constructions and "play" with them. Nice stuff.

When your liver blogs

Imec Netherlands has demonstrated a new type of wireless body area network (BAN). The Human++ BAN platform converts Imec's ultra-low-power electrocardiogram sensors into wireless nodes in a short-range network, transmitting physiological data to a hub (the patient's cellphone). From there, the readings can be forwarded to doctors via a Wi-Fi or 3G connection.

Body sensing comes to smartphones

Via New York Times

BodyMedia FIT (BodyMedia)

BodyMedia has announced that its armband sensors will be able to communicate with smartphones, and wirelessly, using Bluetooth. Its health sensors will be one of the first devices, other than ear buds, that link to smartphones with Bluetooth short-range communications.

It opens the door to allowing a person to monitor a collection of the 9,000 variables — physical activity, calories burned, body heat, sleep efficiency and others — collected by the sensors in a BodyMedia armband in real-time, as the day goes on.

 

 

 

 

 

 

GoWear fit Armband (BodyMedia)

The Bluetooth-enabled armband costs $249 and the BodyMedia data service costs $7 a month and go on sale next month. In the past, BodyMedia users had to consult personal data downloaded to a Web site or observe a few measurements on a special watchband display, sold for $100.

19:55 Posted in Self-Tracking | Permalink | Comments (0) | Tags: self-tracking

Growing neurons on silicon chips

Via Robots.net

Researchers at University of Calgary have developed neurochips capable of capable of interfacing to and sensing activity of biological neurons in very high resolution. The new chips are automated so it's now easy to connect multiple brain cells eliminating the years of training it once required. While researchers say this technology could be used for new diagnostic methods and treatments for a variety of neuro-degenerative diseases, this advancement could ultimately lead to the use of biological neurons in the central or sub-processing units of computers and automated machinery.

 

 

Oct 12, 2010

New issue of Cybertherapy and Rehabilitation now online fulltext

The new issue of Cybertherapy and Rehabilitation magazine (3/2) is now online and available for full-text download. Topics covered by this issue include brain-computer interface, cognitive enhancement and trainers and the use of massive multiplayer online games in rehabilitation and therapy.

 

 

 

 

 

Oct 07, 2010

Raytheon shows off the XOS2 Exoskeleton robotic suit

03:04 Posted in AI & robotics | Permalink | Comments (0) | Tags: robotic suit

Oct 05, 2010

Patient self-monitoring technology could save needed funds in Britain

Via Telemedicine and E-Health Journal newsletter

Britain's National Health Service could meet a substantial part of its $31.6 billion [USD] cost savings program simply by using technology that enables patients to monitor their own conditions, according to the health department's chief information officer. DOH CIO Christine Connelly said patients need 21st century technology to help them make informed health choices and "take control of their health and experiences." Experts note that home-based technology could reduce hospital admissions, physician call-outs and patients with repeat problems, however current access to health information is too hard, jargon-ridden and fails to reach people in their homes. The challenge, they note, is for software developers to create applications that can overcome these issues.

Oct 04, 2010

Crowdfunding science: utopia or reality?

Several initiatives are exploring the potential of crowdfunding for supporting scientific research. In this approach, that I described in a letter to Science donors can choose from a list of public projects. Projects seeking funding are stored in an online repository. Investors (either people or funding agencies) can decide which projects to fund.

The closest example of crowdfunding science is Cancer Research UK's MyProjects scheme (http://myprojects.cancerresearchuk.org/). Launched in October 2008, MyProjects allows Cancer Research UK donors to search projects by type of cancer and location to find a specific research project to donate money.

I am also running a crowdfunding-science project in Italy, called Open Genius. The website is available only in Italian, but you can find the essential info about the project in this presentation.

 

 

I wish to hear your comments about this!

Flattr this

 

Do you want to make money with your blog? Try Flattr

Wanna make money with your blog? Do you have quality content that you would like to monetize? Now you can do it, with a revolutionary micropayment system called Flattr.

The system was launched publicly in March 2010 on an invite-only basis, and then opened up to the public in August 2010.

How does it work? 

You pay a small monthly amount (using either Moneybookers or PayPal) and then click buttons on sites to share out the money you paid in among those sites, sort of like an Internet tip jar. The minimum to pay is 2 euros per month. The money payed each month is spread evenly among the buttons you click in a month. In this way, users share not only money, but also content. For the service, Flattr keeps 10% of all the users monthly flatrate.

What is interesting about this service is that not only sites which support Flattr, but all sites, can have Flattr buttons.

Good Flattr to everybody!

Flattr this

Sep 26, 2010

Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy

Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy.

BMC Neurosci. 2010 Sep 16;11(1):117

Authors: Hashimoto Y, Ushiba J, Kimura A, Liu M, Tomita Y

ABSTRACT: BACKGROUND: For severely paralyzed people, a brain-computer interface (BCI) provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD) appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG) after long-term use of the virtual reality (VR)-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI) in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home. RESULTS: After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD) following MI was significantly enhanced (p < 0.01) for feet MI (from -29% to -55%), left-hand MI (from -23% to -42%), and right-hand MI (from -22% to -51%). CONCLUSIONS: These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.

Sep 20, 2010

XWave: Control your iPhone with your brain

The XWave is a new technology that uses a single electrode placed on the wearer’s forehead to measure electroencephalography (EEG) data, and converts these analog signals into digital so they can be used to control an external device. The XWave comes bundled with a software that includes a number of brain-training exercises. These include levitating a ball on the iDevice’s screen, changing a color based on the relaxation level of your brain and training your brain to maximize its attention span.

 

In the company’s own words:

XWave, powered by NeuroSky eSense patented technologies, senses the faintest electrical impulses transmitted through your skull to the surface of your forehead and converts these analogue signals into digital. With XWave, you will be able to detect attention and meditation levels, as well as train your mind to control things. Objects in a game can be controlled, lights in your living room can change colour depending on your mood; the possibilities are limited to only the power of your imagination.





The interesting feature is that the company is even serving up their APIs so developers can design and develop apps using the XWave device. The company reports that some apps already in development include games in which objects are controlled by the wearer’s mind and another that allows the wearer to control the lights in their home or select music based on their mood. You can order an XWave for $US100; it ships on November 1.


Sep 19, 2010

Artificial skin projects could restore feeling to wearers of prosthetic limbs

Via Telemedicine and E-Health news

Research groups at Stanford University and the University of California at Berkeley are developing sensor-based artificial skin that could provide prosthetic and robotic limbs with a realistic sense of touch. Stanford's project is based on organic electronics and is capable of detecting the weight of a fly upon the artificial skin, according to Zhenan Bao, professor of chemical engineering at Stanford.

The highly sensitive surfaces could also help robots pick up delicate objects without breaking them, improve surgeons' control over tools used for minimally invasive surgery, and increase efficiency of touch screen devices, she noted. Meanwhile, UC Berkeley's "e-skin" uses low-power, integrated arrays of nanowire transistors, according to UC Berkeley Professor of Electrical Engineering and Computer Science Ali Javey.

Thus far, the skin, the first ever made out of inorganic single crystalline semiconductors, is able to detect pressure equivalent to the touch of a keyboard. "It's a technique that can be potentially scaled up," said study lead author Kuniharu Takei, post-doctoral fellow in electrical engineering and computer sciences at UC Berkeley. "The limit now to the size of the e-skin we developed is the size of the processing tools we are using."

 

Sep 03, 2010

Samsung Galaxy Tab: iPad killer or flop?

Samsung has finally unveiled its new Galaxy Tab at the IFA conference in Berlin. The Galaxy runs on Android 2.2 operating system, which can run HTML5 and Adobe's Flash Player — unlike the iPad. It comes with a capacity of 16 or 32GB, expandable by 32GB more. The device weighs 380g (14oz), and has an 18cm (7in) screen - smaller and lighter than the iPad.

The Galaxy supports Bluetooth, Wi-Fi and 3G cell phone networks, and comes with two cameras, one 3-megabyte digital camera with a flash on the back of the device, and a second camera on the front for video conferences — a feature the iPad lacks.

The price is not very competitive, though: a number of European news sources are reporting that the Galaxy Tab will cost €699 and €799 for the 16GB and 32GB models, respectively.

 

Sep 02, 2010

The Blue Brain Project

As computing power continues to increase, it may ultimately be possible to simulate the functioning of the most complex system of the known universe: the brain. This is the ambitious goal of the Blue Brain Project, the first attempt to reverse-engineer the mammalian brain.

The project is expected to provide answers to a number of fundamental questions, ranging from the emergence of biological intelligence to the evolution of consciousness.

Lead by neuroscientist Henry Markram, Blue Brain was launched in 2005 as a joint research initiative between the Brain Mind Institute at the École Polytechnique Fédérale de Lausanne (EPFL) and the information technology giant IBM. Using the impressive processing power of IBM’s Blue Gene/L supercomputer, the project reached its first milestone in December 2006, with the development of the model of a rat’s neocortical column (NCC).

BBPLogo.png

To perform the empirically-based simulation of individual cells, the Blue Gene/L supercomputer uses the NEURON software developed by Michael Hines, John W. Moore, and Ted Carnevale at Yale and Duke. Thanks to this software, processors are converted into neuron simulators and communication cables are converted into axons interconnecting the neurons, which allows to transform the entire Blue Gene into a cortical microcircuit.

In November 2007, the project achieved the completion of its first phase, with the development of a new modelling framework for the construction and validation of neural circuits built from biological data. The project is now striving to simplify the simulation of the column, in order to allow the parallel simulation of multiple connected columns. If this strategy will be successful, the final objective will be to simulate a whole human neocortex, which includes about one million cortical columns.

More to explore:

H. Markram, The Blue Brain Project, Nature Reviews Neuroscience, 7:153-160, 2006

Aug 27, 2010

The Experiment-Driven Life

A great presentation by Mattew Cornell about the concept of self-tracking and the implications of this emerging research field for people's wellbeing.

Augmented City

Keiichi Matsuda did it again. After the success of Domestic Robocop, the architecture graduate and filmaker got the nomination for the Royal Institute of British Architects (RIBA) Silver Medal award, for his new video "Augmented City". As in his previous work, in this new video Matsuda describes a future world overlaid with digital information, whose built environment can be manipulated by the individual. In this way, the objective physical world is transformed in a subjective virtual space.

In Matsuda's own words:

Augmented City explores the social and spatial implications of an AR-supported future. 'Users' of the city can browse through channels of the augmented city, creating aggregated customised environments. Identity is constructed and broadcast, while local records and coupons litter the streets. The augmented city is an architectural construct modulated by the user, a liquid city of stratified layers that perceptually exists in the space between the self and the built environment. This subjective space allows us to re-evaluate current trends, and examine our future occupation of the augmented city.

TO CHANGE FROM SPLIT SCREEN TO 3D/2D, CLICK THE '3D' TAB AT THE BOTTOM OF YOUR VIEWER

Aug 26, 2010

Heart Chamber Orchestra

The Heart Chamber Orchestra consists of classical musicians who use their heartbeats to control a computer composition and visualization environment. To my best knowledge, this is the first example of "group biofeedback".

The musicians are equipped with ECG (electrocardiogram) sensors. A computer monitors and analyzes the state of these 12 hearts in real time. The acquired information is used to compose a musical score with the aid of computer software. It is a living score dependent on the state of the hearts.

hcoquadrooo.jpg

0aalesfelchesiii.jpg


While the musicians are playing, their heartbeats influence and change the composition and vice versa. The musicians and the electronic composition are linked via the hearts in a circular motion, a feedback structure. The emerging music evolves entirely during the performance.

The resulting music is the expression of this process and of an organism forming itself from the circular interplay of the individual musicians and the machine.

The sensor network consists of 12 individual sensors; each one is fitted onto the body of a musician. A computer receives the heartbeat data. Software analyzes the data and generates via different algorithms the real-time musical score for the musicians, the electronic sounds and the computer graphic visualization

Below is a video documentation from the Heart Chamber Orchestra performance on the 28th of March 2010 at Kiasma Theatre at Pixelache Festival in Helsinki, Finland.