Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Sep 25, 2009

Miruko Eyeball Robotic Eye

Via Pink Tentacle

Miruko is a camera robot in the shape of an eyeball capable of tracking objects and faces. Worn on the player’s sleeve, Miruko’s roving eye scans the surroundings in search of virtual monsters that are invisible to the naked human eye. When a virtual monster is spotted, the mechanical eyeball rolls around in its socket and fixes its gaze on the monster’s location. By following Miruko’s line of sight, the player is able to locate the virtual monster and “capture” it via his or her iPhone camera.

In this video, Miruko’s creators demonstrate how the robotic eyeball can be used as an interface for a virtual monster-hunting game played in a real-world environment.

 

 

According to its creators, Miruko can be used for augmented reality games, security, and navigation.

Sep 21, 2009

Nokia Mixed Reality gadgets

Cool video by Nokia Future Tech lab on the next generation of Mixed Reality gadgets.. gaze-tracking eyewear that allows browsing and selecting with your eyes; 3-D audio to find and hear spatialized sounds... and more.

check it out:

 

HAL: New assistive walking device

Japanese company Cyberdyne, with the scientific support provided by Professor Sankai of Tsukuba University, have developed the Hybrid Assistive Limb - HAL - a device designed to help people walk or carry heavy loads. The assistive walking system weights 10 kilogram and has a battery at the back. Embedded sensors collects electric signals that are delivered to the brain through the skin surface. Thanks to these sensors, the system can help users to move in the direction they are thinking. The walking speed is 1.8 km p/h.


Watch the HAL in action in this video:

17:10 Posted in AI & robotics | Permalink | Comments (0) | Tags: robotics

Driving dreams: cortical activations during imagined passive and active whole body movemen

Driving dreams: cortical activations during imagined passive and active whole body movement.

Ann N Y Acad Sci. 2009 May;1164:372-5

Authors: Flanagin VL, Wutte M, Glasauer S, Jahn K

It is unclear how subjects perceive and process self-motion cues in virtual reality environments. Movement could be perceived as passive, akin to riding in a car, or active, such as walking down the street. These two very different types of self-motion were studied here using motor imagery in fMRI. In addition, the relative importance of visual and proprioceptive training cues was examined. Stronger activations were found during proprioceptive motor imagery compared with visual motor imagery, suggesting that proprioceptive signals are important for successful imagined movement. No significant activations were found during active movement with proprioceptive training. Passive locomotion, however, was correlated with activity in an occipital-parietal and parahippocampal cortical network, which are the same regions found during navigation with virtual reality stimuli.

Reactivity to cannabis cues in virtual reality environments

Reactivity to cannabis cues in virtual reality environments.

J Psychoactive Drugs. 2009 Jun;41(2):105-12

Authors: Bordnick PS, Copp HL, Traylor A, Graap KM, Carter BL, Walton A, Ferrer M

Virtual reality (VR) cue environments have been developed and successfully tested in nicotine, cocaine, and alcohol abusers. Aims in the current article include the development and testing of a novel VR cannabis cue reactivity assessment system. It was hypothesized that subjective craving levels and attention to cannabis cues would be higher in VR environments with cannabis cues compared to VR neutral environments. Twenty nontreatment-seeking current cannabis smokers participated in the VR cue trial. During the VR cue trial, participants were exposed to four virtual environments that contained audio, visual, olfactory, and vibrotactile sensory stimuli. Two VR environments contained cannabis cues that consisted of a party room in which people were smoking cannabis and a room containing cannabis paraphernalia without people. Two VR neutral rooms without cannabis cues consisted of a digital art gallery with nature videos. Subjective craving and attention to cues were significantly higher in the VR cannabis environments compared to the VR neutral environments. These findings indicate that VR cannabis cue reactivity may offer a new technology-based method to advance addiction research and treatment.

The sensitivity of a virtual reality task to planning and prospective memory impairments

The sensitivity of a virtual reality task to planning and prospective memory impairments: Group differences and the efficacy of periodic alerts on performance.

Neuropsychol Rehabil. 2009 Aug 26;:1-25

Authors: Sweeney S, Kersel D, Morris RG, Manly T, Evans JJ

Executive functions have been argued to be the most vulnerable to brain injury. In providing an analogue of everyday situations amenable to control and management virtual reality (VR) may offer better insights into planning deficits consequent upon brain injury. Here 17 participants with a non-progressive brain injury and reported executive difficulties in everyday life were asked to perform a VR task (working in a furniture storage unit) that emphasised planning, rule following and prospective memory tasks. When compared with an age and IQ-matched control group, the patients were significantly poorer in terms of their strategy, their time-based prospective memory, the overall time required and their propensity to break rules. An examination of sensitivity and specificity of the VR task to group membership (brain-injured or control) showed that, with specificity set at maximum, sensitivity was only modest (at just over 50%). A second component to the study investigated whether the patients' performance could be improved by periodic auditory alerts. Previous studies have demonstrated that such cues can improve performance on laboratory tests, executive tests and everyday prospective memory tasks. Here, no significant changes in performance were detected. Potential reasons for this finding are discussed, including symptom severity and differences in the tasks employed in previous studies.

Increased personal space of patients with schizophrenia in a virtual social environment

Increased personal space of patients with schizophrenia in a virtual social environment.

Psychiatry Res. 2009 Sep 15;

Authors: Park SH, Ku J, Kim JJ, Jang HJ, Kim SY, Kim SH, Kim CH, Lee H, Kim IY, Kim SI

Virtual reality may be a good alternative method for measuring personal space and overcoming some limitations in previous studies on the social aspects of schizophrenia. Using this technology, we aimed to investigate the characteristics of personal space in patients with schizophrenia and evaluate the relationship between their social behaviors and schizophrenic symptoms. The distance from a virtual person and the angle of head orientation while talking to a virtual person in a virtual environment were measured in 30 patients with schizophrenia and 30 normal controls. It was found that patients with schizophrenia had longer distances and larger angles than did normal controls. The severity of the negative syndrome had significant inverse correlations with the distance from the angry and neutral virtual persons and with the angle of head orientation toward the happy and angry virtual persons, suggesting that negative symptoms may have a close relationship with personal space, including distancing and eye gaze. The larger personal space of patients may reflect their discomfort in close situations or cognitive deficits. Showing these profiles to patients could help them realize the amount of personal space they need.

The use of biofeedback in clinical virtual reality: the intrepid project

The use of biofeedback in clinical virtual reality: the intrepid project.

Stud Health Technol Inform. 2009;144:128-32

Authors: Repetto C, Gorini A, Algeri D, Vigna C, Gaggioli A, Riva G

In our protocol for the treatment of Generalized Anxiety Disorders we use Virtual reality (VR) to facilitate emotional regulation and the relaxation process. Using a biofeedback biomonitoring system (GSR, HR, Thermal) the patient is made aware of his or her reactions through the modification of some features of the VR environment in real time. Using mental exercises the patient learns to control these physiological parameters and using the feedback provided by the virtual environment is able to gauge his or her success. To test this concept, we planned a randomized controlled trial (NCT00602212), including three groups of 15 patients each (for a total of 45 patients): (1) the VR group, (2) the non-VR group, and (3) the waiting list (WL) group.

Neurofeedback-based motor imagery training for brain-computer interface

Neurofeedback-based motor imagery training for brain-computer interface (BCI).

J Neurosci Methods. 2009 Apr 30;179(1):150-6

Authors: Hwang HJ, Kwon K, Im CH

In the present study, we propose a neurofeedback-based motor imagery training system for EEG-based brain-computer interface (BCI). The proposed system can help individuals get the feel of motor imagery by presenting them with real-time brain activation maps on their cortex. Ten healthy participants took part in our experiment, half of whom were trained by the suggested training system and the others did not use any training. All participants in the trained group succeeded in performing motor imagery after a series of trials to activate their motor cortex without any physical movements of their limbs. To confirm the effect of the suggested system, we recorded EEG signals for the trained group around sensorimotor cortex while they were imagining either left or right hand movements according to our experimental design, before and after the motor imagery training. For the control group, we also recorded EEG signals twice without any training sessions. The participants' intentions were then classified using a time-frequency analysis technique, and the results of the trained group showed significant differences in the sensorimotor rhythms between the signals recorded before and after training. Classification accuracy was also enhanced considerably in all participants after motor imagery training, compared to the accuracy before training. On the other hand, the analysis results for the control EEG data set did not show consistent increment in both the number of meaningful time-frequency combinations and the classification accuracy, demonstrating that the suggested system can be used as a tool for training motor imagery tasks in BCI applications. Further, we expect that the motor imagery training system will be useful not only for BCI applications, but for functional brain mapping studies that utilize motor imagery tasks as well.

Neurofeedback and brain-computer interface clinical applications

Neurofeedback and brain-computer interface clinical applications.

Int Rev Neurobiol. 2009;86:107-17

Authors: Birbaumer N, Ramos Murguialday A, Weber C, Montoya P

Most of the research devoted to BMI development consists of methodological studies comparing different online mathematical algorithms, ranging from simple linear discriminant analysis (LDA) (Dornhege et al., 2007) to nonlinear artificial neural networks (ANNs) or support vector machine (SVM) classification. Single cell spiking for the reconstruction of hand movements requires different statistical solutions than electroencephalography (EEG)-rhythm classification for communication. In general, the algorithm for BMI applications is computationally simple and differences in classification accuracy between algorithms used for a particular purpose are small. Only a very limited number of clinical studies with neurological patients are available, most of them single case studies. The clinical target populations for BMI-treatment consist primarily of patients with amyotrophic lateral sclerosis (ALS) and severe CNS damage including spinal cord injuries and stroke resulting in substantial deficits in communication and motor function. However, an extensive body of literature started in the 1970s using neurofeedback training. Such training implemented to control various EEG-measures provided solid evidence of positive effects in patients with otherwise pharmacologically intractable epilepsy, attention deficit disorder, and hyperactivity ADHD. More recently, the successful introduction and testing of real-time fMRI and a NIRS-BMI opened an exciting field of interest in patients with psychopathological conditions.

Is neurofeedback an efficacious treatment for ADHD? A randomised controlled clinical trial

Is neurofeedback an efficacious treatment for ADHD? A randomised controlled clinical trial.

J Child Psychol Psychiatry. 2009 Jul;50(7):780-9

Authors: Gevensleben H, Holl B, Albrecht B, Vogel C, Schlamp D, Kratz O, Studer P, Rothenberger A, Moll GH, Heinrich H

BACKGROUND: For children with attention deficit/hyperactivity disorder (ADHD), a reduction of inattention, impulsivity and hyperactivity by neurofeedback (NF) has been reported in several studies. But so far, unspecific training effects have not been adequately controlled for and/or studies do not provide sufficient statistical power. To overcome these methodological shortcomings we evaluated the clinical efficacy of neurofeedback in children with ADHD in a multisite randomised controlled study using a computerised attention skills training as a control condition. METHODS: 102 children with ADHD, aged 8 to 12 years, participated in the study. Children performed either 36 sessions of NF training or a computerised attention skills training within two blocks of about four weeks each (randomised group assignment). The combined NF treatment consisted of one block of theta/beta training and one block of slow cortical potential (SCP) training. Pre-training, intermediate and post-training assessment encompassed several behaviour rating scales (e.g., the German ADHD rating scale, FBB-HKS) completed by parents and teachers. Evaluation ('placebo') scales were applied to control for parental expectations and satisfaction with the treatment. RESULTS: For parent and teacher ratings, improvements in the NF group were superior to those of the control group. For the parent-rated FBB-HKS total score (primary outcome measure), the effect size was .60. Comparable effects were obtained for the two NF protocols (theta/beta training, SCP training). Parental attitude towards the treatment did not differ between NF and control group. CONCLUSIONS: Superiority of the combined NF training indicates clinical efficacy of NF in children with ADHD. Future studies should further address the specificity of effects and how to optimise the benefit of NF as treatment module for ADHD.

Jul 15, 2009

Solar

Solar - by Rejane Cantoni and Leonardo Crescenti

(under development)

Solar is a robotic installation, immersive and interactive, designed to simulate qualities and measures of solar light in relation to man-space time.
 The interactor can agency the machine in two ways: he can control his geographic position with his feet and/or he can speak with it. 
Agencying via positioning make it possible for the interactor to inform his geographic position to a data bank. One possible example of this type of user-system interaction could be: 
You enter the machine – a black rotunda of 6.30 in diameter x 3.50 high. In the center, there is a movable platform. Upon stepping on it, the gravitational force of the body is interpreted by the system that, in function of the relative latitude and longitude, alters the original setup.

Solar-large

For example, when you step in front of the platform, the system advances to the north, i.e., it produces, on the plasma wall, visual feedbacks that appear as modifications in the latitudes of the imaginary lines which, in this case, advance from Equador to the North Polar Circle (see on video an example of navigating in the inverse direction, south). 
Agencying via voice command, on the other hand, makes it possible for the interactor to particularize a date and a moment of an event. For example: when the interactor says “August 03 at 3 p.m.”, the system associates to this command the information to his relative position, which makes it possible to simulate the solar light intensity relative to the space-time solicited. 
To the eyes of the outside observer, without movement or without the interactor’s voice command, time, in this machine, stops.

Solar_MIS_001.jpg

 

Rejane Cantoni and Leonardo Crescenti

Jul 10, 2009

Neuroscience and the military: ethical implications of war neurotechnologies

Super soldiers equipped with neural implants, suits that contain biosensors, and thought scans of detainees may become reality sooner than you think.

In this video taken from the show "Conversations from Penn State", Jonathan Moreno discusses the ethical implications of the applications of neuroscience in modern warfare.

Moreno is David and Lyn Silfen professor and professor of medical ethics and the history and sociology of science at the University of Pennsylvania and was formerly the director of the Center for Ethics at the University of Virginia. He has served as senior staff member for two presidential commissions and is an elected member of the Institute of Medicine of the National Academies.

Jul 06, 2009

Thought-controlled wheelchairs

Via Sentient Development

The BSI-Toyota Collaboration Center (BTCC) is developing a wheelchair that can be navigated in real-time with brain waves. The brain-controlled device can adjust itself to the characteristics of each individual user, thereby improving the efficiency with which it senses the driver's commands. That way, the driver is able to get the system to learn his/her commands (forward/right/left) quickly and efficiently; the system boasts an accuracy rate of 95%.

Jul 05, 2009

Magnetic Liquid

Nice video of magnetic liquid (ferrofluids) created by Sachiko Kodama and Minako Takeno that reminds me to some sort of artificial life

12:47 Posted in Cyberart | Permalink | Comments (0) | Tags: cyberart

Jul 01, 2009

Toward Participatory Sensing

In this interesting paper Burke and coll. describe how the massive proliferation of mobile devices and sensors may give raise to interactive, participatory sensor networks that enable users to gather, analyze and share local knowledge.

The authors also explain how the vision of Participatory Sensing can inspire new applications in different domains, such as healthcare or urban planning.

 

A virtual reality-based system integrated with fmri to study neural mechanisms of action observation-execution

A virtual reality-based system integrated with fmri to study neural mechanisms of action observation-execution: A proof of concept study.

Restor Neurol Neurosci. 2009;27(3):209-23

Authors: Adamovich SV, August K, Merians A, Tunik E

Purpose: Emerging evidence shows that interactive virtual environments (VEs) may be a promising tool for studying sensorimotor processes and for rehabilitation. However, the potential of VEs to recruit action observation-execution neural networks is largely unknown. For the first time, a functional MRI-compatible virtual reality system (VR) has been developed to provide a window into studying brain-behavior interactions. This system is capable of measuring the complex span of hand-finger movements and simultaneously streaming this kinematic data to control the motion of representations of human hands in virtual reality. Methods: In a blocked fMRI design, thirteen healthy subjects observed, with the intent to imitate (OTI), finger sequences performed by the virtual hand avatar seen in 1st person perspective and animated by pre-recorded kinematic data. Following this, subjects imitated the observed sequence while viewing the virtual hand avatar animated by their own movement in real-time. These blocks were interleaved with rest periods during which subjects viewed static virtual hand avatars and control trials in which the avatars were replaced with moving non-anthropomorphic objects. Results: We show three main findings. First, both observation with intent to imitate and imitation with real-time virtual avatar feedback, were associated with activation in a distributed frontoparietal network typically recruited for observation and execution of real-world actions. Second, we noted a time-variant increase in activation in the left insular cortex for observation with intent to imitate actions performed by the virtual avatar. Third, imitation with virtual avatar feedback (relative to the control condition) was associated with a localized recruitment of the angular gyrus, precuneus, and extrastriate body area, regions which are (along with insular cortex) associated with the sense of agency. Conclusions: Our data suggest that the virtual hand avatars may have served as disembodied training tools in the observation condition and as embodied "extensions" of the subject's own body (pseudo-tools) in the imitation. These data advance our understanding of the brain-behavior interactions when performing actions in VE and have implications in the development of observation- and imitation-based VR rehabilitation paradigms.

Positive impact of cyclic meditation on subsequent sleep

Positive impact of cyclic meditation on subsequent sleep.

Med Sci Monit. 2009 Jul;15(7):CR375-381

Authors: Patra S, Telles S

Background: Cyclic meditation (CM) is a technique that combines yoga postures interspersed with supine rest. This combination is based on ancient texts and is considered easier for beginners to practice. Material/Methods: Whole-night polysomnographic measures and the self-rating of sleep were studied on the night following a day in which 30 male participants practiced CM twice (ca. 23 minutes each time). This was compared with another night when they had had two sessions of supine rest (SR) of equal duration on the preceding day. The sessions were one day apart and the order of the sessions was randomized. Recordings were from the F4, C4, and O2 electrode sites referenced to linked earlobes and bipolar electroculography and electromyography sites. Results: In the night following CM, the percentage of slow-wave sleep (SWS) was significantly higher than in the night following SR, whereas the percentage of rapid-eye-movement (REM) sleep and the number of awakenings per hour were less. Following CM the self-rating of sleep based on visual analog scales showed an increase in the feeling that the sleep was refreshing, an increase in feeling "good" in the morning, an overall increase in sleep duration, and decreases in the degree to which sleep was influenced by being in a laboratory as well as any associated discomfort. Conclusions: Practicing cyclic meditation twice a day appeared to improve the objective and subjective quality of sleep on the following night.

Jun 29, 2009

Towards a Positive Technology of Gaming

In this very interesting keynote given at the recent Game Developers Conference, Jane McGonigal discusses the role of Positive Psychology in gaming. Another significant sign of how the world of ICT is embracing the perspective of Positive Technology...

 

Learning to Make Your Own Reality - IGDA Education Keynote 2009

View more documents from avantgame.

Jun 26, 2009

Reactable

From the Reactable website:

The Reactable is a revolutionary new electronic musical instrument designed to create and perform the music of today and tomorrow. It combines state of the art technologies with a simple and intuitive design, which enables musicians to experiment with sound, change its structure, control its parameters and be creative in a direct and refreshing way, unlike anything you have ever known before.

The Reactable uses a so called tangible interface, where the musician controls the system by manipulating tangible objects. The instrument is based on a translucent and luminous round table, and by putting these pucks on the Reactable surface, by turning them and connecting them to each other, performers can combine different elements like synthesizers, effects, sample loops or control elements in order to create a unique and flexible composition.

As soon as any puck is placed on the surface, it is illuminated and starts to interact with the other neighboring pucks, according to their positions and proximity. These interactions are visible on the table surface which acts as a screen, giving instant feedback about what is currently going on in the Reactable turning music into something visible and tangible.

Additionally, performers can also change the behavior of the objects by touching and interacting with the table surface, and because the Reactable technology is “multi-touch”, there is not limit to the number of fingers that can be used simultaneously. As a matter of fact, the Reactable was specially designed so that it could also be used by several performers at the same time, thus opening up a whole new universe of pedagogical, entertaining and creative possibilities with its collaborative and multi-user capabilities