Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Oct 27, 2007

LevelHead

Re-blogged from Networked Performance

 

levelhead.jpg

 

levelHead is an interactive game that uses a cube, a webcam, and pattern recognition. When the cube is rotated or tilted in front of the camera the user will be able to see ‘inside’ the cube and guide a small avatar through six different rooms.

Pattern recognition has already been used in several other projects, but this is a new way of using it, and a new way of thinking of the technology. The idea behind the game itself is rather simple. When the cube is tilted the avatar moves in the corresponding direction. The goal of the game is to guide him through a maze of rooms connected by doors, and lead him to the outside world.

According to the creater, Julian Oliver, the game is currently in development, but will be released as open-source soon.

Check out the explanatory video.

 

Effects of natural stress relief meditation on trait anxiety

Effects of natural stress relief meditation on trait anxiety: a pilot study.

Psychol Rep. 2007 Aug;101(1):130-4

Authors: Coppola F

Natural Stress Relief meditation, a mental technique which is practiced for 15 minutes twice a day, aims to reduce stress and anxiety by eliciting a specific state of physiological rest along with mental alertness. The meditation is taught in a self-administered program, requiring one hour of training during the first three days, followed by the regular twice daily practice. Each 15-min. session consists in sitting quietly with closed eyes while applying a specific mental procedure. To test the effectiveness of meditation in reducing trait anxiety, Spielberger's State-Trait Anxiety Inventory was administered to 25 participants four times over a 3-wk. period: one week before starting to practice the meditation, a few hours before starting, 1 wk. after, and 2 wk. after. The difference in Trait Anxiety score between pretreatment and before starting the practice was not significant, while it was significant both after the first week of practice (Cohen d=.46) and after the first 2 wk. of practice (d=.67).

Pocket Supercomputer

 
Researchers at Accenture Technology Labs in France have developed the "Pocket Supercomputer," which automatically identifies objects in a video, using any ordinary 3G cellphone equipped with a video camera
 
 
 
 

15:21 Posted in Research tools | Permalink | Comments (0) | Tags: data mining

Oct 25, 2007

Darkness-enhanced startle responses in ecologically valid environments

Darkness-enhanced startle responses in ecologically valid environments: A virtual tunnel driving experiment.

Biol Psychol. 2007 Sep 14;

Authors: Mühlberger A, Wieser MJ, Pauli P

Using the startle reflex methodology, researchers have shown that darkness, a phylogenetically relevant aversive context for humans, elicits fear responses. The present study replicated these findings in an ecologically valid situation, a virtual tunnel drive. Furthermore, the study focused on the question whether the darkness-enhanced startle response is modulated by an additional task involvement of the participants. Startle responses were assessed during virtual tunnel drives with darker and brighter sections. Participants once actively drove the virtual car and once passively sat in the car as a passenger. We found more negative feelings during darker parts of the virtual tunnel and during active driving. However, facilitated startle reactions in darkness were restricted to passive drives. Furthermore, correlation analyses revealed that darkness-enhanced startle modulation was more pronounced in participants with lower state anxiety. These results extend earlier findings in an experimental paradigm using ecologically valid virtual environments. Further research should use virtual reality paradigms to address context-dependent research questions.

Japanese android recognizes and uses body language

Via Pink Tentacle

NICT develops robot with nonverbal communication skills --

 

Japan’s National Institute of Information and Communications Technology (NICT) researchers have developed an autonomous humanoid robot that can recognize and use body language. According to the press release, the android can use nonverbal communication skills such as gestures and touch to facilitate natural interaction with humans. NICT researchers envision future applications of this technology in robots that can work in the home or assist with rescue operations when disaster strikes.

 

NICT press release (japanese)

Oct 24, 2007

Movement imagery increases pain in people with neuropathic pain following complete thoracic spinal cord injury

Movement imagery increases pain in people with neuropathic pain following complete thoracic spinal cord injury.

Pain. 2007 Oct 15;

Authors: Gustin SM, Wrigley PJ, Gandevia SC, Middleton JW, Henderson LA, Siddall PJ

Spinal cord injury (SCI) results in deafferentation and the onset of neuropathic pain in a substantial proportion of people. Based on evidence suggesting motor cortex activation results in attenuation of neuropathic pain, we sought to determine whether neuropathic SCI pain could be modified by imagined movements of the foot. Fifteen subjects with a complete thoracic SCI (7 with below-level neuropathic pain and 8 without pain) were instructed in the use of movement imagery. Movement imagery was practiced three times daily for 7days. On the eighth day, subjects performed the movement imagery in the laboratory and recorded pain ratings during the period of imagined movement. Six out of 7 subjects with neuropathic pain reported an increase in pain during imagined movements from 2.9+/-0.7 during baseline to 5.0+/-1.0 during movement imagery (p<0.01). In SCI subjects without neuropathic pain, movement imagery evoked an increase in non-painful sensation intensity from a baseline of 1.9+/-0.7 to 4.8+/-1.3 during the movement imagery (p<0.01). Two subjects without a history of pain or non-painful phantom sensations had onset of dysesthesia while performing imagined movements. This study reports exacerbation of pain in response to imagined movements and it contrasts with reports of reduction pain in people with peripheral neuropathic pain. The potential mechanisms underlying this sensory enhancement with movement imagery are discussed.

Oct 22, 2007

DARPA next generation prosthetic arm

The Boston Globe has an article by Scott Kirsner about the next generation of prosthetic limbs under development at DARPA


caf3bcb3064af02420aec0fb0c067cdd.jpg

From the article: 

Without any covering to emulate human skin - what those in the prosthetics field call a "cosmesis" - the arm is distinctly robotic, all metal cylinders and dark gray carbon fiber. I held out my index finger, and Van Der Merwe manipulated the arm so that the index finger and thumb grabbed my fingertip and squeezed lightly. Suddenly, there was a buzzing sound. "That's a sensor in the fingers letting me know how hard I'm squeezing," Van Der Merwe explained. A few minutes later, when I shook hands, the grip was firm (if not warm), and Van Der Merwe chided me for not shaking more vigorously. I didn't want to break the hand and get on the Pentagon's bad side

 

Check the video 

Oct 21, 2007

AjaxLife

A new application called AjaxLife allows browser-based interaction with the online virtual world SecondLife.

Using AjaxLife, users do not need a video card and a graphical client to connect to the SecondLife world. However, the AjaxLife client supports only limited functions, such as in-game chatting, teleporting to various locations and checking the status of the user's inventory, friends and Linden Dollars, and there is no representation of in-game avatars.

The web application was developed by Katharine Barry, a fifteen (!) year old English schoolgirl.

 

23:45 Posted in Virtual worlds | Permalink | Comments (0) | Tags: second life

Oct 20, 2007

fMRI Analysis of Neural Mechanisms Underlying Rehabilitation in VR

fMRI Analysis of Neural Mechanisms Underlying Rehabilitation in Virtual Reality: Activating Secondary Motor Areas.

Conf Proc IEEE Eng Med Biol Soc. 2006;1:3692-3695

Authors: August K, Lewis JA, Chandar G, Merians A, Biswal B, Adamovich S

A pilot functional MRI study on a control subject investigated the possibility of inducing increased neural activations in primary, as well as secondary motor areas through virtual reality-based exercises of the hand. These areas are known to be important in effective motor output in stroke patients with impaired corticospinal systems. We found increased activations in these brain areas during hand exercises in VR when compared to vision of non-anthropomorphic shapes. Further studies are needed to investigate the potential of virtual reality-based rehabilitation for tapping into the properties of the mirror neuron system to stimulate plasticity in sensorimotor areas.

A Low Cost Human Computer Interface based on Eye Tracking

A Low Cost Human Computer Interface based on Eye Tracking.

Conf Proc IEEE Eng Med Biol Soc. 2006;1:3226-3229

Authors: Hiley JB, Redekopp AH, Fazel-Rezai R

This paper describes the implementation of a human computer interface based on eye tracking. Current commercially available systems exist, but have limited use due mainly to their large cost. The system described in this paper was designed to be a low cost and unobtrusive. The technique was video-oculography assisted by corneal reflections. An off-the shelf CCD webcam was used to capture images. The images were analyzed in software to extract key features of the eye. The users gaze point was then calculated based on the relative position of these features. The system is capable of calculating eye-gaze in real-time to provide a responsive interaction. A throughput of eight gaze points per second was achieved. The accuracy of the fixations based on the calculated eye-gazes were within 1 cm of the on-screen gaze location. By developing a low-cost system, this technology is made accessible to a wider range of applications.

My Map: Email visualization

Re-blogged from Information Aesthetics

email_circle.jpg

My Map is a data visualization application capable of rendering the relationships between the user & individuals in the address book by examining the TO:, FROM:, and CC: fields of every email in the 60,000-large email archive.

the intensity of the relationship is determined by the color intensity of the line. "My Map" allows to explore different relational groupings & periods of time, revealing the temporal ebbs & flows in various relationships. My Map thus becomes a veritable self-portrait, a visual reflection of personal associations.

 

link: christopherbaker.net

Computational Arts and Creative Informatics

Via Networked Performance

oscillon4.jpg


 
 
CALL FOR CHAPTERS
 
Proposals Deadline: November 15, 2007
Full Articles Due: February 28, 2008.

At the core of the fundamental questions of “what is art” and “what is technology” we focus The Handbook of Research on computational Arts and Creative Informatics at the convergence of computer science and artistic creativity. We seek to discover new ways to conceptualize art and investigate new methods of self expression through the use of technology. Here we are inviting experts in a wide range of disciplines to discuss the emergence of expression and art through that of science, information technology, computer science, artificial intelligence and many other related fields. We see this book as a comprehensive recourse where artists and scientists can collaborate on ideas, practices and experimentations on these topics. As technology becomes meshed further into our culture and everyday lives, new mediums and outlets for creative expression and innovation are abound. We are emphasizing the creative nature of technology and science itself. How does the human side of technological achievement influence our creative abilities as technology is a creation in itself? Has the ontology of the information age influenced society at the level of both the human and non human? Through this handbook we are addressing novel concepts from creation, interaction, communication, interpretation and emergence of art through various technological means and mediums.

The Handbook of Research on computational Arts and Creative Informatics will provide a comprehensive range of topics regarding the interaction of the sciences and the arts. Key concepts, theories and innovations involving the computational side of the arts and other creative technologies will be discussed. This particular volume will feature chapters (8,000-10,000 words) written by experts and leading innovators of their respective fields.

Recommended topics include, **but are not limited to**, the following:

+Essays and Discussion on Art and Technology
+Art and web design
+Fractals, tessellations and Creativity through mathematical expression
+Interactive and computational sculptures and artworks
+Kinetic sculptures
+Creativity as an emergent property of art and science
+Digital art and creative expressions
+The creative process in IT education
+Art created by Artificial Life and Intelligent Agents
+Creativity in computer interface and web design
+Creativity from emergent properties
+Art expressed or created by multi-agent systems
+Virtual spaces and Art of synthetic/ virtual worlds
+Art and expression through information visualization
+Animation, simulation and modeling
+3-D artwork
+Art for the blind and visually impaired- Universal Creativity
+Human expression through cybernetics
+Robotics and Art
+Future trends in Art as influenced by emerging technologies

Submissions: Interested individuals are invited to submit a 1-2 page manuscript on their proposed chapter by November 15, 2007. We encourage the inclusion of related topics not mentioned above that may be related to both the theme of the handbook and your particular research area or expertise. Upon acceptance of your proposal, you will have until February 28, 2008 to submit a completed draft of your chapter.

A set of guidelines will also be sent to you upon acceptance. Each potential author will receive notification on their acceptance status by November 30, 2007.


A wrist-mounted instrument for measuring psychosocial stress

Via AATP Interactive 

ewatch.jpg

 

 

 

 

 

 

 

 

Researchers from Carnegie Mellon University and the University of Pittsburgh are investigating psychosocial stress exposure during the course of daily life using an instrument called eWatch, a multisensor package about the size of a large wristwatch. The eWatch can sense sound, motion, ambient light, skin temperature and other factors that provide clues about the wearer’s location, health status and current activity. 

From the news release

Every 45 minutes over the course of five days, the eWatch will prompt wearers to take part in a 2-to-3-minute interview. The instrument will record their response to questions about their current activities, such as “Working hard?” and “Working fast?” By the end of the study, several hundred people will have tested the eWatch.

Previous research has shown that responses to such interviews help predict who will show higher rates of plaque development in the arteries, a risk factor for heart attack or stroke. Using interviews in real time allows researchers to quantify how stressors affect one’s daily life, as well as to pinpoint when these effects begin and when they end.

Use of the eWatch technology should assist researchers in finding the optimal method for responding to such interviews during daily activities, whether by pressing a button, moving the wrist or speaking into a wireless ear bug device. Environmental data collected by the eWatch also may assist the researchers in characterizing the types of environments people find most stressful, so that their location, such as home or work, may be recorded automatically.

“We want to capture a slice of life in people’s daily routine,” says Kamarck. “We hope that these new tools will allow us to do so while minimizing disruptions imposed by the act of measurement.”

 

Short-term meditation training improves attention and self-regulation

Short-term meditation training improves attention and self-regulation.

Proc Natl Acad Sci U S A. 2007 Oct 11;

Authors: Tang YY, Ma Y, Wang J, Fan Y, Feng S, Lu Q, Yu Q, Sui D, Rothbart MK, Fan M, Posner MI

Recent studies suggest that months to years of intensive and systematic meditation training can improve attention. However, the lengthy training required has made it difficult to use random assignment of participants to conditions to confirm these findings. This article shows that a group randomly assigned to 5 days of meditation practice with the integrative body-mind training method shows significantly better attention and control of stress than a similarly chosen control group given relaxation training. The training method comes from traditional Chinese medicine and incorporates aspects of other meditation and mindfulness training. Compared with the control group, the experimental group of 40 undergraduate Chinese students given 5 days of 20-min integrative training showed greater improvement in conflict scores on the Attention Network Test, lower anxiety, depression, anger, and fatigue, and higher vigor on the Profile of Mood States scale, a significant decrease in stress-related cortisol, and an increase in immunoreactivity. These results provide a convenient method for studying the influence of meditation training by using experimental and control methods similar to those used to test drugs or other interventions.

A brain-computer interface with vibrotactile biofeedback for haptic information

A brain-computer interface with vibrotactile biofeedback for haptic information.

J Neuroengineering Rehabil. 2007 Oct 17;4(1):40

Authors: Chatterjee A, Aggarwal V, Ramos A, Acharya S, Thakor NV

ABSTRACT: BACKGROUND: It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only vibrotactile feedback, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy. METHODS: A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance. RESULTS AND CONCLUSIONS: Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.

Interactive Multimodal Biofeedback System for Neurorehabilitation

Novel Design of Interactive Multimodal Biofeedback System for Neurorehabilitation.

Conf Proc IEEE Eng Med Biol Soc. 2006;1:4925-4928

Authors: Huang H, Chen Y, Xu W, Sundaram H, Olson L, Ingalls T, Rikakis T, He J

A previous design of a biofeedback system for Neurorehabilitation in an interactive multimodal environment has demonstrated the potential of engaging stroke patients in task-oriented neuromotor rehabilitation. This report explores the new concept and alternative designs of multimedia based biofeedback systems. In this system, the new interactive multimodal environment was constructed with abstract presentation of movement parameters. Scenery images or pictures and their clarity and orientation are used to reflect the arm movement and relative position to the target instead of the animated arm. The multiple biofeedback parameters were classified into different hierarchical levels w.r.t. importance of each movement parameter to performance. A new quantified measurement for these parameters were developed to assess the patient's performance both real-time and offline. These parameters were represented by combined visual and auditory presentations with various distinct music instruments. Overall, the objective of newly designed system is to explore what information and how to feedback information in interactive virtual environment could enhance the sensorimotor integration that may facilitate the efficient design and application of virtual environment based therapeutic intervention.

Microsoft Mind Reader

Via NewScientist Tech

 

Microsoft plans to use EEG signals for task classification and activity recognition of users. The software giant has applied a new patent for a method that will allow to separate useful cognitive information from EEG artifacts and noise.

 

Read the full Microsoft mind reading patent application

 

Oct 17, 2007

Intraspinal stimulation for bladder voiding in cats

Neural engineers at Huntington Medical Research Institutes were able to use chronically implanted neuroprosthetic device inside the spinal cord. The device, consisting of an array of micrometer-sized spiny electrodes, was used to treat bladder paralysis in spinal cord-injured animals. The electrodes were located throughout the spinal cord tissue, and stimulation near the middle of the cord in the area called the dorsal gray commissure was most effective in inducing a bladder voiding reflex. Unlike the previously-used approaches of stimulating spinal roots or nerves which contain mixed fibers innervating multiple organs, the intraspinal stimulation was shown to be very specific and induced near-complete bladder emptying.

The article is coming out in the December issue of Journal of Neural Engineering and is available online at http://www.iop.org/EJ/abstract/1741-2552/4/4/002/

 

Oct 15, 2007

Smart Ambience Therapy with Body Brush

Via the Presence List-serv by Mattew Lombard  

An article recently appeared in Time magazine reports about Smart Ambience Therapy (SAT), a series of interactive technology programs designed by Horace Ho-Shing Ip to help children overcome the effects of abuse.

 

ff8af0b7208b4e184eb2f6583c286065.jpg

From the article:

Abused children are often withdrawn children. Ip had never considered using virtual reality to help them, but in 2002, he held a public exhibition of a virtual-paintbrush program he was working on and was surprised to see that emotionally closed kids took to it, using their bodies to create ebullient paintings. The kids' parents were shocked, but perhaps they shouldn't have been.

Since the 1990s, virtual reality has aided medicine by allowing victims of phobia or post-traumatic stress disorder to confront simulations of their fears: an oversize tarantula, a balcony on the 77th floor. Paint Splash was a natural outgrowth of those therapies.

Another program in the SAT product line helps abused kids confront aggressors by letting them shove away approaching grizzlies. A third teaches aggressive kids to reach out and touch virtual ribbons that dart away from jerky movements but glide toward smoother ones.

SAT has won a gold medal at Geneva's Salon International des Inventions, but Ip's success is evidenced best when kids come to use the lab. Many of them begin by covering the screen in black. "By the end," Ip says, "they're throwing blue. Art therapists will tell you, 'That's calm.'"

 

This website provides more information about Smart Ambience Therapy, including a demonstration video:

 

The project Smart Ambientce Therapy aims to develop a new form of therapy treatment by integrating the existing art and drama therapy to the virtual environment through an innovative exploitation of the Body Brush technology.

 

1. To use the body as a brush in virtual reality space. Body Brush will be used as a tool for communication and a creative emotional outlet for young clients recovering from physical/ or emotional abuse. 

2. The body, acting as a paintbrush, can be seen as a unique process of accessing one's internal world, as it can help a client get in touch with emotional material through kinesthetic movements.

3.  The body brush medium will be integrated into the art therapy process to help address feelings around grief and depression, distrust, fear and anger and low self-esteem resulting from abuse. What can emerge from this creative outlet are new feelings of self worth, strength, hope and coping strategies to deal with changes.

4.  Through using this technique, a sense of mastery in using computer can be achieved. The client can also 'get into' the image.

 

 

 

 

 

 

 

Oct 14, 2007

Neuropod

 
Nature, in partnership with The Dana Foundation, has launched Neuropod, a podcast on neuroscience research. This month's topics include the relationship between cognitive enhancement and warfare, how stress contributes to memory formation, learning from brain imaging, and why chili peppers might have a future in anesthesiology.
 
To have the podcast delivered to your desktop paste this link in your media player

17:55 Posted in Research tools | Permalink | Comments (0) | Tags: neuroscience

1 2 3 Next