Jul 23, 2007
Senior scientist position at Philips Research Eindhoven, The Netherlands
The candidate should hold a MSc. in psychology and have affinity with industrial research on User – System Interaction. Practical experience with methodologies and data analysis techniques for conducting empirical research as well as strong experience in the area of research into persuasive technologies is required. The candidate should have strong conceptual and analytical skills and is able to form and communicate own views. At the same time the candidate is part of a multi-disciplinary team working for customers, which requires a cooperative and customer-oriented attitude. The offered function includes task such as:
• deliver contributions to projects as required by the project leader and the customer;
• contribute to the development of transferable results within the defined projects;
• contribute to the patent portfolio;
• support related research projects and colleagues in setting up empirical research projects;
• deploy model, assessment methods and tools in research projects into persuasive technologies;
• carry out free research: prescreen own creative ideas on their usefulness for Philips, and find ways to propose new ideas and get them accepted;
The successful candidate will offered a position as senior scientist in the Cognitive & Behavioral Insights capability cluster at Philips Research Eindhoven, The Netherlands. The Cognitive and Behavioral Insights capability cluster delivers user Insights for the conceptualization of technological solutions that bring compelling user experiences. In the context of user – system interaction and system mediated communication, the cluster deploys and extends knowledge on human cognition, emotion and social interaction. With capabilities in the area of human cognition, behavioural models and empirical research methodologies, the cluster's research focuses on the themes of (i) persuasion and motivation, (ii) social presence and awareness and (iii) emotion and experience.
18:59 Posted in Research institutions & funding opportunities | Permalink | Comments (0) | Tags: captology
Novel brain-scanning technology invented
Researchers from Siemens have developed a prototype MRI scanner that uses a lattice of small coils positioned around the head rather than large coils you lie inside. As noted in this Technology Review article, the device is likely to have important applications in functional magnetic resonance imaging (fMRI), a variation of standard MRI that tracks blood flow in the brain as an indirect measure of activity.
The technique is often used to locate the parts of the brain that control specific functions, such as speech and movement. The first clinical application for the device will likely be fMRI for neurosurgery planning, says [Siemens MR vice president] Bundy. "Surgeons want to know where speech and motor areas are when they take a tumor out- the more precise, the better."
18:50 Posted in Research tools | Permalink | Comments (0) | Tags: research tools
Neural correlates of dispositional mindfulness during affect labeling
Neural correlates of dispositional mindfulness during affect labeling.
Psychosom Med. 2007 Jul;69(6):560-5
Authors: Creswell JD, Way BM, Eisenberger NI, Lieberman MD
18:48 Posted in Meditation & brain | Permalink | Comments (0) | Tags: mindfulness
Jul 17, 2007
Global mobile phone use to hit record 3.25 billion
Global mobile phone use to hit record 3.25 billion ( i.e. half world population )
00:27 Posted in Wearable & mobile | Permalink | Comments (0) | Tags: mobile phones
Plumi — free software video sharing platform
EngageMedia are very excited to announce the release of Plumi, a free software video sharing platform. Plumi enables you to create a sophisticated video sharing and community site out-of-the-box. In a web landscape where almost all video sharing sites keep their distribution platform under lock and key Plumi is one contribution to creating a truly democratic media.
00:23 | Permalink | Comments (0) | Tags: future interfaces
Antinormalizer
Via Smart Mobs
Brett Stalbaum and Derek Lomas from UC San Diego have developed Antinormalizer, a location media program that provides a creative approach of the relationship between people and their cell phones.
From Smart Mobs:
“What’s Antinormalizer?” you’ll ask. It’s basically a “hot spots” or location media program. A GPS system is aware of the mobile owner’s location and triggers an audio file to be played at that particular location. Derek scripted a number of activities delivered by these audios for people to perform with the goal of changing the “social lubrication” in a certain area, modify the normal social script in order to initiate interesting behaviors.
With Antinormalizer people play a game: instead of doing the normal stuff (like picking up the phone and simply talking to the other person), people are told to do something “outrageous,” so out of the ordinary, while their friends are taking pictures of that abnormal behavior. The best picture wins the game. The range of the abnormal behaviors can vary from reading your book while laid down on a parking spot, to playing some jungle rhythms as loud as you can, using paint bucket bottoms. If you don’t find that challenging enough, you can always clime a ten foot statue in the campus or talk to your friend across the street while you’re both sitting in a trashcan.
00:20 Posted in Locative media | Permalink | Comments (0) | Tags: locative media
Gesture-control for regular TV
Australian engineers Prashan Premaratne and Quang Nguye have designed a novel gesture-control for regular TV.
The controller's built-in camera can recognise seven simple hand gestures and work with up to eight different gadgets around the home. According to designers:
“Crucially for anyone with small children, pets or gesticulating family members, the software can distinguish between real commands and unintentional gestures“.
Premaratne and Nguye predict the system availability on the market within three year.
00:13 Posted in Future interfaces | Permalink | Comments (0) | Tags: future interfaces
Delicate Boundaries
Re-blogged from We Make Money Not Art
Delicate Boundaries, a work by Chris Sugrue, uses human touch to dissolve the barrier of the computer screen. Using the body as a means of exchange, the system explores the subtle boundaries that exist between foreign systems and what it might mean to cross them. Lifelike digital animations swarm out of their virtual confinement onto the skin of a hand or arm when it makes contact with a computer screen creating an imaginative world where our bodies are a landscape for digital life to explore.
Video.
00:03 Posted in Cyberart, Future interfaces | Permalink | Comments (0) | Tags: future interfaces
Jul 14, 2007
RunBot
Researchers from Germany and the United Kingdom have developed a bipedal walking robot, capable of self-stabilizing via a highly-developed learning process.
From the study abstract:
In this study we present a planar biped robot, which uses the design principle of nested loops to combine the self-stabilizing properties of its biomechanical design with several levels of neuronal control. Specifically, we show how to adapt control by including online learning mechanisms based on simulated synaptic plasticity. This robot can walk with a high speed (>3.0 leg length/s), self-adapting to minor disturbances, and reacting in a robust way to abruptly induced gait changes. At the same time, it can learn walking on different terrains, requiring only few learning experiences. This study shows that the tight coupling of physical with neuronal control, guided by sensory feedback from the walking pattern itself, combined with synaptic learning may be a way forward to better understand and solve coordination problems in other complex motor tasks.
The paper: Adaptive, Fast Walking in a Biped Robot under Neuronal Control and Learning (Manoonpong P, Geng T, Kulvicius T, Porr B, Worgotter F (2007) Adaptive, Fast Walking in a Biped Robot under Neuronal Control and Learning. PLoS Comput Biol 3(7): e134)
16:18 Posted in AI & robotics | Permalink | Comments (0) | Tags: artificial intelligence, robotics
The EyesWeb Project
From InfoMus Lab: Laboratorio di Informatica Musicale’s, Genova, Italy
The EyesWeb Project - The EyesWeb research project aims at exploring and developing models of interaction by extending music language toward gesture and visual languages, with a particular focus on the understanding of affect and expressive content in gesture. For example, in EyesWeb we aim at developing methods able to distinguish the different expressive content from two instances of the same movement pattern, e.g., two performances of the same dance fragment. Our research addresses the fields of KANSEI Information Processing and of analysis and synthesis of expressiveness in movement. More.
The EyesWeb open platform (free download) has been originally conceived for supporting research on multimodal expressive interfaces and multimedia interactive systems. EyesWeb has also been widely employed for designing and developing real-time dance, music, and multimedia applications. It supports the user in experimenting computational models of non-verbal expressive communication and in mapping gestures from different modalities (e.g., human full-body movement, music) onto multimedia output (e.g., sound, music, visual media). It allows fast development and experiment cycles of interactive performance set-ups by including a visual programming language allowing mapping, at different levels, of movement and audio into integrated music, visual, and mobile scenery.
EyesWeb has been designed with a special focus on the analysis and processing of expressive gesture in movement, midi, audio, and music signals. It was the basic platform of the EU-IST Project MEGA and it has been employed in many artistic performances and interactive installations. More.
16:15 Posted in Future interfaces | Permalink | Comments (0) | Tags: cybermusic
New NIH Neurotech Funding Opportunities
NIH announced new Federal funding to advance understanding of the nervous system, behavior or the diagnosis and treatment of nervous system diseases and disorders, through support of research, development, and enhancement of a wide range of neurotechnologies.
16:11 Posted in Neurotechnology & neuroinformatics | Permalink | Comments (0) | Tags: neurotechnology, neuroinformatics
Jul 13, 2007
A VR extended neuropsychological assessment for topographical disorientation
A virtual reality extended neuropsychological assessment for topographical disorientation: a feasibility study.
J Neuroengineering Rehabil. 2007 Jul 11;4(1):26
Authors: Morganti F, Gaggioli A, Strambi L, Rusconi ML, Riva G
ABSTRACT: BACKGROUND: Topographical disorientation represents one of the main consequences of brain injury. Up to now several methodological approaches have been used in the assessment of the brain injured patient's navigational abilities showing a moderate correlation with the impairments observed in everyday contexts. METHODS: We propose a combination of standardized neuropsychological tests and a more situated virtual reality-based assessment for the evaluation of spatial orientation in brain injured patients. RESULTS: When tested with this virtual reality integrated procedure patients showed performance and execution times congruent with their neuropsychological evaluation. When compared to a control group, patients revealed significantly slower times and greater errors in solving virtual reality based spatial tasks. CONCLUSIONS: The use of virtual reality, when combined with classical neuropsychological tests, can provide an effective tool for the study of topographical disorientation.
19:44 Posted in Research tools | Permalink | Comments (0) | Tags: virtual reality
Direct instrumental conditioning of neural activity using fMRI feedback
Direct instrumental conditioning of neural activity using functional magnetic resonance imaging-derived reward feedback.
J Neurosci. 2007 Jul 11;27(28):7498-507
Authors: Bray S, Shimojo S, O'Doherty JP
19:39 Posted in Biofeedback & neurofeedback | Permalink | Comments (0) | Tags: neurofeedback
Jul 11, 2007
Gadgets may help merge virtual reality with real life
reBlogged from networked performance
LindenLab, the company behind Second life, hopes to introduce hand-held and wearable systems that act as gateways between the real and virtual worlds. Linden Lab and other virtual worlds also are developing versions that run on existing mobile phones.
Researchers at a recent virtual worlds conference at MIT said that special eyewear, display "badges," and speakers worn about the neck will allow us to live more fully through our avatars - those idealized versions of ourselves that typically boast better proportions than the saggy originals.
Read full article
22:00 Posted in Augmented/mixed reality | Permalink | Comments (0) | Tags: mixed reality
Randomized controlled trial of virtual reality simulator training: transfer to live patients
Randomized controlled trial of virtual reality simulator training: transfer to live patients.
Am J Surg. 2007 Aug;194(2):205-11
Authors: Park J, MacRae H, Musselman LJ, Rossos P, Hamstra SJ, Wolman S, Reznick RK
BACKGROUND: New Residency Review Committee requirements in general surgery require 50 colonoscopies. Simulators have been widely suggested to help prepare residents for live clinical experience. We assessed a computer-based colonoscopy simulator for effective transfer of skills to live patients. METHODS: A randomized controlled trial included general surgery and internal medicine residents with limited endoscopic experience. Following a pretest, the treatment group (n = 12) practiced on the simulator, while controls (n = 12) received no additional training. Both groups then performed a colonoscopy on a live patient. Technical ability was evaluated by expert endoscopists using previously validated assessment instruments. RESULTS: In the live patient setting, the treatment group scored significantly higher global ratings than controls (t(22) = 1.84, P = .04). Only 2 of the 8 computer-based performance metrics correlated significantly with previously validated global ratings of performance. CONCLUSIONS: Residents trained on a colonoscopy simulator prior to their first patient-based colonoscopy performed significantly better in the clinical setting than controls, demonstrating skill transfer to live patients. The simulator's performance metrics showed limited concurrent validity, suggesting the need for further refinement.
21:54 Posted in Virtual worlds | Permalink | Comments (0) | Tags: virtual reality
MIT Media Lab: Responsive Environment Group
"Dual reality" is the concept of maintaining two worlds, one virtual and one real, that reflect, influence, and merge into each other by means of deeply embedded sensor/actuator networks. Both the real and virtual components of a dual reality are complete unto themselves, but are enriched by their mutual interaction. The Dual Reality Media Lab is an example of such a dual reality, as enabled the Plug sensor / actuator network that links our actual lab space to a virtual lab space in the Second Life online virtual world. [MOV]
21:47 Posted in Pervasive computing | Permalink | Comments (0) | Tags: interreality
Myomo e100 NeuroRobotic System
Via Medgadget
US company Myomo has announced that the e100 NeuroRobotic System, a technology designed to help in the rehabilitation process of patients by "engaging and reinforcing both neurological and motor pathways," has received FDA clearance to market.
How it Works
- Patient's brain is the controller: When a patient attempts movement during therapy, their muscles contract and electrical muscle activity signals fire
- Non-invasive sensing: An EMG sensor sits on the skin's surface to detect and continuously monitor a person's residual electrical muscle activity
- Proprietary system software: Advanced signal processing software filters and processes the user's EMG signal, and then forwards the data to a robotic device
- Proportional assistance: Portable, wearable robotics use the person's EMG signal to assist with desired movement; power assistance is customized to patient ability with Myomo's real-time adjustable control unit.
Product page: Myomo e100 NeuroRobotic System ...
21:36 Posted in AI & robotics, Cybertherapy | Permalink | Comments (0) | Tags: cybertherapy
Using a Robot to Teach Human Social Skills
Via KurzweilAI
A humanoid robot designed to teach autistic children social skills has begun testing in British schools. Known as KASPAR (Kinesics and Synchronisation in Personal Assistant Robotics), the $4.33 million bot smiles, simulates surprise and sadness
Read full article
21:26 Posted in AI & robotics, Cybertherapy | Permalink | Comments (0) | Tags: artificial intelligence, robotics, cybertherapy
Jul 09, 2007
Postdoctoral Position in Theoretical Neuroscience at CNCR, Amsterdam
20:51 Posted in Research institutions & funding opportunities | Permalink | Comments (0) | Tags: neuroscience
Jul 08, 2007
Field dependency and the sense of object-presence in haptic virtual environments
Field dependency and the sense of object-presence in haptic virtual environments.
Cyberpsychol Behav. 2007 Apr;10(2):243-51
Authors: Hecht D, Reiner M
Virtual environment (VE) users often report having a sense of being present in the virtual place or a sense that the virtual object is present in their environment. This sense of presence depends on both the technological fidelity (e.g., in graphics, haptics) and the users' cognitive/ personality characteristics. This study examined the correlation between user's cognitive style on the field-dependency dimension and the level of object-presence they reported in a haptic VE. Results indicated that field-independent individuals reported higher presence ratings compared to field-dependent participants. We hypothesize that field-independents' advantage in reorganizing the perceptual field and constructing it according to their previously acquired internal knowledge enables them to cognitively reconstruct the VE experience more efficiently by selectively attending only to the relevant cues and by filling in the gap of missing information with their previous knowledge and creative imagination. This active and creative cognitive process may be behind the enhanced sense of presence. In addition, we raise a possible linkage between field dependency, the sense of presence, and simulator sickness phenomenon.
00:48 Posted in Telepresence & virtual presence | Permalink | Comments (0) | Tags: telepresence, virtual presence