Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Dec 14, 2018

Transformative Experience Design

In the last couple of years, I and my team have been intensively working on a new research program in Positive Technology: Transformative Experience Design.

In short, the goal of this project is to understand how virtual reality, brain-based technologies and the language of arts can support transformative experiences, that is, emotional experiences that promote deep personal change.

About Transformative Experience Design

As noted by Miller and C’de Baca, there are experiences in life that are able to generate profound and long-lasting shifts in core beliefs and attitudes, including subjective self-transformation. These experiences have the capacity of changing not only what individuals know and value, but also how they see the world.

According to Mezirow’s Transformative Learning Theory, these experiences can be triggered by a “disorienting dilemma” usually related to a life crisis or major life transition (e.g., death, illness, separation, or divorce), which forces individuals to critically examine and eventually revise their core assumptions and beliefs. The outcome of a transformative experience is a significant and permanent change in the expectations – mindsets, perspectives and habits of mind – through which we filter and make sense of the world. For these characteristics, transformative experiences are gaining increasing attention not only in psychology and neuroscience, but also in philosophy.

From a psychological perspective, transformative change is often associated to specific experiential states, defined “self-transcendence experiences”. These are transient mental states that allow individuals experiencing something greater of themselves, reflecting on deeper dimensions of their existence and shaping lasting spiritual beliefs. These experiences encompass several mental states, including flow, positive emotions such as awe and elevation, “peak” experiences, “mystical” experiences and mindfulness (for a review, see Yaden et al.). Although the phenomenological profile of these experiential states can vary significantly in terms of quality and intensity, they are characterized by a diminished sense of self and increased feelings of connectedness to other people and one’s surroundings. Previous research has shown that self-transcendent experiences are important sources of positive psychological outcomes, including increased meaning in life, positive mood and life satisfaction, positive behavior change, spiritual development and pro-social attitudes.

One potentially interesting question related to self-transcendent experiences concerns whether, and to which extent, these mental states can be invited or elicited by means of interactive technologies. This question lies at the center of a new research program – Transformative Experience Design (TED) – which has a two-fold aims:

  • to systematically investigate the phenomenological and neuro-cognitive aspects of self-transcendent experiences, as well as their implications for individual growth and psychological wellbeing; and
  • to translate such knowledge into a tentative set of design principles for developing “e-experiences” that support meaning in life and personal growth. 

The three pillars of TED: virtual reality, arts and neurotechnologies

I have identified three possible assets that can be combined to achieve this goal:

  1. The first strategy concerns the use of advanced simulation technologies, such as virtual, augmented and mixed reality, as the elective medium to generate controlled alteration of perceptual, motor and cognitive processes. 
  2. The second asset regards the use of the language of arts to create emotionally-compelling storytelling scenarios.
  3. The third and final element of TED concerns the use of brain-based technologies, such as brain stimulation and bio/neurofeedback, to modulate neuro-physiological processes underlying self-transcendence mental states, using a closed-loop approach.

The central assumption of TED is that the combination of these means provides a broad spectrum of transformative possibilities, which include, for example, “what it is like” to embody another self or another life form, simulating peculiar neurological phenomena like synesthesia or out-of-body experiences, and altering time and space perception.

The safe and controlled use of these e-experiences hold the potential to facilitate self-knowledge and self-understanding, foster creative expression, develop new skills, and recognize and learn the value of others.

Example of TED research projects

Although TED is a recent research program, we are building a fast-growing community of researchers, artists and developers to shape the next generation of transformative experiences. Here is a list of recent projects and publications related to TED in different application contexts.

The Emotional Labyrinth

In this project I teamed with Sergi Bermudez i Badia and Mónica S. Cameirão from Madera Interactive Technologies Institute to realize the first example of emotionally-adaptive virtual reality application for mental health. So far, virtual reality applications in wellbeing and therapy have typically been based on pre-designed objects and spaces. In this project, we suggest a different approach, in which the content of a virtual world is procedurally generated at runtime (that is, through algorithmic means) according to the user’s affective responses. To demonstrate the concept, we developed a first prototype using Unity: the “Emotional Labyrinth”. In this VR experience, the user walks through a endless maze, whose structure and contents are automatically generated according to four basic emotional states: joy, sadness, anger and fear. 

During navigation, affective states are dynamically represented through pictures, music, and animated visual metaphors chosen to represent and induce emotional states.

The underlying hypothesis is that exposing users to multimodal representations of their affective states can create a feedback loop that supports emotional self-awareness and fosters more effective emotional regulation strategies. We carried out a first study to (i) assess the effectiveness of the selected metaphors in inducing target emotions, and (ii) identify relevant psycho-physiological markers of the emotional experience generated by the labyrinth. Results showed that the Emotional Labyrinth is overall a pleasant experience in which the proposed procedural content generation can induce distinctive psycho-physiological patterns, generally coherent with the meaning of the metaphors used in the labyrinth design. Further, collected psycho-physiological responses such as electrocardiography, respiration, electrodermal activity, and electromyography are used to generate computational models of users' reported experience. These models enable the future implementation of the closed loop mechanism to adapt the Labyrinth procedurally to the users' affective state.

Awe in Virtual Reality

Awe is a compelling emotional experience with philosophical roots in the domain of aesthetics and religious or spiritual experiences. Both Edmund Burke’s (1759/1970 and Immanuel Kant’s (1764/2007) analyses of the sublime as a compelling experience that transcends one’s perception of beauty to something more profound are couched in terms that seem synonymous with the modern understanding of awe.

The contemporary psychological understanding of awe comes largely from a foundational article written by Keltner and Haidt (2003). According to their conceptualization, awe experiences encompass two key appraisals: the perception of vastness and the need to mentally attempt to accommodate this vastness into existing mental schemas.

Crucially, research has shown that experiencing awe is associated with positive transformative changes at both psychological and physical levels (e.g., Shiota et al., 2007Schneider, 2009Stellar et al., 2015). For example, awe can change our perspective toward even unknown others thus increasing our generous attitude toward them (Piff et al., 2015Prade and Saroglou, 2016) and reducing aggressive behaviors (Yang et al., 2016). Generally, awe broadens our attentional focus (Sung and Yih, 2015), and extends time perception (Rudd et al., 2012). Furthermore, this emotion protects our immunity system against chronic and cardiovascular diseases and enhance our satisfaction toward life (Krause and Hayward, 2015Stellar et al., 2015).

Considering the transformative potential of awe, I and my doctoral student Alice Chiricofocused on how to elicit intense feelings of this complex emotion using virtual reality. To this end, we modeled three immersive virtual environments (i.e., a forest including tall trees; a chain of mountains; and an earth view from deep space) designed to induce a feeling of perceptual vastness. As hypothesized, the three target environments induced a significantly greater awe than a "neutral" virtual environment (a park consisting of a green clearing with very few trees and some flowers). Full details of this study are reported here.

In another study, we examined the potential of VR-induced awe to foster creativity. To this end, we exposed participants both to an awe-inducing 3D-video and to a neutral one in a within-subject design. After each stimulation condition, participants reported the intensity and type of perceived emotion and completed two verbal tasks of the Torrance Tests of Creative Thinking (TTCT; Torrance, 1974), a standardized test to measure creativity performance. Results showed that awe positively affected key creativity components—fluidity, flexibility, and elaboration measured by the TTCT subtest—compared to a neutral stimulus, suggesting that (i) awe has a potential for boosting creativity, and (ii) VR is a powerful awe-inducing medium that can be used in different application contexts (i.e., educational, clinical etc.) where this emotion can make a difference.

However, not only graphical 3D environments can be used to induce awe; in another study, we showed that also 360° videos depicting vast natural scenarios are powerful stimuli to induce intense feelings of this complex emotion.

Immersive storytelling for psychotherapy and mental wellbeing

Growing research evidence indicates that VR can be effectively integrated in psychotherapyto treat a number of clinical conditions, including anxiety disorders, pain disorders and PTSD. In this context, VR is mostly used as simulative tool for controlled exposure to critical/fearful situations. The possibility of presenting realistic controlled stimuli and, simultaneously, of monitoring the responses generated by the user offers a considerable advantage over real experiences.

However, the most interesting potential of VR resides in its capacity of creating compelling immersive storytelling experiences. As recently noted by Brenda Wiederhold:

Virtual training simulations, documentaries, and experiences will, however, only be as effective as the emotions they spark in the viewer. To reach that point, the VR industry is faced with two obstacles: creating content that is enjoyable and engaging, and encouraging adoption of the medium among consumers. Perhaps the key to both problems is the recognition that users are not passive consumers of VR content. Rather, they bring their own thoughts, needs, and emotions to the worlds they inhabit. Successful stories challenge those conceptions, invite users to engage with the material, and recognize the power of untethering users from their physical world and throwing them into another. That isn’t just the power of VR—it’s the power of storytelling as a whole.

Thus, VR-based narratives can be used to generate an infinite number of “possible selves”, by providing a person a “subjective window of presence” into unactualized, but possible, worlds.

The emergence of immersive storytelling introduces the possibility of using VR in mental health from a different rationale than virtual reality-based exposure therapy. In this novel rationale, immersive stories, lived from a first-person perspective, provide the patient the opportunity of engaging emotionally with metaphoric narratives, eliciting new insights and meaning-making related to viewers’ personal world views.

To explore this new perspective, I have been collaborating with the Italian startup Become to test the potential of transformative immersive storytelling in mental health and wellbeing. An intriguing aspect of this strategy is that, in contrast with conventional virtual-reality exposure therapy, which is mostly used in combination with Cognitive-Behavioral Therapy interventions, immersive storytelling scenarios can be integrated in any therapeutic model, since all kinds of psychotherapy involve some form of ‘storytelling’. 

In this project, we are interested in understanding, for example, whether the integration of immersive stories in the therapeutic setting can enhance the efficacy of the intervention and facilitate patients in expressing their inner thoughts, feelings, and life experiences. 

Collaborate!

Are you a researcher, a developer, or an artist interested in collaborating in TED projects? Here is how:

  1. Drop me an email at: andrea.gaggioli@unicatt.it
  2. Sign into ResearchGate and visit Transformative Experience Design project's page
  3. Have a look at the existing projects and publications to find out which TED research line is more interesting to you.

Key references

[1] Miller, W. R., & C'de Baca, J. (2001). Quantum change: When epiphanies and sudden insights transform ordinary lives. New York: Guilford Press.

[2] Yaden, D. B., Haidt, J., Hood, R. W., Jr., Vago, D. R., & Newberg, A. B. (2017). The varieties of self-transcendent experience. Review of General Psychology, 21(2), 143-160.

[3] Gaggioli, A. (2016). Transformative Experience Design. In Human Computer Confluence. Transforming Human Experience Through Symbiotic Technologies, eds A. Gaggioli, A. Ferscha, G. Riva, S. Dunne, and I. Viaud-Delmon (Berlin: De Gruyter Open), 96–121.

Jan 02, 2017

Mind-controlled toys: The next generation of Christmas presents?

Source: University of Warwick 

The next generation of toys could be controlled by the power of the mind, thanks to research by the University of Warwick.

make-way-for-mind-controlled-toys-the-future-gifts.png

Led by Professor Christopher James, Director of Warwick Engineering in Biomedicine at the School of Engineering, technology has been developed which allows electronic devices to be activated using electrical impulses from brain waves, by connecting our thoughts to computerised systems. Some of the most popular toys on children's lists to Santa - such as remote-controlled cars and helicopters, toy robots and Scalextric racing sets - could all be controlled via a headset, using 'the power of thought'.

This could be based on levels of concentration - thinking of your favourite colour or stroking your dog, for example. Instead of a hand-held controller, a headset is used to create a brain-computer interface - a communication link between the human brain and the computerised device.

Sensors in the headset measure the electrical impulses from brain at various different frequencies - each frequency can be somewhat controlled, under special circumstances. This activity is then processed by a computer, amplified and fed into the electrical circuit of the electronic toy. Professor James comments on the future potential for this technology: "Whilst brain-computer interfaces already exist - there are already a few gaming headsets on the market - their functionality has been quite limited.

New research is making the headsets now read cleaner and stronger signals than ever before - this means stronger links to the toy, game or action thus making it a very immersive experience. "The exciting bit is what comes next - how long before we start unlocking the front door or answering the phone through brain-computer interfaces?"

 

Effects of Smart-Tablet-Based Neurofeedback Training on Cognitive Function in Children with Attention Problems

Effects of Smart-Tablet-Based Neurofeedback Training on Cognitive Function in Children with Attention Problems

J Child Neurol. 2016 May;31(6):750-60 Authors: Shin MS, Jeon H, Kim M, Hwang T, Oh SJ, Hwangbo M, Kim KJ

Abstract We sought to determine whether smart-tablet-based neurofeedback could improve executive function-including attention, working memory, and self-regulation-in children with attention problems. Forty children (10-12 years old) with attention problems, as determined by ratings on the Conners Parent Rating Scale, were assigned to either a neurofeedback group that received 16 sessions or a control group. A comprehensive test battery that assessed general intelligence, visual and auditory attention, attentional shifting, response inhibition and behavior rating scales were administered to both groups before neurofeedback training. Several neuropsychological tests were conducted at posttraining and follow-up assessment. Scores on several neuropsychological tests and parent behavior rating scales showed significant improvement in the training group but not in the controls. The improvements remained through the follow-up assessment. This study suggests that the smart-tablet-based neurofeedback training program might improve cognitive function in children with attention problems.

Jun 21, 2016

New book on Human Computer Confluence - FREE PDF!

Two good news for Positive Technology followers.

1) Our new book on Human Computer Confluence is out!

2) It can be downloaded for free here

9783110471137.jpg

Human-computer confluence refers to an invisible, implicit, embodied or even implanted interaction between humans and system components. New classes of user interfaces are emerging that make use of several sensors and are able to adapt their physical properties to the current situational context of users.

A key aspect of human-computer confluence is its potential for transforming human experience in the sense of bending, breaking and blending the barriers between the real, the virtual and the augmented, to allow users to experience their body and their world in new ways. Research on Presence, Embodiment and Brain-Computer Interface is already exploring these boundaries and asking questions such as: Can we seamlessly move between the virtual and the real? Can we assimilate fundamentally new senses through confluence?

The aim of this book is to explore the boundaries and intersections of the multidisciplinary field of HCC and discuss its potential applications in different domains, including healthcare, education, training and even arts.

DOWNLOAD THE FULL BOOK HERE AS OPEN ACCESS

Please cite as follows:

Andrea Gaggioli, Alois Ferscha, Giuseppe Riva, Stephen Dunne, Isabell Viaud-Delmon (2016). Human computer confluence: transforming human experience through symbiotic technologies. Warsaw: De Gruyter. ISBN 9783110471120.

 

May 26, 2016

From User Experience (UX) to Transformative User Experience (T-UX)

In 1999, Joseph Pine and James Gilmore wrote a seminal book titled “The Experience Economy” (Harvard Business School Press, Boston, MA) that theorized the shift from a service-based economy to an experience-based economy.

According to these authors, in the new experience economy the goal of the purchase is no longer to own a product (be it a good or service), but to use it in order to enjoy a compelling experience. An experience, thus, is a whole-new type of offer: in contrast to commodities, goods and services, it is designed to be as personal and memorable as possible. Just as in a theatrical representation, companies stage meaningful events to engage customers in a memorable and personal way, by offering activities that provide engaging and rewarding experiences.

Indeed, if one looks back at the past ten years, the concept of experience has become more central to several fields, including tourism, architecture, and – perhaps more relevant for this column – to human-computer interaction, with the rise of “User Experience” (UX).

The concept of UX was introduced by Donald Norman in a 1995 article published on the CHI proceedings (D. Norman, J. Miller, A. Henderson: What You See, Some of What's in the Future, And How We Go About Doing It: HI at Apple Computer. Proceedings of CHI 1995, Denver, Colorado, USA). Norman argued that focusing exclusively on usability attribute (i.e. easy of use, efficacy, effectiveness) when designing an interactive product is not enough; one should take into account the whole experience of the user with the system, including users’ emotional and contextual needs. Since then, the UX concept has assumed an increasing importance in HCI. As McCarthy and Wright emphasized in their book “Technology as Experience” (MIT Press, 2004):

“In order to do justice to the wide range of influences that technology has in our lives, we should try to interpret the relationship between people and technology in terms of the felt life and the felt or emotional quality of action and interaction.” (p. 12).

However, according to Pine and Gilmore experience may not be the last step of what they call as “Progression of Economic Value”. They speculated further into the future, by identifying the “Transformation Economy” as the likely next phase. In their view, while experiences are essentially memorable events which stimulate the sensorial and emotional levels, transformations go much further in that they are the result of a series of experiences staged by companies to guide customers learning, taking action and eventually achieving their aspirations and goals.

In Pine and Gilmore terms, an aspirant is the individual who seeks advice for personal change (i.e. a better figure, a new career, and so forth), while the provider of this change (a dietist, a university) is an elictor. The elictor guide the aspirant through a series of experiences which are designed with certain purpose and goals. According to Pine and Gilmore, the main difference between an experience and a transformation is that the latter occurs when an experience is customized:

“When you customize an experience to make it just right for an individual - providing exactly what he needs right now - you cannot help changing that individual. When you customize an experience, you automatically turn it into a transformation, which companies create on top of experiences (recall that phrase: “a life-transforming experience”), just as they create experiences on top of services and so forth” (p. 244).

A further key difference between experiences and transformations concerns their effects: because an experience is inherently personal, no two people can have the same one. Likewise, no individual can undergo the same transformation twice: the second time it’s attempted, the individual would no longer be the same person (p. 254-255).

But what will be the impact of this upcoming, “transformation economy” on how people relate with technology? If in the experience economy the buzzword is “User Experience”, in the next stage the new buzzword might be “User Transformation”.

Indeed, we can see some initial signs of this shift. For example, FitBit and similar self-tracking gadgets are starting to offer personalized advices to foster enduring changes in users’ lifestyle; another example is from the fields of ambient intelligence and domotics, where there is an increasing focus towards designing systems that are able to learn from the user’s behaviour (i.e. by tracking the movement of an elderly in his home) to provide context-aware adaptive services (i.e. sending an alert when the user is at risk of falling).

But likely, the most important ICT step towards the transformation economy could take place with the introduction of next-generation immersive virtual reality systems. Since these new systems are based on mobile devices (an example is the recent partnership between Oculus and Samsung), they are able to deliver VR experiences that incorporate information on the external/internal context of the user (i.e. time, location, temperature, mood etc) by using the sensors incapsulated in the mobile phone.

By personalizing the immersive experience with context-based information, it might be possibile to induce higher levels of involvement and presence in the virtual environment. In case of cyber-therapeutic applications, this could translate into the development of more effective, transformative virtual healing experiences.

Furthermore, the emergence of "symbiotic technologies", such as neuroprosthetic devices and neuro-biofeedback, is enabling a direct connection between the computer and the brain. Increasingly, these neural interfaces are moving from the biomedical domain to become consumer products. But unlike existing digital experiential products, symbiotic technologies have the potential to transform more radically basic human experiences.

Brain-computer interfaces, immersive virtual reality and augmented reality and their various combinations will allow users to create “personalized alterations” of experience. Just as nowadays we can download and install a number of “plug-ins”, i.e. apps to personalize our experience with hardware and software products, so very soon we may download and install new “extensions of the self”, or “experiential plug-ins” which will provide us with a number of options for altering/replacing/simulating our sensorial, emotional and cognitive processes.

Such mediated recombinations of human experience will result from of the application of existing neuro-technologies in completely new domains. Although virtual reality and brain-computer interface were originally developed for applications in specific domains (i.e. military simulations, neurorehabilitation, etc), today the use of these technologies has been extended to other fields of application, ranging from entertainment to education.

In the field of biology, Stephen Jay Gould and Elizabeth Vrba (Paleobiology, 8, 4-15, 1982) have defined “exaptation” the process in which a feature acquires a function that was not acquired through natural selection. Likewise, the exaptation of neurotechnologies to the digital consumer market may lead to the rise of a novel “neuro-experience economy”, in which technology-mediated transformation of experience is the main product.

Just as a Genetically-Modified Organism (GMO) is an organism whose genetic material is altered using genetic-engineering techniques, so we could define aTechnologically-Modified Experience (ETM) a re-engineered experience resulting from the artificial manipulation of neurobiological bases of sensorial, affective, and cognitive processes.

Clearly, the emergence of the transformative neuro-experience economy will not happen in weeks or months but rather in years. It will take some time before people will find brain-computer devices on the shelves of electronic stores: most of these tools are still in the pre-commercial phase at best, and some are found only in laboratories.

Nevertheless, the mere possibility that such scenario will sooner or later come to pass, raises important questions that should be addressed before symbiotic technologies will enter our lives: does technological alteration of human experience threaten the autonomy of individuals, or the authenticity of their lives? How can we help individuals decide which transformations are good or bad for them?

Answering these important issues will require the collaboration of many disciplines, including philosophy, computer ethics and, of course, cyberpsychology.

Aug 31, 2014

Self-regulation of human brain activity using simultaneous real-time fMRI and EEG neurofeedback

Self-regulation of human brain activity using simultaneous real-time fMRI and EEG neurofeedback.

Zotev V1,Phillips R, Yuan H, Misaki M, Bodurka J. Neuroimage. 2014 Jan 15;85 Pt 3:985-95. doi: 10.1016/j.neuroimage.2013.04.126. Epub 2013 May 11.

Abstract. Neurofeedback is a promising approach for non-invasive modulation of human brain activity with applications for treatment of mental disorders and enhancement of brain performance. Neurofeedback techniques are commonly based on either electroencephalography (EEG) or real-time functional magnetic resonance imaging (rtfMRI). Advances in simultaneous EEG-fMRI have made it possible to combine the two approaches. Here we report the first implementation of simultaneous multimodal rtfMRI and EEG neurofeedback (rtfMRI-EEG-nf). It is based on a novel system for real-time integration of simultaneous rtfMRI and EEG data streams. We applied the rtfMRI-EEG-nf to training of emotional self-regulation in healthy subjects performing a positive emotion induction task based on retrieval of happy autobiographical memories. The participants were able to simultaneously regulate their BOLD fMRI activation in the left amygdala and frontal EEG power asymmetry in the high-beta band using the rtfMRI-EEG-nf. Our proof-of-concept results demonstrate the feasibility of simultaneous self-regulation of both hemodynamic (rtfMRI) and electrophysiological (EEG) activities of the human brain. They suggest potential applications of rtfMRI-EEG-nf in the development of novel cognitive neuroscience research paradigms and enhanced cognitive therapeutic approaches for major neuropsychiatric disorders, particularly depression.

Biofeedback-based training for stress management in daily hassles: an intervention study.

Biofeedback-based training for stress management in daily hassles: an intervention study.

Brain Behav. 2014 Jul;4(4):566-579

Authors: Kotozaki Y, Takeuchi H, Sekiguchi A, Yamamoto Y, Shinada T, Araki T, Takahashi K, Taki Y, Ogino T, Kiguchi M, Kawashima R

Abstract. BACKGROUND: The day-to-day causes of stress are called daily hassles. Daily hassles are correlated with ill health. Biofeedback (BF) is one of the tools used for acquiring stress-coping skills. However, the anatomical correlates of the effects of BF with long training periods remain unclear. In this study, we aimed to investigate this. METHODS: PARTICIPANTS WERE ASSIGNED RANDOMLY TO TWO GROUPS: the intervention group and the control group. Participants in the intervention group performed a biofeedback training (BFT) task (a combination task for heart rate and cerebral blood flow control) every day, for about 5 min once a day. The study outcomes included MRI, psychological tests (e.g., Positive and Negative Affect Schedule, Center for Epidemiologic Studies Depression Scale, and Brief Job Stress Questionnaire), and a stress marker (salivary cortisol levels) before (day 0) and after (day 28) the intervention. RESULTS: We observed significant improvements in the psychological test scores and salivary cortisol levels in the intervention group compared to the control group. Furthermore, voxel-based morphometric analysis revealed that compared to the control group, the intervention group had significantly increased regional gray matter (GM) volume in the right lateral orbitofrontal cortex, which is an anatomical cluster that includes mainly the left hippocampus, and the left subgenual anterior cingulate cortex. The GM regions are associated with the stress response, and, in general, these regions seem to be the most sensitive to the detrimental effects of stress. CONCLUSIONS: Our findings suggest that our BFT is effective against the GM structures vulnerable to stress.

Aug 03, 2014

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity.

Neuropsychologia. 2014 Jul 21;

Authors: Hui M, Zhang H, Ge R, Yao L, Long Z

Abstract. Although the neurofeedback of real-time fMRI can reportedly enable people to gain control of the activity in the premotor cortex (PMA) during motor imagery, it is unclear how the neurofeedback training of PMA affect the motor network engaged in the motor execution (ME) and imagery (MI) task. In this study, we investigated the changes in the motor network engaged in both ME and MI task induced by real-time neurofeedback training of the right PMA. The neurofeedback training induced changes in activity of the ME-related motor network as well as alterations in the functional connectivity of both the ME-related and MI-related motor networks. Especially, the percent signal change of the right PMA in the last training run was found to be significantly correlated with the connectivity between the right PMA and the left posterior parietal lobe (PPL) during the pre-training MI run, post-training MI run and the last training run. Moreover, the increase in the tapping frequency was significantly correlated with the increase of connectivity between the right cerebellum and the primary motor area / primary sensory area (M1/S1) of the ME-related motor network after neurofeedback training. These findings show the importance of the connectivity between the right PMA and left PPL of the MI network for the up-regulation of the right PMA as well as the critical role of connectivity between the right cerebellum and M1/S1 of the ME network in improving the behavioral performance.

Jul 29, 2014

Real-time functional MRI neurofeedback: a tool for psychiatry

Real-time functional MRI neurofeedback: a tool for psychiatry.

Curr Opin Psychiatry. 2014 Jul 14;

Authors: Kim S, Birbaumer N

Abstract. PURPOSE OF REVIEW: The aim of this review is to provide a critical overview of recent research in the field of neuroscientific and clinical application of real-time functional MRI neurofeedback (rtfMRI-nf).
RECENT FINDINGS: RtfMRI-nf allows self-regulating activity in circumscribed brain areas and brain systems. Furthermore, the learned regulation of brain activity has an influence on specific behaviors organized by the regulated brain regions. Patients with mental disorders show abnormal activity in certain regions, and simultaneous control of these regions using rtfMRI-nf may affect the symptoms of related behavioral disorders. SUMMARY: The promising results in clinical application indicate that rtfMRI-nf and other metabolic neurofeedback, such as near-infrared spectroscopy, might become a potential therapeutic tool. Further research is still required to examine whether rtfMRI-nf is a useful tool for psychiatry because there is still lack of knowledge about the neural function of certain brain systems and about neuronal markers for specific mental illnesses.

Apr 06, 2014

Glass brain flythrough: beyond neurofeedback

Via Neurogadget

Researchers have developed a new way to explore the human brain in virtual reality. The system, called Glass Brain, which is developed by Philip Rosedale, creator of the famous game Second Life, and Adam Gazzaley, a neuroscientist at the University of California San Francisco, combines brain scanning, brain recording and virtual reality to allow a user to journey through a person’s brain in real-time.

Read the full story on Neurogadget

Mar 02, 2014

3D Thought controlled environment via Interaxon

In this demo video, artist Alex McLeod shows an environment he designed for Interaxon to use at CES in 2011 interaxon.ca/CES#.

The glasses display the scene in 3D and attaches sensors read users brain-states which control elements of the scene.

3D Thought controlled environment via Interaxon from Alex McLeod on Vimeo.

Jan 23, 2014

Mobile biofeedback of heart rate variability in patients with diabetic polyneuropathy: a preliminary study

Mobile biofeedback of heart rate variability in patients with diabetic polyneuropathy: a preliminary study.

Clin Physiol Funct Imaging. 2014 Jan 20;

Authors: Druschky K, Druschky A

Abstract. Biofeedback of heart rate variability (HRV) was applied to patients with diabetic polyneuropathy using a new mobile device allowing regularly scheduled self-measurements without the need of visits to a special autonomic laboratory. Prolonged generation of data over an eight-week period facilitated more precise investigation of cardiac autonomic function and assessment of positive and negative trends of HRV parameters over time. Statistical regression analyses revealed significant trends in 11 of 17 patients, while no significant differences were observed when comparing autonomic screening by short-term HRV and respiratory sinus arrhythmia at baseline and after the 8 weeks training period. Four patients showed positive trends of HRV parameters despite the expected progression of cardiac autonomic dysfunction over time. Patient compliance was above 50% in all but two patients. The results of this preliminary study indicate a good practicality of the handheld device and suggest a potential positive effect on cardiac autonomic neuropathy in patients with type 2 diabetes.

Dec 24, 2013

Evaluation of neurofeedback in ADHD: The long and winding road.

Evaluation of neurofeedback in ADHD: The long and winding road.

Biol Psychol. 2013 Dec 6;

Authors: Arns M, Heinrich H, Strehl U

Among the clinical applications of neurofeedback, most research has been conducted in ADHD. As an introduction a short overview of the general history of neurofeedback will be given, while the main part of the paper deals with a review of the current state of neurofeedback in ADHD. A meta-analysis on neurofeedback from 2009 found large effect sizes for inattention and impulsivity and medium effects sizes for hyperactivity. Since 2009 several new studies, including 4 placebo-controlled studies, have been published. These latest studies are reviewed and discussed in more detail. The review focuses on studies employing 1) semi-active, 2) active, and 3) placebo-control groups. The assessment of specificity of neurofeedback treatment in ADHD is discussed and it is concluded that standard protocols such as theta/beta, SMR and slow cortical potentials neurofeedback are well investigated and have demonstrated specificity. The paper ends with an outlook on future questions and tasks. It is concluded that future controlled clinical trials should, in a next step, focus on such known protocols, and be designed along the lines of learning theory.

Dec 08, 2013

Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox

Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox.

PLoS One. 2013;8(12):e81658

Authors: Sato JR, Basilio R, Paiva FF, Garrido GJ, Bramati IE, Bado P, Tovar-Moll F, Zahn R, Moll J

Abstract. The demonstration that humans can learn to modulate their own brain activity based on feedback of neurophysiological signals opened up exciting opportunities for fundamental and applied neuroscience. Although EEG-based neurofeedback has been long employed both in experimental and clinical investigation, functional MRI (fMRI)-based neurofeedback emerged as a promising method, given its superior spatial resolution and ability to gauge deep cortical and subcortical brain regions. In combination with improved computational approaches, such as pattern recognition analysis (e.g., Support Vector Machines, SVM), fMRI neurofeedback and brain decoding represent key innovations in the field of neuromodulation and functional plasticity. Expansion in this field and its applications critically depend on the existence of freely available, integrated and user-friendly tools for the neuroimaging research community. Here, we introduce FRIEND, a graphic-oriented user-friendly interface package for fMRI neurofeedback and real-time multivoxel pattern decoding. The package integrates routines for image preprocessing in real-time, ROI-based feedback (single-ROI BOLD level and functional connectivity) and brain decoding-based feedback using SVM. FRIEND delivers an intuitive graphic interface with flexible processing pipelines involving optimized procedures embedding widely validated packages, such as FSL and libSVM. In addition, a user-defined visual neurofeedback module allows users to easily design and run fMRI neurofeedback experiments using ROI-based or multivariate classification approaches. FRIEND is open-source and free for non-commercial use. Processing tutorials and extensive documentation are available.

Nov 16, 2013

Neurofeedback training aimed to improve focused attention and alertness in children with ADHD

Neurofeedback training aimed to improve focused attention and alertness in children with ADHD: a study of relative power of EEG rhythms using custom-made software application.

Clin EEG Neurosci. 2013 Jul;44(3):193-202

Authors: Hillard B, El-Baz AS, Sears L, Tasman A, Sokhadze EM

Abstract. Neurofeedback is a nonpharmacological treatment for attention-deficit hyperactivity disorder (ADHD). We propose that operant conditioning of electroencephalogram (EEG) in neurofeedback training aimed to mitigate inattention and low arousal in ADHD, will be accompanied by changes in EEG bands' relative power. Patients were 18 children diagnosed with ADHD. The neurofeedback protocol ("Focus/Alertness" by Peak Achievement Trainer) has a focused attention and alertness training mode. The neurofeedback protocol provides one for Focus and one for Alertness. This does not allow for collecting information regarding changes in specific EEG bands (delta, theta, alpha, low and high beta, and gamma) power within the 2 to 45 Hz range. Quantitative EEG analysis was completed on each of twelve 25-minute-long sessions using a custom-made MatLab application to determine the relative power of each of the aforementioned EEG bands throughout each session, and from the first session to the last session. Additional statistical analysis determined significant changes in relative power within sessions (from minute 1 to minute 25) and between sessions (from session 1 to session 12). Analysis was of relative power of theta, alpha, low and high beta, theta/alpha, theta/beta, and theta/low beta and theta/high beta ratios. Additional secondary measures of patients' post-neurofeedback outcomes were assessed, using an audiovisual selective attention test (IVA + Plus) and behavioral evaluation scores from the Aberrant Behavior Checklist. Analysis of data computed in the MatLab application, determined that theta/low beta and theta/alpha ratios decreased significantly from session 1 to session 12, and from minute 1 to minute 25 within sessions. The findings regarding EEG changes resulting from brain wave self-regulation training, along with behavioral evaluations, will help elucidate neural mechanisms of neurofeedback aimed to improve focused attention and alertness in ADHD.

Aug 07, 2013

What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold

What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold.

Front Hum Neurosci. 2013;7:438

Authors: Martini M, Perez-Marcos D, Sanchez-Vives MV

It has been demonstrated that visual inputs can modulate pain. However, the influence of skin color on pain perception is unknown. Red skin is associated to inflamed, hot and more sensitive skin, while blue is associated to cyanotic, cold skin. We aimed to test whether the color of the skin would alter the heat pain threshold. To this end, we used an immersive virtual environment where we induced embodiment of a virtual arm that was co-located with the real one and seen from a first-person perspective. Virtual reality allowed us to dynamically modify the color of the skin of the virtual arm. In order to test pain threshold, increasing ramps of heat stimulation applied on the participants' arm were delivered concomitantly with the gradual intensification of different colors on the embodied avatar's arm. We found that a reddened arm significantly decreased the pain threshold compared with normal and bluish skin. This effect was specific when red was seen on the arm, while seeing red in a spot outside the arm did not decrease pain threshold. These results demonstrate an influence of skin color on pain perception. This top-down modulation of pain through visual input suggests a potential use of embodied virtual bodies for pain therapy.

Full text open access

International Conference on Physiological Computing Systems

7-9 January 2014, Lisbon, Portugal

http://www.phycs.org/

Physiological data in its different dimensions, either bioelectrical, biomechanical, biochemical or biophysical, and collected through specialized biomedical devices, video and image capture or other sources, is opening new boundaries in the field of human-computer interaction into what can be defined as Physiological Computing. PhyCS is the annual meeting of the physiological interaction and computing community, and serves as the main international forum for engineers, computer scientists and health professionals, interested in outstanding research and development that bridges the gap between physiological data handling and human-computer interaction.


Regular Paper Submission Extension: September 15, 2013
Regular Paper Authors Notification: October 23, 2013
Regular Paper Camera Ready and Registration: November 5, 2013

Jul 23, 2013

SENSUS Transcutaneous Pain Management System Approved for Use During Sleep

Via Medgadget

NeuroMetrix of out of Waltham, MA received FDA clearance for its SENSUS Pain Management System to be used by patients during sleep. This is the first transcutaneous electrical nerve stimulation system to receive a sleep indication from the FDA for pain control.

The device is designed for use by diabetics and others with chronic pain in the legs and feet. It’s worn around one or both legs and delivers an electrical current to disrupt pain signals being sent up to the brain.

neurometrix sensus SENSUS Transcutaneous Pain Management System Approved for Use During Sleep

May 26, 2013

Cross-Brain Neurofeedback: Scientific Concept and Experimental Platform

Cross-Brain Neurofeedback: Scientific Concept and Experimental Platform.

PLoS One. 2013;8(5):e64590

Authors: Duan L, Liu WJ, Dai RN, Li R, Lu CM, Huang YX, Zhu CZ

Abstract. The present study described a new type of multi-person neurofeedback with the neural synchronization between two participants as the direct regulating target, termed as "cross-brain neurofeedback." As a first step to implement this concept, an experimental platform was built on the basis of functional near-infrared spectroscopy, and was validated with a two-person neurofeedback experiment. This novel concept as well as the experimental platform established a framework for investigation of the relationship between multiple participants' cross-brain neural synchronization and their social behaviors, which could provide new insight into the neural substrate of human social interactions.

Using Music as a Signal for Biofeedback

Using Music as a Signal for Biofeedback.

Int J Psychophysiol. 2013 Apr 23;

Authors: Bergstrom I, Seinfeld S, Arroyo-Palacios J, Slater M, Sanchez-Vives MV

Abstract. Studies on the potential benefits of conveying biofeedback stimulus using a musical signal have appeared in recent years with the intent of harnessing the strong effects that music listening may have on subjects. While results are encouraging, the fundamental question has yet to be addressed, of how combined music and biofeedback compares to the already established use of either of these elements separately. This experiment, involving young adults (N=24), compared the effectiveness at modulating participants' states of physiological arousal of each of the following conditions: A) listening to pre-recorded music, B) sonification biofeedback of the heart rate, and C) an algorithmically modulated musical feedback signal conveying the subject's heart rate. Our hypothesis was that each of the conditions (A), (B) and (C) would differ from the other two in the extent to which it enables participants to increase and decrease their state of physiological arousal, with (C) being more effective than (B), and both more than (A). Several physiological measures and qualitative responses were recorded and analyzed. Results show that using musical biofeedback allowed participants to modulate their state of physiological arousal at least equally well as sonification biofeedback, and much better than just listening to music, as reflected in their heart rate measurements, controlling for respiration-rate. Our findings indicate that the known effects of music in modulating arousal can therefore be beneficially harnessed when designing a biofeedback protocol.

1 2 3 4 5 6 Next