Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Dec 14, 2018

Transformative Experience Design

In the last couple of years, I and my team have been intensively working on a new research program in Positive Technology: Transformative Experience Design.

In short, the goal of this project is to understand how virtual reality, brain-based technologies and the language of arts can support transformative experiences, that is, emotional experiences that promote deep personal change.

About Transformative Experience Design

As noted by Miller and C’de Baca, there are experiences in life that are able to generate profound and long-lasting shifts in core beliefs and attitudes, including subjective self-transformation. These experiences have the capacity of changing not only what individuals know and value, but also how they see the world.

According to Mezirow’s Transformative Learning Theory, these experiences can be triggered by a “disorienting dilemma” usually related to a life crisis or major life transition (e.g., death, illness, separation, or divorce), which forces individuals to critically examine and eventually revise their core assumptions and beliefs. The outcome of a transformative experience is a significant and permanent change in the expectations – mindsets, perspectives and habits of mind – through which we filter and make sense of the world. For these characteristics, transformative experiences are gaining increasing attention not only in psychology and neuroscience, but also in philosophy.

From a psychological perspective, transformative change is often associated to specific experiential states, defined “self-transcendence experiences”. These are transient mental states that allow individuals experiencing something greater of themselves, reflecting on deeper dimensions of their existence and shaping lasting spiritual beliefs. These experiences encompass several mental states, including flow, positive emotions such as awe and elevation, “peak” experiences, “mystical” experiences and mindfulness (for a review, see Yaden et al.). Although the phenomenological profile of these experiential states can vary significantly in terms of quality and intensity, they are characterized by a diminished sense of self and increased feelings of connectedness to other people and one’s surroundings. Previous research has shown that self-transcendent experiences are important sources of positive psychological outcomes, including increased meaning in life, positive mood and life satisfaction, positive behavior change, spiritual development and pro-social attitudes.

One potentially interesting question related to self-transcendent experiences concerns whether, and to which extent, these mental states can be invited or elicited by means of interactive technologies. This question lies at the center of a new research program – Transformative Experience Design (TED) – which has a two-fold aims:

  • to systematically investigate the phenomenological and neuro-cognitive aspects of self-transcendent experiences, as well as their implications for individual growth and psychological wellbeing; and
  • to translate such knowledge into a tentative set of design principles for developing “e-experiences” that support meaning in life and personal growth. 

The three pillars of TED: virtual reality, arts and neurotechnologies

I have identified three possible assets that can be combined to achieve this goal:

  1. The first strategy concerns the use of advanced simulation technologies, such as virtual, augmented and mixed reality, as the elective medium to generate controlled alteration of perceptual, motor and cognitive processes. 
  2. The second asset regards the use of the language of arts to create emotionally-compelling storytelling scenarios.
  3. The third and final element of TED concerns the use of brain-based technologies, such as brain stimulation and bio/neurofeedback, to modulate neuro-physiological processes underlying self-transcendence mental states, using a closed-loop approach.

The central assumption of TED is that the combination of these means provides a broad spectrum of transformative possibilities, which include, for example, “what it is like” to embody another self or another life form, simulating peculiar neurological phenomena like synesthesia or out-of-body experiences, and altering time and space perception.

The safe and controlled use of these e-experiences hold the potential to facilitate self-knowledge and self-understanding, foster creative expression, develop new skills, and recognize and learn the value of others.

Example of TED research projects

Although TED is a recent research program, we are building a fast-growing community of researchers, artists and developers to shape the next generation of transformative experiences. Here is a list of recent projects and publications related to TED in different application contexts.

The Emotional Labyrinth

In this project I teamed with Sergi Bermudez i Badia and Mónica S. Cameirão from Madera Interactive Technologies Institute to realize the first example of emotionally-adaptive virtual reality application for mental health. So far, virtual reality applications in wellbeing and therapy have typically been based on pre-designed objects and spaces. In this project, we suggest a different approach, in which the content of a virtual world is procedurally generated at runtime (that is, through algorithmic means) according to the user’s affective responses. To demonstrate the concept, we developed a first prototype using Unity: the “Emotional Labyrinth”. In this VR experience, the user walks through a endless maze, whose structure and contents are automatically generated according to four basic emotional states: joy, sadness, anger and fear. 

During navigation, affective states are dynamically represented through pictures, music, and animated visual metaphors chosen to represent and induce emotional states.

The underlying hypothesis is that exposing users to multimodal representations of their affective states can create a feedback loop that supports emotional self-awareness and fosters more effective emotional regulation strategies. We carried out a first study to (i) assess the effectiveness of the selected metaphors in inducing target emotions, and (ii) identify relevant psycho-physiological markers of the emotional experience generated by the labyrinth. Results showed that the Emotional Labyrinth is overall a pleasant experience in which the proposed procedural content generation can induce distinctive psycho-physiological patterns, generally coherent with the meaning of the metaphors used in the labyrinth design. Further, collected psycho-physiological responses such as electrocardiography, respiration, electrodermal activity, and electromyography are used to generate computational models of users' reported experience. These models enable the future implementation of the closed loop mechanism to adapt the Labyrinth procedurally to the users' affective state.

Awe in Virtual Reality

Awe is a compelling emotional experience with philosophical roots in the domain of aesthetics and religious or spiritual experiences. Both Edmund Burke’s (1759/1970 and Immanuel Kant’s (1764/2007) analyses of the sublime as a compelling experience that transcends one’s perception of beauty to something more profound are couched in terms that seem synonymous with the modern understanding of awe.

The contemporary psychological understanding of awe comes largely from a foundational article written by Keltner and Haidt (2003). According to their conceptualization, awe experiences encompass two key appraisals: the perception of vastness and the need to mentally attempt to accommodate this vastness into existing mental schemas.

Crucially, research has shown that experiencing awe is associated with positive transformative changes at both psychological and physical levels (e.g., Shiota et al., 2007Schneider, 2009Stellar et al., 2015). For example, awe can change our perspective toward even unknown others thus increasing our generous attitude toward them (Piff et al., 2015Prade and Saroglou, 2016) and reducing aggressive behaviors (Yang et al., 2016). Generally, awe broadens our attentional focus (Sung and Yih, 2015), and extends time perception (Rudd et al., 2012). Furthermore, this emotion protects our immunity system against chronic and cardiovascular diseases and enhance our satisfaction toward life (Krause and Hayward, 2015Stellar et al., 2015).

Considering the transformative potential of awe, I and my doctoral student Alice Chiricofocused on how to elicit intense feelings of this complex emotion using virtual reality. To this end, we modeled three immersive virtual environments (i.e., a forest including tall trees; a chain of mountains; and an earth view from deep space) designed to induce a feeling of perceptual vastness. As hypothesized, the three target environments induced a significantly greater awe than a "neutral" virtual environment (a park consisting of a green clearing with very few trees and some flowers). Full details of this study are reported here.

In another study, we examined the potential of VR-induced awe to foster creativity. To this end, we exposed participants both to an awe-inducing 3D-video and to a neutral one in a within-subject design. After each stimulation condition, participants reported the intensity and type of perceived emotion and completed two verbal tasks of the Torrance Tests of Creative Thinking (TTCT; Torrance, 1974), a standardized test to measure creativity performance. Results showed that awe positively affected key creativity components—fluidity, flexibility, and elaboration measured by the TTCT subtest—compared to a neutral stimulus, suggesting that (i) awe has a potential for boosting creativity, and (ii) VR is a powerful awe-inducing medium that can be used in different application contexts (i.e., educational, clinical etc.) where this emotion can make a difference.

However, not only graphical 3D environments can be used to induce awe; in another study, we showed that also 360° videos depicting vast natural scenarios are powerful stimuli to induce intense feelings of this complex emotion.

Immersive storytelling for psychotherapy and mental wellbeing

Growing research evidence indicates that VR can be effectively integrated in psychotherapyto treat a number of clinical conditions, including anxiety disorders, pain disorders and PTSD. In this context, VR is mostly used as simulative tool for controlled exposure to critical/fearful situations. The possibility of presenting realistic controlled stimuli and, simultaneously, of monitoring the responses generated by the user offers a considerable advantage over real experiences.

However, the most interesting potential of VR resides in its capacity of creating compelling immersive storytelling experiences. As recently noted by Brenda Wiederhold:

Virtual training simulations, documentaries, and experiences will, however, only be as effective as the emotions they spark in the viewer. To reach that point, the VR industry is faced with two obstacles: creating content that is enjoyable and engaging, and encouraging adoption of the medium among consumers. Perhaps the key to both problems is the recognition that users are not passive consumers of VR content. Rather, they bring their own thoughts, needs, and emotions to the worlds they inhabit. Successful stories challenge those conceptions, invite users to engage with the material, and recognize the power of untethering users from their physical world and throwing them into another. That isn’t just the power of VR—it’s the power of storytelling as a whole.

Thus, VR-based narratives can be used to generate an infinite number of “possible selves”, by providing a person a “subjective window of presence” into unactualized, but possible, worlds.

The emergence of immersive storytelling introduces the possibility of using VR in mental health from a different rationale than virtual reality-based exposure therapy. In this novel rationale, immersive stories, lived from a first-person perspective, provide the patient the opportunity of engaging emotionally with metaphoric narratives, eliciting new insights and meaning-making related to viewers’ personal world views.

To explore this new perspective, I have been collaborating with the Italian startup Become to test the potential of transformative immersive storytelling in mental health and wellbeing. An intriguing aspect of this strategy is that, in contrast with conventional virtual-reality exposure therapy, which is mostly used in combination with Cognitive-Behavioral Therapy interventions, immersive storytelling scenarios can be integrated in any therapeutic model, since all kinds of psychotherapy involve some form of ‘storytelling’. 

In this project, we are interested in understanding, for example, whether the integration of immersive stories in the therapeutic setting can enhance the efficacy of the intervention and facilitate patients in expressing their inner thoughts, feelings, and life experiences. 

Collaborate!

Are you a researcher, a developer, or an artist interested in collaborating in TED projects? Here is how:

  1. Drop me an email at: andrea.gaggioli@unicatt.it
  2. Sign into ResearchGate and visit Transformative Experience Design project's page
  3. Have a look at the existing projects and publications to find out which TED research line is more interesting to you.

Key references

[1] Miller, W. R., & C'de Baca, J. (2001). Quantum change: When epiphanies and sudden insights transform ordinary lives. New York: Guilford Press.

[2] Yaden, D. B., Haidt, J., Hood, R. W., Jr., Vago, D. R., & Newberg, A. B. (2017). The varieties of self-transcendent experience. Review of General Psychology, 21(2), 143-160.

[3] Gaggioli, A. (2016). Transformative Experience Design. In Human Computer Confluence. Transforming Human Experience Through Symbiotic Technologies, eds A. Gaggioli, A. Ferscha, G. Riva, S. Dunne, and I. Viaud-Delmon (Berlin: De Gruyter Open), 96–121.

Apr 06, 2017

Crowdsourcing VR research

If 2016 has been a golden year for virtual reality, there is reason to believe that the coming year may be even better. According to a recent market forecast by International Data Corporation (IDC), worldwide revenues for the augmented reality and virtual reality market are projected to grow from $5.2 billion in 2016 to more than $162 billion in 2020.

virtual-reality-goggles-700x341.jpg

With virtual reality becoming a mass product, it becomes crucial to understand its psychological effects on users.

Over the last decade, a growing body of research has been addressing the positive and negative implications of virtual experience for the human mind. Yet many questions still remain unanswered.

Some of these issues are concerned with the defining features of virtual experience, i.e., what it means to be “present” in a computer-simulated reality. Other questions regard the drawbacks of virtual environments, such as cybersickness, addiction and other psychological disorders caused by prolonged exposure to immersive virtual worlds.

For example, in a recent article appeared in The Atlantic, Rebecca Searles wrote that after exploring a virtual environment, some users have reported a feeling of detachment that can last days or even weeks. This effect had been already documented by Frederick Aardema and colleagues in the journal Cyberpsychology, Behavior, and Social Networking some years ago. The team administered a nonclinical sample questionnaires to measures dissociation, sense of presence, and immersion before and after an immersion in a virtual environment. Findings showed that after explosure to virtual reality, participants reported an increase in dissociative experience (depersonalization and derealization), including a lessened sense of presence in objective reality.

However, more research is needed to understand this phenomenon, and other aspects of virtual experience that are still to be uncovered.

Until today, most studies on virtual reality have been mainly conducted in scientific laboratories, because of the relatively high costs of virtual reality hardware and the need of specialist expertise for system setup and maintenance.

However, the increasing diffusion of commercial virtual reality headsets and software could make it possible to move research from the laboratory to private homes. For example, researchers could create online experiments and ask people to participate using their own virtual reality equipment, eventually providing some kind of rewards for their involvement.

An online collaboration platform could be developed to plan studies, create research protocols, collect and share data from participants. This open research strategy may offer several advantages. For example, the platform would offer researchers the opportunity to rapidly get input from large numbers of virtual reality participants. Furthermore, the users themselves could be involved in formulating research questions and co-create experiments with researchers.

98-687-1264940474.png

In the medical field, this approach has been successfully pioneered by online patient communities such as PatientsLikeMe and CureTogether. These social health sites provide a real-time research platform that allow clinical researchers and patients to partner for improving health outcomes. Other examples of internet-based citizen science projects include applications in astronomy, environmental protection, neuroscience to name a few (more examples can be found in Zooniverse, the world’s largest citizen science web portal).

But virtual reality could extend the potential of citizen science even further. For example, virtual reality applications could be developed that are specifically designed for research purposes, i.e., virtual reality games that “manipulate” some variables of interest for researchers, or virtual reality versions of classic experimental paradigms, such as the “Stroop test”. It could be even possible to create virtual reality simulations of whole research laboratories, to allow participants to participate in online experiments using their avatars.

Feb 22, 2017

The Potential of Virtual Reality for the Investigation of Awe

Alice Chirico, David B. Yaden, Giuseppe Riva and Andrea Gaggioli

Front. Psychol., 09 November 2016 https://doi.org/10.3389/fpsyg.2016.01766

Dipartimento di Psicologia, Università Cattolica del Sacro Cuore, Milan, Italy

The emotion of awe is characterized by the perception of vastness and a need for accommodation, which can include a positive and/or negative valence. While a number of studies have successfully manipulated this emotion, the issue of how to elicit particularly intense awe experiences in laboratory settings remains. We suggest that virtual reality (VR) is a particularly effective mood induction tool for eliciting awe. VR provides three key assets for improving awe. First, VR provides users with immersive and ecological yet controlled environments that can elicit a sense of “presence,” the subjective experience of “being there” in a simulated reality. Further, VR can be used to generate complex, vast stimuli, which can target specific theoretical facets of awe. Finally, VR allows for convenient tracking of participants’ behavior and physiological responses, allowing for more integrated assessment of emotional experience. We discussed the potential and challenges of the proposed approach with an emphasis on VR’s capacity to raise the signal of reactions to emotions such as awe in laboratory settings.

Jan 19, 2017

Facebook Study Finds Introverts Feel More Comfortable with VR Social Interaction

Via RoadToVr

A recent study by Facebook IQ, in which people completed one-on-one conversations in VR, concluded that most people respond positively, and introverts in particular feel more comfortable. Facebook IQ is a team established to assist marketers in understanding the way people communicate online and offline.

Facebook has been exploring the potential of social VR since their famous acquisition of Oculus VR in 2014. More recently, they detailed the results of their social VR avatar experiments and are planning to launch a ‘social VR app’ very soon. A different social experiment was recently completed by Facebook IQ, an internal team who help businesses understand communication trends and advertising effectiveness – asking 60 people to have a one-on-one conversation, half of them being in person, and half being in a VR environment wearing the Oculus Rift.

Interestingly, they didn’t use the VR avatars seen in Facebook’s own demonstrations, nor did they use the Oculus avatars found in the Rift’s menus – instead they used vTime, a popular ‘sociable network’ app available for Rift, Gear VR, Cardboard and Daydream. vTime uses its own full-body avatar system, complete with automatically-animating hands – surprising that these would be used in such an experiment. However, it seems like the main reason for choosing the software was to use its comfortable ‘train cabin’ environment – a familiar and natural place to converse with a stranger – and the focus of the experiment was about vocal communication.

facebook-vr

Applied neuroscience company Neurons Inc was commissioned to assist with the study of cognitive and emotional responses; all participants wore high resolution electroencephalography (EEG) scanners, used to record electrical activity in the brain, and eye trackers. With half the group conducting a normal one-to-one conversation in person, and the other half engaged in vTime, Neurons Inc was able to compare the level of comfort and engagement of a VR conversation compared to a conventional one. The eye trackers helped to determine the user’s level of attention, and the EEG scanners were used to assess motivation and cognitive load, based on the level of brain activity. If the load is too low, it means the person is bored; too high and they’re stressed.

According to the report published on Facebook Insights, the participants, who had mostly never tried VR before, were within the ‘optimal range of cognitive effort’, being neither bored nor overstimulated. The cognitive load decreased over time, meaning that people naturally became more comfortable as the conversation progressed. In the interviews that followed, 93% said that they liked their virtual conversation partner, and those who were identified as more introverted responded ‘particularly positively’, being more engaged by meeting in VR than by meeting in person.

 
 

Jan 02, 2017

The Potential of Virtual Reality for the Investigation of Awe

The Potential of Virtual Reality for the Investigation of Awe

Alice Chirico, David B. Yaden, Giuseppe Riva and Andrea Gaggioli

Front. Psychol., 09 November 2016 | https://doi.org/10.3389/fpsyg.2016.01766

The emotion of awe is characterized by the perception of vastness and a need for accommodation, which can include a positive and/or negative valence. While a number of studies have successfully manipulated this emotion, the issue of how to elicit particularly intense awe experiences in laboratory settings remains. We suggest that virtual reality (VR) is a particularly effective mood induction tool for eliciting awe. VR provides three key assets for improving awe. First, VR provides users with immersive and ecological yet controlled environments that can elicit a sense of “presence,” the subjective experience of “being there” in a simulated reality. Further, VR can be used to generate complex, vast stimuli, which can target specific theoretical facets of awe. Finally, VR allows for convenient tracking of participants’ behavior and physiological responses, allowing for more integrated assessment of emotional experience. We discussed the potential and challenges of the proposed approach with an emphasis on VR’s capacity to raise the signal of reactions to emotions such as awe in laboratory settings.

The Impact of Virtual Reality on Chronic Pain

The Impact of Virtual Reality on Chronic Pain.

PLoS One. 2016;11(12):e0167523

Authors: Jones T, Moore T, Choo J

Abstract. The treatment of chronic pain could benefit from additional non-opioid interventions. Virtual reality (VR) has been shown to be effective in decreasing pain for procedural or acute pain but to date there have been few studies on its use in chronic pain. The present study was an investigation of the impact of a virtual reality application for chronic pain. Thirty (30) participants with various chronic pain conditions were offered a five-minute session using a virtual reality application called Cool! Participants were asked about their pain using a 0-10 visual analog scale rating before the VR session, during the session and immediately after the session. They were also asked about immersion into the VR world and about possible side effects. Pain was reduced from pre-session to post-session by 33%. Pain was reduced from pre-session during the VR session by 60%. These changes were both statistically significant at the p < .001 level. Three participants (10%) reported no change between pre and post pain ratings. Ten participants (33%) reported complete pain relief while doing the virtual reality session. All participants (100%) reported a decrease in pain to some degree between pre-session pain and during-session pain. The virtual reality experience was found here to provide a significant amount of pain relief. A head mounted display (HMD) was used with all subjects and no discomfort was experienced. Only one participant noted any side effects. VR seems to have promise as a non-opioid treatment for chronic pain and further investigation is warranted.

22:12 Posted in Virtual worlds | Permalink | Comments (0)

Oct 15, 2016

Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change

Front. Psychiatry, 30 September 2016 http://dx.doi.org/10.3389/fpsyt.2016.00164

Giuseppe Riva, Rosa M. Baños, Cristina Botella, Fabrizia Mantovani and Andrea Gaggioli

During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/ or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.

Jun 21, 2016

New book on Human Computer Confluence - FREE PDF!

Two good news for Positive Technology followers.

1) Our new book on Human Computer Confluence is out!

2) It can be downloaded for free here

9783110471137.jpg

Human-computer confluence refers to an invisible, implicit, embodied or even implanted interaction between humans and system components. New classes of user interfaces are emerging that make use of several sensors and are able to adapt their physical properties to the current situational context of users.

A key aspect of human-computer confluence is its potential for transforming human experience in the sense of bending, breaking and blending the barriers between the real, the virtual and the augmented, to allow users to experience their body and their world in new ways. Research on Presence, Embodiment and Brain-Computer Interface is already exploring these boundaries and asking questions such as: Can we seamlessly move between the virtual and the real? Can we assimilate fundamentally new senses through confluence?

The aim of this book is to explore the boundaries and intersections of the multidisciplinary field of HCC and discuss its potential applications in different domains, including healthcare, education, training and even arts.

DOWNLOAD THE FULL BOOK HERE AS OPEN ACCESS

Please cite as follows:

Andrea Gaggioli, Alois Ferscha, Giuseppe Riva, Stephen Dunne, Isabell Viaud-Delmon (2016). Human computer confluence: transforming human experience through symbiotic technologies. Warsaw: De Gruyter. ISBN 9783110471120.

 

May 26, 2016

From User Experience (UX) to Transformative User Experience (T-UX)

In 1999, Joseph Pine and James Gilmore wrote a seminal book titled “The Experience Economy” (Harvard Business School Press, Boston, MA) that theorized the shift from a service-based economy to an experience-based economy.

According to these authors, in the new experience economy the goal of the purchase is no longer to own a product (be it a good or service), but to use it in order to enjoy a compelling experience. An experience, thus, is a whole-new type of offer: in contrast to commodities, goods and services, it is designed to be as personal and memorable as possible. Just as in a theatrical representation, companies stage meaningful events to engage customers in a memorable and personal way, by offering activities that provide engaging and rewarding experiences.

Indeed, if one looks back at the past ten years, the concept of experience has become more central to several fields, including tourism, architecture, and – perhaps more relevant for this column – to human-computer interaction, with the rise of “User Experience” (UX).

The concept of UX was introduced by Donald Norman in a 1995 article published on the CHI proceedings (D. Norman, J. Miller, A. Henderson: What You See, Some of What's in the Future, And How We Go About Doing It: HI at Apple Computer. Proceedings of CHI 1995, Denver, Colorado, USA). Norman argued that focusing exclusively on usability attribute (i.e. easy of use, efficacy, effectiveness) when designing an interactive product is not enough; one should take into account the whole experience of the user with the system, including users’ emotional and contextual needs. Since then, the UX concept has assumed an increasing importance in HCI. As McCarthy and Wright emphasized in their book “Technology as Experience” (MIT Press, 2004):

“In order to do justice to the wide range of influences that technology has in our lives, we should try to interpret the relationship between people and technology in terms of the felt life and the felt or emotional quality of action and interaction.” (p. 12).

However, according to Pine and Gilmore experience may not be the last step of what they call as “Progression of Economic Value”. They speculated further into the future, by identifying the “Transformation Economy” as the likely next phase. In their view, while experiences are essentially memorable events which stimulate the sensorial and emotional levels, transformations go much further in that they are the result of a series of experiences staged by companies to guide customers learning, taking action and eventually achieving their aspirations and goals.

In Pine and Gilmore terms, an aspirant is the individual who seeks advice for personal change (i.e. a better figure, a new career, and so forth), while the provider of this change (a dietist, a university) is an elictor. The elictor guide the aspirant through a series of experiences which are designed with certain purpose and goals. According to Pine and Gilmore, the main difference between an experience and a transformation is that the latter occurs when an experience is customized:

“When you customize an experience to make it just right for an individual - providing exactly what he needs right now - you cannot help changing that individual. When you customize an experience, you automatically turn it into a transformation, which companies create on top of experiences (recall that phrase: “a life-transforming experience”), just as they create experiences on top of services and so forth” (p. 244).

A further key difference between experiences and transformations concerns their effects: because an experience is inherently personal, no two people can have the same one. Likewise, no individual can undergo the same transformation twice: the second time it’s attempted, the individual would no longer be the same person (p. 254-255).

But what will be the impact of this upcoming, “transformation economy” on how people relate with technology? If in the experience economy the buzzword is “User Experience”, in the next stage the new buzzword might be “User Transformation”.

Indeed, we can see some initial signs of this shift. For example, FitBit and similar self-tracking gadgets are starting to offer personalized advices to foster enduring changes in users’ lifestyle; another example is from the fields of ambient intelligence and domotics, where there is an increasing focus towards designing systems that are able to learn from the user’s behaviour (i.e. by tracking the movement of an elderly in his home) to provide context-aware adaptive services (i.e. sending an alert when the user is at risk of falling).

But likely, the most important ICT step towards the transformation economy could take place with the introduction of next-generation immersive virtual reality systems. Since these new systems are based on mobile devices (an example is the recent partnership between Oculus and Samsung), they are able to deliver VR experiences that incorporate information on the external/internal context of the user (i.e. time, location, temperature, mood etc) by using the sensors incapsulated in the mobile phone.

By personalizing the immersive experience with context-based information, it might be possibile to induce higher levels of involvement and presence in the virtual environment. In case of cyber-therapeutic applications, this could translate into the development of more effective, transformative virtual healing experiences.

Furthermore, the emergence of "symbiotic technologies", such as neuroprosthetic devices and neuro-biofeedback, is enabling a direct connection between the computer and the brain. Increasingly, these neural interfaces are moving from the biomedical domain to become consumer products. But unlike existing digital experiential products, symbiotic technologies have the potential to transform more radically basic human experiences.

Brain-computer interfaces, immersive virtual reality and augmented reality and their various combinations will allow users to create “personalized alterations” of experience. Just as nowadays we can download and install a number of “plug-ins”, i.e. apps to personalize our experience with hardware and software products, so very soon we may download and install new “extensions of the self”, or “experiential plug-ins” which will provide us with a number of options for altering/replacing/simulating our sensorial, emotional and cognitive processes.

Such mediated recombinations of human experience will result from of the application of existing neuro-technologies in completely new domains. Although virtual reality and brain-computer interface were originally developed for applications in specific domains (i.e. military simulations, neurorehabilitation, etc), today the use of these technologies has been extended to other fields of application, ranging from entertainment to education.

In the field of biology, Stephen Jay Gould and Elizabeth Vrba (Paleobiology, 8, 4-15, 1982) have defined “exaptation” the process in which a feature acquires a function that was not acquired through natural selection. Likewise, the exaptation of neurotechnologies to the digital consumer market may lead to the rise of a novel “neuro-experience economy”, in which technology-mediated transformation of experience is the main product.

Just as a Genetically-Modified Organism (GMO) is an organism whose genetic material is altered using genetic-engineering techniques, so we could define aTechnologically-Modified Experience (ETM) a re-engineered experience resulting from the artificial manipulation of neurobiological bases of sensorial, affective, and cognitive processes.

Clearly, the emergence of the transformative neuro-experience economy will not happen in weeks or months but rather in years. It will take some time before people will find brain-computer devices on the shelves of electronic stores: most of these tools are still in the pre-commercial phase at best, and some are found only in laboratories.

Nevertheless, the mere possibility that such scenario will sooner or later come to pass, raises important questions that should be addressed before symbiotic technologies will enter our lives: does technological alteration of human experience threaten the autonomy of individuals, or the authenticity of their lives? How can we help individuals decide which transformations are good or bad for them?

Answering these important issues will require the collaboration of many disciplines, including philosophy, computer ethics and, of course, cyberpsychology.

May 24, 2016

Virtual reality painting tool

Feb 14, 2016

3D addiction

As many analysts predict, next-generation virtual reality technology promises to change our lives.

From manufacturing to medicine, from entertainment to learning, there is no economic or cultural sector that is immune from the VR revolution.

According to a recent report from Digi-Capital, the augmented/virtual reality market could hit $150B revenue by 2020, with augmented reality projected to reach $120B and virtual reality $30B.

Still, there are a lot of unanswered questions concerning the potential negative effects of virtual reality on the human brain. For example, we know very little about the consequences of prolonged immersion in a virtual world.

virtual-reality-time

Most of scientific virtual reality experiments carried out so far have lasted for short time intervals (typically, less than an hour). However, we don’t know what are the potential side effects of being “immersed” for a 12-hour virtual marathon. When one looks at today’s headsets, it might seem unlikely that people will spend so much time wearing them, because they are still ergonomically poor. Furthermore, most virtual reality contents available on the market are not exploiting the full narrative potential of the medium, which can go well beyond a “virtual Manhattan skyride”.

But as soon as usability problems will be fixed, and 3D contents will be compelling and engaging enough, the risk of “3D addiction” may be around the corner. Most importantly, risks of virtual reality exposure are not limited to adults, but especially endanger adolescents’ and children’s health. Given the widespread use of smartphones among kids, it is likely that virtual reality games will become very popular within this segment.

Given that Zuckerberg regards virtual reality as the next big thing after video for Facebook (in March 2014 his corporation bought Oculus VR in a deal worth $2 billion), perhaps he might also consider investing some of these resources for supporting research on the health risks that are potentially associated with this amazing and life-changing technology.  

 

21:55 Posted in Virtual worlds | Permalink | Comments (0)

Dec 26, 2015

Manus VR Experiment with Valve’s Lighthouse to Track VR Gloves

Via Road to VR

The Manus VR team demonstrate their latest experiment, utilising Valve’s laser-based Lighthouse system to track their in-development VR glove.

Manus VR (previously Manus Machina), the company from Eindhoven, Netherlands dedicated to building VR input devices, seem have gained momentum in 2015. They secured their first round of seed funding and have shipped early units to developers and now, their R&D efforts have extended to Valve’s laser based tracking solution Lighthouse, as used in the forthcoming HTC Vive headset and SteamVR controllers. 

The Manus VR team seem to have canibalised a set of SteamVR controllers, leveraging the positional tracking of wrist mounted units to augment Manus VR’s existing glove-mounted IMUs. Last time I tried the system, the finger joint detection was pretty good, but the Samsung Gear VR camera-based positional tracking struggled understandably with latency and accuracy. The experience on show seems immeasurably better, perhaps unsurprisingly.

manus-machina-gloves-1

Oct 12, 2014

New Scientist on new virtual reality headset Oculus Rift

From New Scientist

The latest prototype of virtual reality headset Oculus Rift allows you to step right into the movies you watch and the games you play

AN OLD man sits across the fire from me, telling a story. An inky dome of star-flecked sky arcs overhead as his words mingle with the crackling of the flames. I am entranced.

This isn't really happening, but it feels as if it is. This is a program called Storyteller – Fireside Tales that runs on the latest version of the Oculus Rift headset, unveiled last month. The audiobook software harnesses the headset's virtual reality capabilities to deepen immersion in the story. (See also "Plot bots")

Fireside Tales is just one example of a new kind of entertainment that delivers convincing true-to-life experiences. Soon films will get a similar treatment.

Movie company 8i, based in Wellington, New Zealand, plans to make films specifically for Oculus Rift. These will be more immersive than just mimicking a real screen in virtual reality because viewers will be able to step inside and explore the movie while they are watching it.

"We are able to synthesise photorealistic views in real-time from positions and directions that were not directly captured," says Eugene d'Eon, chief scientist at 8i. "[Viewers] can not only look around a recorded experience, but also walk or fly. You can re-watch something you love from many different perspectives."

The latest generation of games for Oculus Rift are more innovative, too. Black Hat Oculus is a two-player, cooperative game designed by Mark Sullivan and Adalberto Garza, both graduates of MIT's Game Lab. One headset is for the spy, sneaking through guarded buildings on missions where detection means death. The other player is the overseer, with a God-like view of the world, warning the spy of hidden traps, guards and passageways.

Deep immersion is now possible because the latest Oculus Rift prototype – known as Crescent Bay – finally delivers full positional tracking. This means the images that you see in the headset move in sync with your own movements.

This is the key to unlocking the potential of virtual reality, says Hannes Kaufmann at the Technical University of Vienna in Austria. The headset's high-definition display and wraparound field of view are nice additions, he says, but they aren't essential.

The next step, says Kaufmann is to allow people to see their own virtual limbs, not just empty space, in the places where their brain expects them to be. That's why Beijing-based motion capture company Perception, which raised more than $500,000 on Kickstarter in September, is working on a full body suit that gathers body pose estimation and gives haptic feedback – a sense of touch – to the wearer. Software like Fireside Tales will then be able to take your body position into account.

In the future, humans will be able to direct live virtual experiences themselves, says Kaufmann. "Imagine you're meeting an alien in the virtual reality, and you want to shake hands. You could have a real person go in there and shake hands with you, but for you only the alien is present."

Oculus, which was bought by Facebook in July for $2 billion, has not yet announced when the headset will be available to buy.

This article appeared in print under the headline "Deep and meaningful"

Oct 06, 2014

Is the metaverse still alive?

In the last decade, online virtual worlds such as Second Life and alike have become enormously popular. Since their appearance on the technology landscape, many analysts regarded shared 3D virtual spaces as a disruptive innovation, which would have rendered the Web itself obsolete.

This high expectation attracted significant investments from large corporations such as IBM, which started building their virtual spaces and offices in the metaverse. Then, when it became clear that these promises would not be kept, disillusionment set in and virtual worlds started losing their edge. However, this is not a new phenomenon in high-tech, happening over and over again.

The US consulting company Gartner has developed a very popular model to describe this effect, called the “Hype Cycle”. The Hype Cycle provides a graphic representation of the maturity and adoption of technologies and applications.

It consists of five phases, which show how emerging technologies will evolve.

In the first, “technology trigger” phase, a new technology is launched which attracts the interest of media. This is followed by the “peak of inflated expectations”, characterized by a proliferation of positive articles and comments, which generate overexpectations among users and stakeholders.

In the next, “trough of disillusionment” phase, these exaggerated expectations are not fulfilled, resulting in a growing number of negative comments generally followed by a progressive indifference.

In the “slope of enlightenment” the technology potential for further applications becomes more broadly understood and an increasing number of companies start using it.

In the final, “plateau of productivity” stage, the emerging technology established itself as an effective tool and mainstream adoption takes off. 

So what stage in the hype cycle are virtual worlds now?

After the 2006-2007 peak, metaverses entered the downward phase of the hype cycle, progressively loosing media interest, investments and users. Many high-tech analysts still consider this decline an irreversible process.

However, the negative outlook that headed shared virtual worlds into the trough of disillusionment maybe soon reversed. This is thanks to the new interest in virtual reality raised by the Oculus Rift (recently acquired by Facebook for $2 billion), Sony’s Project Morpheus and alike immersive displays, which are still at the takeoff stage in the hype cycle.

Oculus Rift's chief scientist Michael Abrash makes no mystery of the fact that his main ambition has always been to build a metaverse such the one described in Neal Stephenson's (1992) cyberpunk novel Snow Crash. As he writes on the Oculus blog

"Sometime in 1993 or 1994, I read Snow Crash and for the first time thought something like the Metaverse might be possible in my lifetime."

Furthermore, despite the negative comments and deluded expectations, the metaverse keeps attracting new users: in its 10th anniversary on June 23rd 2013, an infographic reported that Second Life had over 1 million users visit around the world monthly, more than 400,000 new accounts per month, and 36 million registered users.

So will Michael Abrash’s metaverse dream come true? Even if one looks into the crystal ball of the hype cycle, the answer is not easily found.

Sep 25, 2014

With Cyberith's Virtualizer, you can run around wearing an Oculus Rift

Sep 21, 2014

First hands-on: Crescent Bay demo

I just tested the Oculus Crescent Bay prototype at the Oculus Connect event in LA.

I still can't close my mouth.

The demo lasted about 10 min, during which several scenes were presented. The resolution and framerate are astounding, you can turn completely around. I can say this is the first time in my life I can really say I was there.

I believe this is really the begin of a new era for VR, and I am sure I won't sleep tonight thinking about the infinite possibilities and applications of this technology. and I don't think I am exaggerating - if anything, I am underestimating

 

 

Aug 05, 2014

Life's a beach at work for Japanese company

A Japanese company has recreated a tropical beach in the very reception area they also use as their employee meeting space and staff lounge.

Aug 03, 2014

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity

Modulation of functional network with real-time fMRI feedback training of right premotor cortex activity.

Neuropsychologia. 2014 Jul 21;

Authors: Hui M, Zhang H, Ge R, Yao L, Long Z

Abstract. Although the neurofeedback of real-time fMRI can reportedly enable people to gain control of the activity in the premotor cortex (PMA) during motor imagery, it is unclear how the neurofeedback training of PMA affect the motor network engaged in the motor execution (ME) and imagery (MI) task. In this study, we investigated the changes in the motor network engaged in both ME and MI task induced by real-time neurofeedback training of the right PMA. The neurofeedback training induced changes in activity of the ME-related motor network as well as alterations in the functional connectivity of both the ME-related and MI-related motor networks. Especially, the percent signal change of the right PMA in the last training run was found to be significantly correlated with the connectivity between the right PMA and the left posterior parietal lobe (PPL) during the pre-training MI run, post-training MI run and the last training run. Moreover, the increase in the tapping frequency was significantly correlated with the increase of connectivity between the right cerebellum and the primary motor area / primary sensory area (M1/S1) of the ME-related motor network after neurofeedback training. These findings show the importance of the connectivity between the right PMA and left PPL of the MI network for the up-regulation of the right PMA as well as the critical role of connectivity between the right cerebellum and M1/S1 of the ME network in improving the behavioral performance.

Fly like a Birdly

Birdly is a full body, fully immersive, Virtual Reality flight simulator developed at the Zurich University of the Arts (ZHdK). With Birdly, you can embody an avian creature, the Red Kite, visualized through Oculus Rift, as it soars over the 3D virtual city of San Francisco, heightened by sonic, olfactory, and wind feedback.

Jul 30, 2014

A virtual rehabilitation program after amputation: a phenomenological exploration

A virtual rehabilitation program after amputation: a phenomenological exploration.

Disabil Rehabil Assist Technol. 2013 Nov;8(6):511-5

Authors: Moraal M, Slatman J, Pieters T, Mert A, Widdershoven G

Abstract. PURPOSE: This study provides an analysis of bodily experiences of a man with a lower leg amputation who used a virtual rehabilitation program. METHOD: The study reports data from semi-structured interviews with a 32-year veteran who used a virtual environment during rehabilitation. The interviews were analyzed using interpretative phenomenological analysis (IPA). RESULTS: During this rehabilitation program, he initially experienced his body as an object, which he had to handle carefully. As he went along with the training sessions, however, he was more stimulated to react directly without being aware of the body's position. In order to allow himself to react spontaneously, he needed to gain trust in the device. This was fostered by his narrative, in which he stressed how the device mechanically interacts with his movements. CONCLUSION: The use of a virtual environment facilitated the process of re-inserting one's body into the flow of one's experience in two opposite, but complementary ways: (1) it invited this person to move automatically without taking into account his body; (2) it invited him to take an instrumental or rational view on his body. Both processes fostered his trust in the device, and ultimately in his body. IMPLICATIONS FOR REHABILITATION: Providing (more) technological explanation of the technological device (i.e. the virtual environment), may facilitate a rehabilitation process. Providing (more) explicit technological feedback, during training sessions in a virtual environment, may facilitate a rehabilitation process.

1 2 3 4 5 6 7 8 Next