By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Sep 21, 2009

Nokia Mixed Reality gadgets

Cool video by Nokia Future Tech lab on the next generation of Mixed Reality gadgets.. gaze-tracking eyewear that allows browsing and selecting with your eyes; 3-D audio to find and hear spatialized sounds... and more.

check it out:


Aug 07, 2007

Experimental evidence for mixed reality states

Via Science Daily

I was fashinated by this physics experiment, which is the first attempt to create a linked virtual/real system.  Vadas Gintautas and Alfred Hübler of the Center for Complex Systems Research at the University of Illinios achieved this result by coupling a real-world pendulum with a virtual version that moved under time-tested equations of motion. In their "mixed reality" system, the two pendulums swing as one. To get the two pendulums to communicate, the physicists fed data about the real pendulum to the virtual one, and information from the virtual pendulum is transferred to a motor that affects the motion of the real pendulum.

cradle pendulums











Mixed reality can occur only when the two systems are sufficiently similar, but a system having unknown parameters could be coupled to a virtual system whose parameters are set by the experimenters. The unknown variables in the real system could then be determined by adjusting the virtual system until the two systems shift from dual reality to mixed reality, enabling good estimates for the values of the unknown parameters.

Here is the study abstract: 


Experimental evidence for mixed reality states in an interreality system.

Phys Rev E Stat Nonlin Soft Matter Phys. 2007 May;75(5-2):057201

Authors: Gintautas V, Hübler AW

We present experimental data on the limiting behavior of an interreality system comprising a virtual horizontally driven pendulum coupled to its real-world counterpart, where the interaction time scale is much shorter than the time scale of the dynamical system. We present experimental evidence that, if the physical parameters of the simplified virtual system match those of the real system within a certain tolerance, there is a transition from an uncorrelated dual reality state to a mixed reality state of the system in which the motion of the two pendula is highly correlated. The region in parameter space for stable solutions has an Arnold tongue structure for both the experimental data and a numerical simulation. As virtual systems better approximate real ones, even weak coupling in other interreality systems may produce sudden changes to mixed reality states.

19:05 Posted in Research tools | Permalink | Comments (0) | Tags: mixed reality

Jul 11, 2007

Gadgets may help merge virtual reality with real life

reBlogged from networked performance

LindenLab, the company behind Second life, hopes to introduce hand-held and wearable systems that act as gateways between the real and virtual worlds. Linden Lab and other virtual worlds also are developing versions that run on existing mobile phones.

Researchers at a recent virtual worlds conference at MIT said that special eyewear, display "badges," and speakers worn about the neck will allow us to live more fully through our avatars - those idealized versions of ourselves that typically boast better proportions than the saggy originals.


Read full article


Mar 16, 2007

Egocentric depth judgments in optical, see-through augmented reality

Egocentric depth judgments in optical, see-through augmented reality.

IEEE Trans Vis Comput Graph. 2007 May-Jun;13(3):429-42

Authors: Swan Ii JE, Jones A, Kolstad E, Livingston MA, Smallman HS

Abstract-A fundamental problem in optical, see-through augmented reality (AR) is characterizing how it affects the perception of spatial layout and depth. This problem is important because AR system developers need to both place graphics in arbitrary spatial relationships with real-world objects, and to know that users will perceive them in the same relationships. Furthermore, AR makes possible enhanced perceptual techniques that have no real-world equivalent, such as x-ray vision, where AR users are supposed to perceive graphics as being located behind opaque surfaces. This paper reviews and discusses protocols for measuring egocentric depth judgments in both virtual and augmented environments, and discusses the well-known problem of depth underestimation in virtual environments. It then describes two experiments that measured egocentric depth judgments in AR. Experiment I used a perceptual matching protocol to measure AR depth judgments at medium and far-field distances of 5 to 45 meters. The experiment studied the effects of upper versus lower visual field location, the x-ray vision condition, and practice on the task. The experimental findings include evidence for a switch in bias, from underestimating to overestimating the distance of AR-presented graphics, at sim23 meters, as well as a quantification of how much more difficult the x-ray vision condition makes the task. Experiment II used blind walking and verbal report protocols to measure AR depth judgments at distances of 3 to 7 meters. The experiment examined real-world objects, real-world objects seen through the AR display, virtual objects, and combined real and virtual objects. The results give evidence that the egocentric depth of AR objects is underestimated at these distances, but to a lesser degree than has previously been found for most virtual reality environments. The results are consistent with previous studies that have implicated a restricted field-of-view, combined with an inability for observers to scan the ground plane in a near-to-far direction, as explanations for the observed depth underestimation.

VR and museums: call for papers

Via Networked Performance




Deadline: Friday April 27, 2007 :: Contributions are welcomed for a new book addressing the construction and interpretation of virtual artefacts within virtual world museums and within physical museum spaces. Particular emphasis is placed on theories of spatiality and strategies of interpretation.

The editors seek papers that intervene in critical discourses surrounding virtual reality and virtual artefacts, to explore the rapidly changing temporal, spatial and theoretical boundaries of contemporary museum display practice. We are especially interested in spatiality as it is employed in the construction of virtual artefacts, as well as the roles these spaces enact as signifiers of historical narrative and sites of social interaction.

We are also interested in the relationship between real-world museums and virtual world museums, with a view to interrogating the construction of meaning within, across and between both. We welcome original scholarly contributions on the topic of new cultural practices and communities related to virtual reality in the context of museum display practice. Papers might address, but are in no way limited to, the following:

* Authenticity and artificiality
* Exploration and discovery
* Physical vs virtual
* Representation/interpretation of virtual reality artefacts - as 3D spaces on screen or in a physical gallery
* Museum visiting in virtual space
* Representation of physical museum spaces in virtual worlds and their relationship to cultural definitions of museum spaces.

Please send a proposal of 500-750 words and a contributor's bio by Friday
April 27, 2007. Authors will be notified by Thursday May 31, 2007. Final drafts of papers are due by Monday October 1, 2007.

Please send your proposal to:

Tara Chittenden
Room 201
Strategic Research Unit
113 Chancery Lane
London WC2A 1PL

Or via email: tara.chittenden[at]lawsociety.org.uk

Mediamatic workshop

Via Networked Performance









Mediamatic organizes a new workshop--Hybrid World Lab--in which the participants develop prototypes for hybrid world media applications. Where the virtual world and the physical world used to be quite separated realms of reality, they are quickly becoming two faces of the same hybrid coin. This workshop investigates the increasingly intimate fusion of digital and physical space from the perspective of a media maker.

The workshop is an intense process in which the participants explore the possibilities of the physical world as interface to online media: location based media, everyday objects as media interfaces, urban screens, and cultural application of RFID technology. Every morning lectures and lessons bring in new perspectives, project presentations and introductions to the hands-on workshop tools. Every afternoon the participants work on their own workshop projects. In 5 workshop days every participant will develop a prototype of a hybrid world media project, assisted by outstanding international trainers and lectures and technical assistants. The workshop closes with a public presentation in which the issues are discussed and the results are shown.

Topics: Some of the topics that will be investigated in this workshop are: Cultural application and impact of RFID technology, internet-of-things. Using RFID in combination with other kinds of sensors. Ubiquitous computing (ubicomp) and ambient intelligence: services and applications that use chips embedded in household appliances and in public space. Locative media tools, car navigation systems, GPS tools, location sensitive mobile phones. The web as interface to the physical world: geotagging and mashupswith Google Maps & Google Earth. Games in hybrid space.