Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Feb 04, 2010

Yet another nice video about AR

Yet another nice video about AR

Augmented (hyper)Reality

Via Leandeer

This great video by Keiichi Matsuda shows how augmented reality "may recontextualise the functions of consumerism and architecture, and change in the way in which we operate within it". The scenario is also interesting because it suggests how AR may be (ab)used by commercial companies. On the other hand, it is difficult to imagine how AR could go mainstream without them... of course any suggestion is welcome.

Augmented (hyper)Reality: Domestic Robocop from Keiichi Matsuda on Vimeo.

Sep 28, 2009

Bionic Eye - Augmented Reality on the iPhone

Bionic Eye is the first augmented reality application developed for the iPhone 3GS. A brainchild of Dutch start-up Layar, Bionic Eye enables you to visualize Points of Interest (POI) located in your nearby environment in the US.

POI databases include restaurants, WiFi hotspots, subway stations (New York Subway, Washington Metro, Chicago L Rapid Transit), etc. Over 100.000 POI are already included in this application. Elements located at a distance less than 1km (0,621miles) only will be displayed on the screen.

Download link

 

 

Sep 21, 2009

Nokia Mixed Reality gadgets

Cool video by Nokia Future Tech lab on the next generation of Mixed Reality gadgets.. gaze-tracking eyewear that allows browsing and selecting with your eyes; 3-D audio to find and hear spatialized sounds... and more.

check it out:

 

Jun 05, 2009

Digital hologram of smart grid technology

General Electric has a great mini-site up showcasing their newest energy services and smarter power management tools.

But the most intriguing part of the site is the augmented reality applications that you can play with using your computer’s webcam.

What you do is you print out a piece of paper that the webcam “sees” and GE’s augmented reality program builds a virtual hologram.

Check out the demo video and then try the AR apps here

 

Apr 14, 2009

Movable screen

Cool video of an augmented reality display in use at Allard Pierson Museum in Amsterdam. Augmented reality has become reality...

 

Oct 27, 2007

LevelHead

Re-blogged from Networked Performance

 

levelhead.jpg

 

levelHead is an interactive game that uses a cube, a webcam, and pattern recognition. When the cube is rotated or tilted in front of the camera the user will be able to see ‘inside’ the cube and guide a small avatar through six different rooms.

Pattern recognition has already been used in several other projects, but this is a new way of using it, and a new way of thinking of the technology. The idea behind the game itself is rather simple. When the cube is tilted the avatar moves in the corresponding direction. The goal of the game is to guide him through a maze of rooms connected by doors, and lead him to the outside world.

According to the creater, Julian Oliver, the game is currently in development, but will be released as open-source soon.

Check out the explanatory video.

 

Jul 11, 2007

Gadgets may help merge virtual reality with real life

reBlogged from networked performance

LindenLab, the company behind Second life, hopes to introduce hand-held and wearable systems that act as gateways between the real and virtual worlds. Linden Lab and other virtual worlds also are developing versions that run on existing mobile phones.

Researchers at a recent virtual worlds conference at MIT said that special eyewear, display "badges," and speakers worn about the neck will allow us to live more fully through our avatars - those idealized versions of ourselves that typically boast better proportions than the saggy originals.

 

Read full article

 

Mar 16, 2007

Mediamatic workshop

Via Networked Performance

 

medium_14623-505-337.jpg

 

 

 

 

 

 

Mediamatic organizes a new workshop--Hybrid World Lab--in which the participants develop prototypes for hybrid world media applications. Where the virtual world and the physical world used to be quite separated realms of reality, they are quickly becoming two faces of the same hybrid coin. This workshop investigates the increasingly intimate fusion of digital and physical space from the perspective of a media maker.

The workshop is an intense process in which the participants explore the possibilities of the physical world as interface to online media: location based media, everyday objects as media interfaces, urban screens, and cultural application of RFID technology. Every morning lectures and lessons bring in new perspectives, project presentations and introductions to the hands-on workshop tools. Every afternoon the participants work on their own workshop projects. In 5 workshop days every participant will develop a prototype of a hybrid world media project, assisted by outstanding international trainers and lectures and technical assistants. The workshop closes with a public presentation in which the issues are discussed and the results are shown.

Topics: Some of the topics that will be investigated in this workshop are: Cultural application and impact of RFID technology, internet-of-things. Using RFID in combination with other kinds of sensors. Ubiquitous computing (ubicomp) and ambient intelligence: services and applications that use chips embedded in household appliances and in public space. Locative media tools, car navigation systems, GPS tools, location sensitive mobile phones. The web as interface to the physical world: geotagging and mashupswith Google Maps & Google Earth. Games in hybrid space.

Mar 15, 2007

Augmented reality on cell phones

From Technology Review

Nokia wants to superimpose digital information on the real world using a smart cell phone.

A prototype uses a GPS sensor, a compass, and accelerometers. Using data from these sensors, the phone can calculate the location of just about any object its camera is aimed at:

 

Last October, a team led by Markus Kähäri unveiled a proto­type of the system at the International Symposium on Mixed and Augmented Reality. The team added a GPS sensor, a compass, and accelerometers to a Nokia smart phone. Using data from these sensors, the phone can calculate the location of just about any object its camera is aimed at. Each time the phone changes location, it retrieves the names and geographical coördinates of nearby landmarks from an external database. The user can then download additional information about a chosen location from the Web--say, the names of businesses in the Empire State Building, the cost of visiting the building's observatories, or hours and menus for its five eateries.



Read Original Article

Oct 13, 2006

Immersive Mixed Reality Environments

Via Networked Performance

Saturday 21 October 2006, Erasmus Medical Center, Rotterdam: 11.00 - 12.30: Bioinformatics dept., Faculty building, 15th floor 12.30 - 18.00: Sophia Children's Hospital, Cinema 3rd floor.

Test_Lab:/ Immersive Mixed Reality Environments/ is the product of a unique collaboration between the Erasmus Medical Centre and V2_, Institute for the Unstable Media with the aim of opening the dialogue between scientists and artists that apply Virtual Reality in their research and art practice. The event consists of demonstrations by Virtual Reality artists and scientists providing hands-on experiences with Immersive Mixed Reality Environments, and presentations by renowned international speakers presenting the latest in Virtual Reality in science and art. See below for the program details, a description of the projects that will be demonstrated, and the invited speakers that will present their work in the seminar.

Test_Lab is a bi-monthly public event hosted by V2_ that provides an informal setting to demonstrate, test, present, and/or discuss artistic research and development (aRt&D).

The event is free of charge, but registration is required before the 19th of October. For further information and registration please contact Remco Beeskow at press@v2.nl (tel: +31 (0)10 206 72 72) or Fred Balvert at f.balvert[at]erasmusmc.nl (tel: +31(0)6 41431721). Also visit www.v2.nl and www.erasmusmc.nl

Oct 01, 2006

SCACS

Re-blogged from information aesthetic


wearableviz.jpg

SCACS is a "Social Context-Aware Communication System" that collects information on social networks (i.e. academic co-author relationships networks) & visualizes them on wearable interfaces to facilitate face-to-face communications among people in physical environments. RFID sensors sense the identity of specific people (i.e. authors) nearby, & a wearable computer transforms the complex social network graphs into treemaps, which are then shown as augmented reality on a wearable interface (or head-mounted display).


link: aist-nara.ac.jp (pdf)

Sep 06, 2006

Augmented reality may help people with visual impairment

Via NewScientist.com 

Eli Peli, an ophthalmologist and bioengineer at Harvard Medical School in Boston, has designed an augmented reality device to help patients with tunnel vision, a condition which narrows a person’s field of view.

The system, consisting of glasses fitted with a small camera and a transparent display on one lens, works by superimposing computer-generated images over real scenes.

According to preliminary test results, which will be reported in the September issue of Investigative Ophthalmology & Visual Science, patients who tried the system were able to search objects far more quickly.

Read the original article  

Jul 20, 2006

CatchBob

From the project's website 

CatchBob is an experimental platform to elicit collaborative behavior of people working together on a mobile activity. Running on a mobile device (iPAQ, TabletPc), it's a collaborative hunt in which groups of three persons have to find and circle a virtual object on our campus.

Videos of CatchBob!:

  • The long video (3:30, .mov, 15.8Mb) can be downloaded here.
  • The short version (1:20, .mov, 8.3Mb) can be downloaded there.

 

 

Jul 05, 2006

Handheld Augmented Reality

 
medium_artoolkitplus_Smartphone_small.jpg Handheld Augmented Reality: A standard, off-the-shelf Personal Digital Assistant (PDA) constitutes a cost-effective and lightweight hardware platform for Augmented Reality (AR). A PDA provides a simple, well-known user interface, and is fully equipped with a touch-screen and camera for providing a video see-through Magic Lens metaphor of interaction. In our Handheld AR framework, all interactive processing is done exclusively on the PDA without relying on a server infrastructure, which makes this solution highly scalable. Because of the low cost and suitable ergonomic properties of the PDA platform, massive multi-user AR application become possible for the first time.

The MagicBook

 
medium_magicbook2.jpg
 
 
The MagicBook explores seamless transition between reality and virtual reality. When users look at the pages of a real book through a hand held display, they are able to see virtual content superimposed over the real pages, that is augmented reality. When they see an augmented reality scene they like, users can fly into the scene and experience it as an immersive virtual environment. Currently the user can transition smoothly between these two fixed viewing modes: the augmented-reality view and the virtual-reality view.

May 01, 2006

Free Network Visible Network

Via information aesthetics

Free Network Visible Network is a project that combines different tools and processes to visualize, floating in the space, the interchanged information between users of a network. The people are able to experience in a new exciting way about how colorful virtual objects, representing the digital data, are flying around. These virtual objects will change their shape, size and color in relation with the different characteristics of the information that is circulating in the network.

 

Mar 07, 2006

An augmented reality game on a cellphone

Via Gizmodo

"Viennese computer scientist Daniel Wagner has figured out a way to show a virtual character on an i-mate SP5 cellphone, and when you move around with the cellphone, it appears that you're floating around this virtual character in 3D. Other people with cellphones can also see this character from their points of view."

 

Feb 14, 2006

MIXER 2006

Thanks Regine 

MIXER 2006: International Workshop on the Design and Engineering of Mixed Reality Systems

5 June 2006, Bucharest, Romania

Deadline for submissions: February 27th, 2006 (new deadline)

MIXER is an international workshop dedicated to the design and engineering of Mixed Reality Systems. Following on from the first work-shop, MIXER´2004, jointly organized with ACM-IUI´2004 and CADUI´2004, the primary  goal of MIXER´2006 is to bring together researchers in the area of Mixed  Reality (MR) systems (including augmented reality, augmented virtuality,  augmented video, and tangible systems) to identify and articulate key  research challenges for the design and engineering of MR systems. Also, in order to develop reliable theoretical foundations for MR systems, there is a need to gather the results of MR research and development to create a body of reusable knowledge for the design and construction of future systems. This body of knowledge may include: - MR design and engineering issues; - Research challenges for design methods and tools; - A corpus of applicable usability knowledge (e.g., guidelines, design heuristics).

MIXER´2006 solicits contributions that discuss: theoretical, methodological, technical or application-oriented considerations, relevant for: MR system design, MR system engineering / implementation, MR systems evaluation   - and application-oriented issues, especially arising from new devices, techniques or environments. 

Nov 09, 2005

Real-time 3D human capture system for mixed-reality art and entertainment

IEEE Trans Vis Comput Graph. 2005 Nov-Dec;11(6):706-21

Authors: Nguyen TH, Qui TC, Xu K, Cheok AD, Teo SL, Zhou Z, Mallawaarachchi A, Lee SP, Liu W, Teo HS, Thang le N, Li Y, Kato H

A real-time system for capturing humans in 3D and placing them into a mixed reality environment is presented in this paper. The subject is captured by nine cameras surrounding her. Looking through a head-mounted-display with a camera in front pointing at a marker, the user can see the 3D image of this subject overlaid onto a mixed reality scene. The 3D images of the subject viewed from this viewpoint are constructed using a robust and fast shape-from-silhouette algorithm. The paper also presents several techniques to produce good quality and speed up the whole system. The frame rate of our system is around 25 fps using only standard Intel processor-based personal computers. Besides a remote live 3D conferencing and collaborating system, we also describe an application of the system in art and entertainment, named Magic Land, which is a mixed reality environment where captured avatars of human and 3D computer generated virtual animations can form an interactive story and play with each other. This system demonstrates many technologies in human computer interaction: mixed reality, tangible interaction, and 3D communication. The result of the user study not only emphasizes the benefits, but also addresses some issues of these technologies.