Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Oct 05, 2006

It is still a watch however

Via Mobile Community Design

The MBW-100 watch lets you answer or reject calls, displays caller ID info, and even lets you play, pause and skip music tracks. The device connects to selected Sony Ericsson handsets (including K790 and K610 models) via Bluetooth 2.0. The stainless-steel, 6.6-ounce watch boasts analog dials and an OLED screen, and it lets you control a variety of phone functions, such as call handling (you can answer, reject or mute calls) and music playback (play, pause, and skip tracks)

 

22:54 Posted in Wearable & mobile | Permalink | Comments (0) | Tags: wereable, mobile

Teliris Launches VirtuaLive with HSL's Thoughts and Analysis

Via Human Productivity Lab

GlobalTable VirtuaLive 360 Front.jpg

Teliris, a company that develops telepresence solutions, has announced its 4th generation offering. From the press release:

The VirtuaLive(TM) enhanced technology provides the most natural and intimate virtual meeting environment on the market, and is available in a broad set of room offerings designed to meet the specific needs of its customers.

Building on Teliris' third generation GlobalTable(TM) telepresence solutions, VirtuaLive(TM) provides enhanced quality video and broadband audio, realistically replicating an in-person meeting experience by capturing and transmitting the most subtle visual gestures and auditory cues.

"All future Teliris solutions will fall under the VirtuaLive(TM) umbrella of offerings," said Marc Trachtenberg, Teliris CEO. "With such an advanced technology platform and range of solutions, companies can select the immersive experience that best fits their business environment and goals."

VirtuaLive's(TM) next generation of Virtual Vectoring(TM) is at the center of the new offerings. It provides users with unparalleled eye-to-eye contact from site-to-site in multipoint meetings with various numbers of participants within each room. No other vendor offering can match the natural experience created by advanced technology in such diverse environments.

Social behavior and norms in virtual environments are comparable to those in the physical world

Nick Yee and colleagues at Stanford University have investigated whether social behavior and norms in virtual environments are comparable to those in the physical world. To this end, they collected data from avatars in Second Life, in order to explore whether social norms of gender, interpersonal distance (IPD), and eye gaze transfer into virtual environments even though the modality of movement is entirely different.
"Results showed that established findings of IPD and eye gaze transfer into virtual environments: 1) Malemale dyads have larger IPDs than female-female dyads, 2) male-male dyads maintain less eye contact than female-female dyads, and 3) decreases in IPD are compensated with gaze avoidance"
According to Yee and coll., these findings suggest that social interactions in online virtual environments are governed by the same social norms as social interactions in the physical world.

Yee, N., Bailenson, J.N. & Urbanek, M. (2006). The unbearable likeness of being digital: The persistence of nonverbal social norms in online virtual environments. Cyberspace and Behaviour, In Press.


 

Oct 04, 2006

The neural basis of narrative imagery

The neural basis of narrative imagery: emotion and action.

Prog Brain Res. 2006;156:93-103

Authors: Sabatinelli D, Lang PJ, Bradley MM, Flaisch T

It has been proposed that narrative emotional imagery activates an associative network of stimulus, semantic, and response (procedural) information. In previous research, predicted response components have been demonstrated through psychophysiological methods in peripheral nervous system. Here we investigate central nervous system concomitants of pleasant, neutral, and unpleasant narrative imagery with functional magnetic resonance imaging. Subjects were presented with brief narrative scripts over headphones, and then imagined themselves engaged in the described events. During script perception, auditory association cortex showed enhanced activation during affectively arousing (pleasant and unpleasant), relative to neutral imagery. Structures involved in language processing (left middle frontal gyrus) and spatial navigation (retrosplenium) were also active during script presentation. At the onset of narrative imagery, supplementary motor area, lateral cerebellum, and left inferior frontal gyrus were initiated, showing enhanced signal change during affectively arousing (pleasant and unpleasant), relative to neutral scripts. These data are consistent with a bioinformational model of emotion that considers response mobilization as the measurable output of narrative imagery.

Science & Consciousness Review back to life

Science & Consciousness Review, the online webzine/journal for the review of the scientific study of consciousness, is back online after a crash that occurred several months ago

scr11_new.png

Affective communication in the metaverse

Via IEET 

Have a look at this thought-provoking article by Russel Blackford on affective communication in mediated environments:

One of the main conclusions I’ve been coming to in my research on the moral issues surrounding emerging technologies is the danger that they will be used in ways that undermine affective communication between human beings - something on which our ability to bond into societies and show moment-by-moment sympathy for each other depends. Anthropological and neurological studies have increasingly confirmed that human beings have a repertoire of communication by facial expression, voice tone, and body language that is largely cross-cultural, and which surely evolved as we evolved as social animals.

The importance of this affective repertoire can be seen in the frequent complaints in internet forums that, “I misunderstood because I couldn’t hear your tone of voice or see the expression on your face.” The internet has evolved emoticons as a partial solution to the problem, but flame wars still break out over observations that would lead to nothing like such violent verbal responses if those involved were discussing the same matters face to face, or even on the telephone. I almost never encounter truly angry exchanges in real life, though I may be a bit sheltered, or course, but I see them on the internet all the time. Partly, it seems to be that people genuinely misunderstand where others are coming from with the restricted affective cues available. Partly, however, it seems that people are more prepared to lash out hurtfully in circumstances where they are not held in check by the angry or shocked looks and the raised voices they would encounter if they acted in the same way in real life.

This is one reason to be sightly wary of the internet. It’s not a reason to ban the internet, which produces all sorts of extraordinary utilitarian benefits. Indeed, even the internet’s constraint on affective communication may have advantages - it may free up shy people to say things that they would be too afraid to say in real life.

 

Read the full article

Oct 03, 2006

Netflix Prize

 

Netflix, an online movie rental service, is offering $1 million to the first person who can improve the accuracy of movie recommendations based on personal preferences by 10%. From the website:

The Netflix Prize seeks to substantially improve the accuracy of predictions about how much someone is going to love a movie based on their movie preferences. Improve it enough and you win one (or more) Prizes. Winning the Netflix Prize improves our ability to connect people to the movies they love
 
Find more details here:

Oct 02, 2006

Web Journals Take On Peer Review

LOS ANGELES - Scientists frustrated by the iron grip that academic journals hold over their research can now pursue another path to fame by taking their research straight to the public online.

Instead of having a group of hand-picked scholars review research in secret before publication, a growing number of internet-based journals are publishing studies with little or no scrutiny by the authors' peers. It's then up to rank-and-file researchers to debate the value of the work in cyberspace.

The web journals are threatening to turn the traditional peer-review system on its head. Peer review for decades has been the established way to pick apart research before it's made public.

Next month, the San Francisco-based nonprofit Public Library of Science will launch its first open peer-reviewed journal called PLoS ONE, focusing on science and medicine. Like its sister publications, it will make research articles available for free online by charging authors to publish.

 

Read the full story  

17:03 Posted in Research tools | Permalink | Comments (0) | Tags: research tools

Japan to invest US$17.4 million in robotics research

Via Pink Tentacle

Asimo

Japan’s Ministry of Economy, Trade and Industry (METI) will invest over 2 billion yen (US$17.4 million) to support the development of intelligent robots that rely on their own decision-making skills in the workplace.

The objective of METI’s robot budget is to support the development of key artificial intelligence technology for robots over the next 5 years, with the goal of introducing intelligent robots to the market by 2015.


Oct 01, 2006

Color of My Sound

Via infosthetics

medium_colorofmysound.jpg

Color of My Sound is an Internet-based application that allows to assign colors to specific sounds. The project is inspired by the phenomenon of synesthesia, the mixing of the senses.

In CMS, users choose a sound category. Then, after listening, they can choose the color to which they are most strongly drawn. finally, they can see how others voted for that particular sound.

The Color of My Sound's original prototype has recently won a Silver Summit Creative Award, and is up for a 2006 Webby, in the NetArt category.


see also music animation machine & wolfram tones.

SCACS

Re-blogged from information aesthetic


wearableviz.jpg

SCACS is a "Social Context-Aware Communication System" that collects information on social networks (i.e. academic co-author relationships networks) & visualizes them on wearable interfaces to facilitate face-to-face communications among people in physical environments. RFID sensors sense the identity of specific people (i.e. authors) nearby, & a wearable computer transforms the complex social network graphs into treemaps, which are then shown as augmented reality on a wearable interface (or head-mounted display).


link: aist-nara.ac.jp (pdf)

HiResolution Bionic Ear System

Via Medgadget

Medgadget reports that Boston Scientific has received FDA approval of its cochlear implant Harmony™ HiResolution® Bionic Ear System, a device designed for severely deaf patients.

From the press release:

Developed by the Company's Neuromodulation Group, the Harmony System delivers 120 spectral bands, 5 - 10 times more than competing systems, helping to significantly increase hearing potential and quality of life for the severe-to-profoundly deaf.

"The Harmony System represents the next generation of cochlear implant technology," said Jeff Greiner, President of Boston Scientific's Neuromodulation Group. "We have brought together unprecedented advancements in science, design and functionality for the user -- furthering our commitment to restoring hearing and improving quality of life for those living with hearing loss due to permanent inner ear or auditory nerve damage."

Designed to enhance music appreciation and improve hearing in a variety of difficult listening environments, the Harmony System couples revolutionary internal sound processing (with the optional HiRes Fidelity™ 120) with the new Harmony behind-the-ear (BTE) external sound processor. Together, the two key components of the Harmony System are designed to provide significantly enhanced spectral resolution compared to conventional systems for a more natural representation of sound to help improve patient performance...

Cochlear implant users can access soft whispers and loud sounds without adjusting dials or controls with Harmony's CD-quality processing and sophisticated dual-loop automatic gain control, helping users better appreciate music, hear in noisy environments, use the telephone, and hear sounds that are loud and soft.

In addition to the FDA approval, the Harmony HiResolution Bionic Ear System recently received approval from Health Canada and the CE mark in Europe.

According to clinical evaluation results, approximately 80 percent of the subjects reported a strong preference for the Harmony sound processor with HiRes Fidelity 120, most noting that they had improved clarity of speech and/or that environmental sounds were clearer and easier to distinguish.

The HiResolution Bionic Ear System with optional HiRes Fidelity 120 is approved in the U.S. for adults only at this time and for all patients in Canada and Europe. The product is expected to be available in early 2007.

 

Sep 30, 2006

Gaming Realities

Re-blogged from Networked Performance 

By luis on Opportunities + Events + Resources

logo06.gif

What role do videogames play in our lives today? As the boundaries between the virtual and the real blur more and more in the new gaming worlds we have come to inhabit, new conditions arise.

With the theme Gaming Realities, medi@terra 06 aims to explore the different dimensions and developments in the gaming fields and the impact they have on the different fields of society today. This year's Festival and International Conference set up to explore the diverse ideas, narratives, and ideologies involved in the video games.

Videogames express and reflect today's world, its aesthetics and technologies, give rise to new identities and new mentalities. Medi@terra Festival has invited individuals who have realised the importance and dimensions which this field has acquired, asking them to deposit their viewpoints and experiences with regard to the connections of the game to society, the identity and psychology of the player, the space and narration of the game, new technologies and conceptions and possibilities for the computer game to comprise the key art of 21st century.

'Gaming Realities: the Challenge of Digital Culture' is a three days International Conference [6-8 October, Athens] organised by Fournos Centre for the Digital Culture.







Perimeters, Boundaries, and Borders

Re-blogged from Networked Performance 

20060929.gif

 

Artists, architects, designers, and other practitioners are constantly fashioning new forms and challenging disciplinary boundaries as they employ techniques such as rapid prototyping and generative processes. In the exhibition Perimeters, Boundaries, and Borders, at Lancaster, UK's Citylab, organizers Fast-uk and Folly explore the range of objects, buildings, and products being conceptualized with the aid of digital technologies. Aoife Ludlow's 'Remember to Forget?' is a series of jewelry designs that envisioned accessories incorporating RFID tags that allow the wearer to record information and emotions associated with those special items that we put on daily. Tavs Jorgensen uses a data glove in his 'Motion in Form' project. After gesturing around an object, data collected by the glove is given physical shape using CNC (Computer Numerical Control) milling, creating representations of the movements in materials such as glass or ceramics. Addressing traces of a different sort is Cylcone.soc, a data mapping piece by Gavin Bailey and Tom Corby. These works and many more examples from the frontiers of art and design are on view until October 21st." Rhizome News.




Increasing cortical activity in auditory areas through neurofeedback fMRI

Increasing cortical activity in auditory areas through neurofeedback functional magnetic resonance imaging.

Neuroreport. 2006 Aug 21;17(12):1273-8

Authors: Yoo SS, O'Leary HM, Fairneny T, Chen NK, Panych LP, Park H, Jolesz FA

We report a functional magnetic resonance imaging method to deliver task-specific brain activities as biofeedback signals to guide individuals to increase cortical activity in auditory areas during sound stimulation. A total of 11 study participants underwent multiple functional magnetic resonance imaging scan sessions, while the changes in the activated cortical volume within the primary and secondary auditory areas were fed back to them between scan sessions. On the basis of the feedback information, participants attempted to increase the number of significant voxels during the subsequent trial sessions by adjusting their level of attention to the auditory stimuli. Results showed that the group of individuals who received the feedback were able to increase the activation volume and blood oxygenation level-dependent signal to a greater degree than the control group.

Endoscopic eye tracking system for fMRI

Endoscopic eye tracking system for fMRI.

J Neurosci Methods. 2006 Sep 13;

Authors: Kanowski M, Rieger JW, Noesselt T, Tempelmann C, Hinrichs H

Here we introduce a new video-based real-time eye tracking system suitable for functional magnetic resonance imaging (fMRI) application. The described system monitors the subject's eye, which is illuminated with infrared light, directly at the headcoil using an endoscopic fibre optical system. This endoscopic technique assures reliable, easy-to-use and fast adjustment. It requires only a minimal amount of equipment at the headcoil and inside the examination room. Moreover, the short distance between the image acquisition optics and the eye provides high spatial tracking resolution. Interference from physiological head movement is effectively reduced by simultaneous tracking of both eye and head movements.

Utilizing Gamma Band to Improve Mental Task Based Brain-Computer Interface Design

Utilizing Gamma Band to Improve Mental Task Based Brain-Computer Interface Design

IEEE Transactions on Neural Systems and Rehabilitation Engineering, Volume 14, Issue 3, Sept. 2006 Page(s): 299 - 303

Palaniappan, R. 

A common method for designing brain–computer Interface (BCI) is to use electroencephalogram (EEG) signals extracted during mental tasks. In these BCI designs, features from EEG such as power and asymmetry ratios from delta, theta, alpha, and beta bands have been used in classifying different mental tasks. In this paper, the performance of the mental task based BCI design is improved by using spectral power and asymmetry ratios from gamma (24–37 Hz) band in addition to the lower frequency bands. In the experimental study, EEG signals extracted during five mental tasks from four subjects were used. Elman neural network (ENN) trained by the resilient backpropagation algorithm was used to classify the power and asymmetry ratios from EEG into different combinations of two mental tasks. The results indicated that 1) the classification performance and training time of the BCI design were improved through the use of additional gamma band features; 2) classification performances were nearly invariant to the number of ENN hidden units or feature extraction method.

Brain-computer interfaces for control of neuroprostheses

Brain-computer interfaces for control of neuroprostheses: from synchronous to asynchronous mode of operation.

Biomed Tech (Berl). 2006;51(2):57-63

Authors: Müller-Putz GR, Scherer R, Pfurtscheller G, Rupp R

Transferring a brain-computer interface (BCI) from the laboratory environment into real world applications is directly related to the problem of identifying user intentions from brain signals without any additional information in real time. From the perspective of signal processing, the BCI has to have an uncued or asynchronous design. Based on the results of two clinical applications, where 'thought' control of neuroprostheses based on movement imagery in tetraplegic patients with a high spinal cord injury has been established, the general steps from a synchronous or cue-guided BCI to an internally driven asynchronous brain-switch are discussed. The future potential of BCI methods for various control purposes, especially for functional rehabilitation of tetraplegics using neuroprosthetics, is outlined.

Beyond emoticons

Via New Scientist Tech

Anthony Boucouvalas and colleagues at Bournemouth University in the UK have created a system that contorts an image of a user's face to express different emotions, New Scientist reports. According to Boucouvalas and coll., the system might be used to enrich text-based internet chat.

From the article:

A user first uploads a picture of their face with a "neutral" expression. Then they use their mouse to mark the ends of their eyebrows, the corners of their mouth and the edges of their eyes and lips. The software uses these points to morph the face to express different emotions: happiness, sadness, fear, anger, surprise, and disgust. A user can select an emotion and one of three intensity levels when using the system.

 

Read the full story here

See also Illustrations from Smiley Arena

Sep 27, 2006

M300: wristwatch GSM phone with SMS

Via Textually.org

medium_9361_tm.jpg

The Australian company SMS Development has announced the world's first truly mobile GSM watch phone.

The watchphone has SMS capabilities with an internal antenna, is tri-band and comes with bluetooth functionality; talk time is 80 minutes.

20:15 Posted in Wearable & mobile | Permalink | Comments (0) | Tags: wereable, mobile