Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Dec 20, 2009

Famous thoughts on manuscript reviews

from Edward Ross (Division of Nephrology, Hypertension & Transplantation, UFla) in The Lancet today …

To die while awaiting the review… alone.
Ernest Hemingway

It is the nature of reviews to be late.
Aristotle

I invented slow manuscript reviews.
Al Gore

There is more to life than simply increasing the speed of manuscript reviews.

Gandhi

I deny reviewing any author. What is your definition of reviewing?

Bill Clinton

Am I late, did I miss the date? It is so sad, this work is bad.

Dr Seuss

Imagine all the reviews in the world being returned, on time, in peace.

John Lennon

Hasten the review slowly.
Augustus Caesar

All things come round to the author who will but wait.
Henry Wadsworth Longfellow

The review isn’t over until it’s over.

Yogi Berra

Never in the field of manuscript conflicts was so much owed by so many authors to so few tardy reviewers.
Winston Churchill

Is the review really late, or is the rest of the world moving faster?
Albert Einstein

Dec 14, 2009

Get relief from stress

Stressed by technology? Let out your office anger and smash up your computer!

center_cubicle_freakout.jpg

12:09 Posted in Blue sky | Permalink | Comments (0) | Tags: stress, serious games

Dec 13, 2009

Augmented Cognition

This short movie, entitled The Future of Augmented Cognition, depicts DARPA’s vision of how augmented cognition will in the future be used to integrate multiple sources of information. The film is set in the year 2030, and takes place in a command centre which monitors cyberspace activity for threats to the global economy. The movie  was commissioned by DARPA and directed by Alexander Singer.

Mobile phones to record and map noise pollution

Via Mobile Active

From traffic to construction to everyday chatter, noise pollution is a part of city life. But with the ubiquity of mobiles, documenting noise pollution is getting a little bit easier. NoiseTube and LHR NoiseMap are two projects that use mobile phones to record and map instances of noise pollution.

NoiseTube uses crowd-sourcing to monitor noise pollution. Users with GPS-enabled phones can install a free application that measures the noise level wherever they are. Users tag the recordings with a description of the noise, its source, the time of day, and other criteria, and the data is then mapped onto GoogleEarth; in this way participants can use their phones as noise sensors to automatically share information about their city with other members of the community.

Be a Junior Jedi

USA Today reports about a new device that uses brain waves to allow players to manipulate a sphere within a clear 10-inch-tall training tower, analogous to Yoda and Luke Skywalker's abilities in the Star Wars films. The Force Trainer is expected to be priced at $90 to $100.

jedi mind training toy

Image is from USA Today article

 

Happy Birthday Positive Technology Journal

Positive Technology is 5 years old.

Since then, 1650 published entries, and hundreds of comments generated.

A warm thanks to all readers, commentators and submitters of projects and news!

5th%20birthday%20cake%20credit%20rev%20dan%20catt%20cc%20attrib%20from%20flickr.jpeg

 

Top 10 Internet of Things Products of 2009

I was very proud to read that there are two italian innovations among the Top 10 Internet of Things Products of 2009: WideNoise and Arduino.

WideNoise is an iPhone application that samples decibel noise levels, displaying them on an interactive map. With the app you can take a sound reading, and if you so wish share that with the WideNoise community. You can check the average sound level of the area around you, which might be handy if you're house-hunting or simply looking for a quiet spot to relax in.

Arduino is an open-source electronics prototyping platform made up of open source hardware and software. It's intended for artists, designers, hobbyists and anyone interested in creating interactive objects or environments. For an example of the type of internet-connected object you can build with Arduino, check out this presentation where the author configured a child's toy ray gun to react when anyone posted the #barcampliverpool hash tag on Twitter.


Dec 12, 2009

A History of Game Controllers Diagram

image by Damien Lopez (via James Burke)

A History of Game Controllers
















































 

Microvision’s new PicoP 3D projected screen

Amazing new first person shooter technology where you can project the screen anywhere.

Especially When The October Wind

by Dylan Thomas

Especially when the October wind
With frosty fingers punishes my hair,
Caught by the crabbing sun I walk on fire
And cast a shadow crab upon the land,
By the sea's side, hearing the noise of birds,
Hearing the raven cough in winter sticks,
My busy heart who shudders as she talks
Sheds the syllabic blood and drains her words.

Shut, too, in a tower of words, I mark
On the horizon walking like the trees
The wordy shapes of women, and the rows
Of the star-gestured children in the park.
Some let me make you of the vowelled beeches,
Some of the oaken voices, from the roots
Of many a thorny shire tell you notes,
Some let me make you of the water's speeches.

Behind a post of ferns the wagging clock
Tells me the hour's word, the neural meaning
Flies on the shafted disk, declaims the morning
And tells the windy weather in the cock.
Some let me make you of the meadow's signs;
The signal grass that tells me all I know
Breaks with the wormy winter through the eye.
Some let me tell you of the raven's sins.

Especially when the October wind
(Some let me make you of autumnal spells,
The spider-tongued, and the loud hill of Wales)
With fists of turnips punishes the land,
Some let me make of you the heartless words.
The heart is drained that, spelling in the scurry
Of chemic blood, warned of the coming fury.
By the sea's side hear the dark-vowelled birds.

12:46 Posted in Blue sky | Permalink | Comments (0) | Tags: dylan thomas, blue sky

Dec 08, 2009

Allosphere: University of California, Santa Barbara

The AlloSphere is a spherical space in which immersive, virtual environments allow researchers to convert large data sets into experiences of sight and sound. For example, it allows researchers to “fly” through a hydrogen atom while hearing sonified features of the wavefunction of its single electron to help describe invisible processes of nature.

The facility consists of a 30-foot diameter sphere built inside a 3-story cube that’s nearly echo-free. Inside the chamber are two spherical hemispheres that are constructed of perforated aluminum designed to be optically opaque and acoustically transparent. A 7-foot-wide bridge runs across the center, supporting the users. High-resolution video projectors can project images across the entire inner surface enabling seamless stereo-optic 3D projection.

The Application and Management of Personal Electronic Information

The First International Forum on the Application and Management of Personal Electronic Information, organized by the MIT SENSEable City Lab, gathered many stakeholders from multiple disciplines to share on the issues surrounding the application and management of personal electronic information:

The goal of this forum is to explore the novel applications for electronic data and address the risks, concerns, and consumer opinions associated with the use of this data. In addition, it will include discussions on techniques and standards for both protecting and extracting value from this information from several points of view: what techniques and standards currently exist, and what are their strengths and limitations? What holistic approaches to protecting and extracting value from data would we take if we were given a blank slate?

Position papers and presentations are now online.

 

Laser-Enhanced Concentration

Now that neuroscientists have identified the brain's synchronizing mechanism, they've started work on therapies to strengthen attention. Ultimately, it may be possible to improve your attention by using pulses of light to directly synchronize your neurons, a form of direct therapy that could help people with schizophrenia and attention-deficit problems, said Dr. Desimone, the director of the McGovern Institute for Brain Research at MIT. In the nearer future, neuroscientists might also help you focus by observing your brain activity and providing biofeedback as you practice strengthening your concentration. Researchers have already observed higher levels of synchrony in the brains of people who regularly meditate.

(Source: http://www.nytimes.com/2009/05/05/science/05tier.html)

FaceBots

The world's first robot with its own Facebook page (and that can use its information in conversations with "friends") has been developed by Nikolaos Mavridis and collaborators from the Interactive Robots and Media Lab at the United Arab Emirates University.

The main hypothesis of the FaceBots project is that long-term human robot interaction will benefit by reference to "shared memories" and "events relevant to shared friends" in human-robot dialogues.

More to explore:

  • N. Mavridis, W. Kazmi and P. Toulis, "Friends with Faces: How Social Networks Can Enhance Face Recognition and Vice Versa", contributed book chapter to Computational Social Networks Analysis: Trends, Tools and Research Advances, Springer Verlag, 2009. pdf

    N. Mavridis, W. Kazmi, P. Toulis, C. Ben-AbdelKader, "On the synergies between online social networking, Face Recognition, and Interactive Robotics", CaSoN 2009. pdf

    N. Mavridis, C. Datta et al,  "Facebots: Social robots utilizing and publishing social information in Facebook", IEEE HRI 2009. pdf

 

Smart Sensors Help Improve Prostheses Personalization

Press release: Sensitive fitting process for leg prostheses

When fitting a leg prosthesis on a patient, clinicians typically have to use a gait laboratory to analyze patient's natural steps. The problem is that only one or two steps can be recorded by the lab, which provides too little information for a comprehensive fitting. Now researchers at the Fraunhofer Institute for Surface Engineering and Thin Films IST in Braunschweig, Germany have developed a sensor system that fits into a prosthesis for a more long term analysis.

The adapter measures 4 x 4 x 3 centimeters and sits at the ankle joint or above the knee. It measure the applied forces in three spatial dimensions and three torque moments. A miniature data logger near the sensor reads out the data and stores them. “This adapter makes it possible to continuously measure the load on a leg prosthesis during different routine activities throughout an entire day,” says IST team leader Dr. Ralf Bandorf. The adapter has eight measuring bridges, each with four strain gauges. These consist of a sputtered insulating layer covered with a metal film. When the patient walks, the layer stretches according to the type of movement performed, and this changes the electrical resistance of the metal film. The 32 strain gauges are placed at a number of different points and in different orientations, so the data provide a complete picture of the load acting on the prosthesis. Strain gauges used in sensor systems normally consist of adhesive films, but in this case the layers are sputtered directly onto the surface. This means they can also be applied to the complex geometries of the adapter, for instance its edges, which would be difficult in the case of adhesive films. Moreover, the film is insensitive to moisture and does not require the use of adhesives.

“The main challenge was to design a suitable geometry for the adapter,” says Dr. Ralf Bandorf. It mustn’t be too large, as there is only limited space available inside the prosthesis, but it has to be large enough to accommodate the strain gauges. The developers are already testing a prototype of the adapter on the first patients, and will present it at the Hannover Messe from April 20 to 24.

Avatar - A multi-sensory system for real time body position monitoring

Avatar - A multi-sensory system for real time body position monitoring.

Conf Proc IEEE Eng Med Biol Soc. 2009;1:2462-5

Authors: Jovanov E, Hanish N, Courson V, Stidham J, Stinson H, Webb C, Denny K

Virtual reality and computer assisted physical rehabilitation applications require an unobtrusive and inexpensive real time monitoring systems. Existing systems are usually complex and expensive and based on infrared monitoring. In this paper we propose Avatar, a hybrid system consisting of off-the-shelf components and sensors. Absolute positioning of a few reference points is determined using infrared diode on subject's body and a set of Wii((c)) Remotes as optical sensors. Individual body segments are monitored by intelligent inertial sensor nodes iSense. A network of inertial nodes is controlled by a master node that serves as a gateway for communication with a capture device. Each sensor features a 3D accelerometer and a 2 axis gyroscope. Avatar system is used for control of avatars in Virtual Reality applications, but could be used in a variety of augmented reality, gaming, and computer assisted physical rehabilitation applications.

Impact of the virtual reality on the neural representation of an environment

Impact of the virtual reality on the neural representation of an environment.

Hum Brain Mapp. 2009 Dec 4;

Authors: Mellet E, Laou L, Petit L, Zago L, Mazoyer B, Tzourio-Mazoyer N

Despite the increasing use of virtual reality, the impact on cerebral representation of topographical knowledge of learning by virtual reality rather than by actual locomotion has never been investigated. To tackle this challenging issue, we conducted an experiment wherein participants learned an immersive virtual environment using a joystick. The following day, participants' brain activity was monitored by functional magnetic resonance imaging while they mentally estimated distances in this environment. Results were compared with that of participants performing the same task but having learned the real version of the environment by actual walking. We detected a large set of areas shared by both groups including the parieto-frontal areas and the parahippocampal gyrus. More importantly, although participants of both groups performed the same mental task and exhibited similar behavioral performances, they differed at the brain activity level. Unlike real learners, virtual learners activated a left-lateralized network associated with tool manipulation and action semantics. This demonstrated that a neural fingerprint distinguishing virtual from real learning persists when subjects use a mental representation of the learnt environment with equivalent performances. Hum Brain Mapp, 2010. (c) 2009 Wiley-Liss, Inc.

Dec 06, 2009

Avatar: Can't wait any longer

I just can not wait for the new James Cameron's movie Avatar...

The iPhone Orchestra

The Stanford Mobile Phone Orchestra (MoPhO) is a new repertoire-based ensemble using mobile phones as musical instrument. MoPhO's interactive musical works take advantage of the unique technological capabilities of today's hardware and software, transforming multi-touch screens, built-in accelerometers, built-in microphones, GPS, data networks, and computation into powerful and yet mobile chamber meta-instruments.

The researcher behind the idea, Ge Wang, believes cell phones are becoming so powerful that we “cannot ignore them anymore as platforms for creativity. . . . It levels the playing ground in some ways, because everyone has a cell phone.”

 



The Stanford Mobile Phone Orchestra’s performance on December 3 at Palo Alto (CA) used an Apple iPhones amplified by speakers attached to small fingerless gloves. Here is a video of the concert.


Dec 02, 2009

Ring°Wall: World Largest Multi-Touch and Multi-User Wall

Via Infoaesthetic

world_largest_multi_touch_wall2.jpg

The World's Biggest / Largest / Longest Multi-Touch (and evidently Multi-User) Wall is installed in Nürburgring (Germany) consists of a huge LED media facade (at the top), and a multitouch information-wall (at the bottom), and impresses by its physical size, as it totals a surface of about 425 square meters, equaling more than 6000 computer displays.

The interactive interface emerges out of 34 million pixels generated by 15 high definition projectors, supported by sound produced by 30 directional speakers. The multitouch capturing itself is based on laser technology, also called Laser Light Plane Illumination (LLP).

This means more than 80 users can simultaneously get informed about news and activities around the ringworld. Now imagine the sorts of sparklines this device could display...

You can watch a documentary movie below.