Aug 31, 2014

Information Entropy

Information – Entropy by Oliver Reichenstein

Will information technology affect our minds the same way the environment was affected by our analogue technology? Designers hold a key position in dealing with ever increasing data pollution. We are mostly focussed on speeding things up, on making sharing easier, faster, more accessible. But speed, usability, accessibility are not the main issue anymore.  The main issues are not technological, they are structural, processual. What we lack is clarity, correctness, depth, time. Are there counter-techniques we can employ to turn data into information, information into knowledge, knowledge into wisdom?

Oliver Reichenstein — Information Entropy (SmashingConf NYC 2014) from Smashing Magazine on Vimeo.

Jun 30, 2014

Never do a Tango with an Eskimo

Apr 15, 2014

Avegant - Glyph Kickstarter - Wearable Retinal Display

Via Mashable

Move over Google Glass and Oculus Rift, there's a new kid on the block: Glyph, a mobile, personal theater.

Glyph looks like a normal headset and operates like one, too. That is, until you move the headband down over your eyes and it becomes a fully-functional visual visor that displays movies, television shows, video games or any other media connected via the attached HDMI cable.

Using Virtual Retinal Display (VRD), a technology that mimics the way we see light, the Glyph projects images directly onto your retina using one million micromirrors in each eye piece. These micromirrors reflect the images back to the retina, producing a reportedly crisp and vivid quality.

Mar 03, 2014

By licking these electric ice cream cones, you can make music

From Wired

Ice cream can be the reward after a successful little league game, a consolation after a bad breakup, or, in the hands of gourmet geeks, a sweet musical instrument. Designers Carla Diana and Emilie Baltz recently whipped up a musical performance where a quartet of players jammed using just a quart of vanilla ice cream and some high-tech cones

lickestra

Jan 23, 2014

Transparent display @MIT

The innovative system is described in a paper published in the journal Nature Communications, co-authored by MIT professors Marin Soljačić and John Joannopoulos, graduate student Chia Wei Hsu, and four others.

Abstract of Nature Communications paper:

The ability to display graphics and texts on a transparent screen can enable many useful applications. Here we create a transparent display by projecting monochromatic images onto a transparent medium embedded with nanoparticles that selectively scatter light at the projected wavelength. We describe the optimal design of such nanoparticles, and experimentally demonstrate this concept with a blue-color transparent display made of silver nanoparticles in a polymer matrix. This approach has attractive features including simplicity, wide viewing angle, scalability to large sizes and low cost.

 

 

 

Jan 20, 2014

The Future of Gesture Control - Introducing Myo

Thalmic Labs at TEDxToronto

Jan 12, 2014

Wearable Pregnancy Ultrasound

Melody Shiue, an industrial design graduate from University of New South Wales, is proposing a wearable fetal ultrasound system to enhancing maternal-fetal bonding as a reassurance window. It is an e-textile based apparatus that uses 4D ultrasound. Latest stretchable display technology is also employed on the abdominal region, allowing other members of the family especially the father to connect with the foetus in its context. PreVue not only gives you the opportunity to interact and comprehend the physical growth of the baby, but also an early understanding of its personality as you see it yawning, rolling, smiling etc., bringing you closer till the day it finally rests into your arms.

More information at Tuvie

Dec 24, 2013

NeuroOn mask improves sleep and helps manage jet lag

Via Medgadget

A group of Polish engineers is working on a smart sleeping mask that they hope will allow people to get more out of their resting time, as well as allow for unusual sleeping schedules that would particularly benefit those who are often on-call. The NeuroOn mask will have an embedded EEG for brain wave monitoring, EMG for detecting muscle motion on the face, and sensors that can track whether your pupils are moving and whether they are going through REM. The team is currently raising money on Kickstarter where you can pre-order your own NeuroOn once it’s developed into a final product.

Dec 08, 2013

iMirror

Take back your mornings with the iMirror – the interactive mirror for your home. Watch the video for a live demo!

Nov 20, 2013

inFORM

inFORM is a Dynamic Shape Display developed by MIT Tangible Media Group that can render 3D content physically, so users can interact with digital information in a tangible way.

inFORM can also interact with the physical world around it, for example moving objects on the table’s surface.

Remote participants in a video conference can be displayed physically, allowing for a strong sense of presence and the ability to interact physically at a distance.

Nov 16, 2013

Phonebloks

Phonebloks is a modular smartphone concept created by Dutch designer Dave Hakkens to reduce electronic waste. By attaching individual third-party components (called "bloks") to a main board, a user would create a personalized smartphone. These bloks can be replaced at will if they break or the user wishes to upgrade.

Aug 07, 2013

What Color Is Your Night Light? It May Affect Your Mood

When it comes to some of the health hazards of light at night, a new study suggests that the color of the light can make a big difference.

Read full story on Science Daily

Mar 03, 2013

Brain-to-brain communication between rats achieved

From Duke Medicine News and Communications

Researchers at Duke University Medical Center in the US report in the February 28, 2013 issue of Scientific Reports the successful wiring together of sensory areas in the brains of two rats. The result of the experiment is that one rat will respond to the experiences to which the other is exposed.

Brain-to-brain interface with rats | Duke

The results of these projects suggest the future potential for linking multiple brains to form what the research team is calling an "organic computer," which could allow sharing of motor and sensory information among groups of animals.

"Our previous studies with brain-machine interfaces had convinced us that the rat brain was much more plastic than we had previously thought," said Miguel Nicolelis, M.D., PhD, lead author of the publication and professor of neurobiology at Duke University School of Medicine. "In those experiments, the rat brain was able to adapt easily to accept input from devices outside the body and even learn how to process invisible infrared light generated by an artificial sensor. So, the question we asked was, ‘if the brain could assimilate signals from artificial sensors, could it also assimilate information input from sensors from a different body?’"

To test this hypothesis, the researchers first trained pairs of rats to solve a simple problem: to press the correct lever when an indicator light above the lever switched on, which rewarded the rats with a sip of water. They next connected the two animals' brains via arrays of microelectrodes inserted into the area of the cortex that processes motor information.

One of the two rodents was designated as the "encoder" animal. This animal received a visual cue that showed it which lever to press in exchange for a water reward. Once this “encoder” rat pressed the right lever, a sample of its brain activity that coded its behavioral decision was translated into a pattern of electrical stimulation that was delivered directly into the brain of the second rat, known as the "decoder" animal.

The decoder rat had the same types of levers in its chamber, but it did not receive any visual cue indicating which lever it should press to obtain a reward. Therefore, to press the correct lever and receive the reward it craved, the decoder rat would have to rely on the cue transmitted from the encoder via the brain-to-brain interface.

The researchers then conducted trials to determine how well the decoder animal could decipher the brain input from the encoder rat to choose the correct lever. The decoder rat ultimately achieved a maximum success rate of about 70 percent, only slightly below the possible maximum success rate of 78 percent that the researchers had theorized was achievable based on success rates of sending signals directly to the decoder rat’s brain.

Importantly, the communication provided by this brain-to-brain interface was two-way. For instance, the encoder rat did not receive a full reward if the decoder rat made a wrong choice. The result of this peculiar contingency, said Nicolelis, led to the establishment of a "behavioral collaboration" between the pair of rats.

"We saw that when the decoder rat committed an error, the encoder basically changed both its brain function and behavior to make it easier for its partner to get it right," Nicolelis said. "The encoder improved the signal-to-noise ratio of its brain activity that represented the decision, so the signal became cleaner and easier to detect. And it made a quicker, cleaner decision to choose the correct lever to press. Invariably, when the encoder made those adaptations, the decoder got the right decision more often, so they both got a better reward."

In a second set of experiments, the researchers trained pairs of rats to distinguish between a narrow or wide opening using their whiskers. If the opening was narrow, they were taught to nose-poke a water port on the left side of the chamber to receive a reward; for a wide opening, they had to poke a port on the right side.

The researchers then divided the rats into encoders and decoders. The decoders were trained to associate stimulation pulses with the left reward poke as the correct choice, and an absence of pulses with the right reward poke as correct. During trials in which the encoder detected the opening width and transmitted the choice to the decoder, the decoder had a success rate of about 65 percent, significantly above chance.

To test the transmission limits of the brain-to-brain communication, the researchers placed an encoder rat in Brazil, at the Edmond and Lily Safra International Institute of Neuroscience of Natal (ELS-IINN), and transmitted its brain signals over the Internet to a decoder rat in Durham, N.C. They found that the two rats could still work together on the tactile discrimination task.

"So, even though the animals were on different continents, with the resulting noisy transmission and signal delays, they could still communicate," said Miguel Pais-Vieira, PhD, a postdoctoral fellow and first author of the study. "This tells us that it could be possible to create a workable, network of animal brains distributed in many different locations."

Nicolelis added, "These experiments demonstrated the ability to establish a sophisticated, direct communication linkage between rat brains, and that the decoder brain is working as a pattern-recognition device. So basically, we are creating an organic computer that solves a puzzle."

"But in this case, we are not inputting instructions, but rather only a signal that represents a decision made by the encoder, which is transmitted to the decoder’s brain which has to figure out how to solve the puzzle. So, we are creating a single central nervous system made up of two rat brains,” said Nicolelis.  He pointed out that, in theory, such a system is not limited to a pair of brains, but instead could include a network of brains, or “brain-net.” Researchers at Duke and at the ELS-IINN are now working on experiments to link multiple animals cooperatively to solve more complex behavioral tasks.

"We cannot predict what kinds of emergent properties would appear when animals begin interacting as part of a brain-net. In theory, you could imagine that a combination of brains could provide solutions that individual brains cannot achieve by themselves," continued Nicolelis. Such a connection might even mean that one animal would incorporate another's sense of "self," he said.

"In fact, our studies of the sensory cortex of the decoder rats in these experiments showed that the decoder's brain began to represent in its tactile cortex not only its own whiskers, but the encoder rat's whiskers, too. We detected cortical neurons that responded to both sets of whiskers, which means that the rat created a second representation of a second body on top of its own." Basic studies of such adaptations could lead to a new field that Nicolelis calls the "neurophysiology of social interaction."

Such complex experiments will be enabled by the laboratory's ability to record brain signals from almost 2,000 brain cells at once. The researchers hope to record the electrical activity produced simultaneously by 10-30,000 cortical neurons in the next five years.

Such massive brain recordings will enable more precise control of motor neuroprostheses—such as those being developed by the Walk Again Project—to restore motor control to paralyzed people, Nicolelis said.

More to explore:

Pais-Vieira, M., Lebedev, M., Kunicki, C., Wang, J. & Nicolelis, M. A. L. Sci. Rep. 3, 1319 (2013). PUBMED

2nd Summer School on Human Computer Confluence

Date: 17th, 18th and 19th July 2013

Venue: IRCAM
Location: Paris, France
Website: http://www.ircam.fr/

The 2nd HCC summer school aims to share scientific knowledge and experience among participants, enhance and stimulate interdisciplinary dialogue as well as provide further opportunities for co-operation within the study domains of Human Computer Confluence.

The topics of the summer school will be framed around the following issues:
• re-experience yourself,
• experience being others,
• experience being together in more powerful ways,
• experience other environments,
• experience new senses,
• experience abstract data spaces.

The 2nd HCC summer school will try to benefit most from the research interests and the special facilities of the IRCAM institute, the last as a place dedicated to the coupling of art with the sciences of sound and media. Special attention will be given to the following thematic categories:
• Musical interfaces
• Interactive sound design
• Sensorimotor learning and gesture-sound interactive systems
• Croudsourcing and human computation approaches in artistic applications

The three-day summer school will include invited lectures by experts in the field, a round-table and practical workshops. During the workshops, participants will engage in hands-on HCC group projects that they will present at the end of the summer school.

Program committee

• Isabelle Viaud-Delmon, Acoustic and cognitive spaces team, CNRS - IRCAM, France.
• Andrea Gaggioli, Department of Psychology, UCSC, Milan, Italy.
• Stephen Dunne, Neuroscience Department, STARLAB, Barcelona, Spain.
• Alois Ferscha, Pervasive computing lab, Johannes Kepler Universitat Linz, Austria.
• Fivos Maniatakos, Acoustic and Cognitive Spaces Group, IRCAM, France.

Organisation committee

• Isabelle Viaud-Delmon, IRCAM
• Hugues Vinet, IRCAM
• Marine Taffou, IRCAM
• Sylvie Benoit, IRCAM
• Fivos Maniatakos, IRCAM

Aug 04, 2012

The Age of ‘Wearatronics’

Medgadget has an interesting article on the raise of ‘Wearatronics’, a new trend in which new materials and interconnects have made circuit assemblies flexible and, as a result, embeddable. Such flexible electronic arrays may be embedded into textiles in order to, for example, measure the wearer’s vital signs or even generate and store power.

According to GigaOm's research report The wearable-computing market: a global analysis the wearatronics market for just health and fitness products is estimated to reach 170 million devices within the next five years.

In this video, Bloomberg's Sheila Dharmarajan reports on the outlook for wearable electronics on Bloomberg Television's "Bloomberg West." (Source: Bloomberg)

Also interesting is this TED speech from David Icke, who creates breathable, implantable microcomputers that conform to the human body, which can be used for a variety of medical applications.

Aug 03, 2012

Around 40 researchers attended the 1st Summer School on Human Computer Confluence in Milan

I would like to thank everyone involved but particularly the students and speakers who made this a very successful and enjoyable event. See you all next year in Paris!

1st Summer School on Human Computer Confluence

Jul 14, 2012

Robot avatar body controlled by thought alone

Via: New Scientist

For the first time, a person lying in an fMRI machine has controlled a robot hundreds of kilometers away using thought alone.

"The ultimate goal is to create a surrogate, like in Avatar, although that’s a long way off yet,” says Abderrahmane Kheddar, director of the joint robotics laboratory at the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan.

Teleoperated robots, those that can be remotely controlled by a human, have been around for decades. Kheddar and his colleagues are going a step further. “True embodiment goes far beyond classical telepresence, by making you feel that the thing you are embodying is part of you,” says Kheddar. “This is the feeling we want to reach.”

To attempt this feat, researchers with the international Virtual Embodiment and Robotic Re-embodiment project used fMRI to scan the brain of university student Tirosh Shapira as he imagined moving different parts of his body. He attempted to direct a virtual avatar by thinking of moving his left or right hand or his legs.

The scanner works by measuring changes in blood flow to the brain’s primary motor cortex, and using this the team was able to create an algorithm that could distinguish between each thought of movement (see diagram). The commands were then sent via an internet connection to a small robot at the Béziers Technology Institute in France.

The set-up allowed Shapira to control the robot in near real time with his thoughts, while a camera on the robot’s head allowed him to see from the robot’s perspective. When he thought of moving his left or right hand, the robot moved 30 degrees to the left or right. Imagining moving his legs made the robot walk forward.

Read further at: http://www.newscientist.com/article/mg21528725.900-robot-avatar-body-controlled-by-thought-alone.html

Mar 31, 2012

Microsoft patents projector eyewear for Xbox and beyond

Via KurzweilAI

[+]mshmspatent

Illustrations from Microsoft's patent show the rough schematics for both a helmet-based display and one embedded in a pair of glasses (credit: Microsoft)

According to Patent Bolt, Microsoft has been secretly working on a video headset since September 2010.

A New Microsoft patent reveals that they’ve been working two styles of headset: an aviation styled helmet aimed at Xbox gamers, and one that resembles a pair of sunglasses for use with smartphones, MP3 players and other future devices.

In the patent, Microsoft states that a compact display system may be coupled into goggles, a helmet, or other eyewear. These configurations enable the wearer to view images from a computer, media player, or other electronic device with privacy and mobility. When adapted to display two different images concurrently — one for each eye — the system may be used for stereoscopic display (e.g., virtual-reality) applications.

Mar 22, 2012

1st Summer School on Human Computer Confluence

 

The 1st Summer School on Human Computer Confluence will take place in Milan, Italy, on 18-20 July 2012.

The Summer School is hosted and organized by the Doctoral School in Psychology of the Faculty of Psychology at the Università Cattolica del Sacro Cuore di Milano.

enquires: hcc.summerschool@unicatt.it

The specific objectives of the Summer School are to provide selected and highly-motivated participants hands-on experience with question-driven Human-Computer Confluence projects, applications and experimental paradigms, as well as to gather project leaders’ researchers and students working together on a list of inter-disciplinary challenges in the field of HCC. Participants will be assigned to different teams, working creatively and collaboratively on specific topics of interest.

The 1st Summer School will be addressed up to 40 Ph.D. students attendees, interested in the emerging symbiotic relation between humans and computing devices. There is no registration fee for the Summer School and financial aid will be available for a significant number of students towards travel and accommodation.

About Human Computer Confluence

HCC, Human-Computer Confluence, is an ambitious research program funded by the EU, studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding.

The initiative aims to investigate and demonstrate new possibililities emerging at the confluence between  the human and technological realms. It will examine new modalities for individual and group perception, actions and experience in augmented, virtual spaces. Such virtual spaces would span the virtual reality continuum, also extending to purely synthetic but believable representation of massive, complex and dynamic data. Human-Computer confluence fosters inter-disciplinary research (such as Presence, neuroscience, machine learning and computer science) towards delivering unified experiences and inventing radically new forms of perception/action.

Dec 19, 2011

‘Brainlink’ lets you remotely control toy robots and other gadgets

BirdBrain Technologies, a spin-off of Carnegie Mellon University has developed a device called Brainlink that allows users to remotely control robots and other gadgets (including TVs, cable boxes, and DVD players) with an Android-based smartphone. This is achieved through a small triangular controller that you attach to the gadget, with a Bluetooth range of 30 feet.

1 2 3 4 Next