Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Nov 16, 2013

Phonebloks

Phonebloks is a modular smartphone concept created by Dutch designer Dave Hakkens to reduce electronic waste. By attaching individual third-party components (called "bloks") to a main board, a user would create a personalized smartphone. These bloks can be replaced at will if they break or the user wishes to upgrade.

Stanford Center on Longevity competition challenges students to design products to help older adults

The design contest solicits entries from student teams worldwide and is aimed at finding solutions that help keep people with cognitive impairments independent as long as possible.

The competition is currently accepting submissions in what is called Phase I of the challenge. Submitted concepts will be judged in January and finalists will be given financial help to flesh out their design and travel to Stanford to present it.

From January until April, called Phase II, finalists will also have access to mentors in different schools and centers at Stanford

The final presentations, in April, will be before a panel of academics, industry professionals, nonprofit groups and investors.

The top prize is $10,000, while the second place team will take home $5,000 and third place will get $3,000.

 

Nov 03, 2013

The poetry of Everyday Objects

Artist Javier Pérez turns everyday objects into whimsical illustrations. Here are some of my favourites. Discover more on his Instagram account.

 

Art Director Javier Pérez Turns Everyday Objects into Whimsical Illustrations illustration humor

 

 

 

 

 

 

 

 

 

 

 

 

23:56 Posted in Cyberart | Permalink | Comments (0)

Neurocam wearable camera reads your brainwaves and records what interests you

Via KurzweilAI.net

The neurocam is the world’s first wearable camera system that automatically records what interests you, based on brainwaves, DigInfo TV reports.

It consists of a headset with a brain-wave sensor and uses the iPhone’s camera to record a 5-second GIF animation. It could also be useful for life-logging.

The algorithm for quantifying brain waves was co-developed by Associate Professor Mitsukura at Keio University.

The project team plans to create an emotional interface.

Oct 31, 2013

Brain Decoding

Via IEET

Neuroscientists are starting to decipher what a person is seeing, remembering and even dreaming just by looking at their brain activity. They call it brain decoding.  

In this Nature Video, we see three different uses of brain decoding, including a virtual reality experiment that could use brain activity to figure out whether someone has been to the scene of a crime.

Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation

Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation.

Int J Psychophysiol. 2013 Oct 18;

Authors: Kranczioch C, Zich C, Schierholz I, Sterr A

Abstract. Studying the brain in its natural state remains a major challenge for neuroscience. Solving this challenge would not only enable the refinement of cognitive theory, but also provide a better understanding of cognitive function in the type of complex and unpredictable situations that constitute daily life, and which are often disturbed in clinical populations. With mobile EEG, researchers now have access to a tool that can help address these issues. In this paper we present an overview of technical advancements in mobile EEG systems and associated analysis tools, and explore the benefits of this new technology. Using the example of motor imagery (MI) we will examine the translational potential of MI-based neurofeedback training for neurological rehabilitation and applied research.

Smart glasses that help the blind see

Via New Scientist

They look like snazzy sunglasses, but these computerised specs don't block the sun – they make the world a brighter place for people with partial vision.

These specs do more than bring blurry things into focus. This prototype pair of smart glasses translates visual information into images that blind people can see.

Many people who are registered as blind can perceive some light and motion. The glasses, developed by Stephen Hicks of the University of Oxford, are an attempt to make that residual vision as useful as possible.

They use two cameras, or a camera and an infrared projector, that can detect the distance to nearby objects. They also have a gyroscope, a compass and GPS to help orient the wearer.

The collected information can be translated into a variety of images on the transparent OLED displays, depending on what is most useful to the person sporting the shades. For example, objects can be made clearer against the background, or the distance to obstacles can be indicated by the varying brightness of an image.

Hicks has won the Royal Society's Brian Mercer Award for Innovation"" for his work on the smart glasses. He plans to use the £50,000 prize money to add object and text recognition to the glasses' abilities.

 

Daniel Dennett – If Brains Are Computers, What Kind Of Computers Are They?

Source: Future of Humanity Institute

23:33 Posted in Blue sky | Permalink | Comments (0)

Signal Processing Turns Regular Headphones Into Pulse Sensors

Via Medgadget

A new signal processing algorithm that enables any pair of earphones to detect your pulse was demonstrated recently at the Healthcare Device Exhibition 2013 in Yokohama, Japan. The technology comes from a joint effort of Bifrostec (Tokyo, Japan) and the Kaiteki Institute. It is built on the premise that the eardrum creates pressure waves with each heartbeat, which can be detected in a perfectly enclosed space.  However, typically, earphones do not create a perfect seal, which is what gives everyone in a packed elevator the privilege to listen to that guy’s tunes.  The new algorithm allows the software to process the pressure signal despite the lack of a perfect seal to determine a user’s pulse.

 3 sensor Signal Processing Turns Regular Headphones Into Pulse Sensors

 ear sensor Signal Processing Turns Regular Headphones Into Pulse Sensors

 

 

 

Oct 25, 2013

Positive Technology at ICT 2013

Good news!

Our networking session proposal was accepted at the ICT 2013 Conference in Vilnius (6-8 November, 2013).

Title: Positive Technology: Steps Towards Ubiquitous Empowerment (07/11/2013, Booth 4, 18:00-19:30)

More than 5000 researchers, innovators, entrepreneurs, industry representatives are expected to attend the conference. That's indeed a great opportunity to explore the future developments of Positive Technology within Horizon2020.

ICT 2013

If you are also planning to attend the conference and you're interested in participating to this special networking session, drop me a message here.

Sep 10, 2013

BITalino: Do More!

BITalino is a low-cost toolkit that allows anyone from students to professional developers to create projects and applications with physiological sensors. Out of the box, BITalino already integrates easy to use software & hardware blocks with sensors for electrocardiography (ECG), electromyography (EMG), electrodermal activity (EDA), an accelerometer, & ambient light. Imagination is the limit; each individual block can be snapped off and combined to prototype anything you want. You can connect others sensors, including your own custom designs.

Sep 09, 2013

Effortless awareness: using real time neurofeedback to investigate correlates of posterior cingulate cortex activity in meditators' self-report

Effortless awareness: using real time neurofeedback to investigate correlates of posterior cingulate cortex activity in meditators' self-report.

Front Hum Neurosci. 2013;7:440

Authors: Garrison KA, Santoyo JF, Davis JH, Thornhill TA, Kerr CE, Brewer JA

Neurophenomenological studies seek to utilize first-person self-report to elucidate cognitive processes related to physiological data. Grounded theory offers an approach to the qualitative analysis of self-report, whereby theoretical constructs are derived from empirical data. Here we used grounded theory methodology (GTM) to assess how the first-person experience of meditation relates to neural activity in a core region of the default mode network-the posterior cingulate cortex (PCC). We analyzed first-person data consisting of meditators' accounts of their subjective experience during runs of a real time fMRI neurofeedback study of meditation, and third-person data consisting of corresponding feedback graphs of PCC activity during the same runs. We found that for meditators, the subjective experiences of "undistracted awareness" such as "concentration" and "observing sensory experience," and "effortless doing" such as "observing sensory experience," "not efforting," and "contentment," correspond with PCC deactivation. Further, the subjective experiences of "distracted awareness" such as "distraction" and "interpreting," and "controlling" such as "efforting" and "discontentment," correspond with PCC activation. Moreover, we derived several novel hypotheses about how specific qualities of cognitive processes during meditation relate to PCC activity, such as the difference between meditation and "trying to meditate." These findings offer novel insights into the relationship between meditation and mind wandering or self-related thinking and neural activity in the default mode network, driven by first-person reports.

Aug 28, 2013

Twitter reveals the happiest spots in New York

Sentiment in New York City: A High Resolution Spatial and Temporal View

Karla Z. Bertrand, Maya Bialik, Kawandeep Virdee, Andreas Gros, Yaneer Bar-Yam

http://arxiv.org/abs/1308.5010v1 (link to PDF full text)

Measuring public sentiment is a key task for researchers and policymakers alike. The explosion of available social media data allows for a more time-sensitive and geographically specific analysis than ever before. In this paper we analyze data from the micro-blogging site Twitter and generate a sentiment map of New York City. We develop a classifier specifically tuned for 140-character Twitter messages, or tweets, using key words, phrases and emoticons to determine the mood of each tweet. This method, combined with geotagging provided by users, enables us to gauge public sentiment on extremely fine-grained spatial and temporal scales. We find that public mood is generally highest in public parks and lowest at transportation hubs, and locate other areas of strong sentiment such as cemeteries, medical centers, a jail, and a sewage facility. Sentiment progressively improves with proximity to Times Square. Periodic patterns of sentiment fluctuate on both a daily and a weekly scale: more positive tweets are posted on weekends than on weekdays, with a daily peak in sentiment around midnight and a nadir between 9:00 a.m. and noon.

Aug 07, 2013

On Phenomenal Consciousness

A recent introductory talk on the problem that consciousness and qualia presents to physicalism by Frank C. Jackson.

Welcome to wonderland: the influence of the size and shape of a virtual hand on the perceived size and shape of virtual objects

Welcome to wonderland: the influence of the size and shape of a virtual hand on the perceived size and shape of virtual objects.

PLoS One. 2013;8(7):e68594

Authors: Linkenauger SA, Leyrer M, Bülthoff HH, Mohler BJ

The notion of body-based scaling suggests that our body and its action capabilities are used to scale the spatial layout of the environment. Here we present four studies supporting this perspective by showing that the hand acts as a metric which individuals use to scale the apparent sizes of objects in the environment. However to test this, one must be able to manipulate the size and/or dimensions of the perceiver's hand which is difficult in the real world due to impliability of hand dimensions. To overcome this limitation, we used virtual reality to manipulate dimensions of participants' fully-tracked, virtual hands to investigate its influence on the perceived size and shape of virtual objects. In a series of experiments, using several measures, we show that individuals' estimations of the sizes of virtual objects differ depending on the size of their virtual hand in the direction consistent with the body-based scaling hypothesis. Additionally, we found that these effects were specific to participants' virtual hands rather than another avatar's hands or a salient familiar-sized object. While these studies provide support for a body-based approach to the scaling of the spatial layout, they also demonstrate the influence of virtual bodies on perception of virtual environments.

Using avatars to model weight loss behaviors: participant attitudes and technology development.

Using avatars to model weight loss behaviors: participant attitudes and technology development.

J Diabetes Sci Technol. 2013;7(4):1057-65

Authors: Napolitano MA, Hayes S, Russo G, Muresu D, Giordano A, Foster GD

BACKGROUND: Virtual reality and other avatar-based technologies are potential methods for demonstrating and modeling weight loss behaviors. This study examined avatar-based technology as a tool for modeling weight loss behaviors. METHODS: This study consisted of two phases: (1) an online survey to obtain feedback about using avatars for modeling weight loss behaviors and (2) technology development and usability testing to create an avatar-based technology program for modeling weight loss behaviors. RESULTS: Results of phase 1 (n = 128) revealed that interest was high, with 88.3% stating that they would participate in a program that used an avatar to help practice weight loss skills in a virtual environment. In phase 2, avatars and modules to model weight loss skills were developed. Eight women were recruited to participate in a 4-week usability test, with 100% reporting they would recommend the program and that it influenced their diet/exercise behavior. Most women (87.5%) indicated that the virtual models were helpful. After 4 weeks, average weight loss was 1.6 kg (standard deviation = 1.7). CONCLUSIONS: This investigation revealed a high level of interest in an avatar-based program, with formative work indicating promise. Given the high costs associated with in vivo exposure and practice, this study demonstrates the potential use of avatar-based technology as a tool for modeling weight loss behaviors.Abstract

What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold

What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold.

Front Hum Neurosci. 2013;7:438

Authors: Martini M, Perez-Marcos D, Sanchez-Vives MV

It has been demonstrated that visual inputs can modulate pain. However, the influence of skin color on pain perception is unknown. Red skin is associated to inflamed, hot and more sensitive skin, while blue is associated to cyanotic, cold skin. We aimed to test whether the color of the skin would alter the heat pain threshold. To this end, we used an immersive virtual environment where we induced embodiment of a virtual arm that was co-located with the real one and seen from a first-person perspective. Virtual reality allowed us to dynamically modify the color of the skin of the virtual arm. In order to test pain threshold, increasing ramps of heat stimulation applied on the participants' arm were delivered concomitantly with the gradual intensification of different colors on the embodied avatar's arm. We found that a reddened arm significantly decreased the pain threshold compared with normal and bluish skin. This effect was specific when red was seen on the arm, while seeing red in a spot outside the arm did not decrease pain threshold. These results demonstrate an influence of skin color on pain perception. This top-down modulation of pain through visual input suggests a potential use of embodied virtual bodies for pain therapy.

Full text open access

What Color Is Your Night Light? It May Affect Your Mood

When it comes to some of the health hazards of light at night, a new study suggests that the color of the light can make a big difference.

Read full story on Science Daily

Phubbing: the war against anti-social phone use

Via Textually.org

Screen Shot 2013-08-06 at 9.03.46 AM.png

Don't you just hate it when someone snubs you by looking at their phone instead of paying attention? The Stop Phubbing campaign group certainly does. The Guardian reports.

In a list of "Disturbing Phubbing Stats" on their website, of note:

-- If phubbing were a plague it would decimate six Chinas

-- 97% of people claim their food tasted worse while being a victim of phubbing

-- 92% of repeat phubbers go on to become politicians

So it's really just a joke site? Well, a joke site with a serious message about our growing estrangement from our fellow human beings. But mostly a joke site, yes.

Read full article.

The Computer Game That Helps Therapists Chat to Adolescents With Mental Health Problems

Via MIT Technology Review

Adolescents with mental health problems are particularly hard for therapists to engage. But a new computer game is providing a healthy conduit for effective communication between them.

Read the full story on MIT Technology Review