Apr 27, 2016
Apr 05, 2015
Recently, a growing number of opinion leaders have started to point out the potential risks associated to the rapid advancement of Artificial Intelligence. This shared concern has led an interdisciplinary group of scientists, technologists and entrepreneurs to sign an open letter (http://futureoflife.org/misc/open_letter/), drafted by the Future of Life Institute, which focuses on priorities to be considered as Artificial Intelligence develops as well as on the potential dangers posed by this paradigm.
The concern that machines may soon dominate humans, however, is not new: in the last thirty years, this topic has been widely represented in movies (i.e. Terminator, the Matrix), novels and various interactive arts. For example, australian-based performance artist Stelarc has incorporated themes of cyborgization and other human-machine interfaces in his work, by creating a number of installations that confront us with the question of where human ends and technology begins.
In his 2005 well-received book “The Singularity Is Near: When Humans Transcend Biology” (Viking Penguin: New York), inventor and futurist Ray Kurzweil argued that Artificial Intelligence is one of the interacting forces that, together with genetics, robotic and nanotechnology, may soon converge to overcome our biological limitations and usher in the beginning of the Singularity, during which Kurzweil predicts that human life will be irreversibly transformed. According to Kurzweil, will take place around 2045 and will probably represent the most extraordinary event in all of human history.
Ray Kurzweil’s vision of the future of intelligence is at the forefront of the transhumanist movement, which considers scientific and technological advances as a mean to augment human physical and cognitive abilities, with the final aim of improving and even extending life. According to transhumanists, however, the choice whether to benefit from such enhancement options should generally reside with the individual. The concept of transhumanism has been criticized, among others, by the influential american philosopher of technology, Don Ihde, who pointed out that no technology will ever be completely internalized, since any technological enhancement implies a compromise. Ihde has distinguished four different relations that humans can have with technological artifacts. In particular, in the “embodiment relation” a technology becomes (quasi)transparent, allowing a partial symbiosis of ourself and the technology. In wearing of eyeglasses, as Ihde examplifies, I do not look “at” them but “through” them at the world: they are already assimilated into my body schema, withdrawing from my perceiving.
According to Ihde, there is a doubled desire which arises from such embodiment relations: “It is the doubled desire that, on one side, is a wish for total transparency, total embodiment, for the technology to truly "become me."(...) But that is only one side of the desire. The other side is the desire to have the power, the transformation that the technology makes available. Only by using the technology is my bodily power enhanced and magniﬁed by speed, through distance, or by any of the other ways in which technologies change my capacities. (…) The desire is, at best, contradictory. l want the transformation that the technology allows, but I want it in such a way that I am basically unaware of its presence. I want it in such a way that it becomes me. Such a desire both secretly rejects what technologies are and overlooks the transformational effects which are necessarily tied to human-technology relations. This lllusory desire belongs equally to pro- and anti-technology interpretations of technology.” (Ihde, D. (1990). Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana, p. 75).
Despite the different philosophical stances and assumptions on what our future relationship with technology will look like, there is little doubt that these questions will become more pressing and acute in the next years. In my personal view, technology should not be viewed as mean to replace human life, but as an instrument for improving it. As William S. Haney II suggests in his book “Cyberculture, Cyborgs and Science Fiction: Consciousness and the Posthuman” (Rodopi: Amsterdam, 2006), “each person must choose for him or herself between the technological extension of physical experience through mind, body and world on the one hand, and the natural powers of human consciousness on the other as a means to realize their ultimate vision.” (ix, Preface).
Nov 01, 2014
Nestle SA will enlist a thousand humanoid robots to help sell its coffee makers at electronics stores across Japan, becoming the first corporate customer for the chatty, bug-eyed androids unveiled in June by tech conglomerate SoftBank Corp.
Nestle has maintained healthy growth in Japan while many of its big markets are slowing, crediting a tradition of trying out off-beat marketing tactics in what is a small but profitable territory for the world's biggest food group.
The waist-high robot, developed by a French company and manufactured in Taiwan, was touted by Japan's SoftBank as capable of learning and expressing human emotions, and of serving as a companion or guide in a country that faces chronic labor shortages.
Nestle said on Wednesday it would initially commission 20 of the robots, called Pepper, in December to interact with customers and promote its coffee machines. By the end of next year, the maker of Nescafe coffee and KitKat chocolate bars plans to have the robots working at 1,000 stores.
"We hope this new type of made-in-Japan customer service will take off around the world," Nestle Japan President Kohzoh Takaoka said in a statement.
Nestle did not say how much it was paying for Pepper, which SoftBank has said would retail for 198,000 yen ($1,830). The robot is already greeting customers at more than 70 SoftBank mobile phone stores in Japan.
Among Nestle's most successful Japan-only initiatives is the Nescafe Ambassador system, in which individuals stock coffee pods and collect money for them at their offices in exchange for free use of machines and other perks. Nestle wants half a million "ambassadors" by 2020 - nearly quadruple the number now - as it expands into museums, beauty salons and even temples.
The Japanese unit has also developed hundreds of KitKat flavors including wasabi and green tea, and this year rolled out a KitKat that can be baked into cookies.
The latest creation from Aldebaran, Pepper is the first robot designed to live with humans.
Oct 12, 2014
MIT researchers have developed an algorithm for bounding that they've successfully implemented in a robotic cheetah. (Learn more: http://mitsha.re/1uHoltW)
I am not that impressed by the result though.
Aug 03, 2014
Jibo is a new robot from MIT roboticist Cynthia Breazeal. It is designed to be a social robot that you interact with like it’s another person in your home. The 28-centimetre, 3-kilogram “sociable robot” snaps family photos, handles video calling and acts as a digital concierge. Connected wirelessly to the Internet, Jibo sifts through messages, organizes your itinerary and orders takeout.
What people say about Jibo:
"JIBO's potential extends far beyond engaging in casual conversation and completing daily tasks." - Katie Couric, Yahoo News
"A Robot with a Little Humanity" - John Markoff, New York Times
"JIBO isn't an appliance, it's a companion, one that can interact and react with its human owners in ways that delight instead of disturb." - Lance Ulanoff, Mashable
"Move over, Siri, the JIBO robot is coming" - Maggie Lake, CNN
"This Friendly Robot Could One Day Be Your Family's Personal Assistant" - Christina Bonnington, WIRED
Jul 29, 2014
Ekso is an exoskeleton bionic suit or a "wearable robot" designed to enable individuals with lower extremity paralysis to stand up and walk over ground with a weight bearing, four point reciprocal gait. Walking is achieved by the user’s forward lateral weight shift to initiate a step. Battery-powered motors drive the legs and replace neuromuscular function.
Ekso Bionics http://eksobionics.com/
Dec 08, 2013
University at Buffalo researchers are developing brain-computer interface (BCI) devices to mentally control robots.
“The technology has practical applications that we’re only beginning to explore,” said Thenkurussi “Kesh” Kesavadas, PhD, UB professor of mechanical and aerospace engineering and director of UB’s Virtual Reality Laboratory. “For example, it could help paraplegic patients to control assistive devices, or it could help factory workers perform advanced manufacturing tasks.”
Most BCI research has involved expensive, invasive BCI devices that are inserted into the brain, and used mostly to help disabled people.
UB research relies on a relatively inexpensive ($750), non-invasive external device (Emotiv EPOC). It reads EEG brain activity with 14 sensors and transmits the signal wirelessly to a computer, which then sends signals to the robot to control its movements.
Kesavadas recently demonstrated the technology with Pramod Chembrammel, a doctoral student in his lab. Chembrammel trained with the instrument for a few days, then used the device to control a robotic arm.
He used the arm to insert a wood peg into a hole and rotate the peg. “It was incredible to see the robot respond to my thoughts,” Chembrammel said. “It wasn’t even that difficult to learn how to use the device.”
The video (below) shows that a simple set of instructions can be combined to execute more complex robotic actions, Kesavadas said. Such robots could be used by factory workers to perform hands-free assembly of products, or carry out tasks like drilling or welding.
The potential advantage, Kesavadas said, is that BCI-controlled devices could reduce the tedium of performing repetitious tasks and improve worker safety and productivity. The devices can also leverage the worker’s decision-making skills, such as identifying a faulty part in an automated assembly line.
Nov 24, 2013
Great BBC documentary (40')
Apr 05, 2013
Researchers at Vanderbilt University are studying the potential benefits of using human-looking robots as tools to help kids with autism spectrum disorder (ASD) improve their communication skills. The programmable NAO robot used in the study was developed by Aldebaran Robotics out of Paris, France, and offers the ability to be part of a larger, smarter system.
Though a child might feel like the pink eyed humanoid is an autonomous being, the NAO robot that the team is using is actually hooked up to computers and external cameras that track the kid’s movements. Using the newly developed ARIA (Adaptive Robot-Mediated Intervention Architecture) protocol, they found that children paid more attention to NAO and followed in exercises almost as well as with a human adult therapist.
Mar 03, 2013
“The new technology is a major breakthrough that has many advantages over current technology, which provides very limited functionality to patients with missing limbs,” Brånemark says.
Presently, robotic prostheses rely on electrodes over the skin to pick up the muscles electrical activity to drive few actions by the prosthesis. The problem with this approach is that normally only two functions are regained out of the tens of different movements an able-body is capable of. By using implanted electrodes, more signals can be retrieved, and therefore control of more movements is possible. Furthermore, it is also possible to provide the patient with natural perception, or “feeling”, through neural stimulation.
“We believe that implanted electrodes, together with a long-term stable human-machine interface provided by the osseointegrated implant, is a breakthrough that will pave the way for a new era in limb replacement,” says Rickard Brånemark.
Read full story
The Japanese communication robot destined to join the crew aboard the International Space Station (ISS) this summer recently underwent some zero gravity testing. The Kibo Robot Project, organized by Dentsu Inc. in response to a proposal made by the Japan Aerospace Exploration Agency, unveiled the final design of its diminutive humanoid robot and its Earthbound counterpart.
Watch the video:
Oct 27, 2012
DARPA has announced the start of the next DARPA Robotics Challenge. This time, the goal is to develop ground robots that perform complex tasks in "dangerous, degraded human-engineered environments". That means robots that perform humanitarian, disaster relief operations. The robots must use standard human hand tools and vehicles to navigate a debris field, open doors, climb ladders, and break through a concrete wall. Most but not all of the robots will be humanoid in design.
The challenge is divided into two parts with a Virtual Robotics Challenge scheduled for 10 - 24 June, 2013 to test simulated robots and the actual DARPA Robotics Challenge scheduled for 21 December, 2013. DARPA has adopted the free software Gazebo simulator, which supports ROS. There are two competition "tracks" - competitors in Track A will develop their own humanoid robot and control software, while competitors in Track B will develop control software that runs on a DARPA-supplied Atlas robot built by Boston Dynamics. Already University teams are making announcements of participation. Read on for more info about some of the teams, as well as some awesome photos and videos of the robots in action.
May 07, 2012
Researchers at Federal Institute of Technology in Lausanne, Switzerland (EPFL), have successfully demonstrated a robot controlled by the mind of a partially quadriplegic patient in a hospital 62 miles away. The EPFL brain-computer interface system does not require invasive neural implants in the brain, since it is based on a special EEG cap fitted with electrodes that record the patient’s neural signals. The task of the patient is to imagine moving his paralyzed fingers, and this input is than translated by the BCI system into command for the robot.
Nov 26, 2011
Today at 10:02 am the latest Mars Rover, Curiosity was launched into the deep space. The $2.5 billion exploratory system started its eight month journey to Mars where it will spend another two years researching the conditions for (past or future) life. The nuclear-powered Curiosity is much larger than any previous Mars Rover and five times heavier. Its equipment includes a drill on a 2.1-meter arm and a laser to vaporize rocks for easier onboard analysis.
When I first watched this video this morning I was really amazed by the technology, the landing strategy and the terrific level of sophistication of the rover system. Then I thought to myself - if there is enough brainpower on earth to make this vision a reality, then it must be also possible to workout a solution for the global economy!
Jul 27, 2011
The poetry of technology..
Oct 19, 2010
Dance Robot LIVE! is a performance recently shown at the Digital Content Expo in Tokyo. The performance features AIST's feminine HRP-4C robot and four humans. The routine was produced by renowned dancer/choreographer SAM-san and the lip-synced song is a Vocaloid version of "Deatta Koro no Yō ni" by Kaori Mochida (Every Little Thing).
Oct 07, 2010
Sep 19, 2010
Research groups at Stanford University and the University of California at Berkeley are developing sensor-based artificial skin that could provide prosthetic and robotic limbs with a realistic sense of touch. Stanford's project is based on organic electronics and is capable of detecting the weight of a fly upon the artificial skin, according to Zhenan Bao, professor of chemical engineering at Stanford.
The highly sensitive surfaces could also help robots pick up delicate objects without breaking them, improve surgeons' control over tools used for minimally invasive surgery, and increase efficiency of touch screen devices, she noted. Meanwhile, UC Berkeley's "e-skin" uses low-power, integrated arrays of nanowire transistors, according to UC Berkeley Professor of Electrical Engineering and Computer Science Ali Javey.
Thus far, the skin, the first ever made out of inorganic single crystalline semiconductors, is able to detect pressure equivalent to the touch of a keyboard. "It's a technique that can be potentially scaled up," said study lead author Kuniharu Takei, post-doctoral fellow in electrical engineering and computer sciences at UC Berkeley. "The limit now to the size of the e-skin we developed is the size of the processing tools we are using."
Apr 07, 2010
Researchers from the Intelligent Robotics Laboratory at Osaka University have teamed up with robot maker Kokoro Co., Ltd. to create a realistic-looking remote-control female android that mimics the facial expressions and speech of a human operator.
Modeled after a woman in her twenties, the android has long black hair, soft silicone skin, and a set of lifelike teeth that allow her to produce a natural smile.
According to the developers, the robot friendly and approachable appearance makes her suitable for receptionist work at sites such as museums. The researchers also plan to test her ability to put hospital patients at ease.
The research is being led by Osaka University professor Hiroshi Ishiguro, who is known for creating teleoperated robot twins such as the celebrated Geminoid HI-1, which was modeled after himself.
The new Geminoid F can produce facial expressions more naturally than its predecessors and it does so with a much more efficient design. While the previous Geminoid HI-1 model was equipped with 46 pneumatic actuators, the Geminoid F uses only 12.
In addition, the entire air servo control system is housed within the robot body and is powered by a small external compressor that runs on standard household electricity.
The Geminoid F easy-to-use teleoperation system, which was developed by ATR Intelligent Robotics and Communication Laboratories, consists of a smart camera that tracks the operator's facial movements. The corresponding data is relayed to the robot control system, which coordinates the movement of the pneumatic actuators to reproduce the expressions on the android face.
The efficient design makes the robot much cheaper to produce than previous models. Kokoro plans to begin selling copies of the Geminoid F next month for about 10 million yen ($110,000) each.
Dec 08, 2009
The world's first robot with its own Facebook page (and that can use its information in conversations with "friends") has been developed by Nikolaos Mavridis and collaborators from the Interactive Robots and Media Lab at the United Arab Emirates University.
The main hypothesis of the FaceBots project is that long-term human robot interaction will benefit by reference to "shared memories" and "events relevant to shared friends" in human-robot dialogues.
More to explore:
N. Mavridis, W. Kazmi and P. Toulis, "Friends with Faces: How Social Networks Can Enhance Face Recognition and Vice Versa", contributed book chapter to Computational Social Networks Analysis: Trends, Tools and Research Advances, Springer Verlag, 2009. pdf
N. Mavridis, W. Kazmi, P. Toulis, C. Ben-AbdelKader, "On the synergies between online social networking, Face Recognition, and Interactive Robotics", CaSoN 2009. pdf
N. Mavridis, C. Datta et al, "Facebots: Social robots utilizing and publishing social information in Facebook", IEEE HRI 2009. pdf