Jul 29, 2014
Ekso is an exoskeleton bionic suit or a "wearable robot" designed to enable individuals with lower extremity paralysis to stand up and walk over ground with a weight bearing, four point reciprocal gait. Walking is achieved by the user’s forward lateral weight shift to initiate a step. Battery-powered motors drive the legs and replace neuromuscular function.
Ekso Bionics http://eksobionics.com/
Dec 08, 2013
University at Buffalo researchers are developing brain-computer interface (BCI) devices to mentally control robots.
“The technology has practical applications that we’re only beginning to explore,” said Thenkurussi “Kesh” Kesavadas, PhD, UB professor of mechanical and aerospace engineering and director of UB’s Virtual Reality Laboratory. “For example, it could help paraplegic patients to control assistive devices, or it could help factory workers perform advanced manufacturing tasks.”
Most BCI research has involved expensive, invasive BCI devices that are inserted into the brain, and used mostly to help disabled people.
UB research relies on a relatively inexpensive ($750), non-invasive external device (Emotiv EPOC). It reads EEG brain activity with 14 sensors and transmits the signal wirelessly to a computer, which then sends signals to the robot to control its movements.
Kesavadas recently demonstrated the technology with Pramod Chembrammel, a doctoral student in his lab. Chembrammel trained with the instrument for a few days, then used the device to control a robotic arm.
He used the arm to insert a wood peg into a hole and rotate the peg. “It was incredible to see the robot respond to my thoughts,” Chembrammel said. “It wasn’t even that difficult to learn how to use the device.”
The video (below) shows that a simple set of instructions can be combined to execute more complex robotic actions, Kesavadas said. Such robots could be used by factory workers to perform hands-free assembly of products, or carry out tasks like drilling or welding.
The potential advantage, Kesavadas said, is that BCI-controlled devices could reduce the tedium of performing repetitious tasks and improve worker safety and productivity. The devices can also leverage the worker’s decision-making skills, such as identifying a faulty part in an automated assembly line.
Nov 24, 2013
Great BBC documentary (40')
Apr 05, 2013
Researchers at Vanderbilt University are studying the potential benefits of using human-looking robots as tools to help kids with autism spectrum disorder (ASD) improve their communication skills. The programmable NAO robot used in the study was developed by Aldebaran Robotics out of Paris, France, and offers the ability to be part of a larger, smarter system.
Though a child might feel like the pink eyed humanoid is an autonomous being, the NAO robot that the team is using is actually hooked up to computers and external cameras that track the kid’s movements. Using the newly developed ARIA (Adaptive Robot-Mediated Intervention Architecture) protocol, they found that children paid more attention to NAO and followed in exercises almost as well as with a human adult therapist.
Mar 03, 2013
“The new technology is a major breakthrough that has many advantages over current technology, which provides very limited functionality to patients with missing limbs,” Brånemark says.
Presently, robotic prostheses rely on electrodes over the skin to pick up the muscles electrical activity to drive few actions by the prosthesis. The problem with this approach is that normally only two functions are regained out of the tens of different movements an able-body is capable of. By using implanted electrodes, more signals can be retrieved, and therefore control of more movements is possible. Furthermore, it is also possible to provide the patient with natural perception, or “feeling”, through neural stimulation.
“We believe that implanted electrodes, together with a long-term stable human-machine interface provided by the osseointegrated implant, is a breakthrough that will pave the way for a new era in limb replacement,” says Rickard Brånemark.
Read full story
The Japanese communication robot destined to join the crew aboard the International Space Station (ISS) this summer recently underwent some zero gravity testing. The Kibo Robot Project, organized by Dentsu Inc. in response to a proposal made by the Japan Aerospace Exploration Agency, unveiled the final design of its diminutive humanoid robot and its Earthbound counterpart.
Watch the video:
Oct 27, 2012
DARPA has announced the start of the next DARPA Robotics Challenge. This time, the goal is to develop ground robots that perform complex tasks in "dangerous, degraded human-engineered environments". That means robots that perform humanitarian, disaster relief operations. The robots must use standard human hand tools and vehicles to navigate a debris field, open doors, climb ladders, and break through a concrete wall. Most but not all of the robots will be humanoid in design.
The challenge is divided into two parts with a Virtual Robotics Challenge scheduled for 10 - 24 June, 2013 to test simulated robots and the actual DARPA Robotics Challenge scheduled for 21 December, 2013. DARPA has adopted the free software Gazebo simulator, which supports ROS. There are two competition "tracks" - competitors in Track A will develop their own humanoid robot and control software, while competitors in Track B will develop control software that runs on a DARPA-supplied Atlas robot built by Boston Dynamics. Already University teams are making announcements of participation. Read on for more info about some of the teams, as well as some awesome photos and videos of the robots in action.
May 07, 2012
Researchers at Federal Institute of Technology in Lausanne, Switzerland (EPFL), have successfully demonstrated a robot controlled by the mind of a partially quadriplegic patient in a hospital 62 miles away. The EPFL brain-computer interface system does not require invasive neural implants in the brain, since it is based on a special EEG cap fitted with electrodes that record the patient’s neural signals. The task of the patient is to imagine moving his paralyzed fingers, and this input is than translated by the BCI system into command for the robot.
Nov 26, 2011
Today at 10:02 am the latest Mars Rover, Curiosity was launched into the deep space. The $2.5 billion exploratory system started its eight month journey to Mars where it will spend another two years researching the conditions for (past or future) life. The nuclear-powered Curiosity is much larger than any previous Mars Rover and five times heavier. Its equipment includes a drill on a 2.1-meter arm and a laser to vaporize rocks for easier onboard analysis.
When I first watched this video this morning I was really amazed by the technology, the landing strategy and the terrific level of sophistication of the rover system. Then I thought to myself - if there is enough brainpower on earth to make this vision a reality, then it must be also possible to workout a solution for the global economy!
Jul 27, 2011
The poetry of technology..
Oct 19, 2010
Dance Robot LIVE! is a performance recently shown at the Digital Content Expo in Tokyo. The performance features AIST's feminine HRP-4C robot and four humans. The routine was produced by renowned dancer/choreographer SAM-san and the lip-synced song is a Vocaloid version of "Deatta Koro no Yō ni" by Kaori Mochida (Every Little Thing).
Oct 07, 2010
Sep 19, 2010
Research groups at Stanford University and the University of California at Berkeley are developing sensor-based artificial skin that could provide prosthetic and robotic limbs with a realistic sense of touch. Stanford's project is based on organic electronics and is capable of detecting the weight of a fly upon the artificial skin, according to Zhenan Bao, professor of chemical engineering at Stanford.
The highly sensitive surfaces could also help robots pick up delicate objects without breaking them, improve surgeons' control over tools used for minimally invasive surgery, and increase efficiency of touch screen devices, she noted. Meanwhile, UC Berkeley's "e-skin" uses low-power, integrated arrays of nanowire transistors, according to UC Berkeley Professor of Electrical Engineering and Computer Science Ali Javey.
Thus far, the skin, the first ever made out of inorganic single crystalline semiconductors, is able to detect pressure equivalent to the touch of a keyboard. "It's a technique that can be potentially scaled up," said study lead author Kuniharu Takei, post-doctoral fellow in electrical engineering and computer sciences at UC Berkeley. "The limit now to the size of the e-skin we developed is the size of the processing tools we are using."
Apr 07, 2010
Researchers from the Intelligent Robotics Laboratory at Osaka University have teamed up with robot maker Kokoro Co., Ltd. to create a realistic-looking remote-control female android that mimics the facial expressions and speech of a human operator.
Modeled after a woman in her twenties, the android has long black hair, soft silicone skin, and a set of lifelike teeth that allow her to produce a natural smile.
According to the developers, the robot friendly and approachable appearance makes her suitable for receptionist work at sites such as museums. The researchers also plan to test her ability to put hospital patients at ease.
The research is being led by Osaka University professor Hiroshi Ishiguro, who is known for creating teleoperated robot twins such as the celebrated Geminoid HI-1, which was modeled after himself.
The new Geminoid F can produce facial expressions more naturally than its predecessors and it does so with a much more efficient design. While the previous Geminoid HI-1 model was equipped with 46 pneumatic actuators, the Geminoid F uses only 12.
In addition, the entire air servo control system is housed within the robot body and is powered by a small external compressor that runs on standard household electricity.
The Geminoid F easy-to-use teleoperation system, which was developed by ATR Intelligent Robotics and Communication Laboratories, consists of a smart camera that tracks the operator's facial movements. The corresponding data is relayed to the robot control system, which coordinates the movement of the pneumatic actuators to reproduce the expressions on the android face.
The efficient design makes the robot much cheaper to produce than previous models. Kokoro plans to begin selling copies of the Geminoid F next month for about 10 million yen ($110,000) each.
Dec 08, 2009
The world's first robot with its own Facebook page (and that can use its information in conversations with "friends") has been developed by Nikolaos Mavridis and collaborators from the Interactive Robots and Media Lab at the United Arab Emirates University.
The main hypothesis of the FaceBots project is that long-term human robot interaction will benefit by reference to "shared memories" and "events relevant to shared friends" in human-robot dialogues.
More to explore:
N. Mavridis, W. Kazmi and P. Toulis, "Friends with Faces: How Social Networks Can Enhance Face Recognition and Vice Versa", contributed book chapter to Computational Social Networks Analysis: Trends, Tools and Research Advances, Springer Verlag, 2009. pdf
N. Mavridis, W. Kazmi, P. Toulis, C. Ben-AbdelKader, "On the synergies between online social networking, Face Recognition, and Interactive Robotics", CaSoN 2009. pdf
N. Mavridis, C. Datta et al, "Facebots: Social robots utilizing and publishing social information in Facebook", IEEE HRI 2009. pdf
Sep 25, 2009
Via Pink Tentacle
Miruko is a camera robot in the shape of an eyeball capable of tracking objects and faces. Worn on the player’s sleeve, Miruko’s roving eye scans the surroundings in search of virtual monsters that are invisible to the naked human eye. When a virtual monster is spotted, the mechanical eyeball rolls around in its socket and fixes its gaze on the monster’s location. By following Miruko’s line of sight, the player is able to locate the virtual monster and “capture” it via his or her iPhone camera.
In this video, Miruko’s creators demonstrate how the robotic eyeball can be used as an interface for a virtual monster-hunting game played in a real-world environment.
According to its creators, Miruko can be used for augmented reality games, security, and navigation.
Sep 21, 2009
Japanese company Cyberdyne, with the scientific support provided by Professor Sankai of Tsukuba University, have developed the Hybrid Assistive Limb - HAL - a device designed to help people walk or carry heavy loads. The assistive walking system weights 10 kilogram and has a battery at the back. Embedded sensors collects electric signals that are delivered to the brain through the skin surface. Thanks to these sensors, the system can help users to move in the direction they are thinking. The walking speed is 1.8 km p/h.
Watch the HAL in action in this video:
Jul 22, 2008
(Credit: Fraunhofer IPA)
Here are the major functionalities of Care-O-bot 3:
- Omnidirectional Navigation: Care-O-bot 3 has an omnidirectional platform, with four steered and driven wheels. This kinematic system enables the robot to move in any desired direction and therefore also safely to negotiate narrow passages.
- Safe Manipulation: Care-O-bot 3 is equipped with a highly flexible, commercial arm with seven degrees of freedom as well as with a three-finger hand. This makes it capable of gripping and operating a large number of different everyday objects.
- 3D Environment Detection: A multiplicity of sensors enables Care-O-bot 3 to detect the environment in which it is operating. These range from stereo vision colour cameras and laser scanners to a 3D depth-image camera.
- Software Architecture/Middleware: Several interlinked computers are used to evaluate and control the sensors and actuators inside the robot. The system resources are coordinated and managed by a specially developed middleware which controls communications between the individual processes and which reacts appropriately in the event of a malfunction.
- Human-Machine Interaction: The primary interface between Care-Obot 3 and the user consists of a tray attached to the front of the robot, which carries objects for exchange between the human and the robot. The tray includes a touch screen and retracts automatically when not in use. A laser projector on the gripper also enables the robot to project information onto objects.
Mar 16, 2008
Passing the Turing test - the holy grail of AI (a human conversing with a computer can't tell it's not human) - may now be possible in a limited way with the world's fastest supercomputer (IBM's Blue Gene) and mimicking the behavior of a human-controlled avatar in a virtual world, according to AI experts at Rensselaer Polytechnic Institute. "We are building a knowledge base that corresponds to all of the relevant background for our synthetic character--where he went to school, what his family is like, and so on," said Selmer Bringsjord, head of Rensselaer's Cognitive Science Department and leader of the research project. The researchers plan to engineer, from the start, a full-blown intelligent character and converse with him in an interactive virtual environment, like Second Life.
read full article here