Press "Enter" to skip to content

Robots ape human behaviour to gain acceptance

Leslie 2

Robots are not only performing ‘humane’ tasks but also becoming increasingly social

Robots are getting their act together to search for and rescue survivors of the massive earthquake that rocked Japan. Universities and robotics companies from around the world are pitching in with their own snake bots, aerial vehicles and rovers to unearth survivors, especially from debris that humans can’t reach.

These robots can be identified by their names. Quince, for instance, is being deployed to sense chemical, biological, radiological or nuclear dangers in areas that firefighters can’t reach. Developed by the Chiba Institute of Technology and Tohoku University, Qunice has a camera, microphone, Position Sensitive Detector (PSD) sensor, Laser Range Finder, Wi-Fi and Infrared Thermography camera and carbon-dioxide sensors to locate survivors by detecting body heat and breathing. It can move on its wheels at about 5.2 feet per second.

India’s Defence, Research and Development Organisation (DRDO) has its own device called ‘Sanjeevani’ for similar disasters. It is available as a portable pack and the probe head can be used in air, water or mud. The initial version of Sanjeevani (which means life giver) was used for the first time to detect survivors after the Gujarat earthquake in 2002.

While extremely skillful, helpful and ‘humane’, such robots are typically remote-controlled, look like machines and do not socialise with human beings. They are similar to ‘assistive’ robots which help us in cleaning rugs (like Roomba), washing windows or minimising human accidents by doing dangerous, repititive tasks on industrial shop-floors.

To be categorised as ‘social’, these robots need autonomy. Examples of such robots include Kismet, Leonardo, Maggie, Domo, Mertz, AIBOs, Furbies, Cog and Paros. Another example is that of Honda’s ASIMO, also referred to as a humanoid robot, which celebrated its 10th birthday on October 31, 2010.

Universities the world over are now engaging in further research to make robots more social. The Carnegie Mellon University (CMU), for instance, worked on ‘Vikia’ which had a flat-screen monitor used to display an animated computer graphics model of a female face. This face lip-synced dialogues using text-to-speech software and could be programmed to exhibit a variety of facial expressions. While Vikia is no longer actively participating in experiments, her hardware and software design formed the basis of the GRACE and Valerie projects.

Valerie was touted as one of the world’s first storytelling roboceptionists who sat in a specially-designed reception booth at CMU. Her sensors would alert her to the presence of people to whom she would offer assistance and directions. Valerie also spent time on the telephone, imitating a human receptionist. Valerie was the product of a two-and-a-half-year collaboration between researchers and students in CMU’s Robotics Institute and the School of Drama in the College of Fine Arts. ‘Tank’ later succeeded Valerie as a roboceptionist.

The Personal Robots Group at the Massachusetts Institute of Technology (MIT) Media Lab, meanwhile, is focusing its energies on a team of four small mobile humanoid robots, referred to as “MDS” for Mobile/Dexterous/Social. In April 2008, the team introduced the Nexi MDS — a complete mobile manipulator robot augmented with expressions of anger, surprise, etc. It has hands to manipulate objects, eyes (video cameras), ears (an array of microphones), and a 3-D infrared camera and laser rangefinder to support real-time tracking of objects, people and voices as well as indoor navigation.

Meanwhile, Androids — made popular by sci-fi films like Star Trek, Terminator, Surrogates, and Bicentennial Man — have used biomimetic skins and human gestures to look and act in an almost human way. The Intelligent Robotics Lab at Osaka University and Kokoro demonstrated the Actroid at Expo 2005 in Aichi Prefecture, Japan. In 2006, Kokoro developed a new DER 2 android which stood 165 cm tall with 47 mobile points. DER2 can not only change its expression but also move its hands and feet and twist its body.

The Waseda University (Japan) and NTT Docomo’s manufacturers succeeded in creating a shape-shifting and face-changing robot, WD-2. EveR-2, a Korean android developed by Kitech can even sing. She is 160 cm tall and weighs 50 kg. Hanson Robotics of Texas and Kaist produced an android portrait of Albert Einstein, using Hanson’s facial android technology mounted on Kaist’s life-size walking bipedal robot body called ‘Albert Hubo’. Even Iran has its Surena 2 which has the ability to dance.

Do these qualities make a robot social? Last October, Andrew Meltzoff, co-director of the University of Washington’s Institute for Learning and Brain Sciences, and Rajesh Rao, associate professor of computer science and engineering, hyphotesised that babies would be more likely to view the robot as a psychological being if they saw other friendly human beings socially interacting with it. In a research published in the October/November issue of Neural Networks, the duo provided a clue as to how babies decide whether a new object, such as a robot, is sentient (live) or an inanimate object. Sixty-four babies participated in the study, and they were tested individually.

The study has implications for humanoid robots, said co-author Rao whose team helped design the computer programs that made the robot Morphy appear social. “The study suggests that if you want to build a companion robot, it is not sufficient to make it look human,” said Rao. “The robot must also be able to interact socially with humans, an interesting challenge for robotics.” With robots serving as human companions like eldercare bots and nanny bots do, the hurdle is significant.

However, with robots gaining a more human-like appearance and becoming more social with advancements in aritificial intelligence (AI) over the past 50 years, there are more challenges too such as the one that Sherry Turkle, an MIT professor, points out in her book “Alone Together: Why We Expect More from Technology and Less from Each Other”: “…These days, insecure in our relationships and anxious about intimacy, we look to technology for ways to be in relationships and protect ourselves from them at the same time…it can happen when interacting with a robot.”

Link to the article in Business Standard