Enter song title:

Tuesday, March 24, 2009

It took just a few decades for computers to evolve from room-size vacuum tube–based machines that cost as much as a house to cheap chip-powered desktop models with vastly more processing power. Similarly, the days of "personal robots"—inexpensive machines that can help out at home or the office—may be closer than we think. But first, says Alexander Stoytchev, an assistant professor of electrical and computer engineering at Iowa State University in Ames, robots have to be taught to do something we know instinctively: how to learn.

"A truly useful personal robot [must have] the ability to learn on its own from interactions with the physical and social environment," says Stoytchev, whose field of developmental robotics combines developmental psychology and neuroscience with artificial intelligence and robotic engineering. "It should not rely on a human programmer once it is purchased. It must be trainable."

Stoytchev and a team of grad students are developing software to teach robots to learn about as well as a two-year-old child. Their platform is a humanoid robot that sprouts two 60-pound (27-kilogram) Whole Arm Manipulators (WAM) made by Cambridge, Mass.,–based Barrett Technology, Inc., each tipped with a 2.6-pound (1.2-kilogram) three-fingered BarrettHand.

In one set of experiments, the robot was presented with 36 different objects, including hockey pucks and Tupperware. It could perform five different actions with each one—grasping, pushing, tapping, shaking and dropping—and had to identify and classify them based only on the sounds they made. After just one action the robot had a 72 percent success rate, but its accuracy soared with each successive action, reaching 99.2 percent after all five. The robot had learned to use a perceptual model to recognize and classify objects—and it could rely on this model to estimate how similar two objects were with only the sounds they made to guide it.

Another set of experiments showed the robot could learn to tell whether or not something was a container. The team presented the machine, topped with a 3-D camera, with objects of different shapes. By dropping a small block on each one and then pushing it, the robot learned to classify objects either as containers—those that moved together with the block ["co-moved"] more often when pushed—or as noncontainers. The robot could then use this knowledge to judge whether unfamiliar objects could hold things; in other words, it had learned, roughly, how to discern the unique characteristics of a container.

When personal robots finally hit retail chains, they might look something like HERB, the "Home Exploring Robotic Butler" created at an Intel lab in Pittsburgh. It is part of the company's Personal Robotics Project, whose goal is to make a truly autonomous robotic assistant that can perform routine tasks at human speeds in cluttered environments like homes or offices.



by:scientific american


Enter your email address: