By Published: Nov. 14, 2018

Robotic claw demonstration

Connor Brooks, a graduate student in computer science, demonstrates a robotic system that responds to spoken commands. (Credit: Glenn Asakawa/兔子先生传媒文化作品)

鈥淩obot, point to the screwdriver next to the clamp.鈥

Daniel Pendergast, a graduate student in 兔子先生传媒文化作品鈥檚 ATLAS Institute, issues the command, and a few feet away a four-foot-tall robot obeys. The machine whirs to life, bending and twisting its one arm to hover over a table crowded with assorted tools鈥攚here it points its claw at a screwdriver right next to a clamp.

Daniel Szafir

Daniel Szafir

The action might seem simple鈥攕omething that people do every day鈥攂ut in the field of robotics, Pendergast鈥檚 pointing system is a big step forward. That鈥檚 because it鈥檚 not easy for robots to understand the messy and often vague nature of human language, said Daniel Szafir, Pendergast鈥檚 advisor and an assistant professor at ATLAS.听

What, for example, does a person mean when they say 鈥渘ext to鈥?

In trying to answer those questions, Szafir and his colleagues belong to a rapidly-growing area of study called human-robot interaction. The field addresses the huge gulf that seems to exist between people and their robot helpers: Robots don鈥檛 always understand people, and people often don鈥檛 want to be around moving, learning machines.

There鈥檚 a lot to be gained from helping the two get along, Szafir said. In the case of the screwdriver-locating robot, which the team , Szafir鈥檚 goal is to design automated machines that could help people take on a range of tasks鈥攆rom caring for elderly relatives to assembling toy castles for their kids on Christmas morning.听

鈥淭here was always something that fascinated me about this idea of automated assistants,鈥 said Szafir, also in the Department of Computer Science. 鈥淚t seems like such a powerful way to improve the quality of life for people at all stages. It can help out in healthcare and rehabilitation. It can help us around the house and free us up for pursuits that we鈥檇 really like to be doing.鈥

Flying eyes

If the idea of a world filled with robotic assistants wigs you out, Szafir acknowledged that you鈥檙e not alone. Many people feel uncomfortable around robots, in part because humans are used to working with beings with expressive eyes and complex body language.

鈥淭he robot in our lab only has one arm,鈥 he said. 鈥淵ou can do certain kinds of gestures with that, but people have two arms.鈥

Szafir, who was named to the Forbes 30 Under 30 list in 2017, is trying to cross that valley. He has experimented, for example, with using augmented reality headsets to help people understand what robots are going to do next. In one case, he made it easier for humans to anticipate the movements of flying robots by .听

He imagines that similar technologies could help disaster responders fight wildfires鈥攗sing augmented reality displays to track and manage fleets of drones flying around blazes. Szafir and his colleagues recently landed a $1.1 million grant from the U.S. National Science Foundation to experiment with how workers in dangerous fields could use those sorts of tools.

But he also focuses on designing robots that can better interpret human gestures and language. As Szafir put it, in the field of human-robotic interaction, 鈥渢he human is just as important as the robot.鈥

That鈥檚 not easy. Take the task of building a toy castle on Christmas morning. If you鈥檙e working with a human assistant, you can signal that you want a screwdriver in many different ways: you might say 鈥渉and me that,鈥 grunt and point or just direct your gaze.听

鈥淧eople are so good at interpreting highly-ambiguous statements and gestures,鈥 Szafir said. 鈥淪o while I can tell a person, 鈥榗an you pass me that thing,鈥 for a robot, it would be really hard to know what that meant.鈥

Helping hands

To get to that point, Szafir and his colleagues took an unusual approach: they asked people to teach their robotic system for them.听

They solicited human volunteers to describe the locations of objects in a series of illustrations of messy workbenches, similar to the one in Szafir鈥檚 lab. The team then fed those sentences into a computer algorithm that analyzed and learned the speech patterns that people use when they want something but can鈥檛 reach it.听

The claw isn鈥檛 perfect. So far, it points to the right objects about 70 percent of the time. And it can鈥檛 understand certain types of descriptions, such as those involving negatives: 鈥淗and me the screwdriver that isn鈥檛 next to the clamp.鈥 But, Szafir said, it鈥檚 a leap above existing systems of this kind.

The researchers at the in Boulder.

And the team hasn鈥檛 stopped at spoken words. In related research, Szafir and his colleagues are working to develop robots that can understand the language of human shrugs, head scratching and pointing.听

They have designed a system that scans people as they complete a basic assembly task鈥攕ay, building a tower out of wooden blocks and screws. Based on how the builders move and where their eyes are pointing, the robot tries to guess at the tools those people might need next.听

鈥淚t would recognize when they wanted to fasten things together and it would hand them a screwdriver,鈥 Szafir said. He presented the results of that research recently at the in Madrid.听

There鈥檚 a lot of work to be done, but Szafir hopes that automated assistants will be coming to work places and homes near you in the decades ahead. Such feats of engineering may seem mundane in a world where drones can fly over the surface of Mars and run on treadmills.听

But, Szafir said, the pursuit of everyday robot coworkers is about conserving something that all humans cherish: 鈥淭he one limited resource that we all have is our time.鈥