Researchers Study How to Make Robots More Like Your Co-Workers

Jan. 22, 2015
Three-Year Program Is Aimed at Developing “Cobots” Who Can Work with Their Human Partners Work Together More Effectively

While industrial robots have been around for years now, we’re still a long way from having an artificial, humanoid creature like Commander Data working nearly seamlessly beside us. Most robots in operation today are hulking, highly specialized machines that are caged away from human factory workers in the interests of safety.  

But now researchers at University of Wisconsin (UW) and Massachusetts Institute of Technology (MIT) are working to determine best practices for effectively integrating human-robot teams within manufacturing environments. Their research is funded by a three-year grant from the National Science Foundation (NSF) as part of its National Robotics Initiative program.

The Cobots Are Coming

Their goal is to develop “cobots,” or robots that can work collaboratively and side by side with their human counterparts. “Our goal is to improve work processes and integrate the work of the robots with the work of people,” says Bilge Mutlu, assistant professor of computer sciences at UW and director of the university’s Human-Computer Interaction Laboratory. “This new family of robotic technology will change how manufacturing is done,” he says. “New research can ease the transition of these robots into manufacturing by making human-robot collaboration better and more natural.”

In recent years, the robotics industry has introduced new platforms that are less expensive and intended to be easier to reprogram and integrate into manufacturing, such as robots based on a platform called Baxter, made by Rethink Robotics. Each Baxter robot has two arms and a tablet-like panel for “eyes” that provide cues to help human workers anticipate what the robot will do next.

Baxter combines movement-based learning with machine arms that mimic the movement of human arms. This enables the robot to recognize its position and adjust its grip as necessary. If it encounters some obstacle—say, a human hand—it  stops moving. Baxter’s cameras continuously check its position and that of nearby objects. This means it can handle variable tasks, not just those that are repetitive.

Mutlu says these types of cooperative robots are safer to operate around humans. “Robots can recognize objects and human presence. This enables the possibility of collaborative work. There could be a project where humans and robots could work together. It makes bringing the robot system into the work cell a lot more feasible.”

Team Building

Much of the attention of the researchers is on figuring out the best ways to develop efficient robot-human teams. “When we think about integrating automation systems, we think about system capability and matching that to the required tasks,” says Mutlu. “What are the strengths of the robots and the humans? What are the requirements of the task? What can each bring to the job? If the robot does an okay job, and the human can do another job better, the robot can free up the human to do other things.”

Julie A. Shah, an assistant professor of aeronautics and astronautics at MIT and Mutlu’s counterpart on the project, breaks down the components of human-robot teamwork and tries to determine who should perform various tasks. Mutlu’s work complements Shah’s by focusing on how humans and robots actually interact.

“People can sometimes have difficulty figuring out how best to work with or use a robot, especially if its capabilities are very different from people’s,” says Shah. “Automated planning techniques can help bridge the gap in our capabilities and allow us to work more effectively as a team.”

Making Eye Contact

Some of the research on project relates as much to how the humans interact with the robots as the other way around. Mutlu’s team is building on previous work related to topics such as gaze aversion in humanoid robots.

Mutlu explains that gaze aversion has to do with how we interpret eye contact. “In conversation you build eye contact,” he says. “It’s also useful in performing tasks. If I’m looking at you, it might be that I need something from you. If a robot isn’t looking at you, but at something else, it might be telling you that it needs something.”

Those tablet “eyes” on the Baxter robot can provide cues to what the robot is doing, says Mutlu. “When the robot is going to pick up a piece, the tablet will turn to the piece it’s going to pick up. We look at the things we’re working on. So do these robots. There’s a certain amount of behavioral science in this.”

Another issue the team is addressing is the issue of “speech and repair.” If a human misunderstands a robot’s instructions or carries them out incorrectly, how should the robot correct the human?

Over the summer, UW-Madison computer sciences graduate student Allison Sauppé traveled to office furniture maker Steelcase’s  headquarters, where a pilot project to learn more about its efforts to incorporate Baxter into the production line is taking place. She found that perceptions of Baxter varied according to employees’ roles.

While managers tended to see Baxter as part of the overall system of automation, front-line workers had more complex feelings. “Some workers saw Baxter as a social being or almost a co-worker, and they talked about Baxter as if it were another person,” she says. “They unconsciously attributed human-like characteristics.”

Interestingly, says Mutlu, so far at least, the humans are inclined to react positively to the robots, or at least cut them a little slack. Even when people have a negative experience with the robot, say it doesn’t perform as expected, they’re inclined to say, “The robot is having a bad day,” he says. “Generally our intuition is that when you put the robot in an environment where he works with people, automatically people are interpreting what the robots want.”

Much work remains to be done in the project. “We have applications in isolated domains,” explains Mutlu. We don’t know yet whether bringing this into the larger factory will work. Can we use an optimizer to figure out which taska a robot can do? Can we enable an engineer at a facility to do this? This is where we’re at right now. We’re hoping to get traction pretty quickly. It’s a three-year project and we’ve only been at it for six months.”

To see more about Professor Mutlu’s work, see the video presentation below:

Sponsored Recommendations

2024 State of Technology Report: PLCs and PACs

Programmable logic controllers (PLCs) and programmable automation controllers (PACs) are the brains of the machine in many regards. They have evolved over the years.This new State...

2024 State of Technology Report: Packaging Equipment

Special considerations and requirements make packaging equipment an interesting vertical market unto itself. This new State of Technology Report from the editors of ...

High Sensitivity Accelerometers to Monitor Traffic and Railroad Vibration for Semiconductor Manufacturing

This paper examines highly sensitive piezoelectric sensors for precise vibration measurement which is critical in semiconductor production to prevent quality and yield issues....

Simulation for Automation Guide

How digital twin solutions are expanding the capabilities of plant engineers.