Engineers at the Massachusetts Institute of Technology (MIT) are studying how robots and humans can work closer together on assembly lines. The Interactive Robotics Group in the Computer Science and Artificial Intelligence Laboratory (CSAIL) is focusing on dynamic sequencing, scheduling, spatial awareness and other areas.
With support from the National Science Foundation, the engineers are developing next-generation assembly line robots that are smarter and more adaptable than today’s machines.
“Our goal is to design software that allows robots to work more effectively with people in high-intensity, time-critical and safety-critical applications,” says Julie Shah, an assistant professor of aeronautics and astronautics at MIT who heads up the Interactive Robotics Group. “Half of the work that we do is focused on integrating robots into manual work environments. We are choreographing when and how people and robots move in and out of shared physical space.
“We’re focused on robot learning, planning and decision making,” adds Shah. “In addition, we’re looking to develop fast, smart tasking algorithms so robots can work interdependently with people. Our latest work is getting robots to automatically learn through observation of people to reduce the burden of programming, which traditionally has been a big entry barrier to small- and medium-sized manufacturers.”
According to Shah, 70 percent of aerospace assembly and 50 percent of automotive assembly is still done manually. She says the availability of more intelligent, adaptable and inherently safe robots will create new opportunities for automation to smaller manufacturers.
“In most factories, robots and people are kept very separate,” explains Shah. “But, factories of the near future are going to look very different. We’re beginning to see safety standards and technology that lets us put some of these large, dangerous industrial robots onto mobile bases and rails so that they can safely work with people.”
Shah and her team are studying robot spatial awareness by using overhead cameras and a motion-capture system attached to robots. They have also harnessed artificial intelligence technology to enable the machines to learn from experience, so the robots will be more responsive to human behavior. The more robots can sense the humans around them and make adjustments, the safer and more effective they will be on the assembly line.
“We have a very accurate picture of where the person is in the space, and the robot can use that in its decision making to decide what task it does next and how it moves through the space,” says Shah. “We want anyone to be able to come in and teach a robot the way I would teach another person. We designed very fast algorithms that are able to take this real-time information and adapt the robot’s motion plan.”
The MIT engineers are also working with a major automaker and attempting to install mobile robotic assistants on assembly lines. “People waste time walking back and forth to pick up the next piece to install,” notes Shah. “A mobile robotic assistant can fetch the right tools and parts at the right time.”
A big challenge is that humans and robots must work interactively in confined spaces. “The robot needs to maneuver around many people, and may need to straddle a moving conveyor belt,” Shah points out. “It has to move on and off the line seamlessly.”
To help robots negotiate in this dynamic environment, the MIT engineers are teaching them how to interpret anticipatory signals in human motion. “Biomedical studies show people can anticipate whether a person will turn left or right about a step or two before they do,” says Shah. “If we can teach the robot to anticipate which way the person will move, and modify its motion paths and speed accordingly, we could improve efficiency while maintaining safety.
"Our goal is to translate anticipatory signals with a few cameras rather than relying on body sensors," adds Shah. "There are researchers working on vision technology that can sense within a millimeter where a person is. Those advancements are coming along in parallel with our research. Sensing and computation are large enablers for us."
The Interactive Robotics Group has also spawned a start-up company called Tercio Solutions LLC. It has developed a software program that allows manufacturers to schedule and optimize their operations.