Teaching Robots to Handle Liquids

By Ruth Seeley

Industries as diverse as automotive manufacturing and retailing have invested in robotic workforces to do much of the heavy lifting and scanning tasks at which they excel. But robots deal, for the most part, with rigid objects that need to be moved rather than items that need to be shaped.

Now MIT researchers are developing new models that will train robots to how to handle liquid and deformable materials.  Traditional learning-based simulators mainly focus on rigid objects and are unable to handle fluids or softer objects. While some more accurate physics-based simulators can handle diverse materials, they tend to rely heavily on approximation techniques that introduce errors when robots interact with objects in the real world.

A new “particle simulator” developed by MIT researchers improves robots’ abilities to mold materials into simulated target shapes and interact with solid objects and liquids. (Image via MIT)

In a paper being presented at the International Conference on Learning Representations in May, the researchers describe a new model that learns to capture how small portions of different materials—”particles”—interact when they’re poked and prodded. The model directly learns from data in cases where the underlying physics of the movements are uncertain or unknown. Robots can then use the model as a guide to predict how liquids, as well as rigid and deformable materials, will react to the force of its touch. As the robot handles the objects, the model also helps to further refine the robot’s control.

In experiments, a robotic hand with two fingers, called “RiceGrip,” accurately shaped a deformable foam to the desired configuration that serves as a proxy for sushi rice. In short, the researchers’ model serves as a type of “intuitive physics” brain that robots can leverage to reconstruct three-dimensional objects somewhat similarly to how humans do.

“Humans have an intuitive physics model in our heads, where we can imagine how an object will behave if we push or squeeze it. Based on this intuitive model, humans can accomplish amazing manipulation tasks that are far beyond the reach of current robots,” says first author Yunzhu Li, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). “We want to build this type of intuitive model for robots to enable them to do what humans can do.”

 

Source:  Massachusetts Institute of Technology

Leave A Reply

Your email address will not be published.