New research from Carnegie Mellon University Robotics Institute (RI) could help robots feel layers of fabric rather than relying on computer vision tools to just see them. The work could allow robots to help people with household tasks such as folding clothes.
Humans use their senses of vision and touch to grab a glass or pick up a piece of cloth. It’s so routine that few think about it. However, these tasks are very difficult for bots. It’s hard to quantify the amount of data collected by touch and sense has been difficult to simulate in robots – until recently.
“Humans look at something, we reach for it, and then we use touch to make sure we’re in the right position to grab it,” he said. David HeldHe is an Assistant Professor in the College of Computer Science and Head of the Department Robot Perception and Action (R-PAD) laboratory. “A lot of the sense of touch that humans do is natural to us. We don’t think much about it, so we don’t realize how important it is.”
For example, to fold clothes, robots need a sensor to mimic the way human fingers can feel the top layer of a towel or shirt and pick up the layers underneath. The researchers could teach the robot to feel and grab the top layer of fabric, but without the robot feeling the other layers of fabric, the robot would only grab the top layer and would never succeed in folding the fabric.
“How can we solve this problem?” Hold asked. “Well, maybe what we need is touch sensing.”
ReSkinDeveloped by researchers at Carnegie Mellon and Meta AI, it was the perfect solution. The open-source, touch-sensing “skin” is made of a thin, flexible polymer embedded with magnetic particles to measure triaxial touch signals. in The last paperthe researchers used ReSkin to help the robot feel layers of fabric rather than relying on its vision sensors to see them.
“By reading changes in magnetic fields from depressions or skin movement, we can achieve the sensation of touch,” Thomas Wing, A Ph.D. student in the R-PAD lab, who worked on the project with a postdoctoral fellow at RI Daniel Sita and graduate student Sashank Tirumala. “We can use this tactile sensing to determine how many layers of fabric we’ve picked up by pressing with the sensor.”
Other research has used tactile sensing to grip solid objects, but fabric is deformable, meaning it shifts when you touch it — making the task more difficult. Adjusting the robot’s grip on the canvas changes its shape and sensor readings.
The researchers didn’t teach the robot how or where to grab the tissue. Instead, they taught her how many layers of fabric she was grasping by first estimating how many layers she was holding using the sensors in the ReSkin, then adjusting the grip to try again. The team evaluated a robot that picked up one or two layers of fabric and used different textures and colors of the fabric to show generalization outside of the training data.
The thinness and flexibility of the ReSkin sensor allowed it to teach robots how to work with something as delicate as layers of fabric.
“The profile of this sensor is very small, and we were able to do this very delicate job, inserting it between layers of fabric, which we can’t do with other sensors, especially optical sensors,” Weng said. “We were able to use it to do tasks that were previously unachievable.”
There is a lot of research that needs to be done before handing the laundry basket over to a robot. It all starts with steps like smoothing out a wrinkled piece of fabric, choosing the right number of layers of fabric you want to fold, and then folding the fabric in the right direction.
“It’s really an exploration of what we can do with this new sensor,” Weng said. “We’re exploring how to get robots to feel this magnetic skin of soft objects, and we’re exploring simple strategies for manipulating the fabric that we need so that robots can eventually do our laundry.”
The team’s paper, “Learning the Uniqueness of Fabric Layers Using Haptic Feedback,” will be presented in 2022 International Conference on Robotics and Intelligent Systems in Kyoto, Japan. He was also awarded Best Paper at the 2022 RoMaDO-SI Conference Workshop.
#Robots #feel #layers #fabric #day #laundry #News