Using tactile sensors and machine learning to improve how robots manipulate fabrics
In recent years, roboticists have been trying to improve how robots interact with different objects found in real-world settings. While some of their efforts yielded promising results, the manipulation skills of most existing robotic systems still lag behinds those of humans.
Fabrics are among the types of objects that have proved to be most challenging for robot to interact with. The main reasons for this are that pieces of cloth and other fabrics can be stretched, moved and folded in different ways, which can result in complex material dynamics and self-occlusions.
Researchers at Carnegie Mellon University’s Robotics Institute have recently proposed a new computational technique that could allow robots to better understand and handle fabrics. This technique, introduced in a paper set to be presented at the International Conference on Intelligent Robots and Systems and pre-published on arXiv, is based on the use of a tactile sensor and a simple machine-learning algorithm, known as a classifier.
“We are interested in fabric manipulation because fabrics and deformable objects in general are challenging for robots to manipulate, as their deformability means that they can be configured in so many different ways,” Daniel Seita, one of the researchers who carried out the study, told TechXplore. “When we began this project, we knew that there had been a lot of recent work in robots manipulating fabric, but most of that work involves manipulating a single piece of fabric. Our paper addresses the relatively less-explored directions of learning to manipulate a pile of fabric using tactile sensing.”
Most existing approaches to enable fabric manipulation in robots are only based on the use of vision sensors, such as cameras or imagers that only collect visual data. While some of these methods achieved good results, their reliance on visual sensors may limit their applicability for simple tasks that involve the manipulation of a single piece of cloth.
The new method devised by Seita and his colleagues Sashank Tirumala and Thomas Weng, on the other hand, uses data collected by a tactile sensor called ReSkin, which can infer information related to a material’s texture and its interaction with the environment. Using this tactile data, the team trained a classifier to determine the number of layers of fabric grasped by a robot.
“Our tactile data came from the ReSkin sensor, which was recently developed at CMU last year,” Weng explained. “We use this classifier to adjust the height of a gripper in order to grasp one or two top-most fabric layers from a pile of fabrics.”
To evaluate their technique, the team carried out 180 experimental trials in a real-world setting, using a robotic system consisting of a Franka robotic arm, a mini-Delta gripper and a Reskin sensor (integrated on the gripper’s “finger”) to grasp one or two pieces of cloth in a pile. Their approach achieved promising results, outperforming baseline methods that do not consider tactile feedback.
“Compared to prior approaches that only use cameras, our tactile-sensing-based approach is not affected by patterns on the fabric, changes in lighting, and other visual discrepancies,” Tirumala said. “We were excited to see that tactile sensing from electromagnetic devices like the ReSkin sensor can provide a sufficient signal for a fine-grained manipulation task, like grasping one or two fabric layers. We believe that this will motivate future research in tactile sensing for cloth manipulation by robots.”
In the future, Tirumala, Weng, Seita, and their colleagues hope that this manipulation approach could help to enhance the capabilities of robots designed to be deployed in fabric manufacturing facilities, laundry services, or in homes. Specifically, it could improve the ability of these robots to handle complex textiles, multiple pieces of cloth, laundry, blankets, clothes, and other fabric-based objects.
“Our plan is to continue to explore the use of tactile sensing to grasp an arbitrary number of fabric layers, instead of the one or two layers that we focused on in this work,” Weng added. “Furthermore, we are investigating multi-modal approaches that combine both vision and tactile sensing so we can leverage the advantages of both sensor modalities.”
Generating cross-modal sensory data for robotic visual-tactile perception
Sashank Tirumala et al, Learning to singulate layers using tactile feedback. arXiv:2207.11196v1 [cs.RO]. arxiv.org/abs/2207.11196
© 2022 Science X Network
Citation:
Using tactile sensors and machine learning to improve how robots manipulate fabrics (2022, August 16)
retrieved 16 August 2022
from https://techxplore.com/news/2022-08-tactile-sensors-machine-robots-fabrics.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.