Scientists present a "sweater," thanks to which robots will receive the function of touch (photo, video)

May 29, 2023  14:16

Carnegie Mellon University's Robotics Institute has unveiled a groundbreaking innovation that could revolutionize the way robots interact with humans. Dubbed RobotSweater, this machine-knitted textile "skin" has the potential to enhance the comfort and adaptability of robots during human-robot interactions.

The development of RobotSweater was spearheaded by a research team consisting of Changliu Liu, an assistant professor of robotics in the School of Computer Science, James McCann, an assistant professor, and Wenzhen Yuan, the director of the RoboTouch lab. Inspired by the versatility and comfort of knitted sweaters, the team sought to create a textile that could sense contact and pressure.

The unique feature of RobotSweater lies in its ability to be customized to fit uneven three-dimensional surfaces, much like how knitters can transform yarn into various shapes and sizes. McCann explained that knitting machines can create patterns suitable for non-flat and curved surfaces, prompting the idea of developing sensors that can conform to irregular robot bodies.

Once knitted, RobotSweater functions as a sensory fabric, allowing robots to "feel" when a human makes contact. In industrial settings where safety is paramount, this innovation could prove invaluable. Unlike the conventional rigid shields used for human-robot interaction detection, RobotSweater covers the entire robot body, enabling it to detect potential collisions more effectively.

RobotSweater.JPG (115 KB)

The knitted fabric of RobotSweater comprises two layers of conductive yarn with metallic fibers for electrical conductivity. Between these layers lies a net-like, lace-patterned layer. When pressure is applied to the fabric, the conductive yarn closes a circuit, which is then read by sensors. This mechanism enables the fabric to sense the distribution, shape, and force of human contact.

Designing the knitted layers and integrating the wiring and electronic components posed challenges for the research team. Extensive prototyping and adjustments were necessary to develop a functional solution. Ultimately, they found success in wrapping wires around snaps attached to the ends of each stripe in the knitted fabric. This cost-effective and efficient method ensures a secure connection while preserving the integrity of the yarn.

Once RobotSweater is fitted onto a robot's body, it provides more accurate and effective sensory feedback compared to visual sensors commonly used by robots today. Yuan explained that robots outfitted with RobotSweater can respond to human gestures and movements, enabling them to mimic and react to human actions.

In their research, the team conducted several demonstrations to showcase RobotSweater's capabilities. They showed how a push on a companion robot outfitted with RobotSweater could dictate its movement or head orientation. Similarly, when applied to a robot arm, RobotSweater allowed a person's push to guide the arm's motion, while grabbing the arm triggered the opening or closing of its gripper.

Looking ahead, the research team plans to explore programming reactions based on swipe and pinching motions, akin to those used on touchscreens. Their findings will be presented at the upcoming 2023 IEEE International Conference on Robotics and Automation (ICRA) by the team, which includes Ph.D. students Zilin Si and Tianhong Catherine Yu, as well as visiting undergraduate student Katrene Morozov from the University of California, Santa Barbara.

The collaboration among the team members played a pivotal role in bringing RobotSweater to fruition. McCann expressed his appreciation for the diverse expertise within the team, encompassing fabrication, robotics integration, sensing, and planning and control. This multidisciplinary approach ensured comprehensive coverage of all aspects of the project.

With RobotSweater's potential to enhance the interaction between humans and robots, the researchers have paved the way for more advanced and intuitive human-robot collaborations.

 


 
 
 
 
  • Archive