Soft Robot with humanlike Perception: Sensing Touch and Shape in Real Time

Soft Robot with humanlike Perception: Sensing Touch and Shape in Real Time
Photo by Andy Kelly. Two robots greeting.

Human beings use a complex system to perceive and interact with their surroundings. Our sensory system not only processes external stimuli but also anticipates the outcomes of our actions, which helps in guiding movements.

This concept of anticipation-based perception is well-established in neuroscience but poses challenges when applied to soft robots.

Unlike rigid robots, soft robots need to account for body deformations due to their flexible and deformable bodies, and they must differentiate between changes caused by internal movement versus external contact.

In soft robotics, the perception-action loop, basically how the robot senses its environment and reacts to it, is still under development. Effective sensory methods such as tactile sensors and force sensing have been developed, but they struggle with rapid, unexpected reactions and complex 3D environments.

To address these challenges, multi-faceted perception systems are required, integrating sensors to detect and interpret a robot’s movements, both internally and from external interactions. That works much like perception and proprioception systems in the human nervous system.

The study - the robot
Technology inspired by the human perception-action loops has been proposed for soft robots. The robot can then quickly identify irregularities, such as unexpected contact, by creating a system that integrates expected data and actual sensor data.

Researchers have now developed a soft robot with such an advanced perception ability that it can detect and respond to external contact. The robot is designed with flexible rods and a sensing system integrated into the soft body, enabling it to compare expected and actual shapes during movement.

By measuring these deformations, the robot can distinguish between internal movements and external contacts, allowing it to detect the presence and direction of forces acting on it. The robot responds in under a second to changes, distinguishing between internal and external influences, even without visual feedback.

Real-Time Sensing and Shape Estimation
The robot’s perception system works by calculating the expected shape based on movement and comparing it with the actual shape sensed by embedded sensors.

This process helps the robot determine whether any external force is interacting with it or not and, if so, the direction and magnitude of that force. The system is accurate, even when external obstacles are encountered, maintaining high precision in determining the position of the robot regardless of the applied forces.

Robot with real sensory nerves?
The robot can detect contact with obstacles of varying stiffness and can adapt its actions based on real-time feedback. For example, when the robot comes into contact with a rigid or soft object, it can accurately detect the force and direction of the contact.

This level of sensitivity enables the robot to make informed decisions in dynamic environments, even under conditions where visual feedback is unavailable, such as in dark or cluttered spaces.

Autonomous exploration and maze navigation
The enhanced sensory capabilities of this soft robot enable it to autonomously navigate environments, such as mazes, by detecting and responding to obstacles. The robot uses sensors embedded in its flexible body to detect interactions with walls and obstacles.

Through continuous shape and contact feedback, the robot adjusts its movements, just like a human would, using the sensory nerves to explore and find its way to the goal, demonstrating a significant leap in autonomous soft robotics for tasks like exploration and manipulation.

Real-time interaction with the environment, like the human nervous system, makes it ideal for tasks requiring delicate touch and intelligent decision-making.

Learning from human interaction
In another experiment, the robot was trained to perform specific tasks, like a massage, by a human operator. The operator manually guided the robot to desired positions and adjusted the force applied, which was measured by force sensors.

The robot learned the correct sequence of movements and force levels and then repeated them without further human input. This task involved recording the robot's movements and forces and then using these records to autonomously replicate the actions, showing impressive precision in repeating the taught tasks.

Enhanced perception for better performance
This sophisticated feedback system, which resembles our human proprioceptive nervous system, enables the robot to perform complex tasks like navigation and manipulation with minimal errors, even in dynamic environments.

This approach holds promise for applications like autonomous navigation and human-robot collaboration.

About the scientific paper:

First author: Peiyi Wang, China and Singapore
Published: Nature Communications, November 2024
Link to paper: https://www.nature.com/articles/s41467-024-54327-6