Newswise — Scientists have developed an affordable and eco-friendly robotic hand that can adeptly grip a variety of objects without dropping them, relying solely on the motion of its wrist and sensory feedback from its artificial skin.

The University of Cambridge researchers have addressed the challenge of enabling robots to grasp objects of varying sizes, shapes, and textures with the creation of a soft, 3D printed robotic hand. Despite lacking individual finger mobility, this innovative hand can still perform a wide range of intricate movements.

By leveraging sensors integrated into its artificial skin, the robot hand underwent training to learn how to grip various objects, and even had the ability to anticipate whether it would drop them or not. This cutting-edge technology allowed the robotic hand to make informed decisions based on the feedback received from its sensory system.

The utilization of passive movement in the robotic hand design has proven to be advantageous in terms of control and energy efficiency, outperforming robots with fully motorized fingers. The adaptable nature of this design has the potential to pave the way for affordable robotics that exhibit more natural movements and possess the ability to learn how to grasp diverse objects. The research findings detailing this breakthrough have been published in the esteemed journal Advanced Intelligent Systems.

In contrast to traditional robotics, which rely solely on motorized components for movement, the natural world demonstrates that efficient and complex movement arises from the dynamic interaction between the brain and the body. Inspired by this, recent advancements in 3D printing technology have enabled the integration of soft components into robotic designs, facilitating the creation of more intricate and energy-efficient systems. This emerging field of soft robotics holds promise for developing robots that can mimic the efficiency and complexity of natural movements, opening up new possibilities for innovative applications in various fields.

Indeed, replicating the intricate dexterity and adaptability of the human hand in a robot poses significant research challenges. Despite the advancements in robotics, many of today's advanced robots still struggle to perform manipulation tasks that even small children can effortlessly carry out. Tasks such as picking up an egg require the right amount of force, which humans instinctively possess, but robots find challenging. Too much force could result in breakage, while too little force could result in dropping the object. Additionally, fully motorized robot hands with multiple motors for each finger joint consume substantial amounts of energy, posing further limitations.

At Professor Fumiya Iida's Bio-Inspired Robotics Laboratory in the Department of Engineering at Cambridge University, researchers have been diligently working on innovative solutions to tackle both of these challenges. Their aim is to develop a robot hand that can effectively grasp diverse objects with the appropriate amount of pressure, all while minimizing energy consumption. Through cutting-edge research and interdisciplinary approaches, the team is striving to create robotic hands that possess improved dexterity, adaptability, and energy efficiency, pushing the boundaries of what robots can accomplish in manipulation tasks.

As stated by co-author Dr. Thomas George-Thuruthel, who is currently based at University College London (UCL) East, earlier experiments conducted at Professor Fumiya Iida's Bio-Inspired Robotics Laboratory have demonstrated that wrist movement alone can yield a significant range of motion in a robot hand. Building on this foundation, the team sought to explore whether a robot hand designed with passive movement could not only successfully grasp objects but also predict if it would drop them, and accordingly adapt its grip. This innovative approach could potentially lead to the development of more efficient and adaptable robotic hands capable of making informed decisions during manipulation tasks.

The researchers used a 3D-printed anthropomorphic hand implanted with tactile sensors, so that the hand could sense what it was touching. The hand was only capable of passive, wrist-based movement.

The team carried out more than 1200 tests with the robot hand, observing its ability to grasp small objects without dropping them. The robot was initially trained using small 3D printed plastic balls, and grasped them using a pre-defined action obtained through human demonstrations.

“This kind of hand has a bit of springiness to it: it can pick things up by itself without any actuation of the fingers,” said first author Dr Kieran Gilday, who is now based at EPFL in Lausanne, Switzerland. “The tactile sensors give the robot a sense of how well the grip is going, so it knows when it’s starting to slip. This helps it to predict when things will fail.”

Following the initial training with the plastic balls, the robot hand underwent a trial-and-error learning process to determine what kind of grip would be successful for different objects. It attempted to grasp a variety of objects, such as a peach, a computer mouse, and a roll of bubble wrap, among others. In these tests, the robot hand demonstrated a high success rate, successfully grasping 11 out of 14 objects. This performance showcases the adaptability and versatility of the hand's passive, wrist-based movement combined with tactile sensing, allowing it to grasp objects of different shapes, sizes, and textures with a high degree of accuracy.

“The sensors, which are sort of like the robot’s skin, measure the pressure being applied to the object,” said George-Thuruthel. “We can’t say exactly what information the robot is getting, but it can theoretically estimate where the object has been grasped and with how much force.”

“The robot learns that a combination of a particular motion and a particular set of sensor data will lead to failure, which makes it a customisable solution,” said Gilday. “The hand is very simple, but it can pick up a lot of objects with the same strategy.”

Professor Fumiya Iida, the lead researcher of the study, highlighted the advantage of the design in terms of range of motion without using any actuators. By simplifying the hand and relying on passive movement and tactile sensors, the robot hand is able to achieve a high degree of control and gather valuable information without the need for additional actuators, which can make the system more complex and less energy-efficient. The researchers aim to continue developing the design to add actuators in a more efficient and effective manner, leading to more complex and advanced behavior in the robot hand.

A fully actuated robotic hand, in addition to the amount of energy it requires, is also a complex control problem. The passive design of the Cambridge-designed hand, using a small number of sensors, is easier to control, provides a wide range of motion, and streamlines the learning process.

In future, the system could be expanded in several ways, such as by adding computer vision capabilities, or teaching the robot to exploit its environment, which would enable it to grasp a wider range of objects.

This work was funded by UK Research and Innovation (UKRI), and Arm Ltd. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.

Journal Link: Advanced Intelligent Systems