
At first glance, this bionic hand, with its movable, slightly angled fingers and skin-colored tips, looks like a futuristic high-tech model from a science fiction film. But this hand doesn’t belong to Luke Skywalker from Star Wars, even though it might seem like it. What immediately catches the eye in the picture are the fingertips, which actually contain a lot of technology.
In everyday life, we rarely think about where we have to move our fingers when we reach for a cup or pen. Things are different with hand prostheses: Even the most modern models require additional mental effort from their users because each individual finger has to be consciously opened and closed in order to get a secure grip.
A research team led by Marshall Trout from the University of Utah wants to tackle the problem using special sensors and artificial intelligence. For their neurotechnological experiments, they expanded a commercially available prosthetic hand and equipped it with sensors that enable more precise interaction with the environment. There are pressure and optical proximity sensors in the fingertips of the prosthesis that “see” objects before they are even touched. The AI uses these signals to automatically fine-tune movements and make gripping more intuitive.
The system was tested with four people whose arms had been amputated between the elbow and wrist. Simple everyday scenarios such as picking up small objects showed how sensitive the technology already works. The fingertips even registered an almost weightless wad of cotton. Together with the AI, the participants were able to perform a number of different grips safely and precisely, without taking long to get used to it.