Bionic hand given AI ‘mind of its own’ to improve life of amputees

0

By Stephen Beech

A bionic hand has been given an AI “mind of its own” to make life easier for amputees.

American scientists used state of the art artificial intelligence to “fine-tune” the robotic prosthesis and improve manual dexterity.

They say the breakthrough will make everyday tasks – such as drinking from a plastic cup – more straightforward for people with artificial limbs.

The University of Utah team explained that whether reaching for a mug, a pencil or someone’s hand, you don’t need to consciously instruct each of your fingers on where they need to go to get a proper grip.

But the loss of that ability is one of the many challenges people with prosthetic arms and hands face.

Even with the most advanced robotic limbs, everyday activities come with an added burden as users purposefully open and close their fingers around a target.

The Utah team turned to AI to solve the problem.

By integrating proximity and pressure sensors into a commercial bionic hand, and then training an artificial neural network on grasping postures, the researchers developed an approach which they say is more like the natural, intuitive way people grip objects.

The study, published in the journal Nature Communications, showed that, when working in tandem with AI, participants showed greater grip security, greater grip precision and less mental effort.

Most importantly, they were able to perform several everyday tasks – such as picking up small objects and raising a cup – using different styles of grip, all without extensive training or practice.

Study senior author Professor Jacob George said: “As lifelike as bionic arms are becoming, controlling them is still not easy or intuitive.”

Lead author Dr. Marshall Trout said: “Nearly half of all users will abandon their prosthesis, often citing their poor controls and cognitive burden.”

The team explained that one problem is that most commercial bionic arms and hands have no way of replicating the sense of touch that normally gives people intuitive, reflexive ways of grasping objects.

But dexterity is not simply a matter of sensory feedback.

The team say we also have subconscious models in our brains that simulate and anticipate hand-object interactions.

A “smart” hand would also need to learn those automatic responses over time.

The Utah team addressed the first problem by outfitting an artificial hand, manufactured by TASKA Prosthetics, with custom fingertips.

As well as detecting pressure, the fingertips were equipped with optical proximity sensors designed to replicate the finest sense of touch. For example, the fingers could detect an effectively weightless cotton ball being dropped on them.

For the second problem, they trained an artificial neural network model on the proximity data so that the fingers would naturally move to the exact distance necessary to form a perfect grasp of the object.

Because each finger has its own sensor and can “see” in front of it, each digit works in parallel to form a perfect, stable grasp across any object.

The team also created a bioinspired approach that involves sharing control between the user and the AI agent if, for example, if they wanted to open their hand to drop an object.

The success of the approach relied on finding the right balance between human and machine control.

Dr. Trout said: “What we don’t want is the user fighting the machine for control.

“In contrast, here the machine improved the precision of the user while also making the tasks easier.

“In essence, the machine augmented their natural control so that they could complete tasks without having to think about them.”

The team also conducted studies with four participants whose amputations fall between the elbow and wrist.

They attempted multiple everyday activities that required fine motor control.

Simple tasks, such as drinking from a plastic cup, can be extremely difficult for an amputee as squeezing too soft will see it dropped, but squeezing too hard will break it.

George said: “By adding some artificial intelligence, we were able to offload this aspect of grasping to the prosthesis itself.

“The end result is more intuitive and more dexterous control, which allows simple tasks to be simple again.”

The work is part of the Utah NeuroRobotics Lab’s larger vision to improve amputees’ quality of life.

George said: “The study team is also exploring implanted neural interfaces that allow individuals to control prostheses with their mind and even get a sense of touch coming back from this.”

He added: “The team plans to blend these technologies, so that their enhanced sensors can improve tactile function and the intelligent prosthesis can blend seamlessly with thought-based control.”

 

FOX41 Yakima©FOX11 TriCities©