Most Advanced Artificial Touch for Brain-Controlled Bionic Hand

For the first time ever, a complex sense of touch for individuals living with spinal cord injuries is a step closer to reality. A new study published in Science, paves the way for complex touch sensation through brain stimulation, whilst using an extracorporeal bionic limb, that is attached to a chair or wheelchair.

The researchers, who are all part of the US-based Cortical Bionics Research Group, have discovered a unique method for encoding natural touch sensations of the hand via specific microstimulation patterns in implantable electrodes in the brain. This allows individuals with spinal cord injuries not only to control a bionic arm with their brain, but also to feel tactile edges, shapes, curvatures and movements, that until now have not been possible.

"In this work, for the first time the research went beyond anything that has been done before in the field of brain-computer interfaces (BCI) - we conveyed tactile sensations related to orientation, curvature, motion and 3D shapes for a participant using a brain-controlled bionic limb. We are in another level of artificial touch now. We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand,” says Giacomo Valle, lead author of the study and Assistant Professor at Chalmers University of Technology, in Sweden.

A sense of touch builds richness and independence in our everyday lives. For individuals living with a spinal cord injury, the electrical signals coming from the hand to the brain that should allow an individual to feel tactile sensations, are being blocked by the injury and that sense of touch is lost. A bionic limb controlled by user’s brain signals can bring back some functionality and independence to someone with a paralysed hand, but without the sense of touch, it is very difficult to lift, hold and manipulate objects. Previously, a bionic hand would not be perceived by the user as part of the body, since it would not provide any sensory feedback like a biological hand. This study aimed to improve the usability of an extracorporeal bionic limb, which would be mounted on a wheelchair or similar equipment close by to the user.

For the study, two BCI participants were fitted with chronic brain implants in the sensory and motor regions of the brain that represent the arm and hand. Over the course of several years, the researchers were able to record and decode all of the different patterns of electrical activity that occurred in the brain related to motor intention of the arm and hand. This was possible, since the electrical activity was still present in the brain, but the paralysis was blocking this from reaching the hand. Decoding and deciphering brain signals with this technology is unique and allows the participants to directly control a bionic arm and hand with the brain for interacting with the environment.

The participants were able to accomplish a series of complex experiments, that require rich tactile sensations. To do this, the researchers typed specific stimulations directly into the users' brain via the implants.

"We found a way to type these 'tactile messages' via microstimulation using the tiny electrodes in the brain and we found a unique way to encode complex sensations. This allowed for more vivid sensory feedback and experience while using a bionic hand," says Valle.

The participants could feel the edge of an object, as well as the direction of motion along the fingertips.

By utilising the Brain Computer Interface, the researchers could decode the intention of motion from the participant’s brain in order to control a bionic arm. Since the bionic arm has sensors on it, when an object comes into contact with these sensors, the stimulation is sent to the brain and the participant feels the sensation as if it were in their hand. This means that the participants could potentially complete complex tasks with a bionic arm with more accuracy than was previously possible, like picking up an object and moving it from one location to another.

This research is just the first step towards patients with spinal cord injuries being able to feel this level of complex touch. To capture all the features of complex touch that the researchers are now able to encode and convey to the user, more complex sensors and robotic technology is needed (for example prosthetic skin). The implantable technology used to stimulate, would also require development, to increase the repertoire of sensation.

Valle G, Alamri AH, Downey JE, Lienkämper R, Jordan PM, Sobinov AR, Endsley LJ, Prasad D, Boninger ML, Collinger JL, Warnke PC, Hatsopoulos NG, Miller LE, Gaunt RA, Greenspon CM, Bensmaia SJ.
Tactile edges and motion via patterned microstimulation of the human somatosensory cortex.
Science. 2025 Jan 17;387(6731):315-322. doi: 10.1126/science.adq5978

Most Popular Now

Stanford Medicine Study Suggests Physici…

Artificial intelligence-powered chatbots are getting pretty good at diagnosing some diseases, even when they are complex. But how do chatbots do when guiding treatment and care after the diagnosis? For...

OmicsFootPrint: Mayo Clinic's AI To…

Mayo Clinic researchers have pioneered an artificial intelligence (AI) tool, called OmicsFootPrint, that helps convert vast amounts of complex biological data into two-dimensional circular images. The details of the tool...

Testing AI with AI: Ensuring Effective A…

Using a pioneering artificial intelligence platform, Flinders University researchers have assessed whether a cardiac AI tool recently trialled in South Australian hospitals actually has the potential to assist doctors and...

Adults don't Trust Health Care to U…

A study finds that 65.8% of adults surveyed had low trust in their health care system to use artificial intelligence responsibly and 57.7% had low trust in their health care...

AI Unlocks Genetic Clues to Personalize …

A groundbreaking study led by USC Assistant Professor of Computer Science Ruishan Liu has uncovered how specific genetic mutations influence cancer treatment outcomes - insights that could help doctors tailor...

The 10 Year Health Plan: What do We Need…

Opinion Article by Piyush Mahapatra, Consultant Orthopaedic Surgeon and Chief Innovation Officer at Open Medical. There is a new ten-year plan for the NHS. It will "focus efforts on preventing, as...

Deep Learning to Increase Accessibility…

Coronary artery disease is the leading cause of death globally. One of the most common tools used to diagnose and monitor heart disease, myocardial perfusion imaging (MPI) by single photon...

People's Trust in AI Systems to Mak…

Psychologists warn that AI's perceived lack of human experience and genuine understanding may limit its acceptance to make higher-stakes moral decisions. Artificial moral advisors (AMAs) are systems based on artificial...

DMEA 2025 - Innovations, Insights and Ne…

8 - 10 April 2025, Berlin, Germany. Less than 50 days to go before DMEA 2025 opens its doors: Europe's leading event for digital health will once again bring together experts...

Relationship Between Sleep and Nutrition…

Diet and sleep, which are essential for human survival, are interrelated. However, recently, various services and mobile applications have been introduced for the self-management of health, allowing users to record...

New AI Tool Mimics Radiologist Gaze to R…

Artificial intelligence (AI) can scan a chest X-ray and diagnose if an abnormality is fluid in the lungs, an enlarged heart or cancer. But being right is not enough, said...

AI Model can Read ECGs to Identify Femal…

A new AI model can flag female patients who are at higher risk of heart disease based on an electrocardiogram (ECG). The researchers say the algorithm, designed specifically for female patients...