Most Advanced Artificial Touch for Brain-Controlled Bionic Hand

For the first time ever, a complex sense of touch for individuals living with spinal cord injuries is a step closer to reality. A new study published in Science, paves the way for complex touch sensation through brain stimulation, whilst using an extracorporeal bionic limb, that is attached to a chair or wheelchair.

The researchers, who are all part of the US-based Cortical Bionics Research Group, have discovered a unique method for encoding natural touch sensations of the hand via specific microstimulation patterns in implantable electrodes in the brain. This allows individuals with spinal cord injuries not only to control a bionic arm with their brain, but also to feel tactile edges, shapes, curvatures and movements, that until now have not been possible.

"In this work, for the first time the research went beyond anything that has been done before in the field of brain-computer interfaces (BCI) - we conveyed tactile sensations related to orientation, curvature, motion and 3D shapes for a participant using a brain-controlled bionic limb. We are in another level of artificial touch now. We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand,” says Giacomo Valle, lead author of the study and Assistant Professor at Chalmers University of Technology, in Sweden.

A sense of touch builds richness and independence in our everyday lives. For individuals living with a spinal cord injury, the electrical signals coming from the hand to the brain that should allow an individual to feel tactile sensations, are being blocked by the injury and that sense of touch is lost. A bionic limb controlled by user’s brain signals can bring back some functionality and independence to someone with a paralysed hand, but without the sense of touch, it is very difficult to lift, hold and manipulate objects. Previously, a bionic hand would not be perceived by the user as part of the body, since it would not provide any sensory feedback like a biological hand. This study aimed to improve the usability of an extracorporeal bionic limb, which would be mounted on a wheelchair or similar equipment close by to the user.

For the study, two BCI participants were fitted with chronic brain implants in the sensory and motor regions of the brain that represent the arm and hand. Over the course of several years, the researchers were able to record and decode all of the different patterns of electrical activity that occurred in the brain related to motor intention of the arm and hand. This was possible, since the electrical activity was still present in the brain, but the paralysis was blocking this from reaching the hand. Decoding and deciphering brain signals with this technology is unique and allows the participants to directly control a bionic arm and hand with the brain for interacting with the environment.

The participants were able to accomplish a series of complex experiments, that require rich tactile sensations. To do this, the researchers typed specific stimulations directly into the users' brain via the implants.

"We found a way to type these 'tactile messages' via microstimulation using the tiny electrodes in the brain and we found a unique way to encode complex sensations. This allowed for more vivid sensory feedback and experience while using a bionic hand," says Valle.

The participants could feel the edge of an object, as well as the direction of motion along the fingertips.

By utilising the Brain Computer Interface, the researchers could decode the intention of motion from the participant’s brain in order to control a bionic arm. Since the bionic arm has sensors on it, when an object comes into contact with these sensors, the stimulation is sent to the brain and the participant feels the sensation as if it were in their hand. This means that the participants could potentially complete complex tasks with a bionic arm with more accuracy than was previously possible, like picking up an object and moving it from one location to another.

This research is just the first step towards patients with spinal cord injuries being able to feel this level of complex touch. To capture all the features of complex touch that the researchers are now able to encode and convey to the user, more complex sensors and robotic technology is needed (for example prosthetic skin). The implantable technology used to stimulate, would also require development, to increase the repertoire of sensation.

Valle G, Alamri AH, Downey JE, Lienkämper R, Jordan PM, Sobinov AR, Endsley LJ, Prasad D, Boninger ML, Collinger JL, Warnke PC, Hatsopoulos NG, Miller LE, Gaunt RA, Greenspon CM, Bensmaia SJ.
Tactile edges and motion via patterned microstimulation of the human somatosensory cortex.
Science. 2025 Jan 17;387(6731):315-322. doi: 10.1126/science.adq5978

Most Popular Now

Mobile App Tracking Blood Pressure Helps…

The AHOMKA platform, an innovative mobile app for patient-to-provider communication that developed through a collaboration between the School of Engineering and leading medical institutions in Ghana, has yielded positive results...

Accelerating NHS Digital Maturity: Paper…

Digitised clinical noting at South Tees Hospitals NHS Foundation Trust is creating efficiencies for busy doctors and nurses. The trust’s CCIO Dr Andrew Adair, deputy CCIO Dr John Greenaway, and...

Customized Smartphone App Shows Promise …

A growing body of research indicates that older adults in assisted living facilities can delay or even prevent cognitive decline through interventions that combine multiple activities, such as improving diet...

AI Tool Helps Predict Who will Benefit f…

A study led by UCLA investigators shows that artificial intelligence (AI) could play a key role in improving treatment outcomes for men with prostate cancer by helping physicians determine who...

New Study Shows Promise for Gamified mHe…

A new study published in Multiple Sclerosis and Related Disorders highlights the potential of More Stamina, a gamified mobile health (mHealth) app designed to help people with Multiple Sclerosis (MS)...

AI in Healthcare: How do We Get from Hyp…

The Highland Marketing advisory board met to consider the government's enthusiasm for AI. To date, healthcare has mostly experimented with decision support tools, and their impact on the NHS and...

Research Shows AI Technology Improves Pa…

Existing research indicates that the accuracy of a Parkinson's disease diagnosis hovers between 55% and 78% in the first five years of assessment. That's partly because Parkinson's sibling movement disorders...

New AI Tool Accelerates Disease Treatmen…

University of Virginia School of Medicine scientists have created a computational tool to accelerate the development of new disease treatments. The tool goes beyond current artificial intelligence (AI) approaches by...

DMEA sparks: The Future of Digital Healt…

8 - 10 April 2025, Berlin, Germany. Digitalization is considered one of the key strategies for addressing the shortage of skilled workers - but the digital health sector also needs qualified...

First Therapy Chatbot Trial Shows AI can…

Dartmouth researchers conducted the first clinical trial of a therapy chatbot powered by generative AI and found that the software resulted in significant improvements in participants' symptoms, according to results...

Who's to Blame When AI Makes a Medi…

Assistive artificial intelligence technologies hold significant promise for transforming health care by aiding physicians in diagnosing, managing, and treating patients. However, the current trend of assistive AI implementation could actually...

DeepSeek: The "Watson" to Doct…

DeepSeek is an artificial intelligence (AI) platform built on deep learning and natural language processing (NLP) technologies. Its core products include the DeepSeek-R1 and DeepSeek-V3 models. Leveraging an efficient Mixture...