Researchers Take Step Toward Next-Generation Brain-Computer Interface System

Brain-computer interfaces (BCIs) are emerging assistive devices that may one day help people with brain or spinal injuries to move or communicate. BCI systems depend on implantable sensors that record electrical signals in the brain and use those signals to drive external devices like computers or robotic prosthetics.

Most current BCI systems use one or two sensors to sample up to a few hundred neurons, but neuroscientists are interested in systems that are able to gather data from much larger groups of brain cells.

Now, a team of researchers has taken a key step toward a new concept for a future BCI system - one that employs a coordinated network of independent, wireless microscale neural sensors, each about the size of a grain of salt, to record and stimulate brain activity. The sensors, dubbed "neurograins," independently record the electrical pulses made by firing neurons and send the signals wirelessly to a central hub, which coordinates and processes the signals.

In a study published on August 12 in Nature Electronics, the research team demonstrated the use of nearly 50 such autonomous neurograins to record neural activity in a rodent.

The results, the researchers say, are a step toward a system that could one day enable the recording of brain signals in unprecedented detail, leading to new insights into how the brain works and new therapies for people with brain or spinal injuries.

"One of the big challenges in the field of brain-computer interfaces is engineering ways of probing as many points in the brain as possible," said Arto Nurmikko, a professor in Brown's School of Engineering and the study's senior author. "Up to now, most BCIs have been monolithic devices - a bit like little beds of needles. Our team's idea was to break up that monolith into tiny sensors that could be distributed across the cerebral cortex. That’s what we’ve been able to demonstrate here."

The team, which includes experts from Brown, Baylor University, University of California at San Diego and Qualcomm, began the work of developing the system about four years ago. The challenge was two-fold, said Nurmikko, who is affiliated with Brown's Carney Institute for Brain Science. The first part required shrinking the complex electronics involved in detecting, amplifying and transmitting neural signals into the tiny silicon neurograin chips. The team first designed and simulated the electronics on a computer, and went through several fabrication iterations to develop operational chips.

The second challenge was developing the body-external communications hub that receives signals from those tiny chips. The device is a thin patch, about the size of a thumb print, that attaches to the scalp outside the skull. It works like a miniature cellular phone tower, employing a network protocol to coordinate the signals from the neurograins, each of which has its own network address. The patch also supplies power wirelessly to the neurograins, which are designed to operate using a minimal amount of electricity.

"This work was a true multidisciplinary challenge," said Jihun Lee, a postdoctoral researcher at Brown and the study's lead author. "We had to bring together expertise in electromagnetics, radio frequency communication, circuit design, fabrication and neuroscience to design and operate the neurograin system."

The goal of this new study was to demonstrate that the system could record neural signals from a living brain - in this case, the brain of a rodent. The team placed 48 neurograins on the animal’s cerebral cortex, the outer layer of the brain, and successfully recorded characteristic neural signals associated with spontaneous brain activity.

The team also tested the devices’ ability to stimulate the brain as well as record from it. Stimulation is done with tiny electrical pulses that can activate neural activity. The stimulation is driven by the same hub that coordinates neural recording and could one day restore brain function lost to illness or injury, researchers hope.

The size of the animal’s brain limited the team to 48 neurograins for this study, but the data suggest that the current configuration of the system could support up to 770. Ultimately, the team envisions scaling up to many thousands of neurograins, which would provide a currently unattainable picture of brain activity.

"It was a challenging endeavor, as the system demands simultaneous wireless power transfer and networking at the mega-bit-per-second rate, and this has to be accomplished under extremely tight silicon area and power constraints," said Vincent Leung, an associate professor in the Department of Electrical and Computer Engineering at Baylor. "Our team pushed the envelope for distributed neural implants."

There's much more work to be done to make that complete system a reality, but researchers said this study represents a key step in that direction.

"Our hope is that we can ultimately develop a system that provides new scientific insights into the brain and new therapies that can help people affected by devastating injuries," Nurmikko said.

Lee, J., Leung, V., Lee, AH. et al.
Neural recording and stimulation using wireless networks of microimplants.
Nat Electron, 2021. doi: 10.1038/s41928-021-00631-8

Most Popular Now

Philips and Medtronic Advocacy Partnersh…

Royal Philips (NYSE: PHG, AEX: PHIA), a global leader in health technology, and Medtronic Neurovascular, a leading innovator in neurovascular therapies, today announced a strategic advocacy partnership. Delivering timely stroke...

New AI Tool Predicts Protein-Protein Int…

Scientists from Cleveland Clinic and Cornell University have designed a publicly-available software and web database to break down barriers to identifying key protein-protein interactions to treat with medication. The computational tool...

AI for Real-Rime, Patient-Focused Insigh…

A picture may be worth a thousand words, but still... they both have a lot of work to do to catch up to BiomedGPT. Covered recently in the prestigious journal Nature...

New Research Shows Promise and Limitatio…

Published in JAMA Network Open, a collaborative team of researchers from the University of Minnesota Medical School, Stanford University, Beth Israel Deaconess Medical Center and the University of Virginia studied...

G-Cloud 14 Makes it Easier for NHS to Bu…

NHS organisations will be able to save valuable time and resource in the procurement of technologies that can make a significant difference to patient experience, in the latest iteration of...

Start-Ups will Once Again Have a Starrin…

11 - 14 November 2024, Düsseldorf, Germany. The finalists in the 16th Healthcare Innovation World Cup and the 13th MEDICA START-UP COMPETITION have advanced from around 550 candidates based in 62...

Hampshire Emergency Departments Digitise…

Emergency departments in three hospitals across Hampshire Hospitals NHS Foundation Trust have deployed Alcidion's Miya Emergency, digitising paper processes, saving clinical teams time, automating tasks, and providing trust-wide visibility of...

MEDICA HEALTH IT FORUM: Success in Maste…

11 - 14 November 2024, Düsseldorf, Germany. How can innovations help to master the great challenges and demands with which healthcare is confronted across international borders? This central question will be...

A "Chemical ChatGPT" for New M…

Researchers from the University of Bonn have trained an AI process to predict potential active ingredients with special properties. Therefore, they derived a chemical language model - a kind of...

Siemens Healthineers co-leads EU Project…

Siemens Healthineers is joining forces with more than 20 industry and public partners, including seven leading stroke hospitals, to improve stroke management for patients all over Europe. With a total...

MEDICA and COMPAMED 2024: Shining a Ligh…

11 - 14 November 2024, Düsseldorf, Germany. Christian Grosser, Director Health & Medical Technologies, is looking forward to events getting under way: "From next Monday to Thursday, we will once again...

In 10 Seconds, an AI Model Detects Cance…

Researchers have developed an AI powered model that - in 10 seconds - can determine during surgery if any part of a cancerous brain tumor that could be removed remains...