In 10 Seconds, an AI Model Detects Cancerous Brain Tumor often Missed During Surgery

Researchers have developed an AI powered model that - in 10 seconds - can determine during surgery if any part of a cancerous brain tumor that could be removed remains, a study published in Nature suggests.

The technology, called FastGlioma, outperformed conventional methods for identifying what remains of a tumor by a wide margin, according to the research team led by University of Michigan and University of California San Francisco.

"FastGlioma is an artificial intelligence-based diagnostic system that has the potential to change the field of neurosurgery by immediately improving comprehensive management of patients with diffuse gliomas," said senior author  Todd Hollon, M.D., a neurosurgeon at University of Michigan Health and assistant professor of neurosurgery at U-M Medical School.

"The technology works faster and more accurately than current standard of care methods for tumor detection and could be generalized to other pediatric and adult brain tumor diagnoses. It could serve as a foundational model for guiding brain tumor surgery."

When a neurosurgeon removes a life threatening tumor from a patient’s brain, they are rarely able to remove the entire mass.

What remains is known as residual tumor.

Commonly, the tumor is missed during the operation because surgeons are not able to differentiate between healthy brain and residual tumor in the cavity where the mass was removed. Residual tumor’s ability to resemble healthy brain tissue remains a major challenge in surgery.

Neurosurgical teams employ different methods to locate that residual tumor during a procedure.

They may get MRI imaging, which requires intraoperative machinery that is not available everywhere. The surgeon might also use a fluorescent imaging agent to identify tumor tissue, which is not applicable for all tumor types. These limitations prevent their widespread use.

In this international study of the AI-driven technology, neurosurgical teams analyzed fresh, unprocessed specimens sampled from 220 patients who had operations for low- or high-grade diffuse glioma.

FastGlioma detected and calculated how much tumor remained with an average accuracy of approximately 92%.

In a comparison of surgeries guided by FastGlioma predictions or image- and fluorescent-guided methods, the AI technology missed high-risk, residual tumor just 3.8% of the time - compared to a nearly 25% miss rate for conventional methods.

"This model is an innovative departure from existing surgical techniques by rapidly identifying tumor infiltration at microscopic resolution using AI, greatly reducing the risk of missing residual tumor in the area where a glioma is resected," said co-senior author Shawn Hervey-Jumper, M.D., professor of neurosurgery at University of California San Francisco and a former neurosurgery resident at U-M Health.

"The development of FastGlioma can minimize the reliance on radiographic imaging, contrast enhancement or fluorescent labels to achieve maximal tumor removal."

How it Works

To assess what remains of a brain tumor, FastGlioma combines microscopic optical imaging with a type of artificial intelligence called foundation models. These are AI models, such as GPT-4 and DALL·E 3, trained on massive, diverse datasets that can be adapted to a wide range of tasks.

After large scale training, foundation models can classify images, act as chatbots, reply to emails and generate images from text descriptions.

To build FastGlioma, investigators pre-trained the visual foundation model using over 11,000 surgical specimens and 4 million unique microscopic fields of view.

The tumor specimens are imaged through stimulated Raman histology, a method of rapid, high resolution optical imaging developed at U-M. The same technology was used to train DeepGlioma, an AI based diagnostic screening system that detects a brain tumor’s genetic mutations in under 90 seconds.

"FastGlioma can detect residual tumor tissue without relying on time-consuming histology procedures and large, labeled datasets in medical AI, which are scarce," said Honglak Lee, Ph.D., co-author and professor of computer science and engineering at U-M.

Full resolution images take around 100 seconds to acquire using stimulated Raman histology; a "fast mode" lower resolution image takes just 10 seconds. 

Researchers found that the full resolution model achieved accuracy up to 92%, with the fast mode slightly lower at approximately 90%.

"This means that we can detect tumor infiltration in seconds with extremely high accuracy, which could inform surgeons if more resection is needed during an operation," Hollon said.

AI's Future in Cancer

Over the last 20 years, the rates of residual tumor after neurosurgery have not improved.

Not only does residual tumor result in worse quality of life and earlier death for patients, but it increases the burden on a health system that anticipates 45 million annual surgical procedures needed worldwide by 2030.

Global cancer initiatives have recommended incorporating new technologies, including advanced methods of imaging and AI, into cancer surgery.

In 2015, The Lancet Oncology Commission on global cancer surgery noted that "the need for cost effective... approaches to address surgical margins in cancer surgery provides a potent drive for novel technologies."

Not only is FastGlioma an accessible and affordable tool for neurosurgical teams operating on gliomas, but researchers say, it can also accurately detect residual tumor for several non-glioma tumor diagnoses, including pediatric brain tumors, such as medulloblastoma and ependymoma, and meningiomas.

"These results demonstrate the advantage of visual foundation models such as FastGlioma for medical AI applications and the potential to generalize to other human cancers without requiring extensive model retraining or fine-tuning,” said co-author said Aditya S. Pandey, M.D., chair of the Department of Neurosurgery at U-M Health.

"In future studies, we will focus on applying the FastGlioma workflow to other cancers, including lung, prostate, breast, and head and neck cancers."

Kondepudi A, Pekmezci M, Hou X, Scotford K, Jiang C, Rao A, Harake ES, Chowdury A, Al-Holou W, Wang L, Pandey A, Lowenstein PR, Castro MG, Koerner LI, Roetzer-Pejrimovsky T, Widhalm G, Camelo-Piragua S, Movahed-Ezazi M, Orringer DA, Lee H, Freudiger C, Berger M, Hervey-Jumper S, Hollon T.
Foundation models for fast, label-free detection of glioma infiltration.
Nature. 2024 Nov 13. doi: 10.1038/s41586-024-08169-3

Most Popular Now

Commission Joins Forces with Venture Cap…

The Commission has launched a Trusted Investors Network bringing together a group of investors ready to co-invest in innovative deep-tech companies in Europe together with the EU. The Union's investment...

Philips and Medtronic Advocacy Partnersh…

Royal Philips (NYSE: PHG, AEX: PHIA), a global leader in health technology, and Medtronic Neurovascular, a leading innovator in neurovascular therapies, today announced a strategic advocacy partnership. Delivering timely stroke...

Wearable Cameras Allow AI to Detect Medi…

A team of researchers says it has developed the first wearable camera system that, with the help of artificial intelligence (AI), detects potential errors in medication delivery. In a test whose...

AI could Transform How Hospitals Produce…

A pilot study led by researchers at University of California San Diego School of Medicine found that advanced artificial intelligence (AI) could potentially lead to easier, faster and more efficient...

New AI Tool Predicts Protein-Protein Int…

Scientists from Cleveland Clinic and Cornell University have designed a publicly-available software and web database to break down barriers to identifying key protein-protein interactions to treat with medication. The computational tool...

AI for Real-Rime, Patient-Focused Insigh…

A picture may be worth a thousand words, but still... they both have a lot of work to do to catch up to BiomedGPT. Covered recently in the prestigious journal Nature...

New Research Shows Promise and Limitatio…

Published in JAMA Network Open, a collaborative team of researchers from the University of Minnesota Medical School, Stanford University, Beth Israel Deaconess Medical Center and the University of Virginia studied...

G-Cloud 14 Makes it Easier for NHS to Bu…

NHS organisations will be able to save valuable time and resource in the procurement of technologies that can make a significant difference to patient experience, in the latest iteration of...

Start-Ups will Once Again Have a Starrin…

11 - 14 November 2024, Düsseldorf, Germany. The finalists in the 16th Healthcare Innovation World Cup and the 13th MEDICA START-UP COMPETITION have advanced from around 550 candidates based in 62...

Hampshire Emergency Departments Digitise…

Emergency departments in three hospitals across Hampshire Hospitals NHS Foundation Trust have deployed Alcidion's Miya Emergency, digitising paper processes, saving clinical teams time, automating tasks, and providing trust-wide visibility of...

MEDICA HEALTH IT FORUM: Success in Maste…

11 - 14 November 2024, Düsseldorf, Germany. How can innovations help to master the great challenges and demands with which healthcare is confronted across international borders? This central question will be...

A "Chemical ChatGPT" for New M…

Researchers from the University of Bonn have trained an AI process to predict potential active ingredients with special properties. Therefore, they derived a chemical language model - a kind of...