In 10 Seconds, an AI Model Detects Cancerous Brain Tumor often Missed During Surgery

Researchers have developed an AI powered model that - in 10 seconds - can determine during surgery if any part of a cancerous brain tumor that could be removed remains, a study published in Nature suggests.

The technology, called FastGlioma, outperformed conventional methods for identifying what remains of a tumor by a wide margin, according to the research team led by University of Michigan and University of California San Francisco.

"FastGlioma is an artificial intelligence-based diagnostic system that has the potential to change the field of neurosurgery by immediately improving comprehensive management of patients with diffuse gliomas," said senior author  Todd Hollon, M.D., a neurosurgeon at University of Michigan Health and assistant professor of neurosurgery at U-M Medical School.

"The technology works faster and more accurately than current standard of care methods for tumor detection and could be generalized to other pediatric and adult brain tumor diagnoses. It could serve as a foundational model for guiding brain tumor surgery."

When a neurosurgeon removes a life threatening tumor from a patient’s brain, they are rarely able to remove the entire mass.

What remains is known as residual tumor.

Commonly, the tumor is missed during the operation because surgeons are not able to differentiate between healthy brain and residual tumor in the cavity where the mass was removed. Residual tumor’s ability to resemble healthy brain tissue remains a major challenge in surgery.

Neurosurgical teams employ different methods to locate that residual tumor during a procedure.

They may get MRI imaging, which requires intraoperative machinery that is not available everywhere. The surgeon might also use a fluorescent imaging agent to identify tumor tissue, which is not applicable for all tumor types. These limitations prevent their widespread use.

In this international study of the AI-driven technology, neurosurgical teams analyzed fresh, unprocessed specimens sampled from 220 patients who had operations for low- or high-grade diffuse glioma.

FastGlioma detected and calculated how much tumor remained with an average accuracy of approximately 92%.

In a comparison of surgeries guided by FastGlioma predictions or image- and fluorescent-guided methods, the AI technology missed high-risk, residual tumor just 3.8% of the time - compared to a nearly 25% miss rate for conventional methods.

"This model is an innovative departure from existing surgical techniques by rapidly identifying tumor infiltration at microscopic resolution using AI, greatly reducing the risk of missing residual tumor in the area where a glioma is resected," said co-senior author Shawn Hervey-Jumper, M.D., professor of neurosurgery at University of California San Francisco and a former neurosurgery resident at U-M Health.

"The development of FastGlioma can minimize the reliance on radiographic imaging, contrast enhancement or fluorescent labels to achieve maximal tumor removal."

How it Works

To assess what remains of a brain tumor, FastGlioma combines microscopic optical imaging with a type of artificial intelligence called foundation models. These are AI models, such as GPT-4 and DALL·E 3, trained on massive, diverse datasets that can be adapted to a wide range of tasks.

After large scale training, foundation models can classify images, act as chatbots, reply to emails and generate images from text descriptions.

To build FastGlioma, investigators pre-trained the visual foundation model using over 11,000 surgical specimens and 4 million unique microscopic fields of view.

The tumor specimens are imaged through stimulated Raman histology, a method of rapid, high resolution optical imaging developed at U-M. The same technology was used to train DeepGlioma, an AI based diagnostic screening system that detects a brain tumor’s genetic mutations in under 90 seconds.

"FastGlioma can detect residual tumor tissue without relying on time-consuming histology procedures and large, labeled datasets in medical AI, which are scarce," said Honglak Lee, Ph.D., co-author and professor of computer science and engineering at U-M.

Full resolution images take around 100 seconds to acquire using stimulated Raman histology; a "fast mode" lower resolution image takes just 10 seconds. 

Researchers found that the full resolution model achieved accuracy up to 92%, with the fast mode slightly lower at approximately 90%.

"This means that we can detect tumor infiltration in seconds with extremely high accuracy, which could inform surgeons if more resection is needed during an operation," Hollon said.

AI's Future in Cancer

Over the last 20 years, the rates of residual tumor after neurosurgery have not improved.

Not only does residual tumor result in worse quality of life and earlier death for patients, but it increases the burden on a health system that anticipates 45 million annual surgical procedures needed worldwide by 2030.

Global cancer initiatives have recommended incorporating new technologies, including advanced methods of imaging and AI, into cancer surgery.

In 2015, The Lancet Oncology Commission on global cancer surgery noted that "the need for cost effective... approaches to address surgical margins in cancer surgery provides a potent drive for novel technologies."

Not only is FastGlioma an accessible and affordable tool for neurosurgical teams operating on gliomas, but researchers say, it can also accurately detect residual tumor for several non-glioma tumor diagnoses, including pediatric brain tumors, such as medulloblastoma and ependymoma, and meningiomas.

"These results demonstrate the advantage of visual foundation models such as FastGlioma for medical AI applications and the potential to generalize to other human cancers without requiring extensive model retraining or fine-tuning,” said co-author said Aditya S. Pandey, M.D., chair of the Department of Neurosurgery at U-M Health.

"In future studies, we will focus on applying the FastGlioma workflow to other cancers, including lung, prostate, breast, and head and neck cancers."

Kondepudi A, Pekmezci M, Hou X, Scotford K, Jiang C, Rao A, Harake ES, Chowdury A, Al-Holou W, Wang L, Pandey A, Lowenstein PR, Castro MG, Koerner LI, Roetzer-Pejrimovsky T, Widhalm G, Camelo-Piragua S, Movahed-Ezazi M, Orringer DA, Lee H, Freudiger C, Berger M, Hervey-Jumper S, Hollon T.
Foundation models for fast, label-free detection of glioma infiltration.
Nature. 2024 Nov 13. doi: 10.1038/s41586-024-08169-3

Most Popular Now

500 Patient Images per Second Shared thr…

The image exchange portal, widely known in the NHS as the IEP, is now being used to share as many as 500 images each second - including x-rays, CT, MRI...

Jane Stephenson Joins SPARK TSL as Chief…

Jane Stephenson has joined SPARK TSL as chief executive as the company looks to establish the benefits of SPARK Fusion with trusts looking for deployable solutions to improve productivity. Stephenson joins...

Is Your Marketing Effective for an NHS C…

How can you make sure you get the right message across to an NHS chief information officer, or chief nursing information officer? Replay this webinar with Professor Natasha Phillips, former...

We could Soon Use AI to Detect Brain Tum…

A new paper in Biology Methods and Protocols, published by Oxford University Press, shows that scientists can train artificial intelligence (AI) models to distinguish brain tumors from healthy tissue. AI...

Welcome Evo, Generative AI for the Genom…

Brian Hie runs the Laboratory of Evolutionary Design at Stanford, where he works at the crossroads of artificial intelligence and biology. Not long ago, Hie pondered a provocative question: If...

Telehealth Significantly Boosts Treatmen…

New research reveals a dramatic improvement in diagnosing and curing people living with hepatitis C in rural communities using both telemedicine and support from peers with lived experience in drug...

AI can Predict Study Results Better than…

Large language models, a type of AI that analyses text, can predict the results of proposed neuroscience studies more accurately than human experts, finds a new study led by UCL...

Using AI to Treat Infections more Accura…

New research from the Centres for Antimicrobial Optimisation Network (CAMO-Net) at the University of Liverpool has shown that using artificial intelligence (AI) can improve how we treat urinary tract infections...

Research Study Shows the Cost-Effectiven…

Earlier research showed that primary care clinicians using AI-ECG tools identified more unknown cases of a weak heart pump, also called low ejection fraction, than without AI. New study findings...

New Guidance for Ensuring AI Safety in C…

As artificial intelligence (AI) becomes more prevalent in health care, organizations and clinicians must take steps to ensure its safe implementation and use in real-world clinical settings, according to an...

Remote Telemedicine Tool Found Highly Ac…

Collecting images of suspicious-looking skin growths and sending them off-site for specialists to analyze is as accurate in identifying skin cancers as having a dermatologist examine them in person, a...

Philips Aims to Advance Cardiac MRI Tech…

Royal Philips (NYSE: PHG, AEX: PHIA) and Mayo Clinic announced a research collaboration aimed at advancing MRI for cardiac applications. Through this investigation, Philips and Mayo Clinic will look to...