New App Uses Smartphone Selfies to Screen for Pancreatic Cancer

Pancreatic cancer has one of the worst prognoses - with a five-year survival rate of 9 percent - in part because there are no telltale symptoms or non-invasive screening tools to catch a tumor before it spreads. Now, University of Washington researchers have developed an app that could allow people to easily screen for pancreatic cancer and other diseases - by snapping a smartphone selfie.

BiliScreen uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person's sclera, or the white part of the eye. The new app is described in a paper to be presented Sept. 13 at Ubicomp 2017, the Association for Computing Machinery's International Joint Conference on Pervasive and Ubiquitous Computing.

One of the earliest symptoms of pancreatic cancer, as well as other diseases, is jaundice, a yellow discoloration of the skin and eyes caused by a buildup of bilirubin in the blood. The ability to detect signs of jaundice when bilirubin levels are minimally elevated - but before they're visible to the naked eye - could enable an entirely new screening program for at-risk individuals.

In an initial clinical study of 70 people, the BiliScreen app - used in conjunction with a 3-D printed box that controls the eye's exposure to light - correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used.

"The problem with pancreatic cancer is that by the time you're symptomatic, it's frequently too late," said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering. "The hope is that if people can do this simple test once a month - in the privacy of their own homes - some might catch the disease early enough to undergo treatment that could save their lives."

BiliScreen builds on earlier work from the UW's Ubiquitous Computing Lab, which previously developed BiliCam, a smartphone app that screens for newborn jaundice by taking a picture of a baby's skin. A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants.

In collaboration with UW Medicine doctors, the UbiComp lab specializes in using cameras, microphones and other components of common consumer devices - such as smartphones and tablets - to screen for disease.

The blood test that doctors currently use to measure bilirubin levels - which is typically not administered to adults unless there is reason for concern - requires access to a health care professional and is inconvenient for frequent screening. BiliScreen is designed to be an easy-to-use, non-invasive tool that could help determine whether someone ought to consult a doctor for further testing. Beyond diagnosis, BiliScreen could also potentially ease the burden on patients with pancreatic cancer who require frequent bilirubin monitoring.

In adults, the whites of the eyes are more sensitive than skin to changes in bilirubin levels, which can be an early warning sign for pancreatic cancer, hepatitis or the generally harmless Gilbert's syndrome. Unlike skin color, changes in the sclera are more consistent across all races and ethnicities.

Yet by the time people notice the yellowish discoloration in the sclera, bilirubin levels are already well past cause for concern. The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.

"The eyes are a really interesting gateway into the body - tears can tell you how much glucose you have, sclera can tell you how much bilirubin is in your blood," said senior author Shwetak Patel, the Washington Research Foundation Entrepreneurship Endowed Professor in Computer Science & Engineering and Electrical Engineering. "Our question was: Could we capture some of these changes that might lead to earlier detection with a selfie?"

BiliScreen uses a smartphone's built-in camera and flash to collect pictures of a person's eye as they snap a selfie. The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics. The app then calculates the color information from the sclera - based on the wavelengths of light that are being reflected and absorbed - and correlates it with bilirubin levels using machine learning algorithms.

To account for different lighting conditions, the team tested BiliScreen with two different accessories: paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting. Using the app with the box accessory - reminiscent of a Google Cardboard headset - led to slightly better results.

Next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements - including removing the need for accessories like the box and glasses.

"This relatively small initial study shows the technology has promise," said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70.

"Pancreatic cancer is a terrible disease with no effective screening right now," Taylor said. "Our goal is to have more people who are unfortunate enough to get pancreatic cancer to be fortunate enough to catch it in time to have surgery that gives them a better chance of survival."

Alex Mariakakis, Megan A. Banks, Lauren Phillipi, Lei Yu, James Taylor, Shwetak N Patel.
BiliScreen: Smartphone-Based Scleral Jaundice Monitoring for Liver and Pancreatic Disorders.
Proc. ACM Interact. Mob. Wearable Ubiquitous Technol, doi: 10.1145/3090085.

Most Popular Now

AI for Real-Rime, Patient-Focused Insigh…

A picture may be worth a thousand words, but still... they both have a lot of work to do to catch up to BiomedGPT. Covered recently in the prestigious journal Nature...

New Research Shows Promise and Limitatio…

Published in JAMA Network Open, a collaborative team of researchers from the University of Minnesota Medical School, Stanford University, Beth Israel Deaconess Medical Center and the University of Virginia studied...

G-Cloud 14 Makes it Easier for NHS to Bu…

NHS organisations will be able to save valuable time and resource in the procurement of technologies that can make a significant difference to patient experience, in the latest iteration of...

Hampshire Emergency Departments Digitise…

Emergency departments in three hospitals across Hampshire Hospitals NHS Foundation Trust have deployed Alcidion's Miya Emergency, digitising paper processes, saving clinical teams time, automating tasks, and providing trust-wide visibility of...

Start-Ups will Once Again Have a Starrin…

11 - 14 November 2024, Düsseldorf, Germany. The finalists in the 16th Healthcare Innovation World Cup and the 13th MEDICA START-UP COMPETITION have advanced from around 550 candidates based in 62...

MEDICA HEALTH IT FORUM: Success in Maste…

11 - 14 November 2024, Düsseldorf, Germany. How can innovations help to master the great challenges and demands with which healthcare is confronted across international borders? This central question will be...

A "Chemical ChatGPT" for New M…

Researchers from the University of Bonn have trained an AI process to predict potential active ingredients with special properties. Therefore, they derived a chemical language model - a kind of...

Siemens Healthineers co-leads EU Project…

Siemens Healthineers is joining forces with more than 20 industry and public partners, including seven leading stroke hospitals, to improve stroke management for patients all over Europe. With a total...

MEDICA and COMPAMED 2024: Shining a Ligh…

11 - 14 November 2024, Düsseldorf, Germany. Christian Grosser, Director Health & Medical Technologies, is looking forward to events getting under way: "From next Monday to Thursday, we will once again...

In 10 Seconds, an AI Model Detects Cance…

Researchers have developed an AI powered model that - in 10 seconds - can determine during surgery if any part of a cancerous brain tumor that could be removed remains...

Does AI Improve Doctors' Diagnoses?

With hospitals already deploying artificial intelligence to improve patient care, a new study has found that using Chat GPT Plus does not significantly improve the accuracy of doctors' diagnoses when...

AI Analysis of PET/CT Images can Predict…

Dr. Watanabe and his teams from Niigata University have revealed that PET/CT image analysis using artificial intelligence (AI) can predict the occurrence of interstitial lung disease, known as a serious...