When Detecting Depression, the Eyes have it

It has been estimated that nearly 300 million people, or about 4% of the global population, are afflicted by some form of depression. But detecting it can be difficult, particularly when those affected don’t (or won't) report negative feelings to friends, family or clinicians.

Now Stevens professor Sang Won Bae is working on several AI-powered smartphone applications and systems that could non-invasively warn us, and others, that we may be becoming depressed.

"Depression is a major challenge," says Bae. "We want to help."

"And since most people in the world today use smartphones daily, this could be a useful detection tool that’s already built and ready to be used."

One system Bae is developing with Stevens doctoral candidate Rahul Islam, called PupilSense, works by constantly taking snapshots and measurements of a smartphone user’s pupils.

"Previous research over the past three decades has repeatedly demonstrated how pupillary reflexes and responses can be correlated to depressive episodes," she explains.

The system accurately calculate pupils’ diameters, as comparing to the surrounding irises of the eyes, from 10-second “burst” photo streams captured while users are opening their phones or accessing certain social media and other apps.

In one early test of the system with 25 volunteers over a four-week period, the system - embedded on those volunteers' smartphones - analyzed approximately 16,000 interactions with phones once pupil-image data were collected. After teaching an AI to differentiate between "normal" responses and abnormal ones, Bae and Islam processed the photo data and compared it with the volunteers' self-reported moods.

The best iteration of PupilSense - one known as TSF, which uses only selected, high-quality data points - proved 76% accurate at flagging times when people did indeed feel depressed. That’s better than the best smartphone-based system currently being developed and tested for detection depression, a platform known as AWARE.

"We will continue to develop this technology now that the concept has been proven," adds Bae, who previously developed smartphone-based systems to predict binge drinking and cannabis use.

The system was first unveiled at the International Conference on Activity and Behavior Computing in Japan in late spring, and the system is now available open-source on the GitHub platform.

Bae and Islam are also developing a second system known as FacePsy that powerfully parses facial expressions for insight into our moods.

"A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures," Bae points out.

FacePsy runs in the background of a phone, taking facial snapshots whenever a phone is opened or commonly used applications are opened. (Importantly, it deletes the facial images themselves almost immediately after analysis, protecting users' privacy.)

"We didn't know exactly which facial gestures or eye movements would correspond with self-reported depression when we started out," Bae explains. "Some of them were expected, and some of them were surprising."

Increased smiling, for instance, appeared in the pilot study to correlate not with happiness but with potential signs of a depressed mood and affect.

"This could be a coping mechanism, for instance people putting on a 'brave face' for themselves and for others when they are actually feeling down," says Bae. "Or it could be an artifact of the study. More research is needed."

Other apparent signals of depression revealed in the early data included fewer facial movements during the morning hours and certain very specific eye- and head-movement patterns. (Yawing, or side-to-side, movements of the head during the morning seemed to be strongly linked to increased depressive symptoms, for instance.)

Interestingly, a higher detection of the eyes being more open during the morning and evening was associated with potential depression, too - suggesting outward expressions of alertness or happiness can sometimes mask depressive feelings beneath.

"Other systems using AI to detect depression require the wearing of a device, or even multiple devices," Bae concludes. "We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool."

Rahul Islam, Sang Won Bae.
FacePsy: An Open-Source Affective Mobile Sensing System - Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings.
Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 260 (September 2024). doi: 10.1145/3676505

Most Popular Now

500 Patient Images per Second Shared thr…

The image exchange portal, widely known in the NHS as the IEP, is now being used to share as many as 500 images each second - including x-rays, CT, MRI...

Is Your Marketing Effective for an NHS C…

How can you make sure you get the right message across to an NHS chief information officer, or chief nursing information officer? Replay this webinar with Professor Natasha Phillips, former...

We could Soon Use AI to Detect Brain Tum…

A new paper in Biology Methods and Protocols, published by Oxford University Press, shows that scientists can train artificial intelligence (AI) models to distinguish brain tumors from healthy tissue. AI...

Welcome Evo, Generative AI for the Genom…

Brian Hie runs the Laboratory of Evolutionary Design at Stanford, where he works at the crossroads of artificial intelligence and biology. Not long ago, Hie pondered a provocative question: If...

Telehealth Significantly Boosts Treatmen…

New research reveals a dramatic improvement in diagnosing and curing people living with hepatitis C in rural communities using both telemedicine and support from peers with lived experience in drug...

AI can Predict Study Results Better than…

Large language models, a type of AI that analyses text, can predict the results of proposed neuroscience studies more accurately than human experts, finds a new study led by UCL...

Using AI to Treat Infections more Accura…

New research from the Centres for Antimicrobial Optimisation Network (CAMO-Net) at the University of Liverpool has shown that using artificial intelligence (AI) can improve how we treat urinary tract infections...

Research Study Shows the Cost-Effectiven…

Earlier research showed that primary care clinicians using AI-ECG tools identified more unknown cases of a weak heart pump, also called low ejection fraction, than without AI. New study findings...

New Guidance for Ensuring AI Safety in C…

As artificial intelligence (AI) becomes more prevalent in health care, organizations and clinicians must take steps to ensure its safe implementation and use in real-world clinical settings, according to an...

Remote Telemedicine Tool Found Highly Ac…

Collecting images of suspicious-looking skin growths and sending them off-site for specialists to analyze is as accurate in identifying skin cancers as having a dermatologist examine them in person, a...

Philips Aims to Advance Cardiac MRI Tech…

Royal Philips (NYSE: PHG, AEX: PHIA) and Mayo Clinic announced a research collaboration aimed at advancing MRI for cardiac applications. Through this investigation, Philips and Mayo Clinic will look to...

Deep Learning Model Accurately Diagnoses…

Using just one inhalation lung CT scan, a deep learning model can accurately diagnose and stage chronic obstructive pulmonary disease (COPD), according to a study published today in Radiology: Cardiothoracic...