Computers can Tell if You're Bored

Computers are able to read a person's body language to tell whether they are bored or interested in what they see on the screen, according to a new study led by body-language expert Dr Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS).

The research shows that by measuring a person's movements as they use a computer, it is possible to judge their level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.

If someone is absorbed in what they are watching or doing - what Dr Witchel calls 'rapt engagement' - there is a decrease in these involuntary movements.

Dr Witchel said: "Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements. It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.

The discovery could have a significant impact on the development of artificial intelligence. Future applications could include the creation of online tutoring programmes that adapt to a person's level of interest, in order to re-engage them if they are showing signs of boredom. It could even help in the development of companion robots, which would be better able to estimate a person's state of mind.

Also, for experienced designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting. While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.

"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Dr Witchel said. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci fi' but are becoming a realistic possibility within our lifetimes."

In the study, 27 participants faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.

The study team also included two of Dr Witchel's team, Carlos Santos and Dr James Ackah, media expert Carina Westling from the University of Sussex, and the clinical biomechanics group at Staffordshire University led by Professor Nachiappan Chockalingam.

BSMS is a partnership between the Universities of Sussex and Brighton together with NHS organisations throughout the south-east region.

Most Popular Now

Philips and Medtronic Advocacy Partnersh…

Royal Philips (NYSE: PHG, AEX: PHIA), a global leader in health technology, and Medtronic Neurovascular, a leading innovator in neurovascular therapies, today announced a strategic advocacy partnership. Delivering timely stroke...

Wearable Cameras Allow AI to Detect Medi…

A team of researchers says it has developed the first wearable camera system that, with the help of artificial intelligence (AI), detects potential errors in medication delivery. In a test whose...

New AI Tool Predicts Protein-Protein Int…

Scientists from Cleveland Clinic and Cornell University have designed a publicly-available software and web database to break down barriers to identifying key protein-protein interactions to treat with medication. The computational tool...

AI for Real-Rime, Patient-Focused Insigh…

A picture may be worth a thousand words, but still... they both have a lot of work to do to catch up to BiomedGPT. Covered recently in the prestigious journal Nature...

New Research Shows Promise and Limitatio…

Published in JAMA Network Open, a collaborative team of researchers from the University of Minnesota Medical School, Stanford University, Beth Israel Deaconess Medical Center and the University of Virginia studied...

G-Cloud 14 Makes it Easier for NHS to Bu…

NHS organisations will be able to save valuable time and resource in the procurement of technologies that can make a significant difference to patient experience, in the latest iteration of...

Start-Ups will Once Again Have a Starrin…

11 - 14 November 2024, Düsseldorf, Germany. The finalists in the 16th Healthcare Innovation World Cup and the 13th MEDICA START-UP COMPETITION have advanced from around 550 candidates based in 62...

Hampshire Emergency Departments Digitise…

Emergency departments in three hospitals across Hampshire Hospitals NHS Foundation Trust have deployed Alcidion's Miya Emergency, digitising paper processes, saving clinical teams time, automating tasks, and providing trust-wide visibility of...

MEDICA HEALTH IT FORUM: Success in Maste…

11 - 14 November 2024, Düsseldorf, Germany. How can innovations help to master the great challenges and demands with which healthcare is confronted across international borders? This central question will be...

A "Chemical ChatGPT" for New M…

Researchers from the University of Bonn have trained an AI process to predict potential active ingredients with special properties. Therefore, they derived a chemical language model - a kind of...

Siemens Healthineers co-leads EU Project…

Siemens Healthineers is joining forces with more than 20 industry and public partners, including seven leading stroke hospitals, to improve stroke management for patients all over Europe. With a total...

MEDICA and COMPAMED 2024: Shining a Ligh…

11 - 14 November 2024, Düsseldorf, Germany. Christian Grosser, Director Health & Medical Technologies, is looking forward to events getting under way: "From next Monday to Thursday, we will once again...