Computers can Tell if You're Bored

Computers are able to read a person's body language to tell whether they are bored or interested in what they see on the screen, according to a new study led by body-language expert Dr Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS).

The research shows that by measuring a person's movements as they use a computer, it is possible to judge their level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.

If someone is absorbed in what they are watching or doing - what Dr Witchel calls 'rapt engagement' - there is a decrease in these involuntary movements.

Dr Witchel said: "Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements. It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.

The discovery could have a significant impact on the development of artificial intelligence. Future applications could include the creation of online tutoring programmes that adapt to a person's level of interest, in order to re-engage them if they are showing signs of boredom. It could even help in the development of companion robots, which would be better able to estimate a person's state of mind.

Also, for experienced designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting. While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.

"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Dr Witchel said. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci fi' but are becoming a realistic possibility within our lifetimes."

In the study, 27 participants faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.

The study team also included two of Dr Witchel's team, Carlos Santos and Dr James Ackah, media expert Carina Westling from the University of Sussex, and the clinical biomechanics group at Staffordshire University led by Professor Nachiappan Chockalingam.

BSMS is a partnership between the Universities of Sussex and Brighton together with NHS organisations throughout the south-east region.

Most Popular Now

Almost All Leading AI Chatbots Show Sign…

Almost all leading large language models or "chatbots" show signs of mild cognitive impairment in tests widely used to spot early signs of dementia, finds a study in the Christmas...

New Study Reveals Why Organisations are …

The slow adoption of blockchain technology is partly driven by overhyped promises that often obscure the complex technological, organisational, and environmental challenges, according to research from the University of Surrey...

Emotional Cognition Analysis Enables Nea…

A joint research team from the University of Canberra and Kuwait College of Science and Technology has achieved groundbreaking detection of Parkinson's disease with near-perfect accuracy, simply by analyzing brain...

New Recommendations to Increase Transpar…

Patients will be better able to benefit from innovations in medical artificial intelligence (AI) if a new set of internationally-agreed recommendations are followed. A new set of recommendations published in The...

Digital Health Unveils Draft Programme f…

18 - 19 March 2025, Birmingham, UK. Digital Health has unveiled the draft programme for its Rewired 2025 event which will take place at the NEC in Birmingham in March next...

AI System Helps Doctors Identify Patient…

A new study from Vanderbilt University Medical Center shows that clinical alerts driven by artificial intelligence (AI) can help doctors identify patients at risk for suicide, potentially improving prevention efforts...

Smartphone App can Help Reduce Opioid Us…

Patients with opioid use disorder can reduce their days of opioid use and stay in treatment longer when using a smartphone app as supportive therapy in combination with medication, a...

AI's New Move: Transforming Skin Ca…

Pioneering research has unveiled a powerful new tool in the fight against skin cancer, combining cutting-edge artificial intelligence (AI) with deep learning to enhance the precision of skin lesion classification...

Leveraging AI to Assist Clinicians with …

Physical examinations are important diagnostic tools that can reveal critical insights into a patient's health, but complex conditions may be overlooked if a clinician lacks specialized training in that area...

AI can Improve Ovarian Cancer Diagnoses

A new international study led by researchers at Karolinska Institutet in Sweden shows that AI-based models can outperform human experts at identifying ovarian cancer in ultrasound images. The study is...

Major EU Project to Investigate Societal…

A new €3 million EU research project led by University College Dublin (UCD) Centre for Digital Policy will explore the benefits and risks of Artificial Intelligence (AI) from a societal...

Predicting the Progression of Autoimmune…

Autoimmune diseases, where the immune system mistakenly attacks the body's own healthy cells and tissues, often have a preclinical stage before diagnosis that’s characterized by mild symptoms or certain antibodies...