We all like to think we’re intuitive and can tell what our peers are feeling without having to ask them. Body language, facial expression and tone of voice can give us clues, but in reality, measuring human emotion is much harder than it sounds.
Psychologists have traditionally relied on rudimentary questionnaires to gauge people’s reactions when they’re exposed to certain stimuli. But this isn’t the most accurate way to explore something as nuanced as human emotion – and you can’t always rely on people to be completely honest in a survey. What someone says and how they really feel do not always correlate.
In the 1970s, psychologist Paul Ekman pioneered the study of facial expressions in emotion. He developed the Facial Action Coding System – where facial expressions are categorised into discrete archetypes such as happiness, sadness, fear, anger, surprise and disgust. But Ekman’s famous framework did not account for the fact that people experience emotions in vastly different ways. Just because someone isn’t smiling doesn’t necessarily mean they’re unhappy.
Experts are now realising that physiology could give a more objective reading of someone’s psychology. Our facial expressions are the result of many rapid muscle contractions. A technique called facial electromyography (EMG) uses electrodes to measure these contractions at 1,000 times per second. The technology could be a more accurate way of assessing an individual’s reaction than simply looking at their face.
Plastic surgeon Charles Nduka specialises in treating people with facial paralysis, such as Bell’s palsy – a condition that causes temporary weakness of the muscles on one side of the face. Each year, more than 50,000 people in the UK lose the ability to smile, speak, eat or close their eyes, due to facial palsy.
“With Bell’s palsy, you wake up one day and your face doesn’t move on one side,” Nduka explains. Usually, about 70% of people recover spontaneously but a third can be left with lifelong facial disability – and struggle with non-verbal forms of communication.
“When they’re trying to smile it becomes a grimace. It can cause tremendous psychological distress,” says Nduka. “They often withdraw from life, change jobs or stop working altogether.”
Avoiding the mirror
Mirror therapy helps some people with facial palsy regain their movement, but it’s a tough treatment. Patients must stare into a mirror several times a day and practise willing their face to adopt the expression they want. Nduka says this can be so psychologically painful for patients that they avoid the exercises altogether, decreasing their chances of recovery.
“If you have a facial paralysis, the last thing you want to be doing is spending lots of time looking in the mirror,” he reasons. Nduka realised facial EMG could help these people. “It can provide patients with real-time feedback when they do an expression without having to look in the mirror.”
But although EMG is useful, it’s a bulky technology – involving multiple electrodes and wires – and only really possible in the clinic under the supervision of trained healthcare professionals. Nduka wanted to miniaturise the technology so his patients could use it in their own home.
Joining forces with data security and AI expert Graham Cox, he launched EMTEQ Labs, a startup in Brighton. The company developed a more patient-friendly EMG device based on a virtual reality (VR) headset. It shows the wearer an avatar of their face doing the exercises rather than their own.
“But we recognised that it’s a very tiny market and quite hard for us to make something affordable if it’s only being made available to patients with facial paralysis,” Nduka reveals. However, the pair soon realised there could be a number of applications for emotion-sensing technology – including commercial uses such as in media, marketing and gaming.
Emteq’s VR headset contains multiple sensors to measure the wearer’s facial and eye movements and pulse rate. These readings correspond to a particular emotion such as joy, excitement or dissatisfaction so researchers get a good idea of how someone is feeling. The device was validated by Professor Stephen Fairclough from Liverpool John Moores University who tested the kit in a trial to prove the correct emotions were being recorded.
Researchers from Bournemouth University tested Emteq’s device at the Science Museum in 2019. They wanted to see if it is possible to create VR experiences that can reliably induce certain emotions – and whether the emotional responses to the immersive simulation can be accurately measured by the device’s built-in EMG technology. The team gathered data from over 780 people, across more than 3,000 recording sessions.
Face the fear
Nduka says the device could help patients with mental health conditions such as anxiety or phobias. It might work as a form of exposure therapy. When the patient wears the device, they will see a simulated personalised scene. For instance, if someone is afraid of flying, the device could show an aeroplane setting. “If you’ve got a phobia, you can present something to the patient that is a low level of stimulus. And they’ll learn to calm themselves down by slowly increasing the level of exposure,” says Nduka.
At the moment, Emteq’s device is only licensed as a research tool to gain more insight into measuring human emotion. But in the future, Nduka expects it to be available for patients to use in their own homes to tackle mental health difficulties with the support of a trained psychotherapist.
As well as anxiety and phobias, Nduka believes VR technology could help people with severe mental health conditions such as post-traumatic stress disorder (PTSD). A two-year trial from Cardiff University found military veterans with PTSD saw an almost 40% improvement in their symptoms after virtual reality therapy.
It involved patients walking on a treadmill in front of a screen which projected traumatic images. Combining VR with emotion-sensing technology could allow this early research to go further, says Nduka. “The big problem with current VR technologies is that there’s no means of understanding how a person is responding. If you can’t measure emotion, it’s very hard to help someone improve,” he says. “But our system allows patients to get objective feedback on their illness at a very low level of intensity.”
However, Nduka cautions that the risk-benefit ratio of using VR for treating mental health problems must be carefully determined before it is an accepted approach. “Lots of people are looking at VR because of the potential benefit but there is a risk there as well.”
There’s a danger that VR simulations could cause distress and exacerbate someone’s psychological condition. However, Nduka believes the addition of emotion-sensing technology could make VR platforms safer and allow researchers to quickly see if something is likely to do more harm than good. “We want to be sure that our technology is like the seatbelt in a car – something to minimise the risk of someone coming to harm.”