Support our Nation today - please donate here
Opinion

AI is rapidly supplanting us in identifying and dealing with feelings

23 May 2025 6 minute read
File photo dated 04/03/17 of a woman’s hand presses a key of a laptop keyboard. A Government-built AI tool has been used for the first time to summarise public responses to a consultation and is now set to be rolled out more widely in an effort to save money and staff time. Credit: Dominic Lipinski/PA Wire

Dr Keith Darlington

Introduction
As a teenager in the 1960s, I enjoyed watching a weekly television series called "Lost in Space". In this science fiction series, a robot with a male identity is used as one of the main characters. He spoke perfect English but sounded monotonic and dull, devoid of inflexions and variations of volume and tone. In short, his voice sounded unnatural. Robots were depicted as sounding devoid of human emotions. It was not uncommon then – perhaps film producers needed to reinforce the differences between humans and AI-based machines.

However, in a film called Her (released in 2014), the main character, Theodore, uses an AI operating system that speaks conversational language – and sounds human. During the installation of the operating system, Theodore assigned a female gender to his operating system and named her Samantha. She talks with a very sensitive, warm, human voice that is indistinguishable from a human voice.

The machine interacts with Theodore on a very personal level, and they develop an emotionally intimate bond. So much so that he eventually falls in love with Samantha, who is nothing more than a computerised voice. This change reflects how AI using humanlike emotions, called Emotional AI, is now used in many applications. This is because AI has become as good as, if not better than, us at detecting and expressing human emotions. I briefly describe some applications in this article to demonstrate the usage and effectiveness of Emotional AI.

Emulating Human Emotions
AI cannot feel emotions like humans do, but it can replicate them and respond in many ways, such as through empathy, creating the illusion of emotional intelligence. Many smartphone apps that use this type of emotional AI are available, such as Hume AI. This is a technology company focused on developing advanced emotional AI tools that can understand and respond to human emotions using voice, facial expressions, and other body language signals.

Its purpose is to align AI systems with human needs by allowing machines to interpret and express emotion in contextually appropriate ways. It could be used, for example, in voice assistant interactions by detecting customer satisfaction and adjusting responses in real time to improve support quality and outcomes.

Many emotional AI apps that support emotional companionship have burgeoned recently. These apps can offer comfort to those experiencing isolation and loneliness. They can engage in emotionally responsive conversations to help users feel heard and connected. Research shows that users often mention reduced feelings of loneliness when interacting with AI companion apps. Replika is an example of a companionship app enabling users to have conversations that mimic human empathy and engagement. There are also emotionally intelligent social robots in widespread use in hospitals and care homes, designed to care for patients’ emotional wellbeing, and are increasingly used for the elderly and those in long-term care.

However, AI can do more than emulate human emotions; it can also interpret our feelings in ways that are sometimes better than ours, as I show in the following sections.

AI Systems that Detect Human Emotions
Emotional AI detection systems operate behind the scenes in many aspects of our lives and can often do it with better accuracy, speed, and consistency than humans. They can also interpret subtle cues that humans frequently fail to recognise.

For example, some AI systems can analyse voice, facial expressions and other aspects of body language. They can detect sound emotions through tone of voice, pitch, and intonation. They are also very good at detecting human emotions from facial expressions, like movements in facial muscles and even the blink rate of the eyes, to assess emotions like anger, happiness, or nervousness.

This is used effectively in call centres, particularly by detecting customer frustration during conversations, prompting human agents to intervene more effectively. In education, emotional AI is embedded in online learning situations to monitor student engagement and tailor instruction. In mental health, AI tools can analyse voice and facial data to detect early signs of anxiety. Businesses can use
emotional AI for feedback on customer reactions to products, which can be used to help guide product design and marketing strategies. They can also use AI to track user reactions to ads or products through cameras and biometric sensors, or to identify shoplifters through their emotional nervous behaviour in stores.

Security Surveillance
Emotional AI is particularly effective in large applications like Airport security, sports, and large entertainment events. They can analyse people's facial expressions and body language to detect stress, anxiety, or suspicious behaviour, helping security personnel identify individuals who may require further screening.

What sets emotional AI apart from human security is its ability to analyse vast amounts of emotional data in real time. Along with video
surveillance, these AIs can scan copious quantities of video feeds to detect emotional cues, supporting efforts to monitor behaviour patterns and identify potential threats more efficiently. Unlike humans, they never get tired, or bored, or lose focus, and they do it all fast and accurately.

The Disadvantages of Using Emotional AI
While emotional AI has many advantages, it still struggles with cultural nuances. It can often misinterpret emotions due to cultural and religious differences. For example, some cultures celebrate funerals with joy and remembrance rather than solely focusing on grief. Emotional AI might not be aware of the data's context. However, this will improve in time as larger amounts of data are acquired. There
are also issues raised by ethical and privacy concerns, such as consent, and emotional AI often relies on sensitive data, such as voice tone, making it prone to possible privacy violations.

Dr Keith Darlington is a retired AI university lecturer and author of five books on AI and computing topics, and many magazine articles about AI.


Support our Nation today

For the price of a cup of coffee a month you can help us create an independent, not-for-profit, national news service for the people of Wales, by the people of Wales.

Subscribe
Notify of
guest

1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Chris Jones
Chris Jones
14 days ago

An excellent non-judgemental and knowledgable article. People should read this and also Dr Darlington’s books. As someone whose business is impacted by AI, I keep an eye on developments and have compiled a list of articles here that may be of use to interested people. Pob lwc!

Our Supporters

All information provided to Nation.Cymru will be handled sensitively and within the boundaries of the Data Protection Act 2018.