ECU researchers advance emotionally aware AI through facial expression recognition

22
Image credit: sdecoret/stock.adobe.com

Researchers at Edith Cowan University (ECU) are developing new methods to help machines better understand human emotions, by improving how artificial intelligence interprets facial expressions.

The team, led by ECU senior lecturer and AI expert Dr Syed Afaq Shah, has introduced an approach that mimics how people assess emotions in real life – by considering multiple facial expressions rather than relying on a single image.

“As more digital systems, from virtual assistants to wellbeing apps, interact with people, it’s becoming increasingly important that they understand how we feel,” said ECU PhD student Sharjeel Tahir.

Rather than training AI models to analyse emotions based on isolated facial images, the ECU researchers grouped related expressions, giving the system a broader context.

Tahir said this method helps machines make more informed predictions about how someone is feeling, even when faces are viewed from different angles or under varying lighting conditions.

“Just like we don’t judge how someone feels from one glance, our method uses multiple expressions to make more informed predictions,” Tahir said. “It’s a more reliable way to help machines understand emotions.”

Although the research does not involve physical robots, the findings have potential applications in mental health support, customer service, and interactive education – areas where emotionally responsive systems could enhance user experience.

“We’re laying the groundwork for machines that don’t just see faces, but understand them,” Tahir said.

PhD student and co-author Nima Mirnateghi said the technique introduces richer visual cues into the training process, allowing AI systems to maintain efficiency while improving emotional recognition accuracy.

“By exposing the model to diverse features within a structured set, we found that it learns existing patterns far more effectively, refining its emotional recognition capabilities,” he said.

Tahir is now focusing on generating artificial empathy in AI systems under the supervision of Dr Shah, aiming to create agents that can respond appropriately to human emotional states. He said such technology could help address growing demand for emotional support.

“There is a significant need for emotional support these days, and that gap could be filled by emotionally aware or emotionally intelligent machines or robots,” he said.

Mirnateghi added that the research has also prompted broader questions about how AI makes decisions.

The ECU team is now investigating explainable AI in language models, aiming to make emotional intelligence in machines more transparent and understandable.

“By making these processes more transparent, we aim to create AI systems that are inherently understandable—bridging the gap between advanced computation and human intuition,” Mirnateghi said. 

“For example, what makes a machine emotionally intelligent? That’s one of the questions that our current research aims to explore.”