top of page
Writer's pictureIPMD webmaster

M vs Microsoft: Episode 23

Can AI detect remorseful emotions from facial looks?


What is happening in this image?


Beverly, the woman in the image, is a mother who has been diagnosed with narcissistic disorder. As a result of her diagnosis, she is extremely verbally abusive, manipulative and controlling toward her daughter and her husband. She uses language that is catastrophic and dramatic. Helpless in the face of Beverly’s behavior, her family goes to the Dr. Phil Show to ask for advice. Beverly denies her destructive behavior when she is confronted by Dr. Phil and her family. She becomes defensive and refutes the allegations of her behavior made against her.


After a long intervention, this picture captures the moment she finally comes to terms of how destructive, hurtful, and toxic her behavior has been towards her family and people around her. She even finds it within herself to admit that she is wrong.


What could she be feeling right now?

One would imagine she feels regretful and sad about her behavior towards her family, as she treated them badly when she should have been kind and loving. She may also feel afraid that she will lose her family due to her behavior.



Now let’s see what M and Microsoft Azure identify:

As we would expect, M identifies sadness as the primary emotion (38.6%). It also identifies fear as her secondary emotion (16.2%), perhaps because this is a moment of confrontation Beverly was not expecting, and she is afraid of the consequences. Lastly, M identifies surprise as the tertiary emotion (9.07%), because this was an unexpected intervention. Characteristic of narcissistic disorder, often people are not aware of their manipulative behavior. Calling awareness to that may be why Beverly is feeling surprised.


On the other hand, Microsoft identifies no emotions, completely missing the emotions that M has identified as salient. Though it detects a very small level of sadness, it does not match up to the level of emotion the context of this image would actually suggest.


Now that you know the context behind the situation, and the two different emotion readings, it is up to you to decide what interpretation you think is most accurate.





 

We at Project M believe the ability of emotional artificial intelligence platforms to identify and understand negative emotions has great potential to optimize humans’ emotional well being.


Follow us on our social media and look out for our next episode!

Your LIKE means a lot to us! Follow Project M on:

Company website: http://www.ipmdinc.com

Medium: Project M

Facebook: Project M

Blogger: Project M

Instagram: mprojectai

Twitter: @mprojectai



*As of April 1, 2020, the Project M team has devoted 82,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.

Recent Posts

See All

Yorumlar


bottom of page