Can Emotional AI detect a fake smile?
In this episode M vs Microsoft Episode 25, we will be testing the image that you see, with a smiling young woman using M and Microsoft Azure and compare the testing results. Then we will be discussing why her smile is not genuine and why Microsoft fails to identify a fake smile.
There are many situations and reasons why people fake a smile, especially when they are in public.
For example, a situation when you have said an inappropriate joke at dinner with your parents-in-law or boss, or a joke when no one laughed at or did not understand. A situation when someone tells you you have food in your teeth, or when you realized that you have been talking to people with your lipstick all over your front teeth.
All these uncomfortable and awkward situations in public can be applied to the woman with a fake smile in the picture above. Additionally, if you look at her eyes, they do not sparkle like in a happy smile. As you can see there’s a lot of discomfort in her eyes because of the awkward situation.
Now let’s see what M and Microsoft Azure identify:
As you can see M identifies the woman’s emotional state as Disgust at 87.8% as the primary emotion. That’s because when a person fakes a smile in awkward and uncomfortable situations as we discussed earlier, they are trying to cover up their true feeling of discomfort by forcing them to smile in public. M also identifies her mood as Negative at 98.2% which means she feels uncomfortable and awkward.
Now, let’s test Microsoft Azure using the same image:
As you can see Microsoft identifies her emotional state as Happy at 0.991% as the primary emotion, which is incorrect because you never feel happy in situations like discomfort and awkwardness or even embarrassment.
So why does Microsoft fail to identify a fake smile?
And the reason is that Microsoft Azure uses Computer Vision technology that identifies human emotions based on facial muscle movements. In most cases, people do not reveal true emotional states through facial muscle movement and often fake them. If there is no facial muscle movement or the facial muscle movement is not genuine then the AI is not able to identify human emotions just based on facial looks.
A genuine smile
Now let’s look at the second image with a woman who has a genuine smile. She is with her dear grand-daughter taking a selfie together. The woman feels happy and even her eyes reveal her true emotional state of happiness as well as comfort and kindness.
Now that you know the context behind the situation and the two different emotional readings, it is up to you to decide what interpretation you think is most accurate.
We at Project M believe the ability of emotional artificial intelligence platforms to identify and understand negative emotions has great potential to optimize humans’ emotional well being.
Follow us on our social media and look out for our next episode!
Your LIKE means a lot to us! Follow Project M on:
Project M: https://projectm.ipmdinc.com
IPMD website: http://www.ipmdinc.com
Medium: Project M
Facebook: Project M
Blogger: Project M
Instagram: mprojectai
Twitter: @mprojectai
LinkedIn: Project M
*As of January 1, 2021, the Project M team has devoted 100,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.
コメント