Can AI detect human emotions when there is almost no facial muscle movement?
What happened in this picture?
After serving for the Marines in World War II, the person in the picture returns home and wants to attend university. At the university reception, the interviewer asks him if he has learned anything useful during his time of service, such as fixing a car, writing journals and so on. After answering many of these ridiculous questions, he finally gets angry and annoyed. The interviewer doesn’t know what he had to suffer during his service; his long days in the wet, muddy, rainy and environment, and moments of fear as he survived bombs and bullets from the enemy lines. He tells the interviewer that all he learned was how to kill the enemy.
The feelings depicted in this image are multifaceted, with 3 identifiable emotions: anger as the primary emotion, disgust as the secondary, and contempt as the tertiary. There is also some sadness attached as well. He feels angry that he is being asked these questions despite his service to his country, disgust and contempt at the interviewer for treating him without the respect he feels he has deserved, and sadness, at remembering the experiences he had to go through to survive the war.
M accurately identify all these key emotions because M is the only Artificial Intelligence Platform available today that equipped with human emotion:
However, other emotional AI platforms, including Microsoft’s Azure Emotional AI Platform, are coding systems based on facial muscle movement that require facial muscle movement. In the absence of such muscle movement (such as in this image), these systems are not able to read any emotion, so they determine the emotion as “Neutral”.
The ability of Emotional Artificial Intelligence to understand these negative hidden emotions is extremely important, as prolonged hidden negative emotions lead to emotional vulnerability that can adversely affect overall emotional well being. M knows how to understand this state and identify it to help humans as early as possible before the complications goes to level out of control.
We strongly believe that this is the time for us to welcome the Artificial Intelligence Platform, Project M, that can understand human emotions based on facial looks just like a human, and let it help us to maximize our emotional well beings.
Your LIKE means a lot to us!
Follow Project M on:
Company website: http://www.ipmdinc.com
Medium: Project M
Facebook: Project M
Instagram: mprojectai
Twitter: @mprojectai
*As of July 2019, the Project M team has devoted 64,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.
Comments