AI/Machine Learning And Facial Micro-Expression Detection
The use of AI/Machine Learning in Affective computing--systems that can recognize, detect, and respond to human emotions--is a growing field. The state of the art has come a long way since the initial Human Activity Recognition- from posture and gait to changes in the human voice--to provide a better understanding of human bodies and emotions!
In the 60's Paul Ekman tried to determine whether he could identify deception in depressed patients to prevent suicide. He found that some patients who were suffering from depression would lie in interviews and say they were improving. His work led to the Facial Action Coding System, which formed the basis for training humans to detect micro-expressions. Today Affectiva can successfully identify emotions based on macro-expressions like broad smiles, exaggerated frowns, and obviously narrowed eyes and pursed lips.The next frontier will be based on micro-expressions.
Satya Venneti reports on a 2 CNN approach* to generate machine learning features that focus on micro-expressions.Pre-processed video frames are fed into the CNNs, which use both pixel data and optical flow data to capture spatial and temporal information. The CNNs generate machine learned features, and both streams of data are integrated into a single classifier that predicts the emotion associated with the micro-expression.
*a spatial CNN that was pre-trained on faces from ImageNet, and a temporal CNN for analyzing changes over time.
Do share your thoughts on how you see this evolving socially and technologically - drop me a note privately or via the comment section below.
About the Author:
Madhu cherishes the opportunity to learn and collaborate; he has three decades of experience on how to nurture the emergence of beachhead market ideations worldwide. Note that what is expressed by Madhu here is of his own interest and is in no way reflective of his employer.