Last year, new research came out about how the science of emotions isn’t keeping up with the power of the emotional AI technology. Now, Nature magazine has released a handy update on the emerging science of emotions. It includes a challenge to the AI community: “Are AI companies going to continue to use flawed assumptions or are they going do what needs to be done?”
This is important for AI designers to understand. Failing to understand the full complexity, nuance and variability of emotional expression risks both bias and avoidance. People will easily be able to outsmart an AI by faking expressions. Humans do this all the time; in customer service, people smile even though they don’t necessarily feel happy.
The modern approach to emotional analysis from facial expression is to include the highly varied context under which humans express emotions through facial expression. Our facial expressions are far more nuanced and variable than was previously assumed and much of this nuance stems from the context in which we express emotion. There are three main contextual considerations:
- situation – what’s the background? who is present? what kind of situation?
- personality traits – what is someone’s internal context?
- temporal – how does perception of emotion change as someone face moves? what about with acoustic changes in vocalization? how about with changes in body posture?
Scientists studying emotion have shifted their focus to study faces in context, not just faces. This means that AI companies working in emotional analysis should do the same thing.
In practice this means:
- use moving faces and introduce dynamic analysis not static pictures. Vary the situation.
- enhance labeling of training video/images to include context.
- use technology (rather than actors) to discover patterns between facial movements and perception of emotion.
This video from Nature is worth a watch.