Emotion AI or affective computing refers to artificial intelligence that detects and interprets human emotional signals. This technology combines cameras and other devices with artificial intelligence programs to capture facial expressions, body language, vocal intonation, and other cues. Although Emotion AI is a newer technology, facial recognition tools are nothing new.
These AI systems use various types of data to generate insights into emotion and behavior including vocal intonation, body language, and analyzing the content of spoken or written speech for affect and attitude.
However, there are major concerns and nuances that come with technology. These machines are trained, and like any other machine, can be wrong.
When it comes to AI algorithms being trained on data sets with embedded racial, ethnic and gender biases, they can quickly prejudice their evaluations. Facial-recognition systems, most also based on deep learning, have been widely criticized for bias. Research at the MIT Media Lab found that these systems were less accurate when matching the identities of nonwhite, nonmale faces. This is typically due to using training data sets skewing white and male. And now, identifying emotional expressions adds additional layers of complexity.