Get ready for the rise of emotional surveillance đŹ
New AI tools claim to measure your personality, emotions, and mood. They canât of course, because humans arenât reducible to simplistic criteria, and stuff like your facial expression doesnât actually reveal much.
And yet:
âThe bad news is that software now purports to glean insights into the depths and vagaries of human emotion using AI, and it is coming to watch you. If it isnât already: Morphcast, for example, has licensed its technology to a mental-health app, a program that monitors schoolchildrenâs attention, and McDonaldâs, which launched a promotional campaign in Portugal that scanned app usersâ faces and offered them personalized coupons based on their (supposed) mood. It is one of many, many such companies doing similar workâthe industry term is emotion AI or sometimes affective computing.
Some products analyze video of meetings or job interviews or focus groups; others listen to audio for pitch, tone, and word choice; still others can scan chat transcripts or emails and spit out a report about worker sentiment.â
So, one problem is the totally BS premise, which is that you can look at, say body language, and infer inner thoughts.
Another problem is that these tools will mainly be used for âenhancing worker productivity.â The companies selling digital surveillance advertise all manner of use cases: worker safety, mental health, organizational efficiency, burnout reduction in high-stakes fields such as medicine and transportation.
Of course they do.
A final problem is that this is yet another attack on human dignity.
So, if your new job is managing 20 AI agents because all your colleagues got laid off, itâs not enough to get on with it. You also have to look convincingly happy about it. Dystopian stuff.