‘Emotion Recognition’ AI Judges If You’re Happy At Work, Knows You’re Faking It – Corporate B2B Sales & Digital Marketing Agency in Cardiff covering UK

Imagine this: you’re at work, bogged down by the countless items on your to-do list, and trying to muster up the motivation to make it to Friday. Worse still, you can’t furrow your brows, yawn, scowl at an annoying coworker, or betray your real emotions at all because something’s watching you.

While this may sound like a farfetched plot of the latest dystopian sci-fi novel, it, unfortunately, could be the reality for some employees working in major organizations in China.

Chinese technology firm Taigusys has developed an AI emotion-recognition system, claiming to detect and examine the facial expressions of humans.

It doesn’t just give a minute-by-minute analysis of what someone might be feeling, but can even generate detailed reports to track how they present over a period of time. This system was first uncovered in an investigative report by The Guardian.

On Taigusys’ site, it lists several multinational corporations as part of its clientele, including big names such as Huawei, China Mobile, and PetroChina. It is important to note, however, that there is no proof these organizations are using Taigusys’ AI emotion-recognition product.

According to Insider, many researchers have come forward to question the accuracy of such AI software, plus its ethical qualities.

While Taigusys claims the software will help to “address new challenges” and “minimize the conflicts” in the workplace, being constantly tracked to see if you’re happy or bored could make work even more stressful than it already is.

The AI algorithm analyses an individual’s facial muscle movements and biometric signals, evaluating them on several scales to determine one’s emotional state.

“Good” emotions include happiness and surprise, while “negative” emotions are those such as sorrow, confusion, and anger. The software is even able to detect “neutral” emotions such as concentration and focus, too, reports Insider.

Adding to the software’s creepiness factor is the fact Taigusys claims it can even detect when an individual’s faking a smile. Yikes. If one exceeds the recommended parameters for “negative emotions,” the software can generate a report, recommending them for “emotional support.”

While this may sound rather helpful on paper, it’s not known if organizations will really offer employees help, or take them off the payroll altogether.

Ethical researchers, such as Daniel Leufer, a Europe policy analyst at digital civil rights non-profit Access Now, explained to Insider that even if the software worked, it would constitute a “gross violation” of human rights, including an individual’s right to privacy and free expression.

“Worst of all, all of these violations potentially occur even if emotion recognition is not scientifically possible. The very fact that people believe it is, and create and deploy systems claiming to do it, has real effects on people,” he explained.

Though no one is sure how many organizations around the globe are employing such technology, it sure looks like it’s entering the territory of alarming Black Mirror episodes.


Image via Taigusys

[via

Be the first to comment

Leave a Reply

Your email address will not be published.


*