eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

228
active users

#emotionrecognition

0 posts0 participants0 posts today

"On December 17th, EPIC filed comments with the Dutch data protection authority, Autoriteit Persoonsgegevens, regarding use of and prohibitions on emotion recognition surveillance. The EU AI Act prohibits the development, deployment, and placement on the EU market of emotion recognition systems intended for use in the workplace and in educational institutions, with limited exceptions where the algorithm is intended for certain medical or safety reasons. Autoriteit Persoonsgegevens opened a consultation requesting feedback on the implementation of this prohibition.

EPIC’s comments discuss some of the common types of emotion recognition, the harms of emotion recognition systems and their inefficacy, common uses and risks in the education and workplace settings, and recommendations. EPIC urges Autoriteit Persoonsgegevens to define emotion recognition systems broadly and either allow for no exemptions or construe the medical and safety exemption narrowly. This recommendation is based the complete lack of scientific evidence that these systems work and the many ways they violate the rights to privacy, data protection, freedom from discrimination, and various other rights enshrined in the EU Charter of Fundamental Rights and other EU regulations."

epic.org/epic-urges-dutch-data

#EmotionRecognition #AI #PseudoScience: "For his part, Keyes is convinced technologists will never get that far. Developing an AI capable of parsing all the many nuances of human emotion, they say, would effectively mean cracking the problem of general AI, probably just after humanity has developed faster-than-light travel and begun settling distant solar systems.

Instead, in Keyes’ view, we’ve been left with a middling technology: one that demonstrates enough capability in applications with low-enough stakes to convince the right people to invest in further development.

It is this misunderstanding that seems to lie at the root of our inflated expectations of emotion recognition. “It works just well enough to be plausible, just well enough to be given an extra length of rope,” says Keyes, “and just poorly enough that it will hang us with that length of rope.”"

techmonitor.ai/technology/emer

Tech Monitor · Emotion recognition is mostly ineffective. Why do companies still invest?Despite criticism of the AI technology, some companies are still moving ahead with plans for emotion recognition software.