razz
4-5-21, 3:27pm
I had fun trying to express my emotions to challenge the emotion detection technology in this https://www.theguardian.com/technology/2021/apr/04/online-games-ai-emotion-recognition-emojify?utm_term=3d48c16641dcb959b168d9165fcd51dd&utm_campaign=GuardianTodayUK&utm_source=esp&utm_medium=Email&CMP=GTUK_email article.
Some quotes and a link to try the game:
"It is a technology that has been frowned upon by ethicists: now researchers are hoping to unmask the reality of emotion recognition systems in an effort to boost public debate.
Technology designed to identify human emotions using machine learning algorithms is a huge industry, with claims it could prove valuable in myriad situations, from road safety to market research. But critics say the technology not only raises privacy concerns, but is inaccurate and racially biased.
A team of researchers have created a website emojify.info where the public can try out emotion recognition systems through their own computer cameras. One game focuses on pulling faces to trick the technology, while another explores how such systems can struggle to read facial expressions in context.
Their hope, the researchers say, is to raise awareness of the technology and promote conversations about its use.
It is a form of facial recognition, but it goes farther because rather than just identifying people, it claims to read our emotions, our inner feelings from our faces, said Dr Alexa Hagerty, project lead and researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk.
Facial recognition technology, often used to identify people, has come under intense scrutiny in recent years. Last year the Equality and Human Rights Commission said its use for mass screening should be halted, saying it could increase police discrimination and harm freedom of expression.
But Hagerty said many people were not aware how common emotion recognition systems were, noting they were employed in situations ranging from job hiring, to customer insight work, airport security, and even education to see if students are engaged or doing their homework."
Some quotes and a link to try the game:
"It is a technology that has been frowned upon by ethicists: now researchers are hoping to unmask the reality of emotion recognition systems in an effort to boost public debate.
Technology designed to identify human emotions using machine learning algorithms is a huge industry, with claims it could prove valuable in myriad situations, from road safety to market research. But critics say the technology not only raises privacy concerns, but is inaccurate and racially biased.
A team of researchers have created a website emojify.info where the public can try out emotion recognition systems through their own computer cameras. One game focuses on pulling faces to trick the technology, while another explores how such systems can struggle to read facial expressions in context.
Their hope, the researchers say, is to raise awareness of the technology and promote conversations about its use.
It is a form of facial recognition, but it goes farther because rather than just identifying people, it claims to read our emotions, our inner feelings from our faces, said Dr Alexa Hagerty, project lead and researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk.
Facial recognition technology, often used to identify people, has come under intense scrutiny in recent years. Last year the Equality and Human Rights Commission said its use for mass screening should be halted, saying it could increase police discrimination and harm freedom of expression.
But Hagerty said many people were not aware how common emotion recognition systems were, noting they were employed in situations ranging from job hiring, to customer insight work, airport security, and even education to see if students are engaged or doing their homework."