Scientists urge public to try new game to see the risks of “emotion recognition technology”

Scientists are asking members of the public to face the screen of their webcam or phone to learn more about a controversial technology.

The site, called Emojify, was created to draw attention to “emotion recognition” systems and the downfalls of this powerful technology.

Developed by artificial intelligence, it is designed to recognize a person’s feelings and is a billion-pound industry that tunes expressions into data.

The problems with technology are that many experts consider it to be inaccurate and simplistic and that it has been repeatedly found to struggle with an integrated racial bias.

Technology works on the basis that humans have only six main facial emotions and that they can be classified using an algorithm, in the same way that a person uses emojis.

Critics of this theory say that human facial expressions are more complex and nuanced than this and that technology has no place in modern society.

The site, called Emojify (pictured), was created to draw attention to the systems

The site, called Emojify (pictured), was created to draw attention to “emotion recognition” systems and the downfalls of this powerful technology.

The website delves into the ethics of AI-driven emotion recognition.  Critics of the technology say it is inaccurate and should not be used.  This movement is now called

The website delves into the ethics of AI-driven emotion recognition. Critics of the technology say it is inaccurate and should not be used. This movement is now called “resistance to emojification”

Cambridge and UCL researchers created Emojify to help people understand how computers can be used to scan faces to detect emotions.

They have a couple of experiments where people can “overcome” emotional recognition technology, which they have framed as a game.

Once a person identifies the difference between a wink and a blink, for example, something the machine cannot do, they are given a congratulatory message.

It says, ‘You can read facial expressions in context, which emotion recognition systems can’t do. You are part of the resistance to emojification!

Dr Alexa Hagerty, a researcher and project leader at the Leverhulme Center for the Future of Cambridge Intelligence, said the technology, which is already used in parts of the world, is “powerful” but “flawed”.

Website visitors can play another game that involves throwing faces at the device’s camera to try to get the emotion recognition system to recognize all six emotions: happiness, sadness, fear, surprise, disgust, and anger.

They can also answer a number of optional questions to aid research, including whether they have experienced the technology before and whether they find it useful.

AI emotion recognition technology is used in various sectors of China, including police interrogation and behavior control in schools.

Other potential uses include border control, evaluating candidates during job interviews, and having companies collect information from clients.

Researchers say they hope to start conversations about technology and its social impacts.

Researchers at Cambridge University and UCL created Emojify to help people understand how computers can be used to scan facial expressions to detect emotions.

Researchers at Cambridge University and UCL created Emojify to help people understand how computers can be used to scan facial expressions to detect emotions.

Website visitors can play a game of pulling faces against the device’s camera to try to get the emotion recognition system to recognize all six emotions: happiness, sadness, fear, surprise, disgust, and anger.

Website visitors can play a game of pulling faces against the device’s camera to try to get the emotion recognition system to recognize all six emotions: happiness, sadness, fear, surprise, disgust, and anger.

Dr. Hagerty said: “A lot of people are surprised to learn that emotion recognition technology exists and is already in use.

“Our project offers people the opportunity to experience these systems for themselves and get a better idea of ​​their power, but also their shortcoming.”

Juweek Adolphe, chief website designer, said: “It’s meant to be fun, but also to make you think about the stakes of this technology.”

Dr. Hagerty says the science behind emotion recognition is unstable and makes too many assumptions about people.

“It assumes that our facial expressions perfectly reflect our inner feelings,” she says. “If you’ve ever faked a smile, you’ll know it’s not always like that.”

Dr. Alexandra Albert, of UCL’s Extreme Citizen Science (ExCiteS) research group, said a “more democratic approach” is needed to determine how the technology is used.

“There has been no real public input or deliberation on these technologies,” he said.

“They analyze your face, but it’s the technology companies that make the decisions about how to use them.”

The researchers said their website does not collect or store images or data from the emotion system.

Optional answers to the questions will be used as part of an academic paper on citizen scientific approaches to better understand the social implications of emotion recognition.

.Source