Get featured on INDIAai

Contribute your expertise or opinions and become part of the ecosystem!

Technology in the past few years has achieved the ability to identify human emotions from facial expressions with the help of machine learning. This has opened a whole new segment of possibilities for the industry - from deploying the technology for road safety to employing it for market research. However, critics state that this technology not only breaches privacy rights but is inaccurate and racially biased. 

A group of researchers from the University of Cambridge have created a gaming website to raise awareness of the technology and how it works, and advance conversations around its use and approach. Visitors can go to the website, emojify.info, and try out the emotion recognition systems with the help of their computer cameras. One game focuses on pulling faces to trick the technology, while another explores how such systems can struggle to read facial expressions in context.

“The claim of the people who are developing this technology is that it is reading emotion,” said Hagerty. But, she elaborated, in reality, the system was simply matching the facial movements with the assumption of what emotions these movements are linked to emotions – for example, a smile means someone is happy. 

“There is lots of really solid science that says that is too simple; it doesn’t work quite like that,” said Hagerty, adding that even just human experience showed it was possible to fake a smile. “That is what that game was: to show you didn’t change your inner state of feeling rapidly six times, you just changed the way you looked [on your] face,” she said.

Critics have risen the issue of using facial recognition technology, especially in the past few years. Just last year, the Equality and Human Rights Commission said its use for mass screening should be halted since they say, it could lead to increased police discrimination and hamper the freedom of expression. However, Dr Hagerty said that we are yet unaware of the extent to which facial recognition systems have been employed, stating that they are employed in situations ranging from job hiring to customer insight work, airport security, and even education to see if students are engaged or doing their homework.

“It is a form of facial recognition, but it goes farther because rather than just identifying people, it claims to read our emotions, our inner feelings from our faces,” said Dr Alexa Hagerty, project lead and researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk.

The technology, in some form or another, is implemented across the world - from Europe to China - in homes and prisons. For example, in Lucknow, the Uttar Pradesh government has announced using the technology to spot distress in women as a result of the harassment. “We need to be having a much wider public conversation and deliberation about these technologies,” she said. While the technology may have some benefits, concerns regarding the accuracy, racial bias and effectiveness of the technology should be weighed before implementing it. 

“I think we are beginning to realise we are not really ‘users’ of technology, we are citizens in the world being deeply shaped by technology, so we need to have the same kind of democratic, citizen-based input on these technologies as we have on other important things in societies,” she said.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in