Results for ""
UN human rights chief Michelle Bachelet is calling for a moratorium on the use of artificial intelligence (AI) technology that poses a serious risk to human rights like face-scanning systems that track people in public spaces.
Bachelet extended this plea to countries and said that they should expressly ban AI applications that don’t comply with international human rights law.
In addition, the UN is asking for the prohibition of government 'social scoring' systems that score people based on their behaviour and certain AI-based tools that categorise people into groups based on their gender, ethnicity, etc.
AI-based technology has the potential for unbounded good but they also “have negative, even catastrophic, effects if they are used without sufficient regard to how they affect people’s human rights,” Bachelet said in a statement.
Bachelet's comment coincides with the new UN report that examines how countries and businesses have rushed into introducing AI systems that impact people's lives and livelihoods without really establishing safeguards to avoid discrimination and other biases.
“This is not about not having AI,” Peggy Hicks, the rights office's director of thematic engagement, told journalists as she presented the report in Geneva. “It’s about recognizing that if AI is going to be used in these human rights — very critical — function areas, that it’s got to be done the right way. And we simply haven’t yet put in place a framework that ensures that happens.”
Bachelet has not asked for a ban on facial recognition technology but said the government should halt the scanning of people in real time until the technology is proved unbiased and non-discriminate while also ensure that there are privacy and data protection standards.
The report also voices wariness about tools that try to deduce people’s emotional and mental states by analyzing their facial expressions or body movements, saying such technology is susceptible to bias, misinterpretations and lacks a scientific basis.
“The use of emotion recognition systems by public authorities, for instance for singling out individuals for police stops or arrests or to assess the veracity of statements during interrogations, risks undermining human rights, such as the rights to privacy, to liberty and to a fair trial,” the report says.