Get featured on INDIAai

Contribute your expertise or opinions and become part of the ecosystem!

CDEI, which draws on a detailed analysis of the use of algorithms in four sectors (financial services, local government, policing and recruitment), made some key recommendations

  • Government should place a mandatory transparency obligation on all public sector organisations using algorithms that have an impact on significant decisions affecting individuals.
  • Organisations should be actively using data to identify and mitigate bias. They should make sure that they understand the capabilities and limitations of algorithmic tools, and carefully consider how they will ensure fair treatment of individuals.
  • Government should issue guidance that clarifies the application of the Equality Act to algorithmic decision-making. This should include guidance on the collection of data to measure bias, as well as the lawfulness of bias mitigation techniques (some of which risk introducing positive discrimination, which is illegal under the Equality Act).


This roadmap will help government, regulators and industry increase fairness and reduce bias, while also ensuring that the UK regulatory ecosystem is set up to support responsible innovation. The measures are designed to produce a step change in the behaviour of all organisations making life-changing decisions on the basis of data, with a focus on improving accountability and transparency, according to a media release. The CDEI recommends that the government should place a transparency obligation on all public sector organisations using algorithms which support significant decisions. This would include information about how algorithms are used in the overall decision-making process, and steps taken to ensure fair treatment of individuals.

Effective use of data can enable organisations to shine a light on practices that may otherwise go unseen, and identify the drivers of bias. A large scale survey, conducted with Deltapoll, found that the majority of respondents were aware of the use of algorithms to support decision-making (around 6 out of 10). Of those respondents who were aware of the use of algorithms, respondents were most aware of their use in financial services (more than 5 in 10), in contrast to local government (around 3 in 10). The results suggest that the public are more concerned that the outcome of decision-making is fair, rather than whether algorithms are used to inform these judgements. There is public support for data - including age (net agreement of +59%), ethnicity (+59%) and sex (+39%) - to be used for tackling algorithmic bias in recruitment.

The review points to the need for an ecosystem of industry standards and professional services to help organisations address algorithmic bias in the UK and beyond. This presents an opportunity for the UK: leadership in this area can not only ensure fairness for British citizens, but can also unlock growth by incubating new industries in responsible technology.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in