Results for ""
Responsible AI is the buzzword these days. From organizations to government, policy building and formulation of AI frameworks revolve around the concepts of responsible AI. World leaders and thinkers are attempting to inject this concept into the DNA of every organization.
IIGF- India Internet Governance Forum analyzed the concept of responsible AI from a novel perspective- feminism. The forum hosted a series of multistage holder sessions to discuss public policy issues related to the Internet.
The Dialogue's Shruti Shreya organized and moderated a session to discuss the design and deployment of AI systems in India while keeping in mind social disparities. The session was titled - Feminist Perspective to the evolution of responsible AI in India. The panellists included Laura Galindo-Romero, Meta; Nidhi Singh, Stanford Law School; Asna Siddiqui, Head, INDIAai; Aayushi Bhotica, Wadhwani AI.
Implementation of policies has always been an issue in India. Gender bias is an already persisting issue in a country like India. AI is a technology that can bridge and widen the gap. This will be primarily determined by the decision-making of the humans behind it, which will be subjected to human prejudices. The new models of Machine Learning are suitable examples of these human prejudices.
For instance, security robots are generally male, and service robots are female. These gender disparities in developing countries will increase the bot's perceived humanity and acceptability of AI. Hence, it is vital to have a responsible AI to bridge this gap rather than widen it.
If there is a bias that creeps in, it can cause harm to the community. There is bias in the datasets as their usage varies from community to community. Aayusha gave the example of an algorithm that converts images into avatars.
However, the avatars generated by the algorithm favored a man's perspective rather than a women. It was the portrayal of what a beautiful woman is according to a man. Also, the rise of this bias could be due to the fact that the functional heads of the company and the hands behind them belong to men than women.
Some things are important for responsible AI- dataset, how you define success, and deploy it. Understanding the community is the thread that ties these three parameters together.
International organizations such as OECD and UNESCO are working to build global harmony. It is a ray of hope that there are women policymakers behind the formulation of several of these strategies.
Companies, on the other hand, would not like to produce products that their consumers, who are the end users, cannot trust. Stakeholder engagements are necessary for understanding the consumers. It can aid in understanding the impact of AI directly and indirectly.
During a study with generative AI, the images of females were gained for the prompt 'nurse' and men for the prompt 'CEO'. It is important to accept that such a prejudiced dataset will always exist. While training the AI algorithm, the developer has to ensure enough data is fed into the system to reduce its impact.
Also, there are several techniques to offset the bias, one being removing the gender-based dataset if it is not required in training an algorithm.
It is important to increase the participation of women in the workforce. It will eradicate male dominance and open doors to an inclusive workspace. India, as a nation, is making appreciable progress, although more needs to be done in the designing and deployment stages. Furthermore, the developer must ensure that AI continues to learn after the deployment stage.