GIGO or Garbage In Garbage Out is common parlance in AI today. It simply means – flawed data input will result in a flawed data output. Now, the meaning of flawed could be a number of things – it could mean incorrect data, it could even mean an imbalanced representation of data. For instance, one of the most common gripes with facial recognition systems in Western countries today is its inability to distinguish anything but white males – this means the machine immediately de-classifies women, transpeople, mixed or biracial individuals, African-Americans, ethnic minorities… the list is endless. White males constitute 72% of the US population – which is a significant number no doubt. But 28% are either Hispanic, African-Americans or other ethnic groups. To have machines that disregard this 28% is not acceptable. And this is just the USA. Similar discrepancies in facial recognition systems are being seen worldover. Researchers conclude that this is a result of GIGO. The machines are being fed data that is skewed and biased, so naturally, the algorithms that train on these datasets reinforce the same stererotype.

This is hugely damaging to the potential of AI, and this is exactly what today’s panel on Inclusive AI aimed to address.

With an all-star woman panel, representing a varied set of interests, this panel took on some weighty issues in bias propagated by technology, their impact as well as examples of how technology when used to uplift the marginalized or lesser abled can have hugely transformative powers.

  • Dr. Victoria Baines, Visiting Research Fellow Oxford Department for International Development (ODID
  • Ms. Anita Bhatia, Assistant Secretary General, Deputy Executive Director, UN Women
  • Jenny Ray Flurrie, Chief Accessibility Officer, Microsoft
  • Susie Hargreaves OBE, CEO, Internet Watch Foundation 
  • Elena Sinel, Founder, Teens In AI and Acorn Aspirations
  • Shakuntala Doley Gamlin, Secretary, Department of Empowerment of PWD
  • Jacqueline Kernot, Partner – Financial Services Cyber Security, E&Y 
  • Nishtha Sathyam, Deputy Country Representative for UN Women in India
  • Simmi Choudhary, Economic Adviser and Group Coordinator, MeitY


With the United Nations listing technology and innovation as one of the Sustainable Development Goals (SDGs), the panel witnessed some obvious inclinations towards those who spent a lot of time understanding and applying the intersection of technology and its applications to uplifting those with limited access to technology. Nishtha Sathyam, Deputy Country Representative for UN Women in India and the session moderator noted that only 35% of the total active users (TAUs) globally are women. “Lack of access to technology is the first factor that widens the gap between people, and the within technology, the lack of inclusiveness further reinforces this gap,” she said, adding, “At United Nations Women, we don’t want to leave anyone behind. After the pandemic, we now have the twin responsibility of protecting the accomplishments to date, and building a responsible future in technology.”

Susie Hargreaves OBE, CEO of the Internet Watch Foundation, is invested in minimizing sexual abuse content online, especially child sexual abuse content in the form of photos or videos. In 2019, the IWF assessed a webpage every two minutes. Every four minutes, that webpage showed a child being sexually abused. Analysts at IWF processed 260,426 reports – a 14% increase from 2018. Of these, 132,730 images showed images or videos of children being sexually abused – a 26% increase from 2018. IWF has been extensively using indigenous AI technologies to cast its net across a wider number of users consuming or uploading such content. Some of the solutions used included an Image Hash List (a list of digital fingerprints of known child sexual abuse); Intelligent Crawler that browses the Internet in a strategic manner and Machine Learning Classifiers to identify sexual abuse images. She says, “The challenge is too big for us to handle singlehandedly, and we need technology to help us in this endeavor. And tech has to go hand in hand with governance and legislation to truly impact change.”

Shakuntala Doley Gamlin, Secretary, Department of Empowerment of PWD spoke extensively of the various ways in which differently abled individuals are being enabled by technology interventions like geotagging and DBTS. She also mentioned how her department is looking to integrate more closely with state and central government agencies to promote more inclusion.

Jacqueline Kernot, Partner – Financial Services Cyber Security, E&Y commented that the Internet’s genesis stemmed from connected computers to provide safety. But the reality is the Internet is far from protecting the interests of the disabled, marginalised and under-age. Technology too is advancing at a rapid pace, faster than regulation’s pace. Moreover, governments too lack adequate representation of digitally-savvy lawmakers, who understand and can build a case for the responsible and judicious use of technology – but this is slowly changing. Another social problem that’s brewing, which could have a detrimental effect on the entire Internet landscape is what Kernot calls a “white male dominated” approach to building tech ecosystems. This inadvertently eliminates the representation of women, LGBTQI and differently abled to name a few. As one moves to different geographies, the differences are more widespread and entrenched. She says, “The very biases we strive to fight in society daily is what is being mirrored online.”

Dr Victoria Baines, Visiting Research Fellow Oxford Department for International Development started her address a profound statement – AI doesn’t exist in a vacuum. With the advancements in IoT, and the anticipation for 5G, AI needs to be developed in tandem with, for maximum and meaningful impact. However, the reality is the combination of AI and big data have brought about a series of unintended consequence – like the Gangs Matrix that is a racially biased database that discriminates against black men. While it is intended to root out crime, it has been far more damaging in its outcome by discriminating an ethnic group basis their colour and background. “What we need are technology-driven solutions that preserve our protectionist tendencies and don’t impinge on human rights.” She also spoke of the blanket ban imposed by Instagram in the UK on users from sharing graphic pictures of injuries – but this was met with criticism for it limits freedom of expression. To avoid these challenges of good intent being masked by blanket implementation, it is necessary to identify the right problem statements – and this can be achieved by diversity. If you engage with varied, diverse people, you will have a better lens into their challenges and then, you can use technology to address these challenges.”

Anita Bhatia, Assistant Secretary General, Deputy Executive Director, UN Women stressed that digital inclusion is very critical today. Only 22% of AI professionals globally are women, which directly impacts the quality of AI and propagates further discrimination. Bhatia spoke of how the UN is actively harnessing technologies like AI and blockchain to address women safety, child trafficking and enable digital assistance.

Jenny Lay Flurrie, Chief Accessibility Officer at Microsoft has an incredible story of her own to tell. She is deaf, and more recently, almost succumbed to a life threatening embolism that damaged her left leg to a great deal. But, Flurrie is more driven and passionate about being an agent of change in disability management and integration, using technology. She spoke of the countless benefits of hiring people with disabilities, and the kind of relentless work ethic and perspective they bring to work, in addition to immense resilience stemming from managing an additional cognitive load due to the disability. She believes now and ahead, it is incumbent on the tech ecosystem to hire more differently abled individuals. 

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in