We have read about AI being biased more often than not, or the way bias gets induced in the algorithms unknowingly or several times intentionally.

However, researchers are now determined to flip the scenario and use AI to investigate any social, gender-based, or color bias in our movies, books, magazine articles, radio transcripts, or social media posts. 

Movies, radios, drams, and social media are not only mediums of entertainment but also of serious business. People express, learn, connect, and communicate through these mediums. 

We might sometimes have even noticed how our movies exhibited some bias in terms of gender or color. We were pretty used to seeing just male leads in movies, actresses being all fair to look beautiful, or men being the mighty rescuers for the weak and fragile women. Dowry was so much acceptable and not to miss the birth of a boy bringing joy to the family and a female born bringing the burden.  

It is crucial that we run a thorough check for any such discrepancies in our movies today with changing scenarios. We can also analyze the bias in our older movies using some tools to take appropriate actions.

In recent research at Carnegie Mellon University, few computer scientists designed a method that could show us the numbers based on cultural studies of various movies.

An automated computer analysis uses a statistical language model to analyze the subtitles of movies or radio transcripts to narrow down to words that may show gender and social biases.

The idea was conceptualized by Kunal Khadilkar and Ashique R. KhudaBukhsh of CMU's Language Technologies Institute (LTI). They gathered a big chunk of 100 Bollywood movies from the past decades and few 100 Hollywood movies too from the same time period. The comparative analysis is pretty quick as, according to Khadilkar, "Most cultural studies of movies might consider five or 10 movies while our method can look at 2,000 movies in a matter of days."

This method enables a more precise study of cultural issues. NLP algorithms at the backend produce more quantitative results, which can help us understand various conventions around cultural norms, beliefs, ideologies over a certain time period through books, magazines, etc. too.

There was also a lot of buzz around several biases around beauty norms in the industry since long ago, especially in Bollywood. These tools can also catch and assess beauty conventions through a technique that would catch specific words such as "fair" instead of more generic terms. This way, the results obtained from movie subtitles would represent the bias. 

This technique can be used to detect gender bias based on a metric called Male Pronoun Ratio (MPR) or other keywords that represent any other cultural convention or a norm that might relate to a specific issue or bias during an era.

Even Disney started using AI for few years to detect any gender bias in its movies on the same lines. A tool created by Geena Davis, the founder of the Davis Institute on Gender in Media, aims to educate both filmmakers and audiences about the need to eliminate unconscious bias in the entertainment industry. This tool analyses scripts to catch and evaluate gender bias by calculating the number of male and female characters to conclude whether the numbers represent the actual population.

These efforts sound quite progressive in the direction; however, this is mere detection and analysis. Real change and transition are required in our efforts, behavior, and actions. Bias detection is not enough; taking actions to make it equal for all and transparent is the need of the hour.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in