Technology has crept in our lives deeply and we are extremely thankful to this technological wave for making our lives easier in many ways. How every meeting, class, video call, or online gym or cookery session makes us love this technology-led progress. However, we must keep in mind that behind the inception of these technological innovations are humans. This brings us to a concern that some experts predicted long back and that is the human emotion led Bias. Today we can see AI-led systems changing the scenarios for better in almost every major industry of the world. 

AI has undoubtedly improved productivity and eased operations in almost every domain it touched. While at the same time we can not ignore that AI is completely a data and human-led machine. It is as good as the data we feed it or the algorithm that we humans design for it to provide desired results. In this process of conditioning these machines we incorporate our emotions like prejudice leading to a big concern, Biased AI systems. We have unknowingly trained these AI systems with our own ideological biases. These biases can be gender-based, racial, demographic, ethnicity-based, and many more. 

Still don’t see how machines can be biased. Let's have a look at some instances of discrimination that these algorithms exhibited.

AI bias in Healthcare

Deep learning algorithms many times adopt bias based on the data they are fed. In the healthcare domain, one such racial discrimination was caught wherein the AI algorithms churned out racially discriminating results. This algorithm tagged white people with high-risk scores despite the fact that people with color also had high statistics supporting higher risks. This led to a very serious issue where black patients were considered less likely to get the proper care. This can be attributed to the AI systems deducing results using the cost of healthcare as a variable and considering darker skin toned people incapable of paying for high standard healthcare.

AI bias in Hiring

In 2018, Amazon allegedly displayed a flawed hiring system and the core of it was a biased AI recruitment algorithm. This AI-based hiring software was choosing men over women for various roles. It was later realized that the software taught itself that women were fit for some roles while they were unfit for other technical roles such as software development or so. The gender discrimination came from the data fed to it which already had traces of sexism. Gender bias is not new but what is surprising is that it crept within our machines and software too, originating from us humans.

AI bias in Finance

PWC came up with a great definition of bias “Unconscious bias is the idea that you may treat someone differently, or make assumptions about them, based on a trait they have, or that you think they have, without even knowing that you’ve done it.” Finance as a domain is highly sensitive as this has chances of getting many types of biases including racial, demographic, ethnicity, gender, age, and more. A very obvious and evident one is in our credit rating systems where the ratings were decided based on the years of banking or family background which acted as a hiccup to many budding entrepreneurs or idea bearers for today's big startups. Thankfully this was noticed and is being worked on now with various FinTech startups. However, it could lead to large-scale damaging results both economically and financially.

AI bias in Law

There had been a significant amount of bias noticed in the algorithm used in US law and court systemsCOMPAS (Correctional Offender Management Profiling for Alternative Sanctions). This algorithm was producing results where the prediction was racially biased. It predicted an offender as a recidivist based on the offender’s color. The numbers showed twice as many dark-skinned offenders to be recidivist compared to the light-skinned ones.

AI bias in Education 

AI-based solutions are extensively used these days in various learning platforms where the performance and results are assessed using algorithms. Such systems also have shown bias based on many parameters including past qualification, gender, race, and ongoing learning. The algorithms have shown problems in assigning appropriate subjects to students or in creating an individualized study matter where parameters such as income level, minority status, previous educational institute were taken into account. Such results may hinder a student's future prospects as it does not take into account a very basic point of learning being an ongoing process based on Human’s human's ever-evolving skills and knowledge.

AI bias in Fashion

Racism, body types, beauty prototypes have always been a point of discussion in the fashion world for various reasons and the most recent being the introduction of algorithms being used by magazines, photo editors, model selectors, and more. As the fashion norms are changing we need these AI-based algorithms to change their programming too. Many of the AI-based picture quality enhancing software are working mostly on color and changing Asian skin color to white. Also, it considers a set parameter for body type size and body image. Most of the augmented reality software are changing old faces to new in the name of enhancement. Today fashion is transitioning and evolving from a set paradigm to a more inclusive approach. Today fashion is about inner beauty and not just color size and shape. However, these AI-based algorithms are creating a contradictory wave and producing flawed results. Things are changing for better and we need to critically examine what we feed our software today. Fashion is for all and hence the industry as a whole is trying to curb any sort of bias.

Fairness and Non-discrimination is the utmost goal for any algorithm. We can not afford any losses or mistakes caused due to lack of fine-tuning of our algorithms. AI is here to help and we must help it to be nonjudgmental and bias-free.

Sources of Article

Image by Gerd Altmann from Pixabay 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE