Bias in Predictive Analytics

Since my interview with sociobiologist Rebecca Costa, I’ve been thinking about predictive analytics and the concept of the “greater good”. I’ve been wondering is “greater good” a high enough standard by which to decide on a course of action given the systemic biases that exist in our societies? The answer is no, not while systemic racism exists and bias continues to be baked into algorithms.

If an algorithm (the building block used in artificial intelligence and predictive analytics) uses data that is collected on the basis of unjust practices, the algorithm further perpetuates these injustices. We see that with Amazon’s attempt to hire women using an AI and in MIT’s data sets. Rebecca and I also discussed issues with the (lack of) representation of people of colour in Google Images. While I only present a few examples here, they are not the exception but rather emblematic of a widespread problem. Race After Technology contends that inequity is coded into many of the purported technological solutions to social problems. Don’t get started on the problems with the predictive tools used in policing.

Nature vs Nurture

When we think of the role of predictive analytics, there is a confounding factor as well: nurture. In Race After Technology, Benjamin contends that:

“98% of all variations in educational attainment is accounted for by factors other than a person’s simple genetic makeup.”

Rietval et al, 2013, as quoted in Race After Technology, p.117

What about variations in other areas of attainment. This statistic suggests that predictive analytics can provide a false sense of confidence. This is especially harmful as they may sacrifice individuality with a focus on and use of ethnicity.

Rebecca Costa advised us to be critical thinkers, stressing how important this is nowadays. If you’ve never thought about how racism and other forms of prejudice can be coded into algorithms, read Race After Technology.

Learn more with Race After Technology

If you’re interested in issues of justice and equality, particularly antiracism, I recommend reading Race After Technology. It addresses the dangers of algorithms and predictive analysis as colour-blind, gender-blind solutions, with examples to show how they perpetuate the discriminatory practices inherent in our societies. I found it eye-opening.

Featured image by Lukas Blazek on Unsplash


Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading