If you haven’t see the documentary called Coded Bias, I would highly recommend you do.
I enjoy messing around with AI and testing its cultural biases. I’ve been able to get this particular AI to see holes in some of its arguments, but I’m also learning too. It’s interesting to see how AI can be biased and how it sometimes tells on itself, and/or our commonly accepted narratives.
What are some of the things that AI is biased about?
- Race
- Sex
- Socioeconomic status
- Ability
- Age
- Geography
- Religion
How does AI root out cultural bias from its responses?
- By recognizing and understanding the different types of bias
- By identifying the patterns of bias in data
- By creating unbiased data sets
- By developing algorithms that are not biased
The Algorithmic Justice League is a research and advocacy group that works to identify and eliminate bias in algorithms. The group was founded by MIT Media Lab researcher Joy Buolamwini, who has conducted extensive research on the ways that facial recognition algorithms can perpetuate racial and gender bias.The Algorithmic Justice League has works to raise awareness of the issue of bias in algorithms and to advocate for reforms that would make algorithms more accountable and transparent. The group has also developed a toolkit that can be used to audit algorithms for bias. There is no easy fix for the problem of bias in algorithms, but the Algorithmic Justice League is working to ensure that the issue is given the attention it deserves. With continued research and advocacy, we can hope to see progress in making algorithms more fair and just. You can join here.AJL.ORG