Is Twitter's algorithm racist?
Social network's image preview function sparks experiment.
Twitter has launched an investigation into its neural network, which appears to be racially biased. When pulling up an image in its photo preview function, the algorithm consistently favours the faces of white people – almost always displaying a lighter-skinned face and cropping out the Black people also featured in the photo.
A college professor first raised the design issue on Twitter, but in relation to Zoom's AI software. You can see the tweet below, which drew attention to the fact that Zoom was erasing the face of his Black colleague when he used a background while video chatting (for more successful design results, see our pick of web design tools).
any guesses? pic.twitter.com/9aIZY4rSCXSeptember 19, 2020
Through the course of the thread, Colin Madland realised that Twitter was also defaulting to only showing his own face in photo previews, and leaving out that of his Black colleague – indicating a widespread issue with the way AI is working.
A separate Twitter "experiment" began (see it below), in which cryptographic engineer Tony Arcieri compared the photo previews of Mitch McConnell and Barack Obama to see which the Twitter algorithm would feature.
Trying a horrible experiment...Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkiaSeptember 19, 2020
Arcieri, and others, edited the images in a myriad of ways to check the results – inverting the colours, brightening and lightening parts of the pictures, and changing the backgrounds. All attempts seemed to prove an AI-powered racial bias, with Obama only popping up on the preview when the colours were changed.
Other Twitter users joined in with testing the algorithm, experimenting with Lenny and Carl from The Simpsons, and even dogs.
I tried it with dogs. Let's see. pic.twitter.com/xktmrNPtidSeptember 20, 2020
Twitter's Liz Kelley thanked everyone for bringing attention to the issue, stating the social network has "more analysis to do". "We tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but it’s clear that we've got more analysis to do. We'll open source our work so others can review and replicate."
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
There's clearly more work to do on the algorithms that power AI, and we're glad Twitter is taking responsibility for the issue with its software. With AI now becoming increasingly responsible for various aspects of our day-to-day experiences, it's vital it is built to reflect the complex nuances of society's needs. If you'd like to make sure your design works for everyone, see our guide to inclusive web design.
Read more:
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Georgia is lucky enough to be Creative Bloq's Editor. She has been working for Creative Bloq since 2018, starting out as a freelancer writing about all things branding, design, art, tech and creativity – as well as sniffing out genuinely good deals on creative technology. Since becoming Editor, she has been managing the site on a day-to-day basis, helping to shape the diverse content streams CB is known for and leading the team in their own creativity.