F5.5G Leap-forward Development of Broadband in Africa The Africa Broadband Forum 2024 (BBAF 2024) was successfully held in Cape Town, South Africa recently, under…
Twitter investigating cropping algorithm after users flag racial bias
Twitter is investigating the autocropping feature in its algorithm after users flagged a pattern of favouring images of white faces.
The issue was spotted over the weekend, The Verge notes. Once spotted, users experimented with images to see whether this was another instance of algorithmic bias.
Algorithmic bias is a pervasive issue in the tech industry, with AI often trained off of image datasets that don’t reflect the diversity of the world’s population.
This has resulted in issues such as facial recognition technology not detecting black people’s faces — an issue which computer scientist Joy Buolamwini highlights in her advocacy for better AI training.
While companies have taken steps towards more inclusive datasets, multiple incidents continue to appear in 2020.
For example, in June Instagram’s algorithm was accused of prioritising partial nudity in posts.
Meanwhile, algorithms still regularly impact white users and people of colour differently across multiple sectors.
One of 2020’s most egregious examples resulted in the false arrest of a Robert Julian-Borchak Williams. Williams’ arrest occurred due to an incorrect AI facial recognition match.
Twitter users test cropping algorithm
Twitter users flagged potential algorithmic bias after noticing some patterns when sharing images showing the faces of black and white people.
A pattern emerged where Twitter’s autocropping feature would repeatedly select the white face, regardless of its placement.
Trying a horrible experiment…
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
For example, a viral thread shows how the algorithm repeatedly chooses Mitch McConnell’s face over former US President Barack Obama’s face.
However, when colours were inverted on the images (thus obscuring skin tone), the algorithm cropped differently.
Let's try inverting the colors… (h/t @KnabeWolf) pic.twitter.com/5hW4owmej2
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Multiple users joined in on the experiments. They tried out everything from cartoon characters and Lego figurines, to different facial expressions and contrast levels.
Twitter is abuzz with examples of its seemingly biased image cropping algorithm. Here's a thread of the most interesting examples and what they mean.
It started with this comparison where Twitter shows McConnell but not Obama (click the photos to see the images I uploaded). pic.twitter.com/PpY6pMmdvQ
— Jacy Reese Anthis (@jacyanthis) September 20, 2020
Hi Twitter algorithm 🤔 pic.twitter.com/2w69hBgWr4
— Ethics in Bricks (@EthicsInBricks) September 22, 2020
There you go pic.twitter.com/JgOGBAVxgz
— nota 🐙 (@NotAFile) September 19, 2020
Other users did tests to see whether the subject’s sex had an effect on the cropping.
let's see if the twitter image algorithm is sexist as well as racist pic.twitter.com/yBPLhxzcbE
— SҚЏLLԐҐФЙ (@NeilCastle) September 20, 2020
Users are also using different variations of images to track down the exact parameters causing the algorithm to choose certain faces.
However, as some people note, even if it is due to a feature such as contrast, it still ends up favouring one group of people over another.
Twitter’s response to autocropping issues
In response to one of the most viral tweets, Twitter’s official communications account said that the company tested the algorithm for racial and gender bias before shipping.
“But it’s clear that we’ve got more analysis to do,” Twitter said.
The company also plans to open-source the analysis and changes so others can learn from it.
We tested for bias before shipping the model & didn't find evidence of racial or gender bias in our testing. But it’s clear that we’ve got more analysis to do. We'll continue to share what we learn, what actions we take, & will open source it so others can review and replicate.
— Twitter Comms (@TwitterComms) September 20, 2020
Meanwhile, Twitter’s Dantley Davis said that contrast may be the contributing factor. However, he noted that the company still needs to fix the algorithm.
Davis continued to engage with users on the social media platform about the autocropping feature.
In addition to this, he shared information on how the cropping algorithm works. Davis specified that it does not use facial recognition.
But has also echoed user sentiments that regardless of the cause, its result is a problem.
I agree that this is a problem, regardless of what the cause is.
— Dantley 🔥✊🏾💙 (@dantley) September 19, 2020
He added that since he’s in a position to fix the problem, he will.
Feature image: Shereesa Moodley/Memeburn