Twitter investigating cropping algorithm after users flag racial bias

Twitter app android social media

Twitter is investigating the autocropping feature in its algorithm after users flagged a pattern of favouring images of white faces.

The issue was spotted over the weekend, The Verge notes. Once spotted, users experimented with images to see whether this was another instance of algorithmic bias.

Algorithmic bias is a pervasive issue in the tech industry, with AI often trained off of image datasets that don’t reflect the diversity of the world’s population.

This has resulted in issues such as facial recognition technology not detecting black people’s faces — an issue which computer scientist Joy Buolamwini highlights in her advocacy for better AI training.

While companies have taken steps towards more inclusive datasets, multiple incidents continue to appear in 2020.

For example, in June Instagram’s algorithm was accused of prioritising partial nudity in posts.

Meanwhile, algorithms still regularly impact white users and people of colour differently across multiple sectors.

One of 2020’s most egregious examples resulted in the false arrest of a Robert Julian-Borchak Williams. Williams’ arrest occurred due to an incorrect AI facial recognition match.

Twitter users test cropping algorithm

Twitter users flagged potential algorithmic bias after noticing some patterns when sharing images showing the faces of black and white people.

A pattern emerged where Twitter’s autocropping feature would repeatedly select the white face, regardless of its placement.

For example, a viral thread shows how the algorithm repeatedly chooses Mitch McConnell’s face over former US President Barack Obama’s face.

However, when colours were inverted on the images (thus obscuring skin tone), the algorithm cropped differently.

Multiple users joined in on the experiments. They tried out everything from cartoon characters and Lego figurines, to different facial expressions and contrast levels.

Other users did tests to see whether the subject’s sex had an effect on the cropping.

Users are also using different variations of images to track down the exact parameters causing the algorithm to choose certain faces.

However, as some people note, even if it is due to a feature such as contrast, it still ends up favouring one group of people over another.

Twitter’s response to autocropping issues

In response to one of the most viral tweets, Twitter’s official communications account said that the company tested the algorithm for racial and gender bias before shipping.

“But it’s clear that we’ve got more analysis to do,” Twitter said.

The company also plans to open-source the analysis and changes so others can learn from it.

Meanwhile, Twitter’s Dantley Davis said that contrast may be the contributing factor. However, he noted that the company still needs to fix the algorithm.

Davis continued to engage with users on the social media platform about the autocropping feature.

In addition to this, he shared information on how the cropping algorithm works. Davis specified that it does not use facial recognition.

But has also echoed user sentiments that regardless of the cause, its result is a problem.

He added that since he’s in a position to fix the problem, he will.

Feature image: Shereesa Moodley/Memeburn

Megan Ellis


Sign up to our newsletter to get the latest in digital insights. sign up

Welcome to Memeburn

Sign up to our newsletter to get the latest in digital insights.