Researchers from AlgorithmWatch and the European Data Journalism Network made the discovery by analyzing Instagram newsfeeds, talking to content creators, and studying patents. The team asked 26 volunteers to install a browser add-on that automatically opens their Instagram homepage at regular intervals, and records which posts appear at the top of their newsfeeds. The volunteers then followed a selection of professional content creators who use Instagram to advertise their brands or attract new clients. Of the 2,400 photos that the content creators posted, 362 (21%) showed bare-chested men, or women in bikinis or underwear. The researchers expected that if Instagram’s algorithm wasn’t prioritizing these pictures, the volunteers would see a similar diversity of posts. But that didn’t happen. In the volunteers’ newsfeeds, posts with semi-nude pictures made up 30% of the posts shown from the accounts. [Read: Sci-fi perpetuates a misogynistic view of AI — Here’s how we can fight it] The pictures of scantily-clad women were 54% more likely to appear in their newsfeeds, while posts with bare-chested men were 28% more likely to be shown. In contrast, posts showing pictures of food or landscape were 60% less likely to pop up in their feeds. Nicolas Kayser-Bril, a reporter at AlgorithmWatch, believes the algorithms are perpetuating the biases of certain users. He said on Twitter: This algorithmic bias could push content creators — particularly women — into posting revealing photos to attract more viewers. It could also help shape the worldview of Instagram’s 1 billion monthly users.
Further research required
The researchers admit that the bias towards nudity didn’t apply to all the volunteers. They suspect this is because Instagram’s algorithm promotes nudity in general, but that other factors — such as personalization — limit the effect for some users. They added that it’s impossible to draw concrete conclusions without access to internal data and production servers held by Instagram’s owner Facebook. Until that happens, the researchers plan to investigate further by recruiting more volunteers to install their monitoring add-on. In a statement, Facebook disputed their findings: Nonetheless, the researchers believe that their findings reflect how Instagram’s algorithm works. They note that Facebook has published a patent showing how the newsfeed can automatically choose which pictures appear in newsfeeds. Among the factors that could determine which images to prioritize, the patent specifically mentions “state of undress”. That suggests that Instagram could not only organize newsfeeds based on what a user wants; it could also select pictures based on what the company thinks they want.