Shadowbanning: are social media platforms really suppressing some of our content?

Big Tech has often been accused of inconsistently applying their own rules on what they deem as inappropriate content

Social media users have accused big tech companies of shadowbanning, or suppressing certain content online. Kym Ellis / Unsplash
Powered by automated translation

Given that the internet has made everyone a potential entertainer or commentator, it's only natural that we might be interested in how many people are reading, watching or listening to us.

But that interest can quickly become an obsession – after all, getting an audience isn't easy, and sustaining it is even more difficult.

An entire industry has been built to advise people on maximising the reach of their posts on Facebook, Instagram, Twitter, TikTok and other social media platforms, but the factors that make one tweet, image or video more popular than any other are largely unknown.

It's hard to trust what you don't understand

Into this information vacuum rushes annoyance, anger and accusation, the most common complaint being that of “shadowbanning”, the idea that platforms, intentionally or not, are quietly suppressing our content without our knowledge, and aren’t giving it an equal chance to be seen and heard.

Only last month, shadowbanning was blamed for the disappearance of social media posts reporting on the impending eviction of Palestinian families from Sheikh Jarrah in East Jerusalem.

Instagram responded with a statement that these deletions were a global issue that happened to include those Palestinian posts, but the supposition remains that it was the result of a widespread secretive practice that penalises people unfairly.

Adam Mosseri, Instagram's chief executive, attempted to put straight some of the criticism levelled at his platform in a series of blog posts, the first of which begins: "It's hard to trust what you don't understand."

He lays out in detail how its automated systems rank people’s photos and videos, and the reasons why they might be demoted.

Regarding shadowbanning, Mosseri had previously stated, during a Q&A on Instagram, that it "is not a thing" – that it doesn't exist. But his comments are now more wary, possibly because several studies have shown that it does exist in some form. "We can't promise you that you'll consistently reach the same amount of people when you post," he says. "But we can be more transparent and work to make fewer mistakes."

Georgia de Lotz/ Unsplash
Social platforms such as Instagram have said that they can't guarantee that users will consistently reach the same amount of people every time they post. Georgia de Lotz/ Unsplash

Some evidence of shadowbanning can usually be found where platforms inconsistently apply their own rules regarding inappropriate content.

Double standards surrounding nudity and semi-nudity are said to affect content posted by athletes, educators and artists, and have been shown to have a disproportionate impact on women and people of colour.

Automated systems have been shown to struggle with languages such as Arabic, resulting in the overzealous removal of posts, while content that "brushes up" against rules – even if they aren't broken – can frequently find itself demoted.

The fact that the majority of these demotions are automated and performed by algorithms means there is little transparency, which in turn breeds paranoia, says Carolina Are, an online moderation researcher at City, University of London.

I think platforms are trying to do too much. It's not realistic for them to have a successful moderation system on that scale

"People tend to believe in conspiracy theories at times of uncertainty," she says. "So if people feel uncertain about what's happening to their content on the platform, they're going to come up with their own reasons for why it's not doing well. The fact that platforms like Instagram have had to apologise for censoring users means it's only natural that people are going to think platforms are trying to reduce the reach of their content."

As a result of this uncertainty, accusations of shadowbanning have become a convenient and powerful weapon for the alt-right in the US, with former president Donald Trump tweeting about Republicans being subjected to this "discriminatory and illegal practice", thus helping to spread the notion that social media platforms have an inherent political bias.

This idea continues to swirl; this year, Hungary's Minister of Justice, Judit Varga, accused Facebook of suppressing "Christian conservative, right-wing opinions".

Such accusations are always firmly denied by the platforms, and the real reasons for any piece of content not being popular are often more prosaic.

“For example, you might see that an Instagram user doesn’t use the ‘Reels’ feature, which Instagram is trying hard to push at the moment,” says Are. “So it could be that they’re not recommending that account because it’s not doing what they’re wishing it to do.”

This issue was highlighted in Instagram's belated response to accusations of Palestinian censorship; it said its policy of favouring original content over reposts of identical content was to blame, and promised to make changes to that prioritisation.

As new users join these platforms and content proliferates, algorithms will have to work harder to sift through and recommend things that we are most likely to want to see, but Are believes the platforms need to be held more accountable for the changes they make, and the knock-on effects that may occur.

“They might say, oh, we have too much content, so some mistakes are going to be made,” she says. “They might say [in relation to censorship], oh, we have so many users and we need to please everyone. But do they really need to rule over that much content? I think platforms are trying to do too much. It’s not realistic for them to have a successful moderation system on that scale.”

But control over the promotion of online content, bound up as it is with financial, corporate and cultural considerations, is not something Big Tech will give up easily. Transparency is probably the best we can hope for.

“The lack of clarity with which platforms recommend, moderate and circulate content is really striking,” says Are. “As a user, you are left posting into a void and hoping for your content to do well, without the platform telling you what’s going on. It’s a service that we’re using, and we should be told how to use it the most to our advantage. I don’t think that is happening at the moment.”