Social media platforms should make algorithms public
Algorithms amplify and suppress content, creating harm, yet are shrouded in secrecy. The draft Online Safety Bill should force Big Tech to declare how algorithms work.
Although they are invisible to the public, algorithms decide what we see, hear and experience on social media. They are the secret editors the online world doesn’t want you to understand. They also influence how news companies choose to publish and prioritise stories.
"Before you decide whether to publish, you have to think whether it will please the algorithm,” says an anonymous social media editor who works at a 24 hour news channel. “If it doesn’t, it won’t perform well. Unfortunately, the algorithms can be at odds with the end user who feels like their concerns are not being met by the content publisher.”
Rather than serve content chronologically to the end user, social media companies deploy algorithms which prioritise or suppress content based on various factors. How many comments, shares and 'likes' a post receives can determine how well something is promoted on Twitter, Facebook or Instagram. But there are concerns that the political and ideological preferences of the platforms may also be shaping what we see online. While the platforms themselves insist this is not the case, there is no way of knowing for sure. Why? Because the algorithms remain a closely-guarded secret.
The draft Online Safety Bill, and the evidence submitted to it, have detailed how algorithms can be harmful. Platform design is central to what people see and experience on social media. Platforms do not neutrally present content. For most user-to-user platforms, algorithms are used to curate a unique personalised environment for each user. This can lead to artificial amplification of content, “rabbit holes”, feeding people’s natural biases and obsessions, and also potentially suppression of content, which creates an opposite harm by omission.
71.Algorithms designed to maximise engagement can directly result in the amplification of content that creates a risk of harm. For example, the CCDH found that 714 posts manually identified as antisemitic across five social media platforms reached 7.3 million impressions over a six-week period. By maximising engagement, algorithms can also hyper-expose individual people to content which exposes them to a high risk of harm. In showing people content that is engaging, algorithms can lead them down a “rabbit hole” whereby content that creates a risk of harm becomes normalised and they are exposed to progressively more extreme material. As Mr Ahmed told us, people are more likely to believe things they see more often and news feeds and recommendation tools are a powerful way to influence a person’s worldview. ITV told us: “show an interest in a topic, even one that is potentially harmful, and their core business model and algorithms will find more of it for you.”
- One example of harm from the draft Online Safety Bill
Although there are fears that the Online Safety Bill will stifle the freedom of the press online, news companies are already self-censoring and serving content that they know will work favourably with social media algorithms.
“It’s only when you publish content every day that you notice which content does well and which doesn’t,” the social media producer explains. “I pick it up almost subconsciously, because I’m not just on there 9 to 5, I’m always looking, in a constant feedback loop. The performance of posts is not completely organic, there is something else at play: algorithms. Content is sorted according to algorithms and not every user knows how to turn this off.”
Environmental protesting is “pushed upwards” on Twitter - “XR content takes off like a rocket” - while immigration and race are pushed down. Across platforms, news reporting about child abuse is pushed down because the algorithm might be confusing it with paedophilia, in a misguided attempt to protect users.
Algorithms are Google’s “most closely-guarded secret” according to DMG Media, which gave evidence in response to the Draft Online Safety Bill. The social media producer says that in his experience Twitter is also very unwilling to discuss how algorithms work. Facebook is more open and a representative told him that algorithms have evolved and they didn’t know why, "a bit like a the rise of the machines in Terminator”.