An algorithm change made by Facebook in 2018 to prioritize reshared material instead led to the spread of “misinformation, toxicity, and violent content,” leaked internal documents have revealed.
That year, Facebook’s chief executive officer, Mark Zuckerberg, said the alteration had been carried out in a bid to strengthen bonds between platform users, particularly family and friends, and to improve their wellbeing.
However, according to the leaked documents that were made public on Wednesday, the modification backfired, turning the social networking platform into an angrier place by rewarding outrage and sensationalism.
The new algorithm produced high levels of comments and reactions that translated into success on Facebook but had a highly negative impact.
Highlighting the issue, a team of data scientists said: “Our approach has had unhealthy side effects on important slices of public content, such as politics and news.”
They concluded that the new algorithm’s heavy weighting of reshared material in its news feed made the angry voices louder.
“Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” the researchers added in internal memos.
The alteration had been intended to encourage engagement and original posting in a way that the algorithm would reward posts with more comments and emotion emojis, which were viewed as more meaningful than likes.
Zuckerberg was reportedly warned about the problem in April 2020 but kept the algorithm in place regardless.
Comments are closed.