Political parties have complained that Facebook’s algorithm promotes polarization


Facebook has reportedly filed complaints from political parties claiming that a major change in the news feed pushed them towards negative and polarizing posts. Today, The Wall Street Journal published leaked reports from Facebook after spurring “meaningful social interactions” on the platform. While Facebook touted the move as helping friends connect, internal reports said it had “unhealthy side effects on large slices of public content, such as politics and news,” calling those effects of “growing responsibility”.

News is part of a larger the Wall Street newspaper series based on internal Facebook searches. Today’s report looks at the fallout from a 2018 decision to prioritize posts with plenty of comments and reactions. Facebook reportedly made the change after noting comments, likes and shares declined throughout 2017 – which it attributed in part to people viewing more professionally produced videos. Publicly, CEO Mark Zuckerberg described it as a way to increase “time well spent” with friends and family instead of passive video consumption.

After the change, internal research found mixed results. Daily active users increased and users found content shared by close relationships more “meaningful”, but reshared content (which the change rewarded) contained “excessive” levels of “misinformation, toxicity and violent content. “. People tended to comment and share controversial content, and in the process, they apparently made Facebook in general more angry.

A report pointed out the concerns of anonymous political parties in the European Union, including one in Poland. “Research conducted in the EU reveals that political parties” are convinced that the change in the algorithm has forced them to negatively bias their communications on Facebook, with the downstream effect of leading them to more extreme political positions “” , he said. Facebook has apparently heard similar concerns from parties in Taiwan and India.

In Poland, “one party’s social media management team estimates that they have increased the proportion of their posts from 50/50 positive / negative to 80% negative, explicitly depending on the algorithm change.” And “many parties, including those that have gone strongly negative, are concerned about the long-term effects on democracy.”

News editors – a frequent victim of Facebook’s algorithm changes – unsurprisingly, too, were not happy with the change. Facebook reported this BuzzFeed CEO Jonah Peretti complained that the change was promoting things like “junky science” and content that divides the race.

Facebook frequently changes the news feed to promote different types of content, often clearly addressing public concerns as well as financial considerations. (The ‘time well spent’ movement, for example, has harshly branded ‘mindless scrolling’ on social media.) Facebook vice president of engineering Lars Backstrom told the Newspaper that “like any optimization, there will be ways in which it will be exploited or leveraged”.

But the Newspaper writes that when Facebook researchers proposed fixes, Zuckerberg was reluctant to implement them if they threatened to reduce user engagement. Ultimately, however, Facebook would have reduce the importance of comments and sharing in the News Feed algorithm, giving more weight to what people actually said they wanted to see.