Opinion: Facebook’s alarming plan for news feeds


But you should also be concerned that Facebook has developed a way to reduce the chances that you will hear about such reviews of the platform in the future. The company now plans to show users more positive articles about itself in the news feed, using what is known as Project Amplify, according to the New York Times, citing three people with knowledge of the source as sources. effort.

While some favorable stories are prioritized, this by definition leaves less room for more objective reporting to share in newsfeeds, since users can only read a finite number of stories each time they log on. .

The plan is downright alarming and underscores the importance of researching information outside of social platforms.

According to the Times, Facebook plans to show users more articles that make it good, including some written by its own employees. The Times reports that the initiative was personally approved in August by chief executive Mark Zuckerberg. The company told The Times it hasn’t changed its approach. But a spokesperson also appeared to suggest he was making a change, telling The Times, “People deserve to know the steps we are taking to address the various issues our business faces – and we will be sharing those steps widely. “.
The reason this plan is so disturbing is that Facebook is one of the main places Americans get their news. 48% of adults sometimes or often get their news from social media, according to a recently released Pew poll.

If Facebook prioritizes posts that it perceives as favorable to its image, users may, of course, see fewer critical stories because they are unlikely to be prioritized and amplified in the same way as the posts the company does. love. From there, it would also be a small step to completely censor articles attacking the company. This could limit an urgent national debate over what Facebook and other social networks are doing across the country.

It has never been more important to have an honest assessment of the problems that technology companies create and how to solve them.

For example, online disinformation poses a direct threat to the American democratic system. According to Facebook’s own figures, a Russian-led campaign to spread disinformation around the 2016 presidential election reached 126 million Facebook users and 20 million Instagram users.

Misinformation from conspiracy group QAnon around the 2020 election helped spread the false perception that the election was stolen from former President Donald Trump and led to participation in the deadly Jan.6 attack on Capitol Hill , according to the recently published book by Mia Bloom and Sophia Moskalenko “Pastels and Pedophiles: In the Spirit of QAnon.”

Bloom and Moskalenko write that “Facebook determined that QAnon was dangerous relatively late in the game” – two years after the group was banned by Reddit. And, of course, Facebook didn’t suspend Trump – who used to use belligerent language – from his platform until after the Jan.6 attack. The company’s supervisory board later found that Trump’s “words of support for those involved in the riots legitimized their violent actions.”

If Facebook users lack critical coverage of how social media affects elections and see only or primarily content the company perceives as favorable, they will be less likely to have the national debate the nation needs. pressing on how to protect future elections from violence, foreign interference and disinformation.

Two Lego blocks gave me hope in my country's crisis

Americans will also be less likely to fully combat what social media is doing to us as people.

While it was scary to read a recent Wall Street Journal report that Facebook is well aware that Instagram makes teenage girls feel bad about their bodies, it would have been even more surprising if Facebook hadn’t been in the loop. aware of these problems.

Research has long shown that users post self-promotion content on social media, which causes envy among their friends. Over time, envy has been shown to damage a person’s mental health, decrease their sense of well-being and self-worth, make them dissatisfied and withdraw from groups and even cause depression. . (Instagram public policy manager Karina Newton wrote in an article in response to the Wall Street Journal article that the article “focuses on a limited set of outcomes.” feel both more connected and more alone – and that the company recognizes that its “job is to make sure people feel good about the experience they have on Instagram.”)
These aren’t even the only problems with social media. As I said before, Facebook also tracks users’ online activities and may use and share this information in a way that seriously compromises their privacy. But you get the idea.

The dangers posed by Facebook and other social media platforms are not abstract. They affect the ability of users to make informed decisions about who to vote for, how to ensure the peaceful transfer of power between leaders, to protect children’s mental health and more. We can’t afford to have tough reporting on these issues replaced with puff bits that populate the Americans’ news feed and appeal to the Facebook PR team.

Of course, even without this policy, relying on Facebook for the news was never a good idea.

As Facebook investor Roger McNamee wrote in “Zucked: Waking Up to the Facebook Catastrophe,” the company’s platform is designed to show users extreme stories because they are the ones who keep them going. line longer (and therefore, of course, generate more advertising revenue for the company).

These kinds of stories are not necessarily the ones that keep us best informed and informed about the issues facing our communities and our country. This is why it is so essential for Americans to seek their information outside of technology platforms, directly from reputable media.

There is absolutely nothing to like about Facebook’s spooky new politics. This will almost certainly reduce users’ awareness of the dangers posed by social media. The only way to avoid this filter is to seek information directly from legitimate sources that are not motivated by the desire to make Facebook look good.