Chronological feeds won’t fix platform polarization, new Meta-backed research suggests

/

Conservative users saw more false news stories than liberals throughout the last presidential election cycle, too.

Share this story

Meta logo on a red background with repeating black icons, giving a squiggly effect.

Image: Nick Barclay / The Verge

Facebook and Instagram users see wildly different political news in their feeds depending on their political beliefs, but chronological feeds won’t fix the problem with polarization, new research published Thursday suggests. 

The findings come from four papers produced through a partnership between Meta and more than a dozen outside academics to research the impact of Facebook and Instagram on user behavior during the 2020 election. The company supplied data from around 208 million US-based active users in aggregate for the study on ideological segregation, totaling nearly all of the 231 million Facebook and Instagram users nationwide at the time.

Turns out, users Meta previously classified as “conservative” or “liberal” consumed wildly different political news during the the 2020 election. A vast majority, 97 percent, of all political news rated as “false” by Meta’s third-party fact-checkers was seen by more conservative users than liberal users. Of the content viewed by US adults throughout the study period, only 3.9 percent of it was classified as political news.

For years, lawmakers have blamed algorithmically ranked news feeds for driving political division within the US. In order to study these claims, researchers replaced these feeds on Facebook and Instagram with chronological ones for some consenting participants for a three-month period between September and December 2020. A second group maintained algorithmically generated feeds.

The change drastically lowered the amount of time users spent on the platforms and decreased their rate of engagement with individual posts. Users who viewed algorithmic feeds spent significantly more time using the platform than the chronological group. While the chronological feeds surfaced more “moderate” content on Facebook, researchers found that it also increased both political (up 15.2 percent) and “untrustworthy” (up 68.8 percent) content more so than the algorithmic feed. 

After the experiment was over, the researchers surveyed participants to see if the change increased a user’s political participation, whether that was signing online petitions, attending rallies, or voting in the 2020 election. Participants did not report any “statistically significant difference” between users with either feed on both Facebook and Instagram. 

“The findings suggest that chronological feed is no silver bullet for issues such as polarization,” study author Jennifer Pan, a communications professor at Stanford University, said in a statement Thursday.

Another study from the partnership removed reshared content from Facebook, which significantly decreased political and untrustworthy news sources from user feeds. But the removal did not affect polarization but decreased the overall news knowledge of participating users, researchers said. 

“When you take the reshared posts out of people’s feeds, that means they are seeing less virality prone and potentially misleading content. But that also means they are seeing less content from trustworthy sources, which is even more prevalent among reshares,” study author Andrew Guess, assistant professor of politics and public affairs at Princeton University, said of the research Thursday. 

“A lot has changed since 2020 in terms of how Facebook is building its algorithms. It has reduced political content even more,” Katie Harbath, fellow at the Bipartisan Policy Center and former Facebook public policy director, said in an interview with The Verge Wednesday. “Algorithms are living, breathing things and this further relays the need for more transparency, particularly like what we’re seeing in Europe, but also accountability here in the United States.”

As part of the partnership, Meta was restricted from censoring the researchers’ findings and did not pay any of them for their work on the project. Still, all of the Facebook and Instagram data used was provided by the company, and the researchers relied on its internal classification systems for identifying whether users were considered liberal or conservative. 

Facebook and parent company Meta have long contended that algorithms play a role in driving polarization. In March 2021, BuzzFeed News reported that the company went as far as creating a “playbook” (and webinar) for employees that instructed them on how to respond to accusations of division.

In a Thursday blog post, Nick Clegg, Meta’s president of global affairs, applauded the researchers’ findings, claiming that the findings support the company’s claims that social media plays a minor role in political divisiveness.

“These findings add to a growing body of research showing there is little evidence that social media causes harmful ‘affective’ polarization or has any meaningful impact on key political attitudes, beliefs or behaviors,” Clegg wrote. “They also challenge the now commonplace assertion that the ability to reshare content on social media drives polarization.”

While previous research has shown that polarization does not originate on social media, it’s been shown to sharpen it. As part of a 2020 study published in the American Economic Review, researchers paid US users to stop using Facebook for a month shortly after the 2018 midterm elections. That break dramatically lessened “polarization of views on policy issues” but, similar to the research published Thursday, did not reduce overall polarization “in a statistically significant way.”

These four papers are just the first in a series Meta expects to total 16 by the time they’re finished. 

The partnership’s lead academics, Talia Jomini Stroud from the University of Texas at Austin and Joshua Tucker of New York University, suggested that the length of the length of some studies could have been too short to impact user behavior or that other sources of information, like print and television, played a sizable role in influencing user beliefs.

“We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes,” Stroud and Tucker said in a joint statement Thursday. “What we don’t know is why.”

Go to Source