Multiple studies aim to understand Facebook’s impact on elections
(Bloomberg) — Researchers observed conservatives engaging more with false news stories on Facebook than liberals during the US 2020 presidential election, according to a new study published Thursday in the academic journal Science — one of the first revelations in a massive research project to understand Facebook’s impact on democracy.
Meta Platforms Inc., Facebook’s parent company, provided internal data to 17 independent researchers from New York University, the University of Texas at Austin and ten other academic institutions which aim to examine the role of social media in American democracy, particularly during the 2020 elections.
Previous academic work on Facebook has tended to rely on data that groups collected themselves using a variety of scraping methods on the platform. What makes this effort unique is the sheer scale of the project and the fact that the data has come from Meta itself, ensuring better quality, researchers said. “Social scientists have been limited in the study of social media’s impact on US democracy,” according to a joint statement by Talia Jomini Stroud of the Center for Media Engagement at UT Austin, and Joshua Tucker of NYU’s Center for Social Media and Politics, who led the academic groups.
The partnership builds on an initiative launched by Facebook in 2018, when the company committed to share huge amounts of posts, links and other data with external researchers so that they could study and flag disinformation on the site. The researchers involved in the project were not paid by Facebook, and Facebook promised to have no involvement in the questions researchers asked, nor the conclusions they drew, the company said.
“The research published in these papers won’t settle every debate about social media and democracy, but we hope and expect it will advance society’s understanding of these issues,” Nick Clegg, Meta’s president of global affairs, said in a blog post.
On Thursday, an initial set of four peer-reviewed studies were published in both Science and the academic journal Nature. Beyond the study observing political polarization on Facebook, researchers presented their results from three other experiments that changed consenting users’ Facebook and Instagram feeds and examined platform-wide data on US adults. The changes included shifting the feed from algorithmic to chronological, reducing political content from like-minded users and suppressing reshared content. More studies from the research project — 16 in total — are set to publish in the coming months, the research consortium said.
In the study looking at ideological segregation on Facebook, researchers analyzed aggregated and anonymized data from 208 million US Facebook users from September 2020 to February 2021. They collected both unique domains and unique URLs of political news stories shared on the platform, examining more than 90,000 links a day during peak news-sharing periods on Facebook — right before the Nov. 3 election, and right after the Jan. 6 insurrection, in which a group of pro-Trump supporters mobbed the US Capitol.
The researchers found that 97% of political news URLs posted at least a hundred times on Facebook and rated as false by Meta’s third-party fact checkers were seen and engaged with by more conservatives than liberals, according to Facebook’s algorithmic classifier on users’ political leanings. The proportion of links rated as false was very low compared to the news links observed on the platform overall.
The study suggests that on Facebook, during the 2020 election, the audience for news stories labeled as misinformation by Meta’s third-party fact checkers leaned conservative, said Sandra González-Bailón, a professor at the University of Pennsylvania’s Annenberg School for Communication, and lead author of the research.
The researchers were careful not to draw causal conclusions for the behavior they observed. David Lazer, a professor of political science and computer sciences at Northeastern University, and one of the co-authors of the study, said in an interview that there are other factors that could explain the outcome. For example, it’s unclear what mechanisms Facebook uses to flag potential misinformation on the site, and it’s possible that the mechanism systematically flags news that appeals to a conservative audience. Another study could also take a closer look at Facebook’s fact-checking process to see why the third party fact checkers rate more right-wing political news as false, Lazer said. “We haven’t rigorously evaluated that pipeline,” he said.
But what the researchers found was that on Facebook, on the whole, audiences are highly politically segregated. On an individual link by link basis, stories were seen, and engaged with, primarily by conservatives or by liberals — but not both, the researchers saw. A lot of previous academic literature “suggested that internet-based consumption was not so politically segregated,” Lazer said. This new research points to the opposite. The observation is “new and it isn’t, because in a way, people thought we knew that already. But we didn’t.”
The study also found that far more political news links were seen almost exclusively by conservatives than liberals. And when links were posted into Facebook Pages and Groups, the political segregation of audiences was even more apparent than when links were posted by individual users. “There are features of Facebook — like Pages and Groups — that really allow people to channel homogeneous content in a way that was probably difficult to do so for free” in the past, Lazer said.
The other three studies published on Thursday were based on changing the algorithms for a set of consenting Facebook and Instagram users during a three-month period between September and December, during the climax of the 2020 US election. In one, researchers changed people’s feeds from algorithmic to chronological, so that the order of posts seen by people was time-ordered, not based on their preferences. Another study reduced political content from ideologically like-minded users. And a third removed re-shared content on Facebook from people’s feeds.
On the whole, changing the way the algorithm works isn’t likely to change political attitudes, according to the experimental studies. Researchers noted limitations in the studies that could explain the relatively low impact on users’ political attitudes, including the length of the study and the timing near an election cycle. The sample of users in these studies “came in even more politically knowledgeable and politically engaged than average,” said Andrew Guess, lead author on two of the studies and assistant professor at Princeton University’s School of Public and International Affairs. “And that would suggest potentially, that these are people who would have somewhat more crystallized attitudes going in.”
Still, across the board, the studies found that changes to algorithmic feeds and the ability to re-share did not change users’ political attitudes or behavior.
“We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes,” said Stroud and Tucker, the leaders of the academic groups studying Facebook data. “What we don’t know is why.”
More stories like this are available on bloomberg.com
©2023 Bloomberg L.P.