Meta Ends Fact-Checking, Raising Risks of Disinformation to Democracy

February 4, 2025 | 8:00 am
Getty Images
Liza Gordon-Rogers
Research Associate

Mark Zuckerberg recently announced that Meta, the parent company of Facebook, Instagram, and other services, will no longer fact-check social media content on its platforms. Instead, Meta will use a crowd-sourced system like X’s Community Notes—where approved users can add notes to posts to add context or correct mis- and disinformation. At the same time as the announcement, disinformation about wildfires in Los Angeles ran rampant on social media. It isn’t just natural disasters and public health crises that are subject to online disinformation—elections are also targeted. What happens on widely-used social media platforms matters, and it’s worth asking what effect it will have, immediately and in the longer term, for Meta to cast aside its modest fact-checking efforts.

What does science say about social media fact-checking and community notes?

According to Sander van der Linden, a social psychologist at the University of Cambridge who advised Facebook on its fact-checking program in 2022, “Studies provide very consistent evidence that fact-checking does at least partially reduce misperceptions about false claims.” However, another expert on misinformation, Jay Van Bavel from New York University, says that fact-checking is less effective when an event or issue is extremely polarizing. A 2019 meta-analysis that compared results of fact-checking across 30 individual studies found that while it has a positive overall influence on political beliefs, fact-checking’s effectiveness depends on a person’s preexisting ideology, beliefs, and knowledge. Overall, the science of fact-checking is complicated, but says that it can be effective in combatting the spread of misinformation.

What about community notes? Are they effective? A 2024 study analyzed X’s Community Notes program to see whether it lowered engagement with misinformation on the social media platform. While researchers did observe an increase in the number of fact-checks through Community Notes, they didn’t find any evidence that Community Notes significantly reduced engagement with misleading posts. The study concluded that Community Notes “might be too slow to effectively reduce engagement with misinformation in the early (and most viral) stage of diffusion.” Another study of X’s Community Notes found that they were not effective in combating false narratives about the US election, and a “significant portion” of accurate notes dispelling misleading election claims were never shown to users.

While it is clear that mis- and disinformation interventions on social media platforms need improvement, abandoning fact-checking for Community Notes doesn’t seem to be the answer.

Why is Meta back-pedaling?

Political actors like President Trump and X owner Elon Musk have attacked social media fact-checking programs, claiming that these programs are biased and suppress free speech. Coming just ahead of President Trump’s inauguration, Zuckerberg’s announcement has been described as an obvious capitulation to long-time political pressure from Trump and his supporters. After Zuckerberg’s announcement, then-President-elect Donald Trump heralded the decision and said “Meta, Facebook, I think they’ve come a long way.” For his part, Trump agrees that his criticism of social media platforms was “probably” the reason for Meta’s changes.

Meta donated $1 million to Trump’s inauguration fund, recently appointed Trump supporter Dana White to its board of directors, and selected Republican lobbyist and longtime Facebook executive Joel Kaplan as its chief global affairs officer. In November, Zukerberg met with Trump at his Mar-a-Lago resort after the election. In January, Zuckerberg—alongside other tech CEOs and founders—sat podium-side at President Trump’s inauguration. As a side note: Elon Musk, owner of X and a member of the Trump administration, also scaled back the platform’s moderation policies, and the company’s approach is the template Meta platforms will use in implementing Community Notes.

What does this mean for election mis- and disinformation?

Meta started its fact-checking program in the wake of broad criticism about the level of mis- and disinformation about the 2016 election on its platforms. Following the attempted insurrection in 2021, Facebook suspended thousands of accounts and removed posts that supported the attack on the Capitol. Yet, after the deluge of mis- and disinformation concerning the attempted assassination of Donald Trump, social media platforms were largely unresponsive. Alongside his announcement about fact-checking, Zuckerberg also said that Meta is going to push more political content on its platforms, a reversal of prior moves to disincentivize politics.

Tech watchdog groups warn that Meta’s decision to end its fact-checking program may result in a “surge in disinformation.” Valeria Wirtschafter of The Brookings Institution said the change was “likely to make the information environment worse.” One of the areas where mis- and disinformation thrives most is electoral politics. The Brookings Institution also argues that disinformation defined the 2024 election landscape by affecting how people perceived candidates and informing their views on issues such as immigration, crime, and the economy. In addition to “traditional” mis- and disinformation content, new technological advances, such as AI, made creating and distributing this content easier than ever. Moreover, Brookings argues that misleading and false claims were effective in shaping public perceptions of the candidates and campaign issues. The 2024 election’s intentional disinformation campaigns were so successful partly because they were widely shared on social media sites.

Users are wary of social media—but millions rely on it for news

The public was not unaware of the role social media played in helping spread election disinformation. According to one poll, 65% of respondents said they thought the problem of election misinformation had gotten worse since 2020. And 71% said social media platforms should prioritize working to prevent false and misleading claims over unrestricted speech.

Recent research by the Pew Research Center found that a little over half of US adults at least sometimes got their news from social media platform in 2024, a number that has risen in recent years. About one-third of American adults claimed to have regularly gotten news from either YouTube or from Facebook, a Meta platform covered by Zuckerberg’s announcement.

Disinformation also affects certain communities more than others. A 2024 Free Press poll found that Black and Latino respondents were more likely to access news on Facebook and YouTube. According to research by Onyx Impact, a nonprofit organization studying how disinformation targets Black Americans, at least 40 million Americans are targeted by disinformation in Black online spaces. Another study on disinformation online found that Black Americans are “disproportionately encountering misinformation.”

Disinformation is a danger to democracy

Disinformation is critical to the election denial movement and is also at the heart of threats against election officials and administrators and high turnover in these positions. With Meta’s (and potentially other social media companies) back-tracking, we should expect mis- and disinformation, including election-related content. Moreover, as I discussed in a previous blog, if President Trump follows the plans proposed by Project 2025, he could encourage Congress and the Federal Communications Commission to use anti-discrimination policies to penalize social media companies that do restrict or limit content related to “core political viewpoints.” Project 2025 also supports civil suits against social media platforms that remove user-generated election disinformation from their platforms.

What can you do? Be prepared to spot mis- and disinformation on your own. UCS has resources on the difference between mis- and disinformation, how disinformation works and what you can do about disinformation.