What’s the Difference between Disinformation and Misinformation?

March 15, 2023 | 12:18 pm
a young person holds a sign reading "Our future's on the line" at a protestJosh Barwick/Unsplash
Sophia Marjanovic
Former Contributor

Information spreads around the world faster than it ever has in human history due to innovations in technology. Ensuring that people have access to accurate information to make science-based and informed decisions is crucial for public health and safety. Because those who spread disinformation try to create the chaos of division, distraction, delays, and demoralization in order to disrupt democratic processes that can result in science-based solutions, underestimating the intent of false information when it is spread is not to our advantage if we seek to uphold fair democratic processes.

How is misinformation different from disinformation?

Information can be defined as “knowledge obtained from investigation, study or instruction.” Scholars who study the spread of false information define misinformation as false information without the intent to cause harm, and distinguish it from disinformation as false information that is shared with the intent to cause harm. Misinformation  is not intentional, but disinformation is, and we must remember that.

Since at least 2020, according to Google Trends, Google searches for the term “misinformation” are more frequent than for “disinformation.” Because there is a lack of clarity over the meanings of terms to describe false information, sometimes disinformation (which is intentional) is classified as misinformation (which is unintentional). I will show that underestimating the intent of disinformation by labeling it as misinformation serves to propagate the harms of disinformation.

Misinformation and disinformation have been with us for a long time

Disinformation has been documented as early as 44 BC during the Roman Empire. Powerful people seeking to gain from the manipulation of information in a community often use strategies and tactics outlined in the UCS Disinformation Playbook. While technological developments have led to information sharing faster than ever before in human history, unfortunately, these developments often reinforce oppressive practices prevalent in the driving ideologies and practices of power/supremacy of a society.

Because artificial intelligence spreads disinformation, technology ethicists have been warning about the harms that technology is reinforcing on the most marginalized people. Dr. Timnit Gebru was fired from her job as an artificial intelligence (AI) ethicist at Google for warning about how Natural Language Programming, the branch of AI artificial intelligence concerned with giving computers the ability to understand text and spoken words in much the same way human beings do, could replicate similar harms on the people already subjected to oppression in a society. On March 17, 2023, on the second anniversary of the release of her research, Dr. Gebru will be hosting a virtual forum about how many of her predictions regarding her warnings have unfortunately occurred. Dr. Gebru has been recommending the implementation of a regulatory agency similar to how consumer products are regulated by the FDA.

Recently, both misinformation and disinformation have contributed to the high human health toll from COVID-19, especially among Indigenous and other Communities of Color. The speed and effectiveness with which both misinformation and disinformation hindered the public health response to the pandemic has shown that underestimating the impact of the intent of information spreaders can have long-lasting impacts. These not only affect human physical and mental health, but also impact reliable access to housing, food, clean water, healthcare, transportation, and public safety.

The harms caused by misinformation and disinformation about the coronavirus pandemic prompted the World Health Organization to launch an information system to counter false information about the virus that causes COVID-19.

Past campaigns to deceive the public, for example, the tobacco industry’s falsehoods about cigarette smoke, and ongoing climate change denier efforts to stop climate action have also disproportionately affected marginalized communities, especially BIPOC communities.

Underestimating disinformation can compromise our health, safety, and well-being

An investigative report showed that “Team Jorge,” a hacking and disinformation spreading firm, offers a variety of services to disrupt democratic processes such as voter suppression, election disruption and election-undermining services. The firm has been involved in disrupting democratic processes around the world.

Disinformation spreaders often take advantage of the illusory truth effect (a phenomenon where repeated information can be perceived as true even if people know it is a lie) and intelligence about local community dynamics such as an increase in social isolation (which is an increasing epidemic in many countries around the world). A spreader of disinformation often uses various software tools and services (such as those mentioned above) to spread disinformation very quickly and repeatedly. The goal is to repeat false information enough times until people begin to normalize false information as truth. The Holocaust, during which over 17 million Jewish, LGBTQ+, Roma, and other people were systematically kept in concentration camps, enslaved, experimented upon, and/or killed resulted from the impact of the normalization of repeated false information.

According to the U.S. State Department, “disinformation is a quick and fairly cheap way to destabilize societies,” and potentially set the stage for disruptive and antidemocratic action such as paramilitary or military action. Underestimating the impact of disinformation can have long-lasting consequences for human health and wellbeing—including the January 6, 2021, insurrection at the U.S. Capitol Building, the infodemic regarding COVID-19, and various other impacts that disinformation spreading have shown.

Therefore, mislabeling disinformation as misinformation is not to our advantage.

What can you do to neutralize the impact of disinformation?

The Union of Concerned Scientists has developed several tools to help counter disinformation.

Social isolation is detrimental to human beings. You can do your part to counter the spread of disinformation and overcome social isolation by reaching out to at least three people who you know to share these resources on countering disinformation.

Relational organizing, which is talking to people who you already know, has been shown to be the most impactful strategy for activating people into implementing solutions. To solve the problem of rapidly spreading disinformation among our communities, it takes people like you and me working to keep the people we know safe.  

Optimizing relational organizing skills involves structured organizing conversations. Steps of a structured organizing conversation to counter disinformation adapted from Jane McAlevey’s No Shortcuts: Organizing for Power in a Gilded Age include:

1) initiate by stating why the relationship with the person influenced by disinformation is important to you;

2) relate by asking questions/listening to why the source of disinformation is concerning to the person influenced by it;

3) educate by giving factual information about the concerns raised by the person influenced by disinformation;

4) agitate by asking why some people intend to divide, distract, delay and/or demoralize people from organizing for solutions, and how things will be better if people like the person being influenced by disinformation organize with others to implement solutions;

5) inoculate by providing clear expectations for the division, distraction, delay, and demoralization strategies and tactics of the spreaders of disinformation, and factual and evidence-based reasons for remaining united, focused, expedited in sustaining pressure on oppressors, and committed in building up community/organized people power; and

6) activate by asking the person influenced by disinformation if they can have structured organizing conversations with other people influenced by disinformation.

If you need assistance on how to have structured organizing conversations to counter disinformation, you can reach out to me – Dr. Sophia Marjanovic at [email protected].