The Ill-logic of Alternative Facts (sic)

Sandra D. Mitchell, , UCS | June 7, 2017, 10:58 am EDT
Bookmark and Share

Philosophers of science are always on the lookout for the logic underlying the successful practices of the scientific community.  For us, that is a window into epistemology more generally, how humans manage to acquire knowledge of nature. The recent surge of “alternative facts,” “fake news,” and claims that accepted science is a “hoax” propagated inside some conspiracy is not just disturbing, but threatens to undermine the hard-won authority of scientific facts. What’s going on, logically speaking, beneath the surface of these attacks?

Webinar: Scientific Facts vs. Alternative Facts

How can we understand and respond to “alternative facts” when they are presented as of equal value as scientific facts? The UCS Center for Science and Democracy joined with the Philosophy of Science Association on June 7, 2017 to present a webinar investigating the differences between scientific facts and so-called alternative facts.

Watch the recording >

The phrase “alternative facts” was introduced by Kellyanne Conway to describe false claims by Sean Spicer about the number of people who attended Trump’s inauguration. While we might agree with Chuck Todd that “alternative facts are lies,” for a philosopher it is important to understand how they work in order to know how to respond to the challenge they present to legitimate facts. Appeals to “alternative facts” reveal a pattern of reasoning that is in stark contrast to the ways in which scientific facts are supported. What’s the difference?

Science comprises a set of practices that generate our most accurate views of what nature is like. That is why we appeal to scientific results to guide our choices of what materials to use to build a bridge or what drugs to take to treat a disease. Humans have their limitations: our first impressions are often wrong, and our in-house perceptual and cognitive abilities are not as acute or unbiased as what we can get by outsourcing to computers or to microscopes, telescopes, spectroscopes etc. The natural conditions we initially confront may obscure causes and confounding influences, and so science crafts experiments that strip away the clutter to expose the main effects, the most relevant variables, the predictive features.

The justification of the results of science is a community affair, founded on critical examination by replication, peer review, and multiple forms of checking structured by the assumption that any fact, data, explanation, hypothesis or theory might well be false or only an approximation of the truth. Science works because it is rigorous in these ways, and that’s what warrants its authority to speak truth to power (or to wishful thinking, or to non-empirically supported beliefs).

The rigorous practices of the scientific community are founded on the most objective procedures humans can implement. The life history of a scientific fact might begin with a hypothesis, or a hunch, or a new application of a well-accepted theory, but to mature into a fact it must pass through the gauntlet of experiment, replication, critical challenge and scientific community skepticism. The logic of accepting a scientific fact goes as follows: If there is good, reliable evidence for it, then it will be accepted (as long as there is not better evidence for a different claim). Its persistence as an accepted fact is not guaranteed, however, as new challenges must be survived when new data, new ideas, or new technologies suggest refinements or adjustments.

Alternative facts follow a different course. They might also begin as a yet-unsupported hypothesis of how things are—how large a crowd might be, how humans might not be causing climate change.  But then the life history looks very different.  Rather than appealing to objective means of determining IF the world matches the hypothesis, purveyors of alternative facts instead consult their ideological, economic or political interests.  Non-objective procedures kick in to cherry-pick data, appealing only to what supports the hypothesis, ignoring or debunking data that contradicts it.  The critical scrutiny of the scientific community is replaced by the sycophantic agreement of those that share ideological, economic or political interests (e.g. “people are saying….”).

The ill-logic of accepting an alternative fact (sic) goes like this. If the hypothesis conforms to one’s interests, accept it as a fact and barricade it from any impugning evidence. If there is some isolated evidence that supports it, treat that evidence as definitively confirming. If there is evidence that contradicts it, ignore, debunk, or deny that evidence. If others who share the same interests voice support for the hypothesis, treat that community as a justifying consensus that the world is the way that group wants it to be.

In short, alternative-fact logic replaces evidence of how nature is with personal preferences for how I want the world to be. Data from experiment or observation, and survival of critical challenges by replication, meta-analysis and peer review, are replaced by what “fact” would be best to increase profits (smoking isn’t addictive), or reduce the need for regulation (CO2 is not a major cause of climate change), or bolster some ideology (most Syrian refugees are young men).

By misappropriating the language of “fact,” this practice undermines the authority of science to speak for nature. Policies that should answer to the facts are no longer constrained by the non-partisan procedures of testing and critical challenge. Instead they are guided purely by partisan interests.  The acceptance of scientific facts is not determined by how we want the world to be. The acceptance of alternative facts is determined exclusively by those preferences.

The consequences of treating “alternative facts” on a par with scientific facts can be dire. The claim that the measles, mumps, rubella (MMR) vaccine can cause autism was proposed in 1998 by Andrew Wakefield, a UK doctor, reportedly based on faulty analysis and a financial conflict of interest. Wakefield had developed his own measles vaccine and was funded by those suing the producers of MMR. His paper was later retracted and his medical license revoked, but his “alternative fact” continues to be promoted and believed.

Dozens of scientific studies have shown no relationship between MMR and autism, but do show that the vaccine is 93%-97% effective at preventing measles. In the decade prior to the introduction of the vaccine in the US in 1963, millions contracted the disease, and an estimated 400 to 500 people died from measles each year. By 2000 measles was no longer endemic in the US. One study estimates that between 1994-2013, 70 million cases of measles and 57,000 deaths were prevented by the vaccine. In recent years there has been a rise in measles in the US, with the majority of cases occurring in unvaccinated individuals. In 2014, 85% of those who got measles declined vaccination due to religious, philosophical or personal objections.

People may choose what they want to believe, but they do not get to choose the consequences of those beliefs. Because measles is so highly contagious, it takes 90-95% of a population to be immune to protect those who are vulnerable (too young or medically compromised to be vaccinated). Relying on “alternative facts” about measles vaccines by even a small percentage in a community can have harmful effects on those who cannot choose.

By exposing the underlying logic of defenses of “alternative facts” we can move beyond the standoff (that’s your fact, this is my fact) to a conversation about what counts as evidence, and how it contributes to what we should believe about nature. Do you really want the pill you take for hypertension to be the one that most increases profits, rather than the one that is most effective and has the least side effects?


Sandra D. Mitchell is professor and chair of the Department of History and Philosophy of Science at the University of Pittsburgh and is the President of The Philosophy of Science Association.  She teaches courses on philosophy of biology, the epistemology of experimental practices, morality and medicine, and practices of modeling in science.  Her research has focused on the implications of scientific explanations of complex systems on our assumptions about nature, knowledge and the ways to use knowledge in policy.  She is the author of Unsimple Truths: Science, Complexity and Policy (2009)

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Posted in: Science and Democracy, Science Communication Tags: , ,

Support from UCS members make work like this possible. Will you join us? Help UCS advance independent science for a healthy environment and a safer world.

Show Comments

Comment Policy

UCS welcomes comments that foster civil conversation and debate. To help maintain a healthy, respectful discussion, please focus comments on the issues, topics, and facts at hand, and refrain from personal attacks. Posts that are commercial, self-promotional, obscene, rude, or disruptive will be removed.

Please note that comments are open for two weeks following each blog post. UCS respects your privacy and will not display, lend, or sell your email address for any reason.