On Sunday, the New York Times published an op-ed from Paul Thacker, a former Senate staffer who is critical of UCS’s efforts to protect scientists from harassment. Unfortunately, he misrepresents our work, as he did previously in a PLOS Biology op-ed that was ultimately retracted (to our surprise, while we were corresponding with an editor about corrections to the piece).
Further, in The Times, he conflates different types of requests and demands for information. He offers no solutions for creating a balance between transparency and privacy, erroneously suggesting that the harassment of scientists is a necessary evil for public accountability of science. Needless to say, we disagree.
This post explains (again) our position on disclosure, details just a few examples of where the op-ed went wrong, and suggests how to frame the discussion in a way that is constructive. You should also check out an excellent critique from my former UCS colleague Aaron Huertas and this piece from the Climate Science Legal Defense Fund.
The harassment of scientists
My own interest in the use of excessive scrutiny to harass scientists began when emails were stolen from the UK’s University of East Anglia and misrepresented by climate contrarians to cast doubt on established climate science. The manufactured controversy duped many journalists, policymakers, and even some scientists who were slow to defend their colleagues.
Then Virginia Attorney General Ken Cuccinelli misused his authority to subpoena the private correspondence of University of Virginia climate scientists. He lost in the state Supreme Court. A coal industry-funded group subsequently tried to access the same records through the state’s open records law. They also lost in the Virginia Supreme Court, and then later in an Arizona superior court. Relatedly, the West Virginia Supreme Court last year ruled in favor of West Virginia University’s attempt to protect the deliberations of one of its scientists who studied mountaintop removal mining.
Through all of this legal maneuvering it became crystal clear how important it is for scientists to be able to develop ideas without fear that a few words could be taken out of context to unfairly cast doubt on their integrity. And why have courts, universities, and government science agencies consistently pushed back against universal disclosure?
“Compelled disclosure of their unpublished thoughts, data, and personal scholarly communications would mean a fundamental disruption of the norms and expectations which have enabled research to flourish at the great public institutions for over a century,” explained University of Virginia Provost John Simon.
Most recently, House Science Committee Chairman Lamar Smith used a congressional subpoena to attempt to access the internal deliberations of scientists who published an important climate change paper he didn’t like, with an ever-shifting series of explanations for his actions. Note that a subpoena is not a mere request, as Thacker suggests. It’s a demand that can come with criminal consequences.
The chairman may have the authority to issue subpoenas. That’s a question for the lawyers. But if he is successful, the effect is that scientists—and not just in government—will fear future subpoenas should their results not suit the worldview of a powerful member of Congress. That chills scientific discourse.
Many hundreds of scientists urged NOAA to resist the subpoena. And contrary to Thacker’s claim, it wasn’t just “some scientists” who expressed concern about Chairman Smith’s actions—it was also AAAS and the major scientific societies that represent climate scientists. In response, the chairman temporarily pulled back on his demands, “prioritizing” communications among political appointees, the disclosure of which we have never objected.
What kind of disclosure are we talking about?
As the attacks on climate scientists unfolded, we started hearing about similar instances of harassment in diverse academic disciplines. I wrote a report called Freedom to Bully detailing numerous examples where corporations, front groups, and activists of all political stripes have misused open records laws and subpoenas to intimidate academics and government researchers. The breadth and depth was shocking.
An ecologist who spent months responding to an open records request for thousands of documents related to her research into pollution in the Great Lakes. An epidemiologist who studies the public health impacts of hog waste and was threatened with criminal charges for failing to disclose the identities and personal health histories of people he interviewed for a study. Tobacco researchers who were ordered to give up the names and phone numbers of six-year-old children to tobacco companies.
Would a reasonable person really argue that this is the best system we can come up with?
And here’s the critical distinction: in none of these cases—not one—were the requesters solely seeking scientific data or methodology. That’s because data and research methods are made public when papers are published. Other scientists use this information to determine whether a published study is solid or lacks rigor, and scientific papers that are found to be fraudulent can be withdrawn. That’s how science works.
Thus, Thacker’s argument is hollow when he says that “the harassment argument should not be used as an excuse to bar access to scientific research,” because access to the research is not precluded. To support his argument, he cites a number of examples of inappropriate political influence on science that have nothing to do with access to data and methodology: censorship of NOAA scientists during the Bush administration, Coke funding of a now-disbanded organization that downplayed the health impacts of sugar, an agreement between the Harvard-Smithsonian Center for Astrophysics that allowed a fossil fuel company to approve a researcher’s papers.
It’s a classic bait and switch. Nobody—not UCS, not any credible science advocate—argues that access to scientific data and methodology should be off limits (except in narrow circumstances such as patient privacy or national security), especially when it is publicly funded. And many of us argue that we should be able to see documents that show financial relationships and any strings attached to those relationships.
“If a university researcher accepts NSF, NIH or other government funds, they are obligated to openly publish their research,” wrote one NYT commenter. “This is the standard channel for such results, not e-mail. E-mail messages while research is underway are usually unreliable, informal, loaded with intermediate, often erroneous ideas, and so forth. Frankly, I wouldn’t give two cents or any credibility to such ‘results.’ Industrial researchers — even when funded by the government — are not expected to ever release e-mails except under lawful subpoena. Why should university researchers be treated differently?”
A sledgehammer or a scalpel?
Just days before we released Freedom to Bully in February 2015, a new fault line opened: an anti-GMO group called U.S. Right to Know requested wide swaths of correspondence among genetic engineering researchers. Again, our argument isn’t about whether open records requests are appropriate; it’s about their scope.
When they asked directly via email, we suggested to the group that more narrowly tailored requests would be better. We said that the group’s requests constituted harassment because they were so broad and seemed like fishing expeditions.
The group has since proved this point, publishing and deceptively framing emails on its website to intimate that reporters were taking money to write favorable stories on genetically modified food (they were not) and shopping around other manufactured controversies (and, to be fair, material that should have already been disclosed).
In his op-ed, Thacker contends that without the broad U.S. Right to Know requests, there would have been no way to know about previously undisclosed relationships between researchers and industry. Really? Some journalists filed more narrow requests and ended up with more than enough material.
For example, Eric Lipton, also writing for The New York Times, penned an article based on his own requests that examined the funding of research and public speaking among academics from both GMO companies and the organic food industry. While important information that should be made public was contained in the anti-GMO group’s requests, that doesn’t mean that they weren’t too broad, or that the content of some of the emails hasn’t been misused.
Disclosure exemptions are important but should have limits
“Some of what we know about abusive practices in science,” writes Thacker, “[H]as come from reading scientists’ emails.” Should universities or government institutions that employ scientists be exempt from open records laws? Certainly not. Should all scientists’ emails be protected from public view? No way. Should we ensure that disclosure standards lead to accountability? Absolutely.
Those who have fully read the Freedom to Bully report and many subsequent articles should note that we consistently argue against overly broad exemptions to open records laws (which Thacker refuses to acknowledge). While we supported the university in the Virginia case, the report prominently references an amicus brief from media organizations filed against the university making that very same point. In fact, we invited the author of that brief to speak at a UCS-organized AAAS session on open records laws in 2015.
UCS regularly uses open records laws to learn more about inappropriate influence on and interference with government science and scientists. For example, a recent UCS FOIA request found that government scientists disagreed with the U.S. Fish and Wildlife Service’s decision not to list the wolverine under the Endangered Species Act. Numerous requests formed the basis of our multiple assessments of government media polices.
“Scientists who profess agreement with transparency only when it is on their terms are really not for transparency at all,” Thacker concludes. “The public should be alarmed.”
But the suggestion that scientists are hypocrites for supporting transparency while opposing absolute disclosure does not hold water. As Aaron Huertas writes (his emphasis):
Scientists would argue that the public should be alarmed when politicians and advocates attempt to stymie scientific research they don’t like. The argument scientists and scientific societies have made, repeatedly, is that there is a public interest in disclosure and a public interest in protecting scientists from political interference and harassment. Thacker only acknowledges the former point, arguing that harassment is the price worth paying for fuller transparency.
Do we know where the line is? Not yet. And that’s the challenge our society is grappling with. We have plenty of work to do to increase transparency in science and rid it of inappropriate influence. But that doesn’t mean we should scan every handwritten note, record every phone call, or publish every email on a website.
Where should we go from here?
For scientists, the best defense against attacks is proactive disclosure of anything that could create a real or perceived conflict of interest, especially for researchers who work on issues that are publicly high-profile or contentious. But researchers receive severely inadequate guidance on what constitutes responsible disclosure. Often, the mistakes they make are out of ignorance or carelessness rather than an attempt to hide the truth.
So I will repeat what I wrote in October:
Together, we need to develop common disclosure standards and incentives to adopt them. The best way to avoid these costly and distracting fights is to agree on what should be disclosed and what should be kept private and develop mechanisms to encourage these standards to be embraced. This would put all researchers—public and private–on more equal footing. I think that scientists, journalists, corporations, and universities could come up with a common framework. Then, all institutions that receive government grants (such as those that come from the National Science Foundation) could be compelled to comply with that framework as a condition of receiving those grants. There are probably other enforcement mechanisms worth considering, too.
A more thoughtful balance between academic freedom and accountability will lead to better public understanding of science and policy outcomes that are more in line with the public interest. In the meantime, scientists who work on contentious issues should be prepared for all kinds of scrutiny, both justified and unjustified. Here’s a guide that helps scientists think through these challenges.
I’m happy to provide more context for any journalist who wants to explore these matters in detail.