In a recent article published in Nature, William Sutherland and his colleagues give 20 tips for interpreting scientific claims and suggest educating policy makers on “the imperfect nature of science”. Their tips are useful and perhaps should be cautious reminders on the desk of every scientist before they step forward to provide advice. But the tips can also mislead policymakers into failing to make a decision in the absence of unrealistic, and unattainable, absolute certainty.
As part of my research work in applied ecology I spent a fair amount of time working to better characterize uncertainty in modeling fish populations. I later became the Northeast Regional Administrator for Fisheries for the National Oceanic and Atmospheric Administration (NOAA), then Deputy Director of the National Marine Fisheries Service. I found that the decisions confronting me were much more about the consequences of action — or too often — inaction. The scientific advice helped me focus on the actions that would have the greatest conservation impact.
Scientists need to step forward, use their training and help make policy better than it would be without scientific input; a key goal of the Center for Science and Democracy at UCS. But frankly, I don’t think training policy makers in caveats, that all scientists should be aware of, is helpful in strengthening the scientific grounding of public policy. Talking about uncertainty and “imperfections” in scientific advice, as the 20 tips do, is only a part of the story.
Understanding Risks of Action – or Inaction
Uncertainly doesn’t inform policy options unless is it coupled with a greater understanding of what might be gained or lost if a particular result comes to pass. So, while it may be true that the prediction of a future severe storm coming ashore in NJ may be highly uncertain, the potential losses are catastrophic. And from a policy perspective, as we discussed in a recent forum in NJ, the decisions about how to rebuild affected communities and strengthen the ability to withstand future storms needs to focus on the risk — the product of probability of the event and the expected losses should it occur.
My colleague David Wright recently dubbed this explanation of risk “The Titanic Principle,” and applied it to nuclear waste.
In my own work in marine conservation, the analysis of the current abundance of an endangered species is highly uncertain because data is often hard to come by. But failing to act, not putting protections in place, can result in extinction. So the advice most needed is on how to reduce that risk of extinction. Following the 20 tips may help a policy-maker better understand how precisely we know the status of a species, but will do little to help inform on mitigating the risks.
Uncertainty Doesn’t Help Make Decisions
And from my experience in the policy arena, spending a lot of time talking about uncertainty isn’t all that helpful. To be sure, most scientific advice contains important elements of uncertainty, and it is never appropriate to hide that uncertainty from policy makers or the public. But talking about uncertainty, sometime with greater emphasis than the main points of the analysis, as I wrote about in a Nature essay some years ago, doesn’t actually give people any guidance about what to do.
If scientists emphasize the “imperfections” in advice, what is it that we are telling the public, or policy makers? Wait for more data? Ignore the science altogether? There is no shortage of voices in the policy world on most issues, proclaiming great certainty about their conclusions. How will science advice fare in this unequal environment?
Put another way, many people, including me, invest for retirement. I have neither the time nor inclination to become an expert in investment strategies and financial markets. Without a doubt, anyone who claims they know with certainty how investments will fare is blowing smoke, though many do. So I seek a credible financial advisor. I ask for a reasonable amount of information but don’t expect them to teach me their job. Then I make decisions that align with my own tolerance for risk.
As scientists and advisors, we need to provide guidance on scientific evidence and results. That guidance is in light of our understanding of the imperfections of the analysis. We know that many factors come into play in societal decisions so let’s not assume decisions are solely based on science. And then help policy makers understand the consequences of action — and inaction.