There’s been a lot in the news and a considerable amount of discussion since my post last week about the Italian scientists convicted of manslaughter and sentenced to jail in connection with the April 2009 earthquake that killed more than 300 people in the Italian city of L’Aquila. And like many situations with extraordinary complexity, some of the reporting has been incomplete.
I’d like to share what we’ve learned in the interim that casts additional light on the situation, and some lessons that we may take away from this tragedy when it comes to ensuring that scientists continue to share their expertise with the public and that governments are able to effectively communicate about and manage low-probability, high-risk situations.
The conditions before the big earthquake
In L’Aquila and surrounding areas, earthquakes are relatively common. The people who live there are used to them. But two factors were significantly elevating residents’ concerns.
The first was a marked increase in seismic activity in the region.
The second was a rogue local technician who was creating panic by making false predictions about exactly when and where an earthquake would occur and what its severity would be. (After the big earthquake, he portrayed himself as a victim, even though he had been wrong many times, he was still wrong in the case of the L’Aquila earthquake, and it is widely recognized his methods were not scientifically credible).
In the wake of these two unsettling factors, people understandably wanted answers. The government called in a panel of scientists, ostensibly to share with the public current scientific knowledge about the risk of earthquakes. But subsequently, it has become fairly clear that the priority for some government officials was to take advantage of the panelists to reassure the public, regardless of the science.
A panel set up to fail
From the beginning, things got messy, and public officials, scientists, and the media made mistakes. There were both scientific and political goals for the panel, and these were in conflict. Some wanted to use the commission to share scientific knowledge. Others wanted to use it to reassure the public no matter what the science said. You can’t usually have it both ways.
The panel made at least four critical errors in how it presented science to the public:
First, there were scientific errors. The panel said publicly that the swarm of tremors had no effect on the likelihood of a larger quake. This left the public with a belief that the tremors they had felt did not suggest a greater risk. According to a commission convened in the wake of the disaster, a swarm can marginally increase the chance of a larger quake. Even if the difference was slight, it should have been recognized.
Second, there were problems with public communication. Many people took away the message from the panel that there was absolutely no chance of a larger earthquake, when in reality there was, of course, a very small chance. A representative of the commission—a political appointee—told a journalist that people should relax and go have a glass of wine. More broadly, it seems from published accounts that the scientists and government officials did not have a plan on when, how, and on what basis to communicate advice to the public. When the earthquake happened, people felt they had been lied to. That feeling lingers and feels as fresh as ever, as people lost loved ones and the physical and emotional effects of the L’Aquila tragedy will last for a long time.
Third, the panelists failed to take into account the cultural practices of the people they were supposed to help protect. In the past, when residents felt nervous about seismic activity, they would sleep in their cars or sleep outside. But the panelists assumed there were only two options: Evacuate, or stay put. They didn’t realize that the people of L’Aquila had been embracing an alternative for years. And in this case, there was no downside to being more cautious. If the quake had not occurred, there would have been no harm done; with an earthquake, some lives might have been spared.
Finally, the panel didn’t adequately think about risk to people. Even if the chances of a quake were low, the consequences were severe. Instead, the scientists seem to have reported on the probability that a quake would occur. That’s not the same as risk (which is probability multiplied by consequences).
An appropriate response
Given these details, should the scientists be held criminally responsible? My personal opinion: absolutely not. The government put these scientists in an incredibly difficult position. It’s almost as if they were set up to fail. The question that they were asked—will an earthquake hit, and if so, when—cannot be answered credibly with a sufficient degree of certainty.
University of Southern California earth scientist Tom Jordan spoke about this context, in which the commission members were more or less forced to rebut the irresponsible predictions of the local technician who was causing panic. They “got trapped into a conversation with a yes/no answer,” he said to the Economist. As a result, people were left with the impression that no earthquake was imminent.
The members of the panel were trying to balance two extremely complicated tasks—understanding how the swarm of tremors changed the probability of a larger quake and communicating the details and uncertainties of that scientific knowledge to the public in a comprehensible way. It is not reasonable to hold them criminally liable for failing to hit those balances exactly right.
The commission and the government officials who supported them did not adequately prepare for this task, letting a difficult situation spiral out of control and allowing inaccurate messages to reach the public. The response in Italy should be a move toward reforming how the government and scientists communicate risk, not punishing these scientists for failing at the impossible job the government gave them.
The immediate problem is what this case does for Italy’s ability to prepare for and respond to future disasters. The sentencing is already having a chilling effect on the willingness of scientists to work with the government and offer their technical expertise in the future. Prominent scientists in Italy have withdrawn from government advisory panels, not feeling comfortable giving advice when the wrong advice could result in manslaughter convictions.
This raises questions for the United States, too. Should we put engineers in the Army Corps of Engineers in jail because of construction problems with the levees around New Orleans during Hurricane Katrina? Should we put meteorologists in jail who underestimate the possibility of a deadly tornado, and people die as a result?
Trying not to cry wolf
I’m at my home today in Washington, D.C., due to Hurricane Sandy, checking my drains from time to time. The government shut down, as did our public transportation system, and the forecasters are telling me it will soon be dangerous to go outside. I respect their advice. But what if they were wrong, several times in a row, and it wasn’t a big deal? I might take their advice less seriously. I might ignore it. And I might be put in danger as a result.
Communicators of scientific risk on all kinds of issues—from weather events to toxic chemicals to drugs—walk a fine line. They can’t be too alarmist, or people won’t pay attention. They can’t be too cavalier since people will take unnecessary chances and may get hurt.
And what if the risks don’t materialize? Consider if the Italian commission warned citizens that there was a chance of an earthquake and urged them to move out immediately until all buildings could be adequately stabilized. There would be significant costs associated with such a move and improvement of infrastructure. If there was no subsequent earthquake, should the scientists be subjected to a civil suit for the seemingly unnecessary cost and inconvenience?
Science drives innovation and helps us anticipate risk. Scientists should be eager to share their work with the public, especially when it comes to hazards like earthquakes, diseases, environmental damage, and other areas of public concern.
If we end up with a situation where scientists pull back from public commissions, we all lose. We are robbed of their expertise. And we are less able to address the challenges in the world around us.
What can we learn?
What we need to do now is to prevent this sort of tragedy from happening in the future. The solution to what happened in Italy is reform, not punishment.
Public commissions and other official bodies at all levels should have clear guidelines for communicating risk in low-probability, high-risk situations. Scientists should have adequate training in communicating with the public and the press. The media should better evaluate which individuals and entities are credible and which are not. And the entire process should be transparent, so that people can have trust in the process and these sorts of panels are less vulnerable to misuse.
Scientists need to be clear about what they do and don’t know and politicians need to respect their statements and not twist them for political purposes. Scientists rarely claim absolute certainty in a topic, but politicians often do. This is what sometimes creates tension between science and politics, and gets both scientists and politicians in trouble.
The solutions to this challenge aren’t easy to come by. But what happened in L’Aquila reminds us that the stakes are high enough to deserve considered attention, not knee-jerk reactions.