In April 2009, a powerful earthquake devastated the city of L’Aquila, Italy and surrounding villages. The quake had been preceded by a “swarm” of tremors, which many townspeople interpreted as a warning; following ancient folk tradition, they left their homes and slept in the fields when the tremors were bad. In an effort to calm local anxiety – “outrage,” in my lexicon – officials asked seven members of Italy’s National Commission for the Forecast and Prevention of Major Risks to come to L’Aquila, meet to assess the evidence, and tell the public what they’d concluded. Officials hoped the panel would conclude, and announce, that there was no reason to think a quake was imminent.
The panel members had their meeting, then most of them went home, leaving two panelists to join with local officials in a news conference. What was said at the news conference sounded far more reassuring than what the panelists had said to each other – seguing in effect (these are paraphrases, not quotes) from “swarms of tremors don’t necessarily signal an imminent major quake” (true) to “there’s virtually no risk and people should stop worrying and go about their business” (falsely over-reassuring). Thus reassured, many local residents remained in their homes during the tremors instead of taking refuge in the fields, and were injured or killed when the quake came.
As a result, six scientists and one government official were indicted and tried for manslaughter.
Jody and I were going to write about this fascinating case – a unique example of a government prosecuting bad risk communication as manslaughter! – when we read a superb September 2011 article by David Ropeik that said most of what we had been planning to say. So we went on to other topics.
Then on October 16, 2012, as the defendants awaited a decision, science journalist Anna Meldolesi of Corriere della Sera (an Italian daily newspaper) wrote and asked for our comments. Again on October 22, after they were sentenced to six years in prison, she asked if we had anything to add.
Some of our original comments made it into Anna’s October 22 story about the sentences (and the overall controversy); our follow-up comments arrived too late.
October 16 email from Peter M. Sandman and Jody Lanard to Anna Meldolesi
The following are some reactions to the L’Aquila trial from a risk communication perspective.
1. Communicating uncertainty
One core problem underlying this tragedy is that scientists (and experts generally) are very poor at communicating uncertainty to the public. When talking to each other, they tend to stress what they don’t know (“further research is needed”) – but when talking to the public they all too often sound like they know more than they do.
The essence of acknowledging uncertainty is to try to reproduce in the mind of your audience the level of confidence you actually feel – a range from near-certainty (“science is never absolutely certain, but experts agree the overwhelming probability is…”) to near-ignorance (“very little is known about this issue, but if we had to make a guess…”). The experts who gathered in L’Aquila to assess the likelihood that a swarm of tremors signified that a major earthquake might well be imminent ended up giving the misimpression that they had scientific grounds for a definitive-sounding “no.” They didn’t; they knew they didn’t; they shouldn’t have let it sound like they did.
All they had grounds for was a reasonable, probabilistic “probably not.” But that isn’t the impression they gave the public, both those who said overconfident things and those who remained silent while others did so. Whether through their words or through their silence, the scientists and officials at the meeting all colluded in supporting the misleading and over-reassuring public statements of panelist Bernardo De Bernardinis and L’Aquila Mayor Massimo Cialente.
Now they all want to say that “of course we always said we can’t predict earthquakes,” and that is literally true. But a couple of them predicted that a specific earthquake (in L’Aquila, and soon) would not happen, and the rest let the prediction stand. If you can’t predict an earthquake, you can’t predict not-an-earthquake either.
They also made the terrible mistake of sounding like their scientific knowledge was so advanced, and so conclusive, that it should overrule the folk wisdom derived from hundreds of years of living along a fault line. That’s not just a failure to communicate scientific uncertainty; it’s also a demonstration of contempt for laypeople’s experience and local practices.
Even acknowledging uncertainty is often insufficient. Both journalists and their readers and viewers tend to disregard nuance. Everyone would rather the experts were sure, so everyone tends to overlook the parenthetical phrases in which the experts say they’re not sure, dismissing those phrases (if they notice them at all) as just CYA – self-protection in case the experts turn out wrong. To get the job done, experts have to proclaim their uncertainty, insisting aggressively that they’re not at all sure about the things they’re not sure about.
The ideal message, grounded not just in hindsight but in risk communication theory, would have been something like this:
There isn’t any scientific basis for concluding that a major earthquake is much likelier in the wake of all these tremors than at other times. But neither is there strong science proving that an earthquake won’t happen soon. Sooner or later there will probably be another major earthquake here, but we simply cannot predict when – or when not.
We’re sorry to offer people so little guidance, but the truth is we don’t know whether the swarm of tremors is grounds for concern or not. Usually, swarms are not followed by large quakes. But “usually” isn’t “always.” We certainly understand why many in this community feel safer leaving their homes when the tremors start, and we have no science that says they’re foolish to do so.
It’s worth noting that when experts sound more confident than they actually are, the error is far likelier to be in a reassuring direction than in an alarming direction. There are exceptions; experts working for activist groups, for example, tend to be overconfidently alarming. But overconfident over-reassurance is the norm, especially among officials. This tendency was exacerbated in L’Aquila. The panel was brought together in the first place largely as a public relations exercise, aimed at reassuring the populace. The panel was expected – in effect it was asked – to be overconfidently over-reassuring.
Given that goal, a politically canny expert might wisely have declined to come to L’Aquila at all. And a scientifically scrupulous expert would have insisted on proclaiming the group’s uncertainty, and therefore its inability to provide the reassurance local government leaders were seeking.
2. “Speak with one voice”
One of the worst pieces of advice frequently tendered in crisis situations is that experts and officials should “speak with one voice” – that is, keep the public from realizing that there is a range of expert opinion on the issue at hand.
The theory is that ordinary people under stress will fall apart unless the experts and officials stand together. It’s certainly true that publics are happier when the experts agree. But when the experts pretend to agree, at least three horrible results are likely:
- The experts who are “forced” to suppress their opinion and go along with the group may leak, journalistically or psychologically or both. Rumors of internal dissention and passive-aggressive behavior from those who lost the internal fight are far more detrimental to public confidence than candid acknowledgment of the range of expert opinion.
- Even if nobody leaks, a lot of people – including most journalists – are automatically suspicious of claims of unanimity, preferring to hear a range of views before making up their minds (so they can see where the real consensus is and what the points of disagreement are). Ironically, the experts’ effort to speak with one voice often gives credibility to non-experts with fringe opinions, since they become the only alternative voices to be heard.
- When experts speak with one voice, the odds of them turning out wrong are far higher than when they let the full range of opinions be expressed.
For all three reasons, a “chorus without soloists” isn’t just bad science. It is bad risk communication.
In the case of L’Aquila, a fringe opinion that a major earthquake was almost certainly imminent launched the process. The expert panel was brought to town to rebut this false near-certainty. We suspect the experts and officials who participated were, in part, experiencing their own outrage at the fringe technician with the big mouth. So they went too far in the other direction. They ended up rebutting the fringe guy with another false near-certainty – predicting that a major earthquake was almost certainly not imminent.
It doesn’t look like most of the panel members meant that to happen. But they let it happen. They colluded. Before the panel convened in L’Aquila, panelist De Bernardinis gave a television interview in which he claimed that a swarm of tremors discharges accumulated energy from the fault and thus actually decreases the chances of a major earthquake. Though this sounded sensible, the other panelists knew it was false – that is, they knew there was little if any science to support it. But they let it stand unrebutted.
After the meeting adjourned and most of its members had started their journeys back home, two panelists joined with local officials for a news conference. That news conference gave townspeople the same misimpression De Bernardinis had given them: that they had nothing to fear.
We have served on expert panels whose conclusions were misrepresented in post-meeting news conferences. It’s very difficult to get much media attention for a “that’s not what we decided” disclaimer – and if you succeed (if you even try), you look like a troublemaker. There are strong disincentives that discourage experts from belatedly breaking the speak-with-one-voice conspiracy.
But the L’Aquila panelists can’t credibly claim to have been surprised by the tenor of the news conference. They knew the meeting had been called in order to calm the populace, and at least some of them knew what De Bernardinis had said before the meeting even started.
Ideally, they would have all stayed for the news conference to make sure it didn’t end up sounding overconfident, over-reassuring, and falsely unanimous. Failing that, they should have collaborated on a news release and a set of talking points. If the talking points had been suitably uncertain, and had explicitly rebutted De Bernardinis’s insupportable claim, the news conference would have left a very different impression, the townspeople wouldn’t have felt relieved of their earthquake worries, and this trial would never have happened.
We don’t know why the panelists failed to produce such a document, but we can guess. A scientifically accurate summary of the meeting’s conclusions wouldn’t have met local officials’ desire that they say something calming. Since they couldn’t honestly say what local officials wanted them to say, we’re guessing, most of the panelists washed their hands of the whole affair and headed home, leaving behind a couple of panelists who were willing to give local officials what they wanted. When the rest learned what had been said at the news conference, we doubt they were especially shocked; they were just glad not to have been the ones to say it … or to make trouble by publicly challenging it.
The earthquake shocked them, of course. It was, after all, a very low-probability event. And the indictments and trial shocked them too. But the news conference was a predictable outcome of their willingness to speak with one voice, even though the voice was predictably overconfident and over-reassuring.
These two scientific sins – the failure to communicate uncertainty and the willingness to hide disagreement – are both extremely common. They are culpable, though we very much doubt they’re manslaughter.
3. The role of outrage
We have already suggested that in L’Aquila the outrage of experts and officials at an unjustifiably alarming prediction propelled them in the direction of an unjustifiably reassuring one.
This too is extremely common. More often than not, the most extreme public statements of scientists are efforts to rebut even more extreme statements from non-scientists. (We see this often in fights over vaccine effectiveness and vaccine safety.) But science is unwise when it fights fire with fire; it should fight fire with science.
We want to comment also on another outrage-driven knee-jerk reaction on the part of scientists – another example that shows that scientists are just like everyone else when they are angry or indignant.
On June 29, 2010, Alan Leshner, Executive Publisher of the journal Science and CEO of the American Association for the Advancement of Science, wrote to the President of Italy, protesting that the “basis for those indictments appears to be that the scientists failed to alert the population of L’Aquila of an impending earthquake.” Perhaps Leshner and the 5,200 other scientists who signed a petition against the indictments had not bothered to read the actual charge by the prosecutor, which was not about the panel’s failure to predict an imminent earthquake but rather about its success in deploying bad science to over-reassure the people of L’Aquila that an earthquake was not imminent. But at least some of the 5,200, we suspect, had read the charge – but were comfortable misrepresenting its content in their zeal to defend their fellow scientists.
Among the risk communication lessons of this appalling case is the extent to which outrage can get in the way of dealing straight with the facts.
October 22 email from Peter M. Sandman and Jody Lanard to Anna Meldolesi:
Judging from the media coverage we have seen in both the U.S. and Italy, reporters and the public are getting the misimpression that the defendants were sentenced to six years in prison for failing to predict an earthquake.
In fact, the defendants’ crime was providing false reassurance. They were found guilty of horribly bad risk communication, not of any technical mistake in earthquake prediction.
Horribly bad risk communication is all too common, and it isn’t usually prosecuted as manslaughter. But perhaps this successful prosecution will help alert scientists to their obligation to inform the public candidly about uncertain risks, instead of giving in to the temptation to over-reassure.
Copyright © 2012 by Peter M. Sandman and Jody Lanard