Experts possess information that non-experts need – and when the information concerns disaster the need is obviously urgent. This is equally true whether the disaster is ongoing (as at Chernobyl or Bhopal), or impending (as at Three Mile Island or Love Canal), or potential (as at hundreds of proposed and functioning nuclear power plants, chemical factories, hazardous waste facilities and so on). Perhaps even more than ongoing disasters, impending and potential disasters require sound decisions from non-technical people. Political leaders must take a stand; government agencies must inform people about safety regulations; citizens must decide to stay or to leave. At least in principle, these decisions should depend largely on technical information, especially technical information about risk.
The problem of explaining risk to non-experts is a difficult one, but there are techniques to help the expert deal with this task. The following are some of the factors that can contribute to the success of the transmission.
Simplifying Risk
Even assuming a public that wants to understand and an expert who wants to be understood, risk information must still be simplified.
Deciding What To Leave Out
Insofar as possible, it is wise to simplify language rather than content – that is, take the extra words to make hard ideas clear. Unfortunately, neither the expert source nor the lay audience is usually willing to dedicate the time needed to convey complex information a step at a time. So inevitably simplification becomes a matter of deciding what information to leave out. Experts are famous for their conviction that no information may be left out; unable to tell all, they often wind up telling nothing.
In fact, there are three standard rules of thumb for popularizing technical content:
- Tell people what you have determined they ought to know – the answers to the questions they are asking, the instructions for coping with the crisis, whatever. This requires thinking through your information goals and your audience’s information needs, then resolutely keeping the stress where you have decided it should be.
- Add what people must know in order to understand and feel that they understand the information – whatever context or bac kground is needed to prevent confusion or misunderstanding. The key here is to imagine where the audience is likely to go off-track, then provide the information that will prevent the error.
- Add enough qualifiers and structural guidelines to prepare people for what you are not telling them, so additional information later will not leave them feeling unprepared or misled. Partly this preparation is just a matter of sounding tentative; partly it is constructing a scaffolding of basic points on which people can hang the new details as they come in.
Applying these three rules isn’t easy, but it is a lot easier than trying to tell everything you know.
Explaining Risk Data
The hardest part of simplifying risk information is explaining the risk itself. This is hard not only because risk assessments are intrinsically complex and uncertain, but also because audiences cling tenaciously to their perception of the risk as “safe” or “dangerous.” One way to avoid such extremes is the trade-off: especially risk-benefit, but also risk-cost or risk-risk. But there is solid evidence that lay people resist this way of thinking; trading risks against benefits is offensive when the risk raises moral issues and the “victims” are not the ones making the choice. Another alternative to the safe/dangerous dichotomy is the risk comparison: X is more dangerous than Y and less dangerous than Z. It is important when using risk comparison to ensure that the examples being used are appropriate to the issue at hand and are, in fact, “comparable”; for instance, risk means a lot more than mortality statistics, and comparing an involuntary risk like nuclear power to a voluntary one like smoking invariably irritates more than it enlightens.
Don’t expect too much. People can understand risk trade-offs, risk comparisons, and risk probabilities when they are carefully explained. But those who are frightened, angry, and powerless will resist the information that their risk is modest; those who are optimistic and overconfident will resist the information that their risk is substantial. Over the long haul, risk communication has more to do with fear, anger, and powerlessness, optimism and overconfidence than with finding ways to simplify complex in formation.
Simplifying for Journalists
Especially during a crisis, risk information reaches the public – and sometimes even the decision-maker – chiefly through the mass media. The problem of simplification is thus largely a matter of simplifying for journalists.
Technical sources often resent the journalist’s tendency to simplify – and resent even more the journalist’s effort to make them simplify, to push them toward unqualified safe-or-dangerous risk assessments. To be sure, oversimplification can mislead the audience and damage the reputation of the source. But journalists have no choice but to simplify what they cover.
If you refuse to simplify what you say, the reporter will try to do the job for you – at great risk to accuracy. Sources who insist on every detail provide little guidance to the non-technical journalist on which details are really essential; the result is typically an error-filled story, a grossly oversimplified story, or no story at all. Sources who insist on only a few technical points are far likelier to win their points. The most qualified person to simplify your views, in short, is you, the expert.
Experienced sources know that simplification is the key to getting their message across to the media, and through the media to the public. One of the hoariest of public relations mnemonics is the so-called KISS Method: “Keep it simple, stupid.” This is especially true for broadcasting. Even a newspaper reporter must simplify, but print permits the reporter space to explain and (perhaps even more important) time to figure out what needs explaining. By contrast, broadcast journalism relies on 15-second “sound bites,” self-contained statements of no more than a few sentences that capture the essence of each source’s contribution to the story. Seasoned broadcast news sources talk in sound bites, recognizing that there is little point telling the broadcast journalist anything that can’t go as is on the air. Print reporters, in other words, may have trouble understanding the story or figuring out how to simplify it – but at least they can try. With the print media your job is to help the reporter simplify the story. With the broadcast media you must give the reporter a simple story.
Avoiding Oversimplification
“Simplify” must not mean oversimplify. Your integrity and your credibility are at stake, and reporters will not forgive an exaggeration on the grounds that the truth was too complicated. Leave out the minor technical details by all means, but don’t let their absence change your assessment of the bottom line.
The crucial technique is to think through your information goals in advance, deciding precisely what you want the reporter and the reader or viewer to come away with. In a crisis you may have no more than a minute to decide, but take the minute. Experienced sources differ on what to do when the reporter’s questions deviate from the emphasis you have chosen. Some suggest sliding from a quick answer to the question that was asked to a longer answer to the question you think should have been asked. Others prefer a more direct approach, inviting the reporter to focus the interview on this topic instead of that one. Either way, a source who goes into the interview with a list of three or four points to stress is way ahead of a source who goes in cold. Though you may need to expand your list to cover the reporter’s questions, do not expand it to cover minor technical details.
One way to simplify content that is sure to please reporters is to leave out exceptions, qualifying phrases, expressions of tentativeness, and the like. Journalists like their sources to be straightforward representatives of clear-cut positions. (See Note at right.)
Written back-up documents can ease the problem of simplifying without oversimplifying. Offer the reporter a fact sheet or a reprint to clarify the technical points you make in the interview, and to provide more detail than you have time for in the interview. The problem of simplifying the risk information is much more difficult during an environmental crisis than when the story is about a chronic environmental risk: the reporter demands more information, the source has less information, and both are in a great hurry. This situation is when stockpiled handouts (the schematics of the plant, the toxicity of the chemical) can save the day, freeing the interview for the most important points.
A spokesperson who is attempting to simplify should always warn reporters that this is the case. One of the greatest costs of simplification is that a simplified account can easily sound ill-informed or dishonest when another source provides the information you left out. To prevent this, qualify your conclusions with the acknowledgment that you are simplifying, and sketch in an overview of the sorts of information you are omitting. To avoid misunderstanding, think about where reporters and audiences are most likely to go off-track, where they have gone off-track before, then provide the information that will prevent the error; better yet, discuss the error: “People sometimes mistakenly get the impression that what I mean here is....” Above all, don’t let your simplification merge into exaggeration. As people find out more about the issue – and some of them will – your simplified early statements should stand up as solid and accurate, not misleading or self-serving. If they pass that test, you needn’t worry that they are technically incomplete.
Personalizing Risk
Perhaps nothing contributes so much to the public’s understanding of risk as finding a way to make it personal. Yet technical sources greatly resent the pressure to personalize.
The Case for Personalizing
Start by understanding the reporter’s viewpoint. Questions that personalize do what editors are constantly asking reporters to do: bring dead issues to life, make the abstract concrete, focus on real people making real decisions.
Personalization isn’t just a matter of making the issue come alive; it’s a matter of defining what the issue is. For scientists and bureaucrats, the key is “macro-risk” and public policy. But for journalists and their audience, it’s a “micro-risk” and the individual’s choice. A U.S. Environmental Protection Agency study of the ethelyne dibromide (EDB) controversy, for example, found that agency sources wanted to talk about how many deaths could be expected from EDB contamination of cereal products, while reporters wanted to ask about whether it was safe to eat the cake mix. The connections between macro-risk and micro-risk are difficult to explain. But the individual citizen is faced with a cake mix, not a regulatory proposal, and the reporter serves that citizen’s interests well by asking questions that personalize.
Suppose an environmental emergency has led authorities to consider – and reject, at least for the moment – a possible evacuation. A good source will explain to reporters that evacuation has its own costs and risks, and that the situation does not yet justify ordering one. A good reporter may then ask: “But if your mother lived in the area, would you call her now and tell her to come on over to your house?” This is a difficult question to answer – an honest “yes” may lead to the evacuation you decided not to order. But it is not an unfair question, and it is certainly not an unlikely one.
The problem is that individual voluntary decisions are very different from social policy decisions. An individual can rationally decide to accept a voluntary risk (e.g. skiing) that no government or corporation could justifiably impose on residents, consumers, or workers. Conversely, a citizen can decide to avoid a risk (e.g., giving up high-fat foods) under circumstances where a policy decision to outlaw the foods would be unacceptable. In our hypothetical environmental emergency, the circumstances that would justify calling your mother are different from the circumstances that would justify ordering an evacuation.
Technical sources often worry that their personal views will be misunderstood by the public as policy positions. To prevent misunderstanding, simply make the distinction explicit. (One way to dramatize the point is to bring along a colleague whose personal decision differs from your own: “Based on our data so far, I have stopped eating the stuff but Charlie here hasn’t.”) The single best guide to this personal decision is the personal judgements of the experts – the scientists and bureaucrats who are charged with the policy assessment. You can keep reporters clear on the difference between the two, but you can’t make them care more about macro-risk than about micro-risk. Don’t try. Instead, be prepared to talk about both.
Avoiding Abstraction
For science, abstraction is the ultimate good; the scientist fashions principles out of raw data, struggling to exclude that which is unique and find that which is universal and can be generalized. If the science is good, the individual researcher, the individual laboratory animal, and the individual procedure should not matter. Journalism, on the other hand, is dedicated to the specific and the novel – just what happened yesterday that’s unusual and interesting. Thus, while technical training focuses people on what is abstract, universal, and impersonal, journalistic training focuses people on what is concrete, unique, and personal.
In short, it is not just the source that reporters want to personalize. They want to personalize the audience and the issue as well. They want examples, anecdotes, and images. And so does their audience.
Even technical sources who are unwilling to bring themselves into the story can find ways to “personalize the principle.” Vivid, personalizing imagery is irresistible to reporters, because it is memorable to readers and viewers. No matter how reluctant a technical source may be to interject his or her personal views, this other sort of personalizing is not objectionable. There is no loss of stature in comparing a leaking landfill to coffee grounds. Good science pursues abstractions, but good communication looks for examples and images.
Risk Comparisons
Risk comparisons are an essential tool for explaining risk to the media and the public. After all, what are the alternatives? Explaining risk quantification is virtually out of the question. Numbers numb the layperson’s mind. The units are unfamiliar; how do you clarify picocuries for an audience that may not be able to distinguish a kilogram from a millimetre? The concepts are unattractive; people do not want to focus on lost years of life expectancy or excess mortality per million persons exposed for 30 years. And the error terms and disclaimers about uncertainty that must accompany a fair risk assessment are enough to dissuade any reporter that made it that far.
How can a source say something about risk that is more sophisticated than the safe/dangerous dichotomy, yet easily understandable to the reporter and the public? Perhaps the best solution is to compare the event at hand to a pre-existing criterion, such as a government safety standard. Like safe-versus-dangerous, above or below the standard is also a dichotomy – but at least it’s a line drawn on a scale, and where the line is drawn bears some relationship (sometimes solid, sometimes shaky) to risk data.
Unfortunately, it is not always possible to offer a comparison to a government standard: there may not be a standard, or there may not be enough information about the situation at hand to know if it is below or above the line.
If comparisons to a government standard are effective, why not try comparisons to other known risks, such as smoking or driving without a seatbelt? Comparing the current crisis to a familiar risk provides an opportunity to say something far more detailed than a simple dichotomy, yet far more interesting and understandable than a quantification. So what’s the problem?
Problems with Comparisons
A risk that is voluntary, familiar, controllable, and fair is likely to be underestimated by the public. Even when people acknowledge the full extent of the statistical risk, they are likely to remain calm (frustrated health and safety educators might say “apathetic”). A risk that is coerced, unfamiliar, uncontrollable, and unfair, on the other hand, tends to be overestimated, and to provoke fear, anger, and suspicion (frustrated industry officials might say “hysteria”) even when the statistical data seem reassuring.
Risk comparisons that cut across these distinctions strike the audience as irrelevant at best, misleading and offensive at worst. Comparisons designed to emphasize a risk about which the public is insufficiently concerned do little harm; “far more dangerous than a nuclear power plant” and “nearly as dangerous as smoking” are acceptable if not especially potent comparisons. But efforts to downplay a risk by means of comparison nearly always backfire: “about as likely as getting hit by lightning on a three-mile drive” strikes most people as snide and unconvincing, even if it’s accurate. Worse yet is the common claim that people who accept certain voluntary risks somehow forfeit the right to complain about smaller involuntary risks: “If you smoke you can hardly be worried about dioxin in the landfill.”
Concentration Analogies
Efforts to explain chemical concentrations by means of analogies are somewhat less emotionally explosive. The following examples are offered with no guarantee of their technical accuracy or their communication effectiveness:
Part per million: | One drop of gasoline in a full-size car’s tankful of gas |
Part per billion: | One four-inch hamburger in a chain of hamburgers circling the earth at the equator two and half times |
Part per trillion: | One drop of detergent in enough dishwater to fill a string of railroad tank cars 10 miles long |
Part per quadrillion: | One human hair out of all the hair on all the heads of all the people in the world |
Note that all of these analogies are designed to stress how small small really is; by minimizing magnitude, they minimize risk. In the hands of sources with an obvious interest in minimizing risk, they lack credibility, and tend to provoke a milder version of the backlash provoked by self-serving risk comparisons. And unlike the risk comparisons, they’re not even especially relevant.
Exaggeration from those wishing to emphasize an environmental risk is less likely to be resented by the public than exaggeration from those wishing to minimize the risk.
When to Use Risk Comparisons
Risk comparisons that cut across the various psychological dimensions of risk backfire more often than they help. But they can sometimes help clarify the issues for reporters if four conditions are satisfied:- The source of the comparison must be high-credibility and more or less neutral. A university scientist can compare a chemical risk to the risk of smoking more acceptably than a chemical industry spokesperson can.
- The situation must not be heavily laden with emotion. A comparison that will work with a reporter on a background feature about chronic risk won’t necessarily work before a hostile audience at a public hearing.
- The comparison must include some acknowledgment that factors other than relative risk are relevant, that the comparison does not in itself dispose of the issue.
- The comparison must seem to be aimed at clarifying the issue, not at minimizing or dismissing it.
These guidelines do not guarantee a calm and understanding public – but simply describing a hotly controversial proposal or an industrial accident as “safer than driving here to this interview” or “less likely than getting hit by lightning” virtually guarantees an angry and suspicious one.
This article is adapted from a paper presented by Peter Sandman at the “Conference on Global Disasters and International Information Flow,” The Annenberg Schools of Communications, Washington, D.C., October 8–10, 1986.
Copyright © 1987 by Peter M. Sandman and Emergency Preparedness Digest