Education is something we want to do to people we think are ignorant. Children are the model. They don’t know their times table, so we’ll teach it to them and then they’ll know it.
But education is also something we want to do to people we disagree with. There is an important bit of sleight of hand here. What we really want, often, is to shut our opponents out of the issue altogether; if that’s not possible, then we want to persuade them that we’re right and they’re wrong. But if we acknowledge that what divides us is a disagreement – not even a disagreement predominantly over facts, but one over values – then shutting them out and even persuading them begin to feel like improper goals. In a disagreement, one ought to listen as well as speak. Disagreeing is a two-way process. Education, on the other hand, is comfortably one-way.
Hence the growing interest in environmental education among environmental regulators. Fifteen years ago, when regulators wore white hats, and “I’m from the government and I’m here to help you” wasn’t a joke, environmental education was widely seen as something of a frill. It is now accorded a somewhat higher priority.
People are getting in the way, demanding impossible levels of protection from essentially trivial risks, stonewalling on the lifestyle changes needed to get serious risks under control, questioning the wisdom and even the integrity of the regulators. In irritation and frustration, out of the corners of our mouths, we mutter, “Let’s educate ’em.” Regulated industries, of course, are right there with us: What better use for environmental protection dollars than to teach people they are afraid of the wrong risks?
Of course, people are afraid of the wrong risks. In most of the disagreements between the American public and the environmental professionals, I am on the professionals’ side. I accept most of the conclusions drawn in Reducing Risk, the 1990 report of EPA’s Science Advisory Board: that the public pays too much attention to the health effects of pollution and too little to its ecosystem effects; that the public worries too much about short-term local risks and too little about long-term global ones; that in responding to public priorities, EPA misses some huge risks while it throws money at some tiny ones. I even accept that the public’s technical ignorance is one of the factors contributing to these problems. But it isn’t the major factor. And an education program is doomed to failure if it is grounded in the false conviction that the way to get people to believe what we believe is simply to teach them what we know.
Any environmental controversy can be divided into a technical dimension and a moral-emotional dimension. The key technical issue is how much damage is being done (to health, environment, or both) and how much mitigation can be achieved at how much cost. The key moral-emotional issues are such matters as these: Who benefits? Who’s in control? Is it fair? Can I trust the people in charge? Did they give me a choice? Do they respond respectfully to my concerns? In an article in the November 1987 issue of EPA Journal, I labeled these two issue clusters hazard” and “outrage,” respectively.
The public is preoccupied far more with outrage than with hazard. The engine that propels the fight over safe–versus–dangerous, in other words, is good–versus–evil. Environmental issues that generate very little outrage – radon, for example – rouse the public much less than the high-outrage issues like incinerator siting or industrial effluent. Technical information, however well taught, is unlikely to change these priorities because they are not grounded in technical judgments in the first place.
What happens when you try to teach outraged people how low the hazard is? First, they don’t believe you. Outraged people naturally tend to resist learning that they are technically wrong. (You and I do the same thing when we are outraged.) And when outraged people do somehow manage to absorb new information, their values are unlikely to reflect the change.
Try this simple thought experiment. Imagine a roomful of citizens listening to an expert on pesticide risks, perhaps someone like Bruce Ames of the University of California. Ames has conducted research suggesting that natural carcinogens in food are several orders of magnitude riskier than pesticide residues. To summarize Ames’s argument in a single oversimplified sentence: Broccoli is more carcinogenic than dioxin. As Ames tries to convince his audience of this, he faces an uphill battle. But let’s assume the best. The audience is calm, there is no cancer cluster in town, the food is good, there’s plenty of time, and Ames is a persuasive speaker with a lot of data to back him up. So over the course of an hour or two, he succeeds in convincing people that, in fact, broccoli is more carcinogenic than dioxin. This is something they didn’t know before, and now they know it. The education goal has been achieved.
Up comes another speaker. “Now that we know that broccoli is more carcinogenic than dioxin,” the second speaker inquires, “which one do we want the EPA to regulate, the broccoli or the dioxin?” How would the audience respond?
If you think the audience would still favor strong regulations controlling industry’s callous, unconscionable poisoning of the environment with dioxin and not worry too much about what God might have done to the broccoli, you understand the resistance of outrage to technical education. As long as dioxin generates a lot of outrage, and broccoli very little, explaining their relative hazards is unlikely to affect the public’s concerns, fears, or policy choices.
The solution, I think, is to make our educational programs two-way rather than one-way and to make them sensitive to values as well as to data. At its best, this is what environmental education has always meant. But it isn’t what technical professionals usually mean when they mutter darkly about the need to educate the public. Many professionals are themselves understandably outraged at the public’s mistrust; they are in no better mood to learn than the public is.
I propose a division of labor. Let’s agree that technical professionals are the experts on what’s hazardous and what isn’t. (They’re wrong sometimes and overconfident often, but they know more than the rest of us.) Let’s also agree that citizens are the experts on what’s outrageous and what isn’t. Finally, let’s agree that hazard and outrage are both legitimate aspects of risk, both deserving of regulatory attention.
People’s price for respecting the professionals’ domain of expertise, I think, is a sense that the professionals respect theirs. From Community Right–To–Know requirements to Superfund cleanups, the fast-accumulating experience of risk communicators tells us that people can learn what the professionals want them to learn about the hazard – if they are convinced that they will remain free to insist on the outrage, to insist that values as well as data must control the regulation of risk. An environmental education program that works, in short, will be freedom-enhancing rather than freedom-constraining. It will help people see the ways in which they are right as well as the ways in which they are wrong. And as it teaches the public about hazard, it will teach the professionals about outrage.
What does this mean in practice? Instead of groundrules, let me suggest a few questions to ask yourself about any environmental education program, but especially one aimed at reducing public concern about risks the professionals consider small:
- Is the purpose of your education program to help people decide which environmental risks they want to tolerate and which they want to oppose – or is it to corner them so they feel they must tolerate the risks you want them to tolerate?
- Does your education program deal with such “outrage factors” as trust, fairness, control, and dread? When you tell people about a risk that is high in outrage and low in hazard, are you discussing only the hazard?
- When you compare risks, are you “bracketing” a risk you consider low by identifying other risks that are higher and lower, or are you telling people only about the risks that are higher?
- How sure are you of your data? How sure do your materials sound?
- What are the strongest arguments to be made against your own position? Does your education program make them?
- How do you feel about the intended audience of your education program? Respectful? Or a little angry, perhaps even contemptuous? Does it show?
- Is your program one-way or two-way? Do you expect to learn anything? Are there ways for people to teach you why they see the issues differently than you do? Do you want to know?
- Think of an issue about which you feel passionately – abortion, gun control, pornography, whatever – and imagine an education program on that issue developed by an expert on the other side. What signals of understanding or insensitivity, open-mindedness or closed-mindedness, would you be looking for? What such signals are you sending?
Finally, for groundrules (and more questions), let me suggest either of two books by Billie Jo Hance, Caron Chess, and Peter M. Sandman: Improving Dialogue with Communities: A Risk Communication Manual for Government (Trenton, New Jersey: Division of Science and Research, New Jersey Department of Environmental Protection, 1988) and Industry Risk Communication Manual (Boca Raton, Florida: CRC Press/Lewis Publishers, 1990).