A few weeks ago Sheldon Rampton and I were guests together on “The Connection” on National Public Radio. Rampton is co-author (with John Stauber) of a recent book called Trust Us, We’re Experts, an attack on public relations for misusing science in defense of corporate misbehavior. My job was to defend PR – which I didn’t do very well, because I basically agree with Rampton that companies often pay scientists to present a carefully selected and sculpted subset of the truth that suits their purposes. This happens in court, of course; but it also happens in news stories and community meetings.
I have two problems with Rampton’s thesis, however. First, he seems to imagine that scientists who don’t work for corporations are entirely trustworthy. And second, he clearly thinks the sort of one-sided science he is complaining about works extraordinarily well to achieve evil corporate designs. Neither of these is so, I think. And understanding why they are not so leads to a very different vision of the role of experts in risk controversies.
There are three kinds of experts: the ones who are for sale, the ones who are deeply committed to a point of view, and the ones who don’t want to get enmeshed in controversy. (There are also the lucky ones who are deeply committed to a point of view that pays them exceedingly well. I guess I’m in that camp myself.) The third group is by far the largest – and arguably the wisest. Thirty-plus years of research, starting with Rae Goodell's book The Visible Scientists, has shown that scientists lose stature with their peers when they let themselves get involved in public controversies. The closer the controversy is to the scientist’s area of expertise, the greater the penalty that scientist pays (in professional credibility, grant-getting potential, etc.) for becoming publicly embattled. Behind the scenes, of course, scientists tend to be at least as opinionated as anyone else. The search for an expert who is well-informed but remains neutral and open-minded, free of strong previously stated opinions, tends to be futile in the first place. The search for someone who meets these specs and is now willing to go public on one side of a controversy is doomed.
So what sort of expert can you get to speak for you in a controversy? Someone who believes deeply in your position, or someone who is willing to support your position for a price. The true believer or the mercenary. Rampton complains that corporations deploy scientific mercenaries to speak on their behalf. He is right, although a surprisingly high percentage of them believe in the cause (or perhaps come to believe in it to resolve their own cognitive dissonance). Activist groups usually can’t afford mercenaries. They look for true believers, for scientists who are willing to volunteer their services for the cause.
The important point here is that neither group is to be trusted. Experts vary in how one-sided they are willing to let themselves get, of course. Whether they are mercenaries or true believers, most insist on telling the truth as they see it; only a few insist on telling the whole truth as they see it. On the whole, I suspect that true believers distort more than mercenaries. It’s not that ideology is necessarily a stronger motive than profit; it’s more that people who are sure they’re on the side of right and justice feel better about distorting a few measly facts than people who are in it for the money. But distortion is common in both cases. Two examples from among hundreds:
Distortion from the true believers.
In the 1980s I was on the board of a group devoted to fighting cancer. Among its principal activities was the operation of smoking cessation clinics at places of business. To talk companies into sponsoring such clinics, we explained that the medical costs of employees who smoked were a financial drain on the employer. One year we decided to commission an economic analysis to bolster the argument. To our shock, the analyst reported that employees who smoked were actually a lot cheaper for employers. They cost more for medical care, but because they tend to die soon after retiring, they save far more in pensions. Over my objections, the group buried the report in its files and continued to tell employers that smoking cessation clinics would more than pay for themselves.
Distortion from the mercenaries.
After both toxicological and epidemiological research found that a particular pesticide caused bladder cancer, the manufacturer had little choice but to inform employees and retirees who had formulated that pesticide. The company asked me to look over its draft information packet. It included the statement that half of all bladder cancers are caused by smoking. This was an accurate figure for the general public – but because of the pesticide's impact, the comparable figure for this occupational cohort was expected to be around 13 percent. The packet also emphasized that no bladder cancers had yet been detected among the company’s formulators – again an accurate statement, but one that ignored two facts: Medical screening had not yet begun, and because of the latency of bladder cancer no cases were expected for about another decade. (Part of what makes these distortions so fascinating is that the purpose of the packet was to persuade readers to take the risk seriously and enroll in the company’s bladder cancer screening program. Even when its goal was to arouse people’s concern, saving lives and reducing legal judgments, the company still had trouble abstaining from misleading reassurances.)
There is one important difference between distortion by true believers and distortion by mercenaries: True believers are a lot likelier to get away with it. The public looks more kindly on exaggerated warnings than on exaggerated reassurances. (Risk analysts can think of this as a sort of conservativeness.) When Greenpeace tells half-truths to alert us to a problem, we are grateful; when the XYZ Corp. tells half-truths to lull us about the problem, we are outraged.
Which leads to my other disagreement with Sheldon Rampton. He is right that companies often hire mercenary experts to distort for them. (Expert opinion inside the company is likely to be much more divided, but the internal dissent gets squelched in the company’s public assertions.) But as his book demonstrates again and again, these expert distortions tend to get discovered. The result of this pattern of distortion is not a public that trusts corporate experts and corporate reassurances excessively. It is a public that distrusts them excessively. We are rightly convinced that companies are willing to try to mislead us; so we assume that they are trying to do so even when they tell us the truth. (Well, okay, pure “truth” is too much to expect. Make that even when they are closer to the truth than their activist opponents.)
Distortion by corporate experts still works on very low-profile matters, where the audience hardly cares and “learns” without paying attention: nine out of ten dentists prefer…. But when the audience cares – when we move from “public” relations to “stakeholder” relations – expert distortion backfires when deployed by corporate interests, though it still works for activists. Companies do it anyway. That doesn’t make them less ethical than the activists, who also do it. Only less successful.
This has very practical implications for corporate stakeholder relations. The more people care about the issue, and the less they trust you, the better the case for not asking experts to distort for you. An expert who acknowledges the valid arguments on the other side turns out to be a more effective advocate than an expert who doesn’t.
Similarly, an expert whose position is not predictable does you more good than an expert who is always on your side. A couple of months ago an Australian client discovered unexpected emissions from a pilot industrial facility, a finding that might well undermine the case for going into production. They believed that the emissions did not pose a health threat to the community, and immediately went looking for an expert who could certify this conclusion. My advice: Find an expert who doesn’t think all industrial emissions are trivial, who sometimes concludes a problem is serious. Whatever the field, an expert who always comes down on the same side lacks credibility. When a research organization judges that a particular government agency should be privatized, ask whether it has ever decided that some other government agency should not. When a sexual harassment counselor finds that a particular complaint is justified, ask how often she has encountered a complaint she thought unjustified. And if you want meaningful support for your conclusion that these emissions are harmless, get it from an expert who has a record of finding some emissions harmful.
The best expert, in fact, is the reluctant expert. I tell my clients to think of experts as falling into four groups: (1) those willing to distort for you; (2) those leaning your way but unwilling to distort; (3) those leaning against you but unwilling to distort; and (4) those willing to distort against you. Find your expert in the third group – someone who will examine your evidence with a skeptical eye, and if it's as good as you think it is, will give you an unenthusiastic okay. “This is a wonderful company and everything it does is safe” is a worthless endorsement. “The s.o.b.'s have a terrible record, but what they are doing this time is safe” is worth its weight in gold.
A few years ago a company consulted with me about a personal hygiene product that contained (and still contains) parts-per-trillion of a controversial toxin. The toxin in question wasn’t technically a contaminant; it was a close chemical cousin of the product’s active agent, and its presence in tiny quantities was thus inescapable. My client had already established to its satisfaction that this was a non-problem from a health perspective; and the active agent was a genuine health boon. Its concern was whether to reveal the contamination proactively or stay mum and hope that nobody else revealed it. In either case it needed experts to support its view. Eventually the company decided to go interview activist scientists, whose views were clearly on the alarmist side of the spectrum. It ascertained that even they thought the health risk trivial and the health benefit substantial. Rather than go public, it stockpiled the interviews in case of future controversy. One of the reasons this client decided not to go proactive, by the way, is that some of the expert interviewees didn’t want it to. They were willing – just barely – to take the company’s side in a pinch, but they didn’t want to be associated with it if they didn’t have to be.
Not just activist scientists, but even academic scientists avoid association with corporate reassurances. An electronics company client hired an academic team to do an epidemiological study, promising total independence. When the study results favored the company view, the company begged the researchers to submit the study for publication in a peer-reviewed journal. They resisted literally for years, at least in part because they didn’t want to look to their peers like corporate sell-outs.
It is awkward and sometimes painful to work with experts who would be happier on the other side. But if you are a company under attack, those are just about the only experts of any use to you.
Copyright © 2001 by Peter M. Sandman