Guidance, Risk, and the “Precautionary Principle”

Introduction

This essay was first written for a research report regarding the future of creosote-treated wood piles in Alaskan waters. The scope of work of that research involved an assessment of “likely future laws and regulations” regarding creosote.  My analysis indicated that new laws or regulations are unlikely, but that agencies “guidance” will be very important.  Here we treat the concept of “guidance” documents, the notion of “risk” related to agency decision, and “precaution.”  The later became a part of the report in a strange way.  An agency, the National Marine Fisheries Service, hired a consultant to analyze the risk of creosote in northern waters.  I studied their report.  All in all, the report reviewed many scientific studies, most of which did not indicate any problem with creosote.  However in the report’s conclusions, the consultant cited the “precautionary principle” to negate the bulk of their own findings and recommend against the use of creosote.  That led this author to investigate the “precautionary principle.”

 

“Guidance” versus Regulations.
The NMFS was desirous of having a policy or guidelines to aid their staff in consultations regarding creosote.  Let us consider the nature of such policies or guidelines in the legal context of agency decisions.  Agencies promulgate regulations (“rules”) under two laws.  First the enabling law that requires the agency to regulate the subject matter, and second the Administrative Procedures Act (APA), that requires the agency to go through a definite process in the promulgation of the rule.  The rule-making process may be long and arduous, requiring public notice, publishing of drafts, public hearings, revision of proposals, and if there are major revisions, the entire process is often repeated. Once the process is complete, the regulation is a “law” as binding as the statue law that mandated it.  Contrasted with regulations, all agencies have myriad “procedures” that guide the work of the regulators.  These may be very definite articles, such as published laboratory procedures, or administrative things like, “all applications originating north of Anchorage are processed by our Fairbanks office.”  These may be published in “Standard Operating Procedures” (SOPs) or simply arise by habit and custom within the agency.  Using term “SOPs” to include all variations of procedures, we note several issues.  The largest is that once an SOP is established, it may have a profound effect on the interpretation of regulations and thus itself become a regulation itself, but one that was not vetted under the APA.  However in a contentious matter, if an agency “fails to follow it own procedures,” aggrieved parties will use this as proof of unfairness and often prevail in the ensuing dispute.  This makes it difficult for agency staff to vary SOPs in contentious situations.  Thus, once an SOP is established, agency personnel feel bound by it.  On the other hand, without such SOPs the agency staff could not function efficiently or perhaps at all.

 

Risk, Guidance, and the Precautionary Principle

Risk

Risk involves the probability and severity of some harm.  (Sometimes the word “opportunity” is used for the opposite of risk, the probably and severity of some benefit.) In complex human transactions, the various risks are a “cost” to the party who bears the risk.  They might be insured against, in which case the risks have a definite monetary cost, or the risks are simply borne by one or the other parties to the transaction with the costs “real but uncertain.”  Of course the costs of these risks must be balanced by some benefit to the parties, or the transaction would not complete.  Note these benefits may be opportunities, that likewise have a probability associated with them.  The balancing of risks and benefits is difficult for a government agency, where the risks include bad publicity, time lost responding to increased scrutiny of the pubic and the media, possible loss of budget, job insecurity for top administrators, and so on.  Benefits on the other hand are very nebulous, other than the satisfaction of doing one’s job well. 

Caution about Precaution

In a later appendix we treat the Stratus [the NMFS consultant] document in some detail, but here want to comment on a section of the report that cites the “precautionary principle” as dictating some courses of action regarding creosote.  The irrelevance of that is discussed in the Stratus appendix, but here we will discuss the underlying principles of precaution and risk.

We have learned that risks are characterized by stating the probability and severity of some harm.  Later those characterized risks are used in a risk management decision.  We also recognize that do-nothing is a management option and different than simply ignoring a risk.  Both the risks and the management decisions from a current hazard, say MTBE in a city’s water supply, are different from the risks and management decision regarding the possibly of some future hazard, say building a landfill near the city’s water supply wells.  Contrasting a current hazard from a possible future hazard, the nature of the do-nothing alternative is quite different.  If the hazard existed prior to the management decision under consideration, the decision itself did not contribute to the hazard.  Of course prior decisions by that same manager may have contributed to the hazards, but those decisions are a “sunk costs” and not relevant to the decision of the moment.  Conversely, if all the effects of the decision will occur in the future, the decision itself may contribute to the hazard.  Thus a present decision to change something in the future, perhaps to increase some risk in the future, is quite different than making a decision about a present hazard.  Now of course there were other hazards in the likely future states of nature, that’s why we are making the decision – for example the city’s current landfill is at capacity we have a court order to close it in two years.  But no hazard at all existed for the City’s wells, prior to our decision - the do-nothing alternative eliminates the risk to wells. 

The decision not to build a project, not to market a new drug, or not to grow genetically modified corn, completely eliminate the hazard due to the project, drug or corn.  Of course the benefits of those would be eliminated as well.  Often, however, neither the future hazards nor benefits are known with certainty.  Proponents of the new devices - project, drug, or genetic product - must believe there are benefits, or they would not promote the devices.  Opponents of the new devices believe there are hazards great enough to warrant the injunction of the device but here, too, the hazards are not known with certainty.  To this point in our essay, we are dealing with a classical cost-benefit analysis superimposed with a probabilistic analysis of future events.  However there may always be “unintended consequences,” which might be presumed hazardous or beneficial depending on the person’s relation to the device.  For those opposed to the new device the next step in to invoke the “precautionary principle,” which implies that doubt about the future should be resolved against the new device.  Here are two versions regarding “precaution,” are “hard” version and a “soft” version. 

“Soft precautionary principle”


The precautionary principle was stated by the Rio Conference   "In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation." [Emphasis mine.]
So Rio requires serious or irreversible damage, albeit only a threat of those, would trigger measures to prevent this, but only if they are cost-effective.  This could mean that the costs would be overwhelming in themselves or that the costs are large in relationship to the damage.  The damage may be hard to evaluate.  For example, the extermination of a rare species that is little known or useful, might be regarded as of infinite value, since the species will never appear again, or no value, since it would not be missed.  Of course real world decisions are always complicated by economics and politics.

Hard precautionary principle


The more recent 1998 Wingspread Conference issued a document that states: “When an activity raises threat upon to human health or the environment, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically”
Note a broad reading of this is simply that if there is a “threat” some precaution is warranted – a notion that is hard to argue with.  However it seems to demand that some measures be taken, even if the science does not establish causation.  

Reactions to the “precautionary principle:”

While these general policy notions might be thought provoking, the application of them is hardly scientific.  As van den Belt (2003) notes:
[Definitions such as Wingspread] beg many questions. Is there ever full scientific certainty? Do we need a minimal threshold of scientific certainty or plausibility before we may (or should) undertake preventative action? And do we really know how to prevent harm if we are so much ignorant about the underlying cause-effect relationships? The definitions that are currently on offer fail to spell out the precise conditions that have to be fulfilled before the PP may be invoked or the nature of the preventative action that has to be taken. The types of action suggested range from implementing a ban, imposing a moratorium while further research is conducted, allowing the potentially harmful activity to proceed while closely monitoring its effects, to just conducting more research. The PP does not have a very precise meaning as long as such crucial aspects are left largely unanswered.

In practice, however, the PP is often given a more definite meaning by reducing it to an absurdity. Normally, no minimal threshold of plausibility is specified as a “triggering” condition, so that even the slightest indication that a particular product or activity might possibly produce some harm to human health or the environment will suffice to invoke the principle. And just as often no other preventative action is contemplated than an outright ban on the incriminated product or activity. The intervention of Greenpeace in the monarch butterfly case seems to fit this pattern.

Closely linked to various versions of the PP is the idea of reversing the onus of proof. Thus, the adherents of the Wingspread Statement declare that “the applicant or proponent of an activity or process or chemical needs to demonstrate that the environment and public health will be safe. The proof must shift to the party or entity that will benefit from the activity and that is most likely to have the information” (Raffensberger and Tickner, 1999). Greenpeace also holds that effective implementation of the PP requires a shift in the burden of proof (Greenpeace, 2001). Shifting the burden of proof seems a fairly straightforward way to ensure, as Jonas demanded, that greater weight will be given to the “prognosis of doom” than to the “prognosis of bliss.”

Before looking into the proper assignment of the burden of proof, we must first examine more closely the underlying justification for the strong version of the PP. Why should the prospect of harmful effects of a new technology take precedence over the prospect of beneficial effects, quite apart from the inherent likelihood of each of these possibilities? The obvious answer seems to be that such a priority is defensible only when the harmful effects are of such magnitude that they carry catastrophic (or, as Jonas would say, “apocalyptic”) potential. The infinite costs of a possible catastrophic outcome necessarily outweigh even the slightest probability of its occurrence.

This type of reasoning exhibits a remarkable resemblance to a well-known example of a “zero-infinity dilemma,” namely Pascal's famous “wager.” When it comes to wagering on the existence of God, the 17th century French philosopher argued incisively in his Pensées that it is better to be safe than sorry (Haller, 2000; Graham, 2002; Manson, 2002). Given an unknown but nonzero probability of God's existence and the infinity of the reward of an eternal life, the rational option would be to conduct one's earthly life as if God exists.

Alas, Pascal's reasoning contains a fatal flaw. His argument is vulnerable to the “many gods” objection (Manson, 2002). Consider the possible existence of another deity than God, say Odin. If Odin is jealous, he will resent our worship of God, and we will have to pay an infinite price for our mistake. Never mind that Odin's existence may not seem likely or plausible to us. It is sufficient that we cannot exclude the possibility that he exists with absolute certainty. Therefore, the very same logic of Pascal's wager would lead us to adopt the opposite conclusion not to worship God. Pascal's argument, then, cannot be valid.

If the reader will pardon another long quote, Chauncey Starr writes in Risk Analysis (2003):

This brings us back to the precautionary principle. Governments asked to regulate public exposure to risks from man-made sources (food, water, air, radiation, pollutants, electromagnetic fields, etc.) face a tortuous decision process because of the above uncertainties of risk analysis. The use of the precautionary principle as a politically defensible umbrella is a tempting escape from this difficulty. However, it is not cost-less, as protection from a risk that may be nonexistent or trivial may deprive the public of attractive and valuable lifetime choices. The only defensible approach is a comparative risk analysis of alternative pathways, taking into account our most credible projections of the lifetime economic, environmental, and health values of these alternatives.

The precautionary principle exists only as a rhetorical statement; it provides no useful input to decision making. Expert opinions should be sought, but be recognized as conservatively biased. The search for science-based guidance is commendable, but is rarely achievable. In areas of public health and safety, comparative benefit/cost/risk analysis of all options should provide the judgmental base for decision making. Between the horserace bet and a credible, scientifically established projection, the decision maker will always be faced with a choice and no guarantees. There will always be room for pragmatic judgments on the limitations of long-range management.

 

Conclusion

Summarize Star, the agency must make a decision based on the best available science and other issues, then apply judgment.  Citing “precaution” to avoid a decision is not valid, since most decisions will be made with many uncertainties involved.   

 

References

The Rio Declaration on Environment and Development, often shortened to Rio Declaration, was a short document produced at the 1992 United Nations "Conference on Environment and Development" (UNCED), informally known as the Earth Summit. The Rio Declaration consisted of 27 principles intended to guide future sustainable development around the world. [http://en.wikipedia.org/wiki/Rio_Declaration_on_Environment_and_Development]
The “precautionary principle” is one of the 27 principles.  Of course these do not have the effect of law in any country. 

2 The Wingspread Conference on the Precautionary Principle was a three day academic conference where the precautionary principle was defined. The January 1998 meeting took place at Wingspread, headquarters of the Johnson Foundation in Racine, Wisconsin, and involved 35 scientists, lawyers, policy makers and environmentalists from the United States, Canada and Europe. [http://en.wikipedia.org/wiki/Wingspread_Conference_on_the_Precautionary_Principle]

Debating the Precautionary Principle: “Guilty until Proven Innocent” or “Innocent until Proven Guilty”? Henk van den Belt Plant Physiol. 2003 Jul; 132(3): 1122–1126. Downloaded 12/12/2016 http://www.ncbi.nlm.nih.gov/pmc/articles/PMC526264/?tool=pmcentrez#ref9

Starr, C. (2003), The Precautionary Principle Versus Risk Analysis. Risk Analysis, 23: 1–3. doi: 10.1111/1539-6924.00285