The battle over Endangered Species Act methodology.

AuthorRuhl, J.B.
  1. INTRODUCTION II. GENERAL FRAMEWORKS FOR DECIDING HOW TO DECIDE A. Level of Confidence Freework B. Hypothesis Statement Framework C. Error Aversion Framework III. ESA DECISION NODES IV. THREE COMPETING MODELS OF ESA METHODOLOGY A. The Professional Judgment Method-- Working With the "Best Scientific Data Available" B. The Scientific Method--Emphasis on "Best Scientific Data ... " C. The Precautionary Principle Method--Emphasis on "Best ... Data Available" V. HARMONIZING THE THREE METHODOLOGIES VI. CONCLUSION I. INTRODUCTION

    The substantive contours of the Endangered Species Act (ESA) (1) have been largely worked out for quite some time. Congress, now virtually inert when it comes to environmental law reform, has not touched the ESA in any meaningful way for over two decades. (2) Likewise, with the exception of the Habitat Conservation Plan permit program under section 10 of the statute, (3) neither of the agencies authorized to implement the ESA has engaged in substantive legislative rulemaking for many years. With so little movement on the statutory and regulatory fronts, by the early 1990s the courts had settled most of the substantive interpretation questions. The political reality is that the substantive law of the ESA is not likely to budge for the foreseeable future. Nevertheless, litigation under the ESA and calls for statutory reform abound, seeming to increase in volume and intensity with each passing year.

    One source of this burgeoning wave of litigation and lobbying is a battle over ESA methodology--the principles and rules of organization for the inquiry processes necessary for making decisions required under the statute. (4) The ESA requires the Secretary of the interior, who acts through the Fish and Wildlife Service (FWS), and the Secretary of Commerce, who acts through the National Marine Fisheries Service (NMFS), to make various decisions about the status and protection of animal and plant species. (5) In suits brought by citizens alleging FWS or NMFS has failed to meet its statutory mandates, (6) courts have been willing, almost eager, to find FWS and NMFS at fault for procedural errors, such as missing decision deadlines, (7) but until recently it was unusual for courts to find that the agencies had made substantive errors in their decisions about the status and protection of species. The conventional rules of judicial review tilt the balance decidedly in favor of the agencies in litigation challenging the substantive merit of the agencies' decisions. (8) This judicial deference, plus the fact that the substantive statutory and regulatory context was locked in place, made it far more likely than not that FWS and NMFS decisions would be upheld on the merits. (9)

    This aura of immunity carried the agencies well into the 1990s before it began to crumble. Starting in the mid-1990s, opponents of FWS and NMFS decisions from both the industry and the environmental camps realized that the methodological contours of the ESA were not nearly as settled as their substantive kin. It was quite clear, for example, what substantive factors the agencies had to consider in order to designate the "critical habitat" of a species protected under the ESA. (10) However, it was far less clear at the time just exactly how the agency was supposed to consider those factors--how to collect the evidence, how to evaluate the evidence, how to weigh the evidence, and how to reach a conclusion. Quite obviously, methodological rules of this sort can have a profound impact on substantive decision outcomes. The fact that they were not clearly worked out under the ESA, therefore, presented stakeholders an opportunity to influence substantive decision-making outcomes without having to alter the substance of the law or challenge the substantive merits of discrete agency decisions. Thus, the debate began in the late 1990s and has been going strong since then, reflecting the realization industry and environmental interests must have made--that how these methodological rules develop could revolutionize the ESA for decades to come. (11)

    This Article explores the breadth and depth of the ensuing battle over ESA methodology. Section II lays out the framework for evaluating decision-making methodologies. One basis on which we might choose how to go about making decisions is the level of confidence we wish those decisions to enjoy. If we want to be as sure as possible that a decision is correct, we would subject it to rigorous empirical tests through a process I (being far from the first) will call the Scientific Method. Where the costs of the Scientific Method are not justified, or the data needed to complete it are unavailable, we might feel comfortable relying on experts in fields relevant to the subject matter of the decision, whose experience and expertise we believe will allow them confidently to fill in the gaps that prevent competent use of the Scientific Method. This methodology, which I will call the Professional Judgment Method, prevails in administrative law. There may be times, however, when the evidence points to a particular decision under either of the two previous methodologies, but the consequences of the decision turning out to be wrong are so severe that we want to exercise caution by resisting the weight of the evidence. This fear of mistakes and their consequences motivates what I will call the Precautionary Principle Method. The more confidence we wish to have in a decision being correct, the more we will eschew the Precautionary Principle Method and gravitate toward the other two methods.

    Of course, it is hard for anyone to decide which methodology to use until more is known about the nature of the decision that has to be made. For scientists testing cause-and-effect relationships between physical phenomena, the Scientific Method prescribes a manner for stating and testing relevant testable hypotheses. Agencies, by contrast, are handed their "hypotheses" through statutory directives. The way Congress expresses the findings an agency must make in order to take prescribed action will influence how different interested stakeholders form preferences for different methodologies of decision making. For example, one possible hypothesis to test under the ESA is "the species is endangered." In that case, using the Scientific Method would require more rigorous justification of the decision that the species is in fact endangered than would the other methodologies. An advocate for species protection thus might object to using the Scientific Method in that context. But if the hypothesis used for ESA purposes were "the species is not endangered," the Scientific Method would cut in favor of anyone interested in increasing species protection. The burden of proof the Scientific Method imposes has substantive effect depending on the hypothesis to be proved. So, how we frame hypotheses will influence who favors a particular methodology.

    This dichotomy between hypothesis statements reveals another framework for methodology selection--aversion to mistaken conclusions about whether the hypothesis is true. Say we are testing the hypothesis that two observed variables are related in some way. The Scientific Method is designed to weed out what is known as "alpha" or "Type I" error--the identification of a causal relationship between the two variables when one does not exist. (12) The other kind of error, "beta" or "Type II," involves the finding of no causal relationship when one in fact does exist, and is precisely what the Precautionary Principle Method is intended to guard against. The Professional Judgment Method imposes no particular risk aversion bias, asking that the experts decide how best to balance the risk portfolio. Hence, we may prefer one of the methodologies based on how we wish to manage risk.

    Because methodology selection depends so much on how hypotheses are stated and the risk aversion bias of different interest groups, Section III provides some background on the ESA and its numerous decision-making nodes--the points at which a choice among the three methodologies must be made using one or more of the frameworks discussed above. Three features of the ESA make its decision-making context particularly susceptible to fights over methodology. First, many decisions the agencies must make involve questions of biological science for which the available scientific database is either sparse or inconclusive. By demanding that the agencies nonetheless reach conclusions under strict deadlines, the ESA sets up a methodological quandary. Second, these biological evaluations often arise in legal contexts that present a poor fit between science and policy. The decisions the law demands the agencies make, in other words, often do not make sense to a scientist. Finally, ESA decisions are characterized by the intense involvement of viciously combative interest groups willing to sue each other and the agencies with what appears to be gleeful abandon. Where the opportunity presents itself to shape ESA methodology, the opposed interest groups seem happy to litigate to a pitched battle in short order.

    Section IV of the Article frames and assesses the battle positions, which fall into three competing methodological camps. None of the frameworks for choosing methodologies leads to a particularly compelling result tender the ESA. By default, the ESA had, until the mid-1990s, followed a fairly conventional methodology for making decisions: the agencies implemented substantive duties based largely on the Professional Judgment Method, for the most part unshackled by significant method constraints, with their decisions subject to judicial review under the deferential Administrative Procedure Act (13) standards. (14) Not surprisingly, FWS and NMFS cling clearly to the Professional Judgment Method.

    Two alternative themes have emerged, however, to alter the settled practice. Not surprisingly, they would pull the statute in opposite...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT