Author:Schwarcz, Steven L.

INTRODUCTION 1074 I. TAXONOMY OF COMPLACENCY 1077 A. Herd Behavior 1077 B. Cognitive Biases 1079 C. Over-reliance on Heuristics 1080 D. Proclivity to Panic 1082 II. COMPLACENCY AS A TRIGGER OF FINANCIAL MARKET FAILURES 1083 A. Herd Behavior and Market Failures 1083 B. Conitive Biases and Market Failures 1085 C. Over-reliance on Heuristics and Market Failures 1086 D. Proclivity to Panic and Market Failures 1086 III. REGULATING COMPLACENCY 1087 A. Regulating Herd Behavior 1088 B. Regulating Cognitive Biases 1091 C. Regulating Over-reliance on Heuristics 1094 D. Regulating the Proclivity to Panic 1097 IV. ADDRESSING THE INEVITABLE FAILURES 1098 CONCLUSION 1099 APPENDIX: COMPENDIUM OF POTENTIAL REGULATORY IMPROVEMENTS 1101 A. Regulating Herd Behavior 1101 B. Regulating Cognitive Biases 1101 C. Regulating Over-reliance on Heuristics 1102 D. Regulating the Proclivity to Panic 1103 INTRODUCTION

Our limitations as human beings impose critical constraints on the efficacy of law. In a law school seminar in legislation, my professor would frequently remind the class that laws will always be implemented imperfectly because we are human. (1)

Since the 1970s, the field of behavioral psychology has been exploring limitations on human rationality. (2) Herbert Simon first outlined the theory of "bounded rationality," (3) which posits that we cannot access and process all the information needed to maximize our benefit. The human mind therefore "necessarily restricts itself by relying on cognitive shortcuts. (4) Around that time, psychologists Daniel Kahneman and Amos Tversky began researching the sources of bounded rationality and resulting cognitive errors. (5) They used the term "prospect theory" to describe their "attempt to articulate some of the principles of perception and judgment that limit the rationality of choice." (6) Among other things, they found that people frequently make decisions based on intuition rather than reason, often reaching the wrong answer. (7) Others found that human rationality only weakly correlates with IQ level. (8)

Behavioral law and economics adopted these findings, rejecting the traditional assumption that economic actors are wholly rational. (9) Recent studies have shown, however, that rationality can be addressed and sometimes improved. (10) Legal scholars are beginning to explore how regulatory intervention can help to counteract irrationality and correct cognitive error. (11)

Little has been done, though, about using these insights to improve financial regulation. Even in financial markets, humans have bounded rationality. (12) The only scholar who, to date, has considered how these insights might improve financial regulation focused narrowly on consumer finance. (13) This Article, in contrast, focuses more broadly on how insights into limited human rationality can improve (and thus references herein to financial regulation include) both "microprudential" financial regulation, which protects the stability of individual financial institutions, (14) and "macroprudential" financial regulation, which is intended to protect the stability of the financial system itself (15) by reducing systemic risk. (16)

For ease of reference and also to situate human limitations within nomenclature used to describe the range of market-failure triggers that can impair financial regulation, this Article refers to those limitations collectively as "complacency" (17) in the expansive sense of that term. (18) Complacency can create market failure by undermining at least two perfect-market assumptions--that parties have full information, and that they will act in their rational self-interest. (19) These assumptions underlie financial regulation. (20)

The Article proceeds as follows. Part I provides a taxonomy of complacency, dividing it analytically into four categories: herd behavior, (21) cognitive biases, (22) over-reliance on heuristics, (23) and a proclivity to panic. (24) Part II explains how these categories of complacency can trigger financial market failures. Part III examines how insights into these categories of complacency can improve financial regulation (and the Appendix to the Article provides a compendium of potential regulatory improvements). And Part IV analyzes how law should address the inevitable failures that occur notwithstanding these regulatory improvements.


    There is not yet a generally accepted way to categorize the limitations on human rationality. In analyzing behavioral limitations and law, however, Professors Thaler and Sunstein discuss the limitations associated with herd behavior, (25) cognitive biases, (26) and reliance on heuristics, which they call "rules of thumb." (27) As shown below, these categories provide insights into improving financial regulation. This Article also proposes a fourth category: the human proclivity to panic, which is strongly connected to the stability of financial markets. (28)

    1. Herd Behavior

      Herd behavior refers to the tendency of people to follow what others are doing. That tendency is not necessarily irrational. Herd behavior can improve financial markets if a firm's managers follow the behavior of other firms whose managers have more or better information. (29) Some even argue that herd behavior may represent an evolutionary adaptation that allows individuals to take advantage of information gained by others. (30) Herd behavior becomes problematic, however, to the extent some followers may not be acting in their self-interest or the interest of the party for whom they are serving. The former tendency contradicts financial regulation's perfect-market assumption that parties have full information. (31)

      For example, a firm's managers might follow the behavior of other firms' managers, thinking the other managers have more or better information. (32) In reality, they may be following a misleading information cascade--a convergence of action based on a belief that the prior actors have better information, whereas the convergence reflects imitation more than good information. (33) An information cascade "has the potential to occur when people make decisions sequentially, with later people watching the actions of earlier people and from these actions inferring something about what the earlier people know." (34) For example, early diners who arbitrarily choose restaurant A over nearby restaurant B "convey [ ] information to later diners about what they knew. A cascade then develops when people abandon their own information in favor of inferences based on earlier people's actions"--i.e., that restaurant A is better than restaurant B. (35)

      The people who follow the actions of earlier people are not mindlessly imitating the earlier behavior; instead, they are "drawing rational inferences from limited information." (36) The frenzied worldwide demand to purchase certain highly leveraged mortgage-backed securities (MBS) in the years prior to the 2008-2009 financial crisis (the "financial crisis") almost certainly represented, in whole or in part, the herd behavior of investors following a misleading information cascade about the value of such MBS. (37)

      A firm's managers might also follow the behavior of other firms' managers without recognizing that behavior benefits the other firms but not their firm. In this context, Professor Bainbridge observes that corporate managers have engaged to their detriment in "participatory management"--involving their employees in workplace decisionmaking--simply because they see other companies doing so successfully. (38)

      The latter problematic tendency--to follow the herd in order to protect self-interest but not necessarily the interest of the party for whom the follower is acting (hereinafter, "defensive" herd behavior)--again contradicts the perfect-market assumption that parties act in their rational self-interest. Resulting in part from risk aversion, (39) this tendency creates agency costs, which are themselves a type of market failure (40) that occurs when an agent acts against its principal's self-interest. For example, a financial analyst (the agent) may recommend a particular investment for his firm (the principal), even though he is skeptical of its value, because other firms are choosing that investment. If the investment ultimately fails, the firm will be harmed, but the analyst's job and reputation will be protected by the fact that others, too, chose that investment. (41)

    2. Cognitive Biases

      As a psychological coping mechanism, we often implicitly simplify our perception of reality. There are at least two common such cognitive biases: availability bias (42) and optimism bias. (43) Both of these biases violate the perfect-market assumption that parties have full information (44) by distorting the internalization of information. (45)

      Availability bias is the tendency of a recent or especially vivid event to be the most readily accessible example in a person's mind, such as overestimating the frequency or likelihood of an event when examples of, or associations with, similar events are easily brought to mind, and discounting the probability of an event's occurrence based on the length of time since it last occurred. (46) For example, people with recendy divorced friends tend to overestimate the divorce rate. (47)

      Optimism bias is the tendency to be unrealistically optimistic when thinking about negative events with which one has no recent experience, and devaluing the likelihood and potential consequences of those events. (48) This bias helps to explain the reputed interpretation of the Delphic Oracle by King Croesus of Lydia, who wanted to make war on Cyrus. The Oracle advised that the war "would destroy a mighty kingdom." (49) Croesus heard what he wanted to hear (50)--that Cyrus would fall--but in fact, his empire was the one destroyed. (51)

    3. Over-reliance on Heuristics

      Over-reliance on heuristics refers to undue reliance on explicitly adopted simplifications of reality...

To continue reading