Improving your forecasting abilities.

AuthorKavanagh, Shayne
PositionThe Bookshelf - Book review

Superforecasting: The Art and Science of Prediction

Philip E. Tetlock and Dan Gardner

Crown Publishing

2015, 352 pages, $28

Philip Tetlock, a political scientist, is best known for his landmark study on the accuracy of judgmental forecasting. Over the course of 15 years, he asked 284 experts to assign probabilities to one of three possible future scenarios that were germane to their fields (e.g., economics, domestic politics, international relations). He provided three choices for each scenario: the status quo, a change in one direction (e.g., more economic growth in a given country), or a change in the opposite direction (e.g., less economic growth in a given country).

The results did not reflect well on the ability of experts to make forecasts using their judgment. A New Yorker review of Tetlock's work put it memorably: "The experts performed worse than they would have if they had simply assigned an equal probability to all three outcomes--if they had given each possible future a 33 percent chance of occurring. Human beings who spend their lives studying the state of the world, in other words, are poorer forecasters than dart-throwing monkeys, who would have distributed their picks evenly over the three choices." (1) In all cases, even rudimentary statistical methods would have provided more predictive power. (2) Further, these disappointing results were consistent over area of expertise, experience, and degree of specialization.

Tetlock's latest book, written with Dan Gardner, continues to examine the accuracy of judgmental forecasting, using results from the "Good Judgement Project" (an initiative sponsored by the U.S. Federal Government to study how forecasts using only human judgment can be improved). The project ran over multiple years and included thousands of participants who forecasted the outcome of world events like elections and economic growth rates.

Superforecasting reveals how some of the participants managed to produce remarkably accurate forecasts--far better than the "dart throwing monkeys" of Tetlock's last study. Tetlock and Gardner dubbed these highly accurate participants "superforecasters" and distilled a list of specific practices they used to achieve their results. Below, Tetlock and Gardner's "Ten Commandments for Aspiring Superforecasters" are summarized, encapsulating their findings.

Perform Triage. Not all forecasting questions are worth equal attention. Superforecasters differentiate between questions where...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT