Governments use two methods of revenue forecasting--judgmental, which relies on the expert opinion of the forecaster, and quantitative, which uses historical data and statistical techniques. Most local government forecasts rely on forecaster judgment to some extent, in large part because this method is easy to use. There is also value in human judgment, under certain conditions, but judgment is inherently biased. Research has shown judgmental forecasting to be generally inferior to quantitative methods, but in situations where no data exist for statistical techniques, judgmental forecasting can provide useful insights. To determine how judgmental forecasting can best be used, one must consider:
* Efficacy. What does forecasting science say about the predictive value of judgmental forecasting?
* Conditions for Use. When is it appropriate to use judgmental methods in forecasting?
* The human side of judgmental forecasting. How can the user best address the human element of judgmental forecasting?
* Judgmental forecasting methods. What techniques make judgmental forecasting most effective?
Forecasting science generally recommends against pure judgmental forecasting for two main reasons. First, the process used to construct the forecast exists primarily in the forecaster's head, which means the forecast is neither transparent nor easily replicated. Hence, the forecast is difficult to fully explain to others, and there would be variations among forecasters, leading to inconsistencies.
Second, a wide body of literature suggests that judgmental forecasts are likely to be of dubious accuracy. (1) One particularly well known example comes from a landmark study by the political scientist, Phillip Tetlock. (2) Over the course of 15 years, Tetlock asked 284 experts to assign probabilities to one of three possible future scenarios for forecast questions that were germane to their fields (e.g., economics, domestic politics, international relations). The three available choices for each scenario covered persistence of the status quo, a change in one direction (e.g., more economic growth in a given country), or a change in the opposite direction (e.g., a downturn in economic growth).
Expert judgment did not perform well in the study. A New Yorker review of Tetlock's work put it memorably: (3) "The experts performed worse than they would have if they had simply assigned an equal probability to all three outcomes--if they had given each possible future a 33 percent chance of occurring. Human beings who spend their lives studying the state of the world, in other words, are poorer forecasters than dart-throwing monkeys, who would have distributed their picks evenly over the three choices." In all cases, even rudimentary statistical methods would have provided more predictive power. (4) Further, these disappointing results were consistent across all areas of expertise, experience, or degree of specialization. In other words, greater expertise does not lead to better judgmental forecasts.
Use Any Quantitative Technique
As a rule of thumb, a forecaster should always try to apply some quantitative technique--any quantitative technique--before relying solely on expert judgment.
Tetlock's findings do not mean that expert judgment has no role in forecasting, however--just that it should be judiciously applied. The next section of this article addresses the conditions under which judgment should enter into the forecast.
CONDITIONS FOR USE OF EXPERT JUDGMENT
Judgmental forecasting is not universally panned by the research. Under certain conditions, some research suggests that judgmental forecasting can be as good as the best statistical techniques, and may be more consistent. This can be the case when:
* The expert has access to information that would not be reflected in the results from a statistical model. (5) This can be a common occurrence in government revenue forecasting because of phenomena such as changes in the tax base or changes in governing law.
* The expert has a history of making and learning from similar forecasts, and the environment is relatively stable (i.e., the impact of seasonality or business cycles is low). (6)
* Good historical data are limited or unavailable.
A government should exercise judgment carefully even if any of these conditions exist. Techniques that can help improve judgment include the following:
* Start by documenting the organization's accumulated wisdom about the revenue source. A simple checklist of key factors documents what should be considered before the forecast is made. (7) This helps forecasters stay focused on the most relevant factors, not overlook key factors, and maintain some consistency in forecasting approach.
* Keep records of how forecasts are made and review past performance before making new forecasts. (8) Feedback allows the forecaster to learn. As such, it is advisable to keep records--not only of the forecast itself, but also of judgments about key variables and assumptions behind the forecast. A forecast may ultimately be the product of too many variables to provide useful feedback, but reviewing how the expert's judgment affected specific key variables may be more instructive than the forecast itself.
* Create graphics of key trends rather than simply studying data in tabular form. (9) Graphical representations of data can help reveal trends that might otherwise be missed. Consider different graphical formats for visualizing data, such as drawing a line of best fit through the data points to better illuminate the trend.
* Obtain several independent judgments rather than relying on just one person. Research suggests that judgmental forecasts can also be improved by simply averaging the results of multiple independent forecasts. (10) This averages out unsystematic differences in the forecasters.
Even in a less structured approach, getting more views for a judgmental forecast can be helpful--but beware of group forecasting. It may seem intuitive that a team would probably arrive at better judgments than an individual would, but this is not always the case. Group interactions can be dominated by a particularly confident (but not necessarily insightful) participant, or they can devolve into "group think."
Decompose the Forecast Problem. Breaking down the forecast into component problems can help simplify the forecasting task. For example, forecast income from different components of the tax base, not the entire tax base. Only use this technique on problematic areas, however. Keep decomposition to a minimum because research shows that overuse can increase the number...