Starting in November 2001, the U.S. economy found itself in the weakest employment recovery since World War II. During the first four year of the business cycle, private-sector employment remained below its prior peak. To maintain consumption, households borrowed record amounts of debt that, combined with federal budget deficits, contributed to the highest reported U.S. trade deficits. The slow recovery gave businesses little reason to invest, and when they finally did, it was narrowly concentrated in a few industries.
This low investment was not due to scarce resources. Corporate profitability saw a strong recovery and reached historic highs by the sixth quarter after the business cycle trough. However, instead of investing in plant and equipment to produce output that might not find buyers, corporations spent their resources on speculative uses like share repurchases and dividend pay-outs. The reward was a strong stock market recovery amid a weak labor market recovery, another distinguishing feature of the early 2000s U.S. recovery.
Many observers have put forward explanations for the "job-loss" recovery. This may not actually be what needs to be explained. The recovery simply resumed a trend that has been underway for more than twenty-five years. There has been a rising divergence between national productivity and labor compensation to the gain of corporate profitability. The job-loss recovery stands out only in the stark relief provided by the tight labor market and rising labor share that characterized a brief period in the late 1990s. What needs to explained, then, is not the job-loss recovery but, rather, the twenty-year trend of capital's gain and labor's loss that preceded the late 1990s boom.
Our explanation for this trend is a shift in the power balance of the U.S. economy that was only briefly offset by extraordinary demand growth in the late 1990s. Since World War II, financial capital has become increasingly concentrated in the hands of institutional investors. Dissatisfied with low rates of return in the 1970s, institutional investors pushed for corporate governance changes, especially in the realm of executive compensation. This push was aided by a concurrent sharp decline in private-sector unionization, weakening U.S. labor's ability to fight these changes. The confluence of these changes led to productivity gains being largely claimed by capital-owners and corporate managers. The job-loss recovery is just another manifestation of this longer trend, more extreme in degree but equivalent in kind to what happened in previous decades.
The "Job Loss" Recovery in Context
The central problem of the job-loss recovery is easy to describe: economic growth and employment diverged as demand growth lagged productivity growth. (1) In previous recoveries, firms had to hire new workers to meet rising demand in the early phases of recoveries. The demand/productivity mismatch of the current recovery, however, is not due to extraordinarily strong productivity growth but rather to anemic demand growth. Labor productivity rose by 10.8 percent in the first nine quarters of the recovery, similar to the early 1950s, early 1960s, and early 1970s (BLS 2004a). Unlike these previous recoveries, compensation growth lagged far behind productivity. The result of this productivity/compensation mismatch is that profits soared, measured either as profit shares or profit rates (see figure 1 and Weller 2004a), and households borrowed record amounts to maintain their consumption (Weller 2004a). At the same time that personal savings fell, the federal government incurred large budget deficits and the nation's current account deficit reached all-time highs (Weller 2004b).
[FIGURE 1 OMITTED]
Even businesses doubted the sustainability of a recovery based on household debt, evidenced by an extremely slow recovery in investment (Weller et al. 2004). Investment declined for nine straight quarters, a record contraction that undid 80 percent of the gains in investment from the 1990s. Further, the initial investment rebound was narrowly concentrated in a few information technology sectors (Weller 2004c).
The combination of record profits and sagging investment begs the question: what were corporations using their resources for? Increasingly, firms used them for dividend pay-outs and share repurchases rather than for productive investment. From the second quarter of 2001 to the first quarter of 2004, corporations paid out a record high 25.4 percent of corporate resources in dividends and used a comparatively high 5 percent of resources for share repurchases (Weller 2004c), leaving less for productive capital investments (table 1).
All of these trends were new only when viewed from a narrow historical perspective that compares them to the late 1990s. They have essentially prevailed for the past twenty-five years, excepting the 1996-2000 period, albeit less starkly. Since the mid 1970s, productivity growth has outpaced wage and compensation growth (figure 2). From 1947 to 1975, productivity and real hour compensation grew apart by a total of 6.0 percent. In comparison, from 1975 to early 2004 (an equally long period of time) productivity grew 25.7 percent faster than real hourly compensation. This divergence was reflected in higher profitability, as the profit share rose more rapidly in the latter period than in the former (Wolff 2003).
[FIGURE 2 OMITTED]
This divergence between productivity and compensation undermined the soundness of the U.S. economy. With income lagging, households increased their consumption by saving less and borrowing more. (2) Consumer debt, especially mortgages, rose rapidly from 63.7 percent of disposable income in the early 1970s to well over 100 percent in the most recent business cycle. The lack of personal savings was matched by a lack of government savings (table 2). To finance its increasingly debt driven economy, the U.S. had to borrow funds overseas, which in turn were matched by rising trade deficits (table 2).
During this whole period, incentives to use the additional corporate resources for real investments were blunted by a slowdown in demand growth. Firms responded to this development by increasing share repurchases and dividend pay-outs. In the 1970s, corporations used 10.7 percent of their resources for these purposes; this share grew to more than 30 percent since the 1980s (table 2). In comparison, capital expenditures continuously lost in importance as a use of internal resources since the mid-1970s (table 2).
An important break in this trend occurred in the late 1990s due to strong employment growth. From 1996 to 2000, productivity growth accelerated to a pace not seen on a sustained basis since the late 1960s and, even more importantly, the unemployment rate was driven down well below prevailing estimates of the NAIRU, resulting not in inflation but in solid wage growth, a point emphasized by James Galbraith (2000). During the five-year period ending in March 2001, total real employee compensation grew 0.7 percent more than productivity growth. This was the first five-year period since the late 1970s in which wages (ever so slightly) outgrew productivity.
Explaining the Divergence between Capital and Labor
A number of arguments have been put forward to explain the job-loss recovery. All, however, take as given that it is the recent, short-run job-loss recovery that needs to be explained (Bivens and Weller 2004a). This ignores the fact that it is not the job-loss recovery that is anomalous, but the late 1990s full-employment boom that increased labor's share. What needs explanation, then, is the twenty-five-year period of diverging capital and labor incomes, not just the profit-friendly economic recovery after 2001.
The economics literature more broadly is not wholly silent on this point. Various authors have examined the rise of inequality since the mid 1970s and ascribed it to either the increasing openness of the U.S. economy to international trade or to technological change that is biased away from (unskilled, mostly) labor. David Autor et al. (1999) provided a good review of the latter literature, while William Cline (1997) provided one of the former. Both theories are insufficient to explain diverging compensation and productivity. First, the arguments are cast in terms of explaining increasing wage differentials rather than a divergence between capital and labor incomes. While, theoretically, both could also apply to inequality of incomes between factors, this is not how they have largely been interpreted. Second, either theory does not explain more than a plurality of the increase in wage inequality during the period under question. Empirical estimates of relative price changes and/or demand shifts that would need to have occurred to validate either as a dominant explanatory factor behind rising inequality haven't been documented. Third, neither approach can explain the acceleration of the trend divergence after 2001.
Institutional authors have argued that a relatively deflationary macroeconomic environment was the dominant cause of inequality in wage and capital incomes. Galbraith 1998 is the clearest exemplar of this argument, examining the rise in inter-industry wage differentials and their relationship with the unemployment rate, arguing that high unemployment is associated with rising inter-industry differentials. This still leaves ample room for competing or complementary explanations. First, Galbraith 1998 primarily addresses inequality within labor earnings. Second, one could argue (like John Smithin ) that the deflationary macroeconomic environment was actually just another manifestation of the power of institutional investors, as they demanded that policy makers target the value of their assets over other economic objectives. Third, although the trend divergence that started in the 1970s continued in the most recent recession, it also accelerated. That is...