Universal health care and the economics of responsibility.

AuthorChamplin, Dell P.
PositionReport

Nearly a century ago, the American Association for Labor Legislation (AALL) began a campaign for universal health insurance based on the notion that health care is the joint responsibility of employers, workers, and the state (Chasse 1994). At first glance, the current American health care system appears to exhibit the shared responsibility philosophy proposed by the AALL. The cost of health insurance is underwritten by all three sectors of the economy: 1) households; 2) employers; and 3) government. However, while costs are shared, responsibility is not. The steady retreat of private firms and government from assuming a substantial share of the burden of health care costs is based on an underlying presumption that health care is entirely an individual's responsibility, while the contributions of government and the private sector are basically optional--a matter of benevolence rather than responsibility. The likely outcome of the current complicated debates over health care reform will depend on this issue of responsibility. Who should pay for health care? Is it a collective responsibility or an individual one?

The presumption that health care costs are the responsibility of individuals is supported by orthodox economics, which treats health care as a consumer good. (1) In this framework, there is no shared responsibility for health care. There is only individual demand for health care with employers and governments in a supporting and, ultimately, market-distorting role. It is difficult to see how universal health care can be built upon such a philosophy. On the other hand, institutional economics views health care very differently. As Dennis Chasse (1991) notes, John R. Commons, John Andrews and other early institutionalists understood that the social and economic structure of modern capitalism left workers with little bargaining power. As a result, workers "bore an unreasonable share of the costs of economic growth and financial speculation--instability, unemployment, hazardous working conditions, and low pay" (Chasse 1991, 805). J.M. Clark also recognized that problems like poverty, unemployment and industrial accidents are systemic in nature and beyond the reach of individual choice and personal responsibility (Clark 1936). Clark also stressed that the benefits of good health accrue not only to individuals but to employers and the community as well: "there is a minimum of maintenance of the laborer's health and working capacity which must be borne by someone, whether the laborer works or not," or else "the community suffers a loss through the deterioration of its working power" (Clark 1923, 16, quoted in Stabile 1993, 173). More recently, institutional economists and others have questioned the applicability of the choice theoretic framework to health care, since the choice of health care services is, at best, a joint decision, and is often made by others (Bownds 2003; Keaney 1999; 2002). In short, in the institutionalist view health care is treated as a social good that is fundamentally a matter of collective responsibility.

We begin the paper with a brief look at the shared cost and responsibility aspects of the current American health care system. We then examine the economics of responsibility as it applies to health care. Finally, we close the paper with a brief assessment of the outlook for health care reform. All of the myriad health care proposals retain some aspect of shared costs. The difference comes down to how costs are allocated among the three sectors of the economy. In the institutionalist framework, any reallocation of costs must ultimately be driven by an underlying philosophy of shared responsibility. It is our contention that universal health care will only be achieved by recognition of collective responsibility rather than a reliance on the spurious notion that health and health care are purely matters of individual consumer choice.

Shared Costs in American Health Care

The drive for universal health care in the United States began in the early 1900s with the recognition that illness, accidents and poor health pose significant financial risks (Starr 1982; Chasse 1991; 1994; Rosner and Markowitz 2003). Initially the primary concern of reformers was the risk of lost wages to individuals, but by the 1930s, the more significant risk became the actual cost of medical services. Financial risks associated with medical care affected not only wage workers but also the middle class, and ultimately health care providers who could not rely on timely payment of fees. Employers also faced financial risks from reduced productivity and greater turnover of workers due to poor health as well as workplace hazards. In addition, the public faced risks from a financially unstable medical sector, social costs resulting from poor public health, and reduced potential output. It is thus not surprising that the American health care system evolved over the course of the twentieth century into an intricate system of public and private insurance designed to protect individuals, health care providers, employers, and the public from these and other risks. Since WWII, the issue of universal health care has been an issue of universal insurance. Indeed, the focus of contemporary health care debates is the ability of individuals, employers and governments to afford the cost of adequate health insurance (Rosner and Markowitz 2003).

In this section, we provide an overview of the American system of health insurance. There is extensive cost sharing either intentionally through mechanisms such as shared premiums, deductibles, coinsurance and co-payments or through the unintentional shifting of costs. Cost shifting coupled with the inherent complexity of cost accounting in medical care have produced a system in which the true cost of a service is often unknown or unknowable (Keaney 2002; Hildred and Watkins 1996). (2) More importantly, the entire financing system is widely viewed as inefficient, inequitable and wholly inadequate. Thus, while the ultimate problem is the fact that total health care costs in the United States continue to increase at an alarming rate, an added concern is the relative burden of costs on households, businesses, and governments. (3) Put differently, while we are all paying more than we want to, many believe that some of us are paying more than our share.

The American system of health insurance is composed of three categories: 1) private insurance; 2) government insurance; and 3) the uninsured. There are significant problems in all three categories, although opinions differ on which category is under the most duress. As we discuss later in the paper, alternative proposals for reform tend to differ according to which of these categories is the primary target of reform. It is safe to say that there are opportunities for improvement throughout the system.

The most prevalent type of private insurance is employment-based insurance. While coverage has declined in recent years, 59.7% of the population was covered by an employer group plan at some point during 2006 (U.S. Census Bureau 2007, 20). Individuals may also purchase private insurance directly, but the only group to do so in significant numbers are those over 65 who purchase insurance to supplement Medicare (U.S. Census Bureau 2007, 20). The government insurance category includes Medicare and Medicaid as well as programs for the military and veterans. The percentage of Americans covered by some form of government insurance varies widely from 10.3% of the 35 to 44 age group to 94.1% of those over age 65 (See Table 1). Finally, nearly 15.8% of the population was uninsured in 2006 with most of the burden falling on those between 18 and 34.

Private Insurance

Jacob Hacker (2006) traces the origins of employment-based health insurance to the defensive maneuvers of corporations designed to block the adoption of national health insurance during the New Deal. However, the widespread availability of employment-based insurance would not have been possible without the cooperation of government policy. During WWII, the National War Labor Board ruled that fringe benefits were not subject to the general wage freeze, allowing employers to attract workers with higher compensation (Quadagno 2005). Offering fringe benefits also provided corporations a way to avoid a tax on profits that exceeded pre-war levels. As Quadagno notes, "[e]mployer contributions to group pension and health insurance plans were excluded from the calculation of profits, because they were considered a tax-deductible business expense. This ruling gave corporations an incentive to reduce excess profits by depositing them in trust funds for fringe benefits" (2005, 50). Rosner and Markowitz also point out that, "... the government reimbursed war industries on a 'cost plus' basis: total costs plus a set percentage for profit. This meant that a company that paid for its employees' hospital and medical insurance could pass along the entire cost to the government and also could make a greater profit" (2003, 63). Thus, from the beginning cost sharing by the public was an integral part of employment based insurance.

Health coverage for workers continued to increase during the post WWII period, reaching a peak in the early 1980s when almost 80% of all workers had health insurance through their employers (Morris 2006, 11). Favorable tax treatment of employment based insurance also continued after WWII. The Revenue Act of 1954 made it clear that fringe benefits would not be considered income and thus not subject to individual income tax (Hacker 2002). At the same time, fringe benefits are treated as an expense for employers reducing taxes on profits. Fringe benefits are also not subject to Social Security or Medicare taxes. Thus, the greater the percent of employee compensation in the form of fringe benefits the lower the payroll tax paid either by workers or by employers. (4) Employees also...

To continue reading

FREE SIGN UP