Provider performance reports and consumer welfare

Published date01 March 2017
DOIhttp://doi.org/10.1111/1756-2171.12175
Date01 March 2017
AuthorHenry Y. Mak
RAND Journal of Economics
Vol.48, No. 1, Spring 2017
pp. 250–280
Provider performance reports and consumer
welfare
Henry Y. Mak
A provider’s performance report consists of his service average outcome and volume. The two
variables depend on the provider’s private quality type and current demand, but he can raise
his average outcome by dumping vulnerable consumers. Prospective consumers infer providers’
qualities from their reports. Performance reporting drives some providers to dump consumers
when competition is intense, but it may not reveal providers’ qualities when their averagequality
is high. Statistical adjustment aiming at making reports independent of consumer characteristics
can lead to more dumping, less informative reports,or both. There is more dumping when volume
information is withheld and less dumping when ratings information is coarse.
1. Introduction
“Success should be judged by results, and data is a powerful tool to determine results. We
can’t ignore facts. We can’t ignore data.”
President Barack Obama, July 24, 2009
Public reporting is instituted to collect and disseminate performance data so that success can
be judged. New York State started reporting cardiac surgery performance in 1989. Since then,
public performance reporting has penetrated the healthcare sector and other sectors such as social
services, crime and enforcement, and education, in the United States and elsewhere.
Unlike private agencies that can report opaque or subjective performance measures, pub-
lic agencies are often obligated to base their reports on transparent and objective measures.1
Therefore, most public performance reports use end results as measures of underlying service
qualities. Because results often depend on clientele, some public performance reports are adjusted
Indiana University-Purdue University Indianapolis; makh@iupui.edu.
This article is based on a chapter of my PhD dissertation at Boston University. I am indebted to Ching-to Albert Ma
for his guidance and support. I also wish to thank Sambuddha Ghosh, Jacob Glazer, Michael Luca, and participants at
various seminars and conferences for their comments. I am grateful to the Coeditor, Kathryn Spier, and two anonymous
referees for their suggestions. All errors are my own.
1For example,Google search uses secret algorithms to rank the relevance of web pages. Ederer, Holden, and Meyer
(2014) show that opaque incentiveschemes can outperfor m transparent ones whenagents are strategic. A large literature
studies subjective performance evaluation; see, for example, Levin(2003) and MacLeod (2003).
250 C2017, The RAND Corporation.
MAK / 251
TABLE 1 Four Categories of PublicPerformance Reports
Contents of Reports Examples
1. Averageoutcome and volume Child protection investigation completion rate and number of cases
of a district
2. Adjusted average outcome and volume Surgery mortality rate of a hospital adjusted for patient severities
and number of cases
3. Averageoutcome Standardized test passing rate of a school
4. Adjusted average outcome Job placement rate of a training center adjusted for enrollment
demographics
to isolate service qualities from consumer characteristics. Table 1 givessome examples of public
performance reports under four categories based on their contents.2
Although these reports have different contents, theyshare a common pur pose of using results
to direct consumer choice and resource allocation. For example, surgery outcome reports induce
patients to switch from high-mortality to low-mortality hospitals; job placement reports allow
funding authorities to determine training centers’ budgets according to their placement rates.
These changes benefit patients and trainees if the end results are perfect measures of service
qualities.
However, a provider is likely to have more information about his consumers than the public
reporting agency. This allows a provider to manipulate his end results strategically. For example,
a low-quality school can raise its standardized test passing rate by retaining weaker students and
increasing special education placements (Jacob, 2005). These strategic behaviors can harm the
affected students and defeat the purpose of performance reporting by making the passing rates
uninformative.
In this article, I develop a model to study howpublic performance repor ting affectsproviders’
service decisions and consumer welfare. Providers in the model have private information about
their service qualities and demands. They can manipulate their reports by dumping consumers
with certain characteristics.34 The model shows how the four categories of reports in Table 1
yield different dumping choices, report informativeness, and consumer welfare. In particular, I
show that reporting adjusted outcomes (categories 2 and 4 in Table1) may lead to more dumping,
less informative reports, and lower consumer welfare. These results can account for a number of
empirical findings. They are also useful for the design of public performance reporting strategy.
For concreteness, consider a market for surgery as an application. In each of two periods,
some new patients would like to receive treatments from one of two providers. Each patient’s
severity can be high or low and each provider’s quality type can be good or bad. A patient’s
treatment outcome depends on her severity and the quality type of her provider. The number of
patients who request a provider’s service is stochastic, but increasing in the provider’s perceived
quality. Each provider receives positive payoffs from providing services.
A provider’s quality type is his private information. If there is no performance reporting, the
demands in both periods are independent of the providers’ quality types. Now suppose a public
reporting agency observes and reports the quality type of each provider at the end of period
one. These reports induce some patients in period two to switch from a bad provider to a good
2These examples are from Florida Department of Children and Families (www.dcf.state.fl.us/performance/
dashboard); New York Department of Health (health.ny.gov/statistics); Indiana Department of Education (com-
pass.doe.in.gov); and United States Job Training Partnership Act (JTPA) program (wdr.doleta.gov/opr/fulltext/99-
performance.pdf). The JTPAprogram was replaced by the Workforce Investment Act program in 2000.
3In this article, I follow Ellis (1998) and use the term “dumping” to refer the avoidance of consumers with worse
service outcomes. However,many authors use the terms “dumping” and “cream-skimming” interchangeably.
4In addition to dumping weaker students, Dranoveet al. (2003) find that cardiac surgery perfor mance reports have
led surgeons to dump sicker patients; Heckman and Smith (2004) find that applicants with less education and from poorer
families are less likely to be enrolled for job training programs.
C
The RAND Corporation 2017.

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT