Gathering Data for Archival, Field, Survey, and Experimental Accounting Research

Date01 May 2016
AuthorEUGENE SOLTES,ROBERT BLOOMFIELD,MARK W. NELSON
DOIhttp://doi.org/10.1111/1475-679X.12104
Published date01 May 2016
DOI: 10.1111/1475-679X.12104
Journal of Accounting Research
Vol. 54 No. 2 May 2016
Printed in U.S.A.
Gathering Data for Archival, Field,
Survey, and Experimental
Accounting Research
ROBERT BLOOMFIELD,
MARK W. NELSON,
AND EUGENE SOLTES
ABSTRACT
In the published proceedings of the first Journal of Accounting Research Con-
ference, Vatter [1966] lamented that “Gathering direct and original facts
is a tedious and difficult task, and it is not surprising that such work is
avoided.” For the fiftieth JAR Conference, we introduce a framework to help
researchers understand the complementary value of seven empirical meth-
ods that gather data in different ways: prestructured archives, unstructured
(“hand-collected”) archives, field studies, field experiments, surveys, labo-
ratory studies, and laboratory experiments. The framework spells out five
goals of an empirical literature and defines the seven methods according
to researchers’ choices with respect to five data gathering tasks. We use the
framework and examples of successful research studies in the financial report-
ing literature to clarify how data gathering choices affect a study’s ability to
achieve its goals, and conclude by showing how the complementary nature of
S.C. Johnson Graduate School of Management, Cornell University Harvard Business
School, Harvard University.
Accepted by Douglas Skinner.This paper has been prepared for the 2015 Journal of Account-
ing Research Conference. We thank Matthew Bloomfield, Scott Emett, Ryan Guggenmos, Anne
Heinrichs, Scott Jackson, Bob Libby, Karthik Ramanna, Jennifer Tucker, Isabel Wang, Anas-
tasia Zakolyukina, and participants at the 2015 JAR Conference for comments, and Rajesh
Vijayaraghavan for his assistance gathering data on published accounting research.
341
Copyright C, University of Chicago on behalf of the Accounting Research Center,2016
342 R.BLOOMFIELD,M.W.NELSON,AND E.SOLTES
different methods allows researchers to build a literature more effectively
than they could with less diverse approaches to gathering data.
JEL codes: M40; M41; B40; C81; C90; C91; C92; C93; C99
Keywords: archival; data; experiment; empirical methods; field study; sur-
vey; financial reporting
1. Introduction
To open the published proceedings of the first Journal of Accounting Research
Conference (May, 1966), Sidney Davidson wrote:
Accounting is singularly concerned with the quantitative expression of
economic phenomena. Despite this preoccupation with numerical presen-
tation, there has been little empirical analysis or testing of the concepts of
accounting. Accounting thought will develop more effectively by increased
reliance on the testing of meaningful hypotheses; there is a need to look to
evidence as well as to authority for the substantiation of accounting ideas
(Davidson [1966, p. iii]).
Fifty years after the first JAR conference, most accounting studies ful-
fill Davidson’s exhortation to “look to evidence,” and in so doing have al-
lowed researchers to understand and predict the causes and consequences
of many accounting phenomena. However, the literature still struggles to
confront the challenge laid out in William Vatter’s “Critical Synthesis of
Conference Papers,” which closed the published proceedings:
One of the real limitations of empirical research is that we tend to work on
problems we are able to study because data are available; we thereby tend
to overlook problems that we ought to study, if data for such problems are
not easy to obtain. It is significant that the larger and more comprehensive
efforts reported here have dealt with published or otherwise readily avail-
able data. Gathering direct and original facts is a tedious and difficult task,
and it is not surprising that such work is avoided (Vatter [1966, p. 232]).
Our goal is to help accounting researchers realize Vatter’s vision over the
next 50 years, much as they have realized Davidson’s in the last 50. Weknow
that “gathering direct and original facts is a tedious and difficult task,” and
we cannot make it any easier. But we can help researchers gather data wisely
and explain their choices and contributions to readers, reviewers, and ed-
itors. We do so by introducing a framework that spells out five goals of an
empirical literature and five data gathering tasks that researchers can use
to advance those goals. We use the framework to define seven methods
that appear in empirical accounting research, to clarify how data gathering
choices affect a study’s contribution, to recommend practices that will en-
hance that contribution, and to show how a community of scholars builds
a literature more effectively by using a wider range of methods.
The paper proceeds as follows. In section 2, we classify articles in four
top accounting journals by their topic and their method. Our results are
GATHERING DATA 343
consistent with Vatter’s conjecture that method choices are affected by the
difficulty of accessing new data. For topics with a great deal of readily avail-
able data, like financial reporting, corporate governance and compensa-
tion, and taxation, a large majority of published articles rely on archival
methods. For topics with less readily available data, like managerial ac-
counting and auditing, a greater proportion of articles rely on methods
that require researchers to gather new data, such as laboratory experi-
ments and field studies. We also find substantial variation in method choice
across journals, even after controlling for topic. This variation suggests
that methodological choices are influenced by a journal’s mission and
traditions.
In section 3, we draw from a philosophy of science called constructive
empiricism to identify five goals that an empirical literature seeks to ac-
complish: (1) specifying causal theories to test, (2) testing for predicted as-
sociations between variables, (3) attributing those associations to the causal
theories, (4) verifying robustness and generality of results, and (5) placing
results in context and offering additional opportunities for theory building.
A successful literature, taken as a whole, strives to achieve all of these goals.
Any particular study is likely to emphasize some goals more than others.
In section 4, we identify five data gathering tasks that researchers either
can choose to undertake or delegate to others (or, in some cases, to na-
ture). The first two tasks help researchers distill observations into variables
that are well suited to testing their causal theory. Researchers can choose
whether to (1) record observations specific to testing their theory or use
records that were created by others for more generic purposes, and (2)
hand-collect records to convert them into structured data sets amenable to
statistical analysis or use data prestructured by others. The other three tasks
involve researchers intervening in the data-generating process to record
data to test their theory. Researchers can choose whether to (3) elicit de-
pendent variables or observe those variables, (4) manipulate independent
variables or allow variation to arise naturally, and (5) control other varia-
tion in the setting or allow that setting to vary naturally. In section 4.1, we
discuss how choices with respect to each of these tasks can help a study
achieve some goals at the expense of others. In section 4.2, we define seven
distinct methods according to the bundle of data gathering tasks that the
researcher chooses to undertake. We derive the seven methods by assum-
ing that two studies use the same method if the researcher undertakes the
same set of tasks; otherwise, the studies use different methods.
In section 5, we discuss a number of the framework’s implications. The
framework draws several useful distinctions between methods, indicating
two distinct types of archival study (depending on whether the researcher
hand-collects data or uses a prestructured archive), two forms of laboratory
study (depending on whether the researcher manipulates an independent
variable), and a narrow definition of field study (because it requires that
the researcher record original data). The framework also indicates that
many studies (or parts of studies) that are called surveys actually apply a

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT