Response pattern analysis: Assuring data integrity in extreme research settings

AuthorLisa Jones Christensen,Madhu Viswanathan,Oana Branzei,Enno Siemsen
Date01 February 2017
Published date01 February 2017
DOIhttp://doi.org/10.1002/smj.2497
Strategic Management Journal
Strat. Mgmt. J.,38: 471–482 (2017)
Published online EarlyView 17 March 2016 in Wiley Online Library (wileyonlinelibrary.com) DOI: 10.1002/smj.2497
Received 20 May 2014;Final revision received28 September 2015
RESPONSE PATTERN ANALYSIS: ASSURING DATA
INTEGRITY IN EXTREME RESEARCH SETTINGS
LISA JONES CHRISTENSEN,1*ENNO SIEMSEN,2OANA BRANZEI,3and
MADHU VISWANATHAN4
1Organizational Leadership and Strategy, Marriott School of Management, Brigham
Young University, Provo, Utah, U.S.A.
2Operations and Information Management, Wisconsin School of Business,
University of Wisconsin-Madison, Wisconsin, U.S.A.
3International Business and Strategy, Ivey Business School, London, Ontario,
Canada
4College of Business, University of Illinois-Urbana Champaign, Champaign, Illinois,
U.S.A.
Research summary: Strategy scholars increasingly conduct research in nontraditional contexts.
Such efforts often require the assistance of third-party intermediaries who understand local
culture, norms, and language. This reliance on intermediation in primary or secondary data
collection can elicit agency breakdowns that call into question the reliability, analyzability, and
interpretability of responses. Herein,we investigate the causes and consequences of intermediary
bias in the form of faked data and we offer Response Pattern Analysis as a statistical solution
for identifying and removing such problematic data. By explicating the effect, illustrating how
we detected it, and performing a controlled eld experiment in a developing country to test the
effectiveness of our methodological solution, we encourage researchers to continue to seek data
and build theory from unique and understudied settings.
Managerial summary: Any form of survey researchcontains the risk of interviewers faking data.
This risk is particularly difcult to mitigate in Base-of-Pyramid or developing country contexts
where researchers have to rely on intermediaries and forms of control are limited. We provide
a statistical technique to identify a faking interviewer’s ex post data collection, and remove the
associated data prior to analysis. Using a eld experiment where we instruct interviewers to fake
the data, we demonstrate that the algorithm we employ achieves a 90percent accuracy in terms of
differentiating faking from nonfaking interviewers. Copyright © 2016 John Wiley & Sons, Ltd.
INTRODUCTION
An increasing number of management scholars
collect data in nontraditional research settings
such as informal markets, slums, war zones,
low-literacy settings, base-of-the-pyramid (BoP),
and developing-country environments. These
settings are theoretically relevant (Kriauciunas,
Keywords: survey research; survey design; survey admin-
istration; research methods; nontraditional contexts
*Correspondence to: Lisa J. Christensen. E-mail: lisa_jc@
unc.edu
Copyright © 2016 John Wiley & Sons, Ltd.
Parmigiani, and Rivera-Santos, 2011; Suder and
Czinkota, 2013; Webb, Ireland, and Ketchen,
2014) but dynamic and possibly dangerous (Hiatt
and Sine, 2014). In order to obtain data from
these settings, researchers frequently rely on
intermediaries— local third parties who organize
and support the data collection effort. In this
paper, we address how working with data collected
through intermediaries can impact research, and
we identify a method scholars can use to screen for
and eliminate faked data.
Intermediaries are usually people with stronger
cultural and developmental proximity to the

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT