Reconsidering Homicide Clearance Research: The Utility of Multifaceted Data Collection Approaches

DOI10.1177/1088767920939617
AuthorBrent Teasdale,Shila René Hawk,Dean A. Dabney
Published date01 August 2021
Date01 August 2021
Subject MatterArticles
https://doi.org/10.1177/1088767920939617
Homicide Studies
2021, Vol. 25(3) 195 –219
© 2020 SAGE Publications
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1088767920939617
journals.sagepub.com/home/hsx
Article
Reconsidering Homicide
Clearance Research: The
Utility of Multifaceted Data
Collection Approaches
Shila René Hawk1, Dean A. Dabney2,
and Brent Teasdale3
Abstract
This study explores issues associated with the data commonly used in homicide
clearance research. Data collected from 2009 to 2011 case files (n = 252) were
reviewed during interviews with investigators (n = 29). The multifaceted data
collection approach produced a more comprehensive dataset than was available based
solely upon case file reviews, with alterations to the data occurring in as many as 69%
of the cases. The process advanced the precision of the data recorded, reduced
missingness, and heightened detail on key variables. Significant differences were noted
in multivariate analyses of the datasets when modeling clearances. Findings suggest
contextualizing case file data is valuable.
Keywords
homicide, clearance, policing, multimethod, missing data, investigation, methodology,
solvability
Widely recognized as the most thoroughly documented crime by law enforcement,
criminologists have long turned to homicide data to inform the theoretical- and policy
landscape of the field. Simply stated, no other type of offense more centrally informs
criminology than homicide. Consider the following observations in this regard: 1)
nearly the entire subfield of macro criminology is built around studies using city, state,
1Applied Research Services, Inc., USA
2Georgia State University, USA
3Illinois State University, USA
Corresponding Author:
Shila René Hawk, Applied Research Services, Inc., 3235 Cains Hill PL NW, Atlanta, GA 30305, USA.
Email: shawk@ars-corp.com
939617HSXXXX10.1177/1088767920939617Homicide StudiesHawk et al.
research-article2020
196 Homicide Studies 25(3)
or national homicide data; 2) rarely does a year pass that top-tier journals do not pub-
lish multiple articles built around homicide data; 3) the journal Homicide Studies pub-
lishes four issues annually dedicated exclusively to the phenomenon; and 4) the
National Institute of Justice maintains a robust portfolio of funding, research reports,
and publicly available databases focused on homicide and its prevention. Given its
prominence in our scholarly and public thinking about crime, it is critical that our
understanding of the occurrence of and response to homicide be shaped by the most
valid and reliable data possible.
This study employs a series of homicide data enhancement protocols and empirically
examines the degree to which these efforts impact the prediction of homicide case clo-
sure. There exists considerable scholarship on the topic of homicide case closure that
spans many decades (see Riedel, 2008 and Hough & McCorkle, 2020 for a general
overview). Still the homicide clearance literature needs increased methodological con-
versation to help remediate the fact that findings are mixed, incongruent, and even con-
tradictory (Lee, 2005; Riedel, 2008; Wellford & Cronin, 1999). Many research questions
remain unanswered (Jarvis & Regoeczi, 2009; Jiao, 2007; Mancik et al., 2018; Rydberg
& Pizarro, 2014). Arguably the next step in advancing the homicide clearance literature
is for research to move past the data limitations of previous studies.
Scholars have noted that clearance studies based on archival data suffer from qual-
ity and completeness issues that significantly compromise how they can be causally
interpreted (Alderden & Lavery, 2007; Lundman & Myers, 2012; Puckett & Lundman,
2003; Riedel & Rinehart, 1996). We investigate data quality as a potential source of
these issues. We observe that most clearance research relies exclusively on archival
data and those data sources may not accurately reflect the facts of the criminal event
and subsequent investigation efforts. Research using these data are often described as
having imprecise measures or omitted factors that likely shape case outcomes. These
data quality issues may pose a threat to models seeking to account for the factors that
predict case closure and the theoretical- and policy-related insights that follow.
When relying on agency-released data, homicide researchers have limited access to
a predefined set of measures and issues of missing data (Riedel & Regoeczi, 2004).
This results in gaps in the inclusion of key measures, the use of proxies, and compro-
mised reliability (Pizarro & Zeoli, 2013; Roberts, 2007). Similar problems are
observed in comprehensive archival datasets assembled directly from official case
files, as they are generally restricted by agency record-keeping practices and lack
some key information among ongoing investigations (Rydberg & Pizarro, 2014;
Schroeder & White, 2009). This means that data concerning relevant case characteris-
tics and investigation activities are largely absent (Carter, 2013; Keel et al., 2009;
Regoeczi & Jarvis, 2013). Furthermore, these key data points are often systematically
missing, given that homicide investigation narratives generally constitute truncated
and sterilized prosecution-oriented documents as opposed to a totalistic account of the
investigation process (Hawk & Dabney, 2019; Innes, 2003).
The current study systematically compares contemporary homicide case file data
augmented through interviews with the investigators who constructed the files. We
examine if the above-mentioned precision and robustness problems could be

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT