Variability of Crime Clearance Among Police Agencies

Date01 March 2019
AuthorCynthia Lum,Charles Wellford,Thomas L. Scott,Heather Vovak
DOI10.1177/1098611118796597
Published date01 March 2019
Subject MatterArticles
untitled Article
Police Quarterly
Variability of Crime
2019, Vol. 22(1) 82–111
! The Author(s) 2018
Clearance Among
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1098611118796597
Police Agencies
journals.sagepub.com/home/pqx
Thomas L. Scott1
,
Charles Wellford1, Cynthia Lum2,
and Heather Vovak2
Abstract
Average crime clearance rates have remained remarkably stable in the United States
since the 1980s, despite many advances in investigative technologies or fluctuations
in crime. Taking these average trends at face value, some have suggested that this
stability indicates that police departments can do little to alter their clearance rates.
However, in this study, we find that the average trends mask substantial long-term
variation in crime clearance among police agencies. Using group-based trajectory
modeling, we test whether large U.S. police departments have reported uniquely
different long-term clearance rate trends from 1981 to 2013 and what organizational
factors might contribute to different trends. As we discuss, this method has attrac-
tive qualities that provide for a more rigorous analysis compared with past compar-
ative work. Our results show diverse levels and patterns of clearance both within
individual crime types and across multiple crime types that appear to covary with
organizational factors. We explain how finite mixture modeling can advance both
quantitative and qualitative research by identifying departmental differences in
performance for further study.
1Department of Criminology and Criminal Justice, University of Maryland, College Park, MD, USA
2Department of Criminology, Law and Society, Center for Evidence-Based Crime Policy, George Mason
University, Fairfax, VA, USA
Corresponding Author:
Thomas L. Scott, Department of Criminology and Criminal Justice, University of Maryland, College Park,
MD, USA.
Email: tscott1@terpmail.umd.edu

Scott et al.
83
Keywords
clearance rate, police performance, trajectory, group-based trajectory modeling,
multitrajectory modeling
Introduction
One of the most important functions of law enforcement is the investigation and
resolution of crimes. In the last half century, American police agencies have seen
a great deal of advancement and innovation in criminal investigations, starting
with the standardization and computer automation of case documentation and
processing to improvements in forensics and investigations technologies to iden-
tify suspects more accurately and quickly. Crime analysts have also become an
important part of investigations, assisting with searching for individuals, gath-
ering clues, and generating patterns of similarities between cases. Particularly for
serious victimizations involving violence and theft, police agencies devote sig-
nificant amounts of resources to investigations, often 10% to 20% of their
annual budgets.
Despite these recent advances and the resources allocated to investigations,
the resolution or clearance of crime in the United States is arguably low. In the
latest year for which data are available in the United States (2016), there were
approximately 1.25 million violent crimes reported to the police, of which 54%
were not cleared by an arrest or exceptional means, including 7,000 homicides.
In addition, of the nearly 7.92 million serious property crimes that occurred in
2016, 6.47 million remained unsolved (about 82%). In total, this amounts to
approximately 78% of all serious crimes that did not result in a successful
resolution.1
Perhaps even more interesting is that average clearance rates have not
changed much over the last 30 years for many crime types (Braga, Flynn,
Kelling, & Cole, 2011).2 Figure 1 shows the average clearance rates for homi-
cide, robbery, aggravated assault, and burglary from 1981 to 2013 for all agen-
cies with 100 or more officers as of 1980.3 This figure shows that although
clearance rates differ dramatically across crime types, being highest for homicide
and lowest for burglary, average national clearance trends within crime types
have remained remarkably stable for over three decades. In Figure 2, we repli-
cate these average clearance rate trajectories for the 100 largest police agencies
as of 1981. For these agencies, while there have been slight declines in clearance
rates in three of these four crime types, trends continue to remain fairly consis-
tent. Clearance rates for aggravated assaults have declined from around 60% in
the 1980s to 50% in the 2000s, robbery rates have hovered around the 30%
range, burglary gradually declined from 15% to 10%, and the homicide clear-
ance rate dropped from 75% in 1981 to below 70% in recent years.









































































84
Police Quarterly 22(1)
Figure 1. Yearly crime clearance rates for the United States from 1981 to 2013 (all agencies
with 100 or more officers).
Figure 2. Yearly crime clearance rates for the United States from 1981 to 2013 (100 largest
agencies subsample).

Scott et al.
85
The long-term stability in clearance rates despite advances in policing and
investigation strategies or even major fluctuations in crime types has led some to
question whether the police can do anything to improve their ability to solve
crimes (Braga et al., 2011, p. 8; Travis, Western, & Redburn, 2014, pp. 48–49).
We are reluctant to take these average trends and their assessments at face value.
Perhaps some agencies consistently have clearance rates that are much higher
(or lower) than our averages shown in Figures 1 and 2. Average trends might
mask unique variations in individual agency performance over time, which given
further study could offer clues as to why some agencies perform better than
others. What is lacking from much research and theory on the dynamic rela-
tionship between clearance rates and other important variables, like investiga-
tive advances or crime rates, is an explicit statement regarding how agency
clearance rate trends are distributed around the national average. One exception
is a recent article by Worrall (2016). Worrall was one of the first to apply group-
based trajectory modeling to the study of police clearances when he modeled
variation in property and violent crime clearance trajectories from 2000 to 2012
for 570 law enforcement agencies.
Such inquiries into variations in clearance rate trends are needed in today’s
policing environment. Clearance rates are used as a common measure of police
effectiveness in both research and practice, and the resolution of crimes is
important for police legitimacy for both individual victims as well as commu-
nities. Yet, we know little about how the efficacy and effectiveness of police
investigative practices affect crime clearance rates. For example, the Lum,
Koper, and Telep (2011) Matrix4 now houses 165 moderate to very strong
evaluation studies of policing, yet only 11 seem connected to the work of inves-
tigative units (see, e.g., Bynum & Varano, 2003; Eck & Wartell, 1998; Fox &
Farrington, 2015; Jolin, Feyerherm, Fountain, & Friedman, 1998; Koper,
Taylor, & Woods, 2013; Martin & Sherman, 1986; Nunn, Quinet, Rowe, &
Christ, 2006; Spergel, Wa, & Sosa, 2002).
Early research by the RAND Corporation and others raised questions about
the utility of investigations by showing that the outcomes of criminal investiga-
tions typically depend on information obtained by patrol officers who first
respond to the scene, and that follow-up activities by detectives appeared to
add little to the apprehension of offenders (Greenwood & Petersilia, 1975).
Concern over the ability of police investigations to impact crime rates and
crime clearances led researchers to examine what might contribute to crime
clearances and, in turn, how might criminal investigations be improved through
better management, training, policies, and investigative techniques (e.g., Braga
& Doussealt, 2018; Coupe, 2016; Cronin, Murphy, Spahr, Toliver, & Weger,
2007; Eck, 1983; Higginson, Eggins, & Mazerolle, 2017; Keel, Jarvis, &
Muirhead, 2009; Ritter, 2008; Wellford & Cronin, 1999). These studies seem
to indicate that aside from specific characteristics of crimes themselves, the
application of investigative resources may influence whether crimes are resolved.

86
Police Quarterly 22(1)
Understanding variations in crime clearances is an important first step in
determining what leads to improved clearance rates for agencies over time,
and therefore is the first step of a larger project in which we are engaged with
the Laura and John Arnold Foundation since 2015. Using similar techniques
applied by Worrall (2016)—group-based trajectory modeling (GBTM; Nagin,
2005)—but on a more targeted group of agencies (very large agencies) and for a
longer time series (33 years), we disaggregated violent and property crime types
into specific offenses to analyze variations in clearance rate trajectories. We also
add to previous work by applying a recent advancement in GBTM—multitra-
jectory modeling (Jones & Nagin, 2007; Nagin, Jones, Passos, & Tremblay,
2016)—to test our hypothesis that there may be identifiable groups of law
enforcement agencies that have similar long-term clearance rate trends across
multiple crime types, and that those group trends are distinct from the national
average. This is an innovative approach to understanding clearance rate trends
and we explain our approach in detail in the Methods section below.
There is some evidence in addition to Worrall (2016) to support a group-
based conception of long-term trends of clearance rates. This is most evident in
studies of homicide...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT