Research-Based Guidelines for Juvenile Justice Programs

AuthorMark W. Lipsey,James C. Howell
Date01 June 2012
Published date01 June 2012
DOI10.3818/JRP.14.1.2012.17
Subject MatterSpecial Issue on Evidence-Based Policy and Practice

*

Research-BasedGuidelinesforJuvenileJustice 
 Programs
James C. Howell
The Comprehensive Strategy Group
Mark W. Lipsey
Peabody Research Institute
Vanderbilt University
* Abstract
Juvenile justice systems make use of many programs intended to reduce the recidivism
of the juvenile offenders with whom they interact. Not all such programs are effective
and one of the more progressive reforms of recent years has been the movement toward
programs validated by research evidence. Three ways to def‌ine evidence-based programs
are described, with a focus on a relatively unfamiliar approach—evidence from meta-
analysis of evaluation research that supports the effectiveness of many generic types
of programs. In contrast to the prevailing model program approach, this approach
makes use of evidence that supports the effectiveness of many of the homegrown and
local programs that juvenile justice systems use. The f‌indings of a large meta-analysis
of hundreds of studies reveal that many of these more generic programs are as effective
as comparable model programs. These f‌indings have been operationalized into a rating
scheme based on the characteristics of effective interventions that can be used by service
providers and juvenile justice systems to assess their programs. Two recidivism studies
provide promising indications of the validity of this scheme for identifying effective
programs and guiding improvement for ineffective ones. The results of this work show
that the large body of research on interventions with juvenile offenders can be used to
create guidelines that extend the concept of evidence-based programs to the kinds of
generic programs most commonly used in juvenile justice systems.
The meta-analysis on which this paper is based was supported in part by grants from
the National Institute of Mental Health, the Off‌ice of Juvenile Justice and Delinquency
Prevention, and the Russell Sage Foundation. Thanks to the Arizona Juvenile Justice Services
Division for their support of the recidivism analysis.
JUSTICE RESEARCH AND POLICY, Vol. 14, No. 1, 2012
© 2012 Justice Research and Statistics Association
Sp e c i a l iS S u e o n ev i d e n c e -Ba S e d po l i c y a n d pr a c t i c e
P

Efforts to implement evidence-based programs as a way to obtain better outcomes
from juvenile justice interventions is arguably the most progressive policy reform of
recent years. Not all programs are effective for producing the intended outcomes,
but a program for which there is already evidence of effectiveness is generally a bet-
ter bet than an untested one. Evidence-based programs can be def‌ined in different
ways, however, depending on what we mean by program and what we mean by
evidence. Recognizing different def‌initions that nonetheless involve programs that
are practical to implement and evidence that is scientif‌ically credible can broaden
the options for a juvenile justice system in useful ways. For this purpose, we distin-
guish three ways to view programs and their supporting evidence.
1. The specif‌ic operating procedure of a particular program.
A single, unique program is def‌ined by its specif‌ic operating procedure—the
implicit or explicit procedures that constitute that specif‌ic program in its particular
site and context as practiced. Supporting evidence for any such specif‌ic program
can be obtained by conducting an impact evaluation, that is, an evaluation that
determines whether the program produces the intended outcomes. To provide the
most valid results, an impact evaluation must use a control group of comparable
juveniles who do not receive the program, preferably assigned randomly to pro-
gram and no-program conditions. An evaluation of this sort for a specif‌ic program
can provide credible evidence of effectiveness and, with positive results, that pro-
gram can rightly claim to be evidence-based. Such direct evaluation, in fact, pro-
vides the most convincing evidence for a specif‌ic program. The disadvantage is the
diff‌iculty and expense of conducting such an evaluation, especially for the many
programs used in a juvenile justice system.
2. Brand name protocol programs.
Another way of def‌ining a program is by way of a manual or protocol that
specif‌ies how the program is to be implemented. There are many familiar examples
of protocol programs in juvenile justice, e.g., Functional Family Therapy (FFT),
Multisystemic Therapy (MST), Multidimensional Treatment Foster Care (MTFC),
Aggression Replacement Training (ART), and the like. Using such a program
requires that it be implemented locally with f‌idelity to the program developer’s
specif‌ications for how it is to be delivered. Such programs are properly viewed as
evidence-based if impact evaluations conducted on implementations elsewhere have
found positive effects. These “model” or “exemplary” programs are typically iden-
tif‌ied through a review of research by some set of designated reviewers. Examples of
such efforts include the Blueprints for Violence Prevention, the National Registry of
Evidence-based Programs and Practices (NREPP), and the Off‌ice of Juvenile Justice
and Delinquency Prevention (OJJDP) Model Programs Guide. Though their criteria
vary and there is no consensus on the appropriate standards, programs appearing
on lists such as these have become the de facto def‌inition of what constitutes an
evidence-based program. The advantages of this type of evidence-based program
are the assurances of effectiveness if implemented
with f‌idelity and the availability

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT