Are social workers ignoring the 'cornerstone of science' by failing to replicate their research? a rejoinder.

Author:Howard, Matthew O.
Position::REJOINDER
 
FREE EXCERPT

Jonah Lehrer (2010) contended that replication studies often yield findings far weaker than those reported in the original investigations they attempted to replicate. If we grant that such a "decline effect" exists, what are its potential implications for social work, evidence-based practice, and science generally? Mark Ezell's editorial thoughtfully considers issues pertaining to the decline effect and the role of replication research generally in the advancement of social work knowledge and practice.

The decline effect may be a particular concern in social work, where comparatively few intervention trials of any sort are published, Rosen, Proctor, and Staudt (1999) reported that only 53 (3%) of 1,849 articles published in social work journals over a five-year period were research evaluations of potentially replicable interventions. Although the current situation may be marginally better, it remains true that relatively few randomized controlled trials of social interventions are published annually in the social work literature (Proctor & Rosen, 2008). Furthermore, most replication studies in social work are "partial" rather than exact in nature, and many focus on replicating substantive associations or treatment outcomes in different client populations, different settings, or both than those used in pertinent initial evaluations. That is, many purported replication studies are essentially efforts to establish the external validity of initially promising interventions.

If replication trials truly are the "cornerstone of science," what are the consequences for our profession of their conspicuous absence from the social work research literature? Perhaps the most serious problem is that the self-correcting function played by replication goes largely unfulfilled in social work. In contradistinction, a recent review of major cardiovascular treatment outcome evaluations published between 2000 and 2005 included 324 randomized clinical trials (Ridker & Torres, 2006).Vigorous research activity in practice areas such as cardiovascular medicine (including partial, exact, methodological, and substantive replications) heightens the likelihood that fraudulent and other erroneous findings will be detected, and it greatly...

To continue reading

FREE SIGN UP