The ABA 'ten principles of a public defense delivery system': how close are we to being able to put them into practice?

AuthorCooper, Caroline S.
PositionHow Do We 'Do Data' in Public Defense?
  1. OVERVIEW

    As part of the Gideon Initiative launched by the Bureau of Justice Assistance (BJA) in 2013 in commemoration of the fiftieth anniversary of the Gideon * (1) decision, American University (AU), in partnership with the National Legal Aid and Defender Association (NLADA), launched the Right to Counsel/Ten Principles Technical Assistance Project designed to assist public defense providers to enhance their ability to adhere to the "Ten Principles of a Public Defense Delivery System" (Ten Principles or Principles), articulated by the American Bar Association (ABA) over a decade ago. (2) The project began its services with a national survey of public defense providers to obtain their perspectives on the degree to which they were able to adhere to the Ten Principles, including selected operational "benchmarks" referenced in the commentary (see Appendix) to each of the Principles, as well as obstacles they were encountering, promising practices they had developed, and areas for which technical assistance would be useful. The survey was distributed to over 1150 public defense providers working in a wide range of contexts for indigent defense service delivery-governmental and nongovernmental defender offices, court appointed counsel systems, contract defender systems, and others--in an effort to obtain as broad a perspective as possible on the degree to which public defense providers felt they were able to adhere to the Ten Principles, challenges they were addressing, and areas for which technical assistance might be useful--whether through information sharing, interactive web discussions, and/or focused assistance to address specific topic areas.

    The survey initiative represented the first national effort to obtain a self-assessment by public defense providers of both their ability to adhere to the Ten Principles as well as the day-to-day impediments they are dealing with so that realistic priorities for follow-up action and appropriate technical assistance support could then be developed and carried out.

  2. METHODOLOGICAL ISSUES

    Methodologically, the survey effort presented a number of challenges, many of which resulted in some deviation from traditionally accepted social science research practice. What had seemed at the start to be a straight-forward effort at information gathering to better understand the situational context in which public defense services were being provided became entangled with unanticipated problems arising from the wide range of contexts in which public defense services were being provided; the lack of readily available information on overall caseloads in the jurisdictions and those requiring public defense services (versus the specific caseload that the respondent handled); and the degree of familiarity of respondents regarding the ABA Principles, their intent and their application. Among the challenges the survey effort presented included: (a) developing the survey instrument to yield both perceptions as well as the operational information necessary to identify technical assistance needs since many public defense providers were not aware of the Ten Principles or were unfamiliar with what achievement entailed, (b) identifying a cross section of public defense providers for survey distribution since there exists no comprehensive list of attorneys providing public defense services in the United States, and (c) subsequently analyzing the survey results to isolate the systemic issues (e.g., factors relating to "independence" and quality of defense services) relevant to implementing the Ten Principles from the numerous tangential problems being reported. These are discussed briefly below.

    1. Designing the Survey to Yield Operational Information

      Each of the Ten Principles provides a general statement(s) regarding critical elements associated with effective public defense services without any additional benchmarks to measure adherence. Simply measuring public defense providers' perceptions regarding their adherence to each of the Principles, however, would not provide information relating to the operational context in which survey respondents worked or a frame of reference for identifying technical assistance needs. It would also not "educate" the respondent in terms of the intent of each Principle to help the respondent better understand what was entailed in adherence and determine the degree to which the respondent was complying as well as potential technical assistance that might be useful. This "educational component" of the survey was also important to promote a consistent frame of reference for respondents' comments to survey questions. In order for the survey to yield more than "perceptions," the commentary and supporting documents for each of the Principles were therefore reviewed to develop a brief list of follow-up survey questions to highlight operational "benchmarks" that could further document the intended application of each Principle. (3) The addition of "operational benchmarks" for each of the Principles made it possible to both ascertain respondents' perceptions regarding the ability of their offices to adhere to each Principle as well as to match respondents' perceptions with the operational context in which the respondent worked.

    2. Limitations Regarding the Pool of Public Defense Providers Receiving the Survey

      With no central repository of "public defense providers" existing in the United States, recipients of the survey were limited to those who could be identified, primarily through public sources, including state-by-state web searches and inquiries to courts, county government agencies, state and local public defense offices, and word-of-mouth. Although the survey was sent to public defense providers in every state, the distribution did not reflect a statistically developed sampling pattern; rather, the survey was distributed to all public defense providers who could be located in a given jurisdiction. In some states the survey was therefore sent to only a few providers; in other states, it was sent to significantly more. Despite intensive effort to develop the survey distribution list, it should be noted that the survey was sent to only a small segment of public defense providers compared to the (yet to be identified) universe of all attorneys providing public defense services. The survey responses, therefore, reflect the comments of a small slice of those involved with the provision of public defense services who could be readily identified and whose perspectives are nevertheless extremely valuable. Hopefully future research efforts of this type will be able to reach a broader segment of attorneys providing public defense services--in particular those working as assigned counsel, especially those in jurisdictions without organized assigned counsel offices.

    3. Limitations in Achieving a Cross Section of Public Defense Provider Perspectives

      Given the limitation in survey distribution noted above, it is not surprising that a high percentage (75%) of survey respondents worked in governmental public defense offices, with only 10% working as contract or assigned counsel. As noted above, this profile of defense service delivery systems reflected among the survey respondents is at odds with the universe of public defense providers in the United States, well over half of whom are estimated to be working as assigned counsel in "unmanaged" assigned counsel systems. (4) And, because the survey recipients (and respondents) represented public defense providers who were the easiest to identify, their comments cannot be presumed to reflect the full range of operational issues relevant to the universe of all public defense service providers in the United States. While the 386 responses from public defense providers included responses from providers in every state plus the District of Columbia, there was no pattern in terms of the numbers of responses per state; for some states only two to three responses were received while for others, responses totaled twenty or more.

    4. Interpreting Survey Results

      Despite the limitations in identifying the universe of attorneys providing public defense services, the survey results provide a valuable perspective on the operational contexts in which public defense providers are working and the first national self-assessment of the day-to-day challenges they are encountering. Not all survey respondents provided comments regarding the degree to which they felt their offices were able to adhere to all of the Ten Principles. In part, response rates may have been a reflection of respondents' greater or lesser familiarity with individual Principles. In fact, almost half of the respondents indicated that they were not familiar with the Ten Principles prior to receiving the survey. A number of respondents who did not provide comment on the degree to which they felt their offices adhered to the Ten Principles still provided comment on the degree to which they felt their office's operations reflected achievement of the operational benchmarks associated with these Principles.

      The original audience intended for application of the survey results and the technical assistance follow-up to be provided were public defense service providers who were assumed to be in the best position to promote adherence to the Ten Principles. However, the extensive commentary survey respondents provided regarding operational issues they were addressing in regard to adhering to each of the Principles (5) also highlighted broad, systemic problems affecting state and local...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT