Seeking best practices among intermediate courts of appeal: a nascent journey.

AuthorBinford, W. Warren H.

I. INTRODUCTION

In early 2006, a small group of appellate court judges, law school faculty, and law students in Oregon endeavored to research whether there were identifiable "best practices" (1) among intermediate courts of appeal. The Oregon Court of Appeals faces the same challenges as many of its sister courts across the country, trying to be as productive and efficient as possible with a limited budget (2) and an increasing case load. (3) If best practices for courts similar to Oregon could be identified, the court could adopt some of these practices to maximize its performance.

Thus, we decided to form the Willamette Court Study Committee. (4) The authors (and several of their academic colleagues) committed to conduct the necessary research on best practices and write this report while the appellate court judges agreed to consult with the researchers on survey design and encourage participation among similar courts. Unfortunately, as is true with so many questions in the field of law, we were frustrated to find that the answer to the question of whether identifiable best practices exist among intermediate appellate courts is an inherently ambivalent maybe.

The frustration starts with a review of the existing literature on intermediate appellate court performance. Although much of the relevant literature is highly impressive and serves useful purposes, (5) very little focuses expressly on best practices, or even on state intermediate courts of appeal. Moreover, we found no research that reports (1) the performance of courts during all phases of the appellate process, (2) across multiple courts, (3) according to the courts' own data, while (4) recognizing and integrating court innovations that might be influencing court performance. Rather, much of the existing literature is relatively discrete, focusing on, for example, time on appeal, (6) the role of court staff, (7) the influence of "court culture," (8) or the adoption of specific performance mechanisms in individual courts. (9) Thus, we hope that this research will serve as a small, first step in filling this gap in the literature.

Our study builds on a recent multi-state study of intermediate appellate courts (10) by using a larger sample size (thirteen courts rather than four). By examining a larger number of courts and gathering data from all phases of the appellate process, we hope to advance the identification of courts' best practices based upon comparative data. Although the sample size of our study is significantly larger than the Hoffman and Mahoney study, it is still small enough that one should be careful not to overstate the validity and reliability of the results. Caution is especially critical here because we are seeking normative systems and standards. After all, the concept of "best practices" is based on the belief that there are certain methods that have proven to be more efficient and effective over time and across large numbers of people and organizations.

Our study differs from previous studies also in that we tried to minimize the filtering of data provided by the courts. We did not conduct site visits or ask court staff to choose among pre-set survey responses. (11) Instead, our survey relies heavily on numerical and narrative responses directly from the courts. Not surprisingly, this approach leads to definitional and interpretive issues.

For example, we compare court performance by measuring "efficiency" and "productivity" based primarily on self reported data. We measure a court's "efficiency" by how quickly the court processes cases from the filing of the notice of appeal to final disposition, as well as during the individual stages in the appellate process (as defined in section IV.B below). We measure "productivity" by the number of "opinions" issued by a court. We recognize that not all opinions are created equally and that appellate courts dispose of cases by methods other than issuing opinions. However, opinions and pre-argument dismissals continue to be dispositive at intermediate appellate courts, and thus, we believe that the number of opinions issued by a court is a leading, but not sole, indicator of the court's "productivity."

Because our research largely leaves to the courts the discretion to define "opinion" from their perspective, we are compelled to remind the reader of the flaws inherent in any research that relies heavily on self definition, assessment, and reporting. Participants may not have the same understanding of a term as the researchers or even one another and even if they do, an unintentional oversight may convey inaccurate data that skews the results reported. Thus, one should be highly cognizant of the fact that much of the data presented here is self reported by the courts and has not been filtered, verified, or manipulated by us, except where expressly stated. Although this approach may create a risk of inconsistencies and even errors in some instances, we believe that narrative responses from courts are more likely to yield descriptions of court innovations. Because we were more interested in discovering effective and efficient practices among courts rather than ranking the courts in a competitive hierarchy, we sacrificed the authority of the rankings (which are interesting, but should not be overvalued since the comparisons are not always "apples to apples" as any court who participated in this survey will immediately recognize) for the opportunity to identify nascent best practices among the participating courts.

Despite all of the caution and limitations necessary with this approach, the results of this study demonstrate that it was well justified. The responses from the participating courts indicate that across the nation, intermediate appellate courts are actively innovating new systems for processing more appellate cases more efficiently. Courts are expanding their use of summary disposition methods, increasing the categories of cases being expedited, reformulating panels, adopting new technologies, reducing oral argument, issuing a majority of unpublished opinions, and in some cases, questioning court culture. We believe that these innovations will give rise to a nascent set of best practices that, once identified, can be adopted by courts to improve their performance even in the face of limited resources.

Analysis of the more routine data such as budgets, staffing, and case processing also proved revealing. For example:

We were unable to identify any significant statistical relationship between a court's total budget and (1) number of filings, (2) court efficiency, (3) court productivity, (4) number of legal staff, or (5) number of judges. Thus, we are left wondering what court-related data, if any, legislatures (or executives) use to determine the size of budget appropriations for courts.

We also found no correlation between court efficiency and productivity.

As a group, the courts were able to meet the ABA Standards for case processing at only one stage in the appellate process: from oral argument to issuing a decision. (12) What is interesting about this phenomenon is that this is the stage that is most within the direct control of the judges.

Also, although we identified correlations between court productivity and number of staff, as well as judicial salaries, we found few correlations with court efficiency, save for one significant exception: there was a positive correlation between the average time to process a case and the number of senior or retired judges used.

This kind of data begins to lay the foundation for a body of knowledge that can assist judges, court administrators, and others interested in optimal court performance. What should determine budget appropriations for intermediate appellate courts? Do budgets affect court performance? Would paying judges more increase productivity? Would using senior judges less increase efficiency or make matters worse? These are just a few of the questions inspired by the data summarized below. Finding answers to these questions as well as continued study of court systems and innovations may help us to identify best practices that promise to increase court efficiency and productivity in the years ahead.

II. METHODOLOGY

To conduct our research, we identified thirteen intermediate appellate courts with similar structure: Arkansas Court of Appeals, Colorado Court of Appeals, Connecticut Appellate Court, Court of Appeals of the State of Georgia, Kansas Court of Appeals, Kentucky Court of Appeals, Michigan Court of Appeals, Minnesota Court of Appeals, Nebraska Court of Appeals, the Appellate Division of New Jersey's Superior Court, New Mexico Court of Appeals, Court of Appeals of North Carolina, and Oregon Court of Appeals. Each court is in a state where there is one intermediate appellate court with predominantly mandatory jurisdiction and one court of last resort with discretionary jurisdiction. Two other courts were invited to participate, but did not.

The courts completed an on-line survey developed by the Willamette Court Study Committee. (13) The survey consisted of forty-two narrative questions covering subjects such as the number of appeals filed, number and kinds of opinions issued, court budget, court staffing, length of case processing at various stages, oral argument practices, disposition of cases, motion practice, panel structure, and statutory periods and internal training, among others. The courts were also provided the opportunity to identify factors they perceive contribute to delays in case processing, as well as innovations the courts have implemented to promote efficiency. The courts were instructed to use data from 2005 in completing the survey. (14) To the extent reliable data from 2006 was also available, the courts were invited to provide it with a clear indication that it was from 2006. We did not include the 2006 data in the analysis unless it was submitted as part of a 2005-2006 fiscal or court...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT