Definite articles: using the law review article type indicator (R) to make law review publishing decisions.

AuthorChiappinelli, Eric A.

Every year close to two thousand law review articles (mostly written by law professors) are circulated among about two hundred student-edited law journals.(1) This number does not include roughly four thousand student works by journal members.(2) The semiannual process by which the two thousand articles by nonstudents are matched with journals has been described elsewhere so I will not elaborate here.(3)

What I will say here is that the process is exhausting for both authors and editors. Stephen R. Heifetz estimates that prestigious law reviews spend three thousand hours a year simply screening articles; much more time is then invested in making publication decisions about the articles deemed worthy of serious consideration.(4) Less prestigious law reviews receive fewer submissions each year but probably spend the same amount of time per article in making publication decisions. If the average law review receives even one hundred fifty articles each year and spends, say, ten hours reading and considering each article, the total person-hours spent on selecting law review articles is a staggering three hundred thousand. Put another way, the aggregate time spent in making publication decisions is roughly equivalent to the time billed these days by five first-year associates at a large law firm. The time spent by the authors of those two thousand articles in submitting them to the law reviews, communicating with law review editors, and accepting (one hopes) an offer of publication adds another five hundred hours to the total.(5)

A number of authors have suggested ways in which the law review article selection process might be made more rational.(6) Arthur Austin suggests that law professors use vetting to short circuit close scrutiny by law reviews.(7) Vetting is the process of having famous professors review one's article in draft so that one can mention them in the first footnote. The theory is that good vetting signals law review editors that the article is of high quality, thus reducing the time the editors need to spend in screening and making publication decisions.

A number of law reviews, especially those just below the very top rank, use a technique called the "exploding offer."(8) It has been customary for a law review to agree that an offer it extends shall remain open for two weeks. Virtually the only purpose that this time period serves, however, is to allow the author to contact law reviews he or she considers to be superior and to try to weasel an offer out of one of them. Sometimes this works. An exploding offer is simply a refusal by the law review to keep its offer open for more than a day or two in hopes the author will not have enough time to parlay it into an offer from a more prestigious journal. Sometimes this works. The exploding offer plays heavily on the risk-aversion of almost all law professors. Rather than risk having no offer at all, and acutely aware of the flaws in his or her article, a law professor will generally accept an exploding offer.

Stephen R. Heifetz has suggested that a market model might efficiently match articles and law reviews.(9) Under this scheme a central clearinghouse triannually would receive articles and rank-ordered lists of up to ten law reviews per article from authors. The clearinghouse would then send copies of the articles to the law reviews and ask them to rank the articles. Finally, the clearinghouse, using an algorithm, would pair up articles and law reviews so that the matings were pareto optimal.

Yet I believe that the core problem in the law review article selection process is the information asymmetry between authors and law reviews. If a way could be found to reduce this information disparity, the article selection process could proceed more swiftly and, presumably, more accurately. So let me begin by describing the kinds of information authors and law reviews want to know and then I will unveil my revolutionary approach to ameliorating the information disparity.

Law reviews want to publish the "best" articles they can get.(10) Writing twelve years ago, however, Leibman and White were dismayed to find that most law reviews have not articulated, even for internal use, the criteria by which they will judge an article's quality.(11) Further, little consensus seems to exist among law reviews as to the qualities that make an article "good."(12) Nonetheless, Leibman and White report that the more prestigious reviews value articles that are trendy, pretentious, and theoretical.(13) Most law reviews also want articles from well-regarded, or at least well-known, professors. This is so because many people in the legal community do not actually have time to read articles. They judge a review by its cover.(14) Just as the journal in which an article appears is an indirect signal of the article's quality,(15) so an author's fame, or more indirectly, an author's affiliation, signals a journal's quality.

It also seems clear that every law review wants articles that are well written and technically sound. That is, they want articles in which most sentences are already footnoted in conformance with The Bluebook.(16) The obvious motivation for this desire is so that the law review editors will have less work to do in editing the text and preparing the manuscript for publication. As a further refinement, law reviews prefer authors who are easy to work with. For law review purposes, this means authors who are polite, who do not quibble over proposed edits, and who return edited drafts to the reviews on time.

In the law review article selection process, a considerable information disparity exists between the student editors and the articles' authors. All authors identify their school affiliation, but students may not know the relative prestige of the author's school. If an author is truly well known, at least one of the early readers should recognize that fact.(17) However, mistakes can happen. Even the worthiest Homer(18) can nod. Moreover, many authors have similar names. Student editors cannot always be certain that the article they are reviewing was actually written by the big name.(19) What's more, even if the review editors could quickly ascertain these two data, the other two criteria, which we may categorize as "quality" and "amount of work required," can only be evaluated by reading the article.

Until now. Now an amazingly powerful tool is available to law reviews that promises to reduce substantially the amount of time that they need to devote to selecting articles. Further, and more importantly, this tool permits law reviews to make a much more accurate assessment of an article's overall worth. If...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT