« IF the O.J. Controversy Ended | Main | Nice to Meet You, Too... »

Tuesday, November 21, 2006

Law Reviews and Law School Rankings

Over at TaxProf, Paul Caron has posted a fascinating contrast between the USNews law school rankings and the Washington and Lee library's rankings of law reviews, based on number of citations in law reviews, citations in cases, and impact factor (citations per articles). I was surprised by a number of these findings, especially the gaps between some of the top schools and their law reviews. For example, I would think Chicago and Michigan's law review would normally be viewed as top six, but based on the combined factors data, they are 10 and 16, respectively. Certain schools with strong law reviews (Chi-Kent, Cardozo) many faculty know about; most would be surprised, however, to learn that schools like Houston and Depaul have law reviews that are "top 50."

What Paul's excerpts don't show is also important. There's no real reason to exclude from consideration the specialty journals that are highly ranked. If you stick to just the student edited law reviews, Harvard actually dominates the top 35 based on the combined rankings. 6 out of the top 35 student-edited journals are edited by HLS students: Harvard CRCL is ranked 18, Harvard Int'l L.J. is ranked 22, Harvard's J. of Law and Technology is ranked 27; its Journal on Legislation is ranked 31, and its environmental law review is ranked 33. Even when you add the peer-reviewed and refereed journals, Harvard still dominates. The venerable Journal of Legal Studies is ranked 38 and the Supreme Court Review is ranked 12. But otherwise the top 35 still looks the same, with Harvard's specialty journals outpacing many of the "main" journals at other schools.

At various faculties, there's discussion about what kind of premium (if any) to put on high placements. The truth is, it's very hard to look at these stats and say who should be in the top 10, or in the top 20, and especially hard for the top 25. That's because faculty are making journal selections based on these rankings from W/L, as well as considering rankings from Leiter and/or USNews. Looking at just USNews, consider: how many people accept placements at, say GW, because of it USN status as 19 without knowing that its law review is ranked 70th? Wash U faces similar gaps with a USN of 19 and a LR ranking of 53. Our friends at MoneyLaw may think this is worthy of Beane-ball exploitation, so I'm curious to see what distortions, if any, are created by this. (Parenthetically, I'm feeling guilty for not having linked more often to Moneylaw, which is on my required reading list every week.)

I also wonder if Leiter's faculty quality rankings track the law review rankings more closely than the USNews rankings. At first blush, I would think it would be hard to draw much of a causal story between the quality of a faculty's reputation and the quality of the student-edited law review. But we live in a strange world. My quick analysis shows there's also some huge chasms. The most recent data from Leiter is here. Michigan and Texas are tied for 8 on Leiter, but have law reviews (surprisingly) ranked 16 and 15, respectively. University of Miami has a Leiter ranking of 40 but a law review ranked 107th among all student edited journals (and 79th among general law reviews). William and Mary's Law Review is ranked 20th, but 37th on the Leiter faculty rankings, and 27th in USNews. All that said, if you exclude Harvard CRCL and William and Mary, the rest of the top 20 journals and the top 20 schools based on USNews look pretty familiar. The same can be said for most of Leiter's top 20. So, if schools are looking to award placements in the top 19, it shouldn't be hard to reach some consensus as to which schools are included there: Harvard, Yale, Chicago, Stanford, NYU, Columbia, Penn, Michigan, Virginia, Georgetown, Texas, Boalt, Duke, Northwestern, Cornell, Vanderbilt, USC, UCLA, and Minnesota. But presumably, placements in the Supreme Court Review and Harvard CR-CL should also count. I'd be curious to get other people's assessments, especially if you can report on what your schools give weight to.

Posted by Administrators on November 21, 2006 at 09:28 AM in Life of Law Schools | Permalink

TrackBack

TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef00d834fd56b869e2

Listed below are links to weblogs that reference Law Reviews and Law School Rankings:

Comments

The University of Colorado will publish Prof. Brophy's most recent analysis of this data in its next issue (78 U. Colo. L. Rev. ___), due out in January. The article is titled, "The Emerging Importance of Law Review Rankings for Law School Rankings, 2003-2007," and is part of a larger section on the current state of law schools.

The other articles in the section are: Remarks of the Honorable Stephen Breyer at the Dedication of the Wolf Law Building; Lorenzo A. Trujillo, "The Relationship between Law School and the Bar Exam: A Look at Assessment and Student Success;" Kenneth D. Chestek, "MacCrate (in) Action: The Case for Enhancing the Upper-Level Writing Requirement in Law Schools"; and Mark W. Pletcher & Ludovic C. Ghesquire, "In Restraint of Trade: The Judicial Law Clerk Hiring Plan".

The issue will be available on LexisNexis, Westlaw, and HeinOnline. Copies of the issue can be purchased from the Law Review, at http://www.colorado.edu/law/lawreview/subscriptions.htm.
Please let me know if you have further questions.

--Sarah

Sarah May Mercer
J.D. Candidate, Class of 2007
University of Colorado School of Law
Editor-in-Chief, Volume 78
University of Colorado Law Review
email: [email protected]

Posted by: Sarah May Mercer | Nov 28, 2006 1:23:14 PM

Out of institutional loyalty (plus, who doesn't love some hard data on whether law review pieces are actually getting read and used?), I'll note that the W&L rankings aren't necessarily meant to displace other rankings, but can help to test them. I also think it's interesting, and rather Paul Smithian, that the most impactful law review doesn't quite average 12 citations per piece published, and that to make the top 25, you only had to average just over six. And though I don't disagree entirely with Dan and Orin, I'd say citology has just as long a social science pedigree as reputational surveys.

Though in the end, I doubt that tenure or salary decisions are made on the basis of how often a piece is cited as opposed to where it is placed.

Posted by: David Zaring | Nov 21, 2006 5:42:26 PM

The W&L study is a measure of citation counts, not reputation. There have been numerous critiques of citation counts as a metric for reputation. The results could be skewed by just one or two widely-cited articles; moreover, getting a lot of citations doesn't necessary correlate very well to quality.

Another problem with citology is that the number of citations separating the journals isn't all that large. So if a journal publishes a few articles that get several hundred cites, the journal will skyrocket up the rankings.

The bottom line is that the reputation of law reviews will be based on whatever law professors want it to be based upon. My sense is that the US News ranking is still the most influential factor in determining a law review's reputation. Unless professors begin believing in citology, astrology, or something else, the default will be to rate primary journals roughly by US News ranking. And despite all the flaws in the US News methodology, I think it certainly beats citology.

Posted by: Daniel J. Solove | Nov 21, 2006 2:06:43 PM

I think some of the mismatch might be explained by understanding the ranking variables. I may be missing something, but impact factor seems to be biased against journals that publish larger issues or more of them. Georgetown and Michigan (especially if they count each book review as a separate article) come to mind. In some respects, this may mean they are less selective, but generally it means they have supplemented a regular publication schedule with a special issue that is not open to normally submitted articles. Selectivity for normal articles is thus still high. Furthermore, the citation to cases is a bit misleading because the schools that do well here frequently have symposia on topics that lend themselves to case citation. For instance, Georgetown benefits her from its annual criminal law issue. Fordham has a similar bounce to cases from certain symposia. The point isn't to downgrade the value of that, but to point out that publishing in those journals doesn't mean it will be read more frequently by judges or lawyers. To the contrary, it may be that only the symposia issues are ever read by someone who could influence case citation and it would be crazy to submit to them for this reason in any other issue. Moreover, frequently one article by a practitioner can drive all of a journal's case citations, which might further cast dount on whether it is a wise place to publish an article generally (I believe St. Mary's in Texas may be one of the most cited journals in cases, but they are all Texas cases using one article).

What you are left with then is the citation in law reviews factor. Just eyeballing it, this would correct some of the anomalies in our estimate of school prestige. It doesn't correct all of them (partly because of the prestigious symposia/less prestigious general submissions split in journals like Fordham), but it is closer to a general ranking that law professors would care about when deciding where to submit their papers in open submission processes.

Posted by: Anon | Nov 21, 2006 12:31:24 PM

I believe that the Michigan Law Review's ranking by W&L is dragged down by its relatively large book review issue that it publishes each year. Book reviews tend to be cited less frequently than articles, even after accounting for their lower page count. Michigan, as a consequence, receives fewer citations per page.

Posted by: Kevin Smith | Nov 21, 2006 12:22:12 PM

Orin, you're right: I have no doubt that ordering not based on prestige leads to the anomalies you suggest. Indeed, I point them out myself. But the question is whether the prestige sensibilities (JLS at top 10, GW at top 25) actually tracks anything worth tracking. It would seem to me that evidence is what informs prestige, and that otherwise, reputation doesn't follow the evidence quickly enough because of the lag effect. It seems to me that choices about where to publish should track the recent citation record more than it should track these contestable notions of who's in the top 10 or 20 or 25, no? Besides, if people are making decisions on prestige, then it leads to the questions of whether prestige is generated by faculty quality (a la Leiter) or USN or something else. Everyone will justify their own decisions based on self-serving selections of rankings. Anyway, I'm curious if GW or other schools give guidance to their faculty regarding what counts as top 20 (or 10, etc.) and what the basis for that determination is, and whether the rankings are updated each year.

Posted by: Dan Markel | Nov 21, 2006 11:40:17 AM

If your goal is to order journals based on prestige, I think the W&L rankings are not very accurate. (JLS at 38? It's more prestigious within the academy than most top 10 placements. GW at 70? It's top 25.) US News isn't perfect, but at least for main journals it's considerably better than the W&L rankings. Further, the prestige for secondary journals is widely disputed; in my experience, opinion differs widely on that one.

Posted by: Orin Kerr | Nov 21, 2006 11:08:30 AM

The comments to this entry are closed.