« What Good are Midlevel Principles in IP? [Thoughts on Justifying IP] | Main | Does Not Translate?: How to Present Your Work to Real People »
Wednesday, January 30, 2013
Ranking the Rankings
There are lots of law-related rankings out there. And many of them are law school-related rankings. But, with all apologies to Juvenal, quis iudices ipsos iudicabit? Why not me?
So, here’s the first-ever ranking of law school rankings. The methodology is simple: it’s wholly idiosyncratic based upon what I value, which is, of course, what I expect others to value. I can’t include all rankings, but I try to include some of the most significant ones out there.
- Intentionally left blank. That’s right. The top slot goes to no ranking. Because I don’t think any of them deserve the top slot. Edgy.
- Sisk-Leiter Scholarly Impact Study: Formally Sisk et al. at the University of St. Thomas, but operating under the Leiter methodology, the study tidily measures scholarly impact of tenured faculty in the last five years. As peer assessment is one of the most significant categories in the U.S. News & World Report rankings, it objectively quantifies much of the academy’s impressions. And absent Green Bag’s promised Deadwood Report, it’s the next best thing. As they say, “Do your job or get fired.” Drawbacks: narrow focus; rewards “old” scholarship that gets a number of recent hits; limited utility for prospective students (except that it provides a good indicator of the strength of the institution).
- NLJ 250 Go-To Law Schools: It’s about as specific and clean a ranking as you can obtain: the percentage of graduates of each school who landed position at a NLJ 250 firm last year. It also finds alumni promoted to partner. Drawbacks: biglaw-centric; a single associate placement can significant change the percentages and ranking; does not include judicial clerkships, which can skew placement.
- Princeton Review Rankings: The strongest trait of PR is perhaps counterintuitive: it refuses to create a comprehensive ranking, and instead provides 11 ranking lists. As overall quality is a difficult task, I, for one, admire the concession. Additionally, it provides student feedback from the relatively near past, a more immediate evaluation of the institution. Maybe you think it’s too quirky. I guess I like the fact that it’s trying to do something different than the field. Drawbacks: black box methodology that refuses to disclose response rates; some less-relevant categories; fairly subjective student surveys.
- SSRN Top 350 U.S. Law Schools: One of the better ways to sort this data, I think, is by “total new downloads” in the last 12 months. That gives a sense for freshness, recency, and output. Drawbacks: narrow focus; driven heavily by a few heavy hitters.
- The Black Student’s Guide to Law Schools: While this survey may not get very much attention, and is admittedly narrow in focus, I appreciate a serious reflection on aspects of legal education that are of real concern to law students. Cost and cost of living are important. One additional thoughtful factor: “Distinguished Black Alumni,” a category that helps identify the long-standing institutional quality in a unique way. Drawbacks: “local legal job access” factor (perhaps unjustifiably) punishes schools in more rural communities; narrow audience.
- Rogers Williams Publication Study: With a more inclusive selection of journals [UPDATE: study of schools; a friend corrected that it only includes the top 50 journals, while Sisk-Leiter includes all journals] than the Sisk-Leiter studies, the study highlights some of the publications at “non-elite” law schools. For those who want to see school rankings all the way down, this fills a gap left by Sisk-Leiter. Drawbacks: narrow focus; relies on Washington & Lee Law Journal Combined Rankings scores from 2007, without updates; band-only rankings below top-40; nearly 20-year publication period may not detect more recent movement.
- Law School Transparency Score Reports: It’s not a formal ranking, but there are a number of categories where one can rank schools from top-to-bottom. It nicely aggregates some of the data otherwise found in disparate places. For instance, here I sorted by the percentage of graduates in federal clerkships. You can poke around for admissions data, costs, or employment outcomes. The real problem is less the format; it’s the data itself. And this isn’t LST’s fault. It’s just that the schools have not been inclined to provide more detailed data. That leaves LST a nice place for sorting single characteristics of self-reported data, but not much else.
- Wall Street Journal Law Blog’s Best Big Law Feeder Schools: The good folks at the WSJ took the ABA figures of those who landed full-time, long-term jobs at firms with more than 250 attorneys and made a chart. It is what it is: a much narrower, less useful version of the NLJ 250 list.
- U.S. News & World Report: I don’t need to write anything about this, right? It’s far and away the most important to most prospective law students. But, in case you haven’t heard, there are flaws with it. And I’ll just say one thing about the methodology: 9.75% of the ranking is based on how expensive you are and how much money you spend on things like electricity, plumbing, and chalkboards. Really. The more expensive you are, the better your ranking. If you’re a prospective law student, re-read that bold sentence a few times. Think it over. Read about it. And ask Robert Morse why that’s still in there. As schools are looking for ways to cut costs, and as other rankings value low-cost options, USNWR still rewards high costs and high spending
- Business Insider 50 Best Law Schools in America: It’s driven entirely by a survey of 650 readers, and only 60% have JDs. The curve is harsh: most schools score under a 3 on a scale of 1 to 5. Not a terribly scientific survey, but at least it measures perceptions and aggregates those perceptions into a score.
- QS World Law School Rankings: I don’t know. Comparing Yale to Melbourne to Singapore to Monash to McGill is a little too broad a series of rankings to have much value. Unless, I suppose, you care passionately that your decision to attend Victoria University of Willington over Cornell was a wise investment.
- Seto Rankings: Professor Theodore P. Seto’s rankings have been thoroughly debunked by my colleague Rob Anderson over at WITNESSETH. I certainly can’t top his perspective (DeLorean metaphors and all).
- National Jurist Best Value Law School Rankings: An ostensibly noble project that tries to merge affordability, employment, and bar passage into a ranking. Unfortunately, it’s basically just a list of flagship state schools, and one with a number of flawed metrics because of data reporting.
- Top Law Schools Rankings: What a hot mess. It includes the Gourman Report, which hasn’t been updated since 1997. Then, it lists Professor Brian Leiter’s “recently updated law school rankings,” which it doesn’t link. The first clue it’s out of date is the identification of Leiter as “a professor at the University of Texas law school.” And it turns out “recent” means 2002. Otherwise, it just lists the last four years’ worth of USNWR rankings. For a rankings list that concludes, “Put time and thought in to what is one of the most important decisions of your life,” one would expect some thought put into the rankings. But, if you’re interested in rankings possibly relevant to viewers of Seinfeld, Friends, and X-Files, go for it.
- Cooley Rankings: Res ipsa. And who can resist repeating this justification for “Library Seating Capacity” as a factor: “To study, a student needs a place to sit.” But, at least the school stopped publishing its self-promotional rankings in 2010.
So, how would you rank the rankings? (And, by the way, if anyone in the future wants to rank the rankings rankings, let it be known that I was a first mover in this space.)
Posted by Derek Muller on January 30, 2013 at 09:12 AM in Life of Law Schools | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef017ee8082093970d
Listed below are links to weblogs that reference Ranking the Rankings:
Comments
David, I think that puts the results before the methodology. That is, it assumes that Yale and Harvard are the "best," and that the methodology must be faulty if they aren't at the top. But, there are plenty of areas in which these schools may not be the best. To cite a few, and just to pick on Yale: in the NLJ 250, around 7% of Yale alumni were promoted to partner, but much higher percentages were promoted to partner at other schools; in the Black Law Student rankings, Yale graduates were penalized because there was virtually no career diversity, as almost all of them were in academia; in the Princeton Review rankings, Yale law students do not highly regard their classroom experiences.
Now, maybe, you say, all of that is unimportant--that Yale alumni disproportionately enter academia is not a bug, it's a feature; that partnership should not be a serious concern, especially if many leave for academia; that classroom experience is unimportant, outcomes are; etc. And this, of course, ignores cost of attendance, indebtedness at graduation, etc.
Also, when you note, "If all you want is a big law job, you should still pick Yale over any other school," I think that might overstate it for some firms, too (and might be too easily written off as self-selection). There are, of course, myriad rumors (rising all the way to multiple members of the Supreme Court) that Yale is not ideal for learning certain things, and that some firms are not inclined to hire Yale graduates, and so on.
This isn't to pick on Yale. There are, of course, many, many areas in which that Yale is, in my view, obviously ahead of all comers. But, I don't think rankings that value factors that you may not value are inherently flawed; they simply reflect a valuation of factors you may not value. And that's where, I think, diversity of rankings are important, because we all value different things in legal education.
Posted by: Derek Muller | Feb 13, 2013 11:46:53 PM
Isn't any ranking that has Yale and Harvard anything other than the top two schools, on its face, faulty? If I came up with a theory of global political power that ranked the United States 40th, that doesn't say anything about the United States--it says something about my ranking system. Anything based on law firm placement ignores the fact that Yale students don't seek law firm jobs to the same degree. If all you want is a big law job, you should still pick Yale over any other school. I don't know a ranking system that can capture this point but, still, if Yale and Harvard are not 1/2 in some order in your ranking system, it's time to go back to the drawing board.
Posted by: David | Feb 13, 2013 11:20:48 PM
Cooley rankings are clearly #1, come on.
Posted by: Brian | Jan 30, 2013 7:43:14 PM
This is fantastic. I agree with Princeton Review being relatively high for actual snapshot of student satisfaction, but really unfortunate they don't disclose response rates. Feel like they could challenge USN if they felt like it.
Posted by: Jason Solomon | Jan 30, 2013 2:21:23 PM
Great post. This was the highlight of my morning review of the law blogs. Entertaining, smart, and even moderate useful. Thanks, Derek.
Posted by: Thom | Jan 30, 2013 10:44:30 AM
Nice work, Derek. FWIW, I think this is the #1 blog post ranking the rankings.
Posted by: Orin Kerr | Jan 30, 2013 10:11:54 AM
The comments to this entry are closed.