« Can Law and Literature Be Practical? | Main | Is the Availability of the Insanity Defense Constitutionally Required? »

Tuesday, July 17, 2012

The Sisk Study is Up -- and a call for inclusion

Over at Brian's blog, you can see some observations on the nature and genesis of the new Sisk et al Study on per capita scholarly impact, which I've appended here for your viewing pleasure. Feel free to go to SSRN and throw them a bone for their hard work.  Brian has no discussion board to chat about the Sisk study, so I thought we could have a fruitful discussion here. As with most rankings, I think they need to be kept in context and not overweighted but also not underweighted simply because they don't measure what you most think is important. Sisk et al are right to emphasize how reputation studies for USNews tend to be a bit of an echo chamber and that studies like this one, which, you know, actually measure something, are a useful supplement to folks interested in trying to figure out the quality and impact a faculty is making in terms of scholarship. Again, it's not everything one should look at, but it's something.

My biggest gripe: while I understand the desire (particularly for Sisk and his institution) to limit the study to the top 70 or so, it seems a shame that there aren't resources available to get the info from and vet *all* the law schools. I have the same frustration with that other wonderful (but admittedly limited) study, the Yelnosky productivity one. For reasons that are either self-serving or that escape me, the Yelnosky study excludes the top 50 schools from study, except for those that happen to be in the New England area. Hmm.  I don't like to be snarky about this, but let's face it, inasmuch as the rankings are useful, they are sort of like a public good that is under-produced. (Yes, I'm getting ready for econ camp next week!) St. Thomas and Roger Williams are only investing in the creation of the rankings to the point they find useful (the private good), even though more information about more schools would benefit a larger group of schools or individuals (whether faculty or students. I suppose -- given that St Thomas did so well (coming at #30) -- we should be grateful that they didn't limit the number of schools to the top 40, but in fact studied almost 100 schools. Good on them.

Anyway, share your thoughts or data in the comments.  From what I can tell, the data and the methodology is transparent, so if there are associate deans or other interested faculty and law librarians out there reading this blog, feel free to do your self-study and share the info in the comments to this thread. Perhaps in future years, we can persuade St. Thomas and Roger Williams to expand the number of schools under consideration.

 

 

 

Posted by Administrators on July 17, 2012 at 04:19 PM in Blogging, Dan Markel, Life of Law Schools | Permalink

TrackBack

TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef0176168812fa970c

Listed below are links to weblogs that reference The Sisk Study is Up -- and a call for inclusion:

Comments

Greg, thanks for your explanation.

Posted by: Dan Markel | Jul 19, 2012 11:52:19 AM

As a note on Dan Markel's suggestion to expand the number of schools under consideration, we at the University of St. Thomas have cast a wide net in our scholarly impact research and have not neglected any ABA-accredited law school.

In 2010, following on Brian Leiter's ranking of the top 25, we prepared scholarly impact scoring for each of the faculties at all 200 ABA-accredited law schools.

For this 2012 study, we examined nearly a hundred law schools, drawing from the 2010 results, while prominently inviting associate deans and others to suggest inclusion of other schools with strong scholarly cultures -- several law schools did just that and shared their internal results with us.

So in our scholarly impact study, we have not excluded any school or, for reasons of limited resources or otherwise, arbitrarily limited the scope of the study. Rather, we have not reported the ranking beyond the top 70 or one-third of ABA-accredited law schools.

As explained in our report on the 2012 scholarly impact ranking, we simply did not believe we could fairly and validly rank law schools below the top 70 (one third of law schools).

First, because scholarly impact scores bunch together ever more tightly as one moves down the ranking, fewer and fewer meaningful differences persist among law faculties beyond the top one-third of law schools. In an internal test, for example, we found at one level not far below the top 70 that we would have a tie for a single ordinal ranking level of about 25 law schools. That also then would mean that the very next ranking level -- differing only by a tiny margin of scholarly impact scores -- would drop 25 ranking levels, essentially an ordinal ranking cliff.

Moreover, to rank all the way down the line to #200, even if it could be done with any validity, would be unfair, in our estimation. We wanted to recognize those law schools that have achieved a significant level of scholarly prominence, at least by this particular measure. We had no wish to offer any kind of commentary on other law schools that may have chosen a different path or are at a different point in their history and progress.

Posted by: Greg Sisk | Jul 18, 2012 3:41:35 PM

Dan notes that a study like this is "not everything one should look at, but it's something."

True, that. But could someone point me to the studies of the impact of law faculty members' scholarship on people/entities other than their fellow law professors as reflected in their citation practices? Of the impact of law faculty members' teaching on their students' learning? Of the impact of law faculty members' service activities on their communities (however defined)?

It is often said (and probably will again be said here) that those are things that don't lend themselves to measurement and ranking, that they are too subjective to allow comparison, and so on. Maybe those things do explain why the only thing we ever focus on is a certain narrow way of thinking about how our articles and essays work their way into the citing consciousness and culture of our own cohort.

But even if that's so, then by god, the result is depressing. How achingly tired I have grown of all of this.

Posted by: Eric Muller | Jul 17, 2012 6:41:00 PM

One modest complaint I have about the study is that it serves two purposes, and therefore accomplishes at least one of them imperfectly. That is, it ranks both faculties and individuals. For faculties, I understand the rationale in omitting the untenured (although to me it would make a lot more sense to divide by tenure-track years in the academy rather than tenured/untenured). But that means they omit juniors who might otherwise be ranked individually. Including (cough) juniors who send them an e-mail to complain about that outcome.

Posted by: BDG | Jul 17, 2012 4:26:13 PM

The comments to this entry are closed.