« "Beware the Monological Imperatives" | Main | "It would be weird if all your professors had Facebook" »

Monday, September 10, 2007

Scholarship in Tiers 3 and 4

Last week, Brian Leiter's Law School Reports linked to a work in progress by Professor Michael Yelnosky, Associate Dean for Academic Affairs at Roger Williams, concerning the scholarly productivity of law schools in the third and fourth tiers in U.S. News & World Reports rankings.  Professor Yelnosky also asks for corrections and explains his methodology.   Because (a) I think such information is potentially useful for folks about to go through the AALS Meat Market and (b) I like to count stuff, I tried to come up with a list of the ten most productive T3 and T4 faculties based on the preliminary results from Yelnosky's study.  Here's what I came up with:

(1) Hofstra

(2) Roger Williams

(3) Michigan St.

(4) Wayne St.

(5)  Capital

(6) University of Mississippi

(7) New York Law School

(8) Widener

(9) Chapman

(10) Willamette

(T-11) University of Memphis

(T-11) University of Wyoming.

Note:  As was pointed out to me, I left off Capital (#5) and Chapman (#9) in my original post.   I have changed the original post to reflect the actual top ten.  My apologies to the good and prolific folks at Capital and Chapman.

Posted by Alex Long on September 10, 2007 at 01:26 PM | Permalink


TrackBack URL for this entry:

Listed below are links to weblogs that reference Scholarship in Tiers 3 and 4:


I also object to the "only top 50" methodology. Somebody might do a symposium issue that's outside the top 50. Or they may write in an area that is not appreciated by law review editors (like administrative law or health regulation). One of the copyright articles that's influenced me most over the past few years was written by Tom Bell for a non-top-fifty journal, and I'd be willing to say it's more important than at least 10 works on similar subjects in top-50 journals.

Here's a link to an extraordinary critique of rankings:

Perhaps a proliferation of rankings is the only way to deal with flaws in dominant systems, but it has its costs as well:

Posted by: FrankP | Sep 10, 2007 9:33:29 PM

Why the obsession with ranking? Especially this RW one - which has a methodology that is based on rankings everyone already agrees are infirm, inaccurate and self-fulfilling. Why only count articles in "top 50" general interest journals (and a few specialty journals) as scholarship? There might be sound reasons that faculty publish in journals that aren't the ones you've counted - whether because of peer review, audience, specialty area... or, perhaps, because they were invited to do so. Also, what about books? If the book is over 200 pages, how many points does it get? Do the index pages count? This all seems very silly, and this RW "study" seems self-serving.

Posted by: anon | Sep 10, 2007 9:25:09 PM

Kudos and all that to Hofstra, MSU, and the others, which deserve them. I didn't understand why the study counted articles longer than fifty pages as 50% better than ones between 25-50 pages in length. It seems arbitrary, inconsistent with the current "shorter and more peer-reviewed" ideal, and filled with perverse incentives. This is not to downplay the general helpfulness of trying to obtain data for this sort of thing, though, which the RW study does do....

Posted by: David Zaring | Sep 10, 2007 4:22:18 PM

The comments to this entry are closed.