« Political Haikus | Main | Dealing with Controversial Patent Subjects »
Wednesday, October 29, 2008
Law Schools Competing On Quality
Why should anyone care about the stupid U.S. News survey anyway? According to a commonly held view, the rankings are silly, and the thing to do is ignore them. But I think this view is quite misguided.
It turns out – and this is the basic premise of the Race to the Top project that I helped start recently -- that a major obstacle to the improvement of legal education generally is the lack of competition on quality among peer institutions, and that this lack of competition also leads to other bad consequences for law schools like spending lots of money on buying LSAT scores and shifting full-time students into "part-time" programs. And the easiest way to address both sets of problems is by taking the U.S. News rankings more seriously, not less, and focusing on this survey.
What would such competition look like? In the Voter's Guide we sent out earlier this week to U.S. News voters, we said: "For example, take Penn and Northwestern, two national schools that compete for students and are close in the overall rankings. Both have very high student satisfaction and bar passage rates. But consider the curricular differences in areas particularly important in preparing students for practice: Northwestern has top-10 (or close) legal writing, clinical, dispute resolution and trial advocacy programs in last year's U.S. News surveys of faculty in these fields. Penn is not ranked in any of these areas, and is one of the few remaining law schools that uses third-year law students to teach 1Ls legal research and writing. Northwestern is also moving towards an increasingly innovative, practice-oriented curriculum, all of which suggests that Northwestern has a higher-quality J.D. program than Penn."
This kind of head-to-head comparison is completely lacking -- there's been no information out there on the relative quality of the education provided at different schools -- and as a result, U.S. News voters simply replicate the previous year's overall US News rankings when filling out the surveys. Glossy brochures notwithstanding, these quality assessment ratings rarely change from year to year, and when they do change over time, it is in response to a shift in a school's overall ranking (driven by higher LSAT scores, for example), not any underlying shift -- of reality or perception -- on the quality of the JD program. By the way, if you don't like the criteria used above to compare schools, would love to hear what existing data you would look to instead in assessing the relative quality of a school's JD program.
To understand why the lack of competition on quality has other bad consequences, recall there are four basic components of the U.S. News formula:
40%: Quality Assessment, from surveys of law professors (25%) and lawyers/judges (15%)
25%: Student Selectivity, from LSAT Scores (12.5%), UGPAs (10%), and Acceptance Rate (2.5%)
20%: Placement Success, from Emp rates at graduation (4%), 9 months out (14%), and Bar Passage (2%)
15%: Faculty Resources, from Expenditures per student (11.25%), Student-Faculty Ratio (3%), and
Volumes in Library (.75%)
So since schools can't move up on the quality factor (40%) in the rankings, what do they do? They start competing on the next biggest category in the U.S. News formula -- LSAT scores and undergraduate GPAs -- by emphasizing these things more in admissions, and throwing money at (buying) higher credentials. Bill Henderson provides evidence of this trend here. How much money is your school spending on "merit-based" financial aid, and how is merit determined? I'm guessing it's not based on valuable graduate training in another discipline, interesting work experience that indicates potential excellence as a lawyer, or being the first in the family to go to a professional school.
Are we really any better than Baylor, which literally paid people to retake the SAT? I'm not so sure. Here's our deal: take that Kaplan course if you can afford it, work really hard studying for the LSAT, and if you're speedy enough, we'll give you a full ride. Sounds like paying for LSAT scores to me; we're only a tad more subtle.
The good news is we can fix this if we want to. It's actually not this pesky magazine controlling our priorities -- we (law professors and lawyers) control the U.S. News rankings, 40% of it, the largest category by far. If we have real competition on quality, there will be less need for schools to compete on other things. We just need to get enough information flowing to make competition on quality possible, and then start filling out the survey accordingly. I hope those voting this month and next will start now.
Posted by Jason Solomon on October 29, 2008 at 07:15 AM in Life of Law Schools | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef010535c06041970b
Listed below are links to weblogs that reference Law Schools Competing On Quality :
Comments
Legal writing at penn is taught by full time faculty (Anne Kringle), and is augmented by research librarians (whose main job is assisting faculty research). The 3L instructors are for small group discussions(about 10 students) that meet occasionally. You make it sound like its only 3L's teaching legal writing.
Posted by: anonymous | Nov 1, 2008 12:12:36 AM
Ed, thanks again -- and by the way, on possible bad consequences of competition, I'd be thrilled to see the next generation of glossy materials on curricular innovations. The other thing is that many of the schools we highlighted have actually been doing some of these things for years so they're not so new -- just hasn't been reflected in the rankings. But we're certainly not tied to our current criteria; we're just trying to spur debate and development of better measures. Obviously, the best is some kind of outcome measures.
On your approach to the survey, I think that's much like what most USN voters do, and of course, my whole point is that we all ought to have a lot of interest in that process.
Posted by: Jason Solomon | Oct 30, 2008 6:58:50 PM
Jason, thanks for replying and again for your concern about legal education.
We agree that curricular innovations may be very helpful. We probably also agree that they may be empty gestures. I am not convinced that your method permits us to tell the difference, or encourages a flight to competition in programs that really make a difference (as opposed to sounding good). We also disagree as to whether you've established a basis for equating a school that has some nice innovations to one that does a better job on all the rest of the traditional stuff. I think you're subtly converting the broad inquiry into "quality" into something else -- perhaps "doing the most with the least," or "taking steps that others should imitate irrespective of whether this vaults the school in question over them" -- without justifying that as a translation of the question or reconciling your own conflicting impulses.
You don't address the other questions I posed about your guide, the lack of true head to head competition, or the bad kinds of competition that might ensue, but that's fine.
Frankly, I think this is enough to undermine your partial reassessments without requiring me to offer an alternative, but you ask what criteria I would suggest. This will only annoy you. I'd probably start with the wisdom of the crowd, initially sorting everyone into the whole number groups based on their present quality averages. I'd eliminate/leave unscored those schools I knew little about, which could be 75% of the field or more. I'd push some of the remaining schools up and some down, mostly based on *changes* in the school of which I'd been made aware by you, Brian Leiter, or others (significant new programs with convincing accomplishments, shifts in faculty, etc.). I would not attempt to correct for what I guessed was excessive competition or injustices on other scores (e.g., LSATs), but I would deem relevant the end results. I would resort to Princeton Review data, or announcements of new initiatives without proven legs or results, with the greatest reluctance. I might very well toss out my survey before completing this process, I'd assume no one else had the foggiest interest in what I'd done, and I'd hope that they did it differently.
Posted by: Edward Swaine | Oct 30, 2008 5:41:16 PM
On anon's bar passage info, great, thanks. I'll check it out. On Ed Swaine's thoughtful comments, first, thank you for them. Second, I think the basic question is: how would you fill out the survey rating the quality of JD programs 1-5? What criteria would you use? One of our basic premises is that there is no information out there on the relative quality of JD programs so a bit of relevant information is good, which you agree with.
But why this information? Good point that I didn't explain well here, and explained more in a prior post, http://prawfsblawg.blogs.com/prawfsblawg/2008/07/the-educational.html.
The short version is: lawyers and experts on professional education have identified, recently and for years, common weaknesses in law schools -- they include insufficient training in communication skills, especially writing, lack of the feedback that improves learning, and lack of experiential learning opportunities. The schools that have done well in addressing thee weaknesses should be rated higher than those who have not, and that's an important way of judging the relative quality of JD programs. Would love to hear alternatives.
On the downsides of competing for the "best" students, defined as those with high LSAT scores, see today's post on distributive justice.
Posted by: Jason Solomon | Oct 30, 2008 12:29:52 PM
This is absolutely biased and ridiculous. What absurd measurements you decided to pick out. This is pathetic, hard to believe its written by a professor
Posted by: Mike | Oct 30, 2008 3:05:45 AM
I agree with you on the need for more information, and that USNWR is unsatisfactory, so I commend your efforts. Take these criticisms for what they're worth.
Basically, I think the pitch for a turn to "competing on quality" is a misdirection. The other, non-survey USNWR components bear some relation to quality, and they are *very* open to competition. Probably the better one's student peers are, the better the education you receive (due to their stimulation), and schools compete for the best students (very, very narrowly measured); placement is an indirect test of perceived quality, schools work very hard to up those numbers, and if they deliver poorly prepared students their numbers will ultimately suffer; some resources (e.g., as reflected in student-faculty ratios) affect educational quality, and schools fund-raise and hire constantly. I think the drift of what you are saying is that schools are competing *too much* on too-narrow measures of quality. What you then suggest is competition on *additional* aspects of quality, and a privileged place for these additional measures in the survey responses. So you need to make a case as to why these other inputs have such high relative significance as compared to rival visions of quality (e.g., peer quality, placement success, and expenditures per student) which might themselves be taken into account in the survey.
I'm not convinced you do that, based on a look at your guide. Those "schools with [unexpectedly] strong bar passage rates" have done something great, but you don't really explain how it translates for different student bodies that they might attract if they were properly rated, or for a voter who thinks it's hard to assess the quality of other schools relative to different student bodies than actually attend them. Likewise, with respect to the "best practices" law schools, you give us descriptions of nine or so schools that have done cool things. But as you stress elsewhere, we need "head to head comparisons," and there aren't any here. I didn't see anything on how many schools responded but didn't make your cut, how unusual the featured practices are, or which are the very best -- let alone whether any are highly associated with your other values, like student satisfaction or bar passage rates. Ditto with the USNWR specialty rankings, which have completely separate difficulties as inputs (e.g., whether there are marked quality differences within the schools ranked, or between them and schools not listed).
The reason I think this matters is because your objective isn't just more information, but aggressively nudging ratings so as to facilitate competition. (Your guide says that "a school's presence on one or more of these lists . . . warrants a 'bump up' from the ranking that one might otherwise give a school.") As applied, your payoff this time is less head to head competition than grade inflation; unless I misunderstand you, you'd encourage giving the hundred or more (?)schools on the various lists bumps up, and only one (not on the lists) a bump down. Explaining why *other* schools -- which I am sure could make good cases for themselves too, on various metrics -- deserve a bump down is where the rubber really meets the road. Or perhaps it's when, if your approach gains ground, we see the kinds of competition it engenders: turning Princeton Review surveying into a law school American Idol, except that everyone votes for themselves, or seeing the next generation of glossy materials on curricular innovations.
Shorter version: more information good; why *this* information should affect ratings *this* much quite unclear; probably the best value is in compiling and reporting. Thanks for doing that, by the way.
Posted by: Edward Swaine | Oct 30, 2008 12:26:08 AM
How about using that NLJ250 placement graph to rank schools? At least that way rank will be realistically tied to job prospects.
Posted by: rip 'n run | Oct 29, 2008 7:51:02 PM
This post comes across as a blatant smear on Penn and blatant trolling for NU.
Posted by: Anon | Oct 29, 2008 6:12:47 PM
"For example, take Penn and Northwestern, two national schools that compete for students and are close in the overall rankings....Both have very high student satisfaction and bar passage rates."
Actually, I very much doubt whether both DO have similar bar passage rates. Unlike some schools which submit multiple bar passage rates from multiple jurisdictions, Northwestern submits only its rate from Illinois, a state with a very high overall pass rate. This is highly suspiciious, considering a) Nothwestern's own website touts the fact that less than half of its graduates practice in Illinois, and b) in the only State (to my knowledge) which releases passage rates by law school (California), Northwestern has VASTLY underperformed ALL of its peer school IN EVERY YEAR in recent history. I suspect we'd find a similarly disappointing passage rate in New York if Northwestern would see fit to release those numbers.
Here are the July CA bar passage rates for NU and Penn since 2003:
2003 -
NU: 71%
Penn: 93%
2004 -
NU: 69%
Penn: 88%
2005 -
NU: 65%
Penn: 83%
2006 -
NU: 68%
Penn: 86%
2007 -
NU: 72%
Penn: 75% (Penn's 75% was the second-lowest among the top 14 law schools as ranked by US NEws)
Posted by: Anonymous | Oct 29, 2008 5:55:13 PM
The comments to this entry are closed.