« The En Banc Fourth Circuit Decides al-Marri, Sort Of... | Main | Do Women Blog? »

Wednesday, July 16, 2008

Paul Caron, Moneyball, and the Most Important Chart In the History of Legal Education

I’ve been talking about how to compare "value added" across law schools for the annual U.S. News survey of law professors, lawyers and judges, which amounts to 40% of the overall rankings. Yesterday, I laid out the kind of data we’d want to see in a Voters’ Guide to U.S. News. One reaction has been: sure, that would be nice, but it seems like a lot of work. And in the absence of the ABA, AALS, Carnegie Foundation, U.S. News, Princeton Review, or Vault (market opportunity here) funding such an effort, it will never happen. Which may or may not be right.

But we can create the race to the top right now, without any additional work, thanks to Blog Emperor and Moneyball guru Paul Caron. A basic Moneyball principle is the use of data-driven analysis to identify things that are systematically overvalued or undervalued. The chart below, created by Caron and an assistant, does that.

This fall, if it’s the day the survey is due, and you have 10 minutes to fill it out, here’s what you ought to do:

(1) Look at the chart below (click to enlarge).
(2) Give everyone in the top half a 4 or a 5, and the bottom half a 1 or a 2.
(3) Put “don’t know” for the rest or leave blank.

Princeton_review_v_us_news_2008_3

Source: Paul Caron, TaxProf Blog, October 22, 2007

This would be infinitely better than what we do now, because when the rankings come out, the “value-added” schools would gain, and the “value-not-add-so-much” schools would lose. The race to the top would be on. Why?

Here’s how the chart works. To create the Princeton Review "rankings," (Princeton Review itself does not do rankings -- they do ratings), Caron added up the following ratings from The Princeton Review: Professors Accessible/Interesting, Admissions Selectivity, Academic Experience, and Career Preparation. So The Princeton Review data includes basically everything U.S. News does, absent a few low-weight items such as volumes in the library, and the high-weight items, which are the surveys of law professors (25%), and lawyers and judges (15%). But the surveys amount to noise -- all they have done over the last few years is replicate the rankings from the previous year! So they make no real difference.

So the difference between the Princeton Review ranking and the U.S. News ranking is attributable to what is in the Princeton Review -- and not U.S. News. And that is mostly the professors accessible/interesting ratings, and responses by students in questions having to do with the "academic experience" (range of available courses, school's research resources, good mix of theory and practice in the curriculum, open to diverse opinions, how intellectually challenging the coursework is), and "career preparation" (how much does the school encourage practical experience, opportunities for externships, internships and clerkships, how prepared do you feel to start practice). In addition, Princeton Review has average starting salaries of graduates, which U.S. News does not.

Basically, the schools at the top of the chart are ones where the teaching is rated very highly, and students feel very prepared for practice. The bottom half of the chart, the school does particularly poorly on both these metrics. This chart, then, tells you what schools are doing a particularly good and bad job of adding value for students, relative to their competitors.

The key lesson here? Assessing “value added” on a relative basis by school is not only knowable; we actually know it for many schools. So we just need to present the data in a convenient and user-friendly way for survey respondents, and then the rankings will move for at least a handful of schools, beginning next spring – and then we get our race to the top, beginning next summer. Caron's chart tells us: it can be done. Which is why, in my humble opinion, this chart might well be the most important in the history of legal education.

Posted by Jason Solomon on July 16, 2008 at 12:45 AM in Life of Law Schools | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c6a7953ef00e553a0ff0d8833

Listed below are links to weblogs that reference Paul Caron, Moneyball, and the Most Important Chart In the History of Legal Education:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Using this chart seems silly to me. This is just a measure of what students think, but as we know from teaching evaluations, that metric doesn't always correlate with good pedagogy -- i.e., one can get good evaluations by spoon feeding. This doesn't mean that what students think isn't valuable or that teaching evaluations don't matter. But to use them as the primary metric in evaluating the reputation of a school strikes me as silly. Harvard may not be the happiest place to go to law school, but if the goal of rankings is to measure *reputation* then I would think Harvard needs to be somewhere near the top. Moreover, from the list I noticed a few very religious schools at the top (i.e. Regent). That's probably because there's a great fit between the students and the pedagogy -- but all that says is that these schools might be the right school for these particular students, but it says little about a school's value for students in general.

Posted by: Daniel J. Solove | Jul 16, 2008 9:29:45 AM

Dan, the goal of the U.S. News survey of law professors, lawyers and judges is absolutely not to measure reputation, and that's how we've gotten into this mess. As far as I can tell, U.S. News used to refer to these as "reputation surveys," many people like Leiter still do, and I think your view is like that of most who fill out the surveys. No wonder that the surveys in the aggregate simply spit back the previous year's rankings. Because reputation these days simply is the rankings.

But the question they ask is to assess the "quality" of a school's "program." That's our job, and we ought to take it seriously because the results drive the incentives facing law schools. And yes, I agree with you that student surveys have limits, but I do think they have a fair bit of value. Students generally are fair in evaluating their institutions.

I don't think either of us has any idea why a few religious schools are at the top, but there are plenty of alternative theories. In reading The Princeton Review write-ups, it seems they take the mission of training ethical professionals quite seriously, emphasize teaching and preparation for practice, and invest in career services.

That's not to say these schools are for everyone, but according to Princeton Review, Regent (for example) won the 2006 ABA National Moot Court Competition, and the 2007 Negotiation Competition. Maybe they're actually doing a good job in training students to practice law.

Campbell, a small Baptist-affiliated school in North Carolina, consistently has the highest bar-passage rate in North Carolina, with students' median LSAT between 152-157. Not bad in terms of adding value.

How would you fill out the survey and why, if not with reference to a school's relative "value added" for students? Look, if you've got an LSAT score of 155, you ain't going to Harvard. The question that supposed "experts" like us are supposed to answer for U.S. News -- so they can use the info to help inform students and prospective employers in making decisions -- is what schools do the best job of preparing their students to practice law and get a job doing it, whatever their entering credentials.

If we started answering the question that way (more accurately and honestly than we do now), we'd create incentives for law schools to compete on quality, like any other service providers. What's wrong with that?

Posted by: Jason Solomon | Jul 16, 2008 10:14:33 AM

The chart isn't just silly, it's flawed because it assumes that the Princeton Review's information is regularly updated, particularly the editorial content for each school. Pull all editions from your library's collection and compare the entries for your school year by year since the start of this annual publication. You will find that entries only change every two or three years, usually every three years. Note also that ranking changes in Princeton Review editions often reflect the addition of new schools being covered by the publisher. The mission of the publisher appears to be to sell the most copies with the least amount of editorial revision.

Posted by: Joe Hodnicki | Jul 16, 2008 11:01:21 AM

I certainly didn't assume that PR survey was same year -- Princeton Review readily acknowledges it only does surveys of each school every other year -- and think this makes data imperfect, but still very usable. If we could use LSSSE data, that would be ideal, but we can't right now.

I'm skeptical this stuff (student engagement, preparation for practice, etc.) changes much from year to year at a particular school. If it does due to concerted institutional efforts, great -- that just means a bit of a lag time until it shows up in rankings. That's OK.

Still need an alternative for filling out the U.S. News survey, given current information, unless you're going to defend status quo.

Posted by: Jason Solomon | Jul 16, 2008 11:17:48 AM

"[T]he goal of the U.S. News survey of law professors, lawyers and judges is absolutely not to measure reputation . . . the question they ask is to assess the 'quality' of a school's 'program.'"

I think you're missing the difference between a survey question and its results. If a survey asks people what they think of a school's program quality, the result is a measure of that school's quality as perceived by the survey respondents—in other words, its reputation.

Posted by: Jim G | Jul 16, 2008 11:25:05 AM

Jim G, good point -- in some cases, I think that's true, but here it is not. Here, respondents are literally answering as if it was a question about reputation, becaue they have no information about the program's quality. That's what I'm trying to change. I'll make this concrete tomorrow.

Posted by: Jason Solomon | Jul 16, 2008 12:10:31 PM

Student impressions locked in print for two or three years in Princeton Review can be fairly misleading since they will not reflect faculty moves, administration changes, facilities improvements, etc. If student input is desirable, something like the the online survey suggestion recommended by Leiter in his An Open Letter to Bob Morse of U.S. News at http://leiterlawschool.typepad.com/leiter/2008/03/an-open-lette-1.html would be more timely. Student leaders (eg officers of student organizations) could be presented with concrete information about each school and polled. See also Fiddling with the US News Law School Ranking Formula at http://lawprofessors.typepad.com/law_librarian_blog/2008/07/fiddling-with-t.html

Posted by: Joe Hodnicki | Jul 16, 2008 12:37:58 PM

Unfortunately, though, students frequently care a good deal about reputation over "value added." If this were not the case, US News and Princeton Review would both be significantly less important in student decisions--and as far as I can tell, this is not the case amongst my peers. I've come into contact with quite a few 0Ls who, for instance, might say something like: "I would never apply to NYU over Columbia." These schools have an almost identical rank, similar career opportunities, are located in the same city, but these 0Ls take into account the name brand of an Ivy League school.

I know this isn't precisely the same argument that's being made here, but it is something to consider: how much do potential law students (the consumers) care about "quality" versus "reputation"?

Posted by: MSD | Jul 16, 2008 1:51:17 PM

I agree with Jason that this is an extremely useful chart. In one year I moved from teaching at a 4th tier law-school to one of the top five (in a fellowship program) and I have to say that the 4th tier law-school offers more value to its students. I also think the tendency of professors to discount student surveys is somewhat self-serving. It's true they aren't perfect--no single source is--but the students are far and away the best situated to evaluate how well a law school does by its students.

hedgie

Posted by: hedgie | Jul 16, 2008 2:18:46 PM


Agree that the goal absolutely should **not** be to measure "reputation" -- because that only begs the most important question, which is what the heck does a particular law school do that earns it that reputation (assuming it is even deserved in the first place). Does it educate students well? Does it have prominent scholars? And are they prominent because they write insightful articles? Or because they wrote insightful articles 20 years ago that are now cited a whole lot?

My guess is a reputational survey among knowledgaeble fans of baseball players find that Albert Pujols, Mark "The Bird" Fidrych, Bill Veeck, Harry Caray, and Ozzie Guillen all have a pretty good reputation -- but that doesn't tell you anything about whether you'd want them on your team and whether they'd fill a particular team's needs (It would help to know, for example, that Ozzie Guillen has been a pretty successful, if eccentric, manager, but was a mediocre hitter as a player).

Of course, one might argue that in a field like legal education and scholarship, which doesn't have RBIs or other easily measurable objective indicators of quality, people will naturally look at reputation because -- although largely meaningless -- it's all that's immediately available.

Posted by: Grateful Gadfly | Jul 16, 2008 7:48:02 PM

It is somewhat old news to see Chapman University topping the list of "the most important chart in the history of legal education." As a member of the Chapman faculty (and associate dean), I can report that since Paul Caron compiled this chart, Chapman has jumped from 4th to 3rd Tier in the latest US News & World Report survey, and in just about every US News category we are now reporting 1st and 2nd Tier numbers, including in LSATs, GPAs, selectivity, bar pass rate, student-faculty ratio, spending per student, etc.

Does this make Princeton Review a "leading indicator"? The circumstances at each law school are so different. At Chapman, we have responded to competitive pressures and the prospect of UC Irvine opening a law school by simply becoming a better law school. We have greatly expanded our faculty and the range of our programs. In just two years, the university has pumped an additional $3.5 million into the School of Law. The demographics of Orange County are also quite helpful to sustaining such an expansion.

As a result, in the past year we have hired eight new permanent faculty members and ten new visitors, including four top laterals (including Ron Rotunda from George Mason) and several distinguished visitors (including Richard Falk). This expansion will reduce our student-faculty ratio from an already low 12.6 to probably less than 10, one of the very lowest ratios of any law school.

Although in the past Chapman has had a reputation as a politically conservative law school, this is no longer accurate. Chapman is now one of the most ideologically diverse, with faculty members who have clerked for six U.S. Supreme Court Justices, as well as the only Nobel Laureate in Economics on any law faculty in the country. Meanwhile, we have continued to diversity our faculty in terms of racial, ethnic and gender diversity.

Did the Princeton Review see something that was missed by US News? Perhaps. The US News rankings are quite backward looking, impressed with yesterday's reputation while many of the students who responded favorably in the Princeton Review survey are naturally excited about the high quality of life (Southern California is hard to beat), the great legal education at Chapman, and our expanding programs. For instance, we're building one of the premier Entertainment Law programs in the country. Of course, it helps to have a state-of-the-art $40+ million film school just down the block from the Law School and so close to Hollywood.

Since Paul Caron's thesis is debatable, whether Princeton Review is a leading indicator or US News is a lagging indicator, perhaps we should see a newer chart that compares the gap between the two surveys over time and that shows how Chapman is moving up so quickly in the US News rankings.

Posted by: Tim Canova | Jul 17, 2008 10:48:56 PM

Am I missing something here but the chart doesn't seem to track "value add"--it tracks difference betwee Princeton and US News. For example, Yale is #35 on the Princeton Rank and, as a result, seems to be doinga bang-up job on the things you think important. Yet, it is in the "not so much" category only because it is so highly ranked in US News. So why punish Yale on the US news surveys--because it does so well on both charts? I am all for punishing Yale, heck it seems like fun, but it seems strange to do it for the "reasons" given...so...

Posted by: Doug Sylvester | Jul 18, 2008 8:09:19 PM

Reasonable question, Doug, and I could have done a much better job explaining. Thanks for asking me to clarify. So, the reason why the difference is "value added" is because Princeton Review includes everything U.S. News does, plus a bunch more stuff on student satisfaction with teaching, preparation for practice, etc.

So LSAT scores, undergraduate GPA, admissions selectivity, bar passage rates, employment rates after graduation are all included in both Princeton Review and U.S. News -- so for Yale to be only #35 in PR, even though they're so high in all these measures, means they must be doing really abysmally in all the things I think important that are in PR but not U.S. News -- teaching, preparation for practice, etc.

Posted by: Jason Solomon | Jul 18, 2008 8:43:40 PM

Do the really good law schools do this? My friend went to a top ranked law school, and argued that law schools should offer concise teaching on black letter law, by offering students materials like the concise BarBri Conviser Mini Review, so that students can get a broad and focused view of how the law is currently implemented and actually works. So that they have more of a base to build upon in analyzing theoretical legal issues professors like to wax upon so much, so students can move on to higher level concepts and questions more quickly and confidently, with a more rigorous intellectual base/background to theorize upon--which professors often neglect to realize that their students may not have yet, or ever, in the process of their law school education.

http://trickledown.wordpress.com/2008/07/30/legal-education-reform-theory-versus-black-letter-law/

Posted by: trickledown | Aug 30, 2008 2:22:39 PM

Post a comment