« New Yorker Cover | Main | With friends like this... »

Tuesday, July 15, 2008

The Educational Quality of Law Schools: What's the Relevant Data?

Yesterday, I laid out conceptually how we might think about comparing the "value added" for students across law schools in order to fill out the U.S. News survey. A critical component of this, of course, will be relative educational quality, and one obvious question is how exactly to get at that.

In talking about the rankings as part of a 2000 symposium on law schools and the legal profession, Stanford's Deborah Rhode, a leading ethics and gender-and-the-law scholar as well as a former president of the AALS, answered the question this way: "Prospective students need more comparative data, and schools need more incentives to compete, across a broader range of characteristics than current rating systems address. So, for example, applicants might benefit from approaches adapted from undergraduate education that evaluate schools by reference to "best practices" in teaching. Such approaches can provide comparative data on students' experiences on matters such as faculty contact, effective feedback, skills instruction, and collaborative projects."

With that and other scholarship on teaching and learning in legal education in mind, below is a first stab at the kind of information that one might want in an "Voters' Guide" to the U.S. News survey of law professors, lawyers and judges.

And before you say, "Wait a second, how can this possibly be objective?" remember, U.S. News is not asking law professors, lawyers and judges for objective information -- they've got that already. They’re asking for our expert opinions (rate from 1 to 5) as to the quality of different law schools' programs – and they’re not asking about reputation. We make evaluative judgments about quality all the time in the absence of precise quantitative formulas -- we compare the quality of doctors, restaurants, books, and movies. Most law professors and many lawyers are fortunate enough to be in the income bracket where we can ask: "Is the local public school good enough for our kids? Or should we spend the money to send them to private school? If so, which private school is the best?" Are you really telling me that we can assess the relative educational quality of elementary schools, but not law schools?

Let me also be as transparent as I can about the assumptions relied upon here. First, the overall concept: to a certain extent, Korobkin is right. In today's legal education marketplace, it is difficult to distinguish among the legal education that one gets at one law school versus another. Most law schools have a similar curriculum, with the new exceptions of Northwestern and Washington and Lee, and existing innovators like Georgetown's first year, the University of Dayton and others. One could say that all law schools have a mix of teachers that use techniques that engage students more or less, etc.

So the idea here is to focus on the areas of likely distinction. We have two recent, comprehensive reports on legal education – and in many ways, as these reports acknowledge, their assessment of the relative strengths and weaknesses of legal education are those that have been made for years, going back to the Carnegie Foundation’s 1921 study by Alfred Reed, Jerome Frank's call in the 1930s and 40s for a lawyer school, etc.

We know what the weaknesses are. The question is: which schools are currently doing the most to address the long-identified weaknesses of legal education? Those schools are doing a better job to prepare their students for the practice of law that those who are not. That's the theory. The other meta-point reflected below is to listen to students and recent graduates -- yes, surveys of student satisfaction, student evaluations, are imperfect. But students have a good sense of when they are more or less engaged, when they are getting excellent or subpar help from student and career services, etc. And in the aggregate, this can help us compare schools on these measures.

The specific assumptions below are drawn mostly from the recent landmark studies of legal education done by the Carnegie Foundation ("Educating Lawyers") and Roy Stuckey and others ("Best Practices For Legal Education"). Obviously, the assumptions are open to challenge; but I don't think you can challenge the people who spent years digging into learning theory, professional education, and the connection between the two without a real theory or research of your own. I.e. "I've been a law professor for a while, I've been a law student, I don't think the case method is overused" doesn't cut it.

So the conclusions from the Carnegie Report, Best Practices, and elsewhere are:
(1) Active learning leads to better outcomes than passive learning.
(2) The Socratic case method, focusing on the in-class dissection of appellate cases, is overused in the second and third years of law school, leading to student disengagement.
(3) Student-faculty contact helps increase analytic ability. (LSSSE 2006)
(4) Law students are generally undertrained in skills like interviewing, counseling, factual investigation, and negotiation. (this goes back to the MacCrate Report, at a minimum)
(5) Just as we would be shocked to allow doctors to practice medicine without first having seen a patient, we should expect law students to have experience dealing with clients before they do so "for real." Relatedly, law students are too infrequently put in the "role" of lawyers.
(6) Feedback is critical to learning. Law schools tend to be terrible at providing feedback.
(7) Coursework that integrates doctrine, skills, and issues of professionalism is better than coursework that deals with these on an isolated basis.
(8) The opportunity to specialize in a particular area of law is important, and leaves the student who specializes in a significantly better position than someone who does not and goes into that area of law.
(9) The work that most lawyers do can be broken down into three categories: litigation, transactional, and regulatory (Rubin 2007). Law schools tend to do a decent job at preparing students for litigation, but a bad job at preparing students to do transactional and regulatory work.

So here are some pieces of data we might use to evaluate relative educational quality -- where there is an existing data source, I indicate it. The ABA data is all on the Web here (can search by school); one question is how exactly one could use The Princeton Review’s Best 170 Law Schools data.

I. Education
A. Teaching Quality
-- Princeton Review ratings: Profs Interesting/Accessible
-- mean or median “contact hours” per professor
-- learned-ness -- Leiter rankings on scholarly impact

B. Classroom Experience
-- participation in LSSSE, and use of the data to increase student engagement (LSSSE)
-- average class size in 1L yr (ABA)
-- small section in 1L year (ABA)
-- number of upper-level courses with enrollment under 50, excluding seminars (ABA)
-- study hrs/day (Princeton Review)
-- # of feedback opportunities per course in the first year

C. Curriculum
Required:
-- what are courses required (school websites)
-- is there a statutory requirement? (school websites)
-- strength of 1L research and writing program (U.S. News, Princeton Review data), including # of credits and broader “lawyering” skills (school websites)

Upper-level Curriculum:
-- simulation-class slots per 100 students (ABA)
-- clinic slots per 100 students (ABA)
-- drafting or advanced-writing (but not seminar) course slots per 100 students (school websites)
-- transactional-course slots per 100 students (school websites)
-- course slots per 100 students in classes that use small-group work
-- opportunities to specialize -- is there a business law or criminal justice track, for example? (AALS Curriculum Committee Survey on Innovations)
-- skills curriculum – factual investigation, interviewing, counseling, negotiation (school websites)
-- indicators of chronic problems with insufficient course offerings (look to Princeton Review, law student blogs, etc.)

D. Bar Prep
-- Actual bar passage rate +/- that predicted based on mean LSAT score (either compare within jurisdiction, or use MBE score instead of passage as dependent variable)

II. Extra-and Co-curricular Activities
-- journal slots per 100 students (ABA)
-- moot court/mock trial opportunities per 100 students (ABA)
-- externships/field placements (ABA/school websites)
-- pro bono requirement (school websites)

III. Career Advising and Assistance
-- career services staff per 100 students (school websites)
-- student satisfaction with career services (Princeton Review)
-- loan repayment for public-interest (school websites)
-- starting salary +/- that predicted based on median LSAT score (I’m nervous about this one because discourages schools from encouraging students to go into public interest)
-- number of OCI interviewers

IV. Alumni network, present and future (associative good)
-- alumni giving participation rates
-- concentration in particular geographic region (assumption is that more concentration = stronger alumni network)
-- alumni career-advising network – yes/no (school websites)
-- Alumni Network National Reach (done by MoneyLaw guestblogger and entering Michigan 1L Michael Shaffer)

I would very much welcome thoughts on this, as it is really a first cut, but also keeping in mind that "it takes an alternative" to improve upon a suggestion. Thanks.

Posted by Jason Solomon on July 15, 2008 at 10:42 AM in Life of Law Schools | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c6a7953ef00e553bae67b8834

Listed below are links to weblogs that reference The Educational Quality of Law Schools: What's the Relevant Data?:

Comments

This is excellent, I'll be borrowing part of it (with credit).

Thanks!

Steve
http://www.adrr.com/law0/reformb6.htm#Pragmatic

(Notes from a project I was involved with that finished in 1996).

Posted by: Stephen M (Ethesis) | Jul 15, 2008 1:53:07 PM

starting salary +/- that predicted based on median LSAT score (I’m nervous about this one because discourages schools from encouraging students to go into public interest)

If we really want to improve placement data for prospective students, short of just providing complete itemized disclosure, then I think we just need to require a few simple (but required) checkboxes on employment surveys sent to graduates:


1) Does this job require:

[] A JD degree
[] Bar admission

2) Is this job:

[] Full-time or [] Part-time
[] Permanent or [] Temporary

3) If you are working as a judicial clerk, in what type of court:

[] Federal Appellate
[] Federal District
[] Federal Other
[] State Supreme Court
[] Other


If we could just see the answers to these questions, then I think you could throw out everything else that schools pretend to collect now and we would have all that we need. Just aggregate these responses and use them to calculate some rates of success. And base all metrics on the TOTAL number of graduates -- eliminating the incentive to "lose" students with poor outcomes. After all, at the very least we can say that if career services doesn't know what the outcome was for a student, then they obviously did nothing to help produce that outcome. So it makes sense to reward those schools with good outcomes and high response rates.

Question number 1) is really the elephant in the room when it comes to employment outcomes, and it's what all almost all prospective students today really want to know -- Will this school get me a job working as a lawyer?. If we could make only one change to employment reporting, my vote would go for getting the answer to that question. Ideally we'd get answers to 1) and 2) both, and in a perfect world all three.

Salary data is nice, but if we discriminate federal from state clerkships, then that will work as well as anything as a proxy for placement in elite or high paying jobs. Nobody really cares what the averages are -- 99 in 100 aspiring lawyers just want to know their actual chances are of getting a job making $100,000 or more. With full disclosure, it's easy to aggregate the data and figure out NLJ250 or Vault 100 placement success or whatever else you want. But those success rates all seem neatly proportional to federal clerkship placement, so that may do the job on its own.

Posted by: Michael Shaffer | Jul 15, 2008 1:56:47 PM

Thanks for the insightful posting. To my mind, the area where Law Schools need to improve most dramatically is # 6 from Carnegie Report that you quote above: "Feedback is critical to learning. Law schools tend to be terrible at providing feedback."

Implicit in this (I hope) is the suggestion that Law School find some way to move sharply away from the current method of evaluating students with only one final exam at the end (prevalent even in the schools that get the highest reputational rankings in US World & News Report). And that to the extent they do rely heavily on a few exams, they provide and evaluate written exercises that simulate and hone the skills tested on those exams (The idea that students are adequately prepared for the analyzing exam fact patterns by use of Socratic method in class discussion isn't the slightest bit persuasive).

Hopefully, future US News & World Report reputation rankings will reflect differences along these lines. The current ones obviously don't.

Posted by: Grateful Gadfly | Jul 15, 2008 2:34:58 PM

Grateful Gadfly, thanks for your comment -- that's absolutely implicit in this, and at some point, I hope to explicitly talk about it (others have, of course). Right now, there's just zero incentive for profs to do this, and not much for law schools to encourage faculty to do it.

Personally, I think key is getting rid of the final exam entirely so that faculty don't have increase in grading responsibilities; just do a few smaller ones instead (i.e. instead of a 2 question final exam, do 2 exams with one question each) and provide feedback along the way.

Posted by: Jason Solomon | Jul 15, 2008 3:15:40 PM

I've written a 0L's take on all this (in light of choosing a law school over this past year), available here.

Posted by: MSD | Jul 15, 2008 11:13:11 PM

Post a comment