« Gutless educational administrators and the First Amendment, part 6577 (Updated) | Main | Going to the Dogs »

Monday, December 29, 2014

Have Law Students Become Worse Students in Recent Years?

Over at his blog Excess of Democracy, Derek Muller (Pepperdine) has a provacative post titled "NCBE Has Data To Prove Class of 2014 Was Worst in a Decade, And It's Likely Going to Get Worse."  Derek recounts that the overall bar passage rate across the country for the July 2014 sitting was down as compared to previous years, and he posits that the lower results were caused by "student quality and law school decisionmaking."  He believes that the data suggests that lower quality students, and educational decisions of law schools, are producing graduating classes that are less qualified overall, in turn resulting in lower bar passage rates.

In essence, students come into law school having done worse on the LSAT, and they leave law school doing worse on the bar exam.  

Are they doing worse while in school as well?

Reflecting on the past few years, I wonder if Derek is on to something, particularly with respect to law student quality.  If he is correct, then we should expect to see lower student performance while students are in law school.  Is the day-to-day classroom discussion, or their final exam performance, worse now than it was a few years ago? 

My own experience suggests that the answer is...probably yes.  But unlike LSAT scores or bar passage rates, performance while in law school is much harder to measure.

My students continue to be bright, inquisitive, and engaging.  Further, Kentucky's bar passage rate, at over 90% for the past several years, remains high, even though, much like most other law schools, the LSAT scores of our incoming class has dropped.  But there might be something intangible -- something that professors might notice in the classroom or on an exam -- that suggests that law student quality may be lower than even a few years ago.

Without suggesting that any particular student has been weak (I love you all!), upon reflection I have noticed that as a whole it takes longer now than even a few years ago to teach deep legal reasoning.  What I mean is that students from the past few semesters, as compared to several years ago, seem to have a harder time making connections between the various doctrines, engaging in deeper-level thinking, and applying the legal rules to new scenarios in creative ways.  I have spent more time recently going over material more than once or walking students through the basics of legal analysis.  Moreover, their writing, at least when they begin law school, seems less advanced than in previous years.  (I have an early semester writing assignment in my Civ. Pro. class so I have a sense of their writing toward the beginning of their 1L year.  Luckily, our excellent LRW professors can, and have, improved their writing dramatically while they are in school.)

Regarding their exam performance, I again find that as a whole the students have not been as strong in deep and complex analysis.  (And I can assure you that it had nothing to do with this year's Civ. Pro. fact pattern involving prisoners.  I'm talking about their analysis on the actual Civ. Pro. issues based on the call of the question, such as personal jurisdiction.)

Of course, the problem could lie with me as a teacher.  Maybe I am not connecting with this crop of students as well as I did previously.  Maybe my exam was harder this year than in previous years.  

I hope, however, that with each year I become a better teacher than I was the previous year.  And I don't think the exam was materially more difficult than previously.

Again, let me emphasize that many, many students performed quite strongly.  Yet I still have the sense that for many of them the analysis was not as deep or nuanced as it could have been, and their raw point scores on their exams showed it.  The students did well spotting issues and giving a surface-level interpretation, but for many, complexity was lacking.

Luckily for us at Kentucky, our excellent faculty can (and has) overcome these kinds of challenges -- as our high bar passage rate reflects.  But I am still left with the question that forms the title of this post: are law students, who potentially have worse credentials coming in than in previous years, and who may be having a harder time with the bar exam, doing worse in the classroom?  And if so, what should we do about it?  Are there innovative teaching techniques we should employ to account for this trend?  Should our overall grades reflect poorer student performance by lowering the curve (if, in fact, that is warranted by lower quality performance)?  Are there systematic changes we should make?

In the end, I am confident that my students, while perhaps coming in with lower numerical credentials, are still excellent students overall and will make fine lawyers, even if their classroom and exam performance has changed somewhat over the years.  But Derek's post makes me wonder whether there is something more we should do in the classroom to account for lower-credentialed incoming students.  If bar passage rate is the measure of success, then across the country our outputs have diminished.  Derek points to the inputs (incoming student credentials) as at least once source of the problem, and my anecdotal evidence backs this up, at least somewhat.  

What can we do for students to improve the outputs given the (potential) new reality of lower inputs?  All law professors have the responsibility to spend significant time and energy contemplating this question.

Posted by Josh Douglas on December 29, 2014 at 09:17 AM in Teaching Law | Permalink


It's sad that the school's financial conflict of interests prevents them from ensuring that they're only graduating high quality prospective attorneys, and that those who are less fit to practice law will be kept out only by their inability to pass the bar exam after spending over $100,000 on law school. Also, I pity the client who spends thousands of dollars on an attorney who only passed the bar after failing numerous times. Many consumers of attorney services don't know enough to evaluate attorney quality, and are harmed by incompetent lawyers who take their money and bungle their case.

Posted by: joeyjoejoe | Jan 1, 2015 1:26:46 PM

Occam's razor: the answer is likely the simplest -- less intelligent students on average are entering law school. Law is not an incredibly complex field, but I think it does require slightly above-average intelligence to perform at the requisite level in law school. If law schools are admitting people who do not have the intellectual resources to do in-depth analysis -- and the plummeting LSAT scores suggest this -- then that would be the most logical explanation.

Posted by: twbb | Jan 1, 2015 1:22:34 PM

"If you say so."

Posted by: Howard Wasserman

The legal education profession is clearly in crisis, from every viewpoint except faculty and dean compensation and duties. Those will hit very hard over the course of the decade, and the first signs are present already.

The fault is 100% on the part of the law schools and their supporters. They built new law schools, expanded old ones, and increased tuition far beyond reason. They not only ignored outcomes, but in many cases committed fraud with salaries and jobs, and resorted to dishonest bullsh*t to make their customers think that a law degree was worth it.

I've seen far more criticism of the people speaking the truth from faculty and administrators than action.

Posted by: Barry | Jan 1, 2015 10:22:06 AM

Some thoughts, moving along the thread:

Ex-Student: I only give take-home exams, which students usually have more than a week to complete (1L exams, because of scheduling, are usually given for 72 hours or so). The decline described here is consistent and pervasive. So, while I am absolutely a believer in more realistic ways of testing students (open book, time to organize one's thoughts, and edit, etc.) I am regretfully coming to the conclusion that it almost never matters. I suppose I'll keep doing it on principle.

Andrew: My problem with the double-bind you suggest is that students actually should have a pretty good idea about what the prof considers frivolous - I mean, the student has literally sat in a room and listened to the professor opine about "plausibility" "frivolousness", "reasonableness", etc. all semester long. Students should not be expected to be experts by the end of the semester. But, if they've mindfully worked their way through 75-100 cases, with professorial editorializing and extension, they should have a sense for what the particular body of law is trying to accomplish. Repeatedly, I have read exams where students avoided issues we have discussed in depth all semester long, in order to hit some inapposite bullet points from an outline. They are practically allergic to analogical reasoning - which amazes me, since that's what all the cases we read tend to do - preferring cut-and-dried answers. There is a small place for cut-and-dried answers, but lawyers will rarely have the privilege of being asked to provide them.

1L Thinker: I'm sure your kitchen sink approach works sometimes, and you report great success with it. That depresses me, because I don't think professors who reward that have done their jobs (though I agree that the "prompting checklist" approach sketched above can be a useful way to begin thinking about an answer). However, I am very certain it doesn't work for my students, because students who write like this tend not to answer the question. That is, when I'm reading an exam answer where large dollops could've been inserted into virtually any exam on the subject - regardless of the hypothesized facts - it indicates a poor engagement with the actual problems I thought to put on the exam. Moreover, it can call into question whether the student really understands what he or she is saying. If I give a hypo involving an inattentive driver who wanders into the next lane, I'm going to be shocked if the first answer the student comes up with is the Learned Hand formula. Fascinating as it is, I am really clear that it has very limited practical application. I mean, one COULD analyze a simple car accident that way, but given the many superior (and clearly emphasized) methods of analysis at one's disposal, I'm going to wonder if this student has been paying attention.

Usually the "kitchen sink" method conceals from the student the right tool for the job, and prevents it from being fully utilized. Save it for your scratch paper.

Most students can summon (via notes or recall) most of the law that might abstractly bear on a question. It takes more effort - and strangely, a kind of "bravery" - to take a stand and make a prediction. To do so while acknowledging legal uncertainty (Is this exactly the right rule for this unusual case? What about cases that go the other way?) is often the hallmark of a successful answer.

I want to be clear about what we're not talking about here. Law professors certainly love theorizing, and that can be a valuable component of legal education. But the lapses my fellow professors see here don't stem from students' unwillingness to acknowledge the "deep structures" of torts, or sociological aspects of this or that body of law. I mean, these might be interesting, but aren't going to get very far before a judge. Instead, the problem lies in getting students to execute the same types of analysis that students see in case after case throughout the semester.

Posted by: Adam | Dec 30, 2014 2:10:29 PM


Ok, that makes sense, but then you're always going to get the shotgun approach, as students think through the problem sequentially. It's almost as fast to observe in one sentence that diversity isn't an issue than to debate internally whether to write it down at all, so when I took exams, I wrote it down.

I agree with you both that the best approach for students is to spend the most time on the best arguments and that leaving out frivolous arguments makes sense for practice, but the incentives of an exam all but guarantee that the same student who knows fully well not to annoy a judge with frivolous points will still put them down on the exam.

Overall, it sounds like it's not going to make a difference in your grading, so perhaps it doesn't matter, but it's just a common lament I've heard from professors that I think places the blame inappropriately on students rather than the incentives students face that make the shotgun approach rational. I'm not sure how to avoid that, though, in an exam-based class except severe time-limiting (which has its own issues of rewarding faster typists at least).

Posted by: Andrew Selbst | Dec 30, 2014 1:33:02 PM


Thanks for your comment.

I don't take off points, explicitly or implicitly, for analyzing frivolous issues. I read it all and try to find points for the issues that *are* relevant. But spending time on the frivolous issues takes time away from the issues that matter, and too often students waste time on the frivolous stuff and don't put in as much time for the actual issues, resulting in surface-level analysis for all of it. (I did say that I often discount answers as being of poorer quality -- although I don't take off points -- that throw out a case name for a case we didn't read in class, but I think that's materially different, as it's usually accompanied by lack of depth on that point.) So yes, if you want to go through subject-matter jurisdiction when it's not really an issue, go for it, so long as it doesn't take time away from a full analysis on the issues that really matter. If this is an implicit way of "taking off" points, so be it. As I advise my students, the best strategy is to go for the high-value issues first. This is also good lawyering -- go for your top issues first before mentioning any likely-losing arguments. Starting with losing arguments is a great way to annoy a busy judge.


Posted by: Josh Douglas | Dec 30, 2014 1:16:44 PM

If the concern is that students are not providing sufficient depth of analysis, perhaps it's worth considering that a 3-hour timed exam is not really an appropriate means of testing the material?

Posted by: Ex-Student | Dec 30, 2014 1:14:07 PM


When I was in law school, I had a professor that told us he would take points *off* for frivolous issues we listed. This seems to be in line with your approach. I don't think you said so explicitly, but you seem to downgrade the answer overall as a result. The problem with this is that where the line between frivolous and losing can be hazy, the student is stuck in a no-win situation. Write the answer down and risk it being frivolous; leave it off and risk the professor thinking you missed an issue.

Putting the law students in this knife-edge problem can be worse than simply ignoring the frivolous listed answers because it makes grading somewhat arbitrary. The student cannot know exactly what to expect from the professor, so there's risk either way that the result is not due to her understanding of the material, but her guess as to professor's strictness on frivolous issues. By simply ignoring the irrelevant stuff in a shotgun approach, you at least ensure that students do not have conflicting incentives about what to put down, that they are really trying to list everything relevant, and if it's irrelevant, then they're just wasting their own time.

Posted by: Andrew Selbst | Dec 30, 2014 1:03:17 PM

As a practitioner at a large firm, I have noticed a recent increase in the number of young associates who are unable or unwilling to do the deeper analysis of issues that our job typically requires. I enjoy giving young associates access to some of the more challenging and open-ended questions that our clients face (not the least because those are the kinds of questions that would have enthralled me when I was in their shoes). Over the past couple years, however, I have encountered several associates who have expressed frustration that these questions do not have clear-cut, verifiable answers. They spend a disproportionate amount of time searching for some magical on-point case that doesn't exist, and then struggle to provide helpful legal analysis in the absence of controlling authority. Like other commenters, the best I can do is hazard a guess at the cause. But the phenomenon Professor Douglas describes appears real, and is not limited to the classroom.

Posted by: Vapplicant | Dec 30, 2014 12:47:24 PM

AnonProf13: Absolutely! I LOVE checklists and encourage my students to use them to issue spot. And I have a new draft article called "A Checklist Manifesto for Election Day" that specifically calls for the use of checklists for poll workers and voters, stemming from Gawande's book. I'll be posting about this article in the next day or two.

So in no way was my comment meant to criticize checklists themselves -- just the opposite, I advocate the use of them wholeheartedly. So yes, my criticism is focused on, as you say, the "shotgun checklist approach" that shows a lack of critical thinking about the application of the facts to the law. Students should use a checklist to say "is subject matter jurisdiction an issue here? No. Is personal jurisdiction an issue here? Yes." And so on. They shouldn't use the checklist to think that they need to spend a lot of time on each of the issue when the issue is irrelevant or non-existent given the facts. And perhaps I need to clarify that more in future years -- but it seems to be a newer problem.

Thanks for pushing me to clarify.

Posted by: Josh Douglas | Dec 30, 2014 10:51:52 AM

Josh, I'd like to push back a little on your criticism of checklists. What I think you are critiquing is what we might call the "shotgun checklist approach": writing an exam with a predetermined set of issues to raise regardless what issues the facts implicate. I have no disagreement with your objection to this approach.

Your comments could be read, however, to criticize the use of checklists, per se. Based on recent research in educational psychology, I encourage students to think through the analysis of a problem using a checklist of legal issues covered in the course. This way, students have a systematic way of ensuring that they don't miss issues, but they also have a built-in way to prioritize bigger issues and rule out non-issues. This approach then actually helps in the lawyerly task of prioritizing, de-prioritizing, and eliminating issues. (I'm thinking specifically of Atul Gawande's "Checklist Manifesto" in this regard).

Is it fair to say that your criticism is limited to the former and not the latter?

Posted by: AnonProf13 | Dec 30, 2014 10:33:47 AM

This is an interesting discussion and I think reflects what many of us have observed, particularly at exam time. If, in fact, the quality of applicants has changed, then many students will likely find themselves placed at better schools than they would have a few years ago, and some students in the lower tier will find places that may not have been open to them in the past. Students might react in a number of ways, two of which seem to stand out. First, students might feel like they have won a golden ticket by being moved into a higher ranked school and they might respond by working less hard because they feel they are, in fact, golden. Others might decide that they need to work harder in order to succeed at a higher ranked (or better) school. The latter seems to me the far more rational response because there is no reason to think that law firms will react the same to a lower quality student body, one that is likely to be reflected on their resumes. (If students who five years ago would have been attending school #50 and are now at school #30, law firms might see that, or they might just be happy to hire from school #30. It is hard to know but time will likely tell.) Anyway, for whatever reason, my experience has been that some, though not all, students have opted for working less hard and do rely on commercial outlines and outdated student outlines excessively. But faculty also seem to respond similarly -- many faculty will teach down to lower quality students, emphasizing doctrine more and analysis less and that, too, seems problematic. On a slightly unrelated subject, many faculty give exams that call for a shotgun or kitchen sink approach -- say as much as you can in the time allotted -- for the life of me, I will never understand why anyone thinks this is a good approach, or one that might aid a lawyer, but it poses a problem for students who are trying to figure out how to take exams when they are told to make all arguments, good or bad, relevant or irrelevant. Professors who award points for things said without taking away points for miscues (all of these elaborate point scales that law professors but seemingly only law professors come up with), lead to the same phenomenon and those professors have nothing to gripe about regarding the quality of the exams they grade.

Posted by: anon | Dec 30, 2014 10:09:13 AM

Think Like a 1L: Yes, students and lawyers should argue issues that they might not win so long as they are not frivolous -- and I give points for that. But no lawyer in their right mind would argue lack of subject matter jurisdiction in the example I gave when I made the citizenship of the parties and the amount in controversy abundantly clear. The only "viable" argument on this issue is possibly that the amount in controversy was not pleaded in good faith, and I also awarded points for that argument. But discussing the citizenship of the parties (in detail, mind you) when the facts were absolute on this point was a waste of time, and is an example of the "shotgun" approach I mentioned that demonstrates a lack of critical thinking.

Posted by: Josh Douglas | Dec 30, 2014 10:07:35 AM

If you say so.

Posted by: Howard Wasserman | Dec 30, 2014 10:00:21 AM

Jojo: "I love the law school as rorschach test revelation here.

If they're dumb, it's because of the scambloggers, but if they're smart it's because we taught them to think like lawyers."

Thanks for posting this.

Mr. Wasserman, anybody who blames 'scambloggers' for anything is clearly in the wrong, both factually and morally.

Posted by: Barry | Dec 30, 2014 9:55:57 AM

@Howard/7:49 a.m.:

I have a quibble with the idea that more students are working because of "the cost of life" rather than the cost of law school, because there are so many scholarships and discounts these days.

I don't have the Google-fu necessary to find what FIU Law's tuition was in 2008, but I do know what the University of Pittsburgh's was: around $19,000 for residents, and $31,000 for non-residents. Seven years later, the University of Pittsburgh is charging $30,816 for residents, and $38,300 for non-residents, and I understand that this phenomenon is fairly widespread as law schools hope to attract applicants with the semblance of a deep discount. Some matriculants are nonetheless paying those sticker prices so that others might pay less or nothing for the same credential, and those matriculants are likely to end up below the median anyway. Why compound the error of attending law school at too high a cost for one's marketability by also paying for one's rent and groceries with the same non-dischargeable loans gaining 7% interest?

Posted by: Morse Code for J | Dec 30, 2014 9:42:23 AM

At least based on what I'm seeing at my school and hearing at others, it's going in the opposite direction. A greater percentage of students are receiving non-loan aid, including full-tuition rides (schools across the board are spending more on scholarships), at the same time that more are working part-time jobs.

There's also a nice question whether any of it helps. How likely is that part-time job that I hang onto throughout law school (and possibly commit more time to than I do to classes and school) going to lead to a job? Would my job prospects be better if I worked less in school and devoted more time to school (on the assumption that I'll do better)?

Actually, the issue pushing many to work is less the cost of law school than the cost of life. A lot of students don't have parents/spouse/partner who work or provide support, so they need the job for that extra living money that we all need. That draws us into a different debate over full-time v. part-time legal education.

Posted by: Howard Wasserman | Dec 30, 2014 7:49:16 AM

Prof. Douglas: I'm dubious on the "viability" delimiter. Depending on how thoroughly you explained this prior to the exam -- as opposed to in instructions that students read on the exam -- I don't think it's reasonable to expect students to know exactly what this means. Must a "viable" issue be dispositive in their favor? It's certainly not the case that lawyers won't argue things they don't expect to win.

It's not a huge issue, but this sort of thing sometimes jumps out at me as irritating exam craftsmanship. In my case it probably would have gotten an introductory "Subject matter jurisdiction isn't a "viable" issue here because... so let's move on."

But I think that one of the most common problems in exam questions is failure to sufficiently define the scope of an answer. Obviously going too far the other direction defeats the purpose of an issue-spotter, but I don't think that means the scope should be open to reasonable dispute.

Posted by: Think Like a 1L | Dec 30, 2014 4:21:57 AM

Seems like many comments are missing one underlying reason why undergrads and law students spend less time studying. Economics-they must spend more energy paying for it all. Students ten years ago didn't need to raise as much funds. And even if its all on loans, that's a major stressor making getting a job a lot more high stakes. Give me free tuition and i'll spend a lot more time studying and less time at my part time job.

Posted by: max | Dec 30, 2014 2:22:22 AM

Thanks for the many interesting comments on this initial post. A few follow-up points:

First, thanks to the citations of the scholarship on this issue. I was aware of some, but not all, of it, and I'm glad it's being looked at in a deeper and more holistic way.

Second, I think Derek and Howard are absolutely correct: today's students, even as compared to students from a few years ago, are increasingly looking for the "answer" and the short way out. Previous students also used supplements to help them through difficult issues; but this crop of students seems even more reliant on outside materials to explain the doctrine, and then they stop there. (Several times this semester a student answered a question in class with the sound byte version of a rule from a supplement. Of course, I always push for more.) Moreover, I find that my students are increasingly relying on outlines from previous years. I discourage this practice because I think that students need to do the work of making an outline for themselves, which helps them synthesize the material, make connections, etc. But on several of the exams this year I saw students actually citing a case that I had assigned in previous years, but not this year. (I usually award fewer points for those answers because the citation to the case is often accompanied by a lack of sophistication.)

Third, I want to give just one example of the lack of deep-level thinking that I am talking about, as it might help to illuminate at least one aspect of the problem. One of the questions on my exam said that the defendant raised "all viable defenses in a motion to dismiss," and I asked the student to act as the judge in resolving the motion. I also emphasized the word "viable." The facts, for this particular part of the problem, had said that the plaintiff was a citizen of state X, the defendant was a citizen state Y, and the amount in controversy was $100,000. A motion to dismiss for lack of subject matter jurisdiction was a complete waste of time and earned no points, taking away valuable time from the defenses that *were* viable. But many students dutifully went through each of the defenses listed in Rule 12(b), even if there were no facts that would implicate some of those defenses. A good lawyer would never raise lack of subject matter jurisdiction to a judge in this instance, and I was clear in the instructions (and in pre-exam study sessions) that the students should go after the key issues actually raised by the facts. But instead, many students took a "checklist" type approach and went through every *potential* defense instead of every *viable* defense. This shows a lack of deep thinking and critical analysis.

To be sure, for students who typed fast enough to raise every 12(b) defense for every question (which rarely happened given the time constraints of the exam), the checklist approach probably avoided their exam scores being at the very bottom. But the shotgun approach without actually thinking critically about whether certain defenses would apply is, to me, a new development over the past few semesters, as compared to several years ago. (Keep in mind that I'm only in my 5th year of teaching, so my sample size is low--and yet I still see this change.) It was particularly frustrating this year given that I emphasized the point that students should think about whether a lawyer would actually raise a defense before a judge, given the facts at hand. Derek and Howard's explanation for this phenomenon -- over-reliance on supplements and old outlines -- makes a lot of sense. And this is just one example of where deeper analysis was lacking.

Posted by: Josh Douglas | Dec 29, 2014 11:39:49 PM

No data, just many, many anecdotes. I have a number of friends who went to elite law schools after doing undergrad at large state universities; they all describe never writing anything as undergrads and never having to do much critical thinking, then facing a steep transition in law school and it taking a while to figure it all out.

I'm not suggesting that people not go to large public schools for undergrad or that large public schools are bad or that students from public schools make worse laws students than those from other schools. But if we are thinking about student readiness, the whole of undergrad matters--major, courses, and school.

Posted by: Howard Wasserman | Dec 29, 2014 10:09:45 PM

I think, somewhat in line with Asher's comment, that it's incorrect to suggest that the issue-spotting/checklist point accumulation of exam-taking is well-tailored only for getting "the points necessary to avoiding the very bottom scores in the course."

It's my practice to apply this "kitchen-sink" method even in the face of professor claims that points are awarded for brevity, organization, or "deep"/"overall" analysis and reasoning. This has been consistently effective.

And, all you really need to do is look at an exam a professor has actually graded. You will observe in many cases that they make check marks in the margins the number of which equals the total score.

My school is in the USNews top 50.

I don't really see any better way to grade a law exam that wouldn't result in excessive subjectivity.

Posted by: Think Like a 1L | Dec 29, 2014 9:50:27 PM

Professor Wasserman, why do you single out "large public schools"? Do you have data suggesting that students from public schools perform worse in law school than students from private schools? I think that it is reasonable for law students to go to public schools before law school when facing the prospect of a very expensive professional education. I knew that I was going to try to attend law school before I entered undergrad, so I went to a public school to lessen the total cost. I do not believe that my decision resulted in poorer performance in law school.

Posted by: anononon | Dec 29, 2014 9:33:29 PM

I love the law school as rorschach test revelation here.

If they're dumb, it's because of the scambloggers, but if they're smart it's because we taught them to think like lawyers.

Posted by: Jojo | Dec 29, 2014 8:33:02 PM

@Howard/3:45 p.m.:

How is this state of affairs (study time at a premium, students attending law school with a job in mind rather than personal enrichment, use of supplements and outlines) is any different than it was seven years ago?

When I was a 1L, there was a robust (and unique) industry in supplements and outlines to explain what the assigned text and the sometimes-unilluminating hours in class were supposed to explain. Whether this reflects well or poorly on the quality of instruction is a separate debate, but the point is that I doubt there ever was some Golden Age of Student Engagement With The Text, that just recently passed for reasons no law professor can explain.

If the process of legal education is not different and rates of bar passage are worse in a statistically significant way, then it seems silly not to consider reduced selectivity among the inputs to law school as the primary cause.

Posted by: Morse Code for J | Dec 29, 2014 7:51:53 PM

In reply to the 3:07 comment, it's a fact that scads of students rely heavily on suppplements and commercial or inherited outlines. But about this complete diversity example, what would be better? My problem with "Section 1332 states that there must be complete diversity" is that it states no such thing; Strawbridge is where complete diversity comes from. So if I taught civil procedure, I'd give an extra point for "Section 1332 requires diversity; in Strawbridge the Court interpreted a predecessor statute to require complete diversity."

It sounds, though, like your problem with your hypothetical answer is that the student isn't providing the complete language of 1332, or Strawbridge, and that even my answer would look to you like something out of a study guide. (Surely a good study guide would cite Strawbridge.) So would you prefer to see something like "Section 1332 requires that a civil action be between citizens of different states; Strawbridge interpreted a predecessor statute to require that each party be able, absent his co-parties, to sue, or be sued by, each of his adversaries"? That's a rough paraphrase of Strawbridge and the statute. But I wouldn't think you'd want to see all that, for a few reasons. First, on a short exam, one would be spending a lot of time making the same point that "complete diversity" would suffice to connote, at the cost of spending time on other points that aren't nearly so simple. Second, most professors will reduce Strawbridge to the slogan "complete diversity"; all an answer that engages with the language of Strawbridge would really show is that the student took some time to memorize a paraphrase of Strawbridge. He isn't really deriving the rule from Strawbridge; he's memorized the rule and copied down a line from Strawbridge. Third, what's wrong with the distillation of Strawbridge and the statute to the slogan "complete diversity? Distilling an awkwardly worded, 208-year-old case to a clear phrase is actually a skill, and one that we expect lawyers and judges to possess. I never see, in a brief or an opinion, anyone actually engaging with the language of Strawbridge; rather, people simply cite it for its rule, which they summarize as complete diversity. I think that, as a general matter, the more complex or open-textured the rule of a case or statute, the wiser a student is to engage with the language of the case or statute instead of applying some vague or possibly contested summary of it. But if you have a very simple and mechanical rule, like the rule of Strawbridge, or 1332's amount-in-controversy rule, engaging with the language of those rules in a test just seems like a copying exercise. I wouldn't suspect students who don't engage with the language of that kind of rule of relying on supplements.

Posted by: Asher | Dec 29, 2014 5:11:36 PM

Many of us are advocating new approaches to teaching to help students with poor education backgrounds. See Michael Hunter Schwartz et.al., Teaching Law by Design (2009), Michael Hunter Schwartz et.al., What the Best Law Teachers Do (2013), Scott Fruehwald, How to Help Students from Disadvantaged Backgrounds Succeed in Law School, 1 Texas A & M L. Rev. 301 2013) (http://ssrn.com/abstract=2322486), E. Scott Fruehwald, Think Like A Lawyer: Legal Reasoning for Law Students and Business Professionals (2013).

Posted by: Scott Fruehwald | Dec 29, 2014 4:38:09 PM

I did not use and would not use the word "lazy." I don't know the answer (if I did, they'd make me a dean), but I'll venture a few guesses:

I second everything mentioned in Rebecca Flanagan's abstract--study time, consumerist orientation, and weaker study and thinking skills brought forward from the wrong undergrad majors (and, I would add, from attending large public universities). In particular, study time is decreasing as more students work while in law school (the ABA's rescission of the prohibition on 1Ls working is not going to help). And all of those things spur even greater reliance on study guides (see my comment and Derek's comment, above). Students are unaccustomed to engaging with primary sources--cases and statutes--and want to rely primarily (or even entirely) on the secondary sources that explain and summarize.

I think the "law-school-is-a-scam" narrative has an effect on this (this somewhat relates to consumerism). Those students who still decide to go to law school come with a set of demands and expectations--"stop playing games and just tell me the answer," "this isn't what law is about," "this isn't making me practice-ready," "just give me experience so I can get a job"--that are inconsistent with the deep-dive of a doctrinal course, especially a 1L course.

Posted by: Howard Wasserman | Dec 29, 2014 3:45:43 PM

Building on Derek's point: Study guides are both more pervasive and more easily accessible. I get the sense that there is a real (and problematic) over-reliance on these materials; some students are relying entirely on them, others are doing a cursory once-over of the assigned materials, then spending more time on the supplements. And the answers Josh (and others) are seeing are reflective not of engagement with the assigned materials, but with that surface-level analysis in the study guide.

One indicator that I notice: Students are unable to provide the complete language of a rule or statute, but only provide the summary of it. For example: "Section 1332 states that there must be complete diversity." That suggests that their engagement is not with the text and the interpreting cases, but with the summary guide.

Posted by: Howard Wasserman | Dec 29, 2014 3:07:08 PM

Thanks, Josh, not only for the kind words but for this thoughtful post. Many of your points struck me as valuable, including this observation: "The students did well spotting issues and giving a surface-level interpretation, but for many, complexity was lacking." I wonder about this frequently. I review many "study guides" for law students, and they often heavily rely on "point accumulation," or identifying issues, spitting out definitions of law, a "checklist" approach to exams, etc. This is, admittedly, an "easy" way to get the points necessary to avoiding the very bottom scores in the course. And, in part, I reward that, because, after all, identifying issues, etc., is an important skill, and it's, perhaps selfishly, a relatively easy, "objective" thing to calculate in an exam score. But, there's very little, from what I can see, in these "study guides" about complexity, depth of treatment, nuanced understanding, marshaling complicated facts in a hypothetical--in short, the depth-of-treatment you identify. That said, it seems whatever I say on this point is not valued nearly as much, for whatever reason--perhaps because it's harder for students to pick up, it's harder for me to grade, it's harder for a study guide to identify, it's not something 2Ls through oral tradition pass along to 1Ls, whatever the reason. And then, further, measuring that output is still harder, even assuming we want to try to slow the pace, to emphasize depth and complexity. I still have more questions than answers at this point, but thanks for this (important) contribution to the conversation.

Posted by: Derek Muller | Dec 29, 2014 2:57:20 PM

Some lawblogger pointed out that the class of 2010 was actually less selected than previous classes (IIRC, the acceptance rate started increasing sharply then). And since then it's clearly been downhill for a large number of school. When I browse Law School Transparency, seeing 75th percentile LSATS at the 25th percentile of a few years ago is no longer shocking, and I don't recall browsing *any* school increasing those percentiles. Given fewer students taking the LSAT, and *massive* increases in acceptance rates, it's not deniable that the quality has been going down.

One new piece of information for this debate is that from my casual browsing, it's very common for the median GPA to have dropped from 2010 to 2011 but then flatten. Given dropping LSAT percentiles and likely a worse raw performance distribution, the flat GPA's make me think that the schools which are dropping LSAT standards are now recruiting quite a bit more from worse undergrad programs.

This means that the raw student ability is lower, *and* that they are inadequately prepared.

Posted by: Barry | Dec 29, 2014 2:37:31 PM

This article explores this theme and analyzes the data. Here is the link and the abstract:

Link: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2478823

The Kids Aren't Alright: Rethinking the Law Student Skills Deficit

Rebecca C. Flanagan
UMass Law

June 1, 2014

Brigham Young University Education and Law Journal, Vol. 15, No. 1, 2015 Forthcoming


This article explores the decline of fundamental thinking skills in pre-law students and the challenges facing law schools admitting underprepared students during a time of constrained budgets and declining enrollment. A growing body of empirical research demonstrates a marked decline in the critical thinking and reasoning skills among college graduates. The causes for the decline are interconnected with other problematic changes on undergraduate campuses: 1) a dramatic decrease in student study time since 1960, examining research which suggests that undergraduate students spent 1/3 less time studying in 2003 than they did in 1961; 2) a consumerist orientation among college students, resulting in a diminished focus on learning; 3) grade inflation at undergraduate campuses, resulting in grade compression and an inability to distinguish between exceptional and ordinary students 4) a decline in undergraduate students choosing to major in liberal arts that provide the foundation for early success in law school. Declines in study time, grade inflation, and changing patterns in student class choice have created an undergraduate learning environment that is less rigorous than undergraduate education fifty years ago.

This article challenges law schools to examine the adequacy of traditional support programs when incoming classes require systemic and sustained academic assistance. Law schools have traditionally helped academically underprepared through academic support programs, however, traditional ASPs are not equipped to provide broad-based and comprehensive assistance to large numbers of law students. Law student underpreparedness is a “wicked problem,” so complex that singular solutions are impossible. Law schools admitting substantial numbers of students with lower-levels of academic preparedness need to ask themselves questions to determine how to best address these challenges. The broader legal community should reflect on these questions because the answers will require all stakeholders to invest in changes to undergraduate education as well as legal training.

Suggested Citation

Flanagan, Rebecca C., The Kids Aren't Alright: Rethinking the Law Student Skills Deficit (June 1, 2014). Brigham Young University Education and Law Journal, Vol. 15, No. 1, 2015 Forthcoming. Available at SSRN: http://ssrn.com/abstract=2478823

Posted by: Eugene Pekar | Dec 29, 2014 2:36:21 PM

@Howard/11:56 a.m.:

So why are law students so much lazier and less capable now than they apparently were only a few years ago?

Posted by: Morse Code for J | Dec 29, 2014 12:10:14 PM

I suspect that this decline reflects the decline in the quality of applicants rather than a "what's a matter with kids today?" general generational lament.

As applications have fallen off a cliff, and as LSAT retakes have been permitted, I have no doubt that the quality of 1L in today's law school, other things held equal, is a standard deviation below the quality of 1L from a decade ago in all but the top 10 law schools.

Sadly, this will be amplified as law schools are unwilling to self police and the ABA is unwilling to perform a quality assurance function.

Posted by: Jojo | Dec 29, 2014 11:58:16 AM

Could it be a matter of effort rather than ability? Are students less willing (because of unreasonable expectations of what law school should be) or able (because of study habits and outside distractions) to put in the time to do the "close reading" and preparation that allows them to get below the surface of this material?

Posted by: Howard Wasserman | Dec 29, 2014 11:56:23 AM

Hopefully most institutions and professors have already been teaching to the best of their ability and constinuously seeking to improve their methods based on experience and feedback. I would not assume there is any pedagogical change that can be implemented to negate the impact that lower admissions criteria will have on the quality of a school's graduates. If a school is serious about maintaining its educational standards then it should find ways to reduce expenses so that it can support a smaller, more select student body.

Posted by: Anon | Dec 29, 2014 11:48:18 AM

The comments to this entry are closed.