« Joel Harrison on "Post-Liberal Religious Liberty" | Main | Guest Post:Could Pipeline and Non-Residential Fellowships Increase the Diversity of the Academy? »

Monday, September 07, 2020

Data--Rough Data--on Bar Exams and Covid Cases Among Test-Takers

With due caution and various caveats but a serious underlying point, I commend to you this post by Derek Muller at the Excess of Democracy blog. Derek writes of his efforts to obtain information on "the spread of Covid-19 related to the administration of" the July 2020 bar exam in the jurisdictions that held in-person bar exams this summer.

Derek reports that he heard back from bar officials in seven jurisdictions and, "to their knowledge, no one contracted the coronavirus on account of the administration of the bar exam. . . . Some additionally confirmed that no proctors or staff contracted it, either." He adds that "some jurisdictions did emphasize they asked test-takers to disclose if they contracted the coronavirus within 14 days after the exam, and none did so."

Of course caveats apply. I can come up with many; you can come up with many; Derek did come up with many, and notes them near the top of his post. I am personally less sanguine than him when he writes, "My instinct is that if someone did contract the coronavirus during the administration of the bar exam, we’d probably know by now." I'm not sure I have an instinct about this one way or the other. But my imagined scenarios for under-reporting embrace, at a minimum, secrecy, incompetence, caution about making disclosures, uncertain standards of causation, and a lack of organized data collection. (Asking test-takers to disclose is better than nothing but doesn't seem a terribly powerful effort.)  

I do not, then, take the post as strong proof of anything. And its interest for me is quite disconnected from my policy views on the bar exam, now and generally. It is possible to think the in-person bar exam is dangerous for current public health reasons without opposing the bar exam generally, and equally possible to think that it's relatively safe, or can be made so, and that the bar exam should be replaced by something else for other reasons. Our normative and policy views and our sense of the evidence on a particular point needn't move in parallel, and there may be reasons to be suspicious when they do. 

I appreciate and commend Derek's post because it is an effort at collecting data to evaluate the many warnings and predictions that were made about the bar exam ex ante. Leaving aside the students and recent graduates, many academics made various predictions or voiced various concerns before the bar exam. These concerns included but weren't limited to the question whether in-person bar exams would spread the virus among test-takers. (Another ex ante argument was that there would be a shortage of lawyers and a surfeit of new clients with pandemic-related legal service needs, and that diploma privileges or other measures would assist in improving access to legal services for those individuals by those new lawyers.)

Voicing ex ante concerns about risks is perfectly understandable. One can hardly wait until after the event to express worries about future risks. And the outcome doesn't mean the concerns about risk were unwarranted. But it seems to me that, at least for those whose arguments were based on academic expertise, or invoked that expertise and appealed to past and ongoing empirical study of the issue, or otherwise invoked a kind of academic or data-driven or scientistic authority in making various arguments, there is an arguable duty to follow up and see what the data ultimately revealed about the accuracy of those warnings. In the long run that would include, I should think, studies of the discipline levels of this cohort of new lawyers depending on the approach taken in different states. And it would be useful in the shorter term to work to find out whether any state's approach actually resulted in any difference in the level of legal services provided to clients in need, and whether those services were provided by new lawyers or by already existing practitioners.

I'm not a particular fan of the bar exam, as I've written before, although I also think some claims for the value of the diploma privilege and some claims against the bar exam seem overstated, and that a period of mandatory supervision in lieu of examination ought to be of meaningful length and contain reasonably detailed requirements for both the supervised and the supervising lawyer. But none of these views have anything to do with whether it's a good idea for those who make predictions to follow up on those predictions with data after the fact. Of course it is. The data would be interesting in themselves, and a better measure of the authority of those making predictions than a general appeal to their credentials. (Even experts can fare poorly in making predictions.) 

I am sure that many of the academics who offered warnings before the fact are working to collect such data, that doing so properly takes time, and that they may well end up being getting and reporting more thorough and careful results than this. I acknowledge the possibility that sometimes no data may be better than some data. Better, sometimes, to know you don't know than to be overconfident that you do know, based on anecdata or weak data. It depends, I think, on whether writers are careful, in the absence of any data, about emphasizing the lack of data and how it affects the strength of their arguments--and on whether a person with some data is equally clear in emphasizing those limits. But I'll take this as an interesting step forward, and one that required genuine time and effort on Derek's part. 



Posted by Paul Horwitz on September 7, 2020 at 04:19 PM in Paul Horwitz | Permalink


The comments to this entry are closed.