« Ice cream Court of the United States | Main | No Country for Old Men: Blogging After a Decade »

Thursday, April 16, 2015

Measuring the Impact of Faculty Scholarship

Given the intensity of the reactions folks had about how to measure productivity, I’ve been a little hesitant to post my thoughts on impact.

So, in addition to the qualifications I previously mentioned, let me add that I think it may be impossible to quantify the impact of legal scholarship.  Indeed, I am uncertain how one goes about quantifying the impact of most things.  We could, for example, obviously state that the Mona Lisa has exerted a greater influence on art than the shabby art projects that I completed and my mother hung on our refrigerator.  But can we assess the impact of the Mona Lisa as compared to the ceiling of the Sistine Chapel?

To put this in terms of legal scholarship, I can confidently say that Holmes’ The Path of the Law has exerted a greater impact than any article that I have ever published (or will ever publish).  But how can we compare The Path of the Law to, for example, Warren & Brandeis’ The Right to Privacy?  We can count how many citations each article receives in Westlaw’s JLR database, we could count the court citations each has received, and we could even ask a bunch of respected law professors to vote which article they believe had a greater impact.  But the fact that Holmes’ article has 3,322 cites in JLR, while Warren and Brandeis have only 2,451 doesn’t seem to settle the question---or at least it doesn’t settle the question for me.

In any event, assuming that we have to come up with some way to measure impact---and that is a major premise of academic analytics---I suggest that we quantify the following for each faculty member:

  • Citations in JLR
  • Citations in ALLCASES
  • Number of downloads from various electronic repositories (such as SSRN)

 (You’ll have to forgive me for using Westlaw databases---I wanted to make sure that we are all working with the same universe of documents.)

I toyed with a few other categories, such as citations in case briefs.  But I don’t think that we have access to an electronic resource that gives us complete coverage of briefs.  Is that correct?

Anyway, rather than attempting to justify these categories, I’d be interested to hear what others have to say.  I’ll either jump in the comments thread or write a follow up post.

Posted by Carissa Byrne Hessick on April 16, 2015 at 06:06 PM in Life of Law Schools | Permalink

Comments

A few quick additional thoughts:

Regarding other factors, the various posters up thread make good points that for a complete picture of a scholar's impact things like inclusion in teaching materials, direct peer assessment, broad media coverage, etc. should be considered. The problem is with quantifying these factors in that (1) they are not all that easy to collect and, (2) can very easily involve debatable qualitative assessments. In the context of a casebook, for example, we would need to answer the second-order question of "how influential is the casebook in which the excerpt is used?" before we decide whether to count an author's work being included in that casebook. Inclusion of part of an article in Hart & Welcher (or something similarly ubiquitous) should clearly count, but what about inclusion in a casebook used by only a handful of schools or only one school? As a result, it may be that these other areas are better left for the non-quantified part of hiring/promotion evaluations by individual schools rather than included in a broader quantitative model.

Regarding SSRN, which I'm skeptical about as a metric at all, I think it might be better to look at abstract views than download counts (although I'm not totally committed to that). This is because (1) downloads from those who do not sign in with a SSRN account are not kept as part of the tally by SSRN (but the abstract views are), and (2) the abstract alone can generate impact even if the article itself isn't downloaded and read.

Regarding all citations not being equal, this is true and Harrison's study makes that point well (whatever its other flaws). But I think that point is of limited value once we are trying to answer the question of how to quantify legal scholarship. As the paper acknowledges, where the lines are drawn between meaningful citation and padding is blurry. Even if clear lines can be drawn, the paper's methodology required an individual analysis of each citation to determine which category the citation fell into. It seems to me that if we are at the point of individually analyzing each citation, we have already lost the main value of doing any sort of quantification analysis. My main take-away from the study, in terms of designing a quantification system, is that citations in law reviews should not be the dominant category in the system (as it is in most current systems) because of how much citation counts there may overstate actual influence.

Posted by: Patrick Woods | Apr 21, 2015 10:07:02 AM

Carissa, Google Scholar is more useful than westlaw in several different respects. It picks up all the inter-disciplinary cites, divides cites into total and recent, takes account of instances where there are multiple citations to the same author in a single article (it awards credit there when Leiter's method, say, does not), and offers a couple of good if rough measures of impact (the h- and i10-indices).

Having said that, it, like westlaw, shares the downside that it omits Tax Notes, which is a major outlet for use tax types (they have an exclusive with lexis). I've complained to Leiter several times that he doesn't give us credit for cites there (although to be fair he says he's willing to include them if I'm willing to count them).

Posted by: BDG | Apr 19, 2015 10:17:41 AM

When I think about impact, I also think in terms of whose work is being taught to future law teachers. Hart & Wechsler, e.g., has been remarkably influential in shaping the field it organizes. Generalizing, it seems that one's success in getting one's ideas into important teaching materials in one's field may deserve some consideration as we ponder influence.

Posted by: Jim Pfander | Apr 18, 2015 12:38:46 PM

what's TE()?

Posted by: Typical | Apr 17, 2015 5:54:11 PM

Not a lot of time to comment at the moment, and there is a lot up thread and in the post I'd like to eventually weigh-in on, but in terms of categories I'll quickly say three things:

(1) There's no real need to limit academic citations to the JLR universe rather than the broader all secondary sources WestlawNext grouping. JLR was initially selected in Leiter's methodology to avoid over counting treatise authors whose works are broken up in Westlaw. Use of the TE() search function, which Scott Dodson rightly points out as necessary, should eliminate that problem without having to ignore all citations to work in treatises and other secondary materials that are outside of JLR.

(2) I think the Administrative Decisions and Guidance category within Westlaw should also be included. I could be wrong, but I believe that at least as to recently published materials (i.e., the last 10 years at least) that this database is pretty comprehensive. While this might make the category less useful for compiling some sort of "all time" list, it should be sufficient for the other purposes of quantification you identified.

(3) I would still include briefs, although the point that the database for that is not comprehensive is well taken. Still, it's a better proxy for practitioner influence/reliance than nothing and, as IL anon noted, all of the different databases miss things.

Posted by: Patrick Woods | Apr 17, 2015 3:56:38 PM

I recently completed my tenure application. As part of the application I looked at the question of what impact my work has had. I started with the sorts of things you suggest. I looked for cases and articles in Westlaw, Lexis and HeinOnline. Due to quirks in coverage, the result of each search was slightly different and articles appeared in one source that did not appear in others. Then I turned to Google Scholar. I found more citations this way that did not appear in Westlaw, Lexis or HeinOnline. Most of these were books or journal articles published in English in Europe. Given that I had a fairly significant incentive to find everything, I also used a variety of different general search engines (including Google) to see if there was other stuff that hadn’t been picked up in the previous searches. I did find a fair amount of stuff this way, including a number of citations to my work in articles in foreign languages (including French, German, Spanish, Hungarian and Turkish). My results are probably partly a function of being an international law scholar, but my main point is that I don’t think you can assume that “citations in JLR” is an accurate measure of actual citations. Moreover, the bigger problem is that I don’t think that the gaps in JLR coverage are randomly distributed. Certain disciplines (including my own) are more likely to be undercounted by using JLR citations than other disciplines. This suggests that you probably need to use the results of a basket of search engines (preferably a basket chosen so as to average out the coverage gaps in individual search engines) to assess impact.

Another thing that I found was that if you measure impact purely by citations in journal articles and legal decisions, you are probably missing several fairly important ways in which legal scholars can impact the world. For example, I found that my work is cited in NGO reports, blog posts, and (very occasionally) mainstream media. I also know (through discussions with policymakers at the institutions that I study) that my work is discussed by policymakers at the target institutions. I can’t prove that any of these things impact decisions in the real world, but these are at least avenues through which my work can reach outside the academy.

Finally, another way that our work can impact things is through their effect on future lawyers. I found, for example, that articles of mine are assigned reading in courses at law schools, international relations schools, and public policy schools in both the US and abroad. (I figured this out because a number of course syllabi that I found through google searches assign my work.) My work is also cited in a fair number of LLM theses, PhD theses, and international relations theses. (Again, I found these through general google searches.) So, it appears that my work may have some impact on the lawyers and policymakers of the future.

“Citations in JLR” has the benefit of being an easy number to quantify. But if you really want to measure “impact” in a broad sense, you have to look at other factors. Those factors are not going to be amenable to quantitative study, but that doesn’t mean that there is no impact from those factors.

Posted by: IL anon | Apr 17, 2015 11:25:24 AM

I'll weigh in.

Citations can be useful objective indices of influence and impact. We write, at least in part, to influence the law and conversations about it, right? So citations in court opinions and academic journals are colorable proxies for that.

But I agree with Mike that citations are not enough and that not all citations are equal. On the former point, direct qualitative assessment is important. And the assessment should have at least some whole-scholar assessment rather than article-by-article assessment. I would support periodic external peer assessments of scholars. We do it for promotion and tenure. We should also do it for periodic post-tenure assessment. Perhaps it doesn't have to be as rigorous or formal as tenure. But it should be done. (Incidentally, such a system might also have the effect of educating US News peer-assessment voters in a way that would relax some of the stickiness of that criterion of the US News rank.)

Not all citations are equal. A string cite for a relatively uncontroversial proposition is somewhat, but not particularly, probative. But a citation that mentions the author in the text of the article, above the footnote line, seems far more important. "As the brilliant Carissa Hessick has famously argued, ..." seems quite probative of impact. Even better would be a paper specifically responding to a paper you wrote. The same could be said for a Supreme Court opinion that says, "We adopt the theory of Professor Hessick for X principle of law." Other examples of concrete impact could be found in legislative history or rulemaking history.

I admit this is still imperfect. And it could be onerous. But if we are serious about what we do, and if we defend what we do because of its impact and influence, then we need also to be serious about evaluating it against the standards we ourselves have established. It is our duty to do the best we can to assess impact, influence, and quality of scholarship.

A small note about counting citations via Westlaw: counts should use the TE() search function to avoid catching papers that only thank the scholar in the "star" footnote or that the scholar has authored. I'll also note that Google scholar might be a useful additional mechanism for citation counts, especially for interdisciplinary folks whose papers might generate citations in non-law academic journals.

Posted by: Scott Dodson | Apr 17, 2015 11:18:30 AM

Arguably citations and downloads are not enough, though. Jeff Harrison's recent draft on citation patterns tends to show that not all citations are equal. While I disagree with him about the normative conclusions we should draw, I'm persuaded that mere counts are not enough if you really want to measure "impact."

On a related note, I'm not sure downloads are even necessary - in many cases abstract views give articles great impact even though not many people actually read them. Not sure it SHOULD be that way, but I think it IS that way.

Posted by: Michael Risch | Apr 17, 2015 8:24:20 AM

Interestingly, that list has quite a number of judges or academics-turned-judges, or judges-turned-academics.

Posted by: JayA | Apr 17, 2015 8:22:36 AM

I don't have a view of what is a good or bad methodology, but it's worth noting that Professor Hessick's methodology is somewhat similar to the one HeinOnline used in its recent ranking of law professors based on impact:
http://home.heinonline.org/top_authors/

Posted by: Anon | Apr 16, 2015 11:43:13 PM

Post a comment