« Professor-Student Wars and the Second-Person Standpoint | Main | On Tribe, On Signing Statements »

Sunday, August 06, 2006

The Use (and Misuse?) of SSRN

What role, if any, should SSRN download counts play in evaluating the quality of a legal scholar’s work product?  This is a question I’ve been pondering recently as a member of my law school’s Appointments Committee, which is engaged in a lateral hiring process.  In connection with the search, I’ve reviewed the scholarship of several potential candidates this summer, and I’ve been struck by the large disparities in download counts among scholars who otherwise seem quite similar based on other standard indicators of productivity, such as number of publications, prestige of placement, and Westlaw citation counts.  Indeed, there seem to me a surprising number of quite successful, prolific scholars who either lack an SSRN presence entirely, or whose download counts per article tend to be in the one or two dozen range.  Should this raise a red flag?  As someone who himself has only been on SSRN for a year and whose own download counts are also quite modest—tips, anyone?—I’m in no position to press the issue.  And, yes, I understand that download counts are manipulable, that different law schools provide different levels of support and encouragement for faculty members to develop a presence on SSRN, that some fields (e.g., intellectual property) have much larger audiences in the SSRN world than other fields, and that there may be generational issues at play here, too.  So, I don’t intend to give any particular weight to download counts in our search this year.  But, at some point in the future, I wonder if SSRN dissemination of scholarship will become so much the norm in the legal academy that it really will be fair to raise questions about a scholar who lacks any substantial presence on SSRN.  Is this a person who is out-of-touch with the world of contemporary legal scholarship in more substantive ways, too?  Is this a person who is unable to communicate his or her ideas effectively?  These questions may some day present themselves not only in lateral hiring, but also in promotion and tenure.  Or am I naive in supposing that this is a future reality?  Are download counts already being used in these sorts of evaluative ways at other institutions?

Posted by Michael O'Hear on August 6, 2006 at 09:39 PM in Life of Law Schools | Permalink


TrackBack URL for this entry:

Listed below are links to weblogs that reference The Use (and Misuse?) of SSRN:


In the end, there really is no substitute for carefully reading someone's scholarship and making an intelligent assessment of its contribution to the dialogue. Numerical proxies (citation or download counts) may mask or distract from important insights that can be gained that way. Eric.

Posted by: Eric Goldman | Aug 12, 2006 12:21:36 AM

In the end, there really is no substitute for carefully reading someone's scholarship and making an intelligent assessment of its contribution to the dialogue. Numerical proxies (citation or download counts) may mask or distract from important insights that can be gained that way. Eric.

Posted by: Eric Goldman | Aug 12, 2006 12:21:32 AM

what is with the academy's obsession with download counts, citations in footnotes of *other* legal papers, and (perhaps most embarrassingly) the bizarre fixation on appeasing 2L editors who have little clue or special expertise regarding the substance of the articles published?

i understand the appeal of download counts. i have put one paper on SSRN, and was quite happy to see that some people (however few) were intrigued enough by my abstract to devote a few moments to downloading (and hopefully reading at least a portion of) my paper. SSRN (and its download counts) are a wonderful, amazing resource-- it's terrific to reach such a large audience so easily, and i think one would be encouraged to publish more and better scholarship knowing that s/he can place it on SSRN and "be heard." however, the suggestion that a download count is in any way indicative of *merit* is peculiarly naive. SSRN is a terrific way to share scholarship, but i cannot see how a download count can qualify as "merit."

i'd perhaps make an exception for those occasional economics articles which have been downloaded 30,000 times or so, in which case one cannot help but assume that those pieces are seminal works. however, a few dozen or even a few hundred downloads don't mean much. one piece of so-called "scholarship" that has 75 downloads (a perfectly respectable number, no doubt) examines whether Jews really would evade taxes, and ultimately concludes "the Jewish view is that tax evasion is almost always or perhaps always unethical. Presumably, this view would hold true even if Hitler were the tax collector." if there were 750 (or even 7500) downloads instead of 75, i doubt that that datum could transform such a piece into a meritorious paper.

perhaps the problem is not the academy's fixation with silly proxies for merit, but rather the fact that so much of the scholarship out there has so little to do with either law or reality that it's either impossible or painful to determine what the particular author is trying to say. given that most articles seem not at all concerned with (god forbid) discussing the law, but instead obsessed with achieving "unique" insights into the intersection of the law (loosely defined) and some obscure field in the humanities, i guess it's not surprising that one would eagerly search for a proxy for quality rather than bother reading the actual work. when given the choice between figuring out what the paper actual says, and relying on the selection ability of a 2l editor, i suppose i cannot blame an appointments committee for choosing the latter option.

Posted by: andy | Aug 10, 2006 2:45:31 AM

I am with Bernie Black on this - most of these measures have a lot of built in noise. But taken together and used sensibly, they can be helpful in understanding whether someone is active and engaged in current legal academic debates. I worry about going to far as in very anon prof dismissing even some regard to article placement. Even though we know all the problems of law review (and peer review actually)selections, in the end, over time and taking a series of scholarly writings, we do get some good indication of quality when we compare someone who published ten articles in top general law reviews with someone who published at the same period of time 5 articles in fourth tier specialized journals, no? let's not exagerate the effect of the noise.
And finally, please don't make "marketing" sound like such a bad word. Since when is it a bad thing to market your work by uploading on ssrn, in the hope that more people will read it, engage with it, react and comment on it? Isn't that part of what good academics should hope for?

Posted by: Orly Lobel | Aug 8, 2006 3:43:07 PM

Let me offer the modest suggestion that any law school that does hiring based on SSRN downloads should be closed. The single best predictor of SSRN downloads (other than working in corporate or L&E) is having a blog. Really.

Posted by: Brian Leiter | Aug 8, 2006 10:05:43 AM

From my http://papers.ssrn.com/sol3/papers.cfm?abstract_id=888327 little essay on rankings regulation:

"[P]ervasive ranking . . . threatens to coarsen culture by encouraging a false sense of commensurability. Consider Richard Posner’s recent effort to “rank” public intellectuals, a laborious project made ever easier by the powerful computing systems driving search engines. There's no way to make a ranking process "fair" unless we subscribe to the bizarre idea that the quality of the work of economists, historians, philosophers, scientists, et al. can be ordinally ranked with some commensurating metric. The danger is particularly acute in social sciences and literature, where quality is hard to measure and notoriety often becomes a substitute for it. Only positivistic delusion could lead us to ignore the contestability of the basic assumptions and aspirations of authors in such fields--for example, how closely tied they are to visions of the good society and ideological commitments. Yet the temptation to assess the importance and even quality of a person’s intellectual work with reference to the number of “hits” they generate on a search engine database is sure to advance as these services play a greater role in our lives."

Posted by: Frank | Aug 7, 2006 6:01:01 PM

Bernie: [a href="URL"]text of link[/a] will do it. Replace the square brackets with . (I can't do that, or the tags wouldn't show up in this post.)

Posted by: Bruce Boyden | Aug 7, 2006 2:19:08 PM

Is it wrong to simply not give a damn about SSRN? Given the criticism (noted above) of the tool, is it unreasonable for a candidate to simply decide not to make use of it?

Posted by: Paul Gowder | Aug 7, 2006 12:44:06 PM

This was the subject of an entertainingly heated debate at Conglomerate not long ago:


Posted by: Scott Moss | Aug 7, 2006 12:25:39 PM

I think that Bernie states it well. It is likely that there is a high correlation between downloads and other measures, but don't use SSRN in isolation. Each measure has weaknesses and should be considered w/ a grain of salt.

Posted by: anony | Aug 7, 2006 11:40:16 AM

On the uses and limits of SSRN download counts, versus other measures such as citation counts and simple number of papers, see Black and Caron, Ranking Law Schools: Using SSRN to Measure Scholarly Performances, in 81 Indiana Law Journal 83-139 (2006) (http://ssrn.com/abstract=784764) [if I were techier, I'd know how to make this a live link]. At the school level (which is where we look), the correlation between downloads and raw productivity (number of posted papers) is amazingly high. I also think the field bias in SSRN downloads may lessen over time; we're seeing way more submissions in law-and-society, legal history, etc. Within-field download comparisons are surely more informative than cross-field comparisons. Citation counts are valuable, I believe, but take years to come in. Overall, I think it fair to say that high-downloaded authors (within field) tend to be good people, but there are important scholars who don't post their work, and no one would think the correspondence between downloads and scholarly quality is other than rough.

In the end, SSRN downloads are a noisy measure, citations are a delayed measure, and journal placements are suspect. Maybe the sensible approach is to pay some attention to each, while also attending to the weaknesses of each. That's what Paul and I conclude in our article.

Conflict disclosure: I am managing director of SSRN and the Legal Scholarship Network.

Bernie Black

Posted by: Bernie Black | Aug 7, 2006 11:01:36 AM

I'm with Very Anon Lawprof.

Posted by: Joseph Slater | Aug 7, 2006 10:16:18 AM

SSRN should be considered a means to an end; not an end in itself. Let me say that again in another way so that no one ever seriously considers this as a point of reference in and of itself - it is a only reasonably considered as a tool to help get your work known - it is not a consistantly reliable indicator of scholarly impact. *In a given case, if it actually reflects their impact, then it should also show up in related, and more valid, statistics - like their citation count in actual publications, among others.* Sure, citation counts aren't perfect, but I think that most would agree that they're better than SSRN. There are a number of better ways to asess impact.

Clearly, SSRN can be played, sometimes not even by those who wish to deceive, but just the way that things work out. Just as a for instance, if you have a paper on SSRN that takes a while to get published (e.g. you keep polishing drafts and running it by people before submitting), then you're likely to get more downloads. If you have something on SSRN that's already published, then most will dowload it elsewhere (and still read it). No one was deceptive here, but the download "N" is not a reliable indicator of impact. I can provide more examples, but I hope that you get the picture.

It just really concerns me that someone would seriously consider such a thing in important hiring decisions when better indicators are available.

One more time - it's a tool and should match up with other, better, indicators.

Posted by: anony | Aug 7, 2006 8:15:03 AM

Michael, I really think you're barking up the wrong tree here. To put it simply, what do SSRN downloads have to do with good scholarship? Business law types get a lot of them. Those in other fields such as the humanities get fewer. Who cares? Someone without an SSRN page perhaps isn't into marketing. Do we only want to hire those who market themselves well? Is that really what makes an attractive scholar? (If it is, we all should have gone into business and made the big bucks!)

And while we're on the topic, I think you're also wrong to rely on citation counts. Great work in less trod-upon areas is often not cited that much, while work in trendy fields is cited a lot, even if it isn't very good. Moreover, citation counts don't distinguished between perfunctory cites in literature regurgitating footnotes (who cares about those?) as opposed to sensitive discussions relying on a particular work.

As for journal placement, can we get real? As if 2L editors have a clue? Why not read the work and see if it's any good? And then maybe ask people in the field what they think? To be sure, you won't be able to cite statistics as to why the person you want to hire is the bestest in the business. But should we be so insecure as to require such seemingly but falsely objective confirmation?

Posted by: Very Anon LawProf | Aug 7, 2006 4:07:31 AM

Post a comment