« Global Warming Yuks | Main | Scalia, judicial ideology, and flag burning »

Friday, August 31, 2012

Reality checks

To finish up my month of Submissions-Editor blogging, I want to highlight a piece of scholarship that exemplifies several of the themes I've touched on.  For obvious reasons, I haven't singled out an article before, but this time I have the authors' permission (and I'm very glad to say it will appear in the Cleveland State Law Review in just a couple of months).  The article is here, and of course I can't improve on the way it introduces and justifies itself.  Nonetheless, let me try to explain why I think it's dynamite.

I've been writing about my layman's interest in the way legal academia develops and overcomes certain blind spots -- in particular, blind spots about how things transpire in actual decision-making.  I tried to blog about this by proposing that (the Constitution foresees that) a jury is in actuality perfectly suited for the decision to order a drone strike (i.e., what is commonly called extrajudicial killing should be called 'killing without a jury').  In general, I suggested that actual jury decision-making (unlike judges' decision-making) gets far too little ink.  I also proposed that there is an enormous amount of work to do to understand the nature of Justice Roberts' restraint in NFIB v. Sebelius:  I think it's a good example of how a real legal decision can expose the poverty, ill fit, or obsolescence of even the best academic theories about such decision-making. 

And I highlighted the work of Lynn Stout, which incorporates into law-and-economics some of the advances economists have made in recent years in addressing their own blind spots about actual human behavior.  For me, one of the most impressive things about Prof. Stout's work is its empirical rigor:  it requires mastery of the statistical outputs of reams of scientific literature from a 'hard' science (specifically, psychology).

The middleman is cut out by the authors of the linked article, The Ideological Divide: Conflict and the Supreme Court’s Certiorari Decision (Profs. Grant, Hendrickson, and Lynch).  They do the hard science themselves.  Rather than theory presenting itself against the blurry backdrop of real life, in this article, reality speaks against a backdrop of (well-conceived and clearly explained) theory.  

And in the end, reality improves the academic theory, rather than (as legal scholars often fantasize) the other way around.

Using data just available from within the Supreme Court (cert. pool memoranda from the papers of Justice Blackmun), and using ideology scores from the JCS methodology, Profs. Grant, Hendrickson, and Lynch analyzed the effect of judges' ideology on cert. decisions when there was a circuit split.  To me, a true scientific mindset is revealed by the size of the authors' data set and the refusal to simply identify cases 'by feel' as ideologically charged or not.  This is methodical, humble, and virtually airtight work.  It puts to shame all the general efforts to define the concept of 'judicial activism,' and in the end suggests that Supreme Court judges take their job seriously in a way that members of other professions don't necessarily know how to name.

As I say, the article speaks for itself.  It bridges ideology and fact.  I urge you to read it, since I can't do justice to it.  

As a final Cavellian note, let me expand on what I mean, or maybe on what I think law professors should always mean, by "doing justice."  My law school, Cleveland-Marshall, has the motto "Learn Law.  Live Justice."  It's like a little koan (learn in order to live? live in order to learn? justice flows from law? law flows from justice?).  It beautifully encapsulates the tensions and the energy in the relationship between legal theory and legal practice.  

(Those tensions are magnified recently by the important debate about whether law schools are serving their students well, professionally.  I'm very proud of how Cleveland-Marshall has handled itself over the last decade or so, and very impressed with what it is doing now along these lines.)  

To speak broadly:  the tensions between thinking and doing are never going to be relaxed.  {Ironically, I wrote an essay about that fact four years ago which I can't link to, because it is STILL not yet out in print; it's called "Perfect from the Pod" but has been germinating at Case Western Reserve and at Cambridge University Press.}  Theory and practice can never do perfect justice to each other.

But universities embody crucial Enlightenment values that have addressed those tensions better than anything else.  And American universities are particularly superb places to study law, because American philosophy, both speculative and applied, has deployed the values of the Enlightenment with unique flair for centuries.

The first such value is science -- knowing the truth.  It is both basic and incredibly complex.  It should be the role of law schools to do justice to the world of societal power -- and doing justice means, before anything else, the hard work of empirical analysis.  There are no shortcuts.  Profs. Grant, Hendrickson, and Lynch embody that principle for me this year.

I'm sure I haven't done them justice, but then I'm just a law student.

Posted by Jim von der Heydt on August 31, 2012 at 08:53 AM | Permalink


TrackBack URL for this entry:

Listed below are links to weblogs that reference Reality checks:


Erratum: Of course such funding does not rule out the possibility that legislators, administrators, or policy makers will cherry-pick from the fruits of social (or natural) scientists.

Posted by: Patrick S. O'Donnell | Sep 1, 2012 8:12:05 PM


It’s perhaps hard to tell, but that was Kevin Drum writing (commenting on the material he quoted from Nature), not me. I resume my remarks with “It’s time to forswear....” And Drum was responding there specifically to: “The ‘larger’ the social or political issue, the more difficult it is to illuminate definitively through the methods of ‘hard science.’”

In any case, I don’t think the argument is properly characterized as one “that the government should not devote tax payer dollars to specifically funding individuals who are poorly equipped to solve the problems they want to solve.” The social sciences don’t necessarily (or routinely) attempt “solve” social problems, especially in the first instance, rather they proffer different ways to think about, to understand, such problems (to ‘illuminate’ them, as in the quote above), indeed, even to identify precisely what may or may not count as a “problem” (e.g., some may still think famines are simply natural ‘disasters’ that we are fairly powerless to address or should simply let run their course, yet an economist, Amartya Sen, explained convincingly that famines in the modern world are essentially ‘man-made’ affairs, whatever the precipitating factors, in his 1982 book, Poverty and Famines; similarly, it was once a widely held belief that starvation and malnutrition around the world was simply a consequence of too many people—‘overpopulation’—and an insufficient supply of food, a proposition that has since been proven wrong).

To be sure, social scientists may, on occasion, even go so far as to suggest possible ways to address these problems by way of ameliorating them or, in the very rare case, even “solving” them, but the responsibility for addressing these problems lies with legislators and policymakers. In other words, and therefore, it is not about “funding individuals who are poorly equipped to solve the problems they want to solve,” as the “solving” problem (as a locus of responsibility) lies in the legal and political realm, not the academic, although it behooves the former to draw upon knowledge gleaned from the latter whenever possible. In other words, arguments similar if not identical to those that permit funding of the natural sciences can and should be made, there being no great divide in terms of what counts as “science.” Social scientists are often more adept at description, understanding and evaluation, rather than unambiguous causal explanation as such, the latter being essential to any attempt to “solve” social problems, but that hardly means we need dismiss the efforts at explanation, and we should bear in mind that explanation alone is not equivalent to implementing regulations or policies that address social problems of one kind or another.

Of course such funding does not rule out the possibility that legislators, administrators, or policy makers will cherry-pick from the fruits of the latter (as neoliberals have done in the case of climate science, the theory of evolution, science generally, international law, economics, ‘knowledge policy’ and so on). That the level of analysis in the social sciences is at some remove from specific policy questions is confirmed with the rise and flourishing of “think tanks” and the like that attempt, as part of the overarching effort to concretely shape public policy, to bridge if not close that gap. Using tax-payer dollars to (partially) fund the work of social scientists does have the salutary effect of expressing a collective belief that we’re not content to leave such matters merely to the “marketplace of ideas,” which not only commodifies knowledge (thus favoring those with the economic power to ‘produce’ knowledge, something to think about as universities emulate corporations and thus are increasingly dependent on the vagaries of the marketplace), but often by default elevates the putative wisdom of crowds or the hoi polloi (or the Hayekian ‘rational judgment’ that can only be uttered, apart from Hayek himself, by a ‘Great Nobody’ in the words of Christian Arnsperger).

Posted by: Patrick S. O'Donnell | Sep 1, 2012 8:09:20 PM

Patrick writes: "In part, this just restates the fact that political science is difficult. To conclude that hard problems are better solved by not studying them is ludicrous."

As I understand it, the argument is not that hard problems should not be studied. Rather, the argument is that the government should not devote tax payer dollars to specifically funding individuals who are poorly equipped to solve the problems they want to solve.

Posted by: Orin Kerr | Sep 1, 2012 6:37:36 PM


Thanks for the clarification. It helps eliminate the wrong sort of inferences being made from the aforementioned expressions.

I confess to not having read all your posts during this stint of guest-blogging.

(For the record, as I often point out, I'm not a professor, merely a lowly adjunct instructor at a community college who returned to the academic world in his mid-forties...although no less happy for all that.)

Posted by: Patrick S. O'Donnell | Sep 1, 2012 8:11:03 AM

Prof. O'Donnell: Thanks very much for laying that out so generously. I agree with virtually all of it wholeheartedly, and hope there is evidence in what I've thrown up in the blog this month that I practice it as well in my own thinking. Indeed, I think your idea of 'empathetic rigor' expresses what I've been calling for (and are not generally getting) when thinking about Justice Roberts' choice in May of this year. I was praising a tree, and you remind us about the forest.

However, let me say a tiny bit more about the tree. I wouldn't want us to miss the tree for its branch. Yes, one branch of "science" as I hyperlinked it (Wikipedia on the scientific method -- set a hypothesis then test it) is an invidious distinction between hard and soft knowing. But that is not the taproot, and it was not in the seed.

I insist that that there IS, without scare quotes, *rigor* in the choice to take one's idea, set it as a hypothesis, and develop a way of testing it that deploys well-accepted (albeit culturally inflected) statistical methods. In the case of the article linked, that rigor is prominent: with one of their hypotheses, the authors reached a result about judicial ideology that was NOT statistically significant. They didn't do it over, or leave it out. They reported it alongside the other one (a correlation that was pleasingly established with 95% certainty).

There are good reasons that indicia of reliability like this are honored in our culture. We should have more of them -- but of course, you're right: not to the exclusion of other kinds of reliability.

So it's a lovely tree, is what I wanted to say. And of course we should not miss the forest for it. I gratefully accept how you prune it with that in mind:

"What is questionable, however, is the wholesale importation of this particular form of standardization as an ideal methodological desideratum for the social sciences. Moreover, one should not lose sight of the important fact that 'there are many scientifically interesting aspects of the natural world that are not amenable to this treatment' (Ziman)."

I agree entirely. I'd add only that when the methodology is successfully extended, we should be glad rather than defensive. Nothing should be imported wholesale; the engine of the academic economy is retail! (And in retail, as we all know, the key is location, location, location: i.e., universities.)

Posted by: Jim von der Heydt | Sep 1, 2012 7:56:31 AM

Re: “empirical rigor: it requires mastery of the statistical outputs of reams of scientific literature from a ‘hard’ science (specifically, psychology).” “They do the hard science themselves.” “To me, a true scientific mindset is revealed by the size of the authors' data set….” “…the hard work of empirical analysis.”

I suspect the presuppositions, assumptions, and claims that animate such statements are worthy of exploration (or ‘unpacking’ as one of my teachers used to say):

We need to forthrightly acknowledge the sundry implications that follow from the fact that

“The techniques used in scientific research are extraordinarily diverse, from counting sheep and watching birds to detecting quasars and creating quarks. The epistemic methodologies of research are equally varied, from mental introspection to electronic computation, from quantitative measurement to speculative inference.”

Empirical number-crunching is no more intrinsically “rigorous” or “hard” than these other methodologies, all with their respective appropriate domains and requisite purposes. In John Ziman’s words, “not all scientifically observable features of the world can be measured, and not all the results of scientific measurement properly be treated as variables in mathematical formulae.” In particular and in “practice, most of the entities that figure theoretically in the human sciences are not ‘arithmomorphic,’ and are not at all amenable to formal mathematical analysis.” The belief or wish that things are or will be otherwise, however, is an obdurate one. Whatever my distaste for her specific views on neo-classical economics, Deirdre McCloskey has courageously, cleverly, and persistently endeavored to persuade her colleagues in economics to rely far less on mathematical formalism and a “scientistic style,” and much more on the “whole rhetorical tetrad—the facts, logics, metaphors, and stories necessary” [….] that enable her discipline to become at once “more rational and more reasonable….” No doubt McCloskey, who says that “It would of course be idiotic to object to the mere existence of mathematics in economics,” shares Hilary Putnam’s motivation behind the mention of the Pragmatist “revolt against formalism:” “This revolt against formalism is not a denial of the utility of formal models in certain contexts; but it manifests itself in a sustained critique of the idea that formal models, in particular, systems of symbolic logic, rule books of inductive logic, formalizations of scientific theories, etc.—describe a condition to which rational thought can or should aspire.” In our case, the reference would be to the rationality peculiar to scientific methods and theorizing. To paraphrase and quote again from Putnam, our conceptions of even scientific rationality cast a net far wider than all that can be logicized or mathematized, in short, formalized: “The horror of what cannot be methodized is nothing but method fetishism” Or, in the words of Ziman, “the domain of science extends far beyond the scope of formal reasoning,” and thus formal reasoning is not emblematic of the “highest” or “best” sort of scientific theorizing, as the types and standards of scientific reasoning vary from discipline to discipline.

First, by way of illustration, consider Kevin Drum’s comments on a snippet from a recent piece by editors of Nature:

“‘Because they deal with systems that are highly complex, adaptive and not rigorously rule-bound, the social sciences are among the most difficult of disciplines, both methodologically and intellectually....So, what has political science ever done for us? We don’t, after all, know why crime rates rise and fall. We cannot solve the financial crisis or stop civil wars, and we cannot agree on the state’s role in systems of justice or taxation. As Washington Post columnist Charles Lane wrote in a recent article that called for the NSF not to fund any social science: “The ‘larger’ the social or political issue, the more difficult it is to illuminate definitively through the methods of ‘hard science’.”’

In part, this just restates the fact that political science is difficult. To conclude that hard problems are better solved by not studying them is ludicrous. Should we slash the physics budget if the problems of dark-matter and dark-energy are not solved? Lane’s statement falls for the very myth it wants to attack: that political science is ruled, like physics, by precise, unique, universal rules.

The public commonly thinks of disciplines like physics and chemistry as hard because they rely so heavily on difficult mathematics. In fact, that’s exactly what makes them easy. It’s what Eugene Wigner famously called the ‘unreasonable effectiveness’ of math in the natural sciences: the fact that, for reasons we don’t understand, the natural world really does seem to operate according to strict mathematical laws. Those laws may be hard to figure out, but they aren’t impossible. And once you do figure them out, the rest is mere engineering.

Hari Seldon notwithstanding, the social sciences have no such luck. Human communities don’t obey simple mathematical laws, though they sometimes come tantalizingly close in certain narrow ways — close enough, anyway, to provide the intermittent reinforcement necessary to keep social scientists thinking that the real answer is just around the next corner. And once in a while it is. But most of the time it’s not. It’s decades of hard work away. Because, unlike, physics, the social sciences are hard.”

It’s time to forswear forever the use of terms that posit a divide between the so-called “hard” and “soft” sciences, with its barely concealed and untenable judgment of the superior value and putative rigor of the former over the latter (incidentally, we might invoke a Daoist examination of these two concepts as yang and yin respectively, keeping in mind the Daodejing’s reasons for normatively elevating yin over yang throughout the text). This is simply nonsense and a curious relic from the history of positivism. The difference is based largely on a false understanding of both the nature and role of quantification and measurement in the natural sciences as exemplified by physics (and in fact, as Ziman notes, thereby lends a ‘paradoxical’ quality to much of twentieth-century physics, for instance, in the denial of the possibility of perfectly precise simultaneous measurements of position and momentum). Even the exalted objective status of measurement involves a basis or standard for comparison, something that involves communal or intersubjective agreement on what entities and procedures are to be regarded as “standard.” Quantitative measurements have led to impressive results in many of the natural sciences and no one questions the significance of empirical quantitative data. What is questionable, however, is the wholesale importation of this particular form of standardization as an ideal methodological desideratum for the social sciences. Moreover, one should not lose sight of the important fact that “there are many scientifically interesting aspects of the natural world that are not amenable to this treatment” (Ziman). There are good reasons that efforts to extend the methodology of numerical measurement into the human sciences have produced comparatively meager results. Ziman reminds us of the often arbitrary and subjective categories that lie at the basis of social statistics and even basic demographic data.

All scientific facts, be they from the natural or social sciences are “contextual,” and thus entail the intertwining of both “theoretical” and “subjective” features. One way of putting this is to refer, with Hilary Putnam, to the “entanglement of fact and value,” in this case with an invocation of four principles from A.E. Singer, Jr.:

1. Knowledge of facts presupposes knowledge of theories.
2. Knowledge of theories presupposes knowledge of facts.
3. Knowledge of facts presupposes knowledge of values.
4. Knowledge of values presupposes knowledge of facts.

It goes for both the natural and social sciences that there “are no absolute rules on what constitutes a scientific fact” (Ziman). Philosophers, philosophers of sciences and meta-scientists have come to recognize scientific theories in both the natural and social sciences as analogous to (and thus having all the ‘rigor’ of) maps:

“Almost every general statement one can make about scientific theories is equally applicable to maps. They are representations of a supposed ‘reality.’ They are social instruments. They abstract, classify and simplify numerous ‘facts.’ They are functional. They require skilled interpretation. And so on. The analogy is evidently much more than an vivid metaphor.” (Ziman)

In the human or social sciences we might call upon a notion of “empathetic rigor” rather than empirical rigor, given that empathy

“is an essential feature of observation in the human sciences. This is an old idea, going back to the eighteenth century. Ethnographers employ empathy constantly, as when they use interviews to enter people’s lives and elicit their personal value systems. In spite of insisting on their objectivity, historians have always had to infer the thoughts and motives of the human agents in their stories by re-enacting them in their own minds.”

And the fact that the human sciences often depend fundamentally on verstehen, that they are steeped in hermeneutics (which also relies on intersubjective norms), is not a fact to be bemoaned and overcome through displacement by empirical rigor. Indeed, I think Ziman (a former Professor of Theoretical Physics) is absolutely right to argue “the knowledge produced by the natural sciences is no more ‘objective,’ and no less ‘hermeneutic,’ than the knowledge produced by the social, behavioural and other human sciences. In the last analysis, they are all of equal epistemological weight.”

The “hard-soft” dichotomy is further unavailing when we consider the fact that “many important scientific disciplines, such as brain science and ecology, have emerged in the wide open spaces between hard-core physics and soft-centered sociology.” The so-called hard sciences hold no monopoly over, no patent on, no proprietary right to, what counts as “good science.”

Formalization, be it with statistics, logic, algorithmic compression, or Bayes’ theorem, for example, does not define the sine qua non of scientific theorizing. In any case, formalization is a matter of degree and much of science is not now nor will ever be conducive to formalization (for instance, we delude ourselves if we think the fullness of time will lead to a ‘Newton’ of sociology or anthropology, with the corresponding principles and laws for social interaction, institutions, and so forth). For perfectly sensible and sound reasons, the language of scientific discourse is not circumscribed by what falls within symbolic logic. A mathematical argument is no more important or rigorous epistemically speaking than the theory in which it occurs. In the end, scientific rationality is but a species of or no more than (transcultural modes of) practical reasoning. Communal standards of rationality are set in the various domains and disciplines of scientific research.

Much more could be said, but I won’t say it here.

Posted by: Patrick S. O'Donnell | Aug 31, 2012 12:36:05 PM

The comments to this entry are closed.