Thursday, September 03, 2009
On Schauer on Brain-Based Lie Detection
Over the last several years, researchers have been trying to develop brain-based methods of detecting lies. The overwhelming consensus among neuroscientists is that such techniques are unreliable, particularly in real-world settings. That hasn't stopped at least a couple of companies from marketing functional magnetic resonance imaging ("fMRI") lie detection services. See here and here.
While the consensus view is probably right that the technology is not yet ready for primetime, there is a perhaps unwarranted pessimism that pervades many of these discussions. I've been to numerous conferences where academics love to identify problems with the reliability of fMRI-based lie detection and emphasize that jurors are likely to be unduly swayed by neuroscience evidence. They often seem to forget that our most well-known method of lie detection--the jury--has numerous reliability problems. Most importantly, despite hundreds of years of data, we can't say very well just how reliable juries are. Surely juror lie detection leads to false positives and false negatives, but this particular "technology" is widely used in the United States.
Of course, litigants have constitutional rights to jury trials that they don't have to modern lie detection techniques. But from a policy perspective, we want to know whether a new approach to lie detection, when combined with whatever other truth-generating techniques we already use, leads to cost-effective improvements in the judicial system. That policy question, at least on the surface, looks rather different than the legal issue of whether the technology satisfies the standards for admission of scientific evidence set out in the Daubert or Frye test.
With these considerations in mind, I was pleased to read Fred Schauer's (Law, UVA) new draft article, Can Bad Science Be Good Evidence: Lie Detection, Neuroscience, and the Mistaken Conflation of Legal and Scientific Norms. Schauer points out, quite appropriately, that the standards for the admission of legal evidence are determined as a matter of policy not scientific inquiry. Neuroscientists can tell us whether a particular technology satisfies an existing legal standard but have no special expertise, as a general matter, in selecting the standard itself.
Schauer states, "Some examples of good science may still not be good enough for some legal purposes, and, conversely, some examples of bad science m[a]y, in some contexts, still be good enough for some legal purposes." His draft, therefore, seems to cut to the heart of the Daubert and Frye standards. Perhaps, given the ways that scientific evidence may unduly sway juries, we need something like the Daubert and/or Frye standards, so that, at the end of the day, we reach the right policy outcomes. But Schauer is not so pessimistic about juror abilities. He suggests, at least implicitly, that in the lie detection context, the traditional tests for admitting scientific evidence may not set the bar at the right place.
Posted by Adam Kolber on September 3, 2009 at 07:34 AM | Permalink
TrackBack URL for this entry:
Listed below are links to weblogs that reference On Schauer on Brain-Based Lie Detection:
Another sort of pessimism or skepticism is of a wholly different kind and that of course is the view that this whole enterprise is philosophically (and by implication scientifically and legally) wrong-headed in aim and ambition owing to untenable conceptions of both the brain and the mind (in the first instance, that the mind is not reducible to the brain). Here, the relevant literature comes not from science or legal theory but rather philosophy of science and philosophy of mind. As Michael Pardo and Dennis Patterson wrote in their recent paper, "Minds, Brains, and Norms" (available at SSRN; this paper should be read in conjunction with their earlier piece, 'Philosophical Foundations of Law and Neuroscience'):
...the basic problem with the philosophical assumptions of much writing about neuroscience is that the picture of human action it produces is one in which the person is reduced to the brain. In short, the thesis is "you are your brain." [....]
For example, many scholars writing on neuro-ethical issues believe that when it comes to human decision-making, all the work is done in the brain. Tendentiously expressed, the view is that human decisions are literally "made" in a certain region of the brain. As such, the brain itself can be studied so that we might learn just how it is that humans decide on one course of action over another, weigh the consequences of a decision, and render judgments of appropriateness. [....]
Suppose we had an fMRI scan of a defendant’s brain while committing allegedly criminal acts. Now, suppose we need to determine whether he (1) knew the suitcase he walked off was not his but belonged to another; or (2) knew about the illegal drugs in his suitcase; or (3) knows "that it is practically certain that his conduct will cause" a particular result; or (4) knows any other particular proposition. Exactly where in his brain would we find this information? Because knowledge is an ability and not a state of the brain, the answer is: nowhere. [....]
Proposed brain-based lie detection comes in two varieties: that attempting to discover if a defendant has knowledge of a crime "stored" or "housed" in his brain, and that attempting to discover whether a defendant’s answers trigger areas of the brain correlated with deceptive behavior. The first type attempts to discover the presence of knowledge directly; the second type attempts to infer knowledge on the basis of deception. [....]
To know something—knowledge that propositions about a crime are true, for example—is not located in the brain. As a conceptual matter, neural states of the brain do not fit the criteria for ascriptions of knowledge. Suppose, for example, a defendant has brain activity that is purported to be knowledge of a particular fact about a crime. But, suppose further, this defendant sincerely could not engage in any behavior that would count as a manifestation of knowledge. On what basis could one claim and prove that the defendant truly had knowledge of this fact? We suggest there is none; rather, evidence of the defendant’s lack of the criteria for knowledge would override claims that depend on the neuroscientific evidence. This confused conception of knowledge "stored" in the brain rests on two additional problematic assumptions. First, that memories (like knowledge) may be identified with particular neurological states of the brain. Second, an assumption that the retention of an ability implies the storage of that ability. [....]
As with knowledge, these types of deceptive lies involve a complex ability engaged in by persons, not their brains. [....]
Thus, neuroscientific evidence might reveal that certain brain activity is inductively well correlated with this behavior, or that damage to certain brain areas makes one incapable of engaging in this behavior, but it cannot establish conclusively that one’s brain is engaged in lies or deception or that an intent to deceive or a lie is located in the brain. Neurological states do not fit the criteria for ascriptions of lies or deception.
* * * *
I'll leave to the side the question of how reliable juries are and your larger points about the legal standards for admissibility of scientific evidence.
Posted by: Patrick S. O'Donnell | Sep 3, 2009 8:42:29 AM
The comments above are irrelevant to the issue of using lie detectors in court (which I think you may acknowledge). At most, they are relevant to the proper way of understanding and speaking about whatever it is that lie detectors are said to do.
Regarding the portion quoted above:
"Thus, neuroscientific evidence might reveal that certain brain activity is inductively well correlated with this behavior, or that damage to certain brain areas makes one incapable of engaging in this behavior, but it cannot establish conclusively that one’s brain is engaged in lies or deception or that an intent to deceive or a lie is located in the brain."
As I've commented to Pardo and Patterson in more detail, reliable correlations are all we need for courtroom use. We should be quite pleased if we are ever able to reliably correlate neuroscience evidence with lying behaviors.
Posted by: Adam Kolber | Sep 3, 2009 10:02:59 AM