Thursday, May 17, 2007
Incantations, Legal and Computational
On Monday, I blogged about the theory-practice tension in computer science and in law. Today, I'd like to talk about another link between the two fields: that both disciplines are attuned to the incantatory power of language. Programmers and lawyers work with systems in which speech acts play a central role.
Most of the work in programming consists in taking a vague idea of what you would like to have happen and expressing that idea in the tightly-constrained artificial grammar of a programming language. This process is a particularly rigorous form of translation that requires you to think through your idea in quite precise detail. As programming professor Donald Knuth says, "Science is what we understand well enough to explain to a computer. Art is everything else we do." But once you are done, the resulting program doesn't just express your intended meaning; it actually executes it. Every valid utterance in a programming language can be parsed into a specific sequence of instructions, which a computer can then carry out. You get a set of pretty pictures on the screen, or an electronic banking system, or an email program, or one of the other millions of things computers can do. The dword is made flesh.
Lawyers are also familiar with the power of words to shape the world.. The Constitution is somehow more binding than other scraps of paper with writing on them. A judge speaking from the bench makes things happen when she renders a judgement. Yes, we can analyze this power in Realist terms, identifying the human institutions and social expectations that carry out judgments, create respect for "the law," and ensure that a community treats these supposedly binding words as something more than flap-jawed jib-jab. But we could do the same for a computer program; without a computer and people willing to use that computer, it's just an abstract idea, a potentiality. The point of unity is that the right words, promulgated in the right way, make stuff happen. Lawyers attend to nuances and to precise expression because there are consequences for getting the details wrong. The wrong guy gets arrested; the cotton arrives on the wrong ship Peerless.
There are limits to the parallel, and one difference is particularly frustrating for techies encountering the law. They are used to a world in which the nuances of language may be complex and counterintuitive, but remain fundamentally predictable. "(+ 2 3)" in LISP always evaluates to 5. The kind of logic that works well for such environments can fail badly in a courtroom, where meanings are often negotiable and the most formally correct reasoning can be undone if the result would be unjust or inconsistent with a statutory scheme as a whole. It can drive a programmer absolutely batty to watch the actual legal system in action, with inconsistencies and ambiguities all over the place. It can be hard for them to recognize that these are features of the legal system, not bugs. The practice of computer programming would fall apart if you could always argue about what the result ought to be; the practice of law would fall apart if you couldn't.
TrackBack URL for this entry:
Listed below are links to weblogs that reference Incantations, Legal and Computational:
» More on Similarities Between Programming Language and the Language of Law from Legal Profession Blog
Posted by Jeff Lipshaw James Grimmelmann (NYLS) has posted a sequel over at PrawfsBlawg to his earlier post on the similarities and differences between programming language and the language of law. There is a lot to chew on, and I [Read More]
Tracked on May 17, 2007 1:41:30 PM
On my long, long, long-term list of projects to undertake, I'm planning someday to look at "magical incantations" that judges make that in some way plant a flag in the ground. For example, by using this particular phrase, this judge is declaring fidelity with a particular school of thought or adherence to a particular line of authority.
In particular, I expect to find that in courts with a lot of judges, we will find places where the magical incantations split into two or more separate paths. (Can you tell I clerked on the Ninth Circuit?) I'm guessing that I'll be able to find (perhaps with machine learning tools) that this magical language signifies this branch in the split.
But until this nice post, James, I had missed the obvious analogy to the code fork. It's a nice, concise way to describe my future project: code forks (see GNU emacs vs. xemacs) and so does law (see Judge Reinhardt's vs. Judge O'Scannlain's line of authority on X question of asylum law.)
Posted by: Paul Ohm | May 17, 2007 1:25:10 PM
Hey, neat! I hadn't been thinking about forking in this way, but now that you point it out, it's a very interesting analogy. I look forward to seeing this paper of yours (even if it involves looking far forward).
Posted by: James Grimmelmann | May 17, 2007 2:13:40 PM
Your point is similar to an analogy that I have used in the past. I remember talking about it with a tax partner at a large commercial law firm. He replied that there is, indeed, a great deal in common between, say, planning and executing a complex deal and writing a computer program. The client brings you some basic objective, and it is up to you to understand his needs and translate them into this technical process. The key difference, he pointed out, is that you can only run it once. If, say, your tax-free merger fails, you cost your client a lot of money. If a characterization is disallowed by the IRS, your client faces interest, penalties--or worse.
By contrast, key to the process of computer programming is testing and debugging. Indeed, some philosophies of software development, like the Extreme Programming methodology that was trendy a few years back emphasize daily or even more frequent tests, dictate writing unit tests before writing the actual code for those units, and, indeed, define progress based on progress in success rates in a unit test. Every programmer who has done anything more complex than "Hello World" knows that debugging will take, on average, about as long as the initial planning and coding--if not longer. Obviously, software still fails, but you can still take measures to get the vast majority of bugs worked out.
I think this explains why the practice of law has to be less mechanistic than computer programs, why, as you say, it breaks down if things can't have fixed meanings. If, say, in drawing up a contract, one were able to run simulations show exactly what would happen if different parties breached at each possible point, we could afford to be that much more formalistic about their interpretations. Latent conflicts and errors in documents would be brought to light and expunged. As it is, lawyers and their clients can only guess, and then engage in ex post arguing. Similarly, software can be updated even after it is released--try issuing a "patch" to a contract half-way through performance! I think this accounts for the difference you highlight at the end.
At the same time, this uncertainty about the future leads to reusing things. Contracts, especially in ongoing relationships, will incorporate by reference larger framework documents that have, presumably, proven the test of time. Lawyers will put in vast tracts of boilerplate, and bizarre phraseology and verbal formulations (I want to see someone "cease," but draw the line at "desist[ing]"), that even they have only a general idea of what they are supposed to mean or do. In the programming world, too, we reuse things. Even as of 1997, 80% of the world's business still ran on COBOL--when you log onto your bank's web site, the web-site code, written a few years ago, probably interfaces with core "business logic" code written in the 1970s. The object-oriented revolution was all about allowing the reuse of code. I think this is another analogy worth exploring.
Posted by: Andrew Carlon | May 17, 2007 3:51:07 PM