« JOTWELL: Malveaux on Selmi and Tsakos on the effects of Wal-Mart v. Dukes | Main | The Place of Power »

Wednesday, March 08, 2017

Complexity Mitigation Strategies for Law-Law Land (and Beyond) and Some Other Thoughts on Hadfield / Susskind^2


Thanks to Dan Rodriguez and the members of this blog for organizing this conversation. I enjoyed both books (i.e. Rules for a Flat World + The Future of the Professions) and think each offers a significant contribution to the overall legal innovation agenda. In the coming years, I plan to assign portions of both books to students in my courses.  

I am a bit late to the conversation so I will just add a few discontinuous thoughts. This post will be devoted mostly to the Hadfield book. I have lots of information already online surrounding the ideas explored by Susskind^2. See the following – {a.i. + law: a primer}, {machine learning as a service #MLaaS}, {the three forms of legal prediction: experts, crowds + algorithms}. However, both as a person who helps run a technology company in this space and as an academic who does technical work on these questions – the idea that automation is only going to reach low level work is nothing more than wishful thinking. While the industry will still remain, the nature of the work and skill sets required are likely to change (and let’s be clear, the change will be in a technical direction). The only real question -- as I see it -- is the time-scale.

(1) Some General Comments on the Hadfield Book

There is both a scientific agenda as well as an implementation agenda but from where I sit -- law can learn quite a bit from other areas of human endeavor that have confronted complexity in one form or another and have responded in turn with some form of mitigation effort. In fairness, it is very hard to mark all boxes in one book. However, particularly on the implementation aspects of what might be called the ‘complexity problem’, I feel the Hadfield book is somewhat underdeveloped. So I thought I might sketch a few efforts that are being undertaken in furtherance of legal complexity mitigation.

(2) Three Complexity Mitigation Strategies

(a) Lean / Six Sigma For (Legal) Processes:  Law (plus government, more generally) is in real need of a rigorous focus upon implementation / service delivery. We need legal professionals who can deliver a higher quality, lower cost, and more consistent service offering to clients across the economic spectrum, from the Fortune 500 General Counsel all the way down to the low income individual seeking access to justice. Particularly for the most complex of problems, this requires some level of professional skill in system redesign / reengineering.

The application of process improvement methods such as lean and/or six sigma have brought significant increases in both efficiency and quality in a wide variety of fields. Here in law-law-land, however, there has been very little in the way of serious work in this direction (aside from a few notable exceptions).

As noted in this recent report from the magic circle firm Clifford Chance, "almost any task that has a beginning, a middle and an end can be construed as a process, including the practice of law."  Legal processes can be recursively decomposed into a series of sub-processes down to some base layer / primitives. After such processes are mapped, they can be streamlined by some combination of reengineering and waste removal (muda).  While law is not automobile manufacturing, the applications of these ideas has reached far beyond manufacturing to medicine, accounting, financial services, etc.

When one encounters service delivery examples across a range of public and private contexts, the hallmarks of processes which have undergone such process engineering are obvious to the end user / customer / client. So even though the market for legal services is sticky and at times even downright dysfunctional, there is good reason to believe those who embrace process improvement will ultimately win out.

At the Law Lab @ Illinois Tech - Chicago Kent College of Law, I am very excited to teach a combined course in legal project management and legal process improvement with the team from Seyfarth Lean Consulting (Kim Craig, Larisa Kruzel, Kyle Hoover and others from the team). For those who are not familiar, Seyfarth Shaw is one of leading law firms applying lean principles to reengineer the delivery of legal services (for more see here, here, and here). Students who complete all of the requirements (including the certification test) receive a Lean Yellow Belt. 

(b) Design Thinking for Lawyers:  There are important overlaps between process engineering and design thinking but design thinking is a separate discipline with its own useful lessons for law. Design thinking is quite the rage in the broader business world (e.g., as the WSJ says, d school is the new b school).

Law is largely still a service business but we are beginning to see much more productization in law, including the legal tech startups and ongoing innovation efforts within the traditional provider ecosystem. So particularly as applied to the productization in law, implementing lessons from the discipline of user centered design will prove to be particularly useful.

Notable examples within law include Margaret Hagan (bridging Stanford d School and Stanford Law School) and Josh Kubicki (Chief Strategy Officer at Seyfarth Shaw). Under the leadership of my colleague Ron Staudt, my institution - Illinois Institute of Technology - has also had a long tradition of integrating design thinking and law. A2J Author - a platform which has helped deliver Access to Justice (A2J) to more than 3 million+ users - grew out of a collaboration between CALI, Chicago Kent College of Law and the Institute of Design @ Illinois Tech. The goal is to make a range of legal processes less opaque for low-income users.

(c) User Interfaces for Law:  One final complexity mitigation idea that J.B. Ruhl and I have discussed in multiple outlets is the idea of developing a range of user interfaces for law. While some complexity in law is simply the byproduct of political economy or improperly drafting, some of the complexity in law serves other important goals such as allowing for particularization of the law in a range of differentiable contexts. In other words, the challenge with the Simple Rules thesis is that complexity is a feature, not a bug.

So rather than engage in a frontal assault on legal complexity, an alternative approach to reduce complexity experienced by the end user is to build user interfaces (UI). As J.B. Ruhl and I discuss in a recent paper, “Complexity in the underlying object may or may not project into complexity as experienced by the relevant end user. TurboTax and other competing products offer a technology layer sandwiched between the Code and the experience of the end user. In a very serious sense, this software is a legal user interface. Much like internet browsers shield (many) users from the underlying coding language (e.g., HTML and Java) and processes, tax preparation software shields users from the underlying complexity in the Tax Code.”

The open and non-trivial question is how we might extend those ideas to other contexts.

(3) Some Broader Thoughts on The Scientific Study of Legal Systems as Complex Systems

Law is a complex adaptive system (a fairly obvious substantive proposition; for more on this proposition see here, here, here, here, etc.). This proposition has significant implications for how we understand efforts at policy making, how our students counsel their clients, and how one might develop technology to help mitigate the law’s complexity.

Mike Bommarito and I recently presented some of our work on this topic at the Conference on Law + Complexity at the University of Michigan Center for the Study of Complex Systems (see full deck here). The conference was focused on law and complexity across a range of sub-topics (see the full conference agenda here). As I was an IGERT Fellow at the University of Michigan CSCS during graduate school and wrote a dissertation on this topic, this was a particularly cool thing to see come to life.

Now, I know these ideas might not be super familiar to legal scholars, so let me give a wider introduction. The early foundations of the field of complex systems were developed at the Santa Fe Institute with folks such as Ken Arrow, Murray Gell-Mann, Brian Arthur, John Holland, et al. The general idea is that equilibrium analysis (closed form analytical representations via a differential equation) are at best a first order representation of the system that they hope to characterize. (See this George Box quote). While they happen to be a pretty good first order representation -- when they break down it can be pretty consequential. Increasing returns, bubbles, cascades, positive feedback loops, and out of equilibrium models etc. are typically difficult or impossible to characterize using closed form analytical solutions.

The science of complex systems is sometimes characterized as a form of post-modernism (but a rigorous version thereof). Anyway, this is a much larger topic but throughout the Hadfield book there are a number of references to and broader descriptions of law and societal complexity and the future of law in a modern global world. Thus, it is worth noting researchers are beginning to apply theoretical and empirical tools of complexity science to better understand how to measure, monitor, and manage the legal system as a CAS. My hope is the Hadfield book (among other works on this topic) will bring additional theoretical and empirical attention to complexity science and how it might be a useful approach to understand and engineer the legal system.

Posted by Daniel Katz on March 8, 2017 at 11:03 PM in Symposium | Permalink


The comments to this entry are closed.