One of us Neota Logicians recently joined Georgetown Law Professor Tanina Rostain and Littler Chief Knowledge Officer Scott Rechtschaffen to talk about expert systems in the law. The forum was ILTA 2014, this year’s edition of the vast and collegial conclave of IT, KM, PM, ED, and other mavens from around the country and the world.
The organizer of our panel, Ginevra Saylor, Dentons National Director of Knowledge Management, set us the question: are expert systems a threat to or an opportunity for traditional legal services?
Unlike moot court or a debating society, where participants are assigned to advocate both sides of the proposition, ILTA panels generally line up with the speakers’ true views. So the three of us, all of whom are active practitioners of the expert systems science and art, debated against an empty chair. We had hoped to have noisy skeptics in the audience with whom to joust, but none stood up.
This post sets forth the case we made. Expert systems are:
- More powerful than the software tools commonly used in legal services.
- Ubiquitous in other industries.
- Practical and effective in legal services.
- Good for clients (better, faster, cheaper services), for law firms (revenue, reputation, and quality assurance), for law students (analytic rigor, client savvy), and for consumers (affordable guidance).
Because the title of the panel includes the words “expert systems,” and ILTA is a conference filled with both experts and systems, we thought we should start by explaining what we meant by those words. Expert systems are:
- A mechanism
- With which one can acquire the expertise of one (or more) people who know a lot about a subject a/k/a experts
- And structure that expertise so it can be delivered to other people (tens or thousands of them) who know less about the domain, but need a solution to a specific problem within the domain.
Oh, you mean an e-book?
No. Books don’t help much with acquiring or structuring.
Well, then, how about a search engine?
No again. Search engines are great at gathering, which is a form of acquiring, but mostly not good at structuring or specificity of answers, though they are getting better at both.
How about the Edwin Smith papyrus? That marvelous artifact is the oldest known medical diagnostic system, from 1700 BC and apparently recording knowledge acquired 1000 years earlier.
Now we’re getting closer. Because if you happen to read hieratic, the Egyptian cursive form of hieroglyphics, you will discover in this 13-foot-long scroll a checklist of differential diagnosis, treatments, and prognosis, probably written for military surgeons.
- IF the wound is in the head
- AND has penetrated the bone
- AND there is bleeding,
- THEN apply meat to stop the bleeding
- AND then suture the wound.
Let’s modernize this and write the diagnostic advice in a programming language. Your choice: Python, C#, Java, Swift, or any of a hundred others.
Nope, still not an expert system. Why? Because the expert’s knowledge and the tools for working with it are all mushed together in code. And only a programmer can read this stuff, write it or, most important, update it.
Instead, expert systems separate the knowledge and the tools into two parts:
The Knowledge Base, which holds what the experts know, using a range of methods to represent that knowledge, including:
- If/Then rules
- Decision trees
- Decision tables
- Formulas and other mathematical expressions
- Multi-factor weightings
- And other reasoning methods
The Inference Engine, which links together the various bits that the experts know and links those bits to the problems that users present. It will:
- Automatically apply the relevant reasoning, by pursuing goals and backward and forward chaining.
- Drive interactions with users and external systems.
- Explain itself.
With the programming languages listed above, every step from beginning to end and every procedure for executing the steps must be specified by a programmer. That’s traditional “procedural” programming.
Here’s a procedural recipe for chocolate cake. Start at the beginning, follow the defined path to the end, and you’ll have a nice cake.
In contrast, expert systems software tools are “declarative.” Declare the system’s goal—a delicious chocolate cake. Or, to be more lawyerly, is a stock transaction a disposition under Internal Revenue Code Section 306?
Define the rules, often many of them with multiple inputs and relationships. Then the expert systems software can automagically organize the relevant rules, ask the necessary questions, determine the answer, and present it to the inquiring stock seller (more likely, her accountant), accompanied by explanations of the reasoning.
To round out the expert systems toolbox, we need to add:
- Editors, for creating and modifying rules and other constructs.
- Extractors, to pull rules from Word, Excel and other sources.
- Verification & Validation tools, because experts, lawyers especially, want their systems to be truly expert, to be correct.
- And integrations so our expert systems can talk to other systems like document management and HR.
Expert systems—rule-based systems—are used in dozens of industries, from medicine to mortgages, including the government side of the legal industry. They don’t advertise the technology under the hood, they just deliver useful answers. Here’s one example, on an iPhone pulled from the white-coat pocket of a busy emergency room physician:
And here are a few more among the many:
- WebMD symptom checkers
- Medical billing systems
- Ikea’s Ask Anna virtual assistant
- Consumer loan approvals
- Workers’ compensation claims review
- Insurance fraud detection
- Social services eligibility determination
- Internal Revenue Service Interactive Tax Assistant
In Law Firms
Scott Rechtschaffen outlined three contexts for expert systems in law firms.
The first is business development, exemplified by Littler’s Health Care Reform Advisor, which is available without charge on the firm’s web site:
“Employers can use Littler’s Healthcare Reform Advisor, an innovative online system, to determine whether they may be at risk of having to pay a penalty under the ACA’s “pay or play” mandate.”
The second context is quality assurance. Littler’s CaseSmart–Charges, “an integrated solution to managing administrative agency charges,” incorporates expert systems to guide lawyers through a comprehensive factual and legal analysis of each claim.
The third context is revenue generation. Clients welcome initiatives by law firms to productize expertise. Expert systems on a subscription model can deliver on-demand answers to frequently occurring regulatory questions far more cost-effectively than traditional legal services on the hourly billing model.
For example, whether a worker is best classified as an independent contractor or an employee is a question asked and answered millions of times each year across the U.S. workforce, and is a question having significant enforcement consequences if answered incorrectly. The analysis is intricate, multi-jurisdictional, and fact-specific, and cannot be done by traditional legal services methods at high volume cost-effectively, rapidly, and accurately. An expert system, however, can do the analysis easily all day long. And when the specific circumstances are unusual or sensitive, the system routes the question to lawyers for evaluation.
In Law Schools
In Tanina Rostain’s Technology, Innovation & Law Practice course at Georgetown Law, now in its sixth semester, student teams build legal expert systems in partnership with not-for-profit organizations and government agencies that serve as both subject matter expert and client.
As the Legal Service Corporation reports, “Demand for legal aid far outstrips the resources available. This is known as the ‘justice gap.’ Recent studies indicate that legal aid offices turn away 50 percent or more of those seeking help.”
Student applications have dealt with issues such as domestic violence, housing, health care, family law, and financial exploitation. They serve to narrow the justice gap, following the Legal Services Corporation’s call to apply technology to deliver legal services to “everyone, anytime, anywhere.”
At the conclusion of the course, students present each application to a panel of invited judges in the IronTech Lawyer competition. Examples of the applications:
- New York City Earned Sick Time Advisor – to provide self-help answers, with A Better Balance.
- Unemployment Benefits Hearing Advisor – to guide people representing themselves, with the DC Office of Administrative Hearings.
- Debt & Eviction Navigator – to deliver in-the-field support for non-lawyer professionals, with JASA, the Jewish Association Serving the Aging.
- Legal Aid Intake – to improve practice efficiency, with the Virginia Legal Aid Society.
- MIDAS – Military Impact of Discharge Assessment System, to strengthen adjudicative decisions. MIDAS illustrates the power of expert systems in the access to justice context.
There are excruciatingly complex rules that govern the impact on a veteran’s claim for health benefits of the terms of the soldier’s discharge from service.
That complexity can be captured efficiently in expert system decision trees, which then deliver a simple question-and-answer user experience to the person trying to apply those rules to specific facts for a specific person.
What do students learn from building applications? Many things, many of them not taught in other course. They learn:
- Rigorous legal analysis. Sure, all courses teach that. But writing an exam answer or a paper doesn’t require the same rigor. If you don’t know the law from the trunk of the tree all the way to every leaf, you can’t write the rules, or apply the normalizations and sometimes simplifications necessary to deliver a useful answer.
- Clear writing. Maybe some courses teach that. But writing for real people who are seeking specific answers to questions of real consequence is a challenge for lawyers. Clarity, brevity, and simplicity are essential.
- And finally the students learn teamwork (with colleagues and clients), project management, a systems approach to problems and practice, user experience design, and even some software engineering. All skills essential for 21st century practice.
How are these things built?
Building expert systems requires people, in addition to software. An expert, of course. Usually multiple experts, because in any complex and interesting domain, several heads are likely better than one, even if it’s this one.
And a “knowledge engineer,” a technically savvy person, though not a programmer, who understands expert systems and is adept at helping the domain experts to:
- Understand expert systems concepts, such as rules, and the software’s capabilities.
- Organize their knowledge in a systematic way so it can be encoded in the software
- Anticipate what users will want to do, so user experiences are well-designed, clear, and coherent.
- And is facile with one of the expert systems tools, so the KE can build and test the system.
An anecdote from the history of expert systems.
Digital Equipment Corporation, a sadly departed computer company of an earlier generation (a case study, by the way, in Christensen death by disruption), built an expert system to configure its big, multi-part computer systems for customers. Like the expert systems you see on car company web sites – if you order option A, you can’t have option B, and so on. DEC’s system was big, useful, and used for a long time.
One of its original creators gave a talk about the system’s construction and was asked how it came to be named R1. He said, “When I started the project, I didn’t know what a knowledge engineer was, and now I are one.” Hence, R1.
Here At Neota Logic
We develop Neota Logic Server, a uniquely powerful software platform with which law students, lawyers, and other people who are not programmers can build expert systems.
We collaborate with pioneering firms like Littler to build expert systems for their clients.
And we collaborate with Georgetown Law and other schools to teach knowledge engineering and, by providing our software, enable students to build and organizations to deploy applications that help to close the justice gap.