I recently chatted with Ed Walters. Ed is the CEO and co-founder of Fastcase, a software company that has quickly become one of the world’s largest legal publishers. He also is an adjunct professor of law at the Georgetown University Law Center where he teaches The Law of Robots, a course that examines the law governing drones, robots and autonomous machines. Prior to this work, he was an attorney with a large DC law firm where he focused his practice on corporate advisory work and intellectual property litigation.
What was the impetus behind Fastcase? Where do you see it going in the next five years?
The idea behind Fastcase was to democratize the law and to make legal research smarter. We started in 1999, and after almost 19 years, feel like we’ve done some good work on both goals so far, but there is such a long way to go! Within five years, I think there’s a decent chance that all 50 state bar associations will partner with Fastcase to make legal research free for their members. (Twenty-nine state bars and dozens of metro, county, and voluntary bar associations currently do, giving access to more than 800,000 lawyers in the U.S.)
We’re making a big push now into secondary treatise publishing with our new imprint, Full Court Press – we’ve already started with our first publication, the Journal of Robotics, Artificial Intelligence, and Law (the “RAIL”). But our plans are ambitious here. I expect we’ll have dozens of new expert treatises, licensed books, bar publications, eBooks, and expert blogs indexed from LexBlog by the end of 2018.
Last, I think that democratizing the law shouldn’t stop in the U.S. – there’s so much work to do around the world. Within five years, I’d expect us to have a substantial presence in a few other countries where we can make a difference.
How did you first get interested in legal technology?
I’ve always been interested in law and technology – my father is a lawyer and a former Air Force computer programmer, and there were always computers in my house growing up. My first was an Atari 400 that I bought with lawn-mowing money when I was about 11, and I had jobs doing lightweight computer programming through the 1980s. I was always going to become a lawyer, but I had a lot of conversations in law school with friends about using a law degree to start a legal tech company.
How are successful are current laws at governing artificial intelligence developments? What needs to change now and what new laws, if any, will be needed in the future?
This is the central question of my Law of Robots class at Georgetown Law and Cornell Tech, and it would be hard to answer in short form. The history of common law is applying existing law to new facts, and AI is no different. In many places, we would regulate AI just like any software tool, with tort law or contract law or products liability law.
But: our law generally assumes that a person (natural like a human, or artificial, like a corporation owned and operated by people) makes all the decisions. The big change in robotics and AI is that, in new cases, machines, not people, will be the actors, and they will act in unpredictable ways. It won’t make sense yet to hold the machines responsible, nor to hold owners responsible for unforeseeable, emergent behavior, nor to leave innocent victims uncompensated, nor criminal acts undeterred. We have some strict liability law or agency law that could work, but would likely discourage innovation, too. So, in many places, we’ll need new law.
Most of the interesting regulation is many years off, but watch for an interesting battle over Explainable AI, or xAI – regulations requiring that machines that make important decisions will have to explain their reasoning. Will we limit the usefulness of AI in order to be able to see inside the black box? Will we even understand the reasoning of the machines? That’s going to be an epic policy battle with far-reaching impact.
What would be the one piece of advice that you would have for a lawyer to be successful in today’s dynamic legal world?
I think empathy for clients is undervalued. Not concern for clients – lawyers are very caring people. But empathy – understanding what certain problems look like for clients, and how clients interact with legal services – isn’t taught in law school, and most lawyers don’t learn empathy. There’s a lot of empathy in design thinking, and you see empathy in leading practices, from family law to corporate legal services.
One reason clients avoid consulting a lawyer is because of how much risk clients bear in hourly billing. Legal services are unpredictable and chaotic, and lawyers don’t change that experience for clients, by and large. The good news is that the friction in legal services is also a huge opportunity. It’s not too late to start re-imagining client-driven legal services – and I think that firms with empathy for clients have a huge market opportunity, both in the existing market and the unserved latent market for legal services.