Could Generative AI help solve the law's access dilemma?
Legal regulators have failed to strike the right balance between accessibility and quality in legal services. It's just possible that Gen AI could open the door to an entirely new solution.
The weird thing about regulation — well, one of the weird things, anyway — is that regulators are frequently saddled with two opposing mandates.
On the one hand, regulators are supposed to ensure that products and services meet consistent standards of quality and effectiveness. To enforce those standards, regulators place various demands on providers, such as health and hygiene requirements for food manufacturers and competence and conduct norms for service professionals.
Imposing these demands, however, raises the cost of market participation, because regulatory compliance is costly. That cost drives some would-be suppliers out of business and prompts those who remain to pass those costs along to consumers as higher prices. The more you regulate something, the scarcer and more expensive the product or service becomes. (That’s not an argument against regulation, of course, just a description of how it operates.)
On the other hand, however, regulators are also directed to ensure that the public can actually obtain the products or services they regulate. It does no good to apply such stringent regulatory standards that only a lucky or wealthy few can find or afford what they need. The finest microwaveable chicken parmesan at $400 a box doesn’t do much good. Nor does the finest accountant at $4,000 an hour.
These two opposing mandates — maximize quality while also maximizing access — invariably leave regulators in a near-impossible conflict, trying to yank open doors that they themselves are holding tightly shut. The fundamental dilemma for service regulators has always been balancing accessibility with dependability.
Accessibility is a combination of affordability (the price of the service is within the financial means of the average consumer) and convenience (the consumer can obtain the service with a reasonable degree of ease and simplicity).
Dependability is a combination of proficiency (the service and/or its provider meets industry standards for competence and accuracy) and trustworthiness (the service and/or its provider can be regarded as safe, reliable, and effective).
Every regulator has to grapple with these two sliding scales: Push one up and the other gets pulled down. You want 100% accessibility? Deregulate the entire industry and give the market free rein. You want 100% dependability? Allow only the absolute best products and services to come anywhere near a consumer.
In the real world, of course, you compromise. You sacrifice some accessibility to ensure products and services are baseline dependable, and you sacrifice some dependability to ensure those products and services are baseline accessible. Ideally, your compromises will result in a market that works well for both buyers and sellers while also serving the public interest.
Which brings us to the legal market.
I do have some sympathy for legal regulators. The prime directive in their enabling legislation or court order is to “protect the public,” a phrase dating back at least a century that’s hardly ever properly defined. And now they’re also being told to improve people’s “access to justice.” Well, which is it? Do you want dependability or accessibility? How do you protect people against both low service quality and low service availability?
My sympathy for (most) legal regulators ends here, however, because the choice they’ve made is not to do much balancing at all. They’ve gone all-in on dependability.
Regulators have restricted the supply of legal service providers to those whom they consider (wrongly, in my view) the only unimpeachably reliable option: members of the legal profession. Having thus gifted the market to a single class of providers, they’ve taken no steps to ensure those providers are widely or even moderately accessible or convenient. Your options for legal assistance, accordingly, are (a) a lawyer, if you can find one and hire one, or (b) nothing. Thanks for coming out.
Now, one easy way to strike a better balance would be to increase the supply of authorized providers. Why not just double or triple the number of lawyers, thereby reducing scarcity and price? Well, as you and I both know, it’s because regulators have been sufficiently captured by the legal profession that protecting lawyers’ interests is an unspoken regulatory priority, and lawyers’ interests aren’t served by competing against more providers and getting paid less for their work.
I’d love to see more lawyers in the legal market. But accomplishing that feat would require root-and-branch regulatory reform, making regulators comfortably independent of the lawyers they govern, and that would take years to attain. It’s a massive struggle today just to get regulatory approval of para-professionals who want to help the absolute neediest people and who pose absolutely no competitive threat to lawyers.
But even if we could, against the odds, manage to flood the market with affordable and dependable lawyers and para-professionals, I think we still wouldn’t solve the access problem.
Lawyers and paraprofessionals are one-to-one solution providers: They help one client with one issue at one time. The access crisis is not linear like that. Unmet and unrecognized legal needs outnumber the people who can meet those needs dozens or hundreds to one. We can’t license our way out of this. Millions of unaddressed and unresolved law-related life and business issues need a one-to-many solution — something that somehow manages to be both accessible and dependable.
Which brings us to Generative AI.
I believe Generative AI is the first legitimate candidate we’ve ever seen for potentially resolving the accessibility/reliability conundrum in legal services. Yes, there are myriad questions today surrounding its accuracy, attainability, and viability. Yes, we’re still quite some distance away from answering those questions. But the potential here is undeniable.
Turn back to the accessibility/dependability dilemma. Gen AI is already as or more accessible than any other legal resource out there.
Affordable: ChatGPT-4, at the moment unsurpassed as the world’s best large language model, is embedded in a free public search engine (Microsoft Co-Pilot). OpenAI’s ChatGPT-4 is available for $20 a month — feel free to calculate how many hours of a lawyer’s time that will buy you. And a host of rival LLMs are racing to development, which should help keep cost barriers low.
Convenient: In addition to online search, Co-Pilot is being integrated into Microsoft’s home, school, and office software products that millions of people access daily. Just this morning, Google offered me (a business Gmail user) discounted early access to its latest Gemini product. Gen AI ubiquity is coming.
Generative AI is not, of course, acceptably dependable as a legal resource yet. But we might not be very far from the day when it is.
Proficient: ChatGPT-4 can pass the US bar exam and has shown it can accelerate the speed and enhance the quality of law student work. Matching the level of Day One lawyer competence could well be attainable within the next few years. The real question then becomes how much farther Gen AI can go.
Trustworthy: High-quality law-specific LLMs are already pouring into the market (Thomson Reuters, Lexis-Nexis, and vLex lead a growing pack), promising reliable results and safeguards against hallucinations. But don’t discount the possibility that general-purpose LLMs like ChatGPT might someday equal domain-specific AI. We don’t yet know what the ceiling is here.
As I’ve said before and as I’ll certainly say again: These are very early days, and things are moving fast. I actually think Gen AI accessibility could decline this year, as giant corporations try to firewall their products to recoup their investments and start generating unicorn profits. But I also think dependability will increase, as new versions of OpenAI’s ChatGPT, Microsoft’s Co-Pilot, Google’s Gemini, and many burgeoning law-specific AIs compete to outstrip each other in reliable performance.
Today, I think there’s a realistic pathway towards an outcome where Generative AI provides the world with a scalable, accessible, and dependable legal information and solutions option that’s never existed before. Obviously, I know that’s a long shot. Obviously, the odds seem stacked against it. But it could really happen. Doesn’t that possibility get you at least a little excited? Don’t you think that’s an outcome worth trying to achieve?
Market forces alone might possibly get us there. But I have no faith that market forces alone will advance the public interest. Market forces advance market interests — to be precise, as the last 30 years have demonstrated, the specific interests of a microscopically small number of people at the very summit of those markets.
So we need social forces to counteract the worst impulses of market forces and guide the better ones towards the best public outcomes. That includes government, both the legislature and judiciary, authorizing and funding the development of accessible, dependable, public-facing, AI-powered legal solution engines, working in tandem with non-profit foundations and community groups.
And it also includes the legal profession — even though, yes, a Generative AI product that successfully maximized both reliability and accessibility could conceivably wipe out much of lawyers’ billed inventory. Facing that real possibility, why would lawyers support this effort? For two reasons:
Because of the enormous financial and personal opportunities available to lawyers who transcend our traditional mundane tasks and engage with our higher calling to advocate, advise, and accompany people who need specialized, empathetic, personal assistance in their lives and businesses; and
Because I still believe that the law is fundamentally a helping profession, a group of people who are motivated by the desire to make things better for others, who are enthralled by the prospect of using their highest talents and skills to accomplish that, and who are ready to step forward, one by one, to join that cause.
Which brings us to you.
Quality has many dimensions. To your point (if I can extrapolate), "good enough" legal advice - delivered in time, at an affordable cost, and it a useable way - is 'good' quality.
Waiting for a hand-crafted, expensive, hard to understand, outcome from a more traditional legal service provider is not.
In Canada, between 50%-80% of family law litigants are self-represented. So clearly there is a problem today.
Generative AI, in a way, excels at doing legal stuff. Large sets of content as semi-structured data with some human training. Isn't that how we train lawyers in the first place?
Gen AI will clearly permit legal consumers to get the help that they are currently denied. And they will avail themselves of that with or without lawyer's help.