Level the playing field: Give consumers access to legally reliable Generative AI
If we agree people shouldn't use ChatGPT for legal issues, what should they use instead? The public deserves a better answer than "Go without." Here's how lawyers can provide one.
Should ordinary people use ChatGPT and other general-purpose Generative AI products to help them address their legal issues?
I don’t think I need to run an interactive poll here to find that the great majority of you would say, “Absolutely not.” And you would have excellent reason: When even lawyers can’t be counted on to use ChatGPT properly for legal work, how can everyday people be expected to catch errors, oversights, and hallucinations in their AI-generated research and documents? The risks are far too high. The downside is much too steep.
So the consensus response to this question from the legal establishment is: “Don’t use general-purpose Large Language Models for legal matters.” That would be great — if the rest of the world were paying attention to us on this point. I believe they are not.
In May 2023, LexisNexis did something unusual for a company in the legal industry: It posed a question to normal people. “Have you used Generative AI tools like ChatGPT to obtain legal advice or assistance with a legal question?” it asked 1,765 American consumers. In what Lexis called a “stunning” result, more than 27% of them (48% of the 57% who were aware of these tools) said yes.
That was 20 months ago. I can almost guarantee that number is substantially higher now. In fact, I expect most people today have used ChatGPT, Claude, Gemini, or another general-purpose LLM (maybe even DeepSeek) to get legal information or guidance. Why would we imagine otherwise? If a person is facing a legal issue (most people do) and can’t access a lawyer (most people can’t), and Microsoft CoPilot will automatically answer the questions the person types into Google — do we really think they’ll ignore that information, even if it’s not fully accurate?
I don’t think it matters much if lawyers believe people shouldn’t use ChatGPT for their legal problems. It doesn’t even matter if we’re right. People are almost certainly using it anyway — quite possibly, in many cases, to their detriment. So the question for the legal profession is: What are we going to do about that?
I already know how some lawyers will respond: “Go right ahead! Easier for me to beat you in court, and you’ll probably end up hiring me later to clean up the mess you made.” (“Ka-ching!” was the actual term a lawyer once said to me when describing the latter scenario.)
Thankfully, most lawyers don’t condescend or exploit people’s desperation like that. But it’s not good enough to just look away, willfully blind to the reality of unaddressed legal needs. It’s not good enough to tell people, “Don’t use ChatGPT for legal matters” when we don’t offer them any viable alternatives — just like it’s never been good enough for us to say, “Don’t use a non-lawyer” or “Don’t represent yourself,” when we know that 80% of people can’t and won’t hire a lawyer. Frankly, it’s insulting.
I believe members of the legal profession, and I’m including judges and regulators here, have an obligation to do more — either because of our professional ethics, or out of common decency, or because leaving people to sink beneath the waves of the legal system is a dangerous violation of the Rule of Law. There’ve been welcome advances recently in authorizing paraprofessionals to provide legal assistance, but those efforts remain scattered and marginal, especially when compared to the incredible potential of legally reliable and effective Generative AI.
Almost a year ago here at Substack, I asked: “Could Generative AI help solve the law's access dilemma?” I’m more convinced now then I was then that the answer is yes. Gleaned from a variety of sources, most prominently Bob Ambrogi and his LegalSites blog, here are several examples of legal aid agencies, pro bono service providers, and consumer assistance organizations currently using Generative AI to help low-income or vulnerable people access their legal rights and remedies.
DACA Copilot, a joint effort of Microsoft and the Northwest Immigrant Rights Project, reduces by 90% the time required for volunteers to prepare Deferred Action for Childhood Arrival (DACA) renewal applications.
iMMPATH provides immigrants and their advocates with actionable, multilingual, AI-powered legal guidance to understand their options for attaining legal status in just 30 seconds, courtesy of the Justicia AI Lab.
JusticeBot, developed by the Cyberjustice Laboratory at the Université de Montréal, combines symbolic AI logic with Generative AI to help users understand their legal rights and options, particularly in landlord-tenant disputes.
Legal Aid of North Carolina’s Legal Information Assistant (LIA) is an AI-powered virtual assistant for under-served communities that provides reliable information, multilingual answers to general civil legal questions, and referrals.
The Eviction Defense Document Engine (EDDE) from Missouri Tenant Help and Lemma Legal helps renters to determine their eligibility for assistance before speaking with staff and to create their own court documents to fight eviction.
The Nevada Supreme Court and CiviLaw.Tech jointly developed a Gen-AI-powered chatbot that provides clear, concise, and personalized answers to legal questions posed by self-represented litigants.
Rentervention is an AI-powered virtual assistant from the Law Center for Better Housing that offers Illinois tenants facing housing issues self-help tools and guidance.
Roxanne is an AI assistant designed to help tenants in New York City get legal guidance for housing repairs, jointly created by NYU Law School, local nonprofit Housing Court Answers, and Australian legal tech platform Josef.
Generative AI can also help these organizations in ways that don’t directly involve consumers. The Legal Aid Society of San Bernardino, for example, uses Gen AI internally to enhance its operations, reduce administrative burdens, and create educational content. Both directly and indirectly, Generative AI is helping people get reliable legal assistance.
But we’re barely scratching the surface here — there are thousands of organizations like those listed above that lack any kind of AI capacity. And the access to justice environment in the United States has just gotten much worse. As Sean West notes in his latest GeoLegal Weekly, the Trump Administration in its first week eliminated legal aid programs for undocumented immigrants while launching attempts at mass deportations that have already had international consequences. Expect many more efforts, lawful and otherwise, to deprive people of their legal, constitutional, and human rights, and many consequent attempts by those people to find any kind of legal help anywhere they can.
And so I ask again: What is the legal profession prepared to do? Here are my recommendations for three relevant stakeholders in this area.
Law firms, especially those enjoying some of their most profitable years in history, can direct a fraction of their revenue to local legal aid and community groups, specifically earmarked for the development of Gen-AI programs and platforms like those noted above. And then they can send their lawyers pro bono to help train and develop these AI programs (and the organizations’ Legal Intermediaries) to provide better and more wide-ranging assistance to people who are never going to hire those law firms.
Legal regulators should adopt the position of Stanford’s Deborah L. Rhode Center on the Legal Profession, that every Unauthorized Practice of Law barrier currently blocking the development or deployment of reliable and effective consumer legal AI should be torn down — and that includes those that, incredibly, purport to tell courts that they can’t offer Generative AI assistance to self-represented litigants. AI has already made UPL a dead letter; regulators just haven’t realized it yet.
Legal technology companies should make their Generative-AI capacities and solutions available in some fashion to on-the-ground organizations providing front-line help to the legally needy. To their great credit, legal tech companies like Everlaw, Relativity, Upsolve, and Thomson Reuters are already putting real money and time into AI-for-access efforts, as are legal organizations like the Minnesota State Bar Association. Follow their lead. (Feel free to link other examples in the comments.)
If all these efforts were successfully undertaken, we could create what would essentially be an international network of AI-powered organizations helping people access reliable and effective legal assistance. But that isn’t, and shouldn’t be, the ultimate goal.
The point of every true A2J effort is to empower people in relation to their legal rights and remedies — to give them justified confidence in their ability to understand and access their protections and entitlements under the law. We should be aiming for legally reliable general-access LLMs with which people can forge their own legal solutions. “In years to come, the principal role of AI in law will not be to enhance today’s largely unaffordable legal and justice systems,” said Richard Susskind recently. “It will be to place the law in the hands of everyone.” He is exactly right.
If you think ordinary people shouldn’t be using ChatGPT to solve their legal problems, great. I agree with you. So, what should they be using instead? And what will you do to help them access something better? That’s the real question AI is posing to everyone in the legal profession, right now.
As always, excellent insights! Let's shout this from the rooftops. This doesn't need to be a zero-sum game. We can give access to justice *AND* lawyers can make more money. AI + Jevon's Paradox = win-win. If we allow it.
Posting from LinkedIn, https://caseguru.app/ is another new player trying to work on this problem. They began as a platform to connect people with the *right* lawyer by helping the user understand their own problem better, and forming it into a standardized 'job' that goes into their marketplace so registered lawyers with relevant expertise can get in touch (think Rover but for lawyers). It's since greatly enhanced the offering by add self-service advice using generative AI to help the user understand what to do, how to do it, and how much a lawyer would likely charge to handle the matter for them. At any point in the process the user can switch from self-service to a lawyer within the platform, and only pay for the lawyer when they need to. The registered lawyers are also winners because this is business they were not getting before (no one was).
Full disclosure, I'm an early stage advisor to CaseGuru and own a very small piece of it.