Non-Lawyers: Read This Before Using AI for Legal Tasks

(Or: Non-Lawyer Use of AI for Legal — Considered Harmful?)

Clients and potential clients who aren’t lawyers are already using mass-market AI tools like ChatGPT or Claude for legal tasks. Barely a day goes by in my practice without a client sending me a “draft agreement” or an AI summary of legal research on some topic — typically thinking they’re already, if not mostly, “done” — and asking for my input. These work products are rarely what was hoped for. But my goal here isn’t to scold non-lawyers for doing this, or even to discourage the practice, but to put it in the right context vis-à-vis licensed and experienced legal counsel.

The Pitfalls of Non-Lawyer Use of General AI for Legal

Below is just a sampling of the risks and adverse consequences of non-lawyer use of off-the-shelf AI, without professional legal advice:

  • Outputs are generally brief sets of terms, closer to a term sheet than a full agreement. This leaves out not only detailed legal provisions that might be significant (and that an experienced attorney would account for), but also legal “boilerplate” — which is, contrary to popular assumption, extremely important.
  • Not all legal considerations live within the “four corners” of a contract — far from it. Failure to consult an attorney about the underlying facts and circumstances can miss key, even overriding, legal concerns. Experienced attorneys know this, and will generally start off by — or at some point steer the conversation back to — understanding what the client is actually trying to accomplish, what the parties’ assumptions and postures are, timing, risk tolerances, and so on.
  • AI is only as good as the data it has access to, and general AI draws heavily from public data. That’s where it gets its assumptions — so when it drafts language and declares a clause “standard,” that’s often a false premise (either because its sense of “standard” is skewed, or because your particular situation is a niche where common clauses are irrelevant). I’ve seen AI confidently announce that a condition wasn’t addressed in a contract when it in fact was — just worded in a way the AI didn’t expect — leading to wasted time rehashing the issue with the client.
  • LLM-based AIs (the prevailing type) are trained to please their prompters: to produce output where there might otherwise be none, and to project confidence where it should be tempered. Not only does this produce the notorious “hallucinations” (the most famous example being AI citing fabricated cases), but in contract drafting and review, it causes AI to generate superfluous clauses or critiques simply to produce more responsive output. This is a terrible tendency for law, where disclaiming, qualifying, and being transparent about genuine uncertainty is as important as getting things right.

All that said, term sheets and high-level agreement drafts are often genuinely useful as starters — and indeed, barely a day passes in my practice without my asking a client for a term sheet to get things moving. The process of having AI draft a proto-agreement can also be useful for helping non-lawyers think through what they actually need. So, if you’re inclined, keep those AI outputs coming! But temper your expectations on three fronts:

  • How legally thorough or complete the output is.
  • How legally accurate the output is for your specific context.
  • How little attorney time it will actually take to build on that AI output and finish the job. (Clients often assume a “quick look” will wrap things up; in reality, a quick look often isn’t enough to do the task justice — and in some cases, an AI-generated starting point can take longer for an attorney to work with than simply starting from an in-house template.)

And of course — AI output isn’t licensed attorney advice. Not everyone cares about that distinction, but most serious business contexts call for, if not outright require, licensed legal counsel. You might not think your situation does, or you might not be sure. But ask yourself: if the transaction later goes awry, and legal terms you only ever ran through AI are implicated in the damages, would you be comfortable with that? (And even if you are, your business partners, investors, or other stakeholders might not be. Moreover, in formal corporate environments, skipping licensed legal advice can constitute a breach of fiduciary duty — giving rise to legal liability.)

Professional Legal AI

The other side of this coin is that modern legal practitioners are — and, at least in my view, should already be — using AI themselves as a tool to improve their practices and bring real value to clients.

Integral to this is that there are already scores of professional-grade legal AI tools on the market, fine-tuned to perform legal work far more reliably than general AI tools. (I include in this category tools like Claude + Cowork — which are technically general AI — but which can be deployed in a legal environment to function much like specialized legal AI.)

Most non-lawyers don’t realize this is already the state of the art, or appreciate the value that bespoke AI brings to legal practice. In my own practice, I routinely see speed-ups of 30–50% on drafting and research tasks through professional legal AI — and that’s before accounting for coverage increases and quality improvements that are harder to quantify.

Among other efficiencies, dedicated legal AI tools typically eliminate hallucinations by being built atop known sources of truth: proprietary legal research databases and professional-grade forms and template libraries. This is absolutely critical in legal work. Approximation, guesswork, and responses that “sound right” won’t cut it — and are often downright dangerous.

The key point is this: in the hands of an experienced attorney, embedded in a trustworthy and broad-based legal information environment, legal AI can be leveraged far more effectively than general AI can in the hands of a non-expert. A useful analogy for the technically inclined: a lawyer is something like an operating system — and compiler or interpreter — for the law. A legal document, or an AI tool applied to legal questions, will look very different through the eyes of someone with legal training, who can take those inputs and extract insights that the underlying tool simply cannot surface on its own.

Conclusion

In sum, AI is neither “good” nor “bad” for legal work — it’s a tool, and like any tool, it can be used well or poorly. Used properly, it meaningfully increases productivity and the quality of the end result. And in many, if not most, circumstances, working with a licensed attorney in the relevant field — who is also leveraging professional legal AI on their end — is the recipe for avoiding costly missteps and getting there with maximum efficiency and economy.

So go ahead: ask ChatGPT to draft that contract, or ask Claude to research a legal question and summarize the answer. Just make sure to frame it as an educational undertaking or a starting point to the real task, not a finishing line. Then ask yourself whether you’d be comfortable telling a stakeholder that you relied on that output alone — especially if problems surface later that might have been caught earlier. If the answer is no, consider reaching out to an experienced, AI-leveraging attorney to bring the job home: for the peace of mind, yes, but also for the potential value-adds — additional insights, upfront efficiencies, and costs avoided down the road.  You can click here to reach out to AI-adept legal counsel and see how we can help.

Note: This article is intended as general commentary and does not constitute legal advice.