Skip to content
Counsel and Code
Go back

The 70-Point Document Your Client Thinks Is 90

A client recently sent me a contract they had drafted with the help of an AI tool. They asked me to “just review it”—the word “just” was doing a lot of work in that sentence. They proposed paying me twenty percent of what I would normally charge to draft something equivalent.

I read the contract. It was, in fairness, a competent draft. The structure was correct. The standard provisions were there. The language was professional. To an untrained eye, it looked like something a lawyer might produce.

But it was wrong in ways the client could not see.

The indemnification clause had three subtle gaps that would, in a dispute, transfer significant risk back to my client. The change-of-control provision contained a phrase that, in this client’s industry, would trigger a regulatory filing obligation they didn’t know about. The dispute resolution clause was reasonable on its face but would be a disaster if the counterparty turned out to be the kind of company that exploits jurisdictional ambiguity—which, given the counterparty, was a real risk.

Fixing these issues required, roughly, the same intellectual work as drafting the contract from scratch. Each fix required reading the surrounding clauses, understanding what the AI had been trying to accomplish, deciding whether the AI’s attempt was salvageable or needed to be replaced, and integrating the fix without breaking other provisions. The AI-generated draft did not save me time. It added time, because I had to reverse-engineer its decisions before I could improve on them.

I told the client this. They were confused. The contract looked fine to them.

This is the situation I want to write about today. It is the situation that is going to break a lot of lawyer-client relationships over the next several years.

The 70-90 problem

Here is the framework that I have started using internally to describe what’s happening.

For any given piece of legal work product, there is an objective quality scale that lawyers can roughly perceive. A document is, say, 70 out of 100—it has the right structure, covers the main issues, sounds professional, but contains a handful of substantive errors or omissions that an expert would catch. Another document is 90 out of 100—same structure, same surface professionalism, but the substantive judgment is reliable and the edge cases have been handled correctly.

The gap between 70 and 90 is, in legal terms, enormous. It is the difference between a contract that holds up under stress and one that explodes when something goes wrong. It is the gap that lawyers exist, professionally, to close.

The problem is that the gap between 70 and 90 is invisible to most clients. To a client who is not a lawyer, both documents look professional. Both have all the sections. Both use the right vocabulary. The 70-point document is not obviously deficient—it just won’t survive contact with an adversarial counterparty in three years.

AI has dramatically increased the supply of 70-point documents. It has not increased the supply of 90-point documents. The supply of 90-point documents is still bounded by the number of lawyers with serious expertise, and that supply is not growing.

What has changed is the visible quality floor. Before AI, a non-lawyer attempting to draft a contract produced something that was clearly amateur—40 points, maybe. The gap between that and a real lawyer’s draft was obvious. The lawyer’s value was visible in the contrast.

Now the same non-lawyer, using AI, produces a 70-point document. The gap between that and a real lawyer’s draft is much smaller in perception, even though it is just as large in consequence. The lawyer’s value has become invisible.

What the client is actually thinking

I want to be charitable about what the client is doing when they say “I’ve already drafted it, just review it for 20% of your fee.”

They are not, in most cases, trying to cheat me. They are doing what they think is a reasonable thing. They have produced what looks to them like a 90-point document. They are asking me to confirm it, and they are pricing the confirmation as if it were a quick sanity check.

The disconnect is that I am not being asked to confirm a 90-point document. I am being asked to find the 20 points of hidden problems in a 70-point document. This requires almost the same expertise as the original draft would have required. The “review” is not cheaper than the original work; it is the same work, performed by reverse-engineering instead of by composition.

The client doesn’t see it this way. From their perspective, the heavy lifting—the drafting—has been done. What’s left is verification. Verification, surely, costs less than creation.

This perception gap is the single most expensive thing happening to the legal profession right now. It is much more important than the conversation about AI hallucinations. It is much more important than the conversation about whether AI can replace lawyers. Because it changes the fee conversation in every single matter.

The pricing trap

Here is the pricing trap that this creates.

Option one: I accept the engagement at 20% of my normal fee. I spend nearly normal time on the work because the review-and-revise is nearly as expensive as drafting. I lose money on the matter. I do this once or twice and then stop accepting clients who bring me AI-generated drafts.

Option two: I decline the engagement. The client thinks I am being inflexible or rapacious. They go to another lawyer, who accepts at 20% and does a superficial review—genuinely a 20%-effort review—and sends the document back. The client is happy. The document, three years from now, explodes.

Option three: I take the engagement at 60% of my normal fee. I explain why “review” of an AI-generated draft is closer to drafting than the client realizes. The client agrees, reluctantly, sometimes. Sometimes they don’t, and they go to option two’s lawyer.

I have settled, mostly, on a version of option three. I tell clients explicitly: “Reviewing an AI-generated draft is more expensive than reviewing a draft you wrote yourself, because the AI’s mistakes are more subtle than a non-lawyer’s mistakes. I will charge approximately 60-70% of my normal drafting fee. Most of the work is the same; the document just happens to already exist.”

The clients who understand this become long-term clients. The clients who don’t tend to leave for cheaper lawyers, learn an expensive lesson in two or three years, and sometimes come back. Sometimes they don’t come back; the expensive lesson teaches them to be more careful, but with a different lawyer.

What this does to lawyers’ self-worth

I want to be honest about something that’s hard to write.

When a client tells me “your work isn’t worth what it used to be because AI can produce most of it”—even when they don’t use those words—something inside me reacts. The reaction is not professional. It is personal. I have spent fifteen years building a craft. Being told the craft is worth less than it was hurts in a specific way.

Most lawyers will not write about this, because it sounds like complaining and complaining is unattractive. But I think it’s worth saying out loud, because the emotional response shapes how lawyers handle these conversations, and how they handle these conversations shapes whether they keep the clients.

If you react with defensiveness—“my judgment is irreplaceable, you don’t understand”—you lose the client. If you react with capitulation—“sure, I’ll do it at 20%“—you go bankrupt. The right reaction is calm explanation: “Here’s why this work is more expensive than it looks, and here’s why you want to pay for it.” Most lawyers, when their professional pride has just been wounded, cannot deliver that explanation calmly. They deliver it as a sermon, or they don’t deliver it at all and just lower their rate.

The lawyers who get past this emotional moment, and learn to have the conversation calmly, are the ones who survive. The lawyers who can’t move past it—who feel insulted every time a client mentions AI—those lawyers are slowly leaking clients to anyone who will engage with the new reality without flinching.

The bigger pattern

Step back from individual cases. The 70-90 problem is not just a contract-drafting issue. It applies to everything in legal practice that AI now does competently.

AI produces 70-point legal research. Clients think it’s 90-point and don’t see why they need a lawyer to do the research.

AI produces 70-point initial advice. Clients think it’s 90-point and don’t see why they need an initial consultation.

AI produces 70-point drafts of demand letters, settlement proposals, even arguments. In each case, the visible surface is professional. The 20 points of hidden problems are invisible until they matter, which is usually months or years after the document was produced.

The legal profession’s challenge for the next decade is not “how do we use AI.” It is “how do we get clients to understand that the visible surface of legal work is the smallest part of what they’re buying.” This is harder than it sounds, because the visible surface is, by definition, the only part the client can see.

The lawyers who figure out how to explain this—patiently, repeatedly, without sounding defensive—are the lawyers who will have practices in 2035. The lawyers who can’t are the ones whose clients will gradually drift toward whoever charges less for the visible surface, and learn the hard way what the invisible 20 points were actually for.


Part of an ongoing series on the changing economics of legal work. Related: why clients pay for certainty, not answers.

If you’ve had the “just review this AI draft” conversation, email [email protected]. I’m collecting examples of how lawyers are handling it.


Share this post on:

Next Post
My Client Asked an AI About His Case. The AI Told Him What He Wanted to Hear.