Avoiding Redlines in the Age of AI Contract Review
At least some of your counterparties are using AI contract review tools to review your contract. That number is only going to increase.
Most AI contract review tools allow you to build a playbook and then have the AI scan a contract against that playbook.
If the contract meets a requirement on your playbook, the AI will mark it with something like a green flag. But if the contract fails a requirement on your playbook, the AI will mark it with a red flag. (Some tools also use a yellow flag for “needs review” flag if the AI isn’t sure.)
Why does this matter?
With AI contract tools, the first time a human may look at your contract is after AI has done its first-pass review. This means that the first thing a reviewer learns about your document isn’t from the document itself — it’s from the snapshot of all the red flags AI has found in your contract.
This should change the way we draft to avoid redlines.
Why?
- We all know that guy who gives you a contract to review and says, “I took a quick glance and it looks OK, so it shouldn’t take you long to review, right?” AI is going to replace that guy, but in a different way. First-pass AI review will scan your contract faster than that guy (if he even reads it), but instead of saying “it looks OK,” AI will give your reviewer a list all of the red flags in your contract. That isn’t great! This means when your reviewer starts reviewing your contract, they’re already going to have a predetermined sense of how much redlining they’ll have to do. If your reviewer’s AI playbook scans for 20 issues and your contract fails 15 of them, your reviewer is going to see 15 red flags staring them in the face before they even start reviewing. They know they’ll have a lot of redlining to do. They’re in Redline Mode before they even open your document! They’re ready to redline things the AI didn’t even scan for. That’s why the more redlines you get the more redlines you get.
- AI doesn’t care about your numbered paragraphs. The limitation of liability clause may be Section 7 in your contract, but it’s #1 in the minds of lawyers and contract professionals according to World Commerce & Contracting. So it’s probably going to be the first issue in AI playbooks and the first result the AI will show. So the first thing a reviewer may learn about your contract is that your limitation of liability is a red flag. That’s a tough start if your goal is to avoid redlines. Again, now your reviewer is in Redline Mode and they haven’t even touched your contract.
So how can we use this to avoid redlines?
- Your one-sided limitation of liability and indemnification clauses are going to lead to more redlines elsewhere. AI will quickly identify red flags in these clauses and send your reviewer into Redline Mode before they even touch your contract. It may be too much to ask to draft a limitation of liability clause that will satisfy every counterparty’s playbook, but just imagine how differently a reviewer may review your contract if they know — before even starting to review it — that your limitation of liability clause is OK. Do you think they’ll be more or less likely to redline elsewhere? Less likely! That’s why the more redlines you avoid, the more redlines you avoid.
- Find ways to stack green flags. If you can take a counterparty-friendly position on a particular issue, make it obvious so the AI will green flag it. The more green flags an AI finds, the more Drafting Capital you build with your reviewer. Think of issues like short notice periods for autorenewals and no AI training on your data, which are positions many vendors are taking anyway. If your reviewer sees a lot of green flags, they’ll see that you’re taking a balanced approach — before even starting to review your contract — and they’ll be more inclined to reciprocate, either through minimal targeted redlines or hopefully no redlines at all!
- Can you get your reviewer to reject their own redlines? Yes, you read that right. AI tools won’t just flag a clause — they can also automatically propose redlines based on a playbook. But your counterparty’s AI may propose some redlines that are nonstarters for you. So if you have certain non-negotiable points, consider addressing them in an explainer page. This way, although the AI may red flag a clause and redline it, a human reviewer may read your explanation, understand why a particular AI redline isn’t going to be accepted, and reject their AI tool’s own redline.
As AI contract review becomes more popular, we have to assume that AIs are going to be reading our contracts, so drafting contracts that will be read by AI is a skill we’ll have to develop.
How is AI changing the way you draft to avoid redlines?