AI Contract Review Tools: Are Law Firms Taking on Hidden Legal and Financial Risks?
Are You Really in Control of Your AI-Powered Legal Tech?
AI contract review platforms like ThoughtRiver and Sirion Labs are becoming popular in legal practice, with firms like #KPMG expecting lawyers to have experience using them to save costs and be more productive.
But are these tools truly protecting your firm’s interests, or are they shifting risks onto you? After reviewing the Terms & Conditions of these platforms, I found contractual loopholes that could expose law firms to liability, data privacy risks, and financial losses.
Hidden Risks in AI Contract Review Software
1. AI Shifts Legal Liability to the Customer
Most AI vendors limit their liability to the amount of fees paid. If the AI makes a critical mistake, your firm—not the provider—bears the legal and financial consequences.
Example from Sirion’s Terms & Conditions :
“The maximum amount that either we or you can be held liable for… will be limited to the Fees you paid us for the Services in the twelve (12) months before the occurrence of the first event giving rise to liability.”
Why This is a Risk
What You Should Do
Key Issues & Contradictions:
? Conflicts with Purpose – AI contract review tools require protected documents, such as NDAs with settlement amounts and merger agreements with pricing, yet this clause prohibits uploading “Protected Information”, which contracts inherently contain.
? No Liability for Data Misuse – If protected data is uploaded and misused, the provider takes no responsibility.
? Access Suspension Risk – A suspected breach could result in suspended access, disrupting operations.
Law firms must demand clarity on how this aligns with the platform’s intended use before relying on it.
2. Your Client Data (IP) May Be Used to Train AI Models
Uploading contracts for AI review may grant the provider a perpetual licence to use your data for benchmarking, analysis, and AI training, often without clear safeguards.
Example from ThoughtRiver’s Terms & Conditions (IP Ownership Clause):
“The Customer grants ThoughtRiver an irrevocable, perpetual, worldwide, royalty-free license to store, display, use, copy, maintain, customise and provide such data as part of the Platform…”
Why This is a Risk
What You Should Do
3. You Might Be Liable for Data Breaches—Even If It’s Not Your Fault
Some AI platforms require customers to share costs of mitigating data breaches, even when the breach occurs on their servers.
Example from ThoughtRiver’s Terms & Conditions:
“The Customer shall pay ThoughtRiver’s reasonable costs for investigation, mitigation and remediation of each such Personal Data Breach.”
Why This is a Risk
What You Should Do
Conclusion
Law firms and organisations must look beyond flashy interfaces and cost-saving promises and focus on what really matters—fair terms and accountability.
I fully support AI in the legal domain and recognise its evolving potential. However, providers cannot offer a digital product that lacks accuracy, reliability, and safety while disowning liability for negligence, economic losses, and infringements—clients must have fair protection for their legal and business interests.
Demand compliance with the EU AI Act to enforce responsibility.
Disclaimer
The information in this article is based on company website terms as of 18 February 2025. This content is for awareness and educational purposes only and does not constitute legal, financial, or professional advice.