Last reviewed on May 12, 2026.
Most legal AI procurement decisions are made with a product demo, a pricing sheet, and a handful of reference calls. That works when the product is mature and the vendor is large. It works less well when the vendor is a two-year-old company built around a single model wrapper, when the data being processed is client-confidential, or when the contract is a multi-year commitment.
This guide is the checklist version of what serious procurement teams ask. It is organised in five sections — security, data handling, contractual terms, references, and viability — because those are the areas where the answers most often diverge between a vendor's sales materials and the operational reality.
1. Security and infrastructure
Legal AI tools handle client data. Some of that data is highly regulated. Before the platform sees a real document, you should be able to answer the following.
- Where is data stored geographically? Can you pin storage to a region if the firm has clients with data-residency obligations?
- Is data encrypted in transit and at rest? What encryption standards are used and who holds the keys?
- What independent security audits has the vendor completed? SOC 2 Type II is the common floor for enterprise legal-tech procurement; ISO 27001 is common in international contexts. A report from "internal security review" is not an audit.
- What is the vendor's penetration-testing cadence? Will they share the most recent summary under NDA?
- How are user accounts authenticated? Look for SSO with SAML or OIDC, role-based access controls, and the ability to enforce MFA for all users.
- Are administrative actions logged in a way that supports the firm's compliance obligations?
Watch for
"We use bank-grade security" is marketing language, not an answer. Ask for the specific controls.
2. Data handling and AI specifics
This section matters more for AI tools than for traditional SaaS. The questions concern what happens to the data after it enters the platform.
- Does the vendor train its models on customer data? If so, what is the opt-out mechanism and how does the firm verify it is honoured?
- If the product uses a third-party foundation model (for example, an OpenAI or Anthropic model), how is data routed and what contractual protections exist with that subprocessor?
- What is the data retention policy after a matter closes or a subscription terminates? How is deletion confirmed?
- Are prompts and outputs logged? Who has access to those logs inside the vendor?
- Does the platform support pseudonymisation or redaction of identifying information before processing, where required by client policy?
- If the firm asks for a list of subprocessors, can the vendor produce a current one with locations?
3. Contractual terms
The boilerplate matters. Read the MSA before the pilot starts, not at renewal.
- Liability caps. Many vendors cap liability at twelve months of fees, which can be very low. Push for a meaningful cap on data-breach claims specifically.
- Indemnification, particularly for IP claims arising from model output. As of 2026 the major model vendors offer indemnities; legal-AI vendors built on top should be passing through equivalent protections.
- Service-level agreements. What is the committed uptime, and what are the credits if it is not met?
- Breach notification timelines. Forty-eight hours is reasonable; "without undue delay" is not specific enough for regulated work.
- Termination for convenience. Can the firm exit if the product is not delivering, or only at the end of a multi-year term?
- Data return and deletion on exit. The firm should be able to extract its data in a portable format and confirm deletion within a defined window.
- Auto-renewal language. Many platforms quietly renew for multi-year periods. Build a calendar reminder against the notice deadline.
4. Reference checks done properly
Vendor-provided references are the warmest customers. They are still useful, but they are not the whole picture. A serious reference exercise covers three angles.
Vendor references
Ask whether the reference firm uses the product in production or only in a pilot. Ask what does not work as well as the demo suggested. Ask what they would change about the implementation in hindsight.
Independent references
Find one or two firms using the product who were not provided by the vendor. Industry forums, your own network, and the legal-tech press make this manageable. Independent references are where the most useful information sits.
Recently departed customers
If you can speak to a firm that chose this vendor and then left, that conversation is worth ten happy-customer calls. Ask politely; some firms will share their reasons.
5. Vendor viability
Legal AI is a fast-moving market and vendors are not all going to be here in three years. Before signing a multi-year contract, look at the basics.
- How is the vendor funded, and what is the public information on financial position? A pre-revenue startup with eighteen months of runway is a different procurement risk from an established platform.
- Who is on the engineering team, and how stable is the leadership? Repeated turnover at the top of a small vendor is a signal.
- What is the product roadmap, and how often does the vendor ship against it? Public release notes are a good proxy.
- If the vendor was acquired or is rumoured to be acquired, what would migration look like? Are there portability commitments in the contract?
Pre-contract checklist
- Security audit report received and read.
- Data-residency commitment confirmed in writing.
- Position on training models on customer data confirmed.
- Subprocessor list received.
- Breach notification window in the MSA.
- Termination for convenience and exit terms reviewed.
- Two independent references spoken to.
- Renewal/auto-renewal dates calendared.
- Implementation cost, training cost, and admin cost estimated.
- Internal owner assigned for the contract relationship.
What to do if a vendor cannot answer
An unwillingness to share a SOC 2 report under NDA, vague answers about subprocessors, or contract terms that cannot be negotiated at all are signals. A serious enterprise vendor is used to these questions and has prepared answers. A vendor that has not had to answer them yet is selling to firms who have not asked.
That is not always disqualifying — small vendors can grow into the requirements — but it changes the procurement risk. Either limit the use case (no client-confidential data, short contract term) or wait until the vendor matures.
Common mistakes
- Letting product enthusiasm in the legal team drive the contract terms. Procurement and IT should have veto on security questions.
- Running due-diligence after signing the MSA. Once the contract is in place, leverage drops sharply.
- Accepting "we're working towards SOC 2" as equivalent to having SOC 2. The certification exists for a reason.
- Skipping the exit clauses because nobody is thinking about leaving yet. The cheapest time to negotiate exit is at signing.
Related reading
The AI Implementation Roadmap covers what to do after you have signed. The Legal AI Ethics Framework covers the professional-conduct dimension. To compare specific products before procurement, the comparison library and tools directory are the starting points.