Insights Business| SaaS| Technology AI Vendor Due Diligence Under EU Regulations – Compliance Verification Checklists and Contract Terms
Business
|
SaaS
|
Technology
Jan 6, 2026

AI Vendor Due Diligence Under EU Regulations – Compliance Verification Checklists and Contract Terms

AUTHOR

James A. Wondrasek James A. Wondrasek
Graphic representation of the topic AI Vendor Due Diligence Under EU Regulations - Compliance Verification Checklists and Contract Terms

You’re procuring AI tools. SaaS HR platforms, cloud AI services, foundation model APIs. If your vendor hasn’t completed their EU AI Act compliance obligations, your organisation could be on the hook.

The confusion around who’s responsible for what creates contractual risk. Vendors claim compliance without proof. You need verification methods. This guide is part of our comprehensive EU AI Act compliance framework, where we explore practical strategies for navigating regulatory obligations.

This article gives you a systematic due diligence framework. Documentation requests, verification steps, contract negotiation tactics. It’s all here so you can protect your organisation from regulatory penalties, clarify liability allocation, and streamline procurement decisions.

We’re focusing on third-party AI vendor evaluation. Practical stuff for non-legal technical leaders. AI Act enforcement began August 2026. Conformity assessments are mandatory for high-risk systems, GPAI providers face transparency obligations.

What is the difference between a provider and deployer under the EU AI Act?

Your AI vendor is typically the provider. Your company is the deployer.

Providers develop or sell AI systems and bear primary compliance obligations – conformity assessment, technical documentation, risk management. Deployers use AI systems with lighter obligations – fundamental rights impact assessment, human oversight, monitoring.

It’s pretty straightforward when it comes to contracts – obligations must be explicitly allocated based on role classification.

Fine-tuning foundation models, white-labelling vendor AI, or co-developing systems can shift you to “new provider” status—see the modifications section below.

So if you’re using a SaaS HR platform? The vendor is provider, you’re deployer. But integrate OpenAI‘s API to create a high-risk system and you might become a “new provider” requiring conformity assessment.

Indemnification must align with roles. Vendor fails their obligations? They indemnify you. You breach deployer obligations? That’s on you.

What documentation should I request from AI vendors to verify compliance?

High-risk AI vendors must provide four things: EU Declaration of Conformity, CE marking proof, technical documentation summary, and conformity certificate if third-party assessed.

GPAI vendors like OpenAI, Anthropic, Google, or AWS Bedrock need different documentation—see the GPAI section below. Understanding foundation model service agreements is critical when these vendors also offer fine-tuning capabilities.

For quality assurance, ask for Quality Management System certification and audit reports.

If the vendor applied harmonised standards, get the list of EU-approved standards and implementation evidence. All vendors should commit to 10-year retention and market surveillance access.

The EU Declaration of Conformity is a formal legal statement. It needs to be signed by a senior vendor representative.

CE marking needs to be visibly or digitally affixed. If third-party assessed, the notified body ID number must be next to it.

Watch out for these red flags: vendor refusal to provide declarations, missing CE marking, vague “compliance in progress” claims.

Request this documentation before contract signing, not after. Once you’ve signed, your leverage disappears.

How do I verify a vendor has completed conformity assessment?

Start with CE marking. Check it’s visibly or digitally affixed with notified body ID if third-party assessed.

Request the EU Declaration of Conformity – the formal compliance statement.

For biometrics or law enforcement AI, verify the conformity certificate. Notified bodies are independent organisations designated by EU authorities to perform conformity assessments. Cross-check against the member state registry.

Confirm harmonised standards. Vendors using EU-approved standards can self-assess. Verify against Official Journal.

Third-party assessment includes documentation review, system assessment, and testing. Upon completion, notified bodies issue CE marking authorisation. Internal assessment relies on harmonised standards creating presumption of conformity.

If the system processes personal data, check for GDPR compliance in the EU Declaration.

Third-party assessment takes 3 to 6 months. Factor this into your procurement schedules.

More red flags to watch out for: CE marking without declaration, self-assessment for biometrics when third-party is required, missing notified body ID.

What are harmonised standards and where can I find them?

Harmonised standards are EU-approved technical specifications. When vendors apply them, they create “presumption of conformity” – meaning they likely meet AI Act requirements.

The benefit is speed. Vendors can use faster internal conformity assessment instead of third-party notified body assessment.

Find them in the Official Journal of the European Union. The AI Act Service Desk maintains an updated list.

Here’s the catch. As of 2026, many standards are still being finalised. So you need to ask which harmonised standards they applied, full or partial, and cross-check against the Official Journal.

Partial application matters. Vendors applying only some standards still need third-party assessment for uncovered parts.

What if harmonised standards aren’t available? The Commission can adopt common specifications – mandatory EU-published benchmarks. Common specifications are mandatory, not optional like harmonised standards.

Prefer vendors using harmonised standards. They enable faster, lower cost compliance. But verify implementation evidence, not just claims.

What contract clauses should I negotiate with AI vendors for compliance?

Five clauses matter.

First, provider warranties. Vendor confirms conformity assessment completed, documentation maintained, compliance monitoring active. Don’t accept generic “we comply” language. Require specific completion date and notified body name if applicable.

Second, indemnification. Vendor compensates you for penalties from their conformity failures. You indemnify them for deployer breaches. Standard clauses aren’t enough. Negotiate carve-outs for gross negligence. Consider super caps for high-risk areas.

Third, obligation allocation schedule. This is an explicit table listing provider versus deployer obligations. Provider: conformity assessment, risk management, technical documentation. Deployer: fundamental rights impact assessment, human oversight, monitoring.

Fourth, update notifications. Vendor informs you within 30 days of modifications affecting risk classification or obligations.

Fifth, documentation delivery. Vendor provides EU Declaration, CE marking proof, and technical documentation summary within 10 business days.

Include broad indemnities covering system use, IP infringement claims, and data protection breaches.

Add audit rights for SOC 2 reports, quality management documentation, and harmonised standards evidence.

Add termination rights for compliance failures and market surveillance non-compliance.

For GPAI vendors, add Model Documentation Form delivery schedules, training data transparency commitments, and copyright policy updates.

When does a deployer become a provider through modifications or integrations?

“New provider” status gets triggered by substantial modification changing the system’s purpose or capabilities, white-labelling vendor AI, fine-tuning foundation models substantially, or creating a high-risk system incorporating a GPAI model.

Consequences? Full provider obligations – conformity assessment, technical documentation, quality management system, CE marking.

You need to protect yourself contractually. Define “substantial modification.” Require vendor notification if changes trigger new provider status. Allocate conformity assessment costs.

Safe harbor: basic configuration, parameter adjustment within vendor documentation, integration without modification.

Substantial modification remains vague. Cosmetic UI changes are safe. Retraining triggers provider status.

For fine-tuning, an indicative criterion is whether training compute for modification exceeds one-third of the original model’s compute.

White-labelling makes you a provider. Selling vendor AI under your brand means full provider obligations.

Integrating a foundation model API to build a high-risk commercial system makes you a provider.

Keep modification logs and configuration records. This creates a safe harbor.

Decide who pays for conformity assessment if modifications trigger provider obligations. Get this in the contract.

Here’s how it plays out: SaaS company embedding OpenAI for customer support – likely deployer. HR platform training custom hiring model – likely new provider.

For third-party HR tool high-risk assessment and vendor conformity documentation requirements, our guide on employment AI classification provides detailed edge case analysis.

Use the AI Act Service Desk compliance checker for edge cases.

What is the AI Act Service Desk and how can it help me?

The AI Act Service Desk is the official platform operated by the European AI Office providing compliance assistance.

Key tools: compliance checker, regulatory sandbox portal, direct contact to AI Office experts.

The compliance checker is an interactive questionnaire determining classification and obligations. You get instant results. Direct queries get responses in 5 to 10 business days.

Use it to clarify provider versus deployer classification, verify harmonised standards, confirm GPAI vendor registration, interpret conformity requirements.

Access: https://ai-act-service-desk.ec.europa.eu/en. Free, no registration required.

The Service Desk is the front-end. The AI Office is the regulatory authority.

The checker isn’t legally binding but it’s highly persuasive.

Consult it before finalising contracts with classification uncertainty or conflicting vendor claims.

Limitations: It’s not legal advice. Country-specific questions need member state authorities. Complex queries take time.

Workflow: Use checker first. Escalate unresolved issues. Document guidance for procurement justification.

How do I evaluate GPAI vendors (foundation model providers) differently?

GPAI vendors like OpenAI, Anthropic, Google, and AWS Bedrock face transparency obligations, not conformity assessment – unless the model is integrated into a high-risk system.

Request: Model Documentation Form covering technical specs, training process, energy consumption. Training Data Summary with dataset information. Copyright policy statement.

Verify AI Office registration. GPAI providers must register. Confirm their status.

Check Code of Practice signatory status. It’s a voluntary framework providing presumption of compliance.

Watch out for downstream provider risk. Use a GPAI API to build a high-risk system and you may become a “new provider” requiring conformity assessment.

GPAI has two tiers. “General-purpose” means broad capabilities. “Systemic risk” threshold is training exceeding 10^25 FLOPs.

The Model Documentation Form has two parts – downstream provider section and authority-only section. You get the downstream section.

For Training Data Summary, check copyright transparency, synthetic data disclosure, curation documentation.

Code of Practice signatories get standardised templates, presumption of compliance, AI Office coordination.

Determine when API usage creates high-risk systems triggering conformity obligations.

Contract considerations: documentation delivery schedules, update notifications for model changes, liability allocation. Understanding vendor vs customer liability allocation and contractual indemnification for penalties is essential when negotiating GPAI vendor agreements.

Compare vendors on Code of Practice participation, documentation quality, transparency responsiveness.

FAQ Section

What happens if my vendor fails market surveillance inspection?

Market surveillance authorities can order product recalls, impose fines up to €35M or 7% of global revenue, and suspend CE marking. Your protection comes from indemnification clauses in vendor contracts allocating penalties for vendor conformity failures, termination rights for material non-compliance, and audit rights to proactively verify vendor quality management systems. Make sure you document your vendor compliance verification efforts to demonstrate good-faith deployer diligence if authorities investigate.

Can I rely on vendor self-certification or do I need third-party proof?

It depends on AI system classification and harmonised standards application. High-risk systems using full harmonised standards allow vendor internal conformity assessment with EU Declaration of Conformity. Biometric systems without standards, law enforcement AI, and Annex I safety components require third-party notified body assessment. Request a conformity certificate if third-party is required. For internal assessment, verify the harmonised standards list and EU Declaration completeness.

How do I verify a notified body is legitimate?

Cross-check the notified body name and identification number against member state notifying authority registries. Each EU country publishes designated notified bodies for AI Act conformity assessments. The AI Act Service Desk maintains a centralised list. Red flags? Body not on official registry, identification number format inconsistent, body located outside EU without member state designation.

What if harmonised standards aren’t published for my vendor’s AI system?

The vendor must comply with “common specifications” – mandatory technical benchmarks published by the European Commission – or participate in a regulatory sandbox. Common specifications compliance is mandatory, not optional like harmonised standards. Alternatively, the vendor can use draft harmonised standards but must undergo third-party notified body assessment. Verify the vendor’s compliance pathway and corresponding documentation.

Do I need separate contracts for AI Act compliance or can I amend existing agreements?

Amendment via addendum is recommended for existing vendor relationships. Include obligation allocation schedule, indemnification clauses, documentation delivery requirements, update notification triggers, audit rights, and termination provisions for material non-compliance. For new procurements, integrate AI Act clauses into master services agreement or SaaS terms. Consult legal counsel for jurisdiction-specific enforceability.

How often should I re-verify vendor compliance?

Annual verification is recommended. Request updated EU Declaration of Conformity, confirm ongoing quality management system certification, and review material system modifications. Trigger additional verification if vendor releases major updates, market surveillance actions are reported in media, vendor changes ownership, or vendor modifies risk classification claims. Contractual update notification requirements enable event-driven verification with 30-day notice.

What’s the difference between CE marking for AI Act vs other EU regulations?

AI Act CE marking indicates AI system conformity assessment completion. Products covered by existing EU regulations like machinery, medical devices, or toys may require separate CE marking for those frameworks AND AI Act if incorporating AI. Verify the vendor provides AI Act-specific EU Declaration and technical documentation, not just sectoral regulation compliance. Annex I high-risk AI systems follow sectoral regulation conformity procedures.

Can I use vendor SOC 2 reports as AI Act compliance evidence?

SOC 2 addresses security, availability, and confidentiality controls, not AI Act-specific requirements like risk management, bias mitigation, human oversight, and transparency. SOC 2 is useful for quality management system assessment but insufficient alone. Request AI Act-specific conformity documentation – EU Declaration, CE marking, technical documentation – even if the vendor provides SOC 2. They’re complementary, not substitutive.

What questions should I ask vendors claiming “AI Act compliance in progress”?

Request expected conformity assessment completion date, conformity pathway chosen, harmonised standards identified for application, notified body selected if third-party, current quality management system status, and interim risk management documentation. Negotiate contract contingencies including conformity completion deadline, penalty clauses for delays, termination rights if vendor fails assessment, and interim documentation delivery milestones. Consider delaying procurement until completion if it’s a high-risk system.

How do I handle vendors outside the EU claiming AI Act doesn’t apply?

The AI Act applies to systems placed on the EU market, used in the EU, or producing outputs used in the EU regardless of provider location. Non-EU vendors serving EU customers are subject to AI Act. Verify the vendor has an EU representative designated (required for non-EU providers), confirm conformity assessment pathway, and verify CE marking intent. Add a contract clause requiring EU representative designation and local market surveillance authority cooperation.

What’s the role of GDPR in AI vendor due diligence?

If the AI system processes personal data, the vendor must comply with both AI Act and GDPR. The EU Declaration of Conformity should explicitly reference GDPR compliance. Request Data Protection Impact Assessment separate from Fundamental Rights Impact Assessment, Data Processing Agreement, and GDPR-compliant technical documentation. Overlap areas include automated decision-making (GDPR Article 22 plus AI Act human oversight) and transparency (GDPR information rights plus AI Act use instructions).

Can regulatory sandbox participation replace conformity assessment?

Regulatory sandbox provides a controlled testing environment with supervisory authority oversight. Successful exit generates an exit report that notified bodies must consider favourably in subsequent conformity assessment, potentially streamlining the process. It doesn’t eliminate conformity assessment but creates presumption of conformity for tested aspects. Verify the vendor provides sandbox exit report, confirm testing scope covered your use case, and check supervisory authority endorsement.

For broader context on implementation context overview and navigating the full spectrum of AI Act obligations, refer to our comprehensive guide on EU AI Act compliance tensions.

AUTHOR

James A. Wondrasek James A. Wondrasek

SHARE ARTICLE

Share
Copy Link

Related Articles

Need a reliable team to help achieve your software goals?

Drop us a line! We'd love to discuss your project.

Offices
Sydney

SYDNEY

55 Pyrmont Bridge Road
Pyrmont, NSW, 2009
Australia

55 Pyrmont Bridge Road, Pyrmont, NSW, 2009, Australia

+61 2-8123-0997

Jakarta

JAKARTA

Plaza Indonesia, 5th Level Unit
E021AB
Jl. M.H. Thamrin Kav. 28-30
Jakarta 10350
Indonesia

Plaza Indonesia, 5th Level Unit E021AB, Jl. M.H. Thamrin Kav. 28-30, Jakarta 10350, Indonesia

+62 858-6514-9577

Bandung

BANDUNG

Jl. Banda No. 30
Bandung 40115
Indonesia

Jl. Banda No. 30, Bandung 40115, Indonesia

+62 858-6514-9577

Yogyakarta

YOGYAKARTA

Unit A & B
Jl. Prof. Herman Yohanes No.1125, Terban, Gondokusuman, Yogyakarta,
Daerah Istimewa Yogyakarta 55223
Indonesia

Unit A & B Jl. Prof. Herman Yohanes No.1125, Yogyakarta, Daerah Istimewa Yogyakarta 55223, Indonesia

+62 274-4539660