Two EU regulations now create binding supply chain transparency obligations for any company shipping AI-augmented products into the EU market. The EU AI Act (Regulation (EU) 2024/1689) covers AI systems and general-purpose AI models. The Cyber Resilience Act (Regulation (EU) 2024/2847) covers all products with digital elements — which in practice means virtually every commercial software product with a network connection.
Here’s the useful bit: these regulations give you concrete, externally validated grounds to demand specific documentation from your AI vendors. The training data summary, the copyright policy, the SBOM — these are legal requirements your vendor is either meeting or not. That’s a much stronger position than asking nicely.
And even if your company isn’t based in the EU, you’re probably still in scope. Both instruments have extraterritorial reach. If your product touches the EU market, this applies to you.
For the broader context on AI supply chain licensing risk, see the overview article. This one focuses on what both regulations actually require — and how to use those requirements in your next vendor conversation.
Why do the EU AI Act and Cyber Resilience Act matter even if your company is not based in the EU?
Both regulations apply based on where products reach the market, not where the company is headquartered. Non-EU companies placing GPAI models on the EU market must comply and must appoint an authorised representative in the EU. The CRA carries fines of up to 2.5% of worldwide annual turnover. The EU AI Act adds fines of up to 3% of global annual revenue for GPAI violations. These are not small numbers.
Even if you serve no EU customers today, your vendors may. Their compliance obligations create documentation requirements that flow into your supply chain regardless of where you are. You’ll feel it eventually.
The US is heading the same direction anyway — Executive Order 14028 mandated SBOMs for federal vendors, and CISA updated minimum SBOM elements in 2025. The destination is the same. The EU is just moving faster.
Enforcement timeline: GPAI obligations under the EU AI Act became effective 2 August 2025. CRA reporting obligations begin September 2026. Full CRA compliance is required by December 2027.
What is a GPAI model and why does it probably apply to the models you are already using?
GPAI stands for General-Purpose AI — the EU AI Act’s regulatory category for what most people call a foundation model or large language model.
In practice, models trained with more than 10²³ FLOPs that can generate language, images, or video are treated as GPAI-scoped. That covers GPT-4, Llama, Mistral, DeepSeek, Qwen, Claude, and effectively every commercially available foundation model. If you are using any off-the-shelf LLM in your product, you are working with GPAI-scoped technology.
There’s a higher tier — GPAI with Systemic Risk (GPAISR) — for frontier models above 10²⁵ FLOPs with enhanced obligations. For most downstream product teams, those obligations fall primarily on the model provider, not you.
The category that matters most for your business is “downstream provider” — companies integrating GPAI models into their own products. What your upstream GPAI providers owe you as a downstream operator is where the practical leverage lies.
The GPAI Code of Practice, published in July 2025, is the compliance reference. OpenAI, Anthropic, Google, and Mistral have all committed to it. That commitment creates the basis for requesting the specific documentation the Code requires.
For a breakdown of individual model licence terms and GPAI classification implications, see the model licence comparison.
What does the EU AI Act open-source exemption actually require and why do most commercial deployments not qualify?
The EU AI Act provides a partial exemption from Article 53 for open-source GPAI models. But it’s not automatic — you need to satisfy all three conditions simultaneously: public model weights, a recognised free and open-source licence, and no monetisation of the model.
That third condition is where most commercial deployments fall apart. Monetisation includes charging for access, bundling with paid services, and collecting user data as a condition of access. “Research-only” or “no-commercial-use” licences do not qualify as FOSS for EU AI Act purposes.
OLMo (AI2) is the canonical qualifying example — fully open weights, genuinely permissive licence, non-commercial entity. Llama (Meta) does not qualify for commercial deployers. Meta’s custom licence imposes commercial use restrictions that take it outside the open-source definition. If you are building a commercial product on Llama, you are a downstream provider with full GPAI obligations.
One more thing the exemption doesn’t give you: even models that do qualify remain subject to Article 53(1c) (copyright policy) and Article 53(1d) (training data summary). The exemption is partial, not total.
This is where permissive-washing becomes a regulatory risk. A model carrying an Apache 2.0 or MIT label in repository metadata may have entirely separate commercial use restrictions in the actual licence text. Research analysing 760,460 ML models on Hugging Face found only 37.8% declared licensing information in a standardised, machine-readable way. Repository labels are not a reliable basis for assessing open-source exemption eligibility.
For per-model exemption eligibility, see the model licence comparison. For the broader AI supply-chain licensing picture this regulation sits within, see the open AI supply-chain licensing risk overview.
What are Article 53 copyright policy and training data summary requirements and what can you demand from vendors?
Article 53 applies regardless of whether a model qualifies for the open-source exemption. Full stop.
Article 53(1c) requires GPAI providers to implement and document a copyright policy — specifically compliance with the EU Copyright Directive’s text and data mining (TDM) Article 4, including respecting robots.txt opt-out mechanisms.
Article 53(1d) requires GPAI providers to publish a training data summary: a mandatory public document describing the data sources used at each training stage and what copyright compliance measures were taken. The European Commission published a template for this in July 2025.
So here’s what you ask your upstream GPAI vendor for: (1) the copyright policy referencing TDM Article 4 compliance, and (2) the training data summary following the Commission template. If a vendor cannot produce these, they are either non-compliant or operating outside GPAI scope. Ask them which one it is.
The AI Bill of Materials operationalises Article 53(1d) — extending the training data summary into machine-readable form with full data lineage and licence chain documentation. For a full explanation, see the AI BOM article.
When does fine-tuning an open model trigger EU AI Act provider obligations?
Fine-tuning is the activity most likely to convert your company from a downstream provider into a GPAI provider with full Article 53 obligations.
The trigger is the one-third compute rule. If the compute used for your modification exceeds one-third of the compute originally used to train the base model, you are presumed to have become a GPAI provider.
The good news: standard LoRA and PEFT fine-tuning typically does not approach this threshold. The AI Office “currently expects only few modifiers to become GPAI model providers.” The modifications more likely to trigger provider status are further pre-training on large datasets, large-scale continued training, and model distillation. RAG, custom system prompts, and hyperparameter adjustment do not trigger reclassification.
If fine-tuning does trigger provider status, your Article 53 obligations are limited to the modifications you made — not the entire upstream model’s compliance history.
Track your fine-tuning compute relative to base model training compute and treat this as a compliance input. For integrating this into your engineering workflow, see the engineering workflow article.
What does the Cyber Resilience Act SBOM mandate mean for AI components in your products?
The CRA requires all products with digital elements placed on the EU market to include an SBOM in SPDX or CycloneDX format, retained for at least ten years (Article 13). That obligation affects vendor contracts and internal archiving infrastructure.
In practice, virtually all commercial software with a network connection qualifies. An AI model embedded in a commercial product is a software component subject to CRA SBOM requirements — foundation models, fine-tuned variants, and their dependencies all belong in the product SBOM.
Beyond the SBOM, the CRA requires secure-by-design engineering standards, conformity assessments, CE marking, vulnerability management, and reporting of severe security incidents to ENISA within 24 hours.
Enforcement timeline: Chapter IV conformity notification obligations apply from June 2026. Vulnerability reporting begins 11 September 2026. Full compliance is required by 11 December 2027. Building SBOM processes takes pipeline changes across your engineering organisation. Starting now means having working systems before the September 2026 reporting window.
On format: SPDX (ISO/IEC 5962) has broad tooling support and dedicated AI profiles in SPDX 3.0. CycloneDX (OWASP) has native ML-BOM support and stronger CI/CD integration. Both are accepted under CRA guidance.
For what AI-specific component documentation looks like, see the AI BOM article.
What should you demand from AI vendors in your next procurement conversation?
These regulations convert vague best-practice expectations into documented, enforceable vendor requirements. Here’s your checklist.
1. Training data summary (Article 53(1d), EU AI Act) Ask for the public training data summary covering datasets used at each training stage and copyright compliance measures. Reference the European Commission template. If the vendor cannot produce it, ask whether they are non-compliant or outside GPAI scope.
2. Copyright policy (Article 53(1c), EU AI Act) Ask for the written copyright policy referencing TDM Article 4 compliance, including the approach to robots.txt. This applies to open-source providers too — the exemption does not remove this obligation.
3. SBOM for AI components (CRA Article 13) Require an SBOM in SPDX or CycloneDX format with the CRA’s ten-year retention requirement in your contract. For AI components, the SBOM should extend to an AI BOM covering training data sources, fine-tuning history, and licence chains.
4. Full licence text (permissive-washing risk mitigation) Request the complete licence text — not just the repository label. This applies to the model licence, any training data licences, and licences for any fine-tuned variants.
5. Open-source exemption status confirmation Ask the vendor to confirm in writing whether their model qualifies for the open-source exemption — specifically the three-part test: public weights, FOSS licence, no monetisation.
6. Incident notification commitment (CRA Chapter IV) Require a contractual commitment to notify downstream users of severe security incidents. The CRA requires initial notification to CSIRT and ENISA within 24 hours — your contract should flow that obligation into your vendor relationship.
7. Vulnerability management and patching cadence Require a defined support period and patching commitment for AI model components. AI models are not always versioned or supported with defined lifecycles. A contractual commitment protects you from silent model deprecations.
The AI Bill of Materials consolidates items 1 through 4 into a single machine-readable document. Requesting an AI BOM is the most efficient way to ask for all four at once.
For non-EU companies: apply these requirements as your procurement baseline regardless. Your EU-based customers may require this documentation from you downstream.
Frequently asked questions
Does the EU AI Act apply to companies outside the European Union? Yes. It applies to any company whose AI systems or GPAI models reach the EU market, regardless of headquarters. If your product uses AI and has EU customers, you are within scope. Non-EU GPAI model providers must also appoint an authorised representative in the EU, unless the open-source exemption applies.
What is the difference between the EU AI Act and the Cyber Resilience Act? The EU AI Act regulates AI systems and general-purpose AI models specifically — transparency, documentation, and risk classification. The CRA regulates all products with digital elements — SBOMs, secure-by-design engineering, vulnerability management, and CE marking. Both apply simultaneously to AI-augmented commercial software.
What happens if a vendor cannot provide a training data summary? They are either non-compliant with the EU AI Act or their model does not fall within GPAI scope. Treat the inability to produce this document as a red flag in your procurement assessment.
What is the GPAI Code of Practice and is it mandatory? It is a voluntary compliance framework published by the European Commission in July 2025, translating Article 53 into actionable chapters covering transparency, copyright, and safety. Providers who do not sign it must still meet the underlying obligations through other means.
Which SBOM format should we use for CRA compliance — SPDX or CycloneDX? Both are accepted under CRA guidance. SPDX has strong licensing and provenance metadata and dedicated AI profiles in SPDX 3.0. CycloneDX has native ML-BOM support and stronger CI/CD integration. For AI-heavy products, CycloneDX may offer more granular component representation.
Does using an open-source AI model exempt my company from EU AI Act obligations? Not automatically. The open-source exemption requires public weights, a recognised FOSS licence, and no commercial monetisation. Most commercial deployments do not satisfy the monetisation condition. Even qualifying models remain subject to copyright policy and training data summary obligations.
What is the one-third compute rule for fine-tuning under the EU AI Act? If your fine-tuning uses compute exceeding one-third of the base model’s original training compute, you are presumed reclassified as a GPAI provider with full Article 53 obligations. Standard LoRA and PEFT fine-tuning typically does not reach this threshold.
What is the CRA’s 10-year SBOM retention requirement? The CRA requires SBOM documentation to be retained for at least 10 years after a product is placed on the market (Article 13). This affects vendor contracts, product delivery requirements, and internal archiving infrastructure.
When do the EU AI Act and CRA enforcement deadlines overlap? GPAI provider obligations became effective 2 August 2025. CRA Chapter IV conformity notification obligations apply June 2026. CRA incident reporting begins 11 September 2026. Full CRA compliance is required by 11 December 2027. September 2026 is your first hard deadline.
What is an AI Bill of Materials and how does it relate to the CRA SBOM mandate? An AI BOM extends traditional SBOM concepts to model provenance, training data lineage, fine-tuning history, and licence chains. SPDX 3.0’s AI profile and CycloneDX’s ML-BOM type both support AI BOM documentation. The AI BOM is the natural mechanism for meeting CRA SBOM requirements for AI components while operationalising Article 53(1d) obligations.
Can I use EU AI Act requirements as procurement leverage even if my company is not in the EU? Yes. The regulations provide a concrete, externally validated framework for demanding vendor transparency regardless of your domicile. Your EU-based customers may require this documentation from you downstream — using these requirements as your procurement baseline means you are ready when they ask.
This article is part of our complete open AI supply-chain licensing risk series. The series covers the full landscape from foundational misconceptions through procurement tools and operational implementation — if you are still mapping your exposure, that is the place to start.