Skip to content

AI Providers Necessitated to Adopt Transparency Standards under New EU Regulations

AI training data disclosure now mandatory under new EU regulations. Failure to comply may incur financial penalties for AI providers.

AI Model Developers Face Demanded Openness Under New EU Regulations
AI Model Developers Face Demanded Openness Under New EU Regulations

AI Providers Necessitated to Adopt Transparency Standards under New EU Regulations

The European Union has introduced new rules for providers of General-Purpose AI (GPAI) systems, effective from August 2, 2023. These rules, established by the EU AI Act, aim to ensure transparency, intellectual property protection, and accountability in the use of AI.

Transparency

Under the new regulations, providers of GPAI models must maintain technical documentation and disclose the use of any copyrighted material in the training data. This move promotes accountability by allowing users and regulators to verify the provenance and nature of training datasets.

Intellectual Property Protection

The Act requires clear attribution of copyrighted training data, aiming to prevent unlawful exploitation of protected works in AI training processes. Developers must specify what measures they have taken to protect intellectual property.

Fines and Enforcement

Non-compliance can result in substantial fines. For infringements regarding prohibited AI systems, fines can reach up to €35 million or 7% of global turnover. For other AI Act provisions, including those applicable to GPAI models, fines may be up to €15 million or 3% of global turnover, whichever is higher. The European Commission will directly impose fines on GPAI model providers, with certain sanctions becoming applicable from August 2, 2026.

Organizations must ensure adequate AI literacy among personnel involved with AI systems to comply with Article 4 of the Act.

Enforcement and Oversight

The newly established European AI Authority will be responsible for enforcing the new AI rules. Operators of these AI systems must disclose how their systems work and the data used for training. Particularly powerful AI models that could potentially pose a risk to the public must document safety measures.

Controversies

The new regulations have received criticism for the lack of obligation to name specific datasets, domains, or sources, which is a concern for the Initiative for Copyright. Google has also expressed concerns about the new rules.

Timeline

The EU AI Act, effective since February 2025 with key provisions entering force progressively, will control AI models starting from August 2026 and 2027 respectively for models introduced before and after August 2, 2025. Private individuals can sue providers based on the AI Act.

In summary, the EU AI Act enforces transparency through documentation and copyright disclosure for GPAI models, protects intellectual property by mandating clear attribution of copyrighted training data, and prescribes heavy fines (up to millions of euros or a percentage of global turnover) for non-compliance, reflecting the law’s goal to secure trust and accountability in AI deployment across the EU.

The technology of artificial-intelligence (AI) providers under the new EU regulations is required to have technical documentation and disclose the use of any copyrighted material in the training data for transparency purposes. (Transparency)

Developers of AI systems must specify what measures they have taken to protect intellectual property because the Act requires clear attribution of copyrighted training data to prevent unlawful exploitation. (Intellectual Property Protection)

Read also:

    Latest