IFPI and Others Criticize EU’s AI Act Implementation Measures

AI Act

The European Commission headquarters in Brussels. Photo Credit: EmDee

The IFPI, CISAC, IMPALA, and others are taking aim at the EU’s AI Act implementation, which they’re criticizing as “a missed opportunity to provide meaningful protection of intellectual property rights.”

Those entities, besides United Voice Artists, the European Federation of Journalists, and many more, voiced their qualms in a release that was emailed to DMN. We’ve been covering the voluminous AI Act for years now – including its early 2023 music-space support and its March 2024 passage in the European Parliament.

Far from sitting on the sidelines, various European organizations responded to that passage by calling for the AI Act’s “meaningful implementation.” To state the obvious, writing a huge law pertaining to an unprecedented, rapidly evolving technology is one thing; enforcing it throughout more than two dozen member states is another thing altogether.

(An implementation aside: certain countries’ slow-moving Copyright Directive rollout elicited plenty of pushback years ago.)

Enter the General-Purpose AI Code of Practice, which was published earlier in July and is described on the Commission’s website as “a voluntary tool, prepared by independent experts in a multi-stakeholder process, designed to help industry comply with the AI Act’s obligations.”

Consisting of three chapters, the Code, technically distinct from the Commission’s adjacent “guidelines on key concepts,” dedicates one section to copyright matters. Said section bills itself “as a guiding document for demonstrating compliance with the obligations provided for in Articles 53 and 55 [of the] AI Act.”

Article 53, “Obligations for providers of general-purpose AI models,” is particularly important for rightsholders. One especially significant clause therein: a requirement to “draw up and make publicly available a sufficiently detailed summary about the content used for training of the general-purpose AI model, according to a template provided by the AI Office.”

Back to the organizations’ criticism, the brief Code of Practice doesn’t exactly have teeth. On top of the aforementioned voluntary opt-in, one section specifies that signatories shall “draw up, keep up-to-date and implement a policy to comply with Union law on copyright and related rights for all general-purpose AI models they place on the Union market.”

Of course, that shouldn’t be too much of a challenge. Nor should pledging “not to circumvent effective technological measures,” subscription models and paywalls among them, “designed to prevent” access to protected works.

Signatories should “only reproduce and extract lawfully accessible works and other protected subject matter if they use web-crawlers,” the text reads.

Additionally, signatories would agree “to exclude from their web-crawling websites that make available to the public content and which are, at the time of web-crawling, recognised as persistently and repeatedly infringing copyright.”

Other regulatory responsibilities – like establishing “technical safeguards to prevent” infringing outputs and adopting terms of service barring “copyright-infringing uses of a model” – are in many instances already the standard.

One final note on the Code’s particulars: These and other requirements “are complementary to the obligation of providers…to draw up and make publicly available sufficiently detailed summaries about the content used” for training, the EU text shows.

But the organizations underscored that their dissatisfaction extends to the EU’s training-disclosure template to boot. Overall, they maintain that “the final outcomes fail to address the core concerns which our sectors…have consistently raised.”

“We wish to make it clear that the outcome of these processes does not provide a meaningful implementation of the GPAI obligations under the AI Act,” the IFPI and others wrote. “We strongly reject any claim that the Code of Practice strikes a fair and workable balance or that the Template will deliver ‘sufficient’ transparency about the majority of copyright works or other subject matter used to train GenAI models. This is simply untrue and is a betrayal of the EU AI Act’s objectives.”

As for where things go from here, the organizations are calling on the Commission to “enforce Article 53 in a meaningful way” – and imploring member states as well as legislators “to challenge the unsatisfactory process of this exercise” for good measure.


Content shared from www.digitalmusicnews.com.

Share This Article