About two weeks after disclosing new details about its licensing framework for generative AI, GEMA has now unveiled an official “AI charter.”
The Berlin-based society reached out with that 10-principle charter today. Previously, September saw GEMA outline an ambitious royalties framework for music created via generative AI – complete with derivative-track compensation for professionals whose works trained the underlying models.
That set the stage for the initially mentioned late-October details as well as the just-published AI charter. Importantly, said charter isn’t an in-depth collection of detail-oriented policy proposals. By GEMA’s own description, the concise resource “shall serve as food for thought and provide guidelines for a responsible use of generative AI.”
Running with the point, at least a few of the principles reiterate general ideas. “Generative AI is obligated to the well-being of people,” the first principle reads, with the second underscoring that “intellectual property rights are protected” notwithstanding the unprecedented technology at hand.
But the third principle explores in relative detail what’s perhaps the most significant element of GEMA’s generative AI compensation model. Not limiting its vision to once-off training payments, the entity believes creators and rightsholders should receive a piece of derivative tracks’ royalties and long-term revenue to boot.
“On the contrary,” this section reads in part, “the economic advantages must be considered which arise through AI content being generated (e.g. income from subscriptions) and are achieved in the market through ensuing exploitation (e.g. as background music or AI generated music on music platforms on the internet).
“In addition, the competitive situation with the works created by people must be taken into consideration. After all, these very works made AI content possible in the first place. This also must apply in cases where synthetic data was used for training the AI. Synthetic data are, in turn, based on works created by people whose creative power continues in such content when generating AI music,” the relevant text proceeds.
The remaining principles touch on well-treaded (albeit meaningful) areas including the need for generative AI training transparency and NIL protections against unauthorized soundalike works. (In the States, many in the industry are advocating for the related NO FAKES Act, which arrived in Congress earlier in 2024 and would establish a federal right of publicity.)
Plus, bearing in mind the ongoing implementation of the sweeping AI Act and the European Union’s unique regulatory environment, any company offering “AI systems that will be rolled out in the EU or that affect people in the EU must stick to the EU regulations,” another principle emphasizes.
Lastly, in terms of brass-tacks takeaways, sizable AI players must engage in “collective negotiations” with rightsholders, per GEMA. “The large digital corporations must find their way back to respecting copyright,” the “negotiations at eye level” principle states.
While it perhaps goes without saying, outlining plans to secure rightsholder compensation from generative AI is only the first of several involved steps. But especially because multiple artificial intelligence companies are adamant that training on protected materials constitutes fair use, it’ll be worth closely monitoring the effort – besides different regulatory pushes and ongoing litigation.