Bitcoin

Requiring Artists’ Consent Before Training AI Systems On Their Work Would “Basically Kill” The Industry In The UK – Nick Clegg

Require the consent of artists before training AI systems on their work

Nick Clegg, former British Deputy Prime Minister and now a manager superior to META, has intensified the growing debate on intellectual property rights in the development of artificial intelligence by declaring that the consent of artists before training AI systems on their work “would mainly kill” industry in the United Kingdom.

Clegg made the statement while promoting his new book at the Charleston Festival, where he recognized that the creative community should have the right to withdraw sets of AI training data – but rejected calls for an approach before consent.

“Many votes say:” You can only train on my content [if you] Ask first. And I must say that it seems somewhat improbable because these systems train on large amounts of data, “said Clegg.” I just don’t know how you ask everyone. I don’t see how it would work. And by the way, if you did in Great Britain and no one else has done so, you would mainly kill the AI ​​industry in this country overnight. »»

Register For TEKEDIA Mini-MBA Edition 17 (June 9 – September 6, 2025)) Today for early reductions. An annual for access to Blurara.com.

Tekedia Ai in Masterclass Business open registration.

Join Tekedia Capital Syndicate and co-INivest in large world startups.

Register become a better CEO or director with CEO program and director of Tekedia.

His comments come then that British legislators debate the legislation that would oblige technological companies to disclose the works protected by copyright that they have used to train their AI models – a push motivated by the concerns of artists, musicians, writers and other creators that their intellectual property is exploited without authorization or compensation.

Parliament returns to the consent amendment

The proposed modification of the data bill (use and access), presented by Baroness Beeban Kidron, herself a filmmaker, would oblige companies to reveal the works protected by the copyright they use, thus allowing creators to protect their work under existing laws. But despite the vocal support of high -level artists such as Paul McCartney, Elton John, Dua Lipa and Ian McKellen, the amendment was rejected last week by the House of Commons.

The British Secretary of State for Sciences, Innovation and Technology, Peter Kyle, has defended rejection, claiming that the country must avoid a regulatory environment which opposes the development of AI against creative industries.

“The British economy needs the two sectors to succeed and thrive,” said Kyle, warning that surregulation could stifle AI growth in the United Kingdom.

However, the leaders of the creative sector argue that the fundamental question is equity, not technological innovation. Without transparency and enforceable rights, they provide, the developers of the AI ​​essentially fly their work. Kidron wrote in an editorial that the aim of the amendment was to give artists “visibility on what happens to their work” and to ensure that the models of AI “do not train in secret, behind closed doors, on the content which has been created over the years, even decades”.

“The fight is not yet over,” added Kidron, confirming that the bill will return to the Chamber of Lords in early June, where efforts to relaunch the transparency clause are already underway.

Climbing world disputes on AI and intellectual property

Clegg’s remarks, delivered on stage in a creators’ play, have produced not only legislators, but also the wider creative community. They consider its dismissal of protections based on consent as the reflection of long -standing reluctance of the technological industry to explain how generative AI systems exploit the protected by copyright.

This controversy is far from unique in the United Kingdom. Globally, AI developers are faced with mounting prosecution and a public examination on how training data comes. In the United States, artists in Europe and press organizations in Asia have all raised concerns according to which the models of AI trained on works protected by copyright are marketed without compensation or even recognition.

In this context, Clegg’s assertion that the United Kingdom is disadvantaged by applying stronger protections is considered by many creatives as a defense of an operating system. For artists, concern does not only concern the principle of consent, but on power – technology companies, according to them, benefit from the creative work of others while protecting the functioning of their AI models from control.

A fight far from finished

While the AI ​​industry continues to evolve at a dazzling pace, the collision between innovation and the rights is no longer hypothetical – it is already there. The United Kingdom, like many other countries, is forced to choose between a technological approach and an approach that focuses on transparency and equity.

The statement of Clegg may have been conceived as a pragmatic warning, but for many in the creative sector, this is the last reminder of what they consider as a model of technological companies claiming that they cannot survive if they are held responsible for the content they use.

The return of the data bill to the House of Lords in June will be a pivotal moment. The result could determine whether the United Kingdom is positioned as a leader in AI ethical development or simply another jurisdiction where creators’ rights are eroded in the name of innovation.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button