What's new

It is impossible to "win" this legal battle.... And you should give up thinking you can.

Ed

Senior Member
I posted this to another thread but I think it should be it's own post.

TL;DR:

The faster people can get from denial, anger, bargaining all the way to acceptance the better they'll be. There WILL be some artform that will come out of this. Like photography and CGI art came out of those technologies. There were similar concerns about synthesizes, and of course sample libraries. This is on a whole other level, but the point remains. The question is, will you have drowned in the flood as a laggard in denial, or found a way to build a boat to ride the wave as long as possible. But one thing you can count on, the flood is coming, and you can't stop it.



These two articles are a great example of why the courts will never rule that AI companies must only use copyright cleared content in training and stop the great replacement.

In the article, their definition of ethical is not about asking creators for permission, or even paying them. Shutterstock and Getty, mentioned in these articles, made their own AI model of their content, made by the same people that think these companies might win the legal battle for them. But they're the ones that you'll have to fight as well if you want to stop all this copyrighted material to be used for training

Look at what Adobe are quoted as saying in the above article:


Hayward also noted that Adobe Stock contributors who submitted AI-generated imagery would qualify for Adobe's 'Firefly bonus', which it paid to contributors whose content was used to train the first public version of the AI model.

So when companies like Midjourney and OpenAI use massive datasets of imagery scraping the internet for everything they can get their hands on, it's "theft" and "unethical". But when Shutterstock, Adobe, and Getty Images do it to their own content creators, that's fine?

But don't worry, if Adobe used your images to train their AI, they'll throw a few dollars at you and call it a "bonus". That will make up for losing your entire industry. We know it can't possibly be more than a few dollars because otherwise they'd have bankrupted themselves.

It's sad that artists (and in-denial composer producers) are thinking they'll be fought for in court, when really these companies have zero interest in fighting for them and are currently (or actively) already betraying them. They care about THEMSELVES being replaced by AI, not you! They cared that someoneELSE "stole" your work for training their AI. Like Adobe, Universal Music will do the same thing. They'll talk about how much this harms artists on the one hand then casually train their own AI on all your work and say they're the ethical ones and expect you to be grateful. Here have $20.

It should be even more insulting because unlike the AI companies, these company's are the ones claiming it's theft and bad for artists, yet they'll go ahead and do the same thing anyway to their creators and act like it's effectively different.

Lets say you have music published with Universal Music. How much would you need if Universal Music made an AI like Udio (only better because it's the future), trained on every track in their collection, and suddenly you find you've been paid a hundred bucks or something for all your work to be used in the training?

The best outcome for these companies is they get those like OpenAI to license their content to them. That's why they're so casual about making their own models, they know the court's legal case isn't going to actually make THEIR models unlawful.

The faster creators give up on this fantasy that they can stop this the better off they'll be. If you worked really hard and were really successful, you might be able to marginally slow down a company or two for maybe 6 months. By the time these cases even reach a legal precedent open source will have already made it impossible to go back, even if you wanted to.


---

There's plenty more reasons why the courts won't decide this the way you want them too, partly because artists and content creators are using arguments that ensure they will. By focusing on the output itself, how good or close it sounds, ensures you have to lose. This is because IF your premise is that the training data is unlawful, the output must be entirely irrelevant. It can't make any difference how good it sounds, or how close it sounds to any traditional sense of copyright infringement. It must be that no matter what it sounds like it's ALL copyright infringement. But they're not!

Strike 1
.

They only cared about the AI training data until it came for their industry, and until it was good enough to be a threat. Where were they for the couple of years artists were losing their shit, saying the exact same things about Image Generators? Again only proving further that they evidently only care about the output, IE, how it actually sounds.

Strike 2.

The premise they don't realize they continue to set up here is one where the logical conclusion is that legally these company's must make sure they take sufficient steps to make it impossible for their model to generate something that actually infringes copyright.

Strike 3.

I see none of them mention or account for user uploaded image references or source images or Open Source. All they do is act morally offended and talk with strong words about how it's definitely theft, while insisting legally it's an "open question" and there's still a chance. Where are the lawsuits to ban open source AI? Facebook are currently on a roll releasing open source AI models. I don't see them advocate for any lawsuits against any of it. All the LLM's are trained on uncleared copyrighted work as well.

And these corporations are the biggest richest company's on earth, even excluding companies like Blackrock and Vanguard (which own each other) that basically own all of them and essentially the entire stock market. The "powers" that be want AI to continue, and there's absolutely no way they'll ever restrict development or criminalize it to the degree they'd have to, in order to stop this from happening to artists. All the major AI companies would have to junk all their models, not allow custom image inputs, outlaw Open Source development, and make all AI generations already used to be by definition copyright infringement, and really criminalize having any Open Source models you may already have on your computer.
 
Last edited:
And how it will make it better exactly ?
I think it's important to fight. We may not win the war but at least win some battles that will limit the impact of generative AI.

I don't think you could have possibly read my post if you're asking this.
 
I don't think you could have possibly read my post if you're asking this.
Well I stopped after this paragraph. It was quite a long post sir. 😀

But I just skimmed it and I fail to see where you explain why it would be good to stop fighting other than the impossibility to win a law suit. It may be impossible to ban all usage of copyrighted material for AI training but there are plenty of other battles to win imo.

But I agree with you , corporations will never fight for the artists. The artists will have to unite and defend themselves.
 
I assume that the major record labels will lobby for laws that somehow allow them to indiscriminately make money off AI music but somehow "ban" small time creators from doing so + I assume the major record labels will also start incorporating AI clauses into their contracts so that they will not only own the masters but also the "musical likeness" of the artist
 
IMHO, the most important legislation being proposed at present is the Generative AI Copyright Disclosure Act that requires AI companies to reveal the content they used to train their AI models.

This requirement would provide rights holders with the necessary foundational evidence to begin the process of collecting revenue from AI vendors who use their content.

It's by no means a comprehensive solution to the AI conundrum, but it's a step in the right direction. We all know it's too late to block AI companies from using copyrighted content - that horse has already left the barn. So the next best thing is for companies to reveal their sources, so appropriate royalties can be extracted.

But one problem with this bill is that it doesn't seem to have sufficient teeth. It's unclear how large the fines would be if an AI company refuses to comply. So ultimately, if the bill even passes, it might end up being the equivalent of a slap on the wrist, and most multi-billion dollar tech companies will probably be happy to accept the tradeoff to keep their content sources secret.


 
And how it will make it better exactly ?
I think it's important to fight. We may not win the war but at least win some battles that will limit the impact of generative AI.
it is important to fight a winnable battle - I am also of the mind AI /robotics will take over most existing work in the end. Most work isalready just a time filling exercise and not at all useful. The countries that do well wil be those that support their non-working population with something like a universal basic income. Countries that dont support their population will do very poorly indeed - they will collapse and become failed states.
Here is a modern car factory producing a car in less than two minutes There is no stopping this and one shouldnt, one should instead find ways to share the wealth
 
And what will happen when A.I. composes completely novel music? And it becomes popular? Because let’s be honest, the technology is only going to get better, and advanced A.I. will decipher music in the same way we do, eventually, and it will stop sounding like a bad parrot.
 
Last edited:
Doomsaying is very easy and usually wrong. Human-ness is harder but will always win in the end. AI is an ogre that can ignite our worst fears but the reality will be so far from….and in fact beyond, those fears.

We have entered the period of history where ‘just because we can’ do a technology, we have to ask ‘do we want to?’ Maybe the birth of this era was the atomic bomb. Technology raises difficult ethical questions but in the end human-ness decides. There was not a nuclear war.
 
It is sad to me to see how many people find it easier to see Ai/Robtics/Job replacement technologies as a disaster rather than as an opportunity for social transformation to an a era of unparalleled wealth and social equity
 
It is sad to me to see how many people find it easier to see Ai/Robtics/Job replacement technologies as a disaster rather than as an opportunity for social transformation to an a era of unparalleled wealth and social equity
It's both.

Clearly technological advances have resulted in better, faster, and more efficient ways of doing things throughout human history, and have made many human jobs and skills obsolete along the way.

This is the nature of progress, and generally speaking, it has benefitted humanity over the long run. When technology, including AI, is applied to humanity's problems, the results can be wondrous and provide significant improvements to the quality of our lives. Great.

But when did the creation of music become a problem that needed an AI solution? We're we running out of musical ideas? We're composers not writing advertising jingles fast enough? Are film directors and producers tired of having to deal with tempermental composers? Are fans and audiences bored with human performers?

Sadly, the true answer comes down to plain old GREED. If an AI company can provide unlimited "new" music to commercial clients and fans, then they are in a position to earn all the profits normally associated with that music.

Spotify loves their new AI-generated playlists, because when listeners stream them, Spotify doesn't have to pay royalties to anyone. Think about all the people/businesses that get shorted when a single AI song is generated: songwriter(s), musicans, recording, mixing, and mastering engineers, graphic artists and photographers, and even instrument makers and sample library and plugin developers.

It wouldn't be so horrible if the impacted human jobs were mindless, menial, and undesirable. But that's not the case. These were never jobs and tasks that were crying out for replacement or in need of improvement. They're being threatened simply because of blind greed.

Someone figured out that if they can develop an AI tech that can generate music on par with human creators, then they can keep all the money associated with the production and distribution of that music for themselves. Starve out the artists and talent, and all you're left with are monolithic organizations that own and control all creative product. That's not a world I want to live in.

AI is not the villain here. It's the greedy tech giants that want to leverage AI to corner creative markets for their own selfish monetary gain.
 
Last edited:
It's both.

Clearly technological advances have resulted in better, faster, and more efficient ways of doing things throughout human history, and have made many human jobs and skills obsolete along the way.

This is the nature of progress, and generally speaking, it has benefitted humanity over the long run. When technology, including AI, is applied to humanity's problems, the results can be wondrous and provide significant improvements to the quality of our lives. Great.

But when did the creation of music become a problem that needed an AI solution? We're we running out of musical ideas? We're composers not writing advertising jingles fast enough? Are film directors and producers tired of having to deal with tempermental composers? Are fans and audiences bored with human performers?

Sadly, the true answer comes down to plain old GREED. If an AI company can provide unlimited "new" music to commercial clients and fans, then they are in a position to earn all the profits normally assiciated with that music.

Spotify loves their new AI-generated playlists, because when listeners stream them, Spotify doesn't have to pay royalties to anyone. Think about all the people/business that get shorted when a single AI song is generated: songwriter(s), musicans, recording, mixing, and mastering engineers, graphic artists and photographers, and even instrument makers and sample library and plugin developers.

It wouldn't be so horrible if the impacted human jobs were mindless, menial, and undesirable. But that's not the case. These were never jobs and tasks that were crying out for replacement or in need of improvement. They're being threatened simply because of blind greed.

Someone figured out that if they can develop an AI tech that can generate music on par with human creators, then they can keep all the money associated with the production and distribution of that music for themselves. Starve out the artists and talent, and all you're left with are monolithic organizations that own and control all creative product. That's not a world I want to live in.

AI is not the villain here. It's the greedy tech giants that want to leverage AI to corner cretive markets for their own selfish monetary gain.
I agree to a point, but Ai will not stop music making it will just take over a whole lot of paid production. It wont replace live performance for example and it wont really replace composition outside of certain types of music production as commodity.

My point is that as AI in music is just one small componenet of a larger displacement of jobs it behoves us - as people not only as musicians - to address this oncoming displacement at a societal level. By doing so we will allow musicians and other artists to continue producing their art in forms that may be rewarded by public acclaim if not by direct payment. Payment will have to be at the societal level - the example I gave is universal basic income but there are other ways
 
Last edited:
To those thinking that AI will be "an opportunity for social transformation to an a era of unparalleled wealth and social equity".
- The productivity of work increased by 50 times between 1870 and 1970, and first world workers did benefit from that, but until the post-war keynesian period workers were still violently exploited (the situation was even worst in the colonies at the time).
- Between 1970 and now, the productivity increased by 4 times. Some people in the 60's were so naive that they were thinking that in the 2000s, we will have flying cars and nobody will work anymore.

Are we 4 times richer than in the 1970 ? Are we working less ?
No, because all this wealth went into the hands of a small group of people. Because it's a political and an economical problem, not a technological one.

And while this time, we destroyed our ecosystem in a way that never happened before since the great extinction 65 millions years ago.

So there is no way AI will bring anything good, because it will destroy the work and value of the most for the benefit of the fews.
If you have any pride and self esteem, you can't just let it happen without fight.

Those calling for acceptation are just the useful idiots of capitalism...
 
I agree to a point, but Ai will not stop music making it will just take over a whole lot of paid production. It wont replace live performance for example and it wont really replace composition outside of certain types of music production as commodity.

My point is that as AI in music is just one small componenet of a larger displacement of jobs it behoves us - as people not as musicians - to address this oncoming displacement at a societal level. By doing so we will allow musicians and other artists to continue producing their art in forms that may be rewarded by public acclaim if not by direct payment. Payment will have to be at the societal level - the example I gave is universal basic income but there are other ways
Well, that certainly sounds like an idealized vision for a utopian society, and may be a worthy long range goal. But in the meantime, those of us who are trying to push back, or at least regulate, generative AI are doing so because:

A) creative jobs are being obviated for no justifiable reason other than greedy tech giants want in on the profits;

B) placing any art form under centralized monolithic control squashes opportunities for cultural diversity and expansion;

C) I don't want to live in a world where the music I stream, hear in movies and TV, and even in commercials, is generated by some non-musician pressing a button at a computer.

Generative AI has nothing to do with individual creative pursuits. I'll go on making music like I always have, regardless of AI. It does of course, threaten my income as a professional composer, but outside of myself, the bigger issue is that once AI-generated music seeps into all of the normal outlets for music in our world, we'll all be forced to listen to it.

In terms of live performance... remember Gorillaz? Basically 2 musicians who worked behind the scenes while the band's persona was represented by a quartet of 2D animated characters. Their live shows were basically the cartoon band projected on a screen, while the musicians remained hidden. With today's improved 3D and holographic technology, it's not a stretch to imagine an AI "artist" playing live shows in a similar manner. I feel sorry for today's younger generation.
 
To those thinking that AI will be "an opportunity for social transformation to an a era of unparalleled wealth and social equity".
- The productivity of work increased by 50 times between 1870 and 1970, and first world workers did benefit from that, but until the post-war keynesian period workers were still violently exploited (the situation was even worst in the colonies at the time).
- Between 1970 and now, the productivity increased by 4 times. Some people in the 60's were so naive that they were thinking that in the 2000s, we will have flying cars and nobody will work anymore.

Are we 4 times richer than in the 1970 ? Are we working less ?
No, because all this wealth went into the hands of a small group of people. Because it's a political and an economical problem, not a technological one.

And while this time, we destroyed our ecosystem in a way that never happened before since the great extinction 65 millions years ago.

So there is no way AI will bring anything good, because it will destroy the work and value of the most for the benefit of the fews.
If you have any pride and self esteem, you can't just let it happen without fight.

Those calling for acceptation are just the useful idiots of capitalism...
as I point out there is a mechanism - UBI - and there is a movement to implement it. Support or not, accept that capitalism has won, or not. Strangely enough the US (and the west) are not the only game in town.
 
Top Bottom