That's a blanket prediction.
I'm going to assume you
read my post I directed you to on the court cases.
Ed said:
Because even if they did, which they won't, they'd have to also ban people from using their own reference/input material.
It's unclear how you reached that conclusion, but: no.
If you don't understand how I reached this conclusion then you're not very experienced using AI.
Using a reference means I can "steal" any work I want to in a similar way. If that's what you're trying to stop, then you'll have failed.
In counterpoint to your arguments, courts have ruled that copyright infringement is illegal, and companies regularly force people who has made illegal copies to take it down, or get sued for infringement. Case in point: YouTube.
How is that a counterpoint?
When did I deny copyright infringement is illegal? The whole point about the legal issue is that this can't possibly be called copyright infringement in a traditional sense. Or do you actually think AI works by having a large database of material and it just edits it together like a robot Frankenstein Photopshoper?
Like it or not this isn't the same and that's why they need to establish a whole new standard.
Ed said:Ed said:
And even if you even managed to get that, which you won't, Open Source models mean that anyone can train their own models on absolutely anything they want, and easily share them.
People do illegal things all the time. But that doesn't mean the behavior is tolerated.
For example, plenty of people visit Warez sites to get cracked VSTis. Should we then not bother purchasing legal copies of things?
Why even use that example and not a musical one? Because you know its different.
The equivalent to what you said is Napster, which is very clearly copyright infringement.
Copyright infringement doesn't give a single shit about someone trying to get close to someone's style and using them as a "reference".
You can't even copyright a rhythm, or chord sequence.
You can have everyone convinced that Hans Zimmer made some music when he didn't, and in theory even make them think it's some actual track from one of his film scores, so long has it doesn't use the same tune etc etc etc it's protected. You can't copyright a style, and if we could, do you have any idea how much art and music would be breaching copyright? Even if everyone agrees we know something is certainly a rip off, doesn't matter. That's not the criteria. The only reason intent is relevant is after it's shown to sound close enough in very particular ways.
It's been observed that limiting the training sets to copyright-free materials makes quite a bit of difference on the quality.
You really aren't paying attention. Companies you think are on your side are making their own AI's and have no fear in doing so, and are always described as the "ethical" ones. That is even though they're training on their own content creators work, apparently without asking them, and if they're lucky they'll be thrown a few dollars as "bonus" as Adobe put it. These are the people you think have a chance to stop this. They aren't trying to get the outcome you're hoping for.
Even if they succeeded in stopping the main company's, how did that help you? And then the logical outcome is that these big companies like Getty, Shutterstock, Adobe, Universal Music etc will just license their catalogue to OpenAI but still a fraction of the cost of what the artist would need to ever make up for it. Open Source is still there, and the very people that publish and distribute your work made their own AI trained by everything you DIDN'T want trained.
Either way you end up in the same place you feared, just with slightly different wallpaper.
You wouldn't "ban AI", you'd ban works that are the result of illegal use of AI.
You actually think this is practical? Have you ever put a single thought into what this would mean at this point?