What's new

AI - Next steps we can take

It is getting to us (human musicians) vs them (corporate technocrats).

Even leaving all the philosophical questions about quality and morality out of the door for now I think there should be a practical unified push against this disaster.

If the legal copyrights issue gets resolved and AI music can be copyrighted by the company owning the algo we have no future as working professionals.

We should take some kind of initiative here on VI Control, a petition of some kind???


These sites need to be shut down, at least until they make public all their training sources, compensate them somehow and also create a way to consent or not companies to use our talents for training.
 
Regarding what we should do, I think this is a follow-the-money situation. As I see it, AI-music companies have two categories of customers to target: One is regular people who want to make songs at home, the second is music-for-media companies who can save money by not hiring composers.

My guess is companies like Suno or Udio are thinking they’re going to make most of their money from hobbyists who want to create their own songs. Millions of people buying $10/month subscriptions. My thinking is that if we can keep them focused on that segment of the marketplace, they won’t be as incentivized to look at us. Don’t get me wrong, I’m sure they’d love to cash in on the music for media marketplace, too, but it’s small potatoes compared to the general market, so it’s not likely to be an early priority.

More to the point of this thread, I think we can make it even less attractive to them, namely by making it less profitable for them. How? Consider that there are two ways they (or us) get paid for music in media - upfront fees and backend royalties.

Upfront fees - Obviously the higher they are, the better (for us.) If AI charges $1 to score a commercial, then a client is likely to give it a try. While if they charge a grand or two, then clients could still save money by going that route, but if the savings aren’t as huge, they’ll be thinking,“Mike is such a nice guy, even with his bad jokes, plus he always comes up with ideas and directions we wouldn’t have thought of.” Thinking-outside-the-box and all that. So I would at least have a chance at that gig.

Also, remember that we don’t just sell music, we sell peace of mind. Producers don’t want to spend hours going through library (or AI) tracks. Music is a proportionately small part of the budget, so it’s not hard for them to justify a few grand to make their lives easier.

How can we incentivize AI companies to keep their music-for-media fees high? I think it can be done, although not by direct means. Rather, we can do it indirectly if we can take away their backend royalties. In other words, if they can’t collect backend from ASCAP/BMI, then the only money they make is on the front end. And since there aren’t that many films/TV-shows/games in production, they can’t make a profit if they’re only charging $10 per project. There’s not enough volume, so to make the market worthwhile, they need to charge a decent amount per show/film/spot.

My guess is ASCAP will be on our side with this, since ASCAP is … us! (Songwriters and composers.) Their leadership is on the older side, though, so they may be unaware of the AI threat. So informing them is probably the first thing to do. I imagine someone has already thought this same thing, of course, so hopefully that process has started. But if anyone has connections, it wouldn’t hurt to reach out. (I may reach out, but my ASCAP checks don’t carry the clout they used to.) From there, it won’t be hard to get them to agree that only humans should collect royalties.

BMI is more of a concern, now that it’s commercially owned. (I don’t trust those mo-fos. This is awfully coincidental timing for the venture guys to come in.) If BMI decided to accept AI “composers,” the membership might complain, but even if they leave and go to ASCAP, BMI could decide to go full AI-money-grab, and be the safe haven for production companies that use AI and want to collect PRO royalties from it.

That would be a huge problem, but it’s against the spirit of our U.S. PRO statutes, where there are government sweetheart deals carved into law for ASCAP and BMI, so government intervention may be able to prevent them from going rogue. Adam Schiff already presented an AI bill recently, although I don’t know what it says. (I’m guessing it’s focused on writers and maybe actors?) He represents part of L.A., so he’s obviously already on our side, so we do have friendly ears in congress. Here’s where having an attorney on board could be useful, so we’d know what the government can actually do. Unlike actors and writers, there are already laws on the books for us, so we have an advantage there.

Assuming we can get full support of the PROs, then we can probably leverage music libraries as well, since backend royalties are a large part of how they get paid. It’s not like music libraries are desperate for more music anyway, so it wouldn’t take much to tip them into a “No AI” stance. In fact, I don’t think the current libraries (at least the big ones) want this business disrupted, anyway, since cheaper music is bad for everyone, including them. So I suspect they’re already friendly to us.

No doubt there would be upstart libraries that are largely or entirely AI generated. As long as we can keep the PROs from paying backend, though, then these guys will only be appealing to small productions, which are kinda already lost anyway. For instance, some home improvement show on HGTV may already be direct licensed anyway, so a lot of these battles were already lost even before AI enters the picture. (A disturbing number of shows, and even a few cable networks, pay no royalties at all.)

There might still be individual composers who cheat, and start cranking out tracks by the bucketload, but if an environment is created where that sort of thing is considered bad, then I imagine rules could be put in place to discourage that. Libraries could require DAW files as proof of composition, for instance.

The bottom line is that if we can keep ASCAP/BMI/SESAC on our side, I think we can make this an unattractive market for AI. Music for media is a tiny industry to begin with. (I doubt Suno has started putting together a team of sales people to market to Paramount.) Even at its most optimistic, we’re small potatoes compared to what I believe is their real goal of selling zillions of monthly subscriptions to wannabe songwriters around the world.
 
BMI is more of a concern, now that it’s commercially owned. (I don’t trust those mo-fos. This is awfully coincidental timing for the venture guys to come in.) If BMI decided to accept AI “composers,” the membership might complain, but even if they leave and go to ASCAP, BMI could decide to go full AI-money-grab, and be the safe haven for production companies that use AI and want to collect PRO royalties from it.
On Monday I'm scheduled to golf with two people at BMI... one is fairly high up and the other does a lot of lobbying for BMI in Washington. I'm not sure I know enough about all this to even know what to ask but if you or anyone had very specific questions that would be relatively easy (and short) to answer I could see what they had to say.
 
Regarding what we should do, I think this is a follow-the-money situation. As I see it, AI-music companies have two categories of customers to target: One is regular people who want to make songs at home, the second is music-for-media companies who can save money by not hiring composers.

My guess is companies like Suno or Udio are thinking they’re going to make most of their money from hobbyists who want to create their own songs. Millions of people buying $10/month subscriptions. My thinking is that if we can keep them focused on that segment of the marketplace, they won’t be as incentivized to look at us. Don’t get me wrong, I’m sure they’d love to cash in on the music for media marketplace, too, but it’s small potatoes compared to the general market, so it’s not likely to be an early priority.

More to the point of this thread, I think we can make it even less attractive to them, namely by making it less profitable for them. How? Consider that there are two ways they (or us) get paid for music in media - upfront fees and backend royalties.

Upfront fees - Obviously the higher they are, the better (for us.) If AI charges $1 to score a commercial, then a client is likely to give it a try. While if they charge a grand or two, then clients could still save money by going that route, but if the savings aren’t as huge, they’ll be thinking,“Mike is such a nice guy, even with his bad jokes, plus he always comes up with ideas and directions we wouldn’t have thought of.” Thinking-outside-the-box and all that. So I would at least have a chance at that gig.

Also, remember that we don’t just sell music, we sell peace of mind. Producers don’t want to spend hours going through library (or AI) tracks. You’d think they would always prioritize saving money, of course, but unless a project is ultra low budget, producers value their time. Music is a proportionately small part of the budget, so it’s not hard for them to justify a few grand to make their lives easier.

How can we incentivize AI companies to keep their music-for-media fees high? I think it can be done, although not by direct means. Rather, we can do it indirectly if we can take away their backend royalties. In other words, if they can’t collect backend from ASCAP/BMI, then the only money they make is on the front end. And since there aren’t that many films/TV-shows/games in production, they can’t make a profit if they’re only charging $10 per project. There’s not enough volume, so to make the market worthwhile, they need to charge a decent amount per show/film/spot.

My guess is ASCAP will be on our side with this, since ASCAP is … us! (Songwriters and composers.) Their leadership is on the older side, though, so they may be unaware of the AI threat. So informing them is probably the first thing to do. I imagine someone has already thought this same thing, of course, so hopefully that process has started. But if anyone has connections, it wouldn’t hurt to reach out. (I may reach out, but my ASCAP checks don’t carry the clout they used to.) From there, it won’t be hard to get them to agree that only humans should collect royalties.

BMI is more of a concern, now that it’s commercially owned. (I don’t trust those mo-fos. this is awfully coincidental timing.) If BMI decided to accept AI “composers,” the membership might complain, but even if they leave, BMI could decide to go full AI-money-grab, and be the safe haven for production companies that use AI and want to collect PRO royalties from it.

That would be a huge problem, but it’s against the spirit of our U.S. PRO laws, where tthere are government sweetheart deals carved into law for ASCAP and BMI, so government intervention may be able to prevent them from going rogue. Adam Schiff already presented an AI bill recently, although I don’t know what it says. (I’m guessing it’s focused on writers and maybe actors?) He represents part of L.A., so he’s obviously already on our side, so we do have friendly ears in congress. Here’s where having an attorney on board could be useful, so we’d know what the government can actually do. Unlike actors and writers, there are already laws on the books for us, so we have an advantage there.

Assuming we can get full support of the PROs, then we can probably leverage music libraries as well, since backend royalties are a large part of how they get paid. It’s not like music libraries are desperate for more music anyway, so it wouldn’t take much to tip them into a “No AI” stance. In fact, I don’t think the current libraries (at least the big ones) want this business disrupted, anyway, since cheaper music is bad for everyone, including them. So I suspect they’re already friendly to us.

No doubt there would be upstart libraries that are largely or entirely AI generated. As long as we can keep the PROs from paying backend, though, then these guys will only be appealing to small productions, which I don’t think we can win anyway. For instance, some home improvement show on HGTV may already be direct licensed anyway, so a lot of these battles were already lost even before AI enters the picture. (A disturbing number of shows, and even a few cable networks, pay no royalties at all.)

There might still be individual composers who cheat, and start cranking out tracks by the bucketload, but if an environment is created where that sort of thing is considered bad, then I imagine rules could be put in place to discourage that. Libraries could require DAW files as proof of composition, for instance.

The bottom line is that if we can keep ASCAP/BMI/SESAC on our side, I think we can make this an unattractive market for AI. Music for media is a tiny industry to begin with, and I doubt Suno has started putting together a team of sales people to market to Paramount. Even at its most optimistic, we’re small potatoes compared to what I believe is their real goal of selling zillions of monthly subscriptions to wannabe songwriters around the world.
Udios terms state:
Due to the nature of artificial intelligence and machine learning, your Output may not be unique and the Services may generate the same or similar output for a third party. Other users may provide similar input to the Services and receive the same output from the Services.

I don’t imagine you can copyright a track that another user will receive and also want to copyright. It’ll just be a mess because ultimately, who owns it?

Studios want to own the music created for their films. I guess it’s why they’re not into AI music.

Even with the AAA production music libraries; like studios, a lot of the catalogues are owned by behemoth tv production companies and networks who maintain 100% of the publishing. I can’t see them ditching all that valuable IP for AI music they can’t own and repeatedly exploit.

Endemol, Banijay, Sky TV in Europe, ITV in the UK and even the BBC have catalogues of library music on Universal, Extreme Music etc. Collectively they have hundreds of thousands of hours of entertainment to fill with music and they have top shelf tracks written for them everyday, ready to be used for their own content and worldwide for anyone else, with zero legal issues down the line. AI music doesn’t even really make sense for that level either. Also the majority of library clients don’t have time to engage in creating tracks. They want to scrub through, audition and use.

It can damage the royalty free market though, serve indie productions, and does seem geared to a hobbyist subscription model. I can see ad execs collectively making a track for their campaign and high fiving each other for their genius prompt “female 90s alternative pop artist singing about sanitary towels.”

Honestly it’d be way cooler if composers could achieve the kind of realism with their own music using that tech. But again, I don’t see it being that ethical and I’d prefer live musicians any day.
 
Last edited:
There is some interesting discussion about music, AI & copyrights in this week's episode of the All-In Podcast. Starting from 45:41

 
On Monday I'm scheduled to golf with two people at BMI... one is fairly high up and the other does a lot of lobbying for BMI in Washington. I'm not sure I know enough about all this to even know what to ask but if you or anyone had very specific questions that would be relatively easy (and short) to answer I could see what they had to say.
That would be really interesting to hear what they might have to say. Surely they've thought about this already, but you never know.

Personally, my goal is to get them (BMI/ASCAP, not just your friends) to take a proactive stance, and at least publicly say they're not going to allow AI to take composer credits. My fear, though, is that with BMI (and possibly ASCAP's as well), I could see them putting their heads in the sand and taking a see-no-evil stance. "There's not much we can do!" is the lazy answer we so often hear.

In their defense, even if they did make a no-AI rule, it's not like it would be easy for them to filter out AI compositions, since nobody's actually going to list the composer as "AI." (They'll probably list the composer as an actual human - namely whoever owns the library company.)

It's not that hard to make the rule work, though. Or at least make it mostly work. (If a few AI tracks sneak through, that's not nearly as bad as a free for all.)

Funny, it's easy to lay out goals, but you obviously can't come in guns-a-blazin' with a bunch of demands at a friendly golf game. :grin:

So I'm thinking maybe you don't necessarily have a specific question for them, but rather you mention that you're feeling nervous about what AI could do to your income. Maybe even say you saw an Udio cue posted that you think was lifted ("learned") from one of your Survivor cues, so this has got you spooked that AI really can take your job.

That puts the ball in their court. (Or putter in their hand, or whatever the golf-equivalent metaphor would be.) You're a friend, just talking about what's happening in your life, so it's on them to have a response. They won't want to look like a jerk who has nothing for you.

If they have nothing, and if you don't want to push it, it might be nice if they'd at least be open to one of us getting in contact with them later about it.
 
This is an important topic which will likely be an evolving conversation for months or years to come, so it would be good to keep the thread on the shorter side, so if newcomers or people from SCL/BMI/ASCAP enter the conversation, it won't get lost in the mix.

So I've moved many posts to a spinoff thread (and copied Simon's opening post there), where the more tangential topics can continue. That way we can keep this thread more narrowly focused.

Thanks for your understanding on this, especially as this was a lot of posts to consider, so I undoubtedly made a few mistakes.
 
Hopefully @chillbot had some success with his BMI guys, but on the ASCAP front, I saw a couple videos that make me wonder if these guys understand what's going. They have an AI page with one article which suggests they're at least thinking about the topic.

But then ... they released some videos in the last couple weeks that are from a full-on evangelist (as Simon put it) perspective. I think these people genuinely believe our future is rosy, thanks to these "wonderful tools."

Particularly scary is the YouTube comments, where people gush with excitement, but no one seems concerned that the apps that can "help us" also have the ability to eliminate us.



 
Hopefully @chillbot had some success with his BMI guys, but on the ASCAP front, I saw a couple videos that make me wonder if these guys understand what's going. They have an AI page with one article which suggests they're at least thinking about the topic.

But then ... they released some videos in the last couple weeks that are from a full-on evangelist (as Simon put it) perspective. I think these people genuinely believe our future is rosy, thanks to these "wonderful tools."

Particularly scary is the YouTube comments, where people gush with excitement, but no one seems concerned that that the apps that can "help us" also have the ability to eliminate us.




Most of the "positive" comments on the first video actually look AI generated 😅
 
In intellectual property law, particularly in copyright, the concept of transformation is a key factor in determining whether a use of a copyrighted work qualifies as fair use. The transformation principle was established in the landmark case Campbell v. Acuff-Rose Music, Inc. (1994), where the U.S. Supreme Court held that a parody of Roy Orbison's song "Oh, Pretty Woman" by 2 Live Crew was transformative and could be considered fair use.

Who must perform the transformation:
The transformation must be performed by the party using the copyrighted work, not the original creator. In most cases, this means that the person or entity accused of copyright infringement must demonstrate that their use of the copyrighted material is transformative.

How the transformation must be performed:
1. Purpose and character: The use of the copyrighted work should add new expression, meaning, or message to the original work. It should not merely supersede the original work but instead build upon it, creating something new and different.

2. Nature of the copyrighted work: Transformative use is more likely to be found when the original work is factual rather than creative. However, even highly creative works can be used in a transformative manner.

3. Amount and substantiality: The amount of the original work used should be appropriate for the transformative purpose. Using more of the original work than necessary may weigh against a finding of fair use.

4. Effect on the market: The transformative use should not substantially harm the market for the original work or its derivatives. If the use serves as a replacement for the original work, it is less likely to be considered transformative.

Examples of transformative use include parody, criticism, commentary, news reporting, teaching, scholarship, and research. However, courts assess transformative use on a case-by-case basis, considering the specific facts and circumstances of each situation.
 
Also, we need to stop calling this stuff “AI” and start calling it what it is: expertise capture (Excap) technology.

Use and reuse of captured expertise is more akin to using the image and likeness of a performer than it is transformation. “New” “content” may be generated. But how, and what enabling factors permitted that generation?

@SimonFranglen the next time you’re in a discussion with reps from this new industry, inhaling the by now familiar vapor of technical brilliance, aggressiveness, and heedless boyhood vandalism: Ask them to go make you something with their software without the benefit of a training set. Tap your watch and say you’ll wait.
 
Top Bottom