Is an 80%-quality pop song good enough for you? What about 80%-quality therapy? Here we sidetrack on the growing role of artificial intelligence in songwriting and mental health services. Oh, and we hang our heads at what it’s doing to college classes… Be sure to join us next week for a full episode!
Send us your thoughts at NeverMusicPod@gmail.com
Advertising Inquiries: https://redcircle.com/brands
Privacy & Opt-Out: https://redcircle.com/privacy
[00:00:00] Hey there! Guess what? The Opus is back! We're the podcast that explores iconic records to celebrate their ongoing legacy history and how the music continues to evolve. I'm Adam Unz, the new host of The Opus from the Consequence Podcast Network and Sony Legacy Recordings. This season, we're celebrating Alice in Chains' second studio album, Dirt, a record whose primal perfection burst onto the music scene back then and still resonates powerfully now.
[00:00:26] We talk to people like Gavin Rossdale from Bush, Charlie Benanti from Anthrax and Dallas Green from City & Color. Join us as we go back to examine the impact, Dirt had when it was released back in 1992 and how it reverberates with rockers and fans today. You can find The Opus on the Consequence Podcast Network or wherever you get your podcasts.
[00:00:57] The platform Shopify revolutionized millions of companies worldwide. With Shopify you can reach the new online shop in, without programming or design-kenntnisse.
[00:01:08] Dank der efficienten Einrichtung und intuitive Social Media und Online Marketplace Integration, you can go over Instagram, eBay & Co. and co. verben and verkaufen.
[00:01:17] Neue Zielgruppen zu erreichen war noch nie so einfach. Shopify bietet auf einer einzigen, sicheren Plattform alle Tools, um dein Online Business aufzubauen.
[00:01:26] Kostenlos testen und dein Business der Welt präsentieren. Shopify.de-try besuchen. Einfach Shopify.de-try eingeben und loslegen.
[00:01:38] Made for Germany. Powered by Shopify.
[00:01:41] Hey everyone, Mark here again. Remember that sidetrack we did a little bit ago where I showed Nicole the podcast theme music and the ad music and all that stuff and got her reaction?
[00:01:50] Well, that sidetrack spawned its own sidetrack while we were having that conversation about artificial intelligence and its effect on the creation of music, its effect on therapy, and even a little bit about how it affects our teaching.
[00:02:05] In any case, thought you might like to listen in.
[00:02:19] Is there a way for the title music to just like press buttons on your fancy computer and just like change it?
[00:02:26] Can't you put it in AI and just change it?
[00:02:28] Like, can you tell AI to like take this music and make it bluesy?
[00:02:33] Can I?
[00:02:34] Can you like put in the sound file to a chat and say like, can you make this sound like TLC?
[00:02:41] So, no. There might be AI out there that can do things like that, but it's not going to create a product that is appropriate quality.
[00:02:50] That you can like put out.
[00:02:50] That I could use, right?
[00:02:51] It might be able to do things following an algorithm that's creating something that sort of vaguely resembles that.
[00:02:58] First of all, I don't know. I haven't really explored this kind of stuff.
[00:03:00] There is AI that will adjust the mix of it and do some of the production.
[00:03:04] But in terms of recomposing and then creating a final product of that recomposition, I don't think we've got the skills.
[00:03:11] I'll say it. I'm lucky we don't have this completely the skills.
[00:03:15] Right. For sure.
[00:03:15] Because I cease to...
[00:03:16] Be valid.
[00:03:17] I cease to be valid as a person, as a profession.
[00:03:19] If songwriters and stuff, we're getting to the point where that kind of thing will be possible.
[00:03:23] Well, it's happening therapeutically. Like in my field, for sure, there's AI therapists that like will read your...
[00:03:29] They'll read your body language and make assessments and give you referrals.
[00:03:32] Like absolutely. And people are using them and they're very comfortable using them because you're not like disclosing personal stuff to another human.
[00:03:39] You're disclosing it to a chatbot.
[00:03:40] So I think that gives... What that gives is the end user an advantage to that difference.
[00:03:44] Right. Whereas the advantage, if AI suddenly does all the songwriting and music production,
[00:03:50] the advantage goes to the record companies and the business owners.
[00:03:54] Right.
[00:03:54] Because what they have to do is not pay a human.
[00:03:56] There isn't a end user advantage.
[00:03:58] So let's say the 80-20 rule, right? Like 20% of the work will get you 80% of the way there.
[00:04:05] And then the last 20% of quality and excellence takes 80% of your time.
[00:04:10] Sure.
[00:04:10] Meaning like getting something across from a B to an A is just so much work.
[00:04:15] That sacrifice might be worth somebody not having to disclose their dark secrets to a therapist.
[00:04:21] 80% success rate therapy or like an 80% good therapist might be good enough if I'm having a rough week.
[00:04:28] But I don't feel like telling a human being exactly how rough.
[00:04:31] The question remains to be seen, is 80% okay pop songs good enough to the consumer?
[00:04:38] Because it sure is heck going to save the record companies a lot of money.
[00:04:40] But will there be still people that want, you know, the Taylor Swift or whatever that's taking thousands of dollars to produce,
[00:04:48] as opposed to something that took an AI bot five minutes of computer crunch time?
[00:04:54] Like, and is that trade-off worth it?
[00:04:56] Because I don't know that there's an advantage for the end user,
[00:05:00] except for maybe more music could be created more quickly, which is not an advantage.
[00:05:04] I think we all feel like we can't listen to everything we want to anyways.
[00:05:06] Yeah, I think I know in therapy, like, it's pretty controversial.
[00:05:12] One, because you're taking the human experience out of it.
[00:05:15] And if you have someone that has like anxiety, it makes sense to like practice talking with the therapist, right?
[00:05:20] You're like, you're kind of just, in my opinion, and it's just my opinion.
[00:05:24] I think chat or AI therapy devalues the experience because you're not getting that like kind of practice arena.
[00:05:32] You know, you're not getting the chance to beta test how you act in the real world with other humans.
[00:05:38] And it is, I think, good in some cases for like access and opportunity for people like seeking therapy,
[00:05:44] that maybe you can't, your insurance won't cover you seeing someone in person.
[00:05:48] You don't have a vehicle to get there.
[00:05:50] You don't have resources to take time off of work to go to therapy.
[00:05:52] You can just dial into this chat and they can offer you assistance.
[00:05:58] What they do is they like put you in a, they videotape you and put monitors on you to like monitor your body language.
[00:06:05] And then the chat recognizes if you're, if you have a low affect or if like your body language is slumped or if you're avoiding eye contact,
[00:06:12] which is, it's a chat, it's a robot anyway.
[00:06:15] And then flagging you and saying, hey, your affect is, you're appearing depressed.
[00:06:19] You're appearing anxious.
[00:06:21] You're appearing aloof.
[00:06:22] Yeah.
[00:06:23] And then it'll like, the algorithm will send you to resources that will help with those things.
[00:06:28] And you can change.
[00:06:30] It's like an avatar that you're talking to.
[00:06:32] It's not just like a voice.
[00:06:35] And the user can change the look of the avatar.
[00:06:38] Like I'm a Caucasian woman, a cisgendered woman.
[00:06:42] Maybe I only want my therapist to be a cisgendered white woman.
[00:06:45] Right.
[00:06:46] And that's who I'm most comfortable talking to.
[00:06:48] Maybe I'm a person of color.
[00:06:50] I want my therapist to be a POC too.
[00:06:52] I want to talk to a black man as a black man in therapy, which would be awesome because it would take down some of those barriers.
[00:07:00] We see a lot of barriers therapeutically for people in our industry.
[00:07:04] There aren't, the numbers are changing, but there aren't a ton of like black male therapists.
[00:07:09] Right.
[00:07:10] And black males often avoid therapy for cultural reasons.
[00:07:13] So we want people that, you know, representation matters in our field.
[00:07:17] But sorry to be the devil's advocate here.
[00:07:20] And obviously I don't know about this in any great detail except for what you just shared.
[00:07:24] That's not a black therapist.
[00:07:25] It's a simulation of one that treats the race element to a therapist as just what they look like.
[00:07:33] Because I think looks maybe are important, but I would imagine that part of the benefit for a person of color seeing a person of color is that there could be a presumed shared cultural understanding, shared understanding of our racialized aspects of mental health, things like that.
[00:07:49] That wouldn't be true by a avatar who happens to be a black man who was coded by a white engineer in Silicon Valley.
[00:07:56] So that's it.
[00:07:57] And maybe, I mean, I'm presuming it could be coded by someone who wasn't a white male engineer, but it almost has nothing to do with actual black experience.
[00:08:05] Right.
[00:08:06] Except for visually.
[00:08:06] So now, while that might give someone the illusion of that safety, that is helpful.
[00:08:12] Yeah.
[00:08:12] I could see for some people what they're looking for in a specific identity in their therapist is not just what they look like.
[00:08:19] Right.
[00:08:19] Yeah.
[00:08:19] But our brains need like stereotypes exist.
[00:08:23] Right.
[00:08:23] So if you stereotype your black male therapist avatar as having a certain set of cultural beliefs, isn't that what we're doing with each other in real life every day?
[00:08:34] Yeah.
[00:08:34] Like I could, you know, maybe you think as a white woman, I have a specific set of lived experiences.
[00:08:40] But when you dig into me a little bit more, you might realize, hey, actually, this person doesn't have those experiences that I assume.
[00:08:47] Like she doesn't shop at Costco and drive a minivan.
[00:08:50] Right.
[00:08:51] When you start like scratching.
[00:08:52] She does go to Grateful Dead concerts and Phish concerts.
[00:08:54] I mean, I used to.
[00:08:56] Used to.
[00:08:56] Well, up until last summer.
[00:08:58] Okay.
[00:08:58] I think that doesn't count.
[00:09:00] Used to.
[00:09:01] Okay.
[00:09:01] So.
[00:09:01] All right.
[00:09:02] So.
[00:09:02] But like we are stereotyping anyway.
[00:09:04] So I think that like if it gets people that are hesitant to see a therapist, if it gets them in the door, I think it's a good thing.
[00:09:12] But I worry about exploitation of those stereotypes and exploitation of those biases and exploitation of people with mental illness to just kind of check off a box.
[00:09:22] Like, oh, you saw a therapist.
[00:09:23] I'm going to bill your insurance.
[00:09:24] But you didn't.
[00:09:25] Yeah.
[00:09:26] That's really interesting.
[00:09:27] So.
[00:09:34] So.
[00:09:48] So.
[00:09:58] So.
[00:09:59] integration, kannst du über Instagram, eBay und Co. werben und verkaufen. Neue Zielgruppen zu
[00:10:06] erreichen war noch nie so einfach. Shopify bietet auf einer einzigen sicheren Plattform alle Tools,
[00:10:11] um dein Online-Business aufzubauen. Kostenlos testen und dein Business der Welt präsentieren.
[00:10:18] shopify.de-try besuchen. Einfach shopify.de-try eingeben und loslegen. Made for Germany. Powered
[00:10:27] by Shopify. And I think bringing it back to music before we wrap, like in terms of the AI,
[00:10:41] which is wild that you brought, this is like a second sidetrack. This is awesome. Perfect.
[00:10:44] I don't know if it is, if we should use it as one. Maybe. I feel like not being an AI expert,
[00:10:50] we're going to have two stages to the AI creating music, right? And that first stage is going to be
[00:10:57] a bummer in a lot of ways because it'll actually cause us to have possibly worse music and maybe
[00:11:03] stop that second stage from happening at all. So what I mean by that is just like when we're
[00:11:08] doing language models, right? If we're talking about ChatGPT, ChatGPT is not going to coin new slang,
[00:11:14] for example. ChatGPT is not doing original research. ChatGPT is just synthesizing what's
[00:11:20] already out there. It's going to say things that humans have already said. Let's say the AI gets good
[00:11:26] enough to do what you said, that if it could take our theme music and it could spin the theme music
[00:11:32] into something in the style of TLC, that probably will be possible at a certain point. I think it's
[00:11:38] technically possible now, but probably the result would be clunky enough that nobody would want to
[00:11:43] listen to it. Not necessarily saying anybody wants to listen to our theme music anyways,
[00:11:47] but you get the idea. It'll be crappier. And that AI will eventually get good enough to where it can
[00:11:51] make it not crappy. And you could just tell that AI, make a pop song, but it's going to need
[00:11:57] information to develop that. It's going to need to be trained to make that pop song.
[00:12:01] It's going to be trained by people like me in conjunction with computer engineers that maybe
[00:12:07] give it some information about how music works, but also more importantly, fuel hundreds of millions
[00:12:13] of songs into it. So it understands what songs and it understands which songs are hits. It understands,
[00:12:19] well, that AI is just going to be spitting out a synthesis of what has already been done. So think
[00:12:24] about it. If the AI learns that in 2024, nine out of the 10 top 10 songs were in a minor key,
[00:12:32] guess what's going to happen in 2025? The pop songs will be created in a minor key. And we'll figure out
[00:12:39] that grunge is coming back in 2026. And so it's going to start looking to a lot of 90s alternative
[00:12:46] rock and creating songs like that. And then it'll find out that, oh wait, now garage punk from the early
[00:12:52] 2000s is coming back or whatever. And we'll just start like doing it all from again, who's going to
[00:12:57] take the next leap? Who's going to create the new content? And this is one of the problems with
[00:13:02] chat GPT, how it's like reading entire people's books and summarizing them for like, if we destroy
[00:13:08] the publishing, if we destroy the newspapers, we destroy the online journals and blogs, the real
[00:13:14] human content, who's going to even create the content that will fuel chat GPT? We'll just have
[00:13:19] the music songwriting AI will just be being fueled by old music or newly AI generated music. So that's
[00:13:27] when we get to the second stage, which is... The second stage being cultural implosion.
[00:13:31] Well, well, that's what I'm worried we would end up. The ideal second stage is eventually the AI
[00:13:36] learns to actually create new ideas. Right. Terrifying.
[00:13:39] And that's terrifying too. But I feel like economically, we may not even get there because
[00:13:44] the impetus to make that AI able to do that, I feel like the impetus to just have it recycle Taylor
[00:13:50] Swift stuff or Beyonce or whatever the most Michael Jackson, whatever the top selling artists of all
[00:13:56] time, and just generate more of that is going to be really hard for it to get out of, I think that
[00:14:01] doom cycle because the actual work to creating an AI that can actually create a new original style of
[00:14:08] music. It's going to need to be the people still doing that, right? And so can the people in 2035,
[00:14:15] the teenagers inventing a new style of music in their basement or whatever, in an underground
[00:14:21] scene, are they going to be able to rise up and give new ideas or will they just never get noticed
[00:14:27] enough even for the AI to feel that? But like, to come to your point, nothing's ever created in a vacuum.
[00:14:33] Like even us, when we create new things, we are just taking old ideas and synthesizing them.
[00:14:39] That's what we're doing. We're seeing repeating trends in current music versus historical music.
[00:14:44] And that's what we're noticing that in this adventure of this podcast, that we're seeing
[00:14:48] repetition of themes. So if we're just taking existing information and synthesizing it,
[00:14:54] and that's what AI is doing too, I feel like it's the same process, but there's no creativity involved.
[00:14:59] I also think what we're talking about is like something that could happen in the future already
[00:15:04] happens and we just don't know it. And I bet you there's people out there listening that are like,
[00:15:09] they're wrong. This already exists. This is what it's called.
[00:15:11] Oh, I'm sure there's already apps out there that could compose music. And that's cool to a certain
[00:15:17] extent. Like I brainstormed with a computer engineer friend of mine about how cool it would be to make
[00:15:22] like a songwriting app that I'm teaching it the main kind of ideas. And you just like have someone,
[00:15:28] it's almost choose your own adventure. And by the end of it, you have a song, but like,
[00:15:32] I wasn't presuming that that app would generate a new style of music. It would be more like business.
[00:15:38] What's that?
[00:15:38] Would put you out of business.
[00:15:39] Put me at well, well, yeah, it would definitely put me as a songwriter out of business if it was
[00:15:45] too good. We'll see how long until me as a teacher gets put out of business by AI.
[00:15:50] Cause that's, Oh God. I mean, the other real sidetrack is how much AI has messed with our classes,
[00:15:56] especially the online classes I teach.
[00:15:58] The online classes are different. I've completely gone archaic in my teaching style because of AI.
[00:16:05] Like every assessment is done in person, pen and paper in front of me. Every assessment.
[00:16:09] Yeah. So that's hard on an online class, but I get it. Yeah. Because I can't imagine being a really
[00:16:14] stressed out college student with a pretty darn good answer two clicks away with chat GPT.
[00:16:21] I can't even imagine the resist. Look, if you're like a music major and it's your music history
[00:16:26] term paper, that's different. But if you're a music major and you're in like a psych class,
[00:16:31] a psych class, and they're asking you for a post about like Maslow, like ES.
[00:16:35] That's right. How could you, and I teach some online classes and I have discussion posts
[00:16:39] that are very much, I'm throwing one of my students under the bus, whoever you are,
[00:16:44] I'm not going to name you by name, but you know who you are. One of the discussion prompts is,
[00:16:48] you know, the whole idea, this is my rock and roll history class, right? The whole idea of Elvis being
[00:16:54] the king of rock as being kind of embraced by some, but also kind of ludicrous, right? He sort of stole
[00:17:00] it in some ways, but he's also this amazing artist. Maybe he should be. And so I pose the question,
[00:17:05] tell us one of the kings or queens or whatever sort of ruler of a style. And it doesn't have to be
[00:17:12] music. It could be, oh, this person is the prince of sculpture or whatever from the 19th, whatever.
[00:17:18] And people usually do do music. But I had as close as I could get to proof, a student use
[00:17:25] chat GPT to answer that question. And that is a question that I'm going out of my way to say,
[00:17:31] there's no wrong answer. The point of this is to just read other people's answers and kind of chuckle.
[00:17:36] But they're so busy. The idea that something that every answer is right, it's still easier for that
[00:17:43] student to not actually think and go, huh, I kind of like Kendrick Lamar. I'll say he's the king of
[00:17:49] 21st century hip hop. They instead asked chat GPT. And I got this really silly equivocating answer
[00:17:56] where it was like, well, many people disagree. Here are some. And it was this bullet pointed list of
[00:18:00] like the queen of soul, the king of pop. It like listed all the ones. The godfather of soul. Like
[00:18:06] it listed all the kind of colloquial ones that people already say. And they copied and paste.
[00:18:10] This was very early gen, right? That student now is still doing that, but not copy pasting. They're
[00:18:15] paraphrasing. Nice. They've learned. They've learned, which then means they've wasted so much more time
[00:18:21] than just answering the question, honestly. A good workaround that I do for discussion boards is do
[00:18:25] video responses, have them videotape themselves and upload it. So that's the move.
[00:18:31] Video responses, presentations, things like that. That's the way to get around when you're asking
[00:18:36] for content. Yes. Look, I'm just asking them to chime in on their opinion. If they're really going
[00:18:40] to waste their time and cheat, like, yes, if I catch them, I try to stop that. But a mitigation factor
[00:18:46] is sort of low stakes. Whereas when you're talking about, I need this student to convey to me that they
[00:18:51] understand the breakdown of tonality in the late 19th century, probably you would need a presentation or a
[00:18:58] video to do that because the AI is going to write something again. That's 80%. Okay. Maybe it's not
[00:19:05] going to be an A paper and it wouldn't, I don't even know that it would be because listeners,
[00:19:10] anybody in music classes, the AIs do not understand music because there's a lot of bad stuff written
[00:19:16] about music. Like if you ask a music theory question to chat GPT on two different days, you will
[00:19:21] get two different answers and they are not both correct. And it's because there's a lot of noise out there.
[00:19:27] It's not like chemistry where there's like, there's one right answer. There's one right answer. And it's been
[00:19:32] written about in journals for 150 years or whatever. This is like how I get people talking about in answers,
[00:19:38] how Bruce Springsteen and Motley Crue are like a part of the same artistic movement, which they're not right.
[00:19:45] Right. But because people say crazy stuff on blogs, on the internet, people are saying that kind of
[00:19:50] stuff in my online classes. Yeah. And it used to be, I'd roll my eyes and be like, that student just
[00:19:55] like look something up online. No, now it's like, Oh, that's an AI answer. Yeah. So anyway,
[00:20:01] this is boring. Yeah. It's interesting to us, but yeah. Any, any last words on any of this?
[00:20:10] Yes. We're doomed. All right, cool. Nevermind the music is hosted by Nicole Vatcher and me,
[00:20:22] Mark Pothany. I also produce, please be sure to subscribe and leave us a rating and a review.
[00:20:27] And let us know what you think on social media. We're never music pod on all the major platforms.
[00:20:32] You can also send us an email at never music pod at gmail.com. And every so often we'll do a mailbag
[00:20:38] episode where we answer all your questions. So please send them in. Thanks for listening.