Sidetrack - Authenticity in the Age of A.I. (with Michael Serazio)
Nevermind the MusicMarch 03, 202601:06:0060.43 MB

Sidetrack - Authenticity in the Age of A.I. (with Michael Serazio)

Is it possible for a robot to sell out? This week, we welcome author and communication professor Michael Serazio to talk about authenticity in pop music, especially with the arrival of A.I.-created music. Can a machine be a tortured artist… or at least fake it better than a human? Is this the end of “real music” or is A.I. just the latest tool that makes all us old folks scared? We promise no answers but lots of human musings!


Michael’s book, “The Authenticity Industries: Keeping it Real in Media, Culture and Politics” is available - go buy it!


Music heard in this episode: Masato Nakamura - “Starlight Zone”, John Baker - “ToeJam Jammin’ (Theme)”, RUN DMC - “My Adidas”, Breaking Rust - “Walk My Walk”, The Velvet Sundown - “Dust in the Wind”


Send us your thoughts at NeverMusicPod@gmail.com


Nevermind the Music is part of The Lorehounds Network. Join the Nevermind the Music Discord channel by visiting thelorehounds.com





Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

00:00 --> 00:03 [SPEAKER_01]: much in the same way when you guys probably get essays from your students.
00:03 --> 00:07 [SPEAKER_01]: And you're like, is this GPT or is this them, right?
00:07 --> 00:09 [SPEAKER_01]: Like how many times did they use an M-dash?
00:09 --> 00:12 [SPEAKER_01]: How many times did they use the word Dell, right?
00:12 --> 00:19 [SPEAKER_01]: And you're not quite sure with the AI artist that I've been hearing on Spotify and who've gone viral and stuff and become popular.
00:19 --> 00:26 [SPEAKER_01]: It exists in a little bit of a gray area, but I mean, I'm not sure, like, if I wasn't deliberately looking or thinking about it being AI, I think it would slide past me.
00:26 --> 00:27 [SPEAKER_01]: I wouldn't notice it necessarily.
00:27 --> 00:34 [SPEAKER_04]: Your students are good over there, mine still leave things in the keywords and bold, so you leave it in my problem.
00:34 --> 00:36 [SPEAKER_04]: The proof of the likes is so bad.
00:36 --> 00:41 [SPEAKER_02]: It's a screenshot from the actual OpenAI website.
00:51 --> 00:54 [SPEAKER_04]: I'm Mark and I'm Nicole and this is never mind the music.
00:55 --> 00:57 [SPEAKER_04]: Who are we going to talk to today, Mark?
00:57 --> 01:04 [SPEAKER_02]: We are talking to Michael Sorazio, professor, doctor, professor, Michael Sorazio.
01:04 --> 01:05 [SPEAKER_02]: Mike, welcome to the podcast.
01:05 --> 01:06 [SPEAKER_02]: Say hello.
01:07 --> 01:07 [SPEAKER_01]: Hey, what's going on?
01:08 --> 01:12 [SPEAKER_01]: Thanks so much for having me and please do as I tell my students in class, but they don't believe me.
01:12 --> 01:13 [SPEAKER_01]: Just call me Mike.
01:13 --> 01:20 [SPEAKER_01]: It feels just like I always tell my students, just don't believe me, Mr. Sorazio, because I look around and think like my father's in the
01:20 --> 01:22 [SPEAKER_04]: What are you a doctor of?
01:22 --> 01:29 [SPEAKER_01]: I am a doctor of philosophy, specifically a doctor of, I'm a, so communication, my PhD is in communication.
01:29 --> 01:36 [SPEAKER_01]: And my area of research, the thing that gets me up every day, super fascinated by it's the media.
01:36 --> 01:40 [SPEAKER_01]: I study the media, I teach about the media, I'm fascinated by the media.
01:40 --> 01:40 [SPEAKER_02]: Awesome.
01:41 --> 01:51 [SPEAKER_02]: I hate to say this, but ever since we reconnected a few years ago, and I knew that you are a professor of communication here around Boston, I can't help but think of that Simpsons clip from the 90s.
01:51 --> 01:56 [SPEAKER_02]: Do you remember where the it's like a football player gets injured or something and Dr. Hiver.
01:56 --> 01:57 [SPEAKER_02]: It's like, Oh, don't worry.
01:57 --> 02:05 [SPEAKER_02]: You can fall back on your degree and then he looks in communications and the guy's like, I know it's a phony major and this is coming from a music major.
02:05 --> 02:09 [SPEAKER_00]: I'm afraid that leg is hanging by a thread.
02:09 --> 02:11 [SPEAKER_06]: Luchenko must be a tongue-to-game.
02:12 --> 02:14 [SPEAKER_00]: Oh, you're playing days or over, my friend.
02:14 --> 02:17 [SPEAKER_00]: But you can always fall back on your degree in communications.
02:18 --> 02:19 [SPEAKER_00]: Oh, dear Lord.
02:20 --> 02:21 [SPEAKER_06]: I know.
02:21 --> 02:22 [SPEAKER_06]: It's funny, Major.
02:23 --> 02:24 [SPEAKER_06]: Luchenko learned nothing.
02:25 --> 02:25 [SPEAKER_06]: Nothing.
02:26 --> 02:26 [SPEAKER_01]: Sorry.
02:26 --> 02:27 [SPEAKER_01]: No, no.
02:27 --> 02:28 [SPEAKER_01]: So a couple of things.
02:28 --> 02:30 [SPEAKER_01]: Like one, that's, yes, that is a problem.
02:30 --> 02:30 [SPEAKER_01]: I'll pipe it in.
02:30 --> 02:31 [SPEAKER_01]: I can find on YouTube, I'm sorry.
02:31 --> 02:32 [SPEAKER_01]: I'm pipein' for reason.
02:32 --> 02:33 [SPEAKER_01]: That's a classic clip.
02:33 --> 02:43 [SPEAKER_01]: The other thing is we have a lot of, yeah, like, I mean, we have a lot of athletes in the major, but it is endlessly fascinating, so, you know, keeps me entertained and down.
02:43 --> 02:47 [SPEAKER_04]: There's value in the liberal arts, gosh, guys, and then to that one.
02:47 --> 02:51 [SPEAKER_02]: But I would say you write about media, but can I shout out, can I tweet your horn?
02:51 --> 02:54 [SPEAKER_02]: Because three books are we on three or we on four yet?
02:55 --> 02:55 [SPEAKER_02]: Yeah.
02:55 --> 02:55 [SPEAKER_02]: Okay.
02:55 --> 02:57 [SPEAKER_02]: So three of published, yeah.
02:57 --> 02:57 [SPEAKER_02]: Got it.
02:57 --> 03:00 [SPEAKER_02]: First book about gorilla marketing.
03:00 --> 03:13 [SPEAKER_02]: Second book about sports media, third book, and that brings us to our conversation, authenticity, and tell me if I've got it right, the authenticity industries, keeping it real in media, culture, and politics.
03:13 --> 03:14 [SPEAKER_02]: That's the book.
03:14 --> 03:14 [SPEAKER_01]: Thank you.
03:15 --> 03:15 [SPEAKER_02]: Come with yours again.
03:16 --> 03:17 [SPEAKER_02]: Cool.
03:17 --> 03:18 [SPEAKER_04]: love a subtitle.
03:19 --> 03:28 [SPEAKER_02]: I dove deep into chapter three pop music sponsorship play and that is what we're talking about today.
03:28 --> 03:30 [SPEAKER_04]: But before we get into that, I read that chapter two more.
03:30 --> 03:31 [SPEAKER_02]: I do the homework too.
03:31 --> 03:33 [SPEAKER_02]: We all, I both did the grad school.
03:33 --> 03:35 [SPEAKER_02]: Yes, we can work.
03:35 --> 03:36 [SPEAKER_02]: I love homework.
03:37 --> 03:46 [SPEAKER_02]: Before we get into it because folks, we weren't, we're going to talk about authenticity and music and
03:46 --> 03:46 [SPEAKER_02]: 24, 24.
03:47 --> 03:50 [SPEAKER_02]: We're going to update it to 26 because there's some AI that's happened.
03:51 --> 04:01 [SPEAKER_02]: And we talk around AI a lot on this podcast, but we want to dive into authenticity and selling out and how AI and social media and all that might change it.
04:01 --> 04:06 [SPEAKER_02]: But should I mention how we how we know each other or should are we just keeping it professional?
04:06 --> 04:06 [SPEAKER_02]: What do we want to do?
04:07 --> 04:11 [SPEAKER_01]: I think we I mean look it's it's
04:11 --> 04:39 [SPEAKER_01]: three decades can pass and you know the kid that you hung out with and played video games within the 1980s winds up in the same city in the same type you know same job right like so yes so we so Mark and I we grew up together in San Diego because my dad's job I moved away from San Diego in the early 90s and then low and behold we both get professor jobs and remind me so I started here in 2015 you were same year right or
04:39 --> 04:42 [SPEAKER_04]: So do you know, crazy, mori, carry girl?
04:42 --> 04:46 [SPEAKER_04]: Do you know the weird eagle scout guy?
04:46 --> 04:48 [SPEAKER_04]: Do you know any of those people?
04:48 --> 04:50 [SPEAKER_04]: That's the mark lore that I found them out.
04:50 --> 04:53 [SPEAKER_01]: The mythology, yeah, my mythology of mine ends around 1991.
04:53 --> 04:56 [SPEAKER_02]: Yeah, so any platform we were so young.
04:57 --> 05:06 [SPEAKER_02]: But I mean, you actually maybe don't know this, even though we've gone to a bunch of shows since we've lived in the same city, and so you know something about musical stylings.
05:06 --> 05:13 [SPEAKER_02]: But some of the experiences with you are heavily influential on me as a musician, because of all the video games.
05:14 --> 05:19 [SPEAKER_02]: We would like, stay up, super late, playing New Year's, while the parents, maybe I guess just left us alone.
05:19 --> 05:21 [SPEAKER_02]: Sonic the Hedgehog, I did not own a Genesis.
05:22 --> 05:25 [SPEAKER_02]: So Sonic 1 was at your house always.
05:28 --> 05:47 [SPEAKER_02]: And, yeah, maybe a sleeper hit, to jam an Earl, anybody who knows good night is bunk baseline, I'll pipe it in.
05:48 --> 06:04 [SPEAKER_02]: That's stuff still like it fills my dream some of that not quite chiptune music to fancy to be chiptune music.
06:04 --> 06:08 [SPEAKER_04]: Do you know Mark I have a Sega Genesis and I have to jam in Earl.
06:08 --> 06:09 [SPEAKER_02]: Oh wow.
06:10 --> 06:10 [SPEAKER_02]: Wow.
06:10 --> 06:12 [SPEAKER_04]: I'll bring it over next time.
06:12 --> 06:13 [SPEAKER_02]: Why are we doing this?
06:13 --> 06:13 [SPEAKER_02]: Let's play that.
06:13 --> 06:14 [SPEAKER_04]: I don't know.
06:14 --> 06:17 [SPEAKER_04]: We can cut it out but I just wanted to mention it.
06:17 --> 06:45 [SPEAKER_01]: what kind before before we dive in do you want to tell us anything about what you're into musically since we're not talking about you as a musician or anything but you're a writer what do you want to yeah i mean i like to pretend that i have omnivorous taste um and and and you know i think i i do try to like listen to his a wide of variety of stuff i mean you know like i grow up you know grow up eighties nineties i've got baby boomer parents so basically my kind of
06:45 --> 06:52 [SPEAKER_01]: You know, if I only have one decade to take to a desert island and listen to music for the rest of my life, it's unquestionably the 80s.
06:52 --> 07:01 [SPEAKER_01]: And then, you know, these days, I mean, it's a sort of mixture of like dad rocky type stuff, right, like the war on drugs and arcade fire, things like that.
07:01 --> 07:23 [SPEAKER_01]: For sure, so in college, I discover electronic dance music and I get really into that and not just the sort of like uptempo, like the kind of like more like downtempo-y stuff which you know the last few years it's been a lot of synthwave I guess 10 years ago 15 years ago chill wave was Having its moment and I still kind of carry forward with some of those sort of downtempo electronic textures
07:23 --> 07:26 [SPEAKER_01]: I don't know, when I'm like work and I need something in the background to listen to.
07:26 --> 07:31 [SPEAKER_01]: I think Cliff Martinez, the soundtracks of the composer Cliff Martinez that he makes from movies is extraordinary and great.
07:31 --> 07:32 [SPEAKER_01]: But yeah, it's a ton of stuff.
07:33 --> 07:34 [SPEAKER_01]: Just depends on the year, depends on the time.
07:34 --> 07:39 [SPEAKER_02]: I think it's funny you're in a downtime, though, because I feel like there's a major movie disc in your book chapter.
07:39 --> 07:39 [SPEAKER_02]: There is.
07:40 --> 07:43 [SPEAKER_02]: You call it lame or at least you say other people.
07:43 --> 07:47 [SPEAKER_02]: It's one of those politician people are saying he's lame.
07:47 --> 07:48 [SPEAKER_01]: Yeah, no, no, no, no.
07:48 --> 07:51 [SPEAKER_01]: Actually, and that is your correct and picking up on next.
07:51 --> 08:01 [SPEAKER_01]: I love Moby and Moby, particularly those albums, 18 and play in the late 90s, early outs, really, they really crystallized electronic dance music for me.
08:01 --> 08:07 [SPEAKER_01]: And, and that was a moment at which it was sort of breaking into the, what was still most standard ground, but it had been more underground, pretty to Moby.
08:07 --> 08:24 [SPEAKER_01]: And just the fact that with that album, he licensed every single track for commercial use, within the subculture, within kind of EDM subculture, that was considered lame because it was considered selling out, which is the kind of core of the chapter and even the whole book argument, doesn't agree or tension or whatever.
08:24 --> 08:37 [SPEAKER_01]: So yeah, no, I'm not hating on Mobie, I just want to point out that other people do.
08:37 --> 08:50 [SPEAKER_04]: Does he talk about Woodstock 99 at all and his stuff like that there because that's like iconic in terms of selling out and being like an ambassador of This damn tempo EDM culture to the masses.
08:50 --> 08:56 [SPEAKER_04]: I think that was a real moment for him and it was a really toxic gross moment just in our cultural history.
08:56 --> 08:56 [SPEAKER_04]: Yes.
08:57 --> 09:00 [SPEAKER_04]: So I'm eager to hear his like first person perspective from that.
09:00 --> 09:01 [SPEAKER_04]: So he talks about in the book.
09:01 --> 09:02 [SPEAKER_04]: I'm in.
09:02 --> 09:04 [SPEAKER_01]: It is the one of most soul-bearing autobiography.
09:05 --> 09:06 [SPEAKER_01]: Again, maybe a lot of them are like that.
09:06 --> 09:19 [SPEAKER_01]: I don't read that many, but it takes you on a journey from the sort of heights of his fame and the depravity with which he indulged his celebrity kind of like Bokanealia to just just how that kind of ruined his life.
09:19 --> 09:29 [SPEAKER_01]: And it's a real kind of like, you know, just like the book is just extraordinarily like just nakedly honest in a way that I did not expect from a foam memoir.
09:29 --> 09:31 [SPEAKER_01]: So it's good book.
09:30 --> 09:32 [SPEAKER_04]: Should we start a book club, do you think?
09:33 --> 09:35 [SPEAKER_04]: I don't know, I just drink an alami, and I'm like off the charts.
09:35 --> 09:38 [SPEAKER_04]: But I think that we need, like, in the remind of the music book club.
09:38 --> 09:41 [SPEAKER_02]: I'm still working on that 1100 page Stephen King novel.
09:41 --> 09:44 [SPEAKER_02]: I've been on for like a month for so long.
09:44 --> 09:47 [SPEAKER_02]: It's good though, things really bad happen at the end of it.
09:47 --> 09:50 [SPEAKER_02]: That's one thing I'll say, spoiler to every Stephen King novel.
09:50 --> 09:50 [SPEAKER_02]: Okay.
09:50 --> 09:51 [SPEAKER_02]: All right.
09:51 --> 09:53 [SPEAKER_02]: Do you want to talk through some of your main points?
09:53 --> 09:55 [SPEAKER_02]: I mean, we read through it.
09:55 --> 10:00 [SPEAKER_02]: And then we can come back and like bring in talk to the AI of all of it.
10:00 --> 10:04 [SPEAKER_02]: But like your thesis of the book is sort of like everything is fake, right?
10:04 --> 10:05 [SPEAKER_02]: Authenticity is fake.
10:05 --> 10:08 [SPEAKER_02]: So how does that apply to music and pop music in particular?
10:08 --> 10:09 [SPEAKER_02]: Totally.
10:09 --> 10:12 [SPEAKER_01]: So short, just a little elevator pitch version of the book.
10:12 --> 10:15 [SPEAKER_01]: America has become obsessed with authenticity in recent years.
10:16 --> 10:18 [SPEAKER_01]: You just hear this term popping up more and more.
10:18 --> 10:23 [SPEAKER_01]: It's used usually to explain some aspect of success that someone is having, right?
10:23 --> 10:26 [SPEAKER_01]: So why did Donald Trump win the election?
10:26 --> 10:29 [SPEAKER_01]: Why does that influencer have a million followers?
10:30 --> 10:33 [SPEAKER_01]: Why is that reality show become popular?
10:33 --> 10:35 [SPEAKER_01]: Why is that musician got a lot of fans?
10:36 --> 10:41 [SPEAKER_01]: And the word that you use, you hear used to explain that, is like, oh, they're really authentic, or it's really authentic.
10:41 --> 10:46 [SPEAKER_01]: And so that became my source of fascination, like look, people seem to be obsessed with this quality and culture now it is.
10:47 --> 10:52 [SPEAKER_01]: But it was my thinking, and then my exploration, that basically authenticity can be faked.
10:52 --> 10:53 [SPEAKER_01]: It can be produced.
10:53 --> 10:56 [SPEAKER_01]: It can be put on for us by people who work behind the scenes.
10:56 --> 10:58 [SPEAKER_01]: So basically the book is based upon
10:58 --> 11:11 [SPEAKER_01]: dozens and dozens of interviews with people who work in entertainment, music, advertising, social media politics to basically understand the work that they do to try to convince us that their clients are authentic.
11:11 --> 11:26 [SPEAKER_01]: So the specific chapter on pop music is really interested in sort of going behind the scenes, talking to band managers, music label executives, just kind of anybody who sort of works in the background when it comes to music to try to understand how do you try to convince people
11:26 --> 11:30 [SPEAKER_01]: this artist that you represent, it was who they purport to be.
11:30 --> 11:34 [SPEAKER_01]: And the tension in the pop music chapters is really all about what is the purpose of music, right?
11:34 --> 11:38 [SPEAKER_01]: Like, is it for people to find their identity through as fans?
11:39 --> 11:46 [SPEAKER_01]: Is it to sort of court some sort of like artistic views that they're just trying to express in the most, you know, honest way possible?
11:47 --> 11:47 [SPEAKER_01]: Or,
11:48 --> 11:48 [SPEAKER_01]: Is it a business?
11:49 --> 11:49 [SPEAKER_01]: Is it an industry?
11:50 --> 11:54 [SPEAKER_01]: Is it supposed to basically just sell a lot of albums and sell a lot of concert tickets?
11:54 --> 12:09 [SPEAKER_01]: And so, you know, the tension between authenticity and selling out is like most intense in the music world because on one hand as fans, we want music to mean something more than simply being a sort of marketplace product.
12:10 --> 12:11 [SPEAKER_01]: And yet, at the same time,
12:12 --> 12:13 [SPEAKER_01]: you have to sell records.
12:13 --> 12:34 [SPEAKER_01]: You have to sell tickets and that and that tension especially in terms of the history of music, whether it be new instruments and new technological advances in terms of how music is played, whether it be the trajectory of the business, I mean whether you know talk about like from the 2000 onward just a complete decimation of revenues and how musicians are going to make money in the aftermath of that, right?
12:34 --> 12:39 [SPEAKER_01]: Those tensions are all kind of
12:39 --> 12:41 [SPEAKER_02]: One thing that I noticed, well, sorry, that's a lot.
12:41 --> 12:42 [SPEAKER_02]: There's a lot there.
12:43 --> 12:48 [SPEAKER_02]: So there's a change in, I guess, how much people are trying to hide it, though.
12:48 --> 12:57 [SPEAKER_02]: You start the story sort of in the 60s, like John Lennon and stuff, but then you also mentioned Russo, which would be like Enlightenment era.
12:57 --> 13:14 [SPEAKER_02]: And I would say we could talk also the romantic period like the artists like Beethoven, the tortured artists and that sort of ideal that we then have since then I would say of the artist not as a craftsman like a carpenter or a Mason or whatever, but.
13:14 --> 13:23 [SPEAKER_02]: some kind of a torture genius that has to suffer for their art, but then the way that we treat that changes and you talk about this in the book and we've made reference to it before.
13:23 --> 13:34 [SPEAKER_02]: We're like 90s when we came of age selling out was like the worst thing you could do, but then a decade plus later there's like rap entrepreneurs.
13:34 --> 13:37 [SPEAKER_02]: And it's like everybody has a sponsorship play now.
13:37 --> 13:38 [SPEAKER_02]: That's one of the things you talk about.
13:38 --> 13:41 [SPEAKER_02]: But like the idea that they were still chasing authenticity.
13:41 --> 13:46 [SPEAKER_02]: It's just that what that authenticity was and what crossed the line has totally changed.
13:46 --> 13:47 [SPEAKER_01]: Yeah, indeed.
13:47 --> 13:50 [SPEAKER_01]: And that timeline that your articulate is is precisely right.
13:50 --> 14:18 [SPEAKER_01]: The 90's, which again, we kind of came of age in, right, like selling out was the worst thing that you could possibly do, right, in the 90's, right, this would be, you know, green day changing labels going big label, this would be to some degree, the if not impetus for the context of Kurt Cobain taking his own life, right, in the sort of sense that like, you know, he had gotten too big for what the project that he was trying to achieve artistically.
14:18 --> 14:25 [SPEAKER_01]: the same stigma that I think it once had, not just in music, but like you guys may have this experience with your students, right?
14:25 --> 14:30 [SPEAKER_01]: Like you asked them, like, what do you think of this concept of selling out?
14:31 --> 14:34 [SPEAKER_01]: And they're like, well, that means like you've sold a lot.
14:34 --> 14:35 [SPEAKER_01]: So that's a good thing, right?
14:35 --> 14:41 [SPEAKER_01]: Like it's like they don't, they don't look at it as some kind of a betrayal, which I think like a sort of certain gen X.
14:41 --> 14:59 [SPEAKER_01]: Gen X and pre Gen X sensibility would articulate but rather as a kind of like celebratory thing and again part of that is them coming of age or again our students here in our younger people coming of age in a world in which influencer like activities are completely normalized in a way that like I think older generations are still like.
15:00 --> 15:06 [SPEAKER_01]: Yeah, I don't know, where are the boundaries of your identity and your relationships that exist outside the marketplace?
15:07 --> 15:10 [SPEAKER_01]: If you're an influencer, you do not conceptualize that there are boundaries.
15:10 --> 15:16 [SPEAKER_01]: So anyway, long story short, I think what happened basically was the music industry as a business deflated.
15:16 --> 15:29 [SPEAKER_01]: You go from record revenues and music sales, peeking around 2000, not coincidentally, right when Napster effectively comes in and destroys the notion that you would buy and consume
15:30 --> 15:38 [SPEAKER_01]: Now the industry's been able to claw some of that back with streaming and you know ticket sales things like that, but but it's still more or less half for a fraction of what it used to be.
15:39 --> 16:00 [SPEAKER_01]: you know, in the absence of that traditional model, you like a band you buy their album, those groups have got to make money some other way and the way that you do it is you do it and sponsorships, you do it in partnerships, you just you're able to mingle with commerce in a way that previously had not been allowed or necessary, but it is necessary in the aftermath of that.
16:00 --> 16:05 [SPEAKER_01]: So, authenticity still, people still seek that in the music, but
16:05 --> 16:09 [SPEAKER_01]: certain things that used to be off limits that would be seen as inauthentic.
16:10 --> 16:11 [SPEAKER_01]: It's not so much anymore.
16:11 --> 16:12 [SPEAKER_01]: Like you mentioned hip-hop.
16:12 --> 16:15 [SPEAKER_01]: Hip-hop has always been cool with selling.
16:16 --> 16:20 [SPEAKER_01]: You know, 1985, I think it was Run DMC walk this way, right?
16:20 --> 16:31 [SPEAKER_01]: They're in concert and you know, my adidas, there's a track called my adidas and they're holding up their their cell toes in concert.
16:41 --> 16:43 [SPEAKER_07]: I won't mind speaking about it
16:49 --> 16:49 [SPEAKER_01]: Right.
16:49 --> 16:55 [SPEAKER_01]: There's always been a comfort within hip hop in terms of entrepreneurialism and selling and stuff like that.
16:55 --> 16:58 [SPEAKER_01]: But you have to make sure that you're selling with the right brand, right?
16:58 --> 17:00 [SPEAKER_01]: Like it's like it's like not like a problem.
17:00 --> 17:02 [SPEAKER_01]: It's not a problem to do a sponsorship with a brand.
17:02 --> 17:04 [SPEAKER_01]: It's a problem to do with an uncool brand.
17:04 --> 17:08 [SPEAKER_01]: And hip hop is pointed the way a lot in music culture in, you know, the past 30, 40 years.
17:08 --> 17:12 [SPEAKER_01]: So it led the way once you could no longer make money in traditional forms.
17:12 --> 17:17 [SPEAKER_04]: And the brand needs to fit in with the persona that you're representing.
17:17 --> 17:20 [SPEAKER_04]: And the brand becomes part of the authenticity.
17:20 --> 17:28 [SPEAKER_04]: In your chapter, you talk about specific genres and what the mental schemas we have are around these genres.
17:29 --> 17:39 [SPEAKER_04]: And it's very interesting that we feel like the most authentic artists are ones that just fit our mental schemas of what a folk musician looks like or what a hip-hop artist looks like.
17:39 --> 17:54 [SPEAKER_04]: And that to me is so upside down almost because you're not being original, authenticity doesn't equal original and the authenticity just equals how much you match the pre existing social skills around a certain icon.
17:54 --> 18:17 [SPEAKER_04]: Yeah, I teach a lot of developmental psych and we're focusing a lot right now on adolescent development and this identity versus role confusion crisis moment that adolescent so 12 to 18 year olds face and even into early 20s where you're trying to like pick what your persona is and how carefully we identify with artists and musicians.
18:17 --> 18:28 [SPEAKER_04]: and influencers to attach to a certain iconography of what our social schema, what our person is going to be and how much we broadcast that out and what these artists are doing are very similar.
18:28 --> 18:32 [SPEAKER_01]: Young people are far more comfortable accepting a notion.
18:32 --> 18:37 [SPEAKER_01]: of themselves as brands and previous generations were.
18:37 --> 18:39 [SPEAKER_04]: And even like that's their goal.
18:39 --> 18:41 [SPEAKER_04]: Dude, like that's their goal is to monetize.
18:42 --> 18:43 [SPEAKER_02]: That's what all about the same age.
18:43 --> 18:44 [SPEAKER_02]: Yeah.
18:44 --> 18:46 [SPEAKER_02]: What do you want to be when you grow up?
18:46 --> 18:49 [SPEAKER_02]: Like the default setting is YouTuber or something like that.
18:49 --> 18:51 [SPEAKER_04]: And that is so why as a joke.
18:51 --> 18:55 [SPEAKER_04]: And now like even in college, I do like career development work with my students.
18:56 --> 18:58 [SPEAKER_04]: And they'll say, like, no, that's the goal.
18:59 --> 19:00 [SPEAKER_04]: Like it's not.
19:00 --> 19:05 [SPEAKER_04]: not saying it sarcastically like that is the goal is to get monetized, to get brand representation.
19:06 --> 19:09 [SPEAKER_04]: The selling out is not even just like acceptable and okay.
19:09 --> 19:13 [SPEAKER_04]: It's actually the preference to sell out because that is how they're going to make their money.
19:13 --> 19:20 [SPEAKER_04]: If they want to be in bold statement, but if they want to be in a creative industry, you need a bank roll.
19:21 --> 19:21 [SPEAKER_01]: For sure.
19:21 --> 19:29 [SPEAKER_04]: And if you can find a bank roll that aligns with your pre-sisting point of view and your persona, then that's awesome.
19:39 --> 19:48 [SPEAKER_02]: We've talked about, you know, how authenticity has remained.
19:48 --> 19:51 [SPEAKER_02]: We can call it going back to the Enlightenment, or whatever, has remained.
19:52 --> 20:00 [SPEAKER_02]: This really important thing, but the way that it is manifested has changed, but we've taken for granted that we're still
20:00 --> 20:03 [SPEAKER_02]: being raised in a desire to have that authenticity.
20:04 --> 20:16 [SPEAKER_02]: And so I'm wondering, with the YouTuber being the thing the gen Alpha kids want to be when they grow up, is it possible that we actually have an instance where the authenticity may no longer actually be important?
20:16 --> 20:17 [SPEAKER_02]: And so,
20:17 --> 20:36 [SPEAKER_02]: 20 years from now, will they be faking the authenticity and making up the myth of I just did this by myself and this is all coming from the heart or actually has something cracked and I mean, maybe the segues into our talk about AI, which was the seed of this conversation, couldn't be that it's so baked in to have a brand.
20:36 --> 20:37 [SPEAKER_01]: It's such a good question.
20:37 --> 20:38 [SPEAKER_01]: So two ways to attack that.
20:38 --> 20:47 [SPEAKER_01]: Like on one hand, we have to foreground that, you know, there's no greater fools there and then trying to protect the future when it comes to media,
20:47 --> 20:56 [SPEAKER_01]: I do think at least the present moment when I talk to my students just to kind of informal sample size of, you know, whatever dozens each semester.
20:57 --> 21:01 [SPEAKER_01]: They still do associate authenticity as
21:01 --> 21:02 [SPEAKER_01]: a positive thing.
21:02 --> 21:21 [SPEAKER_01]: They do still think that is something to aim for, but they have also been so marketed to throughout their lives, and they are so accustomed that the self-brand has to be a way in which they live their lives, that the tension and the sort of slippage just becomes ever more challenging to maintain.
21:21 --> 21:33 [SPEAKER_01]: and try to sort of like calibrate those two contrasting impulses to kind of be who you really are and pursue your dreams independent of what the marketplace will pay for.
21:33 --> 21:42 [SPEAKER_01]: And on the other hand, you know, like just having to make a living and having to get by and having to sort of exist in advanced capitalism and just try to make it work.
21:42 --> 21:46 [SPEAKER_01]: And so, yeah, I mean, I think, you know, segue into our conversation about AI here.
21:47 --> 22:02 [SPEAKER_01]: I think we're at this super fascinating, super weird moment with AI, where we are now having to redefine the boundaries of humanity in all kinds of interesting ways.
22:03 --> 22:10 [SPEAKER_01]: Like I would argue, and I do, and some of the writings and things on this, that we are right now in the midst of what I call the normalizing AI era, right?
22:10 --> 22:17 [SPEAKER_01]: I mean, that in two ways, like one, AI is a massive sales pitch that's being forced upon us.
22:17 --> 22:23 [SPEAKER_01]: encouraged to adopt AI into our personal and professional lives in every way possible.
22:23 --> 22:24 [SPEAKER_01]: So it's being forced upon us, right?
22:24 --> 22:25 [SPEAKER_01]: It's being sold to us.
22:25 --> 22:31 [SPEAKER_01]: And the market bubble that is AI depends on that sales pitch working, because otherwise the whole thing pops and it falls apart.
22:31 --> 22:33 [SPEAKER_01]: And we've got dot com 2001 all over again.
22:34 --> 22:35 [SPEAKER_01]: However, the pitch might work.
22:36 --> 22:44 [SPEAKER_01]: And in which case, we right now are recoiling at forms of AI output that we designate as not being human and not being authentic.
22:45 --> 22:58 [SPEAKER_01]: I'm not sure five or ten years from now, if the AI sales pitch is correct, that we will still look at, to just use music to get us back to music, AI created music as being stigmatized or inauthentic or unacceptable.
22:58 --> 23:07 [SPEAKER_01]: We do right now, there's still this pushback and we can talk about some of the different AI artists that have gone viral, but you know, I don't know whether we're going to look at that.
23:07 --> 23:09 [SPEAKER_01]: in five or ten years time and feel the same way.
23:09 --> 23:11 [SPEAKER_04]: That's such an interesting perspective.
23:11 --> 23:24 [SPEAKER_04]: I was just having a conversation with a few students, so I teach psychology, counseling psychology, things like that, and I was having a conversation with students about using AI as a therapist.
23:24 --> 23:27 [SPEAKER_04]: and what are the pros and cons of that?
23:27 --> 23:36 [SPEAKER_04]: Like pros are, we can reduce some stigma, we can maintain some level of personal confidentiality if we're, do you don't want to feel too vulnerable?
23:37 --> 23:49 [SPEAKER_04]: But the downside came back to this authenticity question that the whole point of what therapeutic relationship is, is saying what you're feeling to another human in having that feeling reflected back on you with like real human empathy.
23:49 --> 23:55 [SPEAKER_04]: that authentic empathy, and that's the word that kept coming up in this conversation was that authenticity piece.
23:55 --> 24:04 [SPEAKER_04]: And now you're equating authenticity with humanity in terms of music making and other art making, we can extrapolate, but I think that that's
24:04 --> 24:10 [SPEAKER_04]: a really interesting way in angle to look at AI as in itself being kind of inauthentic.
24:10 --> 24:18 [SPEAKER_04]: And even in a therapeutic approach, one of our main concerns was you can get an AI model to reflect back whatever you want.
24:18 --> 24:25 [SPEAKER_04]: Like if I want my AI therapist to be a middle-aged white mom, that's what it's going to look to me.
24:25 --> 24:28 [SPEAKER_04]: But it's still all being programmed by the same person.
24:28 --> 24:29 [SPEAKER_04]: And that is
24:29 --> 24:33 [SPEAKER_04]: very, very inauthentic and feels kind of yucky, right?
24:33 --> 24:35 [SPEAKER_04]: But maybe we'll adapt.
24:35 --> 24:36 [SPEAKER_04]: I have hope.
24:36 --> 24:38 [SPEAKER_01]: Yeah, I mean, maybe we'll adapt.
24:38 --> 24:39 [SPEAKER_01]: There's a whole array of things.
24:39 --> 24:41 [SPEAKER_01]: You talk about a therapy.
24:41 --> 24:48 [SPEAKER_01]: There's, you know, all kinds of reporting about people who are finding deep companionship that they find through their chatbots, romance,
24:48 --> 24:50 [SPEAKER_01]: You know, this this one.
24:50 --> 24:54 [SPEAKER_01]: I don't know if you guys are you guys familiar with this phenomenon called grief tech slash ghost box.
24:54 --> 24:56 [SPEAKER_04]: Yes, I was just talking about that with these students.
24:56 --> 24:58 [SPEAKER_04]: It's wild mark.
24:58 --> 24:59 [SPEAKER_04]: Do you know about this bonkers.
24:59 --> 25:05 [SPEAKER_02]: I will play the role of somebody who does not know what it is just just for the sake of the audience.
25:05 --> 25:06 [SPEAKER_02]: What is it.
25:06 --> 25:09 [SPEAKER_04]: Great grief box.
25:09 --> 25:11 [SPEAKER_04]: So what you can do.
25:11 --> 25:15 [SPEAKER_04]: Right, say my dad passed away a few years ago, right?
25:15 --> 25:16 [SPEAKER_04]: I miss him.
25:16 --> 25:35 [SPEAKER_04]: I can take photographs of him videos of him, letters he wrote, and he's hard of fact I have that speaks to his persona, upload it, they'll create this grief bought that will give me a video presentation of my dad sitting in a room, a level of heaven lounge.
25:35 --> 25:36 [SPEAKER_04]: And I can talk to him.
25:36 --> 25:37 [SPEAKER_04]: My kid can talk to him.
25:38 --> 25:42 [SPEAKER_04]: She can share things about her day and the place she was in or this thing that she made.
25:42 --> 25:48 [SPEAKER_04]: And he can respond like my dad would have responded based on what I've fed the algorithm.
25:49 --> 25:54 [SPEAKER_04]: When I first heard about this, I was appalled and I brought it to my students.
25:54 --> 25:55 [SPEAKER_04]: I said, oh my god, this thing.
25:55 --> 25:56 [SPEAKER_04]: It's like so wild.
25:56 --> 25:57 [SPEAKER_04]: And a lot of the students agreed.
25:57 --> 25:59 [SPEAKER_04]: They're like, that's like creepy and weird.
25:59 --> 26:09 [SPEAKER_04]: We've been talking a lot about grief and in this podcast, we talk a lot or I do about how grief is lovely and beautiful and such a huge part of the human experience.
26:09 --> 26:15 [SPEAKER_02]: I think it still might be our number one listen to episode on bone thugs and how many, maybe the second.
26:16 --> 26:18 [SPEAKER_04]: I'm writing an honor seminar for the fall.
26:18 --> 26:21 [SPEAKER_04]: It just got approved about like grief and the human experience.
26:21 --> 26:25 [SPEAKER_04]: because of the seed that was planted in our full and thugs in harmony episode, by the way.
26:25 --> 26:26 [SPEAKER_02]: Love it.
26:26 --> 26:26 [SPEAKER_04]: Such a nerd.
26:27 --> 26:29 [SPEAKER_02]: As long as I'm in the Bidley Archivore.
26:29 --> 26:30 [SPEAKER_04]: I believe that's the only love it.
26:31 --> 26:33 [SPEAKER_04]: And so I'm like piloting all these ideas, right?
26:34 --> 26:38 [SPEAKER_04]: And we're talking about grief, the grief bots, and how we need to let ourselves grieve.
26:39 --> 26:41 [SPEAKER_04]: And the students are like, here, AI's awful.
26:42 --> 26:45 [SPEAKER_04]: And then one girl raised your hand, and she goes, you know what, I think it would be really nice.
26:46 --> 26:48 [SPEAKER_04]: And they said, tell me more.
26:48 --> 26:51 [SPEAKER_04]: And she said, you know, I lost my mom unexpectedly.
26:51 --> 26:53 [SPEAKER_04]: I never got a chance to say goodbye.
26:53 --> 26:57 [SPEAKER_04]: And it would be really nice just to have like, to talk to her again.
26:57 --> 27:01 [SPEAKER_04]: And I know it's a robot, but it still feels like my mom and I need that feeling.
27:01 --> 27:04 [SPEAKER_04]: And that was like, that was authentic.
27:04 --> 27:06 [SPEAKER_04]: And that was raw and vulnerable.
27:06 --> 27:08 [SPEAKER_04]: And it kind of made me think like,
27:09 --> 27:13 [SPEAKER_04]: Maybe I'm being too judgmental when else does know.
27:13 --> 27:15 [SPEAKER_04]: Well, I mean, it's weird though, man.
27:15 --> 27:17 [SPEAKER_04]: Let's, it's weird for me.
27:18 --> 27:21 [SPEAKER_01]: Right now it bonkers, but I do want to say, you know, this is what?
27:21 --> 27:23 [SPEAKER_01]: This is February 26, 26.
27:23 --> 27:28 [SPEAKER_01]: I don't know if it's always going to seem bonkers to us.
27:28 --> 27:30 [SPEAKER_01]: And I use the analogy in this way, right?
27:31 --> 27:48 [SPEAKER_01]: There's a whole lot of behaviors that we see on social media that if you could have gone back in time 20 25 years ago and say like you know what this is the way that people are going to behave this is the things that they're going to do right that are we just take completely as normal now because we live our lives through social media.
27:48 --> 27:51 [SPEAKER_01]: I'm not sure that same won't be true for AI, right?
27:51 --> 28:01 [SPEAKER_01]: Like, you know, I don't know that we're always going to think that digital, deep-fake avatar reanimations of our deceased loved ones will seem strange or uncanny or problematic.
28:02 --> 28:08 [SPEAKER_01]: I don't know that down the road, we're going to look at the essays that our students produce with CHATGPT and try to pass off his own work.
28:08 --> 28:10 [SPEAKER_01]: I don't know if we're going to look at that as being like,
28:10 --> 28:11 [SPEAKER_01]: That's bad.
28:11 --> 28:12 [SPEAKER_01]: You shouldn't do that.
28:12 --> 28:18 [SPEAKER_01]: I don't know that we're going to look at people now who say they're getting married to their AI companions and say like, no, you shouldn't do that.
28:19 --> 28:22 [SPEAKER_01]: In other words, like, like, you know, human beings are really adaptable, right?
28:22 --> 28:27 [SPEAKER_01]: And like, we may just, I mean, I think it's off, but I don't know that we always will.
28:27 --> 28:34 [SPEAKER_04]: I mean, if you think about Beethoven, you know, we're not wearing powdered wigs and fucking bloomers or whatever it is, he was doing any more, right?
28:34 --> 28:35 [SPEAKER_01]: We're not going anywhere.
28:36 --> 28:37 [SPEAKER_02]: And I feel like that's a big deal.
28:37 --> 28:37 [SPEAKER_04]: Yeah, for today.
28:38 --> 28:38 [SPEAKER_02]: Yeah.
28:38 --> 28:39 [SPEAKER_01]: It should be a powdered wigs.
28:39 --> 28:40 [SPEAKER_02]: Not powdered wigs.
28:40 --> 28:42 [SPEAKER_02]: Beethoven was, that was, he was too late for that.
28:43 --> 28:45 [SPEAKER_03]: I don't know, but in the end.
28:45 --> 28:55 [SPEAKER_04]: But in like the scheme I have in my head of like a classical musician, like it's kind of conflating with the George Washington somehow, like a lumping limb.
28:55 --> 28:56 [SPEAKER_02]: That's fair.
28:56 --> 29:00 [SPEAKER_02]: So similar era, the George Washington was kind of old in that era, I remember, right?
29:00 --> 29:04 [SPEAKER_03]: Yeah, you know what, Mark, Mike gets me.
29:04 --> 29:08 [SPEAKER_02]: I'm not going to, I'm not going to chronology shame you.
29:08 --> 29:09 [SPEAKER_02]: Thanks bud.
29:10 --> 29:13 [SPEAKER_02]: We can imagine that, okay, so maybe it could go two ways.
29:13 --> 29:16 [SPEAKER_02]: So back to authenticity and music and stuff like that.
29:16 --> 29:37 [SPEAKER_02]: I could see on the one hand, if authenticity and music and art is about self-expression and tortured emotions that it has been in our psychies for the last couple of hundred years, then AI almost by definition has a massive disadvantage because it cannot self-express, right?
29:37 --> 29:42 [SPEAKER_02]: or authenticity is by definition according to your book fake.
29:42 --> 29:51 [SPEAKER_02]: And so AI has a massive advantage in simulating authenticity versus a human that actually has to grapple with those emotions for real, right?
29:51 --> 29:56 [SPEAKER_02]: So like I could see either one of those paths kind of opening that like,
29:56 --> 29:59 [SPEAKER_02]: the fake authenticity sort of wins out for the consumer.
29:59 --> 30:05 [SPEAKER_02]: And that's one future or they just nothing just ever makes you cry.
30:05 --> 30:09 [SPEAKER_02]: Coming out of an AI, it's got to come out of a human like I could believe either of those things.
30:10 --> 30:14 [SPEAKER_01]: Yeah, and I think those two paths as you articulate are eminently possible.
30:14 --> 30:16 [SPEAKER_01]: Let's begin going back to the kind of timeline that you were talking about earlier, right?
30:16 --> 30:19 [SPEAKER_01]: So like, if you feel like AI cannot make music,
30:20 --> 30:41 [SPEAKER_01]: quote unquote real nowadays you can hear the same claims made throughout music history right when Bob Dylan goes electric at new port folk festival in the late sixties right when he plugs in people are like people are booing him they're mad right like that's not authentic that's not real Bob you're supposed to do you're supposed to be the troubadour with the you know the acoustic guitar right
30:41 --> 30:52 [SPEAKER_01]: Uh, throughout the history of disco throughout the history of electronic dance music throughout the history of hip hop sampling there has always been this kind of stigma and disc that like all like DJs they don't make real music.
30:52 --> 30:58 [SPEAKER_01]: Oh, like, you know, you're not it's not real like so there's always been throughout the history of music attention when it comes to
30:59 --> 31:17 [SPEAKER_01]: machines augmenting what it is that we as humans can inherently do but the reality is like it's always been machines like that little like you know acoustic guitar was itself a machine was itself a tool right just happens that like you could plug it in at some point and then all of a sudden like people recoiled from that but then they got used to it right
31:18 --> 31:44 [SPEAKER_01]: which is why I think that even though right now when you see some of the reactions to for example last summer um velvet sundown was one of the first AI bands that sort of went viral on spotify last november there was an artist again i'm putting that in air quotes because it's not human but an artist named breaking rust that was this country western artist again robot artist who actually made it the top of a billboard country chart for a few weeks uh with this track
31:44 --> 31:47 [SPEAKER_09]: you can keep rocks if you don't like how I talk.
31:48 --> 31:51 [SPEAKER_09]: I'm going to keep on talking and walk my wall.
31:51 --> 31:53 [SPEAKER_09]: They changing my tone.
31:53 --> 31:54 [SPEAKER_09]: They changing my song.
31:54 --> 31:58 [SPEAKER_09]: I was born as a wave and allowed to long you get.
31:58 --> 31:59 [SPEAKER_02]: So did that happen after?
32:00 --> 32:01 [SPEAKER_02]: Because I remember hearing about that.
32:01 --> 32:04 [SPEAKER_02]: But did that happen after or before people knew it was AI?
32:04 --> 32:05 [SPEAKER_02]: but like widely.
32:05 --> 32:06 [SPEAKER_01]: Oh, no, they knew it was AI.
32:06 --> 32:06 [SPEAKER_01]: Okay, yeah.
32:07 --> 32:08 [SPEAKER_01]: I mean, I mean, so that's an interesting question.
32:08 --> 32:11 [SPEAKER_01]: Like that's an empirical question for it's there is an answer that I do not have, right?
32:11 --> 32:29 [SPEAKER_01]: If you are hearing the song on Spotify and you just kind of like the sound of it and you just keep playing it, those people may not have known that it was, because you listen to it and it's like, I don't know, like a lot of stuff, it really falls into that much in the same way when you guys probably get essays from your students and you're like,
32:30 --> 32:44 [SPEAKER_01]: Is this GPT or is this them right like how many times did they use an m dash how many times they use the word del right and you're not quite sure With the AI artist that I've been hearing on Spotify and who've gone you know viral and stuff and become popular
32:44 --> 32:51 [SPEAKER_01]: It exists in a little bit of a gray area, but I mean, I'm not sure, like, if I wasn't deliberately looking or thinking about it being AI, I think it would slide past me.
32:51 --> 32:53 [SPEAKER_01]: I wouldn't notice it necessarily.
32:53 --> 33:00 [SPEAKER_04]: Your students are good over there, mine still leave things in the keywords and bold, so you leave it in my problem.
33:00 --> 33:01 [SPEAKER_02]: The proof of the likes is still there.
33:02 --> 33:06 [SPEAKER_02]: It's a screenshot from the actual open AI website.
33:06 --> 33:11 [SPEAKER_04]: Or it's like, yeah, copy and paste it with a different font and swipe the church EBT file.
33:11 --> 33:12 [SPEAKER_04]: Yeah, stuff out there, man.
33:12 --> 33:24 [SPEAKER_01]: But what do you guys think on that question, which is, you know, is AI just the latest version of an long battle around whether music is real or not because we're using machines.
33:24 --> 33:28 [SPEAKER_01]: Like in other words, like, is this AI movement and the pushback against it in music?
33:29 --> 33:32 [SPEAKER_01]: No different from Bob Dylan plugging in in the late 60s at Newport folk.
33:32 --> 33:35 [SPEAKER_01]: Is there something qualitatively different?
33:35 --> 33:37 [SPEAKER_04]: I think there's something qualitatively different.
33:37 --> 33:37 [SPEAKER_01]: Okay.
33:37 --> 33:38 [SPEAKER_01]: What is that?
33:38 --> 33:38 [SPEAKER_01]: What is that thing?
33:38 --> 33:42 [SPEAKER_04]: I think it's like the grit of humanity as the backbone for the music.
33:43 --> 33:46 [SPEAKER_04]: Like Bob Dylan plugging in, Bob Dylan is still manipulating the machine.
33:47 --> 33:58 [SPEAKER_04]: And something that's completely AI-generated, the level of human manipulation is so removed from the output that to me that's the line, right?
33:59 --> 34:02 [SPEAKER_04]: Bob Dylan's hand was still touching the instrument and manipulating it.
34:02 --> 34:09 [SPEAKER_04]: And while someone coded this, what was it, broken rust or rust, breaking rust, breaking rust?
34:09 --> 34:21 [SPEAKER_04]: The most AI-generated country name that I can think of, that's a good way, like who's kind of pressing the buttons behind it and who's feeding into the algorithm?
34:21 --> 34:23 [SPEAKER_04]: Is that like surrendering of control?
34:23 --> 34:28 [SPEAKER_04]: that for me takes away from like a machine generated art to a machine.
34:28 --> 34:28 [SPEAKER_04]: I don't know.
34:28 --> 34:29 [SPEAKER_04]: That's the line for me.
34:29 --> 34:30 [SPEAKER_04]: Mark, what do you think?
34:30 --> 34:41 [SPEAKER_02]: I wonder, it feels like a difference in kind, but I wonder if what it is is just a geometric difference in degree, right?
34:41 --> 34:48 [SPEAKER_02]: So it feels like me using an electronic
34:48 --> 34:55 [SPEAKER_02]: is like level one of this and that an AI is just a totally different thing, but it may actually just be that the AI is just level 200 of it, right?
34:55 --> 35:04 [SPEAKER_02]: And that an auto tune is level 30 of it and me using all use AI related tools to process this call.
35:04 --> 35:16 [SPEAKER_02]: What I'll do is I have a AD mouth noise plugin that I will put on each of our tracks to get rid of certain ones of us
35:16 --> 35:18 [SPEAKER_02]: I didn't say anybody by name.
35:18 --> 35:20 [SPEAKER_02]: I'm not putting Nicole on black.
35:20 --> 35:22 [SPEAKER_04]: I'm putting my pen out, but I'll click my pen in the microphone.
35:22 --> 35:31 [SPEAKER_02]: But that is an AI tool that saves me the time of having to go in and get rid of the, I don't know, audience.
35:31 --> 35:36 [SPEAKER_02]: If you can hear that because maybe the anti-mouth click plugin has automatically removed it.
35:37 --> 35:41 [SPEAKER_02]: But if that's level 20 and my electric guitar is a level four,
35:41 --> 35:45 [SPEAKER_02]: It's possible though that AI is just level 200 and that it's not actually on a different plane.
35:46 --> 35:48 [SPEAKER_02]: There is not human agency involved, let's say.
35:49 --> 36:02 [SPEAKER_02]: But a human did set the parameters, a human did put the prompt in saying, create some bro country that sounds exactly like the synthesis of the last 10 weeks of the top 10 on the country charts, right?
36:02 --> 36:03 [SPEAKER_02]: But
36:03 --> 36:07 [SPEAKER_02]: think like John Cage, you to know who John Cage is.
36:07 --> 36:08 [SPEAKER_02]: I do so.
36:08 --> 36:20 [SPEAKER_02]: Experimental musician among other things in the mid 20th century, and he would create music that categorically, we call alienatoric music, which means music based on randomness, right?
36:20 --> 36:25 [SPEAKER_02]: And so for example, one of the things he did was use the E.C.
36:25 --> 36:46 [SPEAKER_02]: the Chinese divination system, but you could use dice like my band we've actually talked about how awesome it would be to get a giant like D20 and roll dice and that determines what keyword and let's jam and then I'm going to roll another one and that determines what what my lyrics are supposed to be about and let's improvise and do all this cookie stuff but he would do is randomly determine the notes and rhythms of the music.
36:46 --> 37:00 [SPEAKER_02]: And you could say, oh, well, John Cage decided to use the e-ching divination system and then dropped the sticks on the rug or whatever it was to determine what that was a half note in the next notes going to be a be flat because of the divination system.
37:00 --> 37:03 [SPEAKER_02]: It was essentially random for him, but he started the engine.
37:03 --> 37:07 [SPEAKER_02]: Maybe I'm starting the engine by telling an AI.
37:07 --> 37:19 [SPEAKER_02]: Hey, why don't you do a happy song, but that makes me feel really sad by a twist two-thirds through, and we could say that one is good and the other is bad, but
37:19 --> 37:26 [SPEAKER_02]: The reason why some people, when I play John Cage from my students, some of them are like legitimate, like, this isn't music.
37:26 --> 37:28 [SPEAKER_02]: But there's artistic intent there.
37:28 --> 37:37 [SPEAKER_02]: And so I actually think I, as an artist, as a creator, could use AI to create a pretty cool, like,
37:37 --> 37:38 [SPEAKER_02]: Avon guard piece.
37:38 --> 37:40 [SPEAKER_02]: I'm going to set the parameters.
37:40 --> 37:43 [SPEAKER_02]: I think the problem is when it's done for a buck.
37:44 --> 37:45 [SPEAKER_02]: It feels grosser.
37:45 --> 37:58 [SPEAKER_02]: I don't think anybody would be mad at me if I made some experimental electronic piece with sound mass that was like the THX sound at the beginning and then it turned into dance pop for five seconds and then faded out into a string quartet.
37:58 --> 38:02 [SPEAKER_02]: People would be like, whoa, he's mean something with his commentary.
38:02 --> 38:05 [SPEAKER_02]: Yeah, but if I'm doing it
38:05 --> 38:07 [SPEAKER_02]: It doesn't feel like it's authentic anymore, right?
38:08 --> 38:09 [SPEAKER_02]: I think that's the difference.
38:24 --> 38:29 [SPEAKER_04]: I'm wondering for you, right, as someone that makes art, right?
38:30 --> 38:47 [SPEAKER_04]: Would that provide you the same catharsis and during a prompt into the chat to like generate this output as it would to actually spend the time and get your brain firing and giving us as an audience like this emotional output that we can then absorb with that do it for you?
38:47 --> 38:54 [SPEAKER_02]: No, but I think that's the difference for me.
38:54 --> 39:02 [SPEAKER_02]: Folks, I have probably mentioned like, yeah, I play in a band that I write pop songs, but I also am a trained, so to speak classical composer.
39:02 --> 39:08 [SPEAKER_02]: And my style is something that I would describe as a Jason to what people call post-minimalism.
39:08 --> 39:11 [SPEAKER_02]: And what I use is a lot of processes in my music.
39:11 --> 39:13 [SPEAKER_02]: So like, the rhythms of a given, like,
39:13 --> 39:34 [SPEAKER_02]: cello part I might have decided them or I might have used like a mathematical thing while I was like well the violin is playing a quarter note so the cello must be in eighth and when the violins on a dotted quarter it must be a quarter like i've written pieces where some of the decisions are out of my hands like i have made a primary decision that says here's a rule that will determine
39:34 --> 39:41 [SPEAKER_02]: Whatever, I have a duo called Sidechain that is trying to mimic the pulsing of a compressor on like an EDM track.
39:41 --> 39:45 [SPEAKER_02]: And the second part, the piano part is triggered by the viola part.
39:46 --> 39:46 [SPEAKER_02]: It's weird.
39:46 --> 39:52 [SPEAKER_02]: Like, I decided that was true, but ultimately I created a rule that partially composed of the music for me.
39:53 --> 39:54 [SPEAKER_02]: Was that cheating?
39:54 --> 39:57 [SPEAKER_02]: Is it artistically stimulating for me to have tried that?
39:58 --> 40:02 [SPEAKER_02]: By the way, I don't know that it's one of my best pieces, but it was satisfying to try.
40:02 --> 40:13 [SPEAKER_02]: But if you do that times a million, we're all I'm doing is say make me a cool piece that mimics the side chaining of a compressor in an EDM track but make it a classical concert experimental piece.
40:14 --> 40:15 [SPEAKER_02]: And that's the work I did.
40:16 --> 40:19 [SPEAKER_02]: And I get enter, that isn't satisfying, right?
40:19 --> 40:24 [SPEAKER_01]: Yeah, I think, and here's the analogy, Nicole, you know, you asked Mark, you know, would it be satisfying?
40:24 --> 40:30 [SPEAKER_01]: to just outsource the prompt and never do any of the act yourself.
40:30 --> 40:50 [SPEAKER_01]: The analogy I would give here would be to the way in which AI really does want to take over all of our reading and writing, like the way in which AI summaries are now being forced upon us when we go on Google, when we get emails, things like that, and also in turn, I don't know
40:50 --> 40:57 [SPEAKER_01]: You know, be ready to type an email or something on Google and it would be like want me to write it for you and and like you know again our students were turning in chat U.P.T.
40:57 --> 41:13 [SPEAKER_01]: generated work You know, I'm not a musician, but I'm a writer and my attitude is like how no I don't want you to write for me Yeah, that is that is a source of existential Pleasure and affirmation for me and I'm not giving that over to robots, but because AI looks at
41:13 --> 41:17 [SPEAKER_01]: any type of friction as problematic, right?
41:17 --> 41:20 [SPEAKER_01]: The cult of AI is a cult of efficiency.
41:20 --> 41:33 [SPEAKER_01]: And so anything that is not done as efficiently as possible is something that has to be ironed out by, you know, the silicon cage that is AI, which I'm mixing metaphors there, but, you know, go with me.
41:34 --> 41:43 [SPEAKER_01]: And so, like, there is something about how whether it's music, whether it's writing,
41:43 --> 41:48 [SPEAKER_01]: And look, efficiency is one value, but it's not the only value in life.
41:48 --> 41:57 [SPEAKER_01]: And so, you know, I mean, I was being provocative in the sense that like I do think that there is something different between using an AI prompt to create music and Bob Dylan plugging in a new port.
41:57 --> 42:03 [SPEAKER_01]: Like I do think that that is, I think that there's a difference there in kind, not just amount, but
42:03 --> 42:21 [SPEAKER_01]: I'm also aware that times change and people accept different things is being real and normalized at different times and right now we might push back on that and we might say that's not true creativity, that's not true creation, but I thought Moby's album play was tremendously creative.
42:21 --> 42:39 [SPEAKER_01]: And all he did really was, you know, I mean, not all he did, but a huge part of what he did was go back to like rip samples from really interesting old albums and piece it together or to give another it's that boy slim right from 1990s, you know, he's piecing that album together just by stitching by just putting together little samples these ripped.
42:40 --> 42:41 [SPEAKER_01]: Is that creativity?
42:41 --> 42:42 [SPEAKER_01]: I think so.
42:42 --> 42:46 [SPEAKER_01]: But like, why is that creative and what AI does isn't creative?
42:46 --> 42:46 [SPEAKER_01]: I don't know.
42:46 --> 42:47 [SPEAKER_01]: I don't have good answers for that.
42:48 --> 42:55 [SPEAKER_04]: Yeah, I'm interested in what you were saying about, you know, these AI models trying to
42:55 --> 43:25 [SPEAKER_04]: I know in my university and the students I teach a big part of the curriculum like the operationalized curriculum is teaching failure and teaching grit and resilience and perseverance and these like really transferable skills like I'm a psychologist but I have an undergrad studio art degree and I'm so thankful for that because it taught me that mistakes end up being your best product right mistakes are the things that are so serendipically good
43:25 --> 43:51 [SPEAKER_04]: that it's okay to be playful and to make have errors and now I'm noticing something in this cohort that they've had a lot of challenges, cultural challenges growing up as many of us did, but they really got that front of it or so they say and they don't have a lot of resilience, they don't have a lot of ability to bounce back and we're not
43:51 --> 44:01 [SPEAKER_04]: helping them by writing emails for them or polishing their existing writing in a way that doesn't allow them to make mistakes in a safe environment.
44:01 --> 44:08 [SPEAKER_04]: So when they're out in the real world as fully functional humans, they can make mistakes and know how to bounce back from it.
44:08 --> 44:17 [SPEAKER_04]: And I think that that's something that we can, if we can get a eye to stop trying to fix our stuff and realize it's not broken in the first place, I think it'd be better off.
44:17 --> 44:20 [SPEAKER_01]: And that goes back to the point
44:21 --> 44:22 [SPEAKER_01]: Ghostbots, right?
44:23 --> 44:28 [SPEAKER_01]: What creates more resilience in life in our capacity to cope with grief, right?
44:29 --> 44:32 [SPEAKER_01]: And that is precisely, Ghostbots are precisely designed.
44:33 --> 44:44 [SPEAKER_01]: to remove the inefficiency and the friction of grief, such that we don't have to learn those skills of resilience in the face of the worst thing that people have to go through, which is our own mortality.
44:44 --> 44:49 [SPEAKER_04]: And frictions like the best part, like from a human existence, is that why it's necessary, right?
44:49 --> 44:51 [SPEAKER_02]: Yeah, yeah, I think.
44:51 --> 44:58 [SPEAKER_02]: So when I'm talking to my, you know, if I'm teaching songwriting, or right now I have a composition student and he's,
44:58 --> 45:05 [SPEAKER_02]: preparing a portfolio for transfer to go have that be as major at another college, right?
45:06 --> 45:07 [SPEAKER_02]: I teach it a community college, right?
45:07 --> 45:11 [SPEAKER_02]: So he's going to go finish his major as a specialty, but he's still new to composing.
45:11 --> 45:16 [SPEAKER_02]: And I'm trying to get him to kind of turn out a fair amount of music this semester because
45:16 --> 45:43 [SPEAKER_02]: you have a few bad pieces of music in you that you have to get them out like maybe you're gonna like the cool thing is you if you look at the piece that we're working on right now in two years you might look actually there's something nice in there I really I'm proud of myself for writing that cool thing that's a cool tune but you kind of got got to go through the motions of doing the bad thing you're not going to be good at it the first time and what if you're never allowed to
45:43 --> 46:07 [SPEAKER_02]: do the bad music because are you're never allowed to write that awful paper and have the instructor go you know what you really didn't cite your sources right you know what this logical argument doesn't make any sense because chatGPT is fixed it for you right so that's the scary thing like if you know and ever podcasts have talked about how this is going to it may not be
46:07 --> 46:11 [SPEAKER_02]: taken by AI, it's going to be the entry level positions that are killed, right?
46:12 --> 46:23 [SPEAKER_02]: Because nobody needs the lawyer to the the first year law grad to like file all the briefs and do all the boring research anymore, like they'll just use AI.
46:23 --> 46:23 [SPEAKER_02]: Well,
46:23 --> 46:26 [SPEAKER_02]: you're not going to hire a first-year lawyer at all now.
46:26 --> 46:28 [SPEAKER_02]: So how do you ever become like high-level lawyer?
46:28 --> 46:31 [SPEAKER_02]: How do you ever become a really awesome songwriter?
46:31 --> 46:37 [SPEAKER_02]: If you didn't write five, you know, or 50 terrible songs for a few years.
46:37 --> 46:41 [SPEAKER_04]: And then also learn how to accept feedback.
46:41 --> 46:52 [SPEAKER_04]: And that's something that can be really challenging for a young creator is to put where
46:52 --> 47:21 [SPEAKER_04]: be told that it sucks or I'm going to reframe but have so I'm like look at it really critically in order to help you grow but yeah sometimes it sucks and you have to be able to take that and realize it's not about it's not a mark on you it's just your output and we can like have it suck now like practice makes progress like have it be awful when you're 18
47:21 --> 47:31 [SPEAKER_04]: I think the chat is allowing us to fall under this cognitive distortion that we're all going to be right the first time and that's just not how it goes.
47:31 --> 47:32 [SPEAKER_02]: Totally.
47:32 --> 47:40 [SPEAKER_02]: So one, want to ask you guys, we've been talking a lot about AI creators, but honestly, if we think just about music.
47:40 --> 47:53 [SPEAKER_02]: What is more scary in AI as the creator or in AI as the gatekeeper that decides what becomes successful and what gets published, what gets recorded, et cetera.
47:54 --> 48:08 [SPEAKER_02]: Because we're looking at a world, we're like, we have AI people doing the entry level jobs, but also a world where the AI is looking at our resumes and making sure we don't get that job because I don't have
48:08 --> 48:10 [SPEAKER_01]: Well, a couple of things there.
48:10 --> 48:13 [SPEAKER_01]: So on one hand, yeah, you do have the phenomenon to take it outside of music for a second.
48:14 --> 48:25 [SPEAKER_01]: You do have the phenomenon of, you know, of, you hear these stories about just like, you know, job seekers, you know, turning over there, they're, they're, you know, they're letters to AI and having them written by AI.
48:25 --> 48:29 [SPEAKER_01]: And then, and then those letters for the jobs are then we did out by AI.
48:29 --> 48:34 [SPEAKER_01]: I mean, it's like AI letters written for AI reviewers at these various kind of like white color positions that are,
48:34 --> 48:36 [SPEAKER_01]: And that's a that's a weird kind of future.
48:36 --> 48:50 [SPEAKER_01]: But I would say, you know, to the point not only in music but in culture in general, we already have AI gatekeepers right ever since we switched to an algorithmic delivery of content in our social media feeds right, you know, it used to be the case that you had.
48:50 --> 48:51 [SPEAKER_01]: And it's still as the case.
48:51 --> 49:03 [SPEAKER_01]: You have human gatekeepers who are editors at news papers, who are an or executives in music, you know, who are film studio, green light folks, right?
49:03 --> 49:04 [SPEAKER_01]: You do have humans who are still doing that work.
49:05 --> 49:09 [SPEAKER_01]: But by and large, the way that people tend to get their content nowadays,
49:09 --> 49:13 [SPEAKER_01]: is a machine or a robot or an algorithm determining what gets seen, right?
49:13 --> 49:18 [SPEAKER_01]: And so to some degree, we've already pretty much got AI gatekeepers and I'll give you one concrete example of this.
49:18 --> 49:24 [SPEAKER_01]: This is from like 10 years ago, but there was a fantastic article by Derek Thompson in the Atlantic that was about the App Shazam.
49:24 --> 49:26 [SPEAKER_01]: You guys know that App Shazam that was like a thing, right?
49:27 --> 49:28 [SPEAKER_04]: Yeah, we talked about it recently.
49:28 --> 49:29 [SPEAKER_01]: Awesome.
49:29 --> 49:33 [SPEAKER_01]: So like basically this app had the capacity,
49:34 --> 49:41 [SPEAKER_01]: The music industry used to go off the gut vibe of various people going to like night clubs and just being like yeah like this band sounded good people reacted to it.
49:41 --> 49:42 [SPEAKER_01]: Well, let's go inside this band.
49:43 --> 49:50 [SPEAKER_01]: So what the music industry shifted to I guess roughly a decade ago was basically like let's just look at the data on Suzanne let's just see where like things are popping.
49:50 --> 49:53 [SPEAKER_01]: And artists are blowing up.
49:53 --> 50:05 [SPEAKER_01]: And we'll just basically go off of that kind of data-driven sort of a money ball in a kind of metaphorical sense to the sports world analysis of who we should be signing and which bands we should be promoting.
50:05 --> 50:16 [SPEAKER_01]: And so, at least in the music world, I think to a large degree, it has been gone from gut level decision makers and gatekeepers to the machines already.
50:16 --> 50:17 [SPEAKER_04]: agreed.
50:17 --> 50:24 [SPEAKER_04]: I think Mark actually was part of the problem with this first job at the early Pandora when he was designing the algorithm.
50:24 --> 50:25 [SPEAKER_02]: Talk to you about that.
50:25 --> 50:26 [SPEAKER_04]: Talk to you about the before.
50:27 --> 50:37 [SPEAKER_02]: Let's see Steve, Steve Hogan, our past guest, both works for Pandora and we might have him on to talk about Pandora, but also live human being organist of the San Francisco Giants.
50:37 --> 50:41 [SPEAKER_02]: So you can have, you can have both intersecting identities.
50:42 --> 50:44 [SPEAKER_02]: So Whitman says we contain multitudes.
50:45 --> 50:45 [SPEAKER_03]: Yeah, I guess.
50:46 --> 50:53 [SPEAKER_02]: So, okay, so if it's too late for that question, we're already, we've already got the gatekeepers and we're just now getting creators.
50:54 --> 50:56 [SPEAKER_02]: I made a list of like a progression.
50:56 --> 51:02 [SPEAKER_02]: I'm just curious, like if there's a threshold, we, the three of us cross were something stops being okay.
51:03 --> 51:05 [SPEAKER_02]: What is your vibe, vibe check?
51:05 --> 51:06 [SPEAKER_02]: Thumbs up are thumbs down on.
51:07 --> 51:07 [SPEAKER_02]: Okay.
51:07 --> 51:12 [SPEAKER_02]: And AI, publicist, you're a band,
51:12 --> 51:16 [SPEAKER_04]: I'm like thumbs down because I love like the intro person.
51:16 --> 51:21 [SPEAKER_04]: It would be very it would be more efficient But I love like the networking piece of publicity.
51:21 --> 51:23 [SPEAKER_02]: Okay, so so no not good.
51:23 --> 51:28 [SPEAKER_04]: Okay, I DJ No, I love I like real people.
51:28 --> 51:35 [SPEAKER_02]: I'm always gonna say no Everybody All right, so they're gonna they're gonna just all be downs.
51:35 --> 51:39 [SPEAKER_02]: I bet an AI music critic
51:39 --> 51:45 [SPEAKER_04]: I know, but I also, as a warrior, like, want us here about that, I want to see what they say.
51:45 --> 51:48 [SPEAKER_04]: Like, get the A, I'm using critic to look at the Beatles.
51:49 --> 51:50 [SPEAKER_02]: Right.
51:50 --> 51:50 [SPEAKER_02]: Oh, wow, interesting.
51:51 --> 51:53 [SPEAKER_02]: But you would have to separate its knowledge.
51:53 --> 51:55 [SPEAKER_02]: It couldn't have knowledge of the Beatles.
51:55 --> 51:56 [SPEAKER_02]: you would have to have it.
51:56 --> 51:59 [SPEAKER_02]: You know what I mean, and that's how hard things like that would be yesterday.
51:59 --> 52:02 [SPEAKER_01]: Well, you have to extract its data training.
52:02 --> 52:04 [SPEAKER_01]: You have to extract the Beatles from its like training life.
52:04 --> 52:07 [SPEAKER_04]: So what would you have been trained on just like music theory?
52:07 --> 52:08 [SPEAKER_01]: Beatles knock off bands like I don't know.
52:08 --> 52:09 [SPEAKER_02]: Yeah, we are.
52:09 --> 52:10 [SPEAKER_04]: Or how would you have been programmed?
52:10 --> 52:11 [SPEAKER_04]: I don't know.
52:11 --> 52:11 [SPEAKER_04]: I'm not.
52:11 --> 52:12 [SPEAKER_02]: Okay, record label.
52:13 --> 52:15 [SPEAKER_02]: AI controlled record label.
52:16 --> 52:18 [SPEAKER_04]: I don't even know what a record label does, bro.
52:18 --> 52:19 [SPEAKER_04]: Like, what are they doing?
52:20 --> 52:23 [SPEAKER_04]: Like picking, well, this is like what the Shazan people are doing there.
52:23 --> 52:24 [SPEAKER_02]: Yeah, I guess it's similar.
52:24 --> 52:26 [SPEAKER_02]: It's already happening.
52:26 --> 52:27 [SPEAKER_02]: Mixing engineer.
52:28 --> 52:29 [SPEAKER_04]: Yes, pro.
52:29 --> 52:30 [SPEAKER_02]: You're in it.
52:30 --> 52:30 [SPEAKER_02]: Okay.
52:30 --> 52:31 [SPEAKER_04]: Sorry, Mark.
52:32 --> 52:32 [SPEAKER_02]: Wow.
52:32 --> 52:33 [SPEAKER_02]: Yeah, that happens to be the thing.
52:33 --> 52:34 [SPEAKER_02]: I think that's the difference.
52:34 --> 52:35 [SPEAKER_04]: I don't know.
52:35 --> 52:37 [SPEAKER_04]: I know which time you spend on that.
52:37 --> 52:38 [SPEAKER_04]: I'll let you okay.
52:38 --> 52:38 [SPEAKER_02]: Wow.
52:38 --> 52:39 [SPEAKER_02]: Okay.
52:39 --> 52:40 [SPEAKER_04]: And then you block ours, Mark.
52:40 --> 52:41 [SPEAKER_02]: You like this, your mind.
52:41 --> 52:45 [SPEAKER_02]: AI created backing track for a vocalist.
52:45 --> 52:46 [SPEAKER_04]: No, boo.
52:47 --> 52:52 [SPEAKER_01]: Yeah, I'm going, I'm going, no for all these things, but I doubt myself in going, no, right?
52:52 --> 53:01 [SPEAKER_04]: Because I don't know why, like it's, I'm like so in love with a humanity and like the flaws, like you'd be able to tell.
53:01 --> 53:12 [SPEAKER_02]: I think what you're doing is pandering to our audience and you're not realizing that the future audience will be an AI scrubbing this podcast audio And you need to be a little nicer to the robots.
53:12 --> 53:19 [SPEAKER_02]: So the AI song right no, okay No, any writers gonna say no to that.
53:20 --> 53:22 [SPEAKER_02]: Singer AI voice
53:23 --> 53:24 [SPEAKER_02]: Breaking rust.
53:24 --> 53:26 [SPEAKER_02]: Okay, so you were no on all of us.
53:26 --> 53:27 [SPEAKER_02]: So I know on all of them.
53:28 --> 53:31 [SPEAKER_02]: So I guess what I'm noticing is like you're not okay.
53:31 --> 53:36 [SPEAKER_02]: The two of you aren't okay with any basically any dimension of the music.
53:36 --> 53:42 [SPEAKER_02]: I mean, not that public system DJ are lowest tier, but like, yeah, what would I be okay?
53:42 --> 53:50 [SPEAKER_04]: Marketing and turn, like how much you know what I'd be okay with it is like you're talking about the low level stuff, like AI, social media management, do that.
53:50 --> 53:52 [SPEAKER_02]: But see that, you know what that's publicist?
53:52 --> 53:57 [SPEAKER_04]: I guess, yeah, I guess that would be like published like a, it is all the entry level jobs.
53:58 --> 54:03 [SPEAKER_04]: I feel like, yeah, it's just a bummer because then we're not going to have the people leveling up.
54:03 --> 54:06 [SPEAKER_04]: They're not going to be able to be Peter principled up to hire.
54:07 --> 54:07 [SPEAKER_02]: Yeah.
54:08 --> 54:08 [UNKNOWN]: Yeah.
54:08 --> 54:09 [SPEAKER_02]: the most annoying part.
54:10 --> 54:13 [SPEAKER_02]: I'm a very good mixing engineer and editor.
54:13 --> 54:16 [SPEAKER_02]: I'm good it, especially editing vocals and things like that.
54:17 --> 54:23 [SPEAKER_02]: It is so time-consuming to like look and like fix audio that's out of time and stuff.
54:23 --> 54:27 [SPEAKER_02]: And I could just, there are our AI's out there that will line everything up that up.
54:27 --> 54:43 [SPEAKER_02]: But on the other hand, I'm really good at doing the sound editing for the hard moments, probably because of all the time I've wasted moving my mouse and clicking and using the B key and pro tools just to get things perfectly lined up or whatever, not perfectly lined up, but musically lined up, right?
54:43 --> 54:44 [SPEAKER_02]: And, um,
54:44 --> 54:54 [SPEAKER_02]: Yeah, so the entry level matters, right, that's an entry level job that I could give some recent graduate next time I'm doing a project and the AI would cost me a lot less.
54:55 --> 54:58 [SPEAKER_04]: Yeah, when we were, I was talking with my students about this just today, right?
54:58 --> 55:03 [SPEAKER_04]: What AI and in counseling, and they said, what would you be comfortable with?
55:04 --> 55:09 [SPEAKER_04]: And I said, I'd be comfortable with AI doing intake like gathering family history.
55:09 --> 55:13 [SPEAKER_04]: I'd be comfortable with AI like summarizing case notes.
55:13 --> 55:19 [SPEAKER_04]: So if I had to hand the client over to someone else to like give me a one page clean summary of this person's history.
55:19 --> 55:21 [SPEAKER_04]: That's what I'd be comfortable with.
55:21 --> 55:23 [SPEAKER_04]: And that is like,
55:23 --> 55:27 [SPEAKER_04]: the first blind stuff, like the, yeah, the first job stuff that you do.
55:28 --> 55:32 [SPEAKER_01]: Yeah, I think these are, these are, again, we're kind of saying all this at this current moment.
55:32 --> 55:45 [SPEAKER_01]: And, and I myself, I, I would call myself an AI skeptic slash fatalist in the sense that, like, I'm skeptical that, you know, I'm skeptical of some of the boasts that AI makes.
55:46 --> 55:49 [SPEAKER_01]: But I'm also fatalistic in the sense that,
55:49 --> 55:54 [SPEAKER_01]: we as human beings find all kinds of ways to adapt ourselves to new technologies.
55:54 --> 56:01 [SPEAKER_01]: And so that the claims that we make right now about AI and our resistances that we have to it may not ultimately hold up, right?
56:02 --> 56:06 [SPEAKER_01]: I mean, do you guys hear last summer like when velvet sundown?
56:06 --> 56:08 [SPEAKER_01]: Like, have you guys heard any of the velvet sundown stuff?
56:08 --> 56:09 [SPEAKER_01]: I haven't.
56:09 --> 56:09 [SPEAKER_01]: No.
56:09 --> 56:12 [SPEAKER_01]: It's basically the first band to go like band.
56:12 --> 56:18 [SPEAKER_01]: But that in scare quotes, it's not human, but it was the first band to go kind of viral on Spotify for AI.
56:18 --> 56:26 [SPEAKER_01]: And they basically sound like some kind of like Creedence Clearwater, kind of like 70s sounding kind of classic rocky type band.
56:27 --> 56:36 [SPEAKER_08]: Raise your hand, don't look away, sing out loud, make them pay.
56:44 --> 56:58 [SPEAKER_01]: And like the lyrics are like dumb, one of the things that people criticize this band for band, these robots for, is that it was like the lyrics were just like super cliched and obvious.
56:59 --> 57:11 [SPEAKER_01]: And like that's a fair criticism, but like there's a ton of formulaic dumb, like like you tell me like like one of the greatest pop songs of all time, back street boys, I want it that way,
57:11 --> 57:35 [SPEAKER_01]: Like that song has that song is an absolute master piece of pop and it makes no fucking sense, right like and so like the notion that like we can say like oh Like AI is so formulaic like it's so, you know, it can't have any kind of like creative insights All true and yet plenty of human human human created music that we love also suffers for those problems and we're okay with it And so I don't know man.
57:35 --> 57:35 [SPEAKER_01]: I think like
57:36 --> 57:46 [SPEAKER_01]: If we don't know something's AI, I'm not sure like all those positions that you just asked about and and I'm not being a position, I don't know the nuances of some of those aspects, but like we might resist it.
57:46 --> 57:48 [SPEAKER_01]: outwardly, explicitly consciously.
57:48 --> 57:51 [SPEAKER_01]: But I'm not sure if we didn't know what it would resist.
57:51 --> 58:11 [SPEAKER_02]: I don't think the realistic comparison is comparing the AI written pops on to something written by the greatest songwriter of our era or the AI music critic being compared against the greatest music critic of our era.
58:11 --> 58:15 [SPEAKER_02]: What you need to
58:15 --> 58:19 [SPEAKER_02]: the random most mess on writer because that exists too.
58:19 --> 58:22 [SPEAKER_02]: If somebody is above average, obviously they will be better.
58:23 --> 58:34 [SPEAKER_02]: If the AI pulls to the average, which is maybe the best to get, there'll still be the, you know, the, the posseimins, the Taylor Swift, the Paul McCartney, various people named Paul, even.
58:34 --> 58:35 [SPEAKER_02]: Like
58:35 --> 58:43 [SPEAKER_02]: Those people will still be better than the AI, but gosh, how many thousands of songwriters will the AI maybe need better than?
58:43 --> 58:43 [SPEAKER_02]: Right.
58:43 --> 58:46 [SPEAKER_04]: Maybe they'll start to independently work with each other.
58:46 --> 58:50 [SPEAKER_04]: Like, I heard Taco, well, this is going to talk about fatalism.
58:50 --> 59:00 [SPEAKER_04]: Um, I recently heard that now there are chat rooms that you can put your, you can make your own chat bot and interact just with you and like maybe fall in love with you if you wanted.
59:00 --> 59:00 [SPEAKER_05]: No big deal.
59:01 --> 59:03 [SPEAKER_04]: And then when you're at work, you're like, wow, shit.
59:03 --> 59:04 [SPEAKER_04]: And he's something for my chat bot to do.
59:05 --> 59:11 [SPEAKER_04]: You can essentially plug them into a chat room with other chat bots and they can talk to each other but what's like to be a chat bot.
59:11 --> 59:15 [SPEAKER_01]: Yeah, it's called, it's called, it's called Malt book, MML, MMLT book.
59:16 --> 59:17 [SPEAKER_01]: And it's basically, right?
59:17 --> 59:19 [SPEAKER_01]: Yeah, it's a Reddit for AI's.
59:19 --> 59:23 [SPEAKER_01]: Now, yeah, and that is, you know, again, what does dystopian moments?
59:23 --> 59:24 [SPEAKER_04]: It's like weird, man.
59:24 --> 59:25 [SPEAKER_01]: It's totally weird.
59:25 --> 59:26 [SPEAKER_01]: I've got a ton of cheat.
59:26 --> 59:27 [SPEAKER_01]: It doesn't it, at this point.
59:28 --> 59:28 [SPEAKER_01]: Yeah.
59:28 --> 59:37 [SPEAKER_01]: I also read some reporting that suggests that like some of that posting was actually by humans and not necessarily, I'm glad, like probably run on this because it's isn't good for me.
59:37 --> 59:42 [SPEAKER_01]: Yeah, like the AI agents were like complaining about their human, whatever, overlord, or whatever.
59:43 --> 59:47 [SPEAKER_01]: But yeah, um, no, we're in really weird times.
59:47 --> 59:49 [SPEAKER_01]: We're doing folks, I'm so just gonna say it again.
59:49 --> 59:51 [SPEAKER_02]: Well, who's just faster?
59:51 --> 59:52 [SPEAKER_02]: Who's doomed faster?
59:52 --> 59:58 [SPEAKER_02]: The musicians,
59:58 --> 01:00:01 [SPEAKER_04]: I'll tell you this comes up in our faculty meetings quite a bit.
01:00:01 --> 01:00:03 [SPEAKER_04]: It's always the English teachers that freak out the most.
01:00:03 --> 01:00:04 [SPEAKER_04]: They're writing one or one.
01:00:04 --> 01:00:07 [SPEAKER_02]: I think journalism and it will be like that probably, right?
01:00:07 --> 01:00:09 [SPEAKER_02]: I mean, Alicia, one of our affiliates.
01:00:09 --> 01:00:14 [SPEAKER_02]: She's already talked about how like she's a, like, works in marketing and stuff like that.
01:00:14 --> 01:00:15 [SPEAKER_04]: It's like a tech writer.
01:00:15 --> 01:00:16 [SPEAKER_02]: it's been hard.
01:00:16 --> 01:00:18 [SPEAKER_02]: Yeah, technical writing and stuff.
01:00:18 --> 01:00:21 [SPEAKER_01]: I mean, I'm trained as a journalist.
01:00:21 --> 01:00:23 [SPEAKER_01]: I worked as a magazine writer for a number of years.
01:00:23 --> 01:00:25 [SPEAKER_01]: I feel the core of my being journalism.
01:00:26 --> 01:00:31 [SPEAKER_01]: Journalism has been cooked for 20 years for various other kind of economic technological reasons.
01:00:31 --> 01:00:33 [SPEAKER_01]: But here's the thing that gives me hope, right?
01:00:33 --> 01:00:35 [SPEAKER_01]: I try to cling to sources of hope here, right?
01:00:36 --> 01:00:39 [SPEAKER_01]: There's two aspects that I cling to in terms of hope.
01:00:40 --> 01:00:42 [SPEAKER_01]: One is a phenomenon called model collapse.
01:00:42 --> 01:00:43 [SPEAKER_01]: And I'll give you a very quick primer.
01:00:44 --> 01:00:46 [SPEAKER_01]: I'm not a computer scientist, but I'll just sort of explain this very quickly.
01:00:47 --> 01:00:51 [SPEAKER_01]: AI systems to this point have been trained on human-created content and human-created data.
01:00:51 --> 01:00:53 [SPEAKER_01]: That's what they need.
01:00:53 --> 01:00:58 [SPEAKER_01]: As more of the internet becomes populated with AI-created data and AI-created content.
01:00:58 --> 01:01:06 [SPEAKER_01]: And when the AI systems start training on that AI output, it becomes the equivalent of the analogy
01:01:06 --> 01:01:12 [SPEAKER_01]: If you make a photocopy of a photocopy of something, each time it gets worse in terms of fuzziness and quality.
01:01:13 --> 01:01:24 [SPEAKER_01]: So the possibility of model collapse means that basically these systems will collapse under the weight of their own oroboros kind of like generation of synthetic content.
01:01:24 --> 01:01:25 [SPEAKER_01]: So that's one positive.
01:01:25 --> 01:01:28 [SPEAKER_01]: And then humans need to slide back in and reclaim their place.
01:01:28 --> 01:01:30 [SPEAKER_01]: at, you know, the head of the table for content creation.
01:01:31 --> 01:01:37 [SPEAKER_01]: The other possibilities that, and we're heading in this way, when our lives are surrounded by so much AI slap, right?
01:01:38 --> 01:01:44 [SPEAKER_01]: The music we listen to, the audio visual content written content that fills our feeds.
01:01:44 --> 01:01:51 [SPEAKER_01]: If AI Slop is so pervasive, then actual unique human creativity will be at that much more of a premium, right?
01:01:51 --> 01:01:59 [SPEAKER_01]: Because like you will want the unexpected because AI by its nature can only regress toward the mean in terms of what it puts out there.
01:01:59 --> 01:02:06 [SPEAKER_01]: So like a world in which there's just tons of AI Slop and we're already heading toward that world in terms of video in terms of audio in terms of writing.
01:02:06 --> 01:02:10 [SPEAKER_01]: A world in which just everything is created by the machines and it's Slop and it's average.
01:02:10 --> 01:02:27 [SPEAKER_01]: You know, you could see, you could see a possibility that genuine creativity and genuine uniqueness does have some value in that landscape and, you know, the musician, the psychologist, the journalist, you know, who actually can do something that an AI can't do has some value and some hope in that.
01:02:28 --> 01:02:31 [SPEAKER_04]: And I think that comes back to this conversation about authenticity, right?
01:02:31 --> 01:02:42 [SPEAKER_04]: Inauthenticity, not being just part of a formula to fit in, like a mental scheme and we have about what something should be, but authenticity has like a genuine human element to it.
01:02:43 --> 01:02:45 [SPEAKER_04]: that AI hopefully can't ever replicate?
01:02:46 --> 01:02:46 [SPEAKER_02]: I think it's crossed.
01:02:47 --> 01:02:49 [SPEAKER_02]: Awesome, think it's crossed, indeed.
01:02:49 --> 01:02:52 [SPEAKER_02]: Mike, where can people find you if they want more?
01:02:53 --> 01:02:57 [SPEAKER_01]: Well, after this, you know, I need to, I need to create a digital twin of my stuff.
01:02:57 --> 01:03:03 [SPEAKER_01]: I feel like I feel like the natural answer there would be like, wow, I created an AI component of agent of my stuff.
01:03:03 --> 01:03:09 [SPEAKER_01]: No, so I'm, you know, my books are on Amazon.
01:03:10 --> 01:03:18 [SPEAKER_01]: I don't even use like, I mean, I'm on social media, but it's not like, it's not even like a plug there because it's like, I don't know, every every every social media platform has spiraled toward awfulness.
01:03:19 --> 01:03:21 [SPEAKER_04]: Tell us the name of your book again in your last name.
01:03:21 --> 01:03:22 [SPEAKER_04]: So we can find you.
01:03:23 --> 01:03:26 [SPEAKER_01]: Last name is Sorazio S. E. R. A.
01:03:26 --> 01:03:27 [SPEAKER_01]: Z. I. O.
01:03:27 --> 01:03:33 [SPEAKER_01]: And the last book was the authenticity industries keeping it real in media culture and politics.
01:03:34 --> 01:03:37 [SPEAKER_04]: Do you require your students by your book?
01:03:37 --> 01:03:44 [SPEAKER_01]: Hell no, I find that I find that deeply ethical problematic Oh, no, for sure.
01:03:44 --> 01:03:44 [SPEAKER_01]: That's a good.
01:03:44 --> 01:03:46 [SPEAKER_01]: There's a good racket to be had there, right?
01:03:46 --> 01:03:49 [SPEAKER_01]: Like yeah, like a steady supply of Required reading, bro.
01:03:49 --> 01:04:00 [SPEAKER_01]: Now now keep in mind that like, you know the nature of of the contracts and the royalties mean for like every copy sold I get like a nickel right so like it's like, you know, like I'm not gonna go rich
01:04:00 --> 01:04:01 [SPEAKER_01]: Yeah, I know it's true.
01:04:01 --> 01:04:02 [SPEAKER_01]: It's true.
01:04:02 --> 01:04:03 [SPEAKER_01]: Yeah, those are indeed better.
01:04:03 --> 01:04:12 [SPEAKER_01]: But no, I just feel that would feel like he like I do I do make them read some of my stuff, but that's more not so much that it's good or I want them to buy it I just give it to him for free as PDFs.
01:04:12 --> 01:04:14 [SPEAKER_01]: It's more just like if I have to teach something
01:04:14 --> 01:04:38 [SPEAKER_01]: I already know it so well that I don't have to like, you know, anything you teach that you've written yourself or done yourself or just like, oh, this is a quick prep because like I know exactly what the reading it is, you know, so yeah can you tell us what you're working on next or any articles coming up if not the book you know, I don't really have another book in the hop where I'm I shifted to a chair role at the University so that takes up some time exactly answered that insert that down by there.
01:04:38 --> 01:04:39 [SPEAKER_01]: Um, no, but I remain.
01:04:39 --> 01:04:42 [SPEAKER_01]: I'm doing these little writing projects on stuff related to AI.
01:04:42 --> 01:04:47 [SPEAKER_01]: So, um, you know, just academic articles, serious kind of articles and the sort of popular press.
01:04:48 --> 01:04:52 [SPEAKER_01]: I am watching with rap fascination, what AI is doing to society.
01:04:53 --> 01:04:55 [SPEAKER_01]: I have my suspicions that it is a big sales pitch.
01:04:59 --> 01:05:25 [SPEAKER_01]: there's going to be some real dramatic transformations in terms of our values as human beings things that we said were off limits before as human beings we will capitulate to and I'm really interested to see over the next few years what that looks like you heard it first here everybody we don't know what's going to happen let's see cross our fingers thanks so much for coming on Mike this was awesome thank you so much Mike it was so nice to chat with you thanks for having
01:05:33 --> 01:05:37 [SPEAKER_04]: Nevermind the music is hosted by Nicole Baxter and hosted and produced by Mark Poppinney.
01:05:41 --> 01:05:48 [SPEAKER_04]: You can email us at nevermusicquaditchmail.com and give us a follow on social media.
01:05:48 --> 01:05:51 [SPEAKER_04]: Nevermind the music is also part of the lorehounds network.
01:05:51 --> 01:05:53 [SPEAKER_04]: Please join the conversation on their discord server.
01:05:55 --> 01:05:56 [SPEAKER_04]: Thanks for listening.