Untitled
unknown
plain_text
19 hours ago
79 kB
11
No Index
Justin: [0:27] I'm Justin. I quit my job today, and my pronouns are he and they. Sadie: [0:33] I'm Sadie. I work IT in a public library, and my pronouns are they, them. Jay: [0:37] I'm Jay. I'm a cataloging librarian. My pronouns are he, him. Justin: [0:40] And we have a guest. Would you like to introduce yourself? Hagen: [0:43] Hi, I'm Hagen Blix. My pronouns are he or they. I'm a linguist and cognitive scientist, and I just co-wrote a book about the political economy of AI and AI fears together with my friend, Ingmar Klimmer. Justin: [0:58] Welcome. It's a very quiet cheer, but I can't get that louder. Welcome. Yeah, this is the, we had to reschedule you twice. I feel really bad, but I got caught in a blizzard and then I got sick. And so we're finally talking about this book, which you were very generous to send to us. And I enjoyed it. I thought it was very easy to understand. I thought it was good for general audiences. I thought it was really sad thing. I like when people talk about theory clearly, because I feel like, I love the guys at Asset Horizon, but they talk in like, not even paragraphs, but like subchapters, and use so much philosophical jargon that I have no idea what's going on half the time. So I enjoy it when someone can actually talk about philosophy very plainly, because that's what we try and do here, right? Hagen: [1:43] Yeah, that's it. Yeah. I mean, that's, you know, you don't just theorize just for the hell of it. I mean, that can be fun, but the stuff is there to help people make sense out of their own lives and figure out how we can all liberate ourselves, right? It's a collective enterprise. Justin: [1:56] Mm-hmm. So I'm going to warm us up a little bit because something was posted in Reddit today on our libraries. So this guy's not even really an enemy of the pod. And I can't use the corn thing because the automatic voice detector will screw up the transcript if I use corn scatting. So I've got to use something that's not... Jay: [2:22] Or you don't have to use anything. Justin: [2:27] There's this guy called Juan David and I like ended up looking into him because I I just got curious about him but he made this thing called the Naperville library spy which is a great non-menacing title but it's like an unfiltered look at what books are being checked out across the Naperville public library, Justin: [2:46] And as far as I can tell, it's just like a vibe-coded copy of the BiblioCommons discovery layer. And as far as I can tell, it looks like it is now broken. Because I know someone in one of the library discords called the library and was like, hey, this guy's scraping the shit out of your website. So it looks broken. Because you can see that there's this pulse image. And it's now flatlined. So I think either he shut it down because people yelled at him or someone else blocked his access. Jay: [3:26] It's interesting in theory. Justin: [3:28] It's a strange idea. I don't think it's like an actual privacy problem. Jay: [3:35] I don't either. Justin: [3:35] But it, I mean, there definitely are concerns I would have in terms about, like, if someone did try and scrape a bunch of public library stuff to see, like, books for challenges, I can understand how this would be a pain in the ass. But this guy just makes slop. And you can go to his website, because obviously it's like Juan David Campo Largo, and you can just, like, slash projects. And so I just went to his projects, and he went to UIUC. Jay: [4:04] Oh, no. we have an alma mater in common Justin: [4:07] He made the talk show for their and the thing is like it's a good little student talk show they interview like the professors and talk about stuff and make youtube videos it's not bad but everything else he seems to make is just like he made this thing that is clearly it's called pico masala how i built a restaurant empire and then gave it away for free and it's like all clearly written by ai it's like all the arts done by ai it's for this like restaurant someone approached him to make like branding for and then just didn't want to work with him anymore probably because you know, He says, it's detailed, honestly, maybe too detailed, but I wanted to write down everything needed to make this a standout business that people actually care about, remember and talk about. And then at the bottom, it says, two friends of mine made a podcast episode about this master plan. You can listen to it here. And it's so clearly the Notebook LM podcast voices that he just uploaded all this shit that he made into Notebook LM and hit the make a podcast button. Hagen: [5:08] I've been joking about how we had this moment where people were like, oh, the LMS has finally passed the Turing test. And now we're all so deeply familiar with the voice of Slop that it has just unpassed the Turing test again. Justin: [5:23] I really wish that the voice to text that they're using for Notebook LM was available because it's great. It's really good for Notebook LM. I would love it to read websites to me in a natural voice. Instead, the only thing you can do with it is upload books to it and ask it to give you a dipshit podcast about. And the insane thing about Notebook LM, the amount of compute that goes into this stuff just boggles my mind. Because I threw an audiobook in there and it transcribed it very quickly and made a full podcast summary of it. And I'm like, that's too much compute. I know how compute intense this stuff is. I've tried to run stuff on my computer using Llama, and like the stuff that this is doing is like burning electricity and GPUs. It's very... It's too much, man. It's too much. To be making pico masala, one bite, two hearts. And a machine that spies on the local public library for no reason. Hagen: [6:34] Because libraries are famously very secretive about information. Hagen: [6:38] You need to put spies in there to figure out what kind of information is hidden in the library. Jay: [6:43] Like I am so shocked that Project Hail Mary is one of the top titles considering the movies coming out and that dipshit book The Let Them Theory it's just all a bunch of mass market like fucking Patterson whatever the hell his name is and then like self-help books that's all it's gonna be James Patterson that's who it is it's like James Patterson yeah the bane of library Sadie: [7:10] Workers everywhere yeah Jay: [7:12] It's like Nothing good. Justin: [7:16] He also, I think, made a version of UIUC's catalog as well. So he just scraped their catalog and made his own catalog. Jay: [7:24] Oh, no. Oh, it's so... That's such a huge... That's such a big library. They also... Justin: [7:30] No, their course catalog. Jay: [7:32] Oh, I was like, what? Justin: [7:35] Yeah, no. Yeah, they're... It looks okay, I guess. I mean, it's an interesting idea. Like you could just take your library collection and dump this data and let people make little websites out of it. But like, why is he scraping his own university's catalog to make his own university catalog? Like, I don't, is it that bad? Is it that hard to use? Jay: [8:02] This library spy thing really shows how useless the Library of Congress genre form terms are though. The top genre, novels. Wow. So useful. I'm glad we assigned that to stuff. Graphic novels is a little more useful. The top subjects is friendship juvenile fiction. And then humorous stories and then schools juvenile fiction picture books juvenile fiction dogs juvenile fiction it's like all juvenile fiction except for man woman relationships fiction where i'm guessing that's the romance books and like the colleen hoover and stuff wait Justin: [8:39] Were you seeing this on the library spy page. Jay: [8:41] On the radar tab it tells you the top titles as well as the top formats the top subjects and the top genres oh yeah a Justin: [8:52] Board book wow. Jay: [8:53] Wow sorry Sadie: [8:54] I just looked at this dude's project page for the the library spy and it's i couldn't believe libraries were a thing which fair he apparently is not american american libraries are kind of unique but i also couldn't believe books could be interesting yeah Hagen: [9:11] Well and then the thing that is interesting about books is clearly the metadata not the books Sadie: [9:16] Not the books but just okay he's Justin: [9:21] Such a he's such a hustle core grind mindset bone maxing kind of dude um. Jay: [9:27] This is like every dude i've ever encountered in the like pkm space is this dude uh-huh Justin: [9:34] Yeah see pk could pkm have saved her could personal knowledge management have given him something else to do with his. Jay: [9:40] Time would have made him worse and Justin: [9:43] Well anyway i think he just needs a hobby that's not that's not vibe coding all day. Sadie: [9:50] I mean reading is right there he could just do some of that without vibe coding about it Hagen: [9:56] Now that he knows that libraries are a thing it should be pretty accessible as a hobby Justin: [10:03] Profitable affordable housing yeah anyway this guy's he's just the type of guy and it's very funny to me but i've i've thought about him too much now okay new. Sadie: [10:14] Type of guy just dropped Justin: [10:15] Not a new type public library it sounds like dudes you would tell me about when when you were in college sadie and you're like this guy fucking sat next to me again and kept telling me about bitcoin for an hour oh. Sadie: [10:27] God that guy forgot about that Justin: [10:35] Anyway, so, Hagen, you wrote a book called Why We Fear AI, and let me get the subtitle, On the Interpretation of Nightmares. Fears about AI tell us more about capitalism today than the technology of the future. And I like how you talk about our imaginations quite a bit. But to make this easy on me and to keep us on theme for the podcast, why should library workers read this book? What do you think they'll come away with from it? Hagen: [11:08] And i think a significant part of the book is kind of about the what are the politics of knowledge what are we what do we want from a emancipatory liberatory politics of knowledge which you know to me libraries are a very central kind of theme of a very yeah like utopian idea of knowledge that's accessible and useful to all made in a in a way that yeah is accessible to people that they can do their own thing with and i think what we see in ai is a it's kind of in some ways is an alternative project for how to structure knowledge, how to make knowledge accessible or inaccessible. And in my mind, like one of the things that we go into a lot in the book, especially in the second half, is talk about AI as kind of a special form of a privatizing knowledge, right? It's scraped all these, they've scraped everything on the internet, whether it's, you know, our message boards or all the books that they downloaded from LibreGenesis or wherever, right? But they're trying to make this thing into a private kind of thing that you have to subscribe to or get served ads to do it, which is a lovely fact about the library. They don't put a bunch of ads in your face. Hagen: [12:15] Right. And there's a sense in which that is a use of knowledge when it's in Hagen: [12:20] the machine that is, in my mind, used to devalue people's skills. I think that's ultimately the economic purpose of most AI things is not to increase productivity or make knowledge available or whatever it is to replace people who are coders with people who have a six-week training course in bi-coding. Encoding. It is to transform people who are logo designers into people who have to fix AI slops. So the point is to make people into appendages of these kind of weird new knowledge machines where they don't have to be paid well, where they can be transformed into gig workers, etc. And I think that's a very radically different vision about how knowledge should work than the one that I think something like a public library system presents. So I think that contrast is very useful for making clear what kind of the role of of knowledge of books of all these kinds of things is in a society and what kind of society corresponds to one vision of knowledge versus the other vision of knowledge yeah Justin: [13:21] I like the comparison you started to make in later in the. Justin: [13:30] Structure in the pyramid of power right so knowledge for the professional worker both justifies their position in the hierarchy even though they understand that like intelligence is not like a real thing they know their boss isn't necessarily more intelligent than they are but they are supposed to think that they are more intelligent than other people and that's why they're allowed to be in that position it's like a double mind but knowledge you talk about as this thing that is essentially like proprietary like you are allowed to know this much about your job and if the the dream is the more knowledge we can keep away we can then sort of lower the the power hierarchy flatten the pyramid of power so that there's really only the people who own all of the useful knowledge and everyone else who has to access it without really understanding and having that mastery themselves so knowledge you use it as sort of a way of measuring power like i think this would be a useful way to talk to students about it because i try to explain to them like most of the stuff in our databases is not openly available you will not have access to this much information for the rest of your life right this is like very expensive information that we have in our databases and then also if you could connect that to like the kind of knowledge that it takes to be a professional in a job and be valued for that. Justin: [14:55] Like knowledge of the whole process that you work on. And if there was a way to take that away from people so that there was no professional managerial class, there was only the lower working class and the owning class, I guess is kind of like the, when you talk about flattening the pyramid, that's what it sounds like the ultimate dream is. The CEO who controls all the knowledge, has the machines that control the knowledge, no one else gets to have it. Hagen: [15:18] Right. Yeah, I think that that really is the dream of capital, you know, to be, can we centralize all this knowledge? I think one of the metaphors or we play around within the book is to say the intelligence of artificial intelligence should be the intelligence that we gather, you know, the intelligence of intelligence agencies, which is exactly about can we, what kind of knowledge can we put on a need-to-know basis, Justin: [15:42] Right? Hagen: [15:42] And yeah, we've seen that if you look at the history of scientific management and Taylorism onto all kinds of managerial schools today, there is a sense in which there's always been an attempt to say, can we centralize certain kinds of knowledge among management so that management can tease out the steps in a labor process and say, oh, here's a part that can actually done by someone who only needs a two weeks training course. And we're not going to have an engineer do that. We're going to carve out the niche for the people who do need knowledge ever smaller and more specific so that whenever you give special knowledge to a manager, you do it so that you can deprive other people of the bargaining power that Hagen: [16:24] they can derive from their knowledge. That's, I think, the core thing there where we're talking about power. Power can be either power derived from knowledge can be the power of bosses and managers, or it can be if workers together have knowledge of the labor processes that can't be easily replaced, then they have the ability to go on strike, do a work stoppage and challenge management, challenge the owners of companies and that way derive power. Or if you're a very well-trained professional in a niche where labor is in short supply where you have specific skills that are Hagen: [16:59] In high demand in the market, then you can maybe derive individual bargaining power, right? So there's these different things in what we call the pyramid, the kind of hierarchy of power, where knowledge kind of functions differently in this way in the economy, and then in the political power that people derive from their economic position too, right? It's not an accident that if you're a billionaire, you can call whoever you want in the Senate or Congress. And if you're you and me you cannot Jay: [17:25] I think that's a really good thing to highlight with like what like what is the problem that we have with ai right because like there's a lot of existential problems of course but even material problems like in the environmental issues right theoretically they could come up with a way for it to not be as bad like it's going to be difficult but there's probably theoretically technically a way to do that so it's like that can't be the only bargaining chip that we we have to fight against this so like the thing i've and like even the whole like oh it steals things it's a plagiarism machine i'm like yeah but do you want more robust intellectual property law because that's how you get bad that like it's like the the problem with that even isn't plagiarism or free you know i almost said free use fair use or anything like fair use is good we want a digital commons right like the the problem again is like how this exists under capitalism and the way that it affects labor and you know with plagiarism and fair use and all that It's because that's like a livelihood that's being hurt because that is how someone is existing within capitalism. So I think this way you're framing it of like power relations and who's controlling power and everything is like really insightful. Justin: [18:54] And I guess I didn't structure my questions very well on how to make fun of myself. But like why, I guess I wanted people to get an idea of like the thrust of the book before going back and saying like, why write this book as like someone who's trained as a linguist and as a cognitive scientist? Like what got your interest personally into like writing a book that's, you told me like a lot of the footnotes and stuff had to get cut, right? Like it was, it was very much like made for a more general audience. So like what, uh, what was the path there? Hagen: [19:29] Yeah, I think I kind of got interested in language models a couple of years Hagen: [19:34] before all the big chat GPT splash, etc. That was around the time of Burt. You know, linguists were kind of occasionally taking note of this. And I got interested as a linguist. I was like, oh, here's suddenly a machine that can, you know, back in those days, it was like it can kind of produce something that sounds like English. You read it and you're like, this didn't make any sense. I have no idea what I read, but it did sound like English. And it was like that as a linguist, I was like, oh, I'm studying, you know, what kind of knowledge is involved in knowing a language. So I was like, oh, what's in this machine? And so we, you know, I joined some projects. We looked at like what kind of grammatical properties do these machines actually represent well or not well. So initially for me, it was a purely kind of technical interest to curiosity. Hagen: [20:15] But I've also been politically active on the left for many, many years. And so a little while into that curiosity, I and my friend Ingeborg, who is a machine learning researcher who also got interested in these kinds of objects, we were talking about, well, this looks like it's kind of the industrialization of language production. What does that mean from the perspective of the political economy of knowledge production, language production, these kinds of things? And so we got to thinking about those, yeah, those political questions. And that was kind of just us talking, trying to make sense out of the situation that we were finding ourselves in. But then ChatGPT happened. And together with ChatGPT, there was this explosion of stories that were always in this duality, you know, the promises and perils, the whatever. It was always the same kind of binary. And there was this like, oh my God, the machines are going to, maybe they're going to destroy us. Maybe the Terminator is going to be real. Maybe the Matrix is about to happen, right? So there was this very bizarre sense of the stories that used to belong to the realm of science fiction are now making it into, you know, into very serious, normal, liberal newspapers like the New York Times or whatever. Hagen: [21:29] Time magazine had published a piece where somebody, where Eliza Yudkowsky said we should nuke data centers if the machines get too smart and risk World War III over the risk of having a language model break out of containment or whatever. So there was a lot of strange things. And I think what we saw from the left was either not a particular interest in that or just a desire to ridicule these stories. And I understand that, but we had the feeling that there's something more interesting happening, that there's something to be said about why do these stories resonate? What is it about these stories that makes people interested? Hagen: [22:08] Other than, well, stories about big explosions make people click on links. So we went in there and we kind of figured, oh, there's something to, let's say, a story about the matrix coming true. I think there are things in the real world that get reflected in these stories that make these stories resonate, right? I think there's a very real sense in which even the billionaire class has a sense of they can't stop capitalism from causing climate change, right? They're just writing it out. They're hoping that their bunkers will hold when the time comes, right? Even for them, they're already in a situation where they experience this as there's an unstoppable technological thing. That's just happening, right? The oil wells aren't conscious. The oil wells haven't decided to come for us and make the planet uninhabitable for humanity. But there's a real sense in which there is... Hagen: [23:00] An out-of-control technology sense, right? That's even for the ruling class. But then there's also, you know, workers that get more and more surveillance, Hagen: [23:09] more and more automated control put on top of them. One of the stories that we're talking about in the book is how Amazon is using video classification software in warehouses, right? So there's a sense in which personal relations of domination, like a manager telling you what to do, gets supplanted by a machine-mediated power relation, right? The manager doesn't even tell you what to do anymore. At Amazon, sometimes the algorithm just fires you. The algorithm surveils you, checks whether you're fast enough, whether you make too many mistakes when stowing in the warehouses. And then the algorithm, at the end of the day, decides to fire 5% or whatever of the workforce. Hagen: [23:47] So there is a person whose real life is already under the control of machines. So there's all these ways in which there's real things in the world that can be, I think, accurately characterized as people experience their life as dominated by machines in a way that is real, but where the domination has to be understood. If we want to change something about that situation, if we are like, this is a bad thing, the world shouldn't be like that, then we have to understand how that structure of that domination is related to capitalism. And so we're like, we have to give an interpretation of these stories, or we have to give people tools to make an interpretation of their own situation and their own feeling of why do these stories maybe resonate or not, that help them make sense of the larger political structure, help them make sense out of what is it that we need to change if we want that shit to go away. So we decided to, yeah, look into it and try to write up both how we think one can find something useful to interpreting the actual world in these stories and also just how we think the industrialized production of language will actually change how the world works. Justin: [24:54] Yeah. Yeah. Why focus on fear as a framework? Like at the beginning of the book, I wasn't quite clear on it. And then towards the end, I was very, you know, it made sense to me like, okay, fear is this framework of understanding like why these stories hit, but like why fear as opposed to, I don't know, some other emotion that like the matrix or like Terminator brings into us. Hagen: [25:19] Yeah, I mean, I think that was primarily kind of driven by the sense that that's the, I mean, you know, there's the weird booster euphoria of people who are just like, oh, it's just going to bring about the best possible world. You know, in the words of Sam Altman, it's like so unimaginably good that you can't even talk about it. Hagen: [25:37] So I think there's that feeling, but I think that's just, it's just harder to say something interesting about that. You know, you're like, I think a lot of that is just kind of boringly delusional. But I think there's many ways in which people's worries, people's fears, people's anxieties are reflective of a kind of tension that we're experiencing in this moment where we're seeing this thing starting to unfold. But we also feel like it's certainly not come to its conclusion, right? It's a moment that we're living through that produces a lot of anxiety. And I think when there's anxiety, you know, people give... There's always people who want to profit from anxiety. I mean, there's always very deeply reactionary or fascist movements who are like, we can solve your anxiety. We tell you who's at fault. And, you know, they run with that. So I think fear can both be a thing that can motivate people to get together. To produce solidarity, or it can be a thing that can produce fascist and reactionary impulses. And so giving people tools to make sense out of anxieties seemed very important to me. And then I think there was this thing that Ingeborg and I just experienced because of our own class position. I mean, I have a PhD in linguistics. I feel like in the last five years, literally everyone who hasn't gotten an academic job has started working for an AI company. Hagen: [26:59] So we wanted to also write something that kind of addresses the anxieties that we have experienced in our own circle of friends. And that is very often people from the professional class. And we wanted to get them to look at some of these things that we talk about, especially in the later part of the book, where people want to cling to their sense of their position and the social hierarchy is justified. But sometimes to properly cling to that leads you to certain kinds of necessary delusions. And I'm like, I think precisely because these are machines that are about flattening the social hierarchy, they are certainly attacks on the privilege of many white-collar workers. And again, I think there's that sense where, you know, a middle class that's under attack, and also it tends to radicalize, and it tends to radicalize either to the left or the right. And I'm like, I hope they radicalize towards the left. So I think the book was Hagen: [27:51] an attempt of trying to engage with that too. Justin: [27:55] The question of the book's title, Why Do We Fear AI, do you have like a one-sentence answer yet when people ask you, why do we fear it? Hagen: [28:04] I think we fear it because it's a weapon of class or from above. I think that's my one-sentence answer. Justin: [28:13] I like how a lot of the book was focused on explaining kind of to professionals, because this is like an automation that's coming at them and saying, speaking directly to them saying like, why, why are you buying into this fear? And it's because and you situate, you know, this labor hierarchy, right? Which various theoretical ways of like trying to talk about it, like labor, aristocracy, professional managerial class. But I think at least explain to people you're here and justifying your position here. That's why you're feeling anxiety. It's sort of like all the radicalizing things that people say like happen when you get old or like you know i i there are certain life milestones that are like designed by society to make you reactionary i feel like buying a house is like designed to drive you insane because all the money you will ever have in your life is in this one thing and then suddenly like your neighbor stops cutting their grass and you want to like kill them because you're like i'm losing i can feel myself losing money because my neighborhood isn't pristine the way i want it to be um. Hagen: [29:17] Yeah Or just, oh, the rent is going down in my neighborhood, but the value of a house or an apartment is proportional to the average rent in the neighborhood if you want to sell it again. So same thing. Suddenly you're like, no, I want the rents to go up because I have a 30-year mortgage. Right. Yeah, absolutely agree. I think that was actually very planned i think you know after world war ii that was a very active thing with the ga bill etc was to get people yeah get the turn more people than before into small property owners so that they could be like oh the damn socialists they're against private property and i own an apartment or a house or whatever Justin: [29:56] Yeah i mean that's why pensions went away i mean literally is just to make you invested in the stock market like when i worked for the texas state government right as a public employee my retirement if i like if you have a voluntary contribution like give a hundred dollars into your retirement account like a like i forget what what type of 440 something account that wasn't an actual savings account with money what it was was shares of a retirement account which was tied to oil production so all of my retirement money was in oil and so if the price of oil went down my retirement tanked so like i lost money right in my retirement based on how much i put in and it like went down and cost so i put in like a hundred dollars on every paycheck and that was less than i put in because like the the oil prices were dropping and i that was how i found out that all of that money wasn't a savings account it was oil shares wow so then who is who in texas who works for the state government is going to say yeah let's move to solar power right no one yeah. Justin: [31:06] Like you said it's out of their out of their control even the capitalists are like well if i don't do it someone else will which is a very reactionary mindset i think as well like you know it's i mean when when i was reading that of the the person who said you know if i didn't make this ai someone else would it just reminded me of that like settler in palestine who was like if i don't steal your house someone else will oh. Hagen: [31:30] Yeah i remember that Justin: [31:30] Video extremely you know The guy from New York and is like, hey, if I don't steal your house, someone else will. I'm indigenous to this land. Hey, over here. It's just like some New Yorker going and stealing someone's house and their family for like 500 years. Sadie: [31:47] Hold on a second. I actually need to find the exact line in the book that I highlighted. It was the first thing I highlighted in it. Oh, actually, the very first thing that I highlighted is the picture Darth Vader in Dubilee, the master of the invisible hand, as someone who's into autoerotic asphyxiation in a footnote, which was just like excellent imagery there. Thank you for that. The second thing that I highlighted was the radically unthinkable centerlessness, the will without a willer, and how that was... I had never put that anxiety into those exact words, but realized that that was that's the anxiety of capitalism right is you can't stop it so just the and when that becomes a technology it becomes even more opaque as a force so like yeah justin saying nobody's gonna put nobody's gonna be like oh yeah let's switch to solar power because everybody's shares or retirement shares are in oil it's like it just made me think of like the part where you're just like, yeah, who actually can stop this? CEOs can't. They have their own reasons for it. So it's a very interesting angle, and I thank you for that. I'm going to be churning that over for a while. Justin: [33:04] Yeah, I think you write in such a way that is useful for me because I tend to, because I'm not naturally a very good speaker, I tend to memorize short phrases and pithy statements. A lot of my politics just comes from folk punk lyrics because I can memorize, I was like, okay, I can throw that out in the conversation. So I tend to do that a lot. It's like as a person who had an undiagnosed anxiety disorder, it's like, oh, I learned how to speak by just mimicking people. So yeah, I actually am a philosophical zombie, but no one can tell. Can you explain to my husband what a Chinese room is? Because we had a really long argument. Jay: [33:46] I had never heard of it before. And he starts spouting about some like, okay, imagine this Chinese room. And I was like, but why is it a room? Why is it Chinese? Like, I was so confused. Yeah. Hagen: [34:04] I don't know why it's Chinese. I suspect the answer has something to do with racism, but. Jay: [34:10] See, I'm not crazy, Justin. I just, I'd never heard of this like computer philosophy quandary before. Justin: [34:19] Yeah. Well, it's common. Everyone knows it. Everyone knows about Chinese room. Everyone thinks it's a very useful thing to talk about. I was reading blindsight and the aliens in it are highly intelligent but not they don't have consciousness so when they're trying to communicate with them the linguist on board is like oh they don't understand language they just can speak it to us perfectly but they don't actually think and so she starts like insulting them and stuff and then they insult it back and yeah. Hagen: [34:47] That sounds like a fun book to have written how old is the book Justin: [34:51] It's from the early 2000s i think that is fun yeah it's yeah 2006 it's about humanity's first contact with aliens so like humanity gets like scanned by these satellites and also there's vampires i don't know why you felt the need to throw vampires in there but humans like reintroduce vampires which are like more intelligent than humans the ais are incomprehensible are. Hagen: [35:13] They philosophy vampires or normal vampires Justin: [35:15] They're like they're like prehistoric humans that fed on other humans so they're more intelligent than us So humans can't understand the vampires because they can always outsmart you, but humans also can't understand their own AIs. And then so these humans with a vampire captain and an AI on board go to make like, go to look into this like issue and with, with these aliens. And then the aliens themselves are completely different type of intelligence. And so the whole book is about intelligence. It's really interesting, but I really am like, why did you throw vampires in there? Yeah. And they're not like made a deal with the devil vampires. They're like, no, back in ancient times, these were like an offshoot of humanity that predated on humans. And then for some reason, humans like remade them, but gave them like a crucifix glitch so that they could control them. So that they have basically grand mal seizures if they see certain geometry. Hagen: [36:11] That sounds like a wild story. Sadie: [36:14] I love that. Hagen: [36:15] Yeah, I've been joking that, you know, There's all these people who are like, well, aren't humans just next word predictors? Isn't that just what I do? And I've been joking about how the one thing that we've built with these language models seems to be a P-zombie, a philosophy zombie detector, right? The idea is that maybe there are some people who don't have a rich internal life. Maybe it's the people who are like, well, I'm just a next predictor. I also just predict the next word and then say it. Justin: [36:42] Well, that's a, it's an interesting point though, because like people are pushed to act like computers. People are pushed and, and coerced to act like machines. And so some of the concern that I think people have is that this, this is moving into the personal realm. Justin: [37:00] And so people's personal lives will start to be dominated by like machines. You already see this with like, like grind, grind set kind of guys who are like, my whole life is just about optimization. right. Jay: [37:10] I'm gonna 10x this and i'm like Justin: [37:13] Which can be fun i hate the phrase 10x i took human sexuality class in the 2000s and it showed us like a 24 7 total power exchange and she would talk about how she would like map out her grocery run at the store to be like perfectly efficient so as not to waste time it's like yeah cool i can understand how you want to live your life around efficiency for like a sex reason but like to just do it just because like that's rational it's like Like, why would someone go to school? Why would someone do art when you could have something else do it for you? It's like, well, then, yeah, what are you? Like, what's left of you if this logic of capital goes into your leisure time? Hagen: [37:52] Yeah, yeah, yeah, it's very true. It's because, you know, being rational can be useful for achieving a goal. I mean, being rational is about how do I achieve a goal that I have set myself, but being rational doesn't produce goals in itself. What you think is valuable is not in itself a question that grounds down in rationality. But once you're in capitalism, you can be like, well, I know what's rational. Rational is when the numbers go up, the goals are whatever steps on the way to making the numbers go up are and then you're like well and when i'm dead i hope the numbers have been really high or something right it's a it's a thing that once you think about it in the you know memento mori or whatever way you're everyone should obviously realize how absurd it is but precisely because money is also power there's this like weird way in which it kind of short circuits our ability to yeah to think about what does it actually mean for me personally or for you or forever to live a good life, right? It kind of produces a shortcut there, which is this bizarre thing when we come back to anxiety too. I think it's this kind of thing, it's a shortcut that you seek because, you know, I mean, the fact that whether you're an existentialist or whatever, Hagen: [39:05] Like the fact that we have to kind of figure out what the meaning of our own life is and that it's not just there to discover and that it's not neat and that it's, and the fact that we die is all pretty anxiety inducing right so when you if you don't have to think about it if you can find a shortcut for not thinking about it that this alleviates the anxiety but then you're actually running in this hamster wheel of capitalism which continually produces anxiety as well so so yeah Justin: [39:32] There's no easy solution like you try and like solve it with like calvinism but then people are really worried about well what if i still end up going to hell Like you've solved the problem of the fear of death, but then you spend your whole life anxious. Was I the right kind of person to like not have the bad afterlife? So, you never really solve the anxieties. It's strange seeing people kind of make their own spiritualist beliefs out of the machine, like the guy who's trying to force himself into living forever and getting their blood boys and stuff like that as a way of just not thinking about the fact that they're going to die. Hagen: [40:12] Yeah. Justin: [40:12] And also, I think it's interesting to the, you know, the fear of AI overthrowing is also one that the capitalist has of the fear of all these machines you've created out of people rising up against you as well. I think it's at the core of that anxiety. Hagen: [40:28] Yeah, right. If you read, I mean, in so many science fiction stories, it's so clear that the anxiety is about class and what if the people who are below me in the social hierarchy, whether that's from the professional experience or from the experience of the global north or from the experience of the capitalist class, right? There's many intersecting ways in which there's all these sub-hierarchies of power, but it's very clear that so many of these sci-fi stories about the uprising of those who are at the bottom of the hierarchy. Even in the Matrix, if you look at the backstory from the Animatrix or whatever, it's like the cleaner and construction robots that start the uprising. It is exactly the kind of jobs that are first imagined as fully automated, which is exactly what you said. Hagen: [41:16] We're treating people as if they were already machines. As if they were just, you know, things that I give a couple dollars and then my apartment gets cleaned or whatever. And I don't have to have a human relation to them. I don't have to treat them as a person. I can treat them as some kind of consumption good. But it's exactly those people that in their science fiction stories are first imagined as robots and then imagined as leading an uprising. And that's, yeah, sure. That's definitely also one of those fears. And I think that that's maybe the most obvious one that tells you why we need to interpret these AI nightmare stories ourselves, right? Because if there's politics that follow from the interpretation of the AI anxiety, I don't want it to be the politics that follow from the AI anxiety that's actually about the anxiety of what if poor people come together and form a political movement and overthrow the capitalists and institute a different system, right? I want the politics, I want the interpretation of anxiety to be one about what is capital doing to us with these machines? How are we getting dehumanized and controlled by the system? Justin: [42:26] Because I have to deal with, you know, working in academia and hearing academics try and deal with the reality of AI. And of course, none of these people have any sort of like Marxist grounding, even though they should. You know like our university president is a history phd and it's like i know you know this stuff i know you had to read marx once in your life hopefully i mean who knows and getting a history degree in the u.s maybe not but i know you have to know some historical materialism right i know you have to know a little bit of this when people talk about human in the loop and ethical ai use how does that hit you after writing a book like this. Hagen: [43:06] Well, I know a lot of people who work in those spaces. And I don't know, I mean, usually it hits me with like, they're very bad fixes on a very deeply broken thing. Hagen: [43:17] I mean, they're like, identifying a very concrete problem. And then like, can we do something about this problem without changing anything of the structure? So I think, you know, I think it often comes from a place of good intentions, but it comes from a place where people see exactly the structure in which they already have agency and are like, oh, what can I do in this small world? I think we've seen over the last year, especially how fragile that is to begin with, because, you know, the leadership of big tech companies has turned much more fascist in many realms. And a lot of these spaces are getting much more curtailed, even they're the smallest reformist kind of stuff. Like maybe we can make the image models not be useful for just undressing people in images, right? Like which are not radical demands. Those are demands of like, hey, if we want to have a kind of functioning society of whatever kind, even if you love capitalism, maybe shouldn't have that, right? Even those spaces are getting curtailed. And so I think, yeah, I think people should be thinking about how can they use their individual power to increase the collective power of people who want more democratic control, who want more egalitarian outcomes, right? I think there's a tendency, once you're in a certain position in the hierarchy and academics are in the professional managerial structures, more broadly, are people who have individual agency. Hagen: [44:43] So that's how they see the world. I think that's a quite natural thing. You don't really have to be much of a historical materialist to think that way, that depending on your position in the workplace, either you get to have certain abilities to make decisions or... You get to have a certain freedom. You get to, you know, there are jobs where you design the structure of your day yourself. You make yourself a to-do list every morning and you decide how you go about accomplishing your projects. You structure them and there are jobs where you don't. There are jobs like working in an Amazon warehouse where every single step that you take is measured and surveilled and determined by rationality that isn't one that you chose, right? And these will produce different ways of thinking about, well, where's the problem? Clearly, the Amazon worker will not solve the problem of this is a shitty situation and we should have a say in how this place is run by just being like, well, I'll just decide to do it differently. Hagen: [45:39] Right. So I think there's almost a sense in which that kind of class position misleads people from the professional strata into very small, very reformist projects. And to me, there's a hope of being like, look, if you can see this thing as an attack on the stratum of the working class that you're in, then you should be able to connect your own personal struggles and the larger political struggles around, like, you know, more just world, more clearly to the sense that we need to produce a collective agency of some kind, right? We need to be able to organize ourselves into collective units that can challenge capital, whether that's political formations of some kind, whether that is unions. Hagen: [46:21] I'm not going to make, you know, I have my own ideas. I don't have perfect solutions. I certainly don't feel like I'm in a position where I can tell people what the recipe is for that. But I think we can think about this from that theoretical perspective and be like, there's boundary conditions. There are certain things that your political activity will have to fulfill. And one of them is certainly that you have structures that can grow, where people can get involved, where people can experience their own collective agency in democratic decision-making and in challenging the people who have power simply by virtue of being rich. Justin: [46:57] Mm-hmm. Yeah, it's something I feel like when people talk about having AI ready workers is like having a assembly line ready worker. Justin: [47:08] Like, how do you like like taking a class on an assembly line is like an absurd prospect because the whole point of the assembly line is to de-skill you. Right. So why would you need to learn about AI if, you know, I also think it's interesting that when you talk about different levels of class domination by machines, like telling an Amazon warehouse worker, that they need to have AI literacy because their job is dictated by AI. Justin: [47:33] It's like, yeah, why would you say just have some AI literacy about this so that you aren't misusing the AI? Well, then misusing the AI is not the problem in the same way that me misusing the AI is not the problem. It's whoever decides to hook it up to our power grids problem or whoever decides to hook it up to our nuclear capabilities problem or whoever's hooking them up to drones. Uh it's like you know telling me to to be ai ready for the future is like that's not i i always found it strange in the same way that when those scare like those scare stories came out i didn't really see them as like the way this book approaches them which is like understanding where people's fear is coming from i just saw it as all marketing i just thought they're just saying ai is scary and it's skynet because they want people to believe it's more powerful than it really is but now i I think, no, they genuinely do have like this fear. Justin: [48:28] Like this, this real existential kind of fear of like this thing is, is potentially going to, you can watch them kind of freak themselves out because one, they're very isolated and insular billionaires, but also like you can see that it's, it's fears about a lot of things. Hagen: [48:45] Yeah, I think, you know, in a general sense, I think when one encounters propaganda of some kind, it's a mistake to think that the propaganda works because of its content only. You have to ask not that just why do people want to send this message. And I think you're right. I mean, I think for whatever we're saying about the anxieties of the ruling classes and the professionals, I do think there's also an aspect of this just being propaganda and advertisement in there. But the question is, why do these things resonate with people, right? It's not just why do people want to send this message. It's like, why do people not just immediately tune out? I mean, there's lots of messages that don't reach people that the rich would love to hear, but that nobody wants to read about, right? So what is it about the tension between the sender and the receiver of the message? And is there a potential for politics in the tension and maybe even contradictions between the two, right? Is there a moment where we can, I think, you know, I think as a Marxist, I think the only way to deal with history is by figuring out how the existing contradictions work and figure out, is there something about the way that things are fucked up right now that produces a potential for making something better? You have to kind of, yeah, the only way forward is through in a sense, right? Hagen: [50:07] So, yeah, so that's what we kind of try to do with that, to be like, is there, yeah, what about these anxieties maybe, can maybe, can maybe made into, yeah, into a rational source for solidarity, essentially. And I do think the stuff about the centerlessness, I think it's really, I'm so glad that you like that. That made me very happy. I mean, it's from Mark Fisher originally, but we're connecting it to some language model stuff. Yeah. Hagen: [50:32] But I think that's just so crucial for understanding both something about the actual anxiety of the ruling class and the professionals, but also about why I would say I'm an anti-capitalist and the problem was capitalism, right? Because the problem is not specific capitalists. And I think that's important to make clear that it's not that, you know, The problem is not that the 15 guys in Silicon Valley who are at the top happen to be all assholes. Hagen: [50:59] It's that we're in a system where if you are a certain kind of asshole with a certain amount of money at your disposal, you can become more rich. And that's precisely the wealth that gives you the power to structure decisions. It is that capitalism is not just a system for distributing resources and dividing labor. It is a system for producing decision power, and it is a system for producing decision power based on people who have in the past been guided by the decision to make more profit and have been reasonably good at it for whatever the costs, whatever the externalities, whatever the harms to other people, whether that meant crushing your workers. If it's been good for profit, that means you now have more decision power in your hands than you did before, right? It's a system. That's why it's a system, not an economic system, but a system of political economy, right? Who gets to make these kinds of decisions? How did they get to be there? And replacing individuals because it's a system that selects for particular kinds of individuals that make particular kinds of decisions. And if somebody decides, I don't want to be like that anymore, they just get kicked out of the system and somebody else steps into that fold. That's why we're like, on the one hand, if I don't do it, someone else will, is like a really pathetic way of thinking about the world. Hagen: [52:16] And at the same time, there is a certain truth to it under capitalism, right? That's why the political target has to be to figure out how do we change the system so that we can make decisions to fulfill human needs and not run a system Hagen: [52:31] that puts people in power who put profit above all human needs, whatever. Justin: [52:35] Yeah, I think that's to bring it back to Mark Fisher. There's this, There is no alternative kind of thought that's dominating in the AI discourse. It's like, one, this is going to happen. You are going to be automated. Capitalism is inescapable. And the exact way out of that, I think, is people imagining a future that is an alternative. I think in some ways, maybe the acceleration of it all in the last decade has made it easier for people to actually imagine things. A post-capitalist future that isn't just nuclear annihilation i think finally people are starting to say like this is all this is fake isn't it like covid really helped break some some people's brains in bad ways in good ways but also i think the the second trump administration and all this ai slot being forced on everyone's you know into into people's brains all day is sort of making people go maybe we should like do something different because this is just not working And I feel like that sense of the ability to imagine is coming back to some people. I can't tell if that's my own little bubble because it's the kind of people I like to surround myself with. But I'm starting to see people, I think, talk about other ways of the world. Like, you know, these AI is like this story we tell ourselves. People are also telling counter stories of what if we didn't have to deal with this crap? Hagen: [54:01] Yeah. Justin: [54:01] So the refusal of it is starting to generate some thinking that is actually useful in terms of like, maybe we should have some rules about this. Maybe we should send some billionaires to jail. Maybe we should actually do something differently. And that also activates people on the right and towards fascism. But once you start breaking down that consensus view, I think the things start getting interesting. Hagen: [54:28] Yeah, I agree. I think there's almost a sense to me in which there's a lot of critique out there that does want to break with the sense that there's no alternative to AI, but doesn't really want to go to the there's no alternative to capitalism thing. They want to go in between those. Yes, there's an alternative to AI, but the alternative is more regulation and this and that particular AI use should maybe be ruled out or whatever. But I think, you know, I'm almost sometimes like we should be like, yes, you're right. Under capitalism, there's no alternative to this particular kind of way of making life shittier. So fuck capitalism, right? Let's have a different economic system. Let's have a more democratic world where we figure out how we can make decisions differently, not based on profit, but on other human values. How do we bring those in? Let's, let's, you know, I often feel like people imagine that we can build a political movement that can stop AI, but then they're like, and that, which I think we should, and we can, but I think that political movement itself will be so big that we can make much larger demands than like, don't put a language model into my education software or whatever, right? We can be more imaginative. Why not take this moment and be like, you know, up the game a bit. We want to talk about the rules of the whole fucking game. Justin: [55:53] Yeah, I think there's like this parallel between the resistance you see in communities, rural communities, because they're the ones targeted of resisting the building of these data centers and the resisting a building of ice concentration camps. That is the same resistance to we don't have to deal with this. And these are like conservative areas, but they're like, we don't want this pumping diesel into our atmosphere all the time. And we don't want to have our economy based around big encampments that haul in our neighbors. And even if there's not a left-wing element to it there is a there is a popular resistance that is amorphous and moldable towards an anti-capitalist project i think. Hagen: [56:38] Yeah i think so too i think there's you know you don't you don't build a leftist movement by just being like well i i find you know the three most radical guys that i could find on campus and then we sit at home and imagine how the world could be better. You have to build, when people are engaged in concrete struggles, you have to build solidarity, support them in their struggle, and then through that, bring them into the larger project, right? So I agree. I think, you know, I think there's lots of people who are in some, in this way or in that way, conservative that can be brought into the movement. And that will be difficult and it will require certain boundaries. You know, there are certain things in which you cannot be compromise but it Hagen: [57:23] will it will also require like patience and and solidarity yeah Justin: [57:26] If you've never seen the movie pride about lesbians and gays support the minors watch that. Hagen: [57:31] Yes it's a great movie it Jay: [57:34] Is great i cry um every time every time when those the fucking union buses show up to leap right at the end and every time like i haven't seen Sadie: [57:48] This yet i just haven't gotten around to it i've Make that a better priority. Hagen: [57:53] Sadie. Justin: [57:54] I know, I know. Jay: [57:55] You got it. Hagen: [57:57] No, no, I used to be part of a political organization that would, like, every couple of years we would show that film as like an organizing event, basically, you know, or just have a showing of the movie and be like, stay after if you want to get politically involved in something kind of like this. Jay: [58:13] Weirdly that film is like one of the better depictions of what like organizing work like looks like day-to-day and like solidarity work and all that like like like little practical things like no one goes out there by themselves right like like just like a little shit like that or like teaching people like what solidarity looks like with people who you might not usually consider yourself an ally with like it's just so it's so good anyway yeah Justin: [58:47] Lesbians and gays against data centers lesbians and gays against concentration camps you know like there's no reason it can't it can't work and I mean it is funny when I'm walking around Boston I just see like the the communists try and like you're like communists against the things you hate And it's like, I don't know if that's going to work, but I don't know if Americans are ready for that. But one day they will be because they won't have a choice. Jay: [59:17] Boston is a city that's actually a college town. And so a lot of the explicitly socialist of any flavor stuff that happens is a bunch of college students. Not to shit on college students, but that's a lot of what it is here. Justin: [59:35] It's not true. You hate college students. You tell me every day. Jay: [59:38] This is also true. The students who are on the beeline who don't take off their fucking backpacks. I hate if you are listening and this is you, I hate you personally. Hagen: [59:50] It's not solidarity to have your backpack on the public transit system. This is true. Jay: [59:56] This is true. Hagen: [59:57] I mean, you know, I've been living in New York for almost a decade now, and I know New Yorkers are thought to be very rude. But I think a lot of New Yorker rudeness is actually about, like, you are not following the rules that enable everyone to get along. And that's why we're shouting at you. We're not shouting at you because you're mean. We're shouting at you because you are behaving in ways that maybe you don't realize. But they are very asocial. You cannot just stop on the sidewalk in New York because you thought something was interesting. No, you go behind the little space behind the lantern where you will be in nobody's way. And somebody will shout at you if you don't. But it's a public service that they're shouting at you. Jay: [1:00:38] It's the whole thing about people in the Northeast. We're kind but not nice or the other way around. Where we're like we're kind of dicks but it's for like it's good you know like shouting at someone directions yeah Sadie: [1:00:54] Yeah they'll dig you out they'll dig your car out of a snowy bank while calling you a dumbass Jay: [1:00:59] Yeah yeah yeah Justin: [1:01:01] Yeah there was a it was like one of those boston roundup things it was like guy was mad about the parking situation so he shoveled all the spots on his block. Jay: [1:01:12] Yeah. Two rules. Justin: [1:01:13] He's so angry. Sadie: [1:01:15] This guy helps me parallel park while calling me a fucking idiot the whole time, but like explained it to me in a way I will never forget. Jay: [1:01:23] Yeah. Justin: [1:01:24] Yeah, it's great. Hagen: [1:01:25] Right. Well, that's, you know, we're just talking about solidarity is not a frictionless kind of thing. Sadie: [1:01:32] No. Hagen: [1:01:32] Right. But you have to, I mean, you know, I think the problem with people in college who want to organize is often that they have, that they don't figure out what they can materially bring to support people. I mean, and often that is because college students don't have much access to resources, right? Like often what's needed is resources. But often what is needed is just fucking time, right? Can you figure out something, some way to make your time useful right maybe the thing is yeah shoveling someone's car out so Jay: [1:01:57] This is literally okay i organize with some catalonian people and they told me about this like i forget what they call it but like if you're doing organizing work like and say someone who's like they're the person responsible for like the planning of something the logistic of something they've got to like do all that work but like because they've like got kids or they have a job where they have like irregular hours or something it makes it hard for them to stay on top of the organizing work that they need to do so people who organize with them or in their community will like either like do mutual aid and like pay their wages for a chunk of time so that they can like take that time off work paid in order to do the organizing work that they need to do or it's like i'm gonna cook dinner for you and your family tonight so instead of you having to spend the time and energy to do that you can work on organizing and like that's a lot of what they do and i was like oh like that's it's something that seems so obvious but like even if you don't have resources can you like walk someone's dog right for them right like can you go over and do their dishes or like help them fold laundry like can you do that Sadie: [1:03:17] In their kids for a little bit like put on a movie with the kids Jay: [1:03:22] Or like, is your organizing space friendly for people who have children? Right. What time are you meeting? Where are you meeting? Like. Hagen: [1:03:31] Right. Yeah, a lot of that stuff is not glamorous, but it is actually the stuff of, you know, people's daily lives. Yeah. And, you know, I also like, you know, I think when you're in college, it's also kind of normal that the stuff of daily life is like of less interest to you than most people. Because you're the furthest removed from what it means to have a normal daily life. We've built a very abstract and strange kind of situation where it's like this is the time where you're supposed to, A, form social career networks and, B, acquire all these specific weird skills that may or may not be relevant to your life at all, but that somehow prove that you can be accredited with this paper that says you have a degree from this more or less prestigious institution. But that's a very strange kind of thing. And we have also structured it around most people live materially and relatively deprived life at that point. So we're like... But it's true that these things of daily life, I think not only are they actually what allow these things to work and they are perceived that this is solidarity, they're also the difference between the kind of organizing that also creates communities, which I think ultimately the really resilient political forces, right? And that's another thing that's beautifully depicted in that movie, I think, that people care about each other. Hagen: [1:05:00] It doesn't start from the abstract political level. It starts from the interpersonal level that then gets a political interpretation attached to it. And I think that's the only way that we will be politically successful if we can't do that kind of organizing. Jay: [1:05:15] Yes and i i also think you bring a really bring up really good point about like the way that college is treated at least in the united states i'm not sure about elsewhere but like the sort of scourge of college students using ai right and even before this you saw in the sort of you know entrepreneur grifter mindset like internet space like oh your college degree you know don't waste money on college read these books instead or take this course or or do ever the sort of like anti-intellectualism that is still showing up within quote-unquote the academy um And it's like, you know, so often it's like I see people almost like defending the students because it's like, you know, the types of assignments that are being asked are not well designed or the workload is unreasonable. Like it's the purpose of college has changed so much that like, you know, you can't where people like you can't blame them for, you know, using this tool to do assignments for them when that work is not benefiting them in any way. And I'm like, I understand what you're saying, but then you're pointing to a problem that we can fix. Jay: [1:06:41] Like the way that like i it feels like it's coming out of the same argument of like well in high school why don't they teach people how to do their taxes why do they teach them algebra why do they teach them a bunch of useless shit as if the entire purpose of education is vocational i mean this is like in germany i think like they have that sort of split education system where at a certain point you get like split off into going to college eventually and then going into a trade um and you go to like completely different high schools or something and it's like i don't know how easy it is to swap between those two but it's like it's very difficult is it okay like i remember i took german in college and we learned about that i'm like that doesn't seem good uh no like but yeah i Hagen: [1:07:32] Always tell americans that's probably why college is free in Germany. It's because for us, the teachers just select when you're like 10, whether you go to college or not. So the class reproduction is actually quite similar between Germany and the US. So if you measure, like if you check how likely somebody with zero, one or two college degrees between their parents is to themselves get a college degree, it turns out the heritability of a college degree is pretty similar in Germany and the US, despite college being free. I mean, I paid, I believe, 16 euros and 50 cents per semester for my education. Jay: [1:08:10] I can't even get a meal for that. That's crazy. Yeah, I didn't know. Hagen: [1:08:16] I think 16 euros actually went to the student union. Jay: [1:08:20] Yeah. Yeah. With your name, I wasn't sure if it was German or. But yeah, like I remember like do people in the United States, like the purpose of education is not meant. It shouldn't. I don't think it should be to prepare you for a job. This is actually a huge problem I have with library school. Like a lot of people are like, oh, library school doesn't teach you enough practical skills. And I'm like, no, what you're wanting is that we should have like a vocational track for librarianship. But if there's going to be a master's degree for this, it should maybe be more academic. Like, I don't like the conflation of like education for job market, because then it means that like, then it's just a barrier into getting employment, like with a huge price tag on it. But like because we treat it as this like, oh, you have to have a bachelor's degree to do anything now. It's like all of the assignments are meaningless. A lot of like there's just the way that we do college is just not good. So it's like I can't I almost can't blame students. Like it's bad. Hagen: [1:09:21] I think they're using it. But yeah. Yeah. I mean, I agree with this. There's like I certainly also think that, you know, education should be about more than just vocational training. But there is a sense in which in our society, where we may talk a nice game about what a liberal education or whatever is for, but ultimately, you know, we have... Again, capitalism is the system of evaluating things, of putting a price tag onto everything. And it structures all social relations that we find ourselves in. We try to make community and meaning outside of that, but it's difficult. It keeps intruding into everything. And I think students reacting to that can be depressing sometimes. But I think, again, I'm like, well, the only way is through. So if the system produces that, then hoping that the individual students will just be differently is certainly not a way of changing the system, right? So in a sense, I understand being angry about that or disappointed or depressed or whatever. I mean, lots of friends of mine are teaching and I've heard horrible things from many of them about how the last couple of years have gone. I taught during COVID, during Zoom, and I found that very depressing sometimes because a lot of the community-making thing that happens in a classroom was just so much more difficult to do. Hagen: [1:10:46] But yeah, I think that, again, I think we see there with AI, AI is a tool for trying to devalue certain skills and cheaper for capitalists to buy on the market. And the students, in a sense, are reacting to that cheap. Right that is yeah there are in in the in the pure like neoclassical economic sense they are behaving as rational actors right they're like supposed to produce a skill with a certain value the value has been deprived has been lowered so i'm gonna invest less time in the production of that skill set right but yeah i mean to me one of the things that i really like to me is really interesting in in having thought about and having tried to write about is this Hagen: [1:11:28] This question of why capitalism simultaneously has this way of valuing education, of constantly professing it as a value too, and this deep anti-intellectualism that it simultaneously produces. And to me, when I read Braver Man and when Ingeborg and me were talking this through a lot, it's something really clicked about the fact that very, very often the point of technological and intellectual advancement is precisely to cheapen some other kind of knowledge, right? Once you realize that there's a dialectic or whatever you want to call it, the fact that the devaluing of some knowledge and the upvaluing of other knowledge are two sides of the same coin constantly, right? Hagen: [1:12:18] Braverman was a metal worker before he became a sociologist, so he knew a lot about how metalworking had changed between 1940 and 1970 in the U.S. And he's talking about the shift from manual lathe operators to numerically controlled machines to computer numerically controlled machines. And he doesn't talk that much about it, but there is that sense of, yeah, of course, the person who knows how to make a CNC machine, how to program a CNC machine, who knows how G-code works or whatever. Hagen: [1:12:50] That person that has that kind of technical knowledge gets paid better now precisely because the machinists that were the highly skilled labor force of the 1940s has now been de-skilled. So the upskilling of some and the de-skilling of others, that that's not two totally independent movements, but that they're very strongly conjoined. That really gave me a new grasp of understanding why there's this constant conjunction of value on education and anti-intellectualism. And I think something that didn't really make it into the book, but that I think is really crucial to think about, is that we see this in a very radicalized form historically in fascism, right? Like fascism has this absolute fetishization of technology, especially, of course, military technology, but also heavy industry, while also being an extremely anti-intellectualist force, right? So in a sense, I almost think that I hadn't really, before we started working on this book, thought about how much that's a radicalization of the very normal use of knowledge for de-skilling workers, of the use of knowledge as a weapon in class war. How that kind of reappears once you have a fascist war between states and ideologies, that we see this at different levels. Hagen: [1:14:06] So once I think about the individual student who's like, I don't want to do my homework, I'm going to throw it into the AI thing. In that whole context, all the way up to fascism, I'm at least kind of feeling humbled about wanting to tell the student that that's their personal fault. For whatever, I also think, you know, like, yes, the point of education is the point of any training is struggle. Hagen: [1:14:30] One can only make meaning in life out of friction. I think this is another way in which capitalism is just very deeply anti-meaning. Capitalism hates friction and capitalism hates particularities, right? The point of money is that everything is abstracted into the equality with everything else, right? Like if this thing costs $50 and that thing costs $50, then whatever the two things are, they are in some ways equivalent, right? That ability of capitalism to, above all else, abstract away from particularities and that desire of capitalism to remove friction because friction is cost, I think, is why capitalism is such a nihilistic system. Because anything that we as humans make meaning of is particularities and friction, right? That's how interpersonal relations grow because you've made meaning out of a conflict, right? Your best friends are not people that you never had an argument or disagreement with. It's people where when you have a disagreement, you found really meaningful ways of engaging that and making that disagreement, maybe either resolving it or making it into an ongoing source of meaning or whatever. But friction is really important for meaning making. Sadie: [1:15:43] Yeah, I think that's even part of the fear that we see with AI in particularly creative endeavors is because artistic endeavors are often so very personal and particular and not very well valued in a capitalist sense. So there's that fear that this, at least from my interpretation, following a lot of artists and writers on social media, is not necessarily that they're going to take away our livelihood, even though there is that, but it's going to take away the meaning of our effort. And effort is friction. You can grow at something unless you work at it, right? So not entirely cohesive thought here, but yeah, just like you just said, it is all about friction. Students not wanting to do their homework is trying to avoid the friction of doing it. People not wanting to do organizing is to avoid the friction of interpersonal relationships and having to figure out how to grow that. So embracing the friction, I think, would go a really long way towards soothing the fear enough that we can get a clear view on it and figure out what to do about it. Does that make sense? That's why I really appreciate the fear framework, I guess, is what I'm trying to say. Hagen: [1:17:04] Yeah that makes total sense i absolutely agree yeah totally yeah and i think yeah that that was kind of the the urge that i had also in my mind from like the friction of like doing a homework right that's a struggle it's difficult but the difficulty is kind of like that's that's how you make a thing your own thing right like that's why an idea from a book can be something can turn something into something that you do something with after if you're at the book but before it's just someone else's thing but it's the struggle that is what makes it your own and yeah in art there's so much the struggle is part of the the production of whatever the hell meaning is right i'm not i don't have a theory to advance but i know that that friction and sublimation and all these kinds of things are are essential to that so yes absolutely agree all Justin: [1:17:51] Right well we've gone an hour and a half so i think we are good to wrap up is there any like final questions before I go. Okay, just making sure. I'm not like cutting us too short. Hagen, thanks for coming on. I really appreciate it. I really liked reading your book. Hagen: [1:18:09] I'm so happy to hear that. Thanks for having me. This was a very fun conversation. Justin: [1:18:13] Is there anything you want to plug? People can go, things I can add into the notes. Hagen: [1:18:18] No, I don't think I have anything right now other than the book, you know, which I assume is going to be in there. Justin: [1:18:24] Link will be in there. And I will also put it on social media so that people will go buy it. Hagen: [1:18:32] Perfect. All right. Thank you so much or get it from a library. Justin: [1:18:36] Request it at your library. All right. And good night.
Editor is loading...