The Douglas Rushkoff Interview - Part I

Billionaire Bunkers are a stunning exercise in self-delusion, as the books of our next guest show. In this inspiring conversation with Douglas Ruskofff, author of the must-read Team Human and Survival of the Richest: Escape Fantasies of the Tech Billionaires, the topics range from how to be a respectable prepper to how to raise good humans and whether A.I. is coming for our jobs and our minds.

Rushkoff is an author and documentarian on the frontlines of understanding how technology and tech billionaires are impacting our lives and the world. His twenty books also include the bestsellers Present Shock, Throwing Rocks and the Google Bus, Program or Be Programmed, Life Inc, and Media Virus. His films include the PBS Frontline documentaries Generation Like, The Persuaders, and Merchants of Cool. He won the Marshall McLuhan Award for his book Coercion, and the Media Ecology Association honored him with the first Neil Postman Award for Career Achievement in Public Intellectual Activity. For more on his indispensable work visit his website.

In our bonus episode, Rushkoff takes the Gaslit Nation Self-Care Q&A. To submit your own answers and give inspiration for ways to recharge as we run our marathon together to protect our democracy, leave your answers in the comments section or send an email to GaslitNation@gmail.com. We’ll read some of the responses on the show!

And don’t forget that Andrea will join comedian Kevin Allison of the RISK! Storytelling podcast for a special live event at Caveat in New York City on Saturday August 5th at 4pm to celebrate the launch of the new Gaslit Nation book Dictatorship: It’s Easier Than You Think! To get a ticket to that event in person or to watch the livestream, visit this website. Signed copies of the book can be ordered at the event!

Gaslit Nation Self-Care Questionnaire

  1. What's a book you think everyone should read and why?

  2. What's a documentary everyone should watch and why?

  3. What's a dramatic film everyone should watch and why?

  4. Who are some historical mentors who inspire you?

  5. What's the best concert you've ever been to?

  6. What are some songs on your playlist for battling the dark forces?

  7. Who or what inspires you to stay engaged and stay in the fight?

  8. What's the best advice you've ever gotten?

  9. What's your favorite place you've ever visited?

  10. What's your favorite work of art and why?

Download Transcript


Show Notes

[advertisement]

[intro - theme music]

Andrea Chalupa (00:10):

Welcome to Gaslit Nation. I am your host, Andrea Chalupa, a journalist and filmmaker and the writer and producer of the journalistic thriller, Mr. Jones, about Stalin's genocide famine in Ukraine: the film the Kremlin doesn't want you to see, so be sure to watch it. First, a couple announcements. We are running a very special summer series called “The Future of Dictatorship. What's Next? And Ways to Resist”. This series features leading voices on the front lines of understanding AI, corporate surveillance, Silicon Valley greed, and more, because the dictator's playbook remains the same, but the technology they have to oppress us keeps changing. You can learn more about the dictator's playbook in the Gaslit Nation graphic novel, Dictatorship: It's Easier Than You Think. You can join me for a special night out in New York City to talk all about the making of that book on Saturday, August 5th at 4:00 PM at the fun Lower East Side bar, Caveat, where I will be in discussion with the comedian, Kevin Allison, of the hugely popular Risk storytelling podcast.

Andrea Chalupa (01:10):

If you're not in New York, you can join us by livestream. This is a huge deal for me because I hardly go out, so this will be like a Gaslit Nation prom night. Join me at Caveat on August 5th in New York. Signed copies of the Gaslit Nation graphic novel will be available for order at the event. For details on how to join us in person or livestream, go to gaslitnationpod.com and you'll see the link right on our homepage with more information about the event. Go to gaslitnationpod.com. That's gaslitnationpod.com. We'll be back with all new episodes of Gaslit Nation in September, including a live taping with Terrell Starr of the Black Diplomats podcast reporting from Ukraine. That's right, Terrell's gonna be in Ukraine, and we're gonna hear all about his summer, his reporting trips, what he is learning, who he's talking to, and what's next. That live taping will take place on Tuesday, September 12th at 12:00 PM Eastern for our supporters at the Truth-teller level and higher on Patreon. Come join us for that and drop questions in the chat and hope to see as many of our listeners as I can on August 5th in New York at Caveat for a fun night out. Before we get to this week's guest, here's a quick word from our sponsor, Judge Lackey, the wiley narrator of the new Gaslit Nation graphic novel Dictatorship: It's Easier Than You Think.

[clip - Dictatorship: It’s Easier Than You Think trailer]

Judge Lackey (02:30):

Ah, summer… BBQs, swimming, purging your enemies [people falling sfx]. Every dictator needs a scapegoat, usually an ethnic minority who is used to symbolize all that is opposed to the nation while the dictator and his fans embody the “natural” state. Learn more in my favorite beach read, Dictatorship: It's Easier Than You Think… Almost too easy [sword blade sfx]

[end clip]


Andrea Chalupa:

Here is part one of my discussion with Douglas Rushkoff. I'm going to read now from his bio. “Douglas Rushkoff is an author and documentarian who studies human anatomy in a digital age. His 20 books include the just published Survival of the Richest: Escape Fantasies of the Tech Billionaires—an extraordinary read, I have to tell you. It's chilling… Chilling how dumb the tech billionaires are—as well as the recent Team Human—another essential read, loved that book—based on his podcast and the bestsellers Present Shock, Throwing Rocks at the Google Bus, Program or Be Programmed, Life Inc. and Media Virus. He also made the PBS Frontline documentaries Generation Like, The Persuaders, and Merchants of Cool. His book Coercion won the Marshall McLuhan Award and the Media Ecology Association honored him with the first Neil Postman Award for Career Achievement in Public Intellectual Activity. This is part one of our conversation.


[transition music]


Andrea Chalupa [00:04:26]

Thank you so much for being here. So, I've been listening to your books, Survival of the Richest and Team Human, and they just capture so much of this very scary crossroads humanity seems to be in, amd especially your chapter, your discussion of the Billionaire Bunkers; how they're basically gonna let everything burn and think that they can ultimately escape us, which is really funny. But my first question, I wanna start off, for people raising kids today, kids that are going to have to inherit whatever comes next in this scary chapter after chapter we're living, what advice do you have for parents today raising kids that are looking for the future and thinking, My God, is there going to be any future for them?


Douglas Rushkoff [00:05:06]:

Oh gosh, that's a whole scary thing. I think that my main advice is to think of their kids and help their kids think about themselves in terms of something other than their utility value. You know, people have really kind of adopted the language of capitalism and the language of industry in understanding themselves. So they think of their kids kind of like these machines or computers; “I'm gonna get them enough education, program them with enough stuff!” like they’re little AIs or something. “They're gonna do a whole bunch of machine learning in these early years so then when they're teens, they're good at standardized tests so they can get into the right college and then get the right job and make the right money.” And it engenders a kind of a future-focused ends-justifies-the means approach to life that negates lived experience that negates the moment to moment experience of life.

Douglas Rushkoff [00:06:18]

Now, I'm not saying that you raise kids to be hedonists or something, but raise them to appreciate what is happening right now. How is this choice that I am making impacting everyone around me and everything around me? I would try to find ways for them to experience as much awe as possible, as much of a sense of connection to everything that's around them. I would try to raise them to not be afraid of nature, not be afraid of women, not be afraid of the moon and worms and the soil and the interconnectedness of things so that they don't adopt that kind of techno solutionist Western capitalist tech bro approach to everything, which is to control things and dominate them and de-animate them and put them in boxes and sanitize them and separate oneself from them, right? Get their hands in the dirt, you know? Let them experience problem solving and the problem solving itself as the joy, not the problem solved, right?

Douglas Rushkoff [00:07:34]:

Then they'll think of themselves differently, not just as worker bees. They'll value themselves intrinsically. I mean, I'm lucky that I was at the very, either the beginning of Gen X or the very tail end of the Baby Boomers, and I got to see this show called Mr. Rogers’ Neighborhood, and he would end his show, I remember every time he'd end it and he would say, “I like you just the way you are.” And to me it was really profound. It was like, Oh, you mean before I learned how to do addition, before I learned to read, you like me right now just like I am? And that to me is what understanding your humanity as a sacred phenomenon really is. That's what the Sabbath was all about. You mean you're gonna spend one day, you're not producing anything, you're not consuming anything? You're totally unproductive. It's worthless. But that's valuable? Right. Because that's what they're saying was, Wait a minute, you human beings, you're sacred just the way you are, just for having arrived here is cause enough for celebration. And if you can help young people see that; that just their existence is enough. I don't mean giving them trophies for having lost at soccer. You know, that's a whole… What is that? No, it's a competitive game. You get a trophy if you win. That's that model. But yes, it's a cause for celebration that you played at all, and that you're getting to play, that you're doing the things. So that's really it. I think if we could do that, it would change our whole mental model. Then you're not working for your 401k plan, you're not praying for your salvation, you're not doing this for that, or you don't end up in those sort of ends-justifies-the-means journeys where, well, it's okay if a few thousand people are enslaved in the Congo to get the rare earth metals as long as I get to drive a Tesla. And then, you know… You end up, I think, with a more holistic understanding of the world and less conditions around your happiness. You're allowed to be happy whether or not you're successful.


Andrea Chalupa [00:09:34]:

Absolutely. Tell us about the Billionaire Bunkers that you are researching for Survival of the Richest.

Douglas Rushkoff [00:09:52]”

Well, you know, I didn't even really research them. They landed in my lap and I kind of ran as far away from them as possible. So I'm a writer, right? I've been writing for, you know, 20-30 years about new technology and the impact of new technologies on people. And I've been right enough times that the wealthy tech bros sometimes want me around to find out what's gonna happen. So I thought this was one of those talks, you know? And they brought me out to the desert to this weird resort, and I'm getting ready to do my talk about, you know, the future of digital technology on all of us. And usually they'll bring me… I mean, they know I'm kind of like a Marxist humanist coming to the den of uber wealthy capitalist people. And I feel like they usually hire me as something of an intellectual dominatrix. You know what I mean? That I come in and kind of yell at them for what they do and show them the error of their ways and say, “Oh, look, if you had done this with the internet, instead of that, we'd have a healthy world and the kids wouldn't be killing themselves and anorexic” and all that. And it’s almost like they hire me for the punishment so then they can go back the next day and just like an evil stockbroker after going, you know, getting beaten by his dominatrix, he can go back and do it, you know, to be mean to other people. I thought it was gonna be one of those, but I'm sitting in the green room waiting for the guy to come in with a little mic to hook me up and these five dudes just come in the green room and sit around this table and start peppering me with all those binary questions that these guys always have about the future.

Douglas Rushkoff [00:11:26]:

You know, it was like, “Bitcoin or Ethereum?! Virtual reality or augmented reality?! Web2 or Web3?!” And then finally one of them asked, “Alaska or New Zealand?” and then we're off to the races! And they're telling me about, you know, they're building their bunkers, they have their strategies and, you know, is this gonna work and is that? And I really kind of made my career more—or my fun, anyway—tweaking these guys than helping them. I kind of try to tweak them or challenge them to see whether they can see the fallacy in what they're saying or if they double down on what they're doing. And these guys just kept doubling down. So I asked them, I said, “Well, you know, so you go in your bunker and you're gonna be there and the bombs are going off, or the climate change has happened, or the economic unrest, or the revolution or whatever it is… The nanobots are running around killing people. How are you gonna defend your bunker from the rest of us?” And they say, “Oh, you know, we've got Navy SEALs on call, you know, sitting with their helicopters fully gassed. They're ready, and you push one phone button and these guys are gonna come” and, you know, and protect them. I say, “Oh, you're gonna have Navy SEALs at your bunker… How are you gonna pay the Navy SEALs?” “What do you mean? We’re rich, we got Bitcoin, we got money, we got gold.” “Yeah, but after civilization's gone, your money's worthless. Why are these dudes gonna still protect you?” And one of the guys opens his little moleskin book and he's like, [writing down] How to pay the Navy SEALs after the… And I'm realizing these guys haven't thought about any of these things. And then we spend the rest of the hour talking about how they maintain the allegiance of their security people after their money is worthless.

Douglas Rushkoff [00:12:59]:

So one of them is talking about, “Oh, we're gonna have implants. Everyone in the compound will have an implant and that'll decide what rooms they can get into and what they can do and all that.” And, you know, the implant could be used then as a kind of a virtual shock collar. I'm like, “Oh, try using a shock collar on Navy SEALs. See how well that's how well that's gonna go over.” You know, they'll rip the things out and then come kill you! And then another guy said, “Oh, well I could be the only one who knows the combination to the safe where we keep the seeds and the food.” It's like, “Oh, Navy SEALs in Iraq, whatever, they've never had experience getting information outta somebody [laughs]. You're gonna spend your apocalypse being waterboarded by Navy SEALs for the combinations and they'll kill you!”

Douglas Rushkoff [00:13:43]:

That doesn't work. And they’re like, “What's gonna work?! What's gonna work!?” And so finally I said, you know, the way to get your head of security to be nice to you in the apocalypse is to be really nice to him now. And I suggested, kind of jokingly, I said, “Why don't you, you know, find your head of security and pay for his daughter's bat mitzvah today, and then he'll have a hard time shooting you between the eyes when you're in the bunker later.” And they kinda laughed it off because, you know, the chances are that Navy SEAL head of security doesn't have a daughter getting a bat mitzvah. It's not the typical profile of those guys. But I meant what I was saying. And it was funny, the other thing that was confusing them was I was telling them about how if you talk to any real prepper, they're not prepping their house alone in their basement. The first thing a self-respecting prepper does is talk to their neighbors because they understand they need their whole block to be prepped or it's not gonna work. You're gonna have to then fight off your own neighbors for stuff. Everybody has stuff, you work it out together. And so I was explaining to them about that and they’re like, “Oh, so I get it. So if your neighbor has food and stuff, then he won't come banging. And then if his neighbor has… And then if his neighbor…” I'm like, “Yeah.” And then one of the guys said, “Well, yeah, but where does it stop? Where does it stop? You mean… You're acting like everyone is included in making it to the…” and I'm like, “Yeah! Imagine that. What if everybody is included in making it through the apocalypse. Then maybe you don't need an apocalypse at all.” And they're just like so… The idea of a kind of communitarianism or mutual aid or that we mitigate the impact of our assault on the planet and each other by creating a more social reality, defined more by mutual aid than competition, is just so alien to their, you know, Ayn Rand libertarian understanding of the world.

Andrea Chalupa [00:15:42]: 

I mean, don't they understand that these bunkers, especially the ones they wanna build outside New York City three hours away, they're going to get overrun by the starving masses. Like, people don't worry about bullets when they need to feed their children. they just overwhelm and go for it.


Douglas Rushkoff [00:15:57]:

Yeah. I mean, that's why the smartest people I know who are in the bunker industry are in the education industry. You know, there’s one guy—he can't get anybody to invest in this thing—he's got these farms he's trying to build that are sustainable farms. He's not a new-agey sustainable guy. He's like an old school sustainable. So he’s got like chickens and roosters, rather than just chickens. He's got seeds rather than just seedlings. You know? So most farms, they're not sustainable. Most farms, they buy all their stuff from industry and bring it in. They buy their eggs and then hatch their chickens from it. They don't want roosters because it's all complicated. He's just got an old good old fashioned farm that actually has some circularity to it. And what he wants to do is sell rich people membership in the farm so when the event happens, they can fly out in their copers or come out in their limo a couple of hours from New York and go live in this place with a dentist and a doctor and this and that and everything sort of arranged.

Douglas Rushkoff [00:16:56]:

But part of the investment when you spend your $10 million or whatever to be a member of this thing, a lot of it goes to a business that is an education business that's teaching communities how to build their own farms. Because he said for operational security, which is the language they use, you need to have as few people without ways of living as possible. You know, he said the problem for him is not how to defend the place, but the moral conundrum of what do you do when there's a woman at the end of the driveway with a starving baby? You know? So you want as few of those as possible. So even this guy, and he's, you know, this sort of right-wing, survivalist, gun toting farmer guy, but he's like, “Yeah, but in the end, everyone has to know how to do this or it doesn't work.”

Andrea Chalupa [00:17:43]:

In the years leading up to the pandemic, there was talk about the event that some terrific smash was going to happen and then the pandemic kicks in and everyone's like, oh, here it is.

Douglas Rushkoff:

Mmhmm <affirmative>

Andrea Chalupa:

And we got through that. But it seemed like more of a preview and that there's future smashes to come. We saw how vulnerable a lot of these systems are. What do you see coming down the pipe? In your view, what do you think will most likely be an event? Do you see a sort of event scenario happening? How likely do you think that is?

Douglas Rushkoff [00:18:17]:

Well, I mean, I don't know how healthy it is to spend a whole lot of time, you know, meditating on the existential risks, right? You're walking in New York, a piece of scaffold can fall on your head and kill you. I saw it happen once, that's why I mentioned it. I saw it happen to this dude and it's like, Wow, boom. That fast. Because there's like real metal up there. Those are… [laughs] Those are big things. And it's funny because all the scaffolds are there to prevent a piece of brick falling off a building, you know, and killing somebody, which is how it all started because some people were getting killed as the buildings were falling apart. So it's like everything leads to other things. So, I mean, it seems that the pandemic we just had was the first of many. You talk to smart disease people and they say that, you know, the bird flu is the one that's sort of coming next and that's gonna make this one look like nothing, but whatever it is, let's say it's bird flu coming, you can try to stock up on Cipro or something. I don't know… Although it's a virus. Or you can start to improve the quality of your immune system now, right? And how do you improve the quality of your immune system? By eating more naturally or probably eating less meat, by doing all sorts of kinds of things that actually make bird flu coming less likely. You start living a life where your immune system is healthier. Experiences of awe, what I was talking about doing for your kids before. Let them experience awe. When you experience awe, you have a cytokine response and your immune system is regulated for days later, just looking at one mountain, one valley, one birth of a baby, anything that gives you that sense of awe and connection to something greater than yourself, your immune system improves. And it seems obvious psychologically why it would. It’s because you're experiencing the rest of reality less as a threat than as support, as something that's feeding you, not threatening you. It's the opposite of the tech bro isolationist, fear of nature, Francis Bacon understanding of nature, you know, that nature is this thing that you're gonna hold down and submit to your will. Nature becomes instead this friendly, open, beautiful opportunity for intimacy and opening and all that. So I try to, anytime I do sort of entertain an apocalypse scenario, accompany it with, Well yes, but you can avoid this by that. Or, you know, the oil crisis and climate change is big. It's big. The ways of reversing that, when you see how quickly we just under covid initially, dropping our travel, how quickly that changed the environment.

Douglas Rushkoff [00:21:10]:

I mean, this sounds ridiculous but it's actually true. If everybody traveled no better than someone with say $10 million or less, if we all traveled at the $10 million or less level, it would probably solve climate change just like that. It's the few people that have like more than $10 million or more than $20 million or more than a hundred million who are impacting so much more than thousands of people worth of impact. I mean, Jeff Bezos's yacht and service yacht—his yacht is so big, but you can't land the helicopter on his yacht so he has a service yacht that accompanies his yacht because his yacht is a sail yacht and you need a motor yacht to be flat enough—his yacht and service yacht use thousands of people's worth of energy, right? I mean, it's the excess. So when I look at individuals and the waste of individuals, I don't really feel so bad about that compared to the excess waste of the ultra ultra wealthy. So the cure to, you know, 1.5 degrees temperature increase and whatever is so simple. It's really just tamping down on the ludicrous excess and then using the surplus at that point to actually feed real people.

Andrea Chalupa [00:22:35]:

What is the solution then? Should we regulate these pop stars and other billionaires from taking their jet to go across Orange County? Should we regulate yachts? What should the government be doing to reign in that immense waste by the 1%?

Douglas Rushkoff [00:22:52]:

I mean, it's hard because kind of the 1% sort of own the government right now. I mean even on the left, you know, it was impossible—and I understand that Democratic Party's a private party and it turns out that, no, it's not about who.. I mean the Democratic Party, they were gonna vote for Bernie Sanders who was probably gonna do something or try to do something about this in a somewhat anti establishment way, but it turns out that it doesn't matter. They're allowed to say, “Oh no, we (the people running the Democratic Party) want this.” And that's who your candidate ends up being. So democracy is… It's really hard to do it through the government. And usually when you try to regulate something—like now they say they want to go regulate AI—the people that are jockeying for positions at the table in AI regulation space are not the people you want doing that.

Douglas Rushkoff [00:23:43]:

They're the same people that don't really value human beings, the same sort of techno-solutionists that think that human beings are the problem and their technologies are the solution. I mean, if government could regulate it, then sure, I would do some really simple things. I would reverse the capital gains tax with the dividends and wages tax. So right now, when you earn money by working, you're taxed very high on it. When you earn money by doing nothing, you're taxed very low on it. So people who have billions of dollars that are earning billions of dollars pay almost no tax on it. And it encourages a very different kind of company. So the companies on the New York Stock Exchange, say, you're gonna pay less tax if the value of your stock goes up than if you get dividends, right? If a company's earning money and doing good things, just selling stuff, they'll be able to pay back dividends.

Douglas Rushkoff [00:28:38]

They could stay at one size, not have to grow, not have to dig more factories, not have to take over more territories. They could run sustainably. Companies are not allowed to run sustainably. Shareholders, they don't want money. They don't want dividends because dividends are taxed high. What shareholders want is the growth of the stock. They want the stock value to go up and that only goes up if the company grows. So giant companies like Pepsi and Coke and Google, they have to grow still bigger in order to survive. And that doesn't work. That's why the planet is buckling. So you could flip something like that and it changes the way wealth works. You might end up with more millionaires, but maybe fewer billionaires. And that's actually okay. You know, I go to business schools all the time and I say at the beginning, “Would anyone here be satisfied making $50 million?” No one raises their hand. They all have the home run. They all want to be billionaires. They all wanna… They think that the idea of settling for $50 million means they're not really in it for the home run. And I try to convince them that $50 million is enough and they might be able to live happier lives and have better companies and do less damage if they could just set their sights down to that impoverished $50 million level. And it’s an interesting conversation to have with them and to see their brains try to wrap around envisioning what kind of life would that be? You know, how would that… Because most of them are not gonna make a billion anyway. And I also tell 'em the probability of hitting the $50 million mark is gonna be a lot higher too, so they could steer everything towards that. It's odd. [laughs]

Andrea Chalupa [00:26:26]:

It's very odd. How do we get to this point where $50 million is not enough anymore?

Douglas Rushkoff [00:26:31]:

Well, because they've looked at the stories of the other founders. My friend Evan Williams is one of the founders of Twitter and I remember the day that they went public and he and the other founders’ faces were on the cover of the Wall Street Journal with the amount of money they made that day under their face. And he had like $4.3 billion under his. And I thought, This poor, poor guy. You know, I was happy for him because someone I knew was a billionaire. But then I thought, if Wall Street's giving him $4.3 billion, it's because they expect a hundred X, right? A hundred times that to come back. And I'm thinking, How is he gonna make that off a 140-character messaging app? And he couldn't. I remember Twitter was making $2 billion a year in revenue off a 140-character messaging app and it was considered an abject failure by Wall Street because they had peaked at about $2 billion a year. I think making $2 billion off a 140-character messaging app… Jackpot. That's great. Can you imagine going to grandma? “Grandma, I'm making $2 billion a year off a 140-character messaging app!” That should be a win. And in our schema, it's not. And when you see that story, that's how you get there. You realize, “Oh, I need infinite exponential growth in order to be okay.” And then when you combine that with all the apocalypse fears, “I need infinite exponential growth in order to make it out of here, in order to outrun the damage I'm creating by earning money in this way.” And that's how you get to the mindset, right? “I need to build a car that can go fast enough to escape from its own exhaust.” And that's impossible because there's this thing called karma, right? And it's not a spiritual thing, it's a real thing. You're here with us. You are here with us, whether you like it or not. You're connected. And that's the thing I'm so confused. About these guys. They can go down to South America, do ayahuasca and mushrooms with a shaman and have a better psychedelic experience than you and I would ever be able to afford and they still come back as tech bros. They still come back utterly unconvinced about the interconnectedness of everything in the world and unable to cope, still hoping to somehow separate and isolate themselves, to create some hermetically sealed virtual reality web3 robot-controlled environment in which they live out their predictable lives.

Andrea Chalupa [00:28:56]:

What did Evan Williams think about the sale of Twitter to Elon Musk and what he's done to it since?

Douglas Rushkoff [00:29:05]:

You know, I don't even really know. Evan moved on to Medium, you know, and I write for that, although he actually sold Medium, or I don't think he sold it, but he stepped down as the CEO of that. You know, I think it was just a lot of time. I'm sure he’s sad about where Twitter went because Twitter was so—and this was Evan's talent—Twitter was so lightweight. It asked nothing of you. You didn't even need algorithms with Twitter. I mean, now they have 'em and all, but you didn't. You just subscribe and you see and you swipe and you type. It was so easy. But you can't have exponential growth forever with something that's just a great utility. And at some point, what do you do? What do you do when the whole world is wearing Birkenstocks? How do you grow exponentially with Birkenstocks? You can't, right? But you're not allowed to think about tech like you think about Birkenstocks. With tech, you have to think, Oh, because it's ethereal, because it's, it's replicable, it needs to be able to grow exponentially forever. And that's just not fair.

Andrea Chalupa [00:30:10]:

It's just not possible.

Douglas Rushkoff [00:30:12]:

Right. And now, I mean, Elon Musk is not even… He doesn't care about Twitter as a thing. I mean, for Elon Musk, I really think he wants to make Twitter into a bank as the ultimate thing. He wants it to be the Venmo, or the everything. You know, your online identity will be based in your Twitter account.

Andrea Chalupa [00:30:28]:

Well, he was backed by the PayPal mafia. Peter Thiel was pushing for that. He's close with David Sacks and those guys. 

Douglas Rushkoff [00:30:36]

That's his original business, yeah.

Andrea Chalupa [00:30:38]:

Right. What is their deal? [laughs] I know they're all like these libertarians. They love to be just anti woke and all that stuff, all the shenanigans they used to do back in the day on their college campus. But what is their end game? Do they just want to stick it to women and non-white people and LGBTQ people and just side with Russia and prop up Russian and propaganda points and just sort of burn it all down? What is their ultimate endgame?

Douglas Rushkoff [00:31:04]:

I mean, they see what we think of as government and regulation and the civic sector as an unnecessary drag on the human project, that they're trying to reach escape velocity, basically, so any regulation slows them down. And now that there's climate change and a limited amount of time for them to get off the planet as post-human cyborg entity AI human hybrids, they don't want us to slow them down. Interestingly enough, they signed this thing about AI to slow everybody down for a month, but that looks almost like… Well that's a very big conversation, but I don't think they really wanna slow it down at all. I think they wanna slow down everybody else, you know? [laughs] And kind of retool so they can do it in the direction they want, but I also think it's fake. I don't think their AIs are really AIs at all. But what happened to them is they have this view of us, the same view of us that the conquistadors had of us, right? Humans are to be abused. We're just blood for their, you know, vampiric rituals or whatever. We're the workers. We’re the thing that gets left behind. And I think what they realized was that the AIs that they're building look at them that way. That the AIs want to do to them what they want to do to us. And they're like, “Oh shit.” If the AIs are gonna be like we're making them, then of course the AIs are gonna look at the tech bros with the same disdain that the tech bros look at us. So I think they want a minute to kinda regroup [laughs] and think about that. But the whole thing feels to me more like a sales job than anything else. The AI panic, it feels like an AI sales campaign more than anything else.

Andrea Chalupa [00:32:57]:

Like PSYOPs basically.

Douglas Rushkoff:

Yeah.

Andrea Chalupa:

They’re trying to win at that game. Now, what is the role of AI? Are we all going to basically be farmed? Are we headed towards the Matrix? What role do you see AI playing in the future?

Douglas Rushkoff [00:33:13]:

Well, we were already farmed, you know, at least up till whenever 2021 when Chat GPT’s learning stops. I mean, we will see. AI currently requires a tremendous amount of labor that nobody's talking about. All these human beings who have to tag all the data that's going in there. It’s zillions of people making 15 bucks an hour or less tagging data for the AIs or digging up the cobalt and the rare earth metals for the chips that make the AIs or the servers or the this or the that. There's a huge labor force. So it's not that AI's replacing labor, it's just displacing labor or hiding/camouflaging labor differently. It's more like a dumbwaiter than it is like an actual worker replacement. It just hides the worker better. And right now we know what AIs cost to make. We don't yet see how AIs will make money, or certainly how language learning models will make as much money as even a Google search. You know, Google makes money because there's ads that you can click on or not. AI doesn't quite work that way unless you're gonna have the AIs giving you ads, you know, between things, or you pay to get included in an AI's response. But really so far AIs as I see them are just less accurate, more curated search engine results They throw back at you not what you've asked for, they throw back at you the most likely response from what's already there. If you go to Chat GPT and you put in the question, “What weighs more? One pound of feathers or five pounds of lead?” it will say they weigh the same because it knows so much, there's so many people that say, “What weighs more, a pound of lead or a pound of feathers?” and it's like this old joke that it answers the wrong thing like that.

Douglas Rushkoff [00:35:10]:

Or if you ask it to come up with references for a paper, it makes up the references. They're not even real. I mean, that's how I can tell what papers are written and what are not. I look in the footnotes that the kids put and if they're not real references, they're fake, because they don't care. They don't know. AIs don't know the difference between real and fake right now. They're just churning what looks like something, right? There's no logic in it. So it's funny. I mean, I don't see the real value in them yet because the kinds of places we want to put them require accuracy, so I don't want an AI reading my x-ray and making up whether or not I've got cancer somewhere based on what's likely. There's so many humans that will have to check the work that it might as well be done by humans, unless you use it in non-mission critical things like entertainment or the arts. But who wants to look at a painting that's made by an AI? I mean, I know people want to make paintings made by AI, but when people wanna show me the pictures they made with AI, it's a bit like people wanting to tell me about their LSD trip. I know it was important to you, but it really doesn't matter to me. If I look at a painting of Van Gogh, say, I'm not looking at it for the way the pixels affect my retina. I'm looking at it to see, What was this tortured human being experiencing? I'm connecting with Van Gogh through the years, through his work, through the brushstrokes. Or James Joyce's writing. It's a human to human connection that the arts are for. I mean, I could understand maybe making AI wallpaper or AI music or something, but that's not what my entertainment or arts experience is for.

Douglas Rushkoff [00:36:35]:

So what do you use it for? Special effects, maybe. Right? That's fine. You know, to animate the backgrounds of something or CGI or make a dinosaur render properly. And that's machine shit anyway. So I don't have the same threatened feeling about it except to the extent that AI can be used, to use your word, as PSYOPs. Machine modeling learns really well, so if they learn which combination of words is more likely to make a person do this or do that, then it's gonna be really effective for advertising. You know, the same way algorithms got really good at figuring out, Oh, if I put a picture of this person's ex lover having fun, they're gonna be more involved with Facebook over the next two hours than if I didn't. So the AIs will figure out stuff like that. If I say this, it will make them worry about the health of their child and they will go buy this remedy.

Andrea Chalupa [00:37:52]:

Yeah, no, absolutely. And to your point about that human connection, no human being can beat a machine at chess, but we still have grand chess masters. We still have chess tournaments that people will livestream for days and days and days. So, you know, chess is still around with all these individual players that people connect with. So do you think it's going to be a bit like that?


Douglas Rushkoff [00:38:15]:

Yeah, I mean, think about it. Same with tennis. If you made a robot that could play tennis better than any human being, what does that mean? It's like, yeah, I have a forklift at that Home Depot that could lift more weight than any human being, but they still have weightlifting contests. Hmm. Duh, right? I mean a gun, I can kill you with a gun, but we can still have a boxing match. So yeah, it makes kind of no sense that, Oh, chess is now obsolete, unless we learned—which would be interesting—we learned something from the AI. Where AIs get interesting is like, Well, what has Gary Kasparov learned about chess by playing with an AI as a kind of a training? It's interesting.

Andrea Chalupa [00:39:02]:

And what about the automations? We've had automation decimate jobs in the heartland and now there's this nervousness that it's coming for the urban centers, for the so-called white-collar jobs. You mentioned entertainment might be impacted. We have the writer's strike going on now. One of the big points the writers have is they don't wanna be replaced by AI. Are we going to see, inevitably, in some shape or form, the same automation that's impacted the heartland factory jobs now coming for the urban centers?

Douglas Rushkoff [00:39:36]:

Yeah, but I think it's mostly, you know, I hate agreeing with Sam Altman on this, but I do think it's more the tasks than the jobs. You know, they call it TaskRabbit, not JobRabbit. And I think it's stuff that mostly you wouldn't wanna do; calculating the, you know, mortgage actuarial percent of blah blah. And as long as we're disciplined and use it for that, we won't end up in the same kind of trouble we're in now. Now, judges use AIs to determine sentencing guidelines for prisoners and AIs put bBack people in jail longer than white people. So that's not a good place to use AI, right? It's a bad place to use it, especially because there's the illusion that the AI is somehow less biased or less racist than humans, and it's just as if not more racist than people because it's trained on racist data. And then at the same time we're doing that, we're refusing to teach how racism works in public schools because there's some fear that I don't know what that's gonna do… Make white people hate themselves I guess is the fear, that if we find out that there's structural racism. If anything, if you find that there's structural racism, white people should hate themselves less because they're like, Oh, it's not me, it's the structure that I'm in. In some ways it should relieve white guilt and we could all work together on solving these problems. But instead, we're turning to things like AI to make the problems—again—more distant from us. And it looks, Oh, the computer came up with this, so it must be fair. We'll see. But I'm not afraid. In the end, I am so fine with robots doing all the work if I get to do all the play. That's the original… Read Norbert Wiener and The Human Use of Human Beings, when they were first talking about this in the late ‘40s and early ‘50s, and they were looking at the possibilities for these thinking machines to take over human jobs. What you have to do is not disable the computers from doing the jobs, but look at why do we have jobs in the first place? Now, do you have a job because there's this work that needs to be done or do you have a job in order for society to justify letting you participate in the spoils of capitalism? Which is what it really is. And you don't need to be a Marxist for anybody to start thinking about this. It's like, Oh, wait a minute, if I start borrowing stuff from my neighbor rather than buying it at the Home Depot, that's good for the world, good for me, good for my neighborhood, there's less toxic metals being used, all that, and I'm costing somebody a job at the drill factory because we're buying less drills.

Douglas Rushkoff [00:42:06]:

Why is that a bad thing? That should be a good thing. So now that dude is freed up to either sit on the beach or play more with his kids or help take care of people or do one of the many things that we actually do need done, you know? But what happens now is people lose their jobs and then someone comes up with some, you know, bullshit piece of plastic that we can manufacture to create a craze and everybody buys this piece of plastic and uses it for a bit and then sticks it in a storage unit or throws it out and buries it in the ground, all so that this person could have a job so we could justify letting them have food. You know? And that's dumb. I mean, this is what I wrote about in my book, Life, Inc., in 2007: jobs were invented in the late Middle Ages. Jobs were invented when the chartered monopolies came around. Everybody used to have small businesses. They'd make stuff and trade stuff. The problem was the peasants were becoming wealthy and the aristocracy was getting poor, so they made all those local businesses illegal and they created these things called chartered monopolies where now you have to work for one of the monopolies. That's when employment began. That's when wages started. Instead of people selling the stuff that they made, they started working by the hour, right? So now you are selling your actual time to someone else, and that's very different. That's employment. Well, what if that age is over? That's the industrial age. What if we've reached the limits of that model? What if AI and digital technology helps us see that, Oh, the industrial age model is over. We don't have to measure people by their utility value anymore. They're not machines. They’re something else. We can let the machines do all that mechanistic stuff and human beings can return not to STEM but to the humanities, to the liberal arts, to what it is that humans are most special for.

[theme music - outro, roll credits

Andrea Chalupa

Our discussion continues and you can get access to that by signing up on our Patreon at the Truth-teller level or higher.

Sarah Kendzior:

We encourage you to donate to help rescue and recovery efforts in Turkey and Syria following the devastating earthquakes in early February. To help people in Turkey, visit the TPF Turkeye Earthquake Relief fund at tpfund.org

Andrea Chalupa:

To help Syrians in need donate to the White Helmets at whitehelmets.org. We also encourage you to help support Ukraine by donating to Razom for Ukraine at razomforukraine.org. In addition, we encourage you to donate to the International Rescue Committee, a humanitarian relief organization helping refugees from Ukraine, Syria, and Afghanistan. Donate at Rescue.Org. And if you want to help critically endangered orangutans already under pressure from the palm oil industry, donate to the Orangutan Project at theorangutanproject.org and avoid products with palm oil.

Gaslit Nation is produced by Sarah Kendzior and Andrea Chalupa. If you like what we do, leave us a review on iTunes. It helps us reach more listeners and check out our Patreon. It keeps us going.

Sarah Kendzior:

Our production manager is Nicholas Torres and our associate producer is Karlyn Daigle. Our episodes are edited by Nicholas Torres and our Patreon-exclusive content is edited by Karlyn Daigle

Andrea Chalupa:

Original music in Gaslit Nation is produced by David Whitehead, Martin Visseberg, Nik Farr, Demien Arriaga, and Karlyn Daigle.

Sarah Kendzior:

Our logo design was donated to us by Hamish Smyth of the New York-based firm, Order. Thank you so much, Hamish.

Andrea Chalupa:

Gaslit Nation would like to thank are supporters at the Producer level on Patreon and higher with the help of Judge Lackey, the narrator of the new Gaslit Nation graphic novel, Dictatorship: It's Easier Than You Think…

Andrea Chalupa