Podcasts

Crazy Town: Episode 118. Choose your AI Adventure: Immiseration or Extinction

January 28, 2026

Show Notes

Jason and Asher replace Rob with a much more humane and humble co-host, Elon Musk, to explore the feasibility of harnessing the entire sun to power AI superintelligence. We come away perplexed that not much of the excellent reporting on the environmental, energy, and financial risks of the AI boom address the googleplex-sized elephant in the room – that both AI success and failure lead to immiseration. Originally recorded on 12/3/25.

Sources/Links/Notes:

Related episode(s) of Crazy Town:

Episode 101 “Even AI Chatbots Hate Us: The Rise of the New Luddites, with Brian Merchant

Episode 77, “The Elon Musk Episode about Elon Musk Brought to You by Elon Musk

Episode 84, “Escaping Technologyism: Dreams of AI Sheep and the Deadliest Word in Film History

Transcript

Asher Miller (00:01):
I am Asher Miller.

Jason Bradford (00:02):
And I'm Jason Bradford. Welcome to CrazyTown where AI data center security guard is the most coveted and only job left. Hey, Jason, it's nice to see you. It is great. I mean, it's just the two.

Asher Miller (00:19):
I'm so excited.

Jason Bradford (00:20):
I know

Asher Miller (00:21):
Rob is not here.

Jason Bradford (00:22):
We get to do what we want.

Asher Miller (00:23):
We get to do what we want and we get to shit on him. I know. And we don't even have to tell him. This will be a test to see if he ever actually listens to episodes that we record.

Jason Bradford (00:32):
We'll find out. He doesn't have to know

Asher Miller (00:33):
But in the absence of Rob, I thought we'd invite another person to join us today.

Jason Bradford (00:37):
Oh yeah. I think this is great. People are gonna be-

Asher Miller (00:39):
Why don't you introduce yourself?

Elon Musk (00:41):
Elon Musk, E-L-O-N-M-U-S-K, chief Executive Officer of Space Exploration Technologies or SpaceX.

Asher Miller (00:48):
So yeah, Elon!

Jason Bradford (00:50):
I mean, great having you buddy.

Asher Miller (00:51):
Yeah, it's so wonderful

Jason Bradford (00:52):
I had a big profile of you in our season on false profits. And you were actually the ultimate false prophet.

Asher Miller (00:55):
Hopefully Elon didn't listen to that. Obviously Elon's not here today, but I did want to play a clip of Elon talking that I was made aware of through this great podcast called Search Engine. They did a two-part series on data centers and AI, and lemme play this clip, okay?

Elon Musk (01:14):
Say for argument's sake, 10x more compute will double the intelligence, and I think we'll see intelligence continue to scale all the way up to where most of the power of the sun is finest for compute and then ultimately most of the power of the galaxy.

Jason Bradford (01:29):
So I don't know why he stopped in at Galaxy.

Asher Miller (01:32):
Well, I mean if he was Ray Kurzweil, he would gone much further than the Galaxy.

Jason Bradford (01:36):
I think he's trying not to scare people with the grandeur of his vision. He's toning it down.

Asher Miller (01:41):
Okay. So this is the reason why, and I think it's probably a good thing. Rob is not here. He probably doesn't want to listen to us whine to each other about how crazy the world is. But here's the thing. That was a talk that Elon Musk was giving, right?

Jason Bradford (01:57):
It was recently too.

Asher Miller (01:58):
He's being interviewed by somebody about this and the thing that amazes me, right, there's been all this coverage that we've been seeing about the rush for AI, questions about is there an AI bubble happening right now, people doing great reporting. Like Search Engine did this two part series on data centers and the impact of them on communities and the rush for them. All this great reporting. Why is no one just stopping with their jaw hanging open when people like Elon Musk, the richest man in the world, say something as fucking stupid as that.

Jason Bradford (02:35):
The Daily, again, we're going to harp on the New York Times again, but The Daily Episode was really egregious about this I felt. Definitely not connecting dots. I did think the Atlantic podcast that came out recently about the AI bubble, they did a little better job of taking the next step and sort of saying, wait a second here, if these people are wrong that the AI investment is worth it, that they know it'll pay off, then there's a financial calamity. If these people are right in that they're going to create artificial super intelligence in the next few years, which they say, well then figure out how to make a return on investment. They don't know how. We're going to let this artificial super intelligence figure it out, which then basically removes the need for humans to exist.

Asher Miller (03:23):
Right so -

Jason Bradford (03:23):
Then we're also screwed.

Asher Miller (03:24):
We're also going to be out of jobs or whatever.

Jason Bradford (03:27):
So it's very hard to find actually any good reason for anything to go well. You struggle figuring out why unless you get into pretty dark places.

Asher Miller (03:43):
Let's just back up for one second. So if you take it at face value, what all of these tech companies are espousing and what they're doing, we're talking about hundreds of billions of dollars of investment. Trillions.

Jason Bradford (03:58):
They expect $3 trillion of investment with about a trillion of it being debt, which has really weird financial stuff that's like -

Asher Miller (04:06):
Sure. Reminds you of certain times in our not so distant past.

Jason Bradford (04:12):
Packaging things and reselling them and leveraging them. Exactly.

Asher Miller (04:15):
Default credit swaps or whatever.

Jason Bradford (04:18):
Yes. For data centers.

Asher Miller (04:19):
So let's just take it at face value. Either they are successful or they're not successful. Let's just say there's a binary choice here. And in both of those scenarios, the outcome is bad. There's no conversation. It feels like that's happening about why this is moving forward. Where is the outcry? Where are people asking questions about why are we doing this?

Jason Bradford (04:46):
Sometimes you get the people who are being affected immediately by the data center is kind of ruining their local area. So it's like a nimbyism and they then dig deeper and say, you're screwing up our water, or all this pollution is happening because of unregulated power plants and a few of those then try to make these arguments.

Asher Miller (05:05):
Or it's driving up electricity prices.

Jason Bradford (05:07):
Yeah, exactly. The bigger fish, like the supposed intelligentsia of the world, they seem to just go along for this ride and looking at these AI developers as the new oligarchy or the gods. Although I do think the real thing that's going on maybe behind the scenes and deeper is this national security sort of side of things, right? It's about war. It's about military. It's these super intelligent entities that will then paradigm shift our way for geopolitical reasons and that everything else is just a smokescreen for that, at least at these deep state levels.

Asher Miller (05:46):
So we're talking about what are some of the drivers that have basically people going along with this, right?

Jason Bradford (05:52):
Right.

Asher Miller (05:52):
And you're right. I mean Search Engine went into Memphis, which is where Elon Musk's AI company is rapidly trying to build out these data centers. And the pushback that's happening, which wasn't successful, and the part of people in this local community, very disadvantaged, historically disadvantaged, under-resourced community, the coming in there basically taking over this old derelict factory. So there's that kind of pushback, but there's nobody sort of saying people in power at a larger scale, whether they're pundits or newspapers like the New York Times, or elected officials or whatever, nobody's sort of being like, wait a second, why is this even happening? Can we just hit the pause button? So if we're looking at what are the reasons why people aren't doing that? One I think might be the story of progress. We're sort of locked into this idea that's kind of unquestioned. There might be debates about is this a good thing or a bad thing? But the general idea is that we should always be progressing on technology. We should never just say, no. No we shouldn't do this. It's just questions of is this happening too fast or what is it doing to electricity prices? What are the impacts on people? How do we make it less bad for people. But never that we should just say no to -

Jason Bradford (07:12):
A luddite thing.

Asher Miller (07:13):
Yeah. Technological advancement, right?

Jason Bradford (07:16):
Resistant to saying that for sure. That's one. Yeah.

Asher Miller (07:18):
Right. So that's the myth of progress basically being like we're all locked into it. The other one that you just brought up, I think, almost it's like FOMO thing. This fear of missing out.

Jason Bradford (07:28):
This is maybe the investor class kind of thing, business people.

Asher Miller (07:31):
And security. So you were just talking about the security issue, which is like governments saying, if we don't jump on this super fast and get ahead, the U.S. is going to be behind China and China's going to eat our lunch and this is a security risk for us. So we have to double, triple down, quadruple down on this thing because -

Jason Bradford (07:47):
I mean, the Chinese cat videos I'm seeing right now are so good and realistic.

Asher Miller (07:50):
They're much better than ours. The cat video I saw had 16 likes. It was really weird.

Jason Bradford (07:56):
And what? They're doing it with a 10th of the energy of our cat video productions. So I mean, they're just beating us in that for sure.

Asher Miller (08:02):
Yeah, that's a major security risk for us. So there's fear missing out from the investor class, as you pointed out. People are like, this is what always happens, I think, with people who understand that we're in a stock market bubble of any kind, is they're trying to stay in it for as long as possible and time their exit. It's like musical chairs. We've got to time when the music stops. I've got to get out right before. There's that thing. Because they fear missing out on that extra cool million dollars when they're a fucking billionaire or whatever. You know what I mean? It's insane. But you're right, national security, and it may be, and they talked about this in the search engine thing, it's like opportunities. Fear of missing out on the opportunity of economic development that might come from the rush of building out these data centers. And if you actually care about people from a government side and you want to lift people up out of poverty, you're like, we have to embrace this.

Jason Bradford (08:58):
This is the only thing right now that's leading the sort of quote unquote "growth" within the U.S. economy is contracts for data centers. And so you don't want to say no to that.

Asher Miller (09:11):
But can we point out again, the simple premise that if they're successful, we lose all our jobs. If they're not successful, we lose all our jobs. It's not all our jobs, but is -

Jason Bradford (09:23):
Then also if you get to the Elon - So most people are not talking at the scale of Elon Musk of taking over the galaxy.

Asher Miller (09:29):
Well that's a whole other thing, which is the impossibility of -

Jason Bradford (09:32):
Yes. The other thing is the misalignment or the non-alignment problem with an artificial super intelligence that will then figure out a way to kill us all because it's never going to be aligned with us, and we can't actually program these things with Asimov's Law of Robotics. When you create a general intelligence, you create the, I guess a free will environment where it's an agentic thing that has to evolve and adapt. And so, we actually have no way of knowing how to control any of this stuff. It's a major problem and then people are freaking out and writing books about it.

Asher Miller (10:04):
That is a subset of what I was just saying, which is this binary, if we're successful, we're fucked. If we're not successful, we're fucked.

Jason Bradford (10:11):
Either you lose your jobs or you all die.

Asher Miller (10:13):
Right. But if you look at this rush for AGI, basically general intelligence, this idea that these machines will become intelligent on their own.

Jason Bradford (10:24):
Yes, and program themselves.

Asher Miller (10:25):
Think for themselves in the way that we can imagine intelligence happening. So we've talked about this before. I think we have some skepticism if AGI could actually be achieved, right?

Jason Bradford (10:35):
Right.

Asher Miller (10:36):
But they're banking on AGI being achieved.

Jason Bradford (10:38):
It must be achieved.

Asher Miller (10:39):
Because it has to be achieved in order to figure out, like you said -

Jason Bradford (10:41):
How to make money.

Asher Miller (10:46):
How to make money. But then, so if they're not successful achieving AGI then we're kind of fucked. This is all basically cat videos, like you're talking about. Just AI slop bullshit or call centers.

Jason Bradford (10:57):
Just ruining our brains.

Asher Miller (10:58):
Exactly.

Jason Bradford (10:59):
Teaching us how not to write, how not to think, how not to draw, how not to make music.

Asher Miller (11:02):
And all the promise of it is gone. It won't materialize. Or if it actually is successful, it'll kill us. What are we doing?

Jason Bradford (11:11):
I don't know. Nothing makes any sense.

Asher Miller (11:13):
Nothing makes sense. But that's the thing that it's like - This is the crazy town thing of it all for me, listening to a guy like Elon Musk spouting the bullshit he just spouted, that makes perfect sense. "Blah, blah, blah, blah, blah. We're just going to harness the power of the sun and then eventually the galaxy," and everyone's nodding along.

Jason Bradford (11:34):
Who are these people?

Asher Miller (11:34):
This fucking guy should be just laughed off the internet.

Jason Bradford (11:37):
I know. Right.

Asher Miller (11:38):
Know what I mean?

Jason Bradford (11:39):
So I watched this video, it was terrible, of Elon Musk sitting with the head of the Nvidia, CEO, I can't remember his name, but everyone's seen him now forever.

Asher Miller (11:48):
With this leather jacket.

Jason Bradford (11:49):
He's got a leather jacket, he's got a mic in front of his mouth all the time, talking on stage, walking around with supreme confidence. Anyhow, they're talking about the next big thing is we got to move the data centers to space. And he's talking about the next five years. Because if you look at how much -

Asher Miller (12:05):
Because we're so successful getting things into space right now. Data centers, ha.

Jason Bradford (12:09):
We're going to need like 300 or 400 gigawatts of power running these data centers pretty soon. And the U.S. uses 500 or 600 gigawatts. So we're talking two thirds of the power of the U.S. is going to need to run data centers in a few years. And the only way you can get that power is in space. And what's great about space is these chips, they cool themselves, right? And then the NVIDIA guy goes into it and he's like, "Exactly. A lot of the mass and energy that goes into our data centers is actually cooling these chips. And so our payloads are going to be light compared to what they -." These guys are just like, "Yeah, yeah, we're going to send NVIDIA chips into space very soon."And this is where he gets to start with like the sun and then the galaxy is . . .

Asher Miller (12:52):
Yeah. Well, I wonder if he's talking, if Elon is talking to Jeff Bezos or if they're still in this space war or whatever? Because Bezos has been all about harvesting the moon. So if we can match these things up, we can actually harvest all the raw materials, I guess, that we need to go into the chips that NVIDIA makes. So instead of harvesting Adrenochrome on the moon, which is what we're currently doing, according to Alex Jones, we could have actually operations to harvest materials off of the moon. And then I guess we'll have all the data centers just orbiting around Earth, no problem. And I dunno, somehow beaming stuff down to earth so we could watch our cat videos that much more quickly.

Jason Bradford (13:31):
Then this warning, this flashing -

Asher Miller (13:32):
By the way we say cat videos, we're really talking porn. Let's be honest.

Jason Bradford (13:36):
Cat porn.

Asher Miller (13:38):
We can do anything we want, anything. Furry porn.

Jason Bradford (13:41):
Anything we want.

Asher Miller (13:42):
I mean, it's amazing.

Jason Bradford (13:43):
It's incredible. Now what gets me is, what's it called? There's this point of when you have so much stuff up in space that one thing breaks -

Asher Miller (13:52):
Oh. The chain reaction.

Jason Bradford (13:53):
And the chain reaction. And then you can't stop it. And then space is useless. And we we're super close to this happening.

Asher Miller (14:00):
Right? Well, no, no, no, no, we're not. Because by the way, I don't know if that was Search Engine or if there was a different podcast. I think it was Radiolab.

Jason Bradford (14:08):
Radiolab did one years ago.

Asher Miller (14:09):
Radiolab did an amazing episode about the people that are working in these agencies to study the orbits of all of these satellites and other things we're putting into space and trying to see what might happen with these very complex math of this one thing gets hit and then it triggers all these other things, right?

Jason Bradford (14:27):
Yeah.

Asher Miller (14:27):
But here's the good news. We've gutted all these government programs. I'm sure that Elon with DOGE has canceled it. So if you don't see it, this is what's happening with climate right now. We're not monitoring it anymore. And so climate change is no longer happening. So if we're not monitoring the risks of this chain reaction thing in space, it won't happen. We'll be fine. We'll be totally fine.

Jason Bradford (14:51):
I mean, this is what I would love. The kind of reporting I would love to do and I would love to see people do because we don't have the budget to do it, and I'm old and tired, It is stuff like all the things that could go wrong. Can someone start asking or make a list of how many ways either we're fucked if it works or fucked if it doesn't, and then how many ways it can fail simply because of biophysical limits. Oh, we have another drought so the Arizona data centers can't get enough water, so they're shut down, or whatever.

Asher Miller (15:19):
Or we take the water from farms, you know?

Jason Bradford (15:22):
Yeah. And then there's a food problem and there's riots. So there's just one after another. You can imagine them not actually reaching these goals for this build out, which of course it means that the bubble is much more apparent. And they're planning on spending just an obscene amount of money in the next few years and achieving something like this. And of course, there's so many ways that can fail. Can reporters just look into, now, wait a second. How could this fail logistically?The number of contractors, natural disasters, conflicts over any kind of resource. There's limits. There's limits. There's limits. These guys don't understand.

Asher Miller (16:01):
Nobody seems to understand the limits part of it. But first of all, a get a little side note. I was just thinking as you were talking about, our friend Nate Hagens has talked a lot about the super organism. Here's a hypothesis for you, maybe we're already living in some kind of collective AGI in the sense of the super organism is making these decisions. All of these collective decisions that we're all making, some of them are conscious decisions, but they are micro decisions in this larger picture. And nobody feels like they're in control. Everyone feels like they have to do this thing, right?

Jason Bradford (16:34):
Yes.

Asher Miller (16:36):
Where is there this collective sense of, no, we don't have to do this thing.

Jason Bradford (16:41):
It's, what's it called? The multipolar trap in game theory, or whatever. And no, you're right. It's almost like an individual like you and I who are very limited. We have one little beat brain running at, I don't know, 25 watts or whatever.

Asher Miller (16:55):
Mine's 26 watts.

Jason Bradford (16:56):
Okay. Way less than these data centers. But we can see this. We can seize through this and have the intelligent wisdom to say, at least time out. Maybe frickin' stop. But collectively, with all the power of all these people and their mass intelligence and the millions and maybe billions of human consciousnesses, and this machine intelligence, none of them seem to be able to stop. Just bonkers.

Asher Miller (17:22):
Crazy.

Jason Bradford (17:23):
That's just bonkers.

Asher Miller (17:24):
When we pull our collective idiocy or whatever, or call it our collective intelligence, we can get more clever. But it seems like our wisdom goes down.

Jason Bradford (17:33):
Just plummets.

Asher Miller (17:34):
Our collective wisdom is just to shit.

Jason Bradford (17:36):
And then part of it is a cultural problem. I mean, this gets back into the problem of modernity and the progress narrative. Because what you used to have was you would have cultures, and these would be indigenous cultures, I would say even peasant cultures. Okay? They're actually the same thing in many ways. That had taboos. They had taboos. They had ways of -

Asher Miller (18:01):
That's the first thing we got rid of when we got the internet.

Jason Bradford (18:03):
This is the irony is that these taboos, we look at 'em and we go, how irrational. These people with these ridiculous beliefs. And I look at those and I go, those were guardrails.

Asher Miller (18:16):
Self-limiting factors.

Jason Bradford (18:19):
Those were very wise guardrails. We told the story of the tribe in Africa where the guy hunts porcupine and he's successful and everyone shits on him.

Asher Miller (18:30):
He was like, this is kind of gamey.

Jason Bradford (18:31):
Yeah, exactly. But these are the kind of cultural things that keep people in check from madness happening. And what we decided was that we don't want any of these checks. We are just going to believe in progress continuing. And so essentially, our culture has gotten hijacked and we no longer have these taboos. We used to have what we thought we could do in liberalism. The idea was through the enlightenment and liberalism, instead of having these taboos that were kind of weird, mystical, we would have an administrative state. We would have this bureaucracy and we would have ways of creating laws. And the laws are in place of these cultural norms. Now, of course, you get co-option with power systems and the laws now start getting overridden or ignored. And then what you get, ironically is you think, okay, who's more mystically crazy?

Asher Miller (19:28):
Yeah, right now.

Jason Bradford (19:29):
Yeah. Okay. Like an Andean peasant who still believes that in their ancestor worship and their animate kind of gods and talks to the trees, or fucking Elon Musk?

Asher Miller (19:42):
I know. I know.

Jason Bradford (19:43):
Okay? So did we actually get rid of our mysticism and our kind of illogic? No, it's still there. It just took the form of grandiose narcissism and techno fetishism.

Asher Miller (19:55):
Yeah.

Jason Bradford (19:55):
The direction it went.

Asher Miller (19:57):
And we tied it into an economic system, and I would say a legal system that puts the individual and property at the center. And then you have this situation where we're basically telling people that you pursue your own happiness, your own wealth, and we allow the religious beliefs of these tech kooks, I would say, to plug directly into the capitalist system and supercharge what they're doing. Do you know what I mean? The Indian peasant, what's a downstream effect of whatever their belief system is? Let's say it was a toxic belief system. The scale which that person can impact the world is nothing like an Elon Musk. It just still, we were talking about the collective stupidity that happens, but there are still these individual reporters - Let's pick on that Daily episode. We'll link to in the show notes, right? Great reporting. People are doing good work, highly educated, and skilled at jobs, and they're having a sincere conversation.

Jason Bradford (21:07):
Great pacing.

Asher Miller (21:08):
Talking about what is actually happening with AI and the build out. And they're talking about some of the financial risks around taking on debt and all of that.

Jason Bradford (21:16):
Yes, I appreciate it. I appreciate all of that.

Asher Miller (21:18):
And there's a very slight, almost peripheral out of the corner of their eye, a glance at, oh, shit, maybe there's some jobs that will get destroyed out of this thing. But it should be one of those, just like the record player just stops scratches, like, wait a second. The proposition here, again, I'm going to bring it back to this because I don't see - I would like someone to tell me how it's not one of these two options that is presented in front of us. They're successful, which leads directly to the loss of millions and millions, tens of millions, maybe more, jobs for people, and the risk of their livelihood if they're successful. And if they're not successful, we've just inflated this incredible bubble. Half of our GDP is going into this build out right now. There's all this leverage that's happening within the stock market and all these things and debts. And so if they're not successful, we're going to have a massive economic downturn, potentially on the order of what we saw with the Great Recession with the housing crisis.

Jason Bradford (22:31):
And maybe way more actually. The bubble looks bigger.

Asher Miller (22:32):
It could be more. So those the only two options I'm seeing right now.

Jason Bradford (22:36):
That's only in the next two to five years, one to five years. It could happen next. It could happen at any time?

Asher Miller (22:42):
How is that not the only conversation that's happening when it comes to AI right now? How is it not that every one of these people is being brought forward?

Jason Bradford (22:51):
Yes, a congressional sort of panel.

Asher Miller (22:53):
Do it in Congress. Do it on the street. Everyone should be like, wait a second.

Jason Bradford (22:56):
What are you doing? Why?

Asher Miller (22:58):
Why is this happening? You know what I mean? Explain to me why we're doing this.

Jason Bradford (23:02):
Because I think it's more like we're going to get a bubble than they're going to get AGI. That's my bias.

Asher Miller (23:07):
You think they will get AGI?

Jason Bradford (23:08):
No, no. It's more likely it'll be a bubble.

Asher Miller (23:10):
Oh, then we'll get AGI?

Jason Bradford (23:11):
Yes.

Asher Miller (23:12):
Okay.

Jason Bradford (23:12):
So because I believe that, I think they're setting themselves to be a too big to fail situation.

Asher Miller (23:18):
Oh, right. Yeah.

Jason Bradford (23:19):
And of course, who did Trump have sitting next to him?

Asher Miller (23:23):
Right.Yeah.

Jason Bradford (23:23):
Yeah. His buddies now in the tech world. Okay. So they're just waiting, okay, if this is a bubble, I've got this crazy guy in the Oval Office.

Asher Miller (23:34):
Yeah, who will just -

Jason Bradford (23:35):
Find a way to just keep it reloaded or just pay off, or - I don't know.

Asher Miller (23:39):
I think you're totally right. And I actually would even venture to say that if I'm going to characterize these guys as more evil than religious nut jobs or somewhere on the spectrum, they're both of those things, then I would say that they maybe are looking at a playbook more like the dot-com bubble. So what happened in the dot-combubble was an asset class being totally inflated, all these tech stocks, and then the bubble burst. And what ended up happening was a lot of destruction of companies and consolidation. So there are few winners that came out of this.

Jason Bradford (24:16):
But how can you consolidate any more than you are?

Asher Miller (24:20):
No, they can, because right now what? There's five AI companies. Let's get it down to one or two. I think honestly, we were talking about FOMO. I think for them, they might know that there's a risk here that this isn't going to work, or that they don't even have long enough runway to be able to get there. Let's say they actually do believe that they can get to aGI or whatever.

Jason Bradford (24:42):
Don't worry. I'm a year behind so I'm just riding this wave with infrastructure, talent, and I'll get bought out and I'll be fine.

Asher Miller (24:50):
I'll get bought out, or I'll be able to buy out others.

Jason Bradford (24:53):
Yeah, depending on which side you're on.

Asher Miller (24:54):
And we'll win that way, and it's okay. This is creative destruction. This is the Silicon Valley ethos. This is just all part of the creative process. And we will win out at the end, either because we'll get bailed out because we're too big to fail, or this is just a natural process of these companies succeeding and losing, and we'll have a few winners. Meanwhile -

Jason Bradford (25:17):
The government seems to be okay owning parts of companies now too. So that's the other thing. It's not just bailout. It's like it's quasi state owned entities now. And then of course, you've got the national security side of that, right?

Asher Miller (25:28):
Right.

Jason Bradford (25:29):
Essentially, oh, NSA Department of Defense is all now, oh, I own incredible AI infrastructure.

Asher Miller (25:36):
Incredible AI Infrastructure, ha. Meanwhile, there was all this conversation about these chips become obsolete -

Jason Bradford (25:44):
Every couple of years.

Asher Miller (25:45):
In a couple years. We did an episode about the Museum of the Future and what would be in there. And you look at all these post apocalyptian films, and there are people walking through the remains of New York City or something else. And you see all these skyscrapers or cars that are left behind. And you try to imagine for those people, if it's many, many generations of the future, what were these things? What would people like - We would imagine we've come across the remains of Roman City or a Persian or whatever it is, Assyrian. We're just going to have fucking data centers everywhere. These big ugly blocks. You know what I mean? It's so weird. It's also so ugly. Jesus, if we're going to collapse and leave behind weird artifacts, what do they have to be this ugly, like strip malls and data centers?

Jason Bradford (26:36):
Well, I think maybe we could have at some point the AI start designing data centers.

Asher Miller (26:39):
Oh sure.

Jason Bradford (26:42):
Now you made a point of are they evil or are they just sort of -

Asher Miller (26:46):
Religious nut jobs.

Jason Bradford (26:47):
Religious nut jobs.

Asher Miller (26:48):
Yeah.

Jason Bradford (26:49):
I think there's a possibility that they're both.

Asher Miller (26:51):
Sure.

Jason Bradford (26:52):
So here's the thing. Okay, the Joel Oldstein or the Jim and Tammy Baker people, right? The people that just like they make these mega churches that end up looking like this grift mills. Do they actually believe in Christ our Lord who died for our sins and rose and that the way to get into heaven is to be born again. And I mean, do they believe that, and they like grifting along the way?

Asher Miller (27:17):
I think that most people in those situations convince themselves that they're right, and they justify somehow, and we've talked about this before with Elon and a lot of these other tech guys. The fate of humans today, or the non-human world, is inconsequential when you take a longtermist view. And so, yeah, you break a few eggs, maybe you'll lose a few hundred million jobs. People are starving. I mean, that kind of sucks. But this will allow us to get off this rock and conquer the stars, and in the long run, it'll be better. And I think they actually kind of believe that shit. I just, for the rest of us, can we just normalize saying what the fuck and laughing out loud? Can we just normalize people in, you're at some fucking Ted Talk or whatever, or wherever you are -

Jason Bradford (28:13):
Just start cackling.

Asher Miller (28:15):
Somebody says something ridiculous like Elon Musk.

Jason Bradford (28:18):
Practice your cackle.

Asher Miller (28:19):
Yeah.

Jason Bradford (28:20):
Practice your cackle. They're going to have an evil cackle. You practice the absurd cackle.

Asher Miller (28:25):
Right. It's the only thing left to do.

Jason Bradford (28:28):
Now, I wonder if you're a young person, let's say you're in college -

Asher Miller (28:32):
It's kind of hard to laugh.

Jason Bradford (28:33):
Well, this thing is, how do you take the future seriously? This is the question.

Asher Miller (28:38):
It's that same FOMO thing. It's like, this is happening to me, I don't have agency, and I guess I got to get on the train. I don't know. I know what they think. Or they're led by people like us to realize that this is the greatest ponzi scheme of all time, or the game of musical chairs, or whatever metaphor you want to use. And that the music's going to stop really soon, and the best thing to do is get on land somewhere or whatever.

Jason Bradford (29:03):
Or live in the moment.

Asher Miller (29:04):
It's a small subset of people, but it's kind of hard to look at all this stuff. I mean, I've had conversations with Avi. I was actually sharing with him and my wife Kirsten at dinner recently the reaction I was having to listening to that episode, that Search Engine episode. Because I was very much in a crazy town moment. I was like, this is insane. Am I the only person that thinks this is insane? So I was talking to them about that. And it's interesting. I think there are people who do see this as insane and their way of coping and dealing, or Kirsten, they put their head down, they don't listen to this shit, the just try to do good things in the world and care for people. But then Avi, he's 19 years old, he's a pre-med in college right now. I wonder sometimes if what I'm doing makes sense.

Jason Bradford (29:58):
Well we'll have robodoctors in 5 years.

Asher Miller (30:00):
And I said to him, people will always need healthcare. And I think people are actually going to turn towards human care more.

Jason Bradford (30:10):
Yep.

Asher Miller (30:11):
So either this shit fails and we don't get application of AI in tech doctors, or we don't get those med beds that Donald Trump promised us. Or we have that and it just becomes even more impersonal than it already is with the healthcare system. And people will want human connection with a doctor who actually thinks holistically about their body. So I was like, I actually think that what you're doing kind of makes sense if you can afford to do, he's fortunate enough that he has a great deal to go to college right now. But for a lot of other people, it's a damn good question. I mean, obviously we have our bias about what we would say for people to do. Become a peasant, right?

Jason Bradford (30:49):
Yes. Yeah. Make some new taboos.

Asher Miller (30:51):
Yeah, make some new taboos. But for a lot of people, especially young people, yeah. And they should be mad.

Jason Bradford (30:58):
Yeah, they should be mad.

Asher Miller (30:59):
They should be really fucking mad right now. Okay. Well, thanks for -

Jason Bradford (31:04):
Good thing Rob wasn't here.

Asher Miller (31:05):
Yeah, I know.

Jason Bradford (31:06):
He would've hated this. Because he would just wants to put his head down and do good things, and he doesn't want to listen to our ranting about this stuff.

Asher Miller (31:13):
Well, listener, join us please, in normalizing laughing or ridiculing the absurdity of what we're hearing from everyone, or just ask questions. I only see two possibilities here with this AI rush. Is there a different one that you see and see what people say. Maybe that makes 'em think. Because I can't. Or if you are aware of one, let us know and don't fucking send me an email about thorium reactors. Okay?

Jason Bradford (31:43):
This is a scary time to be an investor. The intersection of international political instability, environmental disasters, white collar job loss from AI adoption, and the appending pop of the AI bubble makes decision making nearly impossible for regular people just wanting to grow and protect their wealth. Introducing Agentic AI Assets Management where top hedge fund talent is brought to the consumer level. We are Wall Street Quants who developed an AI quant running on a quantum computer. Our proprietary Agentic AI will be given control of all your portfolio allocations and by sensing the global financial system in real time, be responsive at quantum light speed to adjust your financial assets faster than any human agent possibly could. To give us room to maneuver your assets and avoid negative outcomes for you, Agentic AI assets management will limit its portfolio to $100 billion, which compares favorably with the $3 trillion being invested in AI infrastructure, using $1 trillion in debt. Agentic AI Assets Management, using Agentic AI to protect you from the AI debacle.

Melody Travers (32:59):
That's our show. Thanks for listening. If you like what you heard and you want others to consider these issues, then please share Crazy Town with your friends. Hit that share button in your podcast app. Or just tell them face to face. Maybe you can start some much needed conversations and do some things together to get us out of Crazy Town. Thanks again for listening and sharing.

Asher Miller

Asher became the Executive Director of Post Carbon Institute in October 2008, after having served as the manager of our former Relocalization Network program. He’s worked in the nonprofit sector since 1996 in various capacities. Prior to joining Post Carbon Institute, Asher founded Climate Changers, an organization that inspires people to reduce their impact on the climate by focusing on simple and achievable actions anyone can take.