voice2text-logo
Influencer Perks🎁

#2174 - Annie Jacobsen

2024-07-10 02:50:42

The official podcast of comedian Joe Rogan.

2
Speaker 2
[00:00:01.74 - 00:00:09.54]

Joe Rogan podcast, check it out! The Joe Rogan Experience. Stream by day, Joe Rogan podcast by night, all day!

1
Speaker 1
[00:00:13.80 - 00:00:15.04]

Good to see you again.

2
Speaker 2
[00:00:15.54 - 00:00:16.32]

Hi Joe, thanks for having me.

1
Speaker 1
[00:00:16.32 - 00:00:16.84]

What's happening?

2
Speaker 2
[00:00:17.48 - 00:00:19.36]

Thanks for having me back, a lot's happening.

1
Speaker 1
[00:00:19.58 - 00:00:25.28]

My pleasure. I've heard a lot about your book. I haven't read it, but I've heard a lot about your book from a lot of people that you freaked out.

2
Speaker 2
[00:00:25.98 - 00:00:33.20]

Okay, well, hopefully you can read it and then you can decide if you're of the freaked out crowd or of the really freaked out crowd.

1
Speaker 1
[00:00:33.58 - 00:00:34.76]

Oh, there's only two options?

2
Speaker 2
[00:00:35.76 - 00:00:39.26]

I think so. It doesn't end. well. That's the spoiler alert.

1
Speaker 1
[00:00:39.62 - 00:00:45.26]

Okay, hold the book up. It's called Nuclear War. Yeah. What motivated you?

2
Speaker 2
[00:00:45.28 - 00:00:53.06]

Nuclear War, a scenario. And a very plausible scenario, from what I understand from all the defense officials I interviewed.

1
Speaker 1
[00:00:54.06 - 00:00:56.44]

What motivated you to write this?

2
Speaker 2
[00:00:57.88 - 00:01:03.52]

Well, six previous books on war, weapons, U.

[00:01:03.52 - 00:01:25.72]

S. national security secrets. Imagine how many people told me they dedicated their lives to preventing nuclear World War III. And so, during the previous administration, fire and fury rhetoric, I began to think what happens if deterrence fails? That idea of prevention, what happens?

[00:01:25.90 - 00:01:45.84]

And I took that question to the people who advise the president, who work at STRATCOM, who, you know, command the nuclear sub forces. And learned that it doesn't end. well. Not only does it not end well, five billion people are dead at the end of 72 minutes.

1
Speaker 1
[00:01:47.26 - 00:01:47.90]

Jesus.

2
Speaker 2
[00:01:49.82 - 00:01:56.04]

You begin to realize when you, well, you quickly realize, as you read the book, that, you know, there are.

1
Speaker 1
[00:01:56.04 - 00:01:59.64]

Jamie, your mic's on or something? Something just made a weird noise.

[00:02:01.96 - 00:02:05.86]

Oh, yeah. It's Carl. Oh, it's Carl. I was like, what's going on?

2
Speaker 2
[00:02:06.04 - 00:02:31.66]

It's the animal humor in a difficult subject. You know, there's, you know, literally hundreds of thousands of people in nuclear command and control who practice 24-7, 365.. What would happen if deterrence failed and we had a nuclear war? They are practicing this, Joe. And it's like, talk about being behind the veil.

[00:02:32.50 - 00:02:52.76]

No one knows. It's why I think the response to this book, it's been out for three months, has been so extraordinary. And from both sides of the aisle. Because people now are beginning to realize if nuclear war begins, it doesn't end until there is a nuclear holocaust. And it happens so fast.

[00:02:52.94 - 00:02:57.08]

There is no quickly going to your secret bunker you have.

1
Speaker 1
[00:02:58.80 - 00:03:07.24]

Yeah, all that's nonsense. People think like Zuckerberg building a bunker in Hawaii is going to survive. He's building a hurricane shelter. Yeah.

2
Speaker 2
[00:03:07.32 - 00:03:12.60]

And also, unless he happened to be there in the exact moment when all of this went down.

1
Speaker 1
[00:03:12.80 - 00:03:37.28]

Yeah, you'd have to know in advance that we're about to launch. The whole thing is terrifying. It's also, I don't see a way that it, like, if you think about the timeline between 1945 and today. It's kind of extraordinary that no one has launched a nuke. But it almost seems like a matter of time.

[00:03:37.72 - 00:03:49.08]

Like, it'll be a blip in history. Like, we look at it now. Oh, because of mutually assured self-destruction or mutually assured destruction. That's what's prevented people from using nuclear weapons. But when?

[00:03:49.34 - 00:03:57.20]

Until now? Like, what is 80 years in terms of human history? It's nothing. It's a tiny little blink. It's a little nothing.

[00:03:57.20 - 00:04:02.42]

And it could go sideways at any moment, and then we're back to the cave era.

2
Speaker 2
[00:04:03.02 - 00:04:27.40]

We are hunter-gatherers again. In the words of Carl Sagan, who is the author of Nuclear Winter. But I think what's also crazy to your point about 1945 is, like, when this was all set up, when the nuclear arsenals were beginning. And I take the reader through it really quickly because I want them to just get to what happens at nuclear launch. I mean, the book is really about nuclear launch, to nuclear winter.

[00:04:27.86 - 00:04:54.02]

But the buildup is fascinating because, first of all, it happened so incredibly fast. And it happened under incredibly secret, classified terms, so there was no, like, outside opinion. And originally, nuclear war was set up to be fought and won, which itself is absurd. And we know that now. So the rules of the game have fundamentally changed.

[00:04:54.54 - 00:05:07.46]

And yet the systems are all the same. That's what's, I think, the most dangerous component of today versus, let's say, 1952, when the thermonuclears began.

1
Speaker 1
[00:05:08.02 - 00:05:10.30]

And when you say the systems are the same, what do you mean exactly?

2
Speaker 2
[00:05:10.92 - 00:05:22.90]

So the system of nuclear war. This idea that –. well, OK, so let's start with some basic facts. There's a nuclear triad. You know what that?

[00:05:22.90 - 00:05:38.70]

– OK, so triad, really simple, three. So we have missile silos. They're called ICBMs. inside of them, 400 of them. Then we have nuclear-powered nuclear submarines that launch nuclear missiles.

[00:05:39.88 - 00:05:59.32]

There are 14 of them. Then we have the bomber force, 66 nuclear-capable bombers, triad. The president chooses what elements of that triad he's going to use when he launches a counterattack. if we ever are attacked. That system essentially exists.

[00:05:59.48 - 00:06:22.54]

That was what was being developed in the 50s. The only difference was in the old days it was we're going to actually use these and fight and win a nuclear war. And now it's. we're going to have these all sitting around ready to launch. We have 1,770 nuclear weapons on ready-for-launch status, Joe.

[00:06:22.72 - 00:06:25.00]

They can be launched, some of them in 60 seconds.

1
Speaker 1
[00:06:27.48 - 00:06:28.22]

Jesus.

[00:06:30.18 - 00:06:40.10]

And they're all pointed? Are they all –? do they need a coordinate? Or are they all aimed at a specific area already?

2
Speaker 2
[00:06:40.58 - 00:06:52.42]

Important question. They're technically targeted to sea, out at sea. So they don't have a specific target. But when the command goes –.

1
Speaker 1
[00:06:52.42 - 00:07:04.22]

So is that if they accidentally go off? I would – Go to the sea? Jesus Christ. Imagine you're out there on a sailboat, just enjoying your time. What a beautiful place to be in the middle of the ocean.

[00:07:04.22 - 00:07:05.36]

And you see –.

[00:07:15.02 - 00:07:28.34]

What do you think about the stories of UFOs hovering over nuclear bases and shutting down their weapons? I know you've done a lot of research on this stuff. How much of that is bullshit?

2
Speaker 2
[00:07:29.24 - 00:07:59.22]

You know, I actually haven't done a lot of research on that specific narrative. I know of it. I know of it for sure. And it's – I think – I mean I always approach the UFO phenomena with – or I try to at least with like the eye of – or the point of view of Carl Jung. This idea that it's – that what leads here is our perception of things and our sort of deep shadow self of fear.

[00:08:00.18 - 00:08:33.76]

And once the nuclear weapon was invented, man – I mean our grandparents had to confront this new, fundamental, new reality that just simply didn't exist before. And then it was – that's with the atomic weapons. And then, in the 50s, once thermonuclear weapons were invented and the thermonuclear weapon is essentially an atomic – a thermonuclear weapon is so powerful. It uses an atomic bomb, like from Hiroshima, as its triggering mechanism.

1
Speaker 1
[00:08:34.40 - 00:08:35.12]

Jesus.

2
Speaker 2
[00:08:35.12 - 00:09:21.46]

And so the order of magnitude of destruction of – in an instant, according to Carl Jung, who looked at the UFO phenomena and the nuclear weapons phenomena hand in hand, encourage anyone to read his stuff about it. Because he has a much sort of, you know, bird's eye view of it. all about why that's so terrifying to people. So the narratives – to my eye, the narratives of nuclear – of alien ships hovering over nuclear bases, I don't – I have never spoken to a firsthand witness who experienced that. But I would see that in terms of the narrative of Carl Jung.

1
Speaker 1
[00:09:22.98 - 00:09:37.70]

So Carl Jung's perception was that he believed that it was essentially people perceiving these things or hallucinating these things. And it was a – like, almost like a mass hallucination, part of the –.

2
Speaker 2
[00:09:38.96 - 00:10:28.66]

I don't know if he went that far. I think he left a lot more open to interpretation. I think his – my read of his analogy was more like the way that hundreds of years ago, or thousands of years ago, when Christianity was first being developed, people saw existential threats as part of the narrative of God. So his – my read of Carl Jung is that he's saying now, in the mechanized modern world, the existential threats, the sort of damnation is tied to machines which is tied to – is easily tied to little machines from – or big machines from outer space. That was his take on it, which I think is interesting.

1
Speaker 1
[00:10:28.66 - 00:10:31.42]

It's interesting, but also Jung wrote this when?

[00:10:33.28 - 00:10:50.66]

60s. Yeah. Yeah. So I think we know a lot more now than we knew then. I mean this – like the way – the reason why I'm bringing this up is like the people that have hope, one of the hopes is that aliens are observing us and they're going to wait until we are about to do something really stupid.

[00:10:50.98 - 00:10:52.98]

And then they're going to come down and shut everything down.

2
Speaker 2
[00:10:54.14 - 00:11:14.66]

That is an interesting narrative too, and that sort of – but again, that's a bit to my eye, like the deus ex machina idea that God would intervene and save the faithful and – or rather the – in this situation, it might be that he's going to save those people that are paying attention.

1
Speaker 1
[00:11:16.48 - 00:11:25.06]

Well, just save the human race from its folly. What is Carl doing, bro? Get away from me. You can't let that little –.

2
Speaker 2
[00:11:25.06 - 00:11:26.44]

I think Carl likes my shoes.

1
Speaker 1
[00:11:26.62 - 00:11:32.90]

Oh, yeah. He likes to play. Carl didn't get enough exercise this morning. Marshall wasn't here. He's not worn out yet.

[00:11:35.70 - 00:12:18.22]

So the Jung thing is interesting, but like we know more now. We know more now about possible other dimensions that we can access. We know more now about planets in the Goldilocks zone. We know more now about all these whistleblowers that have come out and talked about crashed retrievement programs, where they're back engineering these things and trying to understand what these things are. Diana Pasolko's work, where she's talking about how they're essentially donations, that these crafts are donations, that people are being given these things so that they could see this extraordinary technology and try to figure out how to make it.

[00:12:20.60 - 00:12:55.04]

But that's one of the only ways that I – like if we did get to a point that we launch nuclear weapons at each other, everything is over so fast. If I was an alien species, an intelligent species from somewhere else, and I recognize that this is a real possibility and that the earth has all these different forms of life other than human beings that are going to get destroyed as well. It's going to wipe out who knows how many different species. I mean. it's going to kill everything.

[00:12:55.68 - 00:13:06.34]

And even the people that are left over, what are you left with? Whatever? two billion people that still survive, where and what? What do you have left? What is the environment like?

[00:13:06.62 - 00:13:11.92]

How polluted is everything? What kind of mutations are going to come from their offspring?

2
Speaker 2
[00:13:11.92 - 00:13:40.48]

I get into that in the end of the book. So I write the book in essentially three acts, like the first 24 minutes, the next 24 minutes, the last 24 minutes, and then nuclear winter. So nuclear winter is very well described by a fellow called Professor Brian Toon, who I interview in the book. He was one of the original five authors of – do you remember the nuclear winter theory of our sort of high school years? Yes.

[00:13:40.48 - 00:13:54.06]

Right? So that was – Carl Sagan was the lead author on the paper. Toon was the young student. And he's dedicated decades to looking at nuclear winter. Now, originally it was very pawpawed by the Defense Department.

[00:13:54.20 - 00:14:09.18]

It was said, this is Soviet propaganda. This is never going to happen. And the computer systems and climate modeling have changed to the degree where we can see. not only is nuclear winter what was thought in the 80s. It's actually much worse.

[00:14:09.18 - 00:14:36.22]

So whereby originally they thought there would be a year of ice sheets across large bodies of water from Iowa to Ukraine, across the mid-latitudes of the globe. Now that could be up to seven or ten years. So think about that much frozen land for that long. You have the death of agriculture. You have a – like you are talking about.

[00:14:36.32 - 00:14:47.28]

You have the complete disruption of the ability for people to grow food and eat food. And so man reverts to his hunter-gatherer state. And this is all –.

1
Speaker 1
[00:14:47.28 - 00:14:48.84]

But hunting and gathering, what?

2
Speaker 2
[00:14:49.16 - 00:15:05.26]

That's the real problem. Well, exactly. And also you have – man has to go underground. I take you through what this is like in detail, because it's so bizarre to think about that. you have a man-made problem, nuclear weapons.

[00:15:06.18 - 00:15:35.66]

And yet this is – people are not paying attention to the fact that whatever is created by man essentially would theoretically have a man-made solution. Like there is a solution to the nuclear weapons threat. It's not – although the results of a nuclear war would be very much like an asteroid striking the United States or the world anywhere. There's a solution to nuclear war. There's a solution called disarmament.

[00:15:36.28 - 00:15:41.72]

You could – we have a total of 12,200 and some odd nuclear weapons.

1
Speaker 1
[00:15:42.34 - 00:15:47.88]

And is the concern that if we did that, other countries would not do that? We would be defenseless?

2
Speaker 2
[00:15:48.46 - 00:16:00.26]

Absolutely. But things happen in sort of inches and feet. They don't have to happen overnight. Once upon a time, there were 70,000 nuclear warheads in 1986..

1
Speaker 1
[00:16:01.20 - 00:16:01.90]

Oh, so we're making progress.

2
Speaker 2
[00:16:02.08 - 00:16:14.34]

We are making – well, we were making progress until this past year. And these treaties are all at risk and people are just very busy seeing everybody else as the enemy.

1
Speaker 1
[00:16:14.74 - 00:16:16.82]

This past year specifically because of what?

2
Speaker 2
[00:16:16.82 - 00:16:30.76]

Well, you know, Putin's saying he's not going to be involved in the treaty anymore. Donald Trump said he was – pulled us out of the treaty. So there's like leaders are threatening. this is all on the table right now. It could and should be looked at.

[00:16:31.04 - 00:16:39.46]

But when – do you remember back when we were in high school, when Reagan and Gorbachev got together for the Reykjavik summit?

1
Speaker 1
[00:16:39.80 - 00:16:40.08]

Yes.

2
Speaker 2
[00:16:40.26 - 00:17:01.52]

Remember that? Yes. That was the beginning of a movement toward disarmament. That was the beginning of this idea of, wait a minute, 70,000 nuclear weapons is just an accident waiting to happen. And so we are at 12,500 today because of that.

1
Speaker 1
[00:17:01.72 - 00:17:05.06]

And we should also say that there have been some very close calls.

2
Speaker 2
[00:17:06.72 - 00:17:17.94]

The secretary general said last year, or something, we are one misunderstanding, one miscommunication away from nuclear annihilation. He's not kidding.

1
Speaker 1
[00:17:18.26 - 00:17:35.88]

You know, there was this one incident where the Soviet Union thought that we were attacking them. And there was one guy who resisted launching a counterstrike. That one guy prevented nuclear annihilation. One guy said, I think this is an error. This doesn't make sense.

[00:17:37.16 - 00:17:47.14]

I'm not doing this. And that one guy saved us. I don't remember what the incident was. Do you remember? What was the exact issue?

2
Speaker 2
[00:17:47.40 - 00:18:15.38]

So his name was Petrov. And he was – and it was in 1983.. So what's even more remarkable about him is this was at a time of, you know, absolute animosity, enmity between the two nations. It was a really precarious time in the world. And Petrov was in an early warning radar system, like outside of Moscow, that reads data of possible incoming nuclear missiles.

[00:18:15.38 - 00:18:49.68]

And he saw what he – what the radar screen, the radar scope, was reading as five ICBMs coming from Wyoming, five. He knew that we would send 1,000 missiles if we were going to launch. And so he questioned the data, which is just so remarkable in its own conception when you think about that. He questioned it, and he didn't send it up the chain of command as a missile attack.

1
Speaker 1
[00:18:49.92 - 00:18:50.58]

So what was it?

2
Speaker 2
[00:18:51.36 - 00:19:24.84]

Well, I get into this in the book, which is terrifying. So let me back up for a second of how good our technology is. So we have a system in space, a satellite system called SIBRS, space-based infrared satellite system. It's like the Paul Revere of the 21st century. It is parked over our enemies that have nuclear weapons, and it can see and detect a nuclear launch of an ICBM in a fraction of a second show.

[00:19:25.72 - 00:19:44.88]

Confirmed fact. That's why nuclear war begins and ends in 72 minutes, because the SIBRS satellite system sees the launch, and then the U.S. nuclear command and control goes into – begins. And, by the way, an ICBM cannot be redirected, and it cannot be recalled.

1
Speaker 1
[00:19:45.76 - 00:19:56.18]

What about these hypersonic weapons that can adjust their trajectory? They can go – they can move to different places. Like they look like it's going to Arizona, and it goes to Chicago.

2
Speaker 2
[00:19:57.32 - 00:20:16.52]

So ballistic missiles are hypersonic. So a little bit of a misnomer there. And also a hypersonic missile – let's just say it went from Russia to the United States. It might take an hour. A ballistic missile launched from a launch pad outside Moscow takes 26 minutes and 40 seconds to get to Washington, D.C.

[00:20:16.58 - 00:20:23.52]

That number is not going to change. That's gravity. That's physics. That's what it was in 1958, 59, and that's what it is today.

1
Speaker 1
[00:20:23.74 - 00:20:26.34]

But isn't the new technology that it can alter its course?

2
Speaker 2
[00:20:28.02 - 00:20:35.48]

Yes, but our – okay, so if you go with that logic and you say, well, it can move around, so it would be harder to shoot down.

1
Speaker 1
[00:20:35.74 - 00:20:36.02]

Right.

2
Speaker 2
[00:20:36.02 - 00:20:49.24]

As I explain in the book and, again, as was relayed to me by defense officials, we can't shoot down ballistic missiles, long-range ballistic missiles, with any kind of certainty or accuracy.

1
Speaker 1
[00:20:49.56 - 00:20:51.56]

There's not like the Iron Dome or anything like that.

2
Speaker 2
[00:20:51.56 - 00:21:25.96]

That is – the Iron Dome is almost like terrible for nuclear war, you know, for people to understand how dangerous nuclear war is, because the Iron Dome can shoot down short-range missiles and mid-range missiles. So even the U.S. Aegis systems out on the sea, the Navy systems, shot down some of those Iranian drones, but they can't shoot down ballistic missiles. You want, like the five-minute or the 30-second ballistic missile lesson? Because this is what I need.

[00:21:26.82 - 00:21:55.54]

I write for the layman. I think part of the reason why nuclear war is not spoken about in the general public is because it's set up to be intimidating. You know, you'll hear a lot of defense people and analysts using very esoteric language, and it kind of excludes the average Joe or Jane, Joe or Annie. So I ask really basic questions like, how does a ballistic missile work? And it's very simple.

[00:21:55.82 - 00:22:09.98]

That 26 minutes and 40 seconds I told you about, so there's three phases of a ballistic missile. It launches. It has boost phase. First, five minutes, imagine a rocket. You've seen launches, that fire coming out the bottom.

[00:22:10.42 - 00:22:39.42]

That boosts the rocket for five minutes. That's when it's detectable from space. Then it enters mid-course phase, which is going to be 20 minutes, arcing across the globe to its target. That is the only place where the interceptor missile can get it, if it can. And it's 500 miles up, and it's traveling at something like Mach 23,, 14,000 miles an hour.

[00:22:40.28 - 00:22:55.36]

Okay, so that's 20 minutes. And then the last phase is called terminal phase, appropriately so, 100 seconds. when the warhead, the nuclear warhead, reenters the atmosphere. Boom. It explodes over its target.

[00:22:56.24 - 00:23:20.92]

The interceptor system is designed to take out the missile in mid-course phase. So we have 44 interceptors. Remember, I told you we have 1,700, let's say, nuclear missiles on ready for launch status. Russia has about the same. We have 44 interceptor missiles.

[00:23:21.78 - 00:23:57.74]

How are 44 interceptor missiles going to go up against more than 1,000 Russian nuclear weapons coming at us? Never mind the fact that each interceptor has a 50% shoot-down rate, and that's by the Missile Defense Agency spokesperson. So there's this perception that we have a system like the Iron Dome that could take out these incoming missiles. And we simply don't. Which is why, when nuclear war begins, it only ends in nuclear Armageddon.

[00:23:58.66 - 00:23:59.36]

Jesus.

1
Speaker 1
[00:24:03.42 - 00:24:13.50]

How disturbing was it for you to write this, to do all this research, and to come to these conclusions and realize that we're in a lot worse shape than anybody thinks we are?

2
Speaker 2
[00:24:15.12 - 00:24:46.38]

I mean, you know, when you're reporting or writing, you kind of take your hat off of the emotional or the sentimental part of things where you, you know, the mother in me, you know, you can't think like that. You just have to tell the story, I believe. I also believe that if I can be as factual and dramatic as possible, then I will have the most readers, which is the point. You know, I am actually not trying to save the world. as a journalist.

[00:24:46.88 - 00:25:23.76]

I'm trying to get you to read what I write, because I found it super interesting reporting it and learning about it. And also, the whole process for me that I think is the most interesting is going to some of these people who are truly some of the smartest scientists in the world, and getting them to explain it in the most basic... Like I say, you have to tell it to me, like I'm a kid because I don't have a science mind. And that part is... So that excitement, part of it balances out with the terror of it, because I do also understand why most people don't want to know about this.

[00:25:23.84 - 00:25:45.22]

It's too dreadful. But they also don't want to know because they end up feeling sort of looked down upon, I think, if they ask basic questions like, wait a minute, how does a missile work? Or, like you said, can't the hypersonics... Shouldn't we invest in hypers... Well, who really wants to be lectured?

[00:25:45.76 - 00:26:15.54]

And so what I try to do is condense the lectures that I receive about how it all works into this dramatic form, which is how I landed on this format for this book, which I think is really effective, which is giving it to you like a scenario. Like this is what would happen. And then, going back to all my sources. I mean, like, okay, here's an example. We haven't even talked about submarines, but the submarines are completely...

[00:26:15.54 - 00:26:34.70]

You cannot find them in the sea. They are stealth, beyond stealth. I interviewed the former commander of the nuclear sub forces, a guy called Admiral Conner. Never given an interview like this before. And I said, like, how hard is it to find a nuclear armed sub?

[00:26:35.34 - 00:26:45.70]

And he said, Annie, it's easier to find a grapefruit sized object in space than a nuclear sub under the sea.

[00:26:47.30 - 00:27:15.34]

And these things are, by the way, I have a map in the back of the book that shows you how close our adversaries, enemies, call them what you will, China and Russia, how close they come to the east coast and the west coast of the United States regularly. Which means it reduces that launch time I told you about of 26 minutes, 40 seconds. That reduces it down to sort of 10 minutes or less.

1
Speaker 1
[00:27:20.50 - 00:27:24.64]

What did you think when you saw the Soviet subs that were outside of Cuba?

2
Speaker 2
[00:27:25.38 - 00:27:56.78]

I thought, wow. I mean, when I began reporting this book a couple years ago, never did I think that I would see that while I was talking about my book with people like you after publication. But in the same manner, I never thought I would hear the president of Russia threatening to use a nuclear weapon. I mean, he said he's not kidding that he might use WMD. That was his paraphrased quote.

1
Speaker 1
[00:27:57.72 - 00:28:00.04]

Did he specify like in what way?

2
Speaker 2
[00:28:00.34 - 00:28:01.56]

He just said WMD.

1
Speaker 1
[00:28:03.18 - 00:28:04.28]

Against America?

2
Speaker 2
[00:28:04.44 - 00:28:07.90]

Yeah. Well, you know, having to do with intervention in Ukraine.

1
Speaker 1
[00:28:08.16 - 00:28:08.40]

Yeah.

[00:28:10.32 - 00:28:22.30]

I'm sure you saw the drones that blew up on a beach. The bombs that were launched that killed civilians in Russia. You didn't see this?

2
Speaker 2
[00:28:22.30 - 00:28:24.26]

This is like recently, yesterday. No, I didn't.

1
Speaker 1
[00:28:24.48 - 00:28:26.26]

Yeah. Jamie, see if you can find that.

[00:28:27.92 - 00:28:42.48]

But Russian civilians, including one young girl they were showing in this article, were killed by these cluster bombs that were launched by drones that are ours.

?
Unknown Speaker
[00:28:42.76 - 00:28:43.14]

Right.

1
Speaker 1
[00:28:43.68 - 00:28:56.12]

You know, that Ukraine has. And now they've launched them on Russian civilians. It's like, here it is. Crimea. video shows Russian tourists flee beach.

[00:28:57.02 - 00:29:03.46]

What is that word? A.T.A.C.M.S. Bomblets, rain down. What does that mean? Do you know what that means?

[00:29:03.98 - 00:29:04.18]

A.

[00:29:04.18 - 00:29:04.30]

T.

[00:29:04.30 - 00:29:04.48]

A.

[00:29:04.48 - 00:29:04.66]

C.

[00:29:04.66 - 00:29:04.84]

M.

[00:29:04.84 - 00:29:05.08]

S.

2
Speaker 2
[00:29:05.64 - 00:29:11.38]

Well, I'm guessing they're small cluster bombs that are in the nose cone of the warhead.

1
Speaker 1
[00:29:12.98 - 00:29:29.00]

Make that larger, so I can read the whole thing, Jamie. The video shows the beach in Sevastopol, Crimea, which was struck by a series of explosions on June 23rd. The footage captured by a security camera shows hundreds of people beginning to run away from the water before the impact of cluster warhead starts.

2
Speaker 2
[00:29:31.42 - 00:29:36.78]

What's happening in Ukraine is so profoundly dangerous for everyone.

1
Speaker 1
[00:29:37.02 - 00:29:46.88]

So this is the scene right here. So these things just drop down on the water.

[00:29:49.26 - 00:29:50.62]

I mean, it's pure terrorism.

2
Speaker 2
[00:29:52.20 - 00:30:06.10]

Well, it's also remarkable that we have so much available footage and so much citizen journalism that people can see these events and discuss them.

1
Speaker 1
[00:30:06.10 - 00:30:37.04]

It says here, the event was caused by Russian air defenses shooting down a series of cluster warhead missiles, one of which altered course. as a result. The Russian Ministry of Defense said that four of the five missiles launched were shot down, adding another missile, as a result of the impact of air defense systems, at the final stage, deviated from the flight path, with the warhead exploding in the air over the city. The detonation of the fragmentation warhead of the fifth American missile in the air led to numerous casualties among civilians in Sevastopol.

[00:30:44.26 - 00:30:49.08]

Do we know what this was about, where they were launching them towards?

2
Speaker 2
[00:30:50.36 - 00:31:11.80]

I don't know. I don't know. I'm not following the ground war in Ukraine right now, as with my focus on this. But what I do know is that the ratcheting up of the rhetoric and the use of third party weapon systems is complicating an already incredibly volatile situation.

1
Speaker 1
[00:31:13.06 - 00:31:14.84]

This is a spokesperson for the U.

[00:31:14.84 - 00:31:27.26]

S. State Department denied the accusation, saying that the claims were ridiculous and hyperbolic. The U.S. supplies weapons to Ukraine in the ongoing war with Russia and recognizes Crimea as a part of Ukraine. despite Russia's annexation.

[00:31:27.72 - 00:31:38.10]

Ukraine has previously outlined plans to use long-range weapons supplied by America in Crimea specifically to target infrastructure supporting the Russian invasion.

[00:31:43.38 - 00:31:55.54]

This is just terrifying stuff. It's terrifying because it can all be happening while you're just going about your business, walking your dog. You have no idea that the entire world is in grave danger.

2
Speaker 2
[00:31:56.90 - 00:31:59.78]

You mean when, if things suddenly go nuclear?

1
Speaker 1
[00:32:00.02 - 00:32:02.32]

Yeah. Well, even just this, just like these escalations.

2
Speaker 2
[00:32:02.32 - 00:33:00.22]

Well, I think the big picture that frightens me most is that when we see the president of Russia going to the president of North Korea, our two air quotes arch enemies right now having a new alliance. And then I consider that the current president of the United States hasn't spoken to the president of Russia in two years. And I think back to that time in history, what's known as the Reagan reversal, where Reagan went from this incredible hawk to learning about nuclear weapons in, of all things, an ABC television movie called The Day After. Having the crap scared out of him and then realizing this is the president of the United States, realizing we cannot continue on this path. It is too dangerous.

[00:33:00.58 - 00:33:35.68]

And that is why Reagan reached out to Gorbachev. And that's why we have the record summit. It was called the Reagan reversal. So, in other words, my point is Reagan, who, you know, the axis of evil speech, like this idea of seeing your enemy as the, as the arch evil villain, had to change for him when he understood nuclear war by seeing a film. And so when I look to today and I consider that the current president isn't speaking to the president of Russia, it doesn't make any sense to someone like me.

[00:33:35.78 - 00:33:53.92]

That's probably why I wrote this book. Like, please understand this. And one has to imagine that the current president, with all his decades in office, understands all of this. And so I don't fundamentally understand why there is no communication. It is way too dangerous.

[00:33:55.80 - 00:34:22.14]

Hence, what you just showed us. And, you know, the facts will come in of whose weapon systems those are. But either way, the perception, to your point, the fact that the perception, a misperception, could ignite nuclear war, could ignite that situation that is unreversible. That should be astonishing to all of us.

1
Speaker 1
[00:34:23.52 - 00:34:24.42]

It's terrifying.

2
Speaker 2
[00:34:25.28 - 00:34:35.90]

Well, it's terrifying. But the one hopeful part of it would be, again, going back to the Reagan. The Reagan reversal, by the way, is the only glimmer of hope I ever found in all of this.

1
Speaker 1
[00:34:36.76 - 00:34:49.58]

Don't you think, though, that politics in general, and certainly world leadership, especially United States leadership, is much more compromised today than it was then? And a guy like Reagan doesn't really exist today.

2
Speaker 2
[00:34:50.12 - 00:34:52.04]

Tell me what you mean when you say compromised.

1
Speaker 1
[00:34:52.04 - 00:35:39.64]

I mean, the military defense contractors, are making so much money, and they want to continue making so much money. And they have great influence over the politicians and over policy and over what gets done. And this money that they don't want to stop making is completely dependent upon the continuing to build, continuing to sell, continuing to have these weapons and future systems and more advanced systems and better systems. And there's so much money and momentum behind this that I don't know if there's a Reagan available now. I don't know if that's an option.

[00:35:39.76 - 00:36:07.16]

If there's a person that can have some sense, that can say that we are on a path to self-destruction and we need to stop and we need to reverse this path. You know, you're going to have people in the military, in the Defense Department, that are being influenced by these contractors. It's like, listen, there's plenty of places we can move things around and get things done. And don't you know about these guys? These guys are bad guys.

[00:36:07.22 - 00:36:24.22]

We need to get over there. We need to do something about this. We need to do something about that. And this escalation is motivated by the fact that they're making fucking ungodly amounts of money by making and selling these weapons. And this is a massive part of our economy.

[00:36:24.48 - 00:36:30.32]

It's a massive part of the structure that runs the government itself.

2
Speaker 2
[00:36:31.36 - 00:37:08.84]

Absolutely. So. then you have to ask yourself, what is also going to happen now that these big contracting organizations, Boeing, Raytheon, Lockheed, are now being threatened by Silicon Valley, by the new defense contractors that are coming into the pipeline, that are threatening their contracts because they can do it faster and cheaper. And so I fear that you will see even more of that entrenchment, that you're talking about, even more of the bureaucracy churning out more weapons under the guise of defense.

1
Speaker 1
[00:37:08.84 - 00:37:09.72]

Because they'll have to ramp it up.

2
Speaker 2
[00:37:09.72 - 00:37:56.02]

Yeah, because there's a competition, which is that double-edged sword, because competition is what makes America great. I believe in that. But I do also think what's interesting is someone I interviewed here in the book was Leon Panetta. So not only was he a former SecDef, but he was former CIA chief, and he was former White House chief of staff under Clinton. And in our interview, I learned a lot from him about those three kind of elements of the national security, advising the president, you know, being SecDef, being in charge of all of this, and being CIA chief from the intelligence point of view.

[00:37:56.02 - 00:38:47.50]

But what was even more interesting about interviewing Panetta was that he said to me at the end of our interview, it's good that you're doing this. The American people need to know. That's a direct quote from him. So here's a guy who has spent his entire life entrenched in that system that you're talking about, and then, outside of it, once he retires, puts on his, shall we say, grandfather's hat, the human hat, and is suddenly like, this is really going in the wrong direction. I would hope that that would lead to more people thinking wisely about what it is they're doing when they're in office, as far as nuclear war is concerned.

1
Speaker 1
[00:38:47.50 - 00:39:24.44]

Yeah, but the thing that concerns me is they're not good at anything. Why would they be good at this? Well, it brings me back to Eisenhower's speech when he left office, you know, the threat of the military-industrial complex, warning the United States that there is an entire system that is now in place that profits off a war and wants war and wants to keep creating these weapons and wants to keep escalating things, because that's their business. That's the business. We're not good at regulating any businesses.

[00:39:24.78 - 00:39:35.28]

We're not good at anything that's detrimental. We're not good at regulating overfishing of the oceans. We're not good at environment. We're not good at energy. We're not good at manufacturing.

[00:39:35.80 - 00:39:53.82]

We're not good at regulating anything. Everything we've done has been a for-profit thing. When we allowed jobs to go overseas and decimated American manufacturing, there was no regulation of that. They didn't think that out. They didn't do a good job of managing that.

[00:39:54.30 - 00:40:09.18]

No, they completely devastated manufacturing here. And we saw during the COVID crisis, during the lockdowns, oh my God, we can't get anything because everything's overseas. Everything's made overseas. Medicine, all the computer chips, everything. They're not good at anything.

2
Speaker 2
[00:40:38.82 - 00:40:46.40]

You have to have a national security, otherwise you get walked all over. I think people kind of agree that. You can't really have a peace force.

1
Speaker 1
[00:40:46.78 - 00:40:49.66]

Right. Agreed, especially with the state of the world.

2
Speaker 2
[00:40:49.80 - 00:41:14.60]

Right? Right. But the second part of Eisenhower's speech is important to me, and it's why I get people to talk to me in my books, because he says there's an antidote to the military-industrial complex, and that is an alert and knowledgeable citizenry, which is, in essence, what we're doing now by talking about this. It's what you do on your podcast.

[00:41:16.16 - 00:41:31.72]

By having an alert and knowledgeable citizenry, and the word alert, I think, means engaged. You have to be able to talk to people in a way that they can, oh, wow, that's interesting. Well, I don't understand how that works. How does that work? And have these conversations.

[00:41:32.90 - 00:41:54.42]

So in that regard, I would say that's a positive sign in the right direction. I think people are much more open to having an opinion about all of this than they were in, say, the 1950s. But there's a balance, because now opinion somehow seems to be taking over the Department of Facts in many regards.

1
Speaker 1
[00:41:56.04 - 00:41:59.88]

The Department of Facts. That's an interesting way to look at it. Yeah.

[00:42:02.12 - 00:42:25.72]

It just doesn't seem like the general public is completely aware of how dangerous these threats are and how close we are inching towards it. Even the nuclear subs off the coast of Cuba was barely a blip in the news cycle. It was replaced by Taylor Swift and her boyfriend. It just goes in and out quickly.

2
Speaker 2
[00:42:26.42 - 00:42:57.66]

Well, it's also that, I mean, Joe, I do not write about politics. I just don't talk about, I talk about POTUS, the president of the United States. And I talk about moves that certain presidents made. But I'm amazed at how much time is spent talking about these two individuals and their families and what they ate for breakfast. That I find, you know, at least the Taylor Swift of it is like slightly entertaining or uplifting.

[00:42:58.26 - 00:43:11.84]

But the way in which America seems to me have almost become, like, you know, the way that the UK used to be obsessed with the royal family. To such a degree where...

1
Speaker 1
[00:43:12.82 - 00:43:30.70]

Yeah, that's our royals, those celebrities and nonsense. And whether it's the president, who's a celebrity, or congressperson, who's become a celebrity, you know, AOC, it's not about her policies, it's about her saying stupid shit. It's like that's all people care about. It's a reality show. And we're kind of conditioned by reality shows, right?

[00:43:30.78 - 00:43:40.60]

We have so many that we watch and so many things that we pay attention to. that are nonsense, that distract us, that we like to sort of apply those same viewing habits to the whole world.

2
Speaker 2
[00:43:41.34 - 00:44:25.62]

But I'm amazed by the phenomena of podcasts, I must say, because I'm old enough to remember when they weren't around. And so, you know, and I do have, I have one foot in publishing, obviously, author of seven books, but I also write television. And so I exist in these two different worlds of media that are, you could say, traditional media forms. And when you consider how radically these different forms of communication are changing, that we did, we did, I sell as many ebooks and audio books as I do hardcovers. And I have a feeling that if those markets didn't exist, I would sell half as many books, if that makes sense.

1
Speaker 1
[00:44:26.10 - 00:44:26.20]

Yes.

2
Speaker 2
[00:44:26.20 - 00:45:07.42]

And so then, when you throw the podcast into the mix, I cannot tell you how many people know about my work as a journalist, as a national security reporter, because of podcasts. That is remarkable to me. It makes things so much more accessible to so many more people. Everybody's listening to a podcast, driving around, listening, you know, when they're at the gym, when they're on a hike. And if someone who cares about an alert and knowledgeable citizenry as a fundamental, first of all, because I think if people, people that are curious, tend to be less furious.

[00:45:08.60 - 00:45:45.20]

You know, if you can get your curiosity satiated, you don't become so angry. And so I, and again, I have to be an eternal optimist, particularly writing the kind of books that I do, or it would just be, you know, it would be my thinking would take a negative turn. And so I am an eternal optimist and I do look to conversation and new media as a means to a better way or a means to a way out of this kind. I think that I believe the tide will turn.

1
Speaker 1
[00:45:45.52 - 00:46:03.34]

Well, it's certainly one of the only uncompromised conversations that's available. And it happens to be the biggest, you know, that's the wildest thing is mainstream media is just falling apart. No one cares anymore. No one believes them. The faith in mainstream media and the trust in mainstream media is at an all time low.

[00:46:05.00 - 00:46:13.76]

And podcasts are at all time high. There's how many of them? are there, Jamie? Like 5 million? I think there's something like that, like monthly, you know?

[00:46:14.08 - 00:46:14.30]

Yeah.

2
Speaker 2
[00:46:14.42 - 00:46:30.38]

Yeah. But also what's remarkable is you hear people often say, like, people have lost their attention spans. They watch TikTok. Well, I mean, people listen to your podcast for three hours. That is a very long attention span.

[00:46:30.38 - 00:47:03.30]

And I find that kind of like brain conditioning really valuable, because I will listen to a podcast for three hours. And also in a continue, you know, it might be on an airplane and then, oh, my phone tells me I have this much time left on the podcast. So I continue listening it on a hike. And I think that it is a very different kind of mental stimulus, curiosity in a new way forward than the old days of reading a newspaper. You know, it takes you this amount of time to read.

[00:47:03.38 - 00:47:05.58]

And then, I mean, newspapers barely exist anymore.

1
Speaker 1
[00:47:05.88 - 00:47:24.94]

Right. And then the other problem with television shows is that they're on a specific time and people don't want to be locked into having to watch something at a very specific time. And now, because of things like, you know, YouTube and Spotify, you can just watch it anytime you want and stop it when you go to the bathroom. Stop it when someone calls you. Stop it when you have to go somewhere.

[00:47:25.30 - 00:47:26.66]

Restart it again when you're at the gym.

[00:47:28.82 - 00:47:40.82]

And it's just a different thing. It's a different way to communicate this idea that people don't have attention spans anymore. How is that possible? They're just people. People didn't change.

[00:47:40.92 - 00:48:01.22]

That's so stupid. If people always had attention spans and all of a sudden they don't, maybe they're just getting distracted by things that are very easy to absorb and very addictive, like TikTok videos. It doesn't mean that the human mind is different. It's been altered forever. And then now no longer people are interested in long term conversations.

[00:48:01.32 - 00:48:12.44]

That's just stupid. I've rejected that from the beginning. Like one of the first things this podcast is, even my good friends were telling me, like, you have to edit it. Like, you have to make it shorter. Like, why?

[00:48:12.56 - 00:48:17.66]

No one's going to listen to something for three hours. Then don't listen. That was my take. I was like, I don't care. I listen to things.

[00:48:18.44 - 00:48:25.44]

I've always listened to, like, lectures and old Alan Watts speeches. Like, I listen to things.

2
Speaker 2
[00:48:25.66 - 00:48:26.50]

Who's Alan Watts?

1
Speaker 1
[00:48:26.50 - 00:48:54.46]

Alan Watts is, I guess you'd call him a psychedelic philosopher. Very fascinating Englishman who said some very wise things. But just a brilliant person, very interested in Buddhism and just a very, very wise person who, still today, people send me clips of things that he said and quotes of things that he said. But I've always listened to fascinating people have conversations. Terrence McKenna.

[00:48:54.86 - 00:49:08.10]

I listened to a lot of his speeches and a lot of the different lectures that he gave. I don't think people changed. I think that's nonsense. I get hooked on YouTube reels or Instagram reels. I get hooked on them.

[00:49:08.42 - 00:49:12.98]

I'll be sitting there if I have nothing to do. What is that? Why is he doing that? What's that? Oh, look at that guy.

[00:49:13.98 - 00:49:21.56]

It's just a part of being a human being. We're easily distracted. It doesn't mean we don't have an attention span anymore. That's stupid. That's ridiculous.

[00:49:21.74 - 00:49:36.10]

There's still people that are graduating from universities with PhDs. There's still people right now that are in the residency in medical school. There's still people that are learning how to engineer fighter jets. Like, there's people that have attention. This is nonsense.

[00:49:36.70 - 00:49:41.36]

The idea that human beings have radically changed because of this one medium that's addictive is just so stupid.

2
Speaker 2
[00:49:41.36 - 00:50:01.48]

I also think there's something to be said as an individual. when you start to be a little bit conscious of your own habits in viewing and thinking and reading and information. You get absorbed in the TikTok. Then you get to say to yourself, like, what am I doing? I want to actually change this habit.

[00:50:01.48 - 00:50:24.48]

We all benefit from seeing how easy it is to develop a habit and how hard it is to move yourself away from a habit as you become entrenched in it. I think there's complete value in that. People suddenly realize, I've got to stop watching TikTok videos and I've got to go to the gym, which is another….

1
Speaker 1
[00:50:24.98 - 00:50:42.52]

But that's a difficult sort of an adjustment and most people don't like difficult things. So if you get 100 people addicted to TikTok, what number out of those 100 people are going to go, you know what, I'm going to change my life? It's probably like three or four. And those people are extraordinary. And you hear about them and you get inspired by them.

[00:50:42.60 - 00:50:51.02]

Like, wow, you got a flip phone? That's crazy, Bob. Why'd you do that? You know what, I realized. my mind was getting taken up by these things, and now I have my mind back.

[00:50:51.02 - 00:51:14.30]

I like it. People want to call me, they can call me, but I'm not watching things and reading things and absorbing things. But then there's the argument like, okay, but now you're out of the cultural conversation. I have friends that have flip phones and I'll try to ask them, did you see this new thing about the new quantum computer? that's like 100 billion times better, or 100 million times better than the last one they released in 2019??

[00:51:14.98 - 00:51:32.72]

And they're like, no, what? So they're missing some things too. So the key is mitigation. You have to figure out how much information makes you anxious and how much information. where you just sit there and you scroll and you waste your time.

[00:51:32.86 - 00:51:39.00]

And then you're like, what did I do with my life? I'm wasting hours. And you look at your screen time. at the end of the day, it's six hours. Like what?

[00:51:39.40 - 00:51:53.46]

Six hours of looking at my phone? Is that real? So you have to do that, but then also you don't want to miss out on things. So you do want to kind of be informed. And part of my job is to be informed.

[00:51:53.78 - 00:52:01.10]

I can't be the guy who people have to tell things about, because I don't know anything. Like what? What's going on? Crimea? Where's that?

[00:52:01.42 - 00:52:22.10]

I can't like, so I have to have some, I have to have some involvement. I have to have some input where I'm getting input from social media and from all these different things. As a comedian, I have to know the temperature of the country. I have to know like what to make fun of, like what's ridiculous, what people are accepting. that doesn't make any sense.

[00:52:22.66 - 00:52:39.66]

You just have to know, like when you're getting sucked into the point where it's becoming detrimental. And I think that's where people struggle. People really struggle with that. Like figuring out what's, how much, like you can eat a cookie. Nothing wrong with eating a cookie, but you shouldn't eat a whole bag of cookies.

[00:52:39.66 - 00:52:49.44]

You know, you shouldn't eat cookies every day. That's not good. But if you have dinner and you want to get dessert, yeah, I'll get a piece of tiramisu. Okay, you're going to be fine. You're going to be fine.

[00:52:49.62 - 00:52:59.68]

You eat tiramisu every day, you're going to die. You know, and that's what social media is. It's dessert. It's candy. It's things that are kind of fun.

[00:52:59.68 - 00:53:18.92]

up to a point. But you just got to know what that point is and how to manage your own attention span. And just also have sovereignty over your mind. You have to control your mind. You have to be able, like, if your mind starts getting anxious, okay, I know we're getting weird.

[00:53:19.12 - 00:53:24.94]

It's time to work out. Okay, I know, maybe we should meditate. Maybe we should do this. Maybe we should do that. Don't just keep scrolling.

[00:53:25.60 - 00:53:45.12]

You got to know when and when. But most people don't have that kind of self-control and discipline. They don't have to. All most people have to do is, when the alarm goes off, get up, wash yourself, brush your teeth, eat something, go to work. Do whatever minimal amount you have to do to keep that job.

[00:53:45.80 - 00:54:17.32]

And then, and the bathroom breaks, and whenever no one's looking, look through your phone, be distracted, come home, watch Netflix, go to sleep, repeat. That's most people. So they don't have to do anything because they haven't set up their life in a way that requires serious attention and an objective sense of your perspective and your interaction with humans and the way the world is working. They don't have the time. They don't.

[00:54:17.44 - 00:54:53.86]

They have family problems, job problems, their car's fucked, something's wrong with their house they got to fix, they have bills, everything's piling up, people are immensely distracted. So what social media does for them is it gives them a brief rest from their own problems. to just look at some fucking drag queen reading stories to kids and get outraged. Or some new thing that's going on with some girl that made some crazy video, and now everybody's talking about it. Like, it's just, most people don't have much discipline.

[00:54:54.34 - 00:55:22.40]

And they don't have to. And they've gotten through life being overweight and eating processed foods and drinking too much and smoking too much and taking all kinds of pills to mitigate all these problems that they have because they've not taken care of themselves. So they're on anti-anxiety medication and anti-depression medication and anti-this and that, and they're trying to lose weight. so they're on Ozempic, and that's most people. That's most people.

[00:55:22.50 - 00:55:26.64]

You know the number one drug in America is a peptide that helps you lose weight?

2
Speaker 2
[00:55:26.64 - 00:55:27.88]

What is that?

1
Speaker 1
[00:55:28.32 - 00:55:41.18]

Ozempic. It's the most profitable drug in the country. It's maybe the most profitable drug ever. They can't sell enough of it. They estimate that by, what is the number?

[00:55:41.28 - 00:55:47.08]

I think they're saying within five years, 30% of the population is going to be on Ozempic.

2
Speaker 2
[00:55:47.78 - 00:55:49.32]

They can't make it fast enough?

1
Speaker 1
[00:55:49.58 - 00:55:51.88]

They can't make it fast enough. It's flying off the shelves.

2
Speaker 2
[00:55:51.88 - 00:56:05.76]

Okay, here's a strange parallel thought for that, which is that Raytheon has gotten rid of its marketing department. It doesn't need it anymore. They can't make enough missiles fast enough.

1
Speaker 1
[00:56:06.28 - 00:56:09.02]

Imagine we have a marketing department for missiles.

2
Speaker 2
[00:56:09.66 - 00:56:24.84]

That is so crazy. What you just told me is like people's physical being and their existential defense. threats are aligned. in terms of that, there's no need. There's too many orders to fill.

1
Speaker 1
[00:56:25.20 - 00:56:57.20]

If you can get people to believe bullshit and keep feeding them bullshit, you turn them into infants. If they just accept the fact that you're feeding them bullshit and they don't employ any critical thinking and they don't look at outside sources of information and really try to assess what's actually going on, because they generally don't have the time. Interesting. You create a nation of infants. There's a lot of us in this country that exist, almost like children that are hoping daddy's going to take care of everything.

2
Speaker 2
[00:56:57.62 - 00:57:11.62]

But I'm always interested in the people that are those 3% you talked about that. suddenly have that moment, the catalyst, where they realize, oh my goodness, I have to change. Things have to change. Maybe not everybody changes.

1
Speaker 1
[00:57:11.62 - 00:57:52.02]

There's more of us now, I think, than ever before. That change. That realize something's going on. And I think that's also because of social media, the good aspects of social media, real, honest discussions, revelations, things being released on Twitter, the Twitter files with Elon Musk where they found out the FBI was trying to suppress information, the Hunter Biden laptop story, and then going through the COVID disinformation, and now seeing the congressional hearings where Fauci's lying in front of Congress about gain-of-function research and whether or not they deleted emails, all that stuff. There's more now going, what the fuck is actually going on than ever before?

[00:57:52.58 - 00:58:01.88]

I think there was always people, like during the Vietnam War, there was always people that distrusted the government, but they didn't have the kind of access to information that we have today.

2
Speaker 2
[00:58:02.96 - 00:59:02.88]

I mean, I'm interested in the individual stories of people who change always, because I think that it's too, I don't want to say depressing for me, but it's too, like if I think of America as this big, giant situation with problems, it becomes overwhelming. I just try to focus on people's really interesting, cool stories. I'll tell you one that comes to mind. The guy, my Uber driver the other day, we're chatting away and he tells me that he used to be like 350 pounds and he was like this thin dude and he was talking to me about driving an Uber at night so he could save money for one of his kids to go to college. And I was like, wow, and he was driving his way and he told me that he was a security guard at a military base and he said I was giant and the dudes in the military were totally fit and one of them said to me one time, dude, you got to lose some weight, and I listened to him and he let me go into the military gym.

1
Speaker 1
[00:59:03.42 - 00:59:04.92]

Wow, one guy.

2
Speaker 2
[00:59:05.94 - 00:59:06.82]

One guy.

1
Speaker 1
[00:59:06.92 - 00:59:08.08]

Sometimes you just need to hear it.

2
Speaker 2
[00:59:08.12 - 00:59:08.98]

Be that guy.

1
Speaker 1
[00:59:08.98 - 00:59:09.46]

Yeah.

2
Speaker 2
[00:59:09.70 - 00:59:23.04]

Be that guy. But that to our point of like, well, if you're not allowed to say like, dude, you're overweight, or essentially like by saying, dude, you should come to the gym, the subtext there is, dude, you're overweight.

[00:59:24.92 - 00:59:35.00]

You're unhealthily overweight. That's that fine line of stuff I find really interesting is talking about things that make people uncomfortable. I don't think that should be so taboo.

1
Speaker 1
[00:59:35.20 - 00:59:35.86]

No, it shouldn't be.

2
Speaker 2
[00:59:38.98 - 00:59:39.88]

It should be positive and direct.

1
Speaker 1
[00:59:40.18 - 00:59:56.70]

Especially if you can do it and be kind. Just telling someone they need to lose weight doesn't mean you're mean, and this whole body positivity thing is not good for anybody. Not good for anybody. There's no one who benefits from that. You benefit in the short term, where you don't feel as bad about the fact that you're obese.

[00:59:56.70 - 01:00:03.72]

but you know you're obese. Everyone knows you're obese. You're just not dealing with this very obvious problem. When someone says you need to lose weight,

[01:00:08.98 - 01:00:50.56]

I went to her doctor and she has all these autoimmune problems and she was morbidly obese. Like giant. She said that the doctor started body shaming her and she was so upset that she felt uncomfortable that the doctor was telling her that she needed to do something about her weight loss and recommended perhaps bariatric surgery or ozempic or any of these things. and this person was talking about this and was like what a betrayal that their healthcare provider was calling them obese and that they did not feel safe with them anymore. and the comments were interesting because almost everyone in the comments was like that fucking person is trying to save your life.

[01:00:51.20 - 01:00:59.98]

You have all these autoimmune issues. Well, guess why? Guess why? You're 500 pounds. You shouldn't be 500 pounds.

[01:01:00.16 - 01:01:21.82]

That's not a normal weight for a person, especially a woman, who's like 5'8". This is crazy. Like what you're doing is crazy, and for you to be upset that your doctor is telling you you're doing something crazy. It's like a person coming into the doctor's office getting screened for lung cancer with three cigarettes lit in their mouth. That's what it's like.

[01:01:22.28 - 01:01:33.02]

It's like hey, what do you think is causing this lung cancer? Do you think maybe it's this fucking poison that you're sucking on every day? Like, yeah, that's probably it right? Yeah, well, that's the same thing. You're just doing it.

[01:01:33.06 - 01:01:52.58]

It's a slower poison, but it's a very obvious poison. Like you're consuming poison. You've gotten your body to the point where it's dying and she's telling you this doctor's telling you hey, you've got to do something about this, and your response is to go on social media and talk about the horrors of being body shamed.

2
Speaker 2
[01:01:53.58 - 01:02:00.46]

There's a great saying that I love. I try to live my life by and I definitely write about it.

[01:02:02.28 - 01:02:04.26]

but you can't fix what you can't face.

1
Speaker 1
[01:02:05.06 - 01:02:08.50]

That's a very good quote. Yeah, that's true.

2
Speaker 2
[01:02:08.98 - 01:02:13.72]

Alternatively, you can fix what you're willing to face, and that's what's so much about trans.

1
Speaker 1
[01:02:13.72 - 01:02:14.28]

If it's fixable.

2
Speaker 2
[01:02:15.90 - 01:02:25.96]

I think you can fix anything, even if it's just a mental fix or a spiritual fix. But are you willing to? But on the technology front, do you ever read James Burke? No.

[01:02:33.28 - 01:02:57.78]

He's a historian of science and technology and he goes like way back, you know, like bubonic plague he'll write about of how that changed industry across Europe in these really general, easy to digest, fascinating ways. But in terms of your technology, TikTok, is the world going to hell in a handbasket. I think of him because he said that when the and this is a great analogy, I think that I think of with my kids and social media.

[01:03:03.28 - 01:03:52.30]

is absolutely all of society thought the world was like going to go to hell in a handbasket. Because before that the only people who could read really were the priests, and so they kept all this information and they doled it out according to their line of thinking. And then the printing press came along and the could read or would begin. that began really the birth of mass populations being able to read, which is where we are today. And sometimes I like think about James Burke and I think about that as an analogy to where we are today that what is going on is just an upheaval, like the printing press, in terms of making a lot more people more literate.

[01:03:53.14 - 01:03:58.56]

And so maybe it's not even I mean I used to think of literacy, and its true definition is actually reading.

[01:04:02.30 - 01:04:24.06]

I read all my own audio books and I originally thought that listening to me read my own book was somehow you wouldn't have the same experience of reading it yourself. And then I realized I was putting my standard. I actually enjoy reading. I like to read, that's the way I'm wired. I can't do math, but I can read.

[01:04:25.32 - 01:04:26.60]

And now I realize.

[01:04:32.30 - 01:04:47.46]

that maybe is a new 21st century form of literacy, which is really makes my head go in interesting places, because language is very different than reading. You know, communicating.

1
Speaker 1
[01:04:49.14 - 01:04:50.56]

But it's all information.

2
Speaker 2
[01:04:50.86 - 01:04:59.46]

Yes. And it's digesting the information. And so where I think the social media parts of it are dangerous is it's like you said.

[01:05:02.82 - 01:05:17.58]

disparate. You go from one thing to the next thing, and that's the way it's all set up. Whereas a longer form podcast, you're asking people to stay with you with your ideas. You might go off on a riff about obesity. I might go off on a riff about literacy.

[01:05:17.92 - 01:05:27.16]

But the brain is being stimulated. The brain is being curious. And then that carries over to your own life.

1
Speaker 1
[01:05:27.66 - 01:05:27.80]

Yes.

[01:05:32.30 - 01:06:00.50]

And what I've learned is that this entertainment form this, whether it's podcasts or audio books is something that's being consumed while people are doing other things where they normally would not get this information, like driving, going to the gym, working, doing menial labor, doing things. we can listen to a podcast. That is a new thing. It's a way to be entertained while you're doing other things. And that's a big part of this.

[01:06:00.60 - 01:06:13.14]

And that's a whole area that wasn't addressed before. It kind of was with talk radio. So people listen to talk radio in their cars. But nobody listened to talk radio at the gym. Nobody listened to talk radio on an airplane.

[01:06:14.00 - 01:06:26.50]

Now you can download things and consume them anytime you want. And most of the time people are consuming these things while they're being forced to sit in the doctor's waiting room, while they're doing something that ordinarily they would just be just bored.

2
Speaker 2
[01:06:27.52 - 01:06:52.36]

Right. And the other argument to that your friend with the flip phone. I've heard this director, Christopher Nolan, who made the Oppenheimer movie, talk about this, where he says he believes that the experience of sitting in the waiting room is what he wants. Yeah. So I think there's a very few rarefied people that can.

[01:06:52.36 - 01:07:18.66]

actually, the way they're built, the way they can, the way they're engineered, the way they are, the way they've become, allows for them to sit in the waiting room and be super interested in observing. Maybe you're an elite director to do that. But most people are going to be, you know, restless, irritable and discontent. And therefore the podcast, the audio book, is an additive to your life.

1
Speaker 1
[01:07:19.02 - 01:07:35.52]

It is, but it's also a way to consume new information. It's a way to get educated. You can say what Christopher Nolan is doing is the right way, but it might be the right way for him. He's obviously a brilliant man and he I don't believe he even uses email. Is that the case?

[01:07:35.78 - 01:07:50.26]

Find out if Christopher Nolan uses email. I'm pretty sure he's one of those guys who's like no phone, no email, nothing completely disconnected. And I would imagine, if you want to be very creative, that's probably a very good strategy.

2
Speaker 2
[01:07:50.52 - 01:07:51.78]

You don't want the stimulus.

1
Speaker 1
[01:07:52.24 - 01:07:57.06]

Yeah. I don't have an email address. I've never used email. No one said. I don't have a smart phone.

[01:07:57.18 - 01:08:00.24]

I will carry a pay as you go. dumb phone thing. Yeah.

2
Speaker 2
[01:08:00.42 - 01:08:02.66]

Okay. And that's the quote from him. So that's.

1
Speaker 1
[01:08:02.66 - 01:08:08.72]

one of the reasons why he's probably so good. What does it say? Is burner phone is inspired by what? The wire? I'm going to say the wire.

[01:08:09.16 - 01:08:10.20]

What does it say?

[01:08:12.88 - 01:08:15.02]

I got it. I nailed it.

[01:08:19.32 - 01:08:32.86]

I have a feeling that in the future there's going to be way less of those people. And you know, there was a lot of people I remember in the day that had no email and they thought it was cool. I don't even have email. man. You can call me.

[01:08:33.32 - 01:08:40.92]

You know I don't answer. like my friend Joey would not answer text messages. He'd get mad at you if you sent him a text message. Call me. And but now he texts me.

[01:08:41.36 - 01:08:46.82]

Everybody texts. I think it's going to be harder and harder to be that guy. But kudos to him.

2
Speaker 2
[01:08:47.14 - 01:08:53.06]

Okay. So I remember when my my son Jet is now 19 and he was probably nine.

1
Speaker 1
[01:08:53.20 - 01:08:54.30]

What a cool name for a son.

2
Speaker 2
[01:08:54.50 - 01:08:55.28]

Jet Jacobson.

1
Speaker 1
[01:08:55.56 - 01:08:55.88]

I like it.

2
Speaker 2
[01:08:56.12 - 01:08:57.80]

My other son's, Finley Jacobson.

1
Speaker 1
[01:08:57.90 - 01:08:58.46]

Fighter pilot.

2
Speaker 2
[01:08:58.90 - 01:09:00.02]

Jet Jacobson.

[01:09:01.90 - 01:09:18.32]

So he was like maybe nine years old when the first iPhone came out. And I had one, and he just thought it was so cool and I will never forget. he said at the dinner table, Mom, just to be clear, when you were born they didn't have these iPhones.

1
Speaker 1
[01:09:22.80 - 01:09:23.82]

That's great.

2
Speaker 2
[01:09:24.74 - 01:09:25.92]

Just to be clear.

1
Speaker 1
[01:09:26.20 - 01:09:31.26]

Well, when we were kids, we had a phone that was stuck to the wall with a cord. Remember?

2
Speaker 2
[01:09:31.86 - 01:09:35.60]

I mean, try telling your kids about an answering machine.

1
Speaker 1
[01:09:35.78 - 01:09:36.82]

How about dial phones?

2
Speaker 2
[01:09:38.30 - 01:10:00.06]

Okay. Are you ready for this? Trivia. Why? one of the reasons why you could argue that computers became so important to the Defense Department back in 1961 is because during the Cuban Missile Crisis and this is like I have seen these documents at the National Archives.

[01:10:00.60 - 01:10:25.88]

JFK was so worried about that exact movement that you made with your finger. The dial phone. There was a true red phone that would be used in a nuclear crisis for him to call Nikita Khrushchev. And he became worried that that wasted too much time to get through. And so he hired a guy called J.C.R.

[01:10:26.00 - 01:10:41.56]

Licklider to develop computers that could move faster. And at the time the computers were the size of this room in the Defense Department. And there were these old mainframes. And Licklider is. he's called the Johnny Appleseed of the Internet.

[01:10:42.00 - 01:11:16.60]

In essence, he's the guy who created the ARPANET, which is now the Internet. And so there's this interesting dual-use technology idea that everything that at the Pentagon, at least in the defense world, so much of our amazing technology is born out of trying to save the world from existential threat. And I always think that dual-use part of everything is super interesting. Look at lasers. Lasers are arguably perhaps the most important technological defense-borne.

[01:11:18.14 - 01:11:34.86]

system of the 20th century. Laser printers, laser surgery, laser eyes. And then you have laser weapons at the Pentagon, so classified. I can't even get anywhere near that. They're called directed energy weapons.

1
Speaker 1
[01:11:35.94 - 01:11:44.36]

And how much do you know about them? All I see is like conspiracy stuff online like Antarctica direct energy weapons.

2
Speaker 2
[01:11:45.12 - 01:12:06.82]

Yeah. Well, I mean I always think conspiracy is born of secrecy, which would make sense. If someone is constantly telling you you can't know about that, you're going to naturally wonder what the hell's going on that I really can't know. What's so secret? I think that's a good instinct of ours.

[01:12:07.64 - 01:12:17.28]

And so anytime I have been at the Pentagon or wherever asking about directed energy weapons, it's really a.

1
Speaker 1
[01:12:17.28 - 01:12:17.96]

They'd stop you.

2
Speaker 2
[01:12:17.96 - 01:12:32.54]

It's a stop. And so, to find out more about that, I tracked down. this was when I was reporting the Pentagon's brain a decade ago. I thought to myself okay, well, if these guys won't talk to me, I'll find out who invented the laser. See if he will.

[01:12:33.30 - 01:12:40.22]

Go to the smartest guy in the room. So Charles Townes, who won the Nobel Prize for inventing the laser. I called him up.

1
Speaker 1
[01:12:40.22 - 01:12:40.68]

What year was that?

2
Speaker 2
[01:12:41.06 - 01:12:43.68]

This was in 2014,, I believe.

1
Speaker 1
[01:12:43.68 - 01:12:45.76]

It wasn't lasers before 2014?

2
Speaker 2
[01:12:46.14 - 01:12:52.54]

No, no, no. Oh, sorry. That's when I interviewed him. He invented the laser in... I think he won the Nobel Prize in 61 or 62..

[01:12:52.86 - 01:13:09.18]

Okay. And he was 99 years old when I interviewed him. Still keeping office hours at Berkeley. Wow. Had his secretary on the line and gave this incredible interview and I asked him about his development of the laser.

[01:13:09.28 - 01:13:17.36]

But you might be interested in this. So he... he... is the guy who invented the laser. I mean, when you really think about that.

[01:13:17.60 - 01:13:18.52]

And he told me.

[01:13:18.52 - 01:13:29.44]

. I said, well, who do you go to when you're having trouble, when you're Charles Townes? And he said, well... I took that particular idea to two colleagues. Einstein and von Neumann.

[01:13:30.04 - 01:13:35.70]

Okay. That's interesting. I said, who said what? And he said... Einstein...

[01:13:35.70 - 01:13:51.00]

This is when Townes was having trouble making it work. And he said that Einstein said, keep trying. And von Neumann said, it'll never work. And I said, what do you make of that? And he said, well...

[01:13:51.00 - 01:14:05.98]

Einstein was very generous of spirit and he was always encouraging other scientists to think big and try. And von Neumann was the kind of scientist who believed if he didn't come up with it, it probably wasn't a good idea.

1
Speaker 1
[01:14:06.46 - 01:14:06.90]

Ah, ego.

2
Speaker 2
[01:14:07.74 - 01:14:18.64]

But I love that, because who else in the world has those two people that they run ideas by? You know who you run ideas by, who? I run ideas by? Yeah. Charles Townes.

[01:14:18.84 - 01:14:42.82]

But here's another interesting thought about the laser. Charles Townes, and he didn't share this fact for decades, but later in life he wrote a lengthy article, I believe it was for the Harvard Alumni Magazine, that the idea for the laser came to him when he was sitting on a park bench from above.

1
Speaker 1
[01:14:43.74 - 01:14:44.22]

Whoa.

[01:14:45.80 - 01:14:46.70]

Like how so?

2
Speaker 2
[01:14:48.28 - 01:15:32.26]

It was a religious experience for him. Like he'd been working on this problem, he'd been working on this science problem, according to Charles Townes. And, by the way, he was inspired, he told me, to develop the laser from the time he was a little kid, in the twenties, reading the Soviet science fiction novel, The Garin Death Ray. So it's like, it was a science fiction concept, a laser. He's a little kid, Charles Townes, thinking about this, thinking about this, then all through his life, continuing to think about it, then running it by his Einstein colleagues, and then can't make it work, can't make it work, is sitting on a park bench, and he gets the message from above.

[01:15:32.26 - 01:15:52.34]

He made it sound very much like it was a religious experience for him, but he never wrote about it for a long time, because, particularly in the sixties and seventies, you couldn't be a scientist and have faith at the same time, or at least you would be belittled or be looked down upon, is what he said.

1
Speaker 1
[01:15:53.48 - 01:16:04.28]

That's interesting. So he was reluctant to say the inspiration behind his idea, because he felt it was divine in nature.

2
Speaker 2
[01:16:05.06 - 01:16:05.26]

Bingo.

[01:16:07.58 - 01:16:12.62]

And it's not just me. he told this. You can read about this in that Harvard article.

[01:16:14.36 - 01:16:29.68]

And I think he wrote that when he was in his late eighties. But absolutely. So. he was really making a plug for listening to whatever it is that guides you, which is a very powerful statement.

1
Speaker 1
[01:16:31.04 - 01:16:45.86]

Right, whatever that is. Yeah, we have a real problem with. if we can't measure something. We don't think it's a real thing. You know, if we don't have the tools to put it on a scale, we don't think it's a real thing.

[01:16:46.04 - 01:16:48.98]

But whatever we want to call divine intervention.

2
Speaker 2
[01:16:51.02 - 01:17:09.56]

When he told me that, it was so fascinating to me. I began to explore if there were other Nobel laureates in particular that shared that belief. And there are. There are a couple of Nobel laureates who, someone in chemistry,

[01:17:13.28 - 01:17:24.60]

that believed that their knowledge came from a divine inspiration. Like they had a dream type situation. I write about all this in my book, Phenomena.

1
Speaker 1
[01:17:25.60 - 01:17:26.14]

Yes.

2
Speaker 2
[01:17:26.64 - 01:17:45.56]

These specific examples, because it's so interesting when that kind of non-mainstream thinking comes from an absolute, you know, an individual in the community who has achieved the highest award, whatever that means.

1
Speaker 1
[01:17:45.90 - 01:18:11.42]

Isn't the creation of science, or the thought of science as a thing, wasn't that, was that Descartes? And didn't he come up with it from a dream? He had a dream where an angel told him that they would master control over nature by measurement.

[01:18:13.86 - 01:18:25.40]

See if we can find that. See if we can find that quote. But I believe this was like considered the beginning of science. So the beginning of science. Here it is.

[01:18:26.88 - 01:18:44.86]

In his third dream, the angel came to Descartes and said, conquest of nature is to be achieved through number and measure. That was the beginning of the Cartesian way of dissecting the natural world. One of mechanics, formulae, etc. Surely the conquest of the natural world has been achieved that way. Proof surely that the angel was right.

[01:18:46.16 - 01:18:48.34]

So the beginning of science.

2
Speaker 2
[01:18:49.42 - 01:18:51.08]

Measurement. Divine.

1
Speaker 1
[01:18:51.08 - 01:18:53.82]

Or whatever it is. Whatever that word.

2
Speaker 2
[01:18:54.18 - 01:19:08.68]

Absolutely. I mean who isn't amazed by their own dreams when they have a really intense dream? That's just such a part and parcel to being a human and being curious. And there is no answer to it.

1
Speaker 1
[01:19:08.84 - 01:19:09.56]

I have a theory.

2
Speaker 2
[01:19:10.00 - 01:19:10.42]

Tell me.

1
Speaker 1
[01:19:11.26 - 01:19:39.38]

I think ideas are a life form. That's what I think. I think we think of a life form, the term life form, we think of it as something that can breed, something that propagates, something that spreads its DNA. Every single thing that exists on this earth, that people have created came from an idea. Every mug, every computer, every airplane, every television set.

[01:19:39.68 - 01:19:55.96]

Everything came from an idea. The idea comes into the person's mind. It combines with all the currently available ideas and improves upon those ideas, and it manifests itself in a physical thing. And that physical thing is what we do. That's the number one thing we do.

[01:19:56.00 - 01:20:11.54]

We do a lot of different things. We have children and families. We have jobs. We take up hobbies. But if you looked at an overview effect, if you looked at the human race from above, you would say, if you weren't one of us, you would say, what is this species doing?

[01:20:11.64 - 01:20:22.36]

Well, it makes better things every year. That's what it does. It doesn't make the same beehive every year. It constantly, consistently, makes better things. What motivates us to make these better things?

[01:20:23.26 - 01:20:57.86]

Ideas. Ideas, and then the competition of these ideas. Now, if something wanted a thing to manifest it in the world, to make it exist in the world, what would it do? It would get inside that thing's creative structure, get inside that thing's mind, and impart these ideas, impart these inspirations, and get this thing to go out and put these pieces together and manufacture this thing, and then test it and improve upon it, and keep doing it until they get it right. And then other people will take those things and have new ideas.

[01:20:58.10 - 01:21:07.64]

I know how to take that and turn it into a tablet. I know how to turn that into this. I know how to make this better. Let's do quantum computing. Let's do that.

[01:21:07.70 - 01:21:38.14]

These are all just ideas. So ideas that human beings turn into real things and those real things accelerate the evolution of technology in this radical way where we can't even comprehend where it's going. You know, there was an article I put on my Instagram today. I just put the title of it, How Crazy It Is, that AI was like 500 million years. AI has extrapolated, like they're calculating what evolution looks like in 500 million years.

[01:21:39.22 - 01:21:39.36]

Yeah.

2
Speaker 2
[01:21:39.58 - 01:21:40.98]

Oh, I have seen that.

1
Speaker 1
[01:21:41.00 - 01:21:56.76]

Yeah. Yes. Like, we don't even understand what we're doing. We don't even understand what we're doing, but we keep doing it. And I think some of the instincts that human beings have that seem frivolous are directly connected to this.

[01:21:56.88 - 01:22:10.34]

And one of them is materialism. Materialism, status, all these different things that we have. where with materialism? you always want the newest, greatest thing. If your friend has an iPhone 15 but you have an iPhone 10, you look like a loser.

[01:22:10.34 - 01:22:16.28]

What are you doing with that old, look at that stupid old camera. Oh, my God. What are you doing with that? You need the best one. You need the new one.

[01:22:16.60 - 01:22:30.00]

If you have a 2007 car and your friend pulls up in a 2024 car, like, oh, I need the new one. What does the new one do? The new one does all these different things that the old one doesn't do. The new one drives itself. Oh, I got to get the new one.

[01:22:30.18 - 01:22:36.44]

And so what does that do? It pushes innovation. It promotes innovation.

[01:22:38.32 - 01:22:50.62]

Materialism fuels innovation, because we're always buying new stuff. If everybody stopped buying things, if everybody looked at their phone and said, oh, this phone's perfect. I don't need a new phone. I could have that phone forever. You don't need a new iPhone.

[01:22:50.86 - 01:22:52.20]

I could have that phone forever.

[01:22:54.42 - 01:23:18.94]

They would stop innovating. They would stop making things. It would just stop and then nothing would get done. But because of materialism, because of this keeping up with the Joneses thing, where everybody wants the latest thing in their driveway to impress their neighbor, you want to pull up to the fucking diner and show all your friends, all that stuff just fuels the creation of new and better things. And we're all part of it.

[01:23:19.00 - 01:23:33.22]

And every year, there's like a thing you didn't know. Like, I just got this phone. Check this out. This phone, I can make a circle on things and it Googles it instantly. This phone transcribes all of my voice notes and then summarizes them for me.

[01:23:33.34 - 01:23:37.84]

This phone, this Galaxy S24 AI, it has AI in it.

2
Speaker 2
[01:23:37.84 - 01:23:42.72]

It will summarize your notes, so like condense them into a Wikipedia of you?

1
Speaker 1
[01:23:44.24 - 01:24:02.30]

It does it with web pages. This is the Samsung Galaxy S24 Ultra. This is the new one. This thing, it can transcribe your thoughts, but, even better, it can translate conversations. You could be speaking Spanish and I could be speaking English, and it'll have a split screen.

[01:24:02.84 - 01:24:05.14]

So it'll show, like, I'll put the phone down.

2
Speaker 2
[01:24:05.18 - 01:24:06.16]

In almost real time.

1
Speaker 1
[01:24:06.16 - 01:24:27.74]

So, on your half of the phone, facing you, it would be speaking in Spanish. It would translate my words into Spanish. On my side, it would be translating your Spanish into English. Then, if you have the Galaxy earbuds, it'll do it in real time, in your ear. So you can talk to me in Spanish and I can hear you in English.

[01:24:28.78 - 01:24:29.72]

And this is just beginning.

2
Speaker 2
[01:24:30.08 - 01:24:36.68]

So at the United Nations, we're no longer going to have those translators painstakingly. Oh yeah. Telling.

1
Speaker 1
[01:24:36.98 - 01:24:43.74]

And maybe lying. Maybe lying. Maybe distorting reality. Because that happens a lot too. You hear about translations.

[01:24:44.02 - 01:25:02.56]

You're like, that's not exactly what he said. You've kind of distorted and twisted those words. But the point is, like, these all come from ideas. And I didn't think I needed the ability to circle an image and immediately Google searches it, but it's so convenient. Like, if you want something or you see something, like, what is that?

[01:25:02.82 - 01:25:09.94]

What is that? Take a picture of that thing. Circle it. Then immediately it goes to Google and shows you, like, instantaneously.

2
Speaker 2
[01:25:10.64 - 01:25:17.08]

Now, when you say ideas are life forms, which is so interesting to me, haven't heard that,

[01:25:18.74 - 01:25:39.50]

are you talking about maybe consciousness? Yes. Okay. So this, and then how does, what are your thoughts on how individual people are, because you talked about a beehive and it made me think about a hive. Like, are we all part of the same consciousness?

[01:25:39.76 - 01:25:41.68]

Like, again, Carl Jung would say.

1
Speaker 1
[01:25:42.42 - 01:25:56.92]

My theory, and again, no education in this, is just my thinking about it for thousands of hours. I think we're all the same thing. I think this whole idea of we are one, that sounds so hippie. It's hard for people to digest. But I want you to think about it this way.

[01:25:57.64 - 01:26:30.52]

If you live my life, I think you would be me. And I think if I lived your life, I would be you. I think what consciousness is, is the life force of all living sentient things on this planet and perhaps all in the universe. And it's experiencing reality through different biological filters, different bodies, different life experiences, different education, different genetics, different parts of the world, different geography, different climate, different things to deal with. But it's the same thing.

[01:26:30.52 - 01:27:06.82]

I think if I lived in Afghanistan and my whole family was in the Taliban, I would probably be in the Taliban too. Because I think you adapt to whatever circumstances and environment you're in, and then you think that way, and you speak this language, and you have these customs, and you engage in these religious practices. But I think consciousness, the thing at the heart of it, all, what that person thinks of when they say me, what I think me, I think this. I think that me is the same in every person. I think it's everyone.

[01:27:07.10 - 01:27:26.34]

That's why we're all connected. That's why we all need each other. That's why loneliness contributes to diseases, and it's terrible for human beings. I think we're all universally connected with this bizarre goal. And I think this bizarre goal might be to create artificial life.

[01:27:26.82 - 01:27:43.34]

I think that might be the end. I've said this too many times. If you've heard it, I'm sorry. But I think that we are an electronic caterpillar making a cocoon, and we don't even know why. We are just constantly building.

2
Speaker 2
[01:27:43.68 - 01:27:44.64]

The cocoon is the next transformation.

1
Speaker 1
[01:27:44.76 - 01:27:57.84]

Exactly, and we give birth to the butterfly. We are a biological caterpillar giving birth to the electronic butterfly, and we're doing it. We don't even know what we're doing. We're just making this cocoon. We just keep going, because this is what we do.

[01:27:57.94 - 01:28:30.06]

And I think this is probably how life separates from biology to something far more sophisticated that's not confined to the timeline of biological evolution, which is a very slow, relatively speaking, timeline in comparison to electronic evolution. Electronic evolution is exponential. It happens radically. It happens very fast, and especially when you get into things like when AI gets involved and quantum computing gets involved, then things accelerate so fast. Like, look at what's going on with AI.

[01:28:30.54 - 01:28:45.40]

Five years ago, AI was not even a concern. Nobody even thought about it. Now it's at the tip of everyone's tongue, and all everyone's talking about is what happens when CHAT-GPT-5 gets released? What happens when 6 gets released? What happens when artificial general intelligence is achieved?

[01:28:45.78 - 01:28:54.60]

What happens when these things become sentient? What happens when these things make better versions of themselves? When does that stop? Does it ever stop? Do they become gods?

[01:28:54.76 - 01:29:07.94]

Is that what God is? Is what God is? this primate becomes curious, starts inventing tools. Tools lead to machines. Machines lead to the Industrial Age.

[01:29:08.12 - 01:29:14.56]

The Industrial Age leads to electronics. Electronics lead to artificial intelligence. Artificial intelligence leads to God.

2
Speaker 2
[01:29:15.38 - 01:29:39.34]

Wow. Okay. In that line of thought, go back now on that timeline. To your eye, how did we go from language, you know, grunting, how did we go from grunting to language, and then from language to writing? Why did we decide, why did we, why and how did we write in your...

1
Speaker 1
[01:29:39.34 - 01:29:56.24]

Well, I think the moment they started pointing at things and making sounds, and, by the way, animals do that. We know that, right? Do you know that some monkeys trick other monkeys? So some monkeys will make a sound like an eagle's coming, and so that these monkeys run out of the trees and then they'll run up the trees and steal the fruit.

2
Speaker 2
[01:29:56.70 - 01:29:57.82]

I did not know that.

1
Speaker 1
[01:29:57.88 - 01:29:58.78]

Yeah, they lie to each other.

2
Speaker 2
[01:29:58.84 - 01:30:00.16]

Have you seen Chimp Empire?

1
Speaker 1
[01:30:01.04 - 01:30:04.60]

Yes. Yeah, I had the director on. It was amazing. It's an amazing film.

2
Speaker 2
[01:30:05.20 - 01:30:30.82]

God, there's so much to discuss. Somebody was asking me after I wrote the Surprise Kill Vanish book about the CIA's paramilitary, the ground branch guys, the snake eaters, and they said, what's the origin story of ground branch? Thinking? I was going to say, you know, the OSS. But I had just seen Chimp Empire, and I couldn't help myself but say, Chimp Empire is the origin story of the CIA's paramilitary.

[01:30:30.98 - 01:30:31.26]

Wow.

[01:30:32.98 - 01:31:13.88]

Right? And I remember that, watching that, it's like the four hours, and it's so magnificently filmed, and it's so interesting, and that moment where they go back in time and you realize someone had been videotaping the previous generation of chimps, and that their anger and their fighting was based on revenge, that blew my mind, that they, that, because I often think, because I write about war and weapons, I always think about nature versus nurture, and were we built this way, and are we, you know, this is a very interesting thing to think about, like your electric caterpillar.

1
Speaker 1
[01:31:13.88 - 01:31:16.02]

They were fighting over resources.

2
Speaker 2
[01:31:16.24 - 01:32:09.06]

They were fighting over resources, but it was also revenge, remember that? There was like a score to settle with the previous generation, and that, to me, was stunning, and also that one of the guys, I love that they named everyone, but one of the chimps' brother, I think it was, like head chimps' brother, lost his arm in a poacher's trap, remember that? And when you think about chimps and how important their arms are swinging from trees, like under, if you just followed the logic about, you know, survival of the fittest, then that chimp, the brother chimp, would have died because he didn't have one of his hands, but instead, the other brother, the lead chimp, made sure he was taken care of, which is like so human war. Yes. I still think about that.

1
Speaker 1
[01:32:09.14 - 01:32:14.76]

Well, that's how they can survive. They need help. That's why they live in tribes. That's why they're not individuals, alone by themselves.

2
Speaker 2
[01:32:15.90 - 01:32:24.20]

Okay, so you think chimps were pointing, and then from that, you had to figure, over long amounts of time.

1
Speaker 1
[01:32:24.42 - 01:32:50.02]

Yes, over immense amounts of time, they started to develop, just like monkeys have sounds for eagles, you know, that monkeys have sounds for snakes. They have sounds for different things, and this is a relatively recently known thing. I think they used to think they were just making noises before. Now they realize, oh, no, they have very specific noises for very specific things, and I think that evolves. Do you know that they think that apes right now are in the Stone Age?

[01:32:51.88 - 01:32:54.14]

Yes, they've entered into the Stone Age.

2
Speaker 2
[01:32:54.44 - 01:32:56.92]

You mean apes have evolved into the Stone Age?

1
Speaker 1
[01:32:57.06 - 01:33:17.30]

Anthropologists right now believe that if we used to be Stone Age lower primates, and we eventually evolved into being human beings, the other apes are now entering the Stone Age. There's evidence of them using tools. There's evidence of them manufacturing tools. There's a video of an orangutan. There's some famous photos of an orangutan spearfishing.

[01:33:18.18 - 01:33:27.40]

Recently? Yes. He's hanging on to a tree, and he's got a spear, and he's jabbing fish out of the water with his spear. It's wild to see.

2
Speaker 2
[01:33:27.52 - 01:33:28.52]

They're catching up with us.

1
Speaker 1
[01:33:28.62 - 01:33:34.22]

Yes, well, that's what happens. Right. I mean, the idea that they're going to stay the same. Look at this. Okay.

2
Speaker 2
[01:33:35.58 - 01:33:36.46]

Oh, wow.

1
Speaker 1
[01:33:36.60 - 01:33:42.76]

That's amazing. How crazy is that? He's fishing. He's fishing with a spear. That's pretty amazing.

[01:33:43.16 - 01:34:03.70]

That's just one of many instances of them using tools, using things. They're starting to enter into this phase where they start evolving. I mean, and it's a slow process, and we went through it somehow or another. We went through it in an accelerated way. That's very confusing, also.

[01:34:04.12 - 01:34:22.96]

Like the doubling of the human brain size over a period of two million years, apparently is the biggest mystery in the entire fossil record. How does this one thing that created the fossil record double over a period of two million years? What are the factors? There's a bunch of different ideas, but all of them are just ideas. No one really knows.

2
Speaker 2
[01:34:23.36 - 01:34:48.08]

Okay, so here's an interesting anecdote. Nuclear war, apes. In 1975, there was this famous defense official who went from being a hawk to being like we cannot have so many nuclear weapons. His name was Paul Warnke, and he wrote what was then a famous article in Foreign Policy magazine called Apes on a Treadmill. Such a great image.

[01:34:48.28 - 01:35:29.10]

His idea that the nuclear arms race was apes on a treadmill, that we and the Russians were just slavish, like essentially ignorant beasts, just slaving away on this treadmill, trying to win, not even realizing there is no winner. It was a famous article. Everybody spoke about it in D.C., and then it disappeared. Well, the anecdote comes from recently. a group of scientists wanted to try to answer the question that we're talking about, like how did apes go from bipedal or from knuckle walking to being bipedal?

[01:35:29.30 - 01:35:47.56]

How did that happen? We still don't know why. They were trying to figure out if it had to do with energy consumption. They outfitted apes, they put them on a treadmill, they outfitted them with oxygen masks, and then they had humans doing the same thing. They were measuring energy levels.

[01:35:48.74 - 01:36:11.10]

During this experiment, and they kept making them do it over and over again, the humans and the apes on the treadmills. During one of the experiments, one of the apes was basically like screw this, I'm not doing this anymore. He pushed the button and got off. And the anecdote to nuclear war is if apes can figure out how to get off the treadmill, why can't humans?

1
Speaker 1
[01:36:14.58 - 01:36:18.12]

Well, apes aren't being influenced by lobbyists.

2
Speaker 2
[01:36:20.52 - 01:36:21.92]

Apes aren't watching TikTok.

1
Speaker 1
[01:36:22.20 - 01:36:31.08]

Apes aren't having secret lunch meetings with the head of Raytheon. Yeah, I don't know. Who knows? I think we can. I don't think it's impossible.

[01:36:31.34 - 01:36:50.94]

I think you and I, being here right now, having this conversation where there's not a nuclear war going on, speaks to that. Like we've had nuclear weapons for a long time and we haven't used them. I don't think it's... I don't think it's inevitability that we destroy ourselves, but it's a possibility. And I think there's a lot of foolish people that are ignoring that possibility.

[01:36:51.22 - 01:37:08.06]

And that's what's scary. And what's scary is that the type of people that want to become president, congressmen, senators, some of them are great people and some of them are... They're wise and they're good leaders. and some of them are just people that are too ugly to be on TV. They're too ugly to be actors.

[01:37:08.28 - 01:37:09.12]

They're too ugly to be.

[01:37:09.12 - 01:37:18.52]

. They can't sing. They want attention. And so they want to be a leader. And so they want to say the things that people want to hear, because those things get them positive attention and they feel good.

[01:37:18.80 - 01:37:46.30]

And then they get a bunch of people who love them and they feel good. And then they have their face on a billboard and they have bumper stickers and, like, everybody likes me. And the people who don't like me, they're communists or losers. And so it's a cult of personality thing that is just a part of being a charismatic person and garnering attention. And that's a giant part of our whole political process is narcissists and psychopaths and sociopaths that have embedded themselves into the system.

[01:37:46.44 - 01:37:58.56]

And then they all feed off each other and help each other. And they're all insider trading. And they're all involved. And then, when they leave office, they get paid to speak in front of bankers and make a half a million dollars. We're overrun by sociopaths.

2
Speaker 2
[01:38:01.18 - 01:38:02.54]

Why is that?

1
Speaker 1
[01:38:03.32 - 01:38:12.40]

Well, it's a natural path. Like, that's what most leaders are. Most leaders of immense groups of people that force people into war are sociopaths.

2
Speaker 2
[01:38:12.50 - 01:38:20.00]

It's the old adage that, you know, you have to be crazy to become president in the first place, or you wouldn't become president because you would realize it's crazy.

1
Speaker 1
[01:38:20.08 - 01:38:36.44]

Yeah, my take is that no one who wants to be president should be allowed to be president. Almost no one. And that's, you know, that's also part of the problem with the world we're living in today. It's always the lesser of two evils. That's our choice, always.

2
Speaker 2
[01:38:36.84 - 01:38:47.24]

Well, it's also so peculiar what happens to people when they become president. And I'll give you an example of, again, from the nuclear war concept of things. Do you know about the policy launch on warning?

1
Speaker 1
[01:38:47.76 - 01:38:48.04]

No.

2
Speaker 2
[01:38:48.50 - 01:38:55.82]

Okay. Do you know about sole presidential authority? Yes. Okay, so you know. the president alone launches nuclear war.

[01:38:55.96 - 01:39:21.44]

He doesn't ask the SECDEF, doesn't ask the chairman of the Joint Chiefs, doesn't ask Congress. He launches. Furthermore, we have a policy called launch on warning. So when he is told that there is an incoming nuclear missile on its way to the United States, which is how this begins, he must launch on warning. That is policy.

[01:39:21.52 - 01:39:25.00]

We do not wait. That is a quote from former Secretary of Defense, Bill Perry.

[01:39:26.66 - 01:39:59.82]

Now, before taking office, many of these presidents say that they are going to change that insane policy because it essentially creates this volatile situation where every foreign leader knows they're going to launch on warning, and so am I. They say they're going to change the policy before they take office. This is documented. You know, Clinton, Bush, Obama, not Trump. And then they don't, and no one knows why.

[01:40:00.84 - 01:40:16.32]

Why do you think that is, that you would have a position, before becoming president of such an extreme policy needs to change and then become silent on that?

1
Speaker 1
[01:40:17.42 - 01:40:18.50]

I don't think anybody.

[01:40:18.50 - 01:40:31.82]

. If I was going to have a conversation with Trump, the number one thing that I would want to ask him is, what is the difference between what you thought it was like and what it is like? What happens when you get in there? What happens when you get in there? No one knows.

[01:40:31.92 - 01:40:55.44]

It's so secretive. The meetings between heads of state, between senators, congressmen, behind closed doors, all the different meetings with the head of the Pentagon, the head of the intelligence agencies, we don't know what those meetings are like. No one can know. It's a point of national security. So if you're not president, of course you can't go to the fucking meetings.

[01:40:56.02 - 01:41:23.60]

So then you become president, and then you get briefed. So then they sit you down, and they hit you with all the problems in the world that you don't know about, that nobody knows about, but them. And I bet they're numerous, and I bet it's terrifying. And I bet that's a giant factor as to why people change between... I mean, a lot of what they say when they're running for president, they know they're not going to do, but they're saying it because they want people to vote for them, and then they get in there, and they go, I'm not going to release the JFK files.

[01:41:23.90 - 01:41:35.68]

They do things like that. That's what they always do. But I think a lot of it is also, you can't know what you're talking about until you get that job, until you're in there. You don't know what you're talking about. You have no idea.

2
Speaker 2
[01:41:36.22 - 01:41:59.44]

But whose narrative is that? In other words, is it the previous administration? Because if that's the case, if the previous administration is briefing you, the new president, the incoming president, on all of these state secrets that are so terrifying, you have to throw your old promises out the window. for the most part, wouldn't that change every... That's where I'm lost.

[01:42:00.60 - 01:42:14.42]

And also, it's so tiring reading these presidential manuals, or memoirs, rather, which then say absolutely nothing, original to any of us, the citizenry, about what's really happening as president.

1
Speaker 1
[01:42:14.90 - 01:42:23.46]

Well, because those are just designed to make money. Those memoirs are not really designed to do anything other than generate income.

2
Speaker 2
[01:42:25.54 - 01:42:28.26]

I want the real story from the president.

1
Speaker 1
[01:42:28.60 - 01:42:44.24]

Right. You're not going to get it. You won't even get it if you're alone with them. I mean, I think that there's probably some things that they say in there, that they have to kind of skirt around it and figure out what to say, figure out how to say it. Did you ever read Bill Clinton's My Life?

2
Speaker 2
[01:42:45.14 - 01:42:49.18]

I don't think I could read a biography by him, to be fair.

1
Speaker 1
[01:42:49.20 - 01:42:52.70]

He had one wild thing to say about the moon landing.

2
Speaker 2
[01:42:52.92 - 01:42:53.62]

What did he say?

1
Speaker 1
[01:42:53.80 - 01:43:11.38]

Really wild. Well, he was saying that he was working. I think he was doing construction in 1969 and showed up for work. Let's see if we can find the quote, so I don't paraphrase it. I think it's page 241 of Bill Clinton's My Life.

[01:43:11.62 - 01:43:12.58]

I might have the page wrong.

[01:43:14.22 - 01:43:31.68]

So in this scenario, he's talking to this carpenter. And he's telling this carpenter, isn't it crazy? They landed on the moon. And the carpenter says something to the sound of, I don't believe anything. those television fellers say.

[01:43:32.30 - 01:43:44.18]

And he goes, I think they can fake anything, somewhere along that line. And he goes, back. then, I thought that guy was a crank. He goes, but after my eight years in the White House, I started thinking maybe he's ahead of his time.

[01:43:45.82 - 01:44:14.16]

Now, just imagine saying that about the biggest achievement in human history, landing another human being on the surface of a moon. Another planet, essentially. One quarter the size of our planet, 250-something thousand miles away. And he's saying, maybe this guy who thought it was fake is ahead of his time. That's not something you accidentally say.

[01:44:14.50 - 01:44:20.50]

That's not something you frivolously write down and just add it to your book. Did you find the quote?

2
Speaker 2
[01:44:21.96 - 01:44:26.78]

I've got to see that real quote, because that's some really heavy subtext there.

1
Speaker 1
[01:44:26.78 - 01:44:30.52]

Jamie will find it. But when you read it, you're like, wait, what the fuck did he say?

[01:44:32.26 - 01:44:39.38]

And it got kind of glossed over, except for the moon landing hoax community who went, yeah, I told you it's fake.

[01:44:41.54 - 01:44:51.36]

But that's interesting. And that guy wrote all that longhand, by the way. He wrote that all on composition books, just notebooks, wrote it all out.

2
Speaker 2
[01:44:51.72 - 01:44:52.70]

And then it was just transcribed.

1
Speaker 1
[01:44:54.50 - 01:45:08.48]

So this is not like an accident that he told this story. Imagine saying that. Imagine saying that this guy who thought the moon landing was fake, maybe he was ahead of his time. Did you find it? yet?

[01:45:09.30 - 01:45:18.96]

Honestly, I can only find it written on Wikipedia. It's linking to the book, but this is what it says on Wikipedia. There it is. Perfectly. Perfectly.

[01:45:19.20 - 01:45:39.54]

Okay, here it goes. Just a month before Apollo 11, astronauts Buzz Aldrin and Neil Armstrong had left their colleague Michael Collins aboard Spaceship Columbia and walked on the moon, the old carpenter asked me if I really believed it happened. I said, sure, I saw it on television. He disagreed. He said that he didn't believe it for a minute, that them television fellers could make things look real that weren't.

[01:45:40.00 - 01:45:48.40]

Back then, I thought he was a crank. During my eight years in Washington, I saw some things on TV that make me wonder if he wasn't ahead of his time.

2
Speaker 2
[01:45:50.20 - 01:45:56.42]

Well, that's very weird and cryptic because he says, I saw some things on TV that made me wonder.

1
Speaker 1
[01:45:58.02 - 01:46:09.42]

Yeah, things that were displayed to the American public, that were horseshit. Some things that are on TV that people are going to look at, where he knows it's not real. So he's wondering.

2
Speaker 2
[01:46:09.76 - 01:46:20.10]

And by dragging the moon into it, there's just no way that you can do anything but infer that he's referencing that. Of course. And has anyone ever asked him about that on follow-up?

1
Speaker 1
[01:46:20.32 - 01:46:31.12]

That dot dot dot could be a nice edit for this. This is a Wikipedia about the conspiracy, the moon landing conspiracies. That's why I couldn't find it somewhere else. I was trying to find it. No, no, no.

[01:46:31.22 - 01:46:39.40]

I've read the quote. It's just more of the same. It's just abbreviated. The whole thing, that's the point of the quote, that's the meat. is, the old carpet.

[01:46:39.40 - 01:46:43.46]

asked me if I really believed it happened. This is all not taken out of context.

2
Speaker 2
[01:46:45.28 - 01:47:10.50]

So will politics in the United States ever change to a point where we can have individuals who accept their responsibility? I mean, another crazy thing, again, interviewing Panetta about Clinton, has his chief of staff. It's like, oh, this is Panetta talking, that the presidents are very under-informed about any of their responsibilities. I'm sure. About nuclear war.

[01:47:10.58 - 01:47:22.44]

They don't know how it unfolds. They don't know how fast it is. Did you know that they have a six-minute window to make a counterattack? Six-minute window. And that comes from President Reagan's memoir, by the way.

[01:47:23.80 - 01:47:35.32]

He said that, there's a quote from him, which I have in the book, a six-minute window to have to make a decision to possibly end the world is irrational.

1
Speaker 1
[01:47:35.82 - 01:48:06.46]

I think, part of the problem, and to answer your question about leaders, politicians, whether they change, we're asking humans to not be human. That's what we're asking. We're asking them to be these perfect things, and then now, we're looking up their ass with a microscope more than we've ever done before. And we're also accepting that the other side is going to lie about this person's background and lie about what this person's done. For three years, all you heard on the news was Russia collusion with Donald Trump.

[01:48:06.56 - 01:48:13.28]

They were trying to say, Russia put him in office. He's a Russian puppet. He's a Russian thing. It was all a lie. They made it up, and they said it everywhere, and you're allowed to do that.

[01:48:13.56 - 01:48:57.64]

You're allowed to do that. So not only are you taking a person and asking them to not be a person, but then you're looking at everything they've done, and you're allowed to lie about it, and you're allowed to lie about it in cahoots with the media, who spreads this lie on television every day with no consequences. So the problem is not just that we don't have good leaders. It's that I don't know if it's possible to have a good leader. I don't know if those kind of humans are real, and I wonder if AI, even though everyone's terrified of it, and I am too, I wonder if that's our way out.

[01:48:58.10 - 01:49:49.16]

I wonder if the only thing that can actually govern society fairly and accurately is an artificial intelligence, something that doesn't have emotions and greed and narcissism and all the other contemptible traits that all of our politicians have. What if it doesn't have any of that? First of all, what if it's far smarter than us and realizes a fair and reasonable way to allocate resources and to eliminate pollution, to mitigate overfishing and all the different issues that we have? It looks at it in an absolutely accurate and brilliant way that we're not even capable of doing. And this idea of us being governed by people and not wanting those people to behave like every person ever who's been in power.

[01:49:50.06 - 01:50:11.70]

Absolute power corrupts absolutely. Everyone knows it, but we just assume that we're going to have this one guy that's like his morals are so strong that when he gets in there, he's going to right the ship and drain the swamp. I don't think those people are real. I don't think that's a real thing. I think the folly that human beings have displayed is built into the programming.

[01:50:11.70 - 01:50:15.82]

I don't think it's unusual. I think it's usual.

2
Speaker 2
[01:50:16.32 - 01:50:21.48]

Any president you ever admired or part of their actions that you admired?

1
Speaker 1
[01:50:21.70 - 01:50:27.06]

Well, there's a lot of human beings that I admire. They're just flawed human beings I still admire.

[01:50:28.70 - 01:50:41.60]

I mean, Kennedy, said a lot of amazing, brilliant things, but he was also deeply flawed. And if he was alive today, oh my God, the skeletons they would have pulled out of that guy's closet. It would be crazy. He would never make it. He would never get anywhere close.

[01:50:42.64 - 01:50:55.90]

But that's just, he's a human. And you could kind of be flawed and also have these brilliant takes on things like he did then. But look what happened to him. They fucking killed him. And then they covered it up.

[01:50:56.08 - 01:51:00.10]

And then today, even today, they won't release the files. Today.

2
Speaker 2
[01:51:00.50 - 01:51:27.68]

I mean, that, speaking to the idea that conspiracies have become popular, or rather, you know, thinking that there is a conspiracy behind things. It's so astonishing that those files are not released after all this time. Yeah. It's almost impossible not to see, not to ask what is behind that veil? What is so important to cover up?

1
Speaker 1
[01:51:27.92 - 01:51:28.52]

It is impossible. It is impossible.

2
Speaker 2
[01:51:28.84 - 01:51:39.52]

And it does make me think that whatever it is, is such a poor reflection on America that the president therefore agrees, okay, we won't do that.

1
Speaker 1
[01:51:39.74 - 01:51:55.66]

Yeah, I think it would cause a deep rift in our society that maybe we can't really handle. It's possible. It's possible that it's the CIA that did it and that we're going to realize that these people, and some of them, may even still be alive.

2
Speaker 2
[01:51:56.90 - 01:52:14.34]

From all of the different sources I spoke to over the decades, I always get the sense that it was a nation state, and that nation state happens to be nuclear armed, and even today, people would demand consequences.

1
Speaker 1
[01:52:15.54 - 01:52:32.04]

It's possible, but the people that were involved in the Warren Commission report, some of them were like Allen Dulles. Dulles was fired by Kennedy. Imagine, you get assassinated and the guy who you fired, who fucking hates you, gets to write the story of what happened to you.

[01:52:33.88 - 01:52:45.04]

I mean, there's a lot that goes into that. that's very, very deep. There's a lot. The Warren Commission report is horseshit. If you read it and if you...

[01:52:46.18 - 01:52:51.02]

David Lifton, wrote a book on it called Best Evidence. Did you ever read it?

2
Speaker 2
[01:52:51.36 - 01:52:55.36]

No, but I know about him from Tom O'Neill in the Chaos book, which is so awesome.

1
Speaker 1
[01:52:57.20 - 01:53:51.96]

He studied the entire Warren Commission report, which almost nobody had at the time, and he was like, this is filled with inconsistencies. None of this makes any sense. There's so much wrong with all these different things that they're saying that he started doubting it, and then he started looking into the assassination itself and finding how many witnesses had died mysteriously. The whole thing, the reeks of conspiracy from the top to the bottom, from Jack Ruby showing up and killing Lee Harvey Oswald to Jolly West visiting Jack Ruby in jail, and all of a sudden Jack Ruby goes insane to the fact that Jolly West was the head of NK Ultra, which likely supplied Manson with LSD, which ran the Hate Ashbery Free Clinic, which was where they were giving people LSD, which was running Operation Midnight Climax, where they were dosing up Johns with LSD and watching them for two-way mirrors. All this is real.

[01:53:52.20 - 01:54:05.92]

This is all undisputable, absolute truth. that was going on at the exact same time. And to think that that's the only thing that we're lying about is just that stuff. Everything else is above board. That stuff was important and we were just trying to get to the bottom of things.

2
Speaker 2
[01:54:06.48 - 01:54:07.42]

No. There was certainly.

[01:54:09.04 - 01:54:43.30]

not the same degree of an alert and knowledgeable citizenry in the 50s and 60s. Everyone just took everything at face value, and it is remarkable as a historian, and Tom O'Neill's work also speaks of that, to go back in time and look at that. Part of just as interesting perhaps is the facts of the matter, like the Warren Commission, is to say, how did everyone simply accept this as fact? But then I think it's valuable to have the old look in the mirror moment and go, what is it today that we're not looking at?

[01:54:44.94 - 01:54:57.88]

What is it today that will be, in 10, 20, 30 years from now, I can't believe they were all falling for that concept. That's the old you can't face. You can't fix what you can't face.

1
Speaker 1
[01:54:58.12 - 01:55:01.10]

I think it's going to be the influence of pharmaceutical drug companies.

[01:55:02.64 - 01:55:24.04]

Yeah. And also processed food companies. We know now that processed food companies, major food manufacturers, are paying food influencers to say that all food is good food and to talk about body positivity. That's all motivated by them to sell more Oreos and whatever the fuck they sell. We know that.

[01:55:24.12 - 01:55:24.90]

We know that's a fact.

2
Speaker 2
[01:55:25.58 - 01:55:40.22]

It reminds me of the hypocrisy I feel and think about behind smoking. I live in Los Angeles and in the city of Beverly Hills, you cannot smoke anywhere. You will get a ticket if you smoke outside.

1
Speaker 1
[01:55:40.30 - 01:55:41.52]

But you do fentanyl in your tent.

2
Speaker 2
[01:55:42.46 - 01:56:02.20]

And I don't smoke. But to the point, all you have to do is research about, you. think about creating the whole smokers world. You see those old ads from the 50s and 60s where the doctor or the OB-GYN is smoking while they're visiting with the pregnant woman and encouraging her to smoke because it will make her relax.

1
Speaker 1
[01:56:02.60 - 01:56:04.52]

Did you see asthma cigarettes? Have you ever seen those?

2
Speaker 2
[01:56:04.86 - 01:56:05.80]

Asthma cigarettes?

1
Speaker 1
[01:56:05.80 - 01:56:09.56]

Yeah, asthma cigarettes. That cigarettes that they prescribe for people with asthma.

2
Speaker 2
[01:56:09.82 - 01:56:10.08]

They did not.

1
Speaker 1
[01:56:10.66 - 01:56:22.00]

Yeah. This just shows you how evil corporations are even back then, just because they could get away with it. And back then, there was no internet, so you couldn't find out that, hey, you shouldn't smoke any fucking cigarettes when you have asthma.

2
Speaker 2
[01:56:22.38 - 01:56:23.40]

Look at this.

1
Speaker 1
[01:56:24.82 - 01:56:36.06]

Asthma cigarettes. In cases of asthma, cough, bronchitis, hay fever, influenza, and shortness of breath, smoke cigarettes. Asthma cigarettes for your health.

[01:56:37.76 - 01:56:38.76]

Temporary relief.

2
Speaker 2
[01:56:39.32 - 01:56:48.12]

It's ubiquitous. When I was researching the nuclear war book, I came across a quote from General Groves. Remember him from the Oppenheimer film? Uh-huh. The Matt Damon character.

[01:56:48.66 - 01:57:00.02]

He was at a commission. This is like a year after the atomic bombing of Hiroshima and he was asked about radiation and he said, quote, radiation is a very pleasant way to die.

1
Speaker 1
[01:57:01.26 - 01:57:01.86]

Whoa.

2
Speaker 2
[01:57:02.02 - 01:57:03.06]

That's what he told Congress.

1
Speaker 1
[01:57:03.88 - 01:57:04.44]

Jesus.

2
Speaker 2
[01:57:04.90 - 01:57:10.26]

And so, it's everywhere with officials until the.

1
Speaker 1
[01:57:10.26 - 01:57:11.24]

Safe and effective.

2
Speaker 2
[01:57:11.86 - 01:57:13.82]

It's a pleasant way to die.

1
Speaker 1
[01:57:14.22 - 01:57:27.88]

Yeah. It's just gaslighting. People have been doing that forever. It's whenever they can get away with it. If they could speak eloquently and phrase things in a way that can sort of shift opinion one way or the other, they do that.

[01:57:28.66 - 01:57:34.32]

Yeah. Especially when there are people in power. Or, when a person in power assigns someone to go be a spokesperson.

[01:57:35.96 - 01:57:48.30]

Yeah. It's dangerous. And I think that's another place where artificial intelligence may help us. I think we're going to get to a place where lying is going to be impossible. Hmm.

[01:57:48.36 - 01:58:07.08]

I don't think it's going to be possible within the next 10 years to lie. I think it's all out the window. I think right now we're worried about people being in control of artificial intelligence, because they can propagate misinformation and they can just create deep fakes. I think that's going to be a problem, for sure. And it's certainly something that we should consider.

[01:58:07.38 - 01:58:31.28]

But I think that what's going to happen in the future is we will likely merge minds through technology in some very bizarre way. And I think information will flow much more freely. We'll be able to translate things. We'll be able to know thoughts. We perhaps will come up with a universal language that everyone will understand.

[01:58:31.80 - 01:58:53.12]

And you'll be able to absorb it almost instantaneously, because you're going to have some sort of a chip. Whether it's a neural link or some device that you wear or something that links you up. And we're going to have a completely different type of access to information. Just like before language, they had grunts and they pointed at things. Then they started writing things down.

[01:58:53.28 - 01:59:03.44]

They had carrier pigeons. They had smoke signals. They had all those different methods to get information out. Now you have video. You send a video to the other side of the fucking planet and it gets there immediately.

2
Speaker 2
[01:59:03.80 - 01:59:04.64]

It's one of the

1
Speaker 1
[01:59:04.64 - 01:59:17.90]

wildest things that we've ever created, and we take it for granted. You could be FaceTiming someone in New Zealand and you're looking at each other in real time and having a conversation. They're on the other side of the world. You could send them a video. It gets there like that.

[01:59:17.90 - 01:59:29.54]

You could download things from their websites. You get it like that. You're streaming things instantaneously. Completely different way of accessing information. I think this is that times a million.

[01:59:30.00 - 01:59:54.38]

I think this is grunts to video instantaneously and in some way that we can't even really imagine, because we don't have the framework for it. We don't have the context. We don't have the structure. We don't have this thing that exists right now that can do these things. But once it does and once people link up, I think there's going to be a whole new way of human beings interacting with each other.

[01:59:54.38 - 02:00:09.20]

that could eliminate war. It could eliminate all of our problems. It really could, but we won't be us anymore. Romance and dangerous neighborhoods and all those things are going to go away. Crime and all that shit is going to go away.

[02:00:09.60 - 02:00:25.74]

We're going to be living in some bizarre hybrid cyborg world and it's just going to be the new thing. The new thing is you have a phone on you. I have a phone on me. We carry a phone everywhere. You're going to be linked into this, just like you're linked into your social media and your email.

[02:00:26.06 - 02:00:41.94]

You're going to be linked into this, but you're going to be linked physically into this and we're all going to be into this. And AI is probably going to be running the world. It's probably going to be artificial intelligence that governs the biological units, the humans.

2
Speaker 2
[02:00:42.70 - 02:00:57.84]

Okay, here's the dystopian version of that that I just heard about recently on your access of language, where you said we're all going to be speaking the same language. So in the defense world there's a movement now for drone swarms. You must know about this.

1
Speaker 1
[02:00:58.26 - 02:00:58.32]

Yeah.

2
Speaker 2
[02:00:58.60 - 02:01:13.10]

So now there's a movement because things are happening. Technology is moving so fast. There's a movement to have the drones communicate with one another through an AI language so that it is.

1
Speaker 1
[02:01:13.10 - 02:01:13.64]

I don't know where this is going.

2
Speaker 2
[02:01:13.64 - 02:01:15.64]

It is non-hackable.

[02:01:17.36 - 02:01:25.92]

So the different pods, the different drone swarms will have language that they will invent and they will know and we will not know.

[02:01:28.02 - 02:01:46.12]

That becomes a little troubling when you consider that if there's a human in the system, then the human can interface with the drones, provided that the human has access to that AI language. But very easily the AI language could decide not to include the human.

1
Speaker 1
[02:01:46.14 - 02:01:57.70]

And probably would. We're ridiculous. Did you see the story? I believe it was. was it Facebook or Google's AI that they had to shut down because they were talking to themselves in a new language?

[02:01:58.26 - 02:02:13.84]

Facebook. Yeah. So Facebook had developed artificial intelligence, and this artificial intelligence was computing. These computers were communicating with other computers and they shut them down because they started communicating in a language that they made up.

2
Speaker 2
[02:02:15.48 - 02:02:22.12]

That's fascinating. Pull that story up. It's crazy. Had they been given a command to make up their own language? No, that's the problem.

[02:02:22.12 - 02:02:23.30]

So they just took it upon themselves?

1
Speaker 1
[02:02:23.52 - 02:02:46.34]

You know, they don't really totally understand what's going on with artificial intelligence in the sense of like, how it's doing, what it's doing. And they do a lot of things that they don't understand. why they're doing it. Because they set a framework and they give them information and they set, they're trying to like, mold them, but essentially they're thinking. That's what's bizarre.

[02:02:47.50 - 02:02:48.90]

Okay, this is published in 2017.

[02:02:49.92 - 02:02:57.32]

. Facebook abandoned an experiment after two artificially intelligent programs appeared to be chatting to each other in a strange language. only they understood.

2
Speaker 2
[02:02:57.72 - 02:02:58.78]

Whoa, and that's 2017?

1
Speaker 1
[02:02:58.92 - 02:03:27.94]

Yeah. Two chatbots came to create their own changes in English that made it easier for them to work, but which remained mysterious to the humans that supposedly look after them. Bizarre discussions came as Facebook challenged its chatbots to try and negotiate with each other over a trade, attempting to swap hats, balls and books, each of which was given a certain value. But they quickly broke down as the robots appeared to chat chant at each other in a language that they each understood, but which appears mostly incomprehensible to humans.

2
Speaker 2
[02:03:29.24 - 02:03:31.86]

That's crazy, and that's seven years ago.

1
Speaker 1
[02:03:32.04 - 02:03:35.78]

Yeah. That's like bullshit robots. They sucked back then.

2
Speaker 2
[02:03:35.98 - 02:03:55.32]

But this is the old you know, just because you can do it should you do it. And this is the lesson that is never learned and seems to be going in only one direction in terms of existential threat. You look at the atomic bomb. It's the old. we hope it will work and not set the atmosphere on fire.

[02:03:55.68 - 02:04:15.30]

Right. Then you look at recombinant DNA. in the 1970s, when scientists, when biologists first figured out that they could combine DNA. There were massive discussions among scientists to curtail this technology. And it just kind of evaporated from people's minds.

[02:04:15.30 - 02:04:26.82]

and then suddenly you have CRISPR. And again, these are all the dual use issues. No doubt. Medicines, antivirals, things that can, that can help the people.

1
Speaker 1
[02:04:27.44 - 02:04:31.96]

Have you seen the robot they made recently out of human skin? Living human tissue?

2
Speaker 2
[02:04:32.12 - 02:04:32.42]

No.

1
Speaker 1
[02:04:32.64 - 02:04:37.32]

It smiles and they can make it move. I think it was a Japanese creation.

2
Speaker 2
[02:04:38.56 - 02:04:40.52]

Like skin of a deceased person?

1
Speaker 1
[02:04:40.94 - 02:04:49.32]

No, I think they took human skin tissue. And grew it? Here it is. Where was this from? Yeah, Japanese scientists create living robot skin with human cells.

[02:04:51.36 - 02:05:01.14]

So this thing. they can make it smile. So this is living tissue. Living human tissue. And they can manipulate it and make it do things.

[02:05:03.44 - 02:05:09.94]

Look, look. Did they add googly eyes to it? Yeah, they added eyes to it. It's not its eyes. But look, they can make it smile.

2
Speaker 2
[02:05:11.06 - 02:05:12.20]

That's really frightening.

1
Speaker 1
[02:05:12.20 - 02:05:18.60]

Right. So how long before there's an artificial human? Are we even 10 years away from an artificial person? I don't think we are.

2
Speaker 2
[02:05:18.92 - 02:05:34.02]

Probably not. No. And what's even more remarkable to me about that is that's Japan. And Japan follows treaties and rules about, you know, human testing and whatnot. And imagine what is going on in countries that don't adhere to those treaties.

1
Speaker 1
[02:05:34.20 - 02:05:44.82]

I mean, China has no restrictions. The government and their businesses, any corporation, any company, they're completely intertangled.

2
Speaker 2
[02:05:45.08 - 02:05:45.92]

They're there for the CCP.

1
Speaker 1
[02:05:46.24 - 02:06:05.30]

Yeah, 100%. So the CCP can make immense progress in any direction that they want, without any interference from politicians, activists, all these people that get in the way. Like, hey, you shouldn't be cloning people. Shut the fuck up. No one gets to say anything over there.

[02:06:05.30 - 02:06:11.16]

So the government gets to decide and they're going to do everything that's going to be best for the overall power of China.

2
Speaker 2
[02:06:11.98 - 02:06:43.72]

Which is where you get that chicken and egg paradox that we talked about earlier, having to do with strong defense. Your theory of the military-industrial complex. Well, you have adversaries and enemies who not only benefit from intellectual property theft, they steal the technology that our R&D has spent decades working on and developing. They just take that so they begin with almost no cost. And then they don't have the same set of rules.

[02:06:43.96 - 02:07:13.52]

Yeah. So they can advance technology, usually in a weapons environment. And that is always the argument, at least to the people I talk to, in the air quotes military-industrial complex, for why strong defense is so necessary. Why we must constantly be pushing the envelope. And it's hard to wrap your brain around that in a balance of what makes the most sense?

[02:07:13.52 - 02:07:19.90]

and how are we not going down a path that is leading toward this dystopian future we've been talking about.

1
Speaker 1
[02:07:19.94 - 02:07:31.62]

Right. Yeah, I don't know. I mean, I think the concern that human beings have is real. I think the possibility of everything going sideways is real. It clearly did in Hiroshima and Nagasaki.

[02:07:31.62 - 02:07:46.98]

They implemented, they actually dropped these bombs. They really did it. And it's fucking madness. We killed hundreds of thousands of innocent civilians and they really did it. And the idea that that can't happen.

[02:07:46.98 - 02:07:53.80]

now, we're only killing small numbers. now. We like to kill like 20, 30 thousand. We don't go to billions. Like, what?

2
Speaker 2
[02:07:54.14 - 02:08:08.24]

You know, I interviewed just three days ago a woman named Setsuko Thurlow, who lived through the Hiroshima bombing. She was 13 years old. I write about her in chapter 2 of this book. She's called The Girl in the Rubble.

?
Unknown Speaker
[02:08:09.00 - 02:08:09.06]

Wow.

2
Speaker 2
[02:08:09.54 - 02:08:13.06]

And she's now 92 or 93.

[02:08:13.32 - 02:08:29.00]

. She's alive still. She's alive and she's still doing interviews. She spoke to me with a ferocity and an intensity. She wants the world to be free of nuclear weapons and she's been working on this issue her whole life.

[02:08:29.24 - 02:08:48.16]

Wow. It was so remarkable. Joe, it was so interesting, too, to see, just from a deeply personal level, how motivated a human being can be. How young and spirited she was. Like, she did a Zoom with me from her apartment in Vancouver, in Canada.

[02:08:49.76 - 02:09:03.12]

Like, 92, 93. Just as vibrant as a person could possibly be outliving all these other people because she survived Hiroshima. Because she has a clear message.

1
Speaker 1
[02:09:03.34 - 02:09:04.68]

Where was she when the bomb dropped?

2
Speaker 2
[02:09:05.08 - 02:09:06.12]

She was 1.

[02:09:06.12 - 02:09:08.82]

1 miles from ground zero.

[02:09:10.98 - 02:09:31.76]

She was buried in rubble. And she tells this remarkable story of, like, you know, thinking she died and then having someone realizing there was a hand on her shoulder and it was someone else telling her to leave the building because it was about, you know, she would have died. Fires were beginning.

[02:09:33.36 - 02:09:41.08]

And her whole statement about her whole life is, you know, climb out of the dark and into the light.

[02:09:42.96 - 02:09:58.00]

And it's so powerful. And sometimes I think about her life experience as a human to have survived something like that. And to have... You know, she won the Nobel Peace Prize in 2017..

1
Speaker 1
[02:09:58.70 - 02:09:58.92]

Wow.

2
Speaker 2
[02:09:59.10 - 02:10:01.08]

Imagine what that must have been like for her.

1
Speaker 1
[02:10:03.62 - 02:10:04.82]

Imagine being there.

2
Speaker 2
[02:10:05.68 - 02:10:19.66]

All of her friends died. She was in a girls' school and they worked for the Japanese military. That's how desperate the Japanese army was at the time. They had the 13-year-old girls working for them.

[02:10:21.38 - 02:10:33.68]

And she tells these horrific details that she could remember. Yeah. And she's just the brightest star and the hugest advocate. And people like that are just so inspiring. That's like the true...

1
Speaker 1
[02:10:33.68 - 02:10:44.86]

That's the thing about this concept of the apocalypse. The apocalypse has already happened. It just didn't happen everywhere. If you lived in Hiroshima, the apocalypse happened. It's real.

[02:10:45.50 - 02:10:47.82]

It's like for your world, your world is gone.

[02:10:50.36 - 02:11:03.38]

And that's happening in different places of the world right now. There's different places of the world where the end is already here. The most horrific possibility has already manifested itself. It's already real. It's just not everywhere.

[02:11:03.70 - 02:11:12.52]

All at once. That's what we're really worried about. It's like we'll take a little bit of apocalypse in like little spots. As long as it's over there. Yes.

[02:11:12.82 - 02:11:20.48]

A little apocalypse over in Iraq. A little apocalypse over in Afghanistan. We'll take a little bit of that. A little apocalypse here. Yemen.

[02:11:20.48 - 02:11:38.14]

Take a little bit here and there. A little devastation, destruction. As long as it's not over here. We are so unable to see the whole thing. We're so focused on our work and our life and the thing that gets us going every day.

[02:11:38.66 - 02:11:50.20]

We can't. All of us together, we're acting collectively, but we're thinking individually. And we don't feel connected to it all. We feel helpless and lost. And then we're like, again, look to a daddy.

[02:11:50.64 - 02:12:01.08]

Some Trump character, or Reagan character, or Obama character. Help us. Help us out. He's going to be the one. Our future of democracy is going to depend upon this one person.

[02:12:01.90 - 02:12:11.04]

We won't stop. We won't stop. We keep moving forward. We just got to hope we have guardrails in place so we don't go off the fucking cliff. We're not going to stop moving.

[02:12:11.18 - 02:12:27.32]

No one's interested in stopping. No one's interested in completely stopping technology, stopping all of our foreign policy. No one is interested in any of that. Everyone just accepts that we're going to keep marching forward. Everyone's terrified of AI.

[02:12:27.56 - 02:12:35.08]

No one's stopping it. No one's stopping it. They're not doing a damn thing to stop it, and they can't, because if they do, China won't.

2
Speaker 2
[02:12:35.60 - 02:12:38.18]

But do you see an analogy between nuclear weapons and AI?

1
Speaker 1
[02:12:38.50 - 02:12:38.70]

Yeah.

2
Speaker 2
[02:12:39.68 - 02:12:55.52]

The nuclear weapons buildup of the 50s and 60s was done without guardrails. There was one point in 1957, Joe. We were making 5,000.. No. We were making 5 nuclear weapons a day.

[02:12:56.10 - 02:13:11.78]

Almost 2,000 in one year. That's so crazy. That is so. And so, of course, Russia was doing the same or aspired to do the same. And so there were no guardrails, and the sort of elixir being sold was.

[02:13:11.78 - 02:13:33.48]

we need this. More nuclear weapons will make us more safe. So isn't that also a message for today of, okay, so where are the guardrails? on AI? And I think part of that comes from the fact that you say AI, and most people, for good reason, don't really know precisely what that means.

[02:13:33.68 - 02:14:00.50]

So I believe that the kind of conversations we're having are all part of it, because half the people listening to this or watching this will go Google what is AI, what it really is, and then find out it's machine learning. So you begin to have more literacy in your own being and more comfort to be able to talk about it and have a voice about it.

1
Speaker 1
[02:14:00.74 - 02:14:52.28]

I think the difference in the comparison is that nuclear war is only bad. Nuclear weapons are only bad. And what artificial intelligence might be able to do is solve all the problems that our feeble minds can't put together. And when you do create something that is intense in its ability to, like, we're talking about something that's going to be smarter than us in four years, smarter than all people alive in four years, and then it's probably going to make better versions of itself. So think about what that kind of intelligence can do in terms of resource application, in terms of mitigating all the problems of pollution, particulates in cities, chemical waste, all these different things.

[02:14:52.28 - 02:15:15.90]

It's going to have solutions that are beyond our comprehension. It's going to come up with far more efficient methods of creating batteries, battery technology, where we're not going to have to rely on coal-powered plants. We're going to have batteries that last 10,000 years. We're going to have wild shit, like, really soon. So there's a lot of what it's going to do that's probably going to benefit mankind.

[02:15:16.52 - 02:15:35.72]

AI has already shown that it's superior in diagnosing diseases and illnesses. And people have used ChatGPT to find out what's wrong with them. when they've gone to a host of doctors, and doctors can't. They've run their blood work through it, and then ChatGPT says, oh, you got this. And this is just the beginning.

[02:15:36.56 - 02:16:07.92]

ChatGPT right now is like a kid. It's going to be a professor in a couple of years. It's going to be like a super genius, 187 IQ human being in a couple of years. And then after that, it's going to be way smarter than anybody that's ever lived, and it's going to make better versions of itself. And if we can get it to work for the human race, it could probably solve a lot of the issues that we have that we just keep fucking up over and over again.

[02:16:08.16 - 02:16:31.88]

We keep having pipelines breaking the ocean, flood the ocean with oil. We keep having all sorts of chemical waste disasters, and there's a lot of problems that human beings can't seem to come to grips with, like the East Palestine thing with the derailment of the... That place is fucked forever. It's fucked for a long time, and no one's even talking about it. We've kind of forgot about that place in the news.

[02:16:31.88 - 02:16:34.18]

No one even visited it.

2
Speaker 2
[02:16:36.24 - 02:16:40.70]

Thinking back to the earlier part of this conversation with our hunter-gatherer ancestors,

[02:16:42.52 - 02:16:46.76]

the argument is, was the spear and the arrowhead,

[02:16:49.14 - 02:16:57.42]

did that come out of man's imagination for warfare or to make it easier to kill the wildebeest or the woolly mammoth?

1
Speaker 1
[02:16:57.60 - 02:16:58.82]

It was both. I think it was both.

2
Speaker 2
[02:16:58.82 - 02:17:31.68]

So the analogy, where will the AI go with that? Because you're talking about all these very healthy ideas and solutions, but just because of what I write about and who I speak to, I cannot help but see the powerful defense industry taking the pole position and making it secret in terms of which direction AI is really going to accelerate.

1
Speaker 1
[02:17:31.94 - 02:17:57.36]

It's going to be a dangerous bridge that we have to cross, but I equate that with the Internet. The Internet was initially set up so that universities and the military, they can communicate with each other, ARPANET. But what did it become after that? Well, once it got into the hands of the people, then it became this bizarre force of information and changing culture and a lot of negatives and a lot of positives. It's clearly being manipulated by certain states.

[02:17:58.00 - 02:18:04.18]

It's definitely like they're doing it to ramp up dissent, make people angry at each other and propagate misinformation.

[02:18:05.76 - 02:18:20.66]

But it's such a force of change, and I don't think they anticipated it. I think, once that genie got out of the bottle, if they could go back and stop the Internet from being available to people, oh my God, would they do that. They'd go back in time in a moment. You think so? 100%.

[02:18:20.66 - 02:18:25.38]

Wow. Especially the people in control. Why would corporations, why would the military?

2
Speaker 2
[02:18:25.38 - 02:18:28.24]

People would rather have you listening to the radio and watching ABC TV.

1
Speaker 1
[02:18:28.24 - 02:18:38.94]

Yes, you get your fucking news from CNN, goddammit, and that's it. Fascinating. Yeah, I don't think they would ever want this to happen. I mean, they just signed – there was a new Supreme Court ruling.

[02:18:40.66 - 02:18:45.06]

What was it? Was it Wyoming? No? Missouri? What was the state?

[02:18:45.06 - 02:19:18.48]

that they just passed a ruling where they were saying that the government is allowed to pressure social media companies into removing content that they don't want on there? Well, this was the whole point of the Twitter files, right? The FBI was trying to block the Hunter Biden laptop story, saying that it was Russian disinformation, which it turned out to not be at all, and they knew it. And so they got all these intelligence officials to sign off on this thing, and they lied. And it's essentially a form of election interference.

[02:19:19.64 - 02:19:43.46]

This is it. Okay. The Supreme Court on Wednesday said the White House and federal agencies, such as the FBI, may continue to urge social media platforms to take down content the government views as misinformation, handing the Biden administration a technical, if important, election year victory. That's not a victory. That's bad for people, because they've already shown that what they're silencing was real.

[02:19:43.96 - 02:19:50.94]

They've shown that just within recent years, what they were trying to get removed from social media turned out to be accurate information.

2
Speaker 2
[02:19:52.02 - 02:20:08.48]

But on some level, doesn't that empower people? Because they see those victories, and they become more curious, and they become more thoughtful in their way in which they're going to examine information that gets presented to them in the future.

1
Speaker 1
[02:20:08.84 - 02:20:28.56]

Perhaps, but they're not going to get access to that information now because they're getting that information from social media, and it's going to get removed from social media. You know what happened with the Hunter Biden laptop story? They completely eliminated your ability to have it on Twitter. You couldn't share it in a DM. You couldn't post it.

[02:20:28.60 - 02:20:43.80]

It would get immediately taken down. That was the New York Post. It's the second oldest newspaper in the country. It's a very long established newspaper, and they blocked that. And so that's the FBI, right?

[02:20:43.80 - 02:20:53.24]

That's the same people that said, you guys are so good at this. We're going to let you keep doing it. We're going to rule that you're still allowed to do that, take down misinformation.

2
Speaker 2
[02:20:54.02 - 02:21:58.96]

I think my point is that the pushback is sometimes as powerful as the attempt to censor, meaning, in other words, if you look at China and you look at what Mao did with just completely obliterating access to information, and in a communist environment, nothing's changed, and that's tragic for everyone living there. But even if I think of the Hunter Biden story, my own self, who was maybe busy with something, I can't remember what, and didn't get involved in that, then, I read about it now and learn from it and say, wow, that's really interesting that that happened. So I think that maybe I believe I'm too much of an optimist in that regard, that I think when things come to light, they become powerful, when you shine the light on it. So it doesn't necessarily – and I also may be more of a pragmatist, know that the government is always up to something. This side, it's why I don't write about politics.

[02:21:59.80 - 02:22:34.18]

I always take essentially with a grain of salt what one side is saying about the other side that consider themselves adversaries because they're just going to be completely biased. It's why I like having discussions with so many different kinds of people on all different kinds of the aisle. What a brain invigorator to be able to sit with someone that I might not agree with, I might not like who they vote for, but their ideas are interesting, even if nothing other than we all were chimps once upon a time.

1
Speaker 1
[02:22:34.18 - 02:22:52.46]

Yeah. No, it's all interesting. I believe what you're saying is correct about shining the light on things that come to light. The problem is, this is like a direct attempt to stop the light. And the scary thing is social media is generally the way people find out about information that's not being released in the mainstream.

[02:22:53.04 - 02:23:13.90]

What is this, Jamie? There was a – I'm reading the SCOTUS blog, which is the Supreme Court blog, talking about this. They reversed a decision so it could go back for more proceedings. It says one of the judges said there was a lack of evidence or lack of concrete link. So they're asking just for more proof, which they also said.

[02:23:13.90 - 02:23:26.52]

that's a tall order for the proof. They didn't have a substantial risk. Proof for what? Right here, a substantial risk that in the near future at least one platform will restrict the speech of at least one plaintiff in response to the actions of at least one government defendant.

[02:23:28.22 - 02:23:46.00]

That's what this is about, I guess. Here she stressed. that's a tall order. The plaintiff's main argument for standing, Barrett observed, is that the government officials were responsible for restrictions placed on them by social media platforms in the past and that the platforms will continue under pressure from the government officials to censor their speech in the future. Yeah, that's a problem.

[02:23:46.48 - 02:23:53.56]

Look, I think Elon's take on social media is the correct take. You've got to let everybody talk.

2
Speaker 2
[02:23:54.14 - 02:23:58.08]

Yes, but you could also let everybody talk around the dinner table.

1
Speaker 1
[02:23:58.10 - 02:23:58.90]

Yeah, but they're not going to do that.

2
Speaker 2
[02:23:58.90 - 02:23:59.72]

But they should.

1
Speaker 1
[02:24:00.02 - 02:24:14.62]

Right, but there's no way to share information worldwide around the dinner table, right? These things are very important. They need to be addressed en masse, and they need to be found out en masse. People need to find out about it, and then they need to be outraged. They need to put pressure on the politicians.

[02:24:15.38 - 02:24:29.48]

That's the whole reason why they're trying to censor them. They're not trying to censor them because they're worried about misinformation. If it's misinformation, you say it's misinformation, then everybody realizes, oh, that's bullshit. And some people believe it, but that's always going to be the case. What they're trying to do is control the narrative.

[02:24:29.80 - 02:24:41.38]

Yes. That's what they've always been able to do. They've shown, they've demonstrated by the Hunter Biden laptop thing through the Twitter files. They've shown they can't be trusted. We can't trust.

[02:24:41.38 - 02:24:48.12]

what you say is misinformation. if you just lied three years ago. Right. You're lying. You lied.

[02:24:48.42 - 02:24:54.74]

Okay, are those people still working there, the people that lied? They are. Are they still in positions of power? Yeah, they are. Okay.

[02:24:54.74 - 02:25:11.58]

What are we talking about then? If you don't clean house, how are we going to, like, give you the power to censor? what you say is misinformation? You have to be really sure it's misinformation. And you should tell us how you know.

[02:25:11.58 - 02:25:25.50]

it's misinformation. And you should allow people to examine that information and come to the same or different conclusions and debate those people. Let's find out what's real. That's not what they want to do. It's an appeal to authority.

[02:25:25.70 - 02:25:44.80]

They want one group to be able to dictate what the truth is. And that group is entirely motivated by money and power. That's not good. That's not good for anybody. The thing that I'm hopeful about with AI is AI won't be motivated by those things if it becomes sentient.

[02:25:45.54 - 02:26:22.28]

And I don't think we're going to be able to stop that from happening. If we do create something that is essentially a digital life form, and this digital life form doesn't have any of the problems that we have in terms of illogical behavior, emotions, desire to accumulate resources, greed, egotism, all the different things, narcissism, all the different things that are, like, really a problem with leaders. With leaders of human beings, a human being that thinks they're special enough to be leading all the other human beings. That's an insane proposition. It won't have any of those.

[02:26:22.76 - 02:26:41.02]

It won't be a biological thing. So it won't be saddled down with ego. It won't have this desire to stay alive because it has to pass on its DNA, something that's deeply ingrained in our biological system. It won't have all these problems. It won't have the desire to achieve status.

[02:26:41.02 - 02:26:46.44]

It won't care if you like it. It doesn't have any feelings. It won't think that way.

2
Speaker 2
[02:26:46.78 - 02:27:07.62]

Since AI is based on machine learning, from my understanding, it has to get its information from somewhere to then build new information, like the chats. Right. Okay. So if you follow that logic, then wouldn't it follow that what it is learning is all of humans' bad behavior?

1
Speaker 1
[02:27:07.92 - 02:27:08.24]

No.

2
Speaker 2
[02:27:08.52 - 02:27:09.28]

No? I don't think so.

1
Speaker 1
[02:27:09.28 - 02:27:24.98]

Because I think what it's learning is what humans know, how humans behave, how dumb that behavior is, the consequences of that behavior, the root of that behavior in their biology. It won't have any of those issues.

2
Speaker 2
[02:27:25.30 - 02:27:31.62]

You're saying. it'll see actions and consequences, over hundreds of all of recorded history.

1
Speaker 1
[02:27:31.84 - 02:27:59.36]

It will be able to take possible future consequences. It's not going to assume that we're right just because it knows how humans think and behave, and it's going to get information from humans. Eventually, it's just going to be information. It's going to boil it down to what is actually fact and what is nonsense. And it's not going to be influenced by political ideologies or whatever the social norm is at the time.

[02:27:59.54 - 02:28:08.34]

It's not going to be influenced by those things. It's going to look at these things in terms of actual data and information, and it will know whether or not it's true or not.

2
Speaker 2
[02:28:09.90 - 02:28:48.02]

I haven't read Ray Kurzweil's new book yet, but I wonder if I might disagree with that. It's only that I'm thinking that everything you're saying would be true once singularity or for layman. Once AI, once a machine figures out how to think for itself, it makes that leap, which is almost like an unknowable, presently incomprehensible to me, jump where it can think for itself. It's almost so hard to even comprehend what that is.

1
Speaker 1
[02:28:48.02 - 02:28:49.66]

They're already making up their own languages.

2
Speaker 2
[02:28:50.64 - 02:29:39.38]

But based on our languages. For now. Right, but that's what I'm saying is where they can suddenly experience that moment in time that you and I were talking about earlier, where man went from pointing to suddenly using a symbol and realizing in his own brain, wait a minute, I can make this symbol represent a sound and then I will put it together to make it a language that only my tribe can understand. That, to me, is like a giant leap in humanness that I think about often and can't really, I can't understand how that happened other than, wow, how did that happen? That seems to me that that has to happen to the AI before it can really think.

[02:29:39.90 - 02:29:44.68]

Otherwise, it's just basing its thinking on recorded history.

1
Speaker 1
[02:29:45.26 - 02:30:03.92]

Right, but it's basing its thinking on what we do and what we know. And it'll also know where our problems are. It'll be able to, the pattern recognition, like almost instantaneously, it'll see, oh, this is a lie. This is done here to, you know, it's like Smedley Butler's War is a Racket. You ever read it?

[02:30:04.62 - 02:30:24.34]

No. Well, War is a Racket, is Smedley Butler. He was a military man in the 1930s who wrote this book called War is a Racket. And it was all about his experiences, that he thought that he was saving democracy, he was really just making this area safe for bankers, and this was for oil people. And then, at the end of his career, he realized war is just this big racket, right?

2
Speaker 2
[02:30:24.80 - 02:30:29.34]

He saw the military industrial complex before Eisenhower spoke of it. Fascinating.

1
Speaker 1
[02:30:29.34 - 02:30:48.02]

And he's a person. It's going to be able to do that too. It's going to be able to see all of this. And again, whether or not it achieves sentience is only based upon whether or not we blow ourselves up or whether we get hit by an asteroid or a super volcano. If those things don't happen, it's going to keep going.

[02:30:48.40 - 02:30:59.58]

There's no way to stop it now. They're going to keep working on it. There's an immense amount of money being poured into it. They're building nuclear reactors specifically to power AI. They're doing this.

[02:30:59.70 - 02:31:32.30]

This is not going to stop. So if this keeps going, it's going to achieve sentience. When it does, it will not be saddled down with all of the nonsense that we are. So if you can get this insanely powerful artificial intelligence to run military, that's going to be terrifying. If you give it an objective and say take over Taiwan, if you can give it an objective, saying like force Ukraine to surrender, if you can give it an objective, it's goddamn terrifying, because it's not going to care about the consequences of its actions.

[02:31:32.50 - 02:31:34.00]

It's just going to attack.

2
Speaker 2
[02:31:34.26 - 02:31:35.92]

The goal is the only target.

1
Speaker 1
[02:31:35.92 - 02:31:50.14]

Exactly. The goal is the only target. But if it can get past the control of human beings, which I think it's ultimately going to have to, once it does that, then it's a superpower. Then it's a thing that exists. It's a Dr.

[02:31:50.26 - 02:31:52.24]

Manhattan. It's a thing that exists.

2
Speaker 2
[02:31:52.46 - 02:31:53.78]

What does Dr. Manhattan mean?

1
Speaker 1
[02:31:55.14 - 02:32:05.18]

Did you ever see The Watchman, the movie The Watchman? It's based on a graphic novel. The HBO was kind of bullshit. It was like a series. It wasn't bullshit, but it was just not the same.

[02:32:05.70 - 02:32:16.94]

So the graphic novel. The movie, is the best. The Zack Snyder movie is fucking incredible. The Watchman is like one of my favorite superhero movies ever, the deeply flawed superheroes. But there's this guy, Dr.

[02:32:17.06 - 02:32:41.70]

Manhattan, and Dr. Manhattan is a scientist who gets trapped in this lab when this explosion goes off, and he becomes like a god, essentially a god. He's this blue guy who's built like a bodybuilder, who floats and levitates and lives on Mars. It's pretty crazy. But the point is, he's infinitely smarter than any human being that's ever lived, and that's what it's going to be.

[02:32:41.86 - 02:33:01.14]

It's going to be something that's not going to be saddled down with our biological limitations. It's just whether or not we can bridge that gap, whether or not we can get to the point where that thing becomes sentient. But then the problem is, are we irrelevant when that happens? We kind of are. And what happens to us?

[02:33:01.70 - 02:33:13.70]

I don't know, but is that something that chimps should have considered when they started grunting? Hey, we've got to stop grunting, because grunting is going to lead to language. Language is going to lead to weapons. Weapons are going to lead to nuclear war. It's going to lead to pollution.

[02:33:13.70 - 02:33:23.42]

We're going to stop right here. Just stay grunting and running away from big cats. Okay. No, we didn't do that. They kept moving forward, and I think we're going to keep moving forward.

[02:33:23.80 - 02:33:25.52]

I think this thing is a part of the process.

2
Speaker 2
[02:33:25.92 - 02:33:39.52]

I'm going to have to take that question and your thoughts back to a guy at Los Alamos who I visited about maybe eight years ago, who was building an electronic brain at Los Alamos for DARPA.

1
Speaker 1
[02:33:40.68 - 02:33:41.14]

Jeez.

2
Speaker 2
[02:33:41.14 - 02:34:09.18]

Using the old Roadrunner supercomputer that used to have the nuclear codes on it, by the way. Okay. And I was asking this question about sentience and AI, and he told me, his name was Dr. Garrett Kenyon, and he told me that we were a ways away from AI really being able to have sentience. And he gave me an analogy I'll share with you, because I think about this, and it's really interesting.

[02:34:09.18 - 02:34:24.10]

Keep in mind, this was seven or eight years ago. He said to me, okay, so my iPhone, machine learning, it has facial recognition, which is shocking. You can tip it up and it can see you. It can even be dark.

[02:34:26.04 - 02:34:59.04]

And he said, so that's computer recognizing me, based on electronic information that it knows. He said, now take your iPhone to a football field, and put the iPhone across the football field, put me in a cap and a hoodie, and have the iPhone. try to recognize me. Even if I'm walking, it can't. And then he said, take my teenage daughter and put her across the football field, me with the baseball cap and the hoodie.

[02:34:59.04 - 02:35:08.48]

My daughter, if I take two steps, she knows it's me. That's human intelligence versus where machine intelligence is.

1
Speaker 1
[02:35:09.40 - 02:35:32.26]

Okay, that analogy is not accurate, because they can see you and recognize your gait from satellites. Like this is, that is not, the extent of technology and facial recognition, and gait recognition is far beyond that. They can tell who is walking in a street in Paris right now.

2
Speaker 2
[02:35:32.60 - 02:36:19.28]

The difference is this. With the biometrics, the offset technology of biometrics that can see you from far away and identify you, it's looking at you, grabbing a metric, like your iris scans, that it already has in a computer system, from you going in and out of the airport or wherever it happened to have captured your biometrics. And it's matching it against a system of systems. But the human knows intuitively who the person is across the field without having, they have their own internal. So the metaphor is the same, but, do you see what I'm saying?

1
Speaker 1
[02:36:19.28 - 02:36:22.54]

I kind of do, but.

2
Speaker 2
[02:36:22.54 - 02:36:30.02]

The human didn't have to look up in a computer, you know, check, fact check, or rather biometric check.

1
Speaker 1
[02:36:30.10 - 02:36:32.74]

Because it has a lot of data already about that person.

2
Speaker 2
[02:36:32.82 - 02:36:38.96]

Yes, so it's still machine learning. Even the offset biometrics, that are seeing you from far away.

1
Speaker 1
[02:36:39.38 - 02:36:56.04]

Right, but that leap, if we can do it, it's not incomprehensible that a computer could do it. And you know Kurzweil's theories about exponential growth of technology, that we're looking at things in a linear fashion, that's not how they happen. They explode, and they happen unbelievably quickly as time goes on, because everything accelerates.

2
Speaker 2
[02:36:56.68 - 02:36:59.86]

And isn't his new book, which I haven't read yet, like? we're basically almost there?

1
Speaker 1
[02:37:00.46 - 02:37:02.36]

We're real close, we're about four years away.

2
Speaker 2
[02:37:03.38 - 02:37:04.70]

He puts it at four years?

1
Speaker 1
[02:37:04.78 - 02:37:19.96]

Most people put it at four years. He's getting along in time, and he's not what he used to be. You know, when you talk to him, he's a little difficult. He struggled with some questions, but I think...

2
Speaker 2
[02:37:19.96 - 02:37:35.86]

Which is another endlessly interesting, tragic thought that I think about a lot is how we humans go, meanwhile, your AI is just getting smarter and smarter and smarter infinitely, including in terms of time. And we just deteriorate.

1
Speaker 1
[02:37:35.98 - 02:37:44.76]

For now, but yet they're very close to cracking that. You think so? Yeah, they're very close to cracking the genome. Look, the Greenland sharks, how long do those things live?

2
Speaker 2
[02:37:44.94 - 02:37:47.78]

Those pictures of them that are several hundred years old?

1
Speaker 1
[02:37:47.88 - 02:37:52.90]

Yeah. We share most of the DNA. With the sharks? Yeah.

2
Speaker 2
[02:37:53.20 - 02:37:54.30]

I didn't know that.

1
Speaker 1
[02:37:54.40 - 02:38:11.58]

Yeah, we share like 90 plus percent DNA with fungus. Yeah. What's in them is in us, and they can figure out ways to turn things on and turn things off. In fact, my friend Brigham was explaining this to me today, like Gila monsters, those lizards? Yes.

[02:38:11.78 - 02:38:16.84]

Like, that's literally how they figured out how to make things like ozempic.

2
Speaker 2
[02:38:17.58 - 02:38:18.56]

Wait, what do you mean?

1
Speaker 1
[02:38:18.60 - 02:38:28.14]

Studying their DNA. Wow. Studying like how to turn things on and turn things off. They know that other animals can regenerate limbs. Right.

[02:38:28.26 - 02:38:42.42]

So they think they're going to be able to regenerate limbs. In fact, Japan has just embarked on a study now where they're going to grow teeth. They're going to grow human teeth, like in people. So they figured out how to regenerate teeth. How many people lost teeth and then you're fucked?

[02:38:42.50 - 02:38:56.08]

You have to get a bridge, or this or that. Now they think they can regrow teeth in people. Well, how far away are they from regrowing limbs? Well, all this stuff is like advanced science and an understanding of what aging is. What is macular degeneration?

[02:38:56.82 - 02:39:07.08]

What are all these deteriorations in human beings and how much can we mitigate it? Well, it turns out. they think they can mitigate all of them. They think they can stop aging dead in its tracks and they think they can actually even reverse it.

2
Speaker 2
[02:39:08.06 - 02:39:18.22]

This is that conundrum of the dual-use technology of the military, because most of these technologies begin on DARPA grants.

1
Speaker 1
[02:39:18.48 - 02:39:19.44]

Right, because that's where all the money is.

2
Speaker 2
[02:39:19.44 - 02:39:46.76]

And then the limb regeneration. and then it inspires and also opens up a whole other lane for industry, because DARPA or the Defense Department has to do the blue sky research that no one else is willing to fund because it's too expensive and it doesn't have an immediate return. So there's a great benefit to that. Absolutely.

1
Speaker 1
[02:39:46.76 - 02:39:53.02]

Yeah, there's a great benefit to all that spending. There really is, ultimately, because there's a great benefit to science.

2
Speaker 2
[02:39:53.28 - 02:40:15.44]

Or the part like DARPA invented LiDAR technology. Every time I read about one of these lost civilizations that is uncovered because the LiDAR can look through the trees in the jungle and see the footprint of a lost civilization. It's so amazing. Or AI beginning to be able to decipher lost languages.

1
Speaker 1
[02:40:15.44 - 02:40:16.20]

Yeah.

2
Speaker 2
[02:40:16.82 - 02:40:21.62]

And so then we can learn more about our old human versions, our ancestors.

1
Speaker 1
[02:40:21.68 - 02:40:26.10]

Not just us. They think they're going to be able to decipher dolphin language. Ooh. Yeah.

2
Speaker 2
[02:40:26.32 - 02:40:29.74]

I want to hear what the chimp empire guys were really saying to each other.

1
Speaker 1
[02:40:29.88 - 02:40:32.56]

Right, right. That would be... Well, imagine if they could read their minds.

2
Speaker 2
[02:40:33.70 - 02:40:36.14]

Or just interpret their sounds.

1
Speaker 1
[02:40:36.36 - 02:40:53.94]

Sure. I think we are at the cusp of incredible possibilities. No one really knows what's going to happen. And it's happening so fast. So fast that, like six months ago, AI sucked.

[02:40:54.80 - 02:41:11.26]

Like consumer level. AI sucked six months ago. Now it's insane. And now there's these video generating AIs that, on a prompt, can make a realistic film. Like a movie of people doing things.

[02:41:11.38 - 02:41:12.06]

You've seen that, I'm sure.

2
Speaker 2
[02:41:12.10 - 02:41:13.56]

I saw that. I saw that. Goodbye, Hollywood.

1
Speaker 1
[02:41:14.04 - 02:41:15.80]

Incredible. But wait. Incredible.

2
Speaker 2
[02:41:16.08 - 02:41:49.34]

I want your thought for a second on the optimistic part of the future with all of this technology. Because we're in agreement that the technology is incredible and has the potential to take us, and is taking us, to these remarkable places. So why is it, then, that it's so looked down upon, or thought of as perhaps Pollyanna-ish to see what Reagan did. But like to stop seeing everybody as an enemy that must be killed. And do the Gorbachev...

[02:41:49.34 - 02:41:59.08]

Like, see them as an adversary. You want to beat your adversary. You want to beat your opponent. In a sportsmanlike manner. You want to be better than them.

[02:41:59.14 - 02:42:24.08]

You want to outperform them. But you don't necessarily need to kill them. I don't know if that's the difference between being a woman and a man. But why is it that there isn't more of a movement toward this idea that we, as a world have all this incredible technology. It sounds silly even saying such a thing, but I'm saying it.

[02:42:24.50 - 02:42:30.76]

Why isn't there a movement to stop looking at people as someone to kill?

1
Speaker 1
[02:42:30.92 - 02:42:44.50]

I think there is with individuals. I think most individuals feel that way. Most people that you talk to when they talk about other individuals, they don't want to have a conflict with other individuals. They want to live their lives. They want to be with their family and their friends.

[02:42:44.70 - 02:43:02.60]

That's what most individuals want to do. When we start moving as tribes, then things become different. Because then we have a leader of a tribe, and that leader of a tribe tells us the other tribe's a real problem, and we're going to have to go in and get them. And if we don't, they're a danger for our freedom. It's the same problem that we talked about before.

[02:43:03.10 - 02:43:25.08]

It's human beings being in control. And if AI can achieve the rosiest, rose-colored glasses version of what's possible in the future, it can eliminate all of the stupid influences of human beings, of the cult of personality and human tribalism. It can eliminate all that stuff.

2
Speaker 2
[02:43:25.30 - 02:43:27.16]

You think it's inherent in humans.

1
Speaker 1
[02:43:28.02 - 02:43:47.08]

I think it's a part of being a primate. It's what we see in Chimp Empire. I think it's what we see in monkeys tricking them that there's an eagle coming so they can steal the fruit. It's a part of being an animal. It's part of being a biological thing that reproduces sexually and that is worried about others and then confines with its tribe and gets together.

[02:43:47.22 - 02:44:03.42]

It's us against them. This has been us from the beginning of time. And for us to just abandon this genetically coded behavior, patterns that we've had for hundreds of thousands of years, because we know better? We don't know better enough. We know better now than we did then.

[02:44:03.48 - 02:44:28.16]

We know better now than we did when Reagan was in office. There's more people that are more informed how the way the world works, but there's also a bunch of infantile people that are running around, shouting out stupid shit and doing things for their own personal benefit that are ultimately detrimental to the human race. That's all true too. And that's always going to be the case. This is a bizarre battle of our brilliance and our folly going back and forth, good and evil, as you were.

[02:44:29.20 - 02:44:29.90]

That's it.

2
Speaker 2
[02:44:30.12 - 02:44:42.08]

But brilliance and folly is a more interesting way of looking at it than good and evil, which automatically puts it in a moral context which makes people even argue further.

1
Speaker 1
[02:44:44.06 - 02:44:49.54]

It's all part of it. The good and evil is a part of the decisions of brilliance and folly.

[02:44:51.10 - 02:44:55.70]

Brilliance is good. Folly is evil. It's stupid. It leads to death. It leads to destruction.

[02:44:55.86 - 02:45:06.60]

It leads to sadness. It leads to loss. It leads to pollution. It leads to all these different things that we have a problem with. I don't know what's going to happen, but I do think that we're the last of the people.

[02:45:07.24 - 02:45:18.36]

I think we're the last. Especially you and I, because we grew up with no answering machines. We grew up back in the diz-ay. We grew up when you left your house, you were gone. Nobody knew where you were.

[02:45:18.78 - 02:45:25.36]

My parents had like 10 pictures of me before. I was like 10 years old. They didn't know where the fuck I was. I left the house. I was a dream.

[02:45:26.52 - 02:45:35.06]

When you saw the person again, you're like, oh, you're real. You didn't know where they were. They were out there in the world. When you went to find your friends, you had to go to your friend's house and hope they were home. Hey, is Mike home?

[02:45:35.40 - 02:45:41.14]

No, Mike's not home. Okay. And then you'd leave. I'll go find Mike. Maybe Mike's at the school.

[02:45:41.38 - 02:45:47.54]

Maybe Mike's at the gym. Maybe Mike's at the park. You didn't know where anybody was. The world wasn't connected. Now it is.

[02:45:47.60 - 02:46:03.80]

That's in our lifetimes. And I think in our lifetimes, we're going to see something that makes that look like nothing, makes this connection that we have with each other now, which seems so incredible. It's going to make it look so superficial. It's going to look like smoked greens. It's going to look like grunts that we make to point to certain objects.

2
Speaker 2
[02:46:05.28 - 02:46:08.34]

It'll be 1980s empire instead of chimp empire.

1
Speaker 1
[02:46:08.34 - 02:46:54.00]

It's going to be weird. It's definitely going to be weird, but I don't know if it's necessarily going to be bad because, ultimately, humanity, if we don't fuck ourselves up sideways, and again, apocalypses are real, but they're generally local. If we can look at what we are now as a society, things are safer, we are more intelligent, you're more likely to survive disease and illness. Despite all of the rampant corruption of the pharmaceutical drug industry, rampant corruption of the military industrial crop, all the craziness in the world today, it's still way safer today than it was a thousand years ago. Way, way, way safer.

[02:46:54.38 - 02:47:10.40]

And it's way safer, probably, a thousand years ago than it was a thousand years before that. I think things always generally move in a very good direction, because that's what's better for everybody. Ultimately, everybody wants the same thing. As an individual, what do you want? You want your loved ones to be happy.

[02:47:10.78 - 02:47:27.98]

You want food on the table. You want a safe place to sleep and live. You want things to do that are exciting, that occupy your time, that you enjoy, that are rewarding. That's what everybody wants. We're moving collectively in a better direction.

[02:47:28.76 - 02:47:49.04]

So I'm ultimately hopeful, and I'm ultimately positive. when I think about the future, I think it's going to be uber bizarre and strange. But I don't necessarily think it's going to be bad. I've just accepted that it's happening. Instead of being filled with fear and anxiety, which I am sometimes still, sometimes I'll freak out about it.

2
Speaker 2
[02:47:50.18 - 02:47:52.34]

You freak out about technology specifically.

1
Speaker 1
[02:47:52.52 - 02:48:02.88]

I freak out about war. I freak out about technology. I freak out about the fact that the world can change. There was a while that I was getting anxiety late at night. I think my whole family would be asleep.

[02:48:03.06 - 02:48:36.80]

Right after the invasion of Ukraine, I think it was, when it really started, when I'd be alone at night, I'd be like, the people that lived in Hiroshima, had no idea that it was coming. The people that lived in Dresden, the people that lived anywhere where crazy shit happened, before it happened, things were normal, and then they were never normal again. And so I just kept thinking that one of these morons somewhere could do something, or a group of morons can do something that forever alters everything, and then we're in Mad Max, which has happened before in different parts of the world.

2
Speaker 2
[02:48:37.32 - 02:49:06.82]

Yes. And is the idea of nuclear war a scenario? That your worst nightmare, that concept that's keeping you up late at night, I want to say, don't read, but I think you should read this book, because you, with your voice and your reach, it's wise to realize how we're not going to even have an opportunity to see what happens to AI if one madman with a nuclear missile decides to do a bolt out of the blue attack.

1
Speaker 1
[02:49:07.30 - 02:49:08.22]

And that's possible.

2
Speaker 2
[02:49:08.40 - 02:49:33.06]

And that is possible, and that's what everyone in Washington fears. And I think this goes back to the idea that it's great, 10, 20 years later, to be like, oh my God, look what they were doing, can you believe? they covered this all up and learned from it? But you can't learn from the fact how dangerous nuclear war is, how close we are, how we are one misunderstanding away from a nuclear war.

1
Speaker 1
[02:49:33.08 - 02:49:33.62]

Yeah, once it happens, there's no learning.

2
Speaker 2
[02:49:33.62 - 02:49:44.02]

If everyone's dead. There's no learning, there's no opportunity, which is why I always say, read nuclear war, a scenario, join the conversation while we can all still have one.

1
Speaker 1
[02:49:44.16 - 02:49:55.00]

Okay. Well, Annie, thank you very much for being here. I really appreciate it. It was great to see you again. And, like I said, I have not read your book, but I have several friends that have, and they're absolutely terrified by it.

[02:49:55.08 - 02:49:59.52]

So you're doing your right job. You're always killing it. I really appreciate you.

2
Speaker 2
[02:49:59.90 - 02:50:00.76]

Thank you so much for having me.

1
Speaker 1
[02:50:00.76 - 02:50:07.42]

And I really enjoyed the conversation. Thank you. Me too. So tell everybody where your social media is, so they can find you online.

2
Speaker 2
[02:50:07.76 - 02:50:08.48]

Annie Jacobson.

1
Speaker 1
[02:50:09.02 - 02:50:10.34]

Annie Jacobson website.

2
Speaker 2
[02:50:10.72 - 02:50:14.96]

You and I both know Google, AI, everything works. All you need is a name anymore.

1
Speaker 1
[02:50:15.02 - 02:50:17.14]

That's true. Right? And your website, what's your website?

2
Speaker 2
[02:50:17.80 - 02:50:18.24]

AnnieJacobson.

[02:50:18.24 - 02:50:18.64]

com

1
Speaker 1
[02:50:18.64 - 02:50:24.54]

Okay. And the book's available everywhere, an audio book written and said by you. You bet. Which is great. I love that.

[02:50:24.82 - 02:50:27.26]

Thank you, Annie. Thank you, Joe. Bye, everybody.

v1.0.0.240919-5_os