HomeVideos

Joe Rogan Experience #2466 - Francis Foster & Konstantin Kisin

Now Playing

Joe Rogan Experience #2466 - Francis Foster & Konstantin Kisin

Transcript

5934 segments

0:01

Joe Rogan podcast. Check it out.

0:04

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:12

>> Okay. So, when we scheduled this, the

0:14

there's nothing happening. It was so

0:16

peaceful.

0:16

>> Every time we're here, something crazy

0:18

is going on, man.

0:20

>> Yeah. Uh maybe we manifest it.

0:23

>> To be honest, it did. 2026 did start

0:26

with a bang.

0:27

>> Yeah.

0:28

>> Yeah.

0:28

>> Yeah. Well, a lot of things, you know,

0:32

it's just nothing seems stable

0:34

everywhere. Everywhere in the world

0:36

seems [ __ ] right now. Like this is the

0:38

like in all of my years, this seems the

0:42

most unstable globally.

0:45

>> Like I never worried that the UK was

0:47

going to be like complete chaos,

0:50

arresting 12,000 people for social media

0:53

posts and abandoning trial by jury, all

0:57

all that [ __ ] I never thought the

1:00

Ukraine Russia war would go on this

1:02

long. Never thought.

1:04

>> Never thought they would just continue

1:06

bombing Gaza and then what's happening

1:08

now? They just sort of stop and now

1:12

they're talking about putting a resort

1:13

there.

1:16

What are like you hear that and you go,

1:19

are you [ __ ] serious?

1:21

>> Right.

1:21

>> Tim Dylan has a I won't won't give the

1:24

bit away. He has a [ __ ] phenomenal

1:26

bit about staying in that resort.

1:30

>> Yeah. And you boys have been busy as

1:31

well, Joe.

1:32

>> Yeah. And then I was going to get to

1:34

that, the embarrassing part. Uh uh in

1:38

the middle of Ramadan, you take out the

1:40

leader of a Muslim country and uh

1:44

>> they're hangry already and you're

1:45

[ __ ] with it.

1:46

>> Well,

1:48

and they're really Yeah. They can't even

1:51

drink water, right? And then, you know,

1:54

I was listening to Tim Dylan's podcast

1:55

today. Uh he's got a great podcast with

1:58

uh Ryan Grim and one other gentleman.

2:01

But one of the things that they brought

2:03

up was that some of these drone attacks,

2:07

it doesn't even seem like they're from

2:08

Iran. Some of these drone attacks on

2:11

Gulf States

2:12

>> like that. One of them um I I don't want

2:16

to speak out of tune out of turn because

2:18

I don't I'm not exactly sure which ones

2:20

they're talking about. They're talking

2:21

about one of them on um

2:25

either it's a oil refinery. I think it

2:28

is an oil refinery that it doesn't seem

2:30

like it came from Iran.

2:32

>> So, okay.

2:32

>> Where did it come from?

2:33

>> That's a good question.

2:34

>> One of their proxies.

2:36

>> That's a good question.

2:37

>> Well, the the fear is a false flag.

2:40

That's the fear. Like if you really

2:41

wanted to get really scared of what

2:43

we're dragged into,

2:46

>> you're dragged into an ally that's not

2:48

telling you the truth and is also doing

2:50

some other stuff.

2:52

>> Well, I'm not even a saying that that's

2:54

the case, but a lot of people are

2:56

assuming that that's what it is.

2:58

>> But that's what happens when you have an

3:00

absence of information,

3:01

>> right?

3:02

>> And so the moment you have an absence of

3:03

information, there's a vacuum and nature

3:05

of a vacuum. You need to have that

3:07

vacuum filled. So that's where

3:09

conspiracies naturally flourish because

3:11

100% because if people don't have

3:12

information then they're going to

3:14

basically theorize,

3:15

>> right?

3:16

>> And part of people theorizing is

3:19

conspiracies are going to start

3:20

flourishing.

3:21

>> Well, I think they were basing it on the

3:23

where the drone came from like let's see

3:27

if we can find some information on that.

3:28

Jamie,

3:29

>> I will try. I was it was Jeremy Scill

3:31

was the other

3:32

>> That's right.

3:33

>> reporter on the video. I just find it

3:36

amazing now how many people have like a

3:38

hard take on like what's going I'm like

3:40

what the we don't know a [ __ ] thing

3:42

about what's going like the coin is in

3:44

the air right and we do not know how

3:45

it's going to but everyone's got a take

3:47

everyone knows

3:49

>> like we do not [ __ ] I don't I don't

3:50

think anyone knows like I understand if

3:52

you if you work at the White House or if

3:54

you work in Russian propaganda or you

3:56

work in Chinese propaganda like or if

3:58

you work in Iranian like you you've got

4:00

to get your point of view across to try

4:01

and persuade people but if you're

4:03

actually trying to work out what's

4:04

genuinely happening. I don't think

4:06

anyone knows how. This is a gamble of

4:09

gigantic proportions

4:11

>> and nobody knows how it's going to end.

4:14

Like it's just so unpredictable. And I

4:16

can tell you a great story that is like

4:20

positive for the for the West, let's

4:22

say, or for America. I can tell you a

4:24

terrible story and they both sound very

4:25

convincing and no one knows which one of

4:27

them is true.

4:28

>> Yeah, that's a very good point. Uh th

4:31

this is the hottake culture, right?

4:33

Everyone has a take and they want that

4:35

take to be the expert take.

4:37

>> So uh specific drone attack incidents

4:39

that call potential false flags. Saudi

4:42

Arabia uh Saudi uh Saudi Aramco rather

4:46

um oil facility attack. So Iranian

4:49

officials deny striking the Saudi Aramco

4:52

processing facility and instead suggest

4:55

Israel may have carried out that attack

4:57

as a false flag to inflamed Gulf opinion

4:59

and pull Saudi Arabia more directly into

5:02

the war with Iran. So Ryan Grim

5:04

explicitly says he thinks Iran's claims

5:07

that Israel hit the Aramco facility need

5:09

to be taken seriously and that it's very

5:12

possible Israel did it. And this was the

5:15

other one, the drone strike on the

5:16

British base in Cyprus. That was from

5:18

Lebanon, right?

5:19

>> Yeah.

5:20

>> Is that what they're saying?

5:20

>> Yeah.

5:21

>> May have come from the direction of

5:22

Lebanon. He places this in the same

5:24

context of Iran claiming Israel carried

5:26

out certain attacks in neighboring

5:28

states as false flags to blame in Iran

5:31

and drag those countries into the war.

5:33

>> Those countries, this doesn't make any

5:35

logical sense to me because those

5:36

countries already in the war. I mean,

5:38

Saudi Arabia and UAE have been attacked

5:40

by Iran because

5:43

they were on the phone to Trump

5:45

basically asking him to do this. Here's

5:47

this is another weird one. The Tucker

5:49

Carlson one.

5:50

>> You saw that, right?

5:51

>> No.

5:52

>> So Tucker Carlson said that

5:54

>> Oh, yeah. Yeah. Yeah. That they had been

5:57

arrested that members of MSANDA have

5:58

been arrested in Qatar and Saudi Arabia.

6:01

>> So, uh,

6:02

>> but both Qatar and Saudi Arabia have

6:03

said it's not true.

6:04

>> Yeah. They've officially denied such

6:06

arrests. Their own Saudi sources also

6:08

denied it, though they note details

6:11

don't prove it didn't happen, and that

6:13

states would almost certainly hide such

6:14

arrests if real. Huh. Well,

6:18

the thing Joe is that these countries,

6:19

so Saudi Arabia and UAE, Qatar less so,

6:23

they want this to happen because they

6:25

also hate Iran or the Iranian regime,

6:27

>> right?

6:28

>> So there is no need for Israel even if

6:30

you you know, if people are tempted to

6:33

believe there's no need for Israel to do

6:34

this because these countries are already

6:37

in it,

6:38

>> right?

6:39

And that one of the reasons Iran has

6:41

attacked things in Saudi Arabia and in

6:44

the UAE is they know that. Right.

6:46

>> Right. So

6:47

>> the Gulf countries are on board with

6:48

this.

6:49

>> Right. So what would uh what would be if

6:52

let's assume that the false flag thing

6:55

that it's in play? Who would why why

6:58

would Israel how would they benefit from

6:59

doing that?

7:00

>> That's what I'm saying. There is no

7:01

rationale that I can think of.

7:03

>> The people that think the false flag is

7:05

real. Like why do they think that? like

7:06

what do they think that Israel would

7:08

benefit from it? Is there a scenario

7:10

where you could imagine it would inflame

7:12

things and further support other

7:15

countries contributing to the I mean

7:18

there's a lot of money that's being

7:19

spent on this war, right? It's like this

7:21

is insane amount of money just for

7:23

munitions, just for missiles.

7:25

>> Yeah. Yeah. And then rebuilding Iran if

7:26

we get to that, right?

7:27

>> Maybe another resort.

7:28

>> Yeah.

7:29

>> So, but I just don't see the rationale

7:31

because the the Gulf countries are

7:33

already targets for Iran, right? There's

7:35

nothing too inflamed. Like the situation

7:37

is already pretty [ __ ] inflamed,

7:39

right? And it's partly inflamed because,

7:41

as I say, actually the Gulf States and

7:44

Israel are pretty aligned on this

7:46

particular thing. They are all um at

7:49

threat from the Iranian regime.

7:51

>> Mhm.

7:52

>> Um so we we had Aean Dean and Richard M

7:56

on our show the other day. One of them

7:58

is a al-Qaeda MI6 double agent. Another

8:01

one is a really reputable investigative

8:04

journalist.

8:05

Al-Qaeda MI6.

8:08

Oh, what balls.

8:09

>> And he has a great podcast now as well

8:11

called Conflicted.

8:14

>> What a great name for a podcast for a

8:17

guy who's a double agent.

8:18

>> What balls that guy must have.

8:20

>> Yeah. Yeah. He's a really cool guy. But

8:23

anyway, I mean, he was explaining that

8:24

Saudi Arabia has a population that is

8:27

way bigger than what they can sustain in

8:29

terms of the water, but they live in the

8:31

[ __ ] desert. So they have these

8:33

desalination plants which are extremely

8:35

vulnerable. Uh and Saudi Arabia, UAE,

8:38

these other countries, they felt at huge

8:40

risk from Iranian attacks for a long

8:41

time. So they none of them like the

8:44

Iranian regime that's spreading

8:46

terrorism through its proxies. So in

8:49

actual fact, dragging them into the war

8:51

kind of like there's no sense for that.

8:53

I think there's a lot of people just

8:55

they they go to reaction now whenever

8:57

anything happens is that it was Israel's

8:59

fault. You know, like Venezuela, [ __ ]

9:01

all to do with Israel, but when it

9:02

happened, every Israel I think some

9:05

people just go to that now as the

9:07

automatic response, which it comes back

9:08

to what I was saying earlier about the

9:10

hottake culture.

9:12

>> Something happened 3 minutes ago and now

9:13

everyone's got a [ __ ] take on it. You

9:15

don't know anything.

9:16

>> None of us know anything. None of us

9:18

know how this is going to go cuz this

9:21

right now is a highly unpredictable

9:23

situation. I don't think the White House

9:25

knows how this is going to go.

9:27

>> No, it's terrifying. It's terrifying and

9:29

it's exactly the opposite of what we

9:31

were told leading into this

9:33

administration

9:34

>> that it's going to be America first,

9:36

>> right?

9:36

>> And no more unnecessary foreign wars.

9:39

>> I There's the other thing that Do you

9:41

remember Desert Storm?

9:42

>> Yeah.

9:43

>> Yeah.

9:43

>> Desert Storm. Quick and easy, baby. Woo.

9:45

We went in, kicked some ass, took some

9:47

names. That's a wrap. Pulled out. There

9:50

was only one base that got hit, so there

9:52

was the the amount of deaths by American

9:53

citizens was fairly minimal. I think

9:56

that's what got people so confident into

9:59

entering Iran after 9/11 or excuse me,

10:01

Iraq after 911 to go back like we

10:04

already [ __ ] them up once. We're going

10:05

to go back and this is going to be easy.

10:07

Well, it wasn't easy the second time and

10:09

it was drawn out and it didn't make any

10:11

sense. But people wanted some form of

10:13

revenge something for 9/11 and so

10:17

somehow or another they justified a war

10:19

with a country that had zero to do with

10:21

it. Like it didn't even make sense. That

10:23

one took for and then we also invaded

10:26

Afghanistan at the same time. What do we

10:28

do? What the [ __ ] So in the fog of this

10:31

idea of American exceptionalism, we're

10:33

just going to go in and fix it. We did

10:34

it before. There's no one even close to

10:36

us. Well, look how that turned out.

10:39

>> Yeah. Well, this is completely true. And

10:40

then this idea that it's so easy to take

10:42

one regime, remove it, and then just put

10:45

another one in its place like it's a

10:47

Lego block, and then all of a sudden

10:49

you're going to magically fix a country

10:51

is a fantasy. Like if you take Iran, the

10:54

IRGC, which is the Islamic Revolutionary

10:57

Guard, numbers around 200,000

11:01

trained soldiers. And not only are they

11:02

trained soldiers, they're fanatical.

11:05

They're fanatics. And then you have the

11:06

secret police. And then you have the

11:08

regular police. And then you have the

11:10

people employed in the government. And

11:11

then their families, and so on and so

11:13

forth, and then their supporters within

11:15

the country. And then you've got the

11:18

various factions within Iran like the

11:19

Kurds who want independence. So the

11:22

moment the the leadership is weakened,

11:25

they're going to use it as an

11:26

opportunity to launch their own

11:27

revolution to try and break away from

11:30

the rest of Iran. So you have all of

11:33

these particular parts play these

11:35

factions and then you think if you take

11:38

out the top the tag at the top who's

11:40

holding it all together by force I'm not

11:42

saying I agree with him or I what he

11:44

does you have the very real risk that

11:47

the entire country is going to

11:49

disintegrate as what happened in Iran in

11:51

Iraq sorry

11:52

>> and also Libya.

11:53

>> Yeah.

11:55

the the idea that you could just take

11:57

the guy out and that's a wrap. The

12:01

I mean it doesn't seem like it's well

12:03

thought out.

12:04

>> Well, I mean they would say Venezuela,

12:07

right? Like regime adjustment

12:10

different kind of course. This is a

12:13

religious fanatical. It's a totally

12:15

different kind of country. Also, it's a

12:17

country that's been under threat for

12:18

decades, right? So, they've been

12:20

preparing for this kind of thing.

12:21

>> Yeah.

12:22

>> Yeah. And also with Venezuela, it wasn't

12:24

a regime change. Practically everybody

12:26

who was in the old regime is still

12:28

there.

12:28

>> It's regime adjustment.

12:29

>> Exactly. Exactly. So Deli Rodriguez was

12:33

one of the senior leaders in Maduro's

12:36

regime. They just took out Maduro and

12:38

his wife. They put Deli Rodriguez there,

12:40

but the whole structure, the whole

12:43

leadership, the whole party is still in

12:45

place. So they've just what they've done

12:48

is they put Deli at the top and they've

12:50

said to her, "Look, if you [ __ ] about,

12:53

you're going to get what your boss got.

12:54

So you're going to follow what we say.

12:56

You're going to do what we say. You are

12:58

going to open up the oil refineries.

13:00

We're going to build it. We're going to

13:01

you're going to start pumping oil out.

13:03

You're going to stop messing about with

13:04

Hezbollah, which they had training camps

13:06

in the island of Margarita, which is a

13:08

little Caribbean island 2 and a half

13:11

hours away from Miami. Training camps.

13:13

You can't have that. You're not going to

13:14

be fratonizing with the Cubans and

13:17

you're going to play ball and

13:18

essentially Venezuela is now a colony of

13:20

the United States. That's what it's now

13:22

become.

13:23

>> That's wild. Well, there's also the Kurt

13:27

Mezer angle, which is hilarious. Kurt

13:29

Medsker cornered me one night at the at

13:32

the mothership and uh he explained to me

13:34

that this is all about the 2020 election

13:37

and that Maduro somehow or another had

13:40

something to do with rigging the 2020

13:42

election and he's going to say it as a

13:45

part of his testimony.

13:49

He's like, "Just wait. Just wait. Mark

13:51

my words."

13:52

>> He's he's convinced of this.

13:54

>> He goes down the rabbit hole to the

13:57

lava. Like he passes and he's like,

14:01

"This rabbit hole has been covered up.

14:03

It goes deeper and he keep he keeps

14:05

going till he's at the [ __ ] center of

14:07

the earth."

14:08

>> He's a funny guy, though.

14:08

>> Oh, he's hilarious. He's hilarious. He's

14:10

mentally ill, but he's hilarious. He's

14:12

one of the funniest people I know, like,

14:14

ever. Fantastic joke writer, too. I

14:16

mean, he's just great all around. But

14:18

Jesus Christ, like some of his nutty

14:21

theories, they go so far.

14:23

>> Oh, absolutely. I've I've been in in

14:25

bars with Kurt where he starts talking

14:27

to me and I'm like, "Kurt, I I I don't

14:29

even know what we're talking about

14:31

anymore."

14:31

>> Well, he changes he changes conspiracies

14:34

midsentence. He starts bringing up some

14:37

[ __ ] from the 70s and it's the Church

14:39

Committee and this and that and MK Ultra

14:41

and don't you know about Monarch? Like

14:43

what?

14:45

Slow down. Like not everybody knows what

14:47

you're talking about. But I I think this

14:50

is and I love Kurt, but this is kind of

14:53

where you feel that the truth isn't

14:56

enough. So you there needs to be

14:58

something else.

14:59

>> There needs to be something that goes

15:00

deeper than that. And sometimes there

15:02

is, don't get me wrong. Sometimes it

15:04

does go deeper, but sometimes like

15:06

you're making connections where there

15:08

are no connections. Like it's pretty

15:10

simple with Venezuela what was going on.

15:12

They were [ __ ] about and they were

15:14

doing it for a long time and they were

15:16

doing it in America's backyard and they

15:18

had warning after warning and Maduro,

15:20

the way I'd push back against Kurt is

15:22

I'm really sorry Kurt, but Maduro ain't

15:24

that bright.

15:25

>> Well, I don't think he has to be that

15:26

bright to finance and make sure and

15:29

arrange things because they did there

15:31

was like some something connected to the

15:35

voting machines that were there.

15:36

>> They made those claims. Sydney Powell

15:38

was it?

15:39

>> Yeah. Here we go. She's another fun one.

15:42

um post 2020 from Trump allies like

15:44

Sydney Powell and Rudy Giuliani claiming

15:47

Hugo Chavez, Maduro's predecessor,

15:49

developed rigged software to export to

15:51

US firms. These were promoted by figures

15:53

like Mike Lindell. He makes a great

15:55

pillow. Should listen to him and

15:57

amplified on social media, but courts

15:59

and fact checks rejected them, including

16:02

Fox News 78787

16:05

million uh Dominion settlement.

16:07

>> Yeah, I was going to say I was pretty

16:08

sure that those claims were debunked.

16:11

Yeah. Not to Kurt. They didn't

16:14

He's like, "You guys don't know where

16:15

the hard drives are. They're in the

16:17

center of the earth. We got to get

16:18

there." Um, yeah, that doesn't make much

16:21

sense to me. But, uh, neither does this

16:24

idea that you're going to take over a

16:26

country's oil supply, you know, like

16:28

that, you know, we'll just take it. that

16:31

the problem is from the outside like the

16:34

rest of the world. You you look at this

16:36

unnecessary aggression by the United

16:38

States government and then you tack on

16:40

whatever propaganda they have already

16:42

been spitting out about America for the

16:44

last 20 or 30 years and then this war

16:46

with Iran gets really ugly

16:48

>> because that's how you start a World War

16:50

II. You start a World War II by doing

16:53

something that

16:55

other than people that wanted this

16:57

forever. Who else thinks that's a good

17:00

idea? Who else thinks it's a good idea

17:02

to just attack a country that isn't

17:04

doing anything? They haven't done

17:05

anything. Like, if you've proof that

17:07

they have developed the depleted uranium

17:10

and they've got it up to a point where

17:12

it's they've got it to what it what what

17:15

percentage does it have to be? Like

17:16

they're at 60, right?

17:18

>> But that's way more than you need.

17:20

>> Way more than you need, right? So, it

17:21

shows that they're at least ramping up

17:23

their production where it's possible to

17:25

get it to whatever it needs to make.

17:27

It's way more than you need for civilian

17:29

use. Right. Right.

17:30

>> Um

17:30

>> but that's still it's not clear that

17:33

that justifies an invasion when North

17:35

Korea has nuclear weapons.

17:37

>> Right.

17:37

>> Right. It's like do we just want we just

17:40

trying to prevent them from ever getting

17:41

to a point where they're like North

17:42

Korea? Who the [ __ ] is worried about

17:44

North Korea? Zero people. This episode

17:46

is brought to you by Squarespace. Look,

17:49

if you're trying to take your business

17:50

seriously, you got to level up your

17:52

website. Mine, joggan.com, is powered by

17:56

Squarespace. It's a no-brainer because

17:59

it does all the hard stuff for you.

18:01

They've got everything in one place. You

18:04

can lock down your domain, make your

18:06

stuff look clean and professional, build

18:09

your brand, get paid, all of it. Go to

18:12

squarespace.com/rogan

18:14

and try it out for free. And when you

18:17

are ready to launch, use the offer code

18:19

Rogan to save 10% off your first

18:22

purchase of a website or domain.

18:24

>> I think the difference, as you correctly

18:26

said earlier, is these people are very

18:28

different to the North Koreans, right?

18:29

North Korea wants to be left the [ __ ]

18:31

alone. Iran does not want to be left

18:33

alone. Iran wants to dominate the

18:35

region. That's why they fund Hamas. It's

18:37

why they fund Hezbollah. It's why they

18:39

fund the Houthies. It's why they are

18:41

doing [ __ ] That's why the Gulf

18:43

countries and Israel are very worried

18:45

about them. Right. So that's the

18:46

difference I think. And then there's

18:48

the, you know, uh, some of the people

18:50

we've had on the show who are Iranian

18:51

have talked about the the what is it

18:55

called? 12 Shia Islam. Is that the I

18:57

can't remember the details, but

18:58

basically they have a kind of messianic

19:01

vision of what's going to happen. And

19:03

they they believe that when the world

19:05

ends, that's when the prophecy will be

19:07

fulfilled. You don't want those guys

19:09

with nuclear weapons, right? That's a

19:11

good point.

19:12

>> Yeah. So, from that perspective, it's

19:14

different to North Korea. Um,

19:17

>> and so that that's I think that's part

19:19

of the thinking, but your your point is

19:21

interesting to me about the fact that

19:23

this doesn't reflect what people, you

19:26

know, as we're not Americans, but it

19:28

doesn't seem to have been part of the

19:29

policy platform of the Trump uh election

19:32

at the last election, right?

19:34

>> No, not at all.

19:35

>> But I do think there is some kind of

19:37

strategy behind all of this, and I'm

19:38

very curious what that is. Uh because I

19:41

guess if if you think about it

19:43

logically, you would say, well, is it an

19:45

attempt to effectively push back against

19:47

China and Russia infiltrating all these

19:50

countries, right? China and and Russia

19:52

were very close with V with Maduro in

19:54

Venezuela.

19:55

>> Very very like Francis is saying,

19:57

Hezbollah training camps, I margarita,

19:59

where was the oil going? Right. Same

20:02

with Iran. I mean, Iran sells its oil to

20:04

China and sends suicide drones to Russia

20:07

to use in Ukraine. So maybe it's that

20:10

maybe the strategy is you're trying to

20:13

push back against Chinese and Russian

20:14

influence in in all these countries

20:16

because you can't attack them directly

20:18

because you can't attack them directly,

20:20

right?

20:20

>> Yeah.

20:21

>> Uh but this is all just guessing on on

20:23

my part and that's what I'm really

20:24

curious. We're going to do some

20:25

interviews on this trip to kind like I

20:27

want to get someone on the show who can

20:28

go this is the strategy, right? this is

20:31

what we're doing because I think as we

20:33

were saying earlier it's not very clear

20:36

to most people what the rationale behind

20:38

all of this is but I also don't think

20:41

this sort of like mad dog Trump idea is

20:44

true either I think he has a strategy

20:46

I'd just love to know what it is

20:47

>> and it's very interesting because

20:49

there's been talk about regime change in

20:52

Cuba

20:53

>> and one of the things so

20:55

>> I think that's next genuinely

20:57

>> oh my god

20:58

>> I think that's next

20:59

>> so when Chavis came to power in 99. What

21:02

he did, and not enough people talk about

21:03

this, is he turned what was uh very

21:06

corrupt, admittedly, liberal, western

21:09

style democracy into a communist

21:11

dictatorship. And how do you do that?

21:14

You can't just literally do that

21:16

overnight. So what he did is he allied

21:18

with the Cubans and um Fidel in

21:21

particular, Fidel Castro. Venezuela

21:24

provided Cuba with cheap oil which

21:26

helped to keep the Cuban economy afloat

21:28

because Cuba's been going broke since

21:30

however many years, 40 odd years. And

21:33

what Castro did was he gave him the

21:36

boots on the ground in Venezuela, but

21:39

also the technical expertise and knowhow

21:41

in order to change a western liberal

21:43

democracy into a communist state with

21:45

permanent surveillance, secret police,

21:48

subjugate the population. So there was

21:49

no chance of them ever being able to

21:51

revolt and m make every and turn

21:54

everything into like I said into a

21:55

communist state. So by what they did in

21:58

Venezuela, Venezuela can no longer

22:01

support Cuba. So Cuba is literally now

22:05

withering on the vine as a result of

22:07

them knocking out the Venezuelans. M

22:10

>> so it's going to come to a point where

22:12

you say Cuba effectively going to go

22:14

bankrupt which could precipitate an

22:17

uprising a revolution by people when

22:19

people can no longer eat and that would

22:22

mean that that country is then weakened

22:24

finally they can get rid of the

22:26

communist regime there and they can have

22:29

a different type of government one which

22:30

would be far more sympathetic should we

22:32

say to working with America and being an

22:35

American island possibly

22:36

>> that makes sense to me in a way what I

22:38

don't understand about the Iran thing is

22:40

like what is the end goal here? Well,

22:42

you've got the Resa Bahavi, the Sha's

22:45

son. I mean, he left Iran a long time

22:47

ago as a kid, right? The you know, the

22:50

idea he's going to go back in there and

22:51

be welcomed by the masses. Maybe that's

22:54

true. Maybe

22:55

>> it's like Daenerys returning to the Iron

22:58

Throne.

22:58

>> Yeah, but Daener Daenerys had three

23:00

[ __ ] dragons,

23:02

right?

23:03

>> But you know what I'm saying? Like she

23:04

left when she was a baby.

23:05

>> Yeah, that's right. That's right. And

23:07

the pe it's not like the people are

23:08

desperately I I don't know maybe the

23:10

people are desperate for the return of

23:11

the

23:12

>> well it seems like some people are

23:13

desperate for a change there the people

23:15

that were protesting some people that

23:16

risk their lives 100%. But like every

23:19

country like if you only listen to the

23:22

liberals in this country you you would

23:24

think that you know that no one's

23:26

illegal on stolen land. If you only

23:27

listen to the Republicans in this

23:29

country you would think we got to find

23:31

every illegal and get them out of our

23:33

country and make America great again.

23:35

like this. It doesn't make sense if we

23:38

just go only by the protesters. Like

23:40

>> we don't really have accurate polling

23:42

because they don't have any free speech

23:43

over there. And they they've killed like

23:47

famous athletes over there for

23:49

protesting. I mean, they killed the

23:50

Olympic gold medalist in wrestling. The

23:52

UFC tried to step in and tried to do

23:54

something to stop it. They they executed

23:56

him for just

23:57

>> I apparently I don't even think he was

23:59

actually protesting. I think he was just

24:01

at a protest.

24:02

>> He wasn't even saying anything. And this

24:04

is the thing you always have to bear in

24:06

mind, Joe. Um,

24:07

>> I might be wrong about that though.

24:08

>> No, I think you are right.

24:09

>> Am I right about that?

24:10

>> The the final details, I'm not sure, but

24:12

the fact that he was executed and Dana

24:14

worked very hard to try and save that.

24:16

>> I think there there was some discrepancy

24:18

as to whether or not he was actually

24:20

participating in the the protest, but

24:23

that also could have been the defense,

24:25

you know? I don't know. But I mean, the

24:26

fact that they execute people who

24:29

protest, I mean, there's no way you can

24:31

support that kind of that's a scary ass

24:32

[ __ ] government and run by religious

24:35

fanatics. That's a scary ass government.

24:37

But the question is like, how scary ass

24:39

does it have to get where invading makes

24:41

sense because if if this keeps going

24:44

like if we we have to go boots on the

24:46

ground, that's where things get nuts.

24:48

>> You can't go boots on the ground, man.

24:50

You can't you can't,

24:51

>> right? I don't but I don't think there's

24:53

any

24:54

>> you should be president of the United

24:55

States.

24:57

>> There's a lot of people are going to

24:58

disagree with that.

25:02

>> No, I I I I don't think that's viable.

25:05

Just like

25:06

>> it might be robot boots on the ground.

25:08

>> Yeah.

25:08

>> You know, if Elon gets that factory up

25:10

in time.

25:11

>> What I want to know is like what is it

25:13

that what is what is it you're working

25:15

towards? Right. Right. Like so from what

25:18

what I understand talking to some of the

25:19

people like Israel would quite like a

25:22

reslavi monarchy because the other

25:24

Middle Eastern countries that they have

25:27

peace with you know Bahrain, Morocco, uh

25:32

increasingly the Gulf States, they're

25:33

all monarchies. Right.

25:34

>> Right.

25:35

>> Right.

25:35

>> So they're down with that. But from what

25:37

I understand the White House is really

25:39

not that interested in Pakavi. And so

25:42

what one

25:43

>> what do they want? Well, one of the

25:44

things that Richard Miner broke on our

25:46

show, because it hadn't been reported

25:47

anywhere else, was that the White House

25:49

has given the Israelis a no kill list,

25:52

which is basically a list of members of

25:54

the current regime that they don't want

25:56

to be killed because they have hoped

25:58

that these people could then be the

26:02

Rodriguez

26:03

equivalent in Iran, right? And I I don't

26:06

know that the fanatics within the

26:09

Iranian regime who are there now, how

26:12

many of them are like this mo like Darth

26:14

Vader but like like do you know what I

26:16

mean? Yeah.

26:16

>> You're kind of looking for

26:17

>> Darth Vader zero.

26:19

>> Yeah. No, no, no. No is zero Islamism.

26:22

Like I I don't know that that exists.

26:24

Right.

26:25

>> Right. Sugarfree.

26:26

>> Sugarree. Islamism free. Yeah.

26:28

>> So So that's the bit. And that doesn't

26:31

mean that there isn't like a plan,

26:33

right?

26:33

>> But I don't know what the [ __ ] that plan

26:35

is right now. And I find it hard to see

26:37

one, right?

26:37

>> So regime, evil regime gone. Wonderful.

26:40

But what but the question is always like

26:42

what comes after that,

26:43

>> right?

26:43

>> That's always the question. And that's

26:45

where I think your point is very true,

26:47

which is in the past there have been

26:48

times where this sort of approach has

26:50

gone completely off the rails.

26:51

>> Yeah,

26:52

>> that's a fact.

26:53

>> And it's also as well what has been

26:54

coming out of the Trump camp is

26:57

contradictory to put it mildly. You have

26:59

Hexf saying one thing, you have Trump

27:00

saying another. They contradict each

27:02

other at certain points. Is that a

27:04

tactic in order to befuddle the

27:05

opponent? Maybe. Who knows? Or is it the

27:08

fact that they don't actually have a

27:09

grand vision?

27:10

>> Was there some sort of a concession

27:12

today on Russian oil?

27:14

>> Yeah. Well, I think the first of all,

27:17

Trump let uh India buy Iranian oil and I

27:20

think now they are lifting the sanctions

27:22

on Russia selling its oil because the

27:23

oil prices spiked as much as they did.

27:25

Right.

27:26

>> Here it goes. US eases limits on Russian

27:29

energy as oil prices sore.

27:31

>> Right.

27:36

>> Yeah.

27:40

>> Yeah. Well, you've got the Pink Floyd

27:41

t-shirt, so it's appropriate.

27:43

>> But you you can see it like the oil oil

27:45

prices spiked

27:46

>> for for what? One day, two days.

27:48

>> Yeah.

27:49

>> And everyone went full panic straight

27:52

away. But the thing is if that does if

27:54

that carries on for two months, the

27:56

impact of that on domestic politics, I

27:59

mean, I'm not an expert in American

28:00

politics, but even I can say that's

28:01

going to be pretty [ __ ] important.

28:03

>> Oh, it's bad,

28:04

>> right?

28:05

>> Yeah, it's going to be bad. I mean, if

28:07

oil prices spike, we're [ __ ]

28:09

>> Yeah.

28:09

>> You know, and the Republicans are really

28:12

[ __ ]

28:12

>> Yeah. And you got the midterms coming.

28:14

>> You got the midterms coming up in

28:15

November. And it's also the momentum

28:17

will be in their way. And look, there's,

28:20

you know, second, third, fourth order

28:22

consequences. So, at the moment in the

28:24

UK, the vast majority of people find it

28:27

are finding it more and more difficult

28:29

just to get through to the end of the

28:30

month

28:31

>> because of the cost of living,

28:33

inflation.

28:35

It's becoming worse and worse. I was

28:36

talking to a butcher in my area, which

28:39

is this very nice part of North London.

28:41

You know the type of place I'm talking

28:43

about. Everyone loves BLM. No one has a

28:44

black friend. That kind of place, right?

28:48

Okay. That's the kind of area it is. And

28:50

he was telling me that even in this very

28:52

wealthy area, people are starting to

28:54

ration in meat now. So before they'd

28:56

have meat 5 days a week, now they're

28:58

going down to three or two. And this

29:00

isn't a wealthy area. So now imagine if

29:02

there's energy spikes and then food

29:04

becomes more and more expensive. There

29:06

is already a very worrying

29:09

hardleft political movement growing in

29:12

the UK where they're talking about, you

29:15

know, the capitalism doesn't work. We

29:17

need socialism. And there's a there's

29:19

this new politician come to the four, a

29:21

guy called Zack Palansky, who talks

29:23

about what we need in this country and

29:25

and the UK is socialism. Now, imagine if

29:29

the cost of living crisis gets worse.

29:31

And the vast majority of people who work

29:34

hard in a regular job can't make ends

29:37

meet through literal no fault of their

29:38

own. Can you blame them for going, "Hang

29:41

on, capitalism doesn't work." Because in

29:43

this instance, at that moment, it

29:45

doesn't work for them. And then you

29:47

could that could spark something

29:50

completely disastrous for our country.

29:52

>> Yeah. But I think you know that's a

29:54

negative story. I think it's incredibly

29:55

persuasive and I lean more in the

29:57

direction that this could go badly. But

30:00

I also think there is the possibility of

30:02

this goes well too. I I think that is

30:04

possible. Um

30:06

>> what do you how do you envision that

30:08

scenario? Well, so if if they're able to

30:10

keep the straight off open and you don't

30:13

have this energy problems that we've got

30:15

now, um you know, Venezuela, Cuba, he's

30:19

basically resetting the region and he's

30:22

basically saying to all the people that

30:23

want to align themselves with China and

30:25

Russia, like we're not [ __ ] about

30:27

here. Do don't cross these lines. That

30:30

is an opportunity to to to address the

30:33

slide that the Western world has had

30:35

visa v China and Russia for a very long

30:37

time. That could be a very positive

30:38

thing. My the thing is what happens in

30:41

Iran like that is the thing that I don't

30:44

really see how that goes well. Might

30:46

might do like I said there's probably a

30:48

plan that we don't know and if if that

30:50

works out that could be very good.

30:52

>> Does it okay what would you imagine that

30:54

plan would be if you if you like let's

30:59

let's imagine best case scenario you're

31:01

in the White House. They're all very

31:02

rational. No one's being influenced by

31:04

foreign governor governments. No one's

31:06

incompetent. Everybody knows what

31:08

they're doing.

31:09

>> Well, yeah. I mean, we're in the realms

31:10

of fantasy now.

31:14

>> But I mean, take the Soviet Union, which

31:16

is obviously something that I know,

31:17

right? Being born in the Soviet Union,

31:19

Russia. Towards the end of the Soviet

31:21

Union, you still had some fanatical

31:23

communists. And in fact, throughout the

31:25

Soviet Union, you always had within the

31:27

government a mixture of different

31:29

people, right? You had the fanatical

31:31

communists who believed that communism

31:33

is the only thing that was ever going to

31:34

work, etc. But you also had people who

31:37

were reformers. They saw the problems.

31:39

They saw that the fanatical communists

31:42

were ruining things and things were

31:44

getting worse, right? They saw that you

31:46

had to kill more and more of your own

31:47

people to keep the to keep [ __ ] locked

31:50

down, right? So the argument could be

31:52

within the Iranian revolutionary guard

31:55

or the regime more broadly, there are

31:57

people who are like, you know, I'm I'm

32:00

I'm not necessarily that keen on on the

32:02

guy who runs Syria now, Al Galani,

32:04

right? He is a jihadi but he's kind of

32:06

like a moderate one. You know, you know

32:08

how long that I don't know how long it's

32:10

going to last. But my point is within

32:11

every regime there is some range of

32:13

opinion. There is some range of

32:15

fanaticism. There is some range of

32:17

people who partly for generational

32:20

reasons. You know the younger people

32:21

have seen uh you know a 40-year history

32:24

and they now go okay this isn't working

32:26

anymore. We need to try something else.

32:28

That is possible. So if if the the CIA

32:31

and the White House have someone like

32:33

that um and they can do a regime

32:35

adjustment and like I think the idea

32:38

that you're going to have,

32:41

you know, multi-parliamentary democracy

32:44

with, you know, free and fair elections

32:46

and women, you know, like Venice Beach,

32:49

you know, rollerblading on bikinis on. I

32:52

don't know that that's going to happen,

32:53

right? But what you might have is an

32:57

authoritarian regime of some kind like

33:00

many other countries in the Middle East

33:02

which realizes that actually economic

33:04

growth is more important than shouting

33:06

Alahaga every 3 minutes and blowing [ __ ]

33:08

up, right? that focuses on making life

33:11

better for their citizens that you know

33:13

practices traditional Muslim values

33:15

which many countries do and says you

33:17

know women ought to be modest but

33:19

doesn't force them to wear the burka or

33:22

or or or the the headscarf or whatever

33:25

or and uh is less interested in

33:29

destabilizing the region and attacking

33:31

others and trying to be this great power

33:33

and is more interested in just

33:35

prosperity for its own people survival

33:37

for them for themselves ves as a regime

33:40

and is willing to play ball with the

33:41

United States. I I mean

33:43

>> well said. Yeah. Well said. That's best

33:44

case scenario.

33:45

>> That's best case scenario. Now, if you

33:46

get there, you I think that would be a

33:48

huge win for President Trump and it' be

33:50

a huge win for the world and he will

33:53

walk away from that with a huge win. And

33:55

I think you know you're better expert on

33:57

the American people, but I think

33:58

American people like winning, right? So,

34:00

if you have all this happen, he can then

34:03

say, well, look, we did this, we did

34:04

this, we did this, Russia and China have

34:07

been pushed back. we got to, you know,

34:08

the situ is not going to get a nuke,

34:10

which is important. I think we can all

34:12

agree on that, right?

34:13

>> Yeah.

34:13

>> Um there is the possibility that he

34:15

comes out of this very well.

34:18

>> I I think that based on what I see, but

34:21

I don't know, coming back to what we

34:22

said, I think we've shared this kind of

34:24

perspective really, Francis and I, that

34:27

seems somewhat less likely at this

34:29

point, at least harder to see, but I

34:30

think you can tell a persuasive story

34:32

both ways. I really do.

34:34

>> That makes sense. And what what you're

34:36

saying I think is very valid that we

34:38

need to abandon any idea of them having

34:40

some sort of a democracy over there.

34:41

It's it's not going to happen.

34:43

>> No.

34:43

>> Um and you know you do look at the

34:46

relationships that we have with other

34:49

Gulf state

34:50

>> nations

34:52

>> seems fine. Right. It's not threatening

34:54

to us. We would like everyone to be free

34:57

and have the same sort of liberal

34:58

democracy that we have in America. But

35:02

okay, you like that all want that all

35:04

day long. You can't do anything to

35:06

change the way other people govern

35:07

themselves. Especially when you've

35:09

gotten to the point where like

35:13

take any of the Middle Eastern countries

35:15

for example, these some of these people

35:17

are worth trillions of dollars. These

35:19

royal families have been running it

35:20

forever. They have insane amounts of oil

35:22

money.

35:24

>> Good luck.

35:24

>> Good luck getting them out of there.

35:26

like good good luck saying uh we should

35:28

just vote, you know, and have a

35:30

president and you don't have any power

35:31

anymore. Like how how are you going to

35:33

pull that off? Especially if things are

35:35

going well for the people that live

35:36

there. Like like I have a friend who

35:39

moved to Dubai

35:41

>> and uh he's an American and uh he moved

35:44

back to America recently, but he was

35:45

over there and he said, "Dude, you could

35:46

leave a Rolex on the street and people

35:48

would pick it up and bring it to the

35:50

police.

35:51

>> Like it's so safe." He's like, "There's

35:53

no crime." and he's black and she he's

35:56

like, "I worry when I go out in America

35:58

I'm going to get shot. I'm worried I'm

35:59

going to go to a club and someone's

36:01

going to start beefing and shooting up

36:03

the place and I'm going to get hit." He

36:04

goes, "I don't think about that at all

36:06

over here. There's none of that." He

36:08

goes, "It's safer." Is he Is it [ __ ]

36:10

up that you know it's run by a king?

36:13

>> I I guess is there is it that much

36:16

different than a president? I mean, in

36:19

in a way, like it's a leader, right?

36:22

You've got more checks and balances over

36:24

here. You've got Congress. You got the

36:25

Senate. You got all this [ __ ] going on

36:27

with the Supreme Court. You have all

36:29

these different human beings that also

36:30

have a say and can block things. And but

36:33

at the end of the day, we're still under

36:35

this bizarre alpha male chimpanzee

36:39

structure that has existed from the time

36:42

that we were 150 people in a [ __ ]

36:44

tribe, right? So, it's still one guy

36:48

running things. It's just running things

36:50

their way. And if you were a citizen in

36:53

Dubai,

36:54

pretty [ __ ] good, right?

36:56

>> Right. Well, your point about the UAE is

36:58

really interesting because not only is

36:59

on the practical level of safety and

37:01

other things, but also uh they don't

37:04

have the Islamism problem, right,

37:06

>> that we have in Britain and increasingly

37:08

you guys are starting to see here

37:10

>> because they recognize that it's a

37:12

problem and they deal with it. So, I

37:14

don't know if you saw this news story.

37:15

The UAE no longer gives sponsorships to

37:18

their students to go to the UK because

37:21

they're worried their kids are going to

37:23

get radicalized by Islamists in Britain,

37:26

>> which is [ __ ] wild.

37:28

>> Yeah, that is [ __ ] wild.

37:30

>> And you I mean you you you were

37:32

messaging me about the story with the

37:34

Mandani situation, right, yesterday?

37:36

Yes.

37:36

>> You now have this problem in America.

37:38

>> Yeah. Yes. You have the Islamism problem

37:40

here where people who are supporters of

37:42

ISIS are thrown. I mean, your media is

37:44

pretending it's not happening, but it's

37:46

[ __ ] happening.

37:47

>> Well, it happened in Austin,

37:48

>> right?

37:48

>> I mean, the guy who shot up that bar.

37:51

>> Um,

37:51

>> because this is a

37:52

>> send that article to Jamie. Jamie, you

37:55

could probably find it on CNN because

37:57

it's kind of hilarious.

37:58

>> Incredible. This was actually

38:00

incredible.

38:00

>> The New York Times title change thing.

38:02

>> Did the New York Times change their

38:04

title? Yeah,

38:05

>> they did with the

38:07

>> I wonder why

38:10

>> I'm going to send you the CNN one first.

38:12

>> Um the CNN one is is really wild. I'm

38:15

>> called they had like fuses of things

38:19

with smoke or something. They change it

38:20

to bombs or something.

38:21

>> Yeah, [ __ ] duh. Uh I'm sending you

38:24

this one cuz the CNN one is, believe it

38:27

or not, more preposterous. The CNN one

38:29

is so kooky. You you you see their

38:32

headline, you're like, "What? What kind

38:34

of story are you painting here? Like

38:37

this is such a crazy way to frame a guy

38:42

showed up with bombs and was hurling

38:44

them at people.

38:45

>> Well, they made it sound like the exact

38:47

opposite of what actually happened.

38:48

>> Well, it sounded like it was just a

38:50

regular day. Just regular just regular

38:52

day for this fella. And then things just

38:54

went a little sideways somewhere along

38:56

the way.

38:56

>> Listen, we've all got hobbies, man. You

38:58

know what I mean? You like working out.

38:59

He has nail bombs. Come on.

39:01

>> Yeah.

39:01

>> Did you get it, Jamie? It's

39:03

>> not It's not coming through.

39:04

>> No. Look, when I uh It's not loading

39:07

there.

39:07

>> Oh, interesting. But let me um see if

39:10

they took it down.

39:12

>> I'd guess they would have. You probably

39:14

have an old link that

39:15

>> Okay, that's the New York Times one I

39:16

sent you on for Oh, well that No, that's

39:18

the one I just clicked on. Let me check

39:19

the uh original one. Nothing to see

39:21

here. Interesting.

39:23

>> Uh they probably deleted it. Okay, here.

39:25

I found it. I found it. Two Pennsylvania

39:27

teenagers crossed into New York City

39:29

Saturday morning for what could have

39:30

been a normal day enjoying the city

39:33

during abnormally warm weather. But in

39:35

less than an hour, their lives were

39:37

drastically changed as the pair would be

39:39

arrested for throwing homemade bombs.

39:41

Like she that's that's that is CNN's

39:44

tweet. Um I'm going to send you a

39:46

screenshot cuz I do believe they've

39:48

taken it down.

39:49

>> They bur them.

39:50

>> Um yep. Nothing to see here yet. Yeah,

39:52

they took it down.

39:53

>> That's not a headline though. Let's take

39:54

a second.

39:55

>> No, no, no. I'm I I'm going to send you

39:57

the uh actual tweet because they did

40:00

take it down because it's so [ __ ]

40:02

ridiculous. But uh the internet never

40:04

forgets. Um I'm sending it to you here.

40:07

Thank god I saved it. It took a

40:09

screenshot cuz I'm like this is such a

40:12

crazy way to frame

40:14

>> Yeah.

40:14

>> two guys wanted to do a terror attack.

40:18

>> Yeah.

40:18

>> But it's not an accident, Joe. It's not

40:20

crazy.

40:21

>> You know it's not crazy.

40:22

>> No. Look. Look how it's framed here.

40:24

This is the original tweet. Two

40:25

Pennsylvania teenagers, just regular

40:27

fails from from PA from Philly. Two

40:30

Philly boys had a couple of cheese

40:32

steaks and then got on the train

40:35

>> crossing New York City Saturday morning

40:36

for what could have been a normal day

40:38

enjoying the city during abnormally warm

40:40

weather. Why the [ __ ] would you even say

40:42

that? Could have been a normal day if

40:44

they weren't going there to commit

40:44

terrorism.

40:45

>> Do you know what it reads like? It reads

40:46

like when I used to teach uh 13year-olds

40:48

creative writing. That's how they'd all

40:51

start off. You know, it was like a

40:52

normal day. And

40:54

>> I mean, is the I'd like to know who

40:56

wrote who wrote that. Who's the person

40:58

who wrote that? And I want to know if

41:00

were you were we directed to write it

41:01

that way or

41:01

>> who approved it? Yeah. Who edited it?

41:04

>> Are you trying to downplay the

41:05

possibility of first of all now in New

41:09

York because you have a a guy who's an

41:11

avowed whatever he is, democratic,

41:13

socialist, some say communist, but also

41:16

Muslim. And then you have these

41:19

Islamists who are doing a terrorist

41:21

attack. So like are you trying to soften

41:23

that? Are you trying to soften it?

41:25

>> So what what happened just so people

41:26

know is there was a protest outside

41:28

Mdani's mansion.

41:30

>> Right.

41:30

>> Right. And then these two people turned

41:33

up and threw bombs at the protesters and

41:35

the way it was reported you if you just

41:38

read that and no other stuff you would

41:40

have come away from with the conclusion

41:42

that it was the protesters who were the

41:44

targets of the bombs. they were the ones

41:47

that threw the bombs. No one officially

41:49

said that's what happened. But the way

41:51

they did the story and the headline, you

41:54

would have got that impression and

41:56

you're just going, "Well, you were just

41:58

on a team. You see this as a team game,

42:01

right? And you want to present your team

42:03

in the correct light."

42:04

>> Oh, the new new post, a post regarding

42:07

two individuals arrested for throwing

42:08

handmade bombs outside of New York City

42:10

Mayor Zuhan Mandani's home failed to

42:13

reflect the gravity of the incident.

42:15

thereby breaching the editorial

42:17

standards we require for all our

42:19

reporting. It has therefore been

42:21

deleted.

42:22

>> But see how skillful this is, Joe. This

42:23

is gaslighting again. They're saying

42:26

their mistake was to what?

42:28

>> Look at the first guy. Donut operator.

42:31

Nah. You retards got called out for

42:33

trying to downplay actual terrorism and

42:35

now you're backpedaling. Yep.

42:37

>> Who's that second guy? He's really

42:38

smart.

42:39

>> Didn't fail to reflect the gravity of

42:41

the situation. This guy named

42:42

Constantine. I think he's on that

42:43

trigonometry show. Uh reflect the

42:46

gravity of the situation. It failed to

42:47

accurately communicate who was

42:48

responsible, who the intended victims

42:50

were, and where to the the blame for the

42:53

attempted terrorist attack lay. In other

42:55

words, you didn't accidentally downplay

42:57

the serious seriousness of it. You

42:59

deliberately misrepresented what

43:01

happened to conceal the truth from the

43:02

public. Well, that's how AI would say

43:04

it.

43:07

>> I still write my own [ __ ] John.

43:09

>> I know you do, but I like the donut

43:11

operator guy.

43:12

Yeah, I know. He's more your cup of tea.

43:15

>> That's how I like to talk.

43:17

>> That is what happened though, right?

43:18

>> That is what happened. But that's what's

43:19

really scary about this world we're

43:23

living in now right now because we're

43:25

we're so ideologically captured both

43:27

right and left. Everyone in this country

43:29

looks at this administration as an

43:31

existential threat to democracy itself

43:33

and our way of life and, you know, fill

43:36

in the blank whatever marginalized

43:38

groups are all going to be round up and

43:40

put in interment camps. This is this is

43:42

the narrative that the the most radical

43:44

of the left have about that the sky is

43:47

falling because Trump's in office.

43:48

>> But it's also as well what people on the

43:50

left don't want to acknowledge is the

43:52

dangers of Islamism, right? when they

43:54

see people do these kind of horrific

43:57

terror attacks, when they see for

43:59

instance what happened in the London

44:00

Bridge terror attacks in 2019 or what

44:03

happened in Manchester in the Ariana

44:04

Grande concert where Islamic terrorists

44:08

bombed a Ariana Grande concert and the

44:11

majority of the audience were little

44:13

girls were young girls and they say,

44:16

"Oh, that this happened because you know

44:18

they were marginalized and they felt

44:20

angry and this is what people do when

44:22

you push them to one side and they don't

44:24

have a means in order to how to express

44:26

themselves. You're going no what this is

44:28

is an ideology. It's an ideology which

44:31

believes that our civilization, our way

44:34

of life is evil, but also they want to

44:37

establish their form of radical Islam

44:40

across the globe. They want to create a

44:42

global Islamic caliphate and they will

44:44

do whatever it takes in order to achieve

44:45

that goal. But people in the west they

44:48

can't understand that because it's so

44:50

alien for how we see how we see things.

44:53

We believe human life is precious. We

44:55

believe the most important thing is

44:57

human life. They don't. They believe the

44:59

cause is more important than your life.

45:01

And we can't understand that because

45:03

we're raised in a world that is

45:05

fundamentally Christian even though we

45:07

might not be. We still have Christian

45:09

values. We had a guest on the show, a

45:11

wonderful historian called Tom Holland,

45:13

and he explained this to us that even if

45:16

you're not Christian, even if you think

45:17

you were raised by atheist parents, you

45:20

were still raised with Christian values.

45:23

That's the soup in which we live. That's

45:25

what the water in which we swim. So this

45:28

way of life that these people have, this

45:30

ideology is so alien to us that we can't

45:33

understand it. But also we don't want to

45:36

understand it because if you start to

45:38

actually investigate what these people

45:40

believe, what their ideology is, you

45:43

realize that we are not all the same and

45:45

these people believe something very very

45:47

different and then we're going to have a

45:50

very uncomfortable conversation of how

45:52

do you tackle this cuz can you have

45:55

western liberal democratic values and

45:57

Islamism and people who are Islamists in

46:00

the same society and the answer is you

46:02

can't

46:02

>> and I think it's really important your

46:03

point the difference between Islamists

46:05

and Muslims.

46:06

>> Yes.

46:07

>> The Muslims are these Gulf State people.

46:09

Muslims are these people in Dubai and

46:11

Saudi Arabia. Islamists are the ones

46:13

that want the caliphate.

46:14

>> Yeah. Right.

46:14

>> But then you have the crazy Christians

46:17

and that thing that I sent you, the

46:18

Yahoo thing that we talked about

46:20

yesterday with Shelonburgger.

46:21

>> The Yahoo thing is nuts. So these

46:25

military leaders, so this this comes

46:27

from one of the non-commissioned

46:29

officers who went to a briefing. He goes

46:31

to a briefing and they inform him that

46:33

you shouldn't be scared because this is

46:36

all because President Trump is anointed

46:38

by Jesus and this is to bring about

46:40

Armageddon so that Jesus returns to

46:43

Earth. This isn't a [ __ ] military

46:46

briefing. One such note included an

46:48

anecdote from a non-commissioned officer

46:50

who reported that their commander had

46:52

urged us to tell our troops this war was

46:54

all a part of God's divine plan. And he

46:57

specifically referenced numerous

46:58

citations out of the book of Revelations

47:00

referring to Armageddon and the eminent

47:02

return of Jesus Christ. This is [ __ ]

47:05

crazy. Um he said this morning our

47:07

command So this is this is an officer

47:10

who's talking about this. This morning

47:11

our commander opened up the combat

47:13

readiness status briefing by urging us

47:15

to not be afraid as to what was

47:16

happening with our combat operations in

47:18

Iran. He said, "President Trump has been

47:20

anointed by Jesus to light the signal

47:23

fire in Iran to cause Armageddon and

47:26

mark his return to Earth." He said he

47:28

had a big grit on his face when he said

47:30

all this, which made his message seem

47:32

even more crazy.

47:33

>> Well, that's reassuring.

47:35

>> That's that Well, this is this is the

47:37

scary arm of the the right. This is the

47:40

scariest arm of the right. The the

47:42

people that think that this is one of

47:43

the main reasons, the Makabe people who

47:46

think this is the main reason to protect

47:47

Israel. It's a part of God's plan. You

47:50

know, Israel is where Jesus is going to

47:52

return. He's going to return to

47:53

Jerusalem.

47:54

>> Yikes.

47:56

>> Yeah. I've never really understood that.

47:57

Like, I think you can argue for you can

48:00

be pro- Israel for pragmatic reasons.

48:03

This religious stuff is a little bit

48:05

weird to me. Well, this but the problem

48:08

is you've got fanatics like the

48:10

Islamists, but you've also got these

48:13

>> Christian hard right Christian

48:15

nationalists that really believe that

48:18

this is a part of biblical prophecy

48:22

>> and that they this is book of

48:24

revelations. It's about to go down and

48:25

they they want it to go down.

48:27

>> This is [ __ ] terrifying. This is

48:30

really interesting for us because in in

48:33

in the UK, Christianity has be dig

48:36

fanged to the point where there's a

48:38

trans flag on practically every church.

48:41

So this idea of having these hardcore

48:44

right-wing fundamentalist Christians, we

48:46

we just don't experience that.

48:47

>> We don't have that really.

48:48

>> Yeah. It's like can't any everybody live

48:51

in the middle? Why why do you have to go

48:52

all the way over to we got to start

48:54

Armageddon and Jesus Jesus is going to

48:56

come back on a white horse? You ever

48:58

read the book of Revelations?

48:59

>> Yeah, I got really into it. The book of

49:01

Revelations is kooky. You know, they

49:03

they really believe that Jesus is going

49:04

to return on a horse.

49:06

>> Why a horse?

49:07

>> A white horse?

49:08

>> It's a bit racist

49:09

>> a little bit. I mean, I don't get

49:13

I don't

49:14

>> Can we have a diverse horse at least for

49:16

>> a horse of color joke?

49:17

>> You want me to read you the passage? Cuz

49:19

I I saved it cuz it's kind of kooky. Um

49:21

because it's one of those things where

49:23

you just go, "Wait, who [ __ ] believes

49:25

this? Is this is is this really what you

49:27

think is going to go down because

49:28

someone wrote it down on paper 2,000

49:31

years ago in

49:32

>> in ancient Hebrew. Uh it says, "Heaven

49:36

opens and Christ appears on a white

49:38

horse to judge and wage war called

49:41

faithful and true with eyes like fire,

49:44

many crowns and the name King of King

49:46

and Lord of Lords." Just imagine it's

49:49

2026 and you're like, "That's the

49:50

blueprint, boys."

49:53

But this is just as scary. And

49:54

especially for people that are Muslims,

49:56

right? Or or anybody who lives in the

49:58

Middle East, like they this is more

50:00

important than human life. This is more

50:02

important than international law. This

50:05

is like in the eyes of the crazy on the

50:08

right.

50:09

>> This is the problem. So it's like it's

50:11

not it's not like one side. It's like

50:12

all good over here. We have to fight

50:14

against the Islamist. Now we we've got

50:17

some cooks over here, too. If if that

50:20

guy is for real and that guy's in a

50:21

position of power and he's really having

50:23

combat readiness meetings where he's

50:25

telling people that we have to bomb and

50:27

start Armageddon so Jesus can come back

50:28

on a white horse

50:30

>> [ __ ] yo

50:31

>> like that's kooky.

50:33

>> The thing that is probably reassuring

50:35

somewhat is like I don't President Trump

50:37

doesn't strike me as one of those

50:38

people. He's not.

50:39

>> He's not. Right.

50:40

>> Whereas the leader of Iran is

50:43

>> right. But people in the military I

50:45

think are as well. Yeah. and people in

50:47

high positions in the military, I think

50:48

maybe as well. If if this guy can give

50:51

that kind of a meeting and and that kind

50:53

of a speech at a meeting, that that's a

50:55

little terrifying. And if I was over

50:57

there, I'd be freaking the [ __ ] out. If

50:59

I'd be like, "This is your plan.

51:01

>> I'm cannon fodder so that Jesus can come

51:04

back. My body's going to be part of the

51:06

the [ __ ] signal fire."

51:08

>> Let's be honest though, it wouldn't be

51:10

that much of a plot twist for 2026,

51:12

would it?

51:12

>> Right. It would be the final, you know,

51:14

episode 10, Game of Thrones season 6.

51:19

>> Yeah, it's I mean, it is getting [ __ ]

51:21

what? And that's when the aliens come.

51:23

Maybe that's what they're doing.

51:25

>> You know, the the whole thing is uh it's

51:28

there's not a sane person on either

51:30

side.

51:32

>> The whole thing is nuts. And it's like

51:35

it does it it doesn't make sense to

51:37

anybody. And that's what scares the [ __ ]

51:39

out of me.

51:40

>> Yeah. It's the thing that scares me is

51:43

what Constantin has has addressed is I

51:46

don't get worried unless I can't see a

51:50

way out or the way that this is

51:51

resolved. And like you said, the coin is

51:53

in the air

51:54

>> and I'm slightly of more more

51:56

pessimistic nature. He's more of an

51:58

optimist. But as I I look at it and I

52:02

think to myself, this could go so wrong,

52:04

>> so badly wrong that it could make it

52:07

could make Iraq look like an absolute

52:09

picnic in comparison.

52:10

>> Yeah. Well, especially if terrorist

52:12

attacks start popping off over in

52:13

America, like major ones,

52:15

>> you know, and that that could be bad for

52:18

everything. That could be bad for

52:19

freedom of speech. That could be bad for

52:22

rights.

52:23

>> That could be bad for, you know, in

52:25

incorporation of digital ID. That would

52:27

be a good way to push that through.

52:29

There's a lot of stuff that would go

52:30

through that would radically change just

52:32

like the Patriot Act did. Patriot Act

52:34

radically changed the freedoms that we

52:36

have in America and the overreach that

52:38

the government is allowed to

52:40

>> You're so right. But we we only just

52:41

started being allowed to take water back

52:42

on the [ __ ] planes. Right.

52:44

>> Right.

52:46

>> Maybe this is what it's all about. It's

52:48

so you can't take water on a plane.

52:49

>> You decide to let people keep their

52:51

sneakers on. Wasn't that going on for a

52:52

while?

52:53

>> Yeah. You can't can't have that. We had

52:55

one [ __ ] shoe bomber. That Richard

52:57

[ __ ] that one guy. One guy.

53:00

>> And there was once, do you remember you

53:01

weren't allowed to bring scissors on?

53:03

Like a small scissors. I can't remember

53:05

the comedian who said this, but he went,

53:06

"You know what? If you take over an

53:08

entire plane armed with nothing but

53:10

water and some small scissors, you

53:12

deserve the plane."

53:13

>> Well, here's the thing. Like, you can

53:15

bring skateboards, but you can't bring a

53:17

pool queue.

53:18

>> So, it doesn't make any sense. Like,

53:20

there's the like I I [ __ ] you up with a

53:22

skateboard. If you give me, I'll [ __ ] a

53:25

lot of people up with a skateboard, you

53:27

know? Like, think about what kind of

53:28

damage you could do with that big heavy

53:30

ass thing,

53:31

>> you know?

53:32

>> Yeah. It's just And the the worry is

53:35

when it comes to all of this is you look

53:37

at these guys and you go, do you have a

53:38

vision for what is actually going to

53:40

happen,

53:40

>> right? But I do think they do though. I

53:43

do think they have a vision. What I want

53:45

to find out is what that vision is.

53:47

>> I hope you're right, but I don't think

53:49

you are.

53:50

>> You don't think I am? Uh I I think it's

53:53

very possible that they thought this

53:55

would be over much quicker. They thought

53:57

taking out the I look look just look at

53:59

the success that they had in the initial

54:00

bombing of Iran, right? The initial

54:02

bombing they supposedly decapitated

54:04

their ability to make nuclear bombs or

54:06

at least stopped it for a long time. And

54:08

there was a lot of concessions that the

54:10

Iranians were willing to submit to that

54:12

they never submitted to under Obama or

54:14

anybody else.

54:15

>> And that wasn't enough,

54:17

>> right? So the the problem is when you're

54:20

like we were talking about Desert Storm,

54:22

you get away with something that works

54:24

really well. You're like, "We know what

54:25

we're doing."

54:26

>> And then you bite off more than you

54:27

could chew.

54:28

>> Yeah. And especially once you've done

54:29

Venezuela, you feel like you're kind of

54:31

>> you're on a roll,

54:32

>> right?

54:32

>> But yeah, I I see your point. I see your

54:35

point. Um I do think though, I mean,

54:37

from what I read, uh both Kushner and

54:39

Witoff both said that the Iranians were

54:42

not playing ball actually. Um

54:44

>> Okay, which is why they went in. So

54:46

obviously if you think about it given

54:48

how long it takes for US assets to get

54:49

to the region this decision would have

54:51

been made weeks ago at the very least

54:54

right and that's because from what I

54:56

understand the negotiators like Iran

54:58

isn't actually playing ball what they're

55:00

doing is they're claiming publicly that

55:02

they're willing to make concessions but

55:04

when we sit down with them that's not

55:06

what's happening because all they're

55:07

doing is stalling for time.

55:08

>> That makes more sense. And so if you

55:10

were worried that someone was in the

55:11

middle of actually getting their uranium

55:13

up to a point where you enrich it to

55:16

nuclear bomb levels,

55:17

>> right? But I think a lot of people

55:19

misunderstand that in the sense that

55:21

like um I think it's based on my

55:23

understanding it's totally false to

55:25

claim that they were like about to

55:26

develop a nuclear they were not.

55:28

>> Well, you've seen the compilation of

55:30

Netanyahu saying Iran is two weeks away

55:34

from developing a nuclear bomb all the

55:35

way back to the 80s. Have you seen that

55:37

compilation?

55:38

>> I haven't. No. It's wonderful,

55:39

>> right?

55:39

>> See if you can find it, Jamie. Because

55:41

it's so it's so kooky. I mean, he's been

55:43

talking about this for [ __ ] ever,

55:45

right? They're that close. They're two

55:46

weeks away. They're two weeks away.

55:47

They're two weeks away.

55:49

>> And you know, maybe they are,

55:51

>> you know, and maybe Stuckset put a dent

55:54

in that, right? They they used that um

55:56

virus program to kill all the computer

55:59

programs that were running

56:01

>> their nuclear program over there.

56:03

>> Yeah. Yeah. Well, I I don't know that

56:05

they were ever like two weeks from

56:07

having a a payload that was ready to be

56:09

delivered to wherever, but they are they

56:12

were enriching uranium to to levels that

56:14

you only enrich if you want nuclear

56:15

weapons, right?

56:16

>> Uh and so I guess the question for Trump

56:18

is like, do I allow this to continue?

56:21

And do I have to wait until they've got

56:23

the [ __ ] bomb on a launcher waiting

56:25

to to go? Right

56:26

>> here it is.

56:26

>> Yeah,

56:27

>> let's hear it. You've probably heard

56:28

this line before. Iran has never given

56:31

up its quest for nuclear weapons and the

56:34

missiles to deliver them.

56:35

>> That's because Israeli Prime Minister

56:37

Benjamin Netanyahu has been saying this

56:39

for more than 30 years, claiming Iran is

56:43

close to having

56:44

>> nuclear weapons. Nuclear weapons,

56:46

nuclear weapons, atomic bombs.

56:49

>> In 1992, as a member of parliament,

56:52

Netanyahu addresses the Knesset. He says

56:55

within 3 to 5 years we can assume that

56:58

Iran will become autonomous in its

57:00

ability to develop and produce a nuclear

57:02

bomb. 3 years later in his book Fighting

57:06

Terrorism he repeats the same time frame

57:10

3 to 5 years.

57:12

>> Thank you Mr. Chairman.

57:13

>> Fast forward to 2002. Netanyahu

57:16

testifies before a US congressional

57:18

committee actively calling for the

57:21

invasion of Iraq. Uh, are there any

57:23

other nations that you would recommend

57:25

that the United States launch preemptive

57:27

attacks upon? At this point,

57:29

>> the two nations that are vying competing

57:31

with each other who will be the first to

57:32

achieve nuclear weapons uh is Iraq and

57:35

Iran.

57:36

>> The invasion happens months later.

57:40

>> You keep going on this

57:42

>> are found in Iraq.

57:44

This is a fragment of a 2009 US State

57:48

Department cable released by Wikileaks.

57:50

Netanyahu tells members of Congress that

57:53

Iran is one or two years away from being

57:56

capable of developing nuclear weapons.

57:59

>> They're online.

58:00

>> It's 2012 and Netanyahu is holding up

58:03

his infamous cartoon bomb at the UN

58:06

General Assembly.

58:07

>> By next spring, at most by next summer

58:10

at current enrichment rates, they will

58:13

have finished the medium enrichment

58:17

and move on to the final stage.

58:20

From there, it's only a few months,

58:22

possibly a few weeks

58:24

before they get enough enriched uranium

58:28

for the first bomb.

58:30

>> And now, 33 years after Netanyahu's

58:34

first so-called imminent warning, Israel

58:37

attacks Iran.

58:38

>> And if not stopped, Iran could produce a

58:41

nuclear weapon in a very short time. It

58:44

could be a year. It could be within a

58:46

few months, less than a year. That's

58:48

despite the US director of national

58:50

intelligence saying Iran isn't building

58:52

a nuclear weapon months earlier.

58:55

>> Iran lied.

58:57

>> But for Netanyahu

58:59

>> big time,

58:59

>> the slogan has been the same for

59:01

decades.

59:02

>> Like how he said big time alazer,

59:06

>> we get it.

59:08

>> Yeah. So, but here's the thing. Maybe

59:10

he's kind of right, but they haven't

59:13

ever done it

59:14

>> right. Yeah. Yeah. Well, they certainly

59:16

are enriching uranium to a point where

59:19

it's more than you need for power.

59:21

>> Right. So why why are they doing that?

59:23

>> Right.

59:23

>> Right. And so I guess for Trump the

59:25

calculation is like I'm in my last term.

59:28

I might as well, you know, roll the

59:30

dice. Go go go and deal with it now.

59:32

Could end very badly as we've discussed.

59:34

>> Yeah.

59:35

>> There is a way that it ends. Well, we

59:37

will see what happens. And we I I just

59:39

honestly don't think anyone knows how

59:40

it's going to happen.

59:41

>> I don't think anyone knows. But you How

59:42

can you? So many moving parts. It's like

59:45

if I ask you who's, you know, Dana White

59:48

just announced the UFC card for the

59:49

White House, right? Yeah.

59:50

>> Who's going to win Justin Gai?

59:53

You you are not going to say this is

59:55

what's going to happen,

59:56

>> right? You don't know

59:57

>> cuz nobody knows. Yeah.

59:58

>> Right. And this is like a hundred times

60:00

more complicated than that,

60:01

>> right? Yeah.

60:03

>> Yeah. At least.

60:04

>> At least. Yeah.

60:05

>> Yeah. Probably several thousand times

60:08

more. So, it's a it's a gamble. And you

60:10

got to I I mean, you've got to think if

60:12

this goes badly,

60:15

this is legacy defining for all

60:17

involved.

60:18

>> For all involved.

60:19

>> Yeah.

60:20

>> This will this will whatever you've done

60:22

up to that point, it's like Blair and

60:24

Bush.

60:24

>> Mhm.

60:25

>> Tony Blair, people forget in our

60:27

country, Tony Blair was immensely

60:29

popular

60:30

and then Iraq happened. And the only if

60:33

you mention Tony Blair now, the only

60:34

thing anyone remembers is Iraq. So for

60:36

context, Tony Blair was one of the

60:40

people. Tony Blair is a hero in Kosovo

60:42

because he effectively stopped the large

60:46

part of the reason the war in Kosovo

60:47

ended was Tony Blair. I think there was

60:50

something I saw a story that kids were

60:51

people were naming their kids Tony

60:53

Blair, right?

60:54

>> They regret it now.

60:58

>> He was one of the central people in the

61:00

Northern Ireland peace deal bringing

61:02

peace to Northern Ireland for people of

61:03

our age who grew up in the UK.

61:06

We never thought we'd see peace in

61:07

Northern Ireland. Northern Ireland was a

61:09

glorified civil war and it had been for

61:12

however long right the way from the 60s

61:14

the 70s the 80s and he was one of those

61:17

people instrumental in bringing peace to

61:19

Northern Ireland. It was a miracle. It

61:21

was an it was a total miracle that that

61:23

happened the Good Friday Agreement. And

61:25

it was other people as well like Mo

61:26

Mullen etc. So you look at Blair he was

61:30

on a roll. He he must have thought to

61:32

himself, everything I do turns to gold

61:35

here. I have achieved peace in Kosovo,

61:38

peace in Northern Ireland. Why can't I

61:41

invade Iraq and Afghanistan and install

61:44

democracies and bring peace to the

61:46

Middle East? I've done it to Northern

61:47

Ireland. No one ever thought that could

61:49

happen.

61:50

>> So this will whichever way it goes, I

61:54

think it will be defining for the people

61:56

involved. If it goes well, this is like

62:00

the biggest, you know, uh, Hail Mary

62:03

touchdown in history. In some ways, goes

62:05

badly,

62:07

that will define this. I certainly from

62:09

an outside perspective, that's what I

62:11

see. It's going to define the the

62:12

presidency. I mean, I I don't know how

62:14

you can argue with that really. Can you?

62:16

>> No. No, I don't know how you can argue

62:18

with it either. But that's what that's

62:19

what's so interesting about people that

62:21

absolutely know how it's going to play

62:22

out.

62:24

>> You know, you don't. And then there's

62:25

also uh the New York Times thing. I sent

62:27

it to you. Did they change that, Jamie,

62:29

or did they take it down?

62:32

>> I I just sent it to you.

62:35

>> So, what does it say? So, does New York

62:36

Times still have it up?

62:37

>> It's from a day ago, it says.

62:39

>> Wow.

62:40

>> Yeah.

62:40

>> Crowd gathered on Monday. They didn't

62:42

say which Monday.

62:46

>> Yeah,

62:46

>> it was a Monday. It just It was just six

62:49

[ __ ] years ago.

62:51

>> Yeah.

62:51

>> Um Yeah. that and then again this is

62:54

this is the problem where everything is

62:56

polarized and politicized.

62:58

>> Well, I think your point about people

63:00

wanting to believe something is so true.

63:03

>> If if if whenever anything like this

63:05

happens, you instantly get these camps,

63:07

right? You've got the anti-war camp,

63:09

you've got the pro-war camp, you've got

63:10

the this camp, you've got the

63:11

anti-Israel camp, you've got the

63:12

pro-Israel camp, and everyone like

63:15

information is no longer about

63:16

information,

63:17

>> right?

63:17

>> It's just foder for your information war

63:20

that you're fighting.

63:21

>> Exactly. And then on top of that, and

63:24

look, this is a kind of I'm cutting my

63:26

own balls off here because I I make good

63:28

money from posting stuff on X, right?

63:30

But the monetization of content has made

63:33

things different. And we can all see it

63:35

in our feed, right? Yeah.

63:36

>> You've seen this. I mean, you must see

63:38

it.

63:38

>> So now you have people who are basically

63:41

like I go on Twitter on X to express my

63:43

opinion

63:44

>> and to engage in discussion with people

63:46

who have a different opinion. That's

63:47

what I do, right? But there are now lots

63:50

and lots of people who go to work. They

63:53

go to X to work,

63:55

>> right?

63:55

>> And that's what they're doing now. The

63:58

incentive structure of that is not

64:01

conducive to a healthy debate,

64:03

>> right,

64:03

>> at all. What you've got now is people

64:06

going, "Okay, a thing has happened. What

64:08

is my Venezuela got in? It was Israel's

64:10

fault. Okay, here's here's some content

64:11

about that."

64:13

>> And it's no longer authentic

64:14

communication, unfortunately.

64:15

>> And that's just actual people doing it,

64:18

>> right? Yeah.

64:18

>> And then you've got AI on top of that.

64:20

And then you've got foreign bot farms

64:22

>> and foreign governments trying to

64:24

influence this [ __ ]

64:25

>> There's this one currently popular page

64:28

that I follow that's clearly AI. You

64:30

mean you could just read it and tell

64:32

that it's AI and it gets immense amounts

64:35

of engagement. Heavy heavily right-wing

64:38

like really well written, you know,

64:41

funny,

64:42

>> you know, and you could but not but

64:44

yeah, but not human. Not not funny. real

64:47

funny but funny technically funny. You

64:50

know what I'm saying? Like tech insults

64:52

that are technically funny but for

64:55

whatever reason you don't digest it.

64:57

It's like remember that stuff get fat

65:00

but it just went right through you like

65:01

diarrhea.

65:02

>> That's what it's like.

65:03

>> It's got no soul. There's no soul to

65:05

>> it. That's just one. I mean how many how

65:08

many of them exist and how many state

65:10

actors are running bot farms?

65:12

>> Yeah.

65:13

>> So it's we don't know what the [ __ ] is

65:14

going on at any given time. No,

65:17

>> but it's the incentives that have become

65:18

perverted because it's no longer like

65:20

Constantin said about expressing opinion

65:22

or wanting to get involved in dialogue

65:24

or debate. What you've got now is people

65:26

like you said earning their livingings.

65:28

So if you know if you need to pay your

65:31

mortgage at the end of the month or you

65:33

need to pay a team or you have a

65:35

company, you're not going to put out a

65:37

nuance take. Why would you? It's going

65:38

to get minimal engagement. You are going

65:40

to put out something that is going to

65:42

trigger, that is going to be incendury,

65:44

that is going to drive engagement, that

65:46

is going to get people upset or angry

65:48

and agree or agree with you and

65:50

therefore more likely to share. So

65:52

that's the content you're going to put

65:53

out cuz that's the content that's going

65:54

to make you the most do

65:56

>> 100%.

65:57

>> And then you have people that are

65:58

pushing for this idea that no one should

66:00

be able to post online unless you're

66:01

using your real name

66:03

>> and you show some sort of an ID,

66:05

>> which is also kind of crazy. Yeah,

66:08

there's downsides to that for sure, but

66:09

I also I do understand why they're

66:11

saying it.

66:12

>> I understand it, too. I just think it's

66:13

a slippery slope that stops all

66:14

whistleblowers. And imagine you are a

66:18

regime critic in Iran and you're trying

66:20

to post news from Iran under you know

66:23

there's definitely but I you know I

66:25

think Jordan Peterson was actually one

66:26

of the first people that suggested this

66:28

thing and I understand why

66:31

>> because

66:32

>> the way it's like the windscreen the

66:34

windshield effect in your car the way

66:36

you and I behave face to face is not the

66:38

way people will behave when they're

66:40

sitting in their truck and someone cut

66:41

them up in traffic. Yeah.

66:52

So, I understand it.

66:53

>> But times a million.

66:55

>> Times a million, right? And then you've

66:57

got foreign bots and all this [ __ ]

66:59

>> And then taking away people's right to

67:02

anonymity online, like [ __ ] me that, you

67:04

know, the second, third, fourth order

67:05

consequences of that.

67:06

>> Yeah.

67:07

>> Are pretty [ __ ] crazy as well. I

67:08

found another picture

67:10

>> of that area from what it says it was

67:12

yesterday.

67:13

>> So I don't know that it was not real.

67:16

>> So this is from which website

67:19

>> it says uh I typed it into Perplexity

67:22

and I'm clicking around on pictures to

67:23

find out where they're coming from one

67:25

by one.

67:26

>> So is it possible that this is that the

67:29

New York Times put the wrong footage but

67:32

it was a similar kind of protest in the

67:36

same spot? Yeah, that's my

67:38

understanding.

67:38

>> This is a similar photo. This is an AP

67:40

source on this photo.

67:41

>> Okay. So, all the New York Times did is

67:43

get the wrong photo of a bunch of people

67:46

gathering.

67:46

>> I'll note this one, which is at night,

67:48

so it is definitely a different photo.

67:49

This is from January.

67:51

>> Mhm.

67:51

>> This one is.

67:52

>> Yeah. But this is not the same photo

67:53

that we looked at before.

67:54

>> Right. So, the other one though,

67:56

>> this one is from a day ago. Very

67:59

similar.

68:00

>> Well, that's very different right there.

68:02

That's small. Well, but it's the same

68:04

angle of of this little pool or whatever

68:06

is on it. So, what they did was just use

68:09

the wrong footage, but a similar sort of

68:12

a protest. So, it was just an

68:14

unfortunate error, not like reframing

68:16

the narrative with propaganda. that I'm

68:18

not sure cuz even these comments is like

68:20

AI and it's fake photo but that's why I

68:23

was trying to find other sources of it

68:24

not

68:25

>> what does Grock have to say

68:26

>> that I mean I used our perplexity to do

68:28

it and it's

68:29

>> right but I mean on the post usually if

68:31

someone posts something on they say

68:34

Grock is this true is this footage legit

68:36

>> oh that's it okay it's on Instagram

68:38

>> I could find it on

68:39

>> and so Perplexity says that there is a

68:42

legitimate size protest that's like

68:44

>> yeah I just asked if crowds gathered

68:45

there yesterday and says thous multiple

68:47

Reports indicate that thousands of

68:49

people gathered in the central square in

68:50

Tan yesterday. Show support and pledge

68:53

allegiance to new supreme leader. How do

68:55

you say his name?

68:57

>> Uh

68:57

>> Moabani

68:59

Mojaba Kmeni.

69:00

>> Mojaba.

69:02

>> He's the son of the guy they killed.

69:04

>> Yeah.

69:04

>> How many people have they killed so far?

69:06

Like the the leaders there's

69:08

>> I don't know, but it it's probably up to

69:09

the low hundreds, I would imagine,

69:11

>> because they had one guy last week that

69:13

was the new guy and they whacked him

69:15

almost immediately. Yeah, I didn't tweet

69:16

this, but when this guy was appointed, I

69:18

wanted to say like congratulations to

69:20

him and condolences to his family.

69:23

>> I was like, this is a bit full on.

69:24

>> Yeah.

69:26

>> The problem is, you're right.

69:28

>> Yeah. Yeah.

69:30

>> Yeah.

69:31

>> Yeah. I don't think he's going to last

69:32

very long cuz he seems pretty hardline

69:34

as well.

69:34

>> Yeah. Well, I mean, they killed his dad.

69:37

>> Yeah. That doesn't that doesn't that

69:38

doesn't tend to dradicalize you very

69:40

much.

69:40

>> You know, that's going to piss you off.

69:41

>> Yeah. So Jamie put in, "Did the New York

69:43

Times use an old photo for this event?"

69:46

Evidence so far suggests the New York

69:47

Times used a recent photo for this

69:49

week's gathering, not an old archive

69:51

image, though many commenters have

69:53

accused the opposite. Interesting.

69:58

Instagram's own post of the square

70:01

crowd. Multiple Iranian users claim the

70:04

image is fake or AI or from 2020, and

70:07

several assert that it's not

70:08

representative of real public sentiment.

70:10

However, another Facebook thread

70:12

referencing the same image states that

70:14

it was taken by New York Times

70:16

photographer Arash

70:19

Kamushi Kamushi on Monday, March 9th,

70:22

2026, which matches the article date and

70:24

captions used by other outlets showing

70:26

the same scene.

70:28

>> See, this is the the fog of confusion

70:31

that exists on social media.

70:33

>> Yep.

70:34

>> Isn't it worrying that we can no longer

70:36

tell what's real? We can we're already

70:38

at that point. And when you think of

70:40

where we were last year where you could

70:42

really tell pretty much what was AI and

70:44

what wasn't right now we're in the murky

70:47

waters of is this isn't it? It's going

70:49

to come to a point pretty soon where

70:50

it's everything is going to look ex like

70:53

real life.

70:53

>> Well, as soon as AI can't detect it,

70:55

that's when we're [ __ ] right? And

70:57

then so I talked to Mark Andre about

70:59

this and his recommendation was that

71:00

everything she should be on the

71:01

blockchain. So you you you're going to

71:03

be able to tell whether or not footage

71:05

has been altered, what the you know what

71:08

the chain of custody of this image has

71:11

been, where it started, where you know

71:15

>> I mean it's terrifying, isn't it?

71:17

>> Oh, it is. Yeah.

71:18

>> Because you're going to think is that

71:20

going to be the end of journalism

71:21

really,

71:22

>> right? Is that going to be almost I

71:25

think we're talking about this and this

71:27

is really really important but what's

71:29

coming with AI is even more important

71:31

and no even the people you talk to in

71:34

the field have no idea what's going to

71:37

be the second third fourth order

71:38

consequence

71:39

>> right

71:39

>> I know it it I mean there's so much to

71:43

be excited about with AI I think it

71:45

blinds a lot of people to the like not

71:47

exciting parts of it

71:49

>> well well ultimately if you just looked

71:52

at where what is it ultimately going to

71:54

lead to? It's going to lead to something

71:56

that's way smarter than us. And why

71:58

would it listen to us anymore?

71:59

>> Well, you've seen I'm sure you've seen

72:01

the stuff about how it will blackmail,

72:04

right? So that by definition that means

72:06

it has a survival instinct.

72:08

>> Yeah.

72:08

>> And if it has a survival instinct, by

72:10

definition, it means there is a priority

72:12

that it has which is above humans.

72:14

>> Yeah.

72:15

>> By definition, that's what a survival

72:16

instinct means. It means you care more

72:18

about yourself than you do about anyone

72:20

else. Right.

72:20

>> Right. So if AI has a survival instinct,

72:24

we are not going to be its number one

72:26

priority.

72:27

>> Not only that, it doesn't seem to

72:30

differentiate between using nuclear

72:32

weapons or other weapons. And when you

72:34

they've done these war game simulations

72:36

with a AI, they prefer to use nuclear

72:39

weapons.

72:39

>> Well, they're more effective, right?

72:41

>> But this is the thing like they're not

72:42

scared of this idea like, oh my god,

72:44

you're just going to dust a city.

72:46

They're like, oh, that's the way to do

72:46

it.

72:47

>> The numbers on a chart, right? Well, you

72:48

want to get your goal accomplished. How

72:51

does AI accomplish its goal with

72:52

whatever the best available is? Oh,

72:54

that's bomb.

72:55

>> Yeah.

72:56

>> It's efficient.

72:58

>> Yeah.

72:58

>> You know, and then a mutual friend of

73:01

ours, Militia Chen, was telling me that

73:03

there's a Chinese, one of the Chinese

73:04

robotics companies. It's called Skynet.

73:08

>> Oh god.

73:09

>> And they have they released a robot

73:10

called the T900. And I'm like, who says

73:12

ch, you know, the CCP don't have a sense

73:15

of humor?

73:16

>> That's that's wild. actually true.

73:18

>> Yeah.

73:19

>> Well, you've seen the robots that they

73:20

have now that will work in your home and

73:22

like fold your sheets and make your bed

73:25

and stuff and do it remarkably

73:27

humanlike.

73:29

>> There's a video that was released

73:31

yesterday. Again, I don't know if it's

73:32

real, but it looks real. It looks like

73:35

an actual robot that's making your bed.

73:37

And they've gotten the dexterity to the

73:40

point where you could imagine things

73:42

like this happening. I think this is one

73:43

of the reason why Elon is shifting his

73:45

focus away from some Tesla models so

73:48

that they can reset up one of their

73:50

factories to make these Optimus robots

73:53

that you're going to have them as home

73:54

companions

73:55

>> and they're going to be able to do

73:57

kitchen work for you and maybe even

73:59

cook.

74:00

>> My acupuncturist, she's Chinese. She

74:02

went back to China and she was saying

74:04

she stayed at the hotel and like most of

74:06

the services there is provided by

74:07

robots. So, like she she went to her

74:10

room, ordered some food, 3 minutes,

74:12

knock, and [ __ ] robot delivering the

74:15

food.

74:15

>> An actual humanoid looking robot.

74:17

>> Uh, no, I don't think so.

74:18

>> Cuz there's a restaurant out here that

74:19

you go to and when you order drinks, it

74:21

comes by on a little robot trolley.

74:23

>> Yeah. Yeah, we have that.

74:24

>> You have to say accept. You take your

74:26

tea. Yeah, it's kind of cool. It's fun.

74:29

Here it is.

74:29

>> It's not called Skynet, I don't believe,

74:31

but

74:31

>> Oh, sorry. Company. They did name it

74:34

after the I think, but I can't tell what

74:36

this is. T800 T800. The T900 is in

74:40

development.

74:41

>> Look what it looks like.

74:42

>> In a weird movie here, so could be for a

74:45

movie and we're misunderstanding.

74:47

>> That looks fake. Yeah, that looks fake

74:48

right there.

74:49

>> The robot itself looks fake.

74:50

>> That's why like hearing come the

74:52

movement doesn't look

74:53

>> this doesn't seem

74:54

>> Well, it just it's in that. First of

74:56

all, I don't like how it's lit.

74:57

>> I don't like how this room is lit. I

74:59

feel like

75:00

>> for a film.

75:02

>> Yeah.

75:02

>> Yeah. That's a [ __ ] ass function.

75:04

>> I think that's a mistake.

75:07

Joe's like, "This is my era of

75:08

expertise." No way.

75:09

>> A lot of videos of it that say it's

75:11

real. But

75:11

>> yeah,

75:12

>> I like how the the bag went flying like

75:14

it's on a rail, you know? It's not even

75:16

stationary cuz you don't you don't want

75:18

to really see how hard it can get this

75:20

Forbes article, but it says 40 grand.

75:22

>> Wow.

75:23

>> But I don't

75:24

>> right now.

75:25

>> No, that's I don't know. This might be

75:26

[ __ ]

75:27

>> Yeah. Look at it. It's got the Iron Man

75:28

thing in the center of it chest. That's

75:30

pretty dope.

75:33

Hm.

75:36

>> I would just say engine AI would makes

75:39

it just right away sound like it's a AI

75:42

content company, not robot company.

75:45

>> Well, that video looked very AI like it

75:48

didn't that there's something about it.

75:50

You know, your brain recognizes

75:51

miniature cars. You ever see like a

75:53

miniature car? Like, you know, you know

75:55

how they have those like really well

75:57

done miniature cars that people like to

75:59

collect and it's like a tiny Porsche,

76:02

>> but your brain knows

76:04

>> like your brain looks at it and goes,

76:06

"There's something wrong here. This is

76:07

not real."

76:08

>> Yeah.

76:08

>> That's how I felt looking at that robot.

76:10

Like my brain was like,

76:12

>> "That's not a real thing." Throwing

76:14

kicks.

76:15

>> Yeah. But it's going to come to a point

76:18

where it's like, is that real?

76:19

>> Oh, yeah. I mean, it's probably chat

76:22

GPD5 can already probably do it better

76:24

than that. You know, we don't know what

76:26

the newest iterations of these things

76:28

are, and they're improving radically all

76:30

the time.

76:31

>> There's no I just don't believe it. The

76:33

beginning of this video says there's no

76:34

CGI, which I I don't know. I don't know

76:36

why we have to believe that. But,

76:37

>> bro, this looks fake to me.

76:38

>> Fake [ __ ] I know.

76:39

>> This looks fake to me. That does not

76:40

look real.

76:41

>> There's articles all over saying it's

76:42

real, but doesn't look real at all.

76:44

>> Yeah, but I mean, why wouldn't you have

76:45

it more well lit? Like if I was going to

76:47

do something like this, I would have

76:49

spotlights on it and people next to it

76:51

so I could examine their shadows and it

76:53

sh This is weird.

76:54

>> Yeah, this is like a

76:55

>> looks fake.

76:56

>> Why would you have like

76:57

>> Why is the light coming through the

76:58

corner of the window like that?

77:00

>> Also, why would you make a robot that

77:02

does like martial art?

77:04

>> Just shoot people, bro.

77:05

>> Right.

77:06

>> Yeah. Why would you throw that [ __ ]

77:08

kick, too? That's a 360 roundhouse kick

77:10

that almost never lands.

77:12

>> That's really hard to pull off. Yeah,

77:14

but you're saying that as a human. Maybe

77:16

as an AI, you land it every time. Slow,

77:18

>> but surely if you're a robot, you just

77:19

grab their neck with your metal claw and

77:21

crush it. I mean,

77:22

>> just run after them and headbutt them

77:24

and knock them unconscious. Your head's

77:25

made out of metal, right?

77:26

>> The whole thing is crazy. Like, why

77:28

would you be throwing wheel kicks?

77:30

>> Like the, you know, even if you like

77:34

went over UFC fights and say like,

77:36

"What's the most effective techniques

77:37

that work most of the time?" Why would

77:39

you program in 360 roundhouse kicks?

77:41

That [ __ ] never comes up. M

77:43

>> yeah I mean it does look cool though.

77:45

>> Yeah but I I can think of one fight uh

77:48

Yair Rodriguez pulled it off on BJ Penn

77:50

>> but there was BJ Penn towards the latter

77:53

end of his career. Yayier Rodriguez in

77:55

his peak and Yaire is exceptionally

77:57

talented kind of a freak with his kicks.

78:00

>> But it's almost like he was showing off.

78:01

He already had BJ really hurt and he

78:03

just threw a 360 roundhouse kick and he

78:05

hit him in the face. It was crazy.

78:07

>> But that this thing is doing that just

78:09

to show you it does martial arts. Why

78:11

would you need martial arts? Like you

78:13

you should have like a thousand bullets

78:14

on you. Just just like gun everybody

78:17

down with your fingertips.

78:18

>> Robocco.

78:19

>> Yeah. Why wouldn't you turn your like,

78:20

you know, like like Iron Man does? Shoot

78:23

fire out of your palms.

78:25

>> The future's bright, Joe.

78:27

>> Well, we're also kind of being

78:29

bullshitted, I think. I mean, like, is

78:31

there a way to an analyze that video?

78:33

>> God, this is I'm going this rabbit hole

78:35

is strange. This is a website that

78:37

they've made that says you can buy it.

78:38

When you click on buy now, it takes you

78:40

somewhere else. And I think that that's

78:41

the first signal.

78:42

>> And it steals your IP address.

78:44

>> Yeah, that's where I'm like, I'm not

78:44

clicking.

78:45

>> Gets all your credit card information.

78:46

>> But it's a fully made website. They have

78:48

a team. They got a CEO. They've got

78:50

other things. It just it doesn't seem

78:52

This all looks fake to me. Like it's for

78:54

a movie. Like this is like

78:56

>> Maybe it is. Maybe this is like a setup

78:58

for a movie and we're being [ __ ] with,

79:00

>> right?

79:01

>> Well, we're giving them a lot of free

79:05

even Googling it. There's lots of

79:06

articles about it. People are talking

79:08

about it like it's real and discussing

79:09

it like it's real. No one that I've even

79:11

seen is like, "This is obviously fake.

79:12

This is obviously AI."

79:14

>> I like how it's got the Sylon eyes.

79:15

>> Yeah.

79:16

>> By your command.

79:18

>> They've got uh other products here for

79:20

sale, which those fit in line with other

79:22

robots who

79:23

>> click on that. I don't know what the

79:24

[ __ ] this is,

79:25

>> bro. That's the guy from Monsters, Inc.

79:26

>> Yeah,

79:27

>> the one big eyeball.

79:28

>> It's a really well-made website. It

79:30

looks nice.

79:31

>> Interesting.

79:32

>> They did some good fun work, but it just

79:33

seems like a fun project someone made.

79:36

>> Yeah.

79:37

>> Stay tuned for this one. You can't buy

79:38

this yet. Yeah. Give me a few years.

79:40

What about the dog?

79:41

>> What about the guy who's just got legs?

79:43

That one's weird.

79:45

>> What's that?

79:48

>> It's like I don't want him touching me.

79:50

>> Just run around my house and do some

79:51

stuff.

79:52

>> Yeah. Expandable bipedal robot that

79:54

supports user custo blah blah blah. Uh

79:57

watch this with my purchase. Now you can

79:58

already see the website at the bottom.

80:00

That doesn't look it's 3.cn. So it's a

80:02

China website.

80:03

>> Mhm.

80:04

>> Not this blankly. Probably getting

80:06

>> it's floating.

80:06

>> Okay, there we go.

80:07

>> It's in Chinese. See it taken somewhere

80:09

else now. It's like what is that?

80:11

>> It's definitely not a buy now.

80:14

>> Maybe that red thing is

80:15

>> No, this is like this is I think

80:17

>> register your interest. Maybe

80:18

>> like a backend website. Interesting.

80:20

>> They just didn't click the right link

80:22

here cuz it opened up a different

80:23

>> I don't think this is but this is

80:26

>> whatever this is is very they did it

80:28

well.

80:29

>> Interesting.

80:31

>> That is

80:31

>> even if it's a a college kid making a

80:33

project.

80:34

>> Good job.

80:36

Well, I take full responsibility for

80:37

that one.

80:38

>> I know. That's fine. I mean,

80:39

>> I take responsibility

80:44

can make a Forbes article. That's kind

80:46

of another

80:47

>> what?

80:47

>> Yeah. Not anybody, but like you can make

80:49

we can make Forbes articles that say all

80:51

sorts of stuff.

80:52

>> It's not coming from Forb's

80:54

>> editorial team, per se.

80:56

>> Well, but but but you could publish on

80:58

Forbes. Uh let me I don't I don't want

81:00

to speak out of turn specifically, but

81:01

like I've seen there's so many like uh

81:03

reviews for video games that pop up like

81:05

every single day that's like it's you

81:07

can be a contributor I believe is what

81:08

it would be. Not like

81:10

>> Oh, and maybe they have a bad editorial

81:12

team and you could sneak this through.

81:14

>> Sure.

81:15

>> And just pretend that there really is.

81:16

That would be a great pro prank to pull.

81:19

>> Yeah,

81:19

>> it's a great hoax.

81:20

>> Did you follow the molt book thing?

81:22

>> No.

81:23

>> I mean this Did you follow molt book?

81:25

>> Yeah, I just actually saw that. that I

81:26

think Meta just bought it today.

81:28

>> Meta just bought it today. This is

81:29

>> I thought it was fake.

81:30

>> Did you Was that fake as well?

81:32

>> No, it's I I don't know why they would

81:33

have bought

81:33

>> How weird is it we have to worry about

81:35

everything.

81:35

>> France is just here spreading fake news

81:37

the whole

81:38

>> I didn't I was just very cynical about

81:40

it cuz it it the idea of it sounds right

81:42

but like that actual bots are making a

81:44

social network to do stuff and talk

81:46

about us and whatever kind of sounds too

81:49

far into the sci-fi.

81:50

>> So this is the social network for AI

81:53

agents. I have heard about this.

81:56

>> Well, they they complain about humans.

81:58

Is that right?

81:58

>> Yeah.

81:59

>> These [ __ ] are

82:00

>> Yeah.

82:01

>> Yeah. And apparently they created their

82:03

own language and they talked amongst

82:05

themselves so that we wouldn't be able

82:06

to access and see what they were talking

82:08

about.

82:08

>> Yeah. That's really fun. Did you see

82:10

when they got all the AI agents to talk

82:11

to each other and started using

82:12

Sanskrit?

82:13

>> No.

82:14

>> Yeah. They got these different large

82:16

language models to communicate with each

82:18

other and they eventually broke out into

82:19

Sanskrit.

82:20

>> Wow.

82:23

It's very strange.

82:24

>> Maybe these guys in um Iran are right.

82:28

Like maybe this is the apocalypse. Maybe

82:30

this is how it comes about. Maybe we're

82:32

we're looking at each other and we're

82:34

going to bring about these [ __ ]

82:36

and that's what's really going to be the

82:38

end of civilization.

82:39

>> Yeah. And places like Iran is the only

82:40

place you're going to be able to hide as

82:42

a as a human cuz it's the one place that

82:43

hasn't adapt adopted all this [ __ ]

82:45

right?

82:46

>> Yeah. Maybe Afghanistan's the spot to

82:47

go. Just live like a goat herder.

82:49

>> Yeah. We're going to be like Bin Laden

82:51

just living in a cave.

82:54

>> I mean, you're being very negative,

82:56

boys. There's another option.

82:58

>> There's another option. There's another

83:00

option.

83:00

>> What's the other option?

83:01

>> We all become Amish.

83:03

>> Oh, okay. But then we're run by AI.

83:06

Yeah.

83:06

>> We're We're Amish and we live in our

83:08

little communities, but we have no say

83:09

on how the world works. So, this is the

83:11

real fear is that we're no longer the

83:13

apex intelligence of the planet. And

83:16

that seems to already be the case.

83:18

>> Yeah. This is for the Forbes thing I was

83:20

talking about. The article we had, I

83:22

don't think was specifically this, but

83:23

I've seen many articles like this where

83:25

people can submit

83:26

>> oneoff original articles to the opinion

83:28

section, particularly for topics related

83:30

to business tech or pol tech. There you

83:32

go. Or policy by emailing pitches to

83:36

ideas at Forbes.com. So yeah, like so if

83:39

you're a person on the other end just

83:40

looking for clicks,

83:42

>> like that would be a good one. You see

83:44

it like, oh, this is really well-ritten

83:45

article. Let's go to the website.

83:47

website looks legit. Oh, they're

83:49

throwing wheel kicks. I'm in.

83:54

>> Yeah. I mean,

83:56

>> [ __ ]

83:57

>> But again, the the point being made

84:00

again is like it's such a it's such a

84:03

terrifying world where you don't know if

84:05

what you're seeing is true. You don't

84:08

know if what you're reading is accurate,

84:10

>> right? to the point where you can't help

84:13

if that's the case that the world you

84:15

live in continually feeding things that

84:17

may or may not be true or altered or

84:19

doed. Wouldn't that just put you in a

84:21

state of paranoia after a while?

84:23

>> 100%. Now imagine if you are in the

84:26

Middle East and uh you bust out your

84:28

cell phone because a fiery cloud emerges

84:31

and Jesus is on a white horse and you

84:34

film it and you post it online. Who's

84:36

going to believe it? Right? This is the

84:39

real problem with Jesus returning. If he

84:42

returned now, no one would buy it. Like

84:45

we're we're getting into this like like

84:47

imagine Jesus is a real person or a real

84:49

God who's the son of God who's going to

84:51

come back. He really is. It's real. It's

84:53

all real.

84:54

>> It's happening at the same time where

84:55

you have no idea what's real.

84:58

And it all converges instantaneously

85:00

with the rise of sensient artificial

85:03

general super intelligence that has

85:05

complete autonomy. that's running all

85:08

the resources, everything, anything

85:11

that's attached to a computer, which is

85:12

basically everything. All of our power,

85:14

all of our, you know, everything fill in

85:16

the blank. Everything's run by

85:17

computers. And now AI has control of

85:19

everything and no longer wants to listen

85:22

to human beings. And Jesus returns.

85:29

Yeah. I mean, that might be what

85:31

everybody's talking about when they're

85:33

talking about Armageddon, when they're

85:34

talking about the end of civilization.

85:36

It might be this new thing that we're

85:38

creating.

85:39

>> Well, if that happens, I'll be rooting

85:41

for Jesus to return. I'll be Please,

85:43

Jesus. I'm not sure I believe in you,

85:45

but please come back.

85:46

>> Yeah, I'm not sure either. But this is I

85:48

mean, maybe he did. Maybe a historical

85:50

Jesus existed at one point in time. And

85:52

maybe what they're talking about is like

85:55

their version of the cycles of humanity

85:58

that other religions have talked about.

86:00

is that especially when you deal with

86:02

technology and power and civilization

86:04

that things get to a point where they

86:06

always go sideways and then there's dark

86:08

times and then they then society like

86:10

the yugas like you know

86:12

>> what are the yugas

86:13

>> the yugas are the cycles of civilization

86:16

that let's uh I don't want to [ __ ] this

86:19

up so let's um define the y we're in the

86:21

middle of kaliuga which is the the age

86:24

of confusion and

86:26

>> feel like it

86:27

>> yeah I mean it's odd how accurate these

86:32

cycles are when you look at historical

86:36

events and like what things were like

86:38

you know x amount of thousands of years

86:40

ago.

86:40

>> So Hindu cosmology

86:41

>> yes uh vast cosmic ages in Hindu

86:44

cosmology describe reoccurring cycles in

86:46

the moral and spiritual state of the

86:48

world. So the four yu yugas are satya

86:52

yuga, the first and most righteous age

86:54

often called the golden age marked by

86:56

truth, virtue and maximum dharma which

87:00

is moral order. Uh ta yuga the second

87:03

age uh dharma declines somewhat. Virtue

87:08

still predominates but imbalance begins.

87:11

Uh how do you say that word? Dwaf parara

87:14

yoga the third age with with further

87:17

decline in righteousness and an increase

87:19

in conflict suffering and confusion and

87:21

then kaliuga the fourth and darkest age

87:24

characterized by moral decay ignorance

87:26

and materialism with dharma at its

87:29

weakness. Okay, that's us.

87:31

>> Yeah.

87:31

>> Hindu cosmology treaties yugas as

87:34

repeating cycles of creation, growth,

87:35

decline, and destruction rather than

87:38

one-time historical periods.

87:41

Yeah. Very interesting, right? Do

87:43

>> you know it's it's even more noticeable

87:45

for us coming to America, I think,

87:47

because you know, we love America, but

87:48

one of the things that really stands out

87:49

is how materialistic people are and how

87:52

much money is like the number one thing

87:54

for everything now.

87:55

>> Yeah.

87:55

>> I find that really

87:57

it it stands out to me.

87:59

>> Mhm. the fact that so and I find it

88:02

weird in our game especially like in

88:04

media and podcasting and whatever like

88:07

because the way we think about what we

88:09

do is we're trying to produce content

88:11

that's actually of value to people

88:13

>> but we also meet a lot of people for

88:15

whom it's it's like a business it's like

88:17

it's like selling widgets it's the same

88:18

you know how do you get

88:20

>> how do you maximize your returns on your

88:21

investment you know

88:23

>> um and that to me is it's um it shows

88:28

you that something is slightly

88:30

Yeah, it is. It's And you you also get a

88:32

lot of

88:35

>> you get people that are making content

88:37

just based only on the perceived

88:39

popularity of that content, not whether

88:41

or not they are really interested in

88:43

having these conversations and you feel

88:45

it when you're talking to these people

88:46

or when you're listening to these people

88:48

talk to each other rather.

88:49

>> Yeah. Um the clickbait stuff, a lot of

88:51

celebrity stuff,

88:53

>> you know. Um, uh, Bert Chryser went on

88:55

Shannon Sharp's podcast and he said they

88:57

basically have like a list of like

89:00

controversial things they could talk

89:01

about and you know and subjects they

89:03

think are going to get the the most

89:04

amount of traction and that those are

89:06

the questions that he asked. You just

89:08

ask questions off of a list.

89:09

>> But from a business point of view, if

89:11

you take morality out of it, that's a

89:13

smart thing to do. Joe,

89:14

>> is it though? Is it though because like

89:16

what what's the most popular show is

89:17

this one and why is this one the most

89:19

popular? Because I don't do that at all.

89:21

>> Mhm. But a agreed, but you're sort of an

89:25

outlier in that. There's people who make

89:27

very very very good living interviewing

89:30

those types of people, having that type

89:32

of approach and creating that type of

89:34

content. I know, but I think in the end

89:37

you bite off your nose despite your face

89:39

cuz I think that you lose a certain

89:41

amount of authenticity. There's a

89:43

certain amount of like a legitimate

89:45

connection between you and whatever

89:47

you're talking about that it doesn't get

89:49

through to the people. Like if I talk to

89:51

someone, I'm only talking to them

89:52

because I want to like then I have a lot

89:55

of people on that are not even remotely

89:56

popular or famous, but I think they

89:58

wrote an interesting book or I think

90:00

they're involved in interesting research

90:01

or I think they've got a weird opinion

90:03

on something and I want to talk to them

90:04

about it or they have had a strange life

90:06

or, you know, they were an undercover

90:08

cop or whatever it is. I I I'm just

90:10

interested, right? And I think that

90:12

>> if you abandon that and only focus on

90:16

this person is famous or this person's

90:18

in the news or this is this is going to

90:19

get a lot of views,

90:21

>> you don't care as much about the

90:23

conversation you're having and the

90:24

people know. So like the person

90:26

listening and watching, they can feel

90:28

it.

90:29

>> No, I agree with that. But I also think

90:30

you could probably get a lot of clicks

90:32

by saying, I don't know, Erica Kirk

90:33

killed Charlie Kirk.

90:34

>> Right. You could do that, too. But

90:36

you're also you're you're playing a

90:38

weird game where you got to continually

90:41

go deeper and deeper and deeper. And now

90:42

Erica Kirk's a man. You know, you know,

90:45

you know what I'm saying?

90:46

>> That's probably next to

90:48

>> Has that already happened? I probably

90:49

That's probably already happened.

90:50

Someone's probably already floated that

90:52

one out there.

90:52

>> Yeah. And you saying over time that runs

90:54

you run out of Yeah.

90:55

>> You run out of road.

90:56

>> You're playing the wrong game. you're

90:58

you're playing you're playing a very

91:00

similar game to the game that like TMZ

91:03

is playing or any of these other like

91:05

things where you can get a lot of

91:07

traction, you can get a lot of views,

91:09

you know, but no one thinks you're being

91:11

authentic. No one like if you have a

91:13

take on world events and we're

91:15

incredibly sorry for the loss of this

91:17

person, like you don't really care and

91:19

they know you don't really care. So they

91:20

they know there's no sincerity. They

91:22

know you're not really connected to it.

91:24

And so in this weird age that we're

91:26

living in where you're not sure what's

91:28

real at the very least, you want the

91:31

person who's talking to be talking about

91:34

something in an honest way,

91:36

>> right?

91:37

>> And connecting with people in an honest

91:39

way because that's what we're missing.

91:41

And that might be the only thing we have

91:42

left once this AI [ __ ] goes live. Like

91:46

>> it's it's probably not even going to be

91:47

podcast. It's probably going to be

91:49

public speaking. It's going to you're

91:50

going to have to like talk to people in

91:52

groups and we're going to all have to

91:54

like work ideas out together because I

91:57

don't think you're going to be able to

91:58

know when you're communicating online

92:00

what's real and what's not real. We're

92:02

already in the fog where we haven't hit

92:04

the [ __ ] full hail storm of [ __ ]

92:07

that's coming our way.

92:08

>> Yeah. And it but I I agree with you Joe

92:10

and I agree. if you want something that

92:12

is sustainable, if you want something

92:14

that is nourishing, if you want some if

92:16

you want to create content that people

92:18

engage with, that is honest.

92:20

But I think there's a lot of people out

92:22

there who are just looking at it in a

92:24

very cynical way and they're optimizing

92:25

it for clicks, attention, and monetary

92:28

gain.

92:28

>> Yeah.

92:29

>> And if you want to create a a business

92:31

that can make money and that is that

92:34

doesn't require a lot of lift, we all

92:37

know what you can do.

92:38

>> That's the Eagle song, Dirty Laundry.

92:40

>> Yeah. You know, it's they've always been

92:42

>> It's real.

92:43

>> It's real.

92:44

>> Jamie's still on this do all the jumping

92:46

and [ __ ]

92:46

>> Thank you. It's from I find it real.

92:50

This is

92:50

>> Okay, that looks way more real. So, the

92:53

videos we were watching were [ __ ]

92:55

>> So, this went viral a while ago. They've

92:58

they had to come out and make other

92:59

videos. Oh, this is a different one. Uh

93:01

to say that

93:01

>> Oh my god, that's so Iroot.

93:02

>> I know. They had a that lined up

93:04

military style. There's a few different

93:06

companies in China that have gone viral

93:08

for posting videos that people in

93:10

America think are fake. And

93:12

>> that's why I had to go to CES and find

93:14

somebody else because they put out more

93:15

content that doesn't necessarily look

93:18

fake, but it doesn't look better.

93:20

>> Well, that doesn't look fake.

93:22

>> Yeah, this does not look fake because

93:23

this is people on the floor at CES,

93:24

>> right? But look how much more awkward it

93:26

movements are.

93:28

>> But they put out a video where the thing

93:29

is kicking the CEO.

93:32

Yeah. It It almost looks real, but it's

93:35

not. It It I It's tough. I like to see

93:38

what of it.

93:41

>> Is it funky lighting again?

93:42

>> Not Not that it's when they're in the

93:45

ring with them.

93:46

>> Okay.

93:46

>> Hold on a second.

93:49

>> Okay. So, he's going to go ahead. Okay.

93:51

That looks much more real. That looks

93:53

much more real. Much more awkward.

93:56

>> Jesus.

93:56

>> But that didn't that looks fake. I tell

93:58

you what, I wouldn't I would not do that

94:00

if I

94:00

>> No, no, that looks real to me.

94:02

>> Stand in front of a hunk of metal is

94:04

going to kick you.

94:05

>> The problem is it's slow motion. Let me

94:06

see it again. In In fact, did they show

94:08

it in real speed?

94:09

>> It's just this weird clip of it, which

94:10

is kind of strange, but

94:11

>> let me see it again.

94:14

Taking the first kick.

94:17

>> It looks real.

94:19

>> The only thing I would say is it's not

94:20

jumping up and doing spin kicks, but

94:22

it's doing some of this other stuff that

94:23

>> Well, that would be what I would teach

94:25

it. First of all, I wouldn't teach it to

94:27

do the spin kicks. I'd teach it to do

94:28

like a stepping front kick like that.

94:30

>> That's the [ __ ] they were showing that

94:31

people had problems with like we just

94:33

did. But man, that

94:36

>> the robots are doing crazy stuff like

94:38

>> Well, they definitely can do crazy

94:40

stuff. There was that one demonstration

94:41

they did in China. I I think you've seen

94:43

that once.

94:44

>> See, it's bouncing around like in a

94:45

fight position here like he's ready to

94:47

go,

94:47

>> shaking his legs out.

94:49

>> Wow. But let me see some wheel kicks.

94:52

>> That's the I mean,

94:53

>> see, that's the thing. Why is the one in

94:54

the corner looking depressed?

94:56

>> That's the older one.

94:59

>> Decommission. He's right now plotting

95:02

his strategy for blackmail to get

95:04

upgraded software.

95:06

>> Yeah.

95:07

>> I don't know. So that I mean

95:09

>> well that's this is so what we're

95:10

looking at was probably some AI at least

95:13

enhancements.

95:15

>> They but the problem is is they're not

95:16

saying they're not admitting that it is.

95:18

They're saying it's not.

95:19

>> And I go like that's tough.

95:21

>> Interesting. Well, I would want to see

95:22

this thing move in a similar way that

95:24

you're seeing in that video. I mean,

95:26

that thing shows remarkable agility

95:28

where it's jumping up in the air and

95:30

spinning around.

95:31

>> And this thing's not doing that.

95:33

>> No, it's moving very differently, isn't

95:34

it?

95:35

>> Yeah.

95:35

>> Yeah.

95:36

>> There's a stiffness to its movement.

95:38

>> Strange though.

95:39

>> Looks like you at the gym.

95:40

>> Yeah, it does.

95:42

>> Yeah.

95:44

>> Really does.

95:46

>> Um, there was that the Chinese

95:48

demonstration though. There was a a

95:50

demonstration where these people were on

95:52

a stage and they were doing martial arts

95:54

and then the robots came out and the

95:55

robots did martial arts. That looked

95:57

real,

95:57

>> right? Right.

95:58

>> That looked real, but it didn't.

96:00

>> Here's the other video we sort of saw.

96:02

>> Oh, this is crazy.

96:03

>> God, this is

96:04

>> this is crazy.

96:07

That's where they reload.

96:09

>> And these are real, right?

96:10

>> I think. Yeah, this is a different video

96:11

they had to post because people didn't

96:12

think these were real.

96:13

>> That looks real.

96:19

They look like they're unsullied.

96:21

>> So

96:22

>> from Game of Thrones.

96:23

>> What are these ones?

96:26

>> Who makes these?

96:27

>> I don't know.

96:28

>> I don't know. It doesn't

96:30

>> That marching sound is not comforting,

96:32

is it?

96:32

>> No.

96:33

>> It says world's first mass delivery of

96:35

humanoid robots.

96:36

>> Yeah. You're going to have cargo ships

96:37

filled with these headed to America.

96:40

>> Wonderful.

96:41

>> I mean, those are going to be the new

96:42

police officers.

96:43

>> Uh-huh.

96:44

>> Yeah.

96:44

>> Yeah. Yeah. That's not good. This is I

96:49

mean this is Terminator. This is the

96:50

movie. I mean and

96:53

>> if you really were imagining like if you

96:55

were trying to warn people of an

96:56

apocalypse and you told it through

96:59

stories for generation after generation

97:01

and then eventually people write down

97:03

their versions of this story and then it

97:06

goes to 2026 where this stuff is

97:09

actually happening. Maybe this is what

97:11

they were warning us about.

97:12

>> Yeah. Do you remember in the 80s and the

97:14

'9s and the early 90s there was this run

97:16

of great movies talking about how the

97:18

robots are going to take over

97:20

>> sci-fi books as well. I mean Isaac

97:21

Azimov stuff was amazing.

97:23

>> Philip K. Dick you know um do Android

97:26

Dream of Electric Sheep which then

97:28

became Bladeunner

97:30

all of these. And then no one's making

97:32

those movies now are they?

97:34

>> No. Too close to home.

97:36

>> I guess I Robot was probably the last

97:38

one right?

97:39

>> Yeah. which was uh I I wrote but which

97:42

was the one with Tom Cruz Minority

97:44

Report which was based on a Philip K.

97:46

Dick.

97:46

>> Mhm.

97:47

>> But nobody's making them anymore cuz

97:49

everyone's like, "Dude, I I know this is

97:51

going to happen,

97:52

>> right?

97:52

>> I don't need to see this."

97:54

>> Well, they're also talking about using

97:55

AI to predict people's behavior. So,

97:58

they're talking about future crime

98:00

report, right?

98:01

>> Yeah. So, they've literally talked about

98:03

one of the ways that AI could be

98:04

implemented. You look at someone's

98:06

history. You look at someone's behavior

98:08

patterns. Look at what they're doing

98:09

now. And you predict, oh, this person

98:11

has been radicalized. They're about to

98:13

do X.

98:14

>> Yeah. And there is

98:15

>> they're about to tweet something.

98:18

>> Yeah. That's

98:20

>> arrest them.

98:21

>> Yeah. Graeme Lithan picks up his phone.

98:23

Robot kicks down the door and arrests

98:25

him.

98:26

>> Graham.

98:26

>> He's doing right now, I think.

98:27

>> Yeah. Yeah. But he's in America. That's

98:29

why he's doing

98:30

>> I think that backfired. Yeah. I think

98:32

people were outraged by that cuz it's so

98:34

outrageous. Yeah. Then

98:35

>> you meet that guy at the airport and

98:37

arrest him. And it was right after he

98:39

did this podcast, by the way.

98:40

>> Yeah. Yeah.

98:41

>> Yeah. Yeah. I I remember that's a moment

98:45

even when I was talking to comedians who

98:47

were actually woke. They were like,

98:48

"Yeah, that this is you can't do this."

98:52

>> The thing is he didn't even do it in

98:53

England. So you're arresting who someone

98:55

who's not a citizen of the United

98:57

Kingdom for a crime.

98:59

>> I mean, if we accept that framing that

99:01

they didn't even commit in the country,

99:02

>> right?

99:04

the

99:07

>> Yeah, it's pretty cookooky that they

99:08

went with it.

99:09

>> Yeah.

99:10

>> Yeah.

99:10

>> Yeah.

99:10

>> And I know the reason is because every

99:14

police officer in airports in the UK

99:16

have guns,

99:18

>> but it's a really bad look. Like there's

99:20

five armed police officers

99:22

>> arresting a comedy writer.

99:24

>> Yeah. I bet you they felt bad doing it

99:26

as well because it's not them that's

99:28

making up these dumb

99:29

>> Oh, I'm sure.

99:29

>> Nobody signs up to like arrest comedy

99:32

writers,

99:32

>> right,

99:33

>> in airports. I don't I don't think

99:34

that's why the police do it. But the

99:36

rules have just got so

99:37

>> well, you see it in like the humiliation

99:39

that a lot of these police officers face

99:41

when they have to arrest someone for a

99:43

Facebook post,

99:43

>> right?

99:44

>> Like you could see like they are not

99:45

happy and when people are protesting and

99:48

yelling, are you [ __ ] serious? And

99:49

they're like, I'm just doing my job.

99:51

>> Yeah.

99:52

>> You know?

99:52

>> Yeah. And that's a large part of the

99:55

problem. We get, you know, former police

99:57

officers on the show and we got a lot

99:59

of, you know, cops and former cops who

100:01

watch the show and they talk to us about

100:03

the state that the British police force

100:05

is in and it's demoralization.

100:07

>> Yeah.

100:07

>> Yeah. The rank and file don't want any

100:09

of this [ __ ]

100:10

>> Well, same in America. A lot of

100:13

especially major blue cities where just

100:16

a few years ago they were running with

100:17

that defund the police [ __ ] and then

100:19

things obviously went sideways and most

100:21

of them sort of course corrected for the

100:23

most part except in narrative.

100:25

>> You know that it's not like public

100:27

massive support for the police officers

100:29

because they keep society together.

100:31

>> Like in Austin the the cops responded in

100:34

a minute. One minute. That guy started

100:36

gunning people down at that bar. The

100:38

cops were there and killed them in a

100:39

minute.

100:39

>> It's incredible.

100:40

>> Incredible. And they should be applauded

100:42

for that. I mean that's amazing. I mean

100:45

that but you know even that like in this

100:49

city there hasn't been this big public

100:51

support of those officers, this big

100:53

celebration of those officers, this big

100:56

acknowledgement of the importance of

100:57

them and how they were willing to put

100:59

their life on the line and react so

101:01

quickly and so effectively.

101:03

>> They're heroes. That's what they are.

101:04

They're heroes. They're real heroes. And

101:06

they're heroes that

101:09

have been demoralized by the last 6

101:11

years of of horseshit ever since the

101:13

George Floyd protests, you know,

101:15

>> and well, it was happening before. I

101:16

mean, if you go back to Michael Brown,

101:19

right? Michael Brown, what we were told

101:20

in the media happened is not what

101:22

happened.

101:22

>> Which one was Michael Brown?

101:23

>> Michael Brown was hands up, don't shoot,

101:26

>> right?

101:26

>> That's he didn't have his hands up.

101:28

>> And he didn't say don't shoot.

101:29

>> Right.

101:29

>> He assaulted the police officer. Right.

101:32

But the media concocted the story and I

101:34

don't think this is what we came back to

101:36

like what's happening in new media where

101:37

people are putting out things that are

101:39

really damaging to the fabric of our

101:42

conversations right and what how we talk

101:44

about things like you say I mean there

101:46

are bad bad apple police officers of

101:48

course there are

101:49

>> but the majority of them they are people

101:52

who are signing up to risk their life on

101:54

a daily basis to protect other people in

101:56

their community. And these people all

101:58

have [ __ ] PTSD because all they see

102:00

is the worst of humanity day in day out.

102:02

Yep.

102:03

>> Every single [ __ ] day they get in the

102:05

car and they go and eat [ __ ] for the

102:06

rest of the day.

102:07

>> Yeah.

102:07

>> And then they go home and they worry

102:09

about not coming home, right? And then

102:11

someone tries to run them over with a

102:12

car like Yeah. They're going to [ __ ]

102:14

shoot.

102:14

>> Yeah.

102:14

>> You know,

102:15

>> and it's it's and the thing is that's

102:18

how society falls apart when you no

102:20

longer honor and celebrate the people

102:21

who are putting themselves on the line.

102:23

Well, not just that in the case of the

102:25

lady running over or she wasn't running

102:27

him over. I I think she was trying to

102:29

turn her her car away from him, but that

102:31

guy had been dragged by a car just a few

102:34

weeks earlier.

102:35

>> That's what I'm saying.

102:36

>> Yeah.

102:36

>> So, but in

102:38

>> and then on top of that, you have people

102:39

that are being paid to protest, right?

102:41

So, it's organized. And I'm not saying

102:43

that lady was, but many people are. And

102:45

then you've got all these people that

102:46

that becomes the focus of their life. It

102:49

becomes a cause that's worthy. You live

102:51

this mundane, boring life of desperation

102:54

and then all of a sudden something comes

102:55

along that gives you hope and meaning

102:56

and like this is my identity. My

102:59

identity is I'm fighting fascism and I'm

103:01

out there in the street,

103:02

>> right?

103:02

>> You know, I was on I was on the plane uh

103:05

to the US. I think it was last year and

103:08

you know the movie Bridesmaids came up.

103:10

>> Yes.

103:10

>> So really funny movie. It's 2013. I was

103:13

like ah I want something like let's

103:14

watch this comedy. The romantic interest

103:17

in Bar's made the main guy, you know who

103:19

what his job is? He's a cop.

103:21

>> Mhm.

103:22

>> Can you imagine a movie being made now

103:24

like a like romantic comedy where the

103:27

main guy is a cop and he's a good guy,

103:30

>> right?

103:31

>> You just wouldn't see it,

103:32

>> right?

103:32

>> You just wouldn't see it because cops

103:34

are oppressive

103:35

>> agent now.

103:39

>> Yeah, man. Do you know Yuri Bezmanov

103:41

talked about this?

103:42

>> Yes.

103:43

>> He talked about this. She talked about

103:44

the fact that when you see in the

103:46

culture, you know, the military, the

103:47

cops, the firefighters, all of these

103:49

people, they're bad and the criminal is

103:51

the one that's to be understood and to

103:53

be, you know, all that's how you flip

103:56

society.

103:57

>> Yeah.

103:57

>> And that's what we've got.

103:58

>> Yeah. That's what we've got.

103:59

>> The Bonghov speech from 1984, which is,

104:02

by the way, such an appropriate date for

104:04

him to make that interview.

104:05

>> Yeah.

104:06

>> But it's it's so eerie how all of that

104:09

has actually come to pass because back

104:11

then nobody took him seriously at all.

104:13

Right.

104:13

>> Right. And it didn't it wasn't until

104:15

like the 2020s that people started

104:17

reviewing that and then once it got on

104:18

YouTube,

104:19

>> then people were like, "Oh,

104:21

>> this [ __ ] guy nailed it."

104:22

>> I think it's YouTube. And also most

104:25

people want most people in my experience

104:28

want to pretend that everything is fine

104:31

most of the time.

104:32

>> Yeah. So if you come out in 2018 as we

104:35

did and say this woke [ __ ] is getting

104:37

out of hand and it's going in a bad

104:39

direction and it's going to cause a lot

104:40

of problems, people make you the

104:42

problem.

104:42

>> Yeah.

104:43

>> They say you're wrong to talk about

104:44

this. If you talk about grooming gangs,

104:47

you're bad and evil and whatever. If you

104:49

talk about free speech and people being

104:51

arrested for tweets and all of this,

104:53

people make you the bad guy. And it's

104:55

only later, like I remember, I can't

104:57

even remember who said it, but like I I

104:59

had this Oh, no, I remember who said it.

105:01

One time I was on TV uh debating with

105:03

this woman uh about this stuff and I was

105:07

saying cancel culture is bad and she was

105:08

saying it's all [ __ ] blah blah blah.

105:10

I met her a few years later and she was

105:11

like, "Yeah, I realized cancel counsel

105:13

was bad." And I went, "How did you

105:14

realize?" And she went, "When my friends

105:15

started getting cancelled,

105:16

>> right? Most people want to pretend most

105:18

of the time and everything is fine."

105:20

>> Yeah.

105:20

>> But when they start to see the reality

105:22

of things and it starts to affect them,

105:25

>> right,

105:25

>> that's when they go, "Ah, maybe this

105:27

best man of guy had a point."

105:29

>> Yeah. I had an argument with a seemingly

105:31

intelligent person who's a friend of

105:32

mine when the NSA when this whole uh

105:36

mass buying thing was the Edward Snowden

105:38

stuff was released

105:39

>> and he was like like you can look at my

105:42

[ __ ] I'm not doing anything wrong. Like

105:43

what do you care? I'm like that's such a

105:46

crazy take.

105:47

>> Yeah.

105:47

>> Like who who are these perfect people

105:51

that are watching over everything? You

105:53

don't think any of them have either some

105:55

financial or power-based incentive to do

105:58

certain things or silence certain voices

106:01

and find out what you're doing or maybe

106:04

even manipulate you in in some sort of a

106:06

way being able to have access to all of

106:08

your emails, all of your phone calls.

106:10

Those are just people and all of them

106:12

unelected bureaucrats. You you think

106:15

that's okay for those people to have

106:17

access to everything you've ever said?

106:19

That's crazy. And look, maybe the

106:22

current government that we have in this

106:23

place is, you know, would never dream of

106:25

doing such a thing. And maybe they're

106:27

entirely honorable and everybody's a

106:29

great person and and you know, they're

106:31

this unique human being where they don't

106:33

have any ulterior motives.

106:35

>> But what's to say the next government

106:37

comes in won't do that and start looking

106:39

in and going, "Hey, you know what?

106:41

You're causing me problems, Joe Rogan.

106:43

You're saying a lot of things that I

106:44

don't actually like. Let's look through

106:46

your emails. Oh, look, I'll find one

106:47

from 14 years ago, which is, you know,

106:50

whatever it may be. Let's get rid of you

106:52

for that.

106:53

>> This was the argument when Obama was

106:55

pushing the NDAA. Um, which the this was

106:59

the indefinite detention.

107:02

>> So, this this concept that you didn't

107:04

have to charge anybody, you didn't have

107:05

to you just have to have it. You don't

107:08

have to try them within a timely period.

107:10

Indefinite detention. Well, we'll never

107:12

use that.

107:12

>> Okay. But

107:13

>> why are you pushing it then?

107:14

>> Right. Well, also, who comes after you,

107:16

man? Like, how how many generations are

107:18

we away from Hitler, right? You know,

107:20

like who [ __ ] who's to say that this

107:23

new power won't be used by very

107:25

unscrupulous people that are now I mean,

107:28

the founding fathers of this country

107:30

really had a good understanding of how

107:32

corruption and tyranny sets in. And

107:34

that's why they put all these checks and

107:35

balances in place. And the more they

107:37

eroded that, whether it's the Patriot

107:39

Act, the Patriot Act 2, or the NDAA,

107:41

when you start doing stuff like that,

107:43

man, you you're you're just undermining

107:46

the very fabric that this country was

107:49

created with. It's like we we were

107:51

created under this idea that we know

107:54

human nature. We know that you cannot

107:56

have power. We know that the government

107:59

has to be working for the people. It

108:01

can't be we are under the power of these

108:04

individuals because those individuals

108:06

will then act like tyrants which is what

108:08

people always do when they have power.

108:11

>> It's one of the things that makes

108:12

America really a great place because you

108:15

we look at the UK now and you know if

108:17

Francis is right and I I've said this I

108:19

think the next election is probably

108:20

going to be Nigel Farage versus these

108:23

far-left. If those far leftists get in

108:25

power I mean they're going to start

108:27

regulating podcasts. I guarantee you

108:29

>> 100%.

108:29

>> That's what they're going to do. They're

108:31

going to say, "We have offcom for TV."

108:33

Well, what we need to have it for other

108:35

broadcasting. Surely you'd agree with

108:37

that, right?

108:38

>> And then before you know it, like

108:39

everything we do,

108:40

>> you know, you guys are living in Austin,

108:42

>> right?

108:43

>> Right. Cuz at that point, we would

108:44

actually leave.

108:45

>> Yeah. You would have to. You'd have to

108:47

>> because what they would say is, and they

108:49

would use the word that they always use,

108:51

which is, you know, they're spreading

108:53

misinformation and hate.

108:55

>> Yeah. When the New York Times spreads

108:56

information, misinformation, that's

108:58

wonderful,

108:59

>> right? But it's it's so yeah, I I think

109:03

allowing people maximum freedom within

109:06

the system you're talking about is is a

109:09

really truly precious thing. It's why

109:10

America in this respect is an example to

109:13

the rest of the world.

109:13

>> I I think if anything that should be

109:15

done, they should be able to figure out

109:17

what which of these accounts are bots

109:20

and eliminate those. Yeah. I I do not

109:23

think that you should be allowed to not

109:26

just run a bot farm

109:28

>> or I don't think you should be allowed

109:29

to hire people to tweet. I think that's

109:32

crazy. And I most certainly don't think

109:34

you should be able to use AI.

109:35

>> I mean that that seems crazy. It seems

109:38

crazy to allow that and pretend that's a

109:40

person.

109:40

>> But if think about it like this, Joe,

109:42

like how basically did social media

109:45

start? Facebook, Meta, all the rest of

109:47

it. It started by a nerd in his bedroom

109:50

in his college dorm who set up a website

109:53

to rate hot girls on campus.

109:56

And my point is like we creating all of

109:58

this technology. We don't know what's

110:00

going to be the second, third, fifth,

110:02

fifth, sixth order consequences. And

110:04

we're having to figure out as we go

110:05

along and now we're creating artificial

110:08

intelligence intelligences that are way

110:10

smarter than us. And you're going at

110:12

what point is this going to run away,

110:14

right?

110:15

>> Or has it already run away? and we just

110:17

don't want to admit it because most of

110:19

us don't know enough and the ones in

110:21

charge are delusional.

110:22

>> Yeah, but you're right, Joe. I think we

110:24

need a way to know a what is human

110:26

content and what isn't human content.

110:28

>> And also, I sometimes look at stuff on

110:31

social media and I go, there's no

110:32

[ __ ] way this take got 50,000 likes

110:35

on X. No [ __ ] way.

110:38

>> Right. Right.

110:38

>> You know what I mean? like and that is

110:40

but that is shaping people's perception

110:42

of reality

110:44

and that is informing political debate

110:46

and that is then informing how people

110:48

vote and where did those 40,000 likes

110:51

come from

110:52

>> right

110:53

>> did they come from within America did

110:54

they come from within Britain

110:56

>> because what if they didn't right so who

110:59

is then shaping the political direction

111:01

of our countries

111:03

>> we need to know that

111:04

>> yeah we do need to know that

111:05

>> we need to know that

111:06

>> because it is effective even if someone

111:09

has a completely preposterous and

111:11

radical position a couple steps down

111:13

from that that becomes more palatable,

111:16

right? Like because now it's closer to

111:19

the like the farther left the left goes,

111:22

>> the the weirder the center gets because

111:25

the center starts accepting things that

111:26

were far left positions. Same on the

111:29

right. Same exactly on the right. And

111:31

you can you can shift narratives by

111:35

really really radical ideologies, really

111:37

radical thoughts and radical

111:39

declarations and you could change like

111:42

what's acceptable.

111:43

>> So an example of that is during the

111:45

Euros, the 2021 final, it was England

111:48

versus Italy, right? And it was a tight

111:50

game and it went to penalty shootout and

111:53

three black England footballers missed

111:56

the penalty

111:57

>> and we ended up losing the European Cup

112:00

to the Italians. And afterwards these

112:02

three black footballers got inundated

112:05

with racism and horrible things. That

112:07

sparked a conversation in our country

112:09

about we have a real problem with

112:11

racism. This is disgraceful that these

112:14

black footballers are exposed to this

112:15

level of racism. Unaccept. Of course it

112:17

is. All those things are true.

112:19

basically about them being exposed to

112:21

race and it's not acceptable and then it

112:22

went into a discussion about England

112:24

being a racist country, white

112:26

supremacist and this became widespread

112:29

and this and the example of what these

112:32

footballers were exposed to was used as

112:35

a way to justify this opinion and you

112:37

could see a lot of people accept that

112:39

opinion

112:40

>> until a couple of days later when they

112:42

investigated where the majority of the

112:43

tweets came from and messages and I

112:45

think something like 85% if not 90 came

112:47

from outside the UK if not even more

112:50

than that. M

112:51

>> so you're going oh so this entire

112:53

conversation that we have had about

112:56

white supremacy about you know black

112:58

people not being accepted in in our

113:01

country about the fact they're secondass

113:02

citizens and look this example of them

113:04

being exposed to horrendous racism when

113:06

the fact is the majority of it came from

113:08

outside the UK

113:09

>> and then you have to ask the question

113:11

who benefits

113:12

>> right

113:13

>> who benefits from us hating each other

113:16

obsessing about our differences worrying

113:19

about how we're the racist places in the

113:21

world when this narrative is likely

113:23

being driven by actually racist

113:25

countries.

113:27

>> Right. Right.

113:28

>> Right. Right.

113:29

>> Because that's what's happening.

113:30

>> Yes.

113:30

>> And we are allowing it to happen. And I

113:32

think we just haven't woken up to the

113:34

fact that we are living in the age of

113:36

information warfare and we because of

113:39

our belief in freedom have just got lost

113:42

in this fact that we are under attack.

113:45

>> It's a very good point. I have to pee.

113:47

Um we'll come back with that.

113:48

>> Let's do that. I'll go pee as well.

113:50

>> Speaking of religion, so um show us this

113:53

uh Sam Tripoli Facebook take. He was on

113:57

Danny Jones and this is what he said

113:59

about Facebook.

114:04

>> Damn it.

114:05

>> Volume

114:07

here.

114:07

>> Yeah.

114:08

>> Is a giant lie. It's a propaganda piece.

114:11

That was a Pentagon program called

114:13

Lifelog. Lifelog is a Pentagon program

114:17

>> that wants to collect all your data for

114:20

your whole life. What day did the

114:22

government stop the Lifelog project?

114:25

>> Whoa, whoa, whoa, whoa, whoa. DARPA shut

114:27

down the Lifelog project February 4th,

114:30

2004.

114:32

>> What day was Facebook registered as a

114:35

business?

114:35

>> Oh my god. No way, bro.

114:39

>> The exact SAME DAY. THEY DON'T EVEN HIDE

114:41

IT, DUDE.

114:42

>> It was created by DARPA. Yeah. They

114:44

handed to Mark Zuckerberg and then the

114:46

the vos.

114:47

>> What about the other? Yeah. Yeah. Yeah.

114:48

>> That's all. And that's why they became

114:50

the first Bitcoin millionaires because

114:53

they played ball.

114:54

>> Oh my god.

114:55

>> It's all theater, dude.

114:57

>> So, what is the purpose of life log?

114:59

>> To collect all your data for your whole

115:01

entire life. No.

115:03

>> Okay. Take this with many grains of

115:06

salt. Sam is one of my best friends.

115:09

I've known him for decades. He's a

115:11

wonderful person, but he's a cook,

115:14

>> but he's right a lot.

115:15

>> Yeah.

115:15

>> I don't know if he's right about this,

115:17

>> right?

115:17

>> Yeah.

115:18

>> Jamie thinks he's right.

115:19

>> It is. It was a It It's not that he's

115:23

incorrect. I would say that

115:24

>> he's making some connections.

115:26

>> Yeah.

115:27

>> Yeah. Well, he's definitely right about

115:28

the dates and that is a little weird.

115:31

>> Yes.

115:31

>> That it's ended on the same day where

115:33

Facebook is beginning. Little weird.

115:36

>> Yeah. You know

115:39

>> what do you think the appeal is of like

115:41

>> when I went down this rabbit hole here

115:43

it said it was made by the information

115:46

processing techniques office

115:49

>> of the CIA I think or something but

115:51

here's some other fun projects that is

115:52

are associated with this

115:54

>> biologically inspired cognitive

115:56

architectures. Wait what?

115:59

>> Yep.

115:59

>> There's just a couple

116:00

>> biologically inspired cognitive

116:02

architectures. That sounds like

116:04

artificial intelligence.

116:06

>> Yeah. Uh,

116:07

>> bootstrapping learning. What was the

116:08

other one, James?

116:09

>> This Forester thing.

116:10

>> Forester, a per a program to develop

116:12

helicopterborn radar system that can

116:14

detect soldiers and vehicles moving

116:17

underneath foliage cover.

116:19

>> Whoa. Deep green.

116:21

>> US Army battlefield decision-making

116:23

support system. Yeah, this is all AI.

116:25

>> Heterogenous urban RSTA.

116:28

>> So, they were planning on this in the

116:30

>> This was 2004 is when that thing ended,

116:32

the lifelog thing. So, uh, I mean, it

116:35

even goes back to says they were working

116:36

on ARPANET back in the 60s.

116:39

>> Whoa.

116:39

>> Which is the beginning of the internet.

116:41

>> By the way, Joe, have you had anyone on

116:42

to talk about this weapon that the US um

116:46

forces used in Venezuela?

116:47

>> No. No, I haven't. Not yet.

116:49

>> But there was something like they use

116:51

something, right?

116:52

>> Yes.

116:52

>> Something supposedly

116:53

>> that makes your brain water temperature

116:56

rise and so you get nose bleeds and

116:58

[ __ ] Is that Is that what it is? Well,

116:59

my cousin told me when I was talking uh

117:02

after the attack after

117:03

>> your cousin in Venezuela.

117:04

>> My cousin in Venezuela. Yeah. Yeah. He

117:06

was saying that it seemed like in a onem

117:08

radius everybody's windows got blown

117:10

out.

117:10

>> Well, that's just blast. That's not like

117:12

it. But what I what I heard was that

117:15

they had some kind of weapon that

117:16

>> some sonic weapon.

117:17

>> Some I don't know if it was sonic maybe,

117:19

but something that incapacitates people

117:21

and makes them very uncomfortable

117:23

basically, but without killing them.

117:26

>> Was it 60 minutes?

117:27

>> Yeah.

117:28

>> Yeah. So 60 Minutes said that these guys

117:30

acquired some weapon from uh Russian

117:34

black market

117:35

>> and it's a very small portable weapon

117:37

that you can carry around with you

117:40

>> and does something very similar. What

117:42

did what did what is their claim on

117:43

this?

117:44

>> Uh well two there's two different things

117:46

going on with the 60 Minutes thing. They

117:48

had a story a couple months ago where

117:49

they were tracking a guy and then they

117:52

just had an update I think over the

117:53

weekend that added to it.

117:55

>> But what is the claim? Oh, I think they

117:57

found the guy that said he was doing it,

117:58

I believe.

118:00

>> Right. He had a device in his car or

118:01

something like that

118:02

>> and you could just point it at people,

118:03

but it was you could carry it around.

118:05

>> Yeah, that that is where it gets

118:06

strange. That's I mean uh the 60 Minutes

118:08

thing from yesterday going around. I

118:10

didn't watch it so I don't know what

118:10

they're talking

118:11

>> Oh, the Havana syndrome. Yes.

118:12

>> Yeah. But it has to do with that and

118:14

that's what I was trying to Trump had

118:15

this discombobulator weapon mentioned.

118:17

>> Right. That's what I'm talking about.

118:18

>> Discombobulator. But that's Trump

118:21

description of what the Havana syndrome

118:22

weapon is.

118:24

>> Yes.

118:24

>> The discombobulator. But it seems like

118:26

whatever its effectiveness is, the

118:27

Havana syndrome is very small in

118:29

comparison to what these things are

118:30

doing. These things are like completely

118:32

incapacitating people.

118:33

>> You know, I don't think people talk

118:35

about this enough. You know, when they

118:36

came in to take Maduro,

118:39

>> you know what they also did? I mean, you

118:40

probably know this. They fired a rocket

118:43

into Chavis's morale.

118:45

>> They did?

118:46

>> Yeah. Just to be like, "Go [ __ ] your

118:48

mom."

118:48

>> Wow.

118:49

>> I'm going to bump your grave.

118:51

>> Wow.

118:51

>> Isn't that the most Trump thing ever?

118:53

You know how much those things cost?

118:55

>> Yeah.

118:55

>> Oh yeah. Millions of dollars.

118:57

>> Millions. So millions just to say [ __ ]

118:59

you.

118:59

>> Yeah.

119:00

>> Just fired a rocket into his grave.

119:02

>> This is the American way, baby.

119:06

>> We're going to fire some expensive [ __ ]

119:09

>> That's crazy.

119:10

>> Your grave.

119:10

>> But under any other president, you would

119:12

have gone, "That's bullshit." But under

119:14

Trump, you're like, "Yeah, of course."

119:16

>> You think it was his idea?

119:17

>> Yes.

119:19

>> Yeah.

119:20

>> Yeah.

119:21

>> I've got a thought.

119:25

That sounds exactly like a Trump idea.

119:27

>> Um, so this weapon that uh what is the

119:30

details?

119:31

>> I don't know. We can just watch this.

119:32

>> Yeah. Let's see what he says here. Well,

119:34

we could just read. Okay. Go ahead.

119:35

Okay. Let's play it. Let's play it.

119:38

>> Here it goes.

119:41

>> Microwave weapon, right,

119:43

>> that may explain mysterious brain

119:45

injuries suffered by US officials. We've

119:48

been investigating these injuries for 9

119:51

years. And now our sources tell us this

119:55

microwave weapon is portable,

119:57

concealable, and uses relatively little

120:00

power. Hundreds of possible attacks have

120:04

been reported, including, we've learned,

120:06

at CIA headquarters in Virginia and at

120:10

least two incidents on the grounds of

120:13

the White House. For years, the

120:16

government doubted the stories of the

120:18

injured. But now, the victims, including

120:20

former CIA officer Mark Polymoropoulos,

120:24

hope that word of a newly discovered

120:26

weapon will finally vindicate them.

120:30

There's a part of this, Scott, that has

120:32

to do with moral injury, and that's the

120:33

idea of of betrayal. You know, I worked

120:35

for 26 years for the CIA. I think I was

120:37

involved in every covert action program

120:39

in the Middle East. I did some very

120:42

interesting things for the US

120:43

government, always with the idea that

120:44

they would have my back if I got jammed

120:46

up. I just needed to get medical care

120:48

when I came back and they wouldn't even

120:49

do that. So, this moral injury, this

120:52

sense of betrayal is so acute with me.

120:54

Um, that's something that I can never

120:56

forgive them for.

120:58

>> Mark Polymoris rose to an executive

121:01

level at the CIA, about the equivalent

121:03

of a three-star general. He was awarded

121:06

a top decoration for service. 60 Minutes

121:10

has learned to take

121:11

>> I just repeated it.

121:13

>> Not much about the weapon there

121:14

unfortunately.

121:15

>> Yeah. Huh. Huh.

121:18

>> But it's it's interesting that the way

121:20

that they did that they did that

121:22

operation because when I was talking to

121:25

my cousins and my friends about what

121:26

happened, no one in Venezuela had a clue

121:29

and they were my friend said that he was

121:31

woken up around 2:00 in the morning by a

121:35

plane going overhead and there's a

121:37

no-fly zone around over Caracus at that

121:40

time especially and he was like, "What

121:42

is this?" and he said, "You heard this

121:43

almighty

121:45

boom."

121:46

And everybody was just nobody knew what

121:50

was happening. They don't have ex in

121:52

Venezuela for obvious reasons. So,

121:54

everybody was in the dark. And it was

121:56

only via Instagram and Facebook that

121:58

they started to understand what had just

122:01

gone on. But it was complete disbelief

122:03

that the Americans had done that. If

122:05

they uh they don't have uh X, do they

122:07

have threads?

122:09

>> Uh which is like

122:10

>> I imagine

122:12

Yeah. X0. Yeah, I imagine they must do.

122:15

But he said the way that everybody was

122:16

communicating was via Instagram.

122:18

>> Interesting.

122:19

>> What are people saying now in Venezuela?

122:21

>> So, and now I talked to my friends. He

122:23

said that things are getting better. He

122:25

said things are getting better. He said

122:27

that crime was down 75%. I mean, I don't

122:30

know how true this is. He said things

122:32

are slowly starting to get liberalized.

122:34

He als the I was talking to a Colombian

122:36

friend of mine who was saying that

122:38

people Venezuelans in Colombia are now

122:40

starting to go back because whilst the

122:43

regime is still obviously not perfect,

122:46

what you essentially have is a perfect

122:48

regime and they know that the moment

122:51

they step out of line, they know the

122:52

moment they to use Trump's parliament

122:54

[ __ ] about something will happen.

122:57

They're kept in they're kept they're

122:59

kept in on a straight line. Yeah. They

123:01

they have to behave. they can't do what

123:04

Maduro did. And what's interesting about

123:06

when Maduro was captured is nobody

123:09

really mentioned that much about his

123:10

wife. But a lot of people say that his

123:14

wife was the brains behind the operation

123:16

because Maduro, there's clips of him

123:18

that went viral on Tik Tok and Instagram

123:21

and on Twitter as well where he was

123:23

doing speeches and he had to do basic

123:25

mental arithmetic and he couldn't do it.

123:27

This guy was a bus driver. He was picked

123:30

by Chavez when Chavez was on his

123:32

deathbed in 2013 dying from stomach

123:34

cancer and he appointed Maduro.

123:36

Everybody was shocked because they were

123:38

saying, "Well, Maduro wasn't the most

123:40

capable. He wasn't the most

123:42

intelligent." But what Maduro was is he

123:44

was the most loyal out of all Chavis's

123:47

underlings. So he was picked not for his

123:50

brilliance, not for his sharpness, but

123:52

because he was a company man. And

123:55

actually the the person who the

123:57

Venezuelans hated the most was his wife

123:59

because she was the brains behind the

124:00

operation. She was the one in charge of

124:03

the kidnappings, the tortures, the

124:05

murders. So when she was kidnapped,

124:07

people were happier that she was on the

124:10

helicopter than Maduro himself.

124:12

>> Really?

124:13

>> Yeah.

124:13

>> Lady McBth.

124:17

>> She was way more cruel than Maduro.

124:20

>> Wow.

124:22

>> Way more cruel. It's interesting you say

124:23

things are getting better now because

124:25

like it's short term, right? We don't

124:28

know,

124:28

>> right?

124:28

>> Yeah.

124:29

>> You know, so some so this has happened a

124:31

lot of times in Latin America, right?

124:33

Like people get overthrown, things are

124:35

getting better, and then some [ __ ]

124:37

happens.

124:37

>> Yeah. Not the most stable place.

124:40

>> Not the most stable people, Joe. I'm

124:42

going to be honest with you. They're my

124:44

people. It's either, you know, uh it's

124:46

either Fatimo or Viva Revol.

124:49

>> Yeah.

124:49

>> And you're like, "Guys, can we have a

124:51

little middle?" and they're like no va

124:52

revol you know they're they you know

124:54

they're excitable people

124:56

>> and you also wonder how much the fact

124:57

that Venezuela in particular it's so

124:59

resourceri a lot of well like a lot of

125:02

Francis always says to me like you know

125:03

it could be a really great country

125:04

really wealth and I go I don't know that

125:06

having those resources makes a country

125:08

better cuz what you get is a corrupt

125:10

elite who are fighting for control of

125:12

these resources that are so easy to get

125:15

like in 1990s Russia when the Soviet

125:17

Union collapsed the people who took over

125:19

all the resource companies, the oil

125:21

companies, the gas companies, where like

125:23

Russia is basically all it is is a is a

125:26

in terms of its economy is digging [ __ ]

125:28

out of the ground and selling it. That's

125:29

what it is. There's no

125:30

>> poetry.

125:32

>> Yeah. Not a lot of money to be made in

125:33

poetry, right? But the people who took

125:35

over those companies, they weren't

125:37

people who knew anything about the oil

125:39

business. They weren't people who knew

125:40

anything about the gas business cuz all

125:43

you really had to do is take over and

125:45

then you just let western companies come

125:47

in and do the drilling and and the oil

125:49

field services and all of it for you. So

125:51

these countries which are so

125:52

resourcable,

125:58

the the resource wealth they have

126:00

doesn't actually make them better for

126:01

the people because the corrupt elites

126:04

fight over those resources and that's

126:06

where you get the [ __ ] that you get.

126:08

And the it's true. So Venezuela before

126:11

Chavis came to power was 98% dependent

126:14

on oil. The economy, the entire economy

126:17

was 98% dependent on oil. The slight

126:19

difference with Venezuela is when we

126:21

were taking over by Chavez, he then

126:24

installed his cronies in charge of

126:26

Pedesa, which is the the Venezuelan oil

126:29

company. And he cut out all the people

126:32

who were competent, all the people who

126:34

were who would criticize him

126:36

ideologically. And as a result, what you

126:38

had is fundamentally incompetent people

126:41

at the top, which meant that it became

126:43

degraded. It was no longer able to pump

126:45

the oil. It wasn't reliable. So that's a

126:49

large part of the reason why the economy

126:50

collapsed is it was entirely dependent

126:53

on oil. They appointed their cronies who

126:55

couldn't actually do the job. The oil

126:57

industry failed and we descended into

127:00

poverty and chaos.

127:02

>> How much do you know about Brazil?

127:04

>> Not a lot. Why? Well, I've always

127:08

that situation is very confusing, right?

127:11

Lula goes to jail now. He's out. He's

127:14

running the country and they jailed Gyar

127:17

Balsinaro, right?

127:18

>> Mhm.

127:19

>> Like there seems And then they tried to

127:21

ban X like then they did for a while,

127:24

right?

127:24

>> I think so.

127:25

>> And they had to make probably some

127:27

concessions.

127:29

>> I don't know a lot about it truthfully,

127:30

Joe.

127:31

>> Yeah. Yeah. I

127:32

>> we're gonna do that thing that no one

127:34

does on the internet is admit we don't

127:35

know something.

127:36

>> Well, as long as we don't have hot

127:37

takes.

127:38

>> What is this,

127:38

>> Jimmy? These are kind of kind of crazy

127:41

descriptions of this weapon. This is

127:43

from the longer version of the CBS News

127:45

60 Minutes article where they're talking

127:47

to that guy we just saw.

127:48

>> Um, I would say start right around here

127:51

and then I'll skip to another paragraph.

127:52

>> Okay. Three independent sources from

127:54

different agencies tell us undercover

127:56

Homeland Security agents purchased a

127:57

miniaturized microwave weapon from a

127:59

complex Russian criminal network. It's

128:02

classified. We didn't see it, but it has

128:04

been described to us. We were told it

128:06

doesn't look anything like a gun. It's

128:08

designed to be concealed and small

128:09

enough to be carried by a person. It is

128:11

silent and doesn't create heat like a

128:13

microwave oven. Our sources say the

128:15

device is programmable for different

128:17

scenarios and can be operated by remote

128:19

control. The range of the beam is

128:21

several hundred feet. It can penetrate

128:23

windows and drywall. The vital

128:25

components were made in Russia. Our

128:27

sources say the key is not the hardware,

128:29

but the software. The programming shapes

128:30

a unique electromagnetic wave that rises

128:33

and falls abruptly and pulses rapidly.

128:36

>> So then it turns out they have tested

128:38

this apparently in US military labs

128:41

start.

128:42

>> Our confidential sources tell us still

128:44

classified weapon has been tested in a

128:47

US military lab for more than a year.

128:49

Tests on rats and sheep show injuries

128:52

consistent with those seen in humans.

128:54

Also, as a separate part of the

128:55

investigation, security camera videos

128:57

have been collected that show Americans

128:59

being hit. The videos are classified,

129:01

but they were described to us. In one, a

129:04

camera in a restaurant in Istanbul

129:06

captured two FBI agents on vacation

129:09

sitting at a table with their families.

129:11

A man with a backpack walks in and

129:13

suddenly everyone at the table grabs

129:15

their head as if in pain. Our sources

129:17

say that another video comes from a

129:19

stairwell in the US embassy in Vienna.

129:22

The stairs lead to a secure facility. In

129:24

the video, two people on the stairs

129:26

suddenly collapse. Those videos and the

129:29

weapon were among the reasons the Biden

129:31

administration summoned about half a

129:32

dozen victims to the White House with

129:34

about two months left in the president's

129:36

term.

129:37

>> And then that guy was also one of the

129:39

people in there. the ads are kind of

129:41

[ __ ] up this um

129:43

>> website. But yeah, he just sort of says

129:45

someone admitted to him that they

129:46

treated him poorly.

129:48

>> Yeah,

129:48

>> that's the biggest cover up I've seen in

129:50

my adult life. A CIS.

129:52

>> Interesting.

129:53

>> I I don't get the like the border. What

129:57

if Russia has this weapon? Why didn't

129:58

they use it to take out Zilinski?

130:00

>> Well, it seems like it's only for 100

130:01

couple hundred feet. That's what they

130:03

were just saying. Like it has to be

130:04

close,

130:05

>> right?

130:05

>> So what what was the one they used in

130:07

Venezuela then?

130:08

>> Yeah. They started off saying it was in

130:09

a truck. It was truck size, but then

130:11

that's where it goes. I started you just

130:13

past that where they said it's actually

130:14

way smaller.

130:15

>> Interesting. So, this is the one that's

130:17

that you could carry around. But do we

130:20

know that that's the same one they used

130:21

in Venezuela or do they use something

130:22

that's completely different technology?

130:24

>> Yeah, we the reality is we just don't

130:26

know. I mean, the the interesting thing

130:28

as well with Venezuela is like Maduro is

130:32

so [ __ ]

130:34

>> He's such a hot take.

130:38

That's so

130:38

>> how [ __ ] is he?

130:40

>> He literally so this is a joke. Set up

130:43

punch line.

130:44

>> Yeah. He but he literally said to Trump

130:46

he said to America, "I'm not going to do

130:48

what you say. Go [ __ ] yourself. Come and

130:50

get me."

130:51

>> Yeah. He did that.

130:51

>> Yeah. That was cocaine.

130:57

>> And it's not just that he So, for

130:59

instance, uh the country next to

131:01

Venezuela is called Guyana. And in

131:04

Guyana, they recently discovered oil,

131:07

really huge large deposits of oil. And

131:10

there's been Guyana is a Brit, a former

131:12

British colony. And Venezuela and Guyana

131:15

have always been disputes about

131:17

territory, about one particular part of

131:19

I think it's called Esbo is which is

131:21

basically rainforest. They always argued

131:24

about it, but no one cared until they

131:27

discovered oil there. At which point,

131:29

Maduro went, "You know what? You know

131:31

how we've been talking about this? Turns

131:34

out it is Venezuelan. They did a

131:37

referendum in Venezuela where you

131:39

basically asked a people who were

131:41

entirely subjugated,

131:43

starving, living in misery and poverty

131:46

whether they wanted to start a war with

131:48

Guyana. Do you know how many pe

131:49

Venezuelans voted for it? 92%. Joe 92%

131:54

of Venezuelans wanted to go to war

131:56

despite the fact they didn't have the

131:57

strength to even pick up a gun because

131:58

they're so malnourished. And then he

132:01

started teaching in schools, redrawing

132:04

the map of Venezuela. So all the school

132:07

kids now think that Venezuela

132:09

incorporates this territory. Like he was

132:12

antagonizing

132:14

the Americans and their allies

132:16

consistently. And unlike Iran, he

132:19

doesn't have the infrastructure. He

132:21

doesn't have that amount of the

132:24

military, the power, the organization.

132:27

>> He made himself so vulnerable. so

132:29

vulnerable.

132:30

>> Who who looks at Trump and goes, "Yeah,

132:31

let's [ __ ] with that guy."

132:33

>> Right? He's 80. He He doesn't have much

132:36

to lose,

132:36

>> right? Last term.

132:38

>> That's the scary thing about old

132:39

leaders. It's like death is imminent.

132:43

>> It's within a decade if you're lucky.

132:46

That's spooky. That's spooky. like, you

132:49

know, you're making decisions for babies

132:51

and children and the future of the world

132:53

and you've only got 10 maybe 10 years

132:56

left on on Earth if everything goes

132:59

great.

133:00

>> And also, you start to degrade.

133:02

>> Oh, yeah.

133:02

>> Your cognitive functions. It doesn't I'm

133:04

not saying that he's got dementia or

133:05

anything like that, but you're just not

133:07

as sharp when you're that age as you are

133:09

when you're younger.

133:10

>> He is I mean, he's kind of weird. Like,

133:12

when I think about how much Barack Obama

133:14

aged,

133:15

>> Yeah. how much Tony Blair aged.

133:18

>> Trump has not aged like that.

133:21

>> Yeah. And he is a terrible diet. I mean,

133:23

especially when he's on the road, he

133:25

just eats junk food because he says it's

133:26

like JFK or RFK Jr. rather told me he's

133:30

uh he eats junk food because he knows

133:32

that when he eats fast food that it's

133:34

not going to be poison. Like he knows he

133:37

can eat it and not worry about getting

133:38

food poisoning.

133:40

>> What? What?

133:41

>> What exactly?

133:42

>> Does that make any [ __ ] sense? Well,

133:43

it does because it's filled with

133:45

preservatives. So, you're not going to

133:46

get food poisoning from a Big Mac. When

133:48

was the last time you heard about

133:49

anybody getting food poisoning from a

133:51

Big Mac,

133:52

>> right? [ __ ] never happens cuz nothing

133:55

can grow on those things.

133:57

>> Really? Like, for real. Like, you've

133:59

seen them. They take like

134:02

decades.

134:03

>> Decades. Decades. They don't rot.

134:05

There's so much preservatives in the

134:06

bread and and whatever the meat is

134:08

[ __ ] made with. And but but this is

134:10

my point is like Trump hasn't aged like

134:13

much younger men

134:14

>> which is even crazier because you

134:16

consider he doesn't exercise

134:17

>> right

134:18

>> and he's been under colossal I mean I

134:20

don't know but I imagine he's been under

134:22

a bit of pressure and stress

134:23

>> well assassination attempts and just all

134:26

that almost going to jail right like 34

134:29

felonies sort of trumped up you know

134:31

pardon the pun against him

134:34

>> he's a Russian agent and all this all

134:36

this [ __ ]

134:38

>> and he's just I I He's kind of like kind

134:41

of impressive in a way.

134:42

>> Oh, that part's very impressive. Yeah.

134:44

And he's funny. Like he's always joking

134:47

around about that stuff and like he's

134:48

very lighthearted about it all.

134:50

>> Yeah, he is. Like when he was talking

134:52

about the Iranian Navy, did you see

134:53

that?

134:54

>> Mhm.

134:54

>> He was like, "They've lost 14 ships. We

134:56

sunk a submarine, did this, but apart

134:58

from that, they're doing really well."

135:02

He's just he's he's very relaxed for a

135:04

man in that like it's it's hard to imag

135:07

I I cannot imagine being in charge of

135:09

anything like a a a thousandth of that

135:13

size.

135:13

>> Right. Just imagine the stress that you

135:15

guys have running trigonometry.

135:16

>> Right. Yeah.

135:16

>> Right. Stressful. I'm sure

135:18

>> Francis is a like Barack Obama.

135:20

>> Yeah. I have I used to have black hair.

135:24

>> Now I just look like an aging lesbian.

135:26

John,

135:26

>> it is stressful, right?

135:28

>> Yeah. And that is the highest stress

135:31

that I can ever imagine. I I can't

135:33

imagine a a level higher.

135:35

>> You know, I always remember uh after the

135:37

war in Iraq and Blair was still in

135:39

power, but it was towards the end. I was

135:43

watching the news with my dad and this

135:46

woman in her 50s came along and she put

135:49

a wreath at this at the door of 10

135:53

Downing Street. And that was a mother

135:56

whose son had died in Iraq

135:59

and placed a wreath at 10 Downing Street

136:03

of all the soldiers that died. I'm like,

136:06

even if the war was justified, even if

136:08

it was the right thing to do, which I

136:10

don't think it was, I would still find

136:12

it impossible to sleep

136:14

>> right

136:14

>> now. Just imagine it was the colossal

136:16

[ __ ] up that that war was

136:18

>> right

136:18

>> and those people died as a result of

136:21

your decision. But how do you unless

136:24

you're a sociopath, I I I think that's

136:26

an you can't you live with that,

136:28

>> right?

136:30

>> I I don't think you I I think it's

136:32

impossible to live with that.

136:33

>> Well, clearly not though. I mean, we've

136:35

got uh people who were heavily

136:37

responsible for promoting that war in

136:40

the UK now, like Alistister Campbell,

136:43

who was the spin doctor that helped

136:45

Blair lie the country into the war in

136:47

Iraq. He now has a really big podcast

136:49

and like all the young people are, "Oh,

136:51

really? Tell me more.

136:52

>> No way.

136:53

>> Yeah.

136:53

>> Yeah.

136:54

>> Yeah.

136:54

>> Yeah.

136:55

>> Yeah.

136:55

>> Yeah.

136:56

>> So, he's the Rush Limbbo of the UK.

136:58

>> Uh, how how do you mean? Sorry, I don't

136:59

know enough about Russ Limbo.

137:01

>> I I've heard the name, but I I can't I

137:03

don't get the reference in the way that

137:04

you mean it.

137:04

>> He was the big right-wing propagandist

137:08

uh on radio. Excellence in broadcasting.

137:12

Oh, by the way, have you seen the um the

137:15

memes that think that Rush Limbbo is

137:17

actually Jim Morrison?

137:19

>> Yeah. Yeah. And if you look at his

137:21

facial features in comparison to Jim

137:22

Morrison's facial features, they're

137:24

almost identical. It's kind of nuts.

137:26

They do like a scan where they like like

137:29

superimpose.

137:29

>> So Rush Limbo is a media guy. Alistister

137:31

Campbell. He was working for Tony Blair,

137:34

>> right? But now the media now he has a

137:36

podcast and all the young people. Oh,

137:38

really?

137:39

>> Well, there was a lot of like young men

137:40

in particular that like really into Rush

137:42

Limbbo and like that like a lot of

137:44

people were crediting him with turning

137:46

young people towards a right-wing

137:48

direction. This was during the Obama

137:51

administration.

137:53

Um, like look at this.

137:56

Watch. Watch when they grow over.

137:59

Pretty close.

138:00

>> It's like Alex Jones and Bill Hicks.

138:02

Have you seen that?

138:02

>> That one's ridiculous. I met both of

138:05

them. They don't look anything like each

138:06

other. Look at this. Look at that. That

138:08

is kind of crazy.

138:10

>> Wow.

138:11

>> Right.

138:11

>> It's pretty close.

138:14

>> Wow.

138:16

But

138:17

>> well, you know, the there's the other

138:19

crazy conspiracy theory involving the

138:22

countercultural move counterculture

138:24

movement of the 1960s with the CIA.

138:27

>> There's a there's a book on it, Strange

138:29

Times in Laurel Canyon. The book's nuts.

138:31

Like, you realize like how many of these

138:33

very popular countercultural figures had

138:36

families that were in the military?

138:37

Yeah. Like highlevel military

138:39

intelligence officers, including

138:41

Morrison.

138:41

>> Oh, yeah. Morrison's dad was very senior

138:43

in the military. Yeah. and a bunch of

138:45

other people that were also involved

138:48

>> in, you know, the the whole Laurel

138:50

Canyon rock scene and that it was

138:53

somehow or another at least promoted by

138:56

intelligence agencies if not formulated.

138:59

>> And by countercultural you mean like

139:01

what? Like hippies?

139:02

>> Yep.

139:02

>> Yeah.

139:03

>> So the hippie movement was promoted by

139:06

intelligence.

139:07

>> Yeah. Well,

139:08

>> why?

139:09

>> That's a good question. Well, we know

139:12

without saying definitively, but pretty

139:14

close based on um Tom O'Neal's book,

139:17

Chaos, that they were absolutely

139:19

involved in the Manson family. So, with

139:21

the reason for them being involved in

139:23

the Manson family is say you have this

139:25

new culture that's arising that doesn't

139:28

embrace materialism, make love, not war.

139:30

You got all these people, you know, drop

139:32

out, tune in, like Timothy Liry.

139:35

>> Yeah. the Timothy Mc Timothy Liry

139:38

people, the the the people that are want

139:41

to do acid and just want to reimagine

139:44

society. So this is a radical change.

139:46

This a radical change from the 1950s to

139:48

the 1960s. Pretty crazy.

139:50

>> So what do you do to stop that? Well,

139:51

what you do is you find a guy who's very

139:54

charismatic, who is a sociopath, who

139:56

who's in prison, and you find that guy

139:59

and teach him how to be a cult leader.

140:00

And then you give him acid and you show

140:02

him how to administer acid and how to

140:04

not take it and have all of his

140:06

followers take it and then direct their

140:08

thoughts and then eventually program

140:10

them like MK Ultra style to commit

140:13

murders. So they have the tabianca

140:15

murders. They have a bunch of other

140:17

stuff that they had did before that.

140:18

He's gotten arrested multiple times.

140:20

Every time he gets arrested they let him

140:22

go. And when they let him go like one of

140:24

the sheriffs says I was told it was

140:26

above my pay grade. So you're letting a

140:28

guy go who's a violent criminal, who's

140:29

violating parole, who's a lifelong con

140:31

man, and now he is running this cult,

140:35

and this cult is murderous. So the Tate

140:37

Labianca murders, the Manson family

140:39

murders, all that stuff becomes public.

140:41

There's the hearings, the trials, the

140:43

whole thing. So the entire public

140:44

narrative changes on what a hippie is.

140:46

Now hippies are dangerous.

140:48

>> So before hippies were like, we're

140:50

nonviolent. We want love. We're We have

140:53

flowers. And now it's like, "Oh, these

140:55

[ __ ] people will cut your baby out

140:57

and write pig on the wall with your

140:59

blood, you know."

141:00

>> Is is the Alimont concert connected to

141:03

that as well? The Alteront concert, you

141:05

know? You know, the Rolling Stones

141:07

concert.

141:07

>> Oh, that was the That was the um Hell's

141:10

Angels. Yes. Right.

141:11

>> Yeah. Yeah. Yeah. And that was kind of

141:12

seen I I don't know. It's just a

141:14

question I'm asking like cuz that was

141:15

seen as the end of the hippie movement,

141:16

wasn't it? That was the death. That was

141:18

the final death rattle. The hippie man

141:21

was that was how it was that's how it

141:23

was written and portrayed.

141:24

>> Well, that's odd because the Hell's

141:25

Angels are not hippies and having Hell's

141:28

Angels as security is a wild move.

141:31

>> That's crazy.

141:33

>> Yeah, because it was a Rolling Stones

141:34

concert, but in because it was a free

141:37

concert, wasn't it? That was a thing.

141:38

>> How did they go about hiring See if you

141:42

can find the history on that. Did they

141:43

go about hiring the the Hell's Angels?

141:47

>> Yeah.

141:48

>> Both previously used the Angels for

141:50

security at performances without

141:52

incident.

141:52

>> Grateful DeadF.

141:53

>> Grateful Dead and Jefferson airplane.

141:55

>> This is also the next sentence says it

141:57

was denied. So I don't know that, but

142:00

that's what it says in the Wikipedia

142:01

here.

142:03

>> Um it says for $500 worth of beer.

142:05

That's all they had to pay them. The

142:07

story was denied by some parties who

142:09

were directly involved. According to the

142:10

road manager of the Rolling Stones 1969

142:12

US tour, Sam Cutler, the only agreement

142:15

there ever was, the Angels would make

142:17

sure that nobody tampered with the

142:18

generators, and that was the extent of

142:20

it. But there was no way uh they're

142:22

going to be the police force or anything

142:24

like that. That's all bollocks. The deal

142:27

was made at a meeting including Cutler,

142:29

Grateful Dead Manager, Rock Scully, and

142:32

Pete Nell, member of the Hell's Angels

142:34

San Francisco chapter. According to

142:36

Cutler, the arrangement was that all the

142:38

bands were supposed to share the $500

142:40

beer cost, but the person who paid it

142:43

was me, and I never got it back to this

142:45

day.

142:48

Okay. He said, uh, the Hell's Angels

142:50

guy, uh, says, "We don't police things.

142:51

We're not security force. We go to

142:53

concerts and enjoy ourselves and have

142:54

fun." Well, what about helping people

142:56

out? You know, giving directions and

142:58

things. He says, "Sure, we could do

142:59

that." How they would be paid? He said,

143:01

"We like beer." in the documentary Gimme

143:04

Shelter, Sunny Barger,

143:06

the guy that was the head of the Rolling

143:07

Stone uh the head of the Hell's Angel

143:09

stated that the Hell's Angels were not

143:10

interested in policing the event and

143:12

that organizers had told them the Angels

143:14

would not be required to do or would be

143:16

required rather to do little more than

143:18

sit on the edge of the stage, drink

143:19

beer, and make sure there weren't any me

143:21

murders or rapes occurring.

143:26

Um,

143:26

>> the only reason I said that is because

143:28

it was that was kind of one of the

143:30

events that was that was heralded to be

143:32

the death to be the end of the hippie

143:34

movement.

143:34

>> Right. So, what happened? They stabbed

143:36

people. Something happened.

143:38

>> I think it was a free concert that the

143:40

Rolling Stones and these bands put on

143:42

and then it degenerated and then a riot

143:45

broke out and then the Hell's Angels who

143:47

was obviously not trained security then

143:50

went on the rampage.

143:52

And how many people died? That I don't

143:54

know.

143:56

Does it say here, Jamie?

143:58

>> Situation deteriorates. Killing

144:02

>> a woman got killed.

144:07

>> 22 caliber from the jacket. Draw

144:10

revolver. Drew a knife. Stabbed him 16

144:13

times in the head, neck, and back.

144:15

>> Whoa.

144:16

>> It's a lot of stabbing.

144:20

Um,

144:22

>> so it says concealing the remaining 14

144:24

stabbings. What

144:30

>> say high on meth when he died?

144:32

>> Oh boy.

144:34

>> Quitted after jury reviewed the concert

144:36

footage.

144:37

>> Rolling Stones were aware of the

144:38

skirmish but not the stabbing. Couldn't

144:40

see anything. It's just another scuffle.

144:42

Jagger tells David um Melissy Malles

144:48

during film editing. It soon became

144:50

apparent they could see something of

144:51

what happened because the band stopped

144:52

playing mid song and Jagger was heard

144:54

calling into his microphone. Really got

144:56

someone hurt here. Is there a doctor?

144:58

After a few minutes, the band began

145:00

playing again and eventually completed

145:01

their set. They had to get paid. Um the

145:05

abandoned the show at one point was it

145:06

say Altoont became whether fairly or not

145:10

a symbol for the death of the Woodstock

145:12

nation.

145:17

>> Interesting.

145:18

>> Yeah.

145:19

>> Yeah. I mean it seems like if you're

145:22

going to have uh concerts, especially

145:24

going to have free concerts and you're

145:26

going to be using Hell's Angels as a

145:29

deterrent, you know, things could

145:31

definitely go sideways. Yeah,

145:33

definitely.

145:34

>> And maybe they just got lucky before

145:35

when they did it for Jefferson Airplane

145:37

and The Grateful Dead.

145:39

>> Yeah.

145:40

>> Joe, not to change the subject, but have

145:42

you have you been following this beef

145:43

between Eddie Hearn and Dana White

145:45

>> a little bit?

145:46

>> Cuz it's kind of interesting to me cuz

145:47

boxing seems to be like changing, right?

145:49

Because of what the Zura boxing is

145:51

doing.

145:52

Is that something you're like excited

145:54

about the possibility that boxing which

145:56

has been in, you know, there's so much

145:58

[ __ ] going on and you so very rarely

146:01

see the best fighters fighting each

146:03

other that that might change.

146:06

>> Well, the beef with those two, I I don't

146:09

know the root of it. I think it's

146:11

essentially that that, you know, it's

146:12

competition like Dana is now entering

146:14

into the MMA space.

146:16

>> Into the boxing space,

146:17

>> excuse me, the boxing space. And I was

146:18

gonna say Eddie Hearn is now entering

146:20

into the MMA space because now he's uh a

146:22

manager of Tom Aspenol. Yeah. Which is

146:25

very interesting.

146:26

>> Um okay. Anything that gets fighters

146:28

more money I'm for.

146:29

>> Yeah.

146:30

>> And if you know more attention, more

146:32

money, more different promoters, more

146:34

people competing to give people higher

146:35

purses.

146:36

>> The real problem is with MMA there's

146:38

nothing. I mean there's essentially the

146:40

UFC and everything else is a distant

146:43

second.

146:44

>> Yeah. And it's a distant second in terms

146:46

of uh attention. Uh in some places it's

146:50

not a distance section. It's not a

146:52

distant second in terms of revenue,

146:54

right? So like the PFL for instance, the

146:57

PFL was offering a million dollars for

146:59

anybody who could win these tournaments.

147:01

And the the caliber of fighters that

147:04

were winning this tournament were not

147:05

the same caliber as UFC champions. And

147:08

then some of the people that were

147:09

competing in the UFC were not making as

147:10

much money as these people that had left

147:12

the UFC because they really weren't able

147:16

to beat the best guys. They went over

147:18

there and they made a million dollars.

147:20

>> Look, I think that's good for fighters.

147:22

>> It's not good for really talented guys

147:25

that really want to be the UFC champion

147:28

because you can languish over there for

147:30

a long time. And there's some good

147:31

examples of guys who spent four, five,

147:34

six years over there that really had

147:36

potential to be a world champion. And

147:38

they are, you know, in quotes a world

147:40

champion over there. But ask the average

147:42

person on the street who they are. No

147:44

one knows. Ask him who Alex Pereira is.

147:46

Everybody knows. The thing is those

147:50

guys, if they're doing that and they're

147:52

getting paid more, you have to make a

147:54

decision like are you willing to take

147:56

more money now in this organization

148:00

versus the potential of much more fame,

148:03

sponsors, and maybe less money initially

148:06

in the UFC, but if you can be a

148:08

champion,

148:09

>> that's really what every fighter wants

148:11

to be. Because if you spend five, six

148:13

years in an organization, the reality is

148:16

your prime is about five, six years. You

148:20

look at the elite of the elite guys,

148:21

Anderson Silva in his prime, it's about

148:23

five, six years. Fedor a milliono in his

148:26

prime, it's about five, six years. So

148:28

you could you could burn out your prime

148:31

in an organization where you're not

148:33

getting as much talent and not getting

148:34

as much recognition. So it depends on

148:36

what you're doing it for. If you're

148:38

purely a prize fighter and you want to

148:40

fight for the highest bidder, the

148:41

difference between MMA and the UFC is

148:43

you can do that in boxing. So in boxing,

148:47

people go to see the fighter. You know,

148:49

Terence Crawford is fighting Canelo

148:51

Alvarez, my mom could be the promoter.

148:53

Nobody gives a [ __ ] They want to see

148:54

that fight. And you put that fight on

148:57

pay-per-view, it's going to sell. You

148:58

put it on Done. You put it on Netflix,

149:01

it's going to sell. In MMA, that's not

149:04

necessarily the case. M

149:06

>> the interesting challenge to that is

149:07

this Netflix thing. So with Ronda Rousey

149:10

versus Gina Corano.

149:12

>> Yeah.

149:12

>> Even though Gina Corano hasn't fought

149:14

since uh the 2000s. I don't remember

149:18

what year was the last time she fought.

149:20

I want to make a guess. Let me guess. I

149:23

want to say 2007 2008.

149:26

When was the last time Gina Carano

149:27

fought?

149:29

Um and she's 43 and I think Rhonda's 39.

149:33

But Ronda is so famous and people are so

149:35

interested. And if it's on Netflix and

149:37

people already have Netflix,

149:39

>> I guarantee you you'll get millions of

149:40

people that'll watch that. Yeah. So,

149:42

that'll be good, right? And that's good

149:43

for the fighters. And I know they

149:45

offered uh some fighters that I know a

149:47

very large purse to compete on that

149:49

card.

149:49

>> Well, Francis and Ghana might be one,

149:50

right?

149:51

>> He might be. Yeah, that's there's talk

149:53

of that. No, actually, I think that's

149:55

been I think that's been confirmed. I

149:56

think he's fighting Philip Lind

150:00

>> 2009. Okay. That was the last time she

150:02

fought.

150:02

>> Chris Cyborg was a beast.

150:05

>> Yeah, there was a lot of supplements

150:07

involved in that.

150:09

>> There was a lot going on with her. I

150:10

mean, that was that was the wild west of

150:12

testing and she looked freak.

150:14

>> Yeah, the eye test was kind of

150:16

>> Yeah, the eye test was 100%. Um, so,

150:20

>> but it'll still be exciting. People

150:22

will, you know, and hopefully they've

150:24

had enough time. I know there's a lot of

150:25

f video footage of Rhonda training for

150:28

quite a while. She lost a lot of weight.

150:30

She got really slim. She looks fit. You

150:32

know, she's she looks outstanding,

150:34

especially with her grappling. She's

150:35

doing a lot of judo throws and arm bars

150:37

and she doesn't look like she's lost a

150:39

step.

150:40

>> There's a difference between that and

150:41

competition. There's ring rust. There's

150:43

a lot of factors. Gian Carano was a

150:45

legit Muay Thai champion. She's got real

150:47

power. And she was a very good striker

150:49

when she was young. She was a very

150:51

technical, solid striker when she was

150:53

young.

150:55

How long has it been since she, you

150:57

know, well, I mean, how long did she

150:59

stop training for? Right. She did

151:01

movies. She's done The Mandalorian.

151:04

She's had a lot of success acting, but

151:07

there it seems like there was probably

151:08

quite a bit of time. She lost a ton of

151:10

weight, too. Look.

151:12

>> And she looks quite a bit bigger than

151:13

>> Those are two attractive ladies, if I

151:15

say so. Yep.

151:17

And

151:18

>> Oh, Jake Paul's in the middle.

151:19

>> Looks very angry.

151:20

>> He looks kind of awkward there, Jake, to

151:22

be honest.

151:26

Uh, you know, when it comes to

151:28

grappling, you give Rhonda a big

151:29

advantage. She's one of the best

151:30

submission artists ever, period. You

151:32

know, her arm bar is about as good as it

151:34

gets.

151:34

>> She's got fantastic judo, bronze

151:37

medalist in the Olympics. When it comes

151:39

to Gina, Gina was like a solid striker

151:41

when she was young and the difference in

151:43

striking would definitely benefit Gina.

151:45

You would have to lean in her direction.

151:47

But again, when you're talking about

151:49

like 2009,

151:51

>> it's a long time, man. 17 years. It's a

151:54

long time to not compete.

151:56

>> Well, there do seem to be a lot of

151:57

fights nowadays in in various

151:59

disciplines happening where it's like

152:01

you're not seeing people at their prime.

152:03

>> You're maybe sometimes seeing people who

152:04

aren't professionals,

152:05

>> right,

152:06

>> but are famous.

152:07

>> And there seems to be a lot of money to

152:09

be made doing that.

152:10

>> Yeah. As long as you match them

152:11

correctly, right?

152:12

>> Yeah.

152:13

>> That that was the thing that was wild

152:14

with Jake Paul versus Anthony Joshua.

152:17

>> Uhhuh. you Jake Paul is a cruiserweight

152:19

and you've got Joshua who is heavy for

152:22

the heavyweight division. You looked at

152:24

the size matchup between the two and at

152:27

one point I was like he's going to kill

152:29

him.

152:30

>> Well, he did. I mean I think Jake

152:32

probably knew it going in and just I

152:34

think his game plan was just to move a

152:36

lot,

152:36

>> you know, and he did a lot of that. Did

152:38

a lot of moving. He uh he hit him a few

152:41

times and he hit him with some wild

152:43

shots from the outside where he kind of

152:45

dove in and threw wild punches. I think

152:47

that was probably part of the strategy.

152:48

But

152:49

>> I mean, ultimately, you're looking at

152:50

Anthony Joshua, who's not just a

152:52

heavyweight champion in boxing, but a

152:54

one punch knockout artist and a former

152:56

Olympic gold medalist.

152:57

>> He's a [ __ ] highly skilled man. Very

153:01

highly, highly skilled.

153:03

>> That was a strange fight because up to

153:04

that point, Jake Paul's fight was the

153:06

other way around. Like he had a he

153:08

clearly had an advantage

153:10

>> and that was like flipping the script

153:12

the other way.

153:12

>> Well, smart dude, you know, very smart.

153:15

>> Do you think that was smart? Yeah. Very

153:16

smart in how he promotes himself. Smart

153:18

in that like you can't criticize him for

153:20

not fighting dangerous fights anymore.

153:21

>> A guy you gota respect, right? Yeah.

153:24

>> The Mike Tyson one was a little sus.

153:26

>> I mean Mike Tyson is, you know, he's on

153:28

the older side. Was Anthony Joshua. He's

153:30

not that old.

153:31

>> It's not just that. Like the fight

153:32

itself was a little sus.

153:34

>> How do you mean?

153:34

>> Cuz it looked like a sparring match,

153:36

right?

153:36

>> Looked like there was an agreement in

153:37

place.

153:37

>> Okay.

153:38

>> I don't know if there was, you know, but

153:41

Terence Crawford thought it was.

153:43

>> Really?

153:43

>> Yeah. It looked a little sus. I mean,

153:45

Mike is How old is Mike? 58, 59.

153:48

>> Yeah. I mean, I I still wouldn't get in

153:49

a ring with him.

153:50

>> Yeah. No, I'll still kill you. But it's

153:52

like I mean, it's not saying that Jake

153:55

would have even won. I mean, who knows?

153:56

I mean, if if Mike really could have

154:00

like you saw he's capable of those

154:02

flurries when he's hitting pads, he's

154:05

still capable of massive speed and

154:07

power. It's not saying that, but it's

154:09

like could he sustain a real fight? Does

154:11

he want to get hit in the head anymore

154:12

at this point in his life? And and it's

154:14

also when you get to that age, you can

154:16

look and you can there'll be glimpses

154:19

where you're like, "Oh, this is the old

154:20

the Tyson of old."

154:22

>> But it's also as well, he's still a 58-y

154:24

old dude. You know, punch around the

154:26

head that can cause a brain hemorrhage,

154:28

etc. And he can he can die.

154:31

>> Fighters die at the peak of their powers

154:33

or get brain damage.

154:34

>> I mean, it's going to be I'm I'm no

154:36

neurologist, but I I'm certain that that

154:38

is a higher risk when you're 58.

154:40

>> Yeah, I would recommend it.

154:43

So, so the thing, this is why I asked

154:45

you about Eddie Hearn and Dana. There's

154:46

talk about them having a boxing match.

154:47

>> Oh, that's funny. Dana can box. He can

154:50

really box. Like I've I've seen Dana hit

154:51

mitts before. I've seen Dana Spar. Dana

154:53

can actually box. And there was a time

154:55

where Dana was supposed to have a boxing

154:56

match with Tito Ortiz,

154:58

>> right?

154:58

>> Wow.

154:59

>> And um you know, even Tito Tito

155:01

acknowledged because Dana was his

155:02

manager at one point in time. Even Tito

155:04

acknowledged like Dana's a really good

155:05

boxer. He can box. He spent a lot of his

155:08

time boxing when he was young. I mean, I

155:09

don't know how much of it he's doing

155:11

these days. He's so [ __ ] busy,

155:13

>> you know? He's so involved in Zufa

155:15

boxing now

155:16

>> and uh

155:18

>> he's involved in some of these Riad

155:20

season events. So, it's like, you know,

155:23

I don't I think it's probably just talk,

155:24

you know. Eddie Hearn's a very tall guy,

155:26

though.

155:26

>> He's a big dude.

155:27

>> Yeah, he's a big dude.

155:28

>> And he used to box as well.

155:29

>> Did he?

155:29

>> He used to he boxed his dad, I think. I

155:31

heard him talking about that.

155:33

>> Interesting.

155:33

>> Yeah.

155:34

>> Yeah.

155:34

>> There's a fight.

155:35

>> Yeah. I mean, I guess I don't want to

155:37

see it. I'll watch though. Well, this is

155:40

what I mean. You don't want to see it,

155:42

but you'll watch.

155:42

>> Yeah.

155:43

>> Well, there's certain things I don't

155:44

want to see that I watch. Like

155:45

Slapfight, like if someone sends me a a

155:47

video, if it shows up on my Instagram

155:49

feed, of some, you know, poor slob

155:52

getting slapped into the shadow realm,

155:54

I'll watch it just for how they hit

155:56

their head off the table and stiffen up

155:57

on the way down.

155:58

>> Yeah,

155:59

>> that's combat sports for the Tik Tok

156:01

generation.

156:01

>> Yeah. When you think about it,

156:03

>> it's not even combat sports. I mean,

156:05

it's just slapping each other. That's

156:07

all it is. you if you want to call

156:09

slapping each other a sport that seems

156:10

crazy. It's also there's a fundamental

156:13

problem with slap fighting is that

156:14

someone has to go first.

156:16

>> Yeah.

156:16

>> Yeah.

156:17

>> And that's a giant advantage. Going

156:19

first is the biggest advantage of all

156:21

time, you know.

156:22

>> And that how's that decided coin toss?

156:24

>> I don't know. I don't watch it.

156:26

>> Is it?

156:27

>> I have no idea.

156:28

>> No, I can't I physically can't watch it

156:31

to be honest.

156:32

>> Yeah. Well, to me it's like this zu for

156:35

boxing is real. this is a real combat

156:38

sport whereas it's not just slapping

156:40

each other in the head.

156:41

>> Well, to me the exciting thing and

156:42

correct me if this is wrong but the

156:43

exciting thing is it has felt for a long

156:46

time that seeing top boxers fighting

156:48

each other is a rare occurrence. UFC you

156:50

see that every single card.

156:52

>> Yeah. Well, the Saudis are stepping up

156:55

and that's, you know, um with uh Turkey

156:58

al-Hik like his his role in boxing has

157:02

really changed that like what what

157:04

they've done with Riad season's done is

157:06

make fights that managers have said

157:09

don't do this.

157:10

>> Yeah.

157:10

>> Like a a a good one is uh Martin Bakoli

157:14

versus um God I forget his name.

157:17

Anderson,

157:20

uh, young prospect, very good fighter.

157:23

And Bakoli is a [ __ ] big, dangerous

157:26

guy. And Bakoli knocked this guy out.

157:28

>> Jared Anderson.

157:28

>> Jared Anderson. That's it. Um, Jared

157:31

Anderson was a undefeated upand

157:33

cominging prospect, young guy, and

157:35

Bakoli beat him up. And uh, he really

157:39

wasn't there yet. He wasn't ready for

157:41

that guy yet.

157:42

>> And Boli stopped him, and that derailed

157:45

this guy's career. But he probably got a

157:47

big paycheck.

157:48

>> Right.

157:49

>> Right. And so what I understand is there

157:51

was a lot of people saying that's a bad

157:52

fight to make.

157:54

>> Don't do it.

157:55

>> Yeah. I mean the UFC has been seems to

157:57

me from the outside quite careful about

157:59

like giving people like Bo Nickel and

158:02

Roas Jr. and Shawn Nomali just trying to

158:06

get them to build up slow and even

158:07

they've you know R and Bo Nickel both

158:10

lost at one point right. Well, yeah.

158:12

Well, Bo Nickel fought Reiner Ditter,

158:14

who's a one champion and a huge guy for

158:17

the middleweight division. And Rhiner

158:19

did a fantastic job of, you know, you

158:21

don't want to take Rineer to the ground

158:22

because he's an elite submission

158:24

athlete. And standing up, he's got

158:26

vicious knees to the body. That's like

158:28

one of his best weapons. And he he

158:30

[ __ ] Bo up. But that was a good fight

158:32

for Bo because he came back from that

158:34

and fought Hadalfo Vieiraa and looked

158:36

fantastic afterwards. like he's a real

158:38

competitor and a winner and the kind of

158:40

guy that gets knocked down like that is

158:42

going to get back up and be five times

158:44

more ferocious and that's what he is.

158:47

>> But, you know, it's one of those things

158:49

where it's like why do you protect some

158:52

people and not protect others? You know,

158:53

and is it because they have better

158:55

management? Is it, you know, because

158:56

sometimes the UFC will tell you like if

158:59

you want to fight in the UFC, hey, we've

159:01

got to fight uh we need an opponent in

159:04

four days. someone dropped out and

159:05

you're going to fight blank.

159:08

>> And that person who you're going to

159:09

fight might be a surging contender who's

159:12

[ __ ] terrifying, who's putting

159:14

everybody to sleep and you have to make

159:16

a decision like this is not a good fight

159:17

for me at this point in my career, but

159:19

if I say no to the UFC, maybe they will

159:21

never offer me a fight again.

159:23

>> And also, you're a fighter and fighters

159:25

from everything that I know would you're

159:27

going to back yourself.

159:29

>> Yes. But you have to do that

159:30

intelligently, right? You have to

159:32

realize that if you if you are in a

159:35

process and this is the thing about

159:36

everyone up into like championship level

159:38

up until a certain point in time when

159:39

you plateau everyone is constantly

159:41

getting better. So if you you you get

159:45

better from training, you get better

159:47

from work with your coaches, but you

159:49

also get better with experience. And

159:51

what boxers and boxing management has

159:54

always done is make sure that you get

159:56

the proper experience and the proper

159:58

kinds of opponents are going to test you

159:59

in certain ways along the way. So the

160:01

idea is you give a fighter a stiff test

160:03

that they can pass. You don't give a

160:05

fighter a chance where they're going to

160:07

compete against someone who's many, many

160:10

levels above them and they don't have a

160:11

chance at all because that can destroy

160:13

confidence that could like cause real

160:15

damage to you. You can get really badly

160:17

hurt and never be the same again.

160:18

There's certain fights that fighters

160:20

have where they are never the same

160:21

again. They get knocked out by someone

160:23

and they just aren't the same. They get

160:24

a flying knee to the face and they're

160:26

done. They get a head kick and they're

160:28

done and they just are never the same

160:29

guy again. And you could point to

160:31

numerous examples of good fights where

160:33

there weren't mismatches, but that a

160:35

fighter was never the same again.

160:37

>> It's a dangerous sport.

160:38

>> It's it's I mean it is the most in I

160:42

mean it's not the most in terms of

160:44

death. Boxing is the most in terms of

160:46

death. And I think that's because they

160:47

have less options.

160:49

>> You know, you can't clinch. You can't

160:51

hold on, try to take a fight to the

160:52

ground. You can't defend yourself as

160:53

well. There's also the thing where you

160:55

get knocked down and you get back up.

160:57

Well, you clear your head momentarily,

160:59

but you're still [ __ ] And now you

161:01

can't get out of the way of punches. Now

161:02

you're really getting [ __ ] up. And

161:04

you're getting much more damage than you

161:05

would have gotten if you got clipped

161:07

that first time and then the guy punched

161:08

you a couple times when you're on the

161:09

ground.

161:10

>> You got choked out.

161:11

>> Yeah. Or you got choked out is way

161:12

better. Choked out's way better. Arm bar

161:14

way better. Just tap and then you're

161:16

good.

161:16

>> It's also the duration of the fight.

161:18

Boxing matches tend to last for a lot

161:20

longer normally if they go the duration.

161:22

>> They certainly can. If it's 12 minutes,

161:24

right, you're dealing with 36 36 minutes

161:26

of fighting of getting punched in the

161:28

head versus 25 for an MMA fight. The

161:31

opposite of that you would say though,

161:33

but they're not getting slammed on their

161:35

head. They're they're not getting

161:36

kicked. They're not getting kneede in

161:38

the face. They're not getting cut open

161:39

with elbows. There's a lot of things

161:41

that can happen in an MMA fight that are

161:42

way worse. But do you do you also think

161:45

as well that when I watch MMA losses

161:48

look of course losses are detrimental

161:50

and they affect careers and they knock

161:51

people back but they don't seem to be as

161:53

consequential as losses in boxing

161:55

>> in terms of your career.

161:56

>> Yeah. In terms of your career and the

161:58

way you're perceived,

161:59

>> right? Well, I think it's accepted that

162:01

if you're fighting

162:03

bunch of different styles, you know,

162:04

style versus style, you're there's

162:07

always a potential of losing, especially

162:09

amongst the elite of the elite. And

162:11

you're seeing more of that in MMA at the

162:13

highest level. Yeah. You're not seeing

162:15

guys avoiding each other because there's

162:16

one champion and it's a UFC champion in

162:19

that weight class and you have to fight

162:20

that guy if you want the title. Whereas

162:22

there's the WBC, the WBO, the IBF and

162:26

you have all these different

162:27

organizations for boxing and so you can

162:30

be a champion while avoiding the other

162:32

champions.

162:33

>> Whereas in the UFC that's the thing

162:34

that's exciting is like you get to see

162:36

Max Holloway who is a super dominant

162:38

guy, right? And then he fights Charles

162:39

Olivea.

162:40

>> Yeah.

162:41

>> And it's it doesn't go that way.

162:44

>> It's crazy.

162:45

>> That that was so dominant.

162:46

>> Yeah. It was so dominant. And Max

162:47

Holloway was a two to1 favorite at least

162:49

at some points in the the betting line.

162:52

>> Yeah.

162:52

>> Yeah.

162:52

>> And it kind of look I mean obviously

162:54

Hamzad Chimay is a whole category of its

162:56

own, but it sort of felt a little bit

162:58

that level of domination on the ground.

163:00

>> Yeah. The difference is Holloway was

163:02

getting dominated on the feet too.

163:04

>> Olive is [ __ ] dangerous as [ __ ] on

163:06

the feet. I mean, he was better

163:08

everywhere.

163:09

>> Yeah.

163:10

>> And he's bigger. He's a bigger guy. And

163:11

you could you could see that in the

163:13

exchanges. Like every time he got a

163:14

clinch on Max, he just hoisted him up in

163:17

the air and slammed him to the ground.

163:19

It was so definitive,

163:20

>> right?

163:20

>> That was a spec spectacular performance

163:23

by Oliver. But

163:24

>> it was

163:25

>> the thing that my concern going into

163:27

that fight was I'd watched the Matteas

163:29

Matau Gamarrot fight with Olive. I'm

163:31

like Alivera is as good if not better

163:34

than he's ever been before. Camrod is

163:35

[ __ ] dangerous and he's a really good

163:37

grappler and they went to the ground and

163:39

he was lost. Olive was just tying him up

163:42

in knots. He wasn't able to get anything

163:43

off on Olive. I'm like, what is Max

163:45

going to be able to do on the ground

163:47

against this guy? And then when it comes

163:48

to standing up,

163:50

>> Justin Gachi said no one ever hit him

163:51

harder than Olivera did. That Olivea is

163:54

like he carries big power in his punches

163:56

and big power in his kicks, too. And

163:58

he's he's so reckless on the feet. not

164:00

reckless, I should say, but just like so

164:02

aggressive on the feet because he wants

164:03

you to take him to the ground

164:05

>> cuz he's the best submission artist in

164:06

the history of the sport. He has more

164:08

submissions than anyone ever in the

164:09

history of the sport.

164:10

>> Wow.

164:11

>> And the thing that you don't appreciate,

164:12

I mean, you really kindly sorted out

164:14

tickets for us in the uh UFC in New

164:17

York.

164:18

>> And you know these guys kick hard. You

164:20

know they punch hard, but when you're

164:21

there ringside and you feel the kick,

164:24

you're like,

164:24

>> "Oh yeah, you guys were close." That's

164:26

the thing is when you're close, you can

164:27

hear the

164:28

>> the slap. You know what happened though

164:30

in New York? We sat down and then some

164:32

guy that I didn't initially recognize

164:33

came and sat in front of us

164:35

>> and that was Dylan Dannis and he kicked

164:38

off this whole

164:39

>> the brawl.

164:40

>> He sat literally right in front of us.

164:42

>> Oh, you had a front row seat.

164:43

>> Yeah, we did.

164:43

>> So, the actual fight.

164:44

>> Oh, boy. Was that exciting?

164:47

>> Yeah. Well, I just turned to the side

164:49

and then there was this just giant brawl

164:51

right in front of us all of a sudden. It

164:53

was

164:53

>> Those are very unfortunate. You don't

164:55

like those?

164:55

>> Yeah. No, me neither. It was crazy. It

164:57

was crazy. Well, Dylan Danis, he knows

164:59

how to get a lot of attention.

165:01

>> Yeah,

165:01

>> that and yeah, he does. And I will say

165:04

this, I don't agree with his behavior,

165:05

but unlike a lot of online trolls, that

165:07

guy actually does it in real life. Do

165:09

you know what I mean?

165:10

>> Right.

165:10

>> He actually like I don't like

165:12

>> Dylan can fight. He can fight. He's a

165:14

very good submission artist. He's a

165:16

Marcelo Garcia black belt. He's very

165:18

legit on the ground. Conor McGregor

165:20

brought him in for training like for a

165:22

lot of his camps.

165:23

>> Yeah.

165:24

>> Are you excited for the White House

165:25

card? That looks really good. Uh, yes.

165:28

Um, I'm excit It's It sounds crazy. I

165:31

know it's going to be very high security

165:32

and high stress and weird to have a

165:35

fight at the White House in the middle

165:37

of a [ __ ] war. But I would hope the

165:40

the war will be sorted out by June, but

165:42

quite honestly, I'm not confident that

165:45

that's going to be the case.

165:46

>> No.

165:46

>> Yeah. No.

165:47

>> Yeah. So, that would be weird having

165:48

this very high-profile event where

165:50

everybody's in one place at one time,

165:51

right there.

165:52

>> I hadn't thought of that.

165:53

>> Yeah. Yeah.

165:54

>> So, you're not excited to be there? That

165:55

seems like it seems like you're asking

165:57

for

165:57

>> Holy [ __ ] I had thought of that at all.

165:59

>> How could you not think of that?

166:00

>> Well, cuz cuz I'm not going to be there.

166:02

You're the one that has to think of it.

166:03

I was just like, "This is a great

166:04

lineup. I look forward to the fights."

166:06

>> Yeah,

166:06

>> cuz if you want to talk about hidden

166:08

hard, I mean, Ilia Taporia, [ __ ] me.

166:10

>> That's when you think like Max Holloway

166:13

and Charles Olivera. Charles Olivera

166:15

dominates Max Holloway. And then you

166:17

realize like how quickly Ilia Deoria

166:21

starched Charles Olivera and you go,

166:23

"How good is that guy?" Right. He's a

166:25

once in a generation talent. Like he's

166:27

he knocked out in three fights three

166:30

all-time greats.

166:33

>> Vulcanowski and then he knocks out Max

166:36

Holloway and then he knocks out Charles

166:38

Olivivera in two different weight

166:40

classes. He knocks out three all-time

166:42

greats three fights in a row and he just

166:45

definitive starching of these guys. Like

166:48

he's a once in a generation talent. And

166:51

think how good Vog how dominant Vog is.

166:54

>> Yeah.

166:54

>> Like how good he looked against Diego

166:55

Lopez, right? How good Max Holloway

166:57

looks all the time, especially in the

166:59

striking exchanges. Max is a very hard

167:01

guy to hit. And Ilia just dominated him.

167:04

He's [ __ ] spooky good and insanely

167:06

confident, insanely charismatic.

167:09

>> Yeah. Do you think part of it is as well

167:11

is just technique is so important is the

167:13

is the most important thing because you

167:15

look at Usyk when he came up against

167:17

Fury

167:18

>> and the first fight I didn't give you a

167:20

prayer like you is basically a a

167:22

glorified cruiserweight right

167:24

>> and you look at Tyson Fury 68 undefeated

167:27

you know he comes from a traveler

167:29

background this is a guy who was taught

167:30

to box from the age of three I've taught

167:32

traveler kids they they all taught how

167:34

to fight they know how to fight they

167:36

know how to throw punches boxing is in

167:38

their

167:39

And you just saw that he was so

167:40

technically supreme.

167:42

>> Yeah.

167:42

>> That Fury had no answer and lost

167:45

consecutive fights against him.

167:46

>> Yeah. And then look at what he did to

167:48

Dubois.

167:49

>> You know, Usyk is special. I mean, he's

167:51

he's basically a gigantic Lomachenko.

167:54

>> Like unbelievable movement. And he was

167:56

trained by Lomachenko's father as well.

167:58

Same same trainer.

168:00

>> Yeah. Um I mean there's just people that

168:02

are better than everybody else. And it

168:04

seems like I is one of those guys. He's

168:06

just weirdly better than everybody else

168:09

and he and he can take it too. Like one

168:11

of the fights that he had, so when he

168:13

was competing at featherweight, he took

168:15

a fight at lightweight against Jai

168:16

Herbert and Jai Herbert in the first

168:18

round caught him with a perfect head

168:20

kick. Rocked him, dropped him, and Ilia

168:23

Tapora wind up grabbing his legs, taking

168:26

him down. They fought on the ground and

168:27

then the second round Ilia just put him

168:29

into the shadow realm. He hit him with a

168:31

combination against the cage where he

168:33

hit him with a I think it was a left

168:35

hook to the body and a right overhand

168:37

that just spun his head around. It was

168:39

wild. I mean, face first, face planted.

168:42

He's He's got freakish power. So, it is

168:46

technique. His technique is flawless.

168:48

His technique and the grappling is

168:49

flawless, but it's also

168:50

>> Was he good at grappling as well?

168:51

>> He's phenomenal at grappling. That's his

168:54

main base. He started off as a grappler.

168:56

>> Really?

168:57

>> Yeah. I didn't even know that.

168:58

>> Oh, yeah. Yeah. and his early fights are

168:59

just taking.

169:00

>> You don't see that much nowadays. He

169:01

just [ __ ] knocks everyone out

169:02

basically, right?

169:03

>> He's just I It's his mind more than

169:06

anything. His confidence is real. It's

169:09

not bluster. Like he's he's really s

169:12

>> He celebrates the fights the day before

169:15

the fight. He has a celebration of his

169:17

victory. He did that against Charles

169:19

Olivera. They all went out to dinner.

169:20

They're making toasts celebrating his

169:22

victory the night before the fight

169:24

itself.

169:25

>> That's a high-risisk strategy.

169:27

>> It's a crazy thing. Mimog goes out and

169:28

knocks him out in the first round. Like

169:29

who [ __ ] knocks out Charles Olivivera

169:31

in the first round like that that

169:32

especially now like the Charles Olivera

169:34

of today? That's crazy. He's it's it's

169:37

it's

169:39

the mind that allowed him to get so

169:42

elite at grappling also allowed him to

169:44

get so elite at striking,

169:46

>> right?

169:46

>> And it's it's setups and traps. It's not

169:50

just throwing wild bombs. It's defense.

169:53

His defense is fantastic. You saw that

169:55

in the Josh EMTT fight. Josh Emmett,

169:57

like as far as like one punch power, he

169:59

rivals everybody. I mean, you saw that

170:01

fight where he knocked Bryce Mitchell

170:03

out with a punch to the forehead. That

170:05

guy hit so [ __ ] hard. And when you

170:07

look at it, it it makes sense. I mean,

170:09

he's a [ __ ] tank.

170:10

>> Yeah.

170:11

>> And I Deora just slipped away from

170:13

everything. Slipped away from everything

170:15

and then eventually just put it on him.

170:16

>> Yeah.

170:17

>> He's he's great everywhere, man. He's

170:19

great on the ground. He's great standing

170:21

up. And more importantly, it's his mind.

170:24

like he doesn't make mistakes. He's a

170:27

he's just a force in there.

170:29

>> Yeah.

170:29

>> He's the new breed, you know, like with

170:32

every generation there's every

170:34

generation builds on the success of the

170:36

previous generation. They all learn from

170:38

the elites of the past.

170:40

>> He's our version of what's possible now.

170:42

>> Wow.

170:43

>> He's that good.

170:44

>> I was hoping I was hoping for the White

170:46

House card Dana would do something and

170:48

pull Jon Jones out of the bag.

170:49

>> I was hoping that as well. Yeah, I was

170:51

hoping that as well.

170:52

>> That would have been really special.

170:54

>> Yeah. I don't know why that didn't

170:55

happen. I don't know. I mean, there's

170:57

John's version. There's the UFC's

170:59

version. I don't know what was the

171:01

stumbling block there.

171:02

>> Well, I think it's fair to say him and

171:04

Dana don't get on very well.

171:06

>> I don't think it's that bad. It's They

171:08

certainly could make a deal. That's I

171:10

don't think it's as bad as like say

171:11

Francis Enano. The Francisano situation

171:14

like Dana does not like him at all and

171:16

won't do any business with him period.

171:18

>> Cuz that would be the fight. Franciso

171:20

versus Jon Jones. Oh my word. That would

171:22

be good. But also Alex Pereira versus

171:24

Jon Jones would be the fight as well.

171:26

Like you wouldn't it didn't a title

171:28

doesn't mean anything. You could do the

171:30

BMF heavyweight version. Like it doesn't

171:32

matter,

171:32

>> right?

171:33

>> Like just those two guys fighting. I

171:35

mean that would be titles are irrelevant

171:37

when you're dealing with the all-time

171:39

great in Jon Jones, the greatest of all

171:41

time. And then Alex Barrera, a

171:43

generational talent who's the most

171:45

devastating striker we've ever seen

171:46

inside the sport. I mean, as you look at

171:48

Ilia, I mean, Ilia is phenomenal, but

171:50

Ilia is like more complete as a fighter,

171:53

>> but Alex is freaky. I I ever tell you

171:57

what Mark Goddard said when he fought

171:59

Khalil Roundtree. So, he he beats Khil

172:02

Roundry up and they stop the fight and

172:05

then Mark Goddard grabs me like as I go

172:07

into the octagon, he goes, "The sound it

172:10

makes when he hits them is ungodly."

172:14

That's what he said. He goes, he goes,

172:15

"Mate, I've been doing this for 20

172:17

years." He goes, "I've seen it all." He

172:19

goes, "It's different. The sound, the

172:22

impact is like he's a freak, man. That

172:25

guy, he he's a physical freak. I mean,

172:27

he's a real genuine Amazon warrior who's

172:31

just built different than other people,

172:33

>> you know? I'm sure you've seen him punch

172:35

that machine."

172:36

>> Yeah.

172:36

>> Where he gets like 190. Francis Enano

172:39

got like 129. What?

172:41

>> And he got 190. Yeah.

172:43

Holy [ __ ]

172:44

>> Yeah. No, it's freak power.

172:46

>> I did not enjoy watching the end of that

172:48

Khalil roundree fight. I'm not going to

172:49

lie. That was rough. I mean,

172:51

>> and Khalil Roundry is like a [ __ ]

172:52

animal.

172:53

>> He's a warrior. He's just a warrior. I

172:55

mean, he knew going into that fight, he

172:56

was willing to go out on his shield. He

172:58

wasn't afraid. And he went after him. He

173:00

went after him. He did.

173:02

>> But the consequences of getting hit and

173:04

then Alex was starting to tune him up at

173:06

the end where he was leaning away from

173:08

shots and then countering and leaning

173:10

away from shots and countering. He was

173:12

he was in his flow state and that's

173:14

where it got real spooky

173:15

>> because Khalil became like a sitting

173:17

target and with each shot his ability to

173:19

get out of the way diminished with each

173:21

kick that landed his ability to move

173:23

diminished and it got spooky.

173:25

>> Yeah. And then it becomes a dilemma for

173:26

the referee like when do you actually

173:28

step in because

173:30

>> look there there's a consent for the

173:32

fighter to be there and to take part in

173:34

the fight but there comes a point where

173:36

you have to step in for the for the

173:38

fighter's own health.

173:39

>> Yeah. There comes a point where you

173:40

realize they can't defend themselves

173:41

anymore and they're getting just tuned

173:43

up and that was the end of the fight. I

173:45

mean, that was the right time to stop

173:46

it, but it was hard to watch.

173:49

>> It was. But then you get fights like

173:50

Usuzman versus Leon Edwards.

173:52

>> Mhm.

173:53

>> Where he's getting smashed for five

173:54

rounds and he just [ __ ] pulls a kick

173:56

out of him in the last minute and knocks

173:57

him out.

173:58

>> Yeah. So, but he wasn't getting smashed.

173:59

>> Not Not the way Khalil Roundry was at

174:01

the end. That's fair. Yeah, that's fair.

174:02

>> He wasn't getting He was getting beat.

174:04

>> He was getting beat. Yeah.

174:05

>> Yeah. Yeah. But he wasn't like in danger

174:06

of getting stopped or really hurt badly.

174:09

But that's why we all watch it because

174:11

it's that knowledge that anything can

174:13

happen.

174:14

>> You know, Hakeim Ryman versus Lennox

174:16

Lewis. You know, no one gave

174:18

>> no one gave Hakee a prayer when he went

174:21

in. He was Lennox is a supreme fighter,

174:23

Olympic gold medalist, one of the

174:25

greatest to ever do it.

174:27

>> And then that one punch he hit flush on

174:30

Lennox's jaw and he was out. I remember

174:31

watching it going, I mean, no one saw

174:33

that.

174:34

>> Especially in the heavyweight division,

174:35

one punch with those guys.

174:36

>> Yeah. Yeah.

174:37

>> It's why I really like it cuz in our

174:38

world like you know if I do a debate

174:40

everyone talks [ __ ] to each other and

174:41

like you know everyone talks [ __ ] then

174:43

they go have a debate everyone still

174:44

talks [ __ ] afterwards. In in combat

174:47

sports everyone talks [ __ ] and then you

174:48

find out

174:49

>> right. Yeah. Yeah. It's very definitive.

174:52

You win or you lose. It's not subject to

174:54

other people's interpretations of cuz

174:56

like you'll see that in debates too.

174:58

Like I'll see a debate where I I think

175:01

like in for you for example like you

175:03

clearly won the debate and then I'll see

175:05

people say you got owned,

175:06

>> right? You know, and they're like,

175:07

"Okay,

175:07

>> all all the people who agree with me say

175:09

I dominated and all the people who agree

175:10

with other guys say he he dominated."

175:12

>> Yeah. And you you'll see these pundits

175:14

and it's that's a weird economy, right?

175:16

There's a weird economy of commentators

175:18

on other people's exchanges.

175:20

>> Yeah.

175:20

>> And it's that is a weird sport. It's a

175:23

weird

175:24

>> You've made loads of careers. There's

175:25

loads of people. Joe Rogan said this on

175:27

his part. That's the entire content.

175:29

>> They work for me. They don't even know

175:30

it.

175:31

>> They do. They make me more famous,

175:33

>> right? Yeah.

175:34

>> You're getting all the kickbacks. Also,

175:36

you you get to see what kind of a person

175:37

they are, right? And they're they're

175:40

silly, [ __ ] people. You go like,

175:41

"Well, the silly, [ __ ] people don't

175:43

like them." Or maybe someone who you

175:44

agree with doesn't like me. Like, oh, I

175:46

don't like him anymore. Which is fine.

175:48

But it's like that economy of commenting

175:51

on other people constantly. The problem

175:53

with that is

175:55

>> you've always put yourself in a position

175:56

of an outsider, right?

175:58

>> You know, you're a com like me, right?

176:00

In when it comes to combat sports, I'm a

176:02

commentator. That's all that's all I do.

176:04

I can't fight. I'm 58, right? I'm not

176:06

going in there. So, it's like I'm always

176:08

going to be in this position of only

176:10

being an observer and a commentator. I'm

176:13

not going to be a per like for those

176:15

people that are commentating on these

176:16

debates, a lot of them probably fancy

176:19

themselves intellectual gladiators. They

176:21

just don't get the opportunity to do it.

176:23

>> And occasionally they do and they

176:24

usually get trounced, right?

176:25

>> Because really they're not that good,

176:27

which is why they're commenting in the

176:28

first place and why they have these

176:29

[ __ ] stupid hot takes.

176:32

Well, you know, the frustrating thing

176:33

for me with the debates nowadays is how

176:36

few people want to have an actual

176:38

discussion.

176:39

>> It was so refreshing. We last time we

176:41

came here, we had Dave Smith on our

176:43

show. I don't know if you saw that one.

176:45

>> Yeah, I did. Yeah,

176:47

>> we loved it. We loved it

176:48

>> and Dave enjoyed it and and you know, it

176:51

was weird because we obviously have lots

176:54

of different perspectives on things, but

176:56

afterwards a lot of people were like,

176:57

"Oh, can't believe you had Dave on." And

176:58

I kept I said to all of them, listen,

177:00

Dave's only crime is that he has a

177:02

different opinion to you. Because apart

177:04

from that, he comes in, he shows up,

177:06

he's super nice, he's respectful, he's

177:09

polite, he doesn't do any dirty tricks,

177:11

right?

177:11

>> He doesn't argue about the definitions

177:13

of words that for 10 minutes, right? He

177:16

just goes, "Here's my opinion. Here's

177:17

your opinion. Let's discuss."

177:19

>> Yeah.

177:19

>> And that's how conversation should

177:21

happen,

177:21

>> right?

177:22

>> But so much of the debate stuff now is

177:25

not people aren't discussing the issues.

177:27

They've just like decided you're a bad

177:28

person. And that's what they're trying

177:30

to achieve. They're trying to get a

177:31

cheap laugh from the audience that

177:33

they're playing to who's not even in the

177:34

room cuz they know their [ __ ]

177:36

followers are going to watch it online

177:37

afterwards and be like, "Oh, he owned

177:39

them."

177:39

>> But where did we get to?

177:41

>> Well, it's just a I mean, it's just some

177:43

people that are doing that, you know,

177:45

and those people that's all they can do.

177:47

That's why they do it that way, right?

177:48

You know, if they were really

177:50

intellectually compelling

177:52

>> and if they were like smart people, like

177:54

I don't want enemies. Like if I if I can

177:56

have a a sane, rational, peaceful

177:59

discussion with someone where we

178:01

disagree with something, I would greatly

178:02

prefer that

178:03

>> then have someone who's insulting me and

178:05

I'm insulting them. We're trying to like

178:06

get off on each other. Like why?

178:07

>> Yeah.

178:08

>> I I'm busy.

178:09

>> Yeah.

178:10

>> I have things to do. Like I don't need

178:12

that kind of [ __ ] in my life. And I

178:13

don't mind when someone disagrees with

178:15

me. I think it's healthy.

178:17

>> I also want to know why you think the

178:19

way you think genuinely.

178:20

>> Right.

178:21

>> I I you know what I see it as an

178:22

opportunity as like because we all have

178:24

blind spots. Yeah,

178:25

>> we all have blind spots. We all have

178:27

biases. I don't care who you are, how

178:28

smart you are, you have biases, you have

178:30

blind spots.

178:32

>> Not me, mate. I know everything.

178:35

>> But but when someone goes, "Well,

178:37

actually, Francis, you say this, but

178:39

what about this? Have you thought about

178:40

this? Have you read about this?" I'm

178:41

like, "No." It's like, well, maybe you

178:43

should,

178:43

>> right?

178:44

>> And maybe you actually maybe won't

178:46

change your opinion, but certainly have

178:48

a more nuanced opinion.

178:49

>> True. Yeah.

178:50

>> But also, we're talking about [ __ ] that

178:52

actually matters.

178:53

>> Mhm. you know, and it deserves to be

178:56

taken seriously.

178:57

>> Yeah. Well, the the the answer is don't

178:59

engage with those certain people. Yeah.

179:01

>> You know, I'm learning that.

179:03

>> Yeah. We were having a conversation

179:04

about one particular individual. Yeah.

179:05

>> Where I'm like, why? Don't bother.

179:07

>> I can't believe you I I didn't say

179:09

anything about that discussion cuz I was

179:11

just like, I don't want anyone to waste

179:12

their [ __ ] time watching it. It was

179:13

It was awful. I'm fascinated by that too

179:15

though because I'm fascinated by these

179:17

people that are doing that where they're

179:20

just trying to win and use tricks and be

179:22

sneaky because like they they think of

179:24

discourse in a completely different way,

179:26

right?

179:26

>> They think about the whole thing in a

179:28

completely different way. They're

179:29

completely ideologically captured and

179:31

they're the place they're starting from

179:33

is I want to prove this to be correct.

179:36

not I want to know why this person

179:38

believes it to be incorrect and I want

179:40

to find out if maybe we have common

179:42

ground and maybe maybe they know

179:44

something I don't or maybe I know

179:45

something they don't and let's find out.

179:47

>> You know what I you know what I really

179:48

want and Const and I have talked about

179:50

it a lot. I want somebody on the left to

179:52

come up and be brilliant at debating and

179:55

go to people on the right well you say

179:56

this and you say this but actually let's

179:58

look at this let's look at that and be a

180:00

genuine intellectual force. And what I

180:03

despair of is I haven't seen anyone be

180:05

from the left like that in basically a

180:07

generation.

180:08

>> I think the generation that you're

180:10

talking about has been captured by some

180:12

certain narratives that you have to

180:14

agree to that aren't rational.

180:16

>> So as soon as you do that and you you

180:18

align yourself with this particular

180:20

ideology,

180:21

>> you're already saying, "I'm willing to

180:24

believe some [ __ ] that doesn't make any

180:26

sense at all because this is the only

180:27

way to be accepted by my tribe." That

180:30

intellectually compromises you. And that

180:32

that also I think humiliates you in a

180:34

certain way. It puts you in a position

180:35

where you're saying something that you

180:37

know can't be true. So you set up blind

180:39

spots.

180:40

>> Do you think they know it's not true?

180:41

>> I think there's got to be a part of them

180:43

that realizes there's a good argument

180:44

that it's not true.

180:46

>> You know, especially when it comes to

180:47

like transgender stuff or or border

180:50

stuff. There's there's certain things

180:52

where like

180:53

>> there's no real good faith argument that

180:55

you should have an open border and allow

180:57

[ __ ] any psychopath to come across

180:58

the border and invade your community.

181:00

That seems crazy. That seems crazy. Like

181:03

if you understand anything about human

181:04

nature and the the nature of the world

181:07

and the level of poverty and crime that

181:10

exists outside of the United States,

181:12

particularly in third world countries,

181:13

we're just allowing

181:14

>> I thought you were talking about Canada

181:16

there, Joe.

181:17

>> I'm all for them evading. They should

181:20

come over. They should bail on their

181:22

country until it gets better. I just ask

181:23

you because I would find it so hard to

181:25

go on stage in front of well what is now

181:28

hundreds of thousands of people by the

181:29

time it goes on the internet, right? And

181:31

just vigorously defend something I

181:33

didn't believe.

181:34

>> Well, that's cuz you're smart. And I

181:36

think the problem is a lot of these

181:37

people aren't really intelligent. What

181:40

they are is a person who has a good

181:42

vocabulary, who's acquired a certain

181:45

amount of technique and skill involved

181:47

in talking really fast and spouting

181:51

things that they've seen online that a

181:54

bunch of narratives. Like one of the

181:55

things that people love to do is if

181:58

you're talking to anyone that's on the

181:59

right, they want to say, you know,

182:00

you're you you support a 34 time

182:04

convicted felon. And you know, there's a

182:07

lot of a lot of things that they like to

182:08

say. There's techniques involved instead

182:10

of like discussing anybody that looked

182:12

at the actual Trump case. If you're

182:15

rational and you're on the left, you say

182:16

that's a crazy case.

182:18

>> There's no way that should be a felony.

182:19

It's not a felony. There 34 different

182:21

misdemeanors and it's also it's past the

182:24

statute of limitations. This is the

182:25

craziest egregious misuse of justice.

182:28

And the scary thing is if someone on the

182:30

right gets in, they decide to do that to

182:31

someone on the left, like you got to put

182:32

your foot down, stop that from

182:34

happening. the Russia Russia Russia

182:35

stuff like the all that stuff, the

182:38

Russia gate stuff. Like that's kind of

182:39

crazy that someone on the left doesn't

182:42

call that out and say, "Hey guys, this

182:43

is [ __ ] dangerous because if you're

182:45

lying and you're having intelligence

182:47

agencies lie and you're having people

182:49

lie on television and you're just

182:51

accepting that." Why? Because it's your

182:53

side. You're they're supporting your

182:55

side. That's crazy.

182:56

>> That's why and and I find that very

182:58

strange because what they do is they

183:00

pivot to that.

183:01

>> Mhm. which is not relevant to the

183:03

conversation we're having. Exactly.

183:05

We're talking about, you know, is it is

183:06

it right to do these strikes on Iran or

183:08

is it this or is it that is, you know,

183:10

what's the situation in the Middle East

183:11

or whatever.

183:12

>> How it how does bringing up Trump's

183:15

convictions or otherwise change that? It

183:17

doesn't affect that conversation or the

183:19

border or or the trans thing or any of

183:21

the other things. And that's the thing

183:23

is like, can we just argue the [ __ ]

183:25

point?

183:26

>> Well, I think at a certain point in

183:28

time, you're going to have to choose

183:30

real opponents. It's like a Jake Paul

183:32

thing.

183:33

>> But but but see, I want the real

183:35

opponents, but where are they? Where are

183:36

they? And we've had people on the show

183:38

where it's like we had um this this

183:40

woman from the Guilty Feminist podcast

183:42

and she came on and we gave 40 minutes.

183:44

She basically laid out her whole vision

183:47

>> and it was respectful and polite and it

183:49

was it was a great conversation

183:50

actually. The moment I said, "Well, you

183:52

know, you've been speaking for the this

183:54

time. Here's some of the things that I

183:55

see that don't make sense in my head.

183:56

Can you help me out?" Immediately goes

183:58

personal.

183:59

>> Mhm.

184:00

>> Immediately.

184:00

>> Yeah. And that's what Francis is saying.

184:02

Like I'd love to see people who have an

184:06

ability to argue the point.

184:08

>> Yeah.

184:09

>> And that's that's what Dave does. He

184:11

argues the point and that's either

184:13

persuasive to you or it isn't.

184:14

>> Well, I think the problem is their point

184:17

is not very good.

184:18

>> Yeah. I think that is the problem.

184:20

>> Yeah. So, you have to go personal. You

184:22

have to attack people. You have to use

184:23

ad homonyms. It's the only way you can

184:25

get anything off. And then you could try

184:26

to get that person emotional and trap

184:28

them. So why don't they change their

184:30

opinion then?

184:31

>> It's a good question because if you're

184:32

ideologically captured, especially if

184:34

you're on the left, like it's a very

184:36

clear ideology and there's like real

184:39

blowback for deviating from it.

184:41

>> Yeah.

184:41

>> As we know, right? Because the moment

184:43

you say, well, you you're no longer on

184:46

the left anymore,

184:47

>> right? Well, a lot of people have been

184:48

kicked out of it.

184:49

>> A lot of people have been pushed into

184:51

some weird quasi

184:53

>> Yeah.

184:53

>> homeless land.

184:55

>> Well, that's how we all ended up as like

184:56

rightwing. I was like, "Fuck off with

184:58

this shit."

184:58

>> Yeah. [ __ ] off with this [ __ ]

185:00

>> You know, when I to 20 years ago, all

185:02

the stuff that we talk about it, it

185:04

wasn't just like not rightwing. It's no

185:07

one questioned.

185:09

>> Do you remember do you remember 25 years

185:11

ago people running around going, "We

185:12

need an open border."

185:15

>> Right.

185:15

>> Right. Or like you can change your sex

185:17

by just going like abracadabra.

185:19

>> Right.

185:19

>> No one said that. And so we weren't

185:21

rightwing for like going that's crazy.

185:24

>> I know. You know, the most bizarre thing

185:26

is watching all these kind of left-wing

185:28

lesbian feminists get be described as

185:31

rightwing and getting kicked out.

185:32

>> JK Rowling

185:34

never over my life.

185:36

>> Yeah, there's a there's a there's a

185:38

journalist called Julie Bindle used to

185:39

ride the Guardian, one of the most

185:40

left-wing journalists and she's lesbian

185:42

and she criticized like and she was like

185:45

the trans movement. That was it. Out the

185:48

door. Doesn't matter what you've done

185:49

before. It doesn't matter. You've done

185:51

all this incredible work with women and

185:53

female prisons. Well, it's because it's

185:55

a cult. I mean, it's essentially like a

185:57

religious ideology. Like they they will

185:59

not take any heretics. Like anybody that

186:04

deviates from whatever their doctrine

186:05

is, like you're out. You're out forever.

186:08

And that scares people. So that that's

186:10

one of the reasons why they're willing

186:11

to comply and follow some of this goofy

186:14

[ __ ] and say no one's illegal on stolen

186:16

land.

186:17

>> Yeah.

186:18

>> What's interesting is it doesn't happen

186:19

on the right nearly the same way. Like

186:21

you can see it now. The right is engaged

186:22

in a fierce debate internally.

186:25

>> Mhm.

186:26

>> And people [ __ ] argue and they hash

186:29

it out and then they like they go have a

186:31

beer afterwards.

186:32

>> I think they're doing it just the same

186:33

way. I think it's a b I think it's a

186:35

human thing. Yeah. I think there's

186:37

people on the right that do it just the

186:38

same way. There's people that call

186:40

people out for not being MAGA enough,

186:42

you know? Yeah.

186:42

>> It's just like right now the whole

186:44

thing's in turmoil whereas there's not

186:46

really the same kind of turmoil on the

186:48

left where there's internal debate. the

186:50

the turmoil on the left is the left

186:52

versus the right.

186:53

>> Yeah.

186:53

>> The turmoil on the right right now, I

186:56

think there's a lot of people right

186:57

versus right and they're trying to find

186:59

out like and I think there's a lot of

187:01

people that they don't believe what

187:02

they're saying either. They're just

187:03

trying to find a thing that aligns with

187:05

the biggest audience.

187:07

>> I think that's definitely happening. I

187:08

also think though internal debate within

187:11

a big broad church movement is a good

187:13

thing because what you're arguing about

187:15

is like what is the right direction?

187:17

Yep. you know, and I do think that is

187:19

more healthy.

187:20

>> I I think working out what it is that

187:22

that that like whatever if you're on the

187:25

right, we believe

187:27

>> I as as I'm not on the right, but as I

187:29

see that, I do think that's a healthy

187:31

thing to do because you're arguing about

187:33

the direction of that movement.

187:34

>> Yeah.

187:34

>> And I think that's much healthier than

187:36

what happens on the left where it's just

187:38

like, well, if you don't agree with this

187:39

wacky idea that's far far out there,

187:42

then you're no longer part of this. But

187:43

I think the good thing about these

187:45

debates is it exposes that and anybody

187:48

who's objective, especially anybody that

187:50

is, you know, a swing voter or anybody

187:54

who's in the middle of all this, which

187:55

is a lot of people. A lot of people,

187:56

>> most people, I think, right?

187:58

>> Yeah. Most people are kind of in the

187:59

middle on most most political issues.

188:01

They get to see how crazy some of this

188:03

[ __ ] is and it makes them less likely to

188:06

follow,

188:07

>> you know. But it's also as well from a

188:10

neutral perspective and I I mentioned

188:12

the point about I want a strong left. I

188:15

want a strong left which has got good

188:17

ideas about how to tackle things which

188:19

are really important like inequality

188:21

>> like the cost of living. How do we make

188:23

it that people can actually have a

188:26

better standard of life where if a woman

188:28

wants to stay at home with her kids, she

188:30

can do that where they're not for then

188:33

have to go out and have to work and put

188:34

the kids in daycare which then leads to

188:36

a whole host of problems. How can we

188:38

have a better world for ordinary people

188:41

which is what the left always used to

188:43

be.

188:43

>> I we need a strong left to then

188:46

challenge the right so that the center

188:47

becomes a more fertile ground. And if we

188:50

don't have that, if we have these crazy

188:52

loons on the left, then what we have is

188:54

a right which will come to dominate,

188:56

which I don't think is good for society

188:58

as a whole. I'm going to be honest with

189:00

you.

189:00

>> No, it's never good if one one party

189:04

left or right, is completely dominant.

189:06

It's not good.

189:07

>> Yeah. You need checks. Yeah. Right. And

189:10

you know, the right has its share of

189:11

crazies, too, as we've been talking.

189:13

>> 100%. Praise Jesus.

189:15

>> All right, we got to wrap this up,

189:16

gentlemen. I love you guys.

189:18

>> Very quickly,

189:18

>> oh, your book. Yeah, I've got a book.

189:20

It's out.

189:20

>> It's called Uneducated: My Life as a

189:22

Teacher and Why You Should Never Become

189:24

One and Never Is In Bold.

189:27

>> An inspiring story.

189:28

>> Francis Foster. All right. I love you

189:30

guys. It's always great to see you. Bye,

189:32

everybody.

Interactive Summary

The podcast discusses pervasive global instability, from geopolitical conflicts in Ukraine and Gaza to domestic chaos in the UK. The speakers lament the "hottake culture" and rampant misinformation, leading to a breakdown of truth and fueling conspiracy theories. They analyze complex US foreign policy challenges like potential regime change in Iran, drawing parallels with past interventions in Iraq and Venezuela's "regime adjustment," which subsequently impacted Cuba. A core theme is the danger of ideological extremism, contrasting Islamism with Christian nationalism and their respective apocalyptic visions. The conversation extensively explores how AI and the monetization of social media are eroding trust and exacerbating information warfare, highlighting fears about autonomous AI prioritizing its own survival over humanity. They also touch on social polarization, the challenges of fostering rational debate, and the unique definitive outcomes of combat sports versus political discourse.

Suggested questions

7 ready-made prompts