HomeVideos

Financial Crash Expert: In 3 months We’ll Enter A Famine! If Iran Doesn’t Surrender It's The End!

Now Playing

Financial Crash Expert: In 3 months We’ll Enter A Famine! If Iran Doesn’t Surrender It's The End!

Transcript

2874 segments

0:00

So there are five scenarios in which the

0:02

war could end because Trump is stupid

0:03

enough to take on what Israel wanted to

0:06

do, which was destroy Iran, but they've

0:08

bitten off far more than they can chew.

0:10

So scenario one is Iran destroys the

0:12

Gulf power infrastructure. I think

0:13

that's highly likely. And if that

0:15

happens, then Saudi Arabia, Qatar,

0:18

Dubai, they'll become uninhabitable. And

0:20

then scenario two, Iran disables

0:22

Israel's nukes. I hope that happens, but

0:24

there's this one. And it scares the

0:27

out of me. Professor Steve, I have so

0:29

many questions. What is going on?

0:30

>> So, this war is threatening everybody on

0:32

the planet. And what Trump is doing at

0:33

the moment is a pump and dump scheme.

0:35

He's trying to drive up the oil price

0:36

and exploiting it for his friends and

0:38

for his own wealth in the process. So,

0:40

people are focusing upon the price of

0:42

this. But the really important point is

0:44

this, the straight of Ham. So, oil,

0:46

fertilizer, helium all have to pass

0:48

through the straight of Ham

0:49

>> and Iran have blocked that gap.

0:51

>> So, they can say you do or do not pass

0:53

depending on your country's attitude

0:55

towards our country. And that's quite

0:57

terrifying because 20 to 30% of our

0:59

fertilizer comes through at this point.

1:00

But if this is not available, the globe

1:02

has a famine.

1:03

>> Do you think he will send ground troops

1:04

in?

1:04

>> Yes, I do. But I'd hate to be one of

1:06

those troops because it's a suicide

1:08

mission. They've got underground

1:09

military units of weapons and troops,

1:11

but we have no idea of the scale.

1:13

>> Trump keeps saying that the war has been

1:14

won.

1:15

>> Yeah.

1:15

>> What's going on there in your view?

1:16

>> I think he's been fed propaganda to tell

1:19

him that he's winning the war by his

1:21

immediate advisers because you cannot

1:23

tell a person like that that they've

1:24

made a mistake. We'll talk about that as

1:26

well. But you developed a bit of a

1:27

reputation because you're very good at

1:28

predicting things. So, which of these

1:30

five outcomes do you think is most

1:32

probable to happen?

1:34

>> Oh god,

1:37

>> this is super interesting to me. My team

1:39

given me this report to show me how many

1:40

of you that watch this show subscribe

1:41

and some of you have told us according

1:43

to this that you are unsubscribed from

1:45

the channel randomly. So, favor to ask

1:47

all of you, please could you check right

1:49

now if you've hit the subscribe button

1:50

if you are a regular viewer of the show

1:51

and you like what we do here. We're

1:53

approaching quite a significant landmark

1:54

on this show in terms of a subscriber

1:56

number. So, if there was one simple free

1:59

thing that you could do to help us, my

2:00

team, everyone here to keep this show

2:02

free, to keep it improving year over

2:04

year and week over week, it is just to

2:06

hit that subscribe button and to double

2:08

check if you've hit it. Only thing I'll

2:09

ever ask of you, do we have a deal? If

2:11

you do it, I'll tell you what I'll do.

2:13

I'll make sure every single week, every

2:15

single month, we fight harder and harder

2:16

and harder and harder to bring you the

2:18

guests and conversations that you want

2:19

to hear. I've stayed true to that

2:20

promise since the very beginning of the

2:21

D ofio and I will not let you down.

2:25

Please help us. Really appreciate it.

2:26

Let's get on with the show.

2:35

>> Professor Steven, who who are you? If

2:38

you had to sort of distill it down to

2:39

three areas of specialism, what would

2:41

those be? history of economic thought,

2:44

financial instability, so the what

2:46

causes volatility in the in the economy

2:49

and the dynamics of money and ironically

2:53

it makes me a minority in economics

2:56

because most economists ignore money

2:57

completely.

2:58

>> It's a strange thing to you.

3:00

>> It's ridiculous but it's true.

3:01

>> We'll talk about that as well today. I

3:02

really want to focus on what's going on

3:04

in the world right now because there's

3:05

so many questions. It's it's all quite

3:08

confusing

3:08

>> extremely

3:09

>> and understanding the layers of

3:10

motivation that you know Trump has, Iran

3:13

have, Israel have is um it's a difficult

3:16

jigsaw puzzle to put together.

3:19

>> I guess the the question that I keep

3:21

asking myself is like what is going on?

3:24

>> You can't get away from the fact that

3:25

we've basically elected a mafia dawn to

3:28

president of the United States. You've

3:30

got a guy who

3:32

admires the mafia

3:34

who's running the country. So what we're

3:37

getting in some ways is a shakeddown

3:38

rather than anything driven by any sense

3:41

of political necessity. Okay. So that's

3:44

that's a crazy element to begin with.

3:47

And the American deep state as it's

3:50

called has been anti-Iran for 40 or 50

3:53

years. Israel has wanted to defeat Iran

3:56

for that length of time. Trump is stupid

3:58

enough but also cunning enough. It's a

4:00

combination of the two to take on what

4:04

Israel wanted to do, which was destroy

4:06

Iran. They're now trying to do it and

4:08

they're finding that they've they've

4:10

bitten off far more than they can chew.

4:12

>> Trump is someone who cares a lot about

4:13

people's opinions of him and he must

4:17

have known that this would be

4:18

politically unpopular to target around

4:20

at this moment in time.

4:22

>> I don't think so. I had a relationship

4:24

with somebody with narcissistic

4:25

personality disorder. So that's

4:27

something over and above what I learned

4:29

academically that I when I think about

4:31

his behavior and somebody like that,

4:33

they want to be the center of attention

4:35

at all times. They can't stand it when

4:38

somebody else is being spoken about.

4:39

It's ridiculous, but it's it's a

4:41

pathology. So he's interested in

4:43

people's opinions so long as they're

4:45

positive and they're about him.

4:47

>> So you are you saying that he attacked

4:49

Iran and started this war in part

4:50

because he wanted attention?

4:52

>> That's always something with somebody

4:54

who's got that disorder. Yeah.

4:56

>> I mean, what do you think about his

4:58

rational? He's saying that he attacked

4:59

Iran because they had nuclear weapons

5:01

and they were there was an imminent

5:02

threat.

5:03

>> We still don't know whether Iran has

5:04

nuclear weapons. Okay. We know that

5:07

Israel has. If you're going to attack a

5:09

country with you, you should attack

5:10

Israel, not Iran.

5:12

>> But you can't attack attack Israel, can

5:13

you? Cuz

5:14

>> I cannot make sense of what politicians

5:16

all over the planet are doing these

5:17

days. There's a huge gap between what

5:20

politicians are saying about global

5:22

politics and what people in the street

5:24

are saying about it. So people on the

5:26

street have seen the Gaza genocide.

5:28

They've seen all the conflicts Israel

5:30

has started there. And I think the

5:32

general sentiment in most countries in

5:34

the world today is anti-Israel because

5:37

of the way it's treating the

5:38

Palestinians. And that's what people are

5:40

thinking about. the top echelons like in

5:42

this country as you know if I if I say

5:44

free Palestine I say that outside on the

5:47

street I can be arrested it's crazy what

5:52

there's a huge divorce between what

5:54

people are thinking and what the

5:56

politicians are saying and I can't give

5:59

any explanation for that divorce apart

6:02

from believing that Israel has something

6:05

over our political leaders

6:06

>> what do you mean you think they have

6:08

something over our political leaders

6:09

>> I think there

6:11

We know about the whole Epstein. The the

6:12

way that the Iranians refer to what's

6:14

happening is they say they're fighting

6:15

the Epstein class and there's belief

6:17

that there's something where Epstein

6:20

has been working for with the Israeli

6:22

intelligence service and has blackmailed

6:25

worthy material on a huge range of

6:27

politicians. And that's the only way

6:30

that I can explain the the sort of

6:31

things that politicians are supporting

6:33

when their populace is angry about those

6:36

same policies. So you get demonstrations

6:38

here, you know, free Palestine

6:40

demonstrations, 80-year-old female

6:42

vicers being arrested for saying this

6:45

sort of stuff. You go back 40 years ago,

6:47

there was a a a belief in the public and

6:50

a belief amongst politicians that Israel

6:53

had a right to exist and it was all

6:54

pro-Israel. And now after 40 years, the

6:58

type of abuses that have happened in

7:00

Palestine have hit individual ordinary

7:04

people's attitudes to Israel. So

7:06

ordinary people are saying Israel's the

7:08

aggressor. Israel's making the mistakes.

7:10

But the politicians are all saying it's

7:12

a it's it's anti- um Seemitic to

7:15

criticize Israel.

7:17

>> If you had to give a a one-s sentence

7:19

answer as to why this war started

7:23

because we sort of hypothesized a few

7:25

things there. What would that one

7:26

sentence answer be?

7:31

>> Again, this is trying to make sense of

7:32

the senseless. I just think Israel

7:35

wanted to destroy Iran. They thought

7:37

they could do it and they thought they

7:38

had an American president who would help

7:40

them do it and they I drastically

7:42

underestimated how prepared Iran was for

7:44

that conflict.

7:46

>> Why would Israel want to destroy Iran?

7:48

What's the context there?

7:50

>> This goes back to religious elements.

7:52

The Zionist state had the right to that

7:54

whole region and there's an expansionist

7:57

element to Israel's behavior for the

7:59

last 40 years. And the major rival they

8:02

saw themselves as having in that sense

8:04

was Iran. They can invade Jordan. They

8:06

could attack Lebanon. Uh they could do

8:10

all these things. Of course they the 67

8:13

war. Uh they wiped out the is Arab

8:16

invading Arabian armies in six days.

8:18

They have this past history of being

8:21

militarily dominant in the area and they

8:24

they knew that Iran was too big for them

8:26

to take on on their own. They thought

8:28

they could get the Americans in there

8:29

and I think they drastically

8:30

underestimated how prepared Iran was for

8:33

this situation.

8:34

>> When you say Iran were prepared for this

8:36

situation and it somewhat surprised

8:38

Israel and the US. What is that

8:40

preparedness you're speaking about?

8:41

Well, it's for a start the the fact that

8:43

Iran witnessed that there were um

8:46

decapitation attacks on other countries

8:49

in the region going way way back not

8:51

just the last 10 years but the last 40

8:54

or 50 years decapitation

8:56

>> you take off the leader you kill the

8:57

leaders and then with the leaders killed

8:59

the armies in disarray and you can come

9:01

in and invade and take over. So getting

9:03

rid of Saddam Hussein that sort of thing

9:05

you know wipe out Saddam Hussein's power

9:07

base and then the whole system

9:09

collapses. That was the Iraq story. But

9:11

the Iranians observed that and they have

9:15

broken their military into 31 divisions.

9:17

There are 31 provinces like 31 states in

9:20

that sense inside Iran. Their military

9:23

has broken into those 31 units. They've

9:25

got their own fail safe system running

9:27

in the background. They've got their own

9:29

resources, their own missiles,

9:31

production systems, all that sort of

9:33

stuff. So you've got to take out the

9:35

whole 31 and then they'd have that sub

9:38

area. So the only way you can beat the

9:39

country is by literally bombing it to

9:42

back to the stone age

9:43

>> which is appears to be what they've been

9:45

trying to do

9:46

>> trying to do. But the thing is it's a

9:47

huge country. I mean look at you know

9:48

the scale of around the map's always

9:50

distort how large. So that is larger.

9:52

That's more than half the size of

9:53

Western Europe. It's got a population of

9:56

90 million about 1/3 or one quarter the

9:58

population of Europe far more than Iraq.

10:01

>> I mean it looks like it's double the

10:02

size of the UK or more

10:04

>> or more than double. I mean you know one

10:06

thing about the MA projection.

10:07

>> No. What's that?

10:08

>> Okay. It's that it's it makes the the

10:10

northern hemisphere is twice as large as

10:12

the southern and arounds in the northern

10:14

hemisphere but not as far north as

10:16

England. So the distortion gets

10:17

amplified the further north you go. So

10:19

it's bigger than France and Germany and

10:21

Italy and Spain

10:23

>> and possibly Poland in terms of area.

10:26

And then if you even see just looking on

10:27

the map itself you can see the

10:28

corrugations there versus what you can

10:31

see corrugation

10:33

>> the what representing mountains. Okay.

10:35

>> Okay. There's more mountains inside

10:36

there. It's a hor it's a horrendous

10:38

terrain to fight a war on. I think what

10:40

Trump is doing at the moment is a pump

10:42

and dump scheme. He's trying to drive up

10:44

the oil price, tell friends beforehand

10:47

that he's about to make his announcement

10:49

which will cause the price to fall and

10:51

he's just oscillating this way up and

10:53

down and exploiting it for his friends

10:56

and for his own wealth in the process.

10:57

>> Do you actually think that's the case?

10:59

Because this must be make sense of this

11:02

stuff. this must be hurting his friends

11:03

economically because the this, you know,

11:05

the stock market's going to take a dip

11:06

if he's not careful and his friends are

11:08

all shareholders in different big

11:09

companies. So, you know, also if you

11:12

know like the one of Kane's great lines

11:14

was that there's no point in buying a

11:17

stock which you think is going to

11:18

increase in value over time if you think

11:20

it's going to slump in the immediate

11:22

future. So, he's making an announcement

11:24

which causes oil markets to panic. So,

11:26

the price goes up. We've given him

11:28

control of the most powerful country on

11:30

the planet. He knows if you make an

11:32

announcement, it moves markets. He has

11:35

no compunction whatsoever in exploiting

11:37

that to cause rises and falls in prices

11:40

and try to exploit them himself and with

11:42

his friends.

11:43

>> I did. I mean, I did see that. I've got

11:44

the data here on on the floor showing

11:46

those graphs. I I generally looked at

11:47

that and thought, yeah, you know, maybe,

11:50

but it's also conceivable that Trump is

11:52

quite a predictable character and he

11:54

tweets at the same time every day. And

11:55

it's also I think me and you both know

11:57

that before the markets open on a Monday

11:59

morning, he's going to want to say

12:01

something really positive.

12:02

>> He has a track record of doing that. So

12:04

is it conceivable that they knew he was

12:06

flying because it was tracked that he

12:08

was going to be on this plane journey.

12:09

There's going to be a press gaggle. We

12:11

know he's going to give an interview. I

12:12

actually think that was quite

12:13

predictable. If I was a betting man, I

12:14

would have gone Sunday night or Monday

12:16

morning. I would have put a bet on oil

12:18

prices coming down, the stock market

12:20

going up.

12:20

>> Yeah. And like for example that one of

12:22

the things he said most recently he

12:24

talked about getting a present from Iran

12:26

>> and then he finally let slip what the

12:27

present was and was letting eight ships

12:29

through the straight of Hmas.

12:31

>> Oh

12:32

those eight ships were not American.

12:35

They were other allies. I think what

12:37

he's thinking is these um that's going

12:39

to mean the oil market gets calmed down.

12:42

That means the price is going to fall.

12:44

Uh so I can then do another pump and

12:46

dump.

12:46

>> Let's explain the straight of hormones.

12:49

>> Oh god. Yeah. is if we had to explain it

12:51

for 16 year olds because there's been

12:53

lots of coverage on it and I think some

12:54

people have kind of skipped past the

12:56

importance of the region. What is the

12:58

straight of hormones and why does it

12:59

matter?

13:00

>> Well, it's the choke point in the

13:02

Persian Gulf to get through. You've got

13:04

like 21 km. Okay, that's an incredibly

13:06

narrow gap for ships to pass through and

13:09

that means that all the countries that

13:11

pump not just oil but fertilizer, uh,

13:14

helium, all these critical elements for

13:16

the production system all have to pass

13:18

through this point. And obviously that's

13:20

well within reach of any weapons from

13:22

Iran. So they can say you do or do not

13:24

pass depending on whether we approve or

13:26

don't approve of your political your

13:28

country's attitude towards our country.

13:30

>> You said fertilizer.

13:32

>> Yeah.

13:32

>> Oil and helium.

13:34

>> Yeah. Helium.

13:35

>> Where are they coming from?

13:36

>> They're mainly coming from I think for

13:38

mainly from the Saudi Arabian side.

13:40

Saudi Arabia and like Iran will have the

13:42

same things but Iran would keep it keep

13:44

those for themselves but Saudi Arabia is

13:47

the main source of gases and oils which

13:50

are refined and as byproducts we get

13:52

sulfur dioxide and we get helium. This

13:55

is the helium.

13:56

>> Yeah.

13:56

>> Okay. That's you know a couple of kilos

13:59

of helium. But helium it's an element

14:02

which there's no substitute.

14:04

>> So helium is inert.

14:07

>> What does that mean? It means it doesn't

14:08

interact with other chemicals. You want

14:10

to give it a try?

14:11

>> I've never done it.

14:12

>> I don't think there's any in here.

14:13

>> Oh, what a pity. Okay. What I would have

14:14

done to give it a try. You got any real

14:16

helium?

14:18

>> Oh, bloody hell. Helium balloon.

14:20

>> Okay.

14:24

>> Does it change your voice?

14:26

>> I don't know. Has my voice changed?

14:27

>> It did. I'll give my voice a try. Okay.

14:31

>> Um Okay. I've never done this before,

14:33

but I've heard it at parties.

14:37

>> Okay. And now I think my voice has

14:39

changed somewhat from

14:42

the fact you can do it. Oh my god.

14:48

That is a riot. Okay.

14:49

>> Where is helium coming from?

14:51

>> It's coming from a gas field. So about

14:53

30% of the world's helium comes from a

14:55

gas field which spans both Saudi Arabia

14:58

and Iran. If you don't trap helium

15:01

physically somehow it goes to outer

15:03

space. That's the ultimate destination

15:05

of the stuff. So it's trapped in the

15:07

same things that trap oil. And then when

15:09

you drill for oil, you also get helium

15:11

coming out. And then helium is

15:13

absolutely critical for the

15:15

semiconductor industry. It didn't matter

15:17

100 years ago.

15:18

>> And semiconductors are important for

15:19

what?

15:20

>> Everything. I mean you you know your you

15:22

take the semiconductors out of that,

15:23

you've got a brick. Okay. The the the

15:26

processors, the CPUs, the memory chips,

15:29

they're all made. Helium is an essential

15:31

element to make them

15:32

>> for our iPhones, our tablets,

15:34

>> everything. Everything electronic. If

15:35

you need semiconductors, you need

15:37

helium. So if you cut off 30% of the

15:39

world's helium supply, you cut off the

15:41

capacity to produce 30% of the world's

15:44

semiconductors.

15:45

>> And Iran have blocked that gap.

15:47

>> And that means that we've suddenly lost

15:49

30% of the world's helium.

15:50

>> I've got a quote from March 2026 from

15:52

leading helium expert Phil Cornblutch.

15:55

>> He said, "We're looking at a minimum 2

15:57

to 3 months shutdown of helium

15:59

production with up to 6 months before

16:01

supply gets back to normal." and he

16:04

explained you can't stockpile helium

16:06

because it leaks through containers.

16:07

>> Y

16:08

>> so once supply is cut off semiconductor

16:10

production will stop entirely. South

16:13

Korea gets 65% of its helium from Qatar

16:16

in that region and makes 2/3 of the

16:19

world's memory chips. Their government

16:20

has launched an emergency investigation

16:22

into the shortage. Nobody's talking

16:24

about this.

16:24

>> I know. And this this is one reason it's

16:26

quite terrifying about the scale of what

16:28

we're going through cuz people just

16:30

thinking it's going to be oil's going to

16:31

be more expensive. That's the sort of

16:33

mindset we have. But in fact, critical

16:36

elements of the production system are

16:38

being terminated by this conflict. You

16:40

can't produce chips anymore. And you

16:42

can't Well, you got hang on. You can't

16:44

produce these chips either because the

16:46

fertilizer is disappearing.

16:47

>> So, you're holding a potato in your

16:49

hand.

16:49

>> Yeah.

16:50

>> How are potato chips going to be

16:52

impacted by the water?

16:53

>> Because the fertilizer. If we don't have

16:55

the fertilizer, we can't grow the

16:56

potatoes. And it's not just potatoes.

16:58

It's a whole range of crops. We eat

17:00

food, okay? We eat this green stuff. It

17:02

actually starts as brown stuff because

17:05

the fertilizer is an essential part of

17:07

growing all the food we eat. And the

17:09

fertilizer is produced by a process

17:11

called the habber bosch process which

17:13

takes petroleum and nitrogen and fixes

17:17

them in such a way that you can put this

17:18

on the on the field and your plants will

17:20

grow courtesy of the fertilizer. If we

17:23

didn't have fertilizer at all, guess how

17:25

many billion people the planet could

17:27

actually support?

17:28

>> I don't know.

17:29

>> Between one and two. And fertilizer

17:31

comes from this region.

17:32

>> Again, 20 to 30% of our fertilizer comes

17:34

from that region.

17:35

>> Through the straight of

17:36

>> through the straight of Hamos.

17:37

>> Where is it coming from?

17:38

>> It's coming again from the the same gas

17:41

field that's producing the helium

17:43

produces a side effect of fertilizer.

17:46

And you need you need I'm not a chemist,

17:48

okay? So I can get these things wrong,

17:50

but you need sulfur. You need sulfuric

17:52

acid as well as part of these production

17:53

processes. 20% of the world's

17:55

fertilizer, helium, sulfuric acid, all

17:59

pass through that straight. And if you

18:01

take them away, then you can't make

18:03

microchips, which is what Korea is

18:05

suffering from. You can't make

18:07

fertilizer, which which everybody will

18:10

suffer from. If we lost 20% of the

18:12

world's fertilizer, we'd lose roughly

18:14

20% of the world's food. And it cause a

18:16

global famine. We've never had this

18:17

experience before. We've had localized

18:20

famines. You know, countries like India

18:21

have had famines in parts of Africa and

18:23

so on. But if this is not available, the

18:25

globe has a famine.

18:27

>> And what's the last uh tanker you've got

18:28

down there? There's one more on the

18:29

floor.

18:30

>> Oh my god. Okay.

18:32

Hey, that's pretty good. It was

18:33

accidental, but that's that's petroleum.

18:35

Okay. A petroleum tank. Obviously empty.

18:37

20 L.

18:38

>> So that's oil.

18:39

>> That's oil.

18:40

>> Oil. Okay. And so if that that's what

18:42

we're losing right now and people are

18:44

focusing upon the price of this but the

18:46

really important point and I can bring

18:47

up one of my own charts here is the role

18:51

of energy in production because if we

18:54

don't have energy we can't produce goods

18:56

and services and the link is incredibly

18:58

tight. This is looking at change in

19:01

energy and change in gross world product

19:04

over the last 40 years. I'll throw this

19:06

graph on the screen for people that are

19:07

watching.

19:07

>> Okay. So, what what you've got here is

19:09

the annual percentage change in gross

19:11

world product and the annual percentage

19:13

change in gross energy consumption. And

19:15

they're virtually lock step and they're

19:17

the same magnitude.

19:18

>> So, when energy goes up, GDP goes up.

19:20

>> And when energy goes down, GDP goes

19:22

down. Now, we're losing 20% of the

19:25

world's liquefied natural gas, a

19:28

substantial proportion of its oil as

19:29

well. We could see a 5 or 10% fall in

19:32

energy. we will certainly see a 5 or 10%

19:35

fall in global world gross world

19:37

product.

19:37

>> So explain that to me. So where is the

19:39

oil in this region?

19:40

>> It's everywhere.

19:41

>> Okay.

19:42

>> I mean this this is one of the accidents

19:43

of history that the oil is a large part

19:46

is concentrated here and a large part

19:48

over here and a bit in Russia.

19:49

>> So over here for people that can't see

19:51

you're pointing at Iran, Saudi Arabia,

19:54

>> Saudi Arabia,

19:55

>> Iraq and then there's a lot in the

19:57

United States and there's a lot

19:58

>> you've got some in Russia as well. There

20:00

was a small amount like the North Sea

20:02

had a substantial amount of oil as well

20:04

at one stage.

20:05

>> And the type of oil in this region I

20:07

hear is quite important.

20:08

>> It's very I mean oil there's no such

20:11

thing as a homogeneous product.

20:12

>> What does homogeneous mean?

20:14

>> Me everything is the same everywhere.

20:15

You can if you don't get it here you can

20:17

substitute for something over here.

20:19

That's a myth that economists actually

20:21

unfortunately believe. They basically

20:23

could persuade people to think that

20:24

everything is homogeneous. In fact oil

20:26

from Venezuela is almost like tar. oil

20:29

from here is flows like water

20:31

comparatively. You need different

20:33

processing systems to to extract that

20:35

oil than you need over here. Uh if we

20:38

lose this, we can't replace it with

20:39

something from over here. So once that

20:42

goes then the production system of the

20:44

planet is damaged. Uh this has been the

20:47

shocking thing for me as a citizen has

20:51

been the fact that a war with one

20:53

country could decapitate what 20 to 30%

20:57

>> of global production

20:58

>> global production of oil.

21:00

>> Yeah. And food.

21:01

>> That's a vulnerability if I've ever

21:02

heard one.

21:03

>> I know. And this is like one reason I'm

21:05

a critic of mainstream economics is they

21:07

trivialize all this stuff. They don't

21:09

teach their students how critical this

21:11

is. So most people are like you, even

21:13

people who've done a PhD in economics,

21:15

even worse in that sense than other

21:17

people, they don't understand how

21:19

critical and how fragile our production

21:21

systems are. So people can talk about a

21:23

war in Iraq and think, "Oh, that's a war

21:24

in Iran and that's going to cut off our

21:26

oil supply." No, it's going to cut off

21:28

your food supply.

21:29

>> And for the average person listening

21:30

now, what will they start to experience

21:32

if this war doesn't immediately end? 2

21:36

or 3 months India is going to run out of

21:38

fertilizer and so there'll be a famine

21:41

in India. Food production on the planet

21:43

could fall 10 25%. And therefore the

21:47

there simply won't be enough food for

21:49

everyone on the planet and then it's a

21:51

question of who's going to starve. Now

21:53

you'd think the wealthy countries are

21:55

going to be safe there. Look for

21:56

Australia, my old home country has about

21:59

30 days oil supply. When it runs out it

22:02

can't get food from the farm to the city

22:04

anymore. So Australia is incredibly

22:07

vulnerable. We're all far more

22:09

vulnerable than we realize and this war

22:11

is threatening everybody on the planet.

22:13

>> I got in an Uber yesterday and I was

22:14

with a a wonderful guy who was actually

22:17

weirdly I sat to the Uber at 2 a.m. and

22:19

I looked up on the screen and he was

22:20

listening to the D of Sierra and then he

22:22

clocked he clocked me in the back of the

22:23

car. We had a great chat and he was

22:24

saying to me, listen this isn't actually

22:26

my main job. It's my third job. I do

22:28

this because of the cost of living and

22:30

it really stayed with me.

22:32

>> He's doing three jobs which I love.

22:33

Yeah. three jobs and he picked me up at

22:35

2:00 a.m. He's got a family

22:37

>> and he's working his butt off to keep

22:39

the family alive.

22:40

>> Yes. And you know, I'm going to say

22:42

something which I probably um I don't

22:44

say a lot which which came to mind which

22:47

is um in the position I'm in now. I

22:50

think it it it was a real reminder of my

22:52

own personal privilege that I think is

22:54

really important for someone like me

22:55

that doesn't has an interview show

22:57

because you've got to be like

22:59

intellectually honest with yourself or

23:00

just like honest with yourself generally

23:01

that like as a as someone in my position

23:04

who has been fortunate enough to be able

23:05

to make significant money. I can

23:07

understand from having that conversation

23:10

how

23:13

detached

23:14

>> you are

23:15

>> I am

23:15

>> from the world around you.

23:16

>> Yes.

23:18

You're you're a very unique soul because

23:21

you I know like read a bit of your

23:22

history of course and you've had that

23:24

terrible period where you were you know

23:25

unemployed and what the hell do I do

23:27

>> shoplifting food and stuff and

23:28

>> you were ambitious but you okay if you

23:31

don't experience poverty you don't know

23:32

what it's like

23:34

>> yeah but even if you have

23:35

>> you can forget it

23:36

>> you can forget it

23:37

>> but you haven't yet

23:38

>> well this is why it's so important for

23:39

me to have those conversations because

23:41

him saying I'm working three jobs and

23:43

this is and picking me up at 2 a.m. in

23:44

his Uber and him telling me he's doing

23:47

that because of cost of living because

23:48

he needs to pay the bills immediately

23:50

made me think of ahead of this

23:51

conversation today like oh my god if the

23:54

prices go up 20% for people

23:56

>> he's out he's he can't work 24 hours a

23:58

day

23:58

>> can't work another he can't work more

23:59

hours in the day and it was just one of

24:00

those moments where you go hell

24:01

Steve like man you need to stay close to

24:04

the plight of uh of people that are

24:07

>> on the bread line and so many people are

24:09

these days in advanced countries not

24:11

just third world countries but certainly

24:14

like in America and the UK there are

24:16

huge numbers of people who are basically

24:18

living from hand to mouth at the current

24:21

system. So if we have a breakdown they

24:23

can't afford it and in that situation

24:26

you can no longer use money as your way

24:29

of deciding whether you can eat food or

24:31

not.

24:31

>> I wonder if politicians know this cuz

24:33

part of the reason I say this is because

24:34

you know Trump is a very wealthy man

24:36

multi-billionaire reportedly

24:38

>> and if the prices go up 20% at the pump

24:41

>> he makes profit. I mean he actually when

24:43

he said he said he said in favor of the

24:44

rising oil price we'll make a lot of

24:46

money out of it. His immediate

24:48

association rising price of something

24:50

that I'm indirectly selling that's good.

24:52

He doesn't say what about people buying

24:54

it. The people who buy it can no longer

24:56

afford it.

24:57

>> You've got um some food there on the

24:59

table which shows how these conflicts

25:04

and the pressure they put on some of

25:06

these scarce resources can impact our

25:08

ability to go and buy food. I think

25:09

you've got two bowls of potatoes.

25:12

>> Well, let's actually make it fairer

25:14

right now. Let's get the actual

25:15

distribution correct initially. So, you

25:18

are talking about someone who in your

25:21

situation, you've got that

25:24

local Uber's got this, and now you're

25:26

taking away the oil price. That's going

25:28

to make Trump better off, but he's down

25:30

to the stage where, you know, he's not

25:32

too far from that happening. And that's

25:35

what we've pushed ourselves into with

25:36

this war.

25:38

Do do wars typically make inequality

25:42

worse?

25:44

>> Very good question. I think wars are

25:47

created when inequality is bad. If you

25:50

go back to the Great Depression and see

25:51

what caused World War II, it was largely

25:54

the collapse of the German economy. Uh

25:57

when they repaid their debt, their

25:59

private debt to America or government

26:01

private government debt to America that

26:03

led to the rise of Hitler. Everybody

26:05

thinks Hitler rose because of the VHimar

26:06

inflation. That's what people normally

26:08

think. In fact, the when when Hitler

26:11

came to power in Germany, the rate of

26:13

inflation was minus 10%. It was

26:16

deflation. Prices were falling.

26:18

Unemployment rose from very low to 25%

26:21

of the population. In that situation,

26:23

people supported Hitler. He revived the

26:26

economy. I'm happy to talk about how he

26:28

did that later. But inequality leads to

26:31

people being willing to elect demagogues

26:34

to say we can save you. And then you get

26:36

war coming out of it. What happened

26:38

after the World War II is

26:41

politicians realized that people had

26:43

been through the Great Depression, which

26:44

was horrific, and they'd been through

26:46

World War II, which was horrific. And in

26:48

that period, people in America were

26:51

talking about either a fascist world or

26:54

a communist world. So the Americans

26:56

realized they had to improve the living

26:58

standards of the average American

27:00

substantially to get away from that. And

27:02

if you look at, you know, in 1950s

27:04

and60s, that's what is called the golden

27:07

age of capitalism because at that stage,

27:09

you could be a single male supporting a

27:12

wife and four kids and have a

27:15

comfortable lifestyle at the time.

27:18

That was where we started from. So the

27:19

the war itself led to a focus upon

27:22

equality, a focus upon fairness and

27:24

getting as much as you can to the

27:26

poorest in society. And then we've

27:29

forgotten that over the last 80 years.

27:31

And we've now got back to massive

27:33

inequality once more. So I think

27:35

inequality causes wars. Wars in the

27:37

aftermath make people focus on equality

27:41

not to allow that horror to happen once

27:42

more. And then we forget and do the

27:44

whole damn thing again.

27:45

>> One of the um surprising things I

27:48

learned the other day was that the

27:50

country that is estimated to have the

27:52

biggest reserve of oil

27:55

is Venezuela.

27:56

>> Yep. The third country on this list that

27:59

is estimated to have the biggest reserve

28:00

of oil is Iran.

28:03

>> Yeah. Yeah.

28:05

>> Now, it doesn't take a genius. Funny

28:06

enough,

28:07

>> of two countries have added. Yeah. The

28:09

second country being America.

28:11

>> Well, it says Saudi Arabia.

28:12

>> Saudi Arabia. Well, that's already an

28:14

American vessel. Yeah.

28:15

>> Yeah. That's already basically they're

28:16

basically partners with America already.

28:18

>> And funnily enough, the fourth one is

28:19

Canada. And if you if you're listening a

28:22

lot to Trump's rhetoric, he said he was

28:23

going to take Canada and make it the

28:25

51st state or something.

28:27

it doesn't feel

28:29

that the countries that the US are

28:31

invading their leaders are the country

28:33

that have the biggest supplies. And

28:35

Trump has already said, you know,

28:36

immediately he said after taking out

28:39

Maduro in Venezuela,

28:41

>> pulling him out of his bed with his wife

28:42

and flying him back to the US,

28:44

>> he already said that the oil's on the

28:45

way back to America.

28:47

>> Yeah. One might assume that much of the

28:49

motivation here with Iran is when they

28:52

were in negotiations with them, maybe

28:55

they weren't playing boy ball with the

28:56

oil. Maybe they were threatening

28:58

something with the oil and maybe it's

29:00

such an economic waste.

29:02

>> Well, maybe most of Trump's friends, if

29:04

he has them, are oil executives and they

29:07

can see the benefit for them in

29:08

controlling global oil and the one part

29:10

they can't control is Iran.

29:11

>> But I mean, it would it's backfired

29:13

pretty horrifically. I think one of the

29:14

great sayings in humanity is it's looked

29:16

like a good idea at the time, then you

29:19

do it and you realize you underestimated

29:21

your opponent. You have you don't

29:23

realize how difficult it is. Like you

29:25

mentioned, you know, talking about how

29:27

being wealthy can make you dissociate

29:29

from the problems that ordinary people

29:31

have. It can also make you dissociate

29:33

from reality in general. You don't

29:35

realize how difficult it is to something

29:37

do something you want to have done. So

29:39

all these oil executives and people who

29:41

Trump socializes with could have

29:43

thought, take out Iran, America

29:45

dominates the global oil thing. We're

29:47

all going to be rich. Okay? But they

29:49

don't realize that Iran's been aware of

29:51

this possibility for 40 years. And

29:53

they're prepared. They're far better

29:55

prepared than the Americans and the

29:56

Israelis thought.

29:58

>> So you've got five scenarios laid out on

30:00

these cards in front of you here that

30:02

you think could happen next. I'm going

30:04

to ask you to explain to me what the

30:05

five scenarios are and then tell me

30:07

which one you think is most likely to

30:09

occur.

30:10

>> So, scenario one, which is the one that

30:12

I think Trump is I think Israel wants

30:15

this one, Iran has destroyed. Okay, if

30:18

that happens, we're all gone because to

30:20

destroy Iran, you're going to have to

30:21

use nuclear weapons. Okay, you can't

30:24

destroy it without obliterating it as

30:27

nuclear weapons do. And that's the

30:29

scariest. I don't think it's going to

30:30

happen. My main hope here is that Iran

30:32

realizes that possibility and they've

30:36

got a way to neutralize

30:39

not America's nuclear weapons, but

30:40

Israel's.

30:41

>> You think it's a possibility?

30:42

>> It's a possibility and it's what scares

30:43

the out of me because if this

30:45

happens, then we're all dead. Obviously,

30:48

a nuclear bomb doesn't just blow up an

30:50

individual target. It everything within

30:52

reach gets exploded into the atmosphere.

30:54

That's what led people to realize that

30:56

you couldn't have a nuclear war back in

30:58

the days when we had mutually assured

31:00

destruction as the as the policy. If you

31:03

attack a country, then you will also

31:05

die.

31:05

>> But can't they use narrow nuclear

31:07

weapons? Is that not a thing?

31:08

>> Well, um

31:10

>> smaller nuclear weapons.

31:12

>> Well, again, if the country is smaller,

31:13

you're talking about destroying Europe.

31:16

The weapons you'd need to make sure you

31:18

got every last potential element of Iran

31:21

neutralized. You're talking about

31:22

bombing something which is, you know,

31:25

virtually the size of Western Europe.

31:27

The amount of weapons you got to drop to

31:28

do that

31:30

and you've got to if you if you don't

31:32

get it right,

31:34

then they're going to come at you with

31:35

what they've got left.

31:36

>> The world has dropped nuclear weapons

31:37

before and people survive. Other

31:39

neighboring countries survived

31:41

>> only twice and only small weapons. The

31:43

weapons we're talking about in Hiroshima

31:46

and Nagasaki, they're about equivalent

31:48

to 20,000 tons of TNT. We're now talking

31:52

weapons to 20 million tons of TNT, the

31:55

biggest nuclear weapons. And if you

31:57

wanted to hit a country the size of Iran

32:00

and know you've neutralized it, so you

32:01

destroy the whole thing, you're talking

32:04

hundreds of those weapons.

32:05

>> If you had to give a percentage

32:07

probability of that outcome occurring,

32:10

would it be less than 1%?

32:13

If we didn't have a madman in

32:15

Washington, yes, be less than 1%. Um, if

32:19

we didn't have madman in Israel, less

32:21

than 1%, I think probably 5%.

32:23

>> 5% probability that

32:25

>> that's a possibility. I mean, again, you

32:27

know, this is trying to make sense of

32:28

the senseless.

32:29

>> Okay.

32:30

>> But I'd put it about less than 10% but

32:32

still scary as a possibility.

32:35

>> If we end up there, we're all gone.

32:38

I mean, you know, I know very little

32:40

about all these things, so that's the

32:41

disclaimer. Um, I'd say that I don't

32:46

think Israel would intentionally wipe

32:49

out the rest of the world or cause a

32:51

nuclear winter because that would

32:52

obviously impact them as well. But I I I

32:55

am quite scared of president's setting.

32:57

And what I mean by that is if we

32:58

establish it being okay to drop nuclear

33:01

weapons on people you don't like, the

33:03

sort of domino effect of that for people

33:05

in Ukraine and other parts of the world

33:07

where there's conflict might then lead

33:09

to,

33:10

>> you know, mutually assured destruction.

33:12

>> Yeah. It's the it's the last possibility

33:14

you want to have happen. The fact that

33:15

it's even possible to contemplate it is

33:18

a terrifying prospect.

33:20

>> Let us hope.

33:21

>> Yeah. So scenario two

33:24

is Iran destroys the Gulf power

33:25

infrastructure. I think that's highly

33:27

likely.

33:28

>> Iran destroys Gulf power infrastructure.

33:30

What does that mean?

33:31

>> What it means is that Iran all the Gulf

33:34

states have got their own power systems

33:36

mainly based on burning oil for obvious

33:38

reasons. Uh if you take out their power

33:40

structure systems then those countries

33:43

become uninhabitable.

33:44

>> Is that what's happening already? Cuz I

33:46

know Iran have attacked a few sort of

33:47

power facilities in the region. There

33:48

have been a couple of well there was one

33:50

attack on a Saudi Arabian power systems

33:53

and that took out two of the 14 units

33:55

that are critical for creating liqufied

33:57

natural gas and apparently it'll take 5

34:00

years to rebuild them and there are only

34:02

five companies on the planet that can

34:04

actually do that rebuilding. One quarter

34:06

of the world's liquid natural gas comes

34:08

through the straight of Hamos. One tenth

34:10

of that has been destroyed. It's like 2

34:13

and a half% of the world's energy supply

34:15

is gone for the next 5 years until those

34:18

are rebuilt. If Iran destroys the Gulf

34:20

power infrastructure, then Saudi Arabia,

34:22

Qatar, uh Dubai, they all become

34:25

uninhabitable.

34:26

>> And we're seeing that happen in parts. I

34:28

mean, it sounds like this these attacks

34:30

have slowed down a little bit, but it

34:32

was interesting that Iran's strategy was

34:33

to attack their neighboring sort of

34:35

partners and specifically targeting a

34:37

lot of their energy infrastructure. Is

34:38

that in part to apply pressure?

34:41

>> Yeah. If I attack Dubai, the leaders of

34:43

Dubai are going to call Trump and say,

34:44

"Listen, cut this out."

34:45

>> Oh, yeah. I mean, the pressure coming

34:46

back from the Arabian states on America,

34:49

I imagine, is quite immense right now,

34:51

saying, "Don't do it." It's quite

34:52

possible Israel could do it, like attack

34:55

Iran and then Iran does a retribution

34:57

attack. Trump, if you would have seen

35:00

his tweet this morning, I think he's put

35:01

it off to April the 6th before he says

35:03

he starts attacking power

35:04

infrastructure. if he attacks power

35:06

infrastructure in Iran. Iran has said we

35:08

will attack power infrastructure in the

35:10

Gulf States. So we've got till you know

35:13

what 8 days. I think he's bluffing. I

35:16

hope he's bluffing. But if he does do

35:18

the attack then Iran will respond by

35:20

destroying either an equivalent

35:22

component of the Gulf States or the

35:25

whole infrastructure.

35:27

>> I don't think people quite realize how

35:29

costly it is for regions like Dubai when

35:33

Iran attack them. I was looking at some

35:34

of the data.

35:35

>> Yeah.

35:36

>> And according to current estimates and

35:38

historical risk assessments by Dubai

35:40

officials,

35:41

>> they lose a million per minute, which is

35:45

60 million per hour or 1.4 billion a day

35:50

when there's an unplanned emergency

35:52

shutdown just of their airport.

35:54

>> Their airports, let alone their power

35:56

systems. Yeah.

35:57

>> As we probably saw on the news, Iran had

35:59

flown it seemed like a couple of drones

36:01

into Dubai's airport, which meant that

36:02

it had to shut down. Yeah,

36:03

>> they're losing a billion a day because

36:06

that airport is closed. I think it's the

36:07

biggest airport in the world.

36:09

>> It is economic pressure

36:13

which will then trickle down to Trump

36:14

and sort of force his hand. So, they've

36:16

got a clear incentive to cause chaos.

36:18

>> Yeah.

36:19

>> And that partly what Iran is saying.

36:22

It's it's like a game of bluff. You

36:24

don't want to do this bluff. If that

36:25

bluff happens then the Saudi Arabian

36:28

peninsula becomes uninhabitable

36:30

and therefore all the I mean that if if

36:33

people have are forced out of there and

36:34

most of the residents in those countries

36:36

are not Saudis. They're third world

36:39

workers from India and Pakistan and the

36:41

Philippines and so on. They're being

36:43

paid lousy wages to to work on all these

36:45

systems. If they leave because the power

36:48

is not there to support them anymore.

36:49

they try to leave then we lose the

36:52

entire energy contribution that that

36:54

region makes to the global economy

36:57

>> and and the figure screwed

36:58

>> and that figure I cited includes not

37:00

just lost airport revenue but then the

37:01

immediate impact on airlines cargo

37:03

logistics and the missed opportunity

37:04

cost of thousands of highv value

37:06

business travelers attending the region

37:07

that Dubai's GDP is roughly 30%

37:10

dependent on the aviation and tourism

37:12

sectors so when the airport closes it

37:14

impacts tourism hospitality real estate

37:16

investing global supply chains and

37:17

everything so it's Um, it's quite

37:20

remarkable specifically with Dubai

37:22

>> because Dubai I think Dubai is a lovely

37:24

place. I've been multiple times. I I I I

37:26

love going there.

37:28

>> But it felt really safe and so a lot of

37:30

people

37:30

>> It's not safe. Yeah.

37:31

>> It's not safe.

37:32

>> Yeah.

37:33

>> Yeah. A lot of people had chosen to

37:34

uproot their lives and move there and

37:36

you'd almost kind of forgotten you were

37:38

in the Middle East to some degree.

37:39

>> Yeah. Yeah.

37:40

>> But I think this is going to be a pretty

37:42

traumatic reminder for a lot of people

37:43

there

37:44

>> how fragile

37:45

>> how fragile

37:45

>> this area is and like that's the lesson

37:47

we're learning. It's the fragility of

37:49

the society we take for granted.

37:51

>> So that was scenario number two.

37:53

>> Okay. Scenario three. That's the one

37:55

that really scares me because that is

37:58

the Samson doctrine. You know the story

38:00

of Samson. Yeah. Okay. Samson is a

38:03

enormously strong individual who's

38:05

strong because of his hair. And then he

38:08

gets conned. This is an ancient story

38:10

from the Bible. And the woman who's

38:12

conned him shaves his hair. So he's

38:14

weakened. And then they put him in a

38:17

temple where he's standing between two

38:19

pillars and his hair is gone. He's bald.

38:21

He can't do a thing. They forget the

38:23

fact that his hair is starting to grow.

38:25

His hair gets to the stage where he's

38:27

now got his strength back. He pushes

38:29

those pillars and the whole thing

38:30

collapses and everybody dies. That's the

38:33

Samson doctrine. And that involves

38:35

Israel's nuclear weapons. If they

38:37

realize that they are going to lose this

38:39

war and it becomes existential for them

38:42

then one of the things they have claimed

38:44

that they do is unleash destruction on

38:47

the rest of the world like Samson

38:49

pushing the towers and the whole thing

38:51

comes collapsing down.

38:52

>> This is I mean going back to the

38:53

situation with Iran and Israel. One of

38:55

the things I was thinking a lot about

38:56

from some commentary that I'd seen is

38:59

Israel really have a motive to get rid

39:03

of Iran because Iran have repeatedly

39:05

threatened Israel. It's also because I

39:08

mean Israel is trying to get rid of the

39:09

Palestinians and in that sense Iran has

39:12

been probably the major bulkwood

39:15

supporting the Palestinians say let the

39:17

Palestinians survive. Let the

39:19

Palestinian people continue existing.

39:22

And the Israelis have been pushing and

39:25

pushing and pushing the Palestinians

39:27

out. You know, it's a hornets's nest.

39:28

We've provoked a hornets's nest. Iran is

39:31

responding right now, I think, in a very

39:32

judicious way. But if the Israelis

39:35

realize they're facing an existential

39:37

defeat, that scenario, it would again

39:40

mean uh civilization potentially gets

39:43

destroyed. And just looking at some of

39:44

the things that Iran have said about

39:46

Israel, historically, the Supreme Leader

39:48

of Iran stated in 2015 that Israel would

39:51

not see the next 25 years. Other

39:54

officials said things like, "The end is

39:56

near." Um,

39:58

>> and in March 2026, Iran's tone shifted

40:02

from ideological to purely retaliatory

40:06

with the speaker of the Iranian

40:07

Parliament, Muhammad, stating that Iran

40:11

has officially declared that it

40:12

considers all Israel energy, water, and

40:14

IT infrastructure legitimate targets for

40:17

irreversible destruction with zero

40:20

restraint.

40:21

If we think about this from a psychology

40:23

perspective, you've got two neighbors.

40:24

They're both either implying implicitly

40:26

or explicitly that they want to wipe the

40:28

other one out.

40:29

>> Yeah.

40:29

>> Trump is sort of this third party in in

40:31

the arrangement who's not in the region,

40:33

so he might be a little bit safer.

40:35

>> Those two parties that are against each

40:37

other, one of them has nuclear weapons

40:39

and the other appears to be trying to

40:42

make one. the neighbor that is Israel

40:45

presumably cannot let that happen

40:48

because if it gets to a point where they

40:49

both have nuclear weapons and they both

40:51

want to wipe each other out.

40:54

>> No, I I actually think that the old days

40:56

of mutually assured destruction were a

40:59

more stable time than what we're in now

41:01

because if you realize that if you

41:02

attack you also die, you don't attack.

41:05

>> But what if you think of death as being

41:07

a better thing than life?

41:10

You you have to have a society

41:12

continuing after you die. If you're

41:14

going to be a martyr, there has to be

41:15

people who are going to mourn your

41:17

death. If you believe being a martyr

41:18

means everybody else so so dies and you

41:20

don't do it.

41:21

>> So, do you think if Iran had nuclear

41:22

weapons, it would be a safer world?

41:24

>> I think it'd be safer because it would

41:26

tell the Israelis, stop attacking your

41:27

neighbors.

41:28

>> I sat with um a few nuclear experts and

41:32

one of the things that was shocking that

41:33

I learned is

41:35

>> if the United States wanted to launch a

41:37

nuclear weapon today,

41:38

>> Yeah. It is one person's decision.

41:40

>> I heard that. And that's what Trump can

41:42

actually just make that decision.

41:43

>> He can make the decision on his own. He

41:45

doesn't need to consult Congress or

41:46

anybody else. He has someone who walks

41:48

around with a briefcase that has the

41:49

nuclear codes in at any moment. They

41:52

call it Yeah. And when I think about the

41:53

same in this region, actually, you don't

41:55

need a whole state to decide that they

41:58

don't like their neighbor. All you need

41:59

is one supreme leader

42:01

>> or Netanyahu to say, "Do you know what?

42:04

I'm near the end of my life and these

42:06

people have really pissed me off." Yeah,

42:08

that's right. And that's I mean I

42:10

thought there was at least some control.

42:11

I saw I saw that segment with Annie

42:13

Jacobson. Yeah.

42:14

>> I thought there was at least some

42:15

control. He had to consult someone.

42:18

>> But if and or they had to have

42:19

circumstances were justified not

42:21

consulting someone.

42:22

>> If he's got that right, then we it comes

42:24

down to what's the behavior of the

42:26

person who carries the nuclear football.

42:28

Does he let Trump get hold of it? And

42:31

like there there was another incident

42:33

way way back I think in this 70s or 80s

42:36

that the Russian early warning system

42:39

reported that there was a nuclear attack

42:41

on the way to Russia and there was one

42:43

submarine commander or one element of a

42:46

submarine command system and they had to

42:48

have three people in the submarine who

42:49

agreed to to launch an attack and this

42:52

particular person refused.

42:55

If he'd agreed with the other

42:58

nuclear war at it,

43:04

even the Russian submarine had three

43:06

people who had to make that decision.

43:08

So, we didn't have a nuclear war. Now,

43:10

we've got one maniac in this White House

43:12

who could do it. I'll play Annie

43:14

Jacobson's clip now where she talks

43:16

about the idea of sole authority which I

43:19

think is an important thing for people

43:20

to understand because when we think

43:22

about who we're electing to lead our

43:24

nuclear capable countries

43:27

>> you have to think about who you want to

43:29

give sole authority to

43:31

>> the United States president has sole

43:34

presidential authority to launch a

43:35

nuclear war

43:36

>> what does that mean

43:38

>> it's exactly like it sounds what's so

43:40

interesting is a lot of this stuff this

43:41

nomenclature that gets thrown at

43:43

If you just break it down, it's sole

43:46

solo presidential. He's the pus

43:50

authority. He doesn't have to ask anyone

43:52

for permission. Not the SEC staff, not

43:55

the chairman of the joint chiefs of

43:57

staff, not the Congress. I love the

43:59

worried look on your face in this moment

44:01

because it is once you know that

44:06

you say well first you might Google is

44:08

it really true and you will get for

44:11

example on Reddit like that's not really

44:13

true you'll get like hundreds of

44:15

thousands of people you know coming in

44:18

with their opinions about how that's not

44:20

really true well it is really true it's

44:23

absolutely true and in fact during the

44:25

former President Trump administration

44:29

Congress became so sort of I want to say

44:32

motivated or alarmed by this issue

44:34

meaning they were being asked questions

44:36

by the powers that be. Is this actually

44:40

true that they released a report stating

44:43

specifically and I quote in the book yes

44:45

it is true as commanderin-chief

44:48

the president has this sole authority.

44:50

He doesn't need to ask anyone.

44:53

>> So what is scenario four in your

44:55

envelopes there? Iran disables Israel's

44:58

nukes. Nobody can know. But I do believe

45:00

that Iran has not developed nuclear

45:02

weapons.

45:03

>> So you're hoping Iran disables Israel's

45:05

nuclear weapons?

45:06

>> I am. I hope that happens because that

45:08

takes out the nuclear option. Okay. We

45:10

won't see nuclear war as a result of

45:12

this. If the only nuclear weapons that

45:13

we know exist in the Middle East are

45:15

destroyed.

45:16

>> But if Iran starts disabling Israel's

45:18

nukes and attacking Israel that

45:21

effectively, there's going to be an even

45:23

bigger problem. Well, not not if we're

45:25

talking conventional weapons. If it's

45:27

conventional weapons and ground trips,

45:29

then you don't end up with nuclear

45:31

winter and the death of everybody on the

45:33

planet.

45:33

>> Wait, so you're saying you hope Iran

45:35

invades Israel and takes out their

45:36

nuclear weapons?

45:37

>> No, that's necessarily invasion. It

45:38

could be the missiles they've got left.

45:40

Again, we don't know how capable their

45:42

missiles are. The level of planning that

45:44

Iran has done in this war, I I had no

45:46

idea of of the fact they had those 31

45:49

regions, for example, until the war

45:50

began. My special is economics, not

45:52

global military politics. But once I

45:56

learned that, I thought they have really

45:57

thought this through. They have wargamed

46:00

what happens if they get attacked by

46:02

America. And they've warmed it

46:04

comprehensively. Now, I hope they've

46:07

also wargamed if we start defeating

46:10

Israel and Israel realizes they're going

46:13

to be wiped out, then the possibility

46:15

for the Samson doctrine comes in. We

46:17

have to disable that before it happens.

46:19

How could they possib They They don't

46:20

have a functioning military left in any

46:23

sort of typical sense. They don't have

46:26

ships left. They don't have planes left.

46:27

>> They don't have ships. They don't have

46:28

planes. But they have got missiles. And

46:30

we don't know how many missiles they've

46:32

got. We don't know where the missiles

46:33

are. Certainly the Americans would have

46:35

some intelligence. I think the word's

46:36

got to be used with inverted commas

46:38

these days, but some intelligence over

46:40

where they are located in Iran. But if

46:42

you listen to the Iranians talking about

46:44

it, they say they've got hundreds of

46:46

these facilities buried hundreds of

46:49

meters below the ground. If the with the

46:52

weapons they've developed, the the um

46:54

advanced rocketry they've developed,

46:56

they can evade RIL's Iron Dome, maybe

47:00

they can also get into and destroy

47:02

Israel's launch capabilities. And if

47:05

that happens, I think that's that would

47:06

be the best possible outcome because we

47:09

have a a rogue state in the Middle East

47:12

which has nuclear weapons which will

47:14

neither admit that it has or won't sign

47:16

it. They're not part of the nuclear

47:18

nonpol proliferation treaty. They won't

47:20

sign that treaty. We should never have

47:22

allowed that to happen. And if Iran gets

47:25

rid of them, I think it's the world's a

47:26

safer place.

47:26

>> Israel are just going to make more

47:28

nuclear weapons.

47:29

>> They have the resources. Uh, you need a

47:32

hell of a lot of technology and a hell

47:33

of a lot of intelligent people to do

47:35

that. You've already lost the war to

47:36

>> How could Israel lose the war?

47:39

>> You've got 90 a population of 90 million

47:41

in Iran and a population of less than 10

47:44

million in Israel.

47:45

>> But they've got it's a sort of

47:46

technological

47:48

gulf.

47:49

>> It's not as big as we thought it was.

47:51

We're only realizing now the level of

47:52

technology that Iran has. I mean the

47:55

things which Iran are doing in this war

47:56

so far have surprised everybody who's

47:59

hasn't got the background of

48:00

intelligence to tell them what's going

48:03

on uh it's an educated sophisticated

48:06

culture far more so than the caricature

48:08

we've got from the west has been about

48:10

it in the past so they

48:12

>> they don't have near nearly the same

48:13

level of resources and technology and uh

48:19

and I would say maybe sort of

48:20

sophisticated yeah advanced systems from

48:23

a war perspective that Israel do.

48:27

>> We think we don't know. We're assuming

48:30

>> even the intelligence services, even

48:31

their like their planes and their

48:33

missiles and their defense systems are

48:35

like profoundly more advanced than

48:36

Iran's.

48:38

>> If that was the case, we wouldn't be

48:39

having this conversation. It's 3 weeks

48:41

after the war began or 4 weeks. You

48:44

know, the original belief that Trump has

48:46

to be over in one day. That's proved

48:48

false.

48:49

>> I think that's in part because of what

48:50

you said because they've prepared for

48:52

decapitation. If I was the supreme

48:54

leader of Iran, yeah, that's the sort of

48:56

approach I would have taken, which is

48:58

you take me out and actually you've got

49:00

a bigger problem because now you've got

49:01

to negotiate with 41 or 31 different

49:04

sort of submillitaries and that's an

49:06

impossible task.

49:07

>> Yeah. Yeah. And that's the Iranians were

49:09

aware of that and they've got a, you

49:11

know, a huge army. They've got they can

49:13

conscript far more people than Israel

49:15

has. Um it's to me if it gets down to an

49:18

conventional military then it's possible

49:21

that you know Israel could lose that as

49:23

well.

49:23

>> On March 21 Trump threatened to

49:26

obliterate Iran's power plants if they

49:28

did not fully reopen the straight of

49:29

Hormos within 48 hours.

49:32

>> He then came out and said that he was

49:34

pausing that because Iran were

49:36

negotiating

49:37

>> um and he says he he thinks he's

49:39

negotiating with the right person. As of

49:42

yesterday, Trump has announced a 10day

49:44

pause until April 6th on destroying

49:48

energy plants, claiming that indirect

49:50

talks are going very well and that Iran

49:52

is begging to make a deal according to

49:54

the Guardian. So, what's going on there

49:57

in your view?

49:58

>> I think he's gaming the markets. I

50:01

really think he's using it to cause the

50:03

OMI price to go up and down and gaming

50:05

at either side. and somebody in his

50:08

circle or people are making a fortune

50:10

playing that's the case.

50:12

>> Yeah, I do. I mean

50:13

>> because there's lots of ways to make

50:15

money that don't involve crashing the

50:18

global economy, losing the midterms.

50:20

>> Yeah, you'd think of that. You've got

50:21

you've got ethics, you've got empathy,

50:23

you've got morals. Trump has none of

50:25

those things.

50:26

>> Do you not think it's just it's just

50:27

again if we look at Trump's pattern of

50:29

behavior over time, even with the

50:30

tariffs?

50:31

>> Yeah. The same pattern of behavior

50:32

occurred there where he would come out

50:34

and say, "Every leader is calling me.

50:36

They can't stop calling me. They all

50:38

want to make a deal. I'm going to do a

50:39

tariff on you 10%. Wait, no, I'm not.

50:41

Pause. Call me."

50:42

>> Yeah.

50:43

>> It's the same pattern of behavior. It's

50:44

you you make a threat system. Yeah.

50:46

>> You then

50:48

>> blackmail the person to try and

50:49

negotiate with you. when they don't, you

50:51

hit them with the thing hard and

50:53

eventually in the end of the day, you

50:55

don't really do any of the stuff you

50:56

threaten to do

50:57

>> because you've sort of

50:59

>> manipulated a person into getting your

51:01

way. It's the same pattern of behavior.

51:03

We're going to smash you if you don't

51:04

call me.

51:05

>> Yeah,

51:06

>> they do or don't call. He announces to

51:07

the world that they called. They're

51:09

begging. Look, it says here, "They're

51:10

begging me for a deal. I'm going to give

51:12

them 10 more days."

51:14

>> To me, it sounds like he's trying to

51:15

build his golden bridge to get the

51:17

out of there. What he's imagining is

51:19

he's dealing with somebody like himself

51:20

in Iran. Okay? He's he's projecting what

51:24

how he would react to these things. He's

51:26

obviously projecting his own behavior

51:28

onto the system. And it's projection

51:30

rather than understanding. So if you

51:32

decapitate, you know, if you took out

51:34

Trump, the fear of being, you know,

51:36

assassinated, yes, well bargain, what do

51:38

you want me to do? He thinks that works

51:40

in Iran. It doesn't.

51:41

>> You can look at his behavior and sort of

51:43

understand what he wants. He wants to

51:44

win this war and he he want you know he

51:47

wants to win the war and that's and get

51:48

out of there because that's what he's

51:49

been saying. We've won. We won. We've

51:50

won every day. We've won. More missiles

51:52

go in. We've won. So that's clearly what

51:54

he wants to happen. The problem is

51:55

winning here doesn't seem like a

51:57

straightforward thing.

51:57

>> No, it's not going to happen.

51:58

>> No pun intended with a straight up. But

52:00

it really doesn't seem like a

52:01

straightforward thing.

52:02

>> So I it's my opinion now that they are a

52:05

little bit stuck because if you leave

52:07

now you lose.

52:08

>> Yeah.

52:09

>> Iran start firing at Israel. Israel

52:11

don't stop even though you tell them to.

52:13

Yeah,

52:13

>> they start firing at each other. The

52:15

whole thing blows up. Oil, they keep the

52:17

straight of Horos closed. Oil prices go

52:20

up. It looks terrible, terrible,

52:21

terrible for Trump. We might get he

52:23

might find himself in a Bush situation

52:25

where his legacy, and I think that's

52:26

such an important word, a man that can't

52:29

be elected for a third term. His legacy

52:32

is tarnished in the same way that Bush's

52:34

legacy was tarnished by going to war in

52:36

the Middle East. M

52:37

>> I think his greatest fear, Trump's

52:39

greatest fear, you think back through

52:41

all of these moments over the last

52:42

couple years where he talked about the

52:43

Nobel Prize,

52:44

>> I think he's trying to put himself on

52:45

the Mount Rushmore of presidents.

52:47

>> Yeah.

52:47

>> In history's mind.

52:49

>> And I think how this situation plays out

52:52

now, the sole thing he's thinking about

52:53

is his legacy. And right now, being

52:56

stuck in a war and contemplating putting

52:59

ground troops in is arguably the worst

53:01

thing for one's legacy. Americans dead.

53:03

>> Yeah. And lots of Americans dead. These

53:05

wars are like you think about Vietnam.

53:07

These wars are never really won.

53:08

>> No. Well, they did. America hasn't won a

53:10

war since World War II and even World

53:12

War II was won by the Russians more so

53:14

than the Americans. So, we have this

53:16

picture of America as being this, you

53:18

know, invincible military power. But it

53:20

lost in Vietnam. It lost in Iraq. It

53:23

lost in Afghanistan. America's failed in

53:25

all of these. This is another American

53:27

failure, but on a scale far beyond what

53:30

happened in Afghanistan and Vietnam.

53:32

>> Do you think he will send ground troops

53:33

in? Yes, I do. Uh, and like I've seen

53:36

people talking about where the troops

53:37

might land. And the only part of they

53:39

can land is is right towards this edge

53:41

here with Pakistan, where they might

53:43

land to between 2 and 10,000 troops. I'd

53:46

hate to be one of those troops because

53:47

it's a suicide mission. again with those

53:49

31 provinces, the separated um military

53:52

commands they've got, the weapons

53:54

they've got hidden underground, the

53:56

troops themselves who if if you know

53:58

that there are Americans landing and

54:00

you're Iranian and a soldier, you are

54:03

going to attack them like nobody's

54:05

business and not be afraid of your own

54:07

death because you do think if you get

54:09

martyed, it's the remaining people that

54:11

you're defending. There will be people

54:12

who recognize you as a martyr. It's it's

54:15

horrific. If you had to give a sort of

54:17

percentage probability of them putting

54:18

ground troops in,

54:19

>> I would say more than 50%. We're going

54:22

to find out in the next couple of weeks.

54:24

>> Much of the reason most people haven't

54:26

posted content or built their personal

54:27

brand is because it's hard and it's

54:30

timeconuming and we're all very very

54:31

busy and if you've never posted

54:33

something before, there's so many

54:36

factors in your psychology that stop you

54:38

wanting to post. What people will think

54:40

of you, am I doing this right? Is the

54:41

thing I'm saying absolutely stupid? All

54:44

of these result in paralysis, which

54:46

means you don't post and your feed goes

54:48

bare. I'm an investor in a company

54:50

called Stanto, which you've probably

54:52

heard me talk about. And what they've

54:53

been building is this new tool called

54:55

Stanley that uses AI, looks at your

54:57

feed, looks at your tone of voice, looks

54:59

at your history, looks at your best

55:00

performing posts, and tells you what you

55:02

should post, makes those posts for you.

55:04

You can also just use it for

55:06

inspiration. And sometimes what we need

55:08

when we're thinking about doing a post

55:09

for our social media channels is

55:11

inspiration. Building an audience has

55:13

fundamentally changed my life and I

55:14

think it could change yours, too. So,

55:16

I'm inviting you to give this new tool a

55:18

shot and let me know what you think. All

55:20

you have to do is search

55:21

coach.stand.store

55:23

now to get started. This company that

55:25

I've just invested in is grown like

55:27

crazy. I want to be the one to tell you

55:28

about it because I think it's going to

55:29

create such a huge productivity

55:30

advantage for you. Whisperflow is an app

55:32

that you can get on your computer and on

55:34

your phone on all your devices and it

55:36

allows you to speak to your technology.

55:37

So, instead of me writing out an email,

55:38

I click one button on my phone and I can

55:41

just speak the email into existence and

55:43

it uses AI to clean up what I was

55:45

saying. And then when I'm done, I just

55:47

hit this one button here and the whole

55:48

email is written for me. And it's saving

55:50

me so much time in a day because Whisper

55:54

learns how I write. So on WhatsApp, it

55:55

knows how I am a little bit more casual.

55:57

On email, a little bit more

55:58

professional. And also, there's this

55:59

really interesting thing they've just

56:00

done. I can create little phrases to

56:02

automatically do the work for me. I can

56:04

just say Jack's LinkedIn and it copies

56:06

Jack's LinkedIn profile for me because

56:08

it knows who Jack is in my life. This is

56:10

saving me a huge amount of time. This

56:11

company is growing like absolute crazy.

56:13

And this is why I invested in the

56:14

business and why they're now a sponsor

56:16

of this show. And Whisflow is frankly

56:17

becoming the worstkept secret in

56:20

business, productivity, and

56:21

entrepreneurship. Check it out now at

56:22

Whisper Flow spelled w

56:26

lw.ai/stephven.

56:29

It will be a game changer for you.

56:31

>> What is the best case scenario? The

56:34

Americans have to realize they've lost.

56:36

They've not going to negotiate the terms

56:37

of reparation. And what Iran has

56:39

proposed, when you look at Iran's terms,

56:42

they're extremely reasonable. They're

56:43

saying America

56:45

leaves the whole Asian. America no

56:47

longer comes back in this region. No

56:49

military bases, no agreements. This

56:52

becomes Iranian protectorate. That

56:54

becomes an Arabian Empire or not Arabian

56:56

Iranian Empire because they're not

56:58

they're not Arabs. They're Persians. Uh

57:00

so this becomes like a Muslim part of

57:03

the world that's you can actually take

57:06

the whole region out to here it's all

57:08

Muslim and what's been happening and

57:10

this is part of the weird religious

57:11

elements here you've got the Sunni sect

57:13

and the Shiite sect which is a bit like

57:15

the Protestants versus the Catholics go

57:18

back 500 years and what we're seeing

57:20

here is like the 100red years war that

57:22

occurred in Europe back in the days when

57:25

it was Protestant birth as Catholic was

57:27

a serious thing. Um, so we're seeing a

57:29

religious war being fought here and the

57:32

the Sunni majority about 90% of Muslims

57:34

are Sunni. They have focused on their

57:36

rivalry with the Shiites. And so what

57:39

they've done is they've sided with this

57:40

mob to enable United States the states

57:44

to so they've sided with the Christians

57:46

to strengthen their own Muslim sect

57:49

which is the Sunni sect against the

57:51

Shiite sect which is Iran is

57:52

predominantly Shiite. Now what's

57:54

happening here is as soon as the war

57:56

started America

57:59

the reason the the Arabs agreed to bases

58:02

here military bases is they thought it

58:04

to protect them from Iran. As soon as

58:06

the war starts those spaces are

58:08

obliterated the Americans leave and they

58:10

realize that hasn't worked at all. So

58:12

the deal the Sunnis made to side with

58:14

the Christians has proved to be an

58:16

extremely bad deal. You're going to have

58:19

to have change in who rules these

58:21

countries to enable it to happen. But I

58:23

think the persuasive case coming out of

58:25

this within the Muslim areas is Muslims

58:28

stick together. Don't cooperate with the

58:30

Christians.

58:31

>> Don't cooperate with the United States.

58:32

>> I think that's what's going to happen.

58:34

>> You think that's going to happen?

58:35

>> I hope so because that at least gives us

58:37

something which is relatively stable.

58:38

This becomes a region that is Muslim.

58:41

>> When you say this, you mean the Middle

58:42

East?

58:42

>> I mean the whole Middle East, Saudi

58:43

Arabia, Iran, Iraq, Pakistan as well

58:46

because it's a Muslim country.

58:47

Afghanistan. This region becomes Muslim

58:51

dominated. Shiites and Sunnis start to I

58:53

mean the whole idea of of Catholics

58:56

fighting Protestants that's completely

58:58

dissipated. Um there's no level in in

59:01

the west anymore of large scale military

59:04

type animosity between Catholics and and

59:07

Protestants. That's what's happening

59:09

over here. The the conflicts those

59:12

religious conflicts within the Cath

59:13

within the Christians disappeared

59:15

largely speaking. They're still

59:17

happening within the Muslim religion.

59:20

This could persuade them that that's got

59:22

to end.

59:23

>> So, we've got one more scenario.

59:24

Scenario five.

59:25

>> Iran develops nuclear weapons. I'd

59:28

rather four happen than five.

59:29

>> Which of these five outcomes do you

59:31

think is most probable to happen?

59:35

>> I think the most likely outcome is Iran

59:38

disables Israel's nuclear weapons.

59:40

Because Iran has been so prepared for

59:42

this conflict in a way that America has

59:44

not, in a way that Israel was not. I

59:46

hope they're also prepared for the

59:48

eventuality of having to neutralize

59:50

Israel's nuclear weapons.

59:52

>> You think the highest probability is

59:53

Iran disabling Israel's nukes?

59:56

>> Yeah, I hope I'm right. I mean, if Iran

59:58

gets destroyed, then this leads not to

60:01

Iran developing nuclear weapons, but

60:03

every potential rival for America on the

60:06

planet developing nuclear weapons. We go

60:09

to a nuclear war dominated world. Do you

60:11

not think it's more likely that Trump is

60:14

going to find himself a golden bridge to

60:17

get out of this situation? He's going to

60:18

call Netanyahu in Israel and say, "Stand

60:21

down, please. I'm going to announce that

60:23

we've won this war. I'm going to

60:24

announce that we've done a deal. It's

60:26

all over."

60:28

>> Well, without doubt, whatever happens,

60:29

Trump is going to say he won. Okay,

60:32

that's again the narcissistic

60:33

personality disorder thing. He simply

60:36

couldn't bring himself to stand on a

60:38

stage and say, "I lost." I mean, think

60:40

about the biggest insult that Trump ever

60:42

made in his apprenticeship show. You're

60:44

a loser. Okay? Being a loser is the

60:46

absolute worst possible thing that

60:48

anybody can be in his mind. If he has to

60:50

say, "I'm a loser," then his life is

60:52

over in that sense. He's self his

60:55

selfimage is over. So, whatever deal

60:58

comes out, he's going to say he won. For

61:00

the for the average person that's

61:01

listening now when they hear all this

61:03

conflict going on on in the world from

61:05

an economic perspective, is there

61:07

anything they can be doing to protect

61:09

themselves against some of these

61:11

downstream consequences?

61:12

>> Well, I think one thing is people we

61:14

we've now got to the stage where you can

61:16

buy your own uh solar systems for your

61:19

house. You need something which means

61:21

you are not dependent upon oil anymore.

61:23

I I think we've trivialized the dangers

61:25

of climate change for the last half

61:27

century. We've done very little about it

61:29

to reverse it. This is telling people

61:31

that if you relied upon oil, you've got

61:34

a fragile existence. Even if it cost you

61:36

more to build solar, you've got to build

61:38

solar as your own alternative energy

61:40

system. Cuz without energy, there's no

61:42

civilization.

61:43

And that's what we're learning the hard

61:45

way from this conflict. So I think

61:47

individual responses is going to be get

61:50

some way to have your own power source

61:52

and for most people that means having a

61:54

solar. One man that has done a lot for

61:56

both solar and sustainable energy is

61:59

Elon Musk.

62:00

>> He has. He's also helped get bloody

62:02

Trump elected. So I think you've got to

62:04

score that against him as well. But

62:06

yeah, his work on solar then and and and

62:08

power and rocketry. I've absolutely

62:10

admired that and I see that as a

62:12

critical positive contribution. But

62:14

getting Trump elected, he played a major

62:16

role in that. He should learn from that

62:18

mistake and get the out of

62:19

politics. He has backed off politics

62:22

now, which is

62:22

>> I think he's realized how poisonous it

62:24

is. Yeah.

62:25

>> Yeah. It sounds like he's realized you

62:26

can't really change the beast. No,

62:28

>> he tried.

62:29

>> Yeah. He should stick with the era where

62:30

he's mature, which is what he does with

62:32

energy systems and what he does with

62:34

rocketry. He's really I mean, in terms

62:36

of legacy, uh, he's tainted his legacy

62:38

by getting involved in politics. Go back

62:40

to engineering. So you say that you

62:43

think people should invest in solar for

62:45

their homes to get their own energy

62:47

sources so they're a little bit

62:48

insulated from these sort of

62:49

macroeconomics. Is there anything else

62:51

they should be thinking about? You know,

62:52

the average person the cost of living

62:54

crisis. What what happens next? What

62:56

should they be doing now?

62:57

>> The thing that I'm most worried about

62:58

this is the impact upon food. I'm the

63:00

last person to talk about growing your

63:03

own food. I've never done it. I'm I've

63:06

got brown thumbs, not green ones. But I

63:09

think if you can have any way to produce

63:10

your own food, you've got a bit of

63:13

insulation against what's happening at

63:14

the global level. The lesson that comes

63:16

out of this is self-sufficiency.

63:19

If we don't have self-sufficiency, then

63:21

these sorts of global chaotic things can

63:23

destroy you completely with you having

63:25

no recompense. If you have some degree

63:27

of self-sufficiency, you can survive.

63:29

>> And how does one create

63:30

self-sufficiency? Growing your own food

63:32

is quite expensive and slow, isn't it?

63:33

>> Yeah, extremely.

63:34

>> So, how does one develop

63:36

self-sufficiency in this these sort of

63:37

economic climates? Is it saving money or

63:40

is it um

63:41

>> I think it's having your own physical

63:42

resources close to you that enable you.

63:45

Money doesn't matter if you can't buy

63:47

the product in the first instance. The

63:49

product doesn't exist anymore. So, one

63:52

thing that happened during World War II

63:53

is a large amount of food was grown in

63:55

the UK by people turning their gardens

63:57

into market gardens. I

63:59

>> I've heard you make a few predictions

64:01

about the future of the economic

64:03

markets. You know, you're famous for

64:04

predicting 2008 and the financial crash

64:07

that occurred then. I've heard you

64:08

saying that you think because of AI

64:10

there's going to be another financial

64:11

crash around the corner within one or

64:13

two years.

64:14

>> Yeah. What's happening with AI is a

64:16

classic economic boom and bust cycle

64:19

overlaid on the fact that AI can also

64:21

eliminate a huge amount of employment

64:24

which we've never seen that possibility

64:27

in the past on that scale. But a common

64:30

pattern in capitalism is that some new

64:33

technology will be developed like

64:34

railways for example. Some people see

64:37

the potential profitability of railways.

64:39

Everybody pours in creating railways.

64:42

You get too many railways built. The 90%

64:45

of the companies that create the

64:47

railways go bust. But then we all have

64:49

these rail systems that we benefit from

64:51

afterwards. So that's the classic uh

64:54

pattern of Joseph Schumpeda was the

64:56

person who best described that of that

64:58

Austrian economist from the early uh

65:00

early 20th century. So he said you'll

65:03

get the the banks will finance a new

65:04

investment area that investment produces

65:07

a new technology which causes a boom

65:09

while you're building the technology but

65:11

when the technology comes online it

65:13

undercuts existing businesses and causes

65:16

a slump. So they boom and slump cycle

65:18

and AI is a natural example of that and

65:21

what you get is massive overinvestment

65:23

in the first instance because everybody

65:25

who invests in AI has the ambition of

65:28

being the only AI provider on the

65:29

planet. Therefore you get too many

65:31

companies investing there's too much

65:32

money going into it. That's what causes

65:34

a boom. But then when the technology

65:36

comes online because it undercuts

65:38

existing technologies you have a slump.

65:41

And when you look at the investment

65:42

taking place at the moment, the big tech

65:44

companies, Meta, Amazon, Microsoft,

65:46

Alphabet, own Google, Oracle is on track

65:49

to spend 720 billion on AI

65:54

infrastructure in 2026 alone, which is

65:56

less than 20% of the revenue that

65:59

they're making. We are seeing a 5:1

66:02

ratio of money being spent versus money

66:04

coming in. Yeah.

66:05

>> Which is historically unsustainable.

66:07

>> Yeah. And I think that's true. There

66:08

there has to be a slump coming out of

66:10

this. And in in a sense, that's part of

66:12

the natural cyclical behavior of

66:15

capitalism because if you want to make a

66:16

profit, you've got to bring in

66:17

technology that undercuts everybody

66:19

you're currently rivals with. So that's

66:21

the railways are a classic example

66:23

there. You know, you had to get around

66:24

by carriage instead. You undermine the

66:27

carriage companies by bringing in the

66:28

railways. But the ultimate benefit,

66:30

society benefits because now you got the

66:32

railways for transportation. So that's

66:34

the same sort of thing that AI is doing

66:36

this time around. But companies 90% of

66:38

those companies are going to fail.

66:40

>> I mean this is kind of what we're seeing

66:41

already. So the failure rate of AI

66:42

specific startups has hit 90% in 2026.

66:45

>> Wow. That's luck.

66:46

>> Yeah, you you predicted that one

66:48

correctly. Significantly higher than the

66:49

70% average for general technology.

66:52

Roughly 95% of enterprise AI pilots fail

66:54

to move into into production when they

66:56

incur massive cost. The other thing I

66:58

think a lot about is um

67:00

>> a lot of startups now are raising a lot

67:02

of money at crazy crazy valuations. I

67:05

can think of one particular startup I

67:06

know they're making like a couple of

67:07

million dollars a year. They've raised

67:09

at a billion dollar valuation

67:11

>> and because they've got the word AI on

67:13

them. And the thing is

67:15

>> because everyone's so such in a frenzy

67:16

at the moment about AI, they're probably

67:18

going to raise at a 2 billion valuation

67:19

6 months from now.

67:20

>> When you think about what's going on

67:21

there, someone somewhere is putting

67:23

their money in

67:24

>> and they're going to lose it all.

67:25

>> And they're going to lose it all. And

67:26

when when everybody starts losing all

67:28

their money very very quickly, you see

67:29

this contraction.

67:30

>> Yeah. where everybody realizes that

67:31

their paper gains, the gains they

67:33

thought they had on paper because of

67:34

valuation went up have just evaporated.

67:37

And when you see that, you have to

67:38

quickly count your pennies.

67:40

>> Yeah.

67:40

>> And get frugal

67:41

>> and pull back in again.

67:42

>> Pull back in again. Lay people off and

67:44

so on and so forth. So I I actually do

67:47

personally believe that we're probably

67:48

within 24 months of a pretty severe

67:51

contraction. And that won't just impact

67:53

these tech oligarchs, it'll impact all

67:55

of us in different ways.

67:57

>> Yeah, it's a Burman bus cycle. I mean

67:59

the only thing which we've experienced

68:00

in our own lives which is similar would

68:02

be the telecommunications bubble and

68:04

then the internet bubble between 1990

68:06

and 2001 2002

68:09

um we don't get bubbles in the internet

68:11

anymore because that's now a stable

68:13

technology in that sense uh but it

68:15

wasn't a big one this is much bigger

68:17

>> what do we do as entrepreneurs as team

68:21

members and companies what do we do at

68:22

this moment if what you're saying is

68:24

correct that there will be a

68:26

>> a boom and bust

68:27

>> a boom and bust which which I think ab

68:29

every smart person that I've spoken to

68:30

agrees that there will be a bust soon.

68:32

>> Yeah.

68:33

>> Their timelines vary.

68:35

>> Yeah.

68:35

>> But what does one do right now in March,

68:39

April 2026 to prepare for this?

68:42

>> Well, you put money aside if you can.

68:43

You buy other assets you think are going

68:45

to survive the the Burman bus cycle.

68:47

>> Like what?

68:47

>> That's the trouble. I mean, gold's been

68:49

driven up. Gold's now been driven down.

68:51

Uh people are buying Bitcoin, but

68:52

Bitcoin is collapsing as well. uh in

68:55

some ways you you really you can't it's

68:58

like saying what do I do during an

68:59

earthquake to not fall over in terms of

69:01

insulating yourself I really can't see a

69:03

way of insulating yourself from the

69:05

downturn but I don't I'm not wor as

69:07

worried about that as long-term

69:08

consequences of AI because this is the

69:12

first technology which implies you can

69:14

actually virtually eliminate labor as

69:16

necessary for reducing output because

69:18

you can use AI rather than clarks you

69:21

can use I know this is a long way from

69:24

being feasible, but robots could replace

69:26

process workers and then suddenly

69:29

something which employs 70% of the

69:31

global population is no longer

69:33

necessary. And then what do you do in

69:36

that situation? What I've seen which I

69:38

respect coming out of the tech bros in

69:41

America is they're talking in terms of

69:43

universal basic income.

69:45

>> You think a universal basic income is a

69:46

good idea? I we should probably explain

69:48

what that is.

69:49

>> Yeah. Well, it's it's the state provides

69:52

everybody with enough money to stay

69:54

alive. That's the basic idea. Rather

69:56

than having to work for a living at the

69:58

minimum, you get paid an amount of money

70:01

that means you can buy the goods and

70:02

services that are necessary to stay

70:04

alive. You don't necessarily prosper,

70:07

but you get enough to survive. And so

70:09

that's the idea of UBI. Now, at the

70:11

moment, to survive, you got to have a

70:12

job. And like that guy you mentioned is

70:14

working at three three jobs right now.

70:16

if he got a UBI, he wouldn't have to

70:18

work at those three jobs. He might work

70:20

at one or he might actually consider his

70:22

own business possibilities in that

70:25

situation. So, I think universal basic

70:27

income is a necessity given what

70:30

robotics and AI can do to employment.

70:34

Every time I've tried to improve

70:35

something in my life, like my

70:36

businesses, my health, my relationships,

70:38

I've noticed that the biggest shifts

70:40

have come from being better informed.

70:42

And when it comes to our health, most of

70:43

us know very, very little. So when our

70:45

team was approached about partnering

70:47

with function health, it felt very much

70:49

aligned. Their team has developed a way

70:50

of giving you a full 360deree view of

70:53

your health, many of the things that are

70:54

going on in your body in the form of

70:56

different tests. You do one blood draw

70:58

and it gives you access to over 160 lab

71:01

results, hormones, heart health,

71:03

inflammation, stress, toxins, the whole

71:06

picture. I use it and so have many of my

71:08

team members.

71:08

>> You sign up and you schedule your test

71:10

and once you're done, you get a little

71:11

report like the one I have here. I can

71:13

see my in-range results, my out of range

71:15

results, and there's a little AI

71:17

function, too. So, if I have any

71:18

questions about my out of range results,

71:20

I can just go in there and ask it any

71:22

question I want. And these tests are

71:24

backed by doctors and thousands of hours

71:25

of research.

71:26

>> It's $365 for a yearly membership. Go to

71:29

functionhealth.com/doac

71:32

and use the code DOAC25

71:34

for $25 off your membership. This is

71:37

something that I've made for you. I

71:39

realized that the direio audience are

71:41

strivvers. Whether it's in business or

71:43

health, we all have big goals that we

71:45

want to accomplish. And one of the

71:46

things I've learned is that when you aim

71:48

at the big big big goal, it can feel

71:51

incredibly psychologically uncomfortable

71:54

because it's kind of like being stood at

71:55

the foot of Mount Everest and looking

71:57

upwards. The way to accomplish your

71:59

goals is by breaking them down into tiny

72:02

small steps. And we call this in our

72:04

team the 1%. And actually this

72:05

philosophy is highly responsible for

72:08

much of our success here. So what we've

72:10

done so that you at home can accomplish

72:12

any big goal that you have is we've made

72:14

these 1% diaries and we released these

72:17

last year and they all sold out. So I

72:20

asked my team over and over again to

72:21

bring the diaries back but also to

72:22

introduce some new colors and to make

72:24

some minor tweaks to the diary. So now

72:26

we have a better range for you. So, if

72:30

you have a big goal in mind and you need

72:32

a framework and a process and some

72:34

motivation, then I highly recommend you

72:36

get one of these diaries before they all

72:38

sell out once again. And you can get

72:40

yours at the diary.com.

72:42

And if you want the link, the link is in

72:44

the description below.

72:45

>> And you think up to 50% of working-class

72:48

jobs could be wiped out because of AI

72:50

and robotics.

72:51

>> Yeah.

72:51

>> I mean, that's um that's been a

72:53

prediction from the leaders of some of

72:54

the biggest companies in AI. I heard the

72:56

the leader of Anthropic uh recently say

72:58

the same thing. thinks 50% of jobs could

73:00

be wiped out. I think the shocking thing

73:02

that we've talked a lot about in the

73:03

show is just,

73:04

>> you know, there's been other sort of

73:05

economic or industrial revolutions in

73:07

the past that have caused for job

73:09

displacement.

73:10

>> Yeah.

73:10

>> But none, I would argue at this speed.

73:13

>> No. And none that can replace virtually

73:14

everything. I see one of there's a

73:16

there's a classic story I read back when

73:18

I was uh talking about the global

73:20

financial crisis uh came out of the New

73:22

York Times article where they went to

73:25

interview workers in an air conditioning

73:27

factory. And there was one woman they

73:29

found there whose job it was to place a

73:31

thermouple inside the air conditioning

73:34

units as they went past. So there's

73:36

3,000 of these going past her a day.

73:38

She's just placing one of these

73:40

thermouples where it needs to go inside

73:41

the circuitry of the air conditioning

73:44

unit. And she said, "You don't have to

73:46

love your job as long as it pays you

73:48

money." It was totally boring job.

73:50

That's all she's doing. The thing is the

73:52

reason she got that job was she couldn't

73:53

make a machine to replace her because

73:55

the air conditioning units don't

73:57

necessarily end up precisely at the same

73:59

point. To make a machine that would do

74:01

that is really difficult. Now if you

74:03

train a robot on it, the robot

74:05

perception can ultimately get to the

74:07

point where the robot can place that

74:09

piece inside there. That particular

74:10

unskilled job disappears. So people who

74:13

work in jobs like that no longer have a

74:15

possibility of getting a job. I think

74:17

even, you know, Anthropic released a

74:18

report. Anthropic are the makers of

74:20

Claude. They released a report saying

74:22

that entry- level positions, they're

74:24

seeing a 13% decline already in people

74:27

getting those entry- level jobs. And

74:28

actually, as an employer, someone that

74:30

spends literally all last night, I was

74:31

looking through our inbox, our

74:32

recruitment inboxes at candidates and

74:34

talent.

74:35

>> I have noticed myself changing. I've

74:37

noticed that um

74:40

people that I would have given roles to

74:43

maybe six months ago,

74:44

>> yeah,

74:45

>> I now have to think long and hard about

74:47

whether there's going to be technology

74:49

that can do those exact roles instead.

74:52

And it's it was really shocking thing. I

74:53

was saying to the team last night at

74:54

like 1:00 a.m. in the office, I was

74:55

like, this is a prime example of a

74:57

candidate. I was looking at this

74:58

particular candidate that 6 months ago I

75:01

would have bitten their hand off but now

75:04

>> I have to pause because my innovation

75:06

team in the corner of the office they're

75:08

they're able to do that now with these

75:10

AI agents instead and so I am you know

75:13

you hear a lot about the theoretical

75:15

impact of AI

75:16

>> but you're actually making the decision

75:18

yourself

75:19

>> and then it's theory it's theory it's

75:21

this thing on my Twitter feed like blah

75:22

blah blah whatever you hear on a podcast

75:24

you go blah blah blah whatever and then

75:26

you find yourself actually behaving that

75:28

way.

75:29

>> Your behavior is changing and you're

75:30

going, "Oh, it's very hard to know the

75:33

types of people to hire into our

75:35

company." And I've kind of almost

75:37

segmented them into these two groups

75:38

where you've got people that have very

75:40

deep expertise. Yeah,

75:41

>> I'd say it's three groups. People that

75:42

have very, very deep expertise on a

75:44

particular thing, you know, like my CFO.

75:46

Group number two, I'd say, are people

75:48

that are AI proficient,

75:50

>> who can actually handle this stuff and

75:52

be the people who manage the agents.

75:53

>> Yes. and they can redesign our workflows

75:56

across every department in the company

75:57

to be agentic

75:59

>> um the word about AI agents that's kind

76:01

of like the word you use so agentic

76:03

workflows and then the third group of

76:05

people are people who have skills that

76:07

are highly beneficial human to human and

76:09

in in real life so like humanto human

76:11

sales people that deal with

76:12

relationships

76:13

>> and are very good at it

76:14

>> and are very good at it because there

76:16

are still a certain type of sale where

76:19

people want to meet the person shake

76:20

their hand and say okay you're

76:22

responsible for this deal

76:23

>> we're still in a situation where people

76:24

don't want agents to do that. Those are

76:26

like the three groups. What I didn't say

76:28

is young people who have

76:31

>> just come out of university, maybe don't

76:32

know anything about agents. They don't

76:34

have the deep expertise yet.

76:35

>> Yeah.

76:36

>> And when you look at the data, we'll

76:37

throw some of the data up on the screen.

76:39

It appears that these sort of entrylevel

76:42

white collar jobs are the ones that are

76:45

right now suffering. Yeah. Some of these

76:47

investment companies would hire like

76:48

three or 400 analysts to look at um

76:51

companies and make decisions. That is

76:53

one example of a of a role that's very

76:55

at risk now. We've got an investment

76:56

fund. We need one analyst, Molly. 6

76:59

months ago, we were interviewing more

77:00

analysts. We now realize that we just

77:02

need Molly, and we need to give Molly

77:04

AI.

77:04

>> Yeah.

77:04

>> And she can set up I think Molly said to

77:06

me yesterday when I left the office at

77:07

when she left the office at midnight,

77:09

she's now set up three agents, these

77:11

error agents

77:12

>> as her team. Those would have been three

77:14

people.

77:15

>> Well, I I saw a demo of that. Like I've

77:16

developed a software package called

77:17

Ravvel uh which I've got one programmer

77:20

for and I I teach an online course as

77:22

well and I give Ravel as part of that

77:24

online course and one of the members of

77:26

the course said he's using an AI to

77:29

build RAL models and he's also using an

77:31

AI to write code behind Ravvel and he

77:34

gave a demo this a couple of days ago

77:36

and you know I watched it happen on

77:39

screen as he built a model a simulation

77:41

system and it was messy on one stage but

77:45

it produced the correct mathemat atics.

77:47

So he's showing you can actually he he's

77:48

trying to tell me that we should get my

77:51

main programmer to learn to drive agents

77:53

to do the whole thing. Now my main

77:55

programmer has said look there's things

77:57

that I can do that an AI cannot do. He's

77:59

one of your highly gifted people. And he

78:02

said it wouldn't be worth my while to

78:03

have me telling an AI what to do because

78:06

what I lose in terms of my own

78:08

initiative I can't I just sort of

78:10

balance out. It's really a okay. But he

78:12

hires a junior programmer. than the

78:13

junior programmer would be one who's

78:15

trained to drive the AI.

78:16

>> I I do think programmers are fine.

78:18

Actually, there was some stats that I

78:19

saw the other day that showed there's

78:20

been this huge demand and people trying

78:22

to hire programmers. It's interesting

78:24

because you hear stats from Spotify.

78:26

Spotify saying, "We haven't written a

78:27

human line of code since December."

78:29

>> And I'm very good friends with the guys

78:30

at Spotify. I was actually with the CEO

78:31

the other day, a couple of weeks ago in

78:32

in Austin. And you you hear that and I

78:34

did check that with them. That's true.

78:37

>> So, you assume that that means we don't

78:38

need programmers anymore. But if you

78:40

think about like Jeban's paradox, when

78:42

something becomes, you know, Jeb's

78:43

paradox is the old analogy,

78:45

>> cheaper, you use more of it. Yeah.

78:46

>> Yeah. So like when coal became cheaper,

78:48

people were worried that maybe the coal

78:50

industry was out of business or trains,

78:52

whatever. But actually what ended up

78:53

happening is people just drove more

78:55

trains and they used them for other

78:56

things like transport. And the same

78:58

applies, I think, for AI. When

79:00

>> creating technology becomes easier,

79:02

every company starts using more

79:04

technology. So media companies, lawyers,

79:07

you name the company, executive

79:09

assistants, they all become coders. And

79:11

actually the demand for highly for

79:13

really anyone who knows how to code or

79:16

program it,

79:17

>> we're seeing it. It's exploding.

79:18

>> Yeah.

79:20

>> But I just think the job disruption in

79:22

the near for most people is going to be

79:23

pretty

79:24

>> Yeah. I mean there's ways in which AI

79:27

and robotics should be welcomed

79:29

>> because it means the possibility exists

79:33

and it's only a possibility that we can

79:35

no longer have to be exploited to get an

79:37

income because if you look at the

79:39

Marxist attitude towards capitalism as a

79:42

cap workers capitalists exploit the

79:44

workers. Okay. Um the real world is

79:46

we've been exploiting energy

79:48

>> mutually both labor and capital exploits

79:50

energy. We could have a future where we

79:52

don't have to work for a living and

79:54

therefore you could do what you want to

79:56

do for a living. That it's a Star Trek

79:58

future. That's that's the possibility

80:00

that it promises. But at the same time,

80:03

uh it could actually eliminate the jobs

80:05

that people currently rely upon. And

80:08

what I fear is we have two

80:10

possibilities. We have a future where

80:12

Star Trek's high future where you have

80:14

replicators that make the goods and we

80:16

consume and and we all live a energy

80:18

abundant life. uh or the hunger games

80:22

where there's one little elite that gets

80:24

has all the robots and lives extremely

80:26

well and we tolerate and oppress the

80:29

vast majority and they end up you know

80:31

hunger game entertainment those are the

80:34

two possibilities we face

80:36

>> I do think the cost of goods and

80:37

services will come down which is great

80:39

>> I think robotics you know if Elon is

80:41

right and I often say with Elon like his

80:43

timelines are not always accurate but he

80:45

does tend to deliver magic

80:47

>> he ultimately delivers but it you know

80:49

he always overpromises and delivers

80:51

later than he plans.

80:53

>> And if he's right about when he says

80:55

there's going to be more humanoid

80:56

robots, his Optimus robots, than humans,

80:59

and he says also in his predictions that

81:02

there's going to be no need to study to

81:03

be a surgeon because the robots are

81:04

going to be so much more uh advanced and

81:07

better than any living surgeon, that

81:09

would imply that surgery and other sort

81:11

of medical diagnoses and procedures are

81:14

going to be incredibly cheap, incredibly

81:15

quick. Great.

81:16

>> How do you pay for them? That's the next

81:18

question. Yeah. So, yeah. How do you pay

81:21

for them? And do people want to, you

81:22

know,

81:23

>> it's also it's also the physical

81:24

requirements. I mean, the amount of

81:26

copper inside a robot, you're talking,

81:29

you know, several kilos per robot. Um,

81:32

do we have enough to produce 8 billion

81:34

of them?

81:35

>> And maybe, you know, surgeons do much

81:36

more than just operate.

81:38

>> Yeah.

81:38

>> There's a human element to the medical

81:40

profession, which I think is sometimes

81:40

unappreciated. Like, I would I don't

81:42

know if I'm quite ready to go talk to a

81:44

robot about my health yet. Maybe I'll

81:45

adjust. I wanted to come back to

81:46

something you actually said earlier. You

81:48

talked about Bitcoin briefly. I've heard

81:49

you say that you think Bitcoin is going

81:51

to zero.

81:52

>> Yeah.

81:52

>> Okay. This is worrying. I think I have

81:54

some Bitcoin.

81:56

>> You're an economist. You're saying that

81:57

Bitcoin is going to zero. Why?

82:00

>> Because ultimately because of its

82:01

reliance upon energy. I mean I you know

82:04

you Max you Max Kaiser and Stacy

82:06

Herbert. Have you met them at all? No.

82:07

They were sort of the original

82:08

proletizers for Bitcoin and they're now

82:11

living in I think El Salvador um which

82:15

is adopted Bitcoin as a form of

82:16

currency. When they told me about

82:19

Bitcoin, I could have bought it for a

82:20

pound a bitcoin which would have been I

82:22

would have been bloody would be

82:23

wealthier than you if I'd done that. The

82:25

reason I didn't was they explained that

82:27

the way that the public ledger is kept

82:29

safe is that it takes too much energy to

82:32

break it. So each transaction requires

82:35

10 minutes of computer processing time

82:38

globally by the looks of it to actually

82:40

create an extra bitcoin and that means

82:43

it's too expensive for somebody to try

82:45

to break the ledger. That means it's got

82:47

a huge requirement for energy use and I

82:52

believe knowing what I know from climate

82:54

scientists that at some point we're

82:56

going to realize we're using far too

82:58

much energy on the planet. We've got to

83:00

cut the energy consumption and the two

83:02

easiest things to cut out to reduce

83:04

energy consumption are cryptocurrencies

83:06

and international travel.

83:08

>> But aren't you saying that, you know,

83:09

nuclear energy is becoming vogue again?

83:12

And they're talking a lot, you know,

83:13

about

83:14

>> it's the amount of time it takes to

83:15

build that stuff. I mean, China is

83:17

building nuclear power stations at a

83:19

hell of a rate and much much cheaper,

83:22

more cheaply than America is doing.

83:23

>> Solar has become a big topic of

83:25

conversation.

83:26

>> Yeah. Again, there's a guy called Simon

83:28

Machau, whom I recommend you get in

83:30

touch with as well. And Simon is an

83:32

engineer who claims that we simply don't

83:34

have the physical minerals necessary to

83:37

support a completely solar and

83:39

wind-based

83:41

energy system. He's got people who

83:43

criticize his analysis definitely, but

83:46

we still are using far more physical

83:49

resources than we're aware of at the

83:51

moment on the planet. And the

83:53

availability of various critical

83:55

elements that we need for the system we

83:58

have right now, it's much less abundant

84:01

than we would like it to be. So, a lot

84:03

of these things about, you know,

84:04

robotics taken over, do we have the

84:06

minerals for it? Solar power, do we have

84:09

the minerals? The answer is is not is

84:11

not yes. Okay. Sometimes the answer is

84:14

no. Other times it's it's dubious. But I

84:17

think that energy requirement alone is a

84:19

problem.

84:19

>> You're saying that we're going to have

84:20

we're going to have to cut back on our

84:22

energy consumption.

84:23

>> But I mean the direction of travel has

84:24

been we've been able to produce more and

84:26

more and more and more energy

84:27

>> and we're dumping it into the

84:28

environment. The planet the problem

84:30

about the use of energy is it's

84:32

happening on a planet. Okay. Can the

84:35

biosphere cope with the waste that we

84:37

dump into it as a result of using that

84:40

energy? And that is something which

84:41

economists are completely stupid on

84:44

beyond stupid. They've trivialized the

84:46

dangers of the amount of resources we

84:49

use and the amount of energy we use. So

84:51

I don't think that energy future is

84:53

possible on this biosphere at the

84:54

moment. It's possible in the future if

84:57

we get off the biosphere. So in that

84:59

sense I'm even more of a space cadet

85:00

than Elon Musk is. I think we have to

85:03

plan to take production off planet, but

85:05

while we're constrained on the

85:06

biosphere, the biosphere's constraints

85:08

will stop us using as much energy as we

85:11

used wish to use.

85:12

>> What are you what are your closing

85:13

statements on this whole situation with

85:15

the war and Iran and everything that's

85:16

going on from a geopolitical

85:18

perspective?

85:19

>> Basic thing is our system is far more

85:20

fragile than we've convinced oursel that

85:23

it is. And we can make observations

85:26

about potential futures which presume a

85:28

robustness we don't have. And if that

85:31

robustness is destroyed either by

85:35

military conflict or by overextending

85:37

what we put into the biosphere, then we

85:40

can fall off what's called the Senica

85:42

cliff. We can go from an abundant future

85:44

to a collapse.

85:45

>> And what would you say the people at

85:46

home should be doing to course correct

85:49

the path that you think we're on?

85:52

>> Stop electing fools.

85:54

Um, electing Trump was an enormous

85:56

mistake. We've got politicians who

85:59

follow what's called neoliberal

86:02

political philosophies. Therefore, put

86:04

us in this problem. It hasn't worked. We

86:06

need to reverse back to having a

86:08

humanoriented and physically realistic

86:11

view of how the economy managed should

86:14

be managed and how the biosphere should

86:16

be managed. We have to take care of our

86:18

home and in a central sense we're

86:20

destroying our home and thinking we can

86:22

keep on doing that indefinitely. We

86:24

can't. Our poem is planet earth. Planet

86:27

earth has got physical restraints. We

86:30

haven't respected them. Planet earth

86:32

will tell us what it thinks of that this

86:34

century.

86:34

>> And which leaders do you think we should

86:36

be electing? Do you think we should be

86:37

electing?

86:38

>> I don't think I I think even lifting

86:39

leaders itself is a mistake because what

86:42

we then do is end up getting we we

86:46

pander to narcissists. We pander to

86:48

people who believe they can solve all

86:50

our problems. We end up with

86:52

megalomaniacs making decisions. If you

86:55

look back at where Athenian democracy

86:58

came from, Athenanian democracy didn't

87:00

use elections. It used a process of like

87:03

random number generators to select

87:05

intelligent people to fulfill essential

87:08

roles in those societies. And they they

87:12

weren't even people you got to know by

87:13

name in that sense. We know Trump here,

87:15

we know Star here, we have Albanesei

87:18

over here. We end up getting narcissists

87:20

and megalomaniacs

87:22

directing us and they're the last people

87:24

you need to make decisions.

87:26

>> When you're thinking about your own

87:27

money as an economist, what are you

87:29

doing to protect your

87:30

>> I'm not doing much. I mean, I I've been

87:33

I've been a a crusader for reforming

87:35

economic theory. For my whole life, I've

87:38

sort of neglected this side of things to

87:41

my detriment, I've got to say. Um, but I

87:46

really am focused on what's sustainable

87:47

for everybody rather than what I can

87:49

make as my own cut. And I don't think

87:51

we've got a sustainable economy at the

87:53

moment. We have a philosophy of

87:55

economics which leads to breakdowns.

87:57

>> I'm asking that cuz I've got so many

87:58

friends and listeners that ask me often

88:00

like, should I be buying a house right

88:01

now? Do you think I should be investing

88:02

in gold? Do you think I should be saving

88:04

my money? Should I be, I don't know,

88:06

investing in technology companies?

88:07

>> Yeah.

88:08

>> And I'm wondering if you had a

88:09

perspective for them.

88:10

>> Not on that. No. like I've I've really

88:12

left that area alone. I'm I'm actually

88:14

looking at the overall system and saying

88:16

how do we make the system sustainable so

88:18

that people can live within it and what

88:20

we've got is an unsustainable system and

88:22

you're asking me how do people survive

88:24

within an unsustainable system? Answer

88:26

is they don't. We always think we can do

88:28

something at the individual level to

88:29

cope with what's happening in the system

88:31

around us that only works if the system

88:33

around us is stable.

88:34

>> What is a better system then? Uh I think

88:37

I think what China has done is a better

88:39

in a better direction. They have a they

88:42

have a collective focus as well as an

88:44

individual focus.

88:45

>> What's their system called?

88:46

>> It's called communist.

88:47

>> So you think communism is better than

88:49

capitalism?

88:50

>> I think a system which reflects the need

88:53

for a cohesive society as well as

88:56

individual gain is needed and the system

88:59

in China is closer to that than the

89:01

system in America. in in China. Listen,

89:03

I don't know a ton about this, but they

89:05

have a leader who stays in power and

89:07

>> that's one that's the potential weakness

89:09

>> and suppresses the people's decision-m

89:12

entrepreneurialism.

89:13

>> Equally, you've got a system to get into

89:15

the Communist Party. You've got to have

89:18

uh highly you've got to be educated to

89:20

get in and you have to perform to some

89:23

extent in the region in which you begin

89:25

your role.

89:26

>> But you're not saying you think the West

89:27

should adopt communism, are you? No, I'm

89:29

saying that is west should adopt a

89:30

system which reflects the need for a

89:33

cohesive society.

89:34

>> Is that socialism?

89:35

>> Socialism is closer to it. I mean I the

89:37

words are all tainted. Okay. If you go

89:40

back, you know, do you eat Cadbury's

89:42

chocolate?

89:43

>> I try not to.

89:44

>> You have, haven't you? Okay. Cadbury's

89:46

was a socialist enterprise. Okay. It was

89:48

formed as a a belief we have one who

89:51

could work as the best possible

89:52

situation while also selling a

89:54

profitable product. Mondreon in Spain is

89:57

another cooperative started by a

89:58

Catholic priest. Uh of all things we

90:01

tend to be very binary in the west. We

90:03

say you either have competition or you

90:05

have cooperation. Okay. Well, you need

90:07

to be more like the east in the sense of

90:09

the idea of ying and nang. You have to

90:11

have both. Okay, cooperation and

90:13

competition.

90:14

>> And so that view is the closest thing is

90:16

socialism.

90:16

>> The closest to socialism. And what China

90:19

has done that better than Russia. You go

90:21

back to the USSR. uh that was they they

90:24

were disastrous in terms of product

90:25

development. China's been extremely

90:27

successful on that front. They've

90:29

learned from the mistakes of being too

90:31

centralized and too top down in Russia

90:33

to have both the top down and the bottom

90:35

up dynamic going on.

90:37

>> What's wrong with capitalism? And

90:38

capitalism is what the UK and the US

90:40

have adopted as their sort of economic

90:42

model.

90:42

>> It's seeing competition absolutely

90:45

ruling and ignoring cooperation. Now the

90:47

real the successful society combines

90:50

both. You have cooperation, you also

90:52

have competition. And we've pushed it

90:54

far too far in the competitive end and

90:56

not enough in the cooperative. And what

90:58

comes out of that as well is this cooper

91:01

competitive tends to be short-term

91:02

focus. What can I make a profit out of

91:04

in time that the money that I've

91:06

borrowed is I'm going to be able to make

91:09

more of a profit than the interest I'm

91:11

paying on the money I've I've created.

91:13

And if the interest if the longer it

91:14

takes to get the the repayment, the less

91:17

likely you are to make the investment.

91:18

So what you get is a focus upon

91:20

short-term with just a market system

91:23

whereas with the long term you say

91:24

what's going to last for 100 years and

91:27

like and what that means is you build

91:28

the infrastructure for the long term

91:30

while you allow competition to occur in

91:32

the short term. It's getting the balance

91:34

right. We've got the balance extremely

91:36

wrong.

91:37

>> Professor Steve, thank you. I highly

91:40

recommend people go check out your

91:42

YouTube channel where you make videos

91:43

all the time about what's going on in

91:44

the world. to give your opinion on

91:46

economic issues, political issues, the

91:47

Iran war. So, if people are listening

91:49

and they want to learn more from

91:50

Professor Steve, then look down below

91:52

and you should see his YouTube channel

91:54

linked um next to our name because we're

91:56

going to try and collaborate on this

91:58

post and I'll put you the the link to

91:59

your channel in the description below

92:01

for anyone that wants to check you out

92:02

and subscribe. It's so fascinating,

92:04

especially the stuff about around the

92:05

raw materials coming out the

92:06

straightforward because I really had no

92:08

idea. I it's just it's quite staggering

92:10

to me that we're so dependent on one

92:12

region of the world and I think from

92:13

watching your videos over the last

92:14

couple of weeks,

92:15

>> it's really made me understand the

92:17

unintended consequences of war

92:19

generally, but specifically this war in

92:21

Iran.

92:22

>> Um, so thank you for turning the lights

92:23

on for me. I really, really appreciate

92:25

this and I hope we can meet again soon

92:26

and have a conversation and hopefully,

92:29

you know, this all resolves itself in a

92:31

way that's good for everybody.

92:32

>> I hope so. I I'm having my 73rd birthday

92:34

tomorrow. Oh, I might have 74th as well,

92:36

but I think there's a question mark over

92:38

that now.

92:39

>> Well, I did hear it was your birthday

92:42

tomorrow.

92:43

>> I think the team have gotten you a

92:45

little something.

92:46

>> Okay.

92:47

>> Happy birthday

92:48

to you.

92:50

>> I'm embarrassed.

92:51

>> Happy birthday.

92:53

>> Oh my god.

92:56

>> Happy birthday.

92:59

>> My god. Thank you.

93:01

Happy birthday to you.

93:05

>> Holy hell. I'm missing.

93:07

>> Thank you. Should I blow the candles

93:08

out?

93:08

>> Yes, you should.

93:10

>> Okay, you get a wish.

93:11

>> You blew them all out, so you get a

93:12

wish.

93:14

>> Well, I wish for peace in the Middle

93:16

East.

93:16

>> Okay,

93:17

>> that's probably the main thing to say

93:18

about right now.

93:18

>> That is a gorgeous cake. I have to say

93:20

>> it's a marvelous cake. Yeah,

93:21

>> that's marking our own homework, but

93:23

>> this better be eaten by the crew cuz I'm

93:25

not going to eat all this myself. Okay,

93:27

you want to get out a knife and start

93:28

slicing up? My god,

93:30

>> thank you. Thank you so much. We're

93:31

done.

93:31

>> Thank you.

93:31

>> YouTube have this new crazy algorithm

93:33

where they know exactly what video you

93:35

would like to watch next based on AI and

93:37

all of your viewing behavior. And the

93:39

algorithm says that this video is the

93:42

perfect video for you. It's different

93:44

for everybody looking right now. Check

93:46

this video out and I bet you you might

93:48

love

Interactive Summary

Professor Steve outlines the critical global implications of the ongoing conflict involving the US, Israel, and Iran. He presents five scenarios for the war's conclusion, highlighting Iran's significant preparedness. A major concern is the Strait of Hormuz, a choke point vital for 20-30% of global oil, fertilizer, and helium supplies, whose disruption could lead to worldwide famine and a collapse in semiconductor production. Professor Steve characterizes Trump's actions as a "pump and dump" scheme to manipulate oil prices for personal gain, and notes the US President's sole authority to launch nuclear weapons as a terrifying prospect. He predicts an imminent AI-driven economic bust, causing widespread job displacement, and advocates for Universal Basic Income and a shift from competitive capitalism towards a more cooperative, long-term focused economic system.

Suggested questions

13 ready-made prompts