HomeVideos

Joe Rogan Experience #2418 - Chris Williamson

Now Playing

Joe Rogan Experience #2418 - Chris Williamson

Transcript

5110 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:14

[laughter]

0:14

>> Just feel a bit less [ __ ] about myself

0:16

to stave off death.

0:18

>> Well, doesn't it do something for your

0:19

mind? Doesn't it help you?

0:21

>> Yeah, of course it Of course it does.

0:23

But when you compare it with life and

0:25

death, there's a little bit of a

0:27

difference.

0:27

>> Oh, yeah. Yeah, for sure. Yeah, there's

0:30

a def definitely a difference, but uh

0:32

just for mental health, that's the main

0:35

reason to do it for me. It's mental

0:37

health. It's it's such a difference

0:39

between not doing it and doing it.

0:40

>> Mhm.

0:41

>> Like two different totally different

0:43

people. You got notes on that thing or

0:44

something.

0:44

>> Always.

0:45

>> You got to get one of these babies.

0:46

Little kickstand jammies. Those are

0:49

[ __ ]

0:49

>> Oh, sexy. Look at that.

0:51

>> Sexy. Sexy.

0:53

>> Yeah.

0:53

>> All right. All right.

0:54

>> Encourages you to waste your time

0:56

watching YouTube videos.

0:57

>> Yeah. without having to hold it

0:58

>> because it props up. Yeah. Beautiful.

1:00

[snorts] You feel like a fool sitting

1:01

there staring at your camera, holding it

1:03

in your hand. I always said like if

1:04

there was a drug that made people stare

1:05

at their hand for six hours a day,

1:07

everybody would be like, "Oh my god,

1:08

where was this really a problem in this

1:10

country?" People were just staring at

1:11

their hands.

1:12

>> Well, we looked at that last time that

1:14

we were on. We had uh the photo of that

1:16

that guy that artist that had taken

1:18

images of people looking at their

1:19

phones.

1:20

>> Yes. With no with no phone

1:21

>> and then removed the phones. [laughter]

1:24

>> It's such a crazy thing we're doing. And

1:26

now, of course, there's AR glasses that

1:29

are eventually going to put whatever Tik

1:31

Tok feed in like one eye where you're

1:33

watching someone in the other eye.

1:34

[laughter]

1:35

>> Have you ever tried those?

1:36

>> I've messed around with them a little

1:38

bit. Uh Zuck was here and uh he let me

1:40

try the new ones that haven't been

1:42

released yet. They were really

1:43

interesting. And you're you move a

1:46

cursor around with your eyeballs and you

1:48

can do things with your fingers. You can

1:50

pinch and and spread things and stuff

1:52

with your fingers and and play games

1:54

with your fingers. You can like It's not

1:57

quite as responsive as you'd like it to

1:59

be, but it's very beta,

2:01

>> you know.

2:01

>> [ __ ] It's

2:02

>> pretty cool.

2:03

>> It is pretty cool.

2:04

>> But also, we're losing humanity. We're

2:06

going to [laughter] we're going to be

2:08

taken in. We're going to incorporate

2:10

with the machine. [snorts]

2:12

>> Yeah. Well, I don't know. I think a lot

2:14

of people feel like that would be a

2:15

better version of the life that they

2:16

have. And that's the saddest thing that

2:19

um people people of older generations

2:21

look at young guys and girls and how

2:24

much time they spend online and they

2:25

think this is ridiculous. Why are they

2:27

sp why are they caring so much about

2:30

what is occurring on the internet? But

2:31

they don't realize people spend more

2:33

time on screens than they do asleep. So

2:36

the digital world is the real world for

2:39

these people. Like the digital world is

2:40

more real than the real world is.

2:42

>> Ooh, I didn't think of it that way.

2:45

There are a lot of people that do spend

2:47

more time on screens than they do

2:49

asleep. That's really common. Yeah. I

2:51

like to balance that out. I'd like to

2:54

spend half as much time on my phone as I

2:56

do asleep. [laughter]

2:59

>> Well, that would be a good way to

3:00

enforce it, right? You have to you log

3:02

how much sleep time you've had and then

3:03

>> So, I'm going to start sleeping 12 hours

3:04

a day. So, I [laughter]

3:07

>> six six hours wasting. It's quite a

3:10

resource if you think about it. like a

3:12

like an a lack of an appreciation of

3:15

your resource because the resource of

3:17

your time and your attention. It's very

3:19

valuable and you can convert it into all

3:21

sorts of amazing skills and information

3:25

and you you know knowledge and change

3:27

your whole life or you can just stare at

3:31

stupid [ __ ] all day long.

3:32

>> It's so compelling though, dude. It's

3:34

been designed by the

3:36

>> most profitable companies on the planet

3:38

with the smartest behavioral scientists

3:40

in history. Like it's an unfair fight.

3:42

It really is an unfair fight and that's

3:43

why

3:45

>> sort of you could not do it though.

3:48

>> Oh, you need to lean in. But it's like

3:50

there is an there is way more willpower

3:52

you need to use in order to be able to

3:54

not than like just whatever the course

3:57

of natural human history is or natural

3:59

human behavior. It's so easy to or

4:02

alternatively

4:03

you could dye the Venice River green.

4:06

[laughter]

4:08

That's what happens when you don't have

4:09

enough phone battery.

4:11

>> I said that to Chris today. Greta

4:13

Thurberg, she dyed the Venice Canals

4:17

green to protest what a lack of action

4:21

and and climate change.

4:22

>> Yeah. Pull back a a a call to pull back

4:26

carbon

4:28

fuel in Europe. And they didn't just do

4:30

it in Venice, they did it at 10 cities

4:32

around Italy. But Venice has obviously

4:34

got this gorgeous waterway. It's entire

4:37

city built on water,

4:40

>> bro. Yeah, that's hard to see how ugly

4:43

it is. Jamie, I can send you a video of

4:45

it. She cuz I sent Chris a video. It's

4:48

so, you know, it's just like, how much

4:50

attention do you need, lady? Okay, stop.

4:55

>> Sky News Australia refers to her as a

4:56

Swedish doom goblin. [laughter]

5:01

Sky News is the one that's weirdly pro-

5:04

Republican American politics.

5:06

>> Super rightwing.

5:07

>> It's like, who's funding that? There's

5:09

no way that there's that much of an

5:11

appetite in Australia for American

5:13

politics. So that's what it looks like.

5:15

That's disgusting. I was there this

5:17

summer. It's [ __ ] beautiful. It's so

5:21

in Venice is so gorgeous and so ancient

5:24

and so interesting. And to have this

5:26

self-important [ __ ] pour a bunch of

5:29

green dye into that water, you should go

5:32

to jail for that. Like, you're you're

5:34

ruining this experience for thousands

5:37

and thousands of people who don't not

5:40

just the ones who live in that amazing

5:42

place, but the ones who get to visit. I

5:44

mean, someone figured out a way to make

5:47

a whole city by shoving pylons into the

5:51

ground. And they did it a long time ago.

5:54

It's all wood. The whole city is stacked

5:56

up on wood. They take these wood poles,

5:59

they shove them into the ground. It's a

6:01

specific type of wood that doesn't rot

6:04

when it gets wet and water logged that

6:06

actually hardens. I forget what kind of

6:07

wood it is. They I watched this whole

6:09

thing on it, but I mean it's very

6:12

stable. I mean sometimes they get some

6:14

flooding. Like one time we were there

6:16

and like the the lobby of this place was

6:18

flooded. M it does flood, but it's also

6:21

it's so [ __ ] beautiful and the

6:23

architecture is so amazing. It's such a

6:25

gorgeous place and it just relaxes you

6:27

like instantly when you're there. You're

6:29

like, "Wow, I just want to have a

6:30

espresso and eat some pasta and just

6:33

chill out

6:33

>> last summer. It's one of the most

6:34

beautiful places I've ever been

6:36

>> and this [ __ ] dummy decides to just

6:38

pour green dye." And how much green dye

6:41

did you put in there? And what kind of

6:42

an effect is that going to have on life?

6:44

>> So, they claimed that it was

6:46

environmentally safe. Rah. I don't know

6:48

how environmentally safe anything of

6:49

that green color can be. Uh but yeah,

6:51

what was it? 48 hour ban and a $170

6:55

fine.

6:56

>> That's it.

6:56

>> Yeah.

6:57

>> Yeah.

6:58

>> Wow.

6:59

>> I should you should go to jail for a

7:00

night.

7:01

>> I think about this a lot, man. The um

7:04

in some ways I understand why the

7:07

rhetoric gets more and more

7:09

inflammatory. So if you care about an

7:13

issue, if you really really think that

7:14

this issue is important

7:16

>> and people don't listen, you start to

7:19

shout a bit louder and then you shout a

7:21

bit louder and then you shout a bit

7:22

louder.

7:22

>> The British are coming.

7:23

>> The British are coming. You know who

7:24

first said that?

7:25

>> Wasn't Paul River.

7:26

>> Bonnie Blue.

7:27

>> Who's that? [laughter]

7:30

>> She is the lady that slept with,57 men

7:33

in a day.

7:33

>> Oh, that poor lady.

7:35

>> Yeah. Um, so people don't listen. Do you

7:38

ever see uh Don't Look Up, that movie on

7:39

Netflix.

7:40

>> Funny. By the way, I missed that joke

7:41

because I didn't know Jamie got it who

7:43

that was. [laughter]

7:45

>> Jam got it from over there even with the

7:46

>> kind of proud that I can't recognize her

7:48

name though, honestly. I'll take that.

7:49

>> Yeah, it's probably a good sign. Um

7:52

people don't So, don't look up that film

7:54

with Leonardo DiCaprio a couple of years

7:56

ago. You remember it was like an

7:57

asteroid coming in.

7:58

>> Oh, yeah. Yeah. Yeah. Yeah.

8:00

>> And um

8:00

>> it's a funny movie, right?

8:01

>> Kind like half funny, but kind of. It's

8:03

supposed to be a comment on the

8:05

impending doom of climate change and

8:07

nobody's listening, right?

8:08

>> Yeah. They're not correct. That's the

8:11

problem. You know, who are those two

8:12

gentlemen that we had in recently,

8:13

Jamie? The guy from MIT and the other

8:16

guy from was he from Yale or Stanford?

8:19

Where was he from? Anyway, these two

8:22

brilliant scientists who have analyzed

8:24

the data and one of them was going over

8:27

the actual

8:30

understanding the equations that you

8:32

would need to understand in order to

8:34

really be able to calculate what is

8:36

having an effect on the climate and what

8:38

is how many different factors there are

8:40

and all of them working synergistically

8:44

in some weird unexplainable way. And

8:46

then the cold hard reality of climate

8:48

data over the past x amount of millions

8:51

of years where it's always done this

8:53

glaciation and then the gl the glaciers

8:57

they recede and then you get higher

8:59

ocean levels. It's like constant. Every

9:01

12,500 plus years it goes up and down

9:04

and up and down and it never stays

9:06

static ever. It's never static. And the

9:08

real fear is not global warming. The

9:12

real fear is global cooling.

9:14

>> Why? Global cooling kills everything.

9:17

And we got that close at one point in

9:19

history to having such a low oxygen

9:21

level at this pl on this planet and such

9:24

a low carbon dioxide level because there

9:27

was no plant food, right, that these

9:30

[ __ ] plants almost died. We almost

9:32

lost all life on this planet. We've gone

9:35

like a few degrees from that happening.

9:38

This is a Glaciers are [ __ ] scary.

9:42

Ice ages are scary. When it gets warm,

9:45

you just move. And I know that sucks if

9:49

you're living in a city of 20 million

9:51

people, but it hasn't happened yet. And

9:53

they've been talking about it forever.

9:55

That [ __ ] stupid movie, An

9:57

Inconvenient Truth, was wrong about

10:00

everything. He should have to give back

10:02

every [ __ ] penny he made from that

10:04

movie. You were wrong about everything.

10:08

You scared the [ __ ] out of everybody.

10:10

And you were 100% wrong. One of the

10:12

problems I think people have is if you

10:14

really care about something and you're

10:16

convinced whether your conviction is

10:17

incorrect or not, you're convinced by

10:19

it. So what you do, you say a thing,

10:21

people don't listen,

10:22

>> right?

10:23

>> Say it a bit louder, people still don't

10:24

listen. Say it a bit louder again,

10:25

people still aren't listening. And the

10:27

problem is it's a misunderstanding about

10:29

what

10:31

compels and convinces other humans. Uh

10:34

what we think is if people aren't

10:35

listening, if I shout louder, they're

10:36

going to pay attention. What we don't

10:38

realize is that actually turns everybody

10:41

off because if you just see someone

10:43

throwing soup over a Van Go painting,

10:46

uh, turning the canals of Venice green,

10:49

gluing themselves to the M25 in London

10:52

and stopping people from being able to

10:53

get to work. Like, it gets attention,

10:56

but you're not looking for attention.

10:58

You're looking for conviction. You're

10:59

trying to compel people to believe the

11:01

thing that you believe. And I think that

11:03

it does the opposite. And I understand

11:05

why it's so seductive because you think

11:08

making it's cool to your own side to do

11:12

something uh flaming sword wielding

11:14

truthteller. I'm going to charge through

11:16

and look at how cool it is. But making

11:19

somebody feel stupid or embarrassed or

11:21

inconvenienced or upset is a really bad

11:23

way to change minds. So I think if

11:27

people really care about changing minds,

11:30

they need to realize and assuming that

11:32

they think that they're correct, they

11:34

need to realize that like intellectual

11:35

chasm from where they are and where

11:37

other people are and you go, "Okay, I'm

11:39

going to take you one step at a time."

11:40

So even if you were to accept that the

11:43

science and all of the stuff that the

11:44

climate change people believe in is

11:46

accurate, I still think that the

11:48

strategies that they're using aren't

11:50

going to be effective because I think it

11:52

turns more people off,

11:53

>> right? And

11:54

>> they're scolding.

11:55

>> They're shrieking scolding. And they're

11:57

not the type of people that you want to

11:59

talk to, so you avoid them. Ho ho.

12:02

Looking down from looking down from

12:04

Yeah, it's it's my British heritage.

12:05

[laughter]

12:06

>> Um,

12:08

it's It doesn't cause you to feel

12:11

inclined to support them.

12:13

>> The opposite. It causes you to want to

12:15

burn tires.

12:15

>> Yeah.

12:16

>> I want to buy spray paint and [ __ ]

12:18

hairspray and just blow it by my car.

12:20

>> Have you heard of the Cassandra Complex?

12:22

Do you know what this is? No.

12:23

>> [ __ ] brilliant, dude. So, uh, in

12:25

ancient Greek mythology, Cassandra is,

12:28

uh, given the gift of being able to see

12:30

the future by Apollo, and then she

12:32

rejects his advances. So, he curses her,

12:35

and he says that for the rest of time,

12:36

you're still going to be able to see the

12:37

future, but people aren't going to

12:39

believe you.

12:40

>> So, she foresees the downfall of Troy.

12:44

She warns everybody, people don't

12:46

listen. Troy burns anyway. And it's

12:49

basically being right, but early. So,

12:52

uh, Rachel Carson, she wrote that book,

12:54

Silent Spring, 1962. It's about, um, uh,

12:57

DDT, environmental epidemics.

13:00

>> She gets mocked by scientists,

13:02

castigated by everybody, but her work

13:05

led to the banning of DDT.

13:07

>> What year was this?

13:07

>> 1962.

13:08

>> Interesting.

13:09

>> Uh, Ignis Samlwise, like 1840s, he

13:13

realizes that doctors are transmitting

13:15

childbed fever from corpses to mothers

13:18

because they're not washing their hands.

13:20

M

13:20

>> so he begs his colleagues to start

13:23

adopting handwashing and he gets mocked

13:25

by academia. He dies in an asylum.

13:28

>> He dies in an asylum. That's how badly

13:30

he's treated. Germ theory of disease

13:32

gets a couple of decades later gets

13:34

proven. Edward Snowden who you've spoken

13:36

to like some people saw him as a

13:38

traitor,

13:39

>> some people saw him as a truth teller,

13:40

but I think everybody had a bit of

13:42

really is that what's going on? Few

13:44

years later it turns out yeah the

13:46

government is spying on you.

13:47

>> Yeah, 100%. and this Cassandra complex.

13:50

So if somebody ever says, "I'm a

13:52

Cassandra. I'm feeling like Cassandra

13:53

today.

13:54

>> I foresee this thing. You don't. You're

13:57

not listening to me. It's a big deal."

14:00

>> And the problem is the difference

14:02

between somebody being a a righteous

14:04

Cassandra with the ability to see the

14:05

future and just being a crazy person

14:08

who's being convinced by bad data or uh

14:13

like perverse incentives.

14:17

It's very hard to work out which one you

14:18

are.

14:19

>> Perverse incentives is the real word

14:21

because here's the thing folks, we do

14:22

have a horrible impact on the

14:24

environment. It's factual. It's

14:26

measurable. You can go see it. Um

14:29

there's many third world countries that

14:31

have rivers that are completely clogged

14:33

with garbage and plastic. That's real.

14:36

If you're not trying to stop that, but

14:39

you're railing about carbon, well,

14:41

carbon is a weird thing because carbon

14:44

is essential to plant life. It's the the

14:47

there's more green on Earth today than

14:50

there was a hundred years ago. And

14:52

that's because of our carbon emissions.

14:54

That is an inconvenient truth. All

14:56

right. [ __ ] Al Gore. That's an

14:58

inconvenient truth. So carbon is a part

15:00

of the Is it good that we're burning

15:01

stuff and putting it in the atmosphere?

15:03

No, I do not think it is. No, I'm not

15:05

arguing that. I'm saying that our impact

15:08

on the environment that is tangible and

15:11

disgusting is pollution. That's the

15:14

impact on the environment. And if you're

15:16

really thinking about our carbon

15:17

footprint and carbon taxes and carbon

15:20

incentives and you got to follow the

15:22

money like what what is happening here?

15:25

Well, there's a bunch of green

15:26

initiatives and those green initiatives

15:28

get funding and they get funding to the

15:30

tune of billions and billions of

15:32

dollars. And if you know anything about

15:34

any sort of nonprofit, like someone just

15:37

pulled up some there's a a nonprofit

15:39

about animals and they just released

15:41

what a what a [ __ ] scam it is.

15:43

There's so many of these nonprofits

15:45

where the vast majority of the money is

15:47

going to salaries. Like the most of the

15:50

money is going to salaries and there's a

15:52

tiny fraction of that money that gets

15:55

allocated to whatever that cause is.

15:57

>> Which is why it justifies people who

16:00

work for the organization to sustain the

16:03

organization's existence because that's

16:05

their

16:05

>> 100%. But there's no data. Here's the

16:08

thing. All of their predictions, all of

16:11

the climate change predictions are

16:13

totally inaccurate. Every single one by

16:16

all the doomsayers. So, you'd think they

16:18

would course correct. You would think

16:20

they would say, "Okay, no one's arguing

16:22

that the particulates that get emitted

16:25

into the atmosphere by coal plants are

16:28

not terrible for everyone." No one's

16:30

arguing that [sighs]

16:33

glyphosate is good for you. No one's

16:36

arguing that the poisons we're putting

16:38

in rivers and streams, no one's arguing

16:40

that's good for you. The stuff that gets

16:42

into groundwater, no one says that's

16:43

good. That's our real problem. Our real

16:46

problem is pollution. It's [ __ ]

16:47

terrible. There's a real problem with

16:50

waste. There's a real problem with

16:52

landfills. All that's real.

16:54

>> This carbon thing is a weird one. It's

16:58

it's a weird one to concentrate on

17:00

solely because it seems to have an

17:02

effect on the atmosphere. It has an

17:04

effect on the temperature of Earth, but

17:07

not what they're saying.

17:09

>> Can you think of a perverse incentive

17:10

other than people just want to keep

17:11

their jobs? Is there something else?

17:13

>> It's people keeping their jobs. It's

17:15

righteousness. It's virtue signaling.

17:17

And and it's also the extraordinary

17:20

amount of money that gets put into green

17:23

initiatives. It also helps people

17:24

campaign. When you're campaigning, if

17:27

you say climate change is real, we will

17:29

follow the science. Oh, thank God. you

17:32

get my vote.

17:35

[clears throat]

17:36

That's what happens. And these [ __ ]

17:38

dumb asses just fall for it every time.

17:41

It's It's not that it's a real impending

17:44

doom scenario. That's not real. It's not

17:47

real. It's not real. But what is real is

17:51

humans impact on Earth. So you got to

17:53

figure out why is this one thing Why are

17:55

they concentrating so much on carbon

17:58

>> when it's not a measurable thing? It's

18:01

not a thing where that you're you're

18:02

seeing this hugely detrimental effect by

18:05

this one action that we have. Well,

18:07

because someone's trying to make money.

18:08

It's it no one's doing it for your own

18:10

good. There's not a [ __ ] single

18:12

person on earth that's involved in any

18:14

of these big causes that's really

18:16

concerned about us. No, they're all

18:20

making money and they're all ma even if

18:22

they're not making money other than

18:25

their salary. If your salary is a

18:27

million dollars a year to run a charity,

18:31

maybe that charity is [ __ ] horseshit,

18:34

you know,

18:36

[laughter] because if you make a million

18:38

dollars a year, you're rich as [ __ ]

18:40

This episode is brought to you by Happy

18:42

Dad Hard Seltzer. A nice cold Happy Dad

18:45

is low carbonation, gluten-free, and is

18:48

easy to drink. No bloating, no nonsense.

18:51

When you're watching a football game or

18:53

you're golfing, watching a fight with

18:55

your boys or out on the lake, these

18:57

moments call for a cold happy dad.

18:59

People are drinking all these selters

19:01

and skinny cans that are loaded with

19:03

sugar. But happy dad only has one gram

19:05

of sugar in a normalsiz can. You can buy

19:08

Happy Dad on the GoPuff app and your

19:10

local liquor and grocery store,

19:12

including Walmart, Kroger, Total Wine,

19:15

and Circle K. And you can't decide on a

19:17

flavor? Grab a variety pack. lemon lime,

19:20

watermelon, pineapple, and wild cherry.

19:22

They also have a grape flavor in

19:24

collaboration with Death Row Records and

19:26

Snoop Dogg. They have their new lemonade

19:29

coming out as well. Visit happydad.com

19:31

for a limited time offer and use code

19:34

Rogan to buy one Happy Dad Trucker hat

19:36

and get one free. Enjoy a cold Happy

19:39

Dad. Must be of legal drinking age.

19:41

Please drink responsibly. Happy Dad Hard

19:44

Seltzer tea and lemonade is a malt

19:47

alcohol located in Orange County,

19:50

California.

19:50

>> Well, the argument would be uh in order

19:52

to get somebody of the standard that you

19:55

need to run this charity at the level

19:57

that it needs to be run at, you need to

19:59

give a competitive salary.

20:00

>> What an amazing job they're doing where

20:02

95% of the money goes to overhead.

20:04

[laughter]

20:05

What an amazing job you've done in

20:07

having zero. Please show me your

20:09

efficiency plans, the blueprint.

20:11

>> Zero progress in any of your air quotes

20:15

science that you're you're pointing to

20:18

that's showing these prediction models.

20:20

All of their prediction models are wrong

20:22

>> and they always quote things that are

20:24

wrong like storms are stronger. There's

20:25

more they're more common. No, you're

20:27

just looking at a strong storm. If you

20:29

look overall, there's always been strong

20:31

storms. They're totally unpredictable.

20:33

>> Have you had Alex Epstein on? Do you

20:35

know him? uh the the case moral case for

20:38

fossil fuels.

20:39

>> Oh, okay.

20:39

>> Interesting dude. Um he has like one of

20:42

the most interesting stats that I

20:43

learned from him was climate related

20:45

deaths have decreased by 98%.

20:49

Over the last century.

20:51

>> Yeah.

20:51

>> So, one of the things that people don't

20:53

consider when they look at the cost of

20:56

um energy and energy production is that

21:00

you need to be able to protect. More

21:02

people are killed from heat than are

21:03

killed from cold. And you need to

21:04

protect from heat by using energy. And

21:06

if you're going to produce cheap energy,

21:08

some uh byproducts are going to be spat

21:10

out into the atmosphere. But the impact

21:12

of the creation of the energy is way uh

21:15

more effective at increasing human

21:17

longevity than the side effect of the

21:19

energy being made. Does that make sense?

21:21

>> Totally rational.

21:21

>> Yeah, it seems like that would make

21:23

sense.

21:23

>> Dude, I've had I've had um Richard

21:25

Betts, director of the IPCC,

21:27

Intergovernmental Panel on Climate

21:28

Change on the show. Uh Hannah Richie

21:30

from Our World in Data. Like I've really

21:32

tried to get a good balance on all of

21:34

this stuff, but Alex's position in that

21:38

area, which is it's a very luxury belief

21:41

to hold to talk about how green we must

21:43

be in the West when you have access to

21:48

unlimited energy. I think a billion

21:50

people worldwide don't have access to

21:52

reliable electricity. Like half a

21:54

billion people are still using wood and

21:56

dung in order to be able to produce

21:58

their electricity. That was the data

21:59

that he showed me the last time we

22:01

spoke. That means that if you've got a

22:02

baby that's on a a a ventilator in a

22:05

newborn baby that needs to be put on

22:07

like that baby dies. That baby dies

22:10

because that particular country does not

22:12

have access to clean to cheap and

22:15

reliable energy. Cleanness does not

22:18

matter for these people. Yeah, I've

22:19

heard that argument that the best result

22:21

worldwide would be to increase the power

22:24

supply to all these third world

22:26

countries and then you would have this

22:28

ability to start manufacturing doing a

22:31

bunch of different things that we

22:32

associate with the negative aspects of

22:34

the west.

22:35

>> You know, the negative aspects of the

22:36

west that cause pollution that cause all

22:38

these different things.

22:39

>> The problem is electricity is a real

22:41

bastard to try and move. I think the the

22:43

entire grid has got eight minutes of

22:46

battery backup. [laughter]

22:48

10 minutes of battery backup. It's it's

22:50

so little and it's so cumbersome and you

22:52

lose it as you transport it further. And

22:54

uh dude, I I I get it. Like I I really

22:56

believe that existential risks, climate

22:58

change included, are things that humans

23:00

should pay attention to. But if you were

23:01

to rank Toby or wrote this great book

23:04

called the precipice and he is from the

23:06

future of humanity institute at Oxford.

23:08

He wrote the best researchers in the

23:10

world. He got them to rank what are the

23:13

uh most dangerous existential risks to

23:15

humans. And it's a one in 10,000 chance

23:18

over the next century coming from

23:20

climate change. It's one in six from AI

23:24

or one in 10 from AI, one in 10 from

23:27

engineered pandemics, like one in 30

23:28

from natural pandemics. Uh there's so

23:31

many other huge issues that are really

23:34

pressing. I'm not saying that climate

23:35

change isn't a priority. I'm saying that

23:38

if you were to rank the priorities, it

23:40

actually starts to move pretty far down.

23:41

And when you think if people are worried

23:44

about the future of the world, they have

23:46

a worried about the future of the world

23:48

budget to spend, almost all of that is

23:51

going on climate change. Jamie, can you

23:53

try and get up? It's it's a a chart by

23:55

Toby OD. It's just called if you search

23:57

like uh uh the precipice chart Toby or

24:00

you can bring it up and you just think h

24:03

how much attention is being paid to all

24:05

of these other things like how much

24:07

attention is being p nuclear war I guess

24:09

gets a a little bit of attention but

24:11

slightly less so now but natural

24:13

pandemics engineered pandemics AGI uh

24:16

these are big deals and I I worry that a

24:20

lot of attention has been focused on to

24:22

one actually relatively inconsequential

24:25

at least in the immediate time.

24:27

>> No, go back. Do a a Google search for

24:30

me.

24:32

Uh, top left. Yep, that's it.

24:36

So, uh, nuclear war, one in 1,000.

24:38

Climate change 1 in 1,000. Other

24:40

environmental damage, one in 1,000.

24:42

Engineered pandemics 1 in 30. Unaligned

24:45

artificial intelligence 1 in 10.

24:48

Total the total risk is 1 in six. But

24:50

climate change is one in a thousand over

24:52

the next hundred years. A stellar

24:54

explosion. There you go. One in what's

24:56

that? A billion.

24:57

>> That's what we need.

24:58

>> I don't like that one. That one scares

25:00

the [ __ ] out of me. I I remember a

25:02

documentary I watched back in the day

25:04

that was about hypernovas. And when they

25:06

first started me measuring these gamma

25:08

bursts in space, they thought that maybe

25:10

alien races were at war with each other

25:12

because there's this enormous burst of

25:14

energy. And they realize it's stars

25:17

going hypernova. And how many of them do

25:20

it all over the universe? Because the

25:22

universe is so big

25:23

>> and there's just a single beam of

25:25

signals

25:26

>> like a death ray that gets sent out

25:27

across the universe.

25:29

>> Just unimaginable power and it happens

25:32

all the time. It's happening all the

25:33

time in the sky.

25:35

[laughter]

25:36

>> [ __ ] bing bing bing.

25:37

>> And if it happens anywhere near you, it

25:39

just takes out the whole solar system.

25:41

Takes out everything. If it happens in

25:44

neighboring solar systems, it takes us

25:46

out. Takes out everything. Yeah. You're

25:49

[ __ ]

25:49

>> Wow. If that's not a justification for

25:51

just living your life and getting the

25:52

[ __ ] on with it and not coloring the

25:54

Venice Canal green, [laughter]

25:57

>> I don't know what

25:57

>> Well, it's the, you know, it's the thing

25:59

that gets you attention, unfortunately.

26:01

That's really what all this is about.

26:03

You know, send her back to Israel.

26:05

They'll give her attention. They gave

26:07

her some great attention.

26:09

>> Uh I mean, I I'm kind of obsessed with

26:11

this idea of toxic compassion, which I

26:13

think is what you're talking about.

26:14

Yeah. So, uh, the prioritization of like

26:19

short-term emotional comfort over

26:22

everything else.

26:23

>> Mhm.

26:23

>> And, uh, I remember Elon was talking,

26:26

uh, a couple years ago, someone had

26:28

accused him of contributing to climate

26:30

change, so on and so forth. And he says,

26:32

I think I've done more to reduce climate

26:33

change than any other human on the

26:34

planet. that if you look at the EV

26:38

revolution being started by Tesla plus

26:39

everything else from a technology

26:41

perspective that we're doing, I think

26:42

that there's an argument to be made that

26:43

I've uh had a more positive impact on

26:47

the future of the climate than any other

26:49

human. He said, "What I'm interested in

26:51

is the reality of doing good, not

26:54

appearing good, and not appearing

26:55

[clears throat] to do good while doing

26:57

bad." M

26:58

>> and this the opportunity people have to

27:02

be able to look like they're doing good

27:05

while not doing it is exactly where this

27:07

toxic compassion thing leaks in. So for

27:09

instance um people will proclaim that

27:12

body weight has no impact on health over

27:16

a long duration even if this causes

27:18

overweight individuals to not take their

27:20

health as seriously and literally die

27:22

sooner. But we're here Joe you don't

27:24

understand. We're trying to be inclusive

27:26

here. We're trying to be understanding

27:28

of what's going on with these people. Uh

27:30

if someone was to say that a uh male

27:33

athlete has no advantage in a sporting

27:37

competition, uh because Joe, we're

27:39

trying to be inclusive. We're trying to

27:40

be empathetic. We care about these

27:41

people. Well, even if that's done at the

27:44

exclusion of female athletes, right?

27:46

People are prepared to show

27:51

they're prepared to do whatever is

27:53

needed to appear good.

27:54

>> Yes. And the alternative which is it

27:57

makes complete sense. Who wants to do

27:59

good while looking bad,

28:00

>> right? [snorts] That's the thing you're

28:03

saying is so important. They they will

28:05

sacrifice everything to appear that

28:07

they're doing good. That's because

28:09

that's really what they're worried

28:10

about. And that that is all stemming at

28:12

least in part I I should say not

28:15

stemming but certainly accentuated by

28:18

the social media world that we're living

28:19

in now because everyone has this

28:20

opportunity to appear like they're

28:23

something other than they are. They're

28:25

using filters. They're standing in front

28:27

of a lease car. There's all all the

28:29

above. They're doing things. They're

28:31

wearing cheap cheap fake jewelry.

28:33

They're trying to look like something

28:35

they're not. And there's a culture of

28:36

that. And there's also a culture that

28:38

that gets, well, I'm not one of those

28:40

cuz I don't care about material goods,

28:42

but I'm really interested in climate

28:44

change. And so then, you know, you join

28:46

up with whatever [ __ ] climate change

28:48

group that's yelling and shouting and

28:49

you carry a sign and you do all these

28:51

things that you're supposed to do and

28:52

you you get free water. The whole thing

28:54

is just it's it's a psychological game

28:57

that people are playing with themselves.

28:59

to try to appear that they're special

29:02

and to be in competition or in battle

29:05

with the other side, you know, but if

29:07

you're if you're in battle with people

29:09

that are saying um, hey, none of these

29:12

models are correct, hey, none of these

29:15

predictions have come to bear, zero, not

29:18

a single one, where they say the sea

29:20

level's going to rise, there's going to

29:22

be no more Miami, nothing, not a [ __ ]

29:24

thing has happened.

29:25

>> Like, you're wrong. Okay? So, we need to

29:28

figure out what's right. If we can all

29:29

agree that if we're doing something bad

29:31

to the planet and it's somehow or

29:33

another avoidable, let's work towards

29:35

that. But if you're telling me we're

29:37

doing something bad to the planet and

29:39

then when I say, "Well, show me." And

29:40

you can't. Well, what about all these

29:42

predictions? Well, they're wrong. Well,

29:44

what about all that movie that every got

29:46

everybody? Well, it was totally

29:47

inaccurate. Okay. Well, you can't use

29:50

that on your side anymore.

29:52

>> I never saw that movie. What was so bad?

29:55

What were the claims?

29:55

>> An inconvenient truth. Oh, let's find

29:57

out. put into perplexity what the uh

30:00

incorrect

30:01

>> I just I was already asking what what

30:03

did they get right and what did they get

30:04

wrong?

30:04

>> Yeah. What did it say?

30:06

>> That's typing it up right now.

30:07

>> I would get [ __ ] I guarantee you they

30:09

didn't get nothing wrong or they didn't

30:11

get nothing right.

30:12

>> You want to know which one you want to

30:13

start with? Right or wrong?

30:14

>> Um what the the predictions for cat

30:17

catastrophic events. What did he get

30:19

wrong about the predictions for

30:20

catastrophic events?

30:21

>> It's just uh

30:22

>> predictions that were incorrect. Rapid

30:24

sea level rise 20 feet. The film

30:26

depicted a potential sea level rise of

30:29

up to 20 feet 6 m in the near future

30:32

from the collapse of Greenland or West

30:34

Antarctic ice sheets. While this extreme

30:36

scenario is considered possible over

30:38

centuries or millennia, scientific

30:40

consensus does not support this

30:41

happening imminently, current rates are

30:43

much slower, even with acceleration

30:46

uh reaching 20 ft would take many

30:48

centuries. Uh another one, Mount

30:50

Kilimanjaro, glacier melt caused by

30:52

global warming. Go attributed the

30:54

shrinking of Kilimanjaro's glaciers

30:56

mainly to global warming, but later

30:57

research points to other major causes

31:00

like sublim sublimation and reduced

31:03

snowfall unrelated primarily to

31:05

temperature. Uh impression of imminent

31:08

chaos. The film often implies that

31:10

catastrophic outcomes like rapid ice

31:12

sheep collapse and dramatic sea level

31:14

rise might occur within decades when in

31:17

reality such processes are expected to

31:20

take much longer often centuries or

31:22

more. And then legal findings. A UK

31:26

court found nine errors of exaggerations

31:28

in the film mostly involving a lack of

31:31

clarity on time scales or oversimplified

31:34

attributions like Kilimanjaro.

31:37

Overall, climate scientists judged an

31:39

inconvenient truth as mostly accurate

31:40

with its projections, particularly in

31:42

broad trends, but criticized its

31:43

presentation for occasionally

31:44

exaggerating the speed and certainty of

31:47

some changes. Well, I think this is

31:48

Yeah, this is the thing. It's most its

31:51

climate scientists judged it. I'd like

31:53

to keep this climate hustle going on.

31:55

So, well, they were mostly accurate. We

31:58

do have a sincere problem. Stop putting

31:59

a British accent on when you do that.

32:00

Stop putting a British accent.

32:01

>> That's not even British. That's like a

32:03

fake British guy. That's like a posh

32:04

[ __ ] from Connecticut.

32:06

>> Okay. Okay. Okay. Uh but no, you're

32:09

you're right. The the lack of scrutiny

32:10

that people have of their behavior, the

32:12

distance between our opinions and our

32:14

deeds,

32:14

>> yeah,

32:14

>> never been greater. That's the internet.

32:16

And what it means is you're allowed to

32:18

do good

32:19

>> while appearing bad and do bad while

32:21

appearing good. And it's way easier to

32:24

do bad or to just not research or and

32:26

it's significantly harder if you're

32:29

like, I'm going to go out try and invent

32:31

something, try and push against an idea

32:33

or an ideology or a campaign for a a

32:37

movement that I think is really really

32:39

important and people are going to say

32:40

that I'm doing something mean or people

32:42

are going to call me names for doing it.

32:44

There's no incentive to do it. Why would

32:45

someone go why would somebody do that?

32:46

And I think that's what Elon's point is,

32:48

right? What I'm interested in is uh

32:50

doing good, not the appearance of it.

32:52

And I see a lot of people who are doing

32:53

bad while appearing good.

32:54

>> Well, you know, I think it's no through

32:57

no fault of their own. Young people are

33:00

indoctrinated into this world when they

33:02

start going to college that you have to

33:03

be active and to be an activist is to be

33:05

a good person and to be involved in

33:08

these campus activities is a good thing.

33:10

And there's also there's a tribal aspect

33:12

to it. You know, you're on a tribe of

33:14

people that are the people that are on

33:15

the right side of history. These are the

33:17

people that are kind and compassionate

33:19

unless you disagree with them. And these

33:21

these are the people that are they trust

33:23

the science unless it's inconvenient.

33:26

And these are the people that you know

33:28

you want to be in the educated minority.

33:31

You want to be [clears throat]

33:32

the people that get it and you want to

33:34

you it's very important that you use

33:36

your voice,

33:38

>> you know, and so they think they're

33:39

being good people. And I I get that and

33:41

I understand that. But it's being

33:43

weaponized against you and it's probably

33:45

not even funded by legitimate people.

33:49

It's most likely there's at least some

33:51

funding by some foreign entities that

33:54

are just trying to sew discord and make

33:56

sure that everybody hates everybody.

33:58

[laughter]

33:59

>> That' be a wonderful way to take down

34:00

any country, right? To make it feel as

34:02

if it was coming from inside.

34:03

>> Yeah, sure. There's a lot of that going

34:05

on. That's been absolutely proven. Uh

34:08

there was a thing recently with chat GPT

34:10

where they found out that these um

34:12

entities in China were using chat GPT to

34:16

argue about us a shutdown to like they

34:19

were just they they ran all these social

34:21

media accounts.

34:22

>> The Twitter account thing where you can

34:24

see where the accounts are based.

34:25

>> Yes, I know the

34:28

one of the ones that is like a fan

34:31

account of uh the JRE. People thought it

34:34

was me forever and I was like I didn't

34:36

correct it. It says I it made it say

34:38

parody accounts go or it says either

34:40

commentary account or parody account or

34:42

whatever fan run account just so you

34:45

don't think it's me because people do

34:46

things to me. It's in Asia. So someone

34:48

in Asia is doing that allegedly unless

34:51

he's got a VPN. I mean you could

34:53

>> you hardworking Asian supporting the Joe

34:54

Rogan podcast.

34:55

>> But you could, right? That's the

34:57

question. Like how do they know where

34:58

you're from if you sign up with a VPN

35:01

and you say I'm in the South Pacific?

35:03

Like how do they know?

35:05

>> I don't know. I don't know. I I

35:06

certainly know that um assuming that

35:10

you're on the right side of history uh

35:12

especially if you're in a big group is

35:15

often a a bit a dangerous position to be

35:17

in. So that Cassandra complex thing that

35:19

I was talking about before, um,

35:21

sometimes people might say it's your

35:24

duty if you believe in a thing to stand

35:27

firm,

35:28

>> right? You should you should make your

35:29

case known. You know, you're Ignis

35:31

Samlise, you know about the germ theory

35:32

of disease. You're Rachel Carson. You

35:34

know about the impacts of DDT. You're

35:36

Edward Snowden. You know about the the

35:37

surveillance that's going on. There's a

35:39

really wonderful example, the comparison

35:41

between Capernacus and Galileo. So,

35:44

Capernicus in the 1500s, he uh begins to

35:48

realize that the Earth might not be the

35:50

center of the solar system, let alone

35:52

the universe. And he has enough evidence

35:56

to justify it, but he waits until his

35:58

deathbed to actually sort of whisper out

36:01

his great work, which is the revolution,

36:04

this this work that he made. And he does

36:06

it on his deathbed presumably to avoid

36:07

the wrath of the church. Now, some sort

36:10

of hardline freedom fighting, you should

36:12

do it. Don't listen to the man. Don't

36:14

back down. Like just stand on your

36:16

principles. People would say, "Well,

36:17

that's a cowardly thing to do. You knew

36:18

what the truth was and you didn't stand

36:20

by it." A hundred years later, Galileo

36:23

comes along. He sees the moons of

36:25

Jupiter, sees the phases of Venus, sees

36:27

the pock marks on the uh surface of the

36:30

moon, and he realizes that the

36:31

heliocentric model, this like Capernac

36:33

revolution is true. Proclaims it from

36:36

the rooftops. What happens to him?

36:38

>> House arrest.

36:39

>> He gets put under house arrest. He gets

36:41

forced to recant under the threat of

36:44

torture and spends the rest of his life

36:45

under house arrest. So what you have

36:47

here, and I [ __ ] love this example so

36:49

much. I think it's so cool. It's two

36:51

guys 100 years apart with the same

36:54

realization and the justification for

36:56

the first one not saying what he didn't

36:59

say loudly is the treatment of the

37:01

second.

37:02

>> I think it's like just this perfect

37:03

explanation of irony. You know what I

37:05

mean? Like it's so perfect. Yeah, you

37:07

go. Well, the main issue that I have

37:10

with like basically being right and

37:13

early often feels a lot like being

37:15

wrong.

37:16

>> Mhm.

37:16

>> And if you make a an example of somebody

37:20

in that way, it is basically you saying

37:24

if you step out of line too far, this is

37:26

what's going to happen to you. And it

37:28

causes people who are trying to move

37:30

conceptual inertia forward. We're trying

37:32

to do research. I'm trying to assess

37:34

whether or not this is actually the way

37:35

that the world should be.

37:37

>> It causes them to be more capernicus,

37:40

not more Galileo.

37:41

>> And uh I think that's

37:45

that is not what you would want in a

37:47

civilization that's trying to continue

37:48

to make progress. You would want to be

37:50

accepting of new ideas and you would

37:52

want to encourage them as opposed to

37:53

cascading people. Do you think that

37:55

social media and the influence of other

37:58

people's opinions, it makes someone more

38:02

likely to

38:05

be able to think for themselves or less

38:07

likely? like more likely to be able to

38:09

examine preconceived notions, recognize

38:12

like, oh my god, maybe I'm biased or

38:14

maybe it's just like a group bias that

38:16

that I've accepted because of all the

38:18

people around me and I'm I'm I think I'm

38:22

I think this is wrong and I think this

38:24

is what I think is really going on or do

38:26

you think it encourages that kind of

38:29

thinking or discourages it?

38:31

>> I think it certainly encourages group

38:33

think very much so

38:34

>> but both right? Uh, it would open up the

38:38

opportunity for some people with a very

38:40

unique psychological profile. Yeah.

38:42

>> To be able to step back against

38:43

>> black helicopters. [laughter]

38:45

>> Yeah, there's a few guys out there I can

38:47

think of.

38:48

>> Um, but I think on average, what you're

38:51

seeing is basically this huge big swath

38:53

of people. For the first time ever,

38:55

you're able to aggregate um just how

38:58

much support or criticism something has.

39:00

>> Yeah.

39:01

>> You know, this is what like to dislike

39:03

ratios are. this is what upvote to down

39:04

votes are on Reddit. And um I I think

39:07

that that

39:08

>> that causes people most people don't

39:10

want to have to do the thinking of

39:11

coming up with an original opinion. I'm

39:13

sure that most of mine aren't original,

39:14

but given the fact that doing the

39:16

original thinking is hard, most of the

39:19

culture war is actually two armies of

39:21

puppets being ventriloquized by a

39:23

handful of actual thinkers. Most people

39:25

are just being brought along and pushed

39:27

along by people who came up with an

39:29

idea. And they're assuming, well, we've

39:31

done we know we we know this for a fact.

39:32

Well, it's interesting because both

39:34

sides know for a fact the thing that the

39:36

other side says is a lie. So, that can't

39:38

be true. Um, see, I I get the sense that

39:41

it causes people to uh adhere to the

39:44

crowd uh more more than they would have

39:47

done previously. And you also have to

39:48

think that if you're spending that much

39:50

time on it, like six hours a day, it's

39:52

one of the primary influences of your

39:53

life, probably more so than any other

39:57

media in the past, because it was very

40:00

rare as a child that you would listen to

40:02

six hours of the news. You wouldn't

40:04

really be indoctrinated into six hours

40:06

of whatever the latest cultural dilemma

40:08

was or the latest social issue was. You

40:11

wouldn't get that much of it. You get

40:13

people talking about it like normal

40:15

people do during the day or maybe you'd

40:17

be talking about a newspaper article you

40:19

read but you're not getting six hours of

40:20

it all day long. But now we are at least

40:23

six hours. I mean what is the let's find

40:25

that out. What's the average number of

40:28

hours a 18year-old kid is on social

40:31

media?

40:31

>> I would guess it's at least I would

40:33

guess social media maybe four or

40:35

>> let's just say their phone screen time

40:37

at least six probably more.

40:39

>> Yeah,

40:39

>> at least six probably more. And the mad

40:41

thing to consider here is your

40:43

parasocial relationships. People, think

40:46

about this. People will listen to your

40:47

show and listen to my show more than

40:50

they see their parents

40:52

>> by by a huge margin.

40:53

>> A huge margin. If you saw your parents

40:55

that much, it'd be kind of creepy.

40:57

[laughter]

40:58

>> The average screen time for 18year-olds,

41:01

7 to eight hours.

41:02

>> There you go.

41:03

>> Of total screen time per day is common,

41:05

though it varies a lot by person and

41:07

country. Okay. country has the least

41:10

amount of screen usage.

41:11

>> Dude, would you want to discount school

41:13

time, too? Cuz aren't they on screens

41:15

technically in school?

41:16

>> Um,

41:17

>> I mean, it's like you're asking phone

41:18

time, I guess, right? Not

41:20

>> Yeah, I think it's personal phones

41:21

they're talking about.

41:22

>> Are they on screens? Some of them

41:25

counting my laptop open in my screen

41:27

time because I'm connected to the same

41:29

like iOS system. So, I'm getting like 18

41:31

hours a day, but I'm like I'm not on my

41:32

phone 18 hours a day.

41:35

>> Interesting.

41:36

Um,

41:38

so let's guess like what countries.

41:40

Well, you'd have to have first world

41:42

countries for it to count,

41:45

>> you know? Like if you're in the Congo,

41:46

you probably don't get as much screen

41:48

time.

41:48

>> No, you're busy mining the [ __ ] Yeah.

41:51

>> raw materials.

41:52

>> Exactly.

41:52

>> Yeah. You're making the phones, not

41:54

using them,

41:54

>> which is the craziest thing of all that

41:56

the the thing that people virtue signal

41:59

on the most at the end of the line is

42:02

someone pulling an out.

42:03

>> Lowest global average.

42:04

>> Interesting. 3 hours and 56 minutes is

42:06

still a lot of time. That's That's kind

42:08

of crazy, but they're probably a little

42:10

healthier with it.

42:11

>> How is 9 hours and 24 minutes less than

42:14

10 hours and 56 minutes?

42:16

Like that how's that the highest global

42:18

average if 10 hours is the

42:20

>> That's weird.

42:21

>> I don't understand.

42:22

>> Yeah, close contender is more than the

42:25

highest global average.

42:27

>> I don't know.

42:27

>> Oh, get it. But either way, the

42:29

Philippines, they're killing the game.

42:30

10 hours and 56 minutes. Dude, there was

42:33

a a 2023 mental health report. Uh the UK

42:38

uh came in second most depressed country

42:41

in the world.

42:42

>> Second

42:43

>> second most depressed country in the

42:44

world.

42:44

>> UK.

42:45

>> UK. Yeah.

42:46

>> What's number one?

42:47

>> Asbekiststan.

42:48

>> So it's just just above

42:51

and just below South Africa. Uh

42:53

>> did the UK used to rank higher?

42:55

>> Uh yes. It's tracked down over time, but

42:58

it's never been superbly. I mean, we're

42:59

a we're misery is our like melancholy is

43:03

sort of our personality trait. It's our

43:04

national sport, right? Being a bit more

43:06

melancholic. Um, but yeah, the Ukraine

43:10

who are just about to go into their

43:11

fourth year of war came in higher and

43:14

Yemen, who apparently are going through

43:16

like one of the worst humanitarian

43:17

crises in human history, also ranked

43:20

higher than the UK. So, yeah, second

43:22

most depressed country in the world.

43:23

>> That's crazy. That's a wild number, man.

43:26

Um, that it can't just be the weather.

43:29

that it has to be like a

43:30

>> weather might contribute a little bit.

43:32

>> A little bit like Seattle does. Like

43:33

people in Seattle are depressed as [ __ ]

43:35

>> Maybe it's the online safety bill.

43:37

>> Could be. That would get me depressed.

43:40

[laughter]

43:40

>> I'd be so depressed if I lived in

43:42

England right now. I'd be like, I'm

43:43

[ __ ] Like legitimately [ __ ]

43:45

>> Like imagine if I was running this

43:47

podcast the exact same way out of

43:49

England.

43:51

>> Yeah. I'd get arrested.

43:53

>> I'd get arrested. I saw them. They

43:54

arrested a teacher because he refused to

43:57

um refer to one of his students as a

44:00

they and this was like his second

44:02

infraction. And so they they arrested

44:05

him for failure to recognize a singular

44:09

plural.

44:11

[laughter]

44:14

>> I look I really don't like I I don't

44:17

like [ __ ] on the UK because it feels

44:18

like I'm pulling the ladder up after

44:20

I've just got out of it. But it's just I

44:22

I don't know how many more ways you can

44:24

face plant over and over again. And

44:26

there's this bit there's a strange kind

44:28

of romanticization of the past of the UK

44:31

where we are English common law and we

44:33

we stopped the transatlantic slave trade

44:35

and we used the navy and so on and so

44:36

forth. But like we're we're really

44:38

living on borrowed time now as the UK.

44:41

It's been a good while since the UK sort

44:43

of contributed in that sort of a way.

44:45

There was a you know Alan Turing from

44:48

Turing effect.

44:48

>> Yeah. Yeah. Yeah. Uh so he was the guy

44:51

that decoded the indigo machine. Yeah.

44:53

Yeah. Yeah. Yeah. So uh he was gay and

44:56

he was chemically castrated by the

44:58

British despite the fact that he was

45:00

literally our equivalent of the atomic

45:01

bomb. Right. He was like a very British

45:03

version as well. It wasn't kinetic, it

45:05

was cognitive. So he decodes the machine

45:08

that the Germans are using to send their

45:10

secret messages. This means that we're

45:12

able to detect exactly where the Ubot

45:14

are going to be. And it results in some

45:16

really awkward situations like if we uh

45:20

before we're going to use all of our

45:22

force to try and take Germany down. If

45:25

we avoid all of their planned bombings,

45:28

they're going to guess that we might

45:29

have the keys to some of their

45:30

communication. So, they had to start

45:32

making decisions about which boats

45:34

needed to be let attacked and which

45:37

boats needed to be saved.

45:38

>> Oh my god. They knew all of the

45:40

different attacks that were coming, but

45:42

if they got rid of all of them, if they

45:44

were safe from all of them, the Germans

45:45

would start to catch on.

45:46

>> So, they had this really

45:47

>> Oh god.

45:48

>> So, this guy, this guy is is our

45:51

equivalent of the atomic bomb, right?

45:54

He's our Oppenheimer.

45:56

He gets chemically castrated just after

45:58

World War II.

45:59

>> Was in the 50s, right?

46:00

>> Yeah. Yeah. Yeah. Yeah. And he kills

46:02

himself. He takes his own life. He puts

46:03

a cyanide in a in an apple. Oscar Wild

46:06

in the 1800s,

46:07

one of the greatest writers of all time,

46:10

he's jailed and then dies in exile as a

46:12

peasant in France because he was gay.

46:15

And then 70 years after uh touring,

46:19

Gordon Brown, it's like 2008, 2009,

46:21

publicly apologizes. They bring out this

46:23

thing called the touring act which uh uh

46:27

gets rid of the criminal records of all

46:30

of these people from history like

46:31

posthumis and some of them are probably

46:33

still alive actually like some of these

46:34

people that had been uh whatever it was

46:37

convicted of indecent behavior improper

46:39

behavior at the time uh and then they

46:42

put touring on the 50 note so Britain

46:44

has for all that it's fantastic and I

46:46

love it and it's the country that I came

46:47

from like it does have a history of

46:50

[ __ ] persecuting people for what's

46:52

deemed improper behavior at the time and

46:54

then apologizing for it a couple of

46:56

decades later. And I think with the

46:58

online safety bill thing just I think

47:01

it's going to be the sort of thing that

47:02

you look back on and go that that was

47:04

not in no one's world was that a smart

47:06

move. I don't think that it's a I don't

47:08

think that it's it's helping anybody at

47:10

all. Well, it just appears that they

47:14

want total complete control over what

47:16

people say over there and that they

47:19

don't want criticism of the government

47:20

and criticism about immigration and

47:22

criticism about, you know, fill in the

47:25

blank. They don't want it. And the best

47:27

way to stop that is to keep everybody

47:29

scared. Make everybody self-censor.

47:30

What's the best way to make everybody

47:31

self-censor? Put a bunch of [ __ ]

47:33

people in jail. So last year, what was

47:36

it? 12,000 12,000 people got arrested

47:38

for social media post

47:39

>> supposedly more than Russia. Although

47:40

the the Russian the Russian stats might

47:42

not be uh

47:43

>> Yeah. [laughter] Well, they didn't

47:44

arrest him. They just shot him in the

47:46

face.

47:47

>> They don't kill Goolag for you.

47:49

>> Yeah. They just kill folks over there.

47:52

But yeah, it's really bad. It's really

47:55

bad. And it just doesn't seem very

47:57

progressive. It doesn't seem like you're

48:00

moving towards the future. It's not

48:02

progress like this. We've figured out a

48:04

long time ago that free speech is very

48:07

important to figure out what's right and

48:08

what's wrong. And when you suppress

48:10

people's speech, you can get away with a

48:12

lot of [ __ ] horrible things because

48:14

you stop people from being able to

48:16

protest it.

48:17

>> You know, in a small part, we saw a lot

48:18

of that during the pandemic. And you

48:20

know, and you you you see what what the

48:22

consequences of that are. You you can't

48:25

trust people that want power. You just

48:28

can't.

48:28

>> What you mean? Well, anybody that wants

48:30

any kind of control over a group of

48:33

people, if you want to control what they

48:35

say, if you want to control where they

48:36

go, you want to put them in 15inute

48:38

cities, like you can't trust that

48:40

because the natural inclination when

48:42

someone has power is to never let it go

48:44

and to ramp it up. They're in the power

48:47

business. If you're in the power

48:48

business, you don't want to keep making

48:50

the same amount of money every year. You

48:51

want you don't want to have the same

48:52

power every year. That's boring, right?

48:55

Like, if you're an insurance salesman,

48:56

you want to be the [ __ ] employee of

48:58

the month. you want to make more money

48:59

next year, you got your eyes on a new

49:00

Lexus. You're trying to make more.

49:02

You're not trying to stay maintained.

49:05

That's not the game you're in. And if

49:06

you're in the power game and if you're

49:08

in the game of enacting new laws in

49:10

order to have, we need safety. Safety

49:14

under the guise of safety, you can get

49:15

so much evil [ __ ] done. And if you start

49:18

doing that, you're not going to say, you

49:19

know what, guys, we were that safety

49:20

bill. We were really wrong. And what's

49:22

really important is discourse. What's

49:24

really important is that maybe I wonder

49:26

why you think the way you think. And you

49:28

know, maybe part of this polarization

49:31

process is not enabling us to see valid

49:34

points the other side has. Let's all

49:36

come together and talk about this as

49:38

reasonable human beings. It's no, that's

49:41

not what they're going to do. They're

49:42

going to just come up with more [ __ ]

49:44

reasons to put you in a cage. [laughter]

49:47

They want you to shut the [ __ ] up

49:49

because they want to make more. They

49:50

want to have more. They want to get more

49:51

power. They want to be the best leader.

49:54

They want to be the most powerful

49:55

leader. Isn't that a ruthless part of

49:57

human nature that trajectory is more

49:59

important than position? Jimmy Carr

50:01

taught me this. Um, so your industry,

50:03

imagine that you're the uh 250th best

50:07

comedian in the world. Let's imagine

50:08

there's a ranking. Uh, and last year you

50:10

were the uh 300th.

50:12

>> You were in a more psychologically

50:15

preferable position than somebody who's

50:18

number two in the world, but last year I

50:20

was number one. this sense that humans

50:22

have of where am I now compared to where

50:24

I was previously.

50:26

>> Uh I spoke to Dan Bilzerian about this

50:28

forever ago and I was like dude you've

50:29

kind of climbed the the peak of the

50:31

mountain of hedenism. Uh did you ever

50:33

think that you kind of frontloaded it

50:35

too much and that it's going to be

50:37

really really difficult for you to ever

50:39

um like reset like do a hideonic reset?

50:42

How do you go from the most amount of

50:44

girls and the cars and the all the

50:45

dopamine that the world has to offer

50:47

like where do you go from there? And uh

50:49

he he basically said, yeah, he was like

50:51

uh I'm going to try I would consider

50:52

shaving my head and my beard and going

50:53

and working in an Amazon warehouse for 6

50:55

months to see if I can do like a hard

50:57

reset, but you always know that you've

50:59

got the get out of jail free card, so

51:00

it's not going to be the same. And uh

51:02

just that idea, as you're saying,

51:05

somebody has power,

51:07

>> they want more power,

51:08

>> right?

51:09

>> They want more power. They want more

51:10

control. That sense

51:12

>> that's the sport they're playing.

51:15

>> Bingo.

51:16

>> Scoring. They're scoring. You have to

51:17

keep score. Greta Tumbach, the same

51:20

thing. We need more eyeballs. We need a

51:21

bigger bigger because where do you go

51:23

after you've made the Rivers of Venice

51:25

green? Yeah.

51:26

>> Well, you need to do something bigger.

51:27

Something more.

51:28

>> I need more likes. That's That didn't

51:29

get me enough likes. I need more likes.

51:32

I need to go viral.

51:33

>> It's a ruthless

51:34

>> I'm being shadowbanned. [laughter]

51:36

>> No, you're not. Your content just sucks.

51:38

No one But some people get shadowbanned,

51:41

but most people that are shadowbanned,

51:42

they just suck.

51:43

>> Yeah. Most people just don't understand

51:44

that they're not interesting. But

51:46

there's definitely real shadowbanning

51:47

going on. One of the things that was

51:48

interesting is that once Elon purchased

51:50

Twitter, I gained like five million

51:54

followers over the course of like a

51:56

couple of months. I was like, "What's

51:57

going on?" It's cuz I was I was somehow

52:01

or another they had locked my followers

52:03

down. This I'm not complaining about

52:05

this. I'm just observing.

52:06

>> Uh I know I have a lot of followers.

52:08

It's ridiculous. But I I started I think

52:11

I had 7 million and it I I used to go up

52:13

pretty steady and then somewhere during

52:15

the woke days during the dark days of

52:17

woke when it all started happening which

52:19

is around I think 2014 15 16 it started

52:23

really ramping up and then it seems like

52:26

from 16 on real censorship started

52:29

really kicking into high gear because

52:31

then they had a reason for it. Donald

52:33

Trump is our president. We have to make

52:36

sure this never happens again. In fact,

52:38

there was a meeting I believe

52:40

I don't want to say the the tech company

52:42

because I might be incorrect, but one of

52:45

the people one of the main people at

52:48

this tech company specifically said at

52:51

the meeting, we have to make sure this

52:54

doesn't happen again.

52:56

>> As in

52:58

did a [ __ ] horrendous job there.

53:00

>> Well, they [ __ ] up. But the point

53:02

being, imagine you are in control of an

53:05

enormous platform, an enormous media

53:09

platform that controls the discourse of

53:13

untold billions of people in the world.

53:16

And you have a very specific mandate

53:18

that you've given to the people that

53:20

work for you. We have to make sure that

53:22

we control who the king is.

53:24

>> Cuz that's what you're saying. You're

53:26

saying we we got to make sure this

53:27

doesn't happen again. Well, how do you

53:29

do that? How do you do that if 50% of

53:31

the people don't agree?

53:32

>> By force.

53:34

>> There's only one way. You have to do it

53:35

by force. Or if you control the

53:38

narrative, then you just hide

53:41

information,

53:42

accelerate information that's incorrect,

53:45

you just ban people from communicating,

53:48

you kick people out.

53:50

>> Well, I mean, some people would say that

53:52

getting to choose who's king is what you

53:53

do if you then buy that social media

53:55

platform. Sure, there's a there's an

53:57

argument for that, like what Elon did.

53:59

There's a real argument for that. But

54:01

there's also an argument for don't you

54:03

think it's a good idea if we have at

54:05

least one of these [ __ ] that's

54:07

huge um that you can go wild wild west

54:10

on and say whatever you want. I I think

54:12

that's very important. You don't have to

54:15

agree with them. There's all these tools

54:17

you can use. One of them is the mute

54:18

button. You can mute people. Bye-bye. I

54:21

don't want to hear you anymore. You're

54:22

annoying. Or you can ban them. I don't

54:24

even want you looking at my page. get

54:25

out of here. There's those things exist.

54:27

Like you can curate who you're

54:29

communicating and interacting with, but

54:32

if you don't have one of these groups

54:36

that's resistant to intelligence

54:39

agencies shutting down legitimate

54:42

voices, including during the COVID

54:44

times, it was guys like Jay Bataria from

54:47

Stanford, guy from from MIT, because

54:50

they were saying something that didn't

54:52

jive with what the agenda that Fouchi

54:54

was pushing through. Where do you think

54:55

we're at now if you were to sort of

54:57

predict what the trajectory of the the

54:59

speech stuff is online? Talk about

55:01

America. UK I think is just a lost

55:03

cause.

55:05

Do you think that we're going to

55:06

continue on this general path which

55:08

seems to be a little bit more sanity

55:09

than the peak?

55:11

>> This episode is brought to you by Zip

55:12

Recruiter. Have you ever lost your keys

55:14

and ended up tearing your house apart

55:16

trying to find them? What makes it even

55:18

worse is when it's the most conspicuous,

55:21

obvious place you could have sworn you

55:24

checked already. It'd be nice if we had

55:26

the ability to find whatever we're

55:28

looking for right away. And at least for

55:30

hiring managers on the hunt for talented

55:32

people, that's possible thanks to Zip

55:35

Recruiter. Try it for free at

55:38

zipcruiter.com/rogan.

55:41

Using Zip Recruiter almost feels like

55:42

having superpowers. It works quickly and

55:44

efficiently at finding qualified

55:46

candidates for your role. Largely

55:49

because they have incredible matching

55:51

technology and an advanced resume

55:53

database that can help you connect with

55:56

people instantly. No more wasting time

55:58

and energy. Zip Recruiter can help you

56:00

find exactly what you want. Want to know

56:03

right away how many qualified candidates

56:06

are in your area? Look no further than

56:08

Zip Recruiter. Four out of five

56:10

employers who post on Ziprecruiter get a

56:12

quality candidate within the first day.

56:15

And right now you can try it for free at

56:17

ziprecruiter.com/roganogen.

56:20

Again, that's ziprecruiter.com/rogan.

56:24

Ziprecruiter, the smartest way to hire.

56:26

I think people realize from the peak and

56:29

most importantly realize from Elon's

56:31

purchase of Twitter. When Elon purchased

56:34

Twitter, and I don't say this lightly, I

56:36

think he changed the course of

56:37

civilization. I really do. I think we

56:39

were on our way to this weird dystopian

56:42

censor censorship complex that was

56:44

already moving. We had already had

56:47

intelligence agencies that were

56:49

contacting Twitter. We know this through

56:51

the Twitter files. And they were banning

56:54

certain people that weren't saying

56:56

incorrect things, but they were saying

56:58

things that were inconvenient.

57:00

>> And they they turned out to all be

57:02

accurate. All the things that they were

57:03

warning about, all the things that they

57:05

were saying, all turned out to be

57:06

accurate. They stopped the the

57:08

distribution of the Hunter Biden laptop

57:10

story by the New York Post. The New York

57:14

Post, the second oldest newspaper in

57:16

America. It's a [ __ ] huge newspaper.

57:19

To stop that from being able to be

57:21

distributed on Twitter, which turn it

57:24

would turn out to be a totally accurate

57:26

story. And to stop that accurate story

57:28

is wild. That is scary stuff. that if

57:33

Elon didn't purchase Twitter, we would

57:36

have just had to deal with that kind of

57:37

stuff. That would be and it would

57:39

accelerate. It wouldn't stay where it

57:40

is. It would ramp up. It would get more

57:43

there would they were started using the

57:45

term malinformation. So there's

57:47

misinformation,

57:49

disinformation, and then malinformation.

57:51

Malinformation is factual information

57:54

that might cause harm.

57:56

>> Can you give me an example of

57:57

malinformation?

57:58

>> Children don't need a COVID vaccine.

58:01

That's malinformation because it is

58:03

true. Statistically speaking, like

58:06

especially healthy kids, they they kick

58:07

it off like it's nothing. They don't

58:09

need a vaccine for that. But that might

58:11

cause people to not get vaccinated and

58:13

that might kill your grandmother. So

58:14

that's malinformation.

58:15

>> Can you think of an example of

58:17

malinformation where it's justified

58:20

in doing that?

58:22

>> Yes. I would say like

58:25

if you had some information and you uh

58:29

were go you were releasing it online

58:33

that was uh an accurate depiction

58:38

of some things that the federal

58:41

government is involved with that would

58:44

compromise national security to achieve

58:46

people overseas uh yeah would get people

58:49

killed start conflict

58:51

>> here's another one that I've just

58:52

thought of uh um how to you know those

58:55

desktop um DNA printers uh this is how

58:58

to put small pox together

59:00

>> right

59:00

>> right something like that

59:01

>> something which is true but would be

59:03

would be dangerous and this is the

59:05

devil's in the [ __ ] details 100%

59:07

stuff like this

59:08

>> it's ne it's never binary it's never

59:10

incorrect

59:11

>> sometimes it's binary sometimes I

59:12

shouldn't say never

59:13

>> yeah some things are binary

59:14

>> sure

59:15

>> like whether or not you should win a

59:16

[ __ ] world's woman stronglifting

59:18

strongman power woman competition

59:21

>> [laughter]

59:22

>> That just happened. I thought we were

59:24

done with that. It just happened.

59:26

>> Well, do you know why it was able to

59:28

happen? It's because that person lied.

59:31

That person lied about their sex.

59:33

>> Oh, interesting.

59:35

>> Jamie, can you try and uh pull up an

59:37

image of the current 2025

59:41

World's Strongest Woman winner, please?

59:45

Um, just for clarity, Mitchell Hooper,

59:48

that is the world's strongest man,

59:50

Canadian dude, he's 6'3, 330.

59:54

The person who won woman's strongest man

59:59

is 6'4 and 400 lb. She makes the current

60:02

World's Strongest Man look like an

60:04

infant.

60:05

>> Oh, World's Strongest Woman, women

60:07

stripped of title after organizers

60:08

discovered she was born a man.

60:10

>> That was an hour ago, dude.

60:11

>> Okay, so an hour ago they stripped her.

60:13

Is that the person? Yep. Jamie Booker,

60:16

disqualified.

60:17

>> That's a man.

60:18

>> Uh,

60:19

>> are you sure?

60:20

>> It appears the athlete who is

60:22

biologically male and now identifies as

60:24

female competed in the women's open open

60:26

category. Uh, they were unaware of this

60:27

fact ahead of the competition and have

60:29

been urgently investigating. I want to

60:31

know what urgent investigation is.

60:32

>> They went on Twitter. [laughter]

60:35

>> They did it. So, that's that's a

60:37

biological male. That's interesting.

60:38

Correct.

60:39

>> It looks like just a big lady. Had we

60:40

been aware or had this been declared at

60:42

any point before or during the

60:43

competition, this athlete would not have

60:44

been permitted to compete in the women's

60:45

open category. The move comes after

60:47

runnerup uh Andrea Thompson, British,

60:49

hey uh was filmed storming off the

60:51

podium as she raged about the [ __ ]

60:53

decision toward the title. So the other

60:55

com competitors evidently knew.

60:57

>> Uh okay. So Thompson is now the winner.

60:59

So the UK gets the gold. But I think I

61:02

think about this so much when it comes

61:04

to sporting competitions and it's not

61:06

just with the the the trans thing

61:08

although this is a huge deal and I did

61:10

think that we kind of got past it.

61:13

>> How horrible is it to be the person who

61:16

won but had that moment the podium

61:19

moment stolen from you by somebody? I

61:22

think there's a a weightlifting Olympics

61:25

weightlifting championship final where

61:28

currently like the 11th place finisher

61:30

is now first because each person has

61:33

progressively got popped for peeds.

61:35

Number one did number two did the number

61:36

three did it's like 11 people have been

61:38

popped for peeds now.

61:39

>> Well that's the tour to France. You

61:41

know, when they took away Lance

61:42

Armstrong's title, the tour to France,

61:44

what they didn't tell you that if you

61:46

want to go and remove all of the people

61:50

that have tested positive for something,

61:52

you got to go down to like 18th place.

61:55

[laughter]

61:57

For real. For real. Like all those guys

62:01

were doing something. They were all

62:02

blood doping. They were all taking EPO.

62:05

They were all They were putting motors

62:07

in their [ __ ] bikes.

62:08

>> I've seen a video of that.

62:09

>> Yeah. They were

62:11

these guys try for [ __ ] every edge

62:14

humanly possible. So, you know, he was

62:17

just a scapegoat. But what what he was

62:18

doing was he was suing people that were

62:20

saying that he did peeds.

62:22

>> It's a smart way to silence them. But

62:24

yeah, I I [laughter]

62:26

>> I mean, sort of

62:27

>> thinking about I would be really

62:28

interested to see what the reaction is.

62:30

That's hot [ __ ] wet clay stuff,

62:32

right? Hot off the press a couple of

62:33

hours ago that it's been

62:34

>> rescended. I think we would be more

62:36

outraged if they accepted this

62:38

transgender person as a female and then

62:42

say, "Oh, a trans woman's a woman. Let

62:44

her compete." It seems like this person

62:46

lied. And so that's different,

62:49

>> but still identify. So I agree that it's

62:53

uh reassuring to see what the world's

62:55

strongest person organization uh decided

62:58

that they were going to do in in a sort

63:01

of repercussion to it. But you you can

63:04

already predict both of us can already

63:05

predict what's going to happen online

63:07

that this person shouldn't have been

63:09

stripped of their title. Maybe they

63:10

lied, but they should be competing

63:12

inside of this the the side of the aisle

63:14

that always agrees with this. Do you not

63:16

think that they're going to be pro?

63:18

>> I think that is slowly but surely losing

63:23

traction and support. I really believe

63:26

that. I believe that's where the rubber

63:28

meets the road [sighs]

63:30

because you're going to lose most women

63:33

that have ever done a sport. You know,

63:35

if you are a sedentary woman that has no

63:38

interest whatsoever in athletic

63:40

competition and you think it's more than

63:43

a good price to pay to let biological

63:45

males who identify as women because we

63:47

want them to be exclusive, it's more

63:48

important to to recognize and affirm

63:51

their identity than it is to be fair.

63:54

You haven't done any sports. So you're

63:56

going to lose not just most of the men,

63:59

you're going to lose a lot of you're

64:01

going to lose anyone right of center

64:05

like libertarian anyone anyone you're

64:08

going to not just lose all of the right

64:10

you're going to lose a giant chunk of

64:12

the center because I think the center in

64:14

this country is probably the most

64:16

rational of all groups those are the

64:18

those are the people that recognize go

64:20

kind of a little bit of everything here

64:22

you know and right of center or left of

64:24

center you're going to lose all those

64:26

people and you're going to lose most

64:27

women. You're going to lose most women

64:29

that have gone to most women that have

64:31

daughters. You're going to lose them.

64:32

The only ones you're not going to lose

64:34

are the [ __ ] cooks. The SSRI

64:38

filled up anti-anxiety medication

64:42

transitioning happy cooks. Those [ __ ]

64:46

cooks that you know think that you you

64:48

have a hierarchy of who's oppressed the

64:51

most and trans people are people. trans

64:53

women are women and they want to scream

64:55

it out and yell. They're just crazy.

64:58

You're you're going to have those people

64:59

that aren't going to be with it no

65:01

matter what.

65:02

>> Get into the boxing ring with that trans

65:03

woman who is a woman.

65:05

>> The one that was a man that lied that

65:08

the the Olympic champion that they just

65:09

took away his gold medal.

65:12

>> Flip-flop. That story flip-flopped back

65:13

and forth like 10 time. It was like a

65:15

Christopher Nolan movie to me that

65:17

>> because that person was threatening to

65:18

sue a bunch of people, right? They were

65:20

threatening to sue a bunch of people

65:21

that called them a male, but then

65:24

>> rescended it, made a statement.

65:25

>> Let's put this through perplexity or

65:27

something

65:29

>> where we could figure out a patent.

65:31

>> I want to know what the number is. I

65:33

want to know what what what's what's

65:35

true because what I think is there was

65:37

another organization that did a

65:39

chromosome analysis and found out this

65:42

person had an XY chromosome. So this is

65:44

specific type of disease or um genetic

65:48

abnormality where your testicles don't

65:51

cases again. So strange ask it

65:54

>> um ask it

65:57

did that person get their gold medal

65:59

taken away and why

66:01

>> but they well yes they did

66:03

>> right but like and why just see what I

66:05

know but let's see what it says as far

66:06

in terms of why why because they're a

66:08

man and how did they find out

66:11

>> find out how they found out because I

66:13

think the narrative is that there was

66:15

another boxing organization that had

66:17

already suspected something was up

66:19

>> did some testing

66:19

>> did some testing found out this person

66:21

has an XY chromosome use that,

66:23

>> you know.

66:23

>> So,

66:25

won the Golden Women's 66 kilogram

66:28

boxing event uh stripped now drecognized

66:32

International Boxing Association

66:34

previously disqualified Khalif from the

66:36

2023 Women's World Championships after

66:38

she failed eligibility tests under its

66:40

own rules. Later claimed those tests

66:42

showed she was ineligible for women's

66:44

competition. Because of these tests, IBA

66:46

officials, some media, and advocacy

66:48

groups have publicly demanded the IOC

66:51

strip or reclaim her gold medal, arguing

66:54

that she could not have been allowed in

66:55

the women's should not have been allowed

66:57

in the women's category. Like, they're

66:59

still saying she. Despite those demands,

67:01

IOC has defended allowing Khalif to

67:04

compete in Paris, describing the IBA's

67:06

disqualification decision as arbitrary

67:09

and saying she met the IOC's eligibility

67:11

criteria at the time. Um, so what is the

67:14

IOC's

67:16

cred? What is their eligibility

67:19

criteria?

67:20

>> Boxing must be a [ __ ] nightmare for

67:22

this because of all of the different

67:23

organizations that exist and each one is

67:25

going to have its own different set.

67:27

They have a coordination problem here.

67:29

>> Here's an even more here's a bigger

67:30

nightmare. Prisons [laughter]

67:33

prisons have a a selfidentity thing. In

67:36

order to be eligible for female prisons,

67:38

there's a lot of prisons including I

67:40

believe New Jersey, California.

67:42

California has 47 biological males that

67:44

are housed in women's prisons.

67:45

>> Okay.

67:46

>> At least.

67:47

>> Are they So, who runs a prison? Is it

67:49

the state? Is it an independent

67:51

organization?

67:52

>> Some of them are independent. Some of

67:54

them are privately owned.

67:55

>> Uh chromosome test results were kept

67:58

confidential by the IBA, but were leaked

68:00

after and widely reported. The IOC

68:03

nonetheless rejected IBA's findings as

68:06

arbitrary even with the chromosome test.

68:09

[sighs]

68:09

>> That's really standing your ground. boy,

68:11

you silly goosees.

68:13

>> Well, see, this was only this was only

68:15

2024, so to say that maybe this is the

68:18

the landmark case. Maybe it wasn't Leah

68:21

Thomas as a swimmer, maybe it was

68:22

somebody in a physical sport, but I

68:23

mean, when we're talking about the

68:25

strong woman competition, dude, if

68:28

you're 6'4, I think the next tallest

68:30

woman was 5'8 or 5'7. Think about what

68:33

you're doing. You're like wrapping your

68:34

arms around. You can It always gets

68:36

slippery, right? because it's like,

68:37

well, there's not very many of them, so

68:39

why are we making such a big deal out of

68:40

it? And it's like, hey, if there's one

68:43

rapist in the local community, you don't

68:46

go, well, there's only one of them.

68:47

Like, what's the chances that you run

68:48

into? It's like, no, no, no, we go we we

68:50

try and, you know, treat this problem.

68:52

So, first off, there's not many of them.

68:54

Then, well, you know, look at what

68:56

happens when you take these estrogen,

68:59

you downregulate your testosterone. It's

69:00

below this particular level,

69:01

therapeutic, da da da da. and you go,

69:03

"Well, yeah, but it's like being on a

69:05

heavy course of steroids up until you

69:07

stopped doing that." And then how much

69:09

of that does carry over? That gets a bit

69:10

slippery. But just the size, the size of

69:14

the hands of a person who's 6'4 and 400

69:16

lb compared with a woman who's probably

69:18

like 220 and 5'8, like grip strength,

69:22

being able to do like that's pretty

69:23

important in the sport of strong women.

69:25

>> All of it is ridiculous.

69:26

>> Wrapping your arms around an atlas

69:28

stone.

69:28

>> Yeah, you could do this forever. It's

69:30

It's all ridiculous. It's ridiculous.

69:32

It's not the same. You know, it's it

69:35

doesn't mean that someone shouldn't be

69:36

able to change their name and identify

69:38

as a woman. It's just like you you can't

69:40

dominate women's sports,

69:41

>> can't dominate women's spaces. You can't

69:44

you you can't you're not a woman. You

69:46

you know, we'll call you one if we want

69:48

to be nice.

69:49

>> But the reality is there's a there's

69:51

biological sex is a real thing. And when

69:53

it comes to competition, physical

69:55

competition, there's a reason we have

69:57

title nine in America. There's a reason

69:59

why we recognize women's sports. There's

70:00

a reason why you have it set up that

70:02

women will compete against each other

70:04

because it's fair. It's not fair to make

70:06

women

70:06

>> or else you just have a unisex category.

70:09

>> Yes.

70:09

>> And it would be dominated by men.

70:11

>> Dominated by men. And then girls

70:12

wouldn't have this amazing opportunity

70:14

to get scholarships which are they're

70:16

being denied because biological males

70:19

are winning in their category because

70:21

they allow them to compete. And there's

70:23

a thing called this is people what

70:24

people don't want to believe but it's

70:26

true. It always has existed. is a site.

70:29

No, they they doing this because they

70:31

really are a woman. There's a thing

70:32

called sandbagging. Okay? And

70:34

sandbagging has always existed.

70:36

Sandbagging is let's let's say that

70:38

you're going to enter into a jiu-jitsu

70:40

tournament and you're going into the

70:42

purple belt division, but you've been a

70:44

purple belt for eight years and you're

70:46

supposed to be a brown belt and they,

70:48

you know, for whatever reason you or you

70:51

could even here's a worse one. Maybe

70:53

you're a black belt in judo, like an

70:55

elite black belt, and you enter into a

70:58

jiu-jitsu tournament in the white belt

70:59

division, and you're in there with some

71:02

[ __ ] dork who's a plumber who's just

71:03

started taking classes. I think it'll be

71:05

fun to compete. And you [ __ ] flip him

71:07

on his head and break his arm and an arm

71:09

bar in like 15 seconds. Like, that's

71:11

sandbagging cuz you're an elite athlete.

71:14

you're you're like a worldclass judo guy

71:17

that's just thought it would be fun to

71:20

be put a white belt on and enter into a

71:22

jiu-jitsu tournament. There's people

71:23

that do that because they just want to

71:25

win. That's why people cheat at video

71:27

games.

71:28

>> That's why people cheat at golf, right?

71:30

People cheat because they want to win.

71:31

They just want to get that W. And there

71:34

will there's people that will pretend

71:36

they're a woman to beat up women. And if

71:38

you don't think that's the case, you

71:40

haven't met enough psychos because are

71:42

there people that are in the wrong body?

71:44

I don't know. I I'll give them that

71:46

respect. I'll give them that dignity.

71:48

Are there also people that are out of

71:50

their [ __ ] mind and want an excuse to

71:52

beat up women and pretend they're a

71:54

woman? If you tell them they could wear

71:55

a dress and they could just run past all

71:58

the ladies and dominate them on the

72:00

field, yeah, they're going to do that,

72:01

too. That's a that's a real type of

72:03

human being. And if you don't have an

72:05

accurate test for that, if you don't

72:06

have a thing you make them lick, oh,

72:08

you're a [ __ ] psycho. If you don't

72:10

have that, then you have to judge each

72:13

individual situation based entirely on

72:16

why would someone do this?

72:17

>> How much crossover would there be if if

72:19

somebody was a a a black belt in judo?

72:22

How much crossover is there to

72:25

>> An immense amount. Yeah,

72:27

>> an immense an immense immense immense

72:29

immense immense amount. Especially if

72:32

it's a ghee tournament. Oh my god.

72:34

You're virtually helpless. Helpless.

72:37

>> Even though judo is primarily done on

72:39

the feet.

72:40

>> It is. But they do arm bars. They do.

72:43

Look at Ronda Rousey. She's one of the

72:45

best arm bars in the history of the

72:47

sport. Look at Kayla Harrison. Look at

72:49

all these. Look at Carl Parizesian.

72:51

There's elite judo people that were

72:53

wizards at arm bars. Wizards at chokes

72:56

and leg locks. And of course, they're

72:59

they're submitting each other as well.

73:01

It's not exactly the same. And if they

73:04

went like ghee to ghee with, you know,

73:07

some prime Leo Vieiraa black belt, you

73:11

know, ghee master, you know, one of

73:15

you likely would give the jiujitsu

73:17

person a giant advantage because they'd

73:20

spend way more time submitting people.

73:22

They spend way more time working on

73:24

submissions. So judo to jiujitsu in a

73:27

tournament I would say black belt to

73:28

black belt they probably have a

73:30

disadvantage in judo but a huge

73:33

advantage over a white belt.

73:35

>> What do you think about Jake Paul

73:36

Anthony Joshua?

73:38

>> Boy

73:41

um

73:43

well realistically

73:47

it's one of the craziest propositions of

73:49

all time. You take a guy who just had a

73:53

boxing match that looks like a sparring

73:54

match with a 58-year-old Mike Tyson and

73:58

then you're gonna fight one of the

74:01

absolute scariest knockout artists in

74:03

[laughter] the heavyweight division.

74:06

Maybe we should watch the Francis Inanu

74:08

fight so you could see. Let's watch that

74:09

real quick just so you can see what

74:11

Anthony Joshua is capable if he's

74:13

fighting someone that's not in his

74:15

league. Okay, look. Usyk beat him and he

74:20

beat him twice and Andy Ruiz caught him

74:23

in the first fight and and dropped him

74:25

and stopped him. It was spectacular.

74:27

Andy Ruiz is super [ __ ] talented.

74:29

Usyk is perhaps the greatest heavyweight

74:32

boxer of all time. Maybe one of the

74:34

maybe maybe one of the greatest of all

74:36

time in any weight class. Usyk, you

74:38

know, and Usyk beat him and he beat him

74:40

twice. But Francis Enanu is coming off

74:45

of this fight with like go a little bit

74:47

before that so we can see this happen.

74:49

>> Watch this

74:50

>> highlights. We can't watch the whole

74:51

fight.

74:51

>> So he he drops him with a right hand

74:53

early and uh this is like uh two minutes

74:56

into the first round and Francis gets

74:59

up. He survives

75:02

and then Joshua

75:04

check out this

75:06

this combination he hits him with.

75:10

I mean, dude, the speed that he hits him

75:13

with this,

75:16

he's so dangerous, man. It's like you're

75:19

dealing with a guy who's an Olympic gold

75:21

medalist and he's enormous and he's got

75:25

vicious knockout power and he's got

75:27

immense amount of experience at world

75:29

class levels. Just think about what we

75:31

said earlier. Fought Usyk twice, fought

75:34

Andy Ruiz twice.

75:39

Oh man. Bro, the timing in that right

75:42

hand just spectacular.

75:45

>> Spectacular over the top. I mean, that

75:48

was a full force shot to the temple. I

75:51

mean, he's he's fucksville right now.

75:53

>> So, they wipe off his gloves, but you

75:56

look at him like he's he's really

75:58

feeling it right now. I mean, he

76:00

probably has no idea where he is. AND

76:01

ANTHONY JOSHUA,

76:03

>> OH MY GOD, absolutely folded in half.

76:05

>> Watch the back that up again. Watch

76:06

this. I mean, just steps into it with

76:10

every ounce of his body.

76:13

Perfect right hand. So, the fact that

76:16

Jake Paul wants to fight that guy. Hey,

76:19

I'll watch. [laughter] I'm going to

76:22

watch. I'm definitely going to watch.

76:23

So, you got me there. And if you want to

76:26

show you're legit uh by taking on one of

76:29

the scariest [ __ ] heavyweights alive,

76:31

>> can you get the tail of the tape of the

76:32

of uh Paul and Joshua? I was going to

76:35

say he has to they got him to weigh a

76:36

grade of 245. That's only like seven

76:38

pounds less than

76:39

>> Yeah, that's nothing. But and there's

76:41

some sort of a rehydration clause.

76:42

Listen kids, it ain't going to matter.

76:44

You know, there's not a chance that

76:45

Anthony Joshua is not going to just lose

76:48

the weight beforehand. He's not going to

76:50

come in drained. What he's going to do

76:51

is just do extra cardio and that's just

76:54

going to make him more dangerous. He's

76:56

going to be terrifying and he's going to

76:58

have a lot to prove. He's going to be

76:59

very angry that Jake Paul wants to fight

77:01

him. very upset that this YouTuber who's

77:05

fought Tommy Fury, who's a legit boxer,

77:08

and you know, a couple other guys that

77:09

were legit boxers. That's it. Like

77:11

everyone else he's fought, he's fought

77:13

Ben Ascrin, who was really a wrestler.

77:15

You know, he fought Tyron Woodley, who

77:17

was an elite MMA fighter, but you know,

77:19

not an elite boxer. He fought Nate

77:21

Robinson, who was a basketball player.

77:23

He's fought these guys. He fought

77:24

Anderson Silva, and he dropped Anderson

77:25

Silva, and Anderson Silva is a really

77:27

good striker, but also in his 40s, you

77:30

know, different time. It's, you know,

77:32

not the same guy he used to be. This is

77:34

this is a 34 year old Anthony Joshua.

77:37

This is a terrifying human being.

77:39

Terrifying.

77:40

Again, a guy who survived Usyk twice.

77:43

You know, you saw what Usyk did to

77:44

Dubois. You see Usyk take out Dubois.

77:47

Did you see that? I That's the Usyk

77:49

you're talking about. There's a Usyk

77:50

that rocked um Tyson Fury who's [ __ ]

77:53

69.

77:54

>> So Jake Paul's 6'1 versus 6'6. Anthony

77:58

Joshua. Jake weighed in for the Tyson

78:00

fight at 199. Joshua [laughter] against

78:04

>> 252 66 just not just 66 but 66 and knows

78:10

how to use every [ __ ] inch of it.

78:12

Knows how to keep that stick in your

78:14

face. He was keep that jab in his face

78:16

and that right hand if it hits you,

78:19

you're [ __ ] And he's not worried

78:22

about you the way he's worried about

78:24

Usyk. You can't move like Usyk. can't

78:26

constantly be frustrating and

78:28

overloading his nervous system. Usyk is

78:31

overloading every aspect of your senses

78:34

at every moment. He's constantly moving

78:36

and then punches are coming and he loops

78:38

punches around your guard and he's

78:40

constantly shifting his feet and you

78:42

think he's going to be there and he's

78:43

over here and it's like this overload of

78:47

thinking. It's not a casual relaxed

78:50

fight where you can kind of move around

78:52

and get your groove and he's going to

78:53

stay on the outside and you're going to

78:55

No, it's just constant. He survived that

78:57

guy twice. He survived, in my opinion,

79:00

the most skillful heavyweight of all

79:01

time twice.

79:03

And you're going to go boxing. And the

79:06

toughest guy you fought before was 40

79:08

years old, Anderson Silva. That was the

79:09

toughest guy so far you fought. You've

79:11

lost to Tommy Fury, who's a very good

79:13

boxer, but this is a giant Olympic gold

79:17

medalist heavyweight. I mean, Anthony

79:20

Joshua's [ __ ] thing that nightmares

79:22

are made of

79:22

>> and he's got that one punch nuclear

79:25

power. One punch and he's fast. It's an

79:29

explo like there's certain guys that

79:31

like in kickboxing couldn't translate

79:34

over to MMA because they didn't have the

79:36

kind of speed. Like Peter Arts is a good

79:39

example. He was a world-class kickboxer,

79:43

one of the best of all time, but didn't

79:45

have the style that would allow him to

79:47

trans. But then there was Merco Crocop.

79:49

Merco Crocop, who was violently

79:52

explosive, what perfectly transition to

79:54

MMA because you got to be able to hit

79:56

people quick. It was like a big part of

79:58

it is speed. Anthony Joshua has that

80:01

kind of speed that it's kind of

80:04

>> 252 pounds. You don't have the skill to

80:06

get away from that kind of power. What

80:09

happens is Francis Enado and Anthony

80:11

Joshua. You have to be a very sk you

80:14

can't judge that guy based on Dubois

80:16

who's a [ __ ] murderer. Daniel Dubois

80:18

is a tank and he took out Joshua. But

80:20

that guy's [ __ ] terrifying. You're

80:22

staring in front of that guy, but Usyk

80:24

didn't stand in front of him. Usyk moved

80:26

all over the place.

80:28

>> Joshua's and he's going to have a lot to

80:30

prove. He's going to be very angry. Do

80:32

you think they'll let everybody take the

80:33

brakes off? Because there's all rumors

80:35

about Tyson versus Jake that both of

80:38

them were sort of pulling punches and

80:40

not fully letting it go.

80:41

>> I think that's a different deal. You

80:43

know,

80:44

>> do you think there was something

80:45

probably just below the table?

80:46

>> I do not know. I do not know if it was

80:49

said. I do not know if it was

80:51

understood. I do not know.

80:52

>> In your professional opinion, based on

80:54

what you saw, do you think that the

80:55

people were holding back? It definitely

80:57

looked like sparring,

80:59

but it could be that he didn't want to

81:01

hurt Mike Tyson because Mike Tyson's 58

81:03

years old. Or it could be that Mike

81:05

Tyson didn't want to hurt him cuz he

81:06

likes him. I don't [ __ ] know

81:09

>> you.

81:09

>> But it wasn't what I was tuning in for.

81:12

>> It was not for me. I was there. I went I

81:14

went to it live.

81:14

>> I was tuning in for Mike Tyson coming

81:16

full 1988 Mike Tyson full chaos. That's

81:20

what I was hoping for.

81:21

>> We walked out like that.

81:22

>> It looked like it. Yeah. But that's what

81:24

everybody signed up for. So, they got us

81:26

whatever.

81:26

>> And do you think that

81:28

>> this is different? I don't think this is

81:29

that. I don't think this is that at all.

81:31

First of all, it can't be that because

81:33

Joshua is still competitive in the

81:34

heavyweight division and he's only doing

81:36

this for money. Like, he's still set up

81:38

for world title fights. After he knocked

81:41

out Inano, you could still set him up

81:42

like Joseph Parker just lost. You could

81:44

set him up with Joseph Parker. You could

81:46

have until a year ago, he could fight

81:48

Deontay Wilder. You're saying that the

81:49

lineage and the trajectory that Anthony

81:52

Joshua is on, if he happens to go a

81:54

little bit too gentle and lose by

81:55

decision to Jake Paul, it doesn't

81:57

exactly look great for his future

81:58

heavyweight champion.

81:59

>> It [ __ ] up all of his marketing

82:00

opportunity.

82:01

>> Wow. So that's a really So what we said

82:03

before,

82:03

>> the Inanu fight is a godsend to him,

82:05

right? The Inanu fight is like, hey,

82:06

boxing's back. This guy knocked down

82:08

Tyson Fury. This is how it was supposed

82:10

to go. Anthony Joshua, you carried the

82:12

torch for the boxing community. Because

82:14

I know a lot of like straightup boxers

82:15

ab and I they absolutely felt that way.

82:18

Yeah. Like this is what needed to

82:19

happen. These guys can't come over from

82:21

MMA and think they can box the best.

82:23

>> Yeah. You need to put them in their

82:24

place. It's uh what's great there and it

82:26

loops back to what we were talking about

82:28

before is incentives. Incentives. Align

82:30

the incentives. Yeah.

82:31

>> Like if you've got Joshua's I mean this

82:33

is

82:34

>> however I should I should

82:36

>> caveat.

82:37

>> Yeah. Here's the caveat. This might earn

82:40

him $200 million.

82:42

So if it earns him so much money,

82:44

>> Joshua or Jake Paul

82:46

>> Joshua like either one which How much

82:48

money is$und00 million?

82:50

>> Oh dude, this is a Saudi organization,

82:53

right? This is reality season, isn't

82:55

this? That was putting this on

82:57

>> probably. They seem to own everything. I

82:58

think they own me now. And you and Jamie

83:00

and K.

83:01

>> It's Netflix, right?

83:02

>> Right. Yeah,

83:03

>> this is going to be on Netflix. Okay. So

83:05

I don't know. Maybe it's not, maybe Riad

83:07

season's not involved, but the money

83:09

they threw Canelo Alvarez to get him to

83:11

fight Terrence Crawford. This is like

83:13

they're throwing insane money. They're

83:15

throwing nutty sums of cash at people to

83:18

make amazing fights happen. Like this is

83:20

this has always been the hiccup in

83:22

boxing is that people don't want to

83:23

fight certain people because they want

83:24

to protect their record. The Saudis are

83:26

like, "How much?"

83:28

>> Everybody's got a price.

83:28

>> Yeah. Everyone's got a price

83:29

>> and we've got the bank account to pay

83:30

it.

83:31

>> So here it is. The reported total prize

83:33

purse for J Jake Jake Paul versus

83:35

Anthony Joshua is 184 million with an

83:39

even split expected meaning each fighter

83:41

will earn approx approximately 92

83:43

million. Some reports init initially

83:45

suggest a different figure 184 is the

83:48

most frequently cited total from sources

83:50

like Daily Mail and Wikipedia. Okay,

83:53

that doesn't mean anything. Uh some have

83:55

also mentioned Jake Paul's cryptic $267

83:57

million tweet which may have fueled

83:59

rumors. I'd listen, it really depends on

84:02

who's setting it up. Netflix doesn't

84:05

have to tell you how much they're

84:06

they're paying, but the thing about

84:09

Anthony Joshua, if he loses this, if he

84:11

So, let's say he's only getting the 92

84:13

million, which I bet he's getting more.

84:14

Let's say he's getting 92 million. If he

84:17

loses this fight, he misses out on that

84:19

Saudi money because they could set up a

84:22

Tyson Fury Anthony Joshua fight and each

84:25

one of them gets $200 million. You can

84:28

you could do a fight like that. The

84:29

Saudis can do a fight like that. They

84:31

can do a fight. They have enough

84:33

resources to throw at boxing or they

84:36

could change the entire landscape of

84:37

boxing.

84:39

>> If you were the guy that stands in

84:40

between 6'6, $250 Anthony Joshua and

84:44

$200 million. [laughter]

84:46

>> Yeah.

84:47

>> I'm sorry. I'm sorry.

84:48

>> He's not going to I don't think he's

84:49

going to lose you on purpose.

84:51

>> I don't want to be that guy.

84:53

>> But I'm not saying that anybody lost to

84:54

anybody on purpose. I don't think that's

84:56

happened. But what I do think is that

85:00

people take it easier on people if they

85:02

like them. And it looked like they were

85:04

taking it easier on each other than you

85:05

would expect. I'll just say that. That's

85:07

just my [clears throat] personal

85:08

opinion.

85:09

>> I don't think that's going to happen

85:10

with this fight. I don't think there's

85:11

any chance in the world knowing what

85:13

Anthony Joshua is a specialist at. He's

85:16

a specialist at putting knuckles through

85:18

your [ __ ] brain, you know, and that's

85:20

what he's going to try to do to Jake

85:22

Paul. Anything other than that if from a

85:25

34y old Anthony Joshua will make us all

85:27

think it's a fixed fight. Whether or not

85:29

Josh can do it, whether or not I mean

85:33

Jake Paul shocks the world and shows us

85:35

that he really does know how to box

85:36

really well and moves really good and

85:37

and uses his jab and blows us all away

85:40

with a strategy and a lot of footwork

85:42

and movement and brings Usyk into his

85:44

camp and

85:45

>> or Lomachenko's dad even better who's

85:47

the guy who trained Usyk. He trained

85:50

Usyk as well. Lomacho's father. That's

85:53

why they both are the best moving

85:54

fighters in in this generation by far.

85:57

By far. They're they're in a group of

86:00

the greatest of all time like Willie Pep

86:01

and Pernell Whitaker. There's like a

86:03

group of like defensive wizards that

86:05

exist today that are they're in that

86:07

group. And two of them that exist in

86:09

that group are trained by the same guy.

86:12

Lomachenko and Usyk.

86:15

>> I don't want to be Jake Paul. That's

86:16

what I know that I do not want to be.

86:19

>> But what better way to show the world

86:20

you're legit? Go get knocked out by

86:22

Olympic gold medalist, former world

86:24

heavyweight champion, 6'6, 250

86:27

[laughter] lbs.

86:27

>> Yeah. To win.

86:28

>> Show the world you're you're in it to

86:30

win it.

86:30

>> You're definitely not [ __ ] about. I

86:32

had this guy on my podcast, Bugsy

86:33

Malone. So, he's a British uh grime

86:35

artist. And uh he had this hoe.

86:37

>> What's a grime artist?

86:38

>> Like a like drill rap. Like British rap.

86:40

>> Oh, okay. Did you know what that means?

86:42

>> You did.

86:43

>> Damn.

86:44

>> Keep up with the times.

86:45

>> I can't. It's too late. I missed it. I

86:47

missed everything. He grows up in the

86:49

north of the UK in gangs Manchester and

86:52

uh he's in juvenile detention as a

86:56

teenager. He gets stabbed with a

86:57

screwdriver. Like rough stuff, rough

86:59

rough northern stuff. Uh but some part

87:01

of his upbringing just sort of really

87:03

compels him to try and bring himself out

87:06

of this situation. Starts making music,

87:08

gets super successful, does this fire in

87:10

the booth with Charlie Sloth that gets

87:11

like 35 million plays. And he starts

87:13

boxing. Boxing is like one of his um

87:17

salvages. It's one of his safe havens

87:18

and it's the thing, one of the things

87:20

that's kept him very disciplined

87:21

throughout his whole life. So, he starts

87:22

accumulating some money. He buys a nice

87:25

house in Manchester. Very, very nice

87:27

house. And the local kids nearby sort of

87:30

starting to take a little bit of notice.

87:31

Maybe they know who he is as an artist

87:33

and word starts to get around that he's

87:35

living there. There'd been some

87:37

concerns, some security concerns for a

87:38

little while. And uh he gets a phone

87:41

call from his girlfriend at the time.

87:43

She says, "There's some men here.

87:45

They're trying to break in and they're

87:46

in a van." And he, as she's on the

87:48

phone, he hears the glass shatter of

87:50

this house. His mom's in the house and

87:52

his girlfriend at the time is in the

87:54

house. He's driving around. He's got his

87:55

sister in the car. So, he drives back in

87:57

the car. This is a guy who's like world

88:00

famous as a rapper, right? This would be

88:01

like it happening to like the British 50

88:03

Cent or the British Jay-Z or PD or

88:05

something like that. Drives back getting

88:07

down the driveway toward this house.

88:09

There's a blockade. There's boulders

88:10

that have been laid out in front. So, he

88:12

knows that there's going to be an ambush

88:13

of some kind and he sees this guy in the

88:15

bushes on the right with a brick. This

88:18

guy's hiding in the bushes waiting and

88:19

he thinks he's going to throw it through

88:20

the window, but he doesn't. He wants to

88:21

hit him with the brick. So, Bugsy stops

88:23

the car, opens the door, and immediately

88:26

he's he's massively into Jordan

88:27

Peterson, personal development,

88:29

self-growth. It's like a odd blend of

88:31

rough upbringing, self-discipline, and

88:33

sort of transcendent personal growth.

88:35

And he gets out of the car and points at

88:36

the guy and he goes, "No way. Is that

88:38

you? Is that a blue t-shirt?" And the

88:40

guy's like, and as he's doing it,

88:42

because he's been training so much, he's

88:43

coming toward him, distracting him the

88:45

same way as I go, "What's on that

88:46

t-shirt there?" Immediately you go, and

88:49

before he knew it, Bugsy's hit him, spun

88:51

him around, bricks fallen out of his

88:52

hand because this guy hasn't set his

88:54

feet in time. It's a problem of having a

88:55

big weapon. Bugsy said like, "You need

88:57

to set yourself and you need to be able

88:58

to throw it." Like, it's good because it

89:00

can hurt someone, but it's slow and it's

89:02

cumbersome and you can't move as fast.

89:03

And he's training every day. every

89:06

single day, no matter whether he's

89:07

rapping, he's on tour, he's training,

89:08

and he's boxing and he's fighting, and

89:09

he's sharp. He knows his distance, hits

89:11

this guy, they have a scrap, Bugsy wins,

89:14

moves the stuff out of the way, gets

89:15

back in the car, drives in. Jamie, can

89:17

you just CCTV search uh search Bugsy

89:21

Malone CCTV? So, there's footage from

89:23

his house of when he pulls up in the

89:25

Mercedes

89:27

and um so

89:30

go back back a little bit. Yeah, just to

89:33

the start. So, this is him pulling in in

89:36

his car, having just beaten someone up.

89:39

This is a van filled with guys, gets out

89:41

of the car, pulls his top off,

89:44

and then sprints

89:47

to go and get the rest of the guys that

89:49

that are waiting outside. That is not

89:51

the behavior of a dude who gives a

89:53

single [ __ ] This is the British Jay-Z

89:56

ripping his top off and then sprinting

89:58

out to try and chase people away. The

90:00

real kicker of this, there was like tons

90:01

of guys, not in that van, but in some

90:03

other van behind. The real kicker was

90:05

the dudes that he fought, they pressed

90:07

charges.

90:09

They press charges against him

90:11

>> because he's rich.

90:13

>> They press charges because he [ __ ] him

90:14

up. [laughter]

90:17

>> And then at [sighs and gasps] at the

90:19

>> Oh, they pressed charges. They pressed

90:21

charges.

90:22

>> Did they actually wind up going to court

90:24

over this?

90:25

>> Yeah. Yeah. Yeah. Yeah.

90:26

>> No.

90:26

>> Went to court. And this is it was so

90:28

brilliant. and he told this story to me

90:30

and he said uh it was the middle of

90:33

COVID and people weren't sure whether

90:34

the venues were going to be open and he

90:36

had this tour this tour was going on but

90:37

it wasn't selling as well no tours were

90:40

selling as well as he would have liked

90:41

so he spoke to his uh lawyer before his

90:44

lawyer went to go and do the not guilty

90:45

verdict and they had two statements that

90:47

were ready he came out he said very

90:49

pleased to say that Aaron Davies has

90:51

been uh acquitted today he's not being

90:54

found guilty uh he is now getting back

90:56

to preparing for his upcoming tour and

90:58

tickets are available ailable now at and

91:00

[laughter]

91:02

he used he used his not his lawyer did a

91:06

midroll ad read

91:07

>> it's a

91:08

>> for his tour as part of his not guilty

91:10

verdict having just beaten up like a a

91:12

van filled with blo of whom looked like

91:14

a plumber it was your plumber comment

91:16

that got me thinking about it like just

91:17

some white belt that decide you know

91:19

some guy that thinks he's a bit hard

91:20

>> like he's had a little bit of a thrower

91:22

and this guy's training every single day

91:24

sharpening his skills and he's been

91:25

doing it since he was a kid

91:26

>> that's hilarious

91:27

>> and he's dangerous and he's nasty It's

91:28

wonderful when a story like that works

91:30

out. In America, people have guns. It's

91:33

a different different sport.

91:35

>> Have you looked at uh appropriate force

91:37

in the UK? Do you know what that is? The

91:39

use of appropriate force.

91:41

>> There's a lot of that in America as well

91:43

depending state by state. They have

91:45

different there's different standards

91:48

that different states impose. Like

91:49

Florida, stand your ground. Florida you

91:51

just get away with killing people.

91:53

>> Um California, it's very different. They

91:56

were actually trying to pass a thing in

91:57

California saying it's your obligation

91:59

to leave your house if someone breaks

92:00

into it.

92:01

>> Uh I don't know if that got through.

92:03

>> You It's your obligation to not shoot

92:05

them. Um that you can't you can't harm

92:08

them because they're just trying to

92:09

steal something. They're not trying to

92:10

harm you.

92:12

>> Like the assumption that

92:13

>> they're not trying to harm you.

92:14

>> Exactly. It's I I've had this

92:16

conversation with people on the podcast

92:18

with actually with Tommy Chong. It was a

92:20

mind-numbing conversation that you know

92:23

you should not think of this person as

92:26

trying to attack you that their life is

92:29

not less valuable than yours. It's just

92:31

as valuable as your life. You shouldn't

92:33

take their life

92:34

>> despite the fact that they're on your

92:35

property.

92:35

>> Despite Despite the fight that back the

92:38

fact I can't talk despite the fact

92:40

rather that historically a lot of people

92:42

have broken into people's houses and

92:43

killed them. It's happened over and over

92:44

and over again. you're just assuming

92:46

that this time is going to be different

92:47

cuz they just want your watch or

92:49

whatever. Like [ __ ] off. Like this is

92:51

that's a dumb way to live. Like you you

92:53

have to be able to protect yourself.

92:54

Like there's crazy people. That's a real

92:55

thing.

92:56

>> Yeah. I think the the appropriate force

92:59

thing becomes interesting in the UK

93:00

where

93:01

>> you don't have as many guns

93:03

>> because there's more levels of weapon in

93:06

between

93:07

>> nothing, just hands. This guy's got a

93:09

brick. Yeah. This guy's got a brick. So

93:11

you're allowed a brick. But if you bring

93:13

a gun to a knife fight, that's not

93:16

appropriate force.

93:17

>> Oh, you know what I mean? Had a knife.

93:19

>> Yes.

93:20

It's [laughter] like I don't know. It's

93:22

very gentlemanly.

93:23

>> Oh god.

93:25

So stupid.

93:26

>> Well, the UK's got like some odd uh um

93:30

archaic laws. Like the distance between

93:32

the front benches in the House of

93:34

Commons is the same as two broadswords

93:36

held out at arms length, [laughter]

93:39

which is just so funny. Well, that's

93:42

also why you guys drive on the other

93:43

side of the road, right?

93:44

>> Why?

93:45

>> I think you drive on the left side of

93:46

the road so you could use your right arm

93:47

to slash each other.

93:49

>> No way.

93:49

>> Sword. Yeah, I believe that's what it

93:50

is.

93:51

>> What? In case you were jousting in a

93:53

vehicle.

93:54

>> Someone If you're on a horse or if

93:56

you're in a car someone's You want to be

93:58

able to get them on that side. That's a

93:59

strong side.

94:01

[laughter]

94:02

>> Someone told me that when I was over

94:03

there. I hope I'm not I'm not incorrect.

94:05

>> I like it as a story. Whether it's right

94:06

or wrong, I don't care. Uh there's a

94:08

reason that women's shirts button from

94:10

the left and not the right. Have you

94:12

ever accidentally put your wife's hoodie

94:13

on instead? It goes in the

94:16

>> middle ages. You knew you were going to

94:18

meet when traveling on horseback. Most

94:19

people are right-handed. So if a

94:21

stranger passed the right of you, you're

94:22

right-handed, be free to use your sword

94:24

if required. Yeah. That's why you guys

94:26

do it with your cars.

94:26

>> Well, this is the problem. If you don't

94:28

have a medieval country like ours, you

94:30

end up driving on the other side of the

94:32

road. But uh yeah, so um women's shirts,

94:34

if you've ever accidentally put your

94:36

wife's hoodie on or something, zipped it

94:37

up, women's shirts button from the other

94:40

side.

94:40

>> They button from the left, not the

94:42

right.

94:43

>> The reason for that is that when buttons

94:44

were first introduced in the 1700s,

94:47

>> they were mostly for the aristocracy.

94:49

And the aristo aristocratic women were

94:51

dressed by mostly right-handed servants.

94:54

>> Whoa.

94:55

>> So they dressed them this way. So the

94:57

women's shirts button, if you put a

94:59

>> still to this day,

95:00

>> same thing, dude. I promise you now,

95:02

anybody that's watching, any guy that's

95:04

watching, go and put your wife shirt on.

95:05

This is how it begins. Go and put your

95:06

wife shirt on and see. It doesn't fold

95:08

that way.

95:09

>> It folds the other way.

95:11

>> And you have to push the buttons through

95:12

with your left hand.

95:14

>> How [ __ ] cool is that?

95:15

>> That's crazy.

95:16

>> And it's the same with hoodies. You

95:18

know, we zip our hoodies with our right

95:19

hand. Girls zip their hoodies with their

95:20

left hand.

95:21

>> Oh, wow.

95:24

>> So [ __ ] cool. weird.

95:26

>> One [laughter]

95:27

one other element is um the gentleman of

95:30

the days uh they would have a sword on

95:32

the left hip drawn by the right hand,

95:35

the way that our shirts are put together

95:36

at the moment. It can't get caught in

95:38

the folds because the left fold is over

95:41

the top of the right. So as you draw it,

95:43

there's no chance that the hilt

95:44

>> would get caught.

95:45

>> So if you're a left-handed person, you

95:46

have to wear women's clothes.

95:48

>> That might actually explain more than

95:50

you think.

95:51

>> Probably.

95:52

Uh so this is an example of uh path

95:55

dependency. So what you're talking about

95:56

like some [ __ ] from the past

95:58

>> that influences the future. Uh querty

96:00

keyboards, right?

96:02

>> Same thing.

96:02

>> Yeah, I know that one.

96:03

>> Typewriters. Yeah. So it was made to be

96:06

inefficient to slow people down. And if

96:08

you take a normal typer from a querty

96:10

keyboard and put them on some other

96:12

formulation that's allowed, they're like

96:14

50 to 70% faster. So, we still using a

96:18

designed to be inefficient keyboard

96:21

because if you type too quickly on a

96:23

typewriter and you use letters that are

96:25

close together, the typewriter jams. So,

96:27

the letters that were used most

96:28

frequently were put out onto the edges

96:30

and it wasn't it was less often that you

96:31

were going to put two next to each other

96:33

so they wouldn't jam.

96:34

>> I don't know a single person who

96:35

switched to a different type of

96:36

keyboard. Do you?

96:38

>> No. No one.

96:40

>> Lex Freedman's got some like weird super

96:42

nerd.

96:42

>> Oh, but his is just separated. He's just

96:44

got it separated. still

96:46

>> like this.

96:47

>> Yeah,

96:48

>> it's almost like a

96:50

>> Yeah, that's kind of interesting. But

96:51

that's not the point. The point is the

96:52

layout of the keys in a regular

96:54

keyboard. There's other layouts. So

96:56

there's there's it's not just

96:57

quarterties not available. You could

96:59

actually buy keyboards that have the

97:00

most efficient layout. I forget what the

97:02

name of it is.

97:03

>> I think it might be the dactyl thing.

97:04

Hot swap dactyl.

97:05

>> I think that's the very top.

97:06

>> I think that's it.

97:08

>> Seeing I think

97:09

>> up and right.

97:10

>> Yeah.

97:11

>> Yeah. Hot swap dactyl. Maybe it's just

97:14

for sale though.

97:16

>> What is

97:16

>> It's still a quot key. I can't get away

97:18

from it. Rated.

97:19

>> So there's

97:20

>> other layouts.

97:21

>> If you could search what styles of key,

97:24

what is the most efficient layout of

97:26

keys for typing speed?

97:28

>> That's what I did.

97:29

>> Yeah.

97:29

>> Yeah.

97:30

>> That's what this this what's coming up.

97:32

This [ __ ] is way faster than typing.

97:34

>> Okay. Yeah. Right. Right. Right. But

97:36

that's a different um that's a different

97:38

thing.

97:39

>> It's fastest type fastest keyboard for

97:41

typing. comes up.

97:43

>> You can type Hold on. Let's look at that

97:45

for a second.

97:46

>> You haven't seen this before?

97:46

>> No.

97:47

>> Yeah. It's like the directions they they

97:49

I'll show you on a demo.

97:52

>> Wow.

97:53

>> Car.

97:54

>> This is crazy.

97:54

>> And each one of those is a letter and

97:56

some of them you

97:58

>> can like make words real fast.

98:05

>> This So, this is a what we're talking

98:06

about right here is a totally different

98:08

device than a keyboard. But uh what I

98:10

mean is like there's another keyboard

98:12

layout that super nerds use

98:15

>> like a tiny amount like the kind of

98:16

people that have like they they have uh

98:19

those Google phones that don't connect

98:21

to the servers, you know.

98:23

>> Eric Prince make those.

98:25

>> No, that's a different one.

98:26

>> Okay.

98:26

>> He's got his own.

98:28

>> Yeah. I uh just that path dependency

98:30

thing like [ __ ] from the past that's

98:31

still influencing us now. Yeah.

98:32

>> Why your shirt is going in the other

98:34

direction.

98:35

>> That's pretty crazy. Oh, so here it is.

98:37

is a new class of peripheral device that

98:40

allows ordinary people to type at the

98:42

speed of thought.

98:43

>> Whoa.

98:46

>> Everything can

98:50

coding, gaming, designing, or just

98:53

typing, whatever you do, do it at the

98:55

speed of thought.

98:57

>> H

98:58

I wonder how much of a learning curve

99:01

there is to figuring out how to type

99:04

with that thing, cuz it looks pretty

99:05

dope. Oh, they have different ones.

99:07

Scroll up to that image at the top.

99:09

That's a different one.

99:11

>> I think it's the same. It's just uh

99:13

>> but shaped different.

99:14

>> Yeah, just like made out of metal,

99:15

>> right? But it's a different shape.

99:17

>> You probably put your hand Well, maybe

99:19

put your hands on it the same way.

99:20

>> It's very different. The other one was

99:22

curved.

99:22

>> The problem that you have is like,

99:23

>> is this the new one? The forge? The

99:26

master forge? Let me see what you got

99:27

here.

99:28

>> It's not showing anybody use it. That's

99:29

how I was trying to find a good using it

99:31

to show you how they type like words

99:32

really fast. I think it's a matter of

99:34

time before you're typing with your

99:35

brain anyway.

99:37

>> I think this is like learning to code.

99:39

>> Yeah. Well, I think about this with

99:40

prompt engineering. Like if AI gets

99:43

progressively better and better, the

99:46

idea of being a prompt engineer, I

99:48

understand how to get the AI to do what

99:49

I want is a job that only shortly after

99:52

it becomes a job

99:54

>> might be made completely obsolete.

99:56

>> 100%. Yeah. Yeah. That that's not going

99:58

to work.

100:00

That's like opening up a Blockbuster

100:02

video in 1999.

100:04

>> It's like it's too you don't have so

100:06

little time.

100:07

>> Well, the problem that you have with the

100:09

quy keyboard thing is it's a

100:10

coordination problem. Like if you want

100:11

to borrow your friend's laptop, you're

100:13

back unless everybody decides we're

100:15

going to switch to the better type of

100:16

keyboard and we're going to do it now.

100:18

>> There you go.

100:18

>> Oh, here he goes. He's moving.

100:19

>> He's typing it right there. That's he's

100:21

typing these words as he's

100:22

>> looking at the screen.

100:23

>> Holy [ __ ]

100:23

>> How's he doing that? That guy's a

100:25

[ __ ]

100:25

>> up to like 300 words a minute. I think

100:27

people can get to.

100:29

>> And but here's the question. Like how do

100:32

you learn? Do you have to play a game?

100:34

You ever do that? Like Mavis Bacon's

100:36

type typing. You ever do that?

100:37

>> No. What was that?

100:38

>> It's fun. It's a game you play. It

100:39

teaches you how to type.

100:41

>> Teaches you type.

100:42

>> Yeah. Yeah. Yeah. You like type things

100:44

that they tell you to type. They time

100:45

you like a race. It's like fun.

100:47

>> See, he's like hitting all these letters

100:49

at once, I think, with his fingers. You

100:52

can see them popping up and then it

100:53

creates the word. I think it's a little

100:54

bit of mixture of like remember the T9

100:56

typing you could do on your phone.

100:58

>> Yeah.

100:59

>> And you could like you could hit four

101:00

four numbers and you you know what word

101:02

it would be and if it wasn't that word

101:03

you'd hit next like three times.

101:05

>> You could get really good at that. I

101:06

think it's a little I think it's

101:07

predictive.

101:08

>> Could you rewind that again so I could

101:09

see him doing that? Is that Can you give

101:11

me some volume so I could hear what he's

101:12

saying?

101:14

>> There's no question that typing

101:15

sentences at over 200 words per minute

101:17

is extremely satisfying. But does typing

101:21

fast actually transfer to productivity

101:23

in the real world? That's the question

101:26

we'll be answering together in today's

101:28

video. Does typing speed really matter?

101:32

>> That's nuts.

101:33

>> Wow.

101:34

>> He just did that. Wow. And he made butt

101:38

large.

101:39

>> Yeah.

101:40

>> Like he he made it all caps. Like how

101:43

>> I suppose this is kind of

101:44

>> I need to know if that dude's Rainman.

101:45

You know what I'm saying? I need I need

101:47

to ask him some questions about math.

101:49

Yeah, I uh it's mad to think how quickly

101:53

we can think and how slowly we can

101:55

communicate that to other people even

101:56

with speech, right?

101:57

>> Can you just please uh search is there a

102:00

more efficient key layout than quarterty

102:03

because that's what I'm looking for

102:05

because I know there is because I I

102:07

remember I I went down a rabbit hole

102:09

with this and I was really thinking

102:10

about trying it and then I was like,

102:11

"God, what are you doing?"

102:12

>> You'd have to change your phone.

102:14

>> Yeah. And it wasn't phone days. This was

102:16

way before phone days. This was the days

102:18

of just typing

102:20

more efficient keyboard layout than

102:22

quiry. That's it. D'Vorak, that's it.

102:24

Puts about 60 to 65 to 70% of keystrokes

102:27

in the home row versus uh roughly 30 on

102:30

querty. So fingers move much less. So uh

102:34

now that we know that, can you uh search

102:37

for images of D'vorak keyboard? So

102:40

that's the what the keys look like right

102:41

there. Oh, that's it right there.

102:44

>> See how different that is?

102:46

Wow.

102:47

>> Yeah. Very different.

102:48

>> How long do you reckon it would take you

102:49

to write out a short email?

102:51

>> It would take forever. My stupid

102:53

[laughter] fingers would go right back

102:54

to where they always go. You know, that

102:56

was one of the things that I learned

102:57

really early on um from teaching martial

103:00

arts. I was I way would rather I would

103:03

way rather teach someone who didn't know

103:05

anything than teach someone who learned

103:08

things wrong. Because someone who

103:10

learned things wrong, it's very

103:11

difficult to correct their technique.

103:13

They they have a a mode in their mind

103:16

that they shift to when they're panicky

103:17

or when they're being pressured. They

103:19

always go back to the bad technique

103:21

always. It's very hard to get someone to

103:24

learn technique correctly when they know

103:26

it incorrectly. You got to retach them

103:29

everything. You see it with every you

103:30

see it with pool. There's certain

103:32

tendencies that people have with their

103:33

arm being out that a lot of people just

103:35

accept the bad relationship between your

103:37

elbow and your as long as it's

103:39

consistent. Even though it's more

103:41

inefficient, it's gonna add extra

103:43

English to the ball and spin and all

103:44

these different and probably make you

103:46

less accurate. Maybe better that than

103:48

try to like make your arm drop down and

103:50

hang 90% because it'll feel so alien.

103:53

>> But that's way less than in martial

103:56

arts. In martial arts, like god,

103:58

>> if you learn how to throw a sidekick

104:00

with your knee down versus your knee up,

104:03

it's so hard to to do it the other way.

104:06

when you're being pressured, you're

104:07

always going to do it the wrong way and

104:09

you're not going to have the the the

104:11

correct amount of power. And those

104:13

tendencies that are burned into you,

104:16

I've been typing for 30 [ __ ] years.

104:19

Like, they are I don't have to look at a

104:21

keyboard. I can just talk to you and I

104:23

can type and I'm not really good, but

104:25

I'm good enough, you know? I'm I don't

104:27

look at the keys. Like, I don't have to

104:28

peck. Like, I used to go it used to

104:30

drive me crazy watching videos of Hunter

104:32

Thompson who never learned how to type.

104:33

>> He would type like this. He would type

104:36

with like one finger at a time, poke and

104:37

peck. I'm like, "Dude, it would take so

104:39

little time for you to just put your

104:40

[ __ ] fingers there and learn how to

104:42

do that, right?" He never did. He poked

104:44

and pecked his way to some of the

104:46

greatest [ __ ] books ever.

104:47

>> Maybe that was a performance enhancer.

104:49

But yeah, I uh

104:50

>> Well, he was poking and pecking while he

104:52

was on Coke.

104:53

>> That's true. Yeah. It's probably for

104:54

[laughter] the best that he didn't type

104:55

more quickly. Imagine the crazy [ __ ]

104:56

that would have come out of him then.

104:58

>> Right. Right. Yeah. You ever seen him

105:02

type? It's so frustrating.

105:03

>> No. Can we see? Is it videos?

105:05

>> Yeah. Um.

105:06

>> Oh, wow.

105:07

>> Find Hunter Thompson typing.

105:09

>> Yeah, you'll you'll see it. It's Pokey

105:11

Pecky. And Johnny Depp actually mimicked

105:14

it really accurately in Fear and

105:16

Loathing in Las Vegas when he's uh

105:18

sitting in front of the thing like

105:19

pecking. Yeah. Like

105:22

doing his Hunter Thompson impression and

105:25

poking. Yeah.

105:26

>> Well, your brain can think at about

105:28

4,000 words a minute. And that's the

105:31

same rate of fire as an M134 machine

105:34

gun.

105:34

>> Wow.

105:35

>> Uh so anything even and it's your point

105:38

of very soon I think that keyboards are

105:40

going to be obsolete when you think

105:41

about how much [ __ ] fidelity and

105:44

speed is lost with you going from brain

105:47

to thumb. Like I wonder what another

105:50

type of keyboard is. And you got to

105:52

think okay what how do I

105:54

>> convert this into words? Where am I

105:55

going to go? Open the app. type that. Oh

105:57

crap. [ __ ] keyboard. Keyboard D. Yes.

106:01

It is so slow.

106:03

>> Yes.

106:03

>> Compared with when we just get neural

106:05

linked up to each other.

106:06

>> Yeah. Yeah. And I'm sure you've seen

106:07

that demonstration [snorts] where the

106:09

two guys are sitting across from each

106:10

other and they have the headsets on.

106:11

They're asking each other question and

106:13

answering the questions without using

106:14

words. No.

106:15

>> You haven't seen that? All right. We'll

106:16

show you that next.

106:17

>> Show that. What were we just looking up

106:19

now?

106:19

>> I can't I I find pictures of him typing

106:21

but not video of him typing.

106:23

>> Oh god. Thinking of it from the movie.

106:24

>> Let me get the bathroom. Let me get the

106:26

bathroom. We'll be right back, ladies

106:27

and gentlemen. It's time to pee. Uh,

106:29

where were we?

106:30

>> Somebody typing like a grandma.

106:32

>> Yeah, the the Hunter Thompson thing. He

106:34

couldn't really find it.

106:35

>> Johnny Depp here.

106:36

>> You got Johnny Depp doing it. Okay. This

106:37

is how he typed. This is a completely

106:39

accurate. This is a great video, by the

106:41

way. Should listen to this. It's really

106:43

>> It's an amazing piece. You got to cut it

106:45

out.

106:46

>> It's not a good YouTube.

106:48

>> Okay, we'll cut it out. But that's it.

106:50

That's how he That's how he poked. He

106:52

poked and pecked like that. So that's

106:54

how Hunter Thompson used to type

106:58

>> out of his [ __ ] bird just poking

107:00

>> telepathic video too.

107:02

>> Oh, you found that? Okay,

107:03

>> I got to look it up.

107:04

>> Okay, this is the crazy one. The

107:06

telepathic thing is nuts because they

107:07

have these headsets on. These guys are

107:09

laughing because they're asking each

107:10

other questions and they're answering

107:12

the questions and they hear the answer

107:14

in their heads. They they hear the the

107:16

other person hears the question and then

107:18

they hear the answer. So, it's a new I

107:21

think it's I don't know if it's a

107:22

product or what, but it's called Alter

107:23

Ego. This is the same guy who developed

107:24

that device where he could uh look

107:27

things up without opening his mouth or

107:29

talking and just sort of like mimicking

107:31

the words in his

107:34

>> We all have moments when in doing the

107:36

same thing. I'll sort of skip past it.

107:38

So, he's talking he's showing it on his

107:40

own here.

107:41

>> The cool part is when he brings in

107:43

someone else to talk to and this guy

107:46

also has it.

107:48

So they're communicating. [music]

107:50

>> Where do you want to get lunch after

107:51

this? He's saying

107:52

>> get lunch after this.

107:54

>> Yeah, for the demo they hooked it up to

107:56

audio so that the video could hear it.

108:00

>> Typhoon could be good.

108:02

>> So they're laughing because they're

108:04

>> doesn't matter where Arnov and I are. It

108:06

could be a noisy environment.

108:08

>> So would they hear a direct

108:10

conversation? It could be a noisy

108:12

environment or a quiet office. Having a

108:15

direct conversation is possible without

108:17

saying a word. The signals Alter Ego

108:20

detects aren't affected by environmental

108:21

noise. So even if you're walking past a

108:23

wind tunnel or a construction zone, what

108:26

you want to say will always get across.

108:28

>> It's like having infinite noise

108:30

cancellation.

108:30

>> Exactly what people say happens when

108:32

they encounter aliens.

108:35

>> It's exactly exactly someone's talking

108:38

in your head and you hear it.

108:42

So imagine this technology scaled out a

108:45

thousand years and they probably don't

108:47

need the other person to have a headset

108:48

anymore

108:50

>> and they just

108:51

>> would make for an interesting podcast.

108:53

>> Yeah, I guess.

108:54

>> Then you could just tune in and nobody

108:55

needs to actually listen to anything. So

108:57

where's the sound? [laughter] Are they

108:58

hearing the sound in a set of

109:00

headphones?

109:01

>> Hard to not headones.

109:03

>> They're it's they know it right. They're

109:07

not hearing it. It's not like

109:09

>> I understand how I don't understand

109:11

>> cuz if it was really loud then you

109:12

wouldn't be able to hear it.

109:13

>> So yeah, for the for the demo we just

109:15

watched they have hooked up to a speaker

109:16

so so like we can hear what they're

109:18

hearing. But I think if anything it's

109:20

got to be some sort of jaw induction,

109:22

but I don't know that for sure.

109:24

>> Well, there's weird earphones that you

109:26

could put on that don't go in your ear.

109:27

They go behind your ear and they send

109:29

the sound into your dome.

109:31

>> People use that for running, right? So

109:32

they can still hear the sound that's

109:34

going

109:34

>> [ __ ] creeps hiding in the bushes.

109:37

>> So they can hear the creeps so they can

109:38

get ready for them. You know Cam Haynes,

109:40

right? My buddy Cam,

109:42

>> his [clears throat] brother almost got

109:43

killed by a mountain line.

109:45

>> Crazy story. He put it on his Instagram.

109:49

Um the day the next day like he talked

109:52

about the story what happened. He was

109:54

running and there was a mountain line in

109:55

the bushes and at first he thought it

109:57

was a coyote. He just saw the eyes. He

109:59

yelled and then stood up and he realized

110:00

it was a cat and it started running

110:02

after him and he's running at night is

110:04

in California and he kicked rocks at it.

110:07

He screamed at it.

110:09

And uh ultimately there's some dogs

110:11

barking and he thinks maybe the dogs

110:13

barking scared the mountain line off

110:14

him. But he said it was like I couldn't

110:16

have used his quote. He said I couldn't

110:18

have used bear spray even if I had it

110:20

because it would have got on me. That's

110:21

how close it was. Said it was right

110:24

there like right on him. Said it's the

110:26

most scared he's ever been in his life.

110:27

>> I've seen that video of the guy tracking

110:29

backward.

110:30

>> Oh yeah.

110:30

>> As it's coming toward him

110:33

swinging. Yeah.

110:34

>> Hey. Hey. The only thing that gives me

110:36

comfort about that video, if I was

110:37

there, is like that thing just wants to

110:39

scare me. It's not trying to kill me. It

110:41

wants to scare me. That's a mother. It's

110:42

trying to get you away from the cubs

110:44

because the way it's doing it, it's

110:46

throwing its arms in the air in a very

110:47

intimidating way. If an animal was

110:49

trying to kill you, it wouldn't do that.

110:51

>> It'd be running full clip at you and

110:53

just dive on your neck. That's the

110:55

difference between a cat that wants to

110:56

kill you and something that's trying to

110:58

scare you off.

110:59

>> So, he was The problem is you're backing

111:00

up, right? and the the the instincts of

111:04

these predators. Like if you throw a

111:06

ball of yarn by a kitten, they dive on

111:08

that ball of yarn. They can't help

111:09

themselves. And that's the thing about

111:10

you backing up or you even you running,

111:13

it's like you're exciting their prey

111:15

drive.

111:15

>> Yeah. They're going to keep tracking

111:16

you,

111:16

>> right? So they tell you to stand tall

111:18

and be loud and make a lot of noise, but

111:21

there's a fine line between you being a

111:23

threat and then them being scared off.

111:25

like you you being something they have

111:27

to deal with depending upon the the

111:29

distance

111:30

>> between each other.

111:32

>> That's the thing.

111:32

>> Oh, that makes so much sense.

111:34

>> Yeah.

111:34

>> Have you ever had any run-ins with

111:36

>> I have never had like an encounter like

111:38

that, but I I did in the wild see an

111:42

enormous mountain line once, but

111:43

fortunately it was from inside of a

111:45

truck. Yeah. Me and my friend Colton, we

111:48

were in Utah. We were taking this turn

111:51

and it was at dusk so the sun was

111:53

setting and he stops the truck and he

111:55

goes look at that cat and we I go where

111:58

and we look over and I see the glowing

112:00

eyes from the setting sun the glowing

112:02

eyes reflecting underneath this tree and

112:05

it's got this pumpkin head this big

112:08

[ __ ] these mandible muscles that just

112:12

crush things and these massive forearms

112:14

and it's just sitting it's a big cat man

112:17

like I've seen two other mountain lines

112:19

before, but they were small. They were

112:21

like a dogs sized. This thing was

112:23

[ __ ] big. Like,

112:24

>> you reckon you'd be able to take a

112:26

dogized mountain line, or are you still

112:27

dead?

112:28

>> You're dead. Yeah. I mean, a cat-sized

112:30

cat might [ __ ] you up. A house cat might

112:33

[ __ ] you up. A bobcat might [ __ ] you up.

112:35

A mountain line will kill you. You know,

112:37

you have you you'd have to be an

112:39

extraordinary person with weapons to

112:41

survive a mountain line handtohand

112:44

fight.

112:44

>> You'd have to be an extraordinary person

112:46

who's really fighting to survive. And

112:47

you won't you don't like you don't panic

112:49

at all. You have to be willing to stay

112:51

calm. This thing's going to tear your

112:53

arms apart. It might tear your face

112:55

apart.

112:55

>> What are the basic I mean you must you

112:57

hunt all the time and you do was it like

112:59

end of September you went and did

113:00

another big one last year.

113:02

>> Elk hunting. Yeah.

113:03

>> You must have been given whatever the

113:06

safety briefing that you have at the

113:07

start of an air like aircraft taking off

113:10

is of hey man if you see a this if you

113:12

see a this or if you see a this these

113:13

are the ways that you're supposed to

113:14

behave. No, we don't get any safety

113:16

briefings.

113:17

>> But you must have learned it in the

113:18

past, right? As a part of

113:20

>> safety briefing, carry a gun. Bring a

113:22

gun with you.

113:22

>> Point at big scary things.

113:24

>> Even if you're bow hunting, carry a

113:26

pistol. Especially if you're in bear

113:27

country.

113:28

>> If you're in bear country, you can't you

113:30

can't depend on this this mist making

113:33

their eyes hurt, keeping them off you

113:35

because it might not work.

113:36

>> We'll just run through it.

113:37

>> Yeah. There was a a a recent case in BC

113:41

uh where a bear mauled 11 people. Um,

113:45

and they used bear spray on it. It

113:46

didn't work. It's real. I think it was a

113:49

teacher protecting his students. Uh, so

113:52

shout out to that teacher. He got [ __ ]

113:54

up. Um, but they tried bear spray. Bear

113:56

spray is not effective. My friend John

113:58

who lives up in Alberta, he used bear

114:00

spray on a grizzly once. He said it

114:02

walked right through it like it was

114:02

nothing.

114:03

>> Is bear spray basically like hardcore

114:05

pepper spray?

114:05

>> Yeah, it's like vicious pepper spray.

114:07

But you're just going to get a mad bear,

114:10

you know?

114:10

>> Why do they make more hardcore bear

114:12

spray then? It's as hardcore as it gets

114:14

without killing you, right? You know, if

114:16

it gets on you

114:16

>> if it's that noxious

114:18

Yeah.

114:19

>> It's just supposed to be a deterrent.

114:20

And sometimes it can work. Like

114:22

sometimes maybe they're just curious and

114:23

you spray him and they're like, "Fuck

114:24

this guy." And they get out of there.

114:26

But maybe sometimes, no, you know, cuz

114:28

it's it's like it's like tasing a guy.

114:31

You ever see a guy get tased and they

114:32

JUST [ __ ]

114:35

>> There's guys that get tased and they

114:36

just go stiff and they fall down. And

114:39

I've seen other guys get tased where

114:40

they rip it right out of their arm. Four

114:42

people, including children, were

114:44

hospitalized. A teacher on crutches,

114:46

second adult with be a second adult with

114:48

bear spray, and a third person who

114:49

punched and kicked a grizzly despite

114:51

serious injuries are being praised for

114:53

their actions saved a school a school

114:55

group attacked by a bear near Bella, uh,

114:59

British Columbia.

115:01

Four people, including the children,

115:02

were hospitalized Thursday after a bear

115:05

attack on students and teachers in the

115:07

Nualk First Nation while out on a school

115:11

trip near the Boy, I'm going to [ __ ]

115:12

this up. A Aqual [laughter]

115:15

Awalta

115:18

school east of the remote community. Oh,

115:20

so it was a very remote place. Yeah.

115:23

Bear spray didn't do anything, man. He

115:25

said, uh, look, uh, nothing phased it.

115:27

Didn't do anything to the bear. two cans

115:30

of spray in the eyes of the animal. Look

115:32

at that. The This said, the teacher

115:34

unloaded two cans of bear spray into the

115:36

eyes of the animal and it didn't do

115:37

anything.

115:38

>> It blows my mind that people who have

115:39

been through something that scary.

115:41

>> When the kids were getting attacked, one

115:42

of my cousins who had his skull ripped

115:44

ran towards the bear and jumped on it

115:47

with his bare hands. Holy [ __ ]

115:50

>> It's pretty hardcore.

115:50

>> That's hard. Well, that's that's primal

115:53

life. You know, that's survival in like

115:55

a real situation where you're like your

115:58

your language goes away. Like you're

116:01

down to

116:01

>> a teacher with crutches was whacking it,

116:03

hitting it in the eyes, the face, the

116:05

head for minutes and then the bear

116:07

finally Imagine being on crutches.

116:09

>> Oh my god.

116:11

>> You have Well, you just It's just

116:13

survival.

116:14

>> It's so reptilian. It's like It's like a

116:17

savage savage moment. That's what blows

116:20

my mind about these situations where

116:23

emotions are running so high. How people

116:26

are able to come back with any kind of

116:28

memory at all.

116:29

>> Right.

116:29

>> Cuz that true

116:31

>> amount of adrenaline

116:33

>> just completely warps people's memories.

116:35

I was learning about this uh this case

116:38

from Australia in the ' 70s. This uh

116:41

lady gets attacked inside of a home. So,

116:44

uh, guy breaks into the house and

116:46

assaults her inside of her home. And she

116:48

identifies this TV psychologist. This

116:50

guy called Donald Thompson says it was

116:52

this was the person who assaulted me.

116:54

>> The TV psychologist.

116:55

>> TV psychologist. Yeah. Yeah. Yeah. So,

116:56

she knew this guy from the TV. Uh, uh,

116:59

he was the guy that assaulted me. That

117:01

night, the police go and they arrest

117:04

Donald Thompson, take him in. Uh, the

117:06

next day there's a lineup and the woman

117:08

positively identifies him and Donald

117:11

Thompson's like, "That couldn't have

117:12

been me because I was actually on

117:15

television in front of a live audience

117:17

at the time." And the arresting officer

117:20

like scoffs at him and basically says,

117:22

"You might as well have Jesus and the

117:24

Queen of England as your alibis as

117:25

well." Like, this is ridiculous. We know

117:27

that she's been assaulted. We've got

117:29

photographic evidence of the marks on

117:31

her. We've done a DNA test, which is

117:34

going to come back soon. She's

117:35

positively identified you from the

117:37

lineup and she called you out before you

117:39

were in the lineup as well. Like you're

117:40

bang to rights. But there was a wrinkle

117:43

that when they actually looked at the

117:44

timing, he was on TV at the time that

117:48

this was happening. And what had

117:50

occurred was the woman had had that

117:53

television program on while the attacker

117:56

broke in and sexually assaulted her.

117:59

[sighs and gasps]

117:59

>> Whoa. And it imprinted that guy's face

118:02

in her memory.

118:03

>> Bingo. Wow.

118:05

blended the attacker's identity with

118:09

what she was seeing on TV while it

118:12

happened.

118:13

>> Wow.

118:14

>> And the kicker, Donald Thompson was on

118:17

TV to discuss an area of psychological

118:20

specialtity that he had, which was the

118:22

unreliability of eyewitness testimony.

118:25

>> Whoa, [groaning]

118:27

dude. Whoa.

118:33

Did you see that? Um, there's someone

118:37

sent me this video. Give me pause this

118:38

for a second. Darren Brown, the

118:40

sidekick. Have you seen the one dude?

118:42

>> Yes.

118:44

>> Have you seen the one where he

118:47

he got a guy to uh assassinate Steven

118:50

Fry?

118:52

>> Oh, yes. Yes. Yes. Yes. Yes.

118:54

>> Yeah.

118:55

>> Yeah. [ __ ]

118:56

>> Yeah. Like got like MK Ultra the guy to

118:59

take out

119:00

>> like the jump or something or the push.

119:02

push. Was it the push?

119:03

>> I don't remember, but I watched a clip

119:05

of it the other day. I'm like, this is

119:07

this is so crazy that you can actually

119:09

do this to someone and it just and the

119:12

point of the article that I was reading

119:14

on or the the post on X was

119:16

>> you're you're telling me that MK Ultra

119:18

has not figured out a way to do this.

119:22

>> You you can get a guy to do it with

119:23

cameras to do it on Steven Fry, the

119:26

comedian.

119:27

And it obviously did he didn't really

119:29

kill him, but

119:31

>> I had a Oh,

119:33

>> yeah. Here it is.

119:34

>> Yeah.

119:35

>> So, the assassin with Steven Fry.

119:38

>> So, he somehow or another gets this guy

119:41

to do it.

119:43

>> Um I guess we can't play it, but

119:47

>> gun,

119:49

>> fake gun.

119:50

>> Um I don't remember what he did.

119:52

>> Yep.

119:52

>> Oh [ __ ] They acted it all out, too.

119:56

>> That is so crazy. That's crazy.

119:57

>> That's so crazy. So that guy really

119:59

thought he killed Steven Fry.

120:00

>> Imagine being in the crowd.

120:02

>> How about those people next to him and

120:03

didn't even flinch?

120:05

>> I'd be like, "What kind of psychos am

120:07

I?"

120:07

>> She's the one whispering to watch. She

120:08

whispers. She's like, "Good job." I

120:09

think she's the one who set him off.

120:11

>> Oh,

120:13

>> she's she's in on it.

120:15

>> Now, here's the question. Is this Can

120:16

anybody

120:17

>> the gun?

120:18

>> Yeah. Can anybody fall into that kind of

120:21

a hypnosis? Can is that

120:23

>> so only certain people that are

120:25

suggestible?

120:26

>> There's uh high, medium, and low

120:28

suggestability people. And there's a

120:30

couple of uh test Dr. David Spiegel um

120:33

from at Stanford. He's like one of the

120:36

world leaders in hypnosis. And um he

120:39

explained some people are more

120:41

susceptible to hypnosis than others. I

120:43

have to assume that Darren will have

120:45

done a profile and this guy is like

120:46

really really susceptible.

120:48

>> Okay. What's that about? Why is that

120:50

around?

120:51

>> Susceptibility to hypnosis.

120:52

>> Yeah. Yeah.

120:53

>> So, I think that um dopamine plays a big

120:56

part of it. And if you process dopamine

120:58

more quickly, uh you are more

121:01

susceptible. I process dopamine really

121:03

slowly. I know that from doing some

121:04

genetic tests. So, I know that my

121:06

susceptibility to hypnosis would be

121:08

lower. There's some personality traits

121:09

that make you more or less likely as

121:11

well. I think agreeableness versus

121:12

disagreeableness is one of them. Um I

121:15

think there'll be a sex difference, too.

121:17

I don't know why it's there. It's kind

121:18

of the same as saying like why are some

121:20

people taller than others? Like they

121:21

just are and there's like a byproduct

121:23

that comes along for the ride.

121:24

>> But it's a weird thing to to to to be

121:28

able to manipulate a person's mind and

121:30

to have it so clearly.

121:33

I mean, this is the the the clearest

121:35

example of it you're ever going to see.

121:36

He he just shot a famous person in a

121:39

room full of people.

121:40

>> It does feel like a weird back door.

121:42

>> Yeah, that's what I'm getting at. Like

121:45

it's like the those voting systems that

121:47

can be hacked.

121:48

>> Well, does it speak about that?

121:49

[laughter]

121:50

>> Does it or like those those cell phone

121:53

uh towers they buy from China that turn

121:55

out to be trans

121:56

>> sending everything back to China? I

121:57

think uh

121:58

>> like why does that exist?

121:59

>> That David Spiegel guy taught me that

122:03

25% of people that do a single session

122:06

intervention for smoking sessation quit

122:08

for life

122:09

>> from hypnosis.

122:10

>> One session, 25%. Get this. And uh I

122:13

think if you do a couple of sessions

122:15

that number starts to go up and go up.

122:16

So hypnosis is this really weird

122:18

>> back door

122:20

>> into the human psyche. But yeah, the the

122:22

memory thing is [ __ ] crazy when you

122:24

think about what do I actually know?

122:26

Like how do I know that this thing

122:28

happened in the past,

122:29

>> right?

122:30

>> So um most people understand there's

122:33

like two types of memory failure. Uh one

122:35

is I can't remember that thing and the

122:37

other is I remember it but I remember it

122:38

incorrectly. That's broadly two

122:40

categories and I think people are really

122:42

happy with the first one

122:43

>> because there's tons of [ __ ] that has

122:45

happened to you and you go yeah I forget

122:46

my memory whatever whatever but your

122:49

experience of your own memory is your

122:51

only experience of your own memory so

122:52

for you to be able to say my

122:54

recollection is wrong what does that

122:56

mean that's like saying this dimension

122:58

that I'm in is wrong so a lot of the

123:00

time I think people struggle to

123:01

understand how often their memory of a

123:04

thing is present but inaccurate

123:07

>> so for instance uh there's only 17

123:10

colors that we remember on average. We

123:12

don't remember like if I ask you what

123:13

color is a tomato.

123:16

>> Well, I would say red, but really it's

123:18

not if it's heirloom

123:21

typically red, but it's like a rey

123:23

orange sort of color.

123:24

>> Sure, but those are the [ __ ]

123:25

tomatoes. Like a real heirloom tomato

123:27

all kinds of differentish sort of

123:30

>> tomato.

123:31

>> That's a real tomato though. That's what

123:32

a tomato really looks like. Sorry.

123:34

>> I know.

123:34

>> Supermarket tomato. Kind of reddish

123:36

orange reddish orang-ish. Most people

123:38

would default to the red thing,

123:40

>> right? But not really.

123:41

>> Yeah. But it's not. And we sort of we

123:43

adjust.

123:44

>> So if you're the

123:45

>> like we're white people.

123:47

>> Uhhuh.

123:47

>> But we're not really white.

123:48

>> Well, I mean, you're a bit flush.

123:50

>> Yeah. We're not white. You know what I'm

123:53

saying? Like

123:54

>> comparatively white.

123:55

>> My friend Jamie, uh, not this one, but

123:57

another one. He's from England, and he's

123:59

white like paper. And, uh, when my

124:00

daughter first met him, that's what she

124:02

said. She goes, "Mommy, he's so white."

124:04

And she goes, "Yeah, he's white." And

124:05

she goes, "No, no, he's white like

124:07

paper." [laughter]

124:10

>> Well, if you live in England, you will

124:12

be referred to as white like paper.

124:13

>> Yeah.

124:14

>> But if you've got I mean, this must be

124:16

the same with fighters. Even if you

124:17

forget the

124:19

>> TBI head traumy stuff,

124:21

>> just the dump of adrenaline

124:23

>> from going through. I mean, you must

124:24

have done this when you've done your

124:25

biggest shows

124:26

>> and you go out.

124:28

>> You come back and you're like,

124:29

>> I've worked my whole life to get to the

124:32

stage where I can achieve this thing.

124:35

And in the achievement of this thing, I

124:37

kind of wasn't really there. Well, I was

124:39

there for it, but in retrospect, I can't

124:42

really recall where I was. And it's this

124:44

odd duality that you want to be in a

124:46

flow state.

124:47

>> Yeah.

124:47

>> Because it's very fulfilling. It's where

124:49

you're at your best. The words just

124:52

coming out of you perfectly. And when

124:53

you look back, you're like,

124:56

I I don't know if I was there fully. I

124:59

feel like I was kind of absent. Well,

125:01

it's not that you're absent, but that

125:04

you're empty. You empty out all your

125:06

expectations, and you're on it for the

125:08

ride. You're not really piloting it as

125:11

much as you're just like making sure it

125:12

doesn't hit the rocks.

125:14

>> You're on you're there for the ride. The

125:16

thing takes over. And I think that's the

125:19

case with everything. That's the case

125:21

when you're in the flow state of

125:22

anything you're doing. when you really

125:24

like you're the more you think about you

125:26

being there, which is what you have to

125:28

do if you're there, you're thinking

125:30

about you. So, it's like wasted

125:32

resources. You're better off being empty

125:35

and just like being a vessel and just

125:37

like taking this thing like you've done

125:39

the work already. Like take it along for

125:41

the ride. Just go go for the ride.

125:44

That's what it is. And so the problem

125:46

with that is if you don't record your

125:47

set, sometimes you'll say things that

125:50

you don't remember, like that were

125:51

really funny and you're like, there's I

125:53

had a totally different point that I

125:56

went off and it really worked, but I

125:57

don't remember what it was. If you don't

125:59

record it, you're [ __ ]

126:00

>> The only way you can get it back is you

126:02

have to get back to that exact spot and

126:04

hope it's still there for the next show.

126:06

[laughter] Sometimes it will be.

126:07

Sometimes it will be. Sometimes it's

126:08

waiting for you with a little gift.

126:10

>> Sometimes that angle pops up again.

126:11

You're like, "Oh, yeah, but but why are

126:13

we doing this?"

126:14

>> Yeah. Yeah. Yeah. I almost forgot it.

126:16

>> That must be a nightmare or must have

126:17

been a nightmare before you could record

126:18

sets.

126:19

>> Yeah. But you've always been able to

126:20

record sets. One that's one of the

126:22

things I learned like really early on

126:23

from this guy Mike Dunovan who was one

126:25

of like the big comics in Boston. He

126:27

goes, "Always record your sets because

126:28

you never know when you're going to say

126:29

something and you'll it'll be lost

126:31

forever if you don't have a recording."

126:34

>> There was a Scotty Sheffller, a golfer.

126:38

He won Jamie. You'll have seen this

126:40

video.

126:40

>> Golfer.

126:41

>> Yeah. Can we get the that it's a there's

126:43

a New York uh sports video like cut. He

126:47

does this. It's such a [ __ ] cool

126:50

explanation of what somebody who's got

126:52

to the peak of their sport, the absolute

126:54

pinnacle like in the moments of winning

126:56

>> and he just breaks the fourth wall open

126:59

about kind of the hollowess of what this

127:02

is

127:02

>> really.

127:03

>> Yeah. It's really fascinating.

127:04

>> Like what's the point that thing?

127:06

>> Yes. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.

127:07

And um it's just it's just such a

127:09

[ __ ] great explainer

127:11

>> because we always assume here. Here we

127:13

go.

127:16

>> You might have just won the US Open here

127:18

too, by the way. Like the biggest event

127:20

of the yearing

127:22

life. It's it's fulfilling from the

127:24

sense of accomplishment, but it's not

127:25

fulfilling from a sense of like the

127:26

deepest,

127:27

>> you know, places of your heart. You

127:29

know, I think it's kind of funny. I

127:30

think,

127:32

you know, I I think I said something

127:34

after the Byron this year about like

127:37

It feels like you work your whole life

127:39

to celebrate

127:41

winning a tournament for like a a few

127:43

minutes. It only lasts a few minutes.

127:45

That kind of euphoric feeling. And I

127:47

like to win the Byron Nelson

127:48

Championship at home. I literally worked

127:50

my entire life to become good at golf to

127:52

have an opportunity to win that

127:53

tournament. And you win it, you

127:56

celebrate, get to hug hug my family, my

127:58

sisters there. It's such an amazing

127:59

moment. And then it's like, okay, now

128:02

now what are we gonna eat for dinner?

128:04

You know, life goes on. This

128:08

is it great to be able to win

128:10

tournaments and to accomplish the things

128:11

I have in the game of golf yet. I mean,

128:12

it it brings tears to my eyes just to

128:14

think about because it's

128:16

literally worked my entire life to

128:18

become good at this sport and to have

128:20

that kind of sense of accomplishment, I

128:21

think, is is a pretty cool feeling. You

128:24

know, to get to live out your dreams is

128:25

very special. But at the end of the day,

128:27

it's like I'm not out here to inspire

128:29

the next generation of golfers. I don't

128:31

I'm not here to inspire somebody else to

128:33

be the best player in the world because

128:34

what's the point, you know? This is not

128:36

a fulfilling

128:38

life. It's it's fulfilling from the

128:40

sense of accomplishment, but it's not

128:41

fulfilling from a sense of like the

128:42

deepest, you know, places of your heart.

128:45

You know, there's a lot of people that

128:46

make it to what they thought was going

128:49

to fulfill them in life, and then you

128:50

get there and all of a sudden you get to

128:52

number one in the world, and then

128:53

they're like, "What's the point?" And,

128:55

you know, I I really do believe that

128:57

because, you know, what is the point?

128:58

You're like, "Why? Why do I want to win

129:00

this tournament so bad? That's something

129:01

that I wrestle with on a daily basis.

129:02

It's like showing up at the Masters

129:04

every year. It's like, why do I want to

129:06

win this golf tournament so badly? Why

129:08

do I want to win the open championship

129:09

so badly? I don't know. Because if I

129:13

win, it's going to be awesome for about

129:15

two minutes and then we're going to get

129:17

to the next week and it's going to be

129:18

like, "Hey, you won two majors this

129:19

year. How important is it for you to win

129:20

the FedEx Cup playoffs?" And it's just

129:21

like, we're back here again, you know?

129:24

Um, so we really do we work so hard for

129:26

such little moments and um, you know,

129:29

I'm kind of a sicko. I I love putting in

129:31

the work. I love being able to practice.

129:32

I love getting out to live out my

129:34

dreams. But at the end of the day,

129:35

sometimes I just don't understand the

129:36

point. This is

129:38

>> That's honest. That's what that is.

129:40

>> I love that video so much.

129:41

>> That's why he's so good.

129:42

>> I love that video.

129:43

>> I guarantee you that's why he's so good

129:44

because I guarantee you that guy has to

129:45

be that honest with himself about

129:47

everything. Otherwise, you'd never fix

129:49

the hitch in your swing. You know,

129:53

>> you you have to be honest about every

129:55

single thing.

129:56

>> You have to be you have to be aware of

129:58

all of it. Every little weird [ __ ]

130:00

thing you do. Why am I doing this? Like,

130:02

what is the point of this? And then when

130:03

you're done, like, yeah, I did it. And

130:05

then it's going to creep right back in.

130:08

Creep right back in. Nike did a

130:10

>> a commercial after that and uh it's him

130:13

with his son

130:15

>> sort of kneeling down on the uh the

130:17

green and it says uh you've already won

130:21

>> and then I think the next slide is but

130:23

let's get another one and it's

130:25

[laughter] so [ __ ] cool, dude. There

130:27

it is. You've already won

130:29

[clears throat]

130:29

>> but another another major never hurt.

130:31

That was a bro.

130:33

>> Yeah,

130:33

>> [ __ ] unbelievable. So, I think I kind

130:36

of become obsessed with um people

130:39

sacrificing what they want, which is

130:41

happiness, for the thing that's supposed

130:43

to get it, which is success.

130:45

>> So, they sacrifice the thing that they

130:47

want,

130:48

>> being happy in the moment. They make

130:51

themselves miserable in order to be able

130:52

to achieve a thing so that when they

130:54

finally have sufficient success, they

130:56

will allow themselves to be happy. It's

130:58

like a very strange trade. Imagine if

130:59

you had some simultaneous equation and

131:01

you just crossed off success from both

131:03

sides. you would sort of be left with

131:04

happiness. I think that's unrealistic,

131:06

right? Because we need social validation

131:08

from people and we want to be

131:09

recognized. We want to do stuff and we

131:11

got to put food on the table and social

131:13

creatures and all the rest of it. But I

131:15

think videos like that are really

131:19

important for people to see when they

131:20

look up to someone about how much there

131:24

is there at the end of the rainbow. Like

131:28

>> uh Elon was on Lex's show a couple of

131:30

years ago and I think Lex asked him some

131:32

question like how are you doing? He

131:33

replied and he said people think they

131:35

want to be me. They do not want to be

131:37

me. They don't know. They don't

131:39

understand. My mind is a storm. I'm like

131:41

that's the price you need to pay to be

131:44

Elon Musk.

131:44

>> I think that was on this podcast.

131:46

>> Was it this one?

131:47

>> Yeah. Because I asked him like what is

131:49

it like to be you? Like he's like you

131:51

wouldn't want to do it. You wouldn't

131:53

want to be me. And you could tell like

131:55

when you're in his eyes like there's

131:58

it's not a normal thought process. It's

132:01

like this chaotic tornado of ideas

132:04

that's running around in his head,

132:06

>> you know? And that sometimes he spits

132:07

them out on Twitter and they're not

132:09

good.

132:10

>> Well, it's a problem when you own the

132:12

platform, right? It's kind of like I can

132:13

say what I want. It's my own house.

132:14

>> Well, he can though. Like, but he's like

132:16

that all the time. He's fun. He's like

132:19

what I would want to see from a guy

132:22

who's a super genius. like a playful guy

132:24

who wants to go to Mars, who's making we

132:27

like Jamie and I went on a tour of uh

132:29

Starship, Starbase. What is it?

132:32

>> SpaceX. Starbase, whatever the [ __ ] it

132:34

is. Uh we saw the launch. We we went to

132:36

the SpaceX launch and so we got a tour

132:38

of the rocket factory which is [ __ ]

132:41

insane.

132:42

>> It's so much more insane than I thought

132:44

it was going to be. It's I mean I can't

132:45

really I don't know how much we could

132:47

even say, but it is nuts. It's nuts. And

132:51

the sheer quantity of rockets that

132:53

they're making is mindblowing. Like

132:56

you're like, I had no I thought they had

132:57

a couple rockets, you know, just a

132:59

couple rockets laying around. They're

133:00

just making rockets.

133:02

>> I'm pretty sure they've put more stuff

133:04

into space, just that one company, than

133:07

like the entirety of the load that's

133:09

been transported into space globally up

133:12

until now.

133:12

>> They put stuff in space for their

133:14

competitors.

133:16

>> Yeah. They they use their space rockets

133:18

to put stuff in space for people that

133:20

they're in competition with. They Yeah,

133:22

they take the money.

133:23

>> Show me the color.

133:24

>> Yeah, we know how to do it. We We're

133:25

better at it than you, so we'll do it.

133:26

>> [ __ ] unbelievable, dude.

133:28

>> It's kind of nuts.

133:29

>> I think about um that like the sort of

133:32

person you need to be to drive that

133:33

though.

133:33

>> It's a different kind of person, right?

133:35

Like that's what he wants to do and

133:36

that's what he desires to do. And you

133:39

know this gentleman talking about golf

133:42

like this is a different this is that's

133:45

a totally different thing cuz he's in a

133:47

competition all the time you know and

133:50

it's it's really hard to just enjoy the

133:52

process when you're in this competition

133:54

where especially if your livelihood

133:56

depends upon a very specific result like

133:59

you have to be better at this thing than

134:01

everybody else. Not just do the best

134:03

yourself,

134:04

>> but better than the other people that

134:06

are also doing their best. So, you're in

134:08

this constant just never escaping this

134:13

pressure.

134:14

>> Fighters feel that I think more than

134:16

anybody because it's like a actual

134:17

physical person coming to harm you all

134:19

the time.

134:20

>> And you're very outcome focused.

134:22

>> Yeah.

134:22

>> And so, it's all well and good him

134:23

saying I I love the process. I'm a bit

134:26

of a sicko. I like my training. So on

134:27

and so forth. But it's very different

134:30

saying, "I enjoy the process of training

134:33

when you've just won than I enjoy the

134:35

process of training when you've just

134:36

come second or fifth or 20th."

134:38

>> Right.

134:38

>> And especially if you're laid out flat

134:40

on the canvas.

134:41

>> Yeah. Oh, especially that

134:42

>> the humiliation.

134:44

>> There's also the damage that was just

134:45

done to you where you might not really

134:47

might not be the same again. And there's

134:49

certain fighters that that you could

134:50

point to one fight and they were they

134:53

never recovered from it. M

134:55

>> Melick Taylor versus Julio Cesar Chavez

134:57

is my personal one that I always point

134:59

to

135:00

>> because Julio Cesar Chavez knocked him

135:02

out with like I think it was like a

135:04

couple seconds left in the last round.

135:07

Stopped him and it was a fight that

135:08

Melrick Taylor was winning a decision

135:09

but Julio Cesar Chavez was wearing him

135:11

down. He was one of the greatest of all

135:12

time. Just ripping the body, constantly

135:14

attacking him and eventually broke him

135:16

down, had him in a corner, boom, dropped

135:18

him with a right hand. And he got up and

135:20

the referee stop called the fight with

135:22

like a couple of seconds to go. And it

135:25

was a hugely controversial call. But

135:27

then when Melrick Taylor returned, he

135:28

was never the same again. He started

135:30

slurring his words really badly.

135:32

>> So it's physical issues.

135:33

>> He's getting knocked out easily. He's

135:35

getting dropped easily. He was just, it

135:36

was gone. It was all gone. That fight

135:38

just took it all out of him. You see

135:40

that? So there there's that too. It's

135:43

not just you're going to lose a golf

135:44

tournament like you might get your

135:46

brains punched in.

135:47

>> Physical repercussions.

135:48

>> Huge physical repercussions for a

135:50

vicious knockout. Huge. Some guys are

135:52

never the same again and much more

135:55

likely to get knocked out again once

135:56

they get knocked out really badly.

135:59

>> Who do you think of all of the people

136:00

that you know has got the right balance

136:03

of is successful and is also having fun

136:06

at the same time? because it seems like

136:08

that's a trade that a lot of people can

136:10

make

136:11

>> where they are successful but they

136:14

sacrifice their happiness or they're

136:16

kind of happy but they're not pursuing

136:17

external successes in the same way.

136:19

>> I would say comedians I would say

136:20

Chappelle. Chappelle is probably the

136:22

most successful guy that's genuinely

136:23

happy. I mean he certainly has a lot of

136:26

moments and deep thought but when you're

136:28

hanging around with him he's a lovely

136:30

person. He's a happy lovely guy. He's so

136:35

sweet and so smart and so so like

136:38

self-deprecating and interesting and so

136:40

great at what he does, but when you're

136:44

hanging out with him, it's just it's

136:45

just a hang. It's just he's just having

136:47

fun, laughing a lot, got a great crew.

136:50

He always, you know, stays keeps his

136:52

circle tight, cool people, and just has

136:55

a great time, you know.

136:56

>> Have you deconstructed what that is?

136:57

Like what the contributing elements are?

137:00

[sighs]

137:00

>> I think he just he's doing it well. He's

137:03

a very unusual person, right? So, you're

137:06

talking about Dave Chappelle when

137:08

Chappelle's show was the number one

137:10

comedy in the country. It was the

137:13

greatest sketch show. I think it was the

137:14

greatest sketch show of all time. And it

137:16

was only two seasons, right? And then

137:19

they offered him an enormous amount of

137:20

money. I think it was $50 million. And

137:23

they wanted to change a bunch of stuff.

137:24

They wanted him to stop saying certain

137:26

words. They wanted him to stop doing

137:27

this, stop doing that. And he didn't

137:30

like it. and she said, "I quit." And he

137:33

went to Africa and just [ __ ] hung out

137:35

in Africa and then came back. When he

137:37

came back, he stopped doing standup. He

137:39

would do standup. I remember one time uh

137:41

he did standup in a park in Seattle. So,

137:45

uh he showed up, he uh had little

137:48

speakers with him and a microphone and

137:50

just did stand up for free to these

137:52

people. Just hung out in Seattle. Just

137:54

did standup. And he would do stuff like

137:56

that. Show up places and just do standup

137:59

occasionally. I mean, for 10 [ __ ]

138:01

years, he was like a a monk on a

138:04

walkabout. And

138:05

>> how did he stay sharp?

138:08

>> Well, I don't think he ever stopped

138:10

thinking about things the same way. And

138:12

he wasn't as sharp when he came back.

138:14

There's one like famous video from him

138:16

in Hartford, Connecticut, where he

138:18

bombed, but I always tell people stay

138:19

out of Connecticut. [laughter] But just

138:22

that's not the point. It's like England,

138:24

you know, I think England's depressed.

138:26

Um, but the point was then eventually he

138:29

started touring regularly, got it all

138:32

back plus then some, and then is now

138:35

widely regarded as if not the greatest

138:37

of all time. He's in the consideration.

138:38

There's like Prior him, Murphy,

138:41

Kenisonson, Lenny Bruce, Carlin for

138:44

some. There's like a bunch of different

138:46

people that you put into like the

138:47

greatest of all time. And Dave is

138:48

certainly in that group, but he's very

138:50

happy. He's a happy guy. I mean,

138:51

certainly there's cultural issues that

138:54

play that trouble him and life issues

138:56

that everybody goes through that trouble

138:58

him, but uh genuinely a pretty balanced

139:01

guy for someone who's ultra successful.

139:03

But he's not stepping outside of his

139:05

lane either. What he's really

139:07

concentrating on and almost exclusively

139:09

concentrating on is doing standup

139:11

comedy.

139:11

>> And he will travel, he would get in a

139:13

jet and fly to New York unannounced and

139:16

just show up at clubs and start doing

139:17

standup. And um he's done this forever.

139:21

One time I was in Colorado and I've

139:24

known Dave forever. I met Dave when he

139:26

was like 19 and I was like I guess I was

139:29

like 23 or 24. We were both very young

139:32

and even back then I was like this kid

139:34

is so talented. It was it was like

139:36

remarkable how poised he was on stage

139:38

like as a 19-year-old kid. Um he he will

139:43

just show up places. I was in Colorado

139:45

doing standup. I was at the comedy

139:46

works. I get off stage. It was on a

139:49

Friday night. I go into the green room

139:50

and Dave's there. He doesn't live in

139:52

Colorado. He just flew to Colorado

139:54

because he knew I was gonna be there and

139:55

he wanted to do comedy. And so, uh, I

139:59

go, "Do you want to do a set?" He goes,

140:00

"Should I?" I go, "Yes." I go, "Hold

140:02

on." So, I went back on stage. The show

140:04

is over. I go, "Everybody [laughter]

140:06

yell at the people that are on the

140:08

stairs to come back. Dave Chappelle is

140:10

here." And the half the crowd had

140:12

already like got up and left. They all

140:14

come back. Everyone, everyone tells

140:16

everyone they're yelling it up the

140:17

stairs. Dave Schmell's here, come back.

140:19

I bring him on stage. Everybody goes

140:20

crazy. And he does like 45 minutes just

140:22

[ __ ] around. It was back in the grab

140:24

him by the [ __ ] days. So he had this

140:26

whole like he said grab him by the

140:28

[ __ ] This whole bit like it just

140:30

happened that week and he had this like

140:32

giant and he just wanted to just go

140:34

places and do comedy. So he's not doing

140:35

it for money, right? He's not getting

140:37

paid to do this show. He would show up

140:39

in New York. He's not getting paid to do

140:40

the stand or wherever these clubs that

140:42

he just shows up in. He's just working.

140:44

He's just working on the craft of

140:45

comedy. So, his mindset is not try to

140:49

make the most amount of money with

140:50

standup because if he was doing that, he

140:51

would do an arena every night, right?

140:53

But he could he could do an arena every

140:54

night of the week all over the world and

140:56

make way more money. But that's not what

140:59

he's doing. What he's doing is working

141:00

on the craft of comedy. He has plenty of

141:03

money, right? He has all this money from

141:04

all these Netflix specials. They pay him

141:06

an exorbitant amount of money and he

141:07

makes all this money when he does do the

141:08

big show. So, he's got plenty of money.

141:10

So, it's not money. It's just the craft.

141:13

It's just the art, the new set, the new

141:15

bits, the new thing. He has a guy who

141:17

films all of his sets. So, he's got like

141:20

a guy there filming every one of his

141:22

sets and then they break them down like

141:24

this rant, that rant, because he'll like

141:26

ask questions to people in the audience.

141:28

He'll do like an hour and a half on

141:29

stage just [ __ ] around with a small

141:31

crowd somewhere, but there's a gem in

141:33

there somewhere and then they take that

141:35

gem and then they expands upon it.

141:38

They'll go over it and break it down. So

141:40

his process is all just about the art.

141:43

And I think because of that, the love of

141:45

the art is what keeps him happy.

141:48

>> I think if it's just the love of the

141:50

money and you're constantly keeping

141:51

score, who's the number one touring act?

141:53

And you're looking at the [ __ ] ticket

141:54

master. Oh, Jesus Christ. Kevin Hart's

141:56

got me beat. Son of a [ __ ] I got to do

141:59

two shows a night now. And that's mine.

142:02

>> Yeah. People get nutty. They get nutty

142:03

and they really do get themselves You

142:05

see it in the podcast world as well.

142:07

people really get obsessed with the the

142:10

number of the rankings and like who's

142:12

making more and who's doing this and

142:16

[sighs] just do what you do.

142:17

>> Well, you're the the problem that you're

142:18

going to come up against there is you

142:20

are going to try and trade

142:24

the outcome that you're looking for for

142:26

the fuel that gets you there. The fuel

142:27

that gets you there is how much you love

142:29

what you're doing.

142:29

>> Yes. Yes.

142:30

>> So, I've been thinking

142:31

>> that's what gets you to the dance.

142:32

>> Correct. I've been thinking so much

142:34

about um the shame of simple pleasures.

142:37

So um there's this quote from a guy

142:39

called Visakan Barasami that says, "I

142:41

have ne not yet grown wise enough to

142:44

deeply enjoy simple things."

142:46

>> And uh I just love the idea of it that

142:48

most of us are kind of terrible

142:50

accountants of our own joy. Yes.

142:52

>> That we only uh accept deposits when the

142:55

transactions large enough, right? The

142:56

day that we get married or the night

142:58

that we play the main stage at

142:59

Glastenbury or sell out the arena. Mhm.

143:01

>> Anything less than that.

143:03

>> Yeah.

143:03

>> And it doesn't even make the ledger. So

143:05

we treat small pleasures like

143:08

counterfeit currency.

143:10

>> And we think like

143:14

we have a kind of uh not disgust but

143:16

rejection of oh that small thing made

143:21

your weak. That tiny incident made your

143:25

day. You must not have a lot going on.

143:27

Like how weak and how small must your

143:30

life be? that seeing a cute golden

143:32

retriever this afternoon was like a

143:34

[ __ ] sick part of your day.

143:36

>> And I think about Scotty Sheoffller as a

143:40

good example. Him making it all the way

143:43

to the top. And if all that you were

143:44

doing was waiting for that final moment

143:46

for this mainstage at Glastonbury, Day

143:48

that I get married, sell the business

143:49

for $500 million, whatever, you are

143:54

forgetting almost all of the journey and

143:57

then just cashing in at the destination.

143:58

And as the guy that's just won

144:00

everything in all of [ __ ] golf, like

144:02

the the the goat of right now is saying

144:04

>> it's fleeting.

144:06

>> Yeah.

144:06

>> It's really really short. It's not going

144:08

to last for very long. And uh that shame

144:11

that uh people have, I certainly know

144:14

that I do as well that it almost feels

144:16

like a reflection on the smallness of my

144:17

life if I take pleasure in little

144:19

things. But when you take pleasure in

144:21

little things, you don't just get more

144:23

of them. You get them right now. You

144:24

don't need to wait. You don't need to

144:25

like be a [ __ ] world champion at the

144:27

winning the marshmallow test just

144:29

delaying gratification so long that you

144:31

never actually end up getting any

144:32

gratification.

144:33

>> Yeah. The problem with that thought

144:36

process is to achieve true greatness.

144:40

You must be mad. Madness and greatness

144:43

are inextricably connected. You can't

144:47

separate them. To treat to to get true

144:49

greatness, you have to there has to be

144:52

some demons. There has to be a mad

144:55

struggle in your mind and you have to

144:58

want it so badly. You want to have to

145:00

want that result so badly that you are

145:03

willing to put in more time, more

145:05

effort, more focus, more hours and just

145:09

you don't get to smell the roses, man.

145:12

You don't you don't get to pet the

145:14

puppies. You do, but you don't. You're

145:16

petting the puppy thinking about the

145:17

thing that you do, thinking about

145:19

getting better because you need those

145:20

resources. It's like a demon that sort

145:22

of climbs inside of you and wears you.

145:24

>> Yeah. You know, you know who Ronnie O

145:25

Sullivan is?

145:26

>> Yeah. The the snooker player.

145:27

>> Yeah. The greatest of all time. Like

145:29

what? Like there's certain people in

145:30

certain sports. I'm going to send you

145:32

something, Jamie. So, you see what a

145:34

wizard this guy is. I'm actually in the

145:35

middle of his book. Um my friend Billy

145:38

Thorp, who's a top flight pool player,

145:40

recommended this book. Oh, no. I'm

145:42

sorry. Tyler Styler, who's another top

145:45

flight pool player, like world-class

145:47

pool player, recommended this book. And

145:49

uh um I started the book and I I can't

145:53

stop it. It's so good. It's It's about

145:56

um I think it's fairly recent because

145:58

it's postco um

146:01

>> uh I thought it was going he he

146:02

recommended it because of the way Ronnie

146:05

describes picking the perfect Q like the

146:08

relationship that he has with a Q but it

146:10

is so eloquent and so but the story the

146:13

whole story the whole book rather the

146:16

the story of his life is really more of

146:20

it's an exercise in him trying to

146:25

explain

146:28

like what it's like to be this good and

146:30

this mad. Like he's a mad man. Like

146:33

watch this. Watch this. Watch what he

146:34

does here. This is

146:35

>> performance here from [music] Sullivan.

146:39

>> Now, if you don't know how difficult it

146:41

is to make these balls, he doesn't give

146:42

a [ __ ] that that guy's in front of him,

146:44

that the referee's [laughter] in front

146:45

of him. Watch how quickly he does this.

146:49

>> I mean, he's making the audience laugh.

146:51

He's moving around that guy. He can't

146:53

miss. This is This is the zone

146:56

personified. He gets to a point in this

146:58

where he's feeling so good, he decides

147:00

to start shooting things one-handed.

147:02

>> Watch this. Watch this. One-handed.

147:04

>> Now he's doing it onehanded.

147:06

>> One-handed. These are tiny little

147:08

pockets. He's shooting one-handed with

147:11

English and getting position. Everyone's

147:12

going crazy.

147:14

>> I mean, that's how [ __ ] good Ronnie

147:17

Sullivan was.

147:19

>> Like,

147:20

>> oh my god. This the book is really about

147:23

managing madness. It's about him being

147:26

sober and now he's trans he's kind of

147:28

taken a lot of that insane competitive

147:30

drive. Now he runs like he's a runner

147:32

like he runs long distances

147:34

>> and he talks about that meets up with

147:36

his running club and they all get

147:37

together and go on runs together. But

147:39

it's like it's just managing whatever

147:42

the [ __ ] that and in in in also

147:44

describing even in his prime he was

147:46

saying he was thinking he's worthless.

147:48

He's thinking he's not good enough. He's

147:49

going to fall apart. He's going to

147:51

choke. He's going to this. He's like all

147:52

these demons are popping up and

147:54

meanwhile he's just everybody's like

147:56

terrified of him. He shows up. It's

147:58

like, "Oh, the genius is here." Cuz he's

148:00

a genius. Like he's a snooker playing

148:02

genius. There's there's something about

148:03

what he does is just different than

148:05

everybody else. But the book is like

148:07

it's not just about like picking the

148:09

perfect queue. It's really about

148:11

managing madness. And everyone who's

148:14

great is [ __ ] crazy. There's But you

148:17

can, I think, like Chappelle does, you

148:21

can take that greatness and just throw

148:23

it into the thing you do and love it

148:25

while you're doing it. You can't, it

148:27

doesn't have to be a demon. Doesn't have

148:30

to be an adversary. It could be like

148:34

just this romantic affair of you being

148:37

so fortunate to be able to pursue this

148:38

thing, but maintaining that same level

148:41

of enthusiasm. I don't know if the same

148:44

level of enthusiasm though can be

148:46

maintained in something that has like a

148:48

a winner and a loser like a game where

148:51

there's so much riding on each shot.

148:53

Yes. Versus art which is like Dave goes

148:57

to he's already won. They're going to

148:59

they the show's sold out. He knows how

149:01

to do comedy. He gets out there. They

149:02

all cheer. He's got great material. He

149:05

can't wait to make them laugh. He

149:06

already won.

149:07

>> Well, that's the problem with turning

149:09

the art into the competition. she said

149:12

there, right? The rankings. Well, that

149:14

means that even if I did it and enjoyed

149:16

it, but I'm number three or whatever,

149:18

>> that's horrendous. That's not good.

149:20

Yeah. I uh

149:20

>> there's podcasts that game the system

149:23

there. So, there's podcasts that release

149:24

multiple episodes a day and they're

149:26

short podcasts, so they have more

149:28

downloads than everybody else. And so,

149:30

>> because download

149:32

speaking, you know, this so it's a it's

149:34

like a scam. And so, like they'll be

149:36

very highly ranked, but no one's ever

149:38

watched it or heard of it,

149:39

>> I think. But they get quoted in

149:41

magazines as being the number two

149:42

podcast in the world.

149:43

>> Oh yeah.

149:44

>> But that's really what it is. It's like

149:45

you've you figured out a way which

149:46

there's nothing wrong with that. If you

149:47

want to do that, you you can game the

149:50

system,

149:51

>> but it doesn't matter. Like what what is

149:53

what are you doing? Like are you doing

149:54

something that you're putting out like

149:57

like I don't talk to anybody that I'm

149:59

not interested in talking to. That's it.

150:00

It's the only reason why I do this. I

150:02

talk to people that I think will be fun

150:03

and I look forward to it and I still do.

150:06

That's why I do it. That's why, you

150:08

know, it continues to work because I do

150:10

it the same way I've always done it. I

150:12

just talk to people that I like to talk

150:14

to. No. No. Like, oh, if I got that guy

150:16

on, he's super famous. Like, that'll

150:18

that'll get a big rating system.

150:20

>> Yeah. There's a lot of famous people

150:21

that I've said no to because I'm just

150:23

not interested in them. I'm like, yeah,

150:24

maybe that'll get a lot of people, but I

150:26

don't I don't want to do that. What I

150:28

found the single best determinant for

150:32

when I know that Modern Wisdom is going

150:33

well is if I wake up on the morning of

150:36

the episode and I can't wait for it to

150:38

be 2 p.m. I'm like [ __ ] yes, I get to

150:40

speak to such and such today. And then I

150:42

finish up and I go,

150:44

>> I learned something. That was [ __ ]

150:45

cool. Like that was a good 1 2 3 4

150:48

hours. That was a good day.

150:50

>> And then there's other days when I've

150:52

like I don't know. I wake up and I just

150:54

think h I should have I should have

150:56

thought a little bit more about I'm like

150:57

I'm looking forward to this but I'm not

150:58

super fired up and the more that you

151:00

push away from that instinct with

151:02

whatever you're doing because your

151:04

instinct is ultimately your only

151:05

competitive advantage that you have

151:07

because it's the most nonfgeible thing

151:09

that you've got. So Doug Douglas Murray

151:11

told me this story. It's really

151:12

fascinating one about this guy when

151:14

Douglas was first on the scene. This guy

151:17

that was uh the head of the paper that

151:19

he was at accumulated all of the the

151:21

fans and all of the foes that you would

151:23

in an industry like that over the space

151:25

of a couple of decades. And he decides

151:27

that he's going to release a uh West End

151:31

show about the life of Prince Charles in

151:33

rhyming couplets. [laughter]

151:37

This is like

151:38

>> what

151:40

Okay. Well, you know, do you trust him?

151:41

He's this guy, this illustrious history.

151:44

So, and so he must know what he's doing.

151:46

>> And uh by the opening night interval,

151:49

there is nobody left in the entire

151:51

auditorium, including the cast.

151:53

Everybody's left. And this guy is

151:55

dejected. And all of the people, all of

151:57

the enemies that he's accumulated

151:58

throughout his career, they start

152:00

sharpening the knives and they come out

152:01

and he's just despond so sad. [laughter]

152:05

Douglas sees him a couple of weeks later

152:06

and he goes, "What were you [ __ ] West

152:10

End show about the life of Prince

152:12

Charles in rhyming couplets. What were

152:14

you thinking?" He said, "Douglas, I

152:17

followed my instincts. The thing is

152:19

instincts, they may sometimes lead you

152:21

wrong, but they're the only thing that's

152:22

ever led you right." And I thought

152:26

that's such a cool insight about

152:30

yes, you're going to make some errors if

152:32

you follow that and maybe you need a

152:34

team around you or a friend to go,

152:36

[laughter]

152:37

not with that one. But

152:39

>> right,

152:40

>> you just going, I think this guy's

152:42

interesting. I think this girl is

152:43

interesting. I think this topic is

152:44

important and I'm going to talk about

152:46

it.

152:46

>> Maybe he just did a bad job. Like look

152:47

at Hamilton. They did a rap about

152:49

Alexander Hamilton. It's [ __ ] huge.

152:52

>> Okay. Yes.

152:54

You might have a great idea, but the

152:55

delivery is wrong.

152:56

>> Yeah.

152:56

>> Yeah. That's an interesting one.

152:58

>> Totally. If you think about Hamilton,

153:00

like Hamilton is a great example. That's

153:02

that play is gigantic.

153:04

>> Mhm.

153:04

>> It's on Netflix now

153:05

>> and it keeps on crushing.

153:06

>> Yeah. It's killing it.

153:07

>> Yeah.

153:07

>> And it's so preposterous if you think

153:09

about it in the like they're they're

153:11

talking in modern language about a guy

153:15

who lived hundreds of years ago. Like

153:16

that doesn't even make any sense. They

153:18

have black people playing white people.

153:19

Like this is going to be weird. It's

153:20

great. Where do you [ __ ] Where do you

153:23

think that drive comes from in people?

153:25

You know that that that demon thing? Is

153:27

there a common thread that you've seen

153:28

with the people that have got it?

153:30

>> Yeah. Most of them had unhappy

153:32

childhoods.

153:34

Yeah. It's very rare that someone has

153:36

like the the best in the world demon and

153:40

their childhood was awesome.

153:42

It's very rare. Generally speaking,

153:45

there's something there. Something, some

153:48

loss, some trauma, something not good.

153:52

>> Some lack of

153:55

what you needed when you were young. You

153:57

didn't get it.

153:58

>> And you know, and [clears throat] then

154:00

you're like, I'm going to [ __ ] show

154:01

every like Mike Tyson maybe the best

154:03

example of that ever. Like for a period

154:05

of time, the scariest heavyweight that

154:07

ever walked the face of the planet and

154:09

redefined the heavyweight division in

154:11

modern boxing. And you know, he was 13

154:14

years old when Customato had adopted

154:16

him. And his his life was hell before

154:18

that. It was hell. It was no love. It

154:19

was crime and being around the worst

154:22

people. And then all of a sudden, he's

154:24

in the Cat Skills with this guy who's uh

154:27

a psychologist and one of the greatest

154:29

boxing coaches of all time and also a

154:32

hypnotist and is hypnotizing him on a

154:34

regular basis when he's 13 years old and

154:36

teaches him to be the best. And so then

154:39

he's got this, I will show you that I'm

154:41

worth something. I will show you that

154:43

I'm special. This one thing that I'm

154:45

good at, and that is separating men from

154:47

their consciousness. Finding a way to

154:49

get in touch with them. Finding get

154:51

close enough and launching launching

154:53

bombs and watching them drop. And he was

154:56

the best at it. And it was, I think, the

154:59

drive to be the best. It has to come

155:01

from some there's got to be something

155:04

wrong where you you have that fire

155:07

inside of you.

155:08

>> I love thinking about this. I think it's

155:10

been the the question that I've probably

155:11

been the most obsessed by since doing

155:13

the show. The the price that people pay

155:15

to be somebody that you admire

155:18

>> and I think it's just it's just

155:19

endlessly interesting. So, um one thing

155:22

that that comes to mind there is do you

155:25

know what the fundamental attribution

155:26

error is? It's like we we attribute to

155:28

other people um motive for their action.

155:32

Uh it's like their character. Uh but for

155:34

us it's situation. So for instance, I

155:35

cut you off in traffic because I'm late

155:37

for work. You cut me off in traffic

155:39

because you're a dick.

155:40

>> Right?

155:41

>> So we have this asymmetry and how we

155:42

judge other people's uh behaviors as

155:45

opposed to our own.

155:46

>> I think that there's an equivalent here

155:48

when we think about our parents. So you

155:50

could call it the fundamental um like

155:53

parental attribution error maybe which

155:55

would be we attribute to our parents our

155:59

shortcomings but not necessarily our

156:01

strengths.

156:02

>> Right?

156:03

>> So we're very happy like modern pop

156:04

psychology it's like a right of passage

156:06

to lay at the feet of our parents. Um

156:09

I've got anxious attachment because

156:11

nobody ever came to look after me. You

156:13

go yeah maybe. But also, isn't this the

156:17

reason that your hypervigilance means

156:18

that no one ever gets to take advantage

156:19

of you? It's like, um, I am unable to

156:23

relax and chill out because love was

156:25

always predicated on me performing. It's

156:28

like, yes, but also it's driven you to

156:31

be an incredibly successful person. And

156:33

I think we should just be a little bit

156:35

cautious when laying at the feet of our

156:37

parents only our shortcomings. They they

156:39

can either have both. You can either say

156:41

that my strengths and my shortcomings

156:43

come from my parents or my strengths and

156:45

my shortcomings come from my own agency.

156:47

But you can't say I authored the things

156:49

that I like about myself but the things

156:51

that I don't like about myself came from

156:53

some like past situation.

156:55

>> Yes. Victim mentality. Yeah.

156:57

>> Yes.

156:58

>> Yeah.

156:59

>> Yeah. And also

157:01

bad things that happened to you when

157:03

your kid being bullied. Being bullied is

157:05

terrible at the time, but it leads many

157:08

a person to say, "I'll [ __ ] show

157:10

you." Yeah,

157:11

>> you know, and then you get this

157:13

incredible result. But then the thing is

157:14

like are you happy? That's that's the

157:18

that's the real dance. The dance is

157:20

between success and happiness. And a lot

157:22

of people have achieved success but have

157:24

not achieved happiness and they'll die a

157:26

loser.

157:26

>> Well, that's you sacrificing the thing

157:28

you want for the thing that's supposed

157:29

to get it. And that's why like, okay,

157:31

what's your definition of success?

157:32

Interesting question. Would you just

157:34

want to be the best in the world?

157:35

>> Mhm.

157:36

>> Like that's not bad. That's not a bad

157:39

but

157:40

It's this thing we talked about before

157:42

too um that just because something's

157:45

difficult doesn't mean it's good. And

157:47

there's a lot of things that you do that

157:49

are very difficult to do and then you

157:51

see other people have achieved them. You

157:53

say that must be really worthwhile. And

157:54

then you do it and you realize like oh

157:56

this isn't worth anything. This is just

157:58

hard to do. This sucks. That's often the

158:01

case with success because if you become

158:04

incredibly successful and then you have

158:06

all these haters and you know like the

158:08

guy who wrote the shitty play,

158:10

>> you know, like they they come for you

158:13

and they they want to chop you down and

158:16

that's part of the game that you're

158:18

playing.

158:18

>> And if you don't like that, if you don't

158:20

like that, but then you've gotten

158:22

trapped in it and you're constantly

158:24

being attacked and you listen to it and

158:25

you pay attention to it. So you're in

158:27

you see with successful people you see

158:28

it really with famous people especially

158:30

young people they have no history with

158:33

this and then all a sudden it's just

158:34

thrown at them and then they are both

158:37

the thing they wanted and something they

158:39

would never want which is to be like

158:41

constantly under attack.

158:43

>> I've thought about uh how brutal it must

158:46

be to have the talent but not the

158:49

constitution to be able to handle

158:51

success and fame. So I don't know

158:53

whether you've been tracking uh Luis

158:55

Kapaldi, the Scottish singer. So there's

158:57

a great documentary on Netflix. You got

158:59

to watch it. Uh how I'm feeling now.

159:01

It's a bit old now. It's like maybe four

159:02

or five years old.

159:04

>> Lewis Kapali breaks onto the scene.

159:06

Unbelievable voice. He's been playing

159:07

working men's pubs around Scotland and

159:10

is just a [ __ ] phenom, right? Um

159:13

billions of streams, billions and

159:15

billions of plays, arena tour, global

159:16

tour, all the rest of it. Co happens.

159:20

He's back in his mom and dad's house

159:22

near Glasgow in Scotland and he's in the

159:25

hut out the back trying to do the

159:27

difficult second album and there's the

159:29

pressure of the world on him. Now he's

159:31

got the talent but the pressure from

159:35

agencies, from record label, from fans,

159:38

from himself, from his parents, from his

159:41

peers, from everybody starts to get on

159:43

him. It weighs on him so heavily that he

159:44

develops a tick.

159:46

>> Oh Jesus.

159:47

>> Like Tourette's. It turns out he's

159:48

always had Tourette's. But the pressure

159:50

has caused him to like he can't he can't

159:54

perform. And toward the end of the

159:57

documentary, he goes back out on stage

159:59

at the O2 in London,

160:01

does the thing, walks out on stage, and

160:03

he's still doing this. And you've

160:04

tracked this whole journey. This is

160:05

toward the end, and he can't get his

160:07

words out. This is his calling

160:11

in life. This is what he was built to

160:14

do. This is what he was made for. and

160:16

his talent has been taken away from him

160:19

by the pressure of trying to do the

160:21

thing, not by his inability to do the

160:23

thing. And this is such a [ __ ] unique

160:25

kind of hell. Like, think about that. I

160:27

think about um uh fighters that have

160:29

performance anxiety that just can't get

160:31

themselves into the octagon with the

160:34

lights on them, put them in the training

160:37

camp that they're sparring, there's not

160:39

that same amount of pressure, not yet.

160:41

And they've they're unbelievable. And uh

160:44

Lewis Kapali did Glastonbury I think two

160:46

years ago and um the same thing

160:48

happened. Comes out on stage and

160:50

basically like can't sing. He can't

160:52

you're hearing these little croakkes and

160:54

squeaks come out of him. And then this

160:55

year he comes back out. He's done a ton

160:58

of mindfulness, got his health in order,

161:01

mental health work, therapy, comes out

161:04

and [ __ ] destroys it.

161:06

>> Oh wow,

161:07

>> dude. It's like it makes the hairs on my

161:09

arm stand up. It's so [ __ ] cool.

161:11

>> Wow. That's awesome. That's a great

161:13

story. That's what I like to see. I like

161:15

to see someone who [ __ ] their whole

161:17

life up and gets it back together again.

161:18

I love that. I really do because I think

161:22

that's what people really root for. They

161:24

really root for you to get it back

161:26

together again. What they don't root for

161:28

is once you're on top, like staying on

161:30

top. They like you to fall.

161:31

>> Yeah.

161:31

>> Yeah. That's a little too much. Well, I

161:33

mean, I I uh I think especially with

161:37

with what most people feel, they want to

161:39

see a little bit of themselves in that

161:40

story and they want to see a little bit

161:41

of struggle,

161:42

>> right? And they also know that they've

161:44

[ __ ] up their life cuz everybody's

161:45

[ __ ] up their life at some point in

161:46

time.

161:46

>> Redemption.

161:47

>> Yes.

161:48

>> If this person can be there and lose it

161:50

and then come back,

161:52

>> maybe I can get my [ __ ] together.

161:53

>> That's the problem.

161:54

>> As a 42-year-old alcoholic. [laughter]

161:56

>> Yeah. Maybe you're not going to be Louis

161:57

Capaldi, but

161:59

>> maybe you are. Maybe you're uh Oliver

162:00

Anthony. You know, Churchill didn't get

162:03

into power until he was 65.

162:05

>> Wow.

162:06

>> So, all of my life up until now would be

162:09

less than twothirds of the warm-up set

162:13

for Churchill starting his thing,

162:15

>> right?

162:16

>> So, I I just you never know sort of when

162:18

this stuff's going to come along. I I do

162:20

love though the uh the idea of watching

162:22

somebody climb to the top, lose it, and

162:25

then turn it back around again. I think

162:26

it's just such a [ __ ] wonderful idea.

162:28

We all love that. But I think it's

162:30

because we try to see some some of our

162:33

self in someone, which is why we don't

162:35

like things that are created by a

162:37

corporation where they put together a

162:39

band like the Monkeys or something like

162:40

that and fake it.

162:41

>> Nepotism Silver Spoon Baby.

162:43

>> We hate all of that. We hate all of

162:44

that. We hate all the people handed

162:46

their life silver platter.

162:48

>> If it feels like somebody didn't earn

162:50

it.

162:50

>> Yeah.

162:51

>> Yeah. I I worry

162:54

I worry about where um motivation comes

162:57

from for people in a way. If you are

163:00

able to game the system, which people

163:02

are now, they can like speedrun

163:04

relatability and authenticity,

163:06

>> but you don't know if this is some K-pop

163:08

thing that's some industry plant style

163:11

scenario that's just been placed

163:12

together to try and get this

163:15

give you a sense of resonance with this

163:18

person that doesn't deserve it. they

163:20

didn't actually struggle in that sort of

163:21

a way, but they can construct the

163:22

narrative that they did. And uh I think

163:26

in a world that's become increasingly

163:28

prefabricated, like people are looking,

163:30

they're scrutinizing very aggressively.

163:32

Is this person who they say they are?

163:34

This is the hypocrisy that points out

163:35

that they're not.

163:36

>> Right. Right.

163:37

>> And that's where you get performative

163:38

vulnerability. Oh, woe is me. They

163:41

pretend to not pretend to have

163:42

Tourette's, although I'm sure some

163:44

people do.

163:45

[laughter]

163:46

>> Um

163:46

>> they pretend they're struggling.

163:48

>> Correct.

163:48

>> Yeah. because I need the sympathy vote.

163:50

>> Yeah.

163:54

>> Yeah. Isn't that an interesting?

163:55

>> Well, the real problem was when someone

163:56

pretends and you catch them pretending

163:57

like that, then you're never going to

163:59

trust them again. You could fail. You

164:01

can fail and [ __ ] up. You could think

164:02

you got it right and you got it wrong

164:04

and you just, oh, [ __ ]

164:06

>> But if you pretend,

164:09

if you if you lie, if you show

164:11

deception, if you pretend you're

164:13

something that you're not, and they find

164:15

out like like Ellen, you know, she's a

164:17

nice lady. She's all dancing. Meanwhile,

164:19

she's [ __ ] screaming at people and

164:20

mean. Yep.

164:21

>> You know, that's like, "Oh, you were

164:23

lying."

164:25

>> That is [ __ ] catnip.

164:26

>> Yeah.

164:27

>> To people.

164:27

>> Oo, they love it.

164:28

>> Yeah. Well, there's nothing that the

164:30

internet wants more than to to find

164:31

somebody that's a hypocrite.

164:32

>> Sure.

164:33

>> Right. Because the internet is basically

164:34

one big spot the difference competition.

164:36

You said this thing here, you behaved

164:39

this way here.

164:40

>> I can compare the two. You have fallen

164:42

short. Like, and the [ __ ] jury comes

164:44

down and smashes you in the head. It's

164:45

also because we crave authenticity. We

164:48

wish we had it. We crave it in other

164:50

people. We want like we're all trying to

164:55

we're watching all these different

164:57

people like this guy play golf and that

164:58

guy play music and we're watching all

165:00

these people do all these different

165:01

things and we're we're getting something

165:03

out of it all. There's a reason why you

165:05

like that thing on Netflix. It's like

165:07

the there's it fuels the human

165:09

condition. It gives you happiness. like

165:11

it's some there's some in a genuine

165:14

moment like that. It's like a very

165:16

special element that it adds to your

165:19

life and we crave that and it's hard to

165:22

know what's real and what's not real.

165:24

>> That's why people get mad at me when I

165:26

say I like AI music. [laughter]

165:28

>> Like I know I know it's not real. I

165:30

still like it

165:31

>> but I don't like it the same way I like

165:35

listening to Johnny Cash sing Hurt.

165:37

>> You know what I mean? It's like that

165:38

there's a there's an authenticity to

165:40

that. There's a real thing to that

165:42

that's like it's very tangible. It's

165:44

different.

165:44

>> There's an upper bound on it. I

165:46

certainly think

165:47

>> I I'm friends with a lot of musicians

165:49

and one of the issues I think that they

165:51

have with the AI revolution, apart from

165:52

the fact that like they're coming for

165:54

our jobs, which is like obvious,

165:56

>> is that learning a musical instrument is

165:58

really [ __ ] hard and it takes a very

166:00

long time. Mhm.

166:02

>> Uh I think that the revolution for

166:05

podcasting has made it [ __ ] fantastic

166:08

for people to feel less lonely and have

166:10

uh exposure to conversations and

166:12

information they never would have done.

166:14

But anybody that sticks a microphone in

166:17

front of them can record a podcast. It

166:19

may be a totally [ __ ] podcast, but if

166:21

you give me a guitar, I can't make notes

166:24

come out of it. So the bar that you need

166:27

to get over to just be acceptably

166:29

proficient enough to be able to do to

166:31

have the conversation, right? Everybody

166:33

does what is equivalent of a everybody

166:35

that has never recorded a podcast has

166:36

had a great conversation over dinner and

166:38

gone, dude, if we recorded that that

166:39

would have got millions of plays on

166:40

YouTube, right? Like um so everyone is a

166:43

little bit closer to this. And I think

166:45

that one of the issues that the music

166:47

industry or musicians within the

166:48

industry have is that AI feels like it's

166:51

allowing people to leapfrog the first

166:54

very long, very boring, very grindy

166:56

stage of, well, this is where your

166:58

[ __ ] fingers need to go on the

167:00

saxophone

167:00

>> or this is how you need to pick the

167:03

strings in order to make the sound come

167:04

out of the guitar.

167:05

>> Yeah.

167:06

>> And if you leap frog it, that feels like

167:08

a little bit like a technology enabled

167:10

nepotism in a way. You've got yourself

167:12

toward the end. You shouldn't be able to

167:13

make this. This is like a guarded and

167:16

highly invested. Is this I mean you guys

167:17

see this in comedy. In comedy you're

167:19

like dude until you're eight like the

167:21

first seven years like they're just you

167:23

earning your keep and then you're eight

167:24

whatever it is like it's a thousand

167:25

shows and once you've done a thousand

167:27

spots then you can say that you've

167:28

started doing comedy or whatever it is.

167:30

For podcasting I think it's like 150

167:32

episodes before anyone that asks me like

167:34

I'm beginning my podcast and what's your

167:36

advice and I'm like once episode 150

167:38

starts.

167:38

>> Yeah.

167:39

>> You have begun doing a podcast. Up until

167:41

then it's basically a warm-up set. And

167:43

uh I think with music because it's such

167:45

a high investment that people need to

167:49

have at the very very beginning this

167:52

sense that there is a shortcut that

167:53

allows people who haven't earned their

167:55

way to get there. It would be like if

167:56

you were using AI to write comedy sets.

167:58

>> Yeah. And I think you're correct. But I

168:00

also think that's probably what lions

168:03

felt when people invented guns.

168:06

Like this is [ __ ] I've been chasing

168:08

you [ __ ] down and eating you

168:10

for thousands of years. And now all of a

168:11

sudden you just squeeze your little

168:13

finger and I die instantaneously. That's

168:16

[ __ ]

168:18

>> It's coming.

168:18

>> Mhm.

168:19

>> It's coming. It's coming in all forms of

168:21

entertainment. It's going to They've

168:23

figured out what you like. They've got a

168:25

giant catalog of billions of hours of

168:27

human beings paying attention to things.

168:29

And it's coming. It's coming. It's going

168:31

to overwhelm you. And it's going to be

168:32

in disccernible from reality eventually.

168:34

It's going to be something that you've

168:36

physically experienced as well as visual

168:40

and audio. You're you're going to have

168:42

the whole experience.

168:43

>> We'd better enjoy ourselves while we

168:44

can.

168:44

>> Yeah. Have fun while you can. [laughter]

168:47

Chris, I appreciate you very much. It's

168:48

always awesome talking to you. Your

168:50

podcast is excellent. Um, tell everybody

168:52

where they can get it, where they can

168:54

find you.

168:55

>> Modern Wisdom on Apple Podcast and

168:57

Spotify, Chris Williamson on YouTube,

168:59

etc., etc. Uh, I appreciate the [ __ ] out

169:01

of you, man.

169:01

>> I appreciate the [ __ ] out of you, too,

169:02

brother. It's always good talking to

169:03

you. It's always fun. Goodbye everybody.

169:06

Peace. [music]

Interactive Summary

The speakers discuss various topics, including the impact of technology on mental health and social interaction, the intricacies of climate change discourse and activism, the societal implications of AI and automation, and the psychology behind success and happiness. They touch upon the prevalence of social media, the pressure to conform to certain ideologies, and the importance of critical thinking. The conversation also delves into personal experiences, historical events, and philosophical ideas, highlighting the complexities of human behavior and the modern world. Specific examples include the controversies surrounding climate change activism, the potential future of human-computer interaction, the challenges faced by individuals in the public eye, and the nature of motivation and achievement.

Suggested questions

11 ready-made prompts