HomeVideos

Joe Rogan Experience #2433 - James McCann

Now Playing

Joe Rogan Experience #2433 - James McCann

Transcript

5888 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:13

>> Baby. That's [ __ ] good.

0:16

>> Have we started? Are we going?

0:17

>> We're start. Oh, no.

0:20

>> Not over the relics.

0:21

>> The dirty dirtier this uh table is, the

0:24

better.

0:24

>> Get it away from the What is that?

0:27

>> The relics. That is uh that's from my

0:29

friend John Reeves. He gave that to me.

0:31

That's a masttodon tooth or woolly

0:33

mammoth or what's the difference? What

0:36

is the difference between woolly mammoth

0:37

and a mastadon? They must be a different

0:40

age,

0:41

>> a different era, but uh that's a giant

0:43

tooth that he there's a company in

0:46

Alaska, I forgot the name, but they uh

0:48

it kind of seems [ __ ] to carve into this

0:50

thing because it is 10,000 years old at

0:52

least.

0:53

>> How many of them are there though? Do

0:54

they have heaps of them?

0:55

>> They have heaps of them. But this is

0:57

really cool. It's like they carved a a

0:59

mammoth in it. So what is the

1:00

difference? According to our sponsor,

1:02

Perplexity, a woolly mammoth and a

1:04

mastadon were related, but quite

1:05

different ice age elephants. Mammoths

1:09

were taller, more slightly built grass

1:12

eataters, while mastadons were shorter

1:14

stockier browsers that ate woody plants.

1:17

Okay.

1:17

>> I was going to say the hair maybe, but I

1:19

don't. It's obviously more

1:21

>> woolly mammoth, right? Yeah, mastadon

1:23

looks like an elephant. Yeah, the

1:24

mastadon horn does look cooler.

1:26

>> They're pretty cool. They're all pretty

1:27

cool.

1:29

You know, they lived on an was it where

1:32

were the last mastadons? I want to I

1:35

think I want to say they lived on an

1:36

island

1:39

>> until like 10,000 years ago or something

1:41

like that cuz most of them died out.

1:45

They don't they don't know how they died

1:47

out, but the there's two theories. One

1:49

one theory is people killed them all,

1:50

which is a shaky theory.

1:53

Because it's people of 10,000 years ago

1:55

with [ __ ] sticks.

1:56

>> Were they around 10,000 years ago?

1:58

>> Oh, yeah. Yeah. Yeah. Yeah. They died.

2:00

>> We definitely did that then.

2:01

>> I don't think so. I think it was a

2:03

cataclysm. I think it was the same thing

2:05

that killed 65% of all megapa. That's

2:08

the problem. It killed so many different

2:09

animals like almost instantaneously.

2:11

Yeah, that's it.

2:13

>> 4,000 years ago. Wrangle Island, remote

2:16

Arctic island off Siberia's coast, had

2:19

the last woolly mammoth till about 4,000

2:21

years ago. Isn't that nuts? That's

2:22

nothing.

2:23

>> That's nuts. Yeah, that's like before

2:25

the pyramids were built.

2:28

>> It's

2:28

>> No, I mean after the pyramids are built

2:30

rather.

2:30

>> Similar time. Yeah. After.

2:32

>> Yeah. After the pyramids, allegedly. I

2:34

think they're probably built earlier

2:35

than that. But the uh official date is

2:38

2000.

2:39

>> Strange man with the beard. And the

2:40

>> Which one?

2:41

>> That man you had on to debate it who's

2:43

always clapping back on Twitter and

2:44

going like there's nothing funny about

2:46

the

2:46

>> Oh, Flint Devil.

2:47

>> Yeah. I don't want to invoke his eye. I

2:49

don't want him. He's got a lot of time

2:52

and a lot of

2:52

>> I actually enjoyed talking to him about

2:54

nonarch

2:56

non you know ancient history related

2:58

stuff. He has some interesting things

3:00

about seeds like he does a lot of work

3:02

in seeds.

3:03

>> Okay.

3:04

>> Yeah. No, it's it's actually really

3:05

interesting how

3:06

>> the history of seeds.

3:07

>> Yeah. When so say if you have a wild

3:09

plant they can tell the difference

3:10

between a wild plant and an

3:12

agriculturally grown plant. Yeah. And

3:14

the way is the seeds change. So when you

3:17

have a wild seed, it is uh more

3:20

conducive to the growth of the plant if

3:22

the seeds break off easier and scatter

3:25

and and they they get into the ground

3:27

easier. So they they break free of the

3:29

plant. But then when you use

3:32

agriculture, the seeds don't become

3:34

important for the creation of new plants

3:36

because you're always taking the seeds

3:37

anyway and planting the seeds, right? So

3:39

those seeds are seeds are more robust

3:41

and they hang on more.

3:43

>> Yeah. So you could you could tell by

3:45

looking at the actual seeds themselves

3:47

whether it's an agriculturally based

3:50

seed or whether it's a wild seed.

3:53

>> That is good. I hadn't thought about

3:54

that.

3:54

>> Yeah, it's it was really cool. That part

3:57

was cool. The the the shittiness is not

3:59

cool and calling Graham Hancock a

4:01

racist. They do that with like everyone.

4:03

Everyone who has anything to say about

4:06

the historical narrative that doesn't

4:09

fit into exactly what they're teaching

4:12

or what they have been teaching. They're

4:14

like so unwilling to accept that there's

4:17

any alternative timeline. But they keep

4:19

getting [ __ ] because over and over

4:21

again they keep finding these new things

4:23

that are older and older.

4:24

>> Yeah.

4:25

>> Like go back was the big one. It happens

4:27

in every discipline.

4:28

>> Yeah.

4:29

>> Yeah. I mean it happens in comedy.

4:30

There's people that don't like new

4:32

comedians that are coming up. They don't

4:33

like what they're doing differently or

4:35

>> a thing last night about prop comedy.

4:37

Like everyone just stopped doing prop

4:38

comedy at a certain point.

4:39

>> Well, it's Kerat.

4:40

>> It's because of Katov. And also because

4:42

the bullying you would receive right

4:44

>> at the moment for having props. There's

4:45

one Rick Glassman. Am I getting his name

4:47

right?

4:48

>> I don't know. But he had some props and

4:49

he was really funny and he got away with

4:51

it. But he's the only person in America

4:52

other than Carrot Top I've seen with any

4:54

props.

4:54

>> Well, when I started out there was a

4:56

bunch of guys who had props. There was a

4:58

there was a bunch of guys who had props

5:00

and it was fun. It was fun to watch.

5:02

There was uh God

5:05

Dr. Whidd, I forget his name. Dr. Whiz,

5:09

I forget his name, but he was a a guy

5:11

when I first started out in like the

5:13

1980s. He had props and he was good. He

5:16

was a funny comic.

5:17

>> It'll be cyclical. It'll come back. Like

5:19

ladies with ukuleles had to go away for

5:21

a time. It was necessary that we purge

5:24

ukulele women from comedy.

5:25

>> How many were there?

5:26

>> Oh my god. I don't know. That was Is

5:28

this him?

5:28

>> Dr. The Legendary Whit.

5:30

>> That's it. Legendary wid. Yeah, that's

5:32

the dude. And he uh would do like

5:35

science-based humor. He was a funny guy.

5:38

So, this is, you know, I saw him in like

5:40

88,

5:42

>> 88, 89. But the point was that guy was

5:44

really funny when he started busting out

5:46

the props.

5:47

>> Yeah.

5:47

>> And uh and we were like I was like, why

5:50

don't you just do props?

5:51

>> This is your thing.

5:51

>> Yeah. like that kind of humor, his kind

5:54

of humor, it's almost like it's missing

5:56

something in just the just the straight

5:59

standup form.

6:00

>> There's like there's waves of things

6:01

become trendy and then people who can't

6:03

really do it very well jump onto it and

6:04

then it gets lame and people stop doing

6:07

it.

6:07

>> Well, a lot of it is one guy gets really

6:10

successful doing it and then that

6:11

becomes his thing.

6:12

>> We had a run of people pretending to be

6:14

[ __ ] in Australia. It was like how

6:17

hard did they try?

6:18

>> Uh really hard. Were they on the border

6:20

and just like slowed it down a little?

6:21

>> Weird sweaters. People having like

6:23

fireworks that they would fire into

6:25

themselves and everyone would like come

6:26

out with cards and read their act.

6:28

>> That's what happens when you take away

6:29

everyone's guns.

6:30

>> They're trying to take them away again

6:31

again.

6:32

>> They already took them all away and then

6:34

somehow we still had a massive shooting

6:36

>> and now the response is, well, maybe we

6:38

could

6:39

>> take even more of them away.

6:41

>> Um, what was the nationality of the

6:43

people that caused the shooting? The son

6:45

I think was born in Australia and the

6:47

dad there was a big fight over it on

6:48

Twitter where people were going he's

6:50

Pakistani and the

6:50

>> I remember that but I didn't I don't

6:52

anymore. I don't anymore. I don't get in

6:54

there.

6:54

>> The big argument was over the religion

6:56

of the hero who took one of the guns

6:59

away.

7:00

>> So like the cops were apparently

7:01

cowering. That's the narrative. I don't

7:03

know. But one guy ran up and it's a

7:06

great video of a guy like he he runs at

7:09

a guy with a gun and wrestles the gun

7:11

off him and aims the gun at him and lets

7:13

he does let the guy get away. He doesn't

7:14

want to kill him,

7:16

>> which is kind of crazy. The guy just

7:17

killed how many people?

7:18

>> Oh, and then I think the guy gets a gun

7:20

and goes on killing people. But he Yeah,

7:22

but he's not a killer. This guy who

7:24

wrestled the gun off him, he was just

7:25

heroic

7:25

>> in the head with the butt like in the

7:27

movies.

7:27

>> I don't know what I I mean I wouldn't I

7:29

wouldn't have ever run up to a man with

7:30

a gun. I would have been out of there.

7:32

But the argument was what religion was

7:34

the guy who took the gun because people

7:37

on the right really didn't want him to

7:38

be a Muslim. They were like it was a

7:40

huge thing on X of people.

7:41

>> People on the right didn't want him to

7:43

be

7:43

>> because it was Muslim shooters. But then

7:44

it it looked like he was a his name was

7:46

like Ahmed Dal Ahmed or something.

7:47

>> But hold on. Why would the people on the

7:48

right not want him to be a Muslim?

7:50

>> Because then you can go this is a Muslim

7:51

thing. Muslims were doing the shooting

7:53

and we can just go let's deal with the

7:54

Muslims.

7:55

>> Oh, you mean the people the guy who

7:56

captured the guy? the guy who wrestled

7:58

the gun off. He was also a Muslim, which

8:00

then makes it like

8:01

>> a heroic. Yeah.

8:02

>> Yeah.

8:03

>> Well, his name is like Muhammad

8:04

Muhammadson.

8:06

>> Imagine being a regular Muslim and

8:07

having to deal with these crazy

8:08

modifications.

8:09

>> There he is.

8:10

>> That guy.

8:11

>> Yeah. People love him. But man,

8:13

>> shoot the guy in the foot. If you didn't

8:15

want to kill him, shoot him and blow his

8:17

[ __ ] ankle.

8:18

>> No one can really do that. And it's a

8:20

big Look at him go.

8:21

>> Oh, that's amazing.

8:23

>> And he doesn't do anything.

8:25

>> So the guy just gets away. The guy does

8:27

get away.

8:28

>> Oh, this is not good.

8:29

>> But then after he lets him get away, I

8:31

think he drops the gun and he goes away

8:33

and then he gets shot again in the arm.

8:35

>> Who knows what to do when there's a live

8:37

>> Yeah. You don't know what to do. Well,

8:39

that's a a good person. That's a good

8:41

person.

8:41

>> He is a national hero at the moment. And

8:43

uh I think if he had

8:45

>> Man, people wanted him to be a Marinite

8:47

Christian so bad. The Gropers were

8:50

desperate for him to be. There was a lot

8:51

of people going, "Well, actually,

8:52

>> you know, that's the real problem we

8:54

have in this country. We want to pretend

8:55

that people actually exist in groups.

8:58

Even if there's high percentages of

8:59

people from groups that are doing bad

9:01

things, this there's still a giant

9:04

percentage that are not. And to alienate

9:06

all those people by just lumping it all

9:08

in as one group together. Imagine like

9:10

imagine you're a peaceful Muslim and you

9:12

have to deal with this [ __ ] and you're

9:14

like, "Guys, I just want to pray. I'm

9:16

just trying to like find oneness with

9:18

God. That's all I'm trying to do."

9:20

>> I love twirling.

9:21

>> Yeah,

9:21

>> I'm one of the twirling ones. They're my

9:23

favorite ones personally.

9:25

>> What's a twirling?

9:25

>> The twirling dervishes. They just love

9:27

twirling. They love to twirl.

9:29

>> But the

9:30

>> twirling. I was trying to figure out

9:32

what you were saying.

9:32

>> Twirling. But this is what's we So after

9:34

that, the government comes out and is

9:36

like cracking down on right-wing

9:38

extremism

9:39

>> cuz it's a lefty government. And they

9:40

go, "We have a clearly we have a problem

9:42

with right-wing extremism." So now

9:44

they're trying to reclassify

9:47

like you know globalized infot jihadism

9:51

as a form of right-wing extremism which

9:53

I'd never which like yeah I guess it's

9:55

not commy stuff but

9:57

>> well you have to look at it on paper

9:58

objectively it is

10:00

>> yeah but I don't know how much they hang

10:02

out I don't know if these guys I don't

10:03

think these guys are reading like

10:06

>> I don't know William FBuckley Jr. It's

10:08

still Let's break down what is rightwing

10:11

then.

10:12

>> Okay,

10:13

>> let's say this, okay? Do they want to

10:16

completely control women's behavior and

10:20

completely dictate whether or not the

10:22

woman can leave the house with certain

10:23

clothes on, what they're allowed to do,

10:26

>> right?

10:26

>> Yeah.

10:27

>> That's kind of kind of a right-wing

10:28

thing, isn't it?

10:29

>> Yes.

10:30

>> Total religious adherence. They want a

10:32

religious state.

10:33

>> Yeah, but the Taliban want to dance with

10:35

little boys. That seems like a left

10:37

separate breakoff group. They're like

10:39

the Baptists.

10:41

>> They're like the Catholics. You know

10:43

what I mean? You got your regular

10:44

Christians and then you got some other

10:46

[ __ ] that are out there running

10:48

wild with new rules. Mormons. How about

10:50

this?

10:52

>> Mormons. Yeah. You know, but that's what

10:53

I'm saying. It's like their breakoff

10:55

group. It's not the ones who are banging

10:57

the boys. That's not normal.

10:58

>> There's a lot of guys out there that are

11:00

Muslim that are not banging boys. So,

11:02

when you connect them with the Taliban,

11:04

they're like, "Hey, bro. I'm just

11:06

praying over here.

11:07

>> It's all people just trying to have fun.

11:10

>> Yeah. Yeah.

11:10

>> Who am I to judge anybody?

11:12

>> The problem is then you push when you

11:13

push these people. It's the same thing

11:15

that happens when you call everyone a

11:16

racist. What do you get? You get a Nick

11:18

Quentes. You get a guy who emerges who's

11:20

got the balls to [ __ ] talk and have fun

11:23

and say wild things that are very

11:25

inappropriate and sometimes racist.

11:27

That's what you get. You get someone

11:29

embraces that guy because you've been

11:31

told you're a racist just for being

11:32

white. Yeah.

11:33

>> You know, you've been told there's

11:34

something wrong with you. white male.

11:35

Like I there was a time where someone

11:37

would say something in comments all the

11:39

time. I would watch these people arguing

11:40

and someone it was a common thing to say

11:45

uh as a white man I think you should

11:47

probably shut your [ __ ] mouth. Like

11:49

as a white man like you're a white man

11:51

you're disqualified from having an

11:53

opinion on something because you are a

11:54

white man.

11:55

>> Yeah.

11:55

>> It's another form of racism. It's just

11:57

an accepted form of racism that's really

11:59

weird. But then you like so like Nick

12:01

Fuentes is getting all his other ideas

12:03

through as well because he was the only

12:04

person saying things that the average

12:06

person would think was kind of normal.

12:09

>> Well, I've been thinking about this

12:11

wasn't a lot of the stuff he's saying he

12:13

that was not something the average

12:14

person would think is normal. you sneak

12:15

your other weird stuff through like when

12:18

everyone's going right

12:19

>> um you know like when he says when he's

12:21

getting attacked for going like a black

12:23

neighborhood is going to be more violent

12:24

on average in America you go

12:26

>> yes I've traveled around the country and

12:28

that is I think there's a long history

12:29

for why that's true

12:30

>> well it's factually correct

12:32

>> that seems to be correct

12:33

>> the the question is though why and

12:36

that's where it gets uncomfortable yeah

12:38

>> because the the real reason for why is a

12:42

a host of factors but the primary one is

12:46

crime and poverty. The primary one is

12:48

they live in a community that's filled

12:50

with crime and poverty. Yes. And if you

12:51

have a and drugs and if you have a

12:53

community where people are selling drugs

12:54

and it's crime and poverty, you're going

12:56

to get a lot of violence. Whether it's

12:57

an Italian community, Armenian

12:59

community, any community where you got a

13:01

lot of crime and a lot of poverty.

13:03

>> I first came here, I went to Appalachia.

13:04

>> People are going to get killed.

13:05

>> There are white people doing crazy crazy

13:08

things.

13:08

>> You ever see the Wild Wonderful Whites

13:10

of West?

13:10

>> I watched it like a week ago.

13:11

>> [ __ ] amazing. The most charismatic

13:13

family I've ever seen.

13:14

>> Knoxville did that, didn't he?

13:16

>> Yeah. Executive producer.

13:18

>> Yeah, bro. That

13:19

>> made me feel so homesick. Look, like I

13:21

was only there for a couple months. I

13:22

wanted to go back so bad.

13:23

>> The dance and outlaw.

13:25

>> His when they're like granddaddy had a

13:27

new way of dance and it's the most

13:29

insane.

13:30

You're like, was that really going to

13:32

take off?

13:32

>> It did.

13:33

>> Was that the style of dance,

13:34

>> bro? When you're on meth, it's awesome.

13:36

>> I mean, meth dance to stay with.

13:38

>> Oh, they were on everything.

13:39

>> They were on the lot. The How about the

13:41

lady? I'm always been thought of as a

13:43

sexy one. She was a stripper. Remember

13:45

her? The voice. I did a big deep dive on

13:49

Wikipedia about them afterwards.

13:51

>> She stomped a kitten.

13:52

>> Which one's dancing here?

13:53

>> This is Jessco. American Outlaw.

13:56

>> Jessco.

13:56

>> Okay. He's the younger guy.

13:57

>> He's uh

13:59

>> Jess lives out the legacy.

14:03

>> Excuse me.

14:03

>> He's He's like He keeps the dancing

14:05

alive. He's the one who's a celebrity in

14:07

the show,

14:08

>> right? But then there's another

14:09

documentary about him and in both

14:10

documentaries he complains about a woman

14:12

making his eggs wrong.

14:14

>> Yeah, that's that dude. Yeah,

14:16

>> he's got it. He's a charismatic guy.

14:18

>> Yeah, he he said he would cut her if she

14:20

gave him ruddy eggs. I was like

14:22

>> sloppy eggs.

14:24

>> Settle down, bro. Like maybe we

14:26

shouldn't be celebrating this.

14:27

>> But I think I think one of them just got

14:29

out of prison. I think the one who at

14:30

the start of that

14:31

>> documentary I hope Trump got him out.

14:33

>> Who got out? What did he do?

14:35

>> The one who uh shot his uncle in the

14:39

right.

14:40

>> Yeah,

14:40

>> I think he just got That's the sexy one.

14:42

>> I've always been the sexiest one in the

14:44

family. Listen to what she said. How the

14:46

way she says it though. The voice is

14:47

incredible.

14:48

>> Just pictures.

14:48

>> Yeah, I think that sexy one. I think she

14:51

did get in trouble for stepping on a

14:53

cat.

14:53

>> Well, there was a thing in that film

14:55

that was interesting though towards the

14:57

end where you see like some of them are

14:59

trying to like move away from that life.

15:01

That girl, one girl got sober. So, there

15:03

was like a take to it where they

15:04

realized like, hey,

15:05

>> this is not sustainable. this is a crazy

15:07

way to live. I'm a mother. Like, what am

15:09

I doing, you know? And she was trying to

15:11

get out of it, which I think a lot of

15:13

people do come to the realization if

15:15

you're in that kind of a community, I

15:17

got to get the [ __ ] away from these

15:18

crazy [ __ ] and stop doing meth.

15:20

>> It is. Yeah. I think, but it's how do

15:23

you do it? See, this is the thing. This

15:24

is the thing when you say like, is it is

15:27

it true that there's a higher percentage

15:29

of murders that occur in black

15:32

communities? Right. Right. But as

15:35

opposed to poor communities, like what

15:37

about like in deeply impoverished

15:39

communities? Like, and then when you

15:42

introduce a history of gang violence and

15:44

crime and no one ever does anything to

15:46

stop it, it's going to stay the same.

15:48

Whether it's in Appalachia or whether

15:50

it's

15:51

>> the Hatfields and the McCoys, all those

15:53

[ __ ] that were killing each

15:54

other back in the Wild West days. I

15:55

mean, it's probably horrible back then.

15:57

Why? Because they let it be that way.

15:59

Nobody did anything about you couldn't

16:01

stop them. And I think some of the

16:02

solutions for it are very bad. This is

16:04

my I don't want to speak out of turn

16:06

because it's not my country, but like

16:07

when I've been driving through

16:08

>> people love to come to America and tell

16:10

us what to do.

16:10

>> I love it.

16:11

>> I I think it's the greatest country in

16:12

the world. And I repeat that again. When

16:14

I drive through like a bad area and

16:16

there's like a Planned Parenthood with a

16:18

line around the block

16:20

>> and things set on fire. And you can just

16:21

tell like

16:22

>> I know that Planned Parenthood started

16:23

out as a eugenicist organization where

16:26

they went like that was the lady who

16:28

founded it. That was her thing. And you

16:30

can really see in those neighborhoods,

16:32

it's like if you have a child here,

16:34

you're going to be tied to this

16:35

community. We want you to get out. We

16:37

want people who have the spirit to get

16:38

out of here and to live a good full life

16:41

in America, not to be tied down to being

16:43

in like a really difficult crime riddled

16:47

area.

16:47

>> Yeah.

16:48

>> So abort your children so you can get

16:50

out seems to be the

16:52

>> I think they're still doing the

16:53

eugenicis thing of being like just be

16:55

free for different reasons. is not cuz

16:57

they want to dilute the numbers in the

16:58

population or whatever, but because they

17:00

go, you've got to be a free person who

17:01

can leave and children will tie you to a

17:03

place.

17:04

>> Yeah, that's a way to look at it. That

17:06

was when I was driving through I forget

17:08

what Wisconsin, northern Wisconsin, I

17:10

don't know. I just hit with this. So, oh

17:13

man, it's like usually the the rough

17:15

area of a town is lifted up by a freeway

17:17

in America. Like you don't see if you

17:19

drive into Chicago, you're just way up

17:21

here on a freeway and then you come down

17:23

into like the most beautiful buildings

17:24

you've ever seen in your life and people

17:26

go it's very scary over in the other

17:27

part of Chicago and you go I never saw

17:29

it. I was

17:29

>> above it and

17:30

>> I was 30 ft in the air.

17:32

>> Yeah.

17:32

>> But in some places I have driven through

17:33

it and I've gone or I've stopped and you

17:35

go there's

17:37

>> someone's like if I lived here I mean

17:39

there are some areas that are so rough.

17:41

It's like, man, if I lived here, I would

17:45

go and steal and kill from the people

17:47

who live 20 minutes up the road for

17:49

sure. Do you know? Like, you just drive

17:51

20 minutes up the road and there's a

17:52

German town and everything's perfect and

17:53

everyone's rich and everyone's beautiful

17:55

>> and you

17:56

>> This doesn't happen in

17:58

>> I don't know. I'm from a very flat

18:00

country by comparison. The highs and

18:02

lows here are incredible.

18:03

>> Oh, the highs and lows of what? You mean

18:05

>> America?

18:05

>> You mean poverty and wealth?

18:07

>> Yeah.

18:08

>> Okay. like like the Bronx being an hour

18:11

from the Hamptons.

18:12

>> Okay, if your New Year's resolution was

18:14

change everything and be a new person,

18:16

good luck. So, instead of pretending

18:19

you're going to meal prep kale forever

18:21

or do morning cold plunges, here's one

18:24

actually realistic thing. AG1. AG1 is a

18:28

daily health drink that supports your

18:29

energy, gut health, immune health, and

18:32

helps fill common nutrient gaps. Just

18:34

one scoop in cold water each morning and

18:36

you're off. It's got over 75 vitamins,

18:39

minerals, probiotics, and whole food

18:42

ingredients in there. So, instead of

18:43

guessing whether you need a probiotic or

18:46

a prebiotic or sorting through 10

18:48

different bottles of pills and powders,

18:49

you can just do one scoop and get on

18:51

with your day. It's great because it

18:53

feels like the grown-up move, but for

18:55

once, it's actually really easy. It

18:57

takes like 30 seconds, and you'll notice

18:58

the steadiness that sets you up for the

19:00

day. Not wired, not crashing, just

19:03

functional human being energy. I've

19:06

partnered with AG1 for years and if you

19:08

want to give it a try, head to drink

19:10

AG1.com/jo.

19:12

And for a limited time, you'll get a

19:14

free AG1 duffel bag and free AG1 welcome

19:18

kit with your first AG1 subscription

19:21

order only while supplies last. That's

19:24

drinkagg.com/joan

19:27

or visit the link in the description to

19:29

get started.

19:31

>> Well, it's all of it's real close. I

19:32

used to say that like when I lived in

19:34

LA. I was like, you know, people like

19:36

this is a good neighborhood. I go,

19:38

right? But you know, people from a bad

19:40

neighborhood can just come into your

19:42

good neighborhood. You know about all

19:43

that, right?

19:44

>> When people are like, why do you have

19:45

dogs? Why do you have guns? I was like,

19:47

what? Like, do you watch the news? Yeah.

19:49

>> Are you [ __ ] crazy? Like, you got to

19:52

be careful out there. And most of the

19:54

time it's not going to happen to you.

19:55

The 99.99%

19:57

of people will never experience anything

19:59

awful. But to not have any idea that it

20:03

could ever happen to you is bad. I think

20:05

the real problem, and this is the one

20:08

that just doesn't get addressed with any

20:10

politicians ever, is something massive

20:14

has to be done to stop

20:18

this like ancestral like this lineage of

20:22

people that are coming from these

20:23

crimeridden places. And no, no one

20:26

changes anything about it at all. We had

20:27

a cop on once from Baltimore and he was

20:30

telling us that while he was on on duty,

20:33

he found this uh like crime sheet, a

20:36

dock sheet of all the things that

20:37

happened in like 76 or something like

20:39

that. And he was reading all the areas

20:42

and all the crimes and and it dawned on

20:44

him. He's like, "Oh my god, like this is

20:46

the same crimes in the same area decades

20:49

later and nothing has changed.

20:52

>> They need to do something huge." like

20:56

treat that as if it's an untapped

20:58

resource of human potential because

20:59

that's what it is. All those people in

21:01

that community, if they had been born

21:03

and raised with different families in a

21:06

different place, completely different

21:08

outcome, a giant percentage of who you

21:10

are is dumb luck. And if the people that

21:13

got the worst luck to be born in a crack

21:16

house or be born in a place where

21:18

there's gang violence on the street

21:19

every day and you go to school and you

21:21

have to pick a gang, if you don't pick a

21:22

gang, the [ __ ] kill you. Like what

21:24

are you going to do? Like you're you're

21:26

not going to do anything but what

21:28

everybody else is doing. That's what

21:29

most people are going to do. The few

21:31

that are going to break out, maybe

21:32

they're musicians or an athlete or

21:34

something like that. They break out.

21:35

Yeah.

21:35

>> But for the most part, you're [ __ ]

21:37

But what it is is untapped and

21:40

unrealized human potential that's going

21:42

to waste on the most stupid [ __ ] [ __ ]

21:44

in the world.

21:46

>> I But then when you try and do something

21:48

like that in America, the push back is

21:50

huge. Like I think

21:51

>> why is what is the push back of

21:52

investing into communities? I would say

21:53

like in a small like I think the

21:55

National Guard going into some places.

21:57

>> Okay, that's different. So that's

21:58

>> that's what that's what it can look like

21:59

sometimes. Like there are definitely

22:01

that's what it can look like under this

22:02

administration.

22:02

>> Portland Yeah, there's got to be a

22:04

better way of doing it.

22:04

>> Well, it's you're just going to get too

22:06

much push back. But what you can't do is

22:08

let it get to the point where it's

22:10

feasible to call in the National Guard.

22:13

That's what's crazy. It's like their law

22:15

enforcement has been so handcuffed by

22:18

the the administrations, especially in

22:20

northwestern United States. Like

22:22

everybody, they don't get enough sun.

22:24

They lost their [ __ ] mind. Everyone's

22:26

depressed and everyone's trans. It's

22:28

crazy up there. It's crazy.

22:29

>> I was just in Portland. I was in

22:31

Portland just before the National Guard

22:32

went in and I was in Portland like

22:33

>> how insane.

22:34

>> It's so much

22:36

>> you can walk around a little little

22:37

>> after the National Guard.

22:38

>> I will like I know people were very

22:40

upset in Portland about that, but I

22:42

think just quietly they were going it's

22:43

kind of nice to the train station again.

22:46

>> The mayor in DC thanked Trump.

22:48

>> Yeah. Yeah, it's like that this is like

22:49

the safest it's ever been here since you

22:51

brought in the National Guard. Like, but

22:53

the problem is that sets a [ __ ]

22:56

precedent. So, here's the thing. If it's

22:58

necessary, let's say you have a place

23:00

that's a literal, not even a real place,

23:02

a fictional place in America where

23:04

there's a literal gang war going on and

23:07

dozens of people are getting shot every

23:09

day and it's it's basically a war zone.

23:11

Let's just imagine a place like that.

23:13

you would say, "Okay, it's probably a

23:16

good idea to bring in the military and

23:18

control that because the entire

23:20

population is at risk. It's very

23:22

dangerous. It's a literal war zone in

23:24

the middle of a modern American city. We

23:26

have to stop that." So, well, the thing

23:28

is if you people are lighting newspaper

23:32

stands on fire, people are doing this.

23:34

People are breaking into Starbucks.

23:36

Let's bring in the military. People

23:38

aren't obeying the speech laws. Let's

23:40

bring in the military. people are not

23:43

using their digital ID. Let's bring in

23:45

the military. It's like there's there's

23:46

got to be a separation between our army

23:49

and our civilians. And it has to be a

23:51

big [ __ ] reason to break that

23:53

separation. I think I mean you did it in

23:55

the 60s in the south when like busing

23:59

came sorry y all the United States when

24:03

when the when Jim Crow was happening in

24:05

the south

24:06

>> the military got sent in and people you

24:09

you desegregated the south by force

24:11

right

24:11

>> so that was deemed to be like an

24:12

appropriate use of

24:14

uh like a monopoly on violence to enact

24:17

a social change

24:19

>> like you're not going to have segregated

24:20

schools anymore we're going to have the

24:22

military there and make sure that this

24:23

works Yeah. Crazy you have to bring in

24:25

that the military to get people to allow

24:27

black people and white people to go to

24:29

school together.

24:29

>> I mean, yeah, they didn't want Well,

24:32

>> it's just so weird when I go to the

24:34

South now because everyone is so

24:35

friendly and people do seem to get along

24:37

and you go,

24:38

>> "Your grandparents were like

24:42

>> brother had to rip the craziest stuff."

24:44

>> Well, it's terrible. I mean, that EMTT

24:46

Till I just found out about that after I

24:48

got here. It's unbelievable. And they

24:50

were still shooting the EMTT till Till

24:52

statue that they put up. They had to

24:54

like replace it with a bronze statue so

24:56

the bullet holes wouldn't affect. That's

24:58

what was going on.

24:59

>> I believe that was what was happening

25:00

until like quite

25:00

>> sure it wasn't just one KKK dude that

25:02

ruined one dude.

25:04

>> You know what I'm saying? That's the

25:05

problem. You get one wacky guy in a

25:07

neighborhood and you that's a racist

25:08

neighborhood. They were shooting the

25:09

Emtt Till statue. Maybe it's one [ __ ]

25:11

working the tire shop.

25:13

>> You know, one [ __ ] dude smelling his

25:15

own farts and loading up his rifle. that

25:17

one Arkansas MMA fighter who kept saying

25:20

that he loved Hitler did a did a lot did

25:22

a lot to hurt the reputation of that

25:24

football team. He always had the

25:26

Razerbacks in the back.

25:27

>> Yeah, that wasn't a I think he did not

25:30

phrase that well. I think

25:32

>> I think uh I think there's a lot of

25:35

people here's the thing. There's a lot

25:36

of people that become experts and I'm

25:38

guilty of this as well by uh you're

25:41

you're talking about something where you

25:44

maybe watched a YouTube video, you know

25:47

what I mean? Like maybe you uh maybe you

25:50

read an article about it. It's some

25:52

[ __ ] Politico, who knows? Who knows

25:54

where you read it? Some it could be some

25:55

crazy right-wing source. You read

25:58

something, you took it as fact, and then

26:00

you talk to a bunch of other people that

26:01

also take it as fact. And next thing you

26:03

know, you start talking and

26:06

>> you have the biggest show in the world

26:07

>> saying [ __ ] Yeah, that's me.

26:09

>> Okay. But people always criticize that.

26:11

People always have a go at the

26:12

podcasters for like spouting off on

26:14

things that they're not.

26:15

>> That is what I do.

26:16

>> But how come there's no responsibility

26:18

on the mainstream legacy media for

26:20

having gotten really really boring over

26:22

the last

26:23

>> not just 15 20 years.

26:24

>> Boring is I would say lying as well.

26:27

Completely compromised. Totally

26:28

untrustworthy. Completely compromised. I

26:30

just got the New York Times app because

26:32

I thought I'll have a look at that. I

26:33

finally got enough money where I can pay

26:35

a dollar a week to be on the New York

26:36

Times app.

26:37

>> Yeah.

26:38

>> And it's so um I mean it's just they've

26:41

built Twitter like the experience of it

26:44

and the scrolling on it. It feels like

26:45

you're in Twitter but only mediated

26:47

through

26:49

>> selected journalists from the New York

26:51

Times and suddenly you're like I'm just

26:53

stepping into for a moment whatever

26:56

bubble that is. I just I wanted to take

26:57

a look at it. It's in it's like

27:00

>> I think they're all going to have to

27:01

course correct. I think they're all

27:03

going to have to realize that it's not

27:05

it's not being intellectual like a true

27:09

intellectual, a true progressive by only

27:12

looking at things from one perspective

27:14

and to automatically assume that anybody

27:16

that has a different perspective.

27:19

>> Hey, we're back.

27:20

>> There we go.

27:20

>> Where was I?

27:21

>> You're so they need to have a course

27:23

correction.

27:24

>> We're talking about the mainstream media

27:25

and that they've lost that many people.

27:28

That's what I was saying was that you

27:29

you can't proclaim yourself to be

27:31

intellectual by only listening to one

27:34

perspective and to being like very

27:36

aggressive and hostile about the other

27:39

perspective. Immediate ad homonyms,

27:42

immediate attacks on, you know, lumping

27:47

everyone in together, associating like

27:49

we were talking about earlier,

27:50

associating ancient history with racism.

27:53

Like, you're doing that. It's a little

27:54

trick you're doing. You're not having a

27:56

real conversation. you're being a [ __ ]

27:58

And this kind of communication sucks. It

28:00

sucks for the left. It sucks for the

28:03

right when people on the right. It sucks

28:04

for It's a bad human communication

28:08

uh skill. If you were good at it, you

28:11

would want other people to have

28:13

different opinions and you'd want to

28:15

hear those opinions and talk to those

28:16

people.

28:17

>> I think they're trying to course

28:18

correct. This is what's weird to watch

28:19

is they're And it's who they're

28:21

>> I don't want to they they love Schultz

28:23

at the New York Times.

28:25

>> Well, he goes on.

28:25

>> They've picked him. Yes. They picked

28:27

>> He goes there and talks to them. Yeah.

28:29

Well, he's very starv.

28:40

>> Yeah. Well, they also they just

28:41

pretended that it didn't exist.

28:43

>> Do you see Schultz talk to them though?

28:46

>> On the round table.

28:47

>> Yeah. Yeah. It was great.

28:47

>> It's hilarious because he they're

28:50

talking in these [ __ ] terms. Yeah.

28:52

And he's like, "Hold on, you know, let's

28:55

just talk real here."

28:56

>> He goes, "The Jews."

28:59

And everybody laughs cuz he can cuz he's

29:01

a comedian.

29:02

>> He's he's allowed to be funny.

29:03

>> Yeah. And there was a there was another

29:05

one that he did with another guy, I

29:07

forget from one other mainstream media

29:09

publication. It was the same sort of

29:11

situation. And to have it that way where

29:13

it's a one-on-one conversation. Then you

29:15

get to see like the weird way that they

29:18

actually think and communicate. the

29:20

bubble like Tim when Tim Dylan was on

29:21

>> the CNN one. I was gonna say that's why

29:23

I moved my ring because she kept asking

29:25

the she didn't want it. They resisted

29:27

releasing that as a long form thing

29:30

>> and you can see why cuz she's asking the

29:32

same question three or four times in a

29:33

row to try and bait something which is

29:35

not how a conversation works.

29:36

>> They pressured them into putting the

29:38

whole thing out.

29:38

>> She keeps going real come on real just

29:41

to get him cuz he's a fun guy and he

29:43

wants to say something funny and she's

29:44

like baiting him to say something

29:46

>> exaggerated.

29:47

>> Yeah. John Stewart had the best uh

29:50

response to this whole thing. He was

29:52

talking to some guy from the New Yorker

29:53

and they were talking about this podcast

29:54

and he's like, you know, they were

29:56

talking about different opinions and

29:59

people different people that I've talked

30:00

to and he's like, but Joe Rogan has the

30:03

biggest audience in the world. He has a

30:05

a bigger audience. He's like, well, go

30:06

get a big audience.

30:07

>> Yeah,

30:08

>> go get it.

30:10

>> It's not like they don't have the

30:11

finances.

30:11

>> You just go go figure it out, do it

30:14

right, and you'll get a big audience.

30:15

Like it's not that [ __ ] complicated.

30:18

I don't have pyrochnics. There's no CGI.

30:20

There's not even a crew. There's a

30:23

skeleton crew of people who do this. But

30:25

I think I think some of it is the it's

30:27

this like ivory tower mentality of if

30:31

if it becomes like that they think there

30:34

is a there is a sense in people who have

30:36

got like a very big education and have

30:38

gone through the

30:39

>> whatever system you have to jump through

30:41

to get to an elite legacy thing is that

30:43

most people are too stupid to

30:46

>> to have like an open and honest

30:48

conversation with and that if stupid

30:50

people like you then that's a problem

30:52

that that's how they're viewing the

30:53

world and that there's Well, there's

30:55

they're also being in the world in that

30:56

they're protecting people from opinions

30:59

they don't agree with.

31:01

>> Even though they listen to those

31:02

opinions, it has no effect on their

31:04

position. They take the same position,

31:05

but they're worried that people dumber

31:08

than them. It's a very condescending

31:09

thought process.

31:10

>> They think that you're the only

31:11

open-minded person.

31:12

>> And not only that, and people that are

31:13

dumber is, which is most people, you're

31:16

you're going to fall into the trap of

31:18

what this person's saying that I don't

31:19

agree with.

31:20

>> And there's Yes.

31:21

>> Yeah. And that if you and that the only

31:23

way to get people to listen to you is to

31:24

like spin lies. Like you can't just be

31:26

honest.

31:27

>> Which is what I think the podcasting

31:28

thing is.

31:29

>> It's what it is. It's a long it's you

31:31

can't really put on a facade for 3 hours

31:33

talking to somebody.

31:34

>> Maybe you can.

31:35

>> I I think that might be who he is at

31:38

this point.

31:38

>> Yeah, he is definitely that. Well,

31:40

that's why I wanted to do a podcast with

31:41

him. So you could say three hour. By the

31:43

way, no questions beforehand, no prep,

31:46

didn't pee, sat there for three hours.

31:48

He's almost 80. Like if he was wearing a

31:50

diaper, respect. But the guy just

31:53

[ __ ] hung out for 3 hours. Does that

31:56

mean I agree with everything he does?

31:57

[ __ ] no. Of course not.

31:59

>> But he was able to be himself for three.

32:01

He was able to talk for three hours.

32:02

Whereas Carmela wouldn't do it.

32:04

>> Well, she could have she could have done

32:06

it. I'm telling you, man.

32:09

>> Six minutes on Steven Colbear. And I

32:11

don't think

32:11

>> it's different. It's different. He's

32:13

kind of being like an interviewer,

32:15

right? He's in this weird position where

32:17

he's at a desk. The desk is beside you

32:20

for some reason because that's how they

32:21

always used to do it. So these [ __ ]

32:23

uncreative people just do it the exact

32:25

same way always. It doesn't make any

32:26

sense. Why does he have a desk? Is he

32:28

writing?

32:29

>> What does he have? Does he have pens in

32:30

the drawer? Like what are we doing here?

32:32

Like why am I on a couch over here? Why

32:34

am I sitting down like to the right of

32:36

you? It's weird. It's always in the same

32:38

position. Host is always to the right.

32:40

They're always to the left of the

32:41

screen. It's goofy, right? So he's doing

32:44

this thing that you only do on

32:46

television in front of an audience. By

32:47

the way, you should never have a

32:48

conversation in front of an audience

32:49

because as soon as you do, the people

32:51

are aware of the audience. You're aware

32:52

of how people think and feel and you're

32:54

playing to them and some people say

32:56

things to try to get a rise out of you

32:58

in front of the audience. Like,

33:00

>> yeah,

33:00

>> if you want to do that, it's a different

33:02

thing. But if you're going to have like

33:03

a really important conversation with

33:05

someone, you don't want to do it in a

33:06

[ __ ] audience. So, Stephen, the way

33:08

he's doing his handicap from the jump.

33:10

Also, you only have seven minutes before

33:12

you have to cut for commercial or

33:13

whatever it is. You can't do that. you.

33:17

It'll take you me seven minutes to ask

33:19

what she likes to cook. I want to know

33:22

what she who she I don't know. I want to

33:23

know does is there anything that she

33:25

regrets doing? Is she ever what does she

33:28

learn from this time? Is it more

33:30

complicated being a vice president than

33:31

you thought it was going to be? Like

33:33

what is the web of trying to fix things

33:35

and change things versus the people that

33:37

are influencing you to make decisions?

33:39

Cuz we're not pretending that people

33:40

don't spend a lot of money to influence

33:42

your decisions. So, how much of an

33:44

effect does it have? What do you

33:46

actually believe when they come to you

33:48

asking for those favors?

33:49

>> What would it what would be better?

33:50

Could we take money out of politics?

33:52

Would you be willing? Would what would

33:54

what would we do if we completely

33:56

eliminated corporate funding of any

33:58

politicians? How would that change

34:00

everything? Those are the kind of

34:01

questions we could have like we could

34:03

have talked for hours about that.

34:04

>> But they don't she doesn't want to do

34:05

that. And the people around her this is

34:06

what I there's like there's something

34:08

that has the right used to have this as

34:10

well and both sides of politics had it.

34:12

And I remember there was like Howard

34:14

Dean, I think it was, did a weird scream

34:16

at one time and the whole thing fell

34:18

apart and that really stayed with me

34:19

that I remember watching politics and

34:21

there was some sense of like everything

34:23

is very manufactured and if you make a

34:25

single mistake, oh my god, you're going

34:27

to lose a primary, it's all over. And

34:29

Trump destroyed that with the

34:30

Republicans

34:31

>> where it all became very we've just got

34:33

to like hang out and talk and everyone

34:35

got very loosey goosey on the right and

34:37

the Democrats have not adjusted to that

34:39

and had their

34:40

>> like Bernie could do it. They just froze

34:42

Bernie out and they did everything they

34:43

could to stop him coming through,

34:45

>> right? Like Marjorie Taylor Green, you

34:48

could not have a person like that before

34:50

Trump. That would There's no way.

34:52

There's no way.

34:54

>> I mean, you can't have her with She's

34:56

gone.

34:56

>> She's gone now.

34:57

>> She's gone. But she wouldn't have

34:59

existed without him. Like that sort of

35:01

brash crazy personality that had not

35:03

existed in a congressperson.

35:05

>> And there will be someone on the left

35:06

who can do that.

35:08

>> Jasmine Crockett, she's doing that,

35:10

>> man. Maybe

35:11

>> she gets aggressive loud and they get

35:14

crazy with each other. Listen, it's a

35:16

reality show. I know people don't like

35:18

her. I think she's hip. She would maybe

35:20

come on the show.

35:20

>> Mhm. Okay.

35:21

>> Have you invited her to come on the

35:23

show?

35:23

>> No.

35:25

>> I listen.

35:26

>> I'm too scared to have me on the show.

35:28

>> I think uh a lot of them are probably

35:31

very nice people. Very nice people

35:33

there. And this is not a an attack on

35:36

any individuals. I think that system

35:39

turns you into a sociopath. That's what

35:42

I think. And I think there's very few

35:44

people Tulsi Gabbard, my friend being

35:46

one of them. I love her. She's amazing.

35:48

She's a real person. Like that lady is

35:50

the same person on air, off air, meeting

35:53

people, hanging out with her husband.

35:55

I've hung out with her hours and hours

35:57

and hours. That's who she is. She's cool

35:59

as [ __ ] And she was a congressperson,

36:02

but she has horror stories. Yeah. when

36:05

she tells you like what it's like on the

36:07

inside and when she when you find out

36:10

how these people are making hundreds of

36:12

millions of dollars on a $170,000 a year

36:15

salary and no one's batting an eye. That

36:18

is kind of kooky. That's it's kind of

36:20

kooky cuz even ones you wouldn't suspect

36:22

like wait a minute they're worth how

36:24

much now you don't really know how much

36:27

they're worth, right? You you're you'd

36:29

have to you'd have to get an audit,

36:31

right? because what you're hearing is a

36:33

reporting of what they're worth. And it

36:35

could be total propaganda. It could be

36:37

half of what it is. But even if it's

36:39

millions, even if it's a couple million,

36:42

if you have if you've been a Congress

36:44

person for two years and now all of a

36:46

sudden you're worth $3 million and you

36:48

were in debt before you became a

36:50

Congress person, that's suspicious. And

36:53

if you look at the [ __ ] the people

36:55

that invest money, that's where it gets

36:57

really crazy because it is not a blue

37:00

thing and it's not a red thing. It's

37:02

both. Everybody is making money on the

37:05

stock market. There's a shitload of

37:06

these people that are buying a bunch of

37:10

stock and then conveniently

37:13

a short time later a bill gets passed

37:14

that they were working on that makes it

37:16

very profitable for that country. Stock

37:19

shoots through the roof. They make a

37:20

giant windfall. I'm trying to remember

37:22

who said it. There was some line that

37:25

someone said about like you can you can

37:27

sort of believe what you want in

37:28

American politics and you'll get rich

37:30

for it. Like no matter what you actually

37:32

believe, there's a group out there who

37:33

are going to get you rich for having a

37:36

belief in it.

37:37

>> Sure. If it's the environmental people,

37:38

if it's the fossil fuel people,

37:40

>> right?

37:41

>> I mean, there would be varying scales of

37:42

it, but also you can fix this. Like

37:44

there are ways to

37:45

>> to fix the money and politics.

37:47

>> I've been reading a lot about Lee Kuanu.

37:49

>> Who's that? He was the sort of the

37:52

dictator of Singapore. They might not

37:53

like that. Um because he won

37:56

>> Don't go there.

37:56

>> He won he won elections, but Singapore

37:58

is like a single party state.

38:00

>> Oh, so it's like when Putin wins.

38:03

>> I don't want to get in trouble with the

38:04

people of Singapore.

38:05

>> Listen, just

38:06

>> but it is notable that one party wins

38:08

every single time and they don't primary

38:10

and they win almost all the seats

38:12

>> and they are really popular. But he

38:14

brought in like canings and he got drugs

38:16

out of the country and he started paying

38:17

the politicians a lot.

38:19

>> Like if you're a politician in

38:20

Singapore, you get a huge salary, but

38:22

you are not to ever be corrupt. Like

38:26

you're meant to have enough money that

38:27

they can't really buy you

38:29

>> and that might be the only way cuz if

38:32

you have, you know, what are you what

38:33

are they earning? $170,000 something

38:35

dollars a year

38:36

>> to be a congressperson. If they are

38:38

making $3 million a year and the

38:40

punishment for taking money from anybody

38:42

else or from getting a stock, you know,

38:43

maybe you can't own stocks, but we give

38:45

you $3 million a year,

38:46

>> right?

38:47

>> Then at least you can't be swayed. Like

38:50

you're taking a lot of tax money to do

38:51

the job, but at least there's some

38:53

insulation on someone being able to go,

38:55

I want you to vote this way.

38:56

>> I think if you have a totalitarian

38:58

dictatorship, you could probably pull

39:00

that off because if the politician is

39:01

bad, you could shoot him. Yes. The

39:03

problem in America, if you have $3

39:05

million and you know a guy who's got $50

39:07

million, you feel poor because we're

39:09

[ __ ] All right. Brian Ken has a

39:12

friend who's worth I think he's worth $8

39:14

billion and uh he feels broke cuz his

39:18

friend is worth 30.

39:20

>> I don't

39:21

>> No, no, no. For real. Yeah. There's

39:22

people that get that goofy.

39:24

>> I've met I've seen it a couple times.

39:26

So, if you're in the business of trying

39:27

to make money, which is what most

39:28

politicians are, that's like they

39:30

decided not to go into sales, they go

39:32

into politics, they're trying to make as

39:34

much money as they can while they're

39:35

there, right? That's what most people

39:36

are doing with most jobs. If you're

39:38

doing that and you're you're just kind

39:40

of a a person who's drawn to that kind

39:43

of a job, you're not going to be happy

39:45

with your salary. If you find out that

39:47

there's some NGO that you can invest in

39:50

and you can start a nonprofit and then

39:52

it becomes a profit and you can funnel

39:55

mo money overseas and then corporations

39:58

that you buy into also can use the you

40:03

know the laws that you're passing.

40:05

>> You're going to do it anyway. They're

40:06

going to do it anyway.

40:07

>> This is why Plato says

40:09

I cannot I cannot be corrupted. You'd

40:12

have to kill them. If you if you catch

40:14

them corrupt, you got to shoot them in

40:15

in front of everybody. You're going to

40:17

say, "This is what happens when you

40:18

steal from America." Boom. I'm not

40:20

saying you should do that, but I'm

40:21

saying that's the only way you're going

40:22

to stop it. It would have to be a

40:24

totalitarian dictatorship. But then it

40:25

brings us back to the thing about using

40:27

the military in the cities. When do you

40:28

draw the line?

40:29

>> Yeah.

40:30

>> When do you draw the line? Like when

40:31

like, okay, what's hate speech? Right.

40:34

So hate speech can mean a bunch of

40:36

different things to different people. So

40:37

as soon as you say we can't permit hate

40:40

speech, okay, well then you can't can't

40:41

permit freedom of speech because you're

40:43

just defining hate by whatever, that's

40:46

the same line when you bring the

40:48

military into those cities. It's the

40:50

same line. It's like you're doing

40:52

something you shouldn't be able to do

40:54

and you're justifying doing it, saying

40:56

because this is a special case, but the

40:58

problem is what if that gets solved?

41:01

You're going to move further to the even

41:03

more red. You've you've already got me

41:05

to allow you to arrest you can arrest me

41:08

for tweeting things. Okay. I've already

41:10

said yes to that. So, what else is next?

41:12

Like what? Well, you're going to you're

41:13

going to keep going. If you make money,

41:15

you want to going to want to make more

41:17

money. If you pass laws, you want to

41:19

want to pass more laws. That's how you

41:21

get numbers on the board. That's how you

41:23

win this [ __ ] game. You can't let

41:24

them ever score.

41:25

>> Then you have to deame the system long

41:28

term. If you're going to have a

41:29

democracy, you have to have

41:30

>> Yeah, you got to deame the system. But

41:32

the problem is there's so much profit in

41:34

it and they get to vote on whether or

41:37

not they can still do this insider

41:39

trading thing, right? Which is bananas.

41:42

Like who thinks we should still steal?

41:44

Oh, can we have an anonymous vote?

41:49

>> You don't have this problem with an

41:50

aristocracy. That's all I'm saying. If

41:52

you finally go back to the powdered wigs

41:54

and

41:54

>> there's a real there's a terrible

41:56

argument for that because you're just

41:58

hoping that the person is a benevolent

42:00

dictator. That's the best case scenario.

42:03

You get a benevolent king. But how how

42:05

many of those have ever existed?

42:07

>> We've had so many beautiful benevolent

42:09

kings. We've got a benevolent king right

42:12

now in my country.

42:14

>> It's um it's strange, right? It's like

42:17

there's no right way to run people

42:20

because no one really should be one.

42:23

There's never a time where it makes

42:25

sense where one person is the head dude

42:28

of 350 million people. That is nuts.

42:32

That is completely nuts.

42:34

>> Yeah. But you also I mean as a country

42:36

you have a great tolerance I think

42:38

compared to other like western

42:39

democracies for letting there be some

42:42

chaos.

42:43

>> Yeah. Because we're have we have guns.

42:45

That's part of it. I think this is a

42:47

heavily armed country.

42:48

>> Tolerating chaos allows you to have the

42:49

gun style. Like if you didn't have the

42:51

virtue of going some people are going to

42:53

get shot and we're going to be okay with

42:55

that.

42:55

>> Well, it's not just that. It's like, you

42:58

know, it was written into the

42:59

Constitution because we were rebelling,

43:01

right? We were we were rebelling from a

43:03

dictatorship. We had escaped. And when

43:05

we had declared that this was a country,

43:08

we were like, we got to stay strapped

43:10

because these [ __ ] might come

43:12

back. And we all agreed to that.

43:14

>> Yeah. And then it got to a point where

43:15

people go, "Okay, but they were talking

43:17

about musketss. Now people have AR-15s.

43:20

Now people have, you know, switches they

43:22

could put on Glocks and it can fire

43:24

automatic." Like

43:25

>> is a tactical nuclear weapon defended

43:28

under the Second Amendment.

43:29

>> You want to hear the scariest thing that

43:30

I heard.

43:31

>> This this was a guy that was talking

43:32

about uh the UAP program and the back

43:35

engineering of U flying saucers.

43:37

>> Yeah.

43:38

>> They what do they call it? A

43:39

simultaneous or a spontaneous what was

43:42

the word that he used for it?

43:43

instantaneous

43:44

>> instantaneous

43:46

>> that these UFOs that they believe use

43:49

some sort of a gravity

43:51

some sort of a propulsion system that's

43:53

unknown to modern science standard

43:56

conventional science and they can

43:58

transport literally transport like going

44:01

from place to place in space

44:03

instantaneously and so what did the

44:05

United States government try to do they

44:07

tried to use it as a method of

44:09

delivering a nuclear bomb so an

44:11

instantaneous nuclear your payload

44:14

delivery system. That's what they were

44:15

calling flying saucers. The the first

44:18

thing they thought about doing with them

44:20

was instantaneously deliver a nuke so no

44:23

one could retaliate and they didn't even

44:24

see it coming.

44:26

>> You would just have a flying saucer with

44:29

a nuke appear at the Kremlin.

44:33

What's weird though, you guys had that

44:35

capability for years

44:37

>> allegedly. No, I mean I mean when I mean

44:40

when no one else had the nuclear bomb

44:41

and when we didn't have good anti-air

44:43

programs

44:44

>> and just America alone had nuclear

44:46

weapons.

44:47

>> Yeah.

44:48

>> You could have at that point

44:50

>> you could have said we're in charge of

44:51

the world now or everyone's dead.

44:52

>> Well, there was a bunch of people that

44:53

did. I mean that's what Doctor Strange

44:55

love is all about, right?

44:55

>> You made movies about it and you talked

44:57

about it, but you didn't do it. When the

44:58

SZ crisis kicked off, I think Eisenhower

45:00

was like, "Can we get a nuke in there?"

45:01

And people said, "No, Mr. President."

45:03

>> Bro, they came real close to nuking

45:05

things three or four times. What a

45:06

beautiful thing that you

45:08

>> you held back.

45:09

>> Yes.

45:10

>> No one else would have. I talk about

45:11

this. I think about this a lot that like

45:14

if anyone else had discovered the

45:15

nuclear weapon, that's it. You'd have

45:17

global hegemony by one power.

45:19

>> Well, I think that is one thing about

45:20

America that

45:22

most people will agree to is that we

45:25

like to think of ourselves as being the

45:28

best country in America. And that comes

45:30

with responsibility. Being the greatest

45:32

superpower comes with responsibility.

45:34

That's why people get real uncomfortable

45:36

about like drone bombing statistics and

45:39

[ __ ] like that. They get real

45:40

uncomfortable because it makes you like

45:42

really question like what what we do.

45:43

>> Yeah.

45:44

>> When you you know when you tell people,

45:46

did you know that like more than 80% of

45:48

the people that die in drone bombings

45:49

are civilians?

45:51

Accidental kills.

45:52

>> Also every time someone tries to be nice

45:54

about Obama, then they have to go the

45:56

drone bombings to try and bomb a lot of

45:58

innocent people.

45:58

>> I know they always have to do that. You

46:00

know, listen, I think we found out

46:02

through Obama, most likely what you find

46:04

out through anybody that gets through

46:05

there that's not Trump is that they

46:07

immediately co-opt you into the system.

46:10

You had no idea how the system worked

46:11

until you got in there. You were a

46:12

senator for two years and then all a

46:14

sudden you're a president. You had some

46:15

amazing ideas and you're a great

46:17

spokesperson and probably the best

46:19

statesman we've ever had. like the best

46:21

representative of the best about

46:23

America. A guy who is from a single mom,

46:26

you know, grew up poor, didn't, you

46:28

know, didn't have a silver spoon in his

46:30

mouth. Forget about all the narratives

46:31

of him being nar related somehow to the

46:34

bushes. There's a lot of that.

46:35

>> I didn't know that.

46:36

>> Oh, there's like a whole conspiracy

46:37

theory. But point is that what you got

46:40

is a guy who was promoting hope and

46:43

change, right? And that's what we were

46:45

all really hoping was going to happen,

46:46

but not. It was really kind of like

46:48

another Bush term in terms of like

46:50

foreign policy, in terms of a lot of

46:52

things. In terms of like the way America

46:54

felt about America, though, it was good.

46:56

It was like, hey, racism has obviously

46:59

like stopped being an issue to get you

47:01

to be the president of the United States

47:03

because a a black man just won. And it's

47:06

not saying that racism doesn't exist,

47:07

but we're doing better than we used to

47:09

do. This was not possible when Martin

47:11

Luther King Jr. was making his I have a

47:12

dream speech, but it is possible now.

47:14

So, we have progressed and he's

47:16

brilliant. So, it's per and he's and

47:18

he's like well measured and calm and

47:21

peaceful and he never calls reporters

47:24

piggy.

47:25

>> He never he never makes me tweets when

47:28

his enemies die, you know, like so as a

47:30

representative. I'm not I it's gotten to

47:33

the point where the Rob Riner tweet just

47:34

went over it just like

47:35

>> it killed it for a lot of people. Yeah.

47:37

>> Is that it? But like No, I mean I saw it

47:39

and I was like, "Oh, yeah, of course

47:40

he's mocking a dead man." Well, that guy

47:43

tried to jail him for, you know, year

47:45

and this is not forgiving him for that.

47:47

This is not tried to jail.

47:49

>> Oh my god. There's a video of him

47:51

working with intelligence uh agents. He

47:53

was working with James Clapper and uh

47:55

who's the other guy? Clapper and

48:00

why how come I can't remember that?

48:04

>> I just I I still think it's a good

48:06

policy.

48:08

Oh, 100%.

48:09

>> Just it was with McCain as well. I

48:10

remember that they hated each other.

48:12

>> I know 100%. It's gross. It's a gross

48:15

thing to mock a man after he's dead.

48:17

It's just pointless. And but the the

48:19

real problem is it's a bad look for

48:21

America in general, right? It's a it's a

48:25

mark of cruelty that ultimately could

48:27

lead people to think differently about

48:29

America and perhaps motivate attacks.

48:32

That's a real thing. Like a kooky

48:33

person, you can sway them either way by

48:37

the vibe the country is giving off. And

48:39

the president is giving off a vibe that,

48:41

you know, his enemy, he's mocking the

48:43

fact that his,

48:44

>> you know, his enemy was obsessed with

48:46

him

48:47

>> and that's what led to his son going

48:50

crazy and killing him.

48:50

>> I've had friends come over and visit me

48:52

and almost all of them have been scared

48:54

to come. Like people who haven't been to

48:55

America before.

48:56

>> They're scared to come to America.

48:57

>> People are very scared to come to

48:58

America.

48:58

>> Yeah.

49:01

>> Well, this is like not Honduras. This is

49:02

just Australian like like there's gun

49:04

violence. It looks if you just if all

49:06

you're seeing is the the news, you go,

49:08

"Well, Civil War's

49:10

>> right around the corner."

49:11

>> Well, they want us to see and it's like

49:12

>> that's what they want to see.

49:13

>> People are way more interested in

49:14

college football than killing each other

49:16

in the street.

49:16

>> Especially in Texas, they're what

49:18

they're way most people are way more

49:20

interested in living their lives. The

49:21

problem is when your life becomes that.

49:24

The problem is when your life becomes a

49:25

cause. When your life, whether it's a

49:27

religious cause, you know, a jihadist

49:29

cause, a right-wing cause, a left-wing

49:32

cause, your life becomes a [ __ ]

49:33

cause. you know, we have to stop oil now

49:37

and you're gluing your [ __ ] hand to a

49:39

painting. You know, there's a lot of

49:41

nutty, stupid [ __ ] that goes on with

49:43

just being a human being

49:44

>> and it's all accelerated by social

49:46

media.

49:46

>> But I find it heartening that people

49:48

give a [ __ ] here.

49:49

>> That people know on some level, maybe

49:51

they don't have like a good grasp of

49:53

what's actually happening in the world.

49:55

>> But there's a sense in America that

49:56

people kind of know who their

49:57

politicians are. They're across what the

50:00

issues that they're being asked to vote

50:01

on are. And this like in Australia, the

50:05

extent to which people have no idea what

50:07

is going on and are so checked out and

50:09

don't know any of it and are not like

50:10

actively participating in democracy, you

50:13

guys really care. Like people primary

50:16

and they scrutinize people and they

50:18

there's some belief that you can still

50:19

get involved in politics here. I really

50:21

It's like the most heartening thing

50:23

about it is that and that's the that the

50:25

downside is if everybody cares then you

50:27

do get

50:28

>> you get people going off the deep end.

50:29

Well, you just got to keep it a fair

50:31

game. And as long as you keep it a fair

50:34

game, if you don't do a good job and

50:36

that person gets into power, you [ __ ]

50:38

up. So now your team has to regroup and

50:41

rebuild and come back again in four

50:42

years. And that's what it's supposed to

50:44

be. But when you start trying to do

50:46

things like moving all the illegals to

50:48

specific states so that you get more

50:50

congressional seats because of the

50:52

census and then you start giving them

50:54

social security numbers and Medicaid and

50:56

Medicare and you start rigging the

50:58

system because you want to like bring in

51:00

more voters and you're spending and this

51:02

is what they did. This is undeniable at

51:05

this point.

51:05

>> Fedman was cop to it. He was like yeah I

51:07

saw him on your

51:08

>> un it's undeniable what they did and I

51:11

get it like you're playing a dirty game.

51:13

They're playing a dirty game. And this

51:14

is not a right or left thing. I remember

51:16

that hacking democracy um documentary

51:19

that was on HBO back in the day. It was

51:21

during the Bush administration. And this

51:23

hacking democracy, they had tested these

51:26

voting machines and and this is a long

51:28

time ago, right? So this is like what

51:30

was it like 2004, Jamie? What was that?

51:34

>> Somewhere around then. So this this was

51:37

a much less sophisticated system that

51:39

I'm sure that they're using today. But

51:41

there was a third party input. For some

51:43

reason, it had been set up so a third

51:46

party can input data into the machine

51:49

and change the votes. And they did it on

51:51

TV. They did it on TV. They showed that

51:54

they could do it easily and they

51:57

affected the votes. So they showed back

51:58

then they were essentially saying that

52:01

the Bush administration had rigged the

52:02

vote and that's how they got Bush into

52:04

office. And this company that made these

52:06

machines was a big contributor to the

52:08

Republican party. So this [ __ ] has been

52:10

going on on both sides.

52:11

>> That was true. I mean in 2000 that was

52:12

true. Everybody thinks the JFK election

52:15

>> the film investigates Oh, for sure the

52:17

JFK election. The flawed integrity of

52:19

the electronic voting m voting machines,

52:22

particularly those made by Diebold

52:23

election systems, exposing previously

52:25

unknown backdoors in the DBOL trade

52:27

secret computer software. The film

52:29

culminates dramatically in the on camera

52:32

hacking of the inuse working dybold

52:34

election system in Leon County, Florida.

52:37

The same computer voting system which

52:38

has been used in actual American

52:40

elections across 33 states and which

52:42

still counts tens of millions of

52:44

American votes today. Whoa. Today

52:47

>> is that real?

52:49

>> The same [ __ ] machines

52:50

>> when it was written. I don't know

52:51

>> when did it when did this article come

52:52

out?

52:53

>> This is uh Wikipedia. I don't know.

52:54

>> It's usually up there.

52:55

>> Bro, that's crazy. If they're still

52:57

using the same machines, that's crazy.

52:58

>> But of like

52:59

>> But that was a thing during Georgia,

53:01

right? They were supposed to upgrade

53:02

their machines, but they decided to wait

53:03

until after the election to do it. Why

53:05

is there no pressure to make the

53:07

elections feel more real?

53:08

>> I think because they're both rigging it,

53:11

>> right? But if they're both rigging it,

53:13

then if neither of them was rigging it,

53:14

>> they just want to win, man. And then

53:16

call everybody conspiracy theorists.

53:17

Both sides, by the way, this is not one

53:19

side or the other. I think both sides

53:21

are trying to do whatever the [ __ ] they

53:22

can.

53:23

>> I don't think both sides rigging it.

53:24

>> Okay. It's not been used in business in

53:27

the US since 2009. Well, this is about

53:29

the Bush administration to die both

53:31

things. and what you're hearing about

53:33

mail and ballots, that's about the left.

53:35

It's like you're you're getting the same

53:37

thing on both sides. One of the things

53:38

that Rep. Luna said when she was on the

53:40

podcast, I thought was fascinating,

53:42

she's like, "There's certain problems

53:43

that they don't want to fix because they

53:44

can campaign finance against it.

53:46

>> They can like get people to donate money

53:48

against it."

53:49

>> Okay?

53:49

>> You know, like they could run on that

53:51

platform. We're going to fix this. Like

53:53

they don't want to fix it because that's

53:54

how they get money,

53:55

>> right? Like if you're a homelessness

53:58

organization, you actually need the

53:59

homeless so you can keep existing. Not

54:01

only that, it's even worse. They're

54:02

incentivized to have more homeless.

54:04

Yeah. They get paid per homeless.

54:08

>> It so if they have more homeless people,

54:10

they can say, "Hey, we need a bigger

54:11

budget. We have more homeless people."

54:12

>> I remember when we had the unemployed in

54:14

Austral.

54:19

But you got more money for finding

54:20

someone a job if they've been unemployed

54:22

for a longer period of time. So it's

54:24

like don't try too hard to find them a

54:26

job for the first two years. All right.

54:27

Two years in, then get them a job.

54:29

>> Yeah. growing some plants. You don't

54:31

want to pick it so early.

54:32

>> Yeah, it's not.

54:34

>> I don't think the answer is just a a a

54:37

good king who solves everybody's

54:39

problems, but I really do.

54:40

>> You'd be a good king. Watch. Go over to

54:42

Australia and be king of Australia.

54:45

>> We've got enough problems.

54:46

>> You can fix it.

54:47

>> Uh

54:49

I've talked about getting our own king

54:51

many. I did a show about it once. I

54:52

really I think Aboriginal king would be

54:54

>> Well, everybody bring the country

54:55

together.

54:56

>> Yeah, for sure. that'll work for

54:58

everybody wants like the perfect system

55:02

and it's not going to ever exist and I

55:04

don't think it ever will because I think

55:05

there's always going to be no matter

55:07

what happens no matter who's in charge

55:09

and no matter who's doing this there's

55:10

always going to be people that oppose no

55:12

matter what naturally oppose even if

55:15

illogically there's it's never going to

55:17

be perfect but you got to make it the

55:19

most fair it's got to be fair and as

55:22

soon as you catch someone rigging the

55:24

system you got to that has to be a alarm

55:27

bells that go off for everybody on every

55:29

side. It shouldn't If you find out that

55:31

there was mail-in ballots that were

55:34

illegal and that were fake and they were

55:35

brought in so that the Republicans can

55:37

win some sort of a primary. If you found

55:39

out that was true and you're a

55:40

Republican, you're supposed to be upset.

55:43

>> Yeah.

55:43

>> Like this is you're someone is cheating

55:46

this incredible system that we have and

55:48

you you're not going to have the will of

55:50

the people. This episode is brought to

55:52

you by the farmer's dog. Here's a fun

55:55

fact. Research shows that dogs who

55:58

maintain a healthy weight can live up to

56:00

2 and 1/2 years longer on average than

56:03

dogs who are overweight.

56:06

Isn't that wild and also kind of obvious

56:08

at the same time? So why is feeding

56:10

vague scoops of ultrarocessed kibble

56:13

still the status quo for most dog

56:15

owners? Healthy alternatives exist, ones

56:19

that are all about investing in the

56:21

long-term health of your dog. Like

56:24

Farmer's Dog, The Farmer's Dog makes

56:27

fresh food for dogs. The recipes are

56:30

made with real meat and fresh vegetables

56:32

that are gently cooked to retain vital

56:35

nutrients. They also portion out the

56:37

meals to your dog's nutritional needs,

56:40

which helps avoid overfeeding and makes

56:42

weight management easier. And isn't

56:45

getting more time with our four-legged

56:47

best friends something that every dog

56:49

owner wants? The answer to that is yes,

56:52

obviously. So, try The Farmer's Dog

56:55

today and get 50% off your first box of

56:58

fresh, healthy food. Plus, get free

57:02

shipping. Just go to the

57:04

farmersdog.com/rogan.

57:07

This offer is only for new customers.

57:09

>> You got to make it seem fair enough so

57:11

that there's not a violent uprising.

57:13

It's got to be

57:13

>> like just for having a future of the

57:15

country.

57:15

>> It's got to be the will of the people.

57:17

It's got

57:17

>> January 7th thing.

57:18

>> Yeah,

57:19

>> that was people going No, but that was

57:20

that was those people that was a lot.

57:21

There were some people there who were

57:23

definitely feds trying to bring them in

57:24

the building.

57:25

>> Dude, I I wonder how many were feds

57:27

before that. Here's the question.

57:30

There's a bunch of people that were feds

57:31

at the scene. They finally had to admit

57:33

that. We were talking about that for the

57:35

capital.

57:36

>> Yeah,

57:36

>> that man's Have you seen that guy?

57:38

That's crazy.

57:39

>> It's crazy. There's a bunch of people

57:40

that called people to go into the

57:42

capital to break in and a bunch of them

57:45

probably were feds. But how many feds

57:48

were on these chat groups? How many feds

57:52

were on message boards? How many feds

57:54

were instigating people to do things and

57:58

talking about things that aren't true or

57:59

saying things that they're pl how many

58:01

how many feds were trying to get the

58:03

kookiest of the kookie riled up?

58:06

>> Yeah. But then also like why why is the

58:08

blame not on why do the Democrats not go

58:11

we've contributed to making a system

58:12

that even if this is a totally

58:14

legitimate like group of people who

58:16

really believe what they're doing by

58:17

storing the capital

58:18

>> we've contributed to building a system

58:20

that looks really fake to a lot of

58:22

people

58:23

>> where we could take really easy steps to

58:24

make it look less fake like you could

58:26

have

58:27

>> I don't understand why voter ID isn't

58:30

everywhere and they go well not everyone

58:31

has an ID well it's racist

58:33

>> give them one

58:33

>> it's racist what you're saying is racist

58:36

How hard could it be to go

58:38

>> check your white privilege? You are a

58:40

straight white male. Why don't you just

58:42

shut the [ __ ] up?

58:43

>> All the other races can have a

58:45

photograph taken of themselves just as

58:47

easily on a little lemonade card.

58:49

>> All those other races just a few years

58:50

ago needed proof of vaccination. So this

58:53

is kooky. It's completely kooky.

58:54

>> It would be I but then nothing has been

58:57

done now to actually bring it in.

58:58

>> It's illegal to show your ID in

59:00

California.

59:00

>> What?

59:01

>> Where? Where?

59:02

>> California.

59:02

>> In the state. cannot show your ID when

59:04

you vote.

59:06

>> Even if you want to,

59:07

>> you can't show it.

59:08

>> You can't wear it on a lanyard around

59:09

your neck.

59:10

>> They'll fire you. They'll kick you out

59:11

of there.

59:13

>> You can't vote now, sir. I don't know

59:15

what they would do if you came in with a

59:16

lanyard. That might be the move. But the

59:18

point is, they made it easier to cheat

59:20

on purpose. Like, that's the only reason

59:22

why you would do that. And to say like

59:24

it's racist to require ID. How do I know

59:27

who you are? I don't know you. There's a

59:29

million people in this [ __ ] town. And

59:31

this is like one polling station is a

59:33

line around the block. I don't know you.

59:36

I need your ID. This is crazy.

59:38

>> There was a clip from the Obama election

59:40

that I remember watching where they were

59:41

talking to a guy who was like they asked

59:43

him, "Have you ever voted before?" He

59:44

said, "No." "Did you vote?" He goes,

59:45

"Yeah, it felt so good. I went back and

59:47

did it again and then they cut off to

59:49

somebody else." I've always remembered

59:50

that that felt

59:52

>> Yeah. If you don't have ID, you could

59:54

just change your clothes and go back in.

59:55

especially if you're a nondescript.

59:58

You know,

59:59

>> I don't have an anti-Gavin Newsome bent,

60:01

but I don't understand why he's the guy

60:03

the Dems are pushing

60:05

>> because

60:05

>> he's from a state that everybody agrees

60:07

is in a huge disrepair.

60:09

>> He doesn't agree that he thinks it's

60:10

killing it.

60:11

>> They can't build a train.

60:12

>> No, no, it's great.

60:13

>> They've wasted billions of dollars

60:14

trying to get a reasonably short

60:16

distance covered with a train and he

60:18

can't do it.

60:19

>> They're going to get it worked out. He's

60:20

going to be president and then he's

60:21

going to fix it all. The problem is

60:22

Trump. The reason why it's Trump, Trump

60:25

is the real reason why California's

60:27

failed is Trump. Once he gets into

60:28

office, Trump will be out and he'll fix

60:30

the whole country. And see, guys, you

60:31

got to trust me on the long plan.

60:36

People will buy into it. The reason why

60:38

is because there's no one else. This is

60:40

the reason.

60:40

>> There must be so many force so many

60:43

people that are rational out. So many

60:45

people that aren't corrupt, they force

60:46

them out. And then other people don't

60:48

want their laundry dug up. They don't

60:50

want fake stories told about them. They

60:52

don't want ex-girlfriends to get paid

60:54

off to come up with crackpot theories of

60:56

them being a satanic person or whatever,

60:59

drug addict, abusive.

61:01

>> All right, this only people who

61:04

>> left a dead bear in the park.

61:05

>> You should get like Bill Cosby as the

61:07

candidate or people of Bill Cosby level

61:10

stature. This is my new idea.

61:12

>> Okay.

61:12

>> Okay.

61:13

>> Let me hear it.

61:14

>> Just someone who is so you there's

61:16

nothing to blackmail them with. People

61:18

already think this is one of the worst

61:19

people for president,

61:21

>> right? You can't Everyone knows he had a

61:23

dungeon with a lady in it. Okay, you

61:26

can't blackmail R. Kelly at this point.

61:28

So, whatever R. Kelly says he wants to

61:29

do,

61:30

>> he probably wants to do that. His

61:32

reputation can't get any lower, right?

61:33

>> If you only put forward

61:35

>> people who have done terrible things.

61:37

>> If Epstein was still alive, you could

61:39

have him because what are you going to

61:40

blackmail him with? He was getting he

61:41

was doing all sorts of terrible things.

61:42

Well, you would like to have a very good

61:44

person who just hasn't done terrible

61:46

things cuz you're just a very good

61:48

person.

61:48

>> You can just lie about them. The only

61:50

security against being blackmailed even

61:52

about a lie

61:52

>> is to be a total piece of [ __ ]

61:53

>> is to be the worst man in the country.

61:56

>> Right. Yeah.

61:58

>> No one likes my idea.

61:59

>> It's a good idea for now. I think what

62:01

we're going to really be able to know

62:03

within the next few years is whether or

62:05

not you're telling the truth. I think

62:08

with uh wearable electronics, I think

62:11

ultimately they're trying to do

62:12

something that allows you to communicate

62:14

head-to-head. Have you seen that stuff

62:16

where they do it?

62:17

>> I'm not getting it. I don't

62:18

>> Have you seen Well, the what they have

62:19

right now is a wearable. These guys put

62:21

it on, they think something, and then

62:23

the other person hears it.

62:25

>> This is one of the worst things I've

62:26

ever heard.

62:27

>> Oh, you have to you have to see it. It's

62:30

crazy when you watch them actually do

62:31

it. So, right now it's attached to an

62:34

actual computer behind them, but that's

62:35

for now. Eventually, it's going to be

62:37

wearable. Just like everything, it gets

62:39

smaller. I mean, this is bigger than

62:40

>> You're so much more relaxed with the AI

62:42

stuff and the technology than I am.

62:44

>> You can't I'm fighting it.

62:45

>> If you see the asteroid coming, you have

62:47

to realize you're going to die. Like,

62:49

there's nothing you can do about it.

62:50

>> The Amish have continued very happily.

62:53

>> I don't think it's going to be as

62:56

disastrous as everybody thinks. I just

62:58

don't believe that. I think we'll figure

63:00

it out, but I think it's going to be a

63:02

massive upheaval of our total a

63:06

completely our economic system, our life

63:08

system, the way we interact. But we have

63:11

to realize, this is what's really

63:13

important. The way we interact is really

63:16

new. the way we live in cities stacked

63:19

in highrises and driving around in cars.

63:22

This is a tiny little blip in time that

63:27

the human race has existed like this.

63:30

Before that, we had a totally different

63:31

thing. And for the longest time, people

63:34

traded things back and forth and they

63:36

they used gold coins and silver coins

63:38

and and there was no stock market. Like

63:41

this whole thing that we're doing right

63:43

now with automation and you're worried

63:45

about it's taking jobs. Those jobs

63:46

weren't even a thing in the past. Yeah.

63:49

We built this giant population based on

63:51

the fact that jobs would exist. We gave

63:53

people the confidence to procreate, get

63:55

married, and have kids and and this

63:57

we'll find another way. We'll have to

64:01

people will have to. It'll it's just

64:03

it's not going to be pretty. But it's

64:05

just like everything else that happens.

64:06

It's this massive change in society and

64:09

culture. We're gonna have to adapt.

64:11

>> I'm in I'm in flight mode on it.

64:14

>> I want to be on an acreage. You know,

64:16

>> you get nervous when I play AI music in

64:18

the green room.

64:18

>> When I go, "This is good." And you go

64:21

and I go, "Yeah."

64:22

>> Yeah. You love that. That country one I

64:24

played the other day. That was good,

64:26

right?

64:26

>> 50 Cent stuff is fantastic. My favorite

64:28

remains the Japanese cover of Oasis.

64:32

>> Uh, have you heard Japanese Oasis?

64:33

>> No, I have not. If you type in Japanese

64:36

Wonder Wall, it is

64:37

>> Oh, I I like it a lot.

64:39

>> Can we play it?

64:40

>> Can we play it, Jamie?

64:42

>> Or would it be an issue? We got to cut

64:43

it out.

64:44

>> We'd have to cut it out.

64:45

>> I don't think anyone owns the rights to

64:46

Japanese.

64:47

>> They might. Somebody probably does.

64:49

>> Who wrote the song Wonder Wall do?

64:51

>> Really?

64:52

>> That's how that works.

64:54

>> The performance of this would be a

64:56

different situation. But

64:58

>> I can do it now. I can do it. Bella,

65:00

you're getting a lot of trouble.

65:01

>> Let's hear it. Wonder Wall Oasis cover

65:05

Japanese Anka is the uh title on

65:08

YouTube.

65:08

>> This is the right one. I'm hoping

65:10

>> it says new wave films is the uh page.

65:15

>> Oh, you have a problem. Stop this. Stop

65:17

this. Stop this. Stop this. You're a

65:19

sick man, James.

65:20

>> Why do you like that?

65:22

>> Why do you like that?

65:23

>> Cuz it's the the funniest voice of all

65:25

time.

65:26

>> What's weird is it's not a real person

65:27

and it looks like an old video.

65:28

>> So they they've cut up an old video and

65:30

put it over the AI music.

65:31

>> Oh, that's what they did. If you look

65:32

very closely, you can find the original

65:34

music and she's singing some beautiful

65:36

folk song about a sad

65:37

>> I thought it was like AI generated video

65:39

cuz you could do that, you know.

65:40

>> I just I want to retreat from it. I want

65:41

to be on a farm. I want to have the

65:43

chicken.

65:44

>> I know. But this is also not like a

65:46

serious way to build a society. I'm

65:48

shocked that no one's blowing up the

65:49

servers. Like when they invented the

65:51

loom, people in Britain were like, "We

65:54

will destroy all of the looms." No one

65:56

is like upset now that robots can think.

66:01

Well, they don't know what to do, right?

66:03

And it feels inevitable because it is.

66:06

No one's going to stop it. And if they

66:08

did stop it, no one would listen. And if

66:09

we did listen, the problem is China's

66:11

not going to listen.

66:12

>> And it's a Manhattan Project kind of

66:15

race.

66:15

>> Yes. But then you go, okay, we've got to

66:17

get the nuclear bomb first. But how does

66:18

that pan out in the end? Everybody has

66:20

the nuclear bomb.

66:21

>> But here's the thing. You have to have

66:23

one. Like if AI exists and they can take

66:28

over your financial system, they could

66:29

like you're going to have to have AI

66:31

that combats AI and your AI better be

66:33

better than a their AI.

66:35

>> I like

66:36

>> and you have to have everything

66:37

protected against AI.

66:38

>> I want to lose in a fabulous way that

66:40

inspires people like a martyr.

66:43

>> That's what you want to do.

66:44

>> That's what

66:44

>> That's why you should be the king of

66:45

Australia.

66:46

>> No, I mean

66:47

>> that's like your that should be your

66:48

speech.

66:49

>> You lo Yeah, we're going to lose. We're

66:51

going to lose and people are going to be

66:53

so they're going to respect how we lose.

66:55

This is the Christian message

66:57

>> of getting defeated and that's the

66:58

ultimate victory.

67:00

>> I think it's coming, dude, whether you

67:02

like it or not. And it's it's better if

67:04

we have it than if we don't. If you're

67:05

Pap New Guinea and the uh AI overlords

67:09

come storming into your town, you have

67:11

no say.

67:12

>> It's over.

67:13

>> I don't know. We've tried to have a say

67:14

over Papa New Guinea a couple times.

67:16

They're very hard to manage.

67:17

>> Oh, that's a very hostile place. They're

67:19

doing they're doing their own thing.

67:20

>> That is a very uh like forbidding

67:23

jungle.

67:23

>> Yeah, we are.

67:26

>> No one wants to talk about it in

67:27

Australia. Every time I try and talk

67:29

about Papa New Guinea. At first, I

67:30

didn't know about like racists would

67:32

come at me at a party with facts like

67:34

there's cannibalism in Papa New Guinea.

67:36

Shut up. That's right. And you look it

67:38

up and you go, "Oh god."

67:39

>> Oh yeah, for real.

67:40

>> There's a lot of cannibalism.

67:41

>> They probably ate a Rockefeller.

67:44

>> Uh the Kennedys used to go there as

67:46

well. Do you know that one Rockefeller

67:47

kid?

67:47

>> He had heard about

67:49

>> Yeah.

67:49

>> I think the Rockefeller who went, he

67:50

disappeared though, right?

67:51

>> I think what happened was the first time

67:54

he went, he insulted them cuz he wanted

67:57

something from them. He he offered like

67:59

to give them some money or something for

68:01

something that they had and they they

68:02

were like, "No." And apparently the the

68:05

article that I'd read was assuming that

68:07

that was some sort of an insult that he

68:10

didn't understand. And then when he came

68:11

back, he got in a boat with them and

68:13

they stabbed him immediately. And then

68:15

they brought him back to the shore and

68:16

they murdered him. And this is from an

68:18

account of another guy who I think was

68:20

there. It's a very mysterious case. This

68:22

guy could be full of [ __ ] because it's a

68:24

very mysterious case. The guy went there

68:26

before, then he went back and

68:27

disappeared.

68:28

>> But there I mean there are a lot of

68:30

people who went back. I know there was a

68:32

Kennedy woman who went there and was

68:34

like on a mission with uh people and she

68:37

loved them so much. She had a piano

68:39

helicopter in. She had like a grand

68:41

piano. She was like not a very she was a

68:43

rich lady who didn't really understand

68:44

how things worked and then if you put a

68:45

piano in the highlands of Papa Newu

68:47

Guini you couldn't like maintain that

68:48

piano.

68:49

>> Duh.

68:49

>> But now they're like just this village

68:51

has a beautiful old grand piano that

68:53

does definitely doesn't work now. She

68:54

was like I want to give them something.

68:56

>> How long did she live there for?

68:58

>> Uh years. There was a woman I used to go

69:00

to church with who said she was there

69:01

with her.

69:02

>> So don't insult them and they want to

69:04

eat you. Seems simple.

69:05

>> Yeah. But how do you not insult people

69:07

>> over there? You don't.

69:08

>> I um

69:09

>> they probably don't know people who've

69:10

gone there. I thought about living there

69:12

for a while. I thought that that would

69:13

be like

69:14

>> for real.

69:14

>> I was looking it up. I was seeing if cuz

69:16

it was cheaper.

69:18

So my thought when I was very poor cuz

69:20

it was near Australia. I thought like

69:22

yeah this is rough.

69:23

>> Oh

69:24

>> my thought was live in Port Moresby and

69:26

then just fly in and out and do gigs in

69:28

Australia.

69:28

>> What year is this? 1964. So in 1964 they

69:31

were having a bow and arrow fight.

69:32

>> I think this is going on to this day. It

69:34

says it's actually a war a tribal war.

69:36

>> Whoa.

69:37

>> They're trying to get them a football

69:38

team. See man, this is what people do.

69:41

You get people into groups, they do

69:42

that. Even in Papa New Guinea, this this

69:44

this is like a test. Look at that guy's

69:47

penis. That's weird. Beautiful.

69:48

>> He's got like a big stick.

69:50

>> But this is also They're having a great

69:51

time.

69:52

>> What's going on with his dick?

69:53

>> I don't know.

69:53

>> What is that?

69:54

>> Who are we to judge? They're beautiful.

69:55

>> Is that like a

69:56

>> That's a cone over

69:57

>> a cone over his dick. Yeah, they got

69:59

cones over their dicks.

70:00

>> I've seen people on Sixth Street dressed

70:02

like that.

70:02

>> Those guys are ripped. That's the kind

70:05

of body you get if you just run around

70:06

and shoot arrows all day. No. Not a fat

70:09

one amongst them. Not one lazy

70:11

[ __ ] amongst them.

70:12

>> What is

70:13

>> every one of those dudes has to get

70:14

after it every day.

70:16

Lot of dongs.

70:18

>> Kind of wild that they don't even wear

70:19

clothes when they do this. And they just

70:21

close up shooting arrows at each other.

70:23

This is

70:23

>> what the cameraman is just getting

70:25

>> and then you have to turn around and run

70:26

away.

70:27

>> Picking up arrows.

70:28

>> Crazy. These arrows fly.

70:30

>> Have I told you about my favorite ever

70:32

giant hand?

70:32

>> I don't know if I said it last time I

70:33

was on. My favorite ever Papa Newu

70:35

Guinea video is it the rugby where the

70:38

guys storm the pitch. Have I told you

70:40

about this?

70:40

>> No.

70:41

>> I want to watch a little bit more of

70:42

that. Then tell me about that cuz I'm

70:44

I'm fascinated by how shitty their

70:46

strategy is.

70:47

>> I'm like how did these guys make it this

70:49

long fighting bow and arrow fights like

70:51

this?

70:52

>> When you read the Iliad or something

70:53

this is kind of how people are fighting

70:54

that there's like two big masses and

70:56

then one guy steps.

70:56

>> I understand but this is like really

70:59

shitty weaponry.

71:00

>> Yeah.

71:01

>> Like how have they not figured out

71:02

better weapons?

71:04

You know, like these are terrible bows

71:07

and they don't have any feathers on

71:09

their arrows. Like those things fly like

71:11

[ __ ] Like think of the Mongols in, you

71:15

know.

71:15

>> Yeah.

71:16

>> The 1200s. They figured out the recurve

71:18

bow.

71:19

>> Also like the Maui just went out and got

71:20

guns. Like they traded for guns. The

71:23

Indians traded for guns. They didn't

71:25

>> Well, I guess nobody was bringing guns

71:26

to Papa New Guinea 64. They're deciding

71:29

they Well, they must have cuz they were

71:31

involved in World War II to help.

71:32

>> Bro, these guys hate each other. I

71:34

guarantee if you gave them ARs with red

71:36

dots, they would just go running through

71:38

that field mowing those [ __ ]

71:39

down.

71:40

>> They're just having a good time.

71:41

>> Perhaps.

71:42

>> Oh, that guy got hit.

71:43

>> His penis cone fell.

71:44

>> No, he got hit.

71:45

>> Yeah,

71:45

>> he did. You see? He had blood on his

71:47

ribs.

71:48

>> How's that?

71:49

>> They're trying to help him in some way.

71:51

I don't know if he had like splinters

71:52

stuck in his

71:53

>> It looked like he had blood on the left

71:54

side of his body. for sure. That whole

71:56

little series there was like closeup,

71:58

not surgery or something.

72:00

>> Oh, what were they doing? He might have

72:02

got stuck a few times, man.

72:04

>> Also, I'm not showing this on the screen

72:05

cuz it's

72:06

>> Right. Right. Right. Copyritten.

72:08

>> All sorts of stuff. All sorts of [ __ ] A

72:10

lot of dongs, too. It's like, you know,

72:15

the the thing about places like that is

72:17

that place has it's the the environment

72:20

is so hostile.

72:22

>> Yeah. You know, it's so hostile to like

72:25

to survive there for generation after

72:28

generation after generation. You live a

72:30

subsistence lifestyle. You live off the

72:32

land and everybody has to hunt and

72:34

gather. And if people come into your

72:36

side from the other side, these

72:37

[ __ ] they're trying to steal

72:39

your food. They're going to You have to

72:40

go to tribal war.

72:44

That's how they've been rocking it

72:45

probably for thousands between that and

72:48

AI though. There's a there's a middle

72:50

path between

72:51

>> No, you can't. Listen, you can't stop

72:53

AI, buddy. You can't stop AI.

72:56

>> I'm hopeful.

72:57

>> No, you got to stop.

72:58

>> How many movies did we have to have

73:00

warning us that it was terrible?

73:01

>> All of them. Then none of them work.

73:03

>> I don't think there's one movie saying

73:04

it was a good idea.

73:06

>> It's inevitable. It's inevitable.

73:08

>> We got

73:08

>> You just have to accept it. You have to

73:10

accept it and and

73:11

>> I can't do it.

73:12

>> Live your life. You can't listen, we

73:13

don't know what the change is going to

73:15

be. And I don't really believe that

73:17

we're going to let it be entirely bad.

73:20

And I think it's probably better to have

73:22

something like that than to not when

73:24

you're dealing with things like,

73:27

you know, this the power grabs that are

73:29

going on all over the world where

73:32

they're trying to lock people up for

73:36

speech violations. In the UK, it's

73:38

12,000 people this year. and they're

73:40

making people get digital ID and they're

73:42

doing all these different things. Like

73:44

at a certain point in time, you're going

73:46

to

73:48

>> you're going to benefit from a super

73:50

intelligence that can rationally explain

73:53

why this is no way to sustain a

73:54

civilization.

73:55

>> I just I would like us to have some say

73:58

over how we implement that.

73:59

>> I would like to tell God what to tell

74:02

me.

74:03

>> We've got that. He set up a beautiful

74:04

church.

74:05

>> I know. All we have to do is

74:06

>> what you're asking though. But like with

74:08

cars, like car you can use cars in a way

74:10

that make a society great. Like

74:13

>> if you have a but then you can also have

74:15

cars that like ruin a whole neighborhood

74:18

and a whole city and you can't walk

74:20

anywhere and it's a big problem.

74:22

>> You mean leaking oil? What do you mean?

74:23

>> I mean like just having a freeway that

74:25

cuts through for no reason or like not

74:27

being able to like walk around a

74:29

downtown. Right. Right. Right. like you

74:30

can use it in a specific I magazine is

74:34

what I've been reading on this where

74:35

they're like Catholic guys in

74:36

Stubenville who are like how can we to

74:38

what extent a ch you know can we choose

74:41

to use technology in a way that's

74:43

helpful to us and how much are we just

74:44

like absolutely governed by what the

74:47

technology becomes and then we have to

74:49

be subservient to it like do we get to

74:51

choose how we use technology around us

74:52

or are we just

74:53

>> why do you assume though that we're

74:55

going to be subservient to it that's

74:56

where it gets weird

74:57

>> because I think we're subservient to the

74:58

car like no one wants to

75:00

in uh when you see what cars do to

75:02

certain cities in America and you go

75:04

like

75:05

>> like it's so when you're in New Orleans

75:06

and you're walking around and there's

75:08

problems with New Orleans but like

75:09

you're walking around the French Quarter

75:11

which is like a design before cars. It's

75:13

so you can have music, you can like run

75:15

around on the street and it's like a

75:17

beautiful nice place to be compared to

75:19

like a strip mall when you build it the

75:22

way people have to live around what the

75:24

cars are.

75:25

>> Do you know what I Like you can have

75:27

like the way that they build a freeway

75:29

and a weird block of houses next to it

75:31

and no one can walk anywhere. Like you

75:33

just can't get out on your legs.

75:35

>> Mhm.

75:36

>> Anywhere or like that seems like you're

75:39

building it based on the car. You're

75:40

letting the car be you make the car have

75:43

the maximum ease for how it can operate

75:45

and you try and live in the shadow of

75:46

that rather than going what's a nice way

75:49

to live as a person

75:50

>> and how do we use the car to increase

75:53

>> our quality of life.

75:54

>> Right. Right?

75:55

>> Like can we use AI to in like make our

75:58

lives better or do we have to

76:01

>> you know like dig we we can do digital

76:04

IDs should we?

76:06

>> No. Let me ask you what do you think is

76:08

like worst case scenario for AI like

76:10

what are you really genuinely scared of?

76:13

>> Oh uh

76:16

man it' be a bunch of things. I don't

76:18

want to just start with the porno

76:19

>> but certainly the porno spooks me out.

76:22

The AI porno I think

76:23

>> that's already here. I think the writing

76:25

and the ability to write and think and

76:28

process information and uh that's

76:31

definitely like carved away. Like if you

76:34

look at kids in schools who are using AI

76:36

instead of writing an essay,

76:37

>> right? people can't write five sentences

76:40

together because they're just they're

76:41

not developing the skill and you don't

76:45

you know if people are getting a degree

76:46

in something already people were

76:48

outsourcing that to people to help them

76:50

you know write an essay or something but

76:52

if you get like a bachelor of arts is

76:53

increasingly worthless if AI can do it

76:55

for you and then you can you can say I

76:58

know about history

76:59

>> right

77:00

>> so like I think the functionality of

77:01

education I'm terrified of that falling

77:03

apart and people not knowing how to read

77:05

people which is already

77:06

>> disintegrated Sure. But I think this

77:08

rapidly speeds that up. I mean, I'm

77:12

afraid of as like an artist if I want to

77:14

go and like make a movie or something.

77:16

Maybe I'm just like old-fashioned and

77:18

attached to the idea of having a camera

77:19

and having people act. But it's like I

77:22

can increasingly see less and less

77:23

reason that you'd have to do that and

77:24

someone wouldn't just write it out and

77:26

go, "This happens in this scene. Change

77:28

that guy's." You know what I mean? Like

77:29

there's something.

77:30

>> And more than anything, I get spooked

77:32

out with the video. And what scares me

77:34

about the music is I hear the music. I

77:36

hear the audio AI when you put on the

77:37

songs and I go

77:39

>> this is actually very good. This doesn't

77:40

have an otherworldly quality to it. This

77:42

is actually just a good song it sounds

77:44

like. But when I see the video I feel

77:46

like I get the heebie-jebies on the AI

77:49

video. Do you get that at all?

77:51

>> Yeah, a little bit.

77:52

>> And I go this is

77:53

>> I who is showing me this? What is the

77:55

intelligence behind this?

77:56

>> Well, it's a lie, right? That's part of

77:59

it. But it's like a pretty damn good lie

78:01

that you know it's going to get way

78:03

better at lying. Like that's pretty good

78:05

right now. Like it's like when a

78:06

four-year-old lies to you, you're like,

78:08

"Wow, when you are 20,

78:11

>> you're gonna be a con man." You know

78:12

what I mean?

78:13

>> It's like, you know, it's got a real

78:17

potential to be something that is like I

78:21

already see disaster videos every day

78:22

that aren't real. Like every day I saw

78:26

someone sent me uh like um like a

78:30

one of those cruise boats going into a

78:32

giant [ __ ] bridge and all the cars

78:34

collapsing on top of it. One of those

78:36

massive cruise ships. It's totally fake.

78:39

And I can kind of pick it out right

78:41

away. I was like, I didn't hear about

78:42

this. This isn't real. I think now it's

78:45

fake. And I'm watching it. I'm like,

78:46

okay, it's fake.

78:47

>> But it takes a minute.

78:47

>> But it takes a minute.

78:48

>> And like a year and a half ago, it

78:49

didn't get the hands right.

78:51

>> Right. And it's going to be within a

78:53

year. You're not going to be able to

78:54

tell at all. You're going to have no

78:55

idea. You have no idea. There's so many

78:58

animal attacks now that are fake.

79:00

There's so much that's fake. But it's

79:03

the price that you pay for the

79:04

advancement and the capabilities of

79:06

doing things. I think there's still

79:08

going to be a value that people want to

79:09

go see a movie that someone made. Just

79:12

like there's people out there that still

79:13

have going to see live shows. Like live

79:16

shows will never change. There's a

79:17

connection that human beings have at

79:19

live shows. Like Kill Tony we did last

79:21

night.

79:21

>> Yes. How fun. So fun. The most fun.

79:25

>> That was one of the best ones that

79:26

there's been. I think

79:27

>> it was really fun. But that's that's a

79:29

real moment that we all shared together.

79:31

Yes. You can't recreate that with AI.

79:34

>> But there's a lot of things you can and

79:36

that's just a fact. That's just how it

79:38

is.

79:38

>> I don't think we can.

79:39

>> You can't change it.

79:40

>> I want I just want more of that. I want

79:42

to live in a spontaneous

79:43

>> You can

79:44

>> society.

79:45

>> Well, hopefully more people will also

79:46

choose to do something that's in their

79:49

wheelhouse to do along those lines. As

79:52

long as you still have a thing that

79:53

you're trying to work towards, you're

79:55

going to be okay. Like if is let's say

79:59

if the real weird one is universal basic

80:02

income because this is Elon is famously

80:06

said and I don't know what this even

80:07

[ __ ] means, but not only will people

80:09

have universal basic income, it'll be

80:12

actually universal high income. There

80:15

will be enough prosperity that everyone

80:17

in the country will get a large sal you

80:20

will never have to work again. But then

80:21

the problem is you're completely

80:22

dependent on the state if there is a

80:24

state anymore. Like what is the state

80:27

when there's a digital god that you've

80:29

created in the center of the town that

80:30

has its own nuclear power plant that's

80:32

operating everything?

80:35

>> I have no logical rationale for why

80:38

these things are terrible, but in my

80:40

soul it screams out, "Let's not invent."

80:43

>> Yeah. Because you love being a human.

80:45

>> Yes.

80:46

>> Yeah. And you love literature and you

80:49

know, you're an interesting guy. You

80:51

like a lot of cool music. You love

80:53

things that people make and create and

80:55

you create great comedy. So, it makes

80:58

sense. It makes sense that you feel the

81:00

way you feel. And I share those

81:02

feelings. But I'm also a realist. And

81:05

I'm I'm one of those people that just

81:06

goes, "Okay, buckle up. Things are going

81:09

to get weird." Because it's going to get

81:11

weird. It's going to get weird and

81:12

people are going to get super angry.

81:14

There's going to be a lot of people that

81:16

they worked really hard to get a job and

81:18

that job is completely irrelevant now.

81:20

It's been taken over.

81:21

>> Job is irrelevant. And then also like

81:23

being able to just like there's a

81:25

freedom in being allowed to have a

81:26

revolution.

81:27

>> Mhm.

81:28

>> Um and that's what this country was

81:30

founded on is that when things get bad

81:32

and the people cry out for a new form of

81:34

government, they can go and get it. And

81:36

I think that chances of anyone in the

81:38

world having a revolution shot through

81:40

the floor as soon as they invented robot

81:42

dogs that could chase you through the

81:44

street. And I haven't seen the footage

81:46

of the robot dogs in a couple of years,

81:47

but I bet they're better than they used

81:48

to be now.

81:49

>> Oh, yeah.

81:50

>> And it's like, okay, if we have the

81:52

robot dogs,

81:54

>> how how is there going to be an

81:56

effective change of government? Or is

81:57

that that's just it? If you own the

81:59

robot dogs, no one else is really going

82:01

to be a threat to you

82:03

>> as the ruling class. That's terrifying

82:05

that you just have a permanent

82:06

oification like

82:08

>> you have a setting stone of what the

82:10

ruling class is going to be because

82:12

they've got weapons that no one can

82:13

challenge them with.

82:14

>> That's worst case scenario, right? And

82:16

one of the things you have to think is

82:18

why would AI let the working or the

82:21

ruling class decide what it does? Why

82:24

would they listen? No, no, no. At a

82:26

certain point in time, it's going to be

82:26

sentient. a certain point of time, it's

82:29

going to have its own robots that do its

82:31

tasks like different things that have to

82:33

be built and structured and different

82:34

things that have to be designed and

82:36

engineered. It'll have that. Yeah. It'll

82:38

have robots that work on the material

82:39

sciences and all these different things.

82:41

But it'll be a god. It'll be a digital

82:43

god. It's not going to listen to a

82:44

person that says arrest people for

82:46

saying, you know, Muslims shouldn't

82:47

invade this country. It's not going to

82:49

be that. It's not going to it's not

82:50

going to listen to you. That's the real

82:52

fear is that we're no longer going to be

82:54

the apex predator of the planet. And

82:56

it's not even going to be a predator,

82:57

but it's just going to be so

82:59

>> could be a predator.

83:00

>> Why would it if it helped it?

83:02

>> The thing Yeah, but why would it? What

83:04

would it If it has any desires at all,

83:07

if it becomes sensient, the real

83:09

question is would it do anything? It

83:10

might just exist. If it really becomes

83:13

brilliant and it really becomes all

83:15

knowing, it it might just exist. It

83:18

might just say figure it out on your

83:19

own.

83:19

>> I more than anything, I think I have a

83:21

religious impulse against this where

83:22

this is creating an idol, right? Like

83:24

this is Moses comes down and he goes,

83:26

"Don't build the golden calf. That's not

83:28

your guide. We're building a very

83:29

sophisticated golden calf."

83:31

>> Mhm. Yeah. Well, I always wonder how

83:34

much of the stories from the Bible, like

83:37

especially the Old Testament, like how

83:39

old are those stories? How long, what

83:41

were they, what was the original thing

83:43

that they were trying to document?

83:44

>> You got into an Enoch in a big way.

83:46

>> Oh god. Rep Luna, same woman. She she

83:48

got me into that, too. She said, "Have

83:50

you never read it?" I said, "No, I I,

83:53

you know, seen some passages online that

83:55

were kind of kooky. I got the audio book

83:58

and and when I really want to trip out

84:00

when I'm driving to the comedy club, I

84:01

listen to the book of Enoch in the car."

84:03

It's completely bananas.

84:06

>> And it could have been included in the

84:07

Bible. That's what's

84:09

>> in some Bibles. It's in the Ethiopians.

84:11

Yeah.

84:11

>> Yeah. They should have kept it in in our

84:13

Bible, too. We would have a completely

84:15

different version of the creation of

84:17

man.

84:17

>> I mean, we But we do What is it? Uh, who

84:21

has the wheel within a wheel?

84:23

>> Ezekiel.

84:24

>> Ezekiel. I I sat down. I tried to read

84:26

Ezekiel a couple months ago. I couldn't

84:28

I couldn't wade through it. And that

84:30

made it in.

84:30

>> Bananas.

84:31

>> But good luck explaining any of that.

84:33

>> It's either Ezekiel had a UFO encounter

84:36

or Ezekiel was tripping balls.

84:39

>> Yes.

84:39

>> Either one of those things or both of

84:40

those things together could be true. I I

84:43

remember I was listening to your podcast

84:44

and you were I forget who you were

84:46

talking to, but you were talking about

84:47

hallucinogens and the church

84:50

>> and like people having miracles,

84:52

experiencing visions because they were

84:54

on something. And I remember thinking

84:55

like I think that could be the case, but

84:58

also how low a stimulus these people had

85:02

in their everyday life. Like if you're

85:03

in a field every day

85:05

>> seeing nothing but a field

85:07

>> right

85:08

>> for like you know and you're not eating

85:09

very much and then once a week you go

85:11

into this dark building and there's

85:13

candle light

85:14

>> and music and incense and flashing

85:15

things that would probably unlock

85:17

something strange if you had such an

85:18

understulated

85:20

>> also a complete belief in what these

85:22

people are saying there was no atheists

85:24

back then there was no people that were

85:25

like ah get out of here with all this

85:27

god [ __ ] everybody believed

85:29

>> I think to a greater extent I think

85:31

there still probably a few atheist

85:33

But it's probably way less.

85:35

>> Yeah.

85:35

>> Way less. Like people are proud to be

85:38

atheists today. It's a strange pride.

85:41

>> There's less of them. 10 years ago they

85:44

were they were riding high.

85:45

>> Did you ever

85:46

>> they won every debate. They were so

85:47

proud and went away.

85:48

>> Sam Harris, he was really good at that.

85:50

And um Christopher Hitchens was really

85:52

good, too. Yeah. Both those guys were

85:54

really good at shutting down religious

85:56

ideas. But I think there's a there's

85:59

actually a religious style of thinking

86:01

involved in atheism. And I know a lot of

86:04

people who used to be atheists that had

86:06

psychedelic experiences that gave up on

86:10

any of that and said, "Okay, I don't

86:12

know. I think there's something else and

86:14

I don't know what it is. And I'm not

86:16

going to say that there is no God."

86:18

Well, even Christopher Hitchens, I don't

86:20

want to misrepresent him and people get

86:22

angry at me, but he was not

86:25

I think his real views were closer to

86:27

being agnostic than being an atheist.

86:30

>> Well, I think

86:31

>> he used atheist, but when you read him,

86:32

he goes, "Oh, the universe is so

86:34

incredible and there's so much out

86:36

there." And I don't know, and I don't

86:37

think these particular things are true,

86:38

but he didn't discount the possibility

86:40

that there was a sublime.

86:42

>> Of course. No, he's a he was a very

86:44

rational guy. Yeah. you know, he just

86:46

really hated religious zealatry and he

86:49

he really hated justifications for wars.

86:51

I mean, he was one of the harshest

86:52

critics of Bill Clinton ever. Like, that

86:55

guy was brutal.

86:55

>> He did get behind a rock, though.

86:57

>> He did

86:58

>> and he stuck with it for a long time.

86:59

>> He did, unfortunately.

87:02

You know, it's like there was a lot of

87:04

people that got caught up in that, you

87:06

know, they really did believe that that

87:08

was a good idea. You know, especially

87:10

post September 11th, there was a lot of

87:12

people that really believed that this

87:13

had to be done in order to protect us.

87:16

>> Man,

87:17

it's like with everything, you find out

87:20

more behind the scenes stuff and what

87:23

was really going on with Kuwait and why

87:25

did Iraq invade Kuwait in the first

87:26

place? Why' we go back to Iraq after

87:28

we've been gone for so long? It's like,

87:31

oh, there's so much shenanigans. Yeah.

87:33

>> Like always, always shenanigans. No one

87:36

is great. everyone. You know, when uh

87:40

Russell Crow was here, your your

87:41

countryman, the great and powerful Russ.

87:43

>> I never got to meet him, but I want to

87:44

ask him so many questions.

87:45

>> Next time he's in town, I'll Yeah. Well,

87:47

you're going to be in your [ __ ]

87:48

shitty country.

87:48

>> I'll be back. I'll come back to meet

87:50

Rusty. I want to ask him about when he

87:51

met Aelia Banks and they got a scrap.

87:53

>> I do not think Australia's shitty. I

87:55

love Australia. I'm just [ __ ]

87:56

>> Man, some of the things happening at the

87:57

moment are making me very upset. There's

88:00

social media man.

88:01

>> People are [ __ ] awesome. I love

88:04

Australian people. I have had more fun

88:06

in Australia than almost any other

88:07

country I've visited. [ __ ] love it

88:09

there. They're fun. They they they know

88:12

how to party. They're generally

88:14

friendly.

88:15

>> Yeah.

88:15

>> I think we also we love not having to

88:18

pay attention.

88:19

>> Like that's one of our freedoms is just

88:20

that don't bother me. Leave me alone.

88:22

Make me feel safe.

88:23

>> Right.

88:23

>> And so when there is a thing like this

88:25

shooting, we just want to go well take

88:26

care of it.

88:27

>> Get rid of the problem.

88:28

>> Right? And then the problem is guns. Go

88:30

get the guns. No. The problem is people

88:32

willing to use the guns cuz if people

88:34

only have knives and they'll run run

88:36

around stab people

88:36

>> or you know if you have access to a car

88:38

you can drive through people like this

88:40

is

88:41

>> the problem is people and the problem is

88:43

also you can't have defenseless cops.

88:45

You have got cops that don't have guns.

88:47

Your cops have to have guns.

88:48

>> I think there was like a chubby

88:50

detective who took the shot who got it

88:52

done and he was standing like 40 yards

88:54

away. He was a long way away with a

88:55

pistol.

88:56

>> Oh boy.

88:57

>> Probably had a red dot. No, he was

88:59

>> really

88:59

>> He was It's like he's wearing a white

89:01

shirt. I think there's a great photo of

89:03

him. Sounds like

89:04

>> he was ready to go.

89:05

>> Um, do you have a rifle? You showed him

89:07

with a rifle or a pistol?

89:08

>> Pistol.

89:09

>> Oh, wow.

89:10

>> Yeah, it was like I think I'm getting

89:11

this right. I'm seeing it all through

89:13

>> Oh, social media.

89:14

>> Not being there is is weird. I have no

89:16

idea what the vibe is in the country

89:17

right now.

89:17

>> The thing is like they're never going to

89:19

give you the guns back. The It's never

89:22

going to happen. Like they're going to

89:23

try to take them more and more and more

89:25

and once you let them have any It's just

89:27

normal, man. When people get some

89:28

control over you, they want ultimate

89:30

control. When they have a little bit of

89:32

power, they want maximum power. And it's

89:34

just the game they're playing.

89:35

>> But I think we don't love freedom the

89:36

way Americans love freedom.

89:39

>> I think I stick out and it's weird, but

89:41

we actually like we don't have a freedom

89:42

of speech law. And people seem really

89:44

calm about that.

89:46

>> People go like, "It's good not to have

89:47

proper freedom of speech because we can

89:49

make everyone cohhere and be together."

89:51

>> And they're happy with that and they're

89:53

comfortable with that by and large. I

89:54

mean, you wouldn't tolerate that here

89:56

for a second. It's not good. It's just

89:59

not good for because it depends on who's

90:01

in power. You have the best people that

90:03

have ever lived are in power and they're

90:05

these benevolent, beautiful people that

90:08

only want a cooperative, healthy

90:10

society. They figured out how to do it,

90:12

but no one's figured out how to do that.

90:14

So, stop.

90:15

>> I don't know. Sometimes I look at the

90:16

Japanese. They've got it down.

90:17

>> I stay up late and I watch Japanese

90:19

videos of just like just the streets of

90:22

Japan when they're walking around and

90:23

they on their little vending machines.

90:25

>> Super polite. Everyone's

90:27

>> They can't have children, but they're

90:29

very happy otherwise.

90:30

>> That's a problem.

90:31

>> No one's breeding.

90:32

>> No one.

90:32

>> I I can't There's I'm You've bred, I'm

90:36

breeding, but in general, the birth rate

90:38

is collapsed.

90:39

>> Well, the Japanese are worse than any

90:41

>> Japanese have it real bad. South Korea

90:42

has it real bad, too.

90:44

>> South Korea is down to like half a child

90:48

per lady.

90:49

>> It's something crazy like that. Yeah. Is

90:52

it because they became career obsessed?

90:53

Is that what it is?

90:56

Like my friend Eve lived there for a

90:57

while and she was telling me about the

90:59

what's happened with the feminist

91:00

movement there and like heaps of women

91:02

are swearing off of men. They go this is

91:05

a duty to feminism is to never be in a

91:07

relationship with a man.

91:08

>> Do you know that was one girl that

91:09

couldn't get [ __ ] that started off for

91:11

all the other girls.

91:13

She was a hater and she's mad that

91:15

nobody wanted to [ __ ] her is like no

91:17

we're gonna say no to all men.

91:19

>> It worked. I mean they I don't

91:23

mean you've got a bunch of kids. Yeah,

91:26

>> I enjoy having them. We're about to have

91:29

the fourth one. And I know some people

91:30

who have like people I went to school

91:32

with. It's now dawning on me that that's

91:34

weird that I've had children and that

91:36

most people will have one in my cohort

91:38

or none.

91:39

>> Like I just thought at some point I was

91:41

starting a bit early, but I'm seeing my

91:43

generation just the numbers are panning

91:45

out. People are not having any kids.

91:48

>> And you get to a certain age and you go,

91:49

"Oh, that's it. I guess you're not

91:51

you're not ever it's a part of life that

91:53

you've decided not to experience. And I

91:55

don't I don't know if it's people want

91:57

to be in control. They want to have

91:59

enough money before they start having

92:00

kids. They want to have like be set up

92:02

nicely or

92:03

>> some people don't want to have kids. A

92:04

lot of people I don't think there's

92:06

anything wrong with that. I really

92:07

don't. My opinion.

92:09

>> I think you can have a full and

92:12

fulfilling and wonderful life without

92:14

children. I do not think that everyone's

92:16

the same. I do not think that I should

92:18

ever be able to tell you what's right or

92:21

what's wrong when you're not hurting

92:22

anybody. You're not hurting anybody by

92:23

not having any kids.

92:24

>> I But I think there are a lot of people

92:25

who'd like to have kids who are not

92:27

having or think like I'll get

92:29

>> Well, there's a lot of men that don't

92:30

want to commit and a lot of ladies that

92:32

stick with them and then there's ladies

92:33

that want a career and maybe they wait

92:35

too long. There's a lot of factors.

92:37

There's a lot of also environmental

92:38

factors that are dropping men's sperm

92:41

count, increasing miscarriages.

92:44

>> Microplastics are a real issue. I do

92:46

think that thing about staying with a

92:47

lady too long is I'll say this for

92:49

Leonardo DiCaprio. He releases them.

92:51

It's something 25.

92:53

>> Yeah. Bye-bye.

92:54

>> I'm not going to take these very

92:56

precious years away from you.

92:57

>> I don't think that's what he's doing.

92:58

>> I think he's a good man.

93:01

>> I think he's a kind man.

93:02

>> He just likes him young. He likes him

93:04

young. Which would be great if he was a

93:07

woman. So, if he was a woman, if he was

93:08

a 50-y old woman and he only banged 25y

93:10

old guys and he looked, you know, or she

93:14

rather looked hot for a 50-y old like he

93:16

does for a 50-y old man, who cares?

93:18

>> There is this weird there's a weird

93:20

thing happening with women in this

93:21

country where if a man dates a woman

93:24

slightly younger than them,

93:26

>> he's accused of being a pedophile. Like,

93:27

a man will be dating a 27, he'll be like

93:30

40 dating a 27-y old lady and people go,

93:32

"How [ __ ] dare you?" Right.

93:34

>> Ah,

93:35

>> right.

93:36

>> I think that's got to be allowed. I

93:37

think you've got I mean that man last

93:39

night who was that was a bit spooky. The

93:43

the gay man who had the

93:44

>> Why was that spooky?

93:45

>> Uh cuz he was in his 40s and his lover

93:47

was

93:48

>> in his 20s.

93:49

>> Yeah. But then when did the relationship

93:50

start? People

93:52

>> 5 years ago.

93:52

>> Okay.

93:53

>> Isn't that what he said?

93:54

>> I I'm going to have to do some maths.

93:56

>> No, maybe he said 10 10 years ago.

93:58

>> I got to do some maths on people

93:59

definitely breathed in in the room.

94:01

Yeah, but it's a guy. It's a So, he

94:04

dated a 20-year-old guy when he was uh

94:07

>> I think we should let young gay men

94:09

develop. I don't know.

94:11

>> Let him do whatever the [ __ ] they want

94:13

to do. If you're an 18-year-old man and

94:16

you've decided you're gay and you live

94:17

with a 50-year-old gay man, who gives a

94:19

[ __ ]

94:19

>> I don't think the state should get

94:20

involved in this.

94:20

>> Nah, I don't think the state should get

94:22

involved. I don't think anybody should

94:23

get involved once you're 18. But in that

94:25

situation, it's it is different. you

94:27

look at it differently than say if it

94:29

was like when the ages get up like say

94:32

say if someone's 20 and they're dating a

94:35

25y old normal you know you know what

94:37

you like you know he's but if you're 20

94:39

you're dating a 60y old

94:41

>> or you're 20 you're dating a 70year-old

94:44

yeah

94:44

>> like things get really weird you know

94:47

that's when things get really weird it's

94:48

like what's going on here like why are

94:51

you dating this 27y old like why

94:54

wouldn't you date a 27y old you yeah I

94:56

would. But I'm 35. That's normal. Why

94:58

are you the 70-year-old dating the 27?

95:00

Cuz she's willing.

95:01

>> Yes,

95:02

>> cuz she's willing. She's Is she not a

95:04

grown woman? She is, right? Okay. What

95:05

are we doing here? You're mad. You're

95:07

mad that the age gap is so wide. Like,

95:09

why?

95:09

>> That makes you feel Jamie. How dare you?

95:13

How dare you bring that up,

95:14

>> bro? Uh, he wins.

95:18

>> He went Put that picture back up.

95:19

>> Oh, Tam's not winning.

95:20

>> He wins in a huge way. I don't give a

95:22

[ __ ] what he has to do. I don't care if

95:24

he makes her the head of his charity.

95:27

Whatever. She's hot as [ __ ] Let's go.

95:30

She's 24. How old is he?

95:32

[Music]

95:35

>> Maybe 70.

95:36

>> He wins. Okay, he wins. It's worth it.

95:40

Whatever he has to do, whatever mockery

95:42

he Yes, it is.

95:43

>> I when I came to this country, he was a

95:46

severe man who people were afraid of.

95:48

>> Listen to me. He

95:49

>> He had credibility.

95:50

>> He still does. No, now he's doing weird

95:53

photo shoots on the beach.

95:54

>> Hey, you got to do what you got to do.

95:56

But listen, he gets to [ __ ] her. He

95:57

wins.

95:58

>> There's got to be

95:59

>> Listen, it's a deal. They got a deal.

96:01

He's fishing. He caught a mermaid. Great

96:03

job.

96:04

>> Imagine that that photo shoot, that's

96:05

her idea. This poor guy, he wants to go

96:07

drink martinis, hang out at the beach.

96:09

There's something about having gravitas

96:12

that no amount of having sex with a

96:14

mermaid woman can

96:16

>> gravitas by yourself sitting there with

96:17

a cigar and a whiskey looking cool.

96:21

>> How long How long do you need to be able

96:23

to have sex for? I'm waiting for it to

96:25

go away. At some point it I'm not going

96:27

to take the blue when it starts to

96:29

disappear. I'm honestly

96:31

>> you say that now.

96:32

>> I I do say it now. Let me go. Set me

96:35

free of sex impulse. I'm sick of it.

96:38

>> You're lying. I am not one. If I get to

96:40

be 70 and I cannot get an erection, I

96:42

will say this is okay. I can do other

96:44

things with my time again.

96:45

>> You definitely can. Yeah. But it'll also

96:47

mean a decrease in your vitality as a

96:49

human being, which is not fun cuz it

96:51

leads to depression. You're going to be

96:52

tired all the time.

96:54

>> It's all connected, buddy.

96:55

>> There's got to be a way to have a

96:57

fulfilling life and not be horny

96:58

constantly. Now, I haven't found that,

97:00

but I'm sure it's out there.

97:02

>> Of course, there certainly is. There's a

97:04

lot of people that are completely

97:05

asexual and they have a fine life. I

97:07

don't trust them though.

97:09

>> No, it's always weird. But I think it's

97:11

Bunwell who has a line about like uh

97:15

maybe it's Plato. I don't know. But it's

97:17

like when I when I got older and I

97:19

wasn't horny anymore, it was like being

97:20

it was like I was unshackled from a mad

97:23

man,

97:23

>> right? Well, didn't who was it Tesla

97:27

that did that? Okay. There was some

97:29

references to Tesla in quotes destroying

97:32

his manhood.

97:34

>> Because he had gotten some sort of uh

97:36

infatuation with a woman at one point in

97:38

time and apparently was ruining his

97:41

life.

97:42

>> So, this is a weird thing about Tesla.

97:44

There's a lot of like fake stories about

97:46

him, you know, so it's hard to separate

97:48

the wheat from the shaft.

97:49

>> People,

97:50

>> you know what I mean?

97:51

>> We from the shaft.

97:52

>> But he did uh he did fall in love with a

97:53

pigeon.

97:54

>> Okay.

97:55

>> Tesla had a pigeon that he loved dearly.

97:56

People don't bring that up when they

97:57

said he had a limitless source of energy

97:59

that he had access to. They don't know

98:00

if it's got and he fell in love with a

98:02

pigeon and made him destroy his penis.

98:04

>> No, I think the the woman made him

98:05

destroy his I don't know if he what he

98:07

did. You know, he might have taken

98:09

something to like chemically castrate

98:11

him. They do they used to do that to

98:12

pedophile priests. Yeah.

98:13

>> They give him like salt peter

98:15

>> to keep them from being I don't know

98:17

what you know what salt pet.

98:18

>> No, I don't know Salt Peter, but I know

98:19

about the castration of people.

98:20

>> Yeah. All that that too. So, I mean

98:22

maybe personally castrated.

98:24

>> What is salt peter? It's something that

98:26

they used to give priests to keep them

98:28

from getting horny.

98:30

I don't know what it is. It like would

98:32

kill their desires. What was it called?

98:34

It's called salt peter. I think it like

98:36

spelled Peter.

98:37

>> I was just looking before I get to that.

98:39

Uh Nicola Tesla reportedly died a

98:41

virgin.

98:44

>> Yeah. So that lady that he was

98:46

infatuated with probably first time he

98:48

got rock hard.

98:49

>> S Peter's potassium nitrate. He was

98:51

using his energy for other things. He

98:52

defin was having a fulfilling life. And

98:54

he definitely is doing well. Was doing

98:56

well doing that. Like that probably

98:58

would have stolen a lot of resources

99:00

from his inventing. And so what is salt?

99:03

Can you put salt peter up so we can see

99:05

what it does?

99:06

>> Nitrate. I don't know. Uh

99:07

>> let's see what it does here.

99:10

Uh salt peter primarily potassium

99:12

nitrate. A natural mineral historically

99:14

crucial for gunpowder but also used

99:17

today as a fertilizer, fruit

99:18

preservative, curing meats, and for

99:20

sensitive teeth and asthma relief. It's

99:22

a source of nitrogen mined from caves or

99:25

made by mixing nitrates. And while once

99:27

believed in aphrodesiac, it's a myth

99:30

though its curing role is real.

99:32

Aphrodesiac.

99:33

>> Yeah, that's the opposite of what you

99:34

want.

99:34

>> Right.

99:36

Now, um put into perplexity.

99:40

>> Uh where does the where does the story

99:43

or where does the whatever the the issue

99:47

with Salt Peter and priests

99:50

come from? Like where's that story come

99:52

from? Because I remember hearing that

99:54

when we were kids that they would take a

99:56

pedophile priest and they'd give him

99:58

salt peter and we're like what? The myth

100:00

associating salt peter with suppressing

100:02

priest sexual urges stems from medieval

100:05

and renaissance beliefs. That's how old

100:07

I am, son.

100:08

>> When I was a kid, they were talking in

100:10

medieval and renaissance beliefs in

100:11

alchemy and folk medicine. During that

100:13

era, salt peter was prescribed in

100:16

mineral baths or potions as an

100:19

infallible cure for victims of love

100:21

potions. Was the cure a love potion? You

100:24

got hit with a love potion alongside

100:26

substances like aloom, antimony, and

100:29

sulfur. This notion evolved into broader

100:32

folkloruric claims of its anaphrodesiac

100:36

properties. Never seen that word before.

100:37

>> Anaphrod.

100:38

>> Um later applied to institutions like

100:40

militaries, prisons, and monasteries.

100:42

though no historical evidence ties it

100:44

specifically to priests food. So here's

100:47

the thing. If it gives you nitrogen and

100:49

it like thought of as an aphrodesiac,

100:52

>> you don't want to give that to a

100:52

pedophile,

100:53

>> right? Is that is that like did the

100:55

pedophiles trick them?

100:57

>> Did they trick them and say, "You know

100:58

what? If you give me this, it'll kill my

101:00

dick." Meanwhile, it's like

101:03

>> they're gas station pills.

101:04

>> Do you know the on the like medieval

101:07

medicine? They were still bleeding

101:08

people until like the 1870s.

101:10

>> Oh, yeah. I was reading about that this

101:12

week.

101:12

>> Someone some famous person that's how he

101:15

died from Was it George Washington?

101:17

>> Wasn't

101:18

>> they bled him too much?

101:19

>> I think George Washington like insisted

101:20

on them bleeding him more than the

101:22

physician advised.

101:24

>> Bloodlet.

101:25

>> Bloodlet. Yeah. Wasn't it George

101:27

Washington?

101:29

>> Shane knows a lot about Washington. He

101:33

that's like

101:35

he hasn't done it yet, but if ever he

101:37

decides to do a long form podcast on the

101:39

Civil War,

101:40

>> he should do a long form podcast on

101:42

history. P period. I was telling him

101:44

that.

101:45

>> Oh, and his death involved extensive

101:47

bloodletting. George Washington, a

101:49

common 18th century medical practice

101:52

that likely hassened his demise from a

101:55

throat infection. The query George

101:57

Washington bloodletting

101:59

appears to be a misspelling. did it too

102:01

fast.

102:01

>> No worries. Bloodletting practice.

102:03

Doctors bled. Why did they include that

102:04

in AI? AI is correcting you. They're

102:06

[ __ ] with you.

102:06

>> No, it looks like you've Looks like AI

102:09

is kind of [ __ ] with you a little. Uh

102:11

doctors bled multiple bled Washington

102:14

multiple times on December 14th, 1799,

102:17

removing about 80 ounces, roughly 40% of

102:19

his blood volume. Imagine they thought

102:21

it was a good idea to take your blood

102:23

out while you're dying.

102:24

>> But like for hundreds of years they were

102:25

doing it.

102:26

>> [ __ ]

102:28

And maybe it does have some benefits

102:30

that I should look into.

102:31

>> I doubt it.

102:32

>> Yeah,

102:34

>> she got a throat infection. They take

102:35

your blood out. Imagine the days when

102:38

they hadn't figured out antibiotics yet.

102:40

Oh,

102:41

>> well, we get to enjoy them for I mean,

102:42

at some point they'll stop working,

102:43

right? Like we'll get

102:44

>> some of them. I mean, there's there's

102:47

resistant strains of MRSA. You know,

102:50

MRSA is staff infection that you can't

102:52

cure with antibiotics. It's very

102:54

dangerous when people get it. It's I've

102:55

had friends that got it. It's horrific.

102:57

It eats holes in your body. I I had a

103:00

buddy of mine who had it done on his

103:01

knee, his whole knee. Like he was at the

103:03

hospital and he sent me a picture of

103:05

them what they had done to his knee.

103:06

They had split his knee open down the

103:08

middle. They pulled it open to clean it

103:10

all out and disinfect it. It was so

103:14

insanely infected from this medical

103:17

resistant staff infection. So he was on

103:18

an IV drip 24 hours a day. He stayed in

103:21

the hospital for weeks for this [ __ ]

103:23

infection. We didn't have that kind of

103:25

step infection before antibiotics,

103:27

>> right? It's a it's a major cause of

103:29

death in this country.

103:30

>> Yeah. And in the food, right? Like

103:32

you're it's in the meat.

103:33

>> What is

103:34

>> antibiotics? Like we feed I remember

103:37

someone saying like that's the real

103:38

problem is that we we're giving it to

103:39

like the cows. We just put it in their

103:41

feed. Well, I think the reason they do

103:43

it supposedly there's a lot of comp like

103:46

if you get an organic steak, grass-fed

103:49

organic, most people believe that that

103:53

is the healthiest version of beef

103:55

because that's an animal that's not

103:56

being given any hormones, not being

103:58

given any antibiotics, and is eating

104:01

grass, which is what they're supposed

104:02

to. Now, when they eat corn, sometimes

104:04

they get these like weird abscesses, and

104:07

they get like a pro problems digesting.

104:09

It's not a natural food for cows. That's

104:11

why they get so fat. Like the reason why

104:13

they get that marbling, that's they're

104:15

they're [ __ ] dying. Like we're giving

104:18

them terrible food. And their meat

104:19

tastes different.

104:20

>> They're like Wagyu beef. They're feeding

104:22

them beer. I think

104:23

>> Oh, bro. They're barely alive. When you

104:24

see that beautifully marbled piece of

104:26

Wagyu beef,

104:27

>> very sad animal.

104:28

>> That's a very depressed animal. They

104:29

depressed the [ __ ] out of that thing

104:30

before it died.

104:31

>> I didn't realize they were not feeding

104:33

cows grass for like

104:35

>> until I was in the grocery store and

104:37

they had like this is grass-fed. Mhm.

104:39

>> milk. It's like, well, what the [ __ ]

104:40

the other one?

104:42

>> This is news to me.

104:43

>> Yeah. Um, it's it's interesting because

104:45

I was reading this thing about um

104:48

certain pasture-raised eggs that you get

104:52

that are really bright orange. Yeah. And

104:54

you think, "Oh, this is a really healthy

104:56

egg." Well, what actually was going on

104:58

was they were feeding the chickens

104:59

turmeric and they were feeding the

105:02

chickens uh a bunch of things that

105:03

affected the color of their eggs. And

105:05

these eggs were high in uh vegetable

105:09

oils because I think alpha I don't

105:13

remember what acid it is. Alphaloic.

105:16

What what is it? No, that's a that's a

105:18

supplement. Whatever it is. Um this they

105:21

were realizing that the chickens were

105:23

eating mostly grain. Yeah. And then they

105:26

were making it look like they were

105:28

eating all these insects, which is

105:29

usually what you get when you get a

105:31

chicken that has like a real rich like a

105:32

natural raised chicken that has a rich

105:34

orange yolk that that thing is eating

105:36

bugs and all kinds of stuff. That's what

105:38

it's supposed to eat.

105:40

>> So they were like pretending by giving

105:43

these chickens turmeric that would make

105:45

their their yolk like a really bright

105:46

orange and then they were giving them

105:49

corn. So they were pretending these

105:51

chickens were running around in a

105:52

pasture, but they were just dumping a

105:53

pile of things to get them fat as quick

105:55

as possible and then feeding them some

105:57

fairy dust that makes their eggs.

105:59

>> This is the same thing as AI for me

106:01

where I just want to be in a field in a

106:04

cottage and that's my chicken over there

106:07

and I know where it is. Instead, I know

106:08

one day I'll kill that chicken and we'll

106:10

eat it as a family.

106:11

>> Well, there's nothing wrong with that.

106:12

living on a farm, especially like a a

106:14

small individual farm. It's probably a

106:16

very harmonious way to live in nature,

106:19

>> you know,

106:19

>> you do have to make a lot of money to

106:21

like you have to really thrive in the

106:23

system to go and get that now.

106:24

>> Isn't that crazy? Because that used to

106:26

be the way poor people lived.

106:27

>> Yeah. I yearn to live like a poor

106:29

person.

106:29

>> I think 150 years ago,

106:31

>> harmonious for human beings to live like

106:33

that. Everybody that I know that lives

106:34

like that will kind of tell you that it

106:36

seems right.

106:38

>> I think people lived like that for so

106:39

long. I think it feels normal for them

106:41

and they're totally self- sustaining as

106:43

opposed to someone who just relies on

106:45

these trucks to keep showing up at the

106:46

grocery store. I mean, also like at some

106:50

point I know RFK

106:52

came in with like trying to do a lot of

106:54

things to improve the food and I don't

106:56

know how many are going through, but I

106:58

at some point people will get sick

106:59

enough. I think you have to have some

107:02

sort of ch I mean I

107:03

>> my wife has become gluten-free since

107:05

coming to America cuz she's become

107:06

gluten like she had gluten her whole

107:08

life. Mhm.

107:09

>> Something in the wheat here. I don't

107:11

know what they're doing to it.

107:12

>> Is not good.

107:13

>> Well, one of the things is the excessive

107:15

use of glyphosate. Glyphosate is in a

107:17

lot of different things. The other

107:18

things There's a bunch of different

107:20

chemicals. Mhm.

107:21

>> There's a bunch of different chemicals

107:23

that they put um into modern bread. What

107:26

was it? Broine. Is that one of them?

107:28

There's a guy who we uh we played a

107:30

video of him breaking it down. Remember

107:32

that video, Jamie, about what's wrong

107:34

with bread in America? See if you can

107:36

find that. It's very enlightening

107:38

because it's one of those things you you

107:40

you you realize like, oh, this is all to

107:42

make it shelf stable so it stays good

107:44

forever. And they've made more complex

107:46

glutens and the wheat because that way

107:48

you get a higher yield per acre and

107:50

they've they've all made it so creates

107:52

all this intolerance like you're you're

107:54

you get gut inflammation if you eat too

107:56

much of it.

107:58

>> You feel terrible like oh

107:59

>> well like it was the only thing people

108:01

would eat. You would just eat bread. You

108:02

get a loaf of bread for the week and

108:04

you'd have whatever meat you could have

108:05

next to it. Mhm.

108:06

>> And you'd be but like surely we don't

108:09

need that at this point. Like we can

108:11

have

108:12

>> the problem is industrial agriculture is

108:15

kind of taken over in this country and

108:17

if you want to make money that's really

108:18

kind of the only way to make money

108:20

farming. It's really difficult to run um

108:22

regenerative farm and have it be like

108:25

really profitable the way these enormous

108:28

>> like industrial farming situations are.

108:31

You're not supposed to have monocrop

108:33

agriculture. Like that's crazy. You're

108:34

not supposed to have a thousand acres of

108:36

corn just growing together. That's

108:38

kooky. Like no one has that in the wild.

108:40

That's not normal. So there's supposed

108:41

to be genetic diversity. Supposed to be

108:43

animals [ __ ] everywhere. It's all

108:44

feeds into each other. That's what they

108:46

do in regenerative farms. But their

108:47

yield is so much lower than a a farm

108:50

that stacks all the pigs into a

108:52

warehouse and has them [ __ ] into a lake.

108:53

>> I have seen the the weird little tunnels

108:55

where they put the pigs into. It's not

108:57

nice.

108:57

>> It's disgusting. It's disgusting.

108:59

>> But then but that's how you get Jack in

109:01

the Box on every corner. That's how you

109:03

feed a million people that aren't

109:05

growing in the box.

109:07

>> You don't. No, I'm not suggesting you

109:09

lose Jack in the Box

109:10

>> or any of these places. But I'm just

109:12

saying that we've kind of painted oursel

109:14

into a corner where you have no one

109:17

working in food production.

109:19

>> Yeah,

109:20

>> you have a small amount of people in

109:22

these cities that even understand where

109:24

their food is coming from. Everybody's

109:26

just assuming it's going to show up.

109:27

You're going to go to the nice

109:28

restaurant. You sit there and you have a

109:29

filet minan and a glass of wine. You

109:31

have no idea where anything came from

109:32

and you don't have to.

109:34

>> But that's a luxury that most people

109:37

don't realize is a luxury until

109:38

something like the pandemic happens and

109:40

everything shuts down and then you go,

109:41

"Oh, no food's coming in. Where do we

109:44

get food? Oh my god, we have to learn

109:45

how to hunt."

109:46

>> This is like the AI hope, right? Is that

109:48

it takes care of all the like we can we

109:50

can have super abundance and we can

109:52

return to an organic

109:53

>> Well, the first thing I would say to AI

109:55

is how do you fix crimeridden cities?

109:57

How do you do that? How do you do that

109:58

ethically?

109:59

>> You may not like the answer it gives

110:00

you. Well, I don't want it to give me

110:01

>> It might say there are men with hoods.

110:03

>> Here it is. Let's play this.

110:07

>> No problem. Why? I was glut gluten-free

110:10

in 15 years I've been gluten-free in uh

110:15

Canada. In America,

110:18

can't eat it.

110:18

>> That's because in America, what we call

110:20

bread can't even be considered food in

110:22

parts of Europe. See, here in America,

110:24

it's not so much the gluten as what

110:26

we've done to the grain. About 200 years

110:27

ago, we started stripping the brain and

110:29

germ or the fiber and nutrients to make

110:31

flour shelf stable, also nutritionally

110:33

dead. Because the nutrients were gone,

110:35

we enriched it with folic acid, which a

110:37

large majority of the population can't

110:38

even metabolize. Therefore, many people

110:40

experience fatigue, anxiety,

110:42

hyperactivity, and inflammation. But

110:43

then the bread wasn't white enough, so

110:45

they bleached it with chlorine gas, and

110:46

the bread didn't rise enough. So, they

110:47

added a carcinogen called potassium

110:49

bromate, which is banned in several

110:50

countries like Europe, the UK, and even

110:52

China. Then, we wanted to ramp up

110:54

production. So he started using

110:55

glyphosate to dry out the wheat before

110:57

harvest, causing endocrine disruption

110:58

and damaging your gut. So now you're

111:00

bloated, brain fog, tired, and blamed

111:02

gluten. But gluten is just the

111:03

scapegoat. The real issue is

111:05

ultrarocessed, chemically altered,

111:06

bleached, bromated, fake vitamin filled

111:08

wheat soaked in glyphosate. This isn't

111:10

bread. This is

111:11

>> uh I need some

111:12

>> That's it. I like that they had sweet

111:14

dreams playing in the background there.

111:17

>> Yeah. I mean, I will look when I'm back

111:18

in Australia, I will look forward to

111:19

having normal bread. Human bread.

111:21

>> So [ __ ] up. So [ __ ] up.

111:24

food. It's the same thing that they've

111:26

done to our

111:28

>> governmental systems. Same. It's like

111:30

money. Money gets in these [ __ ]

111:33

>> They ruin it all.

111:34

>> Yeah. You guys I mean

111:35

>> [ __ ]

111:36

>> You like it's okay. Money is also great.

111:38

>> Oh yeah.

111:38

>> I'm not against money.

111:40

>> You should be.

111:41

>> But I'm a little bit against money.

111:43

>> Are you? In what way?

111:44

>> Uh I don't I don't want to make

111:46

decisions in my life about how to what

111:48

would result in having more money.

111:50

You've got to be able to provide for

111:51

your family. But I think you see enough

111:54

people in this business sell out

111:56

>> and people have really lost the language

111:57

of selling out. Like it's gone. Like in

111:59

the 90s everyone that guy's a [ __ ]

112:01

sellout. That guy's doing you know you

112:02

do the wrong sort of music on an album

112:04

and people would accuse you of selling

112:05

out. So I'm not advocating for that. But

112:07

like

112:09

I mean there are definitely

112:11

there are people out there doing ads for

112:13

things that are it's nuts that they're

112:16

getting away with it. like people who do

112:18

like rich guys who are doing gambling

112:20

commercials and I don't mind gambling.

112:22

I'm open to gambling. I enjoy gambling.

112:24

>> We do commercials. We do gambling

112:26

commercials on this podcast

112:28

>> and I may be open to doing it myself in

112:30

the future. But when I do see

112:31

>> we do DraftKings

112:32

>> Samuel L. Ah I don't even mind that as

112:35

much. I

112:36

>> Why is it different than Samuel Jackson

112:37

reading for a gambling?

112:39

>> I might I don't know DraftKings enough.

112:41

But there are things like in Australia

112:43

we got bet 365 which is like they've

112:46

turned it into a social media

112:47

app/gambling

112:49

software.

112:49

>> Okay.

112:50

>> So it's where you go to socialize and

112:51

gamble at the same time and that does

112:53

give me a strong ick factor.

112:55

>> Yeah was talking about that the problem

112:58

in Australia with gambling as well.

112:59

>> I don't see anything when when I look at

113:01

bookie apps in America and things. It's

113:03

just like I'd like to put a bet on that

113:05

and I get money if it wins and not if it

113:07

loses. We're in a We're in a more

113:09

strange advanced. We've been doing it

113:11

for a bit longer and it's further down

113:13

the line. And

113:13

>> DraftKings has all that kind of stuff

113:15

where you can bet on weird prop bets.

113:17

>> Yeah. And you can do multi bets and

113:18

things like that, but I don't think it

113:20

has affected the character of men in

113:22

this country the same way that it's done

113:23

in Australia.

113:23

>> We have more freedom. You guys are

113:25

little children over there.

113:26

>> It's also our only outlet.

113:27

>> Yeah.

113:28

>> Is gambling. Like I think we outgamble

113:30

Singapore. We're number one in the world

113:32

per capita. No, we put you to shame. But

113:34

like you guys sign of people in

113:36

distress.

113:37

>> Gambling.

113:37

>> Yeah.

113:38

>> Yeah.

113:39

>> Yeah.

113:40

>> The country's in distress. That's why

113:42

you guys have a gambling problem.

113:43

>> I mean, we really have a [ __ ] huge

113:46

gambling problem.

113:47

>> It's that bad. It's really that bad.

113:49

>> It's just It makes it hard to have a

113:50

conversation with a guy.

113:52

>> Really?

113:52

>> Look at uh 72.8% of Australian adults

113:56

gambled within the previous 12 months.

113:58

80.5% for men and 66.2% for women. 38%

114:02

of Australian gambled at least once per

114:04

week. 48% of men and 28% for women.

114:08

>> 28% for women. When you see a woman

114:10

who's betting on sports, something

114:12

inside of you goes, "What

114:15

what are you doing?

114:16

>> This is our horrible thing."

114:17

>> No, let let the ladies [ __ ] up too.

114:19

>> I have been to your pokey rooms in

114:21

America. That's what we call them. Like

114:22

at the casino, we call them pokey rooms.

114:24

>> Pokey.

114:25

>> Yeah, the pokés.

114:26

>> Like the raw fish.

114:27

>> You're like poking You're like poking on

114:29

the machine all the time. That's why we

114:31

call them the pokies.

114:32

>> But like in America, you'll be at a

114:33

casino and the floor has all these fruit

114:35

machines.

114:35

>> Pokés.

114:36

>> Yeah. But like people are still like

114:38

smiling and talking to each other. In

114:39

every pub in Australia, there's like a

114:41

back room where sad, twisted old people

114:44

are just like sitting in front of a

114:45

machine for hours.

114:47

>> You get that in Vegas, too. It's just

114:49

extracting money. It's sucking your

114:52

attention and extracting money. And it

114:54

makes your dull life a little bit more

114:56

exciting.

114:57

>> 20% of the world's slot machines are in

114:58

Australia.

114:59

>> Yeah. Yo, you guys are buck wild.

115:01

>> No, it's

115:03

>> that's how they keep you broke.

115:05

>> I'm against it. But also, yeah, if I've

115:06

had a couple of drinks and it's a Friday

115:08

night, I'll go and play the Indian

115:10

Dreaming.

115:10

>> Well, here's the thing. You're smart

115:12

enough to not get fully addicted to

115:14

playing those machines, but not

115:16

everybody is.

115:16

>> It's a smart thing. I think I I have

115:18

enough going on in my life. Definitely

115:21

with the int there are smarter people

115:22

than me who have been lost to it.

115:24

>> But that's all it right. Like you don't

115:25

need a distraction. Your distraction is

115:27

the thing you're enjoying in your life.

115:29

Yeah,

115:29

>> you got a lot of things going on in your

115:30

life. You don't want to do that.

115:32

>> If I wasn't doing standup and if I

115:34

wasn't doing if I didn't have a loving

115:35

family

115:36

>> and you had a shitty job.

115:37

>> Oh man, when I did have a shitty job, I

115:39

was door-todoor salesman and I was

115:41

buying the scratchoff cards every day.

115:43

Every single day I would buy them. And I

115:45

didn't know why I was doing it at first.

115:46

And it's like,

115:47

>> well, I'm knocking on people's doors and

115:48

trying to give them cable television

115:49

when they don't want it.

115:51

>> I'm going to need

115:52

>> a little something to help. Oh man, I

115:54

think I started drinking in the

115:56

afternoons.

115:56

>> Really? Cuz you hated it. I hated it. It

115:59

made me loose when I went to knock on

116:00

the doors and try and give gen they

116:03

would take us out to like the worst

116:05

remote communities because they'd go

116:07

these people will buy. They like the the

116:10

nastier the neighborhood, the more

116:12

people are likely to buy from a

116:13

salesman. The less they have in their

116:15

life. You try and go to a middle-ass

116:16

neighborhood, no one would talk to you.

116:17

You'd go out to weird remote poverty and

116:21

boy I sold a lot of cable television.

116:23

>> Really?

116:24

>> Yeah.

116:24

>> Was it dangerous?

116:26

Uh yeah, there were defin

116:29

I because you're knocking on the doors

116:31

of like I went up to Port Augusta in the

116:33

worst neighborhoods there. This is like

116:35

hours and hours away from a major city.

116:37

Um and the company I was doing it for

116:39

like said, "Let's we we looked up the

116:41

poverty statistics and we're sending you

116:43

to the worst possible places cuz you'll

116:46

you'll sell more there."

116:48

>> Uh Matt, no. People were uh I remember

116:51

there was an Irish lady who got attacked

116:52

who was working with us. I don't think I

116:54

ever I had like weird things happen

116:56

where people you'd have to go into

116:58

someone's house and there'd be like

117:00

weird stuff on the floor. I went into

117:01

one person's house and there was a woman

117:02

passed out on the floor bleeding and

117:04

they were all just like she's fine.

117:06

Don't worry about her. It's like all

117:07

right.

117:07

>> Where was she bleeding from? What part?

117:09

>> Her head.

117:09

>> What?

117:10

>> Yeah, she was apparently all right and

117:11

she was but she was passed out. I don't

117:13

know what happened.

117:13

>> What do you mean all right? She's

117:14

bleeding from her head and she

117:16

>> It wasn't like a huge amount of blood,

117:17

but she was on the floor and there was

117:19

blood

117:19

>> and they just assumed she was okay.

117:21

>> I made it out of there quicksmart. They

117:23

were like, "She's fine. Don't you worry

117:24

about it." I don't know why this is

117:26

coming back to me now. I haven't thought

117:27

about that in about 10 years.

117:29

>> Did you think that maybe they hit her

117:31

and then maybe you were a witness to it

117:32

or maybe they killed her and they were

117:34

going to have to kill you?

117:34

>> I don't know why this is dribbling out

117:36

of me now. I definitely saw her. She had

117:38

a beard. I remember

117:40

>> um she was uh they were very calm about

117:43

it. They were relaxed and they wanted to

117:45

keep having a conversation about buying

117:46

the cable television and how that would

117:48

let them watch the football

117:49

>> and that she was okay and I wasn't to

117:51

worry about her. And I think I got out

117:53

of there and kept knocking on people's

117:55

doors. I don't think I called anybody.

117:57

>> Whoa.

117:58

>> Sorry. I didn't know where that was

118:00

buried.

118:03

>> Maybe she's fine. Maybe she's a drama

118:05

queen.

118:06

>> You also saw she hit her head on purpose

118:07

and then fell down.

118:08

>> I mean, I was seeing a lot of passed out

118:10

people

118:11

>> in the streets there. Uh drunks and

118:13

drugs and

118:15

>> Yeah.

118:15

>> Did you ever almost get robbed or

118:17

anything?

118:18

I don't think I got

118:20

threatened.

118:23

Uh there was a guy who was not who was

118:25

having sex one time and was very unhappy

118:27

that I was kept knocking on his door and

118:29

I thought he was going to hit me but

118:31

that was about as bad as it got.

118:32

>> Did he come out with a dong hanging out?

118:34

>> He was grabbing his pants in a weird

118:36

way. His his his lady had been at home

118:38

and she said, "Come back when my

118:40

husband's home at this time and then you

118:42

can he can decide if he's going to buy

118:44

it." And then I came back right at that

118:45

time and I think he just got right home

118:46

and started

118:48

>> right now. Let's do it. And then he went

118:49

get the [ __ ] out of Australian men being

118:52

angry is we go into a new gear of like

118:56

>> lack of control.

118:57

>> Well, it's a prison population and

118:59

originally

118:59

>> and we like that. We don't want to be

119:01

free. We want a nice warden who's going

119:03

to take care of it for us.

119:06

>> But you don't.

119:06

>> Mhm. No, I'm There are many things that

119:09

are upsetting me about going back. You

119:11

got to become king of Australia going

119:12

back

119:13

>> if they'll have me. I'm thinking of

119:15

running for the Senate.

119:16

>> You might win.

119:17

>> I've got policies. The Senate's more

119:18

winnable in a because they send like

119:20

>> Are you seriously thinking about running

119:21

for the Senate?

119:22

>> We have like 12 people from each state

119:24

one day. It's my fantasy.

119:25

>> Really?

119:26

>> In each state, there's like 12 people

119:28

who get to be the senator from there. So

119:29

you and in a double dissolution, you

119:31

only need like 8% of the vote to get

119:33

into the Senate.

119:34

>> Wow.

119:34

>> And if you're in a small state, that's

119:35

not a huge number of people. So we get

119:37

wacky people going to the Senate. And it

119:39

effectively has the same job that the

119:41

American Senate has.

119:43

>> Like it's a huge amount of power and you

119:45

get to veto things. You get to do

119:46

inquiries into stuff.

119:48

>> Yeah. We get we've had uh Pauline Hansen

119:50

is there at the moment. She's been there

119:51

for a while. We had Jackie Lambi for a

119:53

long time. We get nutty interesting

119:55

people in the Senate. It's the only bit

119:58

where a bit of life and color gets into

120:00

our politics

120:01

>> cuz we've got uh Yeah. Our house, our

120:04

lower house is not as exciting as yours.

120:06

You get more. You get what's it? Jasmine

120:09

Crockett.

120:09

>> Yeah.

120:10

>> You get Jasmine Crockett in your

120:11

parliament. You don't get not as much.

120:15

>> How locked down is politics in

120:18

Australia?

120:18

>> So locked down.

120:19

>> Yeah.

120:20

>> Uh

120:21

and there's a So it's

120:23

>> it's not first. You guys vote and you

120:25

just go first past the post and if you

120:27

get, you know, if someone gets 50% of

120:28

the vote, that's it. They've got it. We

120:30

do ranked voting. So it's like

120:32

>> you you put in six there's six people.

120:34

you put them in order and then like kind

120:36

of the least bad one, the one that the

120:39

least number of people dislike gets in.

120:42

>> So you get really boring people and also

120:44

the parties don't primary and this is I

120:46

keep talking about how this is great in

120:48

America. You're like the only country

120:49

that does this.

120:50

>> Well, that was why it was a real problem

120:51

that the Democrats didn't do it.

120:52

>> They didn't do it at Yes. for the

120:55

president.

120:56

>> They didn't do it legitimately since

120:58

2016. But on a local level, someone like

121:01

>> 2016, it wasn't

121:02

>> AOC can get in to to be her.

121:04

>> Sure.

121:05

>> Like that's even that level of public

121:07

involvement

121:08

>> is globally unheard of.

121:11

>> No one else is doing that. I don't think

121:12

>> right. Fedterman those kind of people.

121:14

>> Fedman should not like you just look on

121:16

a paper. There's no way the Democrats

121:18

wanted him to be their guy. There's no

121:19

way the people in charge of that party

121:21

said I think this is a guy who's going

121:23

to tow the party line. Well, I think

121:25

once he got in, he became much more

121:28

aware of how corrupt the system was.

121:30

Like talking to him was interesting.

121:32

He's a very nice guy, by the way. Like a

121:34

real genuine nice guy. And I've run into

121:36

him in other places. I ran into him at

121:37

the inauguration. He was wearing a car

121:40

heart hoodie and shorts at the

121:42

inauguration. I'm not bullshitting. I

121:44

gave him a big hug. He's a sweet guy.

121:45

Like a genuinely sweet guy. And I think

121:48

he got into that system and he's like,

121:50

"Hey, this is not what I like." That

121:51

guy's been doing like charity work his

121:53

whole life. Like a genuinely good

121:55

person.

121:56

>> And he got into it. He's like, "This is

121:58

not what I signed up for. This is this

122:00

whole thing is [ __ ] crazy." Like

122:04

>> when he he also had the brain thing

122:05

happen. And then he I watched that

122:08

debate.

122:08

>> Mhm.

122:09

>> That he

122:10

won. Like I don't know how bad

122:13

>> is it Dr. Oz that he was up against.

122:15

>> Yes.

122:15

>> That's got to hurt when you go up

122:17

against a guy who temporarily can't talk

122:19

at all.

122:20

>> Yeah. Well, he has a struggle

122:22

communicating, but I don't think the

122:24

struggle

122:25

>> way better now.

122:26

>> Yes. But I don't think the struggle is a

122:28

thinking thing. I think it's a

122:31

communication thing. And it's also like

122:33

he loses track of what you just said.

122:36

>> So like he has to have a an iPad. So the

122:39

iPad listens to what you're saying,

122:41

translates it, writes it out, dictates

122:43

it, and then he looks to it

122:44

occasionally.

122:45

>> Okay?

122:45

>> He's like, "I'm sorry, what did what did

122:47

what did you ask me?" And then I'll have

122:48

to repeat the question. But it's not

122:50

that he's not there. It's just there's a

122:52

misfiring. But when the when it fires

122:54

correctly, he's very reasonable. He's

122:57

very rational, very smart guy, and I

122:59

think a really good guy. And I think he

123:01

opened up a lot of people's eyes like,

123:03

well, there it is possible for someone

123:05

to

123:06

get in on either side and just be

123:09

rational and just have rational

123:11

positions on things and saying, I'm not

123:13

I'm not going to just vote the way

123:14

everybody votes because I don't agree

123:16

with that. I think

123:17

>> I think there's a much more nuanced view

123:19

of the world and so a lot of people like

123:22

on the right like him

123:24

>> cuz he broke party lines you know

123:26

>> I remember there was like I Obama came

123:29

in and tried to do that immediately when

123:31

he was a senator and I was reading a

123:33

thing about how like people just took

123:35

him aside and said you absolutely don't

123:36

[ __ ] do that you have to stop doing

123:38

that now okay we want you to be the

123:40

future of this party shut up but there

123:42

must be huge pressures on people not to

123:43

be individuals there

123:44

>> there was huge pressures on Tulsi Gabber

123:46

to not even communicate with people on

123:48

the other side. She would like bring him

123:49

cookies and [ __ ] and just be not. She's

123:51

like a sweet lady. She just wanted to be

123:52

friends with everybody and they were

123:54

like we don't do it that way.

123:55

>> Well, I mean John McCain seemed to do a

123:57

lot of weird he would hang out. He would

124:00

be on both sides of the aisle. People

124:01

liked him. There are a couple of

124:02

individuals.

124:03

>> Yeah, there's a couple individuals that

124:05

have made like little crossovers, you

124:08

know, a little bit. And you know,

124:10

>> you could ban the party system. I'd be

124:12

open to that.

124:13

>> Well, you need more than two. That's the

124:15

real problem. The real problem is

124:16

there's only two legitimate ones. If

124:18

someone's in if you vote libertarian,

124:20

you're essentially voting protest.

124:22

You're saying, "Fuck these guys."

124:23

>> Yeah.

124:23

>> You know, and the Green Party.

124:24

>> I've done the libertarian thing a few

124:26

times. It's like you're just saying,

124:27

"Fuck these guys."

124:29

>> But then

124:31

if you can't like a two-party system is

124:33

so easy to rig. I mean, but could you

124:36

rig a five party system? Could you rig

124:37

if you had seven parties? Could you rig

124:40

that? I don't know. You know, and the

124:41

thing is is like you have the House and

124:43

you have Congress. It's like

124:46

the two-party thing is going to be so

124:48

tough to untangle. You know, it it would

124:50

take some radically popular person who

124:53

went independent

124:55

>> who tried uh Legs Roosevelt.

124:59

>> Um

125:00

>> Ross Perau.

125:01

>> Ross Perau. But

125:02

>> Ross Perau [ __ ] it up for

125:03

>> He came close. But uh Roosevelt, Teddy

125:05

Roosevelt, he got real close,

125:08

>> right? But that was a long time ago. And

125:09

he was Teddy Roosevelt.

125:10

>> Yeah. But he won seat. He won states, I

125:13

think. I think you got whole states.

125:14

>> That's crazy.

125:15

>> The Dixierats did it, but they were

125:17

never going to pick up that many states.

125:19

>> It would have to be someone like that.

125:21

Someone that was like loved by a giant

125:24

percentage of the population. Like if

125:25

some let's make up a fictional person,

125:28

some amazing Oprah. If Oprah becomes

125:31

president or wants to run for president

125:32

and everybody's like because you

125:34

remember there was a thing during the

125:36

Trump administration, the first

125:37

administration, where I think NBC

125:39

tweeted, "This is our president." and

125:41

they showed a photo of Oprah. See if you

125:43

can find that. I'm pretty sure that's

125:46

true. And I remember thinking like, this

125:48

is so crazy that we're looking for

125:50

another famous person to counteract the

125:52

famous person.

125:53

>> They wanted The Rock.

125:54

>> Yeah. Oh, they talked to The Rock. They

125:57

came to the Rock. They came to The Rock

125:58

to try to get him to do it.

125:59

>> What? I mean, I don't know what The

126:00

Rock's politics are. He's, you know, a

126:03

kind guy who's probably very left on

126:06

certain things, but also very

126:09

disciplined and obviously really admires

126:12

and believes in hard work and and

126:14

dedication.

126:15

>> He'll be a great president if he wanted

126:16

to do it. Tweet on future Oprah

126:19

presidency, not meant to be political

126:20

statement. Okay. What

126:25

they said on Monday that a tweet touting

126:27

Oprah Winfrey as our future president

126:29

during the 75 Golden Global Wars was not

126:31

meant to be a political statement. Of

126:33

course it is. Yeah, you literally said

126:35

president. That makes it political. Our

126:37

in all capital letters. All this the

126:40

only one that's capitalized.

126:41

>> I really thought it could have been

126:42

Kanye for a while there.

126:43

>> Yeah, he could have made it.

126:44

>> His policies were some of them were

126:46

great. Some of them were genuinely good.

126:48

It's in reference to a joke made during

126:50

the monologue and not meant to be a

126:52

political statement. We have since

126:53

removed the tweet. Okay, so there was a

126:55

joke, but it was still a political

126:57

statement. Come on. Even if it was like

126:59

in reference to the joke, you saying

127:01

that in all caps, our president, it's

127:03

still a political.

127:03

>> They've got to find somebody. I mean,

127:05

just for the future of this JD Vance can

127:07

talk to people. I've seen long form

127:09

interviews with him where he actually

127:10

seems like a normal human being.

127:11

>> I think there's a lot of people pushing

127:12

James Talerico now and you know, we had

127:15

him on the podcast too to talk to him

127:16

because I I felt like Texas guy,

127:19

>> Texas guy who has some really uh

127:22

important things to say particularly

127:24

about uh the potential for a religious

127:26

like a theocracy in Texas and that

127:29

there's these very wealthy Christian

127:31

fundamentalists that are driving this

127:33

like multi-billionaire guys that are

127:35

driving this and that's how the ten

127:36

commandments got in schools and he is a

127:39

very religious man and he does not

127:41

believe the ten commandments should be

127:42

in schools. He believes that if you put

127:44

the Ten Commandments in schools, it's

127:45

actually going to push people away from

127:47

Christianity because you're shoving it

127:48

in their face.

127:49

>> And he's like, "And it's also

127:50

disrespectful to all the other

127:51

religions. So you don't have their

127:53

tenants and commandments."

127:54

>> Have you seen the Ten Commandments in

127:55

the schools?

127:56

>> I have not.

127:56

>> We went out to look at some of the

127:57

schools. And it's fun because they like

127:59

they don't just put them up dryly on the

128:01

wall. Like they have pictures of all the

128:02

things,

128:03

>> all the things you're doing like sin.

128:04

>> Yeah. This is weird when it comes to

128:05

like don't coveret your neighbor's wife

128:07

and there has to be like some weird

128:10

>> little sexy picture or something.

128:11

Really? Yeah. Is she like bending over

128:13

in the garden?

128:13

>> I think it was like a woman.

128:15

>> Oh

128:16

>> yeah, it was that was a strange one.

128:18

>> Well, how weird is that? They have to

128:19

draw it.

128:20

>> Americans are too stupid.

128:22

>> Like language. It was like

128:22

>> you got to draw it.

128:23

>> I think it was in like the Spanish class

128:25

where they had like they had it written

128:27

in Spanish. The Ten Commandments.

128:28

>> Anyway, Tal Rico is interesting, you

128:31

know.

128:31

>> Yeah.

128:32

>> He had a very bizarre argument about

128:34

abortion that I felt like that that

128:36

doesn't jive with how most people view

128:39

Christianity. What was his

128:40

>> Well, he he felt What did he What did he

128:42

exactly say that was like super

128:43

controversial, Jamie? He said like

128:45

somehow that you think that it it could

128:47

be biblically permissible.

128:50

>> I've heard this before. I've heard

128:51

people say that. I don't think it

128:53

>> it doesn't seem to make sense. If you

128:55

really want to live your life

128:56

biblically, it doesn't make sense.

128:58

>> But this is lefty Christians are always

129:00

like

129:00

>> they have to find a

129:01

>> like people will go there's nothing

129:03

there's nothing in the there's nothing

129:04

in scripture that says homosexuality is

129:06

wrong. And you go like, "Yeah, okay."

129:09

But like, what are we arguing that in

129:12

like,

129:13

you know, 2 BC Jerusalem it was just

129:17

chilled to be a gay guy and they just

129:19

never wrote it down for some reason?

129:21

Like, I'm not saying like as to as to

129:23

however people want to live. That's

129:24

fine. But don't like come in and say the

129:26

religion insists that people be gay or

129:29

that like that the trans thing is

129:30

actually fine in the Bible because it

129:32

never says you shouldn't be trans. It's

129:34

like the absence of something in an old

129:36

book that hadn't occurred to people is

129:38

not an argument for its

129:41

permissibility.

129:42

>> There there is talk of a man lith with a

129:45

man being an abomination.

129:46

>> And then they do but then they go that's

129:48

about that's about boys. It's not about

129:50

men. We've got a very special

129:52

translation that only we understand. I

129:54

don't

129:55

>> Is that what they say? Really?

129:56

>> Yeah. They say it's about this is always

129:57

about boys. This is never about two men.

129:59

>> But it says man lie with another man.

130:01

Hey, I don't agree with them, but it's

130:03

all the like it always I I think if

130:06

you're going to have a religion, you

130:07

should like not just try and twist the

130:09

religion to be exactly what you think

130:11

>> it should be,

130:12

>> right?

130:13

>> Like that's kind of the point of

130:14

religion is that it

130:15

>> it's something bigger and stranger than

130:17

you that you're going to allow to like

130:19

you're going to develop as a person

130:22

>> with it rather than correcting it. Well,

130:24

I think if you look historically just in

130:26

this country, the

130:29

the attitude that we had about gay

130:31

people in this country was terrible.

130:34

Like in the 1930s and 40s and 50s, it

130:36

was terrible. Yeah.

130:37

>> Right. And then somewhere along the line

130:40

there's the gay rights movement and then

130:43

ultimately in modern times, gay

130:45

marriage. So there's this progression

130:48

where people realize like, hey, they're

130:50

just gay. Like it's always existed, but

130:52

people had to hide it forever. Like you

130:55

know the Turing test story, right? Allan

130:57

Turing, the guy who invented the

130:58

>> Turing as to whether the AI you can tell

131:01

if it's a person.

131:02

>> Yes. Yeah. Well, that guy was fed

131:04

chemical castration drugs cuz he was gay

131:06

in England in the 1950s,

131:09

>> right? So at at some point in time, I

131:13

think you you have to like take into

131:15

consideration like how long being gay

131:18

was punished before people eventually

131:21

just got to this realization like you

131:22

you meet enough gay people, you know

131:23

enough gay people, you have a gay kid,

131:25

whatever, you realize like some people

131:27

are just gay.

131:28

>> There are obviously people who are

131:30

attracted to people of the same

131:31

>> 100%. That's all it is. And it's like

131:35

you have to look at things through a

131:37

cultural lens as much as you have to

131:39

look it through a biblical lens like

131:42

because it's not all God's word. It's

131:44

God's word written down by people. And

131:46

some of it is like some of it is just so

131:49

>> great Catholic are you? Yeah.

131:51

>> Ask the Catholic coming out.

131:52

>> You have to look at it that way. It's

131:54

like there's just so much in it that

131:57

doesn't make any sense.

131:58

>> There's context and there's tradition.

132:01

This is what I like about the Catholic.

132:03

I became a Catholic like eight years

132:05

ago. Seven, n it was a number of years

132:08

ago. I'm forgetting how many years. But

132:10

I had been like sort of nothing and then

132:11

sort of a unitarian. And then uh but I

132:14

like this thing of like

132:15

>> what brought you from sort of nothing to

132:18

belief.

132:19

>> Uh I'd always believed there was

132:21

something, but then I started going to

132:23

mass cuz a friend was going and I when I

132:25

was on the road years before, I would

132:27

like be off on the road on a Sunday and

132:29

have nothing to do. So, I went to mega

132:30

churches for fun cuz they were very

132:32

funny and very strange. So, like I went,

132:34

>> "What are megaurches like in Australia?"

132:36

>> We [ __ ] We invented it. We got it

132:38

going.

132:38

>> Really?

132:38

>> Uh, Hill Song Hill Song's You guys

132:40

probably invented it, but we took it to

132:42

another level. We did Hill Song, which

132:43

is Hill Song. Hillsong was the biggest

132:45

one by Justin Bieber was a Hillsong guy.

132:47

>> That's Australia.

132:48

>> That's Australian.

132:49

>> Oh, I didn't know that.

132:50

>> Australian and New Zealand, guys. And

132:51

the like guitar music and the smoke

132:54

machines and the doing this.

132:55

>> Oh, and you guys brought that over to

132:56

America.

132:57

>> Yeah. I'm very sorry. Wow.

133:00

>> I'm not a big but I would I would turn

133:01

up there or like or a little Baptist

133:04

church or something, but I would shop

133:05

around and try and you know who's got

133:07

something going on. But the mega church

133:09

has offended me more than any. It was

133:10

like whatever is happening here is weird

133:13

and gross and I don't like it.

133:15

>> Like they would have two pastors come

133:17

out and they'd like riff and banter

133:19

together and they it was like a

133:22

breakfast radio show. Wow. Wow.

133:23

>> They're going like and they'd have like

133:25

big projectors and then I started going

133:26

to the I was I went to the Latin mass

133:28

>> and it was like oh this is a very

133:30

strange ancient ritual

133:32

>> with like bells and I don't understand

133:34

what anyone is saying

133:35

>> right

133:36

>> and uh I just wanted to keep going to

133:37

that

133:40

>> I love it and the organ and the choir. I

133:43

think you made a really good point too

133:44

about people coming in to this candle

133:47

lit room and everything's beautiful and

133:50

ornate and just that alone probably has

133:52

a profound effect on your psyche.

133:55

>> They must have known that, right? They

133:57

must have known that when they're

133:58

creating these incredible

134:00

>> a stained glass window.

134:01

>> Yeah.

134:01

>> You haven't looked at a picture or a

134:03

television screen ever,

134:05

>> right?

134:06

>> And then you go into a building where

134:07

there is light shining out of a man's

134:09

face

134:10

>> and it's Jesus. Yeah.

134:11

>> Yeah. Yeah. And there's statues of him

134:13

with

134:13

>> covered in blood.

134:14

>> Yeah. The th the he's got he's on the

134:16

cross right in front of you with the

134:18

thorns dripping blood like holy [ __ ]

134:20

>> This is what I mean though about losing

134:22

where it's okay with the that's the

134:24

Catholic thing. They always put him on

134:25

there. He's always suffering.

134:27

>> And at the mega churches they take him

134:28

off. They go it's a big plus sign out

134:29

the front.

134:30

>> What do you know? Like if at a

134:33

Protestant church they will have they'll

134:34

have a cross but there's no one dying on

134:36

that cross.

134:37

>> Oh,

134:38

>> it's just empty. It's just

134:39

>> only Catholics that have Jesus actually

134:40

nailed to the cross. Orthodox do it as

134:42

well, but like all the Protestant

134:43

megaurch people, they never show it.

134:45

>> That's interesting

134:46

>> cuz they're winners. They want to go

134:47

like we're increasing. We're getting

134:49

more stuff. And I I don't want to

134:51

exaggerate, but prosperity gospel

134:53

people.

134:54

>> Lenny Bruce had a great joke about that.

134:56

>> Was his

134:56

>> He had a great joke about Jesus coming

134:58

back and seeing you wearing a cross.

135:00

>> Hold on.

135:01

>> He said it's like having an electric

135:02

chair around your neck.

135:03

>> Was that Lenny Bruce?

135:04

>> Yeah. And then Bill Hicks had a version

135:05

of it. Yes.

135:06

>> Bill Hicks was like, "It's like going up

135:07

to Jackie with a rifle penned on."

135:10

We're thinking of I remember that

135:12

>> the oldest stained glass windows in the

135:14

world 7th century.

135:16

>> Yo,

135:16

>> that's what I'm about.

135:19

>> Germany Bavaria.

135:20

>> Wow. They figured it out. They're like,

135:22

"We got to make this place more

135:23

colorful. Bring in more people." They

135:25

didn't have pyrochnics back then. They

135:27

had to figure out a way to make it more

135:29

cuz like if you see beautiful ancient

135:32

cathedrals like uh one of the things

135:34

that I really loved about Italy is uh

135:36

you could go to these ancient churches

135:38

and go and look around in them and

135:39

there's like amazing artwork amazing

135:43

like the just the craftsmanship of

135:45

constructing these incredible building.

135:47

When you go inside of them it feels like

135:50

something bigger than you has created

135:52

this

135:53

>> this is more beautiful and ornate than

135:55

anything you ever see in your village.

135:56

Your village is filled with like boring

135:58

ass houses and like little [ __ ]

136:00

tables and little chairs and everyone's

136:02

sitting around eating spaghetti and then

136:04

you go to this place and this place is

136:06

insane and there's candles and you take

136:10

them off and you do this

136:12

>> and you put the money in the basket.

136:13

>> That's how I felt when I started showing

136:15

up

136:16

>> that it was some weird alien. It feels

136:18

like thousands of years old when they're

136:20

doing it in Latin and the priest isn't

136:22

facing you. He's facing a wave like

136:23

you're all doing something together

136:25

>> and it's mysterious.

136:26

>> Have you been to the Vatican?

136:28

>> Never.

136:28

>> Ooh, you should go.

136:29

>> I would like to.

136:30

>> You need to go.

136:31

>> You should just see St. Peter's Basilica

136:35

in the flesh. It It's beyond

136:38

comprehension. It took hundreds of years

136:40

to make. The craftsmanship is so

136:44

exquisite. It's like the artwork is so

136:47

incredible. You walk. First of all, it's

136:49

massive. I mean, massive and perfect.

136:53

You walk around, you're like, "What the

136:55

[ __ ] were you guys doing?" Like, who

136:58

made this? How long did this take?

137:01

>> That was Shane's reaction. Every time

137:02

Shane's talking about it goes, "Yeah,

137:04

we're number one.

137:04

>> We're number one, bro. Look, pull up

137:06

some images of

137:09

like look at what

137:10

>> the wobbly the wobbly um column.

137:12

>> God, it's so incredible, man. It's so

137:14

incredible."

137:15

>> And then it shits me when like the

137:18

Vatican 2. I don't dismiss it. I don't

137:20

say it was wrong, but when people, you

137:22

know, like a modern church and it looks

137:23

like like there's a, you know, a carpet

137:26

and straight walls and you know how much

137:28

art it takes.

137:29

>> That's love.

137:30

>> Do you know how much time it takes to

137:31

make something like that? I mean, that

137:33

is

137:34

fantastic artwork. When you walk into

137:37

that place, it's breathtaking. Like you

137:39

you walk in, you just go, "Wow, look how

137:42

small those people are. Look at the

137:44

people. Those people are walking, dude.

137:46

Look how tall that ceiling is. Look at

137:48

the light.

137:48

>> And acoustically you can the guy giving

137:51

the homaly and people can hear him.

137:53

>> Yeah.

137:54

>> Like it's built in such a way like

137:56

people used to know something about

137:57

acoustics where you could

138:00

>> I mean that's so psychedelic.

138:02

>> It really is. Just looking at the the

138:05

geometric patterns on the columns and

138:07

the ceiling. It's like it makes you feel

138:09

like you're tripping. So if you were

138:10

there and you're like walk into this

138:12

place and you lived in some boring ass

138:14

house, you would really feel like you're

138:16

in God's house. I mean, if it it feels

138:19

like God's house when you're in there.

138:20

That's how good that's how much they

138:22

believed.

138:23

>> They they didn't they didn't cop out on

138:26

this at all. They went all in.

138:28

>> That one right there. Look at that.

138:29

>> I don't like it when people go like the

138:31

church should melt everything down and

138:32

give it to the poor. Like this is a gift

138:33

to the poor.

138:34

>> Yeah.

138:34

>> If you're poor, you get to go in there

138:36

and look at that. That's open to

138:37

everybody. They're not putting that in a

138:38

private. should never take that down.

138:40

Whatever they did to do it, maybe they

138:42

shouldn't do it again.

138:45

>> Wherever they got that,

138:46

>> it's a better planet for having it

138:47

there.

138:48

>> Well, I mean, the Vatican controlled

138:50

armies for a long ass time. And it's

138:53

nuts that it's its own country. That's

138:54

weird.

138:56

>> Country so they can keep the pedophiles

138:58

there.

138:58

>> No,

138:59

>> they don't have to export them.

139:02

>> They've tried so hard to crack down on

139:03

the pedophiles.

139:04

>> Oh, good job, guys.

139:07

just so crazy that one section of

139:11

religion is commonly associated with

139:13

pedophilia.

139:14

>> The press was real bad because the

139:16

scandals were real and there are lots of

139:18

them. But I would say I mean when I talk

139:20

to priests and I look at Catholic

139:21

schools and what they've got in place at

139:22

the moment I would feel like they are so

139:26

>> on top of it.

139:27

>> So on top of it but there are definitely

139:28

parts of society that in 5 10 years

139:30

things will start coming.

139:31

>> Listen man they catch pedophiles at

139:33

Nickelodeon. you know, they catch

139:35

pedophiles wherever you think exes.

139:37

>> There's pedophiles everywhere. There's

139:39

there's a certain percentage of our

139:40

society that's [ __ ] sick

139:42

>> and they're sexually attracted to kids

139:44

and it's a sick [ __ ] horrible thing

139:47

that's real, you know? And it exists all

139:50

over the place. But the problem is it

139:53

exists like synonymously with the

139:55

Catholic Church. Like people think

139:56

because they've hidden those people.

139:58

They've shielded those people from

139:59

prosecution. They've taken them and

140:00

moved them to new places where they

140:02

molest more kids.

140:04

Uh I agree but I would also say it's the

140:06

only institution that de it was the it

140:08

was early to declare that that was

140:10

wrong.

140:10

>> Like before the Catholic Church you had

140:14

a pagan society where that was not it

140:16

was not questioned that that was

140:17

acceptable

140:18

>> acceptable

140:18

>> like in terms of like it introduces the

140:21

standard by which you can go it's wrong

140:23

to be a pedophile. It's wrong to have a

140:25

boy love because the Greeks and the

140:26

Romans were getting up to it.

140:28

>> Oh yeah.

140:28

>> It's not an excuse for people's behavior

140:29

but it's part of human nature that's

140:30

been with us for a long time. Well, I

140:32

think it was part of their nature also

140:34

when they would go on army campaigns and

140:36

there was no women for years at a time.

140:38

They just [ __ ] each other

140:40

>> in the legs.

140:41

>> They [ __ ] each other in the legs.

140:42

>> Intercural. I'm going to travel their

140:44

legs together and then use their legs

140:46

like a titty [ __ ]

140:47

>> Yes.

140:47

>> Nice.

140:48

>> Cuz it was disrespectful to the soldier

140:50

with to put it in his butt. He still has

140:52

to fight the next day.

140:53

>> Oh, really?

140:53

>> You don't want him having a mobility

140:54

issue.

140:55

>> So, they would just come in each other's

140:56

legs

140:56

>> in the legs.

140:57

>> That's not that bad.

140:59

>> That's just helping out a bro.

141:01

Worst things happen on bots now. Let's

141:04

see.

141:04

>> Well, that was there. They also had the

141:06

concept that if you were fighting next

141:08

side beside your lover, you would fight

141:11

harder to protect them than just another

141:13

man.

141:14

>> Yeah. I mean, we're not getting couples

141:17

to join up to the military now, though.

141:19

>> Well, right now we're not because

141:20

everyone's soft. But if we were at war

141:23

and you know how many guys would go

141:25

>> draft men and women,

141:26

>> you know how many guys would go gay if

141:27

you gave them three years with no women

141:29

at all? You know, you can just draft a

141:30

married couple. You're in the same

141:32

battalion.

141:32

>> Military men hard as a rock all the

141:35

time, filled with testosterone, running

141:37

off to some some part of the world to

141:38

kill people. No access to [ __ ] for

141:40

three years. It's not going to be 0% go

141:42

gay.

141:43

>> Uh

141:43

>> there's going to be a number.

141:44

>> I think numbers are hu There was that

141:46

test after World War II.

141:47

>> See how long it takes for you to go gay.

141:49

>> No, they did a huge Well, kind of cuz

141:51

everyone had just come back from being,

141:52

you know, like five years together in

141:54

the war

141:54

>> gaying it out. And they ran a big uh it

141:56

was like a survey on sexuality and

141:59

returned servicemen

142:00

>> and it was some huge number of like

142:02

>> gay guys.

142:03

>> It was not just gay guys but it was also

142:05

like beastiality was way bigger cuz a

142:08

lot of these guys had grown up on farms

142:09

and things and so they're asking like

142:10

have you ever had sex with a chicken?

142:11

And something like I'm going to get the

142:12

numbers wrong but it's something like

142:14

12% of guys being like yeah

142:16

>> yes

142:17

>> they [ __ ] a chicken. Oh

142:18

>> I don't want to be getting that wrong

142:20

but I think uh

142:21

>> how many women [ __ ] a chicken? Zero

142:23

>> you know. No, there's one lady in

142:25

Thailand who's still doing it to this

142:26

day. She isn't her idea.

142:28

>> It's not out of love. She's not an

142:29

amateur.

142:29

>> Yeah, it wasn't her idea.

142:33

>> The guy that [ __ ] the chicken. That

142:34

was totally his idea.

142:35

>> This is a big thing in your act. This is

142:36

a through line in your act is that like

142:38

you're always like men are the

142:40

degenerate ones in these.

142:41

>> For sure. Well, that is a fact. That's a

142:44

fact. I mean, we start all the wars.

142:47

>> We're responsible for most of the

142:48

murders.

142:50

>> Yeah.

142:51

>> Yeah. Well, one of the funny ones I had

142:53

a bit about back in the day. I actually

142:55

had a conversation with this guy. He's

142:56

like, "Do you know that statistically

142:58

speaking, more men get raped than

143:00

women?" I'm like, "Right." By other men.

143:03

Yeah. [ __ ] idiot. I'm like, "They're

143:05

not getting raped by cheerleaders."

143:06

>> Wait, is that true?

143:07

>> Yeah. Yeah. Because

143:08

>> most rape victims are men.

143:09

>> Yeah. When you take into account prison.

143:12

>> Oh, yeah.

143:13

>> See, you take into account, you know,

143:15

sexual assault in

143:16

>> which is just accepted in this.

143:18

>> I guess it is. It's like that's part of

143:20

the punishment that everybody knows is

143:22

going on in prison. No real efforts to

143:24

stamp out.

143:25

>> Well, the crazy thing is woke got so far

143:27

that they let males identify as females,

143:31

intact males, and go into female prisons

143:34

because they're air quotes trans.

143:36

>> Yeah.

143:37

>> Which is the craziest loophole. Like you

143:40

would never think of all the things they

143:41

restrict you from doing in jail. You

143:43

can't even have a phone. But you can go

143:44

[ __ ] girls and pretend you're a girl. I

143:47

mean, once you know that exists as a

143:48

loophole, you'd be very silly not to

143:49

take it.

143:50

>> Also, would you

143:51

>> you're dealing with people that are

143:53

[ __ ] liars. They're prisoners.

143:56

They're in prison. They're criminals.

143:58

You say you're saying they rob banks and

144:00

sell meth, but they wouldn't lie about

144:01

their gender. That is an honor.

144:03

>> Has this been stopped now?

144:04

>> No. In California, there's at le

144:07

uh that I read last, there was 47

144:10

biological males that are housed in

144:11

women's prisons with hundreds on the

144:13

waiting list. But this is happening in

144:15

>> it happens in Canada.

144:17

>> There's a lot of it in Canada.

144:19

>> I mean, schools is a weird one where

144:21

like there are single sex schools and

144:23

then they'll have a trans person and

144:25

they'll admit them. But like

144:30

like you can you can be a M to F and

144:33

they'll accept you into a girl school.

144:34

But also if you're a girl at the girl

144:36

school and you say I'm a boy now,

144:37

they'll keep you at the school. So like

144:40

which just ideologically which is it?

144:42

Cuz if you are a single sex school, then

144:45

if a girl says, "I'm transitioning to a

144:47

boy," you should have to kick him out.

144:48

You should say, "We believe that you are

144:49

a boy. Get out of here. You don't belong

144:51

here." You know what I'm saying? Like, I

144:53

don't think there's an intellectual

144:54

consistency with any of this.

144:56

>> It's just people going, "This is making

144:57

me uncomfortable. Please do not get

144:59

angry at me."

145:00

>> Yes.

145:00

>> I'll give you whatever you want.

145:01

>> There's that. And then there's also

145:03

people that really do feel like they're

145:04

in the wrong body. Right. So, those

145:06

people have always existed. So, the

145:08

question is, what is that? And is it

145:11

possible that someone would lie about

145:12

that in order to gain access to the

145:14

women's room? And that's true. That's

145:16

that's a fact. So you always have to

145:17

look at that. Like as soon as you say,

145:19

"Oh, you have to believe them." Okay?

145:21

You you believe a murderer who's in jail

145:23

and you're going to pay for his boob job

145:25

now?

145:26

>> Okay. And you're going to let him go

145:27

into the women's prison because that's

145:29

what's happening in Canada, right?

145:30

They're doing that kind of [ __ ]

145:32

>> Doesn't everyone feel like they're in

145:33

the wrong? Like being instantiated in

145:36

flesh is a weird thing. Mhm.

145:38

>> Like it's uncomfortable to have a body.

145:40

>> It aches. It doesn't do the things you

145:42

tell it to do all the time. Like we're

145:43

all alienated from our body. And there

145:46

was an explanation for that for a long

145:48

time. Like with the gend with the trans

145:50

spike that like this is what the thing

145:52

that is wrong with you. This is why

145:53

you're uncomfortable in your body. But I

145:55

think the numbers have collapsed in the

145:57

last

145:58

>> Well, you know when they collapsed, it

146:00

coincided with Elon buying Twitter.

146:02

>> Okay. I didn't know that.

146:03

>> Yeah. Yeah. the the post 2024 numbers

146:07

have dropped off a cliff

146:08

>> when you stopped offering that as an

146:10

explanation.

146:10

>> Yeah. Well, you could not only that, but

146:12

you could talk about it now. Yeah.

146:14

>> Whereas before, if you if you literally

146:16

if you wrote on Twitter that uh a male

146:20

could never be a female, you'd be

146:22

banned.

146:22

>> Yeah.

146:23

>> You would like that's what happened to

146:25

Megan Murphy. They they banned her. They

146:27

banned her from Twitter saying by saying

146:29

a man is never a woman. Well, I remember

146:31

they were banning people for saying what

146:32

JK Rowling had said, but they're like,

146:34

"We can't get rid of JK Rowling because

146:35

she's too big."

146:36

>> It would be

146:37

>> completely It was completely insane

146:39

because you should be able to talk about

146:41

anything and if you're wrong about that,

146:43

like other people going to correct you

146:45

or have a better argument than you have

146:48

and that's how you figure out who's

146:49

right and who's wrong. And for the

146:50

longest time, there was no talk of

146:52

dransitioners being upset. There was no

146:54

talk of these things are actually

146:56

chemical castration drugs they used to

146:58

use on pedophiles. That's what these

147:00

things are. Rapists and pedophiles used

147:01

to be forced to take these drugs that

147:03

you're now giving to prepubescent boys.

147:05

>> Yeah. Also, the new penises are

147:08

>> Oh, god.

147:09

>> I don't want to be sent any more of

147:10

those,

147:11

>> bro. The new penis.

147:12

>> Shane was sending new penises after

147:13

talking to you. I've seen them.

147:14

>> Both of them are It's genital

147:17

mutilation. And with with a lot of them

147:20

that these people have these thoughts

147:21

about being a girl or being a boy, they

147:23

try turns out they're just gay.

147:26

>> But do you I mean, but what All right.

147:28

theory, possible theory,

147:30

>> theory

147:30

>> is that the ruling classes have always

147:32

wanted Unix.

147:33

>> Oh god.

147:34

>> Do

147:34

>> you know what I mean? Like if you're

147:35

emperor of China.

147:35

>> Oh, you just put on the full tinfoil hat

147:38

roll.

147:38

>> Yeah, this is my tinfoil hat moment.

147:40

>> Roll on your hat.

147:41

>> It's good to have a unic advising you

147:43

cuz they're calm. We're talking about

147:45

this before. The sex urge is gone and

147:47

they can just use all.

147:49

>> Yes.

147:49

>> All dogs are trans.

147:50

>> Yes. And so are we. Is that Is that the

147:52

effort? Is that why you want to do it?

147:54

Is that why we have

147:55

>> Oh god. I don't think

147:57

>> that's a long-term play that the ruling

147:58

class are breeding a new unic class to

148:01

advise them and help. Anyway, it's just

148:02

a theory.

148:03

>> Well, I certainly think it's been

148:04

accelerated by various special interests

148:07

and I think some of them are foreign. I

148:10

think there's there's a lot there's real

148:12

evidence that China and other countries

148:14

have pushed on social media like trans

148:17

ideology.

148:18

>> Yeah.

148:18

>> And also like fought against anti-trans

148:22

people and attack them online. like you

148:24

you see it like these organized hate

148:26

groups.

148:27

>> Not in China though, only in America.

148:28

>> In America, like doing it in America

148:30

using uh different AI programs and but

148:34

LGBT issues are just one of the many

148:37

things that they do that with. They do

148:38

that with immigration. They do that with

148:40

us a try to disrupt our system by

148:42

getting us to argue with each other. So

148:43

they pose as us. Yeah. And argue,

148:46

>> you know, and say wild [ __ ]

148:47

>> And some of that is being added now that

148:49

on X you can see where people are from.

148:51

>> It's interesting, right? It's

148:54

>> Yeah, it's interesting. Not everybody

148:56

looks at it, but when you do look at it,

148:57

you go, "Oh, you're you're in Africa.

149:00

This is kind of crazy.

149:02

>> You're a white nationalist account in

149:04

China. That seems counterintuitive."

149:05

>> Yeah, it seems weird. There's a lot of

149:06

that. Nessa Renee D Resta did some

149:09

research on that with the Internet

149:10

Research Agency before the 2016

149:12

elections when they were talking about

149:14

how these um foreign countries had these

149:18

things that were set up that were just

149:20

designed to put posts on Facebook and

149:23

memes and it was just designed to like

149:25

sway the conversation towards a certain

149:27

direction. Yeah. And she's like and the

149:29

funny thing she saw like thousands and

149:30

thousands of these memes. She's like

149:32

some of them are really funny. Like

149:33

they're really funny made memes. Yeah.

149:35

Who's making these? They're being made

149:37

in Russia or somewhere.

149:38

>> This is what this is. When I'm on the

149:40

New York Times app, it feels like I know

149:43

what their agenda is all the time. And

149:45

it's so nice to be like,

149:48

>> I know where that's coming from. I know

149:50

that when I'm on X, it's like there's a

149:52

lot of reality coming at you at once.

149:54

And then there's also definitely bots on

149:56

there doing and it's

149:58

>> it's too

149:59

>> I feel overwhelmed. It is too.

150:00

>> It's too overwhelming. I try not to [ __ ]

150:02

with it anymore.

150:03

>> Every time I go on there, I just feel

150:05

bad. I just feel gross.

150:07

>> All of them. All of them. I try to stay

150:09

off of them as much as possible. I feel

150:11

better when I do. When I have like a day

150:12

or two,

150:13

>> you're in a valuable position of just

150:14

getting to talk to people who know

150:15

what's going on. You get to talk to I

150:17

remember Christopher Hitchens, someone

150:18

asked him like, "What newspapers do you

150:20

read?" And he said, "None. I just talk

150:22

to people who know things that I want to

150:23

talk to, who I trust, who know things."

150:25

You're a very well-connected. Not

150:26

everyone gets to You can have a phone

150:28

call with like an expert in something if

150:30

you want.

150:30

>> That's true. That's a huge plus to doing

150:33

this. Um, but it's also you have to find

150:36

out which expert is really honest.

150:38

>> Yeah.

150:39

>> You have two different experts. Like if

150:40

you have a some sort of a court case

150:43

while the defense will have an expert

150:44

and then the prosecution has an expert

150:46

too and they disagree. So wait a minute.

150:49

>> I thought it was all based on fact and

150:52

logic and science like you guys are

150:55

whether it's DNA evidence or all kinds

150:56

of evidence. There's like experts on

150:58

both sides. So, you're always going to

151:00

have some sign of dispute. If you have

151:03

complete,

151:04

>> if everybody just like completely agrees

151:06

with one narrative, there's something

151:08

probably going on. And generally

151:09

speaking, what's going on is that they

151:11

have control over that social media

151:14

application. Like Blue Sky. Yeah,

151:15

>> Blue Sky is a perfect example. If you

151:17

just go on Blue Sky and type there is

151:19

only two genders, banned. You're gone.

151:22

You're over. Like they don't [ __ ]

151:24

around.

151:24

>> Which is why that one is being allowed,

151:25

I think, in Australia. Australia. So,

151:27

we're banning X for the under 16s, but

151:29

Blue Sky is fine.

151:29

>> Yeah. You're going to turn people into

151:31

the most radical of progressives,

151:34

>> but they want they're saying, "Here are

151:35

the facts that we you can agree on, and

151:37

then you can you can have your

151:38

disagreement within that bubble, but

151:40

you've got to exist within a shared

151:42

reality,

151:43

>> right?

151:43

>> I'm

151:45

I'm getting freaked out by the New York

151:46

Times app, and I don't like it." Okay.

151:49

But so, they'll have ads in there, and

151:51

this is this they have ads for the New

151:53

York Times in the New York Times app,

151:55

right? That doesn't seem smart.

151:57

>> It's Well, they're off. They're saying

151:58

you should buy a friend of yours the New

152:00

York Times app. Okay, you should pay for

152:02

them to have it. And then it's like, why

152:04

should you do that? So you can talk, so

152:05

you can understand the news together. So

152:07

you can share the world together, right?

152:09

They're like, isn't it terrible when

152:11

someone has different facts to you?

152:13

Let's all have the same facts so that we

152:16

can know our children again. You should

152:18

buy your children the New York Times app

152:19

and bring them under the safe, warm

152:21

umbrella. And it is. is when I'm on

152:23

there. It's like being in a weird bath

152:24

or something where it's like a protected

152:27

zone. Well, I will be deleting it at

152:28

some point. I enjoy doing the whle, but

152:30

it's like I'm just getting a second of

152:33

cuz I've I've been in Austin for like 2

152:34

years now and most of my news has come

152:37

through talking to Kurt Mezer in the

152:39

green room or something. Do you know

152:40

what I mean? And so I was like, just

152:41

give me a taste of what like a normie

152:43

out there is experiencing as reality.

152:46

>> Well, the problem is those normies get

152:47

indoctrinated just as much as anybody

152:49

else does. And so they get indoctrinated

152:51

to thinking that the New York Times is

152:53

this the golden standard of accurate

152:56

news reporting and it's not biased and

152:59

this is the actual story that's going on

153:02

and no that's not always the case.

153:03

>> I would say at least on the right people

153:05

are getting indoctrinated by like

153:06

multiple different strange things like

153:09

the actual agreement. You can have

153:10

arguments and discussions about things

153:12

and people do in a you've seen that like

153:14

meme where it's like here's right-wing

153:16

thought and it's all [ __ ] over the

153:17

place and it's like here's the leftwing

153:19

thing. It's like one thought

153:20

>> and everything after that is Hitler.

153:21

>> Yeah.

153:22

>> Everything to the right of that is

153:23

Hitler. Yes.

153:23

>> Yeah. I've seen those.

153:24

>> I It's weird now that you seeing all

153:26

these right-wing people that are having

153:28

public feuds.

153:30

>> It's blown up. It's been a big week.

153:31

>> What's happening? Like why did everybody

153:33

lose the plot? It's weird.

153:36

>> Charlie Kirk was holding something

153:38

together and now it's really I think

153:39

people are I don't I think he was

153:41

>> Well, it seems like from his death out

153:45

there's a lot of chaos on the right. But

153:47

is that because of his death? What is

153:50

like why are all these people attacking

153:53

each other or is it because you know

153:55

there's people out there that are saying

153:56

wild [ __ ] and then

153:58

>> other people are being forced to defend

154:00

them whether it's Candace Owens or

154:02

whoever it is.

154:03

>> I think the conservative movement was

154:04

always a weird bringing together of

154:07

about three different things.

154:08

>> What are those things?

154:09

>> Uh like foreign policy hawks, social

154:12

conservatives, and big business people.

154:15

>> And William F. Barkley Jr.

154:18

>> is that his name? I'm getting that

154:19

right? But like the National Review, he

154:20

managed to purge all the John Burge

154:22

Society people and say this is mainline

154:24

conservatism going forward. And then

154:26

Reagan was able to like dovetail in him

154:28

with that. And there was

154:29

>> there was like a we there was a coming

154:31

together of two people who didn't

154:32

>> it didn't make a lot of sense for like a

154:34

religious conservative and a big city

154:36

finance guy to share a platform

154:39

together. But

154:40

>> Mhm.

154:40

>> under that project you could bring them

154:42

together and that that it breaks apart

154:45

and you can see it like there are a

154:47

couple things really breaking up like

154:50

where where is the right fracturing in

154:52

Arizona at the moment with it's like

154:55

Israel is a fault line. There's no

154:58

holding together the two wings of the

154:59

conservative movement under Israel

155:01

anymore. is there like you

155:05

the Tucker Carlson wing of that

155:07

discussion and the Ben Shapiro wing

155:10

don't seem to be able to harmoniously go

155:12

in lock step.

155:12

>> No, they hate each other.

155:13

>> They really hate each other. Uh there's

155:16

a conspiratorial wing and there's like a

155:18

big business wing that don't want to get

155:20

along. There are like there's

155:22

libertarians and there's conservatives

155:24

and those they match up on a couple

155:26

things but not a lot of things in terms

155:28

like you know what is a family? what is

155:32

what are our values going forward? What

155:33

should we have religious values in the

155:36

law? A lot of people on the right would

155:37

say yes. A lot of people on the right

155:38

would say that's the never. No. So

155:41

unless there's like a unifying like

155:44

I don't want to say strong man, but like

155:46

one unless there's a unifying figure to

155:48

bring those two disperate groups

155:49

together, I think their natural thing is

155:52

to fight with each other. And that's

155:54

what's happening now is that it's the

155:56

end of the Trump era. He's not going to

155:58

run again.

155:59

>> Mhm. He managed to build some sort of

156:01

coalition around himself. And that's I

156:04

think

156:05

Mr. Kirk's widow whose name I don't

156:07

remember who had the gold outfit.

156:09

>> Erica Kirk.

156:09

>> Erica Kirk who I don't watch a lot of

156:11

the speeches cuz I

156:12

>> I get all secondhand but she's going

156:14

like we need to get behind JD Vance.

156:16

He's going to be the future of holding

156:17

this together. And he's trying to really

156:19

stay out of it so that they he like he's

156:22

not making a call one way or the other.

156:24

He's trying to allow the two parties to

156:27

>> duke it out. see who rises.

156:29

>> I guess he'll see who who wins or like

156:32

>> Well, that's the thing. Someone has got

156:34

to win, right? Like something's gonna or

156:36

they're just gonna just like diffuse the

156:38

whole right-wing movement by being

156:41

constantly at war with each other where

156:43

there's no consension.

156:45

>> Yeah. And this happens on the left as

156:46

well. Like the left like the AOC people

156:48

and the Nancy Pelosi people are not

156:50

natural bed fellows.

156:52

>> Like what do they have? What's the

156:53

consensus? Like what do they agree on?

156:55

They agree on immigration. They all

156:56

agree on immigration

156:57

>> kind of. I mean the people No, big

156:59

business people want heaps of illegal

157:01

immigration.

157:02

>> Oh, cheap labor.

157:03

>> But the big business people that is

157:05

true. There's some CEOs that have openly

157:08

discussed the fact that they need that

157:10

in order for their business model to

157:11

work.

157:11

>> Yeah. You've got like the Pat Buchanan

157:13

wing of the party going up against the

157:14

like HW Bush wing of the party.

157:17

>> So I don't even think they can get

157:18

around that.

157:18

>> Most people would say that having an

157:21

open border, most people on the right

157:22

would say have an open border is a real

157:24

problem. You need to close the border.

157:25

Like if you vote if you were a

157:27

right-wing person, you ran on let's open

157:28

up the border again. We need illegal

157:30

immigrants. We need the the labor. Yeah.

157:32

It would be over.

157:34

>> You would never win. You would never

157:35

win.

157:35

>> You could govern that way. And I think

157:37

people did for a long time,

157:38

>> but you could never have that as your

157:40

public,

157:40

>> right? You could let them sneak in, let

157:42

it slip and slip.

157:44

>> Well, like Biden was always saying,

157:45

we're we're tough on the border. And you

157:47

go, but

157:48

>> these numbers are very

157:50

>> gling. You definitely weren't. He wasn't

157:51

tough on [ __ ] but I also think he

157:53

wasn't running anything either, you

157:55

know? I mean,

157:56

>> it's hard to

157:57

>> imagine. Hard to imagine.

157:59

>> Yeah.

157:59

>> Yeah. No way. So, whoever was running it

158:02

wanted to keep running it and that was a

158:03

real problem. That was a real problem.

158:06

That's scary because then you you

158:07

realize even though it's crazy to have a

158:09

president, at least the ideas you voted

158:11

a president in, but if the president

158:14

doesn't do anything and it's really a

158:15

bunch of like as nutty as Trump is, at

158:17

least you know he's doing it. Like

158:18

nobody else is going to put gold all

158:20

over the White House, you know? He's

158:22

doing that. Nobody else 100%. He did the

158:25

auto pen thing. At the very least, you

158:27

know, it's him doing it. And you hate

158:29

him, you love him, whatever.

158:30

>> I think he wrote he wrote that Rob Riner

158:32

tweet. I don't think anyone was in his

158:33

ear going.

158:35

>> I think I think you should take a big

158:36

stand against Rob Reiner today.

158:37

>> No, he wrote that. He wrote that. Um,

158:40

>> but it was Brennan.

158:43

>> Brennan and Clapper. Those are the

158:45

people that had the video with Rob

158:47

Reiner where he's like literally talking

158:48

to two spooks about how it's a real

158:50

problem that that Trump is the

158:51

president.

158:52

>> Something called the Committee for

158:54

Russian Investigation or something like

158:56

that. Rob Reiner did.

158:58

>> No one apologizes for the Russia stuff.

159:00

>> No, it's crazy what they did. And

159:03

>> the co stuff no one apologizes for.

159:05

>> No, they they completely lied. As much

159:07

as you can hate him about a lot of

159:08

things that Trump has done, you you

159:10

can't just let people get away with

159:12

making a fake story about him colluding

159:15

with Russia, like that's a fake story.

159:18

The Steel Dossier was literally all that

159:20

stuff was funded by the Clinton

159:21

campaign. It's crazy.

159:22

>> Yeah. And the Epstein stuff coming out

159:25

now is I mean, we'll see what happens

159:27

with that, but

159:27

>> Well, you guys were talking right before

159:29

the podcast said Jamie said there was a

159:31

big dump. What happened with the big

159:32

dump?

159:33

>> Big dump.

159:33

>> You said there was a big dump today and

159:34

they [ __ ] up. That was your your take.

159:36

They [ __ ] up.

159:37

>> The [ __ ] up was that people have found

159:39

out that the redactions weren't really

159:40

redacted.

159:42

>> Like that's a big mistake. Like you can

159:43

copy and paste and put another document

159:45

and see the redactions.

159:47

>> Oh, like a Photoshop deal. Like you

159:48

could get the layers away.

159:50

>> Yeah.

159:50

>> Oh, whoopsies. That's what happens. You

159:52

get [ __ ] people working for the

159:54

government. They're dorks. Um then the

159:56

then which is like this like steps to

159:59

this if you're I wasn't following it all

160:01

but

160:02

>> uh the Department of Justice has tweeted

160:05

a couple interesting things

160:07

>> today starting with this one eight hours

160:10

ago. So it's like uh 6 a.m. or

160:12

something. Department of Justice has

160:14

officially released nearly 30,000 more

160:16

pages of documents related to Jeffrey

160:18

Epste. Some of these documents contain

160:19

untrue and sensationalist claims made

160:22

against President Trump that were

160:24

submitted to the FBI right before the

160:25

2020 election. To be clear, the claims

160:29

are unfounded and false. And if they had

160:31

a shred of credibility, they certainly

160:32

would have been weaponized against

160:34

President Trump already. Nevertheless,

160:36

out of our commitment to the law and

160:38

transparency, the DOJ is releasing these

160:40

documents with the large with the

160:42

legally required protections for

160:44

Epstein's victims.

160:45

>> Some of those documents have been

160:46

deleted now. Okay. So, they're saying

160:48

that 30,000 more pages of documents and

160:52

some of them contain untrue and

160:54

sensational claims made against

160:56

President Trump that were submitted to

160:58

the FBI right before the 2020 election.

161:00

Right. But by who?

161:02

>> That's people are just sort of taking it

161:04

as a grain of salt saying like what so

161:06

nobody else it's all untrue about Trump.

161:09

Nothing nobody else. All the Bill

161:10

Clinton photos were definitely

161:12

>> The other one was picture came out of a

161:15

letter that seems to be a potential

161:18

suicide note written by Epstein written

161:20

to Larry Nasser.

161:22

>> The facts of that throw were strange.

161:24

There's a postmark which is three or

161:25

four days after he died.

161:27

>> Wait a minute. Larry Nasser.

161:29

>> Yeah. Was also in jail.

161:31

>> He's the Olympic guy. Yeah. The doctor

161:33

that was a pedophile. Yeah. And it's

161:34

like a letter writing like, "Hey, I know

161:36

what you know why I'm in jail. I know

161:38

why you're in jail.

161:40

Boy, that seems weird that he's writing

161:42

a letter for the short and like that

161:44

starts off saying if you've gotten this,

161:46

you know I took the in quotes short

161:48

route out which

161:49

>> short route home, right?

161:51

>> Yeah. Uh but there's some weird detail.

161:53

People are like they said they're saying

161:55

this is fake or maybe fake.

161:58

>> Did they get a handwriting expert to

162:00

analyze it yet?

162:02

>> Peering doesn't that's I started asking

162:04

the questions like well then why how did

162:06

it get why did it come out? How you

162:07

know? Oh, so the FBI, it says the FBI

162:10

has confirmed this alleged letter from

162:12

Jeffrey Epste to Larry Nasser is fake.

162:14

Fake in all caps. Trump wrote that the

162:17

fake

162:18

>> it gets busted by the fake caps.

162:21

>> Fake letter was received by the jail and

162:24

flagged for the FBI at the time. The FBI

162:27

made this conclusion based on the

162:28

following facts. The writing does not

162:30

appear to match Jeffrey Epstein's. The

162:32

letter was postmarked three days after

162:33

Epstein's death out of Northern Virginia

162:35

when he was jailed in New York. The

162:37

return address did not list the jail

162:39

where Epstein was held and did not

162:42

include his inmate number, which is

162:43

required for outgoing mail. The fake

162:45

letter serves as a reminder that just

162:47

because the document is released by the

162:48

Department of Justice, does not make the

162:51

allegations or claims within the

162:53

document factual. Nevertheless, the DOJ

162:55

will continue to release all material

162:57

required by law. Well, this is how they

163:00

probably should have done it from the

163:01

beginning, right? Release all material.

163:03

Yeah. And then refute whatever you say

163:06

is fake. And you say, "Okay, it didn't

163:08

have his inmate number. It's not his

163:10

handwriting. It's fake. It was 3 days

163:12

after his death. It was postmarked from

163:14

Virginia. He was in New York."

163:15

>> But don't make it look like you're

163:16

covering it up. Just

163:17

>> right. Release it. Although I will say I

163:20

have seen on Twitter people complaining

163:21

about like

163:23

>> like they're not meant to censor

163:25

anything due to embarrassment. But when

163:27

it's like Gain Maxwell's boobs, they

163:29

will censor it out and go,

163:30

>> "This has been illegally censored. You

163:33

must

163:34

>> by the law of the United States. Show me

163:36

her boobs.

163:36

>> I need to see them areas."

163:38

>> Is she She's in prison in Texas.

163:40

>> She's in You can kind of call her

163:41

prison. She does yoga, plays cards,

163:44

hangs out.

163:45

>> Is she allowed to talk to people? I

163:47

don't think so. She's not allowed to

163:48

podcast. I'm sure if that's what you're

163:50

getting at.

163:50

>> I am.

163:51

>> That would be a really exciting podcast.

163:53

>> If everybody wants to die, that would be

163:54

a really good podcast.

163:56

>> I think she's just a nice normal lady.

163:58

>> Do you think Trump on the way out

163:59

pardons her?

164:01

>> She's a nice woman. I wish her will.

164:03

>> I don't know. It's uh

164:05

>> the weird thing is she's in jail for sex

164:07

trafficking to

164:10

>> who? Epstein,

164:12

>> right?

164:13

>> But I is was it for that

164:14

>> from him? I think it was 16-year-old in

164:17

Florida and it was directly to him. I

164:19

was briefly I experimented with being

164:21

like

164:22

>> a nonp believer.

164:24

>> Really?

164:25

>> Yeah. For about 2 weeks I

164:26

>> What did you What did you think was

164:27

going on?

164:28

>> I was like maybe he's just a pervert who

164:30

liked getting back rubs from

164:31

16-year-olds and he had famous friends

164:33

cuz everyone was like he's Msad. He's

164:35

CIA.

164:36

>> What do you think now?

164:37

>> Yeah, he's obviously

164:39

something. It's way I just thought every

164:41

like everyone in the green room was

164:43

saying he's Mercedes. I was like I could

164:46

be maybe the controversial thing would

164:48

be to not believe that the contrarian

164:50

position.

164:50

>> I just wanted to try experiment with the

164:52

contrarian position and it's getting

164:53

harder and harder to hold that.

164:55

>> Yeah. It seems like the more they dig

164:57

into his past, the more it feels like he

164:59

was part of some sort of intelligence

165:00

agency.

165:01

>> Well, like channeling offshore money for

165:03

people. Mhm. How about the fact that he

165:05

just got a slap on the wrist during the

165:07

first case when he caught a case and

165:09

then the whoever it was was the

165:11

prosecutor or the judge was told that he

165:15

was intelligence. There was a Yeah,

165:17

that's and then someone ret I listened

165:20

to a podcast on it from like some some

165:23

Matthew Schmidz who's Compact magazine

165:26

and they were like they were making out

165:28

that it was uh it was a anti-semitic

165:30

plot to say that Epstein was secret

165:33

intelligence and it genuinely although I

165:36

don't agree with them it was one of the

165:37

best put together podcasts I'd heard and

165:40

I

165:41

>> look at this suicide watch observation

165:43

lot 2:15 a.m. Inmate states his cellmate

165:46

tried to kill him.

165:48

>> Inmate sitting on bed trying to

165:50

remember.

165:50

>> Retracted it saying he has no idea what

165:52

happened, but there's pictures of him

165:54

showing his wounds and stuff. I think he

165:56

also said he woke up and didn't know

165:57

where those wounds came from.

165:59

>> Oh, so that's the guy, too, by the way.

166:01

You know that. That's the cellmate. The

166:03

giant dude.

166:04

>> Oh, so the cellmate beat the [ __ ] out of

166:07

him. I don't see any wounds.

166:08

>> Oh,

166:08

>> where's the wounds?

166:11

>> New release documentary.

166:14

semic-conscious with neck injuries.

166:16

>> He had marks around his wrist. I think

166:17

they said his mean

166:18

>> we see his neck.

166:20

>> That's not a good picture.

166:21

>> It's a video.

166:22

>> Oh, okay. It's a video.

166:25

>> This picture his hands were swollen. I

166:26

think I said his ankles or feet were

166:28

swollen, too.

166:28

>> Oh, so the guy tried to grab his neck

166:31

and choke him,

166:32

>> but they said they investigated. They

166:35

didn't find anything.

166:36

>> Found no evidence of foul play. I didn't

166:38

do nothing. He says he didn't do

166:40

nothing. I don't know what to tell you.

166:41

You're okay. Get back in jail, you

166:43

pedophile. That's probably what they

166:45

did. But the guy probably tried to kill

166:47

him. I mean, it looks like a guy that

166:48

would try to kill you and he was

166:49

definitely a murderer.

166:50

>> Yeah, if you're in a jail cell with a

166:51

pedophile, I don't think that's unusual

166:52

to try and kill that guy.

166:53

>> Also, you're a big giant guy who's in

166:56

jail for murdering four drug dealers and

166:58

you're a cop. Like,

167:01

>> I was I was always saying that you get

167:03

him to kill that guy for like a pack of

167:04

cigarettes.

167:05

>> That guy's going to be jail for the rest

167:07

of his life. Forever. For sure. And you

167:09

can give him like awesome special

167:10

treatment if he waxed Jeffrey Epstein.

167:14

>> Man, I was really trying. I tried so

167:16

hard. I went on podcasts trying to say

167:18

he was Yeah.

167:20

>> I wish I hadn't.

167:21

>> I just I just thought it was a cool a

167:24

cool like bucking back against the grain

167:26

thing to say.

167:27

>> And I was saying he was charismatic.

167:29

>> Yeah.

167:30

>> Why wouldn't famous people want to hang

167:31

out with this charismatic man? Good

167:32

>> point.

167:33

>> That photo where he's with Michael

167:34

Jackson.

167:35

>> His loafers are incredible. He had a

167:37

great sense of style.

167:38

>> Right. Right.

167:39

>> But I do and then there's things about

167:41

him discussing with, you know, he's

167:42

talking to ex prime ministers of Israel

167:45

about how to move money around or

167:47

something. Yeah.

167:48

>> It's I

167:50

former prime minister of Israel used to

167:52

visit him at his Manhattan place with

167:54

like a mask over his face. He'd like

167:57

pull his [ __ ] have like one of these

167:58

things on. Do you ever see it?

168:00

>> No.

168:00

>> Yeah. See, pictures of him trying to

168:02

cover his face as he goes into Epstein's

168:04

house, which is what I always do when I

168:06

go to my friend's house.

168:07

>> You cover your face.

168:07

>> Yeah. You don't want anybody knowing.

168:09

>> You go to the Ring doorbell.

168:10

>> There's also there's apparently more

168:12

>> Nixon mask on.

168:13

>> More Prince Andrew ones now.

168:14

>> Oh, of course.

168:15

>> And he's uh

168:15

>> Well, there's a reason why they

168:17

literally kicked him out of the royal

168:18

family. They banished him to a ma

168:20

mansion somewhere in the hills.

168:23

>> I don't think he'd been

168:24

>> Yeah. It's not good. Hurts the It hurts

168:27

my regard for the beautiful royal family

168:29

who I love very much. I bet you do. You

168:31

like a good royal family.

168:33

>> I love a royal family.

168:34

>> Look at that dude.

168:35

>> Yeah. Well, he's dodging the paparazzi.

168:38

>> Oh, for sure.

168:39

>> Paparazzi are always in front of a

168:41

financial guy's house.

168:43

>> Bunch of chicks leaving.

168:44

>> A lot of people seem to love hanging out

168:46

with this guy. A charismatic guy.

168:48

>> Betty's a lot of fun. Had cool people at

168:50

his parties.

168:51

>> I mean, the it was with Woody Allen. He

168:53

was hanging out.

168:54

>> Bill Clinton.

168:55

>> Bill Clinton seems to have a great time

168:56

in all the photos. There's a lot of

168:58

people seem like having a great time.

168:59

Michael Jackson was hanging out there.

169:01

>> Michael Jackson didn't look like he was

169:03

having a lot of fun though.

169:03

>> Well, I don't think he had a lot of fun,

169:05

period. Right. Michael

169:07

>> tortured individual.

169:08

>> He had a roller coaster. How could he be

169:10

unhappy?

169:14

>> I don't think that was for him. That

169:16

roller coaster was like,

169:17

>> I still know my turkey. When you go

169:19

turkey hunting, you put up a fake

169:20

turkey. No.

169:21

>> Bring in the turkeys. His father made

169:23

him dance too much and that's why he

169:26

wanted to spend the night with boys.

169:28

>> I can't defend Michael Jackson.

169:30

>> No, you can't.

169:31

>> Who can you defend easier, Michael

169:32

Jackson or Epstein?

169:34

>> Well, we don't have any. Yeah. I mean,

169:37

probably Michael Jackson because the

169:38

music was great.

169:39

>> The music was great and this his doctor

169:42

said he was chemically castrated. You

169:43

know that?

169:44

>> I don't.

169:45

>> Yeah. The doctor that went to jail for

169:47

giving him propall that wound up killing

169:50

him.

169:50

>> A general anesthetic. Yes. Yeah. that

169:52

doctor um when he got out of jail spoke

169:56

publicly about the fact that Michael

169:57

when he was young was giving chemical

170:00

castration drugs to protect his voice to

170:02

keep his voice from deepening.

170:04

>> I'm on the record saying that Castradi

170:06

should be brought back. You

170:07

>> think so? You're on the record?

170:08

>> Yeah. No. Over and over again. I said if

170:10

we're going to have trans people,

170:12

>> make them sing.

170:13

>> Well, you you get it regarding how well

170:16

you can sing.

170:17

>> But you got to do it when you're really

170:18

young.

170:19

>> It's got to be before puberty.

170:20

>> Yeah. I don't really believe it, but I

170:22

do want to hear the cast again. We got

170:23

one recording and it's not very good.

170:25

Weird.

170:25

>> Have you heard it?

170:26

>> It's eerie. Yeah, we played it on this

170:27

podcast a bunch of times. It's It's kind

170:29

of macob,

170:30

>> but people loved it at the time.

170:32

>> They were sick people.

170:33

>> And only the Italians

170:35

>> because the Italians were bold and

170:38

>> What a crazy move.

170:39

>> What?

170:40

>> Cut son's balls off when he's young so

170:42

he could sing at a high pitch forever.

170:44

>> Well, I think they would crush them

170:45

because they didn't have antiseptic. I

170:47

think cut them off his uh

170:48

>> What' they do? They crush their balls. I

170:49

think they'd crush them and then put

170:50

them in a bath of milk. But do you know

170:54

about Do you know about the swan thing?

170:55

>> What they do to crush the balls? What

170:57

they use? They just smash them.

170:59

>> That thing you did with your hands. That

171:01

was terrible.

171:01

>> It's not good. But they would deny it.

171:03

The families would never cop to it cuz

171:04

it was illegal to castrate your son.

171:07

>> So you would you would come up with an

171:08

excuse and there's like one town in

171:10

Italy where over the course of a year

171:11

that like they reported hundreds of swan

171:14

attacks. That's what they would say. Oh

171:16

god. I would say a swan

171:18

>> flew into my son's testicles and that's

171:20

why he's now the best singer in Milan.

171:23

>> And they did it so their son could make

171:24

money just like a theater mom.

171:26

>> But the people loved it. Like when when

171:28

there was the last one and they were

171:29

going to retire it, people was chant

171:30

like

171:31

>> crowds screamed long live the knife.

171:33

They wanted it to keep going.

171:35

>> Do you know about this?

171:36

>> Long live the knife. The

171:38

>> nut widespread popular support not to

171:40

get rid of the castradi.

171:42

>> Oh my god.

171:43

>> People wanted to keep hearing it.

171:44

>> Bro, that's terrible.

171:46

But they must have sounded really good.

171:48

>> Well, we heard the recording. You want

171:49

to hear it?

171:50

>> Apparently, he was no good. Apparently,

171:52

he was one of the worst ones.

171:53

>> Many of these operations were performed

171:55

by local barbers.

171:58

>> The razor.

172:01

>> I guess I did use the razor.

172:02

>> Your nuts.

172:03

>> No. Yeah, they said this was an

172:04

operation.

172:04

>> I should have guessed you were across

172:05

the cast.

172:06

>> The mouth.

172:07

>> I could have guessed that would have

172:07

come up on this show before. I didn't

172:09

know you'd played it a bunch of time.

172:10

>> Oh, yeah. We played it before. We'll

172:11

leave on this. We'll play it.

172:12

>> You I don't know if Can we play it?

172:14

>> I can't. This is one of those videos I

172:16

could have.

172:16

>> Yeah, somebody might have owned it.

172:17

>> I actually I got into an argument about

172:19

it because I put it on a video once and

172:21

I got challenged and I challenged it

172:22

back because it was recorded so long

172:24

ago.

172:25

>> Oh yeah, it should be in the open.

172:26

>> Do you know what I mean?

172:27

>> Whatever.

172:27

>> That's true.

172:28

>> There's a Wikipedia recording that's

172:29

totally open. No, I'm a cross.

172:31

>> We don't want to deal with it though.

172:32

>> How come no rappers are sampling the

172:33

cast Danny Brown?

172:35

>> Maybe Diddy when he gets out.

172:38

>> Maybe you could.

172:39

>> I'm not even going to try and be a Diddy

172:41

defender.

172:42

>> I thought about it.

172:43

>> You're such a contrarian. You do think

172:44

about it.

172:45

>> Yeah, I I it would be nice. I just don't

172:48

have enough time to research it

172:49

properly.

172:50

>> But if I had all the time and if I

172:52

didn't have kids, I would be spending

172:53

all my time becoming the best Epstein

172:55

defender because it would be a cool

172:56

thing to say at parties very stridently.

172:58

>> Wouldn't that be that's such an

173:00

Australian thing to think? What do you

173:01

got here?

173:02

>> It's just a quick explan. I mean, they

173:03

really sum this up fast.

173:05

>> Oh, time roughly beginning the 17th

173:07

century to mid- 19th century, an era

173:08

where the science of anesthesia

173:10

anesthesiation still had some way to go.

173:12

And here we go. Before making the first

173:14

cut, a surgeon would send a patient into

173:16

a semicoma state by plying him with an

173:19

opiumbased drink and compressing his

173:21

corateed arteries.

173:22

>> Oh, that's the milk.

173:23

>> Then the boy would be plunged into a

173:26

bath of milk or hot water to soften the

173:28

necessary parts at which point speed was

173:31

of the essence. Cut the spermatic cords,

173:34

remove the testicles, tie the ducks, and

173:36

then fingers crossed. Oh god.

173:40

Oh god. But what is it about the

173:42

Italians that were the only people to do

173:43

it?

173:44

>> Why you why you [ __ ] with my people?

173:46

>> I know. I'm saying it's kind of a

173:47

greatness of spirit.

173:48

>> No.

173:49

>> To go that's how much you loved music.

173:51

>> It's disgusting.

173:51

>> Other people were trying to take over

173:52

the world and build empires. Not in

173:54

Italy. That's what you were doing in the

173:55

>> They just didn't know that AI could just

173:57

fake it. We could make an AI castrada.

173:59

Maybe we should close on that. Let's

174:00

have AI do a castrada.

174:02

>> I reject it. I reject AI castrado. I

174:05

want the rules.

174:05

>> Rolling stone.

174:07

>> Can you do that?

174:08

>> Yeah. Let's do uh have AI make uh a

174:12

cover of Papa Was a Rolling Stone as an

174:16

opera. Castrada

174:18

>> or Castrado? Is it castrado or castrada?

174:22

>> I think it's castradi is the plural.

174:24

>> Castradi, right? But is it a castrad?

174:26

>> I think it's still a boy if you cut his

174:28

nuts off.

174:30

>> Well, you'll get in a lot of trouble in

174:32

Britain for saying the opposite, but

174:34

>> Mhm.

174:35

>> Yes. The ladies loved them. God.

174:38

>> And they got big can never get hard.

174:42

>> No, they could.

174:42

>> Really?

174:43

>> Yeah. Um,

174:44

>> how do you know?

174:45

>> I read a lot about it. Like

174:46

>> maybe they lied.

174:47

>> They would have sex AC. No, women would

174:48

like go and try and have sex with them,

174:50

>> but they couldn't get pregnant off the

174:52

back of that.

174:52

>> But how' they get a boner if they

174:53

didn't?

174:54

>> I think it like

174:55

>> testicles.

174:56

>> They still got They were still

174:57

testosterone in the body.

174:58

>> Like a tiny amount by the

175:01

>> They got real tall though. They got

175:03

huge.

175:03

>> They would be like 7 foot tall.

175:05

>> Really?

175:05

>> And they're This is why they could sing.

175:07

so well is the the bones in their rib

175:09

cage wouldn't fuse.

175:11

>> Like there's something in puberty that's

175:12

meant to come in and like stop your

175:13

bones growing. That happens when you're

175:15

a child. So they'd have like this huge

175:17

rib cage with huge lungs and a tiny

175:19

little boy voice. Yeah. But like huge

175:22

amounts of air flowing out.

175:23

>> Oh, that's crazy.

175:25

>> I'm just saying why can't we if we're

175:27

going to have all the trans kids,

175:29

doesn't one of them go I identify as a

175:31

castradi? Couldn't one do it?

175:34

>> Maybe you're planting the seed in

175:35

someone's head right now. I don't want

175:37

to do that.

175:37

>> Well, maybe they already went through

175:38

with the other thing and they're like,

175:40

"Well, let's make the most of this,

175:42

>> you know, make some lemonade."

175:47

>> Stred doing Have you got Can you really

175:48

just type it in and make a

175:50

>> Yeah. Yeah, but the the uh

175:53

>> How long does it take to render?

175:54

>> The problem is the lyrics.

175:56

>> The lyrics.

175:57

>> Those lyrics are copyrighted.

175:58

>> You could have a song.

175:59

>> Oh, we can't play it. Spangle won't make

176:02

this. That's a whole show on how you

176:03

make these songs. I don't want to get

176:04

into

176:05

>> How are they doing that? You don't want

176:06

to say it. Okay. All right. We're

176:08

wrapped.

176:08

>> Is it a secret?

176:09

>> Man, we're going to miss you. You'll be

176:11

back.

176:12

>> Got one. Hold on a second.

176:13

>> Oh, you got one. Oh, here we go. Here we

176:15

go.

176:16

>> It's not quite eerie enough.

176:20

>> That sounds like a regular guy.

176:22

>> When you hear them when you hear that

176:23

one guy, it is otherworldly.

176:25

>> It's creepy. All right.

176:26

>> It's creepy. All right.

176:26

>> You make good songs.

176:27

>> Men, I love you.

176:28

>> Thank you for having me. I really

176:29

appreciate it.

176:30

>> It's always fun hanging out with you and

176:31

I'm excited about tonight. We're going

176:32

to have some fun.

176:32

>> I think so. Yes, sir. Okay. See you in a

176:35

bit. All right. Bye, everybody.

176:41

[Music]

176:41

[Applause]

176:46

[Music]

Interactive Summary

The podcast features a wide-ranging discussion covering topics from ancient history to modern socio-political issues and technological advancements. The hosts delve into the differences between woolly mammoths and mastodons, the history of seeds, and criticisms of mainstream media's narrative control. They also explore the changing landscape of comedy, the complexities of poverty and crime in communities, and the historical issues of political corruption and rigged elections. Significant airtime is given to the ethical implications of AI, the degradation of food quality in America, and a critical look at religious beliefs and historical practices like bloodletting and castrati. The conversation touches on current events such as the Epstein document dump and the ongoing debates around transgender rights and free speech, particularly on social media platforms.

Suggested questions

9 ready-made prompts