HomeVideos

Joe Rogan Experience #2444 - Andrew Wilson

Now Playing

Joe Rogan Experience #2444 - Andrew Wilson

Transcript

4913 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:15

Prove me wrong type things, you know,

0:18

way before they change my mind.

0:19

>> Change my mind. What does Charlie What

0:21

did Charlie say? Prove me wrong or

0:23

something like that.

0:24

>> It It was something akin to that. My

0:26

understanding was that

0:29

essentially uh TPUSA ripped that idea

0:32

offered.

0:34

Yeah.

0:36

>> And then uh he would I think he feels a

0:40

lot of like responsibility for what

0:44

happened with Kirk because

0:46

>> is he the MSAD?

0:48

>> What's that?

0:48

>> Is he the

0:48

>> Is he the MSAD? Yeah, exactly.

0:50

[laughter] That's so funny.

0:53

>> I don't know. Let's You got You got

0:54

Candace's number. We can ask her. We can

0:56

ask her.

0:57

>> Kansas is getting uh she's getting

0:59

dragged on Twitter today because uh

1:02

she's like, "I' I've lived in

1:03

Connecticut. I've never seen this much

1:05

ice on trees and uh it's 30° out." And

1:08

everybody's like, "Yeah,

1:11

30 is freezing."

1:13

>> Yeah. [laughter] Yeah. It's so funny.

1:15

Ice on trees.

1:16

>> Do you see all the Miss Cleo memes?

1:18

>> It's Oh, it's so funny. You remember You

1:21

don't remember the Miss Cleo? Yeah. The

1:23

psychic. They keep on putting the Miss

1:25

Cleo memes out for Candace because she's

1:27

a psychic, you know.

1:28

>> That's hilarious. [laughter]

1:31

>> It is funny.

1:33

>> I think this lighter just [ __ ] the bed.

1:35

Can I borrow that other one? Thank you.

1:37

>> Yeah,

1:38

>> it's really funny.

1:40

>> Well, Candace has painted herself into a

1:42

weird corner where everything has to be

1:44

a wild conspiracy. Like, it has to be

1:47

Bridget Mcron's a man. Oh, yeah. Erica

1:49

Kirk killed Charlie. It h it has to like

1:52

one up the last one, you know.

1:54

>> Yeah. I I was It's really funny. He came

1:57

to the same conclusion that I did. So,

1:59

it's like I've seen those conspiracy

2:01

channels come up before

2:03

>> and then they they come up and they

2:05

crash out. And the reason is is because

2:09

like for her, I think she she had the

2:11

whole like um

2:13

uh she was involved with this, right?

2:15

She was involved intricately with with

2:16

Kirk. She knew him.

2:18

>> Yeah. And so that g gave a lot of

2:20

credibility to a lot of the things that

2:21

she was saying. But then once you start

2:24

moving back into like Mandela effect

2:26

stuff and and you know time travel and

2:28

this and that. Yeah. People are like ah

2:30

[laughter]

2:32

>> I mean you could do that if you're that

2:34

guy. If you're Art Bell, you know, if

2:36

you

2:37

>> Well, you know, but Belle I I I remember

2:39

I used to listen to Bill.

2:40

>> He's the goat on the wall over there.

2:42

>> You remember that intro? Boom.

2:46

Yeah. So he the kingdom of N.

2:47

>> Yeah. Yeah. I remember I remember

2:49

listening to him [snorts] uh for years

2:52

when I drive around with my dad and he

2:54

was a like it was a big deal and I

2:57

remember the very first episode I heard

2:58

from him was something about the Nim. It

3:00

was like

3:01

>> what's the Nim?

3:02

>> The Nim were like this guy called in. He

3:04

was a time traveler, [laughter]

3:06

>> right? And he came back in time because

3:09

his whole his his whole thing was like

3:11

he had to stop the weather patterns from

3:13

destroying the future because the Nim,

3:15

an alien race of grays, had come and

3:17

they were heating up the planet slowly

3:20

to change it to be the conditionals that

3:22

were necessary for them to then live on

3:23

the planet.

3:24

>> And you know, Art Bill, he's always

3:26

playing into it with the lunatics, you

3:27

know, and he's like,

3:29

>> "And does the CIA currently know that

3:33

you're there doing this?" You know,

3:34

[laughter]

3:35

the guy And the dude's just like, "Uh,

3:40

[laughter]

3:41

>> yeah, Art would give you all the rope."

3:43

>> Oh, yeah.

3:43

>> You call Art I'm a Werewolf.

3:45

Interesting.

3:46

>> Yeah.

3:47

>> Oh, you ever hear the Bigfoot episode?

3:49

>> No.

3:49

>> Oh my god. That's the funniest episode

3:51

you'll ever hear. So, Redneck calls into

3:52

Art Bell and talks about how he killed

3:54

Bigfoot and where he buried it. And the

3:56

guy has it's like I don't know if it was

3:58

early trolling like before trolling was

4:01

trolling but it was like this guy

4:04

>> he was like yeah you know me and Timmy

4:06

we we took him out back there we shot

4:07

him right in the chest twice and u there

4:10

was some youngans and they spread out a

4:12

little bit and then we you know we

4:13

packed up the Bigfoot and buried him in

4:15

the backyard you know and B's just like

4:17

and you said there was youngans.

4:20

[laughter]

4:21

>> The Bigfoot people are the weirdest.

4:24

Duncan Trussell and I went uh hunting

4:26

for Bigfoot once. We did this uh I used

4:29

to do this TV show for a while called

4:30

Joe Rogan Questions Everything. And I

4:32

would go like, "All right, tell me about

4:34

chemtrails, you know, and I go meet with

4:36

all the loons and all all the people

4:39

that are like really involved UFO,

4:40

anything like that." Yeah. And we went

4:42

and hung out with the Bigfoot people.

4:43

So, we went Bigfoot hunting for like two

4:45

days in the Pacific Northwest and talked

4:47

to all these people and

4:49

>> they're all like the same person. I just

4:52

said it's like a team of unfuckable

4:54

white guys.

4:56

>> It's like that's what you find. Like

4:57

these guys are just like they found

4:59

their calling. It's just like

5:01

>> looking for a mystery in the woods that

5:03

you'll never solve.

5:04

>> Well, the there was a guy I used to have

5:06

on your podcast and he was he was huge

5:08

for a long time and I think it still is.

5:11

It's a remember those missing cases?

5:13

>> Mhm.

5:13

>> Right. That was that's a big deal. And I

5:16

was always like anytime I heard anything

5:18

about that, I always was enthralled with

5:20

it because some of the stories were

5:21

demented. Yeah.

5:22

>> You know, like kids appearing 500 miles

5:24

away and all this.

5:26

>> But that guy always had you edged

5:28

because people would always go, "What do

5:29

you think's going on?"

5:30

>> You know, and he

5:31

>> What was that guy's name?

5:32

>> I can't I can't remember it, but he it

5:34

was like missing

5:36

>> Yeah.

5:37

>> 411.

5:38

>> Yeah. Missing 411.

5:40

>> He's I've seen him on Instagram or on

5:42

Twitter.

5:43

>> Yeah.

5:44

>> Yeah. Wow. He But he'd got all the park

5:46

records, you know, and he started going

5:48

through and he was like, "Uh, there's

5:49

some really weird stuff going on here

5:50

for how many people are missing in

5:52

national parks."

5:53

>> There is. There is, but the reality is

5:58

if you die in the woods, you get

6:00

consumed pretty quick. That's the

6:02

reality. That's why you don't find

6:03

mountain lion skeletons. Mountain lions

6:05

are a real thing. I've never found a

6:07

dead mountain line skeleton in all the

6:09

times I've been hunting. Never. Not

6:10

once. You'll find uh elk bones. You

6:13

know, you'll find stuff like that.

6:15

>> I found some coyote coyote skeletons

6:18

before out in the Nevada desert.

6:20

>> But mountain lions are a real thing. You

6:22

very, very, very, very, very rarely find

6:24

a dead mountain lion.

6:26

>> Yeah.

6:26

>> And there's so many of them. Now, think

6:28

about how few people actually go like

6:31

hiking into the deep wilderness. Your

6:34

body just gets consumed.

6:35

>> Sure.

6:36

>> You know, there's so many animals that

6:37

come along, rats, all kinds of things.

6:39

Eat your bones.

6:40

>> Oh, yeah. It's a free meal.

6:42

>> Yeah. It's so easy.

6:42

>> And they can smell it for miles.

6:44

>> Sure. Bears.

6:45

>> Yeah.

6:46

>> Anywhere there's wild pigs and then then

6:47

it's over. Then there's nothing left.

6:49

>> Yeah. They can smell [snorts] that stuff

6:50

for miles.

6:51

>> But it's like people always want to

6:52

attach some crazy deeper weird, you

6:55

know, UFO Bigfoot meaning to it. It's

6:58

like no, it's you're in the wild and

7:00

nature has a whole plan for dead things

7:03

and it does a really good job of

7:05

>> I assume they don't last.

7:06

>> Not at all.

7:06

>> Well, that's the thing. If you live out

7:08

in the country, it's you see this all

7:10

the time. Uh, you know, raccoon will be

7:12

around getting in someone's trash.

7:13

They'll walk out. Bam. Raccoon's done.

7:15

They just go throw it in the bushes.

7:17

>> You don't need That's it.

7:19

>> That's it.

7:20

>> Problem problem solved.

7:21

>> Problem solved. And it disappears

7:22

quickly and the plants consume it. And

7:25

>> yeah,

7:25

>> that's it. It rots and things eat it.

7:28

>> Doesn't even take that long to rot.

7:31

>> It's pretty quick. It's pretty quick.

7:33

You ever seen like those timelapse

7:34

photos where they take a dead animal and

7:35

they let it sit there and you watch it

7:37

get consumed by maggots and it's very

7:39

quick. Yeah.

7:40

>> So, these poor people that go hiking,

7:43

you know, like if you go hiking and

7:44

you're by yourself and you break an

7:46

ankle and you're 15 miles in and you

7:49

don't have a compass and you're kind of

7:51

like roughly judging which hill you came

7:55

over and there's a lot of people that

7:57

just get ahead of themselves and they

7:58

really shouldn't be that far out there

8:00

and they just die. Happens all the time.

8:03

>> Yeah. You know, so like this idea that

8:05

it's like there's you could if you look

8:08

at all the data and you try to find a

8:11

pattern to it and you start imagining

8:13

that there's some grand conspiracy,

8:15

there's some watcher in the woods that's

8:16

consuming people, some demon that's out

8:19

there, you can you get pretty kooky with

8:22

your

8:23

>> I think the popular theory is it's wild

8:24

men.

8:25

>> Oh, wild men.

8:26

>> Wild men.

8:27

>> Oh, like humans.

8:28

>> Yeah. Well, or some human variant that

8:31

are

8:31

>> that's what this guy, this 411 guy

8:33

believes.

8:34

>> I'm not sure because he won't say they

8:37

don't he doesn't he doesn't [laughter]

8:38

actually give his here's what I think is

8:41

going on.

8:43

>> But people ask him and he's like, "Well,

8:44

I have my theories, but he never tells

8:46

you actually what the theories are, you

8:48

know."

8:49

>> Interesting. I wonder why he doesn't

8:50

want to tell. Maybe that's why he's not

8:53

more popular. If he just came out with

8:55

it like Candace. Yeah,

8:57

>> maybe it'd be huge.

8:58

>> Yeah, like those guys that used to be in

8:59

the 90s who were saying that we were

9:01

going underground and killing the

9:02

Nephilim. Remember those? Those guys

9:04

were great.

9:05

>> They You're going down. It's like a man.

9:07

And they were giants. They had three

9:08

rows of teeth and their special forces

9:10

are going down there and taking them

9:11

out. [laughter]

9:15

>> Yeah. There's a whole group of people

9:17

that believe that there's underground

9:18

creatures that live underground and come

9:20

out at night. And there's always been

9:22

like

9:23

>> Yeah. for whatever they are. You know,

9:25

people some people think the grays live

9:27

underground.

9:29

>> You know, there's like there's not a lot

9:31

of mystery left,

9:33

>> you know, outside of places like the

9:35

Amazon, the Congo that are super deep to

9:37

get to. Not a whole lot of mystery left

9:39

in terms of life.

9:40

>> Maybe ocean depths.

9:42

>> Yeah, ocean depths for sure.

9:44

>> That's like the whole new unexplored

9:46

frontier, right? Is ocean depths.

9:48

>> Yeah.

9:49

>> Saying the time I turn on the TV, it's

9:51

like, uh, look at this crazy creature.

9:52

I'm like, "That doesn't exist." And I

9:54

look it up, I'm like, "Wait, that

9:55

>> that exists?" I saw one the other day

9:57

tweeted out. I was like, "Mandela effect

9:58

has to be real." It was a

10:00

>> It's called a Siberian mu deer.

10:03

>> You ever seen a Siberian mu deer?

10:04

>> No.

10:05

>> They have fangs.

10:07

>> Oh, right. I I have seen a fang deer. I

10:10

forget what they call it. Apparently,

10:12

that's the uh You know, do you know what

10:14

elk ivories are?

10:16

>> Yeah.

10:16

>> Yeah. That's It used to be like a tusk

10:19

like way way back in the day.

10:20

>> It's so [ __ ] looking, dude. Yeah,

10:22

the fang deer. Yeah, they're weird. It's

10:24

very strange.

10:25

>> Yeah.

10:26

>> I wonder what they were there for.

10:29

>> Yeah. Well, there's v I found a video

10:30

cuz I was like, "No way, dude. Do these

10:32

things exist?" I thought I was being

10:33

me'd, you know? So, [laughter]

10:35

look. And this thing is real. So, I

10:37

found a video of them fighting and they

10:39

use those things as weapons.

10:40

>> Oh, that makes sense. That's the only

10:42

thing that makes sense. Like gorillas.

10:43

The gorillas don't eat meat yet. They

10:46

have these massive fangs.

10:47

>> Yeah.

10:48

>> You know, it's nature is weird. so much

10:51

variation, you know, there's so many

10:54

different types of life. It's and that

10:56

the fact that they all sort of

10:59

synchronize like this one eats that one

11:01

and that one eats this one and this one

11:03

lives there and that one lives. It's

11:04

like it's very fascinating when you

11:06

really look at the just the wide variety

11:09

of species that exist.

11:10

>> Well, most people don't know anything

11:11

about it. Like most people have never we

11:15

live in such a comfortable world

11:18

>> that is completely guarded from

11:21

everything that's out there. And it's

11:24

like if people had a taste of out there.

11:26

>> Yeah.

11:27

>> And I think that the worldview of many

11:28

many people would change very quickly.

11:31

Especially feminist. I think that

11:32

feminists would immediately stop being

11:33

feminist if they just had a taste of

11:35

like well you know people actually did

11:37

have to shut themselves up at night from

11:39

wolves. Yeah.

11:40

>> That was a real thing. Wolves would come

11:42

in and eat you and so you would shut

11:44

yourself in so that that didn't happen.

11:46

>> Well, that's gone so far the other way

11:47

that [ __ ] retards are bringing wolves

11:50

into places.

11:51

>> Yeah, I know. [laughter]

11:53

>> It is so dumb. You know, I have a good

11:56

friend.

11:56

>> Didn't they take it over like in in

11:58

Yellowstone or some place? They

11:59

reintroduced wolves and it just

12:01

decimated the deer population,

12:02

>> the elk population. That's actually

12:05

arguable that that might have been a

12:08

good thing um in some ways because it

12:11

was getting to elk need natural

12:13

predators and mountain lions can only

12:15

kill so many elk. Yeah.

12:17

>> Um, but what's really interesting is

12:19

mountain lions kill way more elk when

12:21

wolves are around cuz the wolves find

12:23

the mountain lions and take their elk.

12:26

And so then the the mountain lions have

12:28

to go kill another deer or you know

12:30

whatever they

12:31

>> Why like just issue more hunting elk

12:33

permits though? Like why do that?

12:34

>> Well, you have to have some natural

12:37

predators in in a good healthy

12:40

ecosystem. And there's a good argument,

12:41

particularly in Montana, that at one

12:43

point in time it had gotten to a point

12:46

where you're going to have like rampant

12:48

disease because they were they were

12:50

issuing these uh they were issuing like

12:53

unlimited or a large amount of tags for

12:56

people in the mid winter so that you can

12:58

catch these elk in deep snow and just

13:00

peck them off because they were having

13:02

so many of them and that they they

13:04

weren't sustainable that they were

13:06

hitting these massive populations. So,

13:07

their populations are down to like

13:11

I want to say less than 40% of what they

13:14

were at their peak when they brought in

13:16

the wolves. But the problem is these

13:18

wolves, like what they did in Colorado

13:20

recently is the dumbest of all time

13:22

because they brought these [ __ ]

13:24

wolves outside of Aspen and they took

13:26

wolves from Washington State, Washington

13:29

State or Oregon, but whatever it was,

13:31

these these wolves from the Pacific

13:33

Northwest were wolves that already had

13:36

been killing cattle. So they captured

13:39

these wolves instead of killing them.

13:40

And then they relocated them to Aspen

13:43

where they're killing cattle. So they

13:45

they relocated him onto my buddy's

13:47

ranch. Like there's five of them.

13:49

>> And he had a cattle ranch, did he?

13:51

>> He didn't have he doesn't have cattle on

13:52

his ranch, but his [ __ ] neighbors do.

13:54

Okay. And his neighbors are losing

13:55

cattle left and right. And he's And so

13:57

now they've killed off a couple of them

13:59

and they're trying to It was a disaster.

14:02

And it's because the governor, the

14:04

governor's husband, he's a uh wildlife

14:07

lover and he thinks it would be amazing

14:10

if we had wolves.

14:11

>> You ever talk to those deer hunters in

14:13

Michigan? in Michigan.

14:14

>> Yeah, they've been pissed off for like

14:16

every deer hunter I know in Michigan has

14:18

been pissed off who's a native for years

14:21

because they all they all used to shoot

14:22

pheasant. That was the big deal in

14:24

Michigan was

14:25

>> pheasant.

14:26

>> And then here's the story I heard. I

14:28

don't know if it's true or not, right?

14:30

Um, but the DNR, the Department of

14:33

Natural Resources, imported a b a bunch

14:36

of western coyotes in order to thin out

14:40

the deer population because the deer

14:42

population was basically mangling all

14:44

these farm crops.

14:46

>> Oh boy.

14:46

>> And those now that's an all you can eat

14:49

buffet for a coyote in Nevada. These

14:52

ground birds that are just these fat

14:54

>> fat little ground birds and they

14:57

decimated the population. So you you'll

14:58

talk to these old deer hunters. Have you

15:00

seen any feeasant? No. Shut up.

15:02

[laughter]

15:03

Shut up.

15:04

>> The interesting thing about that though

15:05

is pheasants's an invasive species.

15:07

That's not a natural North American

15:09

species either.

15:10

>> They brought those [ __ ] over and they

15:12

are delicious. Yeah.

15:13

>> And it's fun to hunt them.

15:14

>> Well, they would always just walk those

15:15

train tracks, those old abandoned train

15:17

tracks, you know, and they'd have the

15:19

dogs. Do dogs kick up the feeasant and

15:21

they'd shoot them from the track. Dog

15:22

would bring it up. That was like a

15:23

Michigan pastime.

15:24

>> Yeah. Yeah, the the coyote thing is a

15:26

real problem because coyotes are now

15:28

they used to be a western animal and now

15:29

they're in all 50 states.

15:32

>> Not only that, they're in virtually

15:33

every city in America.

15:35

>> Well, they've been wiping them out in

15:36

Michigan pretty good in the rural areas.

15:38

Oh, yeah. Well, what they do now is they

15:39

have the GPS trackers to put them on the

15:41

dogs.

15:42

>> Old boys will get in with AR-15s. Those

15:45

dogs will run them for 200 miles and

15:48

then they finally take a shot and they

15:50

just will do that all winter long. Man,

15:51

>> that's good. But it's hard to wipe them

15:54

out because what they do is you know

15:56

when you hear coyotes calling it's like

15:58

roll call

16:00

they're letting sometime there's a lot

16:01

of confusion to what they're doing. Some

16:03

people think that they're letting the

16:04

other coyotes know that they've killed

16:06

something that we have food but it's

16:08

also a roll call and when one of the

16:10

coyotes is missing the females have more

16:13

pups.

16:14

>> Really?

16:14

>> Yeah. Some weird natural reaction. Also,

16:16

they have natural enemy is greywolves.

16:20

And when um they evolved, they evolved

16:23

to when the greywolves kill them because

16:26

the the greywolves don't breed with

16:28

coyotes, but coyotes do breed with red

16:30

wolves. That's why you have these like

16:31

koi wolves on the east coast.

16:33

>> A coyote is a wolf. It's a wolf. It's

16:35

just a small wolf. And so, their natural

16:38

inclination is when they're getting

16:41

chased, they move to a new area and then

16:43

they have even more pups. So that's how

16:45

they've spread out through the entire

16:47

country. So if you go back to like the

16:48

turn of the century, like the 1900s,

16:51

>> coyotes were exclusively a western

16:53

animal.

16:53

>> Yeah.

16:54

>> Now they're in [clears throat] New York

16:55

City.

16:55

>> Yeah. They're everywhere,

16:56

>> which is crazy. They have them in

16:58

Central Park. They have [ __ ] coyotes

16:59

running around Central Park.

17:01

>> Some lady this morning posted on X uh a

17:04

mountain line in San Francisco sitting

17:05

on a porch

17:07

>> in the city of San Francisco. A big one

17:09

just sitting there. It's like [laughter]

17:12

>> just having a good time. [clears throat]

17:14

Well, that's just cuz California has the

17:16

dumbest [ __ ] laws when it comes to

17:17

those things.

17:18

>> Yeah. Well, they have terrible gun laws,

17:20

too. They have terrible terrible laws.

17:21

They have terrible laws, period.

17:22

>> They have terrible laws. Terrible

17:24

everything.

17:24

>> Terrible politicians. You know, it's a

17:26

shame, too. Like, I grew up in Santa

17:28

Rosa.

17:29

>> And um that's the most beautiful area,

17:33

>> the Napa Valley area. It's the most

17:35

beautiful area on planet Earth. The

17:36

weather's always perfect. It's January

17:39

15th. It might as well be July 15th,

17:42

right? It's always perfect. It's always

17:44

gorgeous. And they [ __ ] it all up.

17:46

>> They [ __ ] it all up.

17:47

>> They [ __ ] it all up.

17:48

>> Yeah.

17:48

>> And they [ __ ] it up real bad, too.

17:50

>> Oh, it's almost unfixable now.

17:53

>> Especially like San Francisco area.

17:56

>> Like the whole Pacific Northwest is

17:58

almost unfixable. It's like they double

18:01

down and they keep going. Like Seattle

18:04

now has a communist mayor.

18:05

>> Yeah.

18:06

>> Who's been living with her parents.

18:07

>> New York. [laughter] They all got

18:09

communists.

18:11

Black Lives Matter had they were their

18:13

head organizers. They were communist,

18:15

devoued communists. Like

18:16

>> until it came to buying property with

18:18

Black Lives Matter money.

18:19

>> Yeah.

18:19

>> What's happening with that?

18:20

>> Well, then they're then they're very

18:21

much capitalistic.

18:22

>> How come they're not in trouble? I don't

18:24

understand that. Like they spent

18:25

millions of dollars of that money on

18:28

real estate.

18:29

>> Yeah.

18:30

>> There's only one UFC 325 this Saturday.

18:34

And on DraftKings Sportsbook, the number

18:35

one sports book for live betting. Once

18:38

it's over, your shot to get in on the

18:39

action is gone. DraftKings Sportsbook is

18:42

built for live betting, not just

18:43

pre-fight picks. Because in the UFC, one

18:46

moment can flip the entire fight. One

18:48

punch, one kick, one takedown. New to

18:51

DraftKings, new customers can bet just

18:54

five bucks and get $300 in bonus bets if

18:58

your bet wins with code Rogan. Download

19:00

the DraftKings Sportsbook app and use

19:02

the code Rogan. That's code Rogan to

19:05

turn five bucks into 300 in bonus bets

19:08

if your bet wins in partnership with

19:10

DraftKings. The crown is yours. Gambling

19:13

problem? Call 1800 gambler. In New York,

19:16

call 8778 wire or text hope 467-369. In

19:20

Connecticut, call 888-7897777

19:23

or visit ccpg.org. On behalf of Bootill

19:25

Casino and Resort in Kansas, pass

19:27

through if per wager tax may apply in

19:29

Illinois. 21 and over. Age and

19:30

eligibility varies by jurisdiction. Void

19:32

in Ontario. Restrictions apply. Bet must

19:34

win to receive bonus bets which expire

19:35

in 7 days. Minimum odds required. For

19:37

additional terms and responsible gaming

19:39

resources, see dkg.co/audio.

19:42

Limited time offer.

19:43

>> What's going on?

19:44

>> No idea. I have no idea why. Well, I

19:46

don't I don't know why the the heads of

19:48

many of these organizations aren't being

19:49

rounded up and sumearily arrested.

19:51

>> Yeah.

19:51

>> I mean, we're watching these I've been

19:53

covering the riots non-stop.

19:55

>> I'm sorry. Protest that the completely

19:58

organic protests, which are totally

19:59

organic. Um,

20:02

>> and it's been interesting to watch. I

20:04

was watching one the other day. We were

20:06

live and it was Don Lemon and he had

20:09

showed up at Minnesota and the first

20:10

thing Don Lemon does, right? I hate Don

20:13

Lemon, by the way. But first thing,

20:14

>> one of the dumbest [ __ ] that

20:16

has ever gotten on television.

20:18

>> He's terrible. First thing he does, he

20:20

gets he drives up in this car. He's in

20:22

the back seat and he jumps out of the

20:23

car and he has this [ __ ] eatating lemon

20:25

smile on his face, you know, and he runs

20:26

over with Starbucks to these people and

20:29

he's like, "Here you go."

20:30

>> Yeah.

20:31

>> And then he jumps back in the car,

20:33

right? And they drive off.

20:35

>> Now, here's here's what's interesting

20:36

about this. He comes back and he's in

20:38

there with the protesters, you know what

20:39

I mean? And he's interviewing them. Most

20:41

of the protesters are saying, "We're

20:42

coming from out of town. We're from this

20:44

state. I'm from two states away. I'm

20:46

from three states away." You know, for

20:48

this totally organic protest. Well, the

20:50

cops, what they start doing, they have

20:52

these guard rails on the the sidewalk in

20:54

front of the ice facility, and there's

20:57

gaps inside of that barrier. And so,

20:59

they pull their police cruisers in just

21:00

to fill those gaps so that they stay

21:02

behind the barrier. And Lemon's like,

21:04

"Why would they do that? Why would they

21:06

why would they keep us compressed uh you

21:08

know, behind this barrier?" And I'm

21:11

thinking, "Because you just stopped your

21:12

car in the middle of the street to run

21:14

across the road and give these guys

21:16

Starbucks, you idiot." you know, they

21:18

want to keep the roadway clear so that

21:19

they can get their people in and out.

21:21

You literally stopped your car in the

21:23

middle of the road, ran across the

21:24

street to give these people Starbucks

21:26

and then got back in your car and

21:28

they're like, why why is it that they're

21:29

trying to get to keep us from getting

21:31

into the road, you [laughter] know? I'm

21:32

like, what are you talking about? I just

21:35

couldn't believe. I was like, what? It's

21:37

amazing when these people that are so

21:39

smug where they're protected by a large

21:42

organization by CNN and then they get

21:45

fired and then they get they're

21:47

basically like like a dog like Carl

21:50

getting released into the woods and then

21:52

they have to fend for themselves and you

21:54

see them in the world of podcasting

21:56

where you don't have anybody writing

21:58

things for you and you have to express

22:00

your own opinions. You're like, "Oh,

22:04

this is the real you." It turns out

22:06

you're a [ __ ] [laughter] that. Wow. I

22:08

didn't know. Whoa. You know, the whole

22:10

time, Matt, you thinking the whole time,

22:13

uh, you know, I never thought I'd be an

22:14

entertainer. I didn't think I'd do

22:15

anything with podcast. Never. Never in a

22:17

million years. I never would have

22:18

thought that.

22:19

>> You were a what were you? An engineer or

22:20

a robot?

22:21

>> Robotics mechanic. Yeah.

22:22

>> A robotics mechanic. Yeah.

22:24

>> How did you get involved in that? Uh

22:26

well, I was a I was a gunsmith for years

22:30

and um there there's no real applicable

22:33

skills outside of that for anything

22:35

actually. There it doesn't really carry

22:37

over in anything. It's really its own

22:39

thing, you know, bluing things like it

22:41

just doesn't carry over.

22:43

>> Um a friend of mine said, "Hey, look,"

22:45

because I told him, I was like, "I need

22:46

a job. Um you know, I I'm I'm not making

22:50

it. What do What do you think?" He's

22:51

like, "You know, you should apply to be

22:53

an industrial mechanic." And I was like,

22:55

I don't know much about it, you know.

22:57

He's like, we'll just go apply. So, I

22:58

did took NAPDU test. And so, uh, the guy

23:01

was like, well, I want to hire you at a

23:02

level three, which was high high, you

23:04

know, was like mid-range. Wasn't the

23:06

highest, wasn't the lowest. I was like,

23:09

damn. Okay. You know, what's the pay?

23:10

He's like, was like 30 an hour. You

23:13

know, that to me was was life-changing.

23:16

So, I took the job and I got I didn't

23:17

know what the hell I was doing, but they

23:19

trained me up well. And then, um, there

23:21

were some robots on the floor and I

23:22

started working on those. And then from

23:24

there they trained me in robotics and so

23:27

uh it was it was all done on site.

23:29

>> What kind of automation like automation

23:32

and it was all food related.

23:34

>> Food related.

23:34

>> Mhm. All food robots. Yep. So we weren't

23:37

dealing with Johnny 5. We were dealing

23:39

with like vacuum systems and ovens and

23:42

uh various robots which were associated

23:43

with those. Like for instance there was

23:45

a packaging machine that would just all

23:47

it would do is form boxes. That's it.

23:50

All it did it just that's it. But it

23:52

would form, you know, a thousand boxes a

23:53

minute. And it was a it was a giant

23:56

robot and it had a huge sequence of

23:58

functions on it. You know, when people

23:59

think robot, they always think humanoid.

24:02

But almost no robot

24:03

>> is is in any way humanoid at all. You

24:06

know, they're that's just not what

24:08

they're for.

24:08

>> It is weird, right, that we think of

24:10

robots as like movie robots. We think of

24:13

iRoot.

24:14

>> Yeah. If if you came across a robot in a

24:16

factory, you would have no idea it was a

24:18

robot. You'd be like, "What the hell is

24:20

that? So, how did you go from that to

24:23

debating people online?

24:25

>> Uh, co. So, the Yeah, the lockdowns

24:28

happened and I was laid off. The all the

24:31

food plants in Michigan were shut down,

24:32

especially the meat plants and that's

24:34

where I was. I was in the meat plants

24:36

>> and uh and they all shut down be because

24:39

of the draconian restrictions of one

24:42

Gretchen Whitmer. And um anyway, while

24:45

she was out with, you know, on a boat

24:46

partying with uh you know, with her

24:49

honey, we were all locked out of work,

24:52

right?

24:53

>> So, familiar story.

24:54

>> Yeah. We had the stay at home orders and

24:55

I would argue with these dumb liberals

24:57

on Facebook and uh and they man, they

25:00

pissed me off. And so, I started

25:02

crashing their panels and I would debate

25:04

with them and um you know, I had a lot

25:06

to say and those things started to

25:09

become more and more popular and they

25:10

would move over to YouTube. People would

25:12

clip it. Then I started getting invited

25:14

on to do debates with other people. And

25:16

I didn't know who these people were. It

25:17

wasn't my world. Like I didn't know who

25:18

any of these podcasters were, you know,

25:20

stuff like that. I'd listen to it maybe

25:22

occasionally

25:23

>> online, but like I I couldn't have told

25:25

you who like Vosch was or Destiny or any

25:28

of these PE. Like I didn't know who any

25:30

of them were.

25:31

>> Um and I didn't care. To me it was just

25:33

some other dumb smug liberal, you know?

25:35

So um that's where I got my start. I

25:38

never would have foreseen at all that

25:40

I'd be sitting here with you. That's so

25:43

in Well, I never would have foreseen I

25:44

would have been here either.

25:46

>> It's weird, huh?

25:47

>> Yeah. Oh, it's weird. Very weird.

25:48

>> And I'll never get used to it. People

25:50

walk over and they're like, "You're

25:52

Andrew Wilson." And I'm like, "I'm

25:54

[ __ ] nobody." You know what I mean?

25:56

But it's nice to meet you. You know,

25:58

shake their hand, you have a chat with

25:59

them.

26:00

>> Uh, I'll never get used to it.

26:02

>> No, you probably shouldn't. It's

26:04

probably better to not get I'm not used

26:05

to it.

26:05

>> Yeah. Used to it.

26:06

>> It's probably better to not be used to

26:08

it.

26:08

>> Keep you sane.

26:10

>> And maybe keep you humble. Yeah, you

26:12

need something. You need something to

26:14

keep you humble. We all know people that

26:16

did not have something that kept them

26:18

humble and they lost their their way.

26:20

The wheels fall off.

26:21

>> Yeah. They lose their marbles.

26:23

>> Yeah.

26:23

>> Yeah.

26:24

>> Yeah. Especially as you get more and

26:25

more famous. It becomes more and more

26:27

unmanageable.

26:29

>> I was I feel like I I'm pretty well

26:31

grounded uh due to the fact that um I

26:34

didn't come from a political background.

26:36

There's no famous people in my family,

26:38

you know, there's just none of that. And

26:40

so I feel like the the grounding is

26:42

always there because uh you know even

26:44

even from the family you get the call

26:45

like from from my brother for instance

26:48

like he's been calling me the I don't

26:50

know if you can say the felur here so I

26:51

won't but he's been calling yeah [ __ ]

26:53

He's been you know he's been like the

26:54

the phone call since I was 15. What are

26:56

you doing [ __ ] Has not changed.

26:58

>> Good.

26:59

>> It has not changed. You know 42. He's

27:01

like happy birthday [ __ ] [laughter]

27:04

>> That's normal.

27:05

>> Yeah.

27:06

>> I remember you were having a

27:07

conversation. I think it was on Pierce

27:09

Morgan who is the best cat wrangler in

27:12

the business.

27:13

>> That's what he does. He cat wrangles.

27:14

>> Yeah, I just talked to him briefly.

27:15

>> Is he okay

27:16

>> on Well, that's what I asked him. I just

27:18

Yeah, I sent him a DM and I was like,

27:19

"Hey,

27:19

>> for people don't know, he fell."

27:21

>> Yeah, he fell

27:22

>> and really [ __ ] himself.

27:23

>> And it was the hip and at his age, the

27:25

hip,

27:26

>> you know, you don't want to Nothing with

27:28

the hip. Every time I see anybody who's

27:30

60s

27:31

>> Mhm.

27:32

>> they get the hip injury, it's it's bad.

27:35

>> Yeah. It's not good. I think they they

27:38

think your lifespan post hip surgery is

27:41

like 10 years.

27:42

>> Yeah, it's right. And that

27:43

>> he'll probably be better than that.

27:44

>> He will.

27:45

>> And I think he's mobile.

27:46

>> Oh, that's good.

27:47

>> He's mobile, but I'm

27:48

>> really good at hip replacements now.

27:50

>> I was like, what's you know, are you are

27:52

you doing? He's like, yeah, you know,

27:53

I'm I'm doing okay. And and I was like,

27:55

don't [ __ ] around with the hips, dude.

27:57

>> It's crazy that he had to have a hip

27:58

replacement. Like, how bad was that

28:00

fall?

28:01

>> Uh yeah, I don't know. I don't know what

28:03

the details of it were, but

28:05

>> I fell like a sack of spuds, he wrote.

28:07

[laughter] Wound up needing a new hip

28:09

after fracturing the neck of his femur

28:12

and is recovering from surgery. In

28:13

addition to being on crutches for 6

28:15

weeks, he won't be allowed to take any

28:17

longhaul flights for at least 12 weeks.

28:19

He tripped on a small step inside of a

28:22

lust a London restaurant in the tumble.

28:25

You think he's drunk?

28:26

>> He could have been a little fired up.

28:27

>> A little drunk. was also is, you know,

28:29

not the most fit or agile guy in the

28:32

world.

28:32

>> True.

28:33

>> You know,

28:33

>> Yeah.

28:34

>> He's only two years older than me.

28:36

>> No way.

28:36

>> Yeah.

28:37

>> Really?

28:38

>> Mhm.

28:39

>> Not crazy.

28:40

>> Yeah. Damn.

28:40

>> Yeah. Some people

28:42

>> don't take care of themselves.

28:43

>> Yeah. He got he got the crack. Yeah. I

28:45

got to quit these. But yeah, he got the

28:47

crack on the on the hip and I was like,

28:49

man.

28:49

>> Do you think that American spirits are

28:52

better than Marorrow for you?

28:54

>> Probably.

28:55

>> Yeah.

28:56

>> But they taste like [ __ ] Do they? Is

28:58

there a difference? You really like the

28:59

Marbo taste? You don't think you get

29:01

used to American Spirits?

29:02

>> I think I could.

29:04

>> Um

29:04

>> I've been trying the cigarellas. Those

29:06

have been helpful. Like the little mini

29:08

cigarette cigars.

29:09

>> Oh yeah. Ron White used to smoke those.

29:11

He just quit totally. He went to a

29:13

hypnotist, quit instantaneously. But

29:16

those are loaded with nicotine. Like way

29:20

more nicotine than a cigarette. He was

29:22

smoking those [laughter] little tins. We

29:25

have those.

29:25

>> What are those tins, Jamie? Do you know

29:27

what they are?

29:29

>> It's like a like a famous cigar company

29:32

sells tins of these little tiny cigars.

29:35

And it's great if you don't have the

29:37

time to smoke this. Like you get out of

29:39

a flight, you just want to have a small

29:40

But he inhales these [ __ ] like

29:43

a cigarette.

29:43

>> That's brutal, dude. [laughter] That'll

29:45

do you in. And then washes it down with

29:48

whiskey.

29:48

>> Well, he doesn't drink anymore. He quit

29:50

drinking. Yeah. I think he went

29:53

>> That was always in his bit, though. He

29:54

was always up there smoking and

29:55

drinking. And I always thought,

29:57

>> I love that. It was

29:58

>> Well, he did it. He did it till the

30:00

wheels fell off and then um the drinking

30:03

was the big one. You know, he went to a

30:06

doctor and the doctor's like, "You're

30:08

going to die."

30:09

>> Yeah.

30:09

>> Like your liver is not in good shape.

30:11

Like if you back off now, you'll

30:13

probably live. If you don't, you're not.

30:14

You got like a few years left.

30:16

>> Well, alcoholics too, they have it. I

30:17

don't know if he was one or not. He

30:19

might have just been like a heavy social

30:20

drinker, but like real alcoholics,

30:23

that's no way to live.

30:24

>> No. I mean, uh, they stink.

30:28

>> They They're like I mean, kind of

30:30

everything about a real alcoholic is

30:32

just they look completely unwell.

30:34

>> They're just kind of mangled, you know?

30:36

>> Yeah.

30:37

>> It's a weird disease, too. Uh, in that

30:40

that addiction is one that you can't

30:41

quit.

30:42

>> You can't just cold turkey.

30:44

>> You'll die.

30:44

>> You'll die. There's only a couple of

30:46

things that'll just kill you if you quit

30:48

right away. And alcohol is one of them,

30:50

which is really crazy because it gets

30:52

integrated into your biological system

30:54

where you need it to stay alive. Your

30:56

body's like, "Okay, we're going to use

30:57

this for fuel. Going to use this to

31:00

function."

31:00

>> Yeah. They've been weaning people off

31:02

alcohol with beer for centuries.

31:04

>> Is that what they use? Beer.

31:05

>> Yeah. They used beer. They would just

31:07

go, "Okay." And well, it was pretty

31:08

common to drink beer and ale with

31:10

dinner, you know?

31:11

>> They just weenie off with beer.

31:13

>> That makes sense.

31:14

>> They knew. They knew hundreds of years

31:15

ago there's books on how alcohol, you

31:18

know, what do they consumption or

31:20

whatever they called it.

31:21

>> Uh they killed you if you just if you

31:24

just quit if you're an alcoholic and so

31:25

they'd wean off with beer.

31:26

>> Wonder when they started making hard

31:29

liquor cuz you would imagine like

31:31

fermented things like wine and beer were

31:34

like the first things that people

31:35

consumed.

31:36

>> I think it's been around for thousand I

31:38

mean several thousand years.

31:39

>> I wonder I wonder like how they figured

31:41

it out. The biggest the biggest uh what

31:43

was it the biggest distributor in Europe

31:45

of wine was the Catholic Church.

31:47

>> Well, wine has certainly been around

31:48

forever, but like what about hard

31:50

liquor? Jamie, put that into our sponsor

31:52

perplexity. When was the first what

31:55

known? I mean, we don't really know

31:57

because there's so much weird [ __ ] about

31:58

history, but like what was the first

32:02

in like documented hard liquor like

32:07

whiskey, vodka, [ __ ] like that? That's

32:09

the That's the stuff that kills you. If

32:11

you die from beer, boy, you you're

32:13

[ __ ] you're going hard. Like Shane

32:15

Gillis will sit here on a podcast and

32:16

drink 16 Bud Lights.

32:20

Um, first alcohol drinks were fermented

32:23

things like beer, wine, me. Okay.

32:25

Thousands of years before true liquor.

32:27

Okay. First rec recognizable liquor

32:30

appears when people began to distill.

32:33

Archaeological evidence shows fermented

32:35

drinks. Okay. That's around 7,000 B.CE.

32:38

So clear of is of true alcohol

32:41

distillation. Chinese rice beer dist

32:44

distillates

32:46

by about 800 BCE.

32:48

>> Yeah.

32:48

>> So a couple thousand years.

32:50

>> Couple thousand years. Yeah.

32:51

>> Yeah.

32:53

Okay. Wind into strong spirits. The Arab

32:57

alchemists.

32:59

Alul. Oh, interesting. Using the term

33:01

alul, the root of alcohol.

33:04

Well, they used they used alcohol as

33:07

base for for alchemy, too. That was a

33:10

base for trying to transmute metal.

33:12

Yeah.

33:12

>> I wonder if they were ever successful.

33:14

>> They were never successful.

33:15

>> Nothing.

33:15

>> No.

33:16

>> It seems like a crazy thing to waste so

33:18

much time on trying to turn lead into

33:20

gold.

33:20

>> I mean, there there was whole kingdoms

33:23

spent trying to figure out how to do

33:24

this.

33:25

>> Wild.

33:25

>> And it's just like and they never I

33:27

mean, you think about it, it makes

33:28

sense, right? If you're the first one,

33:30

>> if you're the one who knows

33:32

>> Mhm. like you can just create as much

33:34

wealth for yourself as you want.

33:36

>> Oh yeah.

33:37

>> And uh

33:37

>> it's just amazing that they kept trying.

33:40

Must have been someone saying that they

33:41

got it. I got it, dude. Just give me

33:43

some money.

33:43

>> Oh, there's tons of frauds.

33:44

>> Yeah,

33:44

>> there was tons of frauds who were

33:46

alchemist who, you know, there were the

33:48

that centuries version of a snake oil

33:50

salesman, you know. Yeah, of course we

33:52

can we can turn and there there was ones

33:54

even in the '9s who were like, we can

33:56

now turn, you know, base metals into

33:58

gold. I think there's something now

34:02

where they can make some gold, but I

34:06

think it takes an incredible amount of

34:08

energy

34:10

>> and

34:10

>> cost more to make than it's worth,

34:11

>> right? I think it's one of them deals.

34:14

Is that a fact? I feel like I've read

34:16

something like that fairly recently,

34:20

but it's Here's the weird one. Why gold?

34:23

Like, why does anybody give a [ __ ] about

34:25

this metal that you can't even use?

34:26

>> There's not much of it,

34:27

>> right? That's true.

34:28

>> Is what they saying there's um 90% of

34:31

all the gold ever discovered still in

34:32

circulation,

34:34

>> right? Yeah. Well, China just found a

34:37

huge vein of gold. An enormous amount.

34:40

But I mean, when you say enormous, it's

34:42

like relative. Yeah. Because I think the

34:44

entire world supply of gold will fit

34:46

inside of a football field.

34:48

>> Yeah. There's not much of it.

34:49

>> Yeah.

34:49

>> It's very I mean, and very little of it

34:51

is worth a lot. I mean, even if you

34:53

think a pirates treasure chest, you

34:54

know, it's not actually that much gold,

34:56

>> right? So it's yeah it's

34:59

>> it's a box of gold

35:00

>> extremely valuable and also you can do

35:02

things with it you can't do with other

35:03

metals and same thing with silver you

35:05

know silver

35:07

>> scientists mimicking the big bang

35:09

accidentally turn lead into gold yeah so

35:12

this is the thing but I mean again I

35:14

think mimicking the big bang like what

35:16

are they using a particle collider like

35:17

what are they doing

35:20

okay how they do it how to steal a

35:23

proton protons found the nucleus of an

35:24

atom so extreme extremely small amounts.

35:27

In fact, a total of some 29 trillionth

35:30

of a gram they made. Okay. Smashing lead

35:32

atoms into each other. Extreme. So, it

35:34

is a particle collider. I guess uh the

35:37

working on the Alice experiment and the

35:40

large hydron collider. Yeah, there it

35:41

is. In Switzerland in incidentally

35:44

produced small amounts of gold.

35:45

>> You just need a you just need a

35:47

partider. That's all. No big deal. No

35:50

big deal. You just build it into a whole

35:52

mountain. And now they're building a

35:53

second one. They said they're building

35:55

into a new mountain,

35:56

>> right? [laughter]

35:58

>> Well, there was one they were putting

35:59

during the Clinton administration. They

36:01

were building a particle collider

36:03

somewhere in the middle of America. I'm

36:06

trying to figure out where it was.

36:08

>> I'm certain there's already more than

36:09

one.

36:10

>> Oh, there's many particle colliders.

36:11

>> Yeah. But I'm certain there's ones that

36:13

are

36:14

>> that are even probably larger and hidden

36:15

than the one that's currently there.

36:17

Yeah.

36:17

>> Really? You think so?

36:18

>> Yeah. Oh, for sure.

36:19

>> What do you think they're doing with

36:20

them? Well, I mean, the military

36:21

applications for that are like they're

36:24

enormous. The idea that uh you could

36:28

make like some kind of particle weapon,

36:30

you know, or something like this.

36:32

>> Oh, right. Right. Right.

36:33

>> Um Yeah. There's no way that the that

36:35

the US military is going to let

36:37

scientists have a gadget like that

36:39

somewhere that they don't have complete

36:41

control over. There's no way. I wonder

36:43

because I I don't know what kind of

36:46

military applications you would have for

36:49

particle colliders.

36:51

I mean, for sure,

36:52

>> big explosions, right?

36:54

>> There's probably Yeah, but you're just

36:56

you've got a giant loop and you're

36:58

slinging

36:59

>> smashing things together, right?

37:00

>> Yeah.

37:01

>> So, what I mean, what do you do

37:03

together? You can make them go boommy

37:05

boom, right?

37:06

>> Kind of. Well, the real concern with the

37:08

Large Hadron Collider is they were going

37:09

to create many black holes that were

37:10

going to eat their way through the Earth

37:12

that you wouldn't be able to stop them.

37:13

They would just like slide through the

37:15

Earth,

37:16

>> you know.

37:16

>> Yeah, I heard that. I heard they were

37:18

concerned they were going to open up a

37:20

portal to a different dimension. I've

37:22

heard like I've Yeah, I've heard all

37:24

sorts of Yeah, we changed our timeline.

37:27

>> We're on the new timeline. You know, the

37:29

uh the whole nine yards. I I've I've

37:31

heard it all. I'm just saying that

37:33

anything. It's just been my experience.

37:35

Look, when I look through the historic

37:37

record that if there's any scientific

37:39

gadget out there that looks like it has

37:41

the potential to make something go boom,

37:43

the United States military has a version

37:45

of it somewhere.

37:45

>> Yeah, that makes sense. That makes

37:47

sense. But for whatever reason, they

37:49

abandoned this one during the Clinton

37:52

administration. I I don't remember why

37:55

they abandoned it, but you could people

37:57

can if they have access to the area

37:59

where it's at can still go inside of it

38:02

and see like what they started to build,

38:04

but they never did. But it would have

38:05

been larger than the large hydron

38:07

collider. I want to say it's in Georgia.

38:10

>> I don't remember though. But it's a it

38:12

was going to be an enormous particle

38:14

collider.

38:14

>> Yeah.

38:15

>> And for some reason they just stopped

38:16

funding to this thing.

38:19

But

38:19

>> well, they're talking about funding

38:21

another another one that's twice the

38:22

size the one that they have now. They

38:24

say that they need more room to smash

38:27

more particles together.

38:28

>> What are they trying to do?

38:29

>> I have no idea. Like that is way outside

38:32

of of my domain. Um, I can tell you I

38:36

probably the same things you've heard,

38:38

right, is uh they're trying to smash

38:41

small particles together to see what

38:42

happens. That's what the kind of the

38:44

official story is. But it's funny

38:46

because every time a new story comes

38:47

out, it's like

38:48

>> scientists smash this together with this

38:50

and this happens. And I'm always like,

38:51

"Okay, well, what does that mean?" And

38:52

you never get any of that. Right.

38:54

>> Right.

38:55

>> This one was in Texas.

38:56

>> Oh, was in Texas. Yeah. The Clinton.

38:58

Okay, that's it.

38:58

>> Yeah.

38:59

>> Yeah.

38:59

>> Spent $2 billion on it and abandoned it.

39:02

I wonder. See if you can find some

39:04

images of it.

39:05

>> Yeah, it's outside of Dallas.

39:06

>> Oh, it is outside of Dallas. Okay.

39:09

>> Abandoned superconducting super collider

39:11

site in 2008. Wow. I wonder if you could

39:14

buy it. That'd be [ __ ] awesome.

39:16

>> Get Broen's own [laughter]

39:19

particle collider.

39:20

>> I mean, there's nothing there. It's just

39:21

concrete.

39:23

Finish it.

39:23

>> What's the big deal?

39:24

>> Like that weird time machine out in the

39:26

desert. That was really funny.

39:27

>> Let me set up an archery range inside of

39:28

it.

39:28

>> Confirmed this stuff last week.

39:30

Department of War confirms plans to

39:32

scale direct energy weapons. Did you see

39:34

that thing with China?

39:35

>> Why would they need a hydro collider

39:36

though, right? Or you know, particle

39:39

well because they want to make stuff go

39:41

boom.

39:41

>> Yeah, direct energy weapons.

39:44

>> Yes, Department of War has direct energy

39:45

weapons. Yes, we are scaling them. Wow.

39:48

>> Conspiracy theorist went wild over this.

39:50

>> Told you.

39:51

>> Well, that was a lot of people. The

39:52

really

39:53

>> Fallout rifle though. I do. I want my

39:55

plasma rifle, right?

39:56

>> I do. Like if they have plasma rifles,

39:58

you're going to buy one, right? Oh,

39:59

yeah. Oh, for sure.

40:00

>> You'd probably get a tax stamp. You It

40:02

probably be like a lengthy thing to get.

40:04

>> Um I'm glad you're a gun guy because I

40:07

want I wanted to bring up this whole

40:08

thing with this guy uh Prey. Yeah.

40:11

>> And I I haven't talked about it. We

40:12

haven't done a podcast since that guy

40:14

got killed. Um but that whole thing

40:19

for there's a lot of people that don't

40:21

understand what's going on and um why

40:24

riots only in Minneapolis and why riots

40:28

in the place where there's an ungodly

40:31

amount of fraud that has been discovered

40:34

>> coincidentally right around the same

40:35

time.

40:36

>> Exactly. Like instantaneously afterwards

40:38

the narrative completely changes.

40:40

Everybody forgets about the fraud. Now

40:42

all anybody cares about is ICE and

40:44

fascists and Nazis.

40:46

>> Yeah.

40:46

>> And um

40:48

>> it's uh there's a you know what a color

40:50

revolution is

40:51

>> of course

40:51

>> and for people that don't it's it's a

40:54

coordinated effort to cause chaos and

40:58

this is a very coordinated thing. The

41:00

idea that this is an organic protest

41:03

these these riots organic is nonsense.

41:06

Um it's provably nonsense because now

41:08

they have access to the signal chats. So

41:10

they know that these So these people

41:13

that

41:13

>> Cam Higgby, by the way, yes, he was he's

41:15

been on the front lines of this. Um, The

41:18

Crucible has been a big supporter of

41:20

that effort. Uh, my channel, um, I will

41:24

often snipe his coverage while it's

41:26

going on, send my audience over to send

41:28

in super chats in order to keep this guy

41:30

going. I think that that work is

41:32

critical.

41:33

>> Yes,

41:33

>> it's critical work. And there's not that

41:35

many people doing it anymore because of

41:37

how dangerous it has become.

41:39

>> Yes. And so I'm a big supporter of that.

41:41

Doesn't mean I agree with everything he

41:42

says politically, but what he's doing on

41:44

the ground there needs to happen,

41:46

>> right? We need you need to understand

41:48

that this isn't organic. Regardless of

41:50

how you feel, I don't feel that that guy

41:52

should have been shot. Um, but I

41:55

understand what happened and what

41:57

happened was chaos. So what hap first of

42:00

all, it wasn't ICE. People need to

42:02

understand that [clears throat]

42:04

it was uh customs border patrol people.

42:06

So they were brought in to assist ICE.

42:08

Um, and they're telling this lady to

42:11

stand away and then this cop gets very

42:14

aggressive and shoves her. Um, you have

42:17

to understand the situation that they're

42:20

in, right? And this is not making an

42:22

excuse for any of it, but you have to

42:24

just just to put it into context. These

42:26

people are getting harassed outside of

42:28

any hotel they're at. People blow horns.

42:30

They try to smash into the hotel. They

42:33

dox them. That's why they're wearing

42:35

masks. It's a coordinated effort. I'm

42:37

not saying that guy should have shoved

42:38

that guy. I don't think he should have

42:39

or that woman. I don't think he should

42:41

have. And then pepper- sprrayed. And

42:43

then the guy who got shot, Prey, he

42:46

steps in, which is if you know anything

42:50

about concealed carry, if you are a

42:53

concealed carry holder and you are

42:55

carrying not just a pistol, but two full

42:57

magazines as well, you do not ever

43:01

physically engage with someone. You also

43:04

are supposed to carry your license on

43:05

you and you're supposed to uh you're

43:08

supposed to have ID on you. All right.

43:09

You you

43:10

>> and you're trained specifically for

43:12

this. By the way, I was a CPL instructor

43:14

for years.

43:14

>> Okay. So, you know about it.

43:16

>> The the thing is there's a framework

43:18

here if you don't mind if I add your

43:19

framework.

43:21

>> The framework here is this is a

43:22

mathematical formula. So, I've been

43:24

following these extremely closely live

43:27

um and looking at at how this is done.

43:30

Let's go backwards in time. You remember

43:32

what was going on in California? Nobody

43:35

died in California. There was an ICE

43:37

raid on a Home Depot and they went nuts

43:39

and they started smashing police cars.

43:41

They were starting fires, right? This

43:44

was not over. Somebody dying. And now

43:46

the narrative, they're trying to make

43:47

the narrative shift. The Gestapos in

43:49

here, you know, murdering American

43:51

citizens. Well, and what what was going

43:54

on in California then because there was

43:56

no American citizens getting murdered

43:57

there. What was going on there was they

43:59

did an ice raid in a Home Depot, which

44:01

anybody who's been to California knows

44:03

that uh you know there's it used to be

44:06

that you'd drive down the street and

44:07

they would all hang out in front of the

44:08

Home Depot and you'd say two, right?

44:10

>> And they'd hop in the truck and you

44:12

would you know they would go Yeah. They

44:13

were day laborers, right?

44:15

>> So it didn't surprise me that they were

44:16

there doing daily raids. Okay, that

44:18

doesn't surprise me a bit.

44:19

>> Uh and they they all went ballistic.

44:22

>> Now here's what was very curious about

44:24

the coverage of that. And I had a debate

44:25

with a couple of leftists on this.

44:28

What I saw was what looked to me to be a

44:30

police standown order. There was people

44:32

who were breaking into I don't remember

44:34

if it was an Amco or a 7-Eleven, but

44:36

they were busting into it. The cops were

44:38

were on the side corner watching this go

44:40

down and do anything. They didn't do

44:42

anything about it.

44:42

>> Right.

44:43

>> Okay. They if it got too rowdy, they'd

44:45

clear it out and then they let them

44:46

continue. It looked like a standown

44:48

order, like you don't you don't involve

44:50

yourself. Well, what I think these guys

44:52

have figured out is a mathematical

44:54

formula and it works like this. The if

44:56

the local police are not going to

44:57

protect the federal buildings, then it's

44:59

left to the federal police to do this,

45:01

right? In this case, ICE is going to

45:03

protect its own buildings. The FBI is

45:05

going to protect its own buildings. If

45:07

the local police aren't going to protect

45:08

it and it's surrounded, then who who

45:11

does the protection then? And this is

45:12

why Trump, he unleashes the National

45:15

Guard, but where to those federal

45:17

buildings to protect those federal

45:19

buildings. That was the whole point of

45:20

it. That's And basically anytime he's

45:23

unleashed a National Guard that I've

45:24

seen, it's two federal buildings to

45:26

protect them. And so, uh, the

45:29

mathematical formula works like this.

45:31

The longer it is that protesters are

45:33

engaging with federal officers whose job

45:36

is not to do basic street cleanup of

45:39

thugs, that's the local PD's job. Uh,

45:42

the chances that there's an incident,

45:44

which is going to be a bad incident, is

45:46

going to occur. So basically the longer

45:47

you're there, the the more attrition

45:50

there is, the more engagements you have

45:52

with with these federal officers over

45:54

time, eventually yes,

45:56

>> there's going to be something which is

45:58

out of pocket that happens or something

46:00

which is escalatory that happens and

46:02

they're banking on that. And that's why

46:04

ICE is out in front of these or not ICE,

46:06

the Antifa people are still out in front

46:08

of the ICE buildings in front of many

46:10

states night after night after night.

46:12

And it's designed specifically to make

46:14

sure it's just ma a math formula, right?

46:16

The longer we're here and the less the

46:18

local PD involves itself, the more

46:20

chance of incident between federal

46:22

officers and us.

46:23

>> You're knocking steel against Flint.

46:25

>> Yep.

46:25

>> Yeah. You're you're

46:26

>> waiting for the fire.

46:27

>> You're waiting for sparks.

46:28

>> And um in this particular instance, this

46:32

guy uh clearly had been very involved.

46:36

Uh I don't know if he was a part of the

46:38

signal chats, but when you go to what's

46:41

supposed to be a peaceful protest and

46:43

you're fully armed like that with two

46:45

magazines, it's kind of crazy, right?

46:48

Like what do you why do you need so many

46:50

bullets?

46:50

>> Now the liberal said prosecond

46:52

amendment, too.

46:53

>> That is wild.

46:55

>> Wild, which I'm for.

46:56

>> He had every right.

46:58

>> Like tell He didn't last month, but

47:00

okay.

47:01

>> I like it. I like where it's going. I

47:03

like that cuz that's kind of a trap. Um,

47:05

[laughter]

47:06

did you see what MSNBC did to his image?

47:08

>> Yeah. Were they Were they gusty?

47:10

>> They did the opposite of what CNN did to

47:12

me. You know, CNN during the COVID times

47:15

turned me green

47:16

>> and uh they made me ugly and look like I

47:18

was dying and they made him handsome.

47:20

Yeah.

47:20

>> So people would be more sympathetic to

47:22

him getting shot, which is kind of wild.

47:24

Like are ugly people less valuable to

47:27

MSNBC?

47:28

>> Less marketable?

47:29

>> That is crazy to me. Like look at the

47:33

difference.

47:33

>> Yeah. Look at the difference. They

47:35

shortened up his face. They gave him a

47:37

little bit of a tan. They widened his

47:39

face a little bit. It seems like they

47:41

just made him a little handsomemer.

47:43

>> Yeah. Little hotter.

47:44

>> They gave him gave him a bit of that

47:45

Chad jaw, didn't they?

47:46

>> They shrunk his nose a little, too,

47:48

didn't they? They did. They shrunk his

47:50

nose. Gave him a little bit of a

47:51

handsome jaw. So, he looks like

47:54

>> And they made him, if you look at the

47:55

shoulders, it even looks like they uh

47:57

they may have plumped up the shoulders

47:58

there a bit.

47:59

>> A little bit. Yeah. The one on the right

48:01

looks like looks like he's a little

48:02

plumper. Yeah.

48:03

>> Yeah. Yeah. They they changed the tone

48:06

of the color. Wild.

48:08

I mean, they look at they they changed

48:10

his [ __ ] teeth, man.

48:12

>> Communist news now.

48:13

>> They gave him veneers.

48:14

>> Yeah.

48:14

>> Like look at the difference in his

48:15

teeth. He's a much more handsome guy.

48:18

Like the one on the right is like the

48:20

handsome brother and the one on the left

48:22

is like, "Fuck, why couldn't I look on

48:24

the right hand?"

48:25

>> Yeah. The one on the right, he they were

48:27

twins and he took more of the protein.

48:29

Right. Right. That was what that was

48:30

what happened. The thing is is like this

48:32

doesn't surprise me by the way. Uh this

48:35

is um what's going on and it this is a

48:39

well orchestrated well-crafted thing.

48:41

Yeah.

48:41

>> And the signal chats prove that but we

48:43

knew it anyway.

48:44

>> Yes.

48:45

>> Involving government by the way

48:47

allegedly at least involving Minnesota

48:49

state government.

48:50

>> Well, it involves Waltz.

48:51

>> Yes.

48:51

>> So that's not alleged,

48:53

>> right? That's not alleged.

48:54

>> It's not alleged that it involves Waltz.

48:55

It's not alleged that it involves Frey.

48:57

Right.

48:58

>> And it's not alleged. Well, what what is

48:59

alleged is the allegations of fraud, of

49:01

course, but

49:02

>> but there would be a reason why you

49:04

would want to distract from all that

49:06

fraud and that would motivate you to do

49:09

something along these lines. So, let's

49:11

go back to the the instance.

49:13

>> So, um you've got these cops that are on

49:16

these CBP guys that are on high alert,

49:20

right? There's a lot of tension, people

49:22

are screaming. If you're in an

49:23

environment like that all day, like I've

49:25

never been a police officer, but I was a

49:27

security guard. And when I was I was a

49:29

security guard for Great Woods, and by

49:31

the way, I'm not comparing this in any

49:33

way, but I'm just explaining my

49:34

mentality when I was there. It was very

49:38

much us versus them. It was a small

49:41

group of guys that were uh working at uh

49:46

I I worked at Greatwood Center for the

49:48

Performing Arts in Mansfield,

49:49

Massachusetts. It's a concert venue in

49:51

Mansfield. And this was when I was

49:53

fighting. So it was me and uh a bunch of

49:57

guys from my taekwond do team got hired

50:00

to be security guards. One of the guys

50:02

came and said, "Hey, you guys want to uh

50:04

get a job working as a security guard?

50:06

It's great. You get to see concerts."

50:08

And it was like a good pay. And you

50:10

know, I was doing a bunch of random jobs

50:12

back then while I was competing just to

50:14

sort of pay bills. And I said, "Yeah,

50:16

okay. What do I have to do?" And like

50:17

it's nothing. You just go there and you

50:19

work. First day on the job. I go there,

50:22

some guy had stolen one of the security

50:24

golf carts. So there's this dude named

50:26

Alleycat. He was the head guy of

50:27

security. He was [ __ ] character.

50:29

Hilarious. His main dream was to open up

50:32

a a bar. Lib Alleycats, libations, and

50:36

victuals. He had this whole dream of

50:38

like just a real character. But this guy

50:40

was a hardcore [ __ ] And uh they

50:44

caught the guy who uh stole this golf

50:47

cart, tacked him to the ground, and he

50:49

was beating him in the face with a

50:51

walkietalkie. This is my first day on

50:53

the job. So I'm like, "Okay, so this is

50:56

what we're doing." And we kind of became

50:59

like almost like cops for this place,

51:01

but there was very much an us versus

51:03

them mentality. And uh it turns out it

51:06

was a lot more involved than I ever

51:08

thought it was. And then one day I was

51:09

at a Neil Young concert. I was working

51:11

the Neil Young concert and riots broke

51:13

out. There was fire. It was cold out and

51:16

there was like a grassy area. So there

51:18

was like a lawn. So it was like there's

51:20

the inside, not inside, it was like an

51:22

outdoor concert venue, but there was a

51:24

roof to part of it and then the back of

51:26

it was like this lawn area that was in

51:28

the back and these guys had started

51:30

bonfires up there and we we were

51:32

supposed to go in there and break up the

51:33

bonfires. And then my friend Larry, who

51:36

is like one of the most mildmannered

51:38

guys you would ever want to meet, but

51:40

you know, a elite black belt. He gets in

51:43

a fight with this guy and some guy

51:45

pushes him and he knocks this guy down.

51:48

And I'm like, okay, chaos is broken out.

51:50

Let's get the [ __ ] I'm like, let's quit.

51:52

Let's get the [ __ ] out of here. And I

51:53

used to wear a hoodie. I used to carry a

51:55

hoodie so I could just zip up the hoodie

51:57

over my security outfit and like, bye.

51:59

Um because I knew there was going to

52:00

come a time where I was like, I'm not

52:02

getting shot, stabbed, killed, whatever,

52:03

stomped.

52:04

for 20.

52:06

So I wound up leaving that day. But

52:09

>> you there was a very and I it very

52:12

clearly like oh this is probably what

52:14

happens with cops times a million. Like

52:16

you develop this us versus them because

52:18

it was very much us. We would meet up at

52:21

the beginning of our shift. We would all

52:22

talk about what's going down. We mostly

52:24

we were catching people that were

52:26

bringing in alcohol like women in their

52:27

purses would you know you know like uh

52:30

some Carly Simon or something be

52:33

playing. and they'd sneak in a bottle of

52:34

wine that you know and James Taylor, you

52:37

know, there was a lot of that. And so we

52:39

would we'd have like literal

52:42

[ __ ] trash cans filled with bottles

52:45

of wine and liquor at the end of the

52:46

night. We would get to keep them. We'd

52:48

take them home. And so this us versus

52:50

them.

52:51

>> That's a nice perk.

52:52

>> It was kind of fun.

52:52

>> Yeah, that's a nice perk. It's like

52:54

>> also I was illegal to drink. I was only

52:56

19 at the time. But

52:58

>> an even nicer perk. Yeah, [laughter] it

53:00

was very clearly us versus them and the

53:04

tensions were very high.

53:06

>> Like whenever some weird [ __ ] went down,

53:08

everybody puffed up their chest and

53:09

everybody was ready to throw down. And I

53:11

was like, "This job is not good." But it

53:14

it educated me. I was like, "Okay." And

53:16

I in my mind I was like, "Okay, this is

53:18

must be like this when you're police

53:19

officer again times a million." Uh that

53:23

to think of what's going on.

53:24

>> Why they have their codes, right? They

53:26

have the they have their oaths they take

53:28

and then they have their little codes to

53:29

each other too.

53:30

>> Exactly. Exactly.

53:31

>> But I wouldn't I don't blame them. Like

53:33

it seems completely there's a a certain

53:35

wisdom to this

53:37

>> like hey look that could be me%

53:40

>> and so if it's you I'm going to be right

53:42

there with you% and then if it's me

53:44

you're going to be right there with me.

53:45

I get it. Also, there's a tremendous

53:48

amount of social media content that

53:50

anybody could access at any given time

53:52

where a lot of these dorks are calling

53:54

for violence. You know, it's just it's

53:57

all over the place. You could find it.

53:59

The least likely people that would ever

54:01

be involved in any sort of an

54:03

altercation are on Tik Tok calling for

54:06

violence. We got to kill these

54:08

[ __ ] We got to shoot these

54:09

[ __ ] And these guys are out

54:11

there in the middle of that. All right.

54:13

So, tensions are high as [ __ ] And

54:15

they're getting screamed out all the

54:17

time. They're on red alert. They're

54:18

wearing vests. They're carrying guns.

54:20

>> Well, their wives are getting called and

54:21

threatened. And uh they're saying

54:22

they're gonna rape their kids. And

54:24

they're saying that they're going to

54:24

brutalize their family members. And they

54:26

give them calls in the middle of the

54:27

night. And they whisper to them, "Well,

54:29

how's your dad such and such doing, you

54:31

know, and just cryptic things like that

54:33

and let it go."

54:33

>> Uh-huh. Exactly. And this is very

54:36

coordinated. It's very coordinated and

54:38

organized. And the way they find out all

54:40

their information,

54:42

it's very creepy. So again, I don't

54:45

think this guy should have pushed that

54:46

lady. I mean, the way he did it was very

54:48

violent. She was a small woman and he he

54:50

shoved her very violently to the ground.

54:52

Then this other guy, Prey, gets in

54:54

between them, okay? Which again, if

54:57

you're a concealed carry holder, is a

54:59

giant no. You do not [ __ ] do that.

55:01

You do not engage with law enforcement

55:03

when you're armed. You shouldn't engage

55:05

with anyone

55:06

>> ever. Ever. I mean,

55:08

>> you should be avoiding You should be

55:09

trained to avoid conflict. Yes. That's

55:11

the whole thing is like uh if you're if

55:14

you're armed, you move into that next

55:15

level of you need to really be avoiding

55:18

conflict. You're not supposed to be in

55:19

bars drinking. Exactly. You're not

55:21

supposed to be uh you know at big

55:23

parties and things like this where

55:24

violent things can occur.

55:25

>> Exactly.

55:26

>> You know, you can take it to church,

55:27

defend the church. Other than that,

55:29

>> like you're supposed to be avoiding

55:31

conflict.

55:31

>> Exactly. So, uh, he gets in between the

55:36

officer and this woman, puts his hands

55:38

on the officer,

55:40

>> and then he gets pepper-sprayed.

55:42

They go to the ground. There's a lot of

55:45

scrambling going on. Now, you have to

55:47

understand what happens when you get

55:48

pepper-sprayed. Okay. Uh, I've never

55:50

been pepper-sprayed, but I did get

55:52

teargassed once during Fear Factor. We

55:55

did a a fear factor stunt where these

55:56

people had to I forget what they had to

55:58

do but we we had built this there was

56:01

like a structure and they were inside

56:03

the structure and they released tear gas

56:04

in this charger. I got hit with it. It's

56:06

pretty brutal but

56:07

>> it's painful.

56:08

>> Yeah. It sucks

56:09

>> and you can't breathe.

56:10

>> You can't breathe. Your eyes swell up.

56:12

You your nose starts running like crazy.

56:14

Yeah.

56:15

>> Um

56:15

>> and that [ __ ] stays on your clothes.

56:17

>> You don't think well when that happens.

56:19

So this this guy's clearly not thinking

56:21

well and he can't see and he's you know

56:24

and then they're on him, right? So

56:27

they're on him and then

56:29

one guy whether he yells out he's got a

56:33

gun or grabs the gun first. I I'm not

56:36

sure. But there he has a gun. So they

56:40

see his gun in the middle of the

56:41

scramble. The guy pulls his gun out and

56:44

moves off. Now this is where it gets

56:46

this is where it gets weird.

56:48

I I believe the gun was a Sig P32.

56:52

A a Sig P320 is known for having

56:57

accidental discharges. It is uh it has a

57:00

reputation for it. It has a very

57:02

specific type of striker. It doesn't

57:04

have uh a safety the way some other guns

57:07

do. And you can have negligent

57:10

discharges with SIGs.

57:11

>> Now, is the the P320 is that a hammered

57:14

model?

57:14

>> Yes.

57:15

>> Okay. So, it's not a striker fire.

57:16

>> No, wait a minute. No, no, it is a

57:18

striker fire. It's Let's Well, let's

57:19

let's pull Let's pull Let's pull it up

57:21

cuz I'm not I know the 365 is built very

57:24

differently. The the 320 breaks cleaner,

57:28

>> but

57:28

>> I thought the 320 had a hammer was

57:30

double action and single action. And

57:32

then I I

57:34

>> You might be right.

57:34

>> I didn't think it was a striker fire.

57:36

Most of the P models are not striker

57:38

fires that I'm aware of. I could be

57:40

wrong.

57:40

>> Well, the 365 the P365 is definitely

57:42

different than the 320. Yeah,

57:44

>> they have a different striking

57:45

mechanisms. They're known for accidental

57:47

discharges.

57:48

>> Okay. Um I can't tell if that's a hammer

57:51

underneath the slide.

57:52

>> Well, let's um just just What is this?

57:56

What is the trigger mechanism of a Sig

57:59

P320? Put that in there. And what is it

58:02

that makes it prone to accidental

58:05

discharges? If you look up SIG P320

58:08

online in any search engine, accidental

58:10

discharge comes up very quickly.

58:12

>> Oh, it is striker fired. Yeah. Okay.

58:13

Okay.

58:14

>> So, it's a modular striker fire trigger

58:15

mechanism. Uh when pressed, trigger bar

58:18

moves forward, disengage the safety

58:20

lever in sear releasing the striker.

58:21

Okay. So, it's a striker fired pistols.

58:23

Okay. Got it.

58:24

>> So, as of uh 2017,

58:28

SIG changed the way they make their guns

58:31

because the trigger itself was heavier

58:35

than what it is now. And not just the

58:37

pull, but the actual mechanism of the

58:38

trigger was heavier if you drop it. So,

58:41

if uh this is the barrel of the gun

58:43

where the bullet comes out and this is

58:44

where you're holding on your hand, if

58:46

you drop it, it'll it'll discharge and

58:48

it'll discharge without rot without

58:50

moving the slide, which is kind of crazy

58:52

because what happens is something in the

58:55

dropping it on the back where the handle

58:58

is,

58:58

>> releases the disconnector probably.

59:00

>> It causes that heavier trigger, the

59:02

heavier weight to the trigger to drop

59:04

down and it will discharge.

59:06

>> Okay. Uh, as of 2017, they made the

59:09

trigger lighter and it doesn't do that

59:10

anymore. And there's a whole YouTube

59:12

video where this guy explains it and

59:14

shows that you can do it with the older

59:16

models. If you drop them, if you drop

59:18

them on their side, they don't do it. If

59:20

you drop them barrel first, they don't

59:21

do it. But if you drop them handle first

59:24

and it hits the the back where you hold

59:26

it, where the, you know, the the the

59:29

what is it called? The beaver tail gun

59:30

>> sits where the internal hammer f drops

59:32

and bang.

59:33

>> Exactly. And it's one of the only guns

59:35

that does that. Yeah.

59:36

>> And uh so much so that I believe you

59:38

should search this. I believe the Dallas

59:40

Police Department stopped uh issuing

59:44

them to their officers. See if that's

59:46

true

59:48

before I go further because I don't I

59:49

don't want to get into any legal weeds

59:52

here. But I have one. I have a 320. I've

59:55

never had a problem with it. Uh I Here

59:57

it is. Dallas police suspends use of

59:59

pistol manufacturer. Okay. Yeah. And

60:01

it's because of that. So, um,

60:04

>> what are these chambered in? 940.

60:06

>> It's nine.

60:07

>> Nine.

60:07

>> They're nines. But the, um, so that's

60:11

the gun this guy has.

60:12

>> Mhm.

60:13

>> So, when this CBP officer grabs his gun,

60:18

he's moving off and it appears, it's

60:22

very grainy the video. It appears

60:25

there's an accidental discharge.

60:27

Now, you can make an accidental

60:30

discharge of this gun without touching

60:32

the trigger. If there's any kind of

60:34

pressure on the trigger, if it is a

60:36

modified trigger, if there's anything

60:38

that engages with it, even even a slight

60:40

amount and you move the slide

60:43

>> at all, that gun will go off. And

60:45

there's videos of it online. You could

60:47

find videos online. See if you can find

60:49

videos of it online where a guy shows

60:52

how you can get that gun to to negligent

60:55

discharge because it will. It will, at

60:58

least the pre 2017 model. Um,

61:02

>> I don't I didn't see his hand go on the

61:04

slide of the gun, though.

61:05

>> Yeah. Well, he's holding it in a It's

61:08

the hard thing is it's [ __ ]

61:10

>> Yeah, you can't see. I looked at it from

61:11

both angles, but it looked to me like he

61:14

was holding it um by the handle with no

61:18

finger on the trigger. But it does seem

61:21

>> like, at least in some of the takes that

61:23

I've seen, I may be wrong, but it seems

61:25

like that gun might have negligent

61:28

discharged. Now,

61:30

>> usually when someone's holding a gun and

61:32

there's a negligent discharge, it's

61:34

because they they pulled the trigger,

61:35

right?

61:36

>> Right.

61:36

>> So, in this case, let's I'm I mean, I'm

61:39

going to assume it for a second. So, the

61:41

gun drops on the side, striker fired.

61:43

Let's say it's I don't know what the

61:44

mechanism is. Well, let's say the

61:45

disconnector that makes the disconnector

61:46

go. The hammer drops, bam, hits the

61:49

primer, gun fires.

61:50

>> Gun fires, hits the ground. These guys

61:53

think they hear gun. These guys think

61:55

this guy might have a gun in the

61:57

scramble. They don't know. This is all

61:59

second high pressure. They opened fire

62:01

on him.

62:02

>> That's what I believe happened, you

62:05

know. So when people say, "Oh, they

62:07

straight up executed this guy.

62:09

>> Think you better. It there's a little

62:11

more nuance there.

62:12

>> There's more nuance to it. There's

62:13

chaos. There's the fog of chaos. You're

62:16

in the middle of this like very high

62:19

stress situation where you've already

62:21

pepper- sprrayed this guy. Now you're in

62:22

a physical scramble. Someone says he has

62:24

a gun. Gun goes off. Bang bang bang.

62:27

You're just shooting." Mhm.

62:28

>> I'm assuming this is just a lot of, you

62:31

know, a lot of guess,

62:32

>> but that's a lot of bad stuff that has

62:34

to happen in sequence. Like the the fact

62:37

even if this gun is recalled as a model

62:40

uh that had these issues, right? I'm

62:42

guessing that it wasn't every every one

62:44

of them that had the issues. Some of

62:45

them, right? Probably not all.

62:48

>> Um and if it's because it has to have a

62:50

lot of force for it to go off or the

62:52

slide has to be,

62:54

>> you know, you have to be moving the

62:55

slide or something like this. What I saw

62:57

was him holding the pistol, how you or I

63:00

would hold the pistol with the finger

63:01

off the trigger.

63:03

>> I did not actually see like what would

63:05

have caused that force.

63:06

>> This is where it gets weird. So, there

63:09

have been documented instance like the P

63:12

the the SIG P320.

63:15

There's a lot of legal stuff involved in

63:19

this. There's there's tons of cases.

63:21

Some of them are [ __ ] Like there

63:23

was one cop where they said the cop got

63:25

shot because the gun accidentally went

63:28

off and everybody's like, "Oh man, Sig's

63:29

in trouble." Turns out that cop had to

63:31

recant that and he accidentally hit the

63:34

trigger and shot a cop.

63:35

>> But he he want So this is one. So this

63:38

guy, his gun just goes off.

63:41

>> Now he doesn't have his finger on the

63:43

trigger. It just goes off. Now there's

63:45

another one where a cop is in the middle

63:47

of a precinct and he leans forward.

63:50

>> He's got the holster on the outside. He

63:52

leans forward and the gun goes off. He

63:54

does not have his hand on the trigger.

63:56

He's not touching the gun at all. See if

63:58

you can find the one where the cop does

63:59

it. So, there's a cop where he's in the

64:03

precinct. His gun is Now, here's the

64:05

question. Was there something touching

64:07

the trigger? Was it the holster bad? Was

64:10

there Was there debris in it? Was there

64:12

something? Was there was his his shirt

64:14

touching it? Did he Did he jam the gun

64:16

in the holster and maybe like his shirt

64:18

got stuck in it touched the trigger?

64:20

There's also the second gun theory.

64:21

>> And then he moves forward if the guy had

64:23

a second gun.

64:24

>> There's also the second gun theory. So,

64:26

I I understand what you're saying and

64:29

maybe it met all of the conditions for

64:31

that.

64:32

>> It does seem unlikely to me, but it's

64:34

possible um that he's just holding it

64:36

and it just happens to go off.

64:38

>> It would seem unlikely with any other

64:39

gun. So, if the guy had a Glock,

64:41

>> but both of these cases are from the

64:43

holster. This guy's grabbing from the

64:44

holster. He's grabbing from the holster.

64:46

Well, the cop had already pulled it out

64:47

of the holster and now he's holding it

64:49

>> and then it goes off,

64:50

>> right? But it's it's so low resolution,

64:53

it's hard to see what's going on with

64:54

his hands.

64:55

>> So, if there had been some funkiness

64:59

with the trigger, you know, who knows

65:02

where he got the gun, who knows whether

65:04

or not that gun had an aftermarket

65:06

trigger, who knows what's going on. But

65:09

as he's doing this, you're in the middle

65:11

of the chaos. You're ramped up with

65:12

adrenaline. Who knows if that guy

65:14

accidentally while he was holding it put

65:16

pressure on the slide and caused that

65:19

gun to negligent discharge. I don't

65:21

know. This is the speculation and the

65:23

reason why the speculation is so it's

65:26

it's it's this is something we're

65:28

talking about is because it's a Sig P320

65:31

and there's so many stories about that.

65:33

>> It's not outside the realm of

65:34

possibility. In other words,

65:35

>> right, which is the worst case scenario,

65:37

right? You got all this chaos and you

65:39

got that [ __ ] gun and that gun goes

65:42

off. Well, let me ask you this. Let's

65:43

say we we adjust for this.

65:45

>> There's an investigation. Turns out that

65:48

this 320 model was one of the ones that

65:51

uh you know was Yeah. or something like

65:53

this or it was issued after the fact and

65:56

it's brand it's newer. Let's just say

65:57

it's it's newer and they've gotten this

65:59

design flaw out of there. Let's just

66:01

assume for a second.

66:03

>> All those things being equal now, right?

66:07

When a leftist points at that and says

66:08

that's an execution, what what's your

66:11

opinion then? If it if it's the case

66:13

there is no there is no negligent

66:15

discharge, there is none of that. What

66:18

like how would you view it then?

66:20

>> Well, it's a extremely unfortunate case

66:24

of what happens during chaos.

66:26

>> Yeah, I agree.

66:27

>> I don't think it's an execution. I don't

66:29

think they pulled the gun from him and

66:31

then just shot him.

66:32

>> But that's rhetoric being used, right?

66:33

>> It is. But you know which you you're

66:36

automatically going to have if you have

66:38

a guy get shot. Um

66:42

do we can we watch a video? Let's watch

66:45

a video and see if we can discern when

66:48

the shot fires off cuz does it before or

66:51

after they say he's got a gun? Cuz

66:54

someone says he has a gun, one of the

66:56

officers removes the gun and then a shot

66:58

goes off.

66:59

>> Yeah. Now, there's another speculation

67:00

that the guy who shot him had a

67:04

negligent discharge. He didn't like

67:05

maybe he had his hand on the trigger and

67:07

he got a little amped up and it went off

67:09

and then he just [ __ ] fired into him,

67:11

kept going. That's possible, too. I not

67:15

exactly sure. It's there's a ton of

67:17

angles, ton of different cell phone

67:19

angles. None of them are really crystal

67:21

clear. And the thing that's interesting

67:23

about this is I'm even willing to kind

67:25

of grant it to the left just on

67:27

appearance alone for a second just for

67:29

the sake of uh of like logically taking

67:31

this to its conclusion. Let's say that

67:33

the cops were totally wrong on this.

67:35

They they messed the whole thing up.

67:38

They screwed it up. They it was a

67:40

negligent discharge from the officer

67:42

himself. It killed this guy. It was

67:44

totally unjustified.

67:46

Okay. But now what? Right. Is it is it

67:49

the case that we're gonna what stop

67:52

deporting illegal immigrants? We're

67:54

gonna stop uh you know that ICE is gonna

67:56

stop Border Patrol is going to stop

67:57

doing its job? IC is gonna stop doing

67:59

its job because of a single incident

68:01

even if all of the officers involved

68:03

were incorrect. Of course not. That's

68:05

ridiculous. Right. The thing about this

68:07

incident is it's being used as a

68:09

catalyst to now say they're the they're

68:12

the Gestapo just like they were trying

68:13

to do with Renee. They're the Gestapo.

68:15

They're here to to uh you know uh be the

68:17

jack booted thugs of the Trump

68:19

administration. That's being used now as

68:22

the new rallying cry and catalyst for

68:24

the and it's post hawk justification.

68:26

That's what makes me so angry about is

68:28

it's like no no no. You're out here

68:30

doing all of this long before anybody

68:33

was getting uh shot by ICE. Okay. You

68:36

were doing this long before there was

68:37

any supposed abuses by ICE. It seems

68:39

like they're the what they do is they

68:41

set up the reactions, right? They set up

68:43

the conditions, maximize the conditions

68:46

for horrible actions to happen. And then

68:48

when they do, they use those as the

68:49

justification for why they were ever out

68:52

there in the first place. And it's like,

68:53

what's going on here? That's that's what

68:55

bothers me.

68:56

>> Right? This is the quintessential

68:58

description of the color revolution. I

69:00

mean, that they're they're trying to

69:02

create chaos. And um is again, it's very

69:06

well funded and very well organized.

69:09

It's not as simple as this is an organic

69:11

protest that people are fired up because

69:14

ISIS is in their community. That's not

69:16

really what's going on

69:17

>> at all.

69:17

>> But I think there's a lot of good people

69:20

that are wrapped up in that that think

69:22

they're doing a good thing and they

69:23

really do think they're fighting fascism

69:25

because they exist in these bubbles and

69:27

they're they're

69:28

>> I believe them. Yeah,

69:29

>> I do believe them that they think that

69:32

they're fighting against fascism. And I

69:34

I've I've debated with enough of these

69:36

people on the what historically fascism

69:39

is in comparison to what they perceive

69:40

it as that I do think that they believe

69:43

that 100%.

69:45

>> It's I just think it's an unjustified

69:46

belief and I think it's uh it's

69:48

ridiculous, but

69:49

>> it's not accurate.

69:50

>> It's not accurate.

69:51

>> You know, one of the things that we went

69:52

over the other day um is we talked about

69:55

the deportations, right? and that

69:57

there's been somewhere in the

69:58

neighborhood of uh 2 million

70:00

deportations,

70:02

but 1.6 of them were like self

70:06

deportations. 1.6 of them were like

70:08

people were notified and they said,

70:10

"Well, just get the [ __ ] out of here. I

70:11

don't want to be in jail."

70:12

>> Yeah, good.

70:12

>> And then a half a million of them were

70:15

and then but people are saying very few

70:17

of them have have been violent

70:19

criminals. But we found out there was

70:21

like 8%.

70:23

>> This is just 8% of what we know has been

70:25

caught. That is a lot of violent

70:28

criminals. If you go to half a million

70:30

people and 8% of them are murderers and

70:32

rapists and they snuck in during not

70:34

even snuck in because they were allowed

70:36

access to the United States over the

70:38

last four years. Somewhere to the tune

70:40

of let's be like super charitable. Let's

70:42

say it's only 10 million cuz I think

70:44

it's a lot more and they don't a lot

70:46

more.

70:46

>> Yeah. They don't really know the number

70:48

because it's really the numbers that

70:49

they're giving are based on

70:51

interactions, right? But how many people

70:53

snuck through and they didn't have an

70:54

interaction with them? It's a lot, man.

70:56

It's a lot of people. And they did this

70:58

[ __ ] on purpose. And they did this [ __ ]

71:00

because they want more congressional

71:02

seats because the census doesn't count

71:04

citizens. It doesn't count legal

71:06

citizens. It just counts human beings.

71:08

So the more citizens you have in an

71:10

area, the more congressional seats you

71:12

have. And then there's places like

71:14

California that make it illegal to show

71:17

your ID. You're not allowed. Not only

71:19

are you not, which you should have to

71:21

show your [ __ ] ID when you vote,

71:24

right? So we know that you're legally

71:26

voting. They made it so you can't show

71:28

your ID, which is the only the only you

71:32

could steal man this to the end of time.

71:34

The only reason why you do that is

71:36

because you want to cheat. It's the only

71:37

reason.

71:38

>> Of course. Well, it's not just that. But

71:39

you make you make a good compelling

71:41

point here.

71:42

The idea, even if it was the case, let's

71:45

just say almost none of them are violent

71:48

criminals. Let's just give it to them

71:49

just kind of for the sake of argument

71:51

here. We'll give it to them. So what?

71:53

The people don't want them here. That's

71:56

it. You're these are supposed to be the

71:57

biggest believers in democracy and

71:59

republicanism ever. That's what they're

72:01

fighting against is the evil fascists.

72:03

It's like, well, here the people spoke.

72:05

Okay.

72:06

>> And the people said,

72:07

>> "We don't want illegal immigrants here.

72:10

We want them out of here. It doesn't

72:12

matter what the conditionals are for

72:14

violent criminality or not violent

72:16

criminality. the if you're really a big

72:18

believer in the republic like you claim,

72:20

why is it that when Trump gets elected

72:23

to do exactly this job, you impede it at

72:25

every turn?

72:26

>> Yeah, they don't want it to happen

72:28

because it was a part of the strategy

72:30

for you a uni party. I mean, this is

72:34

Elon came on and was,

72:37

>> you know, was very passionate about

72:40

wanting to explain this to people. I

72:42

mean, it's one of the reasons why he did

72:44

it before the election. Like you have to

72:45

understand the plan that's in place. And

72:47

what they're doing is they're trying to

72:49

make it so that no one but by the

72:50

Democrats can ever win ever again. And

72:53

one of the best ways to do that is ship

72:55

untold numbers of people to swing

72:57

states.

72:58

>> Yeah. Which is what they're doing.

72:59

>> It's what they did. They didn't just do

73:01

it. They flew them out there. They gave

73:03

them EBT cards. They put them on social

73:05

security. We had this woman, we

73:07

documented, we we talked about this

73:09

woman who uh worked for God, I forget

73:12

which department, but her job was to

73:15

turn these people from illegal

73:17

immigrants into what she described they

73:19

described to her as clients.

73:22

>> And so you would tell these people, are

73:25

you Yes. So her her question was to

73:29

them, do you have a permanent

73:31

disability? So do you have headaches?

73:34

Does your back hurt? I get headaches. My

73:36

back hurts. I guess I'm permanently

73:38

disabled. And all you have to do is like

73:39

you don't have to have like like clear

73:42

evidence. You have all your [ __ ]

73:44

discs are fused. You can't walk or you

73:46

have Yeah. No, you just have to have a

73:48

[ __ ] back hurt. Your back hurts.

73:50

Well, what [ __ ] man who's a laborer

73:53

who's uh 35 years old doesn't have

73:56

[ __ ] back pain like you all do. So

73:58

they come to you, they said, "You have

73:59

headaches and back pain."

74:00

>> What man is an office worker doesn't

74:02

have back pain? [laughter]

74:03

Exactly. [ __ ] everybody does. You get

74:05

older, you get back pain, especially if

74:06

you don't take care of your back. And so

74:08

these guys are are all being roped into

74:13

the system and then they get money. They

74:16

get social security money. They get

74:17

money from taxpayers in essentially

74:20

forever. So if you can get those people

74:23

to vote, they will most certainly vote

74:26

for the people that are giving them that

74:27

money, right? most certainly vote for

74:30

the people that are moving them into the

74:32

Roosevelt Hotel in New York.

74:34

>> Just like how Muslims will vote, even

74:36

though at the local level they oppose

74:37

all leftist policy, they'll vote at a

74:39

national level for leftists because they

74:41

bring in their family members. They

74:42

bring in they they allow the the

74:44

importation of people that they want

74:46

here.

74:47

>> So yeah, they're they they utilize the

74:49

system

74:50

>> for uh for the aims. And for Democrats,

74:53

this is all good. And of course, for

74:55

Republicans, it's all bad. And Elon's

74:57

right. It is he is right that Democrats

75:00

and here's what I see the bird's eye

75:02

view right Trump what they're going to

75:05

do Democrats going to win the midterms

75:07

by hook or by crook they're going to win

75:08

the midterms and when they do if they

75:11

have the power in the house to do this

75:13

they're going to impeach him day one and

75:15

we'll have now it'll be the thrice

75:17

impeached president right and they'll

75:19

obstruct him they'll obstruct his agenda

75:22

the entire step of the way under this

75:24

elongated impeachment and they'll just

75:25

run out the clock

75:27

>> you Yeah, let's just run it out.

75:29

>> It's It's all pretty [ __ ] crazy. It's

75:32

really crazy. Gads has a great u way to

75:36

describe this. He calls it suicidal

75:37

empathy. And you know, a lot of these

75:40

people that are on the left that are

75:43

self-described leftists, they're very

75:45

kind people. And they they want, you

75:48

know, everyone to have a chance to live

75:50

in America and be good people. and they

75:52

don't understand they're being used as

75:54

pawns by much more cynical people that

75:57

are just trying to get total control.

76:01

And if you want to know what total

76:02

control looks like and what kind of

76:04

restrictions could be imposed on a

76:06

western society, look no further than

76:09

the UK. Look what's going on in in

76:11

England right now. 12,000 people have

76:15

been arrested so far last year for uh in

76:18

the last year rather for social media

76:20

posts. just social media post

76:22

criticizing immigration. Um there was

76:25

some new thing that they uh just passed

76:28

that makes it so that you're supposed to

76:30

tell on people who are talking in pubs

76:35

who are having conversations in pubs

76:37

that you think are dangerous

76:39

conversations.

76:40

>> There was that woman in the UK who was

76:43

said and then called the guy a name via

76:46

text.

76:46

>> Yes. She called him a [ __ ]

76:48

>> Yeah.

76:48

>> Yeah.

76:49

>> She called Yeah. He was sex she was

76:50

sexually assaulted. She called him a

76:52

[ __ ] and then she was arrested.

76:54

>> Yeah, she was arrested. I was and I

76:55

remember arguing on Pierce Morgan. I was

76:57

debating with a leftist on this. This

76:59

was the topic at the time.

77:01

>> Yeah.

77:01

>> And the leftist who looked at me like to

77:03

he was a [ __ ] too. Said uh he was

77:06

defending it tooth and nail, right? This

77:08

is a good thing because we want to get

77:11

rid of stigma. The idea is to try to

77:14

destigmatize the thing. You see, words

77:16

create stigma and stigma creates harm

77:18

values and harm values are evil. They're

77:20

bad. That's that's the whole moral

77:22

system. If it if we reduce harm, that's

77:24

moral. If we increase harm, that's

77:26

immoral.

77:27

>> So that it that's the zero sum way that

77:29

they look at this, right? If you're

77:30

increasing it bad. If you're decreasing

77:32

it good. So if we're decreasing

77:34

stigmatization of an activity that we

77:36

think is protected, then that's reducing

77:38

harm. Therefore, that's the moral

77:40

position.

77:41

>> Crazy.

77:41

>> They are crazy. That is actually a crazy

77:44

way to look at the world. Well, it's

77:45

very dystopian. It's it's very spooky

77:49

that it it's happening so quickly and

77:52

that the UK has become the the leader in

77:56

the world for arresting people for

77:58

social media posts. No one would have

78:00

ever saw that coming five, six years

78:02

ago. But this is what happens when you

78:04

get total control of a population. You

78:06

don't and you don't stop where you're

78:08

at. You continue to move forward. You

78:10

continue to try to get more and more

78:12

control. And this is this new thing

78:14

where they're trying to u uh get people

78:16

to turn people in for bar talk, which is

78:20

just crazy. It's just crazy. So that's

78:22

where it goes. If you're really a

78:24

liberal, a real liberal, a real

78:26

progressive person who really believes

78:28

in free speech, you should believe in

78:30

all speech. And you have to. I mean,

78:32

this was the ADL's position way back in

78:34

the day when they would allow the Ku

78:36

Klux Clan to march. They would say,

78:38

"Look,

78:38

>> and then fight for the right to do so."

78:40

>> Yes. I mean, this is what it used to be.

78:42

Yeah,

78:42

>> there used to be an understanding that

78:43

as complicated as this thing is, you've

78:46

got to allow people to say horrible

78:48

things so that you can counter them with

78:50

better points and you make a better

78:52

argument and that people see your side

78:54

and then society moves forward in a

78:56

generally possible

78:57

>> you know in the online dialectic the way

78:59

that it moves between group and I and I

79:01

think that now uh online influencers

79:03

podcasters political commentators

79:05

actually do have political they have

79:07

some political capital now which can be

79:09

spent the same way low-level polit

79:11

Politicians have political capital which

79:13

can now be spent. They actually are

79:15

connected often times with politicians

79:18

and operate as mouthpieces uh on behalf

79:20

of whatever that political arm is.

79:22

>> Well, you would say that about the right

79:23

too, wouldn't you?

79:24

>> Of course, but I don't see it as

79:25

prevalent as I do uh with the left. The

79:29

the left, for instance, there was a a

79:31

whole thing that used to go on on Twitch

79:34

where an organization came in and bought

79:35

up all the Twitch mouthpieces. Uh that's

79:38

what they did. And like this this is

79:41

this is something which has been going

79:42

on for a long time.

79:44

>> But what's interesting with the

79:46

political capital angle from these

79:48

leftists they don't care what the means

79:51

are. You see the the ends are what

79:54

that's all they care about, right? The

79:56

the means to get there totally

79:58

irrelevant to them. From their view

80:00

though, that makes a sick sort of sense.

80:01

They believe that they're fighting

80:03

against Nazis,

80:05

>> literal Nazis. Right.

80:07

>> So, if you believed that you were in a

80:08

war with literal Nazis, what wouldn't

80:11

you do to complete that war? What what

80:14

means wouldn't you go to, what means of

80:16

sabotage would you not do? What cars

80:18

would you not blow up? What cops would

80:20

you not eliminate in order to stop the

80:22

rise of the new Hitler?

80:23

>> Right.

80:24

>> And it's like, and they're they're

80:25

expending their political capital on

80:27

that message. And that message has a lot

80:30

of influence on people.

80:31

>> Yeah. It also

80:34

there there's so many people that are

80:37

getting attention by feeding into the

80:40

rhetoric. There's so many people that

80:42

are making viral clips of them

80:44

threatening you like menacing like these

80:46

weird dorky liberal guys like these guys

80:50

that you would think of pacifists are

80:52

literally calling for violence. I got

80:54

one of them because it's like the most

80:55

unlikely guy. Like you see this guy

80:58

doing this, you're like, "Hey buddy,

80:59

like who? What? What are you saying?

81:02

[laughter]

81:02

>> Who who who's following you into battle?

81:05

>> Yeah. Who's I'm gonna I'm gonna send you

81:07

because it's it's the way he says it too

81:09

is so like like he's watched too many

81:12

[ __ ] TV shows. Um this guy let's cuz

81:17

it's it's the terminology that he uses

81:19

that is it's actually kind of funny. If

81:21

it wasn't so scary. Put this on real

81:23

quick.

81:23

>> Sure. cuz it's so

81:26

[laughter] you see this guy's doughy

81:28

face and [snorts]

81:29

his understanding of real violence.

81:31

Listen to this.

81:32

>> When combat starts, we all roll

81:34

initiative.

81:36

I'm going to say that again. And anyone,

81:40

everyone knows what I'm talking about

81:41

when I say this. When combat starts, we

81:44

all roll initiative.

81:48

[laughter]

81:51

>> I mean, I hate to laugh. because it's

81:53

kind of [ __ ] serious because they're

81:55

they're they're inciting violence and

81:57

they're they're calling for

81:58

insurrection. They're calling for people

81:59

to, you know, take to the streets and

82:02

start violence. But that guy like what

82:04

what like

82:05

>> you're not going to roll initiative.

82:07

>> What does it even mean?

82:08

>> Well, I mean I think I think it just

82:10

means we all go,

82:11

>> right?

82:12

>> When violence start, we we all go.

82:13

>> Dungeons and Dragons reference.

82:15

>> Is it rolling? You're not going to roll.

82:16

>> Oh, [laughter] rolling.

82:19

>> No, no, no.

82:20

>> Wait a minute. It's

82:21

>> No. The tweet says Dungeons and Dragons

82:23

in it. I don't even know.

82:24

>> Oh god.

82:25

>> Says Dungeons and Dragons nerds means

82:27

business, I think.

82:28

>> Oh, so do a search on that. Does roll

82:32

initiative is that a part of Dungeons

82:33

and Dragons roll that? Cuz I only know

82:35

of Dungeons and Dragons from Strange.

82:36

>> So, does his does his AR do 2d6 damage?

82:39

It's [laughter] like,

82:42

>> but it's just the the menacing way that

82:44

he stares into the camera in combat.

82:47

>> Rolling for initiative determines the

82:49

turn order in combat. Each player and

82:52

monster rolls a 20-sided die and adds

82:54

their dexterity modifier. Oh god. But I

82:58

mean, this is what I'm saying. It's like

83:00

it's a lot of it is cosplay.

83:02

>> That does sound like he is saying

83:03

though, we all go there, right?

83:05

>> Yes. Yeah, it sounds like it's that's

83:07

what he's saying. And it's also

83:09

>> it gives meaning to people whose lives

83:12

do not have a lot of meaning, right?

83:14

Like all of a sudden you're a part of a

83:15

greater cause. You're you're a part of a

83:17

very important movement.

83:18

>> Yeah. You're stopping Nazis.

83:19

>> Yeah. You're stopping Nazis. And it's,

83:21

you know, relatively safe from the

83:23

comfort of your own home staring at your

83:25

phone on on TikTok.

83:26

>> Sure.

83:27

>> You know, and you get all excited about

83:29

it and you cheer. And these are the same

83:31

people that cheered when Charlie Kirk

83:32

got shot um for just talking like that

83:35

was fine, but this one is not good. You

83:38

know, it's it's it's all like it's very

83:40

[ __ ] up, man.

83:41

>> Well, and they're going to kill they'll

83:42

kill more commentators. They can get

83:43

away with it. Um, happily. I mean, part

83:47

of that whole signal chat that's

83:50

dangerous that people aren't talking

83:51

about. That's probably the mo most

83:53

dangerous aspect of it. It it and I

83:55

can't prove this, but it's been my

83:57

experience that left-wing communities

83:59

and leftwing groups, especially online

84:01

communities and online groups, really

84:03

pander to the mentally ill in a big way.

84:06

Really pander to them. And it's I think

84:10

that it's a form of weaponization. They

84:12

want to attract the extremely mentally

84:14

ill into these communities and it it

84:16

helps with actually what is

84:18

radicalization and they play on the fact

84:21

that they're mentally ill in order to do

84:22

this.

84:23

>> Well, this is Antifa, right? Like this

84:25

is [clears throat] this is why

84:26

>> it's not just Antifa. It goes beyond

84:27

that. Like if you go to some of these

84:29

Tik Tockers communities, you go to some

84:31

of the online political pundits

84:33

communities who are far left. Okay.

84:36

These people who are in there are Froot

84:38

Loops, man.

84:38

>> Yeah. They are lunatics and they're

84:41

pandered to. They're pandered to. Oh,

84:43

okay. You can't say this to them. That's

84:45

abbleist. You can't tell them this is

84:46

you're a weirdo because that's mean. You

84:48

can't. And not only they pander to, but

84:50

I think that that's the source of the

84:51

weapon. If it's the case that these

84:53

people don't care about death. They

84:56

don't care like, oh, the outcome's going

84:58

to be death. That guy shot at the uh the

85:00

ICE agents not too long ago, remember?

85:03

He was on top of the roof. He was

85:04

shooting across with I think a mouser

85:06

rifle and they they dusted him. and they

85:08

killed him or he shot himself. I don't

85:10

remember which.

85:10

>> When When was this?

85:11

>> This was a few months back.

85:13

>> I don't know about this one, I don't

85:14

think.

85:15

>> No, there was a guy. He was

85:16

>> Is this how callous have I've become?

85:17

>> Yeah. He was taking shots at uh I

85:20

believe it was ICE agents in front of

85:21

the ice facility.

85:23

>> Oh, that's right. That's right.

85:24

>> Yeah. He was using I believe he was

85:25

using like a mouser rifle or something.

85:27

>> Okay. Now I remember.

85:28

>> And early on with the Charlie Kirk

85:29

thing, they were actually making these

85:31

connections cuz he had used a a mouser

85:33

as well, right, to shoot Charlie Kirk.

85:35

That was the So people were making those

85:38

early connections. Wait, is this is this

85:40

a sequence of events? Does the mouser

85:42

mean something here? Does that

85:43

particular rifle have special meaning?

85:44

You know how people are online.

85:45

>> Yeah.

85:46

>> But anyway, uh the interesting thing is

85:48

like they don't care if they die.

85:50

>> They're they're dying martyrs. They

85:52

don't care.

85:53

>> And it's really easy to weaponize

85:54

mentally ill people that way because

85:56

they don't care. These are the same

85:58

people who have the high suicide rates

86:00

for a reason because they're already

86:02

mentally ill like the trunes and others

86:03

which many of them you find are

86:05

connected to trans people almost every

86:07

time.

86:08

>> Also SSRI.

86:09

>> Yep. And this is the other problem is

86:11

that how many of these people are on

86:14

these psychiatric medications that

86:18

violent ideiation is a part of the side

86:20

effects of these suicidal or excuse me

86:23

these psychiatric drugs. There's a a a

86:26

lot of people that have uh psychotic

86:29

thoughts when they get on some of these

86:31

different SSRIs and and different

86:34

psychiatric medications. So, you've got

86:36

people that are already [ __ ] up

86:37

mentally and then you've got them on

86:39

these medications that cause them to do

86:41

all kinds of crazy things.

86:42

>> And aren't women aren't women taking

86:44

much more in the way of SSRI eye pills

86:47

than men are? And who do we see is the

86:49

on the bullhorns and loudspeakers at

86:51

most of these events? It's women. Well,

86:53

particularly liberal women,

86:55

I'm sure you've se seen the statistics,

86:57

but I I lopsided save them because

86:59

they're kind of nutty.

87:01

>> Uh the which what's what's interesting

87:03

is the like the least mentally ill in

87:08

terms of numbers is conservative men.

87:11

Conservative men, I think it's like

87:13

>> cuz they're normal. [laughter] I think.

87:15

Okay. Young liberal women, 56%

87:20

report a mental health diagnosis. Young

87:23

moderate women uh 18 to 29 28%. Young

87:27

conservative women 27% only slightly

87:30

less. Um so for men

87:33

it is uh 34% of all liberal men. 34%. So

87:39

a third of all liberal men are mentally

87:43

ill. 22% of moderate men and 16% of

87:47

conservative men.

87:47

>> Yeah. But do you know what the lunatics

87:48

argue when you bring that up? The these

87:50

lunatics, they'll argue, "No, no, no.

87:53

>> The conservative men are just as

87:55

mentally ill. It's just undiagnosed

87:58

because there's a stigma in conservative

88:00

communities about going to get your

88:02

mental illness diagnosed." And I always

88:04

point out, and I think this is an

88:06

interesting way to point this out, like

88:08

maybe they're not going to get diagnosed

88:10

because they don't have a problem. Did

88:12

you ever think of that? It's possible.

88:14

It's possible it's undiagnosed because I

88:15

think that is accurate though that there

88:17

is a stigma about mental health and

88:19

therapy and things along those lines in

88:21

conservative I mean if you want to like

88:23

>> I agree but I also think that what

88:25

happens is um when you're when you're

88:28

talking especially about the voodoo that

88:30

is psychology and it is it is voodoo. I

88:33

have very little respect for psychology.

88:35

I don't even consider it science. I

88:37

consider that there's scientific methods

88:39

used for data gathering, but I don't

88:41

consider psychology a science at all.

88:43

>> And that's psychology. Psychiatry gets

88:45

even weirder because then you start

88:47

adding medication. You're not just

88:49

talking about therapy.

88:49

>> It's all voodoo to as far as I'm

88:51

concerned. I think that men often,

88:53

especially conservative men, get as much

88:56

out of uh, you know, their close

88:58

relationships with friends and family as

89:01

they would going to a psychologist. In

89:03

other words, just I think just having

89:04

somebody to talk to who's a close friend

89:07

who's intricately familiar with your

89:08

situation probably gives you more value

89:10

than going to a complete stranger who

89:13

has learned manipulation techniques.

89:15

That's what they learn essentially is

89:17

manipulation techniques. Uh I think

89:19

there's more value there. And so I think

89:21

that the stigma which exists there

89:24

doesn't exist because it's like you're

89:25

not manly, which is how they try to

89:26

frame it. I think the stigma exists

89:28

there because so many conservative men

89:30

go well I tried that [ __ ] and it was

89:31

nonsense. I tried it and it sucked. I

89:33

tried it and it was worth I I went to

89:35

marriage counseling, did nothing. Um,

89:38

sided with the wife, right? I went uh

89:40

for this issue, did nothing. Uh, but

89:42

when I went out and had some beers with

89:44

my friends, that actually helped relieve

89:46

some of these some of these issues.

89:48

>> I think the problem with that is there's

89:50

a lot of guys who don't have good

89:52

friends, you know, and you don't have

89:54

someone that you can count on,

89:55

unfortunately. You know, there's just

89:57

there's a lot of men out there that are

89:59

lost.

90:00

>> I agree. But I think that the

90:01

conservative men seem to like they have

90:03

closer longevity with friends than

90:05

progressive men do.

90:06

>> Yes. And they don't abandon them when

90:07

they change their opinions on things. Um

90:10

so here's a self-reported data um from

90:13

2022 survey analysis found that 51% of

90:16

conservatives report report excellent

90:18

mental health [clears throat]

90:20

>> compared to 20% of liberals.

90:22

>> That's a big difference.

90:24

>> Huge.

90:24

>> It's a giant difference.

90:25

>> I don't think stigma could account for

90:26

that.

90:27

>> No, it can't. It It's like It's not just

90:29

stigma. It's like it's also like like

90:31

what is what does it mean to be

90:32

conservative? Does it mean you know

90:34

taking account for your own actions,

90:36

discipline, hard work ethic? All those

90:39

things are actually good for your mental

90:41

health like like pulling yourself up and

90:44

getting back to work and doing things.

90:47

>> I think I think now and I think maybe it

90:50

always should have been framed this way.

90:51

I think now for to for the label of

90:53

conservative to apply, we really kind of

90:56

start with religious foundationalism.

90:58

>> Mhm. That's what is becoming fast

91:01

becoming the delineation,

91:03

>> right? Having a framework.

91:04

>> Having the framework, right? And the

91:06

religious framework is almost instantly

91:09

going to put you in that moving towards

91:12

that conservative camp almost every

91:13

single time.

91:15

>> And uh and I think I think that that's a

91:17

necessary component. Now, if we're

91:18

trying to make these political

91:20

delineations, it becomes tough. What's a

91:22

Republican or a neocon versus a

91:24

conservative versus a this versus that?

91:27

It comes down to foundationalism of

91:29

framework like you were just saying and

91:31

the framework of Christianity Christian

91:33

ethics huge delineation point between

91:36

the right and the left who rejects that

91:39

for harm principles utilitarianism and

91:42

various other uh sorts of frameworks.

91:44

>> Yeah. And they they'll also point to you

91:47

know [clears throat]

91:48

what Christianity has done throughout

91:50

history and the amount of harm that it's

91:52

caused. [gasps]

91:53

But it's kind of like every power

91:55

structure throughout history you could

91:57

point to in that way.

91:58

>> Well, it was what was there.

91:59

>> Yeah.

92:00

>> The thing is like the Catholic Church,

92:02

the Catholic Church gets a lot of [ __ ]

92:03

for this.

92:05

>> Well, look at all the horrible things

92:07

that the Catholic Church did. It's like,

92:08

well, the Catholic Church was the whole

92:10

known world once, right?

92:11

>> You know, all of Europe was the Catholic

92:13

Church, not, you know, like all of it

92:16

was. You can't have organizations which

92:18

span whole nations and countries um uh

92:22

ethnicities, cultures integrate

92:25

themselves into it and and not have

92:27

corruption. That I don't care what

92:28

system it is.

92:30

>> Pointing to it and saying it's because

92:31

they were Catholic. That's where it

92:33

becomes absurd. Right.

92:35

>> They were corrupt. There was corruption.

92:37

>> But because they're human, not because

92:39

they're Catholic.

92:40

>> Right. Right. Um one of the things that

92:42

I always try to point out to people,

92:43

they go, "Why do you go to church?" Like

92:45

because when I was younger, I was very

92:46

cynical about religion. And then I've

92:48

got older, I was one of the things that

92:49

I always say is if there was a pill that

92:53

could make you as nice as the people

92:55

that I go to church with, [laughter]

92:57

everybody would be on it.

92:58

>> Yeah.

92:59

>> They are the nicest [ __ ] people you

93:01

will ever encounter. When we leave the

93:02

church parking lot,

93:03

>> they're kind

93:05

and nice. They're they're all the above.

93:07

They're like very friendly, happy

93:08

people. But when you leave the church

93:10

parking lot or even when you're

93:11

entering,

93:13

they're the they everybody lets

93:16

everybody in. It's like no one rushes

93:19

ahead. It's like you go ahead and then

93:21

you go ahead. It's like the most

93:23

self-organized

93:25

mo most charitable way of exiting a

93:28

parking lot I've ever experienced in my

93:30

life. The opposite of a concert. You go

93:32

to a great concert, everybody's like

93:33

[ __ ] on everybody's bumper trying to

93:35

weasel in. People are honking. [ __ ] you.

93:38

In church, it's like one person goes and

93:39

another person goes. No, you go wave and

93:42

then everybody's fine and everybody's

93:43

happy. It's like if that was if you

93:44

could take a pill that could do that to

93:46

you. If therapy could do that to you, we

93:48

should all be on therapy. We should all

93:49

take that pill.

93:50

>> Philosophy can do that for you because

93:51

the phenomenon that you're talking about

93:53

is the me philosophy. And so what you're

93:56

you're going to church, it's not all

93:58

about you,

93:58

>> right?

93:59

>> And that's why you have those types of

94:00

interactions with people. Wait, I'm

94:02

going you go to a concert. That's for

94:04

you. Mhm.

94:06

>> That's for me, not for these strangers.

94:08

I'm going there because I want to be

94:09

entertained. That's for me. You go to

94:11

church,

94:11

>> it's not for you.

94:13

>> Right.

94:14

>> And the thing is is um

94:16

>> it's the it's the kind of material

94:20

materialism view, the materialistic view

94:22

of pure materialism reduces always to

94:25

me, me, me, me. Because what else can

94:27

there be, right? There's just me and m

94:29

the material I engage with. There's

94:31

nothing outside of that. So why engage

94:33

as though there's something outside of

94:34

that? That doesn't just lead to

94:36

nihilism, but it's the beginning stages

94:38

of understanding the distinction between

94:40

religious foundationalism and uh

94:43

basically everything else. The reduction

94:45

doesn't come down to me. And that's why

94:48

those interactions seem so much better

94:50

because they are because people are

94:51

thinking about you, right?

94:53

>> It's like what a concept. Imagine a

94:56

world where people think about somebody

94:58

besides themselves

94:59

>> and they think about they think about

95:00

everybody as a part of a community and a

95:03

collective community that you care about

95:05

that has value to you and they're you

95:08

know there's

95:08

>> and then you go why is the mental health

95:10

rate so much better in these

95:11

communities? It's like well isn't it

95:13

interesting how much they think about

95:15

other people than just themselves and

95:17

duties to those people instead of just

95:20

me me. They're the kindest people you

95:22

you're ever going to come across. Yeah.

95:24

And uh I think there's a lot of value in

95:26

that. And I think the people that are

95:27

cynical about that

95:28

>> because they don't want to believe in

95:30

fairy tales or they don't want to be

95:31

stupid. They don't want to get duped by

95:33

like like there's a a foundation to

95:35

that. If you just look, forget about

95:38

some of the stuff that's in the Bible

95:40

that you know, it gets weird when you

95:42

get old like you go back into the old

95:45

old stuff because like for sure human

95:49

beings had some sort of an influence on

95:50

what was written down and what wasn't

95:52

written down. But if you get to the

95:54

teachings of Christ, I can't find any

95:57

faults in it. Like it's all about being

96:01

kind. It's all about this this idea that

96:05

we're all in this together and that

96:07

you're supposed to lift each other up

96:09

and look after each other. There's no

96:12

faults in it. It's it's not like you

96:14

have to kill the non-believers. It's not

96:17

like you get to rape and pillage for the

96:19

non-believers and the infidels must die.

96:21

There's none of that. That's why the

96:24

that's why Christians believe in

96:25

objective truth that there must be

96:27

objective truth because otherwise why is

96:28

most of the world following this as

96:31

though it's objective truth. We seem to

96:33

be leaning towards this as though this

96:35

must be the thing which is objectively

96:37

real and objectively true and a thing

96:39

which we can point to that is because

96:42

when people are introduced to it like

96:43

you just said it's really hard and

96:46

difficult to find fault in it. It's not

96:47

just that. You know, it's interesting.

96:49

If we reverse it, if we say, "What could

96:51

I do that actually would be the best for

96:54

me me?" It would still be that.

96:56

>> Yeah.

96:57

>> Which is the funniest part of the whole

96:59

thing. It's like both ways. It works for

97:01

you if even if it's not all about you or

97:04

it works for you. Even if it is all

97:05

about you, it's still going to be the

97:07

better message out of the two.

97:08

>> It's it's definitely a better framework

97:10

for living your life. And um there's a

97:14

lot of people that just reject that that

97:16

are that think of themselves as

97:18

intelligent, you know, they think of

97:20

themselves as intelligent and well read

97:22

and educated and they just Yeah, I'm too

97:25

smart for that. Too smart for all that.

97:27

I'm an atheist.

97:27

>> Yeah.

97:28

>> Any atheist needs to take eight grams of

97:30

mushrooms. Just

97:33

>> do a little DMT.

97:34

>> Do a little DMT. And you're like, "Oh, I

97:36

don't know anything." You You think you

97:38

know things. You don't know a [ __ ]

97:40

thing. You just know what you've

97:42

experienced. And I think that this

97:48

the world is better off if people have a

97:51

great moral and ethical framework. I

97:53

think morals and ethics and being kind

97:56

is one of the most important values that

97:58

human beings can ever possess if you

98:00

want to live in a productive and healthy

98:02

community.

98:03

>> Completely agree. And uh I think that

98:05

kindness I make a delineation between

98:08

kindness and niceness

98:10

>> is I think it's often kind not to be

98:11

nice

98:12

>> but I do think that you can be nice and

98:15

it may not be kind

98:17

>> right

98:17

>> and so that's true.

98:18

>> So I make a delineation between those

98:19

things. I don't think that kindness

98:21

though has much variance.

98:22

>> Kindness is looking after the interest

98:24

of somebody who's not me

98:26

>> and it makes everybody it's it's

98:28

actually selfish because it makes you

98:30

feel good too.

98:30

>> Yeah. There's I mean there is something

98:32

you

98:32

>> could look at it that way.

98:33

>> Sure. from from the from the position of

98:35

trying to convince the unbeliever,

98:37

right? Appealing to their self-interest

98:39

may not be the worst idea, right? You

98:41

know, appealing to like, well, has the

98:43

lack of community and the like, let's

98:45

just assume for a second. Let's just

98:47

assume

98:48

>> it's all [ __ ] and it's all nonsense.

98:50

>> Every bit of it is just totally made up.

98:52

We just like we just made it up, right?

98:55

But we all acted as though it was true.

98:58

If it's the case that your whole

98:59

framework is that we just want a society

99:01

that really works well and does the best

99:04

it can possibly do for everyone, then

99:06

shouldn't you by your own framework just

99:08

pretend it's true,

99:09

>> right? Yeah.

99:10

>> Shouldn't you just act as though it's

99:11

true anyway?

99:12

>> Jordan Peterson had a very good point

99:14

about that.

99:14

>> Yeah.

99:15

>> About believing in God that if you

99:19

believe if you act as if God is real,

99:22

you will have a better life. Like it it

99:24

it works. It really does work. almost

99:27

like a universal truth.

99:28

>> Yeah. It's it's very fascinating. It's

99:30

fascinating that um people that are

99:33

self-p profofessed atheists and people

99:34

that think of themselves as too

99:36

intelligent for religion won't

99:38

acknowledge that. They don't want to

99:40

believe that. And so many of them that I

99:41

know that are self-professed atheists

99:43

are some of the most miserable people.

99:45

They're they're very depressed. A lot of

99:47

them are on psychiatric medications. A

99:49

lot of them are in therapy. A lot of

99:50

them are really [ __ ] up.

99:51

>> They're almost cursed. [laughter]

99:57

Almost seems like that, doesn't it? And

99:59

the thing, well, the thing is

100:01

interesting is like um I've talked with

100:03

a lot of atheists, debated with a lot of

100:05

atheists, especially on the effects of

100:07

Christianity in society against the

100:09

effects of atheism. And I know what pure

100:13

secular states have led to. That's what

100:14

communism was. That was a purely secular

100:16

state. Yes.

100:17

>> Where you really where you really wall

100:20

off the church from the state. But here

100:22

we pretend that it's secular and they

100:24

get all the benefits of it being quote

100:26

secular, but it's not secular at all.

100:28

Right?

100:28

>> Politicians are constantly voted in

100:30

based on the fact that they have an x

100:32

amount of value structure and that's

100:34

what they're going to implement

100:35

legislatively on you. The whole secular

100:38

thing totally made up. It's and and them

100:41

pretending that that's even even real or

100:44

has ever existed as a real framework in

100:46

the United States just nonsense. Not

100:48

only that, but I think there is a

100:50

natural default in the human mind to be

100:53

attracted to a structure. And uh if that

100:56

structure is a Christian structure,

100:58

you're attracted to all the Christian

101:00

values that we've just discussed being

101:02

so positive and beneficial to you. But

101:04

if you're not and you go to a leftist

101:06

progressive structure, leftists in

101:08

particular, um like a Marxist structure,

101:12

what you know what you're seeing is a

101:15

complete lack of forgiveness. They don't

101:18

have that built into the system. You

101:20

know, one of the one of the beautiful

101:21

things about Christianity is forgiveness

101:23

and the recognition that we're all

101:25

sinners and we all [ __ ] up and we're all

101:27

human and we're all flawed and that you

101:30

could you could move on and be better

101:32

and you can atone for these sins and you

101:34

could recognize that, you know, yes,

101:36

you've made a mistake, but here's the

101:37

best way to move forward and be a better

101:40

person. Society at whole recognizes that

101:43

you are me and I am you and we're all

101:46

kind of the same thing. We all [ __ ] up

101:47

and we're all we're we're all just human

101:50

beings. But there's a pathway. There's a

101:53

pathway to forgiveness. There's zero

101:54

pathway in this in in leftism. That's

101:58

that's the most horrible thing when you

102:00

watch these pylons online over like the

102:02

most innocuous

102:04

>> discretion. What's funny with leftists

102:06

is their pathway is just everything's

102:08

permitted.

102:09

>> Yeah.

102:10

>> And the pathway from the Christian is

102:11

no, not everything is permitted, but

102:13

almost everything can be forgiven.

102:15

>> Right. And that I would see is the big

102:17

distinction. It might there, you know,

102:20

there's a a story uh that I heard

102:23

because I'm Eastern Orthodox. That's um

102:25

that's what I follow. And uh I heard a

102:29

it was a great story my priest told me.

102:31

And so basically how this went is there

102:33

was monks uh they lived in a commune and

102:36

one monk uh liked to get drunk. That was

102:38

his big vice, right? And he drank a lot

102:41

of beer. And he did this clear up until

102:44

the day that he died. And when he died,

102:47

uh, everyone was crying. And a monk

102:50

said, "Well, you know, why why is

102:51

everyone crying?" You know, he held that

102:52

vice clear up till the day he died. And

102:55

the head of the of the the abbot who was

102:58

there, he said, "Yeah, but the last few

103:00

years, he cut it in half. [laughter]

103:04

He was on the path."

103:05

>> Yeah. He was well he's just saying I'm

103:08

going to recognize all the progress that

103:11

this man who had this horrible vice uh

103:14

did right there was still prog he was

103:16

still trying to move towards the virtue.

103:17

>> Yeah.

103:18

>> Now maybe he never got to it. But I'm

103:20

still going to recognize that he was

103:22

trying to and maybe he he was not able

103:25

to surmount it. He was not able to get

103:27

past his demons. Maybe he wasn't able to

103:29

overtake them all. But he was at least

103:31

attempting to.

103:32

>> Right. That's the thing.

103:33

>> That's the that's the thing. Well, it's

103:35

it's this idea of like someone being a

103:38

perfect person is just it's it's

103:39

nonsense. Doesn't exist. And so, if you

103:42

don't have a pathway to forgiveness and

103:44

if you don't if you don't have that

103:47

built into your society, you're always

103:48

going to have people pointing out the

103:50

people that are the bad people. And it's

103:53

going to keep moving in that direction.

103:54

And it's one of the things you see in

103:56

the left in particular, they eat their

103:58

own. And it it's drives me crazy when I

104:00

see that also from the right. I'm like,

104:02

you don't you see that the people that

104:04

you criticize are doing this and now

104:06

you're doing this? You guys are turning

104:07

on each other over the most innocuous

104:09

things and and forming tribes where

104:12

you're attacking each other even though

104:14

you have mostly shared values instead of

104:16

being charitable and recognizing that,

104:18

you know, these are just human beings

104:20

and they make mistakes.

104:21

>> Yeah.

104:22

>> But the left eats itself more than any

104:24

[ __ ] group that I've ever encountered

104:27

over almost nothing. and they love to

104:29

pile on because they're absolutely

104:31

terrified that it's going to come for

104:33

them. They're [ __ ] terrified and so

104:35

they will go out of their way to shame

104:39

and attack and to take some of the

104:41

energy away from them.

104:42

>> But do you think there's a unity in

104:44

that? Like if we were to if we were to

104:46

look at this again like from a bird's

104:48

eye view, I agree with you. The left

104:50

eats itself way more than the right

104:51

does. Though the right eats itself too,

104:53

right? And we've been seeing a lot of

104:54

that post Charlie Kirk's death. I though

104:57

I think that that was mostly power

104:58

vacuum based and who gets to fill the

105:01

power vacuum.

105:01

>> That's what I think.

105:02

>> I still think that it's it turned into a

105:05

dog eat dog for the power vacuum fight

105:07

and it was a criticism of values

105:09

foundationalism and all of that.

105:11

>> But from the left view, if you eat

105:13

>> if you're eating your own, right, and

105:15

you eat the message apart to the point

105:17

where you get down to the foundation and

105:20

now everybody's in lock and step, is

105:22

that better for political power or

105:24

worse? Like if you constantly are just

105:27

eating the wrong, nope, that message

105:28

isn't pure enough and they gobble them

105:30

up until you get the monster, right? Who

105:33

has the right message, they're all on

105:35

board. Is is that the better way to

105:37

achieve this kind of like political

105:39

paradigm that they want? That's my

105:41

question.

105:41

>> It's very naive. It's a It's a naive

105:43

perspective that eventually you're going

105:45

to boil it down to a purity and you're

105:47

not going to. It's not going to happen.

105:48

You're you're never going to get farleft

105:50

enough.

105:51

>> There'll always be something else to eat

105:52

them over on.

105:53

>> Yeah. Well, also you're advocating for

105:55

communism and advocating for communism

105:57

is so wild. And people there's no

106:00

examples of it ever being done right.

106:02

It's there's zero. Imagine advocating

106:05

for something that has zero success.

106:09

There's zero. None. Like you can It

106:12

doesn't exist. It does. It's never

106:14

happened. It's never been. Why? Well,

106:16

I'll tell you why. Because if everybody

106:18

has to share all the money, then who's

106:21

going to enforce that?

106:23

Who's going to do who who's going to

106:24

tell people that you have to give up

106:26

your house? Who's going to tell people

106:27

to give up [laughter] your life? The

106:28

state. The state.

106:29

>> And the state has guns.

106:30

>> Yeah. So, you're advocating for

106:31

violence. Well, you don't think you're

106:33

advocating for violence, but you are.

106:34

You're advocating for hard men with with

106:37

guns to enforce your will.

106:38

>> And those people are going to wind up

106:39

living in mansions and eating filet

106:41

minan and everybody else is going to be

106:43

eating oats and grl and uh that's

106:46

>> which is exactly what's always happened

106:48

when it's tried. And the thing the thing

106:50

is interesting too is there's other

106:51

there's other value set issues that are

106:53

really simple to point to like okay

106:55

nothing's worth anything like how do I

106:58

get my guitar

106:59

>> right it's nonsense

107:00

>> like just you know all the communist

107:02

nations were always setting their market

107:03

prices based on what capitalist would

107:06

markets would set for prices

107:08

>> and it's like how do I value a guitar if

107:11

it's if if if it's supposed to just be

107:14

mine in the commune and then yours also

107:16

and his also how do we set a value

107:19

assessment here. What makes the epohone

107:21

better than the

107:23

>> um another Sorry folks, we had a crash,

107:25

software crash. Another problem is this

107:28

idea of the equality of outcome that

107:30

everybody should get an equal amount.

107:32

That is crazy talk because we all know

107:35

that equality of effort does not exist.

107:38

There's a reason why there's outliers

107:39

and the reason why they're so compelling

107:41

and so inspirational. It's like this

107:44

[ __ ] guy got up at five o'clock in

107:46

the morning and ran every morning before

107:48

work and hustled and and ate the right

107:50

food and and [ __ ] did the right

107:52

things and was thinking and pushing and

107:54

was open-minded and and he became

107:56

radically successful.

107:58

>> But from each according to their

107:59

ability, Joe, [laughter]

108:01

>> but that No, it's it's not even

108:03

maximizing everyone's ability because

108:04

you're you're basically giving a safety

108:06

net for [ __ ] lazy people and that's

108:08

not good for them either. No, being

108:11

inspired by others success is a good

108:13

thing. It's a good thing. And the only

108:15

way that happens is if you let someone

108:17

be exceptional. And the only way you let

108:19

somebody be exceptional, you have to

108:20

incentivize them. What's the incentive?

108:22

The incentive is they get more value out

108:24

of their hard work. They get more money.

108:26

They get a nicer house. They get like

108:28

what are you going to do? You going to

108:29

decide that people have to like mate?

108:32

Like that women [laughter] don't find

108:34

this guy attractive, but that's not

108:36

fair. So that they have to be with this

108:38

guy and they have to find him attractive

108:39

or that you know a woman has to find

108:41

this man attractive even though he's

108:43

he's a a don'ty [ __ ] lazy loser. Like

108:46

this is the way of the world and

108:48

competition is a good thing for human

108:49

beings. It inspires us. It's good. It it

108:52

it lets you know that there's a higher

108:54

bar that can be achieved. And you often

108:56

used to know who the lazy people were

108:57

based on the living conditions they had.

108:59

Yes. Isn't that interesting? Just like

109:01

you would often know if there was an

109:03

ugly kid that their parents probably

109:05

were pretty ugly, right? Uh but it's

109:08

true, right? It is true. The the idea

109:10

here is like um people tend to bat in

109:13

dating in their league, at least men do,

109:15

right? Or try to, right? [laughter]

109:17

Uh women above their league. But the

109:19

thing is is the reason you commonly see

109:22

good-looking people with good-looking

109:23

people and ugly people with ugly people

109:25

is because that's about what you can

109:27

get.

109:28

>> Yeah. But it's the same thing when it

109:29

comes to ability and skill in whatever

109:32

it is that you're doing, right? Yes.

109:34

Hey, thing is is like often times if you

109:37

ask a person, my dad used to say this

109:39

all the time. He was right. If you ask a

109:42

person, are you where you're at based on

109:44

things that happened to you or because

109:46

of you? He said 98% of people will say

109:49

because of things that happened to me.

109:51

And then when you ask them about what

109:52

those things are, you'll find out that

109:54

it's because of them. you'll find out

109:56

it's because of choices they've made,

109:58

things that they've done. That's

110:00

actually what's responsible for the

110:02

conditions that they're in.

110:03

>> Oh, and by the way, for people that's

110:05

not the case, like trust fund kids are

110:07

the most miserable [ __ ] I have

110:10

ever met in my life.

110:11

>> And they lose it all anyway.

110:12

>> They a lot of them do, but so they a lot

110:14

of them are not fully formed human

110:16

beings. And the way I always describe

110:18

it, I I go, it's like if you give if you

110:21

make cement and you don't add all the

110:24

stuff in the right way, you can't fix it

110:26

later.

110:27

>> Right.

110:27

>> Right. So during the developmental

110:29

process, if you're [ __ ] Joffrey from

110:32

Game of Thrones, like what are the odds

110:33

that Joffrey is going to [ __ ] figure

110:35

it out and get his [ __ ] together and be

110:36

cool when he gets old?

110:37

>> You're just Dexter and that's it.

110:39

>> Exactly.

110:39

>> Right. You just like you have the

110:41

informed experience of the serial

110:42

killing and it's like there's just no

110:44

fixing you.

110:45

>> There's no fixing it.

110:45

>> Yeah. There's no fixing it. I get that.

110:47

I just the thing that's interesting is

110:50

like um when I look at the the communist

110:53

paradigm versus the capital, you know,

110:55

that's coming back. That paradigm is

110:57

coming back. And for a while it was kind

110:58

of shoved off as like that's boomer

111:00

[ __ ] you know? But the Cold War is

111:01

over, Grandpa, right? Cold War is over,

111:03

grandpa. There's no communist versus

111:05

capitalist versus that that's all. It's

111:08

like not it's not it's not done.

111:11

>> No, not at all.

111:12

>> It's not done. It's a story as old as

111:13

time and it keeps [ __ ] repeating

111:15

itself. And it's just weird that people

111:17

Well, I look, but also I believe in

111:19

social safety nets because I think that

111:21

there's a lot of people that are very

111:22

unfortunate and there's a lot of people

111:24

that do grow up with shitty parents or

111:26

parents that have a bad situation in

111:28

life. Maybe the father dies or the

111:30

mother dies and there's no like it's

111:32

good to be charitable and churches are

111:34

fantastic at that. It's one of the the

111:37

more pure charities that you're ever

111:38

going to find because their goal is

111:41

really just to help those people. Unlike

111:43

what what you think of as charities in

111:45

the modern sense, one of the grossest

111:48

[ __ ] things today is these enormous

111:51

charities that everybody thinks, "Oh,

111:53

I'm going to support this charity. It's

111:54

doing so much good."

111:55

>> 90% to the CEO.

111:57

>> Yeah, dude. I was watching this thing.

111:59

It was either li I think it was Live

112:00

Aid, you know, one of those uh one of

112:03

those concert things. What was Bono?

112:05

What was he involved in? Was it Live

112:07

Aid?

112:07

>> I don't remember. But Bono Bono I

112:10

remember his speech where he was like,

112:11

"Capitalism's done more to take people

112:13

out of poverty than anything else." I

112:15

thought that was funny.

112:16

>> It is funny.

112:16

>> Yeah. But uh I don't remember which one

112:18

he was involved in directly, but I know

112:20

what you're referencing. Well, Mike Benz

112:22

did this video today where he's

112:24

explaining how an enormous percentage of

112:26

that money went to regime change. Like,

112:30

it went to prop up went to prop up CIA

112:33

operations. Like, the [ __ ] money that

112:35

people donated so generously to the LA

112:39

fires. Do you ever see where all that

112:40

went?

112:41

>> Where?

112:41

>> It went to like a hundred different

112:43

nonprofits. Like some of it was like uh

112:47

uh pro-immigration

112:50

that uh was like we we we we talked

112:53

about it the other day. We had a whole

112:54

list of all the different things that

112:56

have been documented that that money Oh,

112:58

yeah. That's all. So very little money

113:00

is ever going to go to the actual people

113:02

that lost their house. Almost all the

113:04

money is going to go to these

113:05

nonprofits. All these nonprofits have

113:07

overhead. It goes to their employees. It

113:10

goes to the overhead costs. All these

113:12

people got bonuses.

113:14

million dollar went to bonuses.

113:16

>> Yeah, but that's same with the state.

113:17

>> But how crazy is that? You get a bonus

113:20

for running a [ __ ] charity. That's

113:22

crazy.

113:23

>> Huge bonuses.

113:24

>> Huge bonuses.

113:25

>> Like hundreds of thousands of dollars or

113:27

millions in bonuses.

113:28

>> And that's the homeless situation. This

113:30

is the other thing about the homeless

113:31

situation in California. Oh, we're going

113:33

to help the homeless. It's really

113:34

important to donate to the homeless.

113:36

Let's help the homeless. California

113:38

spent $24 billion on the homeless

113:41

problem. It got worse. Not only did it

113:43

get worse, they can't account for the

113:45

money. And when the politicians have

113:47

unanimously voted to try to do an audit

113:50

to see where the money goes, Gavin

113:51

Newsome has vetoed it. IT'S WILD. WELL,

113:56

isn't it counterintuitive? Anyway, if

113:59

you're in an area and you say, "Look,

114:00

we're going to be really good to the

114:02

homeless here. We're going to give them

114:03

a lot of money, a lot of entitlements.

114:05

We're going to really help them get on

114:06

their feet." If you were homeless in a

114:07

neighboring state, where would you go?

114:10

[laughter]

114:12

>> Yeah. Yeah, you move to the place with

114:13

awesome weather, they'll give you money

114:14

>> where they're going to give you money.

114:16

>> And so, you know, and they do they do

114:18

just that. And so, that's why these

114:20

budgets become very bloated, right?

114:21

People are like, "Wait a second. There

114:22

used to be three homeless guys over

114:24

there. Now there's a tent city. What the

114:25

hell is going on?"

114:26

>> And they all get free crack.

114:27

>> Yeah. Yeah. Or free needles in their

114:29

needle exchange or whatever it is.

114:31

>> Um, it gets worse and worse. The state

114:34

does the same thing. State is allocating

114:36

tons and tons of cash that it gets in

114:38

social security tax. It's not going to

114:40

social security. It's like these

114:42

entitlements and entitlement spending.

114:44

>> Well, when people found out that social

114:45

security is going to illegal immigrants

114:47

an enormous amount of it, they were

114:48

like, "Wait, wait, what?" And they

114:49

denied it. They denied it and then, you

114:51

know, they had to fess up to it and

114:53

whistleblowers and

114:54

>> Well, why would you have a payo system?

114:56

Why would you have a system where you're

114:58

like, "This is social safety net that

115:00

you're paying into for your retirement

115:02

that you have to pay into. Why wouldn't

115:04

that go in a lock box? Why would you

115:06

have a Well, because uh we want access

115:09

to that money right now." and we'll

115:11

we'll pay it out later. That's what

115:12

we'll do. We'll pay it out later.

115:14

>> Uh it's like what? Why wouldn't you why

115:16

wouldn't you have it?

115:17

>> It's like misappropriation.

115:18

>> Of course. It's it's it's w it's wild

115:20

that they're allowed to access the

115:22

social security funds that are for your

115:24

retirement. And then they're like,

115:25

"Well, we're going to defund social

115:27

security. So, we'll shut the whole

115:28

government down, right?" Uh because then

115:30

you won't get your social security

115:32

checks. We'll weaponize the entitlement

115:34

that should just be in a lock box.

115:36

>> Yeah. You know it's it's well we'll pass

115:39

funding for it. Funding for the thing

115:41

they already paid.

115:42

>> Well the idea also is that you are

115:44

supposed to be paying into it so that

115:45

you will get money when you retire. But

115:48

your return on investment is so bad.

115:50

>> That's terrible

115:51

>> in compared to what would happen if you

115:53

spent that exact same money and put it

115:55

in like a fund a reliable fund. You

115:58

would get so much more money when you

115:59

retire. Like an enormous amount. Now,

116:02

well, now I almost feel like it's

116:03

hamstringing because if if it was the

116:05

case that they let you keep, you could

116:08

just opt out, you know, I don't want

116:09

social security, I want to keep it, and

116:11

then you took that and you put it in

116:13

those hedge funds and retirement

116:15

accounts and things like this, you would

116:17

way maximize over what you get in social

116:19

security.

116:20

>> Yeah.

116:20

>> You know, so but you can't opt out even

116:22

though it's for you.

116:23

>> There's nothing the government does

116:25

good. Not a thing. Not a single thing.

116:28

So why would they be good at that? And

116:30

why would anybody support that? They're

116:31

just not good at it, especially when it

116:33

comes to money. There's always a bunch

116:34

of shenanigans that take place, you

116:37

know, and the idea that they would say,

116:38

"Oh, Social Security is sacred. This

116:40

this we're going to really treat we're

116:42

going to maximize your the amount of

116:44

investment and really take care of

116:45

people. We we really care about people."

116:48

>> Yeah. We care about them tons.

116:50

>> It's so naive. It's just It's so naive

116:52

and so obviously an ineffective and

116:55

possibly corrupt system.

116:57

>> Well, hasn't it become a weapon?

116:58

Entitlements are a weapon.

117:00

>> They're a political weapon.

117:01

>> Wow. It certainly helps. Yeah. And and

117:04

also again the suicidal empathy that Gad

117:06

Sad talks about. If you're on the left,

117:09

you think of it as being like you're an

117:10

empathetic person, a kind person. You

117:12

want people to have money when they

117:14

retire. You want people to have Medicaid

117:16

and you want people to have welfare and

117:17

you want people to have SNAP. Who was

117:20

the guy you brought up? Um I'm trying to

117:22

think of his name. It was like uh maybe

117:25

Professor Raft, something like that.

117:27

They brought up the Papa New Guinea

117:29

thing. Do you remember the Papa New

117:30

Guinea thing?

117:32

>> Yes, I do.

117:33

>> Yeah. So, on the that little island,

117:36

right, they have the Seol people and the

117:39

Seol people basically they molest young

117:41

boys. That's what they do, right?

117:43

>> Uh but apparently the young boys there,

117:45

they love it because it's a right of

117:47

manhood, right? And it's all socially

117:48

conditioned in. The thing is with

117:50

suicidal empathy that's really funny

117:52

here to point out to a left is from

117:54

their paradigm there's nothing wrong

117:55

with that actually

117:57

>> where's the harm right that's part of

118:00

the suicidal empathy of the part of the

118:03

ideology of suicidal empathy is like for

118:05

me from my worldview it's like I don't

118:08

care if you don't think there's harm in

118:10

that there is we're stopping it emperor

118:12

Andrew done that's that's done no more

118:16

that's not allowed I don't care if it's

118:17

relativistic or Not it's over, right?

118:20

>> Yeah. I mean, it it's crazy to try to

118:23

defend that culture. That culture is so

118:27

wild. The the seaman warriors of Papo

118:30

New.

118:30

>> Yeah. The seaman warriors of Papa. for

118:33

people who don't know. And let's let's

118:36

instead of just talking about this,

118:37

let's read this from an actual source so

118:40

we can explain because they call the the

118:44

the children when I think they're six.

118:48

The boys have to live with a man that

118:50

they refer to as the anal father. And

118:54

this guy and in order for them to grow

118:56

strong, they have to consume semen both

118:59

orally and anally. And so they get mouth

119:02

[ __ ] and [ __ ] by this guy and

119:04

then they continue that when they grow

119:07

up

119:08

>> as part of their warrior culture.

119:10

>> And what stopped it?

119:12

>> You know what's what ended up finally

119:14

stopping a lot of that.

119:15

>> It's not going on anymore.

119:16

>> It still is, but it a lot of it was

119:18

stopped depending on the tribe you were

119:20

in because of Christian missionaries.

119:22

>> Interesting.

119:23

>> Because of Christian missionaries. But

119:24

here's the thing that cracks me up,

119:26

right? And this whole culturally

119:27

relativistic nonsense harm principles

119:29

stuff. you're Christopher Columbus and

119:32

you show up and if culture is doing

119:33

that,

119:34

>> don't you put them to the sword,

119:36

>> right?

119:37

>> Like if you see the pyramids and they're

119:38

cutting people's hearts out like we're

119:40

holding it up to this to to the to raw

119:42

or whatever.

119:43

>> You're like, I'm supposed to feel bad

119:45

that they put you to the store. Like

119:48

it's really hard for me to feel bad

119:50

about that, right?

119:51

>> You know, it's really hard for me to be

119:52

upset about that. But with the seinal

119:54

people, same thing. It's like if you

119:56

went in there and you just used strong

119:57

armed force, this is why the libertarian

119:59

NAP and stuff like that I disagree with

120:01

cuz it's like if you went in there and

120:02

used strong armed force and just stopped

120:04

it immediately, so what? So what? How's

120:08

the world a worse place for this? How

120:10

is, you know, like and how is that not

120:12

ultimately stopping an egregious sinful

120:14

act uh that you can stop with ease,

120:17

>> right?

120:17

>> Like why not why not do that? Well, one

120:20

of the things uh I got really into uh

120:23

Aztecs recently and because I I did I

120:26

wasn't aware um that a lot of those

120:30

temples that they found You would think

120:32

that the people that build those

120:33

incredible pyramids and temples

120:35

>> No, they found them, didn't they?

120:36

>> Yeah. You would think they have to be an

120:37

incredibly sophisticated society. Well,

120:38

it turns out they didn't really build

120:40

them. They found them

120:42

>> and they referred to them as the place

120:43

where the gods were born. I didn't know

120:45

that. I I I was always told that they

120:47

built these incredible structures and

120:49

then the Spaniards came and they found

120:50

them and no when

120:52

>> no they were primitives who found who

120:54

found something that was extremely

120:56

advanced and then used it for their

120:58

primitive application.

120:59

>> Not just primitive but barbaric. Um when

121:02

they completed the consecration of the

121:04

temple of Tanochon they killed

121:08

somewhere between 20,000 on the low end

121:12

and 80,000 on the high end. They

121:15

sacrificed 20,000 to 80,000 people

121:18

within four days. In four days. And this

121:22

was documented by God, I forget his

121:25

name. Um

121:27

something Diaz. He was a Spanish

121:29

chronicler because this is before Cortez

121:32

came. They you know they they they

121:35

started like trying to figure out what's

121:37

going on over there. And one one of the

121:39

things that this guy came back he said

121:42

this place is [ __ ] crazy. like they

121:44

killed 80,000 people and a lot of people

121:46

have disputed that 80,000 people but

121:48

then they found so many bones that

121:50

they're like okay it's probably

121:53

somewhere north of 20,000 which is crazy

121:56

enough they sacrificed him in four

121:58

[ __ ] days

121:59

>> morality is relative

122:01

>> right was 4,000 maybe

122:03

>> what's that

122:04

>> as many as 4,000 was as max as they got

122:06

to

122:07

>> do bonesies [clears throat]

122:08

yeah the people that if even if they did

122:10

20,000 I think the number I think I saw

122:11

was four people or it was like four

122:14

people every minute you would have had

122:16

to do or was it was something that

122:19

almost impossible to accomplish. They

122:20

just said the number was probably

122:21

exaggerated a lot,

122:22

>> right? But they said that the No, I

122:24

think if you

122:25

>> I looked I looked somebody brought up

122:26

>> but I looked it up too. I looked it up

122:27

yesterday actually. I looked it up

122:29

yesterday and they were saying that that

122:31

this guy who was the guy that um

122:35

Okay, we can

122:39

I can I know I have it saved so I can

122:41

find it in here. Um

122:45

so this guy uh this Diaz guy

122:50

>> who's chronicling this

122:51

>> Yeah. who chronicled it in

122:52

>> how long before was this?

122:54

>> Really really soon. Okay. Like within a

122:56

couple of decades before Cortez

122:59

Yeah.

123:01

Um,

123:04

oh god, I'm trying to find it.

123:07

But it

123:08

>> this does say maybe I I'll just the the

123:10

perplexity of things said Spanish

123:12

sources claimed 80,400 victims in 1487,

123:16

>> right?

123:16

>> But modern estimates suggest 4,000 to

123:18

maybe 20,000,

123:20

>> right? So 20 Okay, they don't really

123:23

know. So 20 80,000 might be exaggerated

123:25

if you think about the number, but just

123:26

think about 20,000 people. Killing

123:28

20,000 people by cutting their hearts

123:29

out and throwing them down the steps of

123:31

the pyramid in 4 days. It's [ __ ]

123:33

crazy. So if you're the Spaniest,

123:37

the Spaniards, and you come here, you

123:38

don't feel bad about conquering those

123:40

[ __ ] You're like, "What are you guys

123:43

doing?" Or how about when they showed up

123:44

and they found the Mayans and they're

123:45

playing football with human heads.

123:47

>> Yeah.

123:47

>> Now, here's a funny one. They don't want

123:50

to believe that they played football

123:52

with human heads. So historians try to

123:54

say that they didn't play football with

123:55

human heads. Even though there's

123:57

artistic depictions of them playing

123:59

football with human heads, like no, that

124:02

was just symbolic. Well, did they

124:03

sacrifice humans? Yes, they did. But um

124:06

we think they played [laughter] football

124:07

with their heads. That would be rude.

124:08

>> It's that that whole myth of the noble

124:10

savage.

124:10

>> Exactly. And that whole myth of the

124:12

noble savage is something which is

124:15

utilized by the left in order to make

124:18

the claim that you are an imperialist

124:21

and an occupier and a person who yeah

124:25

you have colonized their land. And the

124:28

thing is is it's like

124:30

>> if that's what we colonized why do I

124:32

care? I ask this question all the time.

124:35

Why do I care if that's what I

124:37

colonized? If that's what my ancestors

124:39

colonized, why should I give a [ __ ]

124:42

about that?

124:42

>> Well, there's an amazing book about

124:44

Texas called Empire of the Summer Moon.

124:47

Uh, it's all about the Comanche. This

124:50

entire land used to be before Mexico

124:52

owned it. By the way, one of the funny

124:54

thing this lady said to me, you know,

124:56

uh, this all used to be Mexico. I'm

124:59

like, right. But do you know how for how

125:00

long? 15 years. Like, I've been here for

125:03

six. Like, you got to let that go.

125:05

That's not that long. 15 years. And by

125:07

the way, Mexico was only

125:08

>> and we had an Alamo over it. Okay.

125:10

>> Yeah. And by the way, Mexico was only

125:12

one when it owned Texas.

125:14

>> Mexico started

125:14

>> with it new constitution.

125:16

>> Yeah. Yeah.

125:17

>> Because then also you have the language

125:20

and the religion of your oppressors that

125:22

you're trying to say is this noble and

125:24

incredible culture that you're bringing

125:26

over to America.

125:27

>> And you're all Catholic.

125:28

>> And yeah, you're all Catholic. You all

125:29

speak Spanish. [laughter] They used to

125:31

have well first of all the people the

125:33

Native American people and the original

125:36

people and the Aztecs the Mayans the

125:38

Mexicans it's essentially the same kind

125:40

of people a lot of them are they look

125:43

the same was like if you look at sitting

125:46

bull looks like you could be working at

125:47

a takaria

125:48

>> my wife calls them the pygmy people

125:50

>> oh they're tiny little people

125:53

they have flat noses and they like yeah

125:56

they look like pygmy people

125:57

>> well the the the original people of

126:00

Mexico had or what the land of Mexico

126:03

had over a hundred languages. The Mayans

126:06

alone had 30 different languages.

126:07

They're all lost. Like these languages

126:09

are all lost. And we're supposed to

126:10

think it's noble that this amazing

126:13

culture that have the language and the

126:15

religion of their oppressors and they

126:18

want to mo move here with this language.

126:20

They're colonizing. They're trying to

126:22

colonize a place. You've been colonized.

126:25

You're trying to colonize. And here's

126:26

the thing about colonizing everybody.

126:29

Everybody that doesn't live in Africa,

126:33

somewhere in their ancestry, there was a

126:36

colonizer. Like if you go to Minnesota

126:39

and you see these Somali communities and

126:41

everyone's speaking Somali, they have

126:43

Somali businesses. What do you think

126:46

that is?

126:47

>> It's a colony, of course.

126:49

>> It's just not a big one. It just hasn't

126:50

taken over the entire country. And when

126:52

it does, you'll think it's great because

126:53

they're not white,

126:54

>> right? Well, they're not colonizers.

126:57

They're not col you see white

126:59

immigrants. They're immigrants. Yeah.

127:01

>> Right. White people are colonizers.

127:02

>> Yeah. White white people are the

127:03

colonizers. Everyone else is the

127:04

immigrant.

127:05

>> Nobody feels bad for Swedish chicks with

127:07

big tits that are moving to America. You

127:08

don't think that those are colonists.

127:10

>> Yeah. Don't care. Yeah.

127:11

>> Don't care. It's only It's only the

127:13

colon. It's only the colonization. Oh

127:16

man. Well, the thing is funny moving

127:18

back to the to the myth of the noble

127:20

savage thing. How weaponized that is

127:23

when so much of it isn't true. Like for

127:25

instance, you've heard of the two

127:26

spirits. Yes.

127:27

>> Right. That's that's all that's all

127:29

[ __ ] too. The whole two spirit

127:31

people all [ __ ] Came from like one

127:33

guy. I don't remember started with a B,

127:36

right? Um who like Bardetche or

127:39

something like this that they called

127:40

them, right? And it was one tribe of

127:42

people who who uh had some like weird

127:46

thing that they did. That was it.

127:47

>> Well, it's probably gay guys.

127:49

>> Yeah. The gay guys that like dressing up

127:50

like women.

127:51

>> Yeah. That's where the whole two spirit

127:52

thing came. And then suddenly it's like

127:54

no, the Native Americans had the two

127:56

spirit. It's like, no, no, no, no, no,

127:59

no, no, no. That is not the case at all.

128:02

You just made it up because then you

128:04

could throw it in with your Skittles

128:05

[ __ ]

128:06

>> Right.

128:06

>> So [laughter]

128:07

that's the rainbow.

128:09

>> Yeah. Throw it in with the Skittles

128:10

[ __ ]

128:11

>> The rainbow. Yeah. Well, the thing about

128:14

this area here before Mexico owned it,

128:16

it was Comancheria. Uh, it was owned by

128:18

the Comanche. But you know how they

128:20

owned it? Cuz they killed the [ __ ]

128:22

Apache. That's That's how Yeah. Well,

128:25

they were [ __ ] brutal. That's why

128:27

this Empire of the Summer Moon book is

128:29

so good because it just shows you how

128:31

unbelievably barbaric the Comanche were.

128:34

They were the baddest [ __ ]

128:36

around cuz they had figured out horse

128:38

raising. Um, and by the way, they only

128:40

got those from Yeah. from Europeans,

128:43

right?

128:44

>> Yeah. Which is crazy because horses

128:45

actually originated from North America.

128:48

>> I thought they were in Europe and

128:49

brought here in from Europe. Horses. No,

128:51

horses originated in North America and

128:53

then made their way to Asia and then

128:55

were wiped out in North America and then

128:58

reintroduced Spanish.

128:59

>> But the but the natives didn't have

129:01

access to them until Europeans brought

129:03

Okay, that's what I thought.

129:05

>> Their culture was so incredibly wild if

129:08

you think about it. Like you you you

129:11

know, you're talking about where you you

129:13

think about Europe and Asia. You've got

129:16

people riding horses and building cities

129:19

and you've got like agriculture and all

129:21

these things. And in North America, you

129:24

basically have stone age people. It's

129:26

really kind of crazy. Really kind of

129:28

fascinating. And then they get horses.

129:30

And the Comanches were the first ones to

129:32

really figure out horse breeding. They

129:34

figured out how, you know, how to

129:36

castrate their horses

129:37

>> and they became Mongols.

129:38

>> Yeah. They basically became Mongols.

129:40

>> Yeah. They became Mongols once they had

129:41

access to horses. Well, that was what

129:43

that was the whole distinction. Anyway,

129:45

if you reduce it all between the two

129:47

civilizations, Europeans had

129:48

domesticable animals, natives didn't,

129:51

>> right?

129:51

>> And because we had domesticable animals,

129:53

we had labor, we built these these

129:56

amazing societies and they didn't. Yeah.

129:58

>> Like the difference that a work ox and a

130:02

workhorse can make in labor is

130:04

astronomical.

130:05

>> Yeah.

130:06

>> And so, you know, like um that's that's

130:09

the real difference. Same thing with

130:10

disease. They're like, ah, you know, the

130:12

whites brought over all their disease.

130:13

It's like, well, all those came from

130:14

animals, small pox, all that got

130:16

immunity because we of animal husbandry.

130:18

They didn't have any immunity to any of

130:20

that.

130:20

>> Not only that, there's real evidence

130:21

that syphilis came from Native Americans

130:24

and then they brought that at least some

130:25

forms of syphilis and they brought that

130:27

syphilis back to Europe and then all the

130:29

Europeans started going crazy and

130:32

getting holes in [snorts] their head and

130:33

losing all their hair and that's where

130:34

the big wigs came from

130:35

>> and then eating mercury pills to cure

130:37

themselves. [laughter]

130:40

>> It's crazy what people used to believe.

130:42

>> Yeah. It's really kind of fascinating.

130:43

But the point is even the people that

130:46

lived in America before those settlers

130:49

came, those people came from somewhere

130:52

else. They came from Siberia. You know,

130:54

everyone's a colonizer. Everyone all

130:56

over the world. People, you start in

130:58

Africa million years ago or whatever it

131:00

is, and then people start slowly moving

131:03

away from the people that were kicking

131:04

their ass looking for a better place to

131:05

live. But isn't the whole thing from the

131:07

leftist paradigm just to create um or to

131:11

delegitimize

131:13

uh the fact that you can say what do you

131:15

mean my my grandpa was born here?

131:18

>> His grandpa was born here, right? He was

131:20

a colonizer.

131:20

>> I'm an American and I have a right to my

131:23

nation because by birth I have a

131:25

birthight to to the land that I'm on and

131:28

so do my fellow countrymen. And they

131:30

Nope. It's an attempt to delegitimize

131:31

that. Right. That's the whole point.

131:33

just to delegitimize your claim to your

131:35

own land.

131:36

>> Well, that's what we were talking about

131:37

earlier with lefts with leftists where

131:39

there's this purity test that no one can

131:41

ever pass because they'll always keep

131:42

pushing the boundaries further and

131:43

further. You're you're never going to be

131:46

there's there's no like real Americans.

131:49

Everyone who's white is a colonizer.

131:51

>> Yeah.

131:51

>> Yeah. It's just it's [ __ ] goofy and

131:54

it's just designed to point at someone

131:57

that someone is the bad person and this

131:59

is the reason why life sucks. and also

132:03

dismiss any of the terrible activities

132:05

that any of the other people participate

132:07

in because like oh they're they're just

132:09

oppressed. They're oppressed people so

132:10

they're lashing out.

132:13

>> Do you think what like if you had if you

132:16

again the bird's eye view what do you

132:18

think the left what do you think their

132:19

end goal is here?

132:20

>> I don't think they know. I don't think

132:22

their end goal their end goal is their

132:23

enemy is the the right and the right is

132:26

Nazis and fascists. They want to

132:27

eliminate the Nazis. In fact, they want

132:29

to roll initiative.

132:30

>> Roll initiative initiative, right?

132:32

>> Yeah. They want and they

132:33

>> 2d6 damage. It's happening.

132:35

>> And they think that once they get into

132:37

power, everything will be fine. But it's

132:39

not going to. And not only that, what

132:42

would be fascinating is if someone from

132:45

the left started behaving exactly like

132:48

the people that are on the right just

132:51

did it from a perspective of the left

132:53

where you would think, oh, this is okay.

132:55

And that's what we got during the Obama

132:57

administration. I sent you this thing,

133:00

Jamie, a little bit ago, the clip of

133:03

Obama talking about immigration. And by

133:06

the way, Obama do I and I was mistaken

133:08

on this. I thought that a lot of the

133:10

people that Obama deported were people

133:12

that were turned away at the border.

133:14

Uh-uh. That was a third. Most of the

133:17

people out of the I think it was 3

133:19

million over the course of his

133:21

presidency uh that were deported were

133:24

[ __ ] deported. Like arrested,

133:26

deported. A lot of people were killed.

133:28

Let's put on the headphones so we can

133:29

listen to the speech cuz this this

133:31

sounds very MAGA. Listen to this. There

133:35

are those [clears throat] in the

133:37

immigrants rights community

133:39

who have argued passionately

133:42

that we should simply provide those who

133:45

are illegally with legal status or

133:49

at least ignore the laws on the books

133:51

and put an end to deportation until

133:55

we have better laws. And often this

133:58

argument is framed in moral terms. Why

134:00

should we punish people who are just

134:02

trying to earn a living?

134:06

I recognize the sense of compassion that

134:10

drives this argument,

134:13

but I believe such an indiscriminate

134:15

approach would be both unwise and

134:17

unfair.

134:19

It would suggest to those tink thinking

134:21

about coming here illegally that there

134:24

will be no repercussions for such a

134:26

decision and this could lead to a surge

134:30

in more illegal immigration.

134:35

And it would also ignore the millions of

134:36

people around the world who are waiting

134:38

in line to come here legally.

134:44

Ultimately, our nation, like all

134:47

nations, has the right and obligation to

134:50

control its borders and set laws for

134:53

residency and citizenship.

134:57

And no matter how decent they are, no

135:00

matter their reasons, the 11 million who

135:04

broke these laws should be held

135:06

accountable.

135:09

That sounds so Republican. In 2010, that

135:13

was a Democrat saying that and everybody

135:15

was like, "Well, okay,

135:17

>> that's reasonable."

135:18

>> Yeah. And Tom Hman,

135:20

>> who is the head now, was the guy then

135:23

and he gave him a [ __ ] medal.

135:26

Find the clip of Hillary when she's

135:29

running in 2012 where Hillary is more

135:33

MAGA than Trump. The way she frames

135:36

things is so hardcore right-wing she

135:39

sounds to the right of Marjorie Taylor

135:41

Green. If you've never seen this, have

135:43

you seen this one?

135:44

>> I think so, but I'm going to look again.

135:46

>> Wonderful. It's wonderful because it

135:48

just shows you how much horseshit. By

135:50

the way, how good was he? He was such a

135:52

good spokesperson. like the way he

135:54

talked was so it was so measured and so

135:58

noble in the way he phrased his his

136:01

sentences like it was really it's really

136:03

interesting how much perception plays a

136:06

factor in what you think of as like

136:07

someone being a good president because

136:09

everybody on the left thinks of him as

136:11

being like the most amazing president

136:12

ever this isn't the one and he wasn't

136:15

>> it keeps coming up though what I'm

136:16

looking for

136:16

>> but this isn't the one the one is she's

136:19

giving a speech

136:20

>> so I thought I was looking for but I

136:22

didn't even type in what I was looking

136:23

for. I just typed in 2012 and that's the

136:25

thing that keeps

136:25

>> Maybe it's not 2012. It might have been

136:27

2008.

136:29

Don't don't do uh Hillary is more MAGA

136:32

than than Trump. See if you can find it.

136:36

It's I know it's on YouTube, but it's

136:39

this amazing campaign speech where you

136:42

got it.

136:43

>> This is 2008.

136:45

>> Is it 2008? Yeah, that's it. That's it.

136:47

Here it is. Listen to this. [laughter]

136:51

I love this one.

136:52

>> I think we got to have tough conditions.

136:55

Tell people to come out of the shadows.

136:56

If they've committed a crime, deport

136:59

them. No questions asked. They're gone.

137:01

If they

137:02

>> Cheers. Cheers from the Democrats.

137:04

>> If they've been working and are

137:06

law-abiding, we should say, "Here are

137:08

the conditions for you staying. You have

137:10

to pay a stiff fine because you came

137:11

here illegally. You have to pay back

137:13

taxes. And you have to try to learn

137:16

English. And you have to wait in line.

137:19

You going TO LEARN ENGLISH? EVERYBODY'S

137:21

CHEERING.

137:21

>> YEAH, they love it. [laughter] They love

137:23

it. And now

137:27

>> Trump Trump's a Nazi.

137:28

>> Yeah,

137:28

>> Trump's a Nazi.

137:29

>> That is more rightwing than Marjorie

137:33

Taylor Green.

137:34

>> Yeah. The the Democrats were I mean

137:37

there used to be labor unions that would

137:39

put pressure on them, right? Uh this was

137:41

a big thing. Like there was labor

137:42

unions. That was what the Democrats had.

137:45

Yes. And the labor unions did not want

137:47

uh the cheap labor to come in and

137:49

displace them from having their nice

137:50

little highwage jobs. And so it was all

137:53

about we got to deport the illegals.

137:55

Like but what did Bernie Sanders say?

137:57

Mass illegal immigration

137:59

>> is a right

138:00

>> is a right-wing Koch brothers uh

138:02

conspiracy to bring in cheap labor. And

138:04

he wasn't wrong.

138:05

>> Right.

138:05

>> Okay. He wasn't wrong. But the thing is

138:07

is it's like

138:09

>> what the hell are we fighting over here?

138:11

Well, we're fighting over the fact that

138:13

uh the the left is just trying to uiate

138:16

itself with power and they don't really

138:18

care about what the moral paradigm is.

138:20

As long as they can get their people in

138:21

power, they'll use anything as a

138:22

lynchpin issue.

138:23

>> That's right. That's right. It's all

138:25

about power. The whole thing is about

138:27

power and that's what people need to

138:28

truly understand. You're being played.

138:30

You're being played. You're being played

138:31

in Minneapolis. You're being played all

138:35

around the country. It's it's about

138:36

power. It's about them getting power.

138:38

And if you think that once they get

138:41

complete, if they did, they were

138:42

successful, they imported millions more

138:44

to all these swing states, they allow

138:46

them to vote, they completely rigged the

138:48

system, now it's only, you think that's

138:49

going to be good for everybody, you're

138:50

out of your [ __ ] mind.

138:52

>> Well, let me ask you this. Do you think

138:53

then that Christians knowing this, they

138:56

know that these are bids for power when

138:59

you have Christian nationalism on the

139:01

rise and Christians moving towards that?

139:03

Doesn't that seem like it's a rational

139:05

and reasonable thing to do for them to

139:07

want the mindset of if we're not in

139:11

power, they will be in power.

139:14

>> It's rational from their perspective for

139:15

sure. What people are terrified of is

139:17

that it would restrict the freedom of

139:19

religion and that you would impose

139:21

Christianity on the entire country, you

139:24

know, and I don't think you should

139:25

impose any kind of religion on any

139:26

people. I think people should be free.

139:28

I've never seen the I know that they

139:31

there are of course the the people who

139:33

push that there has to be an established

139:35

theocracy in order for Christian

139:36

nationalism to work, but the frameworks

139:39

that I've seen that have political

139:40

legitimacy don't seem to push for that

139:42

at all. They push instead that the ideas

139:46

that Christians should not be hamstrung

139:48

from the ideals of of holding power

139:50

itself. that that does not make you bad

139:53

or evil or awful. No matter what the

139:55

left says, Christian or how Christians

139:57

are supposed to act and that when you

139:59

are in power, you should rule with

140:01

Christian ethics in mind. That's how

140:03

you're supposed to pass policy, public,

140:06

you know, public uh public policy of all

140:08

kinds is through those ethical means.

140:10

>> That certainly sounds like

140:11

>> not, hey,

140:12

>> it's going to be a theocracy, right?

140:15

>> Um that doesn't seem like it's a a

140:17

necessary component.

140:19

>> No. Well, people are afraid of the

140:20

concept of a theocracy and I think that

140:23

people are afraid of just human nature

140:24

and that if people did get into power

140:26

that that's what it would become just

140:28

like these people are just trying to get

140:30

into power that they would use

140:32

Christianity as as a vehicle and they

140:34

would just use that as an ability to

140:36

control people. The the the real concern

140:38

is just human nature. Human beings when

140:41

they get into any position of power like

140:43

to keep it and expand it. It's like

140:45

that's what they do. You know, I I tell

140:47

jokes. I talk [ __ ] That's what I do. I

140:49

like to talk [ __ ] I like to tell more

140:50

jokes.

140:51

>> But there have been good kings, right?

140:53

>> There have been. But boy, good luck.

140:56

Good luck finding a benevolent dictator

140:58

to

140:58

>> Well, not anymore. I don't I don't think

140:59

you would have to use a utilize a

141:01

dictatorship. But if it's the case that

141:03

we can point to like there were people

141:05

who had a lot of power who fundamentally

141:07

were pretty good. What was it that

141:09

they're pointing to that made them good?

141:11

Like

141:12

>> is there something we can point society

141:13

towards that could make our leaders a

141:15

bit better? that can make our leadership

141:18

not hyperfocus on uh the nonsense of

141:21

like gay marriage and stuff like that

141:23

which is completely and totally

141:25

unimportant at the political level and

141:27

she didn't mean up to the federal

141:28

government anyway.

141:30

>> Yeah. And I think it's a political tool

141:32

too. you know, uh, Anna Paulina Luna was

141:35

on the podcast and she said something

141:36

that I really didn't consider about

141:38

certain political problems that exist in

141:41

this country that they don't want to

141:43

solve them

141:44

>> because they want to use them to finance

141:46

their campaigns. They want to run on

141:48

those principles. They want to run

141:50

>> to always be there.

141:51

>> It needs to always be there.

141:52

>> It needs to always be there as an issue.

141:53

And it's like uh

141:54

>> I feel I I think a lot of these can be

141:56

solved like if we if we were to have

142:00

>> politicians in mass and their supporters

142:02

in mass who followed Christian ethics. I

142:04

do think a lot of those sub issues get

142:06

solved very quickly

142:07

>> if it's true Christianity if they really

142:09

do follow the teachings of Jesus Christ.

142:11

But I think what people are really

142:12

worried about is like when people think

142:15

about Christians, they think about the

142:16

worstc case scenario of Christianity,

142:19

which is like evangelicals on television

142:21

that just try to get private jets.

142:23

>> But how is electing atheists better or

142:25

electing socialists better or electing

142:27

any of these people better? It can't be

142:29

better. Right.

142:29

>> Right. It's not better like if someone's

142:31

a complete sociopath, it doesn't have

142:33

any moral framework like a Gavin Newsome

142:35

type guy. Like that's that's even more

142:37

terrifying.

142:38

>> Yeah. So, it's like if if I'm going to

142:40

be ruled, can I at least be ruled by

142:41

people who have my ethics or

142:44

>> really [laughter] who really believe and

142:45

that they're trying to make the world a

142:47

better place and they're not just trying

142:48

to acquire wealth and and help their

142:51

donors acquire more wealth. It's spooky.

142:54

It's spooky because people that have

142:56

power, you know, it scares the [ __ ] out

142:59

of everybody else. And it should because

143:00

historically it's never been good. It's

143:02

almost always when people have power,

143:04

they want more power and they want to

143:06

also support the people that help them

143:08

acquire that power. And then they want

143:09

to make sure they got that power locked

143:11

down. So what's the best way to do that?

143:12

Well, you restrict people's ability to

143:14

express themselves. You restrict

143:16

people's ability to travel. You take

143:18

away as much money as possible, tax them

143:20

as highly as possible so they're always

143:22

in this like state of constantly

143:25

struggling to pay their bills. You keep

143:26

them completely no one's comfortable

143:28

ever. and then, you know, have this

143:31

problem that we have to solve. This is

143:33

why you have a problem. It's uh these

143:35

people.

143:35

>> Yeah. And we're the we're the solution.

143:37

The causers.

143:38

>> Yeah.

143:38

>> The causers are the solution.

143:40

>> Do you you know, one of the things that

143:42

I I mean, you you engage in so many

143:44

[ __ ] debates, man. I've I've watched

143:46

I've consumed a lot of your content

143:47

online and I always wonder like what

143:50

does that wear on you after a while?

143:52

>> Constantly. Oh, yeah. All the time. All

143:54

the time. Well, the thing is is that

143:57

[sighs] so I [snorts] I argue from a

144:00

worldview. My worldview is Christian

144:02

ethics and this is a foundation from

144:04

which all other arguments are starting

144:06

and ending. Now, I'm happy to meet

144:09

people in the middle. A lot of people

144:10

want to argue in the middle, right?

144:12

We're going to get past all the

144:13

foundational stuff and we're going to go

144:14

to the menu or the middle of the

144:16

argument and start there. And I'm kind

144:18

of happy to do that to kind of move the

144:21

move everything backwards or forward so

144:23

we can either get to the end or we can

144:24

get to the beginning and get this

144:25

figured out. Yes. What wears on me the

144:28

most about it is there's a lot of people

144:31

who I debate with who I know don't

144:33

believe what they're saying.

144:35

>> I know I know for sure and I there's

144:38

moments where I catch myself where I

144:40

recognize it right then that moment in

144:42

the debate and then I'll hammer them.

144:44

But it happens all the time where I'm

144:46

like, "You don't believe that [ __ ]

144:48

There's no way." And then they'll come

144:49

back with a, you know, with a re I do

144:52

and and and you could just tell it's

144:53

disingenuous. Yeah. Right. I can't

144:56

logically show it. I There's no way for

144:58

me to logically show necessarily your no

145:00

your motivation maybe in extreme

145:02

context, but

145:03

>> yeah, man. There's people who are pretty

145:04

disingenuous about their view and

145:07

there's times where it comes out and the

145:09

whole audience can see it and you can

145:11

see it and you're just like just why?

145:14

Just why are you like you don't even

145:16

believe the [ __ ] yourself and you're

145:18

you're propagating it on other people

145:20

and you know people will follow it. You

145:21

know there's some cash there. You know

145:23

there's a but you're doing it anyway.

145:25

You're doing it anyway. Like I've always

145:27

thought in my head you take a guy like

145:28

Destiny, right? The coomer gremlin as I

145:31

like to call him. Okay.

145:32

>> What do you call him? Cummer gremlin.

145:33

>> What's that mean?

145:34

>> Well, like cumor, like he just all he

145:36

does is he he bakes. He's like a sexual

145:38

degenerate, right?

145:39

>> Is that what a cummer is?

145:41

>> Yeah. Well, akumor it's a it's a little

145:43

more mild than that. Akumor is just like

145:45

um kind of one of the higher values is

145:47

just kind of having sex with everyone

145:49

right around like that's what you do,

145:52

right?

145:52

>> Yeah. Yeah. He did. Yeah. Well, he's all

145:54

kinds of sexual apparently. But the

145:56

thing is is like um [snorts] I've often

145:58

thought that there's times where I'm

146:01

talking to the guy where I'm like I you

146:03

don't believe that. Like you just

146:05

there's no [ __ ] way you believe that

146:07

[ __ ] You're making it up and I know

146:09

you're making it up. Right. And you'll

146:11

catch him at times. You'll be like

146:12

whatever I got to say to win the

146:13

argument.

146:13

>> Yeah.

146:14

>> And it's like yeah I believe that.

146:17

>> Yeah.

146:17

>> I believe that. But it's like there are

146:19

people who genuinely believe their view

146:22

and are excellent debaters.

146:25

uh backing their view and I love those

146:28

engagements. You know, I live for those

146:29

engagements. The problem is it's like

146:32

it's 5% of them,

146:33

>> right? A lot of people are just trying

146:35

to win, right?

146:35

>> Well, not just trying to win, but I

146:38

don't even have a problem with going

146:39

into a debate with a mindset of winning

146:41

it. If you're representing a view you

146:43

believe,

146:43

>> right?

146:44

>> You want to win the engagement, whether

146:46

it's a conversation or it's a debate,

146:48

you want to you want to win people over

146:50

to your side. You want to even win the

146:52

person you're talking to over to your

146:53

side. you know, or maybe sometimes you

146:56

got to be brutal and destroy the view

146:58

completely so people don't move towards

146:59

it. Both of those are completely I I

147:02

consider them both fine and I think

147:04

they're both effective, but the issue

147:06

that I have ultimately is when you're

147:08

arguing with somebody and you know they

147:10

don't believe what they're saying.

147:11

>> Yeah.

147:11

>> And yeah, that that wears on you. And uh

147:14

it's not just that, but sometimes you

147:17

hear the same recycled arguments over

147:20

and over and over and I'm like, you

147:21

don't even have to tell me anymore. I

147:23

can get to the end before you can.

147:25

[laughter] I can tell you exactly where

147:26

you're gonna go, what you're gonna say,

147:28

why you're gonna say it, what your

147:29

justification's going to be, and I can

147:30

just get to the end and take care of

147:31

this right now.

147:32

>> It's got to be weird. Like, how old are

147:33

you?

147:34

>> Uh, 42. Just turned 42.

147:36

>> Weird like in your late 30s, early 40s

147:40

to like have entered into this world.

147:42

>> Oh, yeah. Dude,

147:44

>> so bizarre. It is. It is beyond bizarre.

147:46

I mean to be like a normal

147:48

>> workingass guy doing

147:51

>> literal nobody from nowhere, no

147:53

political experience,

147:54

>> nothing. I had no no entryway, nobody in

147:57

entertainment, nobody to help me along,

147:59

nothing.

148:00

>> And so was it just seeing how ridiculous

148:03

people were being during the COVID

148:06

pandemic that like motivated you to be

148:10

vocal about all this stuff?

148:11

>> It Well, and I had time.

148:13

>> Yeah. You know, time's a big one. But

148:15

with the with the layoff, it's like, oh,

148:18

well, I don't really have a lot to do.

148:20

And I'm listening to this and it's like

148:21

now I can maybe I can engage a little.

148:23

Maybe I can get involved a little bit.

148:25

You know, not much. I didn't think I

148:27

didn't think anything would ever come of

148:28

it. You know what I mean? I just wanted

148:30

I saw my view wasn't being represented

148:32

very well.

148:34

>> But did you have a a history of

148:37

education? Like were you were you just

148:40

reading books? Like where did you

148:42

develop these ideas?

148:43

>> Yeah. Well, it wasn't just from books,

148:45

right? I would listen to long- form uh,

148:47

you know, historic podcast. Um, I would

148:51

[clears throat] more than anything I

148:52

would be listening to, you know, the the

148:55

mediums changed, but I would listen to

148:57

what people had to say on a a variety of

148:59

issues and I would watch the news

149:01

incessantly and I would be able to pick

149:03

out what's true and what's not true

149:04

after a while. Uh, political education

149:07

comes from a variety of sources. You

149:09

can't get it from the news and you can't

149:10

get it from listening to just podcasts

149:12

and you can't listen get it by just

149:13

talking to people. You have to take a

149:15

sum total of everything. All of it in

149:18

order to at least be even moderately

149:20

politically savvy and understand what's

149:22

going on in the world. And I realize

149:23

most people make commentary on things

149:25

they have no [ __ ] idea what they're

149:26

talking about.

149:27

>> Right. And so how did you transition to

149:29

doing this as a job?

149:31

>> Um well it's about two years in to uh

149:35

doing this. So, I was like, "Look, I sat

149:37

sat down with my wife and I said,

149:40

I'm not making enough money on my

149:42

podcast to quit my job. There's no way."

149:45

You know, or doing debates to quit my

149:46

job. There's no way. But I think I

149:48

could. I actually think I could if I

149:51

just focused my time on it now. I think

149:54

I could do it. Replace my income with

149:55

ease.

149:56

>> God, that's a big risk, right?

149:57

>> It was a huge risk

149:58

>> because you have a family.

149:59

>> Yeah, huge risk. And she said, "Okay."

150:02

>> Wow. gave me a kiss and uh next day I

150:06

went in and quit and I was like I don't

150:08

know what the [ __ ] I just did.

150:10

[laughter]

150:12

I you know it's um in some ways it was

150:15

like it it's like a I'm going to go be a

150:18

big football star to you screw you to

150:20

your boss, you know what I mean? And

150:21

it's like it wasn't the same exactly but

150:24

it was a big risk but I just thought you

150:26

know I really can make a go of this. If

150:28

I can focus my time and energy on this I

150:30

think I'll do really well at it. Well, I

150:32

think you have a very unique mind for it

150:34

and uh I think you're very good at it

150:37

and I also think you have really good

150:39

points that are very valuable for people

150:41

to hear and you're really good at

150:44

pointing out the logical fallacies and

150:46

pointing out the ridiculous thought

150:48

processes that a lot of these people

150:50

have. And I I you know that's important

150:53

man. It's it's important for society. I

150:56

probably don't think of it that way.

150:58

probably just enjoy doing it and feel

151:00

like it's but it's valuable

151:03

>> because there's not a lot of people that

151:04

are good at it.

151:05

>> I get hundreds of DMs weekly from people

151:08

and they'll say thing and th this again

151:10

I'll never get used to it but what I do

151:12

this is my process. I sit down every

151:15

morning uh I have a cup of coffee and I

151:19

just respond to every DM that's sent to

151:21

me.

151:21

>> Wow.

151:22

>> So um

151:24

>> I used to do that

151:24

>> for me. Yeah. You probably get too many,

151:27

right? Yeah, it's unto

151:28

>> I get I thought so too. I thought, well,

151:30

if I start getting hundreds every day,

151:32

there's just no way. But I still do it

151:34

every morning.

151:35

>> How much time does it take you? It

151:36

>> takes me hours about two hours.

151:37

>> Wow.

151:38

>> Two hours every morning. I'll sit down

151:40

and I'll go through them and I can't

151:42

send back long paragraphs, but usually

151:44

I'll read exactly what they say and even

151:46

I'll just say something like, "Thanks

151:47

for the support." Or, "You know, I

151:50

really appreciate you saying that. Uh,

151:52

that means a lot." You know, cuz it

151:54

does.

151:54

>> Yeah. To me, it's my privilege to have

151:57

fans. It's not uh their privilege to be

152:01

one.

152:02

>> And so, when I started to see that and I

152:05

started to see, wait, this actually does

152:07

have a massive effect on people. I also

152:10

began taking it very much more seriously

152:12

because I understood uh you I can also

152:15

say things that do the opposite. They

152:17

could move people towards the opposite

152:19

of things which are which are good. You

152:21

know what I mean? or things which are uh

152:23

things which you should be moving

152:24

towards and so you know I do take it

152:27

seriously and I understand my job is to

152:29

represent a worldview and when I go into

152:31

a debate that's exactly what I think

152:32

millions of people are going to see this

152:34

worldview on display I'm representing it

152:36

I need to do the best I can to represent

152:38

it well

152:39

>> to be an intelligent reasonable person

152:42

who's both well read and has very good

152:46

points that you can express about social

152:49

issues societal issues just

152:52

it has

152:54

it's a massive thing. It's it's a very

152:57

important thing that um you know

153:01

mainstream media is not doing a good job

153:03

of filling that role. It just doesn't

153:06

you know there's not a lot of people out

153:08

there I mean Christopher Hitchens is

153:09

dead. There's not a lot of people out

153:11

there that are really good at debating

153:15

against ridiculous people and exposing

153:18

this and and it's it's so important for

153:20

people to sit down and see something

153:22

like that and to to recognize like, oh,

153:25

I've heard people like that talk. Oh,

153:27

I've always wondered like that doesn't

153:28

make any sense. Why doesn't someone tell

153:30

that guy to shut the [ __ ] up? Why

153:32

doesn't someone got And you do that.

153:33

>> And that's my job.

153:34

>> That's your job.

153:35

>> My job is to go in specifically and say,

153:37

why don't you shut the [ __ ] up? Because

153:38

what you're saying what you're saying is

153:41

so detrimental to people, too.

153:43

>> Yeah.

153:43

>> And it it's nightmare fuel for him. Like

153:45

I mean, people hear this stuff, man.

153:49

Like I I remember this one guy, he DM'd

153:50

me and he was like, Andrew, I hate these

153:53

[ __ ] people. Like I hate them. He

153:55

said, "I'll listen to him, man." And I

153:56

just [ __ ] rage in my truck. I'm like,

153:58

"I [ __ ] hate these bastards." He's

154:00

like, "But I can't stop listening." He's

154:02

like, and it his mindset was, "I want to

154:04

know what the enemy is thinking." Right.

154:06

That's his mindset. Yeah.

154:08

>> And I think a lot of it is maybe that

154:10

particular guy's addicted to rage or

154:12

whatever, [snorts] whatever you want to

154:13

frame it. I don't think so. I think the

154:15

truth is is that when people are trying

154:17

to get to the bottom of things, they're

154:19

trying to be like, "Why is this

154:20

happening? Why is this going on? Why do

154:22

these people think the way they do?" And

154:24

then they start listening to them.

154:25

Sometimes it's way worse than you

154:27

thought. It's like, "You really believe

154:29

that [ __ ] You really think that that's

154:31

the case? You really think that we

154:33

should be doing anything like this? What

154:35

is wrong with you?"

154:36

>> Yeah. And I I think that for a lot of

154:38

people that could be a a very kind of

154:40

like jarring experience for them. And I

154:42

I think that that's healthy though. I

154:44

think that's healthy for you to be kind

154:45

of jarred out of complacency a little

154:47

bit.

154:47

>> Well, it's certainly healthy for other

154:49

people to watch it cuz certain people

154:52

lean in one direction or the other and

154:54

they're not really exactly sure how they

154:56

feel about things. And sometimes someone

154:59

who has bad ideas can be very compelling

155:03

with these bad ideas because they're not

155:06

being confronted by someone who's better

155:07

at it,

155:08

>> you know, and I think that's a very

155:09

important

155:09

>> or even as good.

155:10

>> Yeah.

155:11

>> You know, even even some if you just

155:13

draw a stalemate,

155:14

>> it's like sometimes even that's good

155:16

enough.

155:17

>> Um because it's like, you know, maybe I

155:21

was leaning towards this or I was

155:22

leaning towards that, but I'm not sure.

155:24

Again, sometimes that's the be the best

155:26

thing, right? May maybe you shouldn't be

155:28

too sure on this side or that side,

155:31

right?

155:31

>> Um, but you know, before you commit, at

155:34

least maybe I can stop you from making a

155:37

committal to this. Tons of people are

155:39

like, "Man, I was on the fence about

155:40

Christianity. I was on the fence about

155:42

Orthodox. I was on the fence about this.

155:44

I was on the fence about that. This

155:46

debate did it for me. Listening to what

155:47

these people had to say. This debate did

155:49

it for me. This debate did it for me."

155:51

you know, different fans have different

155:53

highlights that they like because

155:55

they're all coming from different walks

155:56

of life, but they're all very similar in

155:58

one aspect. Most of my audience are

156:00

married men, uh, you know, or

156:02

marriageable age, but, you know, late

156:04

20s, uh, through 30s, early 40s. That's

156:07

that's about the demographic, but mostly

156:09

like 32 to 45. And so these people, they

156:13

have some life experience. They're not

156:15

dummies. And they're listening to and

156:17

they're like, "It's about time someone

156:18

told that." Yeah,

156:19

>> you know, somebody let them know what

156:21

was going on. Somebody challenged those

156:23

ideas. Somebody buried them. And um

156:26

yeah, that's that's what I'm effective

156:27

at doing. And that's what I'm going to

156:30

keep doing because these people are and

156:32

the higher here's what I've learned. The

156:35

higher I go in confrontation with the

156:38

higher level people, the dumber they

156:40

get.

156:40

>> Really,

156:41

>> the dumber they get. Back in the old

156:43

Twitch Blood Sport days when it was 50

156:46

live viewers and me against two leftists

156:48

and we were slugging it out, they were

156:50

smarter. They were these were much

156:53

smarter people than the highlevel

156:54

academic like it took on these two

156:56

academics recently at Debatecon. Both of

156:58

them are Ivy League graduates, right? It

157:01

was nothing. I could have I could have

157:04

easily destroyed them while enjoying a

157:06

hot bowl of soup. It would not have like

157:09

it was just it was it was

157:10

inconsequential. Why do you think that

157:12

is?

157:13

>> I Well, I think it's because of there's

157:15

a degree of asskissing and there's a

157:18

degree of people around you affirming

157:21

over and over and over how [ __ ] great

157:23

you are. That's where that egotism comes

157:25

in where I was saying earlier in the

157:27

podcast, you have to be make sure you're

157:28

grounded. Make sure that your ego never

157:30

takes over. Make sure that you don't

157:32

become the thing that you hate. Right.

157:35

>> And it's so easy to do. But it's also I

157:38

think I think that as they go things

157:42

become more cerebral and academic rather

157:47

than applicable and those kind of old

157:50

debates that I was doing was people

157:51

living in it not external from it

157:55

>> and so they you know it did they had

157:57

real emotion behind it this wasn't just

157:59

a thing on a chalkboard

158:00

>> right

158:01

>> you know so uh and the other thing is I

158:03

think a lot of people get where they are

158:04

in media through connection

158:07

and not because of merit. I think a lot

158:09

of people who are in media and political

158:11

pundits have no [ __ ] business being

158:12

there at all. They're dumb as a box of

158:14

rocks and they're there because they had

158:17

connections or they had friends who

158:18

assisted them in getting in the position

158:20

they are. Uh and when they are actually

158:22

confronted on their views, they fall

158:24

apart.

158:25

>> They totally fall apart. I've seen

158:26

comedians comics who were on the road

158:29

for years do better in academic debates

158:32

than academics.

158:34

>> Yeah.

158:34

>> And I Well, how is this possible? Well,

158:36

it's possible because that guy has real

158:38

world experience. He's probably just as

158:40

well read as you had a lot of downtime,

158:42

right? So, he educated himself, but he

158:44

can do the thing you can't. He can apply

158:47

it. That guy had he has a way to apply

158:50

this knowledge in a in a framework that

158:52

works because he's part of the apparatus

158:54

of the world. And there was nobody there

158:57

where he was like he tugged on their

158:59

their shirt sleeve and said hey daddy or

159:01

hey you know Uncle Bucks or whoever you

159:04

know I want to be on Fox News and now

159:07

they have an in you know and I think a

159:09

lot of that in media happens. I think

159:11

it's very very a lot of nepotism there

159:14

and a lot of people just really got no

159:16

business being there at all.

159:17

>> Don Lemon

159:18

>> Don Lemon not just Don Lemon. I mean no

159:21

there's so many of them. I just like to

159:23

pick on him cuz he's picked on me.

159:25

>> Yeah.

159:25

>> Well, I don't know what Don Lemon's

159:26

doing picking on anybody. Like you would

159:28

you would same thing. You would destroy

159:30

Lemon while enjoying a hot bowl of soup.

159:32

It would just be nothing because Lemon's

159:34

biggest problem is he had never had any

159:38

business. He was he was a token.

159:40

Literally a token. He was the token gay

159:43

black guy.

159:43

>> He was not valued for his great insights

159:46

and wonderful political takes and the

159:48

fantastic way in which he broke down the

159:49

issues of our time. He was valued

159:51

because he was a gay dude who was black

159:53

who was like liberal talking points.

159:55

>> Yeah,

159:56

>> that's it.

159:57

>> Yeah, I agree. Hey, listen, man. I

159:59

enjoyed this. Let's do it again

160:00

sometime. Absolutely. And tell everybody

160:02

your show, The Crucible, where they

160:03

could find it.

160:04

>> Yeah, my show is The Crucible on

160:06

YouTube. Uh you can also make sure you

160:08

go and grab a copy of my wife's book,

160:10

Oult Feminism. It's fantastic. I brought

160:12

you a copy, Joe.

160:13

>> Cool.

160:13

>> And then uh I know you I know you love

160:15

feminists. That's why I brought that's

160:17

why I brought that copy. And then um you

160:20

can also catch me uh debate university.

160:22

It's a the thing that I've done for

160:24

years. It uh it'll teach you how to

160:26

debate. You can go check that out as

160:28

well. debateuniversity.com. I really

160:30

appreciate the time.

160:31

>> Hey, I appreciate you being here, man. I

160:32

think what you're doing is great. I

160:33

really do. I enjoy it.

160:34

>> Thanks.

160:35

>> All right. Bye, everybody.

160:41

[music]

Interactive Summary

The podcast discusses various topics, including the nature of political discourse, the impact of social media, and societal issues. A significant portion of the conversation revolves around the perceived rise of left-wing ideology, its tactics, and its perceived negative impact on society. The speakers delve into topics such as the weaponization of social issues, the decline of traditional values, and the role of religion in public life. There's also a discussion about the effectiveness of different political ideologies and the importance of a moral and ethical framework for a functioning society. The conversation touches upon the nature of debate, the importance of intellectual honesty, and the challenges of navigating public discourse in the current political climate.

Suggested questions

5 ready-made prompts