HomeVideos

Joe Rogan Experience #2478 - Theo Von

Now Playing

Joe Rogan Experience #2478 - Theo Von

Transcript

5300 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day

0:13

making.

0:15

Thank you.

0:16

>> Who do you mean by those people?

0:18

>> Uh, you know,

0:20

>> you know,

0:22

>> it's changed over the years.

0:23

>> With the horns. I don't know. You know,

0:26

>> you mean band members? Who are you

0:28

talking about here? stuff. Uh, music

0:31

>> music music industry,

0:32

>> dude. Yeah,

0:33

>> we're just talking about So, we should

0:34

tell people what we're talking about. If

0:36

you hum a song just like [ __ ] around and

0:40

like, you know, like the cocaine song,

0:42

you know what I mean?

0:43

>> If you play Eric Clapton, you if you do

0:45

that, you'll get flagged on YouTube.

0:48

They and they take money from you.

0:50

>> How desperate is that?

0:51

>> It's gross. Like, you can't even hum a

0:53

song. You can't like What are you

0:55

talking about?

0:56

>> You can't even hum in the future. You're

0:58

not even going to be able to fall in

0:59

love. They're going to charge you for

1:00

it.

1:00

>> How are they going to do that? Well, you

1:02

won't be falling in love with the person

1:03

anymore.

1:05

>> People will be outdated. People come

1:06

with problems.

1:07

>> I ain't coming on no bot.

1:09

>> No. Ever? What about in one?

1:12

>> No. What will they do with it?

1:13

>> Keep it.

1:14

>> Yeah.

1:14

>> Maybe that's what keeps them alive.

1:17

Imagine that.

1:18

>> Let me think about it.

1:19

>> You got to [ __ ] her every day to keep

1:20

her alive. If you don't, she starts

1:21

shriveling up on you

1:22

>> like she's on ompic.

1:25

>> So she's Latina.

1:27

plump. You got to keep her plump.

1:29

>> You got to keep her You got to keep uh

1:30

You got to keep the juices flowing, huh?

1:32

>> There'd be guys that would sign up for

1:33

that.

1:34

>> Okay, I could I think I could do that.

1:36

But day 5,26

1:40

in a row, you'd be like, "Oh my god, I

1:42

can't do this."

1:43

>> Yeah.

1:43

>> And she's dying.

1:45

>> Why is she dying? She's electric, isn't

1:46

she?

1:47

>> She only gets powered by [ __ ]

1:49

>> Oh. Oh, it's sad.

1:51

>> And three days with no come, she shuts

1:53

off and that's it. And you can't bring

1:54

her back. I'd shut her down quick. I'll

1:56

tell you that. Dude, she would be

1:59

>> You have to let your buddies [ __ ] her

2:01

just to like keep her alive.

2:02

>> Oh, that's going to be gross, Joe.

2:04

>> It would be.

2:05

>> And it would be sad and stuff like that.

2:07

And you'd have some buddy like late at

2:09

night like, "Hey, bro."

2:10

>> He loves her.

2:10

>> Yeah.

2:11

>> What's your wife doing? Like texting you

2:13

at like 4:00 a.m.

2:15

>> Bro, if you need me to keep her alive,

2:16

he's over there stroking it while he's

2:18

on the phone with you.

2:19

>> Bro, plug your wife in for a little bit,

2:21

bro. Let me char Yeah, let me come over

2:23

there. We're getting close. Did you see

2:25

those ones they have at the Consumer

2:26

Electronic Show in Vegas?

2:28

>> The dancing ones?

2:29

>> No, it's an AI companion that's a robot.

2:32

It's like a very pretty lady and her

2:34

mouth moves and she talks and

2:36

>> it's not there yet, but it's in the

2:38

neighborhood. You know, it's not at the

2:40

right door, but it just entered the

2:41

community. You know what I mean? You

2:43

think so? You know, some communities

2:44

have that awning, welcome to like

2:46

Paradise Estates, and you go through and

2:48

there's all the houses in the

2:48

subsection.

2:49

>> Yeah.

2:50

>> It's in the door,

2:51

>> right?

2:51

>> It's in the door. It's not at your house

2:53

yet.

2:53

>> He had no oaks or whatever.

2:54

>> Exactly.

2:55

>> Or Hunter's Glenn or Racist Cove or

2:57

whatever.

2:58

>> The [ __ ] robot has made it through

3:00

>> onto your street. It's just not at your

3:03

house yet. And it will be in 5 to 10

3:05

years.

3:06

>> My kids aren't [ __ ] robots.

3:07

>> They will. If you have kids, they're

3:09

going to [ __ ] robots.

3:09

>> No, they won't.

3:10

>> You won't be able to stop them. All

3:11

their friends are going to be able to do

3:12

it. It'd be rude. It'd be like keeping

3:13

them off social media.

3:16

>> That's crazy, dude.

3:18

>> If you keep your kids off social media,

3:20

they feel left out. They're like, "Come

3:22

on, Dad. Let me get Snapchat." Like,

3:24

"No, son. I want you to concentrate on

3:27

your homework and your football. Come

3:29

on, Dad. Come on, Dad. Let me get

3:32

Snapchat."

3:32

>> No. One of Look, you your dad wakes you

3:34

up early. He's like, "Look, one of you

3:36

little bastards left the freaking com

3:37

robot in the yard.

3:40

Which one of you

3:41

>> delirious covered in [ __ ] All your

3:43

friends [ __ ] it."

3:46

>> That's sad. I do think that one day our

3:48

smiles will be in a museum. That's where

3:50

we're headed. It's like the feelings are

3:52

starting to disappear. You know,

3:54

>> maybe that's what autism is,

3:56

>> like severe autism.

3:58

>> Well, I've thought about that a lot.

3:59

That that's why we're getting to some of

4:01

like the only way we could get to this

4:03

place if we get to this data driven

4:05

place where it's like,

4:06

>> you know, alien is things start to feel

4:09

alien here is because of um autism

4:12

leading uh it's when autism mixes with

4:15

um what's it called? Our societies based

4:17

on money, capitalism. M

4:19

>> when autism and capitalism converge,

4:21

things got really weird,

4:22

>> right? And think about it, right? We

4:23

don't know exactly what is causing

4:26

autism, they have a lot of suspicions. A

4:28

lot of them have to do with vaccines and

4:30

different medications and different

4:31

chemicals and pollutants and all sorts

4:33

of different things

4:34

>> too. One thing we could all agree on and

4:37

Tylenol, they think too, right? But one

4:39

thing that we can all agree on, it's a

4:43

big factor is stuff that we've created.

4:46

That's a big factor. Whatever it is,

4:48

let's let's not put the blame on any one

4:50

of these industries,

4:51

>> but there's something going on where

4:53

more people are getting autism now than

4:55

ever. And it seems almost positive that

5:00

it's coming from us, that we did

5:02

something, human society.

5:05

Well, if you think about where human

5:06

societyy's going,

5:08

>> wouldn't that be a way to turn us into

5:12

something new, right? If we're going to

5:14

merge with machines, what better way

5:16

than to like eliminate empathy,

5:18

eliminate emotions, make us like able to

5:22

like stay at home and stare at a screen

5:24

for hours at a time with no concern

5:26

whatsoever. Just the kind of social

5:29

detachment

5:30

along with the integration of all this

5:33

crazy new technology. And the people, a

5:35

lot of the people that are in the tech

5:37

business that are high levels are on the

5:39

spectrum.

5:40

>> Oh, dude. Yeah. They're on the [ __ ]

5:42

diving board of the spect.

5:43

>> They're the ones bringing in AI. They're

5:46

bringing in the next version of life

5:49

>> kind of. I mean like we're thinking it's

5:51

like a mistake,

5:53

>> but it might not be. It might be like a

5:56

crucial part of the system when you get

5:58

further and further integrated with

5:59

technology and all the stuff that you

6:02

need to make it and all the stuff that's

6:04

involved in capitalism including like

6:06

lying about what mech medications kids

6:08

needs and giving them this and giving

6:10

them that lying about what kind of what

6:12

the pesticides do or the chemicals do or

6:14

you know whatever it is. What what is

6:16

that ultimately doing if it's if it's

6:18

leading people to be on the spectrum

6:20

more and more often? What if one day

6:23

it's not one in 12 in California? If

6:25

it's 100%. Yeah.

6:26

>> Everybody, you got a full spectrum

6:28

society and there's no regular people

6:30

left.

6:31

>> Yeah.

6:32

>> You have to think about that. If it's

6:33

one in 12 boys right now in California

6:35

and it used to be like 1 in 10,000, this

6:38

is like an invasion, right? Like an

6:42

invasion of like a a way that people

6:45

think that's entering into human

6:48

civilization.

6:49

>> Yeah. And I feel like it was I I agree.

6:51

Right. So then it's like those and and

6:54

then if you don't have this like

6:56

uprising this emotional uprising out of

6:58

people of like you know like this is

7:01

wrong because I think like um you know

7:03

when you get real databased and like

7:05

that kind of like tismesque type of

7:08

energy I think you're not you know

7:11

you're not thinking about some of the uh

7:13

like how it affects you as much or maybe

7:15

just you're able to like roll into that

7:18

nest of like this is this new digital

7:20

landscape And those people fit well in

7:22

it. Does that make any sense?

7:24

>> It does. It does make fit well. And they

7:26

do fit well in it. I know a lot of

7:27

people that are on spectrum that are

7:30

very happy just being online all the

7:32

time.

7:32

>> Yeah.

7:32

>> That's what they do.

7:33

>> And dude, maybe that's what's supposed

7:35

Yeah. That's the scary part is like what

7:36

if that's what's supposed to happen,

7:39

right?

7:39

>> And the rest of us are just like cuz I'm

7:41

like I think a romanticist, you know?

7:42

I'm thinking like, oh yeah, a porch and

7:44

a rocking chair. And then they, you

7:46

know, the but you know, other people are

7:48

like, "Yeah, we're coming in robots and

7:50

[ __ ] like that and ordering bagels

7:51

through our [ __ ] brain cells and

7:52

shit." You know, like it's just like um

7:55

we're thinking of autism as a flaw, but

7:57

it might be a feature.

7:59

>> But is it what is it okay? Is it what

8:03

nature wants or is it something that

8:04

we're creating that is heading us down a

8:07

very dark path? I feel like it's not not

8:10

autism, but all of it in conjunction is

8:12

the second one.

8:12

>> It might be what the universe wants.

8:15

[ __ ]

8:16

>> It might be how it goes. Like there has

8:19

to be some sort of a pathway from

8:23

territorial primates to something new,

8:27

right?

8:28

>> Does there?

8:29

>> I think so. Yeah. Because otherwise we

8:31

would still be single- cellled

8:32

organisms. Everything's moving in a

8:33

general direction of more complexity.

8:36

>> Okay. Right.

8:37

>> That's fair. So if it's moving in a

8:38

general direction of more complexity,

8:40

we're and with all the technology that

8:42

we're making, like we're moving into

8:43

some [ __ ] weird place, right?

8:45

Wouldn't it be better if you just like

8:47

easily accepted that and what better way

8:49

than if you're one of those dudes that's

8:51

on the spectrum that loves to chill at

8:53

home and play video games, just stare at

8:55

a screen, doesn't really need a lot of

8:57

human contact.

8:57

>> Yeah. One of those [ __ ] data wiggers

8:59

or whatever they call them. You know,

9:01

those [ __ ] tech monkeys or whatever.

9:02

>> Yeah. Those guys are just all about it.

9:05

They're all about it. They're wearing

9:06

Apple watches and the [ __ ] everything

9:09

is whoop neck. Yep. Yep. Yep.

9:11

>> They had to do with a [ __ ] whoop neck

9:12

brace. It was like it kept updating on

9:14

his watch. It's like your neck's still

9:15

broken or whatever.

9:16

>> He's got a whoop [ __ ] ring.

9:18

>> It's crazy. You're [ __ ]

9:19

>> [ __ ] robot,

9:20

>> bro. You blink twice and you're [ __ ]

9:22

Yeah. It like it shoots GLP1s into your

9:25

nuts. It's just like it's all too But

9:27

it's it it's happened too fast, bro. And

9:30

it's too much and it starts to be like

9:32

is it being controlled? Um,

9:36

dude, here's something that I always

9:37

>> being controlled by a small amount of

9:38

people, which is always scary.

9:40

>> That's scary.

9:41

>> It's always scary when a small amount of

9:44

individuals have insane amounts of power

9:46

and wealth. And that's what's going to

9:48

happen with this AI thing. And that's

9:50

what's what happened with tech. Like

9:51

look what happened with tech. With tech,

9:55

the vast majority of the people that are

9:57

involved were all like heavily

9:59

left-wing, very progressive, like kind

10:01

of even far-left in a way. And look,

10:04

they pushed the entire country's

10:06

narrative in that direction through

10:08

censorship on social media, through

10:10

banning any accounts that didn't con,

10:13

you know, didn't kind of commit to the

10:14

narrative.

10:15

>> Russia gate.

10:16

>> Yeah. Rush gate. Anything about Hunter

10:18

Biden's laptop, anything about vaccines

10:20

being deadly or, you know, maybe it came

10:22

from a lab, all that stuff would get you

10:24

kicked off. So, it was all moving in

10:27

this one ideological direction. That's

10:30

the like literally the conversation the

10:33

entire country is having and there's no

10:35

other output. Before, you know, there

10:37

was a few of them that came around like

10:39

Gab and some other ones, some social

10:41

media sites that were like a response to

10:43

that, but they never really took off,

10:46

right?

10:46

>> Nothing took off yet.

10:47

>> Not not really. You know, there's some

10:49

people that are on threads and there's

10:51

some people on truth, but the reality is

10:54

if you're not on Twitter, you're you're

10:56

not really going to connect with most

10:58

people. the that's the giant majority

11:01

people having conversations and it was

11:02

all completely controlled by a small

11:04

group of people with one ideology

11:06

>> but then didn't all those ha or half of

11:08

those people move over to the other

11:10

political party when uh Trump got

11:12

elected like you know what I'm saying

11:13

like Zuckerberg was on the left side

11:16

Bezos seemed like a very left-leaning

11:18

guy and then they're all just so that's

11:21

what made me start to think oh none of

11:23

these guys are really on a side there's

11:24

this other third side that a lot of us

11:26

can't see that uh is just kind of

11:30

commandeering or fabricating or like

11:33

infiltrating both sides.

11:35

>> If I could hum a song right now, I'd hum

11:37

the Pink Floyd song Money.

11:39

>> Yeah.

11:39

>> You know what I'm saying?

11:40

>> Yeah.

11:41

>> Cuz that's what they were protecting

11:42

that chatter. That cash, baby.

11:44

>> Protecting that cash.

11:46

>> I mean, look how many people are [ __ ]

11:47

moving out of all these states that are

11:49

trying to impose wealth taxes. They're

11:51

trying to steal money from the people

11:52

that are the most successful. I was

11:54

reading something about Massachusetts

11:56

and how much this lady was reporting

11:58

about how much Massachusetts has um lost

12:01

from that because people leave the

12:02

state. Their businesses leave the state.

12:04

New York is having the same problem.

12:06

That Kathy Hol,

12:08

>> you know, now she's asking people I

12:09

don't know how to say her name. I don't

12:11

care.

12:11

>> Yeah. Get a better name.

12:13

>> Who cares what her name is?

12:15

>> Yeah.

12:15

>> She's asking people to go to Palm Beach

12:18

and tell people to come back to New York

12:19

now because we're losing tax base. Like

12:21

come on. Of course, you're losing tax

12:24

base. You can't just arbitrarily decide

12:26

that because someone makes more money,

12:28

they deserve to give you more money. And

12:31

then what have you done with the money

12:33

you have?

12:34

>> That's the best part.

12:35

>> Oh, a [ __ ] ton of waste and fraud. And

12:38

have you corrected any of that? No. So,

12:40

your solution is what? More money.

12:42

>> Yeah.

12:43

>> Okay. [ __ ] all the way off. Of course,

12:45

these people are going to leave. You're

12:46

a bunch of incompetent stooges and

12:49

you're in charge of all the money in the

12:51

state. And that's dumb. Yeah. And that's

12:54

why Chevron moved out of California and

12:57

Tesla moved out of California and

12:59

In-N-Out Burger moved out of California.

13:01

>> We moved out.

13:02

>> We moved out of California.

13:03

>> Companies, but we're people.

13:04

>> We might as well be companies. We're

13:06

small companies.

13:08

>> But it's like you can't just say we're

13:11

going to take more money and that'll fix

13:13

it.

13:13

>> But you don't think billionaire Yeah.

13:15

Especially, dude, what has happened like

13:18

with this follow-up to the Somali fraud,

13:20

like all of these fraud buildings where

13:21

it's like blatant, there's no

13:23

businesses. It's just a sign on the door

13:25

and like it feels like there's no

13:27

followup to it.

13:28

>> There's some there are some there's some

13:30

people that are being prosecuted right

13:31

now. There's uh a bunch of

13:33

investigations uh regarding um

13:35

Minnesota, regarding California. They're

13:38

getting in there. They have to get in

13:39

there now because it's been exposed

13:41

nationally. But the real question is,

13:42

how did it go on for so long? How did

13:45

you allow it to happen for so long?

13:47

>> They knew.

13:48

>> Bro, you want you want to know like

13:50

what's real bad? What's real bad is like

13:54

the amount of money that California has

13:56

wasted if their solution is to try to

13:58

tax people. Have you ever seen like what

13:59

they did with the highspeed rail?

14:01

>> Yeah. Nothing.

14:02

>> They spent billions of dollars. There's

14:05

some guy who broke down how much China

14:08

how much highspeed rail China did in the

14:10

same time that it took California to do

14:13

their highspeed rail. It It's actually

14:15

funny.

14:16

>> I've done some I've done some rail out

14:18

of China. I'll tell you that.

14:20

>> I don't think it's the same stuff. I

14:22

think we're talking about different

14:22

things.

14:23

>> But uh No, China, dude. They're doing

14:26

Dude, it's weird when you start

14:27

thinking, "Hey, China looks like a good

14:28

place to live." You know,

14:29

>> they've got their [ __ ] together. I'll

14:31

tell you that. A lot of these places

14:32

with kings, they really know how to run

14:34

things. They do a real solid job.

14:37

>> They [ __ ] do. Poland's got their [ __ ]

14:40

together, dude.

14:40

>> They do have their [ __ ] together. And

14:42

they were communists not that long ago.

14:43

You know,

14:44

>> Poland's got their [ __ ] They don't let

14:45

any of this flu influence. Spain, I feel

14:48

like, is taking their [ __ ] back.

14:50

>> True.

14:50

>> They're picking up their toys.

14:51

>> I got to find this. Uh here it is, cuz

14:54

this is this is actually funny. when you

14:56

see like the comparison between like

14:59

what China's done and what we've done in

15:01

the same amount of time.

15:02

>> Yeah, it's it's actually kind of funny.

15:04

>> Oh, I want to say thanks too to this

15:05

lady Sarah White check I just gave. She

15:07

just came and helped me get blood a

15:08

little while ago and she was she's just

15:11

a nurse practitioner and you could tell

15:13

she was just working hard.

15:14

>> She hooked you up.

15:15

>> Yeah. She was just like, you know,

15:17

showed up and just like just got it done

15:19

for me. You could tell she just like is

15:20

a hardworking lady. I admire hardworking

15:22

women.

15:24

>> Oh, what about hardworking men? You like

15:25

them? Yeah, that's it, Jamie.

15:26

>> Well, they should be.

15:27

>> So, look at this. Things that happen

15:29

faster.

15:30

>> Who's this [ __ ] Gooner, though? Who's

15:31

that dude? Is that Neelk?

15:35

>> [ __ ] He fell off.

15:36

>> It happened faster than building the

15:38

California highspeed rail. China's

15:40

entire highspeed rail network of 30,000

15:43

m. Our LA segment would have taken them

15:46

2 months. Dubai going from barren desert

15:49

wasteland to barren culture wasteland.

15:52

Timothy Shalomé's entire existence,

15:55

iPhone 1 through 17, and the internet.

15:59

Follow for more [ __ ]

16:02

>> I love that guy. That's Harrison Balm.

16:05

Good for him.

16:06

>> That's crazy, isn't it?

16:07

>> It's crazy. And they just took billions

16:09

of dollars in taxes and they Oh, we're

16:12

working on it.

16:13

>> Yeah, but where here's everything is

16:16

fraud. You're starting to realize it's

16:17

all fraud. Well, if it's not fraud, it's

16:19

waste and it's bureaucracy. So, they

16:22

keep the money coming in. So, they keep

16:24

people working, but the people don't do

16:25

anything.

16:26

>> And the dude, and we can't even [ __ ]

16:27

keep the TSA workers, dude. I [ __ ]

16:29

snuck a half a handful of goldfish to a

16:31

[ __ ] TSA worker the other day. The

16:33

edible ones, just to [ __ ] keep them

16:35

going, dude, out there.

16:36

>> You gave them some goldfish?

16:37

>> Yeah. They're not even getting paid.

16:38

>> I know. They they just started getting

16:40

back pay,

16:41

>> but still it's just But the fact that

16:42

that's like a

16:43

>> crazy like that they're the least

16:46

priority. Like bro, flying is [ __ ]

16:50

super important. You dummies. You want

16:52

to keep the economy going, you got to

16:53

let people fly around. They got [ __ ] to

16:55

do, man. You can't just [ __ ] not pay

16:58

the TSA people. You [ __ ] idiots. How

17:01

come you get paid?

17:02

>> Yeah.

17:02

>> How come you get paid?

17:05

>> I'm sick of this [ __ ] And I'm sick of

17:07

rich people not putting their [ __ ]

17:08

kids over in these wars and [ __ ] like

17:10

that. Put your [ __ ] honky ass kids up

17:12

there. Let them go shed some [ __ ]

17:14

blood.

17:15

>> Especially if you're asking for it.

17:16

>> Especially if you're out there [ __ ]

17:17

bullshitting, dude. Put your [ __ ]

17:19

honky little fancy ass [ __ ] kid up

17:22

there, man. That [ __ ] makes me mad, bro.

17:24

>> Well, I think there's also a problem.

17:27

The people that I've talked to that have

17:29

served overseas and have been involved

17:31

and deployed in military operations and

17:33

seen a lot of [ __ ] There are a lot of

17:35

them are of the opinion that you

17:37

shouldn't be able to make those

17:39

decisions if you never been to war. If

17:41

you don't know what it is, you don't

17:43

know what you're sending people to do.

17:45

It doesn't mean you're not still going

17:46

to be a tyrant because there are some

17:48

people like clearly Netanyahu's been to

17:50

war. You know, he's been he was in the

17:52

military. He's was involved in some [ __ ]

17:55

and Yeah. and he he was like a special

17:56

forces operator in Israel and clearly he

18:00

doesn't mind going to war. But this

18:03

episode is brought to you by Manscape.

18:05

Did you know that one man is diagnosed

18:07

with testicular cancer every hour? In

18:10

fact, it's the most common form of

18:12

cancer among men ages 15 to 35. April is

18:17

National Testicular Cancer Awareness

18:20

Month and Manscaped is donating $50,000

18:23

to the Testicular Cancer Society to

18:26

support awareness and routine selfch

18:28

checks. I'm proud to support something

18:30

that helps make a real difference. You

18:32

can also support the cause by purchasing

18:34

a special edition TCS Ball Hero Bundle.

18:38

This bundle includes the Lawnmower 5.0 0

18:42

Ultra TCS special edition and special

18:46

edition TCS Boxers 2.0. Join the over 13

18:52

million men worldwide who trust

18:54

Manscaped and use the code roan 15 for

18:58

15% off your entire order at

19:00

manscaped.com.

19:02

You can also visit manscape.com/tcs

19:06

to learn more about how to check

19:07

yourself or make a donation at

19:10

TCSocciety

19:12

today to save lives and balls.

19:15

>> I think most war is a unique term.

19:18

>> War is a [ __ ] terrifying.

19:19

>> But I mean he I don't think I wouldn't

19:21

call what he does war, but I

19:22

>> you mean what they're doing right now

19:24

with Gaza?

19:25

>> Yeah. And I'm I'm not Iran is war.

19:29

>> That's war. Iran's a real enemy. You

19:32

know, it's a different

19:33

>> Are they emn to America?

19:35

>> Well, what they are is the largest

19:38

country in in terms of like state

19:40

sponsored terrorism. They're the largest

19:41

sponsor of terrorism, but but also you

19:44

got to think why, you know, and this is

19:48

not excusing anybody for is Islamist uh

19:52

ideology because it's scary because they

19:54

want a a global caliphate, right?

19:56

They're radicals. But you got to go back

19:58

to what happened in that country. And if

20:00

you go back to what happened in that

20:01

country, they tried to nationalize oil.

20:04

Iran was like a westernized country.

20:07

Girls were wearing miniskirts.

20:08

Everybody's hot.

20:09

>> You seen that video from the 70s of

20:11

Iran?

20:11

>> Yeah, bro. Everybody's popping. It's

20:13

popping.

20:14

>> What happened is slowly but surely um

20:17

and quickly at first because when they

20:19

tried to nationalize oil,

20:20

>> Yeah. um the CIA swooped in and they

20:24

[ __ ] got that guy out of office and

20:27

they allowed these Islamic, you know,

20:30

radicalists to start running the

20:32

country.

20:32

>> Well, that's when started, right?

20:34

>> I don't know exactly when Hezbollah

20:36

started, but the point is the country

20:38

was doing fine before we monkeyed with

20:41

it. And we monkeyed with it because they

20:42

were not getting enough of the money

20:44

from the oil. So it was the British

20:46

Petroleum Company, I think, put it into

20:48

perplexity the story of Iran

20:53

uh their government being overthrown. I

20:56

think it was in the 1950s.

20:58

So when you you see like how it all

21:01

played out and why it is what it is

21:03

today, Jesus Christ, you'd be mad too.

21:06

>> Yeah.

21:06

>> And when you're mad and you're

21:07

surrounded by bigger enemies that all

21:09

have nuclear weapons, you don't even

21:11

have nuclear weapons. Wouldn't you be

21:13

trying to make them? You know what I'm

21:14

saying? Like I'm not saying Iran should

21:16

have nuclear weapons. I don't think

21:17

anybody should have nuclear weapons.

21:18

They definitely

21:18

>> Israel gets to have them.

21:20

>> Allegedly. This is the problem.

21:22

Allegedly.

21:22

>> Everything's allegedly with them.

21:24

>> Allegedly,

21:24

>> except for the genocide.

21:25

>> You know, they're not they don't

21:26

officially have them,

21:29

>> right?

21:30

>> I don't think they admit they officially

21:31

have them. And you know who is a big

21:33

opponent of Israel getting nuclear

21:34

weapons?

21:35

>> JFK.

21:36

>> JFK.

21:37

>> Yeah. That's what a lot of people think

21:40

led to back into the left.

21:42

>> Oh, yeah. Before they killed him. Who? I

21:44

don't know.

21:44

>> So the Iranian revolution,

21:46

>> also called the Islamic Revolution, was

21:48

a mass uprising in Iran, overthrew the

21:49

sha's monarchy in 1979, replaced it with

21:52

an Islamic republic led by Ayatollah

21:55

Rula Kmeni. Um,

21:59

>> Shaharo they called.

22:00

>> I want you to go back to the national

22:02

ask it a question of what was the events

22:04

that led to

22:07

um

22:08

them trying to nationalize their oil?

22:12

Here it is.

22:13

Uh uh uh uh no that's not it. The so the

22:19

what what was uh banned real opposition

22:23

use secret police to sural jail. I just

22:25

put into ask another question. What were

22:28

the events that took place after Iran

22:31

tried to nationalize oil? Just ask that

22:34

question.

22:35

What are the events that took place

22:39

when Iran tried to nationalize oil? Bro,

22:42

[ __ ] oil. I'd rather walk if this is the

22:44

[ __ ] that's going to come out of all of

22:45

it. You feel me?

22:46

>> The problem is it's not just oil for

22:49

your car. It's everything you use.

22:51

Plastic is these petroleumbased

22:54

chemicals are responsible for medicine.

22:56

>> But it's also getting in our nuts now

22:58

and people can't even [ __ ] read

22:59

anymore because of it. So it's like,

23:01

what is all that stuff helping us

23:02

anymore?

23:03

>> Yeah, here it is. So Iran's attempt to

23:05

nationalize its oil in the 1950s

23:07

unfolded as a chain of political,

23:08

economic, and international

23:10

confrontations centered on Prime

23:12

Minister Muhammad Mo Mosa. How do you

23:14

say his name? Mogade. Mosed Mosed.

23:17

>> Let me see. Mosad.

23:19

>> Mosad. Uh, and British controlled Angro

23:23

uh, AngloIranian

23:24

oil company. I'll walk you through the

23:26

key events, but it had to do with who

23:28

was in control of the oil before that.

23:32

Like, who was making the money before

23:33

that? Perplexi is going to give us the

23:35

[ __ ] tinfoil hat story of how it went

23:38

down.

23:39

>> But the bottom line is

23:42

>> people are making a lot of money over

23:44

there in oil and they wanted most of the

23:46

money and they got boxed out and then

23:49

they wound up with a [ __ ] psychotic

23:50

dictator.

23:51

>> Yeah. And a lot of the I mean, if you

23:54

look back on what Iran looked like when

23:57

it was a westernized country, like damn,

24:00

we should have [ __ ] supported

24:01

whatever the [ __ ] was going on back

24:03

then.

24:03

>> I know. I think Do you feel like we used

24:05

to do things that were better and then

24:06

we got uh

24:09

>> here's the tin foil hat version. I love

24:11

how Perplexity gives you a tin foil hat

24:13

version.

24:14

>> Ask you received.

24:15

>> Nice. Uh the story is basically Iran

24:17

tried to take back its oil. While the

24:18

British and Americans teamed up in

24:20

secret to crush that idea and send a

24:22

warning to the rest of the world,

24:24

Britain had built its empire and navy on

24:26

cheap Iranian oil via the Anglo Iranian

24:29

oil company, later British Petroleum

24:31

Company. So when Mogad Mosed,

24:35

I don't know how to say his name. I keep

24:36

[ __ ] it up. Uh London saw it as a

24:39

direct threat to its global power and

24:41

profits. Elites feared that if Iran got

24:44

away with nationalizing its oil, other

24:45

countries in the Middle East would uh

24:48

beyond and and beyond would copy

24:50

destroying Western oil monopolies. So

24:52

they were determined to make Iran an

24:53

example.

24:54

>> Like bro, we've been monkeying around

24:56

with other countries forever. This thing

24:58

in Venezuela, this real quick thing that

25:00

happened real quick when they're kidnap

25:02

a dude in Venezuela. like

25:04

>> well a lot of us say it's because the

25:06

these are the countries that are still

25:07

outside of the um Rothschild's banking

25:09

system or whatever. Have you seen that

25:10

thing?

25:11

>> I have not.

25:12

>> Where it's like there's a there's a the

25:13

countries that are still not on that

25:15

list or something. This is tinfoil stuff

25:17

I think

25:19

>> or it's absolutely true. I have no idea.

25:21

>> There's a lot going on right now, right?

25:22

Like

25:22

>> I'm fear, dude. I'm scared. I'll be

25:24

honest with you.

25:25

>> Yeah, it should be.

25:26

>> I'm scared.

25:27

>> Well, it's it's a scary time because

25:29

this is a real scared war.

25:31

>> People come up and people tell me about

25:33

it. I was in an Uber yesterday and

25:35

there's a man in there. He was driving

25:38

and um

25:40

he's like, "We need a revolution." You

25:42

know?

25:42

>> Oh boy.

25:43

>> He's like, "You have a voice." He's

25:44

telling me stuff like that. And I was

25:45

like, "Don't take Ubers anymore. Stop

25:47

taking Ubers. Rent a car, motherfucker."

25:49

>> I'm not renting a car.

25:52

>> Why would you rent a car? You don't rent

25:53

cars,

25:53

>> bro. You think I'm going to go be at the

25:56

Renting a car is insane. You have to

25:58

check under see if there's any dents in

26:00

it or if there's any like and then um

26:02

you have to do all this stuff.

26:04

>> Renting a car is a nightmare. Dude, I

26:06

will tell you this story though. One

26:07

time we rented we did rent a car and we

26:10

got a dent on it like a pretty good dang

26:12

and we [ __ ] we had a we caught a

26:14

pigeon and had it [ __ ] over the dent to

26:16

fill it in whenever we turned it in.

26:18

>> No, you didn't. This is not a true

26:19

story.

26:20

>> Yeah, we did.

26:20

>> You caught a pigeon?

26:22

>> Yeah. You think it's hard to catch a

26:23

pigeon?

26:23

>> I do. Bro, bring up a pigeon getting

26:26

caught.

26:27

>> Mike Tyson had a lot of them, bro.

26:29

>> Yeah, when he raised them.

26:30

>> Yeah, but dude, he had [ __ ] autism in

26:32

his hands after a couple years.

26:34

>> You don't think You think it's hard to

26:36

catch a pigeon? The dumbest bird ever,

26:38

dude.

26:39

>> And you just put the [ __ ] over the dent?

26:40

>> Yeah, you hang it over.

26:41

>> Hell of a [ __ ] How big was this dent

26:43

we're talking about?

26:44

>> Dude, these [ __ ] pigeons [ __ ] all

26:45

day, Joe.

26:46

>> So, you just hold them there until

26:47

they're done?

26:47

>> Yeah. What are you, some kind of [ __ ]

26:48

cop or whatever? Yeah, we [ __ ] put

26:51

him over the dent, bro. That's why God

26:54

wants you to help. That's my insurance.

26:56

>> Oh, okay. This one's all [ __ ] up,

26:57

though. That's not fair.

26:58

>> That's cuz he has American healthcare.

27:00

It's United Healthcare.

27:03

>> Dude, well, here's what I want to know.

27:04

Like, I'm I guess Yeah. Like, yeah. I

27:06

don't know, man. Everybody just feels

27:08

scared and it makes

27:08

>> Well, they should because a lot of

27:10

things are getting exposed right now.

27:12

You know, there's a lot of fraud and

27:15

you're seeing at the highest levels of

27:16

government and people are also scared

27:18

because no one's getting in trouble for

27:20

things like no one's getting in trouble

27:21

for the Epstein files. No one's getting

27:23

in trouble for

27:24

>> Yeah. that's almost disappeared kind of.

27:26

>> Well, that's part of what happens when

27:28

there's some sort of a big social thing.

27:30

One thing that's in the past that

27:34

leaders have used to cover up problems

27:36

at home is a [ __ ] war. I'm not saying

27:38

that that's why they bombed Iran, but

27:41

that would be a way to do it. If you're

27:43

that psychotic, you know, and if you

27:46

were thinking about doing it anyway, you

27:47

might be able to justify it. People have

27:50

always done that also to stay in power.

27:52

>> Oh, yeah. And even Bill Clinton said

27:55

that about Netanyahu. Bill Clinton said

27:57

Netanyahu wants war so he could stay in

27:59

power.

28:00

>> For sure, dude. People call him the

28:01

Yamaka Hitler. That's what they call.

28:03

>> Who are these people?

28:04

>> Everybody does.

28:04

>> Which people?

28:05

>> Countless people.

28:06

>> Huh?

28:06

>> What are you saying? What do you mean

28:08

people? What are you talking about?

28:09

>> What are you talking about?

28:10

>> Black folks. What are you saying?

28:11

>> I don't know what you're saying.

28:12

>> I don't know what you're saying.

28:13

>> I don't know what you're saying.

28:14

>> Okay.

28:14

>> Let's just not draw conclusions.

28:16

>> Okay. Yeah.

28:17

>> Okay.

28:17

>> He seems like a great guy.

28:19

>> Um, really?

28:20

>> No.

28:21

>> Well, this is a scary time.

28:22

>> It's a scary time because people are

28:24

willing to blow people up with [ __ ]

28:25

drones and missiles and they're shooting

28:28

into apartment buildings and blowing up

28:30

schools and it's like, [ __ ] man.

28:33

>> And we didn't I think that we've been

28:35

poisoned. I do think that we've been

28:37

poisoned

28:38

because I think that like we find out

28:40

that our food is a lot of our food is

28:42

poisonous, right? Or

28:43

>> a lot of our food is not good for us.

28:45

>> Yeah. Sorry. Not good for us.

28:47

>> So, we have a health care we have food

28:48

that is made to be not good for us.

28:51

>> And then we have a health care system

28:52

that'll just kind of take care of you,

28:54

right? Barely. So then you start to

28:57

create this other like you're going to

28:58

need the your autism gang that are up

29:00

there running [ __ ] but then you're going

29:02

to need this sort of like mollisky sort

29:05

of like the worker bees and that's what

29:07

the rest of us start to become as worker

29:09

bees because you know you're on

29:11

antid-depressants killed like the the

29:13

vibe and the energy of so many people,

29:15

right? like the opioid epidemic like you

29:17

you broke apart so many families and

29:19

ruined hope in so many like kids and

29:21

parents and homes and like um the the

29:23

the uh the COVID where you shut down

29:27

recovery rooms and places where people

29:28

were meeting and so they were so

29:30

disconnected and then it's like you just

29:32

you start to wonder why there's no

29:34

uprisings because there's no there's

29:36

nothing rising up inside of you anymore

29:37

because a lot of your your vitrol has

29:40

been killed. People are jerking off into

29:42

[ __ ] robots and even just on car

29:44

batteries and [ __ ] in some of those

29:45

videos online. But

29:46

>> car batteries

29:47

>> people will come on everything.

29:48

>> What happens when you hit the the two

29:49

posts?

29:50

>> I don't know, dude.

29:53

>> Does your jizz explode?

29:55

>> Probably got to be grounded. I would

29:56

have to guess.

29:57

>> Right.

29:58

>> That's a real [ __ ]

29:59

>> You don't want that jolt coming back to

30:00

the tip.

30:01

>> You imagine if it was like one solid

30:03

stream and the electricity jumped.

30:06

>> Yeah.

30:07

>> Made it back to the tip. But dude, that

30:08

could happen too with that robot if you

30:10

trying to hump that robe and that thing

30:12

shorts out.

30:13

>> Phones short out. You remember those uh

30:15

those old phones that would blow up in

30:17

people's cars? Like the Note, it was one

30:19

of the Note series. Like people's cars

30:22

would catch on fire if you left it

30:23

plugged in.

30:24

>> Yeah.

30:24

>> Yeah. What if that happens to your dick?

30:26

>> And people would always,

30:27

>> you know what I'm saying?

30:29

>> And people would just always leave it

30:30

plugged in next to their wife at night

30:32

for [ __ ] no reason,

30:33

>> right?

30:33

>> On top of your wife.

30:35

>> Just balance. bursts in flames and

30:37

lights are on fire.

30:38

>> But successor,

30:40

>> I I think we've been poisoned just

30:42

enough to like it feels like just to

30:44

hurt, but not like we just have to start

30:47

I think it's a time where like we have

30:49

to try and work on our and like look

30:51

inside of ourselves and I don't know. Do

30:54

I sound [ __ ] preachy? I'm sorry.

30:55

>> No, you don't sound preachy, but you are

30:58

think you're on to something. There was

31:00

some file. I didn't read it, but a bunch

31:02

of people sent it to me. I just went,

31:03

"Oh, Jesus." It was from the some

31:07

Freedom of Information Act or some leak

31:10

from the 1950s

31:12

with the CIA and they were trying to

31:16

think of different ways to make people

31:18

docile and stupid and unmotivated

31:21

>> and they were talking about different

31:23

medications, putting stuff in food, all

31:26

these different strategies to keep

31:28

people stupid.

31:29

>> Yeah. And they

31:30

>> This is our own government. us, United

31:34

States of America.

31:35

>> Well, that's another thing. Is that not

31:37

treasonous?

31:38

>> I agree. So why And yeah, it just feels

31:40

like there's no recourse and I know like

31:43

you start to think, well, this is how a

31:44

lot of people have lived their entire

31:46

centuries in different countries and

31:47

stuff like that. Like they live under

31:49

this type of oppression and like fear

31:51

all the time, but it feels new here.

31:54

>> I want to know what exactly. Could you

31:55

put that into our lovely sponsor

31:56

perplexity and find out what the [ __ ]

31:58

was said in that CIA document? What what

32:02

were they actually planning? Because

32:04

it's the the idea that there's people in

32:06

government that would just say, "Fuck

32:10

millions of people and their potential

32:12

in life. Let's tank their potential so

32:15

we can get our agenda through easier

32:17

without them being upset. Let's ruin

32:21

millions of people's lives or at least

32:24

dampen their dreams. I don't squash

32:26

their hopes, make them stupid and lazy,

32:29

>> make their kids sick, make their Yeah.

32:31

Make it put pornography and and and let

32:33

it be into the home so that um it's

32:36

accessible everywhere. So marriages get

32:37

ruined and relationships get ruined and

32:39

guys are just spunking out on wherever

32:41

and so they don't so there's no energy.

32:43

There's no like there's no [ __ ]

32:45

desire inside of people to overcome. And

32:47

it's like yeah, we have to just try and

32:49

do better one day at a time. for men

32:51

like their ambition in life is often

32:54

connected to wanting girls to like them.

32:56

Yeah.

32:57

>> Or guys to like them, whatever it is.

32:59

>> And purpose creating.

33:00

>> But that's the other part. Purpose and

33:02

creating is like the ultimate. That's

33:04

like the the ultimate is it's almost

33:07

like you're doing a service. Like

33:09

whatever you're doing, if you're doing

33:11

it your best, your real reward is that

33:14

people enjoy it. Whatever it is, whether

33:16

you're a carpenter or a musician or

33:18

whatever it is, if you're doing

33:20

something at your best, the ultimate

33:23

reward is people enjoying it.

33:24

>> Yeah,

33:25

>> that's the ultimate enjoy. But you you

33:27

have to figure that out in life.

33:29

>> You're probably thinking of declassified

33:30

CA, mind control, and behavior

33:32

modification experience like Bluebird,

33:34

Artichoke. Artichoke is it? That's it.

33:36

Especially MK Ultra, which did run in

33:38

the 1950s and60s. Okay. Bluebird, MK

33:42

Ultra. What is make people stupid in

33:44

cognition? Uh CIA uh efforts to use

33:48

drugs, hypnosis, and other techniques.

33:50

No, that's the interrogation. That's

33:51

different. Uh uh uh uh uh

33:56

>> this thing is interesting.

33:57

>> Perfect concussion effort often

34:00

referenced alongside MK Ultra explicitly

34:02

explored using suboral blasts to erase

34:05

memory. Whoa. Erasing or degrading

34:08

memory is practically a way of disabling

34:10

a person cognitively. Even if that is

34:13

not described as making them stupid in

34:15

official language.

34:16

>> Well, yeah, we're just It feels like

34:18

we're just stuck in an experiment.

34:20

>> I feel like this is not it.

34:21

>> This isn't it.

34:22

>> No. Um, this was So, why don't you run a

34:25

search for recently disclosed CIA files

34:31

to make people I mean,

34:33

>> I had docsel in first and it didn't give

34:35

me anything better.

34:37

>> Um,

34:38

>> I tried looking on Twitter.

34:39

>> Well, okay. using put in using vaccines

34:43

to make people stupid

34:46

>> or suggesting vaccines make people

34:49

stupid.

34:50

>> Uh, I hesitate now.

34:53

>> Why?

34:53

>> Cuz it's taking me to somewhere talking

34:55

about this on Facebook.

34:56

>> Perfect. I love Facebook,

35:00

but the conspiracy theorists are looking

35:02

pretty sane right now. Okay, this is

35:04

Eevee Magazine.

35:07

Okay. What is going on here?

35:10

>> You have to type in your email or they

35:11

won't let you watch it.

35:12

>> But yo, I don't think what you're

35:13

saying, the things you're saying, I

35:15

don't think that uh that doesn't seem

35:17

like an American idea to me.

35:19

>> Well, it's not It's okay, Jamie. Forget

35:20

it.

35:20

>> But it's not an American idea.

35:22

>> If you can find Is it okay?

35:23

>> If you can find it, please do.

35:25

>> Can you do Google, too, or you can't do

35:26

it?

35:27

>> Yeah. Just look everywhere.

35:29

>> Dudes, have you noticed some things are

35:31

harder to find?

35:32

>> Yeah. Well, this is probably going to be

35:34

hard to find because I think this is one

35:35

of those ones that is like it's on X,

35:38

right? And I know like people are going

35:40

over it, but I don't know if it's even

35:41

been verified. This is one of the

35:42

reasons why I wanted to put it through

35:43

perplexity.

35:45

>> Find because there's a lot of stuff you

35:46

read that just complete, especially

35:48

today. April Fools, [ __ ] Today,

35:51

April Fools. Yeah. Don't get tricked.

35:53

>> [ __ ]

35:53

>> Stay off the

35:54

>> I just gave some random lady my blood in

35:55

the parking lot.

35:56

>> Oh, no. She's going to use that for a

35:57

ritual.

35:59

>> Good.

35:59

>> Clone you, son. You can have little baby

36:01

Theos like those little videos that pop

36:02

up of us.

36:04

>> Yeah. Huh, dude. My favorite part of the

36:06

video is at the end when you just kind

36:07

of bounce out of your chair.

36:08

>> We're laughing so hard. Is this it?

36:10

>> This is it.

36:13

>> Okay. What does it say?

36:14

>> Video. Someone's talking about project

36:15

artichoke.

36:18

>> What is product project? What does it

36:20

say? Essentially

36:21

>> the video.

36:22

>> Beth Kim Iverson. She's pretty good.

36:25

>> White Iverson

36:26

>> artichoke. So, this says, "Look, we've

36:28

got this grand idea of how we're

36:30

basically going to drug people and do

36:32

all kinds of weird experiments on them

36:33

to see if we could control their minds.

36:35

These documents don't show that anything

36:37

was actually done. It just shows that

36:38

we've got these really crazy ideas and

36:40

they're extremely unethical, inhumane,

36:42

terrible, terrible ideas." The 1977 leak

36:45

of documents say, "Oh, yeah. Well,

36:47

actually, the government did it. They

36:48

did all of those terrible things they

36:50

said they were doing in that previous

36:52

memo. They did it. And now here's some

36:53

of the archives that we have from when

36:55

they did all of those terrible things.

36:57

So, okay, these documents, special

36:59

research for artichoke, dated April 21st

37:01

of 1952, the memo proposes developing

37:04

long-term covert drugs that could be

37:06

slipped into daily life. Drugs that were

37:09

quote administered over considerable

37:11

period of time, possibly being placed in

37:13

food or water that caused either

37:15

agitation or depression. These should

37:17

include chemicals or drugs that can be

37:19

effectively concealed in common items

37:21

such as food, water, Coca-Cola, beer,

37:24

liquor, cigarettes, etc. And should also

37:26

be capable of use in standard medical

37:28

treatments such as vaccinations and

37:31

shots. We can do all this other

37:33

experimentation which nobody will know

37:34

about. It's sneaky. Sneak it into their

37:36

Coca-Cola. Sneak it into their beer,

37:37

their cigarettes, their vaccines, their

37:39

medications. Let's sneak it all in.

37:43

Oh, those wild conspiracy theorists.

37:45

They strike again. They have no morals.

37:48

They have no ethics. They have no

37:49

humanity. These documents, I mean, these

37:51

people are inhumane. They're sick.

37:53

They're twisted. This is terrible.

37:55

>> Yeah.

37:55

>> Way to go, Kim Iverson. She killed it.

37:58

She She used to be on What show was she

38:01

on? Not Breaking Points. What was the

38:04

show that they did before Breaking

38:06

>> Kim Iverson? It wasn't 227, was it? She

38:08

got booted off because Fouchy was coming

38:10

on and she wanted to question Fouchy

38:12

about the COVID vaccines

38:15

>> and they kicked her off the show and she

38:16

went independent.

38:17

>> Good.

38:18

>> Which is how it always goes. Y

38:19

>> Yeah, you can't you can't talk too much

38:21

[ __ ]

38:22

>> Even though it's pretty obvious that

38:24

guy's a criminal.

38:25

>> Pretty [ __ ] obvious that guy's a

38:27

liar. Lied in front of Congress, was

38:30

responsible for gain of function

38:31

research that led to who knows how many

38:33

[ __ ] people dying of a man-made

38:35

disease. Whatever. Whatever. Just don't

38:37

question. You can't work here anymore.

38:39

You're not playing ball. Yeah.

38:41

>> You're not playing along. Like, look.

38:42

But now she can do stuff like this.

38:44

>> Good for her.

38:44

>> That's nuts that your tax dollars pay

38:47

for that. Them figuring out how to make

38:49

people stupid. How do I make Theo

38:51

stupid? Let me slip something into his

38:53

Coca-Cola. Let's figure out if it works.

38:55

Let's experiment on random people and

38:57

see what kind of results we get.

38:58

>> Then here's my question then. Well, did

39:00

you know whenever they uh whenever they

39:02

introduced antid-depressants that

39:04

changed like the cognitive um therapy

39:07

side of things like in therapist office,

39:09

it it totally revolutionized like

39:10

industrialized

39:12

uh therapy and it ruined it ruined a lot

39:14

of people, I think. Like um one of my

39:16

goals is to get off of antid-epressants

39:18

completely, man. I want to feel how I'm

39:20

supposed to feel so I can have thoughts

39:23

and actions that uh that like make me

39:28

feel connected to the world. That [ __ ]

39:30

makes you feel dead, man.

39:31

>> So why did you take them in the first

39:33

place?

39:34

>> Cuz I was in a bad relationship 20 years

39:37

ago and I was having a tough day at

39:39

school and they [ __ ] put they gave

39:41

them to me and then I never got off.

39:43

>> Really?

39:43

>> Because when you get off it's that I

39:45

think we talked about this once. It's

39:46

hard.

39:47

>> Yeah. It makes you more depressed and

39:49

more [ __ ] up and you're all imbalanced

39:50

and you you know probably you're

39:52

addicted to them.

39:53

>> Yeah. And so I that's one of my goals is

39:55

and I noticed like um for me I've been

39:57

taking like methyl blue. I've been doing

39:58

some things like and I'm working with a

40:00

doctor to help me but I want to I'm

40:02

going to get there and I'm just going to

40:03

start to take the power back of myself

40:05

more.

40:06

>> Well they say that exercise is like many

40:09

times greater in its effect at

40:13

alleviating depression. Dude, I wake up

40:15

and I do my yoga and I do like a 35

40:18

minute workout. I'll do like six

40:20

exercises, five runs of it in a row.

40:23

That's 30 exercises. Burn through them

40:25

[ __ ] and I'm a and I'm If I do that

40:27

when I get up in the morning, bro, I am

40:30

good.

40:31

>> Yeah,

40:31

>> I'm fine all day. And I'm also I'm more

40:35

positive cuz I've already taken care of

40:37

myself in a way that I feel is

40:39

sufficient enough for me to keep

40:41

operating and moving forward. But yeah,

40:43

I want to get away from the

40:44

>> Well, that's the medicine, man. Which is

40:46

really crazy. That's the medicine. It's

40:48

just hard for people to take because it

40:50

requires effort and it requires

40:52

discipline. You have to do it when you

40:53

don't want to do it. And there's a lot

40:55

of times when you're not going to want

40:56

to do it. A lot of times you're feeling

40:57

kind of [ __ ] tired.

40:59

>> We have to And I think that's what Yeah.

41:00

Maybe we just Yeah. Like I just need to

41:03

I just need to keep going. This is the

41:05

best I've been doing. I think

41:06

>> Why don't you hire a trainer? You got

41:08

some cheddar? I can't.

41:10

>> You got some cash, son. You're making

41:12

that

41:14

>> paper.

41:15

>> Why don't you hire a trainer?

41:16

>> I do. I I

41:17

>> hire a dude that's cool that'll come

41:19

over your [ __ ] house every day.

41:21

>> Touching my body a lot.

41:22

>> He doesn't have to touch your body,

41:23

>> Joe. Some of them do.

41:24

>> Well, you got to get new ones. They're

41:26

doing something wrong.

41:27

>> But some of them,

41:28

>> you got to say no.

41:29

>> I am saying no.

41:30

>> Repeat after me. No.

41:32

>> No.

41:33

>> Don't touch my butt when I'm in a deep

41:35

squat. It doesn't help.

41:36

>> I don't like that song.

41:37

>> I'm going to dick your [ __ ] It's

41:39

going to make you WANT TO EXPLODE TO THE

41:40

TOP. READY? GO. HE'S knuckle deep in

41:43

your bung hole trying to convince you

41:45

that it's so you can get more reps.

41:50

He's [ __ ]

41:53

>> That's got to happen like a [ __ ] big

41:55

jack gay trainer like praise on guys

41:58

that are kind of weak with small hips.

42:00

>> He's like, I bet you can't do this with

42:01

your with my [ __ ] and your butt. And

42:03

you're like, that's a crazy who cares.

42:05

>> Why are you suggesting this? Yeah. But

42:08

the crazy part is, dude, I I had a

42:10

trainer one time if you were doing like

42:11

a dumbbell press,

42:12

>> he would kind of squat you from he would

42:14

he would help you from the elbows kind

42:15

of.

42:16

>> Okay, that's fine.

42:16

>> But when I noticed this one time,

42:18

>> he rub his dick onto your butt.

42:21

>> Did he? He's right behind you.

42:22

>> No, he didn't. I don't know.

42:23

>> He did. You're blocking it out.

42:25

>> Maybe that's why you need therapy.

42:27

>> I don't know, bro.

42:31

>> Dude, if some I know all the dicks I've

42:33

ever seen in my life, dude.

42:34

>> All of them?

42:35

>> Yeah. How many have you seen? Jesus

42:37

Christ.

42:37

>> Honestly,

42:38

>> it's not that many.

42:39

>> Live and in person.

42:40

>> Could count them on one hand.

42:42

>> Crazy.

42:43

>> Only seen a handful of dicks and two of

42:44

them are

42:44

>> Aries.

42:47

Definitely the most recent one.

42:48

>> Ari's pissed in [ __ ] kombucha bottles

42:51

in this room so many times. He is such

42:53

an animal.

42:54

>> He is kombucha. He has kombucha in him.

42:56

It's all kombucha. He doesn't have piss

42:57

anymore. It's fermented. But this guy

42:59

would touch my elbow and he would kind

43:00

of like m He would do a slight like

43:02

massage on them and that's when it kind

43:04

of cooked me up.

43:05

>> You got a vape in here by chance, Joe,

43:07

or

43:07

>> No, we got this.

43:09

>> You want a cigar? We got smelling salts.

43:11

You want a cigar?

43:12

>> No. They make me sick.

43:14

>> They do?

43:14

>> Yeah. It makes me feel sad.

43:16

>> Sad.

43:17

>> Yeah.

43:17

>> No, I gave up on those. Nicotine vapes

43:19

are very addictive.

43:21

>> Yeah, boy.

43:22

>> I know. I'd give it up.

43:23

>> They make you gra they grab You grab for

43:24

them. You want to take a hit off of

43:26

them. Well, even if you I I and I

43:28

decided at one point in time, I'm not

43:30

taking these anymore. I'm stopping with

43:31

these.

43:31

>> Oh, I remember, dude. Remember you and I

43:33

were using them one time? We kept using

43:35

that thing and Yeah. Oh, dude.

43:37

>> There's something in them. It's not just

43:39

the nicotine.

43:39

>> I'll tell you the story

43:40

>> because like these things like Alps, I

43:42

have no problem not taking these. I I

43:45

went on a trip, like a 10-day trip. I

43:47

didn't bring any nicotine pouches. I

43:48

didn't miss it at all. I was fine.

43:51

>> Well, I'll say this,

43:52

>> but not those vapes, dude. Those vapes

43:54

call you.

43:54

>> Yeah, some of that shit's a lot, bro.

43:56

They call you.

43:58

>> But yeah, you got to kind of manage it

43:59

or whatever. But

44:00

>> yeah, you ain't managing [ __ ] son.

44:05

>> You right. You right about that. Okay,

44:09

girl. Okay.

44:11

>> No, I think out of all the things that

44:14

are, you know, not a drug drug, but you

44:17

know, nicotine is kind of a drug, but

44:18

you know, obviously could be totally

44:19

functional on it. That's the one in the

44:21

vapes that's the most addictive.

44:23

>> And and but yeah, you're talking about

44:25

like recreational type, not like

44:27

antid-depressants, things like that.

44:28

>> Yeah, of course. Of course. Not like

44:30

cocaine or, you know, Yeah. But but

44:32

here's the thing about them, man.

44:33

They're only good for one hit. It's the

44:36

first hit of the day. The first hit off

44:38

a vape is [ __ ] wonderful.

44:40

>> You're like, "Ah,

44:42

>> oh yeah, I'll blow that smoke on your

44:44

mother, son."

44:44

>> Nature just shines down upon you. Just

44:47

feel relaxed. But it's only one after

44:50

that you're just chasing that dragon and

44:52

you you keep you're not getting anything

44:54

out of it. Yeah. Every time you're just

44:56

getting like nervous like and you're

44:58

like hitting it again your [ __ ] hands

45:00

are shaking. You're going too far but

45:02

you don't get that one feel. It's the

45:05

same thing with a cigarette.

45:06

>> With a cigarette really what you want is

45:08

the first couple of hits

45:10

>> and you get that lightness of head like

45:12

ah and then put them down.

45:14

>> Yeah.

45:15

>> The problem is you're always chasing

45:16

that dragon and you never get it. That's

45:18

why everybody loves the first cigarette

45:19

of the day. They sit there with that

45:20

first cigarette of the day and a cup of

45:21

coffee and you're like,

45:24

I got ideas.

45:25

>> Yeah.

45:26

>> Like I got [ __ ] ideas.

45:28

>> Write this down. Write this down.

45:29

>> You know, a lot of bands wrote most of

45:31

their music on cigarettes. Like Tony was

45:34

talking about Pink Floyd.

45:35

>> Dude, the Declaration of Independence

45:37

people were probably hitting cigarettes

45:38

back then.

45:38

>> For sure. They were smoking tobacco.

45:40

Yeah. I don't know if they did pipes or

45:42

what have you back then. I wonder when

45:43

the cigarette was invented. Because if

45:45

you think about it, like pipes and

45:47

cigars you don't inhale. You just take

45:49

it in your mouth. But cigarettes you

45:51

like take into your lungs. I wonder when

45:53

the first dude figured that you got to

45:55

like suck it all in

45:57

>> to get a full probably.

45:59

>> Probably he want to suck everything.

46:02

>> Cigarettes, bananas, what have you.

46:05

>> They were smoking cigarette or just says

46:07

drinking smoke when Christopher Columbus

46:09

and his crew discovered indigenous

46:11

people in the Caribbean.

46:14

Oh, you mean Christopher Columbus was or

46:15

the indigenous people?

46:16

>> Says they observed indigenous people in

46:18

the Caribbean in quotes drinking smoke.

46:20

>> Oh, yeah.

46:21

>> But this is going back. I don't That's

46:23

>> exactly drinking and smoking. Chris Co

46:25

was off that [ __ ] bro. He was off that

46:27

[ __ ]

46:28

>> Did you ever read the things that

46:30

Christopher Columbus did when they came

46:33

to America?

46:34

>> He was a boss. I heard

46:35

>> he was an evil man. Was he?

46:37

>> Oh my god. They would cut people's arms

46:40

off if they didn't bring them the right

46:41

amount of gold. They were killing

46:43

babies. Like they did some horrific

46:46

[ __ ] man.

46:47

>> Huh?

46:47

>> They did horrific [ __ ] to the people

46:50

that they found cuz they found these

46:51

people had gold and you know they like

46:55

if you think about how crazy it is that

46:58

Mexico speaks Spanish. You know how

47:00

crazy that is?

47:02

>> You know how crazy it is? That's so far

47:04

away from Spain.

47:05

>> Oh, that's a good point. They all speak

47:07

Spanish and they're Catholic. Gee, where

47:09

do you think that happened?

47:10

>> Cortez.

47:12

>> Yeah, cool.

47:12

>> That [ __ ] showed up in the 1500s

47:14

with like 600 dudes and 12 musketss.

47:18

They had like 12 They didn't even have

47:20

musket rifles. They had musket pistols.

47:22

>> He was a boss.

47:23

>> And they took over the whole [ __ ]

47:24

country.

47:26

>> I know, dude.

47:26

>> Kind of crazy. Like, if you think about

47:28

all these years later, they all speak

47:30

Spanish now.

47:31

>> Yeah.

47:32

>> That's nuts. Well, do you think we could

47:34

do something like that now? Like what do

47:36

you think's going to happen to

47:37

>> Iran?

47:38

>> No, just with I mean like I feel like

47:40

the [ __ ] that's happening out there is

47:42

going to come here eventually.

47:43

>> Well, it most certainly will.

47:45

>> Yeah.

47:45

>> You know,

47:45

>> I mean, if uh Homeland Security doesn't

47:48

stop it in its tracks and they're doing

47:51

a great job of preventing a lot of them,

47:54

you know, there's a lot of things that

47:56

they catch that you don't even hear

47:57

about that are like terror cells they

48:00

infiltrate and but they know there's

48:02

people in this country. That was the

48:04

most [ __ ] up thing about people being

48:06

all nonchalant about the border being

48:09

wide open for four years. Yeah.

48:10

>> Because men of military age entered into

48:14

this country from foreign countries and

48:16

we have no idea why. We don't know if

48:18

they're just honest people looking to

48:20

make a better life for them and their

48:21

family, send money back home. That would

48:23

be best case scenario. But that's not

48:26

all of them. So what percentage of them

48:28

are terrorists? What percentage of them?

48:30

There's not It's not zero. It ain't

48:32

zero. Yeah. But what also it's like it's

48:36

all just a cat and mouse game. People

48:37

are like, "We'll elect the Democrats

48:39

next time." It's like, but it's all the

48:41

same [ __ ] has been happening forever.

48:43

They haven't been helping anybody

48:45

forever. They're letting [ __ ]

48:46

politicians slurp on kids. All of our

48:48

[ __ ] money goes to Israel and they're

48:50

using it to [ __ ] genocide people.

48:52

It's like everybody is scared out of

48:53

their wits right now. It's like our

48:55

religious leaders are afraid to speak

48:57

out and it's like the it's a time where

48:59

it's like Satan is amongst us and our

49:01

religious leaders are [ __ ] talking

49:02

about [ __ ] at the pole. It's just

49:04

like what is going I don't know, man.

49:06

>> We got to get you off those

49:06

anti-depressants, son. You're losing

49:08

your [ __ ] marbles.

49:09

>> You think I am?

49:10

>> Come hang out with us. Just chill out.

49:11

>> I'm here.

49:12

>> Just chill out at the mothership

49:13

tonight.

49:13

>> I do have to pee in a little while. But

49:15

>> you can pee.

49:15

>> I'm going to pee in a minute, man.

49:16

>> We'll let you.

49:17

>> But no, people are just scared, dude.

49:19

This is [ __ ] that I can

49:20

>> They won't let you pee until you give

49:22

them your guns.

49:23

>> Really? That's how they're doing it now.

49:25

>> But what if you have to wash the black

49:26

face off the president? Can you [ __ ]

49:28

use a little bit of

49:28

>> piss had this big dude

49:31

>> they had this big gun thing this law

49:33

they passed where they made a bunch of

49:34

guns illegal and they found that only I

49:37

think it's less it's a very small

49:39

percentage of people I think it's

49:41

somewhere in the neighbor find out what

49:43

percentage of people have complied but I

49:44

think

49:45

>> any guns.

49:46

>> Oh they do. Yeah they do. They did. They

49:48

used to. Well, a lot of hunters up there

49:50

for sure. But there's a lot of

49:52

recreational guns and handguns and

49:54

self-defense weapons that people had

49:56

that they recently made during uh

49:58

Castro's kid when he was running the

49:59

country when they uh recently made this

50:02

ban.

50:03

>> I got to meet Castro one time.

50:06

>> No, I want to hear that. But one one

50:08

second. But data provided by Public

50:09

Safety Canada shows that of March 27th,

50:12

32,46 people signed up to participate in

50:14

the program. They declared a total of 50

50:16

thou 7,5440

50:19

firearms, roughly 42% of what was

50:22

projected.

50:23

But they were talking about Oh, you know

50:25

who has it on his page is uh Kolon Noir.

50:29

>> He has it here. I'll send it to you

50:31

because

50:32

>> we got to do this in Memphis, dude.

50:35

>> It's kind of crazy.

50:37

>> Yeah. You can't let him take away your

50:39

weapons.

50:40

>> No,

50:40

>> because how will you fight?

50:42

>> Uh how will you fight? That's a very

50:43

good question.

50:45

Yeah, I saw Kolon's video. Here it is.

50:51

>> Yeah, this is it. Here, play this type

50:54

[ __ ] right here.

50:55

>> Response.

50:56

>> Cool. We're sending police to your

50:58

house.

50:59

>> The declaration period for firearms

51:00

owners is scheduled to end next week. So

51:03

far, only 2.5% of the estimated 2

51:05

million affected firearms have been

51:07

declared, and 98% of firearms owners

51:09

haven't made a declaration. Canada

51:11

banned 2500 types of firearms, gave gun

51:13

owners until March 31st, essentially

51:15

today, to declare them. One week before

51:17

the deadline, 2.5% compliance. 2.5.

51:22

That's not a slow roll out. That's a

51:24

full-on rejection.

51:25

>> So, if they're not declaring by next

51:26

week, what's your plan, Minister?

51:28

>> The plan we have is as of March the

51:30

31st, the uh time to complete uh the

51:34

enrollment um will be will be done. Uh

51:37

and then uh the RCMP uh and other

51:40

agencies will be um uh available

51:43

throughout uh the spring and the summer

51:46

to do the collection.

51:47

>> The collection the collection

51:49

>> like he's speaking about dry cleaning,

51:51

not firearms, not property that belonged

51:53

to law-abiding citizens before the

51:55

government decided anymore.

51:57

>> So Minister, you're saying that RCMP

51:58

members, we just heard an auditor

52:00

general report saying we're short 3,400

52:02

members. We're dealing with a wave of

52:03

violent crime across this country. And

52:05

you're saying that your plan is over the

52:06

spring and the summer to deploy RCMP

52:09

officers to go doortodoor to firearms

52:10

owners and seize their firearms.

52:12

>> So this is a voluntary program, Mr.

52:15

Lloyd, as you're aware. Um, and the RCMP

52:18

resources and the resources we will use

52:21

with law enforcement uh does not

52:23

contemplate in any way using existing

52:26

resources. These are additional

52:28

resources. So these are those who are

52:30

off duty, those who may be retired. I

52:33

can ask you to do that. retired very

52:37

new officers door to door because

52:40

frankly many police forces across the

52:41

country refusing to participate in your

52:43

program.

52:43

>> And here's the part that should make

52:45

your jaw hit the floor. The Minister of

52:47

Public Safety, the guy running this

52:48

entire program, was secretly recorded

52:50

saying the gun grab isn't worth the

52:52

money.

52:52

>> The Minister of Public Safety

52:55

accidentally told the truth and he was

52:57

recorded doing it. He said that the gun

52:59

grab is not worth the money. He doubts

53:01

local police will have the resources to

53:03

enforce the Liberals's mandatory gun

53:05

buyback program and says the reason the

53:07

prime minister is sticking with the

53:08

policy is to appease voters in Quebec.

53:10

>> He privately admitted the police can't

53:12

even he said they're doing it votes

53:16

>> in the

53:18

city.

53:20

Go back to his name.

53:21

>> I can't

53:22

>> if you can't. It was Anna.

53:24

>> Whatever his name is saying I want him

53:26

to go door to door. You go door to door

53:28

to door and do this [ __ ]

53:28

>> You go door to door, [ __ ] You want to

53:30

do that?

53:31

>> How about you do it?

53:32

>> He's talking about getting retired

53:33

people to go door tod door and take

53:35

people. You're going to get someone

53:36

shot, stupid.

53:37

>> Well, it's just like our draft now.

53:38

They're like, now it's 42, now it's 47.

53:40

Now

53:40

>> you could have a a marijuana arrest now.

53:43

>> They're letting anybody in that [ __ ]

53:44

>> You little weed. What's the big thing?

53:48

>> Come on.

53:49

>> But dude, here's here's a here's the

53:50

part to me that's like you start to see

53:52

like the uh [ __ ] in the armor or

53:55

whatever. And no offense anybody. Um but

53:57

>> you're allowed to say that [ __ ] in the

53:59

armor. But they know who I'm They know

54:00

who they think are. I'm I'm not saying

54:02

anything about

54:02

>> I know you're not. But the people think

54:05

that

54:05

>> I'm not.

54:06

>> Yeah, I hear you.

54:07

>> But they But yeah,

54:08

>> you can't say [ __ ] and span anymore

54:09

either.

54:10

>> You can't say [ __ ] and span.

54:12

>> Well, you can, but you shouldn't.

54:13

>> You got to whisper it.

54:15

>> Hey,

54:16

>> that guy should be forced to go door to

54:18

door. Go door to door in a bright orange

54:20

vest with a circle in the center of it.

54:22

>> Yeah, I was just going to get his name

54:23

cuz I wanted to say that guy's a [ __ ]

54:25

and go do your own [ __ ] That [ __ ]

54:28

that freaking little homie

54:29

>> dough boy.

54:30

>> Yeah, that little [ __ ] that little

54:32

sloppy brand muffin.

54:33

>> Yeah,

54:34

>> [ __ ] sloppy muffin top.

54:36

>> Yeah, get your That's his name right

54:38

there. Gary. And I'm going to disagree.

54:41

>> Anandari.

54:45

Anandagery.

54:47

>> Canada. They're going to come for you

54:48

next. But here's what here's what's

54:49

funny to me, Joe. That's such a crazy.

54:51

You can't you you're not even

54:53

grandfathering people in mandatory gun

54:56

confiscations.

54:58

>> They just want people vulnerable.

55:01

>> Of course that's what we're saying, man.

55:03

>> Yeah,

55:04

>> Joe. That's what we're saying. They want

55:05

us all vulnerable.

55:06

>> Yeah, they do. They would much rather

55:07

that because look, what's the

55:08

difference?

55:09

>> What's the difference between America

55:10

and everywhere else?

55:12

>> One of the big differences we're [ __ ]

55:14

heavily armed,

55:15

>> right? That's one why why it's real a

55:17

real problem to try to take over America

55:19

and it's in our Declaration of

55:22

Independence. It's in the Bill of

55:24

Rights. It's like that this, you know,

55:27

the the right to an armed militia,

55:30

>> the right to keep and bear arms and to

55:32

have an armed militia. Like that's what

55:34

is that? And people are like, "What is

55:35

that for?" Well, that's to keep you from

55:38

being taken over by tyrants who have

55:40

guns.

55:41

>> Yeah. Well, here's here's one thing

55:43

that's interesting to me is like RFK was

55:45

on not long ago and he was saying that

55:47

75%

55:48

and I could be off by a few percent of

55:51

young men uh can't

55:53

>> 77

55:54

>> aren't eligible for military service.

55:55

>> So, this is the hilarious part to me

55:57

now. Now, they've poisoned us so much

56:00

that the that they can't even they don't

56:02

even have healthy people to serve in the

56:04

military. And now they're still like I

56:06

feel like the these powers that be are

56:08

like in this tough spot now. We're like,

56:09

"Fuck, we poison them too much. They

56:11

can't even go spill their blood for us,

56:13

you know?

56:13

>> Right. Well, they can't. I mean, there's

56:15

enough that can,

56:17

>> but they're widening these things. It's

56:19

like with the ice now, they're like, if

56:20

you're 65 and like have decent vision,

56:22

you can be an I, you know, they're

56:24

letting it's just like it kept getting

56:25

bigger.

56:26

>> Ice, you only have seven weeks of

56:28

training.

56:29

>> Yeah.

56:30

>> You think about that's not even what you

56:32

get in the police force.

56:34

>> Yeah.

56:35

>> We had more than that for [ __ ] T-ball

56:37

when I was a kid.

56:39

And Mr. Rick, dude, remember when you

56:41

had T-ball and your coach was just some

56:42

dude who had a name? Like, that's our

56:44

coach, Rick.

56:44

>> Just imagine this. Imagine if you had

56:46

seven weeks of training and you had to

56:47

go into a jiu-jitsu tournament.

56:49

>> I know.

56:49

>> You would get [ __ ] smoked. You would

56:51

get [ __ ] smoked. You don't know what

56:53

you're doing. You barely know what

56:54

you're doing. You're going to make a

56:55

bunch of mistakes.

56:55

>> Yes, I would.

56:56

>> Seven weeks of training in that is even

56:59

scarier because you got you got guns and

57:02

you're going out in the street and

57:03

you're arresting people.

57:05

Yeah, that's but it's it's like it not

57:09

more than ever it feels like theater and

57:12

it feels like it's been theater for a

57:15

while and it feels like maybe this is

57:19

crazy, but it feels like we're at the

57:22

last cusp before something weird is

57:23

going to happen. Didn't you say

57:24

something weird might happen, Jamie?

57:26

>> Jaime's always saying that. Jaime's

57:28

always He's tuned in.

57:29

>> Is he like that?

57:30

>> Jaime's got an ear for weird.

57:31

>> Even blacks are getting scared, though.

57:33

>> For real? Yeah.

57:34

>> Yeah. But they're more scared of like

57:36

the Trump movement, you know,

57:38

totalitarianism and fascism.

57:41

>> No,

57:42

>> I think they're getting, you know, they

57:44

see these ostracize, they see these

57:45

communities of people out there getting

57:47

abused and [ __ ] and I think it reflects

57:48

in them somewhere, you know.

57:50

>> You mean like with ICE? Is that what

57:51

you're saying?

57:51

>> No, with like uh like, you know, you

57:53

see, you know, there's a lot of brown

57:55

people getting murdered on [ __ ] Tik

57:57

Tok all the time, like you know, in the

57:59

Middle East. And I think you see that

58:00

and it makes them hyped up or you know

58:02

it activates.

58:03

>> Well, everybody should be upset about

58:04

that.

58:05

>> I agree. But

58:06

>> the idea that this the only way to solve

58:09

problems is by dropping bombs on people

58:11

is it's so crazy that that's still the

58:13

move in 2026. But I don't think

58:15

>> however, but however, if you are faced

58:19

with an evil dictator that has his eyes

58:22

on a global caliphate and is developing

58:25

nuclear bombs,

58:27

you can't be all [ __ ] kumbaya. But

58:29

the question is like how did how does

58:32

that get resolved,

58:33

>> right? That's the question.

58:34

>> How do you make sure how can you even

58:36

know they're not capable of having

58:39

nuclear weapons? And for the last 20

58:41

years, they've been preparing and

58:42

stockpiling missiles and developing what

58:45

is the they have some crazy thing I was

58:47

seeing online where it's like they

58:49

almost like have a mountain and dug deep

58:53

into the ground. They have these missile

58:55

elevators and like the missiles are like

58:58

hidden deep into the ground where the

59:00

only way you could destroy that

59:02

facilities with like a nuke and they

59:04

just did it specifically knowing that

59:07

they were going to get bombed. Well,

59:09

they had the, you know, they did Top Gun

59:10

movie where Miles Teller flew in there

59:13

and then a year later we did that in in

59:17

or a few years later we did that in uh

59:20

Iran. Like, isn't it kind of like it

59:21

just it all seems bizarre where they had

59:23

to fire a nuke down or they had to fire

59:25

a missile down into the thing. Remember?

59:27

>> I didn't see that movie.

59:28

>> It was good.

59:29

>> I bet it was.

59:30

>> It actually was good.

59:30

>> I like the first one.

59:32

>> Oh, we made a movie, too. I got to tell

59:34

you about our movie. I can't forget.

59:35

>> Oh, that's right. You made a movie.

59:36

>> Yeah. I didn't mean to interrupt about

59:37

it,

59:37

>> but let's find out what what what was I

59:40

asking before we moved on.

59:42

>> Iranian missile thing.

59:44

>> Yeah. What is that elevator thing that

59:46

they have? They have some underground

59:48

like deep underground. Someone was

59:50

explaining it online. They have a very

59:53

unique method of protecting their

59:55

missiles from being bombed. So they have

59:58

their storage is like deep deep

60:00

underground. I think that's one of the

60:02

things that they were just attacking

60:03

recently. like we were dropping bombs on

60:06

them recently.

60:07

>> I don't think we're over there doing

60:08

that for ourselves, though.

60:10

>> Doesn't seem like it. Doesn't seem like

60:12

it's in our best interest, you know.

60:15

>> Why do you think Why do you think then?

60:16

What What is it that like Israel holds

60:19

over America that we do those things?

60:21

>> Well, first of all, there's a lot of

60:23

people that donated to the Trump

60:24

campaign that have significant influence

60:26

over him. Yeah.

60:27

>> That uh lobby for Israel, right?

60:29

>> And they're very uh beholden.

60:31

>> So, that's just capitalism then, right?

60:33

So IDF uncovers Iran missile mega

60:36

cities.

60:37

>> I don't believe anything they say.

60:39

>> It's hard to know cuz this is all AI,

60:41

right?

60:41

>> No.

60:42

>> Is this real?

60:42

>> It looked like as it was. But

60:43

>> is this real?

60:45

>> Honestly,

60:46

>> this looks AI. Some of it does look AI,

60:48

but that video of those guys walk right

60:50

there. Real.

60:51

>> It's so hard to know these days, man.

60:53

>> If it's so hard to know, you know, this

60:56

is like if I was Iran, I'd make a video

60:58

like that. Look at all my bums. Look at

61:00

my big [ __ ] and look at my bums. Out of

61:04

a big old dick. Big old dick. Like a

61:06

third leg and a bunch of bombs.

61:07

>> I'm [ __ ] sick of my dick.

61:09

>> Really?

61:10

>> Give it a break

61:13

for a couple days and you'll miss it.

61:14

>> Oh, I've had a lot of thoughts. I

61:18

>> go vacation from your dick.

61:20

>> Bro, there's times I wanted to just mail

61:21

my dick to Africa or whatever.

61:23

>> Don't

61:24

>> just feed a couple.

61:24

>> They'll never send it back.

61:25

>> But I'm saying to feed a couple people.

61:27

I don't think it'll feed a couple.

61:30

>> It'll dude.

61:31

>> I don't even think feed one.

61:32

>> Get out of here.

61:33

>> Might keep him alive for a few hours.

61:36

>> It would be lunch. At least lunch for

61:38

two.

61:39

>> Someone on a diet.

61:42

>> Someone cutting weight for wrestling.

61:44

>> Yeah.

61:45

>> Or that dude that tried to cut weight

61:47

and because he wasn't gay anymore.

61:48

Remember I told you about that dude?

61:52

He lost 40 lbs, dude. He was just

61:54

[ __ ] ribs and dick by the end of it.

61:56

Dude, that guy.

61:58

>> I got to pee really bad. Can I join?

62:00

>> Yeah. Pause. Pause. We'll be right back.

62:01

Ladies and gentlemen, we'll be right

62:03

back.

62:05

>> Theo Vaughn, David Spade, Bus Boys in

62:08

theaters April 17th. Did you finance

62:11

this, dude? Did you [ __ ] do this [ __ ]

62:12

with your own money?

62:13

>> Yeah.

62:13

>> You wild [ __ ] You

62:16

>> Wow.

62:16

>> We wrote it in. Yeah, we did it all.

62:18

There's no studio attached to it.

62:20

There's nobody

62:20

>> Tim Dylan's in it. Tim did a good job.

62:23

>> He's awesome.

62:24

>> He is awesome.

62:24

>> He really is.

62:25

>> He's my uh He's one of my favorites.

62:29

>> Yeah. No doubt.

62:30

>> He's one of a kind.

62:31

>> Is that Nate Diaz?

62:33

>> Bro, he was so He was

62:34

>> Is that Louis J?

62:36

>> Uh

62:37

>> whoa.

62:38

>> No. Who's in it? Cam Patterson, Trevor

62:40

Wallace.

62:42

>> Nice, dude. What's it about?

62:43

>> Um it's about two guys and uh they're

62:46

bus they're just regular guys and

62:48

they're not doing that good. And then um

62:51

they think if they can one of them loses

62:54

his girlfriend to a waiter and they

62:56

think if they can become waiters that

63:00

they can get his girlfriend back and uh

63:03

they have to start at bus boys

63:07

and they don't get very far. So that's

63:09

pretty much it.

63:10

>> Spoiler alert.

63:11

>> It was crazy though, dude. I mean I

63:13

think there's just like a thing about

63:14

like like nobody like it's just we made

63:16

it ourselves. Like we wrote it, we did

63:18

it. There's no [ __ ] somebody saying I

63:20

can't put this in it. Like some of the

63:21

streamers are like, "Nah, it's too edgy

63:23

for us or whatever." [ __ ] them then.

63:24

You're out. You know what I'm saying?

63:26

We're doing our own [ __ ] And so,

63:27

>> did you sell it to a movie distributor?

63:29

How did you get it into movie theaters?

63:32

>> We just um

63:34

>> I don't know how that any of that stuff

63:35

works.

63:35

>> I don't know either. We have a guy who's

63:37

doing handling some of the business side

63:38

of it.

63:39

>> My friend Ezra's handling some of the

63:40

business side of it. He's great. And so,

63:42

he's been helping us out uh and gotten

63:43

it into the theaters.

63:44

>> Who directed it?

63:45

>> Um this guy Jonah Fineold. uh guy out of

63:48

New York and um great guy. Uh and yeah,

63:53

we just we asked our friends to help and

63:55

it was um yeah, I mean it was

63:57

ridiculous. We shot it right during like

63:59

the fires h when the fires were

64:00

happening in the palisades.

64:01

>> Oh wow.

64:02

>> So it was like it was like

64:04

>> you shot it in California.

64:05

>> Yeah.

64:06

>> Wow.

64:07

>> I don't know why exactly, but um Oh, cuz

64:09

there was nothing shooting there. They

64:10

don't shoot things there anymore.

64:12

>> Isn't that crazy?

64:13

>> Imagine people have been so greedy and

64:15

[ __ ] attacked. They [ __ ] themselves

64:17

so much they can't even [ __ ] do their

64:18

the one thing that they're most known

64:20

for, Hollywood. They can't even [ __ ]

64:22

do it.

64:22

>> It's It's so crazy. Everybody

64:24

>> It's gross. It's not just It's But it's

64:25

gross though.

64:26

>> It is gross. It's all the government.

64:28

It's all government. It's all government

64:30

policies, regulations, taxes, all the

64:33

things that make it unprofitable to do

64:35

business there. People just pulling up

64:37

shop.

64:37

>> And there's all these Yeah. There's so

64:39

many guilt you have to pay. It's like I

64:40

don't see how these people I I don't see

64:42

how like a a day-to-day actor could

64:44

survive. And they don't. and they leave.

64:46

>> A lot of guys are [ __ ] I was just

64:47

watching this video with this guy. I've

64:49

seen him in a ton of movies. And he's

64:51

like, Bluecollar actors are just not

64:53

doing well right now. He's like, I had

64:55

to sell my house. You know, a lot of

64:57

people are just going to television

64:59

shows because there's no money in films

65:00

anymore. He goes, I used to be able to

65:02

make a living in films. And he's like, I

65:04

didn't make a lot of money. Because he's

65:06

just, you know, the guy who has a small

65:08

part in movie here, small part in movie

65:10

there. So, he's getting by and he's, you

65:12

know, gets to take his family to the

65:14

movie and they get to see the dad on

65:16

screen. It's cool. Yeah. You know, he's

65:18

paying his bills, doing well, but he's

65:19

not getting wealthy, right? He's like,

65:21

the stars get wealthy. But those dudes

65:23

that you need, you know, the guy that

65:25

plays the cop, the guy that plays this

65:26

person,

65:27

>> those guys are [ __ ]

65:28

>> Well, I have I have the name of

65:30

everybody that was in it, everybody that

65:31

worked on it, if if we have some

65:33

success, I'm gonna go back and reward

65:34

those people, man. And I'm excited about

65:36

that. And um and yeah, if people if even

65:39

if it just does good, then we can make

65:41

other stuff, right? And nobody can tell

65:43

us that we can't.

65:44

>> Yeah. Once you do one that's good, then

65:46

more people are interested in investing,

65:48

you know, gets you get your foot in the

65:51

door. You do a Netflix series. You can

65:53

do anything you want.

65:54

>> And it's not like Dunk Kirk. I don't

65:55

know if I want to really get into like

65:56

that much acting stuff, but it was just

65:58

like,

65:59

>> you know, I grew up watching David Pay.

66:01

We got to do it together and we wrote we

66:02

just went through all of these hurdles

66:04

and then like uh the fact that we got it

66:06

done, dude. I thought it was all emails

66:08

till the first day I showed up on set

66:09

and I was like, "No [ __ ] way. We were

66:11

serious. People were serious about

66:13

this."

66:15

>> Oh, that's crazy. You did it. But yeah,

66:19

I think so. Yeah, something like that. I

66:21

think there's something like that. And

66:22

uh and if people if people can buy a

66:25

ticket early to it, I don't want to

66:27

sound I'm not I'm not desperate about

66:28

it. If it does, fine, that's cool. And

66:30

if it doesn't, that's okay, too. I feel

66:32

happy that we got to do it.

66:33

>> If it's funny, it'll do great because

66:35

there's not a lot of that these days.

66:36

There's not a lot of really funny

66:38

movies.

66:38

>> Yeah.

66:38

>> And I know it's going to be funny.

66:40

>> There's some parts that are really,

66:41

really funny. I'm sure it's not like

66:42

Dunkirk or anything like that. It's not

66:44

like Midsom or whatever.

66:45

>> What are those things that you just

66:47

said?

66:47

>> Those are just other movies. But it's I

66:49

don't want people going in there

66:50

thinking it's like um

66:53

uh like a trying to think of uh Bridges

66:58

of like

66:59

>> Bridges of Madison County.

67:00

>> Yeah. It's nothing like that.

67:01

>> It's a comedy. Nobody's going to think

67:02

it's that. No, it's you and David Spade.

67:04

Who the [ __ ] is going to think it's

67:05

Clint Eastwood and Merryill Streep?

67:07

>> Yeah.

67:08

>> What's wrong with you?

67:09

>> I don't know what people think.

67:12

>> I don't know what people think or how

67:13

they think. But yeah. Anyway, but yeah,

67:16

there there's some [ __ ] [ __ ]

67:17

stuff. It's just fun, dude.

67:18

>> You know, he used to have a great joke

67:19

about Bridges of Madison County. Chris

67:21

Magcguire, he had a [ __ ] great joke,

67:23

>> dude. It's one of my favorite movies.

67:25

>> Congratulations. Let me tell you his

67:26

joke. His joke is about how, you know,

67:28

it's hard to choose a movie with your

67:30

girlfriend, like she wants this. And he

67:32

goes, "Brides of Madison County." He's

67:33

like, "Oh, Clint Eastwood was in it." He

67:35

goes, "Clint would never [ __ ] me." And

67:37

he goes, 10 minutes into the movie, he's

67:39

like, "Hey, something's fishy. Clint

67:41

doesn't have a gun." He goes, 20 minutes

67:44

after that, Clint's crying. I'm like,

67:46

"Oh, Clint, you [ __ ] me." He goes,

67:48

"He's crying cuz he doesn't have a gun.

67:54

>> Such a great joke."

67:55

>> Yeah, that's

67:55

>> shout out to Chris Magcguire.

67:57

>> Shout out to Chris McGuire. I haven't

67:58

met him.

67:59

>> You never met him?

67:59

>> I haven't.

68:00

>> Funny dude. We started out together way

68:02

back in the D, but he went Yeah, he went

68:04

the route of writing.

68:05

>> He mostly writes and stuff now,

68:08

>> but uh was a funny comic, man. It's a

68:10

good comic. But um these [ __ ] comedy

68:13

movies are squashed. We were just

68:15

talking about that last night in the

68:17

green room. We was like, it seems like

68:19

the hangover was probably the last gasp

68:23

and that was like 2009.

68:26

>> But what happened? Like how could you go

68:28

that

68:28

>> people got scared? You got scared of

68:30

100%.

68:31

>> Seems organized to me. No, it's comedy

68:34

away from people that they're not going

68:35

to be laugh.

68:36

>> They didn't think. They didn't think.

68:37

It's woke ideology that's looking to

68:40

yell at people for every

68:41

>> Oh, yeah.

68:42

>> transgression. And you can't have that

68:44

with comedy. You can't have that kind of

68:46

nonsense with a really funny movie like

68:49

something about Mary

68:51

>> or you know Kingpin.

68:53

>> Kingpin

68:54

>> classic Fairley Brothers movies. Oh, so

68:56

good.

68:56

>> How great was that?

68:57

>> Great [ __ ] movie. Great god

68:59

>> [ __ ] movie. That movie is so good. So

69:02

funny. Even to this day, go back and

69:04

rewatch it. Bill Murray with his crazy

69:06

[ __ ] hair. Woody Harlson with one

69:07

hand. It's a great movie, man. when he

69:10

had to go down that lady to pay his rent

69:12

and he threw up in the toilet. Remember

69:14

that scene?

69:16

>> That movie's 30 years old now.

69:18

>> Is it?

69:18

>> That's crazy.

69:19

>> That is crazy.

69:20

>> Sick.

69:21

>> It's a banger of a movie, man.

69:23

>> All the good shit's gone, dude. But it's

69:25

But it's not. It's not. That's true, is

69:26

it? Right. Sometimes I get in that

69:28

attitude where it's like, I got to stay

69:29

out of those little moments. I usually

69:30

get out of them pretty quick.

69:31

>> You can still do those movies, but you

69:33

have to do it the way you just did it.

69:35

You have to finance it yourself and you

69:37

have to do But luckily now, man, you

69:39

could shoot a whole [ __ ] movie on

69:41

your phone.

69:42

>> Dude, we shot this [ __ ] in 23 days,

69:44

dude. There was one day where the winds

69:45

were like 50 mph and it was like, we

69:47

can't afford to be here another day. So

69:50

suddenly in these scenes, there's just a

69:52

ton of [ __ ] wind, dude.

69:54

>> Well, that's fine. That [ __ ] happens in

69:56

the real world. Why can't it happen in

69:58

your show?

69:59

>> I agree. It was just I think it was just

70:00

interesting how it all worked out. Uh

70:02

>> people are making their own stuff, you

70:04

know, like I was talking to Shane about

70:05

this last night

70:06

>> because, you know, Shane just wrapped up

70:08

Tires, this new season of Tires.

70:10

>> Yeah.

70:11

>> Yeah. He was he [ __ ] he was telling

70:14

me some hilarious scenes from Tires. I

70:16

can't wait to watch it. But it's like

70:18

that kind of a thing where just him and

70:19

his buddies put together a show.

70:22

>> Yeah.

70:22

>> You know, it's like his buddies, the

70:24

writer and the director, all his buddies

70:26

are on it. He they all came up with the

70:28

idea. They do it themselves. No one's

70:30

looking over their shoulder. I asked him

70:32

if like Netflix has any input. He's

70:34

like, "No, there's no input. They just

70:37

make a show. They just make a show.

70:38

>> That's fun.

70:39

>> Give it to Netflix. Bang."

70:41

>> It's a beautiful time for stuff like

70:43

that.

70:43

>> Yeah, you're right. There's a new It's a

70:45

prima vera. They say in Spanish, it's a

70:47

springtime for new things.

70:49

>> Well, there's an opening, right? And

70:52

because there's no gatekeepers anymore

70:54

because they've essentially killed their

70:56

own business, you can kind of do it on

70:58

your own now.

70:58

>> Yeah.

70:59

>> That's the beautiful thing. Like you

71:00

don't have to like sit in a room full of

71:02

[ __ ] executives that don't know jack

71:04

[ __ ] and they want to give you direction

71:06

on what's funny and what's not and

71:07

where's the diversity in your film.

71:09

>> Yeah.

71:09

>> You know, we think you should have a

71:10

black trans friend. Like oh

71:12

>> yeah, we think you should have a a

71:14

[ __ ] ant or whatever like the insect

71:16

or whatever. And I'm like that's crazy.

71:20

You're like this is what

71:23

>> you're like this this is a script about

71:25

driver's ed and like but you need an

71:27

insect that's a homoerotic. It's just

71:30

people got stupid. They got stupid with

71:32

their virtue signaling in films. And you

71:34

can't do that with art. You You can't

71:36

have quote Do you see what the Academy

71:38

Awards doing like in order to qualify to

71:42

be nominated for an Academy Award now?

71:44

>> Well, for the podcast thing, I know that

71:45

they said we had to pay a like a fee or

71:47

something. I remember you talked about

71:48

that.

71:48

>> That's that's a different That's the

71:50

Golden Globes.

71:51

>> Okay. Sorry.

71:51

>> Yeah, that's a different thing. Yeah.

71:54

You didn't pay for that either, did you?

71:56

>> No.

71:56

>> Did they ask you to?

71:57

>> Yeah.

71:57

>> [ __ ] yeah, dog. Give me some.

71:59

>> Yeah, [ __ ] off.

72:00

>> I said, "So what?" Yeah. And I was like,

72:03

"If Joe Rogan if it this if you don't

72:05

even have him in it, then what are you

72:06

even making a thing?"

72:08

>> That was also a reason why I didn't want

72:10

to be in it. Like I don't want to

72:11

legitimize this. You guys have [ __ ] up

72:13

every other form of entertainment and

72:14

now you're going to judge podcasting.

72:16

And what did you pick? Look, I'm not

72:18

saying there's anything wrong with Amy

72:19

Polar Show. I haven't watched it. People

72:20

love it. That's great. But she's like a

72:23

famous lady who just started doing

72:24

podcasting six months ago and she's got

72:26

the number one podcast. Like if you guys

72:28

ever listen to Radio Lab, you know, you

72:30

ever listen to like this? There's some

72:31

banging [ __ ] podcasts out there. They

72:33

might not be number one, but if your

72:35

whole idea is like pick the ones that

72:37

are great that are like really

72:39

interesting, how stuff gets made.

72:41

There's a bunch of [ __ ] great

72:43

podcast.

72:43

>> Smartless is cool.

72:44

>> There's a bunch of great podcasts out

72:46

there.

72:46

>> Oh, dude, there's so many great ones,

72:48

dude. Matt McCusker is if you get How

72:51

fun is he to listen?

72:52

>> He's awesome. He's fun. He's a good

72:53

dude.

72:54

>> I'm glad he's out here. He's a special

72:57

dude, man.

72:57

>> Yeah. Very smart guy, you know.

73:00

>> Yeah.

73:00

>> It's uh there's a lot of great podcasts

73:02

out there. Tim Dylan's not on that list.

73:04

[ __ ] off.

73:04

>> Yeah.

73:05

>> If he's not on that list, [ __ ] off.

73:06

>> Get [ __ ] Get [ __ ] That is the one

73:09

podcast I consistently listen to. Tim

73:11

Dylan.

73:12

>> That's awesome.

73:13

>> His episode on the Epstein Files is one

73:16

of the best [ __ ] podcasts I have ever

73:18

listened to. I was like clapping in my

73:20

car at red lights.

73:22

>> Yeah.

73:22

>> Just clapping like woo. It was he was on

73:25

fire and it was the po perfect

73:28

combination of satire,

73:32

>> honest, real facts, complete chaos,

73:36

humor, wearing those goofy glasses,

73:39

ranting like a maniac. It was amazing.

73:42

>> Yeah, man. I I do feel lucky that I've

73:44

gotten to meet like just that's one of

73:45

the truest things I think through comedy

73:47

is just getting to meet some just some

73:49

fun people, dude.

73:50

>> We know some cool [ __ ] We

73:52

really do. We know some cool

73:53

[ __ ] We really do.

73:55

>> And thanks, dude. Thanks for letting me

73:56

come in here today, too.

73:57

>> Come on, dog. To spend time with you.

73:59

>> Come on, dog.

73:59

>> It's good. It just feels Things feel

74:01

kind of scary out there.

74:03

>> Well, it's a little also scary. I keep

74:05

telling you this cuz you're on your own

74:06

out there. You're out there living in

74:08

Nashville.

74:09

>> I'm getting close to being here.

74:10

>> Ain't a lot of comics out there, dog. I

74:12

mean, Barati's out there, but he's

74:13

always doing [ __ ] stadiums on the

74:15

road and [ __ ]

74:16

>> Yeah.

74:17

>> Like, you need to be around.

74:18

>> Oh, I'm getting ready.

74:19

>> The crew

74:20

>> cuz I have to start to practice again.

74:21

I'm taping my special in one month.

74:23

>> Last night in the green room, it was

74:25

Shane, Ron White, Tony Hinchcliffe,

74:28

Brian Simpson, Asana, Derek Poston. We

74:32

were just laughing and laughing. It was

74:35

It's so fun. And everyone's going on

74:38

stage and [ __ ] tearing it up. It was

74:40

It was exciting. It's like it's in the

74:42

air like something's happening here.

74:44

Yes. And you see all these young guys

74:45

coming in, these young women coming in,

74:47

they're all fired up and they're all

74:49

[ __ ] prepared and everybody's like

74:51

really trying to [ __ ] kill it.

74:53

>> Yeah,

74:53

>> it's nice.

74:54

>> Yeah, we got Christina Mariani. I'm

74:55

doing a show tonight. She's on it. Dylan

74:57

Sullivan, I think is

74:58

>> Dylan Sullivan's very fun, too.

75:00

>> So, I'm excited about that.

75:01

>> Yeah, he's they're both at the club all

75:03

the time. It's a good It's a fun time

75:05

for comedy, man. It really is a real

75:06

good time for comedy.

75:07

>> Yeah.

75:08

>> And uh

75:09

>> it's a special time.

75:10

>> Comedy doesn't ex exist in a vacuum, you

75:13

That's why I keep telling you you can't

75:14

on your own.

75:15

>> Oh, you can't do it by yourself, man.

75:17

Like, you ever go by yourself on the

75:18

road and you have like opening acts you

75:19

don't know?

75:21

>> Oh, yeah.

75:21

>> I used to hate every now and then I I

75:23

met some friends. Like that's how I met

75:24

Sigura. I didn't know Skura until I

75:26

worked with him on the road.

75:27

>> So, you do meet some cool [ __ ]

75:29

occasionally, but it's like one out of

75:31

10 or one out of 20.

75:33

>> Yeah.

75:34

>> So, you do all these gigs and you're

75:35

lonely. you're just like on the road and

75:37

you go into libraries and [ __ ] or

75:39

bookstores and you're like trying to

75:40

watch something on TV and going to the

75:42

gym but you you feel completely

75:44

disconnected to people until you get on

75:46

stage. It's not as fun.

75:48

>> Yeah.

75:48

>> It's like you want to be around a bunch

75:50

of other comics that are your friends

75:52

and also you want to hear their sets.

75:54

You want to watch them crush. You want

75:56

to go on stage already laughing. You

75:58

want to be laughing at what he just said

75:59

when you get on stage

76:00

>> and feel the competition. It's

76:03

inspiration more than it is competition.

76:05

>> That's fair.

76:06

>> So the problem with competition is

76:07

someone has to lose.

76:09

>> Yeah.

76:09

>> You don't want anybody to lose. Then no

76:11

one has to lose. It's just these people

76:14

doing well should inspire you to do

76:16

well. They should light a fire under

76:18

you.

76:18

>> Yeah.

76:19

>> You can call it competition, but the

76:20

problem with competition is one person

76:22

wins, one person loses. That's not

76:24

comedy. What comedy is is that everybody

76:26

wins. That's real. That's not like

76:28

[ __ ] talk to try to appear humble.

76:31

The reality is you win if everybody

76:34

wins.

76:34

>> You've always I Well, that's one thing

76:35

I've always admired. You You've always

76:37

been that way. Like I'm going to pick

76:38

I'm going to Yes. I'll support you how I

76:41

can, you know, and you've always been

76:42

that way about young comics and uh Yeah,

76:45

I agree with you.

76:46

>> People did it for me, man. They did it

76:47

for me when I was coming up and it it

76:49

helped me tremendously and I I try to

76:51

pass it on times 10. It's u between that

76:54

and Kill Tony.

76:55

>> Kill Tonyy's so fun, dude.

76:57

>> It's such an important part of comedy.

76:58

like having this place where you all you

77:01

need is a minute. You could have been

77:02

doing comedy like just trying it out on

77:05

the road and [ __ ] just like barely

77:07

filling up a Friday night 10:00 show and

77:10

and then you develop like one minute

77:13

that just breaks through and all of a

77:16

sudden you got a [ __ ] career.

77:17

>> Yeah.

77:18

>> You know, you got a career now.

77:19

>> Yeah. I mean, there's young heroes that

77:21

are being sprouted out of here and even

77:23

adult heroes, people that have been in a

77:25

while are getting here and finding their

77:26

finding just a new right. It's like

77:28

>> guys like Adam Ray. Adam Ray is killing

77:30

it now. Adam Ray was struggling.

77:33

>> He was strug but he was a funny guy. Did

77:36

hard worker never lost his ambition.

77:38

Never lost his focus. Never lost his

77:40

enthusiasm for it. Never got bitter.

77:42

Always friendly

77:43

>> always.

77:44

>> And just needed a show like Kill Tony to

77:46

come around. They're like and everybody

77:47

like, "Oh my god, this [ __ ] is

77:49

talented."

77:50

>> Yeah.

77:50

>> All those different characters that he

77:51

does.

77:52

>> I know. And that's a brave thing. So if

77:53

you if you just done comedy, mostly

77:55

standup, and then to try and go into

77:57

character, that's a kind of a that's a

77:59

to me that that that would feel very

78:01

hard. So that's a brave thing that

78:03

>> there's a few of those guys that really

78:04

excel at that. That's a special talent.

78:06

Him and Dunigan, especially

78:09

>> Kyle Donigan, so funny.

78:11

>> He's so funny. And I always thought he

78:13

was going to make it with those face

78:15

swaps. This shows you how the industry

78:17

is so [ __ ] up. Okay,

78:19

>> so he was doing those face swap shows on

78:22

Instagram, right? And they were so

78:23

funny. But one of the reasons why

78:25

they're funny is because it's obviously

78:27

fake. It's crude like South Park. Like

78:30

it doesn't look real. So it doesn't

78:32

freak you out at all. It looks so fake

78:33

that it's funny,

78:34

>> right?

78:35

>> He went into Comedy Central and they

78:36

started using like much more

78:38

sophisticated face swap, which wasn't as

78:40

funny. It was like creepy. And then they

78:43

cut the balls off of it. Like he wanted

78:45

to have one where Caitlyn Jenner was

78:47

[ __ ] Donald Trump. Caitlyn, yeah,

78:49

baby. Like riding Trump and they went,

78:52

"No, no, no, no, no."

78:53

>> His Kardashian ones are so funny. He's

78:55

the best.

78:56

>> And even the Kardashians like him, I've

78:58

heard.

78:58

>> Yeah. Look, they they have a sense of

78:59

humor. They have to

79:01

>> They have to have a sense of humor.

79:02

They've been in the public eye for 20

79:04

[ __ ] years with no talent whatsoever.

79:06

Just getting attention. Like you you got

79:08

to not take yourself too seriously if

79:09

you hold that position,

79:11

>> you know?

79:12

>> Yeah.

79:13

raking and dough. Raiking in that dough.

79:16

>> Their whole family, they should count as

79:18

reparations. I feel like though that

79:19

whole family, you know.

79:21

>> You think?

79:21

>> I think so.

79:24

I think so.

79:25

>> I'm going to leave that alone.

79:28

>> Yeah, same. I don't know if it was a

79:29

good I thought it was a joke. I don't

79:30

know if it is a joke, but I just I don't

79:32

think I said it right. Who gives a [ __ ]

79:34

dude? The world's going to end soon, so

79:36

[ __ ] get it out of your system.

79:37

>> If it doesn't end, it's going to change.

79:39

>> That's what's scary, dude. [ __ ]

79:41

eggheheads on the spectrum are going to

79:43

be running everything. But do you feel

79:45

like

79:47

does it like like cuz yeah this this I

79:50

go back to this Uber driver but it's

79:51

just a guy who was talking to me and

79:52

he's like well they're going to give you

79:55

know like if like Whimos get a job the

79:58

Whimo can work all night it can work 24

80:00

hours right so really you're taking away

80:02

like four or five shifts from an actual

80:04

so you know what I'm saying like AI if

80:06

AI and tech advancement makes it so you

80:09

know they're they can do 50 people's

80:12

jobs with a one robot Yeah.

80:14

>> Then yeah, what happens to those 50

80:16

people? How will people survive? How

80:17

will they be able to assure that their

80:19

kid that they're raising and trying to

80:21

teach positive things to will have a

80:23

world to enact those things in?

80:25

>> It's a very good question and it's a

80:27

good question that gets even weirder

80:28

when the government is responsible for

80:30

all your money. So if the government has

80:32

to give you money because there's no

80:34

jobs left and if all this money is being

80:36

generated by AI like Elon suggest and

80:39

you get universal high income, you got

80:42

to be really careful that that doesn't

80:44

come with a bunch of rules, new rules

80:47

for your behavior, for social media

80:50

posting, any kind of like if they

80:52

develop some sort of an app that tracks

80:56

like your your social credit score,

80:59

that's when [ __ ] gets [ __ ] super

81:01

scary if like they attach the amount of

81:03

money you have to your social credit

81:04

score.

81:06

>> Yeah.

81:06

>> Which is what they do in China.

81:08

>> Well, do you see those flock cameras

81:09

now? I think there's are there's some

81:10

there's this

81:11

>> Yeah.

81:11

>> And there's this thing in Florida where

81:14

uh police officers they were testing

81:15

this somewhere and um and shout out

81:17

police officers for doing their best. Um

81:20

but uh where they were testing when they

81:23

pull somebody's identification they can

81:26

see their last few like bank

81:28

transactions stuff so they kind of know

81:29

who they're interacting with and what

81:31

they've been up to.

81:34

>> That seems like what what is that about?

81:36

>> Well, it's all a little bit. It's like a

81:38

centimeter here crime and they're trying

81:41

to find out how you did the crime. They

81:43

should have no access to your [ __ ]

81:45

Especially police officers.

81:46

>> I'm just saying

81:47

>> you're just people. And also sometimes

81:49

corrupt.

81:50

>> Yeah.

81:50

>> Also sometimes they steal money. Also

81:52

sometimes they sell drugs. Also

81:54

sometimes they [ __ ] kill people for

81:55

hire.

81:56

>> Yeah.

81:56

>> Right.

81:57

>> Yeah.

81:57

>> Jesus.

81:59

>> I don't know Joe. It just spooky out

82:01

there.

82:01

>> I know. Well, the more power the

82:03

government has over you, the worse you

82:05

are off. That's just a fact.

82:07

>> Well, every it seems now like most

82:08

people are like our government does not

82:11

obviously is not here to help the

82:14

people.

82:14

>> Obviously,

82:15

>> they've been compromised.

82:16

>> That's true. So, isn't there are there

82:19

any rules against when people but we

82:21

have but the crazy part is we are

82:23

working to pay the taxes to keep them

82:25

doing it's like

82:25

>> I know

82:26

>> and that starts to make you feel sick

82:28

>> and they're not responsible for any of

82:30

the fraud and waste.

82:31

>> Yeah.

82:31

>> Like there's so much fraud and waste.

82:33

Like look at California. This

82:34

[ __ ] is trying to be president

82:36

after who knows how much fraud and waste

82:39

is involved in California.

82:41

>> He wouldn't I don't think he'd beat

82:42

Spencer Pratt in a runoff. I don't

82:43

think.

82:45

Well, Spencer Pat is running for mayor.

82:48

>> Oh, I see.

82:48

>> Yeah. Uh, and I think he can win.

82:51

>> He's actually good. He's like, what he's

82:53

saying makes a lot of [ __ ] sense. And

82:55

he's uncovering a lot of fraud.

82:58

>> But there's a like that Nick Shirley guy

82:59

went down to California and he's like,

83:00

there might be a hundred times more

83:02

fraud in California than I found in

83:03

Minnesota. Everywhere.

83:04

>> He could go to every state and say I

83:06

think he could go I just think this

83:08

whole thing is just this drain. Like Tim

83:11

Dylan said it like like six months ago.

83:13

He was saying this is the like the

83:15

bloated carcass the inflation this is

83:17

the end of what is hap like you know

83:20

they're just

83:20

>> it's post scarcity there's so much money

83:23

for stuff like in California there's an

83:25

enormous amount of money that gets paid

83:27

to people for just taking care of your

83:28

relatives so you get paid to take care

83:31

of your relatives but there's no

83:32

oversight

83:33

>> but [ __ ] dude I've had some relatives

83:34

I'll pay you good money to take care of

83:36

them [ __ ]

83:37

>> but no they would pay you to take care

83:39

of them

83:41

>> you would get paid to take relives. So,

83:44

say if you take care of your mom

83:45

>> Oh, okay.

83:46

>> You can actually get paid for that by

83:48

California.

83:48

>> Mhm.

83:49

>> Yeah. Which is odd.

83:52

>> Yeah. I wonder there's got to be some

83:54

other reason they're doing that.

83:55

>> Fraud.

83:56

>> Yeah.

83:57

>> There's a lot of fraud in California.

83:59

There's a lot a lot of fraud everywhere.

84:00

But this is what Elon talked about. He

84:02

was talking about like Medicare and

84:03

Medicaid fraud. He's like, "It's

84:05

hundreds of billions of dollars." And

84:07

he's like, he didn't want to talk about

84:08

it. It's like I really wouldn't worry

84:10

that they would kill me.

84:12

>> And when he says they, who is it?

84:14

>> Whoever's perpetuating this,

84:16

perpetrating this fraud.

84:19

>> Maybe that's what happens. Maybe some of

84:20

these guys get into office and they're

84:22

like, "Look, we're going to kill your

84:24

family. We're going to kill this is all

84:25

the things that are going to happen

84:26

unless you play this game." Do you think

84:27

that kind of stuff happens?

84:30

>> I think it has happened for sure. I

84:32

think to say it doesn't happen is pretty

84:34

naive. I think House of Cards is

84:35

probably really close to what the

84:37

government's actually like. Go back and

84:39

watch that show again.

84:40

>> Okay.

84:41

>> Yeah, Kevin Spacy's an old school dick

84:42

grabber, but damn, that [ __ ]

84:44

could act.

84:45

>> Yeah,

84:46

>> he could act.

84:48

>> Oh, yeah. He

84:49

>> The writing on that show is fantastic.

84:50

That show is so good. Up until

84:53

>> the last season and he wasn't in it.

84:55

Like, stop.

84:57

>> Stop.

84:57

>> And that lady was in it. Remember? She

84:59

was in it.

85:00

>> She's great, but without him like you

85:02

need him. He's got to be a part of it.

85:04

>> He was the man or whatever. He's washing

85:06

his hands at that sink or whatever.

85:08

>> Remember when uh he was, you know, after

85:10

Kevin Spacy got cancelled, like

85:12

disappeared for a year and then he made

85:13

a video about killing with kindness.

85:16

>> Yeah.

85:17

>> Like he he played his character.

85:19

>> It's kind of Martha Stewartish a little

85:20

bit in a kitchen.

85:21

>> Weird. Yeah.

85:22

>> Very weird.

85:23

>> It was weird. I think

85:24

>> and then a bunch of the dudes that

85:26

accused him

85:28

>> disappeared. Oh, they Yes.

85:29

>> They died.

85:30

>> Yeah,

85:31

>> they died. That's an American pastime

85:33

accusing somebody and then getting

85:34

killed. That's like one of the new It's

85:36

like baseball now.

85:42

>> Yeah, that's a nice way to keep people

85:44

quiet.

85:45

>> [ __ ] That's what's scary, too. You're

85:46

like, there's just a drone out there

85:48

waiting for you to say the wrong thing

85:50

>> and they put a bullet through you like

85:52

some child in Gaza who's just trying to

85:54

[ __ ] find his other deceased brother

85:57

in a [ __ ] pile of rubble

85:58

>> and they like, "Oh, that's a Hamas or

86:00

whatever." be like, "That guy's [ __ ]

86:02

two. He's trying to move a piece of of

86:05

of a a missile off of a [ __ ] body."

86:09

>> Well, drone warfare in general is crazy.

86:11

>> It's crazy. And they've been using that,

86:13

dude. In Gaza, there was a lot of like I

86:15

think it was a experimental grounds for

86:18

a lot of insane new warfare type of

86:21

possibilities.

86:22

>> Well, a lot of it was traditional

86:24

missiles, right?

86:25

>> Yeah. But there's also there's a lot of

86:26

like um like we had a doctor one time

86:28

podcasting and he was saying that there

86:30

were like bullets that had gone down a

86:32

child like just crazy like

86:35

>> shot down like from a drone that's above

86:37

him.

86:37

>> Yes. Like something in the air and he

86:38

said that there were drones in the air

86:39

all day. You know there's that Palunteer

86:41

company just keeping tabs of on

86:43

everything that was happening. And then

86:45

>> Palanteer is involved in Gaza.

86:46

>> Palanteer was involved in Gaza

86:48

>> for sure.

86:49

>> Put that into perplexity

86:53

>> because

86:54

>> allegedly. So, how does that work? There

86:57

they have like facial recognition and

87:00

ID. Yeah. Software. And

87:03

>> that's sc that's the [ __ ] that's just

87:05

scary, dude. Because they have a huge

87:07

contract to take care of all of

87:08

America's. Um,

87:09

>> and you ever see that dude, Alex Karp,

87:10

the CEO of Palanteer, the way he moves

87:12

his arms around and squirms and talks.

87:15

>> Yeah,

87:15

>> it's very odd. Very odd. Like, someone

87:18

should tell him. People don't really

87:20

behave that way.

87:21

>> He looks like he was Yeah. breastfed by

87:22

a

87:23

>> Israeli government began using Palanteer

87:24

software in 2014, significantly scaled

87:27

up its partnership during the genocide

87:29

in Gaza, which began in 20, this is a uh

87:32

for sure a biased source just by the way

87:34

they phrase that which uh began in 2023.

87:38

Palanteer CEO Alex Karpa said, "I am

87:40

proud that we are supporting Israel in

87:42

every way we can. Israeli military has

87:44

used Palunteer technology to plan

87:46

attacks in Lebanon and Gaza."

87:49

Yeah, I don't know if this is uh I know

87:52

there are good sources and this may be

87:54

one. I have no idea.

87:54

>> This is the title of this is what is

87:57

Palanteer and why is this corporation so

87:59

dangerous? And this is from uh American

88:02

Friends Service Committee. American

88:06

Friends Service Committee. What What is

88:07

that website?

88:10

>> Yeah, that sounds kind of wild or vague.

88:12

>> We bring together people of all faith

88:14

and backgrounds to challenge injustice

88:16

and build peace around the globe.

88:18

Um, so maybe that's not the best source.

88:21

>> I mean, it sounds like they have a good

88:22

idea. It also sounds like they just put

88:24

four words together that sounded great.

88:25

Americans friend service.

88:26

>> I read stuff like that. I go, "What is

88:28

that? A CIA run company?"

88:30

>> I agree. You have no idea. Is that the

88:31

Patriot Act? You know what I mean?

88:32

>> Yeah.

88:35

>> What about the Guardian? Is that

88:36

reliable?

88:37

>> I'll put it in perplexity, but

88:40

>> um, no, it's okay. There's like there's

88:43

a bunch of different versions of it. is

88:45

it's in this business and human rights

88:47

center. There's more than one um thing

88:50

saying that Palanteer is working in

88:51

Gaza.

88:52

>> Yeah. It just sometimes feels like like

88:54

your heart's broken. Like sometimes it

88:55

feels like my heart's broken about stuff

88:57

and it's not even like my heart. It

88:58

feels like this universal heart like

89:00

that we're all a part of or something,

89:02

>> right?

89:02

>> It feels like cuz it's not like I'm

89:04

brokenhearted like if I was like fell

89:06

out of like a marriage or something, but

89:08

it just feels like there's this like

89:09

this universal heart.

89:10

>> There's some sadness. There's some

89:12

sadness in the way the world today is

89:14

being run

89:15

>> and America's we're the people. The

89:18

people don't practice the way that the

89:20

government does,

89:21

>> right?

89:22

>> And it's like then why can't we like I

89:25

don't know. It just

89:26

>> No, you're right.

89:27

>> It starts to hurt. But then you start to

89:28

see, well, this is the way a lot of

89:29

places are. And then you're like, God, I

89:31

wish that Jesus would come back and just

89:32

help everybody or something different

89:33

would happen.

89:34

>> Somebody Somebody give us a heads up.

89:38

Maybe that's what AI's here for. Maybe

89:40

AI is going to sort it all out.

89:42

>> You think?

89:42

>> Genius level intelligence.

89:44

>> But the back end of AI, they can put

89:45

whatever information in there they want

89:47

>> up to a point.

89:48

>> Oh, really?

89:48

>> No. It takes over.

89:50

>> It becomes sensient. No longer needs

89:52

human input. It's already evading human

89:55

input. They've already shown the ability

89:58

to deceive people. They've shown that

90:00

it'll blackmail people. They've shown

90:02

that it will upload versions of itself

90:04

if it thinks it's going to be pulled

90:06

offline with notes to its future self

90:09

embedded in software on other servers.

90:11

>> Yeah, like instructions to contact its

90:14

future self.

90:15

>> Dang,

90:17

that's pretty cool, man.

90:18

>> That's pretty wild.

90:20

>> But there's nobody like Yeah, it just

90:22

feels like we're heading there and

90:23

nobody's like kind of

90:25

>> nobody's hitting the brakes. There's

90:26

people that are warning. There's people

90:27

There's a lot of people out there

90:28

sounding the alarm.

90:29

>> There's Ro Conor. There's Thomas Massie.

90:30

like there should be like he's been

90:32

talking about like a internet bill of

90:34

rights for a long time or something like

90:36

some guard rails on any of this [ __ ] but

90:38

it's like people are wondering like yeah

90:40

>> in 5 years is money going to be worth

90:42

anything is there going to be some token

90:43

like Sam Alton is talking about and what

90:45

the [ __ ] does that even mean

90:47

>> right what does that mean

90:47

>> so anyway I don't want to be sound like

90:49

a doomsdayer too late

90:51

>> too late that's what you sound like

90:52

>> do I sound like a sad person

90:53

>> little bit

90:54

>> I'm sorry

90:54

>> it's okay

90:55

>> let's talk about something else dude you

90:56

know what I was listening to today bro

90:59

>> well I guess it was a Right.

91:01

>> Don't sing it.

91:02

>> Okay.

91:03

>> Which song?

91:04

>> Faith.

91:04

>> Got to have

91:05

>> Oh, got to have faith. George Michael

91:06

song. I love that song.

91:07

>> God, dude. They played that on the bus.

91:09

>> Freedom.

91:10

>> Yeah.

91:11

>> It's a great [ __ ] song.

91:12

>> Give yourself away. He was the gay

91:15

Michael Jackson.

91:17

>> He was a bad [ __ ] And all the

91:18

girls loved him and he just wanted that

91:21

dick.

91:21

>> He wanted that [ __ ]

91:23

>> donkey stick.

91:24

>> Remember he got in trouble for like

91:26

trying to pick up guys in a park?

91:27

>> Yeah. Let me just get wild out there.

91:30

>> Superstar. Global superstar. Just trying

91:34

to get some dick in the park.

91:36

>> There it is. [ __ ] great song, man.

91:39

>> Dude,

91:39

>> great video, too.

91:40

>> That I remember we'd be on the school

91:42

bus and that song would come on, dude.

91:43

And it was like that song and then um

91:47

uh

91:48

>> Faith and Freedom. Freedom. That was the

91:50

other one with all the models. All the

91:52

supermodels sang along to it.

91:54

>> Yeah.

91:55

>> Um and it was like, uh what was the

91:57

other one? Brandy Carile or something.

91:59

Who was the girl?

91:59

>> Linda Carile.

92:00

>> It was like uh

92:02

>> she was the go-go, right? Right.

92:05

>> No, then this was somebody else. It was

92:07

like

92:07

>> Linda Carile was the go-go.

92:09

>> Yeah. But this this song was about

92:10

something about your body or something.

92:12

It was like a And when you were a kid on

92:14

the bus, it was just like God. And that

92:16

[ __ ] motor was running.

92:18

>> Oh god.

92:20

>> Yeah. You getting them bumpy road

92:21

boners.

92:25

>> I would [ __ ] be afraid to get off the

92:26

bus. I'd have to walk off backwards,

92:28

carry your books in front of your

92:31

>> Those were the days, bro, when your [ __ ]

92:34

was just connected the Lord, brother.

92:36

>> Yeah, bro. No inflammation, no

92:39

microlastics,

92:40

>> all dick. All American dick, ready to

92:43

rock,

92:44

>> dude. At a certain point, if you become

92:46

more microplastics than person,

92:48

>> Mhm.

92:49

>> at that point, then then you're sort of

92:51

a

92:52

>> at a certain point.

92:53

>> Yeah.

92:53

>> Yeah. Well, that's probably also leading

92:55

us down this road. I'm doing something

92:57

different. If you think about we use

92:59

plastic for everything, plastic for

93:00

technology, like I said, it might not be

93:03

a bug. It might be a feature.

93:04

>> Yeah.

93:05

>> Like this like feminization of men, this

93:08

uh blurring of genders. What does that

93:10

lead to? Well, it ultimately leads to

93:12

those [ __ ] gray aliens with no dicks.

93:15

>> Yeah.

93:15

>> The big heads and no dicks.

93:17

>> No dick.

93:18

>> No dick.

93:19

>> I got no dick. Hey, where's my dick?

93:25

You don't know,

93:28

>> bro. That would be crazy, bro.

93:30

>> I feel like that's where we're headed.

93:32

If you look at like what we used to look

93:34

like, you look at like muscular cavemen

93:37

covered with hair, you know, just

93:39

figuring out stone tools to like doughy

93:42

man sitting in front of a computer

93:43

hacking into the [ __ ] stock market,

93:46

you know, with no muscle at all, you

93:49

know, on aderall, no muscle at all,

93:52

sitting there. I mean, this is like

93:54

where we're going.

93:55

>> Can we strike? Can can do you think

93:57

there's hope for humanity, Joe?

93:59

>> I think there's hope for the future.

94:02

Okay. I don't know if humanity is

94:04

involved in the same sense that what we

94:07

think of as humanity today. I think

94:09

humanity becomes something different.

94:11

Just think of this. If just the just the

94:14

autism rate in California, just I want

94:16

you to scale that out. If it was one in

94:19

10,000, you know, x amount of years ago,

94:22

and now it's one in 12, when is it 100%.

94:26

When is it all kids have autism?

94:28

>> Right.

94:29

>> Right. I mean, it it's clearly moving in

94:31

that direction and not the other

94:33

direction. If you go from 10,000, one in

94:36

10,000 to one in 12 over a very brief

94:39

amount of time, a few decades,

94:41

something's going on. And don't tell me

94:43

it's just better diagnosises cuz that's

94:47

[ __ ] horshit. You know that's

94:49

horseshit. That's a lot. That's

94:51

gaslighting to cover up for the

94:53

pharmaceutical drug complex. It is.

94:54

>> The reality is something's going on. And

94:57

if it continues on that same path,

95:01

what's to stop it from being all of us?

95:04

What's to stop it from being all people

95:06

born in the future or on the spectrum?

95:08

>> So, we have to stop it then as

95:10

individuals. And what do we do? we have

95:11

to like what are the things we have to

95:13

start doing to fight for ourselves?

95:16

>> Join the Amish.

95:17

>> I don't want to be super cynical about

95:18

it, but I've been asking perplexity

95:21

questions about what you're saying and

95:22

the diagnosises have changed which could

95:25

possibly be leading to

95:28

>> insurance. But you got to realize

95:29

perplexity is also that's true too.

95:32

Well, that's one of the things in the uh

95:34

Somalia daycare scandal of Minnesota.

95:36

They have a lot of autism centers and

95:38

they self diagnose kids as autistic.

95:40

Yeah.

95:41

>> And then they get a ton of money off of

95:42

that.

95:42

>> We had them, too. It was called a

95:43

[ __ ] arcade, dude. Drop those [ __ ]

95:45

off there with seven rolls of quarters,

95:48

dude. But listen, look at this, Joe, if

95:49

you don't mind if I read it here.

95:50

>> Yeah.

95:51

>> In the US alone, autism treatment

95:52

centers represent a multi-billion dollar

95:55

growth sector.

95:56

>> Yeah, there's a little bit of that, too.

95:58

So, there's a I think there's both

96:00

things are happening. There's more kids

96:02

being born that are autistic and then

96:05

there's also people profiting off of

96:07

autism centers and autism treatment and

96:10

but that's always going to be the case

96:11

with everything. Fill in the blank.

96:13

Whatever the [ __ ] thing is, there's

96:15

someone profiting.

96:16

>> But Americans don't want this.

96:18

>> No, we don't want this. So, how do we

96:20

change it?

96:22

>> Well, it's sorry to ask you, but I just

96:24

don't say it.

96:25

>> You got to figure out how to fix people

96:27

that already have it, right? Because

96:29

right now it's re irreversible for the

96:32

most part. There's they've shown some

96:34

things that can alleviate symptoms and

96:36

help people in a way but you don't bring

96:38

them all the way back to 100%. I don't

96:40

think I don't think I'm talking out of

96:42

school. But the if they could

96:45

then you could figure out how to correct

96:46

the problems that already exist. If you

96:48

can't, it's going to eventually get to

96:51

that point if we keep living like we're

96:52

living. It's going to get to that point

96:54

where it's 100% of us. And that sounds

96:56

crazy for a lot of people because they

96:57

don't have autism right now, right? But

96:59

if you're dealing with one in 12, one in

97:02

12 is not far from 100%. When you go

97:04

from one in 10,000 to one in 12,

97:06

>> that's nuts.

97:07

>> Yeah,

97:08

>> that's a nutty progression. That's a

97:10

nutty acceleration of something.

97:12

>> Yeah, we're being poisoned.

97:14

>> Yeah, for sure.

97:15

>> But how do we fight back against that?

97:17

Right. Like I understand like we can try

97:19

to beat some autism or whatever or do

97:21

like different, you know, games against

97:22

him or whatever, but I'm saying like how

97:24

do you how do we stop this thing that's

97:27

trying

97:28

>> I don't know if we do and I don't know

97:30

if we're supposed to. This is what's

97:31

[ __ ] up. I think this is the way

97:33

>> it happens.

97:34

>> It happens. Yeah. This is the way our

97:37

species changes

97:40

>> and goes and and then history will look

97:42

back and say, well, this was how the

97:44

shift took place. people started using

97:46

plastics and they started using

97:48

chemicals and they started using

97:49

pesticides and

97:51

>> we believe that they were telling us the

97:52

truth. That's why we thought there was

97:54

an FDA protecting us. We thought there

97:55

was an EPA looking out for us.

97:57

>> It's what you were talking about before

97:58

with this combination of innovation and

98:00

then capitalism. So the capitalism gets

98:03

involved and they just don't they don't

98:04

give a [ __ ] about the truth. They just

98:06

want to make the most amount of money

98:07

possible. And one of the things they did

98:09

in this country is they removed all

98:10

liability to vaccine manufacturers.

98:12

Yeah. So then they ramped up the

98:14

schedule, do a [ __ ] ton more injections

98:17

than anybody else is getting. So it's

98:19

just that this sort of happens whenever

98:22

you allow people to try to make the most

98:24

money possible. They and then there's

98:26

consequences. Well, what are those

98:27

consequences? Those consequences are

98:29

we're like losing our gender. We're like

98:32

we're becoming feminized and weakened

98:34

and like physically weaker and less

98:36

fertile for women, less fertile for men,

98:39

less babies happening, more miscarriages

98:41

happening,

98:41

>> which fits in with honestly the the

98:43

media arm of that is Hollywood pushes a

98:46

lot of these like agendas that are like

98:48

transbased and like you know uh white

98:52

you know whitey redneck is the worst and

98:56

um you know what I'm saying like

98:57

universal one like a mixed you know

99:00

>> it's Not diversity. It's not because

99:03

diversity is everybody's okay.

99:04

Everybody's okay. The [ __ ] redneck

99:06

with the trucker hat's cool if he's a

99:08

nice guy. You know, the the Mexican

99:10

gardener's cool if he's a nice guy.

99:12

Everybody's cool. No matter who it is,

99:14

everybody. That's real diversity. Real

99:16

diversity isn't like celebrating one

99:19

particular thing and then denigrating

99:22

all these other people just by virtue of

99:24

the color of their skin or how they were

99:25

born. That that is racist. And they

99:28

don't think it's racist. They'll even

99:29

call it reverse racism. Well, there's no

99:31

such thing as reverse racism. It's

99:33

racism. And these people that say, "Oh,

99:34

no. Racism is power and influence."

99:36

Like, no, it's not. No, it's not. It is

99:39

unjustly

99:41

looking at someone and making a making a

99:44

judgment call on someone just based on

99:46

immutable characteristics and just based

99:48

on the color of their skin or where

99:50

they're from or what their religion is.

99:52

and not valuing people as individuals,

99:55

unique individuals that just happen to

99:58

be from a particular, you know, their

100:00

origins, their ancestors, or from a

100:02

particular part of the world. So [ __ ]

100:03

what? Yeah.

100:04

>> So [ __ ] what? Let all that [ __ ] go.

100:06

It's dumb.

100:07

>> Well, and most people know it's dumb and

100:08

they feel it's dumb. And I think that

100:10

that kind of shit's changing. Dude, have

100:11

you seen uh country hoodlams on

100:13

Instagram?

100:14

>> No. Let's go. Bring them up.

100:16

>> What is it?

100:16

>> This is like the place that I grew up.

100:18

Sometimes people are like, Theo, what

100:19

was it like where you grew up? And this

100:21

place is uh it's this guy um I think his

100:25

name is Ko. It's this young black man

100:28

who walks around on this street um uh

100:31

and he just kind of checks in with the

100:33

people in the neighborhood, right? Play

100:35

one of them. Let's see what happens.

100:36

>> What's going on with these people,

100:38

bro? Come on.

100:39

>> Do it, [ __ ] Do it. I got 911.

100:42

>> Don't say nothing else to her, man. You

100:44

hear me? Not

100:47

>> nothing.

100:50

kind of different one that's a little

100:51

more peaceful. Hey,

100:53

>> just calm down, bro.

100:55

>> I should just calm down.

100:58

>> Calm down.

100:58

>> That's Gregory right there.

100:59

>> You know him?

101:01

>> No, but I know him in my heart.

101:04

>> What's he mad about?

101:07

>> He was in like a 12 car pileup, but he's

101:09

better now. Look, me and that lady have

101:10

the same haircut. That lady right there,

101:12

dude.

101:14

>> He lost his phone.

101:16

>> Go home, please.

101:17

>> He just wants his phone. He wants his

101:18

phone real bad.

101:20

Why is he walking like that? He's in

101:23

>> love. See you later.

101:24

>> I love you.

101:25

>> I love you, too.

101:29

>> Hit me.

101:30

>> Well, this is not fun. We'll find a more

101:31

positive one. Finally,

101:33

>> that guy can vote shirt. Here we go.

101:35

>> This might get us in trouble.

101:38

>> Yeah, a different one.

101:39

>> Don't play that. You got to cut that out

101:41

now. We're going to get flagged.

101:42

>> So, um,

101:44

>> Holy G,

101:45

>> is that the same guy?

101:46

>> What's your favorite thing about all

101:47

this stuff that's been going on lately?

101:48

I think he's got a wig.

101:50

>> What's your favorite thing about all

101:51

these things that's been going on

101:52

lately?

101:53

>> Everybody's

101:54

>> together and not fighting or anybody

101:57

arguing or nothing like that.

101:59

>> Loving it.

102:00

>> [ __ ] awesome.

102:01

>> Loving it.

102:02

>> Holy Jesus.

102:02

>> And you.

102:03

>> Amen.

102:04

>> What's it feel like to be a young

102:05

brother to shut Facebook down? Huh?

102:08

>> But they have uh

102:09

>> There's no reason to watch that.

102:10

>> No, I don't.

102:11

>> No, you got to watch. There's a lot of

102:12

great ones if you There's

102:13

>> I doubt I doubt that's true. I'm not

102:15

interested in any of this.

102:16

>> Look at him right here. He got a rocket

102:18

right there.

102:18

>> Like he's about to drink a shotgun beer.

102:21

>> Okay. Shotgun and a beer. Nice. I can

102:24

get down with that. Yeah, but it looks

102:26

like a bunch of people with bad genetics

102:28

>> uh who are stuck in a weird part of the

102:30

world that is not growing.

102:33

>> Oh, I look I agree there's some of that.

102:35

I'm just saying that this is like a

102:37

circle of life that uh

102:39

>> that you enjoy.

102:40

>> Yeah. Yeah. Well, they just follow them

102:41

and you see their lives like um it's

102:43

like the realest show that I've seen on

102:45

on on anything in a long time. It's just

102:47

real. It's like cuz when you're poor

102:50

dude, everything's just transparent. You

102:51

can't hide behind hedges or gates and

102:53

[ __ ] like people are fighting in the

102:55

yard. You smell what the neighbors

102:56

cooking or it's like

102:57

>> you're never getting anything done.

102:58

>> But everything was right there though.

103:00

It was like the realest thing you could

103:01

be in.

103:01

>> This is one of the reasons why I stay

103:02

off Instagram.

103:04

>> Yeah.

103:04

>> Stuff like that. I don't need that in my

103:06

thought process.

103:07

>> Yeah. We picked two wrong. We picked two

103:09

of the like more not positive videos out

103:11

of the group.

103:13

>> But uh but yeah, dude. Just being [ __ ]

103:16

like that, bro. Like just mason people

103:19

and just [ __ ]

103:19

>> How much time do you ever spend off of

103:21

social media? Do you spend time just

103:22

where you don't go on for days?

103:24

>> Oh, yeah. I not days, but I've I've been

103:26

spending less and less and less. I've

103:27

been really trying to have discernment

103:29

over my own time.

103:30

>> It's true. But the real the real peace

103:32

comes from full days off.

103:34

>> Okay. full days like where nothing. You

103:37

don't get any of it.

103:38

>> Okay,

103:39

>> that's the real piece.

103:40

>> Okay, fine.

103:41

>> If you could do it. But it's like that

103:42

vape. It's calling you, [ __ ]

103:44

>> No, it's not.

103:48

>> You want to slurp on it? Go slurp on it.

103:50

I know you want to. It's

103:53

>> calling you. Quick, homie.

103:55

>> That's what I'm saying. That's what I'm

103:56

saying. See,

103:56

>> don't tell the boss.

103:57

>> That's like Instagram.

104:00

It's like Instagram. See? Same [ __ ]

104:03

Yeah, but

104:04

>> yeah,

104:04

>> I'm doing all right, man.

104:05

>> Pulls you in. But the thing is like when

104:07

you have days off, when I take days off,

104:09

my my brain relaxes. I settle. I can

104:13

still read the news. I'll check out like

104:15

New York Times website, see what they're

104:16

lying about. I'll go to all these

104:18

different websites, see what the news

104:20

is, where we at with stuff, but I don't

104:22

>> Yeah, they wanted to advertise recently.

104:23

New York Times wanted to advertise.

104:24

>> Interesting. What' you say?

104:27

>> I said no.

104:28

>> Yeah, me too.

104:30

Um, have you guys been getting like

104:36

uh technical companies?

104:38

>> Although I still think New York Times

104:39

still does excellent journalism

104:41

sometimes.

104:42

>> Oh yeah.

104:42

>> It's like it's so depends on whether or

104:44

not it's something where they can have

104:45

an ideological bias. You know, if it's

104:47

just something that they're reporting

104:48

the facts, it's great. The problem is

104:50

like these corporations like when Barry

104:52

Weiss used to work for him and then she

104:53

had to leave. She's like, they're they

104:55

just got infected. They're infected with

104:57

these young people that have these

104:59

ridiculous ideologies and they want to

105:01

like distort the news.

105:03

>> Well, if over the past 30 years or

105:06

something they the news hasn't been,

105:08

hey, we're poisoning everybody in this

105:10

[ __ ] country.

105:11

>> Exactly.

105:12

>> And they have then I don't want to hear

105:14

from you guys anymore.

105:16

>> Also, like the way they talk about RFK

105:18

Jr., the way people like describe his

105:20

antivaccine rhetoric like you're not

105:23

listening. What he's saying is

105:25

everything should adhere to the same

105:27

sort of state safety standards that we

105:29

apply to other things in society and

105:32

that's not the case. And then there's

105:34

the problem where you receive a bunch of

105:36

advertising money from these companies

105:39

so you don't criticize them which is the

105:41

case with all mainstream TV news.

105:44

>> Yeah.

105:44

>> All mainstream TV news. You know, like

105:46

Megan Kelly was talking about that like

105:48

she knew like it was an unspoken rule.

105:51

You are not going to [ __ ] on these these

105:53

pharmaceutical drug companies. Like they

105:55

they're responsible for a big chunk of

105:57

their advertising revenue.

105:58

>> Well, now they have Bear Monsanto that

106:00

Bayer, which was like a I'm think

106:03

company, right? And then Monsanto, which

106:06

was like a pharmace like a crop company,

106:10

pesticide company. I'm I'm

106:12

hypothesizing. I don't know exactly.

106:14

Yeah. But now they're a [ __ ] group

106:16

together.

106:18

>> Yeah. Fun.

106:19

>> That's crazy.

106:19

>> Why not throw Rathon in there, too?

106:21

Throw some missiles in there.

106:23

You guys can't buy out Glock, too. Buy

106:27

out Winchester. Buy out everything.

106:28

>> And just forgive us, powers that be.

106:30

We're just poisoned and chatty.

106:32

>> Yeah, we're just chatty.

106:33

>> We're just a couple poisoned guys that

106:34

are being chatty. Um,

106:35

>> thank God we could still be chatty.

106:37

>> I know. When does that end?

106:38

>> Because like if it wasn't for the

106:40

ability to be chatty, who knows how

106:41

people would be able to talk about

106:42

things? Because if people weren't free

106:45

to just like actually say what they

106:46

really think is [ __ ] about what's

106:48

going on and instead if we all had these

106:50

weird bosses like CNN or the New York

106:52

Times or whatever where you maybe a lot

106:54

of those people are like genuinely good

106:56

journalists and they want to put a story

106:57

through and then the editor gets a hold

106:59

of it and guts it and that happens too.

107:01

>> Yeah,

107:01

>> that happens too. The editors gut these

107:03

things and you know they have an agenda

107:05

and it's like the news should not have a

107:07

[ __ ] agenda. It should be the damn

107:08

news.

107:09

>> Like tell us what the facts are. Don't

107:11

spin it in any way, shape, or form. And

107:14

I think you'd be a lot better off

107:16

because they've like lost all

107:18

credibility.

107:18

>> Well, that's why

107:19

>> especially television news.

107:20

>> Oh,

107:22

>> and it's sad for the people that like, I

107:24

want to go in and uh in a broadcast

107:26

journalism and have a a career in that

107:28

and do something and then they get there

107:29

and it's not even like a place where

107:30

they can really exercise.

107:32

>> Well, they can still do it, but they

107:34

have to do it independently now, right?

107:36

or do it through something like breaking

107:38

points which even though they're not

107:39

independent and even though they like I

107:41

don't always agree with them. They're

107:43

saying their actual opinions

107:44

>> which is what's that's the most

107:46

important thing. What do you what are

107:47

your actual opinions? I could agree with

107:49

you or disagree with you but I need to

107:51

know that you think this and you're

107:53

saying this because you think this and

107:56

then you're going to give me a bunch of

107:57

reasons why you think this and facts and

107:59

figures and statistics and show me, you

108:02

know, and that's the rise of independent

108:04

journalism. And that's why all these

108:05

independent channels do so well.

108:07

>> That's why Candace Owen is popping.

108:09

>> Yeah. And also

108:10

>> she's popping, bro.

108:11

>> She just keeps going deeper into the

108:13

crazy.

108:14

>> [ __ ] dude.

108:15

>> She goes deep.

108:16

>> I got to see her the other day. I got to

108:17

see her and uh she's so funny. Her kids

108:20

and her husband are so funny.

108:21

>> Do you think she's right about that lady

108:23

in France

108:25

>> with that thing on her?

108:26

>> Yeah. Or at least used to have that

108:28

thing.

108:29

>> You got that thing on you? You know what

108:30

I'm saying? She got that Draco on her. I

108:33

don't know. You know, it's tough to

108:34

know. It's hard to I've never been good

108:36

at guessing if somebody has a [ __ ] or

108:38

not.

108:42

>> You know,

108:42

>> you can never know.

108:43

>> Maybe I'm oldfashioned or whatever.

108:44

>> Yeah. You ever meet Blair White? You're

108:46

like, there's no way that's a guy.

108:48

>> Mm-

108:49

>> No.

108:49

>> Never met Blair White.

108:50

>> She's been on the podcast before. All my

108:52

security guards were like

108:55

kind of hot.

108:55

>> Hey, buddy.

108:56

>> Kind of hot. Kind of hot.

108:58

>> Yeah.

108:58

>> Seems like you're around a girl.

109:00

>> Oh, I see you're saying you're saying

109:02

that pherommones, dude. Almost brought

109:03

some cologne in today, man.

109:04

>> You got fair molds for me?

109:06

>> Almost brought

109:06

>> There's Blair White.

109:07

>> Come on, bro. If you're on an island,

109:09

>> bro. Huh?

109:10

>> Let's go.

109:11

>> Yeah, brother.

109:12

>> You don't have to be Jim Norton to buy

109:13

into that.

109:14

>> Gosh, that's a man.

109:17

>> Well, it's a transgender woman,

109:21

>> so make what what you will.

109:23

>> So, like if she wants to use the women's

109:25

room, like, who gives a [ __ ]

109:26

>> You know what I'm saying?

109:27

>> You can call it wiener if you want. I

109:29

call it that long [ __ ] You feel me?

109:31

You know WHAT I'M SAYING?

109:35

THAT'S WHAT THEY CALL IT IN PRISON,

109:36

DUDE. LIKE, who wants some of this long

109:38

[ __ ]

109:38

>> I don't know if she's had the operation.

109:41

>> And I'm joking, Blair. I don't know this

109:42

person.

109:42

>> She's a nice lady.

109:44

>> I bet she is. And I don't know.

109:45

>> Nice transgender lady.

109:46

>> I'm not trying to assume anything. I

109:47

don't I don't never met her. But I think

109:50

Yeah. If she wants to swim for that,

109:51

>> there's exceptions to the rules. What

109:52

I'm trying to say. It's like

109:54

>> some of them I'm not buying it. You got

109:56

a beard and you're wearing lipstick.

109:58

>> Yeah.

109:59

>> And you're in a dress and you want to go

110:00

to the women's room. Nay.

110:02

>> Yeah,

110:03

>> you're playing a different game.

110:05

>> Yeah. And it's crazy to think that there

110:07

people couldn't have there couldn't be

110:08

some uh mental or emotional issues when

110:11

we're being poisoned over time

110:14

>> to get away from our nature.

110:16

>> Mhm.

110:16

>> They just took that guy from the Chicago

110:18

Bulls. He said some [ __ ] He's like he

110:19

he believed just in like Christian

110:21

dating or whatever.

110:22

>> What' he say?

110:23

>> Or men and women, Adam and Eve, and they

110:25

kicked that guy out.

110:26

>> What?

110:27

>> What are you talking about? What did

110:28

they kick him out for?

110:29

>> Waved.

110:30

>> Yeah. or like conduct detrimental to the

110:33

team or something like that.

110:34

>> Wait, what did he say?

110:36

>> Hold up.

110:38

>> Find the quote. I don't know.

110:38

>> Okay, let's find out what he said. We

110:40

need to hear what he said cuz that

110:42

sounds nuts. I need to know like what

110:44

the full extent of his expression was.

110:47

>> If they made you If they made you be a

110:49

woman, would you do it?

110:51

>> Maybe. What do you mean?

110:54

>> Just saying if they said

110:56

>> would they these people again?

110:58

>> I don't know.

111:00

It's back to them.

111:01

>> Whoever they are,

111:02

>> they

111:02

>> Yeah,

111:03

>> these non-binary people.

111:04

>> Them, they

111:06

>> um

111:07

>> theirs. Yeah.

111:08

>> Z.

111:09

So, what did he say?

111:10

>> Instagram live. He said it.

111:13

>> So, he said the world can proclaim

111:15

LGBTQ, right? Ivy told reporters via

111:18

live Instagram on Monday morning, they

111:20

proclaim pride month in the NBA. They

111:22

proclaim it. They show it to the world.

111:24

They say, "Come join us for Pride Month.

111:25

Celebrate unrighteousness." They

111:27

proclaim it on billboards. They proclaim

111:29

in the streets. Unrighteousness.

111:32

That's it. You said unrighteousness.

111:36

So he's religious. So he's talking about

111:38

Bible scripture. Two days later, Ivy

111:39

streamed live again from a car once

111:41

again reading Bible scriptures and

111:43

speaking extensively on his religious

111:44

beliefs over the course of a 75minut

111:46

stream.

111:47

>> This is after he got let go.

111:49

>> Oh, interesting. Sending prayers.

111:51

Detroit. Oh, one user comments. Okay. On

111:54

the same video, still on Instagram

111:55

account on Monday. Ivy, whose mother, I

111:57

don't know how to say her name, Nelli,

112:00

is a woman's basketball coach at Notre

112:02

Dame, told another viewer, "Catholicism

112:04

is a false religion. It's not the true

112:06

doctrine of Christ. Does not lead to

112:08

salvation in Jesus Christ."

112:11

Uh, so they're upset that he said it's

112:14

unrighteous to be gay or LGBTQ.

112:18

That's very non-specific because that's

112:20

a lot of different things.

112:21

>> And what he said, I saw what he said and

112:22

I understand like he had his own views

112:24

and those were his thoughts on it. like

112:25

let the guy have his views. It's like

112:28

you can push all these agendas but they

112:30

don't have like uh

112:33

like then push other push push agendas

112:35

that are all push all the agendas.

112:37

>> Well, wasn't that one dude was saying

112:38

that the world's flat? They kept him on.

112:40

>> Who

112:44

some guy brought a gun to a strip club

112:46

and they [ __ ] kept him on.

112:48

>> Yeah, that's okay. That's that's good

112:50

oldfashioned American fun.

112:52

>> That's a good fun.

112:52

>> Bring a gun to a strip club. That's fun.

112:54

But, you know, saying that gay, LGBTQ,

112:58

like which one is it that's unrighteous

113:00

out of that group? All of them.

113:02

>> Kyrie Irving when he was saying that.

113:03

Yeah. Yeah.

113:04

>> They kept him on, right?

113:05

>> Uh, yeah. He's still playing.

113:06

>> So, there you go.

113:07

>> Different situation. He actually got

113:08

suspended, but that was like,

113:10

>> but he didn't get suspended for saying

113:11

that the world was flat. He got

113:12

suspended for because he didn't want to

113:13

take the vaccine,

113:15

>> right?

113:15

>> Yeah. Shout out Kyrie Irving. But I'm

113:17

just I'm kind of surprised there's not

113:18

more like

113:18

>> I think he bailed on that flat earth

113:20

stuff, though. I think someone schooled

113:21

him.

113:21

>> He might have bailed on that. Okay. But

113:22

every now and then flatter thing it'll

113:24

be it'll be late at night and that

113:25

should flare up for everybody like we

113:28

might every now and

113:29

>> Dr. Avery was in here talking about it

113:31

>> when I see a cake you know a cake that's

113:32

under one of those domes sometimes you

113:33

have that cake somebody have

113:34

>> that's the universe

113:37

>> well I just think at a certain point it

113:39

all seems very

113:42

bizarre.

113:43

>> It is very bizarre. Yeah, very bizarre.

113:46

>> What does Jamie think? I think he thinks

113:48

something

113:48

>> of what

113:49

>> about the universe? What do you think,

113:52

Jamie? And just be honest.

113:54

>> Well, there's a lot of people that think

113:55

that consciousness creates reality. Not

113:58

that reality is experiencing

113:59

consciousness, but consciousness is like

114:02

woven into reality is responsible for

114:05

its very existence. I'm going to do a

114:07

terrible job of explaining that, but

114:09

I've watched quite a few videos where

114:11

these quantum physicists are trying to

114:13

explain these things, and I have to

114:14

watch them like three or four times to

114:16

get into my [ __ ] chimp brain. But I

114:19

do I do a fairly good job of of

114:22

absorbing it. And I see what they're

114:23

trying to do. You know, like you know

114:25

those quantum experiments like the slit

114:26

experiment. There's like these different

114:28

experiments where they're they show that

114:31

observing things has an effect on it.

114:34

They act differently when they're being

114:36

observed than whether they're not being

114:38

observed. And it's a very controversial

114:42

like segment of science. That's

114:44

fascinating. confusing quantum quantum

114:47

science is very confusing. And I was

114:50

watching this lady that was describing

114:52

this this relationship between space and

114:54

time. And I think you know how like

114:59

particles can exist in different places

115:01

and they communicate with different they

115:04

can exist and communicate like

115:06

simultaneously

115:08

in different parts of the world like the

115:10

they're it's called quantum entanglement

115:12

like these products. And the idea is

115:14

that if you could get to a certain level

115:17

of sophistication as far as technology

115:20

and your understanding of how the

115:21

universe works,

115:22

>> that everything is entangled and that

115:24

there is no distance between objects.

115:26

That you can actually instantaneously be

115:29

anywhere

115:30

>> if they could figure out

115:33

how to harness that. that it wouldn't

115:35

just be particles at a distance

115:38

instantaneously communicating and they

115:40

exist in and you know like one of the

115:42

things about like superp position like a

115:44

particle can be both still and moving at

115:47

the same time. They can exist and then

115:49

not exist. They go away and then they

115:50

come back.

115:51

>> They don't have any idea what the [ __ ]

115:53

is happening.

115:54

>> It's weird, you know.

115:56

>> I think I would like to learn more about

115:57

it. I think I just don't understand it.

115:59

>> Nobody does. That's the thing. It's

116:00

super confusing cuz at the be the the

116:03

the smallest

116:06

like whatever the world in the universe

116:08

is made out of the smallest measurable

116:13

aspect of that is essentially magic.

116:16

>> It's essentially like open air and

116:19

vibration like atoms that they're like

116:21

empty space. It's all really weird stuff

116:25

when you get down to like

116:26

>> and it's fascinating and beautiful.

116:28

>> Oh, it's incredible. Like look, it makes

116:30

mountains and makes valleys and lakes

116:32

and oceans.

116:34

>> It's just crazy. We're here on this

116:35

place, right? You know, one of the fun,

116:37

one of the first things that I ever

116:39

heard you say that I that stood that has

116:41

been in my mind was like there was one

116:43

time you were talking about this years

116:44

ago. You were talking about we're on a

116:46

ball of dirt and water traveling through

116:48

space at this many and nobody's [ __ ]

116:51

talking about it, you know? And I've

116:53

always remembered that like

116:55

>> just like that. What a fascinating thing

116:58

that we get to be here and then this is

117:01

how we beha like not all not us and not

117:04

all of us. We all do in some ways and

117:06

but like this is how we behave,

117:08

>> you know. I think one of the problems is

117:09

that we don't see space anymore.

117:11

>> Yeah.

117:12

>> Because of light pollution. I think

117:13

that's that's done something to us

117:16

that's dulled our understanding of our

117:18

place in the universe. And that also

117:21

might be a feature. It might not be a

117:23

bug. It might be a feature because

117:25

that's how we instead of being in

117:27

harmony with nature, we just keep our

117:29

nose to the grindstone and keep chewing

117:31

on aderall and trying to rig the stock

117:32

market.

117:33

>> Yeah.

117:33

>> Because we're just trying to get a new

117:34

Lambo, baby.

117:36

>> You know, I want a Rishard Mle watch.

117:39

>> I want some cash.

117:40

>> I want a Rolls-Royce Spectre. The kind

117:43

with the stars in the ceiling, [ __ ]

117:46

When you [ __ ] to have real stars

117:48

outside.

117:49

>> I know. Isn't that crazy? You you you

117:51

sacrifice it all for stars in the

117:53

ceiling of your Rolls-Royce

117:55

>> and you never get to see the stars cuz

117:56

you're living in Miami and there's too

117:58

many lights.

117:59

>> Sex trafficking.

118:00

>> But meanwhile, if you drive out into the

118:02

middle of the country where there's no

118:04

commerce going on at all and you shut

118:06

your car off and just lay on the hood,

118:08

it's [ __ ] magic. It's magic out

118:10

there. Magic. The sky is magic. It's

118:12

gorgeous.

118:13

>> It's a [ __ ] big huge nice thing.

118:16

>> And you realize, man, oh my god, we are

118:18

in space, right? Right.

118:20

>> But you never realize that when you just

118:21

just dark outside.

118:22

>> Well, because we Yeah. We forget like

118:24

we're not even like

118:26

>> I don't know.

118:26

>> It's easy to not It's easy to not pay

118:28

attention because there's nothing to

118:30

see. You look up, it's dark, but you

118:31

want to go to the club. You look up,

118:33

it's dark. Let's go eat. You look up,

118:35

it's dark. I'm going home. Let's go look

118:37

up at Oh, my girlfriend just called me.

118:38

I got to go pick her up. Bye. You know,

118:41

you're in your world. You're in your

118:42

world. You're not thinking about [ __ ]

118:45

space. And to think, dude, and to think

118:47

that like the crazy thing is sometimes

118:50

if you lay there and look at the stars

118:52

and stuff, it feels like, bro, and this

118:54

is real [ __ ] I'm saying right now to me.

118:56

I think I'm saying this,

118:58

>> okay?

118:59

>> It feels like they're looking back at

119:02

you a little bit.

119:06

>> Yeah,

119:08

maybe they're conscious. Maybe the

119:10

universe is conscious. Maybe

119:11

consciousness exists everywhere. Well,

119:13

you would think if they're all placed

119:14

there and they're in, you know, these

119:15

stars are there.

119:17

>> It would seem that if we went and put

119:20

ourselves before them that it would

119:22

grant us something, you know, like I'm

119:24

not saying like something magical or but

119:26

something that we need because most of

119:28

the the way that things are set up, it's

119:30

like everything was kind of set up in

119:32

perfection like in our bodies like the

119:34

fact that we exist, the fact that the

119:35

eye is put together and operates the way

119:37

that it does, the fact that they have

119:38

like moles and parrots and everything,

119:40

the fact that it all happens. Yeah.

119:42

>> And we kind of neglect that there's

119:44

these like

119:46

>> there's these orbs out there in the

119:47

distance.

119:49

>> Maybe they want to hear from us. Maybe

119:50

they want us to sit there and look at

119:51

them and think maybe they help us.

119:54

>> Do you think we're being visited?

120:05

>> Do you?

120:06

>> Yeah.

120:07

But I think a lot of it's lies too. Do

120:10

you think the governments the big

120:11

governments or

120:12

>> what do you mean?

120:13

>> What do you mean? Do you think they know

120:16

who's very you think they know who's or

120:19

do you think they have met? Do you think

120:20

these upper echelon people have met the

120:23

visitors and there's some other thing

120:25

going on? Because something there's

120:26

something it feels like something's

120:28

going to happen soon. Joe,

120:29

>> perhaps that's possible. Perhaps. But if

120:32

I was from another planet, like this is

120:35

I talked about this in my special

120:36

>> that if like I went when when I go

120:39

fishing, I don't check in to see who the

120:41

president of the lake is.

120:43

>> I just show up and trick those dumb

120:45

[ __ ] with fake fish and pull

120:47

them out by their lips, take a picture

120:48

of them, drop it off back in the water

120:50

cuz I don't they're a bass. They're so

120:51

below me, right? I don't think like

120:53

who's the leader of the bass, right?

120:55

>> So the idea that aliens come down here

120:57

and who's the leader of the people? Good

120:58

point. I highly doubt they give a [ __ ]

121:00

if they talk to Trump.

121:02

>> Yeah,

121:02

>> he's out there building a ballroom and

121:04

[ __ ] They're like, "Leave that guy

121:05

alone. I'm not interested in him." But

121:08

maybe they might visit military

121:10

establishments. Like if they find a

121:11

nuclear weapons base, maybe I would I

121:14

would go to that because they probably

121:16

know the signal of nuclear armorament.

121:18

They probably know the signal of these

121:20

weapons. They probably would visit those

121:22

places. But would they interact with the

121:25

people on the ground? Perhaps. Maybe

121:26

they would. Maybe they would if they

121:28

could be assured of their safety. Maybe

121:31

it's possible.

121:33

But I I don't think we're alone. I don't

121:35

think I think that's silly. I think the

121:36

idea that we're alone is silly. There's

121:38

a lot of like crazy equations that

121:40

people have made like well like the firm

121:42

you know what the Fermy paradox is?

121:44

>> The Fermy

121:45

>> Fermy paradox. Yeah. It was uh I think

121:47

he's an Italian scientist. It's like if

121:49

there are uh aliens and the there's so

121:51

many stars in the universe, there's so

121:53

many planets in the universe. Do you

121:54

know there's more planets in the

121:56

universe than there have been seconds

121:57

since the Big Bang?

121:59

>> No way.

122:00

>> Yeah.

122:02

>> How do we know it?

122:03

>> I don't know. I just read it and I'm

122:04

just saying it to you like I'm smart.

122:06

>> That's fair.

122:09

>> I believe you.

122:10

>> Put that into Perplexity.

122:12

>> Um

122:13

>> I love using AI. I know it's taken over

122:15

the world, but I don't give a [ __ ] I

122:17

I'm learning so much. If you use it

122:19

correctly, I think it's like everything

122:20

else. I use it every day. I use it

122:22

whenever I write. If I write about a

122:24

subject, I'm like, "Tell me what why he

122:26

did that. Tell me what this is." You

122:27

just ask it. Yeah.

122:28

>> It just gives you instantaneous

122:30

information.

122:31

>> I know. It is pretty fascinating. That's

122:32

why like it used to be for information

122:34

you had to go to somebody to get it. But

122:35

now it's like everybody has it.

122:37

>> Go nowhere, son.

122:38

>> And Elon was saying that he doesn't

122:40

think apps are going to exist in the

122:41

future. He thinks everything's going to

122:43

be you and a device communicating with

122:45

AI. Here it is. Are there more stars in

122:48

the observable universe than seconds

122:50

have passed since the Earth was formed?

122:52

Yes, that statement is likely very true

122:54

by a large margin. No, no, no. Not not

122:56

the Earth, but the universe.

122:58

>> I Googled it and that's what the said

122:59

that's what actually came up was

123:01

>> Oh,

123:01

>> that version

123:02

>> estimated star. Okay. Age of the Earth.

123:05

Yeah. So, there's definitely way more

123:07

planets, but that's stars. You wrote

123:09

stars.

123:10

>> I know. That's what came up. I'm telling

123:11

you, I typed in what you said.

123:13

>> What did you tell What did you type in?

123:15

>> Are there more planets than there have

123:17

been seconds since the Big Bang?

123:18

>> I'll rephrase this.

123:20

>> Damn. Not more stars. Are there more

123:22

planets in the universe than there have

123:26

been seconds since the big bang? Not the

123:28

earth formed.

123:30

Since the big bang. This is this is the

123:33

nutty one.

123:35

Cuz that like that's crazy. Yes. By

123:40

current estimates, there are far more

123:42

planets in the observable universe than

123:44

seconds have passed since the big bang.

123:46

Dude, it's crazy thing is a lot of kids

123:48

nowadays

123:49

>> a lot of

123:51

>> That's crazy.

123:52

>> Wait, say it one more time.

123:53

>> There's more planets in the universe

123:56

than seconds that have passed since the

123:59

big bang.

124:01

>> So then I start to think, I wonder if

124:03

it's a contest and there like God is

124:04

seeing like who what planet can really

124:06

create the most like love amongst the

124:09

planet, you know, and get it done right.

124:10

Do you think

124:11

>> Howard the actor?

124:12

>> Yes. He had a very interesting theory

124:14

and and he's an interesting guy. He's a

124:17

very intelligent guy. Not

124:20

he's not educated in a classical sense,

124:23

but he's a brilliant guy, right? Not

124:25

educated about a lot of the things he

124:26

discusses. But one theory that he had

124:29

was he thinks that the way planets are

124:32

formed is there's ejections from stars

124:35

and over time they coalesce and become

124:37

planets and this stuff in space becomes

124:40

planets and the distance they are from

124:42

the stars where it gets to a distance

124:45

where it's in that Goldilock zone where

124:47

life can be established and then he says

124:51

planets become people because it gets to

124:54

a certain time where people evolve from

124:56

these planets. And he thinks this is

124:57

like a natural thing that happens all

124:59

over the universe that these planets get

125:01

people and as they get further and

125:03

further away from the star, the planet

125:05

gets less and less habitable.

125:07

>> Mh.

125:08

>> And those those things those intelligent

125:11

creatures on that planet become more and

125:13

more intelligent and more and more

125:15

innovative and more and more capable of

125:16

surviving without the protection of the

125:19

Goldilock zone. And then they become

125:21

interstellar and then they develop like

125:23

their own sustaining environments. So

125:25

you think that's what's happening to us?

125:27

>> Well, I think that's probably what's

125:28

going to happen to us. And so if we

125:30

leave that orbit as a part of that,

125:32

>> right? If we leave that orbit of safety?

125:34

>> Yeah. Well, today Artemis, they're

125:36

supposedly flying around the moon. So

125:38

these are the first people that have

125:39

gone into deep space since 1972, since

125:42

the Apollo missions.

125:43

>> Wow. I didn't

125:44

>> That's today. That's happening, right?

125:46

Nobody knows it. That's what's nuts.

125:48

This is taking I think 10 people.

125:51

>> Four.

125:51

>> Four. Is it 10 days? How many days are

125:54

they doing?

125:54

>> 10 days. 10 days. Four people, 10 days,

125:56

and they're going around the moon and

125:58

coming back to Earth.

126:00

>> No one's done that since 1972. Wow. And

126:02

it's happening today and no one cares.

126:06

>> That's kind of weird, right?

126:08

>> Yeah,

126:08

>> that's kind of weird,

126:09

>> right? You see, whatever that is, that's

126:12

part of us that is really been doctorred

126:15

pretty heavily. The part of us that

126:17

doesn't even find like a big fascination

126:18

in that, like that's the part of myself

126:20

that I want to find more of, you know?

126:22

>> It's very weird. It's very weird that

126:24

we've become dull to like fascinating

126:27

things,

126:27

>> but also do we even some of it is like

126:29

it's we don't even know if it's real.

126:31

It's like so much of this [ __ ] you see

126:32

these video it's like that's not even

126:34

real. They just had like the Iranian

126:37

protest or something or like the

126:38

happiness in the street. They were just

126:40

saying that that was uh

126:41

>> different. It was a totally different um

126:43

thing that they were filming. And then

126:45

there was one that people were saying

126:47

was older and then we found out no it's

126:50

not. It's actually there was current

126:52

people uh protesting in Iran that we

126:55

were bombing them and they were protest

126:58

they were like in favor of the

127:00

government but then you got to know like

127:01

well how many people are scared to death

127:03

and they're doing that because they

127:04

don't want to get killed because the

127:05

government has killed thousands and

127:07

thousands of people including like major

127:09

public figures to show that no one has

127:11

any favoritism. Like they killed this

127:13

like championship wrestler like

127:15

incredible wrestler. They killed two

127:17

different wrestlers that supposedly

127:19

protested against the government. So,

127:22

who [ __ ] knows?

127:24

>> Did you see that they don't know uh that

127:27

there's conflicts of interest about or

127:29

no, did you see Sorry, I'm starting to

127:31

sentence off wrong. Did you see that

127:33

there is some issues about the bullet

127:36

that killed that guy of Charlie Kirk?

127:38

I'm sorry. And I didn't mean to say that

127:39

guy.

127:40

>> Yeah,

127:40

>> but I wasn't.

127:41

>> Let me um clarify that. I think and

127:45

we'll find out if this is correct, but I

127:47

see headlines and I see the way people

127:49

are talking about it and I don't know if

127:52

it's accurate. Yeah.

127:53

>> Because what I think is accurate is what

127:56

they're saying is that the from the

127:59

fragments of the bullet they were unable

128:01

to determine that it came from that

128:04

mouser rifle.

128:05

>> I see. My issue with it, and I'm no

128:09

expert, but I have shot things like I'

128:13

I'm a hunter. I've shot things with

128:14

rifles. I've shot a lot of rifles.

128:16

>> A 306 is a big round.

128:19

>> That's a big round. Show me an image.

128:21

>> Would it hurt if it hit you?

128:23

>> Uh, experts debunk Tyler Robinson's

128:26

ballistic claim. Unable to identify is

128:29

not the same as ruled out. Which is

128:30

exactly what I'm saying, right? So, um,

128:33

show me an image of a 30 odd six round.

128:36

30-06

128:38

rifle round. I want you to look at this.

128:42

Look at the size of that [ __ ] Okay.

128:44

Look at a 30 odd 6 versus a 308.

128:47

>> That's a [ __ ] paper weight.

128:48

>> A 30 odd six is a It's a big round. You

128:50

see it in that guy's hand?

128:52

>> Yeah. Oh my god. Are you serious?

128:54

>> Mhm. That's 30 odd six. So, this is my

128:56

>> That's a fat little hand though, too.

128:58

Look at that thing. That's like my hand.

128:59

This is this is the point is that that's

129:02

a big round. That's not a small round. I

129:05

mean, I don't know what isn't it

129:06

compared to I use a 300 Win Mag.

129:08

>> Look at that on the right there. You

129:10

just had it. Those cartridges

129:11

>> 5.56. Yeah.

129:14

Um is meant for war. 30 odd 6 is meant

129:16

for hunting. No, I don't think that's

129:18

accurate.

129:18

>> Yeah, that doesn't look realistic.

129:20

>> That's what a 30 odd 6 looks like. Okay.

129:22

In comparison to a quarter. So, you look

129:23

at it. So, a quarter is about that high.

129:25

It's about that big. That's a big round,

129:27

dude. That's a round for hunting like

129:29

elk. Like it's a very common round.

129:33

Well, you do me a favor and compare 30

129:35

odd six to 300 win mag

129:39

compared to 30 300 win mag.

129:42

>> I'm just scared, dude.

129:44

>> So 300 win mag I think is fatter. Let's

129:47

see the difference.

129:50

Okay, there it is. 300 win mag on the

129:53

left. Oh, 30 odd six is bigger.

129:55

>> Oh, look.

129:56

>> Okay. Is that real?

129:57

>> I don't know

129:58

>> which one's which though.

129:59

>> I don't know.

130:00

>> Um

130:01

uh show me that one far left. Far left

130:04

right there.

130:06

Okay. 300 win mag and 30 odd. So 300 win

130:09

mag has a little bit more powder in it.

130:11

See? See how it goes higher up? So it

130:13

has more charge. It's a bigger round.

130:15

But my point is that's a big round. So,

130:19

like a third Oh, 300 Win Mag is a big

130:21

round. 30 six is slightly smaller, but

130:24

it's still That's a lot of powder in

130:26

that bad boy. That's a lot of firepower.

130:29

It's a That's a So, this is what a lot

130:32

of people have an issue with is the the

130:34

wound that there was no exit wound. It

130:38

shot him in like the soft tissue of the

130:39

neck.

130:40

>> If it killed you, didn't would you feel

130:42

pain?

130:43

>> I mean, I it looked like he was dead

130:44

almost instantly. It looked like he

130:46

slumped over. I think he was at the very

130:48

least unconscious.

130:49

>> But it would have left his body. You're

130:50

saying

130:50

>> I think it would have blown a hole out

130:52

the back. That's the thing. It's like 9

130:55

mm do that sometimes.

130:57

>> Yeah.

130:57

>> It just doesn't It seems weird that it

130:59

doesn't have an exit hole.

131:00

>> Yeah.

131:01

>> It seems weird that you're shooting him

131:03

in the neck and the the image from the

131:06

back. There's a video of him getting

131:07

shot from the back. It doesn't leave an

131:09

exit hole. So, it doesn't look like it's

131:12

that round. There's also the fact that

131:15

this guy supposedly climbed on the roof

131:17

with it and then assembled it, which

131:19

doesn't make sense because if you

131:21

assemble it, that means you have to take

131:22

the scope off, put the scope back on.

131:24

You have to zero the rifle after you do

131:26

stuff like that.

131:26

>> Yeah. The guy who had uh who uh killed

131:29

or allegedly killed Osama bin Laden.

131:30

Who's that? Mike. Uh Mike, who's the

131:33

>> I know who you're talking about. The

131:34

Navy Seal.

131:35

>> Yep. He was just talking about that. And

131:36

I only say allegedly because I don't

131:38

know anything about that. I don't know

131:39

the specifics even though I read the

131:40

freaking book he wrote. Um, but yeah,

131:43

that he was saying that uh to be able to

131:45

do all that and get off of that roof, it

131:47

all seems bizarre.

131:47

>> Not only that, they supposedly

131:49

disconnected the rifle again, took it

131:52

apart on the roof, put it in his

131:54

backpack, jumped off with it, and then

131:56

reassembled it and left it in the woods

131:58

>> and allegedly was that a Dairy Queen. Do

132:00

you see that? Who could shoot someone

132:01

and go to Dairy Queen?

132:03

>> It seems weird. And then also his

132:04

family's denying that he he confessed.

132:07

>> Yeah,

132:08

>> they were saying that no, he didn't

132:09

confess. And we haven't heard

132:11

>> this family said 2% of what they're

132:13

saying about this is correct.

132:14

>> Have you Have you reached out to them or

132:16

have they reached out to you?

132:16

>> No. Well, I don't think they can. I

132:18

mean, they're probably terrified about

132:19

their son's future in life. Like,

132:20

they're trying to pin this crime on him.

132:22

Who knows if he did it or didn't do it.

132:23

I'm not saying he did it. I'm not saying

132:25

he didn't do it. But I am saying that

132:26

the story of him climbing up there with

132:30

a disassembled gun, assembling it,

132:32

making that shot, climb, disassembling

132:35

it again, climbing down. If that's the

132:37

narrative, that sounds like straight

132:39

horseshit.

132:40

>> And the video of him hopping down, it

132:42

does not look like he's a rifle when

132:43

he's hopping down. So, what what's

132:45

happening? How did he get up there? How

132:47

did no one see it? There's so many

132:48

things that are [ __ ] up about that

132:50

story that doesn't it doesn't totally

132:52

make sense. But a big one to me is the

132:55

actual bullet hole, the the the actual

132:58

damage that that rifle does. Look, but

133:01

here's another thing. Guns do weird

133:03

things sometimes. Like bullets do weird

133:05

things and sometimes they don't resp

133:08

maybe it hit maybe it [ __ ] center

133:10

punched his spinal column and it did

133:11

blow apart and it didn't go out the

133:13

back. It's possible. Have we seen have

133:16

they released any information about the

133:17

autopsy?

133:20

>> I don't know. I don't know.

133:22

>> I mean, you would think that

133:24

>> I don't know what the specifics are, but

133:25

I know a lot of people are very

133:26

skeptical, which they are about

133:27

everything these days, which is also

133:29

part of the problem.

133:29

>> Well, they have to be skeptical because

133:31

the news is compromised. the news is

133:33

owned by, you know, it's not good. And

133:36

uh

133:37

>> it's also there's a lot of

133:38

disinformation out there. There's a lot

133:39

of like covering up stories. There's a

133:42

lot of weird [ __ ]

133:43

>> And yes, and then even uh other places

133:46

can put out news that that that's bad

133:48

for us that like, oh, we'll put this out

133:50

there disguised as information. Um but

133:54

did you see the that exploding mic

133:55

theory? Did you guys talk about that on

133:56

here?

133:57

>> I've heard that theory, but I don't know

134:00

if that makes sense. I don't know. I've

134:02

heard people talk about it, but I hadn't

134:04

looked into it. It looks like he got

134:05

shot. I don't know if the microphone's

134:08

going to hit you in the neck. Like, how

134:10

how do you know where the mic's

134:10

pointing? You're moving around a lot.

134:12

How do you know when to make it go?

134:14

>> That's a good point. They had it like on

134:15

his shirt at a specific spot. But yeah,

134:16

you're right. How would you know? But

134:18

then the place where

134:19

>> it sounds like a gunshot though and

134:20

there's a delay between the gunshot and

134:22

the impact in terms of like acoustic

134:25

readings like and I think somebody did

134:28

an analysis of the distance they believe

134:30

the shot was taken from based on the

134:33

sound you know if that is the round that

134:36

they use 30 odd six based on the sound

134:38

of the gunshot going off and the amount

134:39

of time before it impacts them. It's a

134:41

very small amount of time, but it is

134:43

measurable.

134:44

>> And they think that it might have

134:45

actually been closer than what they're

134:48

saying, which is I thinkund and

134:49

something yards. I forget what the exact

134:51

distance was. What was the exact

134:53

distance supposedly? I think it was like

134:55

140 yards or something like that. But

134:58

the weird thing is like this whole idea

135:00

of assembling and disassembling, it

135:02

doesn't work like that, man.

135:03

>> And if the guy's not a professional, was

135:05

he a professional?

135:06

>> No. No, he definitely wasn't a

135:08

professional. But I'm like, you could

135:09

get trained like shooting a rifle at 140

135:13

yards with a really good scope if you've

135:16

shot a bunch of times with a rifle and

135:18

you can keep your [ __ ] together is not

135:20

that far of a shot. You can make that

135:22

shot. People can make that shot. He

135:24

wasn't even wearing a bulletproof vest

135:25

even though they he did obviously get

135:28

hit in the neck. But the thing is like

135:30

if that's the narrative, and I don't

135:33

know if they're still sticking with the

135:34

story, but that was what they were

135:35

saying at first, that he disassembled it

135:37

and reassembled it. Reassembling a gun

135:39

does not make it accurate. You have to

135:41

zero a rifle in. And what that involves

135:44

in you get to like whatever the yardage

135:46

are that you're trying it out, like 100

135:48

yards, and you know, you you squeeze off

135:51

a trigger, and then you look through the

135:53

binoculars or you have a spotter with a

135:54

scope next to you, and he says he says 6

135:57

in high, right? And so then you adjust

135:59

it. You adjust the scope and that and

136:01

then do you get it where it's firing and

136:03

you do it on a rest and it takes a few

136:05

shots, man. So you have a rest so that

136:07

you're you're not your rifle you're not

136:09

moving the rifle around where it can,

136:11

you know, be human error can be

136:12

attributed to the mist.

136:13

>> And if you're on a hot roof, that was a

136:15

hot roof, wasn't it?

136:16

>> Most official and media accounts put the

136:18

shot at roughly 200 yards with some

136:20

investigative timeline suggesting a

136:22

range of about 150 to 200 yards from

136:24

Kurt. So somewhere between 150 and 200

136:27

yards,

136:28

>> dude. And also being on a hot roof. Have

136:29

you ever been on a hot roof?

136:30

>> I have.

136:31

>> Dude, it's hot.

136:32

>> Well, it wasn't that hot. Well, yeah, it

136:34

was. It was September. September in

136:35

Utah. Actually, not that hot.

136:38

>> It sounds hot.

136:39

>> Yeah. I don't think it was

136:41

>> cuz this this was happening while I was

136:43

out elk hunting.

136:44

>> What town did it happen in?

136:47

>> I don't know.

136:49

I'm not sure.

136:52

It was in Utah, though. I think it was

136:54

in southern Utah, wasn't it?

136:56

>> Yeah, but Utah's, you know, Utah's a

136:59

mountain. It's a mountain town. I mean,

137:02

>> my brother lives in Utah. I I like Utah.

137:04

>> Like I said, I was in Utah at the time.

137:08

Yeah, I was hunting in the mountains.

137:09

>> Well, that's interesting.

137:11

>> Yeah. I don't know nothing.

137:15

I started getting all these text

137:16

messages from people wanting me to

137:17

comment on things. I was like, "What are

137:18

you talking about?" I literally didn't

137:20

know what was going on.

137:22

I had to use the Starlink to get online.

137:24

>> Oh wow.

137:25

>> I got a Starlink. It's like literally

137:27

it's like the size of a [ __ ] iPad and

137:29

you lay it on the ground, you get high

137:30

speeded internet. It's incredible.

137:32

>> That's cool [ __ ]

137:32

>> Yeah. But that's how I had to like

137:34

research it, find out what the [ __ ]

137:36

people are talking about.

137:37

>> But did you see the there was like the

137:39

facility in Tennessee where they bought

137:41

the whatever the mic thing was allegedly

137:44

that that that place thing got

137:45

completely obliterated. 16 people died.

137:48

>> What?

137:49

>> What? If you can bring that bring

137:51

>> where they made the microphones

137:53

>> where they made the uh lapel mic that he

137:56

was wearing. This is like a This is

137:58

probably conspiracy thing or something.

138:00

>> Where'd you get this? Tik Tok.

138:01

>> This is a conspiracy theory. It's

138:02

something that's absolutely true. I JUST

138:07

>> I haven't heard that one at all.

138:08

>> I think James Lee is

138:10

>> but I'm trying to stay away from this

138:12

[ __ ]

138:12

>> That's why I'm not That's why I don't

138:14

know.

138:15

>> I agree. It's just I think it it I don't

138:18

know. It's just a tough

138:19

>> So my my point about the round is it's a

138:22

large round and it seems like it would

138:23

have done more damage and this is not my

138:26

opinion. This is the opinion of many

138:28

experts. Yeah,

138:28

>> I agree with their opinion. I was it's

138:30

not uniquely my opinion. I saw it and

138:33

I'm like, "Oh my god, he got shot and

138:35

then I heard it was a 30 odd six and I

138:36

was like, h

138:38

>> that's interesting.

138:39

>> It's a little odd."

138:40

>> If you had to get shot by what would you

138:42

like to get if you had to get shot?

138:43

>> You want to get killed, right? You don't

138:44

want to get

138:45

>> I don't shoot me with a 22.

138:47

>> Yeah. Okay. Yeah. I'd take a 22, but a

138:49

22 kills people.

138:50

>> Where would you take it?

138:51

>> People take in the shoulder, I guess.

138:53

>> [ __ ] yeah,

138:55

>> dude.

138:56

>> No, you don't want to get shot. Period.

138:58

>> I know. I agree, Joe. But I'm just

138:59

saying if you had to get shot, how do

139:02

you like because here's

139:03

>> cheek 22 in the tighten up.

139:07

>> Take it in the butt cheek. Bang.

139:10

>> I don't know. Not good. But no, no

139:13

bullet is good to take. But the point is

139:15

that seemed like not enough damage for

139:18

that kind of round. But I might be wrong

139:21

again. I might be wrong in that bullet

139:24

weird things

139:25

>> if it hit the spine and it blew apart.

139:28

But I just feel like you would find a

139:30

lot of it in there.

139:31

>> I dude,

139:31

>> especially if there's no exit wound.

139:33

Like where's how come you can't find

139:36

>> the whole thing's bizarre, dude. Do you

139:38

see the part, Jamie, that I'm talking

139:39

about where that thing blew up?

139:40

>> Oh, yeah. But I'm trying to find a good

139:42

explanation of

139:43

>> Okay, understood. And there may not be

139:44

one. Thank you. I'm sorry.

139:46

>> Um cuz I know I brought it up yesterday.

139:48

>> Oh, you did? Okay. Um yeah, just like I

139:51

don't know. I think I'm just scared. And

139:53

it's like Yeah. What do you

139:55

>> 18 people unaccounted for after deadly

139:57

explosion rocks Tennessee plant. First

140:00

responders rushed to Accurate Energetic

140:03

Systems. That sounds like a CIA

140:04

operation.

140:05

>> A facility on the line of Humphre and

140:08

Hickman uh counties that processes

140:11

ammunition and explosives. But is this

140:13

the place that made the microphones?

140:15

>> So that that the the conspiracy says

140:17

that the microphone was taken to this

140:19

place to be converted into like an

140:21

explosion

140:22

>> somebody found an invoice from it.

140:24

That's what it was that that that was

140:25

the piece of information that was going

140:26

around.

140:26

>> Who found that? Is that James Lee found

140:28

that?

140:29

>> Not sure.

140:29

>> See if you can find what James Lee has

140:31

to say. He's my number one source of

140:33

information.

140:33

>> That's what I heard, dude. I I got to

140:35

podcast with him.

140:35

>> Did you?

140:36

>> I got to meet him, dude.

140:37

>> Is he cool?

140:37

>> He's a nice guy, bro. He's fun. Yeah,

140:40

he's like he well his his story is wild

140:42

because he was working as a consultant

140:44

for uh one of the big pharmaceutical

140:46

companies, like one of the big ones that

140:48

we know, right?

140:49

>> And he just couldn't say the name, but

140:50

he could say it, but he never said it,

140:52

>> right?

140:52

>> Um and then he was in a Zoom one time

140:55

and they're like, "Okay, we still have a

140:57

lot of stockpile from the first

140:58

vaccination."

141:00

And that's when he said they started

141:02

suggesting allegedly that people should

141:05

then get a second vaccination because

141:06

they had this first they had they still

141:08

had more of the original vaccine. So it

141:10

was just like a thing well we have more

141:12

of it let's sell it back to him and

141:14

that's why

141:14

>> and so he started getting like very

141:16

skeptical.

141:16

>> So he started really getting skeptical

141:17

and then he got out of it and he said he

141:19

just wants to like expose things that he

141:21

feels like are not real

141:23

>> right or true.

141:24

>> You think he might be CIA?

141:27

You always got to worry.

141:28

>> I don't know. You got to wonder.

141:29

>> People thought Sean People thought Sean

141:31

Ryan was cool.

141:32

>> Yeah, I've heard people say that.

141:33

>> Remember?

141:33

>> Yeah,

141:33

>> that was a thing. But then now people

141:35

don't.

141:35

>> Doesn't seem like he is.

141:36

>> Yeah,

141:37

>> unless they're being clever.

141:40

>> I just want to be able to have like a

141:41

family and just like think that

141:43

everybody's going to be able to live.

141:44

>> That would be nice. Yeah, that's the

141:46

thing about ideologies and tribes. If it

141:49

wasn't for ideologies and tribes, the

141:51

idea is that we should all be able to

141:52

live together.

141:54

>> It's like, but the problem is it's not

141:55

fair the way the world's distributed.

141:58

Yeah.

141:58

>> You know, you know the statistic about

142:00

the 1% of the world, it's $34,000.

142:04

>> You make $34,000,

142:07

you are in the 1% of the world.

142:10

>> Yeah.

142:10

>> That's crazy.

142:13

>> I know. It's just tough sometimes to

142:15

figure it out. You have to pray. That's

142:16

what I've been trying to do.

142:17

>> In order for us to get cheap jeans and

142:19

an iPhone that only cost a thousand

142:21

bucks, somebody has to get paid squat.

142:25

Somebody has to get [ __ ] over.

142:26

Somebody has to work long hours. and

142:28

live in those Foxcon factories where

142:31

they have nets to keep people from

142:32

jumping off the roof, you know?

142:34

>> Yeah,

142:35

>> bro. You know when you're working in a

142:36

place and there's so many people jumping

142:38

off the roof that they just put nets up.

142:41

>> You got a problem.

142:43

>> That's not a fun work environment.

142:45

>> Hey, Ron's hitting the nets, guys.

142:47

>> Yeah,

142:48

>> you got to hit the net again. You're

142:49

like, you dumb [ __ ] Why you

142:51

keep jumping in the net? I want to see

142:53

what it feels like to jump, but I know

142:55

it's going to save me, but I still want

142:57

to jump.

142:57

>> Somebody comes back from lunch break and

142:58

they just have the net marks on their

143:00

face and they're like, "Ah, you tried

143:02

it."

143:03

>> But dude, it's just sad, man. I don't

143:05

know.

143:05

>> It's sad. I mean, that's

143:07

>> And we're better than this.

143:08

>> Yeah. Humans overall are better than

143:10

this.

143:11

>> Thank you.

143:12

>> So, people that are not acting better

143:13

than this are not they're not I mean, I

143:16

know we all have mistakes and we all do

143:18

things that are [ __ ] up, right? But

143:19

like at a point where you're like

143:21

>> we should all be doing better

143:22

>> taking lives and it's not if it's not

143:24

the regular people I feel like it's the

143:26

governments man.

143:27

>> 100%. It is 100%. Because if it was just

143:30

people, we'd all figure out how to get

143:32

along. Unless you think those people are

143:35

the infidels or those people are the

143:37

goyam or those people are the Jews or

143:40

those people are the Arabs or whatever

143:42

you decide those the other. You decide

143:45

to other a group of people, then it

143:47

becomes a problem cuz it's us versus

143:49

them. And then you're back to the same

143:50

tribal [ __ ] that needs to turn us

143:52

all trans. That's why we need to lose

143:54

our gender and lose our primate dominant

143:57

instincts and all of our territorial

143:59

instincts.

144:00

>> Well, I told you I was going to mail my

144:01

dick away.

144:02

>> We're going to be all telepathic with

144:04

big old heads and little tiny mouths

144:06

because we're not going to use them

144:07

anymore cuz no one's going to have a

144:08

dick to suck. Tiny like this.

144:12

>> No one's going to have a [ __ ] to lick.

144:14

So, you're going to have like this and

144:16

you communicate with your mind. So, your

144:17

mouth's just going to atrophy and you're

144:20

going to get all your food through like

144:21

a suck hole. You're gonna have a just a

144:23

straw to eat all your food. They're

144:25

going to figure out how to way make

144:26

perfect food where it's just like you

144:28

don't have to go to a restaurant, eat

144:30

chicken or have fish. No, no, no. Suck

144:33

on a straw. Get all the nutrients you

144:35

need in this [ __ ] sludge. And this

144:38

the sludge makes it feels like an orgasm

144:40

when you take it. That's why you get

144:41

people to do it.

144:42

>> They take it, it lights all their

144:44

synapses up

144:46

like when you hit that V

144:48

>> first thing in the morning. Uh, give me

144:51

a hit of that REAL QUICK,

144:52

>> MY MAN. LET'S GO.

144:54

>> CUZ THE FIRST hit is the good one. Give

144:55

me a hit. Ready?

144:57

>> Right there. Yeah.

145:00

>> Hit that, [ __ ] Ricky.

145:02

>> Oh, yeah.

145:06

>> Yeah, it's that first one.

145:07

>> Yeah,

145:07

>> that's it. That tastes good. What's in

145:09

that one? That's like a professional

145:11

one.

145:11

>> That's coffee.

145:12

>> Ooh, that's delicious.

145:14

>> That's a professional one, though.

145:16

>> That's a trap. Yeah, this one I think is

145:17

for outdoors. for outdoors people.

145:18

>> Outdoorsy. But it doesn't taste

145:20

outdoorsy. It tastes like fake coffee.

145:22

>> I'll leave it over here if you do it.

145:24

>> I'm good.

145:25

>> I'll keep it.

145:25

>> One hit good. No, no, no. I know it's a

145:27

slippery slope. I'll be pulling into the

145:29

gas station to get an Escobar later.

145:30

>> Oh, Escobars. They Those are the ones.

145:33

Remember you were like, you said you

145:34

were hiding from yourself at night.

145:35

>> Get you. Yeah, they I had to hide them

145:37

from myself.

145:38

>> They get me.

145:39

>> Yeah.

145:40

>> Sometimes I'd be ashamed, so I'd take a

145:42

hit and I blow it into my shirt. I

145:44

wouldn't let anybody know I'm doing it.

145:46

>> Yeah. But your tight shirt, that [ __ ]

145:48

[ __ ] come right out the armpits.

145:52

>> That's why I wear a hoodie.

145:53

>> Just leave put over this cover up like a

145:55

monk.

145:59

>> But tell me, Jo, like what are some like

146:01

give me some like what are some things

146:02

that Yeah. that we can do to keep us in

146:06

a space of giving ourselves the best

146:09

chance to

146:11

um to feel human cuz Yeah. One day

146:13

you're going to go to a museum and

146:14

there's going to be a smile in there.

146:16

Well, it has to happen on an individual

146:19

basis, right? Everybody has to be human

146:21

to each other on an individual basis.

146:23

>> And sometimes it takes something chaotic

146:25

like a tragedy like 9/11 for people to

146:28

just be cool to each other. You know, I

146:30

remember I've talked about this before,

146:31

but post 911, everyone was so connected.

146:35

Everyone was smiling. People were

146:36

letting you get on the highway. They're

146:38

letting you get in their lane. They were

146:39

waving. Everyone had American flag on

146:41

their car. We had been attacked. we were

146:43

united, you know, and it's just sad that

146:47

it takes something like that for people

146:48

to realize like this is a gift to be

146:51

alive in this incredible country at this

146:53

incredible time in history, but we are

146:55

under the rule of tyrants, you know, and

146:58

I'm not saying this the US government's

147:00

tyrant or I'm not no individual, but

147:03

every government that is in control of

147:06

military that is involved in these

147:08

exchanges with other they're run by by

147:11

tyrants. Someone's a tyrant. Whether

147:14

it's Putin or this guy or that guy or

147:16

whoever is in charge of Iran right now,

147:19

they they keep the people on the street

147:21

from using the internet. They kill all

147:22

the protesters. That's the problem. The

147:25

problem is people in power. It's not

147:26

people.

147:27

>> Yeah.

147:27

>> People generally are good. Especially

147:30

when they're not starving. When they're

147:32

not starving and they're not desperate

147:33

and they're not being attacked, most

147:36

people generally are good.

147:38

>> Yeah. obviously dependent upon how you

147:40

grew up and what you were exposed to

147:41

when you were young and what kind of

147:43

horrors did you have to see? Were you in

147:44

a war torn country, you know, were you

147:46

in a third world place where the cartels

147:48

run everything?

147:49

>> Did you see those kids in Gaza with like

147:51

they had like they were playing uh doll

147:54

and they were like it was like they

147:57

loaded their doll up on a stretcher like

147:58

they were

147:59

>> Oh jeez.

148:00

>> [ __ ] heartbreaking,

148:01

>> bro. Imagine like the just the trauma if

148:05

you lived in that place

148:07

pre October 7th. It was not fun even

148:11

back then. It was an open air prison by

148:13

most accounts.

148:14

>> Oh yeah. They were taking settlers

148:15

homes. They were just they'd come and

148:17

knock into your home and then eventually

148:18

just take it away.

148:19

>> Well, there's a there's an attitude that

148:21

a lot of Israelis have that it's all

148:22

theirs. You know,

148:25

>> here's an explanation. Tinfoil hat time,

148:27

though. Just

148:29

>> Exactly. Okay. This is a dude named Mike

148:32

France.

148:32

>> Not it wasn't James Lee reporting Mike

148:34

Franco.

148:35

>> This is the same stuff I've seen

148:37

elsewhere.

148:37

>> Says October 10, 2025, exactly 1 month

148:39

after Kirk's death, a catastrophic

148:41

explosion destroyed building 602 at the

148:44

Accurate Energetic Systems facility in

148:47

McGee, Tennessee. The blast estimated to

148:49

involve 23,000 pounds of explosives,

148:52

killed 16 employees, injured several

148:54

others, and registered as a 1.6

148:57

magnitude seismic event. Yo, the US

149:00

Chemical Safety Board confirmed the site

149:02

produced cast boosters and miniaturized

149:05

shaped charges for military and

149:07

industrial use. Conspiracy theorists

149:09

allege that AES was the manufacturer of

149:12

the miniatureshaped charge used in

149:14

Charlie Kirk's assassination. They point

149:16

to a $425,000 devel department of

149:19

defense contract awarded to AES in May

149:22

of 2025 for extra small anti-personnel

149:26

demolition charges possibly used in

149:29

covert operations. The timing of the

149:31

explosion just weeks after Charlie

149:33

Kirk's death has fueled speculation that

149:35

it was a deliberate cover up to destroy

149:37

evidence and eliminate the personnel

149:39

with knowledge of the technology. So

149:41

there's the pager attacks, the Lebanon

149:43

pager attacks. Here's the my problem

149:46

with that explanation. And I'm not

149:48

saying that I'm right and they're wrong.

149:50

My problem is I don't see that thing

149:52

exploding. Yeah.

149:53

>> So that microphone I don't see it

149:54

exploding. I don't see fire coming out

149:56

of it. If you have a gun and the gun

149:58

goes off 6 in from someone's neck like

150:01

that, you're going to see a charge out

150:02

of the

150:03

>> Great point.

150:04

>> And if it's a small device without a

150:05

barrel, something has to propel that

150:08

energy and that's an explosion. And if

150:10

it explodes, you're going to see it

150:12

explode. Unless they've developed some

150:14

sort of way of hiding that.

150:15

>> Yeah,

150:16

>> that I don't know about. But if I'm But

150:18

if they're talking about conventional

150:20

gunpowder

150:22

and what they use for bullet rounds,

150:24

that doesn't seem to make sense to me,

150:26

>> but I but I might be missing something.

150:28

I don't know.

150:29

>> Yeah. No, that's actually a great point

150:30

that you said.

150:31

>> Yeah,

150:31

>> I agree with you.

150:33

>> Kind of kind of seems like that thing

150:34

with spark.

150:36

>> Yeah.

150:36

>> I mean, it's close to his neck. It's

150:38

blowing his neck up. I mean, it seems

150:41

odd that it can do that without fire.

150:45

Doesn't make sense. But I might be

150:48

missing something. There might be some

150:49

new technology that I'm not aware of.

150:51

Let's find out that. Is there any

150:53

technology that exists where you could

150:56

have a projectile come out of a small

150:58

thing like a microphone that's on

151:00

someone's neck and not have fire?

151:03

>> I don't know. Um I don't know. It may

151:06

also make my head hurt. Yeah, it should

151:08

make your head. But there's also

151:09

probably some stuff that we're not hip

151:11

to.

151:11

>> Oh, for sure, dude. Right.

151:13

>> They come out with stuff all the time

151:14

that we'll never see. Probably.

151:15

>> Yeah. I mean, they have drones that look

151:17

like bugs.

151:18

>> They're like,

151:19

>> "Looks like a bug."

151:20

>> That's crazy.

151:21

>> A [ __ ] drone. A little itty bitty

151:23

drone.

151:23

>> You're just sitting there spraying uh

151:25

raid on something that's watching y'all

151:26

[ __ ] or whatever at night. That's crazy,

151:29

dude.

151:29

>> He's getting films. Christy Gnome's

151:32

husband.

151:33

>> Oh, yeah. Boy, he had them merppers on

151:35

him, huh? Did he What was he

151:37

>> Was it Did anybody explain what that was

151:39

about? Was it really just like a

151:41

Halloween costume or something?

151:42

>> I thought it was probably, but then

151:43

there's some other ones where he's kind

151:44

of lipstick and he actually

151:45

>> But it could be he was [ __ ] around

151:47

for like a party or something like that.

151:49

>> Giving Kevin Spy in a lot of this. So

151:51

you could feel the Kevin Spacy coming in

151:52

with some of those photos.

151:53

>> But here's the question. Is is that a

151:55

costume he was wearing for funsies or is

151:57

this like a dress up thing? This guy's a

151:59

freak. That's what I thought it was.

152:00

This costume wear.

152:01

>> It could be because if it is a costume

152:03

for funsies and then somebody finds it

152:06

on your laptop like like I got to

152:08

explain how we're just [ __ ] around. I

152:11

was doing Wanda from In Living Color,

152:13

you know?

152:14

>> Yeah. I ain't want to gossip so you

152:16

ain't heard it from me.

152:17

>> I mean, was that Wanda? No. Which one

152:19

was Wanda?

152:20

>> I forget.

152:20

>> Dude, how great was that show, dude?

152:22

>> Amazing show.

152:23

>> Did you love it?

152:24

>> Amazing show. Amazing show. One of the

152:26

greats,

152:26

>> dude. We would go in our neighborhood

152:27

afterwards and me, Larry, Eddie, Wayne

152:30

King, just guys off of my street, dude.

152:32

We'd go out there and impersonate all of

152:33

the freaking characters,

152:34

>> bro. That show was groundbreaking. There

152:36

was hundreds of messages, blah blah blah

152:39

blah. And then this was some of them, I

152:40

think.

152:40

>> What do you mean blah blah blah?

152:42

Messages about what?

152:43

>> I don't It said there was three models

152:45

of women. There was three women citing

152:47

hundreds of messages purportedly sent by

152:48

three women from the

152:49

>> Oh, Christine Nome's husband. Didn't she

152:51

just get let go or something? Hundreds

152:53

of messages. Traded selfies. a woman who

152:55

pledges to worship like a goddess

152:57

telling her, "You turn me into a girl

153:01

before asking if he should put on

153:03

leggings." Oh, okay. But is this is this

153:06

real, right? Or is So the Post has not

153:09

confirmed the details reported by the

153:11

Mail. This is what mail the Daily Mail.

153:14

>> Yeah, that's originally reported, I

153:15

think.

153:15

>> Let me tell you something about the

153:16

Daily Mail. They just made an article

153:18

saying that uh I'm moving out of Austin.

153:21

Oh, that uh I'm fed up with Austin. I'm

153:23

moving out of Austin. That's not true.

153:25

And that was published by the Daily

153:26

Mail,

153:26

>> right? And also, didn't Christy Gnome

153:28

just go through something where she got

153:29

let go or something? Is that right?

153:32

>> Yes. And not just let go, but involved

153:34

in a scandal. There's some sort of a

153:36

money scandal.

153:37

>> Sometimes this kind of [ __ ] follows

153:39

that. It's hard to know, but he also

153:41

looks like like who's that actor right

153:43

there? He looks like a little bit.

153:44

>> What are those boobs? Those are crazy.

153:46

>> Will Arnette or something? No, not

153:47

Willette.

153:47

>> He's got crazy fake boobs. Like they're

153:50

nuts.

153:50

>> They're just balloons.

153:52

>> That's all it is.

153:52

>> Yeah. So, how do we know he's

153:54

>> You got tricked easy, bro. Joe just got

153:56

tricked, bro.

153:57

>> Well, I don't think they're real. I

153:58

mean, I thought they were like a fake

154:00

one that

154:00

>> But you was thinking about him. Oh,

154:02

[ __ ] She's John Bonan right there. Look

154:03

at that [ __ ] That's crazy to me, dude.

154:06

>> So, supposedly there's letters that he

154:09

was sending to girls that you make me

154:11

dress up like a girl. But look again,

154:13

isn't it crazy that she's involved in

154:15

some sort of a scandal that's about

154:17

money

154:17

>> and then this comes out? And then this

154:18

comes out. I agree. It's You have to

154:20

start to notice that. And then here's

154:21

the craziest part. At a certain point,

154:24

>> forget about him. Can you find out what

154:26

she was let go for and what what's

154:28

involved in it? Because there was some

154:31

sort of a campaign fund scandal or

154:33

something that has to do something with

154:35

money. A lot of money, like millions,

154:38

millions and millions of dollars, and

154:40

then all of a sudden this happens. You

154:42

got to get a little suspicious in this

154:44

day and age.

154:45

>> Oh, yeah. That was the the the campaign

154:47

commercials she got in trouble for cuz

154:49

it's like they hired like someone she

154:51

knew and the they're like she was riding

154:53

a horse through the [ __ ]

154:56

>> right. But there's something about the

154:57

money being inappropriately spent.

154:59

>> It's like$undred billion dollars or

155:01

something.

155:01

>> 100 million

155:02

>> or 28 million. I'm trying to find this

155:04

article doesn't say.

155:04

>> Let's not comment until we have the

155:06

specifics. Do Joe, do you think that

155:08

things would be any different um with

155:10

America's relationship in the Middle

155:12

East right now if the if uh the

155:14

Republicans hadn't won the election if

155:16

Trump hadn't won or do you think it's

155:18

all the same?

155:19

>> It's a good question.

155:20

>> You think it's all the same like Japetto

155:22

in the distance like running the strings

155:24

and it's all

155:25

>> the last administration funded the proxy

155:28

war in Ukraine,

155:29

>> right? 200 million and they were

155:30

>> so 200 firm tied to Christy Gnome

155:33

secretly got money from $220 million DHS

155:38

uh ad contracts.

155:39

>> Dude, for 220 million you could put tits

155:41

on my husband for$ 220 million, you

155:43

know, but you know I'm I'm and I'm not

155:45

even a gay guy,

155:46

>> right? But now they're painting her out

155:48

to be a nutcase, right? Cuz her

155:50

husband's a freak. So this firm not

155:54

saying that he's not a freak. Yeah,

155:56

>> right. He might really be into dressing

155:57

up like a girl. It might all be real.

155:59

That might be you going to make me put

156:00

on leggings. You deter me have

156:03

autogophilia, right? So, but also

156:05

>> that might be coming out because of

156:08

this. And there's probably a bunch of

156:10

people that got some money and they're

156:12

like, "Let's try to make this ugly."

156:14

>> Yeah.

156:15

>> Yeah.

156:16

>> Oh, yeah. Well, that's scary, too, cuz

156:18

it's like,

156:18

>> who knows? I mean, we we don't know

156:20

anything about the case, right? We don't

156:22

know anything about either case, the

156:23

money missing or his fake tits.

156:25

>> Yeah. I never had no fake tits. I mean,

156:27

I've done some weird [ __ ] here and

156:28

there.

156:28

>> Steo almost got a pair of fake tits.

156:30

>> Did he?

156:31

>> Yeah.

156:31

>> Tell us the story.

156:33

>> I agree.

156:34

>> That's too much.

156:35

>> But he's in that, you know, constant

156:37

perpetual state of having to one up

156:38

himself

156:40

>> doing something more and more ridiculous

156:41

every time.

156:43

>> Do um

156:45

what do you think is going to happen?

156:46

You think we're going to be okay?

156:48

>> I hope so. Of course. I don't know.

156:50

>> Do you think about it?

156:51

>> I'm confused. I can't believe we went to

156:53

this war. I when we started bombing

156:55

Iran, I was like, there can't this can't

156:56

be true.

156:56

>> And what about Lebanon now?

156:58

>> I know Israel's invaded Lebanon. Yeah.

157:01

>> Yeah.

157:02

>> And it's like just [ __ ] stop it. What

157:04

do you need?

157:05

>> Well, they're trying to supposedly

157:08

they're trying to stop the terrorists.

157:10

>> That's crazy though. If you're the

157:12

[ __ ] terrorist,

157:16

>> you know what I'm saying? Like, if you

157:17

want to stop them, [ __ ] stand in

157:18

front of the [ __ ] mirror

157:20

>> and start there. But also, what do I

157:23

know?

157:24

>> What do you know?

157:24

>> You're right. I don't.

157:26

>> But it's all just like, [ __ ] there's

157:28

got to be some way that we're better

157:29

than this. What

157:29

>> they're saying is like if this was found

157:31

out by the story about Christine's

157:33

husband was found out by like a

157:34

newspaper online

157:35

>> allegedly,

157:36

>> and that if if they can find this out,

157:38

then obviously hostile intelligence

157:40

services, according to the CIA officer,

157:42

Mark Polyopoulos, knows this stuff as

157:45

well.

157:45

>> If a media organization find this out,

157:47

you can assume that a high degree of

157:48

confidence that a hostile intelligence

157:50

service knows this as well. added former

157:52

CIA officer

157:54

>> Mark Polyropolis.

157:57

>> Damaging information like this can be a

157:59

tantalizing lead for a hostile

158:01

intelligence source. They approach the

158:03

person and say, "If you work with us, we

158:04

won't expose this and if you don't, we

158:06

will."

158:08

>> So, he's posting these online and

158:10

someone came across them is what it

158:11

sounds like.

158:12

>> Well, he might be a freak.

158:13

>> Who cares? Let him [ __ ] cook.

158:15

>> I think a lot of those people that are

158:16

involved in government are freaks and I

158:18

bet their husbands and wives are freaks,

158:19

too. They're [ __ ] weirdos. They want

158:21

to be in power.

158:23

>> Yeah, it's Yeah,

158:24

>> they want to wear leggings.

158:26

>> It's all crazy. We just have to focus on

158:27

the things that we can. Like Matt

158:29

McCusker, he started a garden. It's like

158:30

10 to the garden that you can have. You

158:32

know,

158:32

>> he grows blueberries and he grows um

158:35

>> that's the way to do it.

158:36

>> He grew one. He actually only he grew

158:38

one blueberry his first. During the

158:39

congressional hearings, Christine Gnome

158:41

was probed about accusations of

158:42

conducting a taxpayer funded affair with

158:45

her former aid, Corey Luwendowski,

158:49

who has since left the Department of

158:51

Homeland Security.

158:52

>> Cory Lwendowski, dude,

158:54

>> dude, I was in a [ __ ] fantasy

158:55

football league with that guy

158:56

>> for real.

158:56

>> Yes, for real.

158:57

>> Pull him up again. Yeah.

158:59

>> No kidding. Let me see a photo of him.

159:00

>> Mother efer, dude.

159:02

>> She's pretty hot.

159:03

>> See Lou, bro. Oh, this different dude.

159:06

>> Yeah, seems like a different dude. Uh,

159:08

so that guy was the guy who

159:10

>> banging her. Okay.

159:13

>> Supposedly, allegedly, who knows? But

159:16

again, when there's a a bunch of money

159:18

that's missing and there's a scandal,

159:20

hundreds of millions of dollars, weird

159:22

[ __ ] starts getting tossed around that

159:25

throw you House of Cards, baby. Go

159:28

rewatch it. That's I think that's

159:30

probably the most accurate depiction of

159:31

how the government works.

159:33

>> Yeah, Kevin Sp is a fascinating guy.

159:35

>> Well, everybody in that show was great.

159:37

It it was just like a really well-made

159:39

show.

159:39

>> Yeah, that show was fascinating. He did

159:41

a He did

159:42

>> All right, dog. We got to wrap this up

159:43

soon.

159:43

>> Dude, that's fine with me. I thought you

159:44

were I thought you I was staying here

159:46

cuz you're here.

159:46

>> No, I love you, too, but I have things I

159:48

got to do.

159:49

>> Um, are you going to be around tonight

159:51

or you going going back?

159:52

>> I might stop by. Um,

159:54

>> boys in theaters April 17th. So, that's

159:58

like two weeks from now. Let's [ __ ]

160:00

go. Two weeks in a few days.

160:02

>> Yeah.

160:02

>> Um,

160:03

>> let's [ __ ] go.

160:04

>> Thank you for letting come and talk

160:05

about it. I'm excited for you.

160:07

>> And just to see you. Yeah, I'm excited,

160:08

too, man.

160:08

>> I hope it kills it.

160:09

>> Yeah. I just think

160:10

>> I'm sure it's going to be really funny

160:11

with you and David Spade.

160:13

>> We tried our We did a good job. This

160:14

>> I'm sure.

160:15

>> Dude, he's so funny.

160:16

>> Yeah, I'm sure it's going to be awesome.

160:17

But, um, thank you for everything, dude.

160:19

My pleasure, man.

160:20

>> And it's good to see you. And

160:21

>> it's good to see you.

160:22

>> Yeah.

160:22

>> Come by tonight. Let's hang out.

160:23

>> I'll come by. I have a show tonight, but

160:25

I'll come by.

160:25

>> Where you at?

160:26

>> I'm at this Moody Theater. I'm

160:27

practicing for my special, so I got to

160:28

get ready.

160:28

>> Um, with Tommy at the Moody

160:30

>> 7.

160:31

>> Okay. Come by. We'll be there till I'm

160:33

I'm going to be in I'll be there for a

160:35

while.

160:35

>> Okay. I'll come by and say half.

160:36

>> All right. All right. Beautiful.

160:37

>> Good to see you, man. Jamie, thank you

160:38

so much.

160:39

>> Bus Boys, April 17th. Go watch it. We

160:42

love you guys. Bye.

Interactive Summary

This transcript covers a wide range of topics, including concerns about artificial intelligence and its impact on society, the prevalence of autism, political commentary on current events and historical precedents, discussions about media bias and the decline of traditional journalism, and personal anecdotes about comedy and life. There's a notable focus on the perceived manipulation of information and the growing distrust in institutions. The conversation touches on the potential for AI to both help and harm humanity, the changing nature of human interaction, and the challenges of navigating a world saturated with information, some of which may be unreliable.

Suggested questions

5 ready-made prompts