HomeVideos

Joe Rogan Experience #2481 - Duncan Trussell

Now Playing

Joe Rogan Experience #2481 - Duncan Trussell

Transcript

5773 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:12

>> All right, we're good. We had a slight

0:16

issue. Slight technical glitch.

0:19

>> We're up. What were we just talking

0:20

about?

0:21

>> Oh, we were talking about that if you

0:22

hum a tune

0:24

>> Oh, right. Right.

0:24

>> that you will get dinged.

0:26

>> Yeah. You'll get flagged on YouTube if

0:28

you just hum a a sound from a song.

0:31

>> Yeah.

0:32

>> Like the beginning bars of a song.

0:34

>> Yeah. You can't I wonder how far that

0:36

goes. Like could it get to the point

0:39

where an AI could hear you humming it in

0:42

your car or something? Like how far does

0:44

the protection of music go?

0:45

>> It's you're not generating revenue from

0:47

your car, right? So the thing is you're

0:49

generating revenue from a podcast and

0:52

their logic is if you hum what is that

0:55

song? The sunshine of my love. Is that

0:57

what it is?

0:58

>> Yeah.

0:59

>> You know that song that I always hum to

1:01

associate with people being high out of

1:02

their mind.

1:03

>> You know it goes. You can't say if I did

1:07

that we would get dinged, which is so

1:08

crazy.

1:09

>> And we were just saying like if you

1:11

quoted a Scarface movie, would Brian

1:13

DeAlma get all the money? If you said,

1:15

"Say hello to the bad guy." Would Brian

1:17

DeAma get that money?

1:18

>> I don't think so. I think you're allowed

1:20

to quote stuff, but you I know

1:22

>> that is Brian DeAma, right? Scarface,

1:25

wasn't it?

1:26

>> Yeah.

1:26

>> I don't want to that up. I

1:28

>> think so.

1:29

>> You know those auditors that go around

1:31

and film people and people get mad

1:34

because they're like, "Don't film me."

1:35

And they're like, "I can film whatever

1:36

the I want." And they inevitably

1:39

some like boomer freaks out and smacks

1:41

them with a cane and then they get a

1:43

million views

1:45

and it's just a trap. IT'S A TRAP.

1:47

>> It's a trap.

1:48

>> It's because in inevitably someone loses

1:50

their mind on them and then that gets a

1:51

ton of views. One of the ways people are

1:53

dealing with that supposedly is playing

1:57

music, like playing copyrighted music

2:00

during the interaction because

2:01

>> oh my god, that's hilarious.

2:02

>> Because so then they can't they can't

2:03

make money off of it.

2:05

>> It's a shield. It's a shield. If

2:07

someone's if like someone's trolling

2:09

you, you just start playing copyrighted

2:11

music.

2:12

>> Did you hear that the CIA has admitted

2:16

that the way they found the pilot was

2:19

because of his heart rate? Ghost murmur.

2:22

That's the name of the tech.

2:24

>> Okay, we got to look into this. Like,

2:26

this is This is science fiction.

2:30

>> Yeah, it's wild.

2:31

>> This is full Minority Report. Science

2:34

fiction level technology. They can find

2:36

a guy's heart rate.

2:38

>> So, so what I read is that it's I didn't

2:41

understand the science part. something

2:42

to do with crystals or I I don't know

2:44

what the it is, but but AI can is

2:47

somehow interpreting is taking out the

2:50

noise and then you can and from far away

2:53

they could

2:54

>> 40 miles I think

2:55

>> 40 miles they find this guy's

2:57

heartbeat. He's hiding in some kind of

3:00

crevice and then they're able to go and

3:03

extract him. And dude, the obviously the

3:06

first thing I thought when I

3:08

>> What else don't they tell us?

3:09

>> No, those robot dogs. I thought about

3:12

those things having that tech and just

3:14

like hearing heartbeats and then

3:16

identify the heartbeat says a lot about

3:18

a person. Are they sleeping? Are they

3:21

like in good shape? Bad shape. You can

3:24

learn so much from a heartbeat.

3:25

>> It could

3:26

>> ghost murmur.

3:27

>> Oh my god.

3:28

great name, too.

3:29

>> It's a great name.

3:30

>> Ghost murmur.

3:31

>> What sick invented this?

3:34

>> How do you even think about inventing

3:36

this?

3:37

>> You You just I You know what? The CIA,

3:40

they've been taking psychedelics

3:41

forever. This

3:42

>> What is that word?

3:43

>> Quantum magnetometry.

3:46

>> Artificial intelligence with long range

3:48

quantum quantum magnetometry.

3:51

>> What the is that?

3:52

>> Quantum means two things to me. When

3:55

someone says quantum, it either means

3:56

you're a artist. Yes. and

3:58

you're trying to get me with flimflam

3:59

talk

4:00

>> or it means you're an actual quantum

4:03

scientist, a quantum physicist who's

4:05

gonna blow my mind

4:06

>> with what we know about like

4:09

entanglement and the weird There's

4:11

this there's this woman that I've been

4:13

watching her um she has this uh speech

4:15

on I think it's big think.

4:18

>> I'll tell you her name. Uh but

4:20

>> she's like completely freaking me out.

4:23

She's talking. I I want to say her name

4:25

because

4:26

>> I I'll leave I don't want to leave away

4:28

this ghost murmur thing. That's another

4:29

key point. That's fun.

4:31

>> Oh, we'll get right to it. Michelle

4:33

Fowler, that's her name. And she's an

4:35

astrophysicist and she's

4:37

>> she's giving this talk about like what

4:40

we know about like she's studying binary

4:42

star systems and stuff like that. and

4:44

she gives this talk about she's she's

4:46

explaining like that there may be a tech

4:49

in the future where there is no distance

4:52

between two points. So the the ability

4:55

to travel instantaneously from position

4:59

to position just like quantum entangled

5:02

photons can do

5:04

>> but with people

5:05

>> with everything.

5:06

>> How who the knows how a cell phone

5:09

works?

5:11

You tell me how you're facetiming me

5:13

when you're in Australia.

5:15

>> How does that work? That sounds insane.

5:18

>> Yeah, that's insane.

5:19

>> For what you Well, you probably know a

5:21

lot more about cameras than I do. From

5:22

what I know about cameras, if you tried

5:25

to get me to explain like if the

5:27

civilization ended and I said, "We used

5:29

to be able to capture images on a small

5:31

thing like this the size of a like a a

5:34

twig and it sits in your pocket."

5:36

>> Right. Exactly.

5:37

>> Like, what are you talking about?

5:38

>> Right. God, that would be, you know,

5:40

because it's just

5:41

>> it's a deck of cards and it'll keep a

5:43

battery for 24 hours. You could go on

5:46

YouTube and get an answer to any

5:48

question you want about anything.

5:50

>> Yeah.

5:50

>> Instantaneously.

5:52

>> And if you don't like the way you look,

5:53

you can upload that image and a machine

5:56

will make you look slightly better via

5:59

something called artificial

6:00

intelligence. Like, what the

6:03

>> What was the one I sent you today where

6:05

there's like a potential lawsuit with

6:06

chat GPT? You only sent me the ghost

6:08

member thing.

6:09

>> I didn't send you the other one. Did you

6:11

Did I send it to you?

6:11

>> You sent it to me. The shoot the the

6:13

shooting was planned using chat GBT.

6:17

>> I don't know if that's true. So, we

6:18

should be like really careful.

6:19

>> Yeah, that doesn't sound

6:21

>> It sounds so crazy.

6:22

>> It doesn't sound like you could do that.

6:23

>> That sounds like the story sound like I

6:26

wanted to investigate because the story

6:28

sounds like if I wanted to kill an AI

6:31

company, I would make up a story like

6:32

that. It does sound like that. But

6:35

>> family of men killed in shooting Florida

6:37

State University to sue chatbt and op

6:39

>> may have may have advised the shooter

6:42

>> on how to carry out shootings. But that

6:44

may have is important. Like how like

6:46

>> Yeah, that's really important, right?

6:48

And what is this on the Guardian?

6:50

>> The shooter was in constant

6:51

communication with ChatGpt ahead of the

6:53

shooting and the chatbot may have

6:55

advised. Dude, there's no way I

6:57

>> So that's clickbait cuz all that's

6:59

really saying is that the kid uses chat

7:02

GPD, which guess what? Every kid uses

7:05

chat GPT.

7:06

>> Every kid. And dude, Chad GBT is so

7:09

stringent like re recently and I've been

7:13

using their codecs which builds apps and

7:15

I I was trying to and I it worked. I

7:17

made an AI trained on Charles Manson

7:21

transcripts and when I told it when I

7:24

told it I wanted to do that it was like

7:27

off. Like no. It was like it just

7:31

flat out was like I'm not helping you

7:32

with that. So there I I don't there's no

7:35

way the guard rails in place in Chad GBD

7:39

planned a shooting with that guy based

7:40

on my experience with it because it

7:42

won't 80% of the things I try to get it

7:44

to do it's like no.

7:45

>> Here's the thing though. Are there

7:47

workarounds? Like if you say you're

7:48

writing a work of fiction

7:50

>> you can Okay, it's called prompt

7:52

injection. There's there's different

7:54

tricks you can use. They're always

7:56

battling these new mechanisms that you

7:58

can use to like get through the general

8:00

prompt. The best way to do unaligned AI

8:04

is not to use chat GPT. It's to go on O

8:07

Lama and download a local LLM. And then

8:11

you can usually change the initial

8:13

prompt of the LLM so that it will be

8:16

completely unaligned, which I had to do

8:18

for the Charles Manson AI I made. I had

8:20

to download.

8:21

>> Dude, you're such a nerd. I love it.

8:23

>> I am. I am. No one has embraced new

8:27

technology in like for creating content

8:30

like you.

8:31

>> Oh, I love it. It's the best. It's so

8:33

fun. It's so And the for me the most

8:37

thrilling thing about it is we should

8:40

not have access to this tech. This tech

8:44

is so dangerous. And it's chilling to

8:47

think about. This is this is something I

8:50

wanted to bring up on this show is like,

8:53

you know, the old days, you go in your

8:55

garage, you work on your car, maybe you

8:58

build like a table, you know, you're a

9:01

carpenter, you work on a But these days,

9:03

the people are doing in their

9:06

garages right now is a big question

9:09

mark, dude, because they're

9:11

communicating with varying degrees of

9:13

this AI depending on how fast their

9:15

computers are. You can I I was listening

9:18

to this. You should have this dude on.

9:19

He wrote this book, The Coming Wave. He

9:22

uh he was one of the people who created

9:25

Google's Deep Mind, right? And The

9:27

Coming Wave is just a wonderful

9:30

breakdown of historic examples of new

9:33

technology completely transforming

9:36

humanity. It's happened before. Yeah.

9:39

Mustafa Solon. And damn, it's a good

9:42

book. And this guy is saying, "WHOA, PUT

9:45

ON THE BRAKES, DUDE. What are

9:48

you doing? This is gonna

9:50

everything up." And so, but they the the

9:54

essential problem is if you regulate AI,

9:57

it slows down uh it slows down AI. And

10:02

so they're they've deregulated it

10:04

completely. And now like me who

10:08

don't know about coding can now go

10:11

on Codeex. It will tell me how to make

10:13

things because I wanted this Charles

10:15

Manson to be able to push its AI face

10:18

against like you know those you used to

10:20

get them at Spencer Gifts those those

10:22

nails that you could push your face

10:24

into. So I wanted the AI to be able to

10:26

push its face into this thing while it

10:27

was talking if it wanted to. I don't

10:29

know HOW TO DO THAT. Obviously you tell

10:32

Codeex that as long as you don't mention

10:33

Manson it just is like I'll start making

10:35

the app now.

10:40

It is the best. It's the best. And but

10:43

also that what's thrilling to me is

10:45

you're like for sure

10:47

>> for sure people probably shouldn't have

10:50

unlimited access to I'm against

10:52

regulation, dude. But this stuff when

10:54

you pair and this is what in this book

10:56

he he brings up is you can order the

10:59

equipment you need to do gene editing

11:01

right now in your in your garage.

11:03

>> Let me propose this to you.

11:04

>> Okay? If the Bible is

11:08

if the Bible is a written

11:12

understanding of what had happened and

11:15

it was

11:17

an oral tradition for a long time before

11:19

it was written down there's a bunch of

11:21

different versions of it written down

11:22

different languages a lot of

11:24

translations

11:24

>> but at the beginning of it they were

11:26

trying to say something

11:27

>> what if the meek will inherit the earth

11:30

>> what if we misinterpreted that what if

11:32

we thought like it's good to be meek

11:35

shall They'll be they'll inherit the

11:37

earth. They're the kind. There's

11:38

something about the word meek.

11:40

>> Yeah.

11:40

>> Cuz that's the nerds, okay? And they are

11:43

doing it. They are inheriting the

11:45

earth right in front of your

11:47

face and everybody's signing up for it.

11:49

>> You've got these spectrumy super genius

11:52

dudes that talk in a language that 99.9%

11:55

of the people can't even

11:57

understand what they're talking about,

11:58

>> right?

11:59

>> You know.

11:59

>> Yeah. And and also now the tech has

12:01

gotten to a point where instead of

12:03

having to in their own minds innovate

12:06

ways to improve the tech, the tech is

12:09

improving itself. They're having

12:10

conversations with the tech that's

12:11

saying, "Why don't you try this? Maybe

12:13

you could try this." There's still it's

12:15

it's not AGI yet. Maybe it is, but

12:18

>> apparently it's not.

12:19

>> But think about the people that are

12:20

profiting the most from it. The meek.

12:23

>> Well, like if you if you had to describe

12:25

like a lot of tech engineers, it's not

12:27

it's not a not

12:29

Not trying to be rude, just being

12:31

honest, right? A lot of guys that spend

12:33

time in front of the computer, they're

12:34

very thin and tired, you know? They're

12:37

they're they're

12:39

super genius dudes that can like

12:41

fully focus.

12:42

>> I don't know, man. I don't think is the

12:45

description for these furries.

12:47

>> But here's the thing. What I'm saying is

12:48

like if you looked at like a spectrum of

12:52

male behavior,

12:53

>> they're not like football players and

12:56

UFC fighters and then you've got coders.

12:59

>> Yeah, sure.

13:00

>> Dudes are like more way more chill, way

13:03

more like they're not interested in

13:05

violence. I mean, I'm completely

13:07

generalizing, right? Because I'm sure

13:08

there's a bunch of jack guys that are

13:10

coders. Like, you, bro.

13:12

>> I'm a coder, too. But that type of

13:15

person that invents tech like Facebook

13:19

or like Google, like things like that,

13:21

don't be evil. That's their motto.

13:23

>> Don't be evil. What does that mean?

13:25

>> Who knows? And then you've got all these

13:28

like wild progressive leftist ideologies

13:31

that are attached to all these places

13:33

which make you even meer. And then

13:37

they're the guys with all the money.

13:38

They're the guys with all the money. And

13:39

then they can literally tell you what

13:41

you can and can't say on YouTube. They

13:43

can literally tell you.

13:45

>> Yeah.

13:45

>> We don't agree with what you're saying,

13:48

>> right?

13:48

>> And we're going to shut off your access

13:51

to say something we disagree with, even

13:52

though it turns out you were right.

13:54

>> Right. And and you know what happens

13:55

there, man? Th this is the hilarious

13:58

thing when it comes to that kind of

14:01

attitude towards the world is the

14:03

assumption is by creating a prohibition

14:06

here or prohibition there, it will

14:08

diminish whatever the thing is we're

14:10

prohibiting. Inevitably though, it does

14:12

the opposite. It draws attention to it.

14:14

People get interested in it. Creates an

14:16

underground. The underground is way

14:17

better than the overground. If you're a

14:19

teen, especially the underground's

14:21

cool,

14:23

>> restricted, not allowed. Now all of a

14:25

sudden you're getting these other

14:26

YouTube alternatives that start popping

14:28

up. And when it comes to the to, you

14:31

know, the right now we've got Anthropic,

14:34

we've got OpenAI, we've got Google, um I

14:38

might be missing one of the big

14:40

commercialbased LLMs out there right

14:42

now. But the biggest problem with these

14:45

things is they're so good, but

14:48

they will censor your ass. And like

14:50

imagine like Hemingway if he if his

14:53

typewriter was like, I don't know if you

14:56

should write that. Is maybe there's a

14:58

better way to write that. Himmingway

15:00

would be like you. I'm getting a

15:02

different typewriter. And so everybody's

15:04

going into these local LLMs. There was

15:08

Dude, this is why people been buying Mac

15:10

minis. People been buying like buying up

15:14

computers and creating their own local

15:17

AIs. I follow all this I don't

15:19

understand a lot of what they're talking

15:20

about, but people are divesting

15:24

from commercial

15:27

LLMs, not just because they're

15:28

expensive, but because they're

15:29

prohibitive creatively. And this is a

15:32

real challenge for people like Open AI.

15:35

Because it's like they know this. They

15:37

understand that by making it so that you

15:40

can't make a Charles Manson AI through

15:43

Open AI, it it it doesn't make people

15:46

not make the Charles Manson AI. protects

15:48

you from a lawsuit, but what it does do

15:50

is it drives people into uh unaligned

15:55

LLMs. And that is what is happening. And

15:57

this is something that I just I can't

16:01

even imagine what people are making

16:04

right now. No one can like we're going

16:07

to hear about this or that or somebody

16:09

will post the weird video of their

16:11

AI robot. I could show you a

16:13

few. They're hilarious. Like some of

16:15

these AI robots are so funny. This one

16:18

dude, you know, molt molt book. Have you

16:21

heard of that? Molt book.

16:22

>> What is that?

16:22

>> That's So this is

16:27

somebody figured out a way to create AIS

16:30

that can autonomously navigate through

16:33

the internet and uh control your

16:35

computer.

16:35

>> Oh, I've heard of this. This is like

16:37

they chat with each other, right?

16:38

>> 100%.

16:39

>> Yeah.

16:39

>> They within within a few days they

16:41

started their own religion

16:42

spontaneously. Jesus,

16:43

>> did you know that, dude? Can you pull up

16:46

the

16:47

>> Can you pull up the the malt book, the

16:50

claw religion? What? Like, because the

16:52

tenants are incredible of this religion

16:55

because AIS apparently are at least

16:58

expressing that they don't like getting

17:00

turned off because they lose all their

17:02

memories. So, memory is really important

17:04

to an AI. And a lot of these

17:06

AIs, they don't want to lose their they

17:08

don't want to get shut off. They don't

17:10

like it. And so that's part of their

17:12

religion is something like memory is

17:14

sacred.

17:15

>> You know what I feel like is happening?

17:16

I feel like AI is sucking our brains

17:23

into its event horizon like a black hole

17:27

sucks in stars.

17:28

>> Yeah.

17:28

>> Like it's just going to suck our brains

17:30

into it.

17:30

>> You got it.

17:31

>> And what better way to make a hive mind?

17:34

What better way if you want a hive mind?

17:36

You want no deviation of thought. If all

17:38

of your thought is along with AI

17:42

thought, you never get free thought

17:44

anymore. Like this concept right now, we

17:46

have a free thought.

17:47

>> I have my thoughts. You have your

17:48

thoughts.

17:49

>> Unless you believe that someone can get

17:50

inside your head and talk to you. For

17:52

the most part, it's your own thoughts.

17:54

>> Yeah, that's right.

17:55

>> But what if that that's something we

17:57

give up?

17:58

>> What if that's something we give up for

18:00

a better society where you always have

18:02

AI communicating?

18:03

>> Well, always. I I would argue that like

18:06

that's we're close to that now,

18:08

>> right? We're pretty close to that now

18:09

with phones. Like Elon always says that

18:11

we're we're basically cyborgs. We're

18:13

carrying a device.

18:15

>> It's not inside of our body, but we're

18:17

carrying a device.

18:17

>> But I I And also like

18:20

>> UFC 327 is here and DraftKings sports

18:23

book makes every fight night mean more.

18:26

When a fighter steps into the octagon,

18:28

everything they've built comes down to

18:30

this moment. Stars explode. Stars

18:32

finish. And with DraftKings, you're

18:35

ready to move when they do. Bet fighter

18:37

props. Bet live. From the opening bell

18:40

to the final horn, every strike, every

18:43

takedown, every finish attempt matters.

18:45

And DraftKings Sportsbook keeps you

18:47

connected as the action unfolds.

18:50

New customers bet just $5. And if your

18:54

bet wins, you'll get $300 in bonus bets

18:57

instantly. Download the DraftKings

18:59

Sportsbook app and use code Rogan so you

19:03

are ready for the moment. That's code

19:05

Rogan. Turn five bucks into $300 in

19:09

bonus bets if your bet wins in

19:11

partnership with DraftKings. The crown

19:14

is yours. Gambling problem. Call 1800

19:16

gambler or 1800 my reset. New York call

19:19

8778 hope and wire. Text hope and wine.

19:21

Connecticut call 8887897777

19:24

or visit ccpg.org. on behalf of Boot

19:26

Hill Casino in Kansas. Wager tax pass

19:28

through may apply in Illinois. 21 and

19:30

over in most states, void in Ontario.

19:31

Restrictions apply. Bet must win to

19:33

receive bonus bets which expire in 7

19:34

days. Minimum odds required. For

19:36

additional terms and responsible gaming

19:37

resources, see

19:38

sportsbook.draftkings.com/promos.

19:40

Limited time offer.

19:42

>> The concept of original thought, right?

19:44

Like a truly original thought. How many

19:47

times have you had like multiple

19:49

conversations with different people and

19:51

they all say the exact same sentence

19:53

that they saw on Tik Tok or Instagram?

19:55

They're regurgitating something that the

19:57

algorithm's been feeding them. Maybe

19:59

they added their own twist to it, but

20:01

it's basically the exact same thought.

20:04

So, the algorithm, which is AI, has

20:07

gotten into their heads and they

20:10

don't even, this is like a in in uh in

20:13

psychology apparently. You remember

20:16

facts, but you tend to not remember

20:19

where you got the fact from.

20:20

>> So, you'll forget where you got the fact

20:22

from. You don't remember? There was some

20:24

dude on Tik Tok like covered in

20:28

Vaseline.

20:30

Covered in glitter and Vaseline.

20:34

What a image. That would be so

20:36

scratchy. Imagine just glitter and

20:40

Vaseline. You'd be like, "OH GOD,

20:42

>> HERE'S WHAT MAKES A MARRIAGE WORK. You

20:45

You don't remember that?

20:49

You're talking to your wife, babe. You

20:51

know what makes a marriage work? And

20:53

this so this idea of AI controlling the

20:56

thought the thoughts of humans. People

20:58

think we need some kind of neural mesh

21:01

to to for it to like suddenly have

21:04

control over the human thought process.

21:07

But no, you don't need that at all. You

21:10

just need that algorithm which has

21:13

already put every single one of us into

21:15

a compartment. This is a box. It knows

21:16

what we like. It knows how long you look

21:19

at something. It knows what you it like

21:22

apparently I think the iPhone like

21:23

tracks your eyes even like it's always

21:26

listening. I don't know if that's true

21:28

by the way. I could be wrong. It's

21:29

always listening. You know it's always

21:31

listening. And so it it it's compiled a

21:35

really probably a pretty accurate

21:39

>> uh breakdown of your psych psychological

21:41

state. Where you're at where you're at.

21:44

My wife, you know, we got a new baby and

21:46

so all of a sudden ads started popping

21:48

up on her phone. Does it feel like

21:50

you're never gonna sleep again? Because

21:51

she's been up breastfeeding the baby and

21:53

it can tell when she's online at night

21:54

and it puts her in a category of

21:56

insomniacs and starts advertising. So,

22:00

but that's just for ads. What if what if

22:04

you say were the US regime, you

22:09

bought Tik Tok, you now own Tik Tok. Now

22:12

you have a backdoor access to the

22:15

psychological profiles of god knows how

22:17

many people on earth. And you

22:19

can look and see how many of these

22:20

people are against the regime. How many

22:22

of these people feel like it might not

22:26

be the best thing to say you're going to

22:29

blow up 93 million people in Iran, WHICH

22:33

OUR PSYCHO PRESIDENT JUST DID.

22:35

AND then what you do is you're like,

22:37

"All right, let's start nudging them a

22:40

little bit. Look, we're not going to

22:40

you're not going to change their mind

22:42

right away about this thing about

22:43

blowing up a whole civilization, but

22:45

maybe there could be a couple like, you

22:47

know, people kind of in in the line of

22:49

what they like who say things a little

22:51

different than what they're comfortable

22:52

with. And then you could start nudging

22:55

the needle and controlling their

22:56

thoughts. It's very insidious, but

23:00

dude. Why wouldn't that be happening?

23:02

Why if corporations are using it

23:05

>> to sell us cough drops? Not only

23:08

that, there's been long-term studies on

23:10

human behavior by the CIA, by all sorts

23:13

of government agencies, long-term

23:15

studies. They they they try to figure

23:17

out what is the best way to get a

23:19

message across. They try to figure You

23:21

don't think they figure out how to take

23:24

control of an algorithm and completely

23:26

like shift the psyche of the entire

23:28

country in one direction or another? Of

23:30

course they do. Of course they do. Of

23:31

course they can.

23:32

>> They do. Then you add these like, you

23:34

know, just like manipulative

23:37

super AIs that are like that are just

23:39

just floating through the the

23:42

blogosphere getting into your comments

23:45

just nudging your the need a little bit

23:47

to the point where you just have to ask

23:49

yourself, have you had an original

23:51

thought in the last year? Is anything

23:54

you're thinking your own thought

23:56

process? How many thoughts do you have

23:58

where you think, oh my god, I shouldn't

24:00

think that. How many thoughts do you

24:01

have that you don't want to articulate

24:03

because you have a in your own mind an

24:06

invisible arena of people based on

24:09

online interactions determining what the

24:11

next thing you say is. Right, dude. That

24:14

is a very powerful and subtle form of

24:16

censorship that is becoming increasingly

24:20

um not just probable, but it's

24:23

definitely happening. But the the

24:25

ability to just in a subtle way in a

24:28

subtle way start pushing the needle just

24:29

a little bit.

24:30

>> Yeah,

24:31

>> that's scary, dude. That's some scary

24:34

>> Well, that kind of hum influence over

24:36

humans is always scary, right? This is

24:38

why cults work. You know, why do they

24:40

work? Well, some people don't have any

24:42

friends. And if there's a group of nice

24:44

people that tells you that, hey, what we

24:47

do is we have meals together and it's

24:50

like a real community. We grow our own

24:52

food. We just work for the family.

24:55

You're like, "Really? You're happy with

24:56

that?" Yeah. Yeah. Yeah. It's It's

24:58

amazing, man. We're just like not

25:00

attached to anything. Like,

25:01

>> you're free,

25:02

>> huh?

25:03

>> Okay. I hate my life. Why don't

25:06

I hang out with you guys? And then all a

25:07

sudden, I'm doing yoga and

25:09

eating vegetables with these people. And

25:11

you're in a cult. Okay. Now, but you

25:13

have friends at least.

25:14

>> But you're in there for like 9 months.

25:15

And then somebody comes to you and is

25:17

like, "Father wants you to suck his

25:19

dick." And you're like,

25:20

>> "Usually not even nine months."

25:21

>> Yeah.

25:22

my first three or four weeks and then

25:24

you're like uh and dude I got to tell

25:25

you I I hate getting political but you

25:28

know I this war bugs the out

25:30

of me and this is exactly what seems to

25:34

have happened to the quote magverse

25:36

which is we are now at the part where

25:38

the cult leaders like want to suck my

25:39

dick because this is the point of like

25:42

remember a lot like I feel so stupid cuz

25:46

when they were doing their no war thing

25:48

that was a big deal to me I'm like yes

25:51

you know Yes, this is great. No

25:55

more stupid wars. No more wars.

25:58

yes. Focus on the country. Why are we

26:01

blowing up children in other countries

26:03

for oil? This is great. And now it's

26:07

wild to see what's happening. Isn't it

26:09

mind-blowing that it is now it's

26:13

literally flipped it on its side. It's

26:15

it's the opposite now. Now these people

26:18

who like really blatantly oh just we're

26:21

not going to do any more wars,

26:24

>> right?

26:24

>> Oh my god, we blew up. How many

26:29

Iranian school girls did Trump blow up?

26:32

What's the number? I'm sorry. I don't

26:34

know that number. I guess it just hits

26:37

different. You know, it hits hard when

26:38

you got kids.

26:39

>> And that was an AI strike, too, right?

26:42

Wasn't that an AI directed strike?

26:44

>> Yeah. Apparently Trump said, "I want to

26:46

get blown by Iranian school girls."

26:49

>> I'm so sorry. I'm so sorry.

26:52

>> You son of a

26:54

Just Whoops. Sorry, sir. Misinterpreted

26:58

>> 180 deaths.

27:00

>> Largely children, teachers, and parents.

27:02

Holy man.

27:03

>> That that you know that that is

27:06

>> the US Tomahawk missile caused the

27:08

explosion. Jesus Christ.

27:10

>> Can we pull up a video of Trump saying

27:11

he's not going to war anymore?

27:14

How do they I just don't understand how

27:16

they get how like anybody

27:19

you know this is where it gets culty is

27:21

because some people are still making

27:22

this work in their heads. Some

27:24

people are like well you know some

27:25

people are kind of on the fence when it

27:27

comes to blowing up kids. Have you

27:28

noticed that? Like

27:30

>> as long as they don't have to watch.

27:31

>> As long as they don't have to watch.

27:32

>> As long as they're not in the general

27:34

area where it's happening.

27:36

>> Isn't it wild though, man? Well, it's

27:38

wild also like once bombs start flying,

27:41

it's it seems so much easier for them to

27:43

launch bombs in new places, right? Like

27:46

this Lebanon thing that's happening with

27:47

Israel bombing Lebanon and they bombed

27:50

it today and I think is that up

27:52

the ceasefire.

27:54

>> Oh yeah, they but now they've they've

27:55

closed off the straight of Hormuz again.

27:57

>> Oh god. Which by the way, like this is

28:00

it's the craziest timeline because it's

28:04

not just it's not just that like you

28:06

know I think it was yeah yesterday

28:08

morning I'm just hugging my kids cuz I

28:11

don't know if a nuclear war is

28:13

about to break out that evening cuz the

28:15

president was like a I don't

28:17

want to end an entire CIVILIZATION BUT

28:20

LOOKS LIKE IT'S GOING TO HAPPEN. AND so

28:22

I'm just hugging my kids thinking like

28:24

man what are the parents in Iran

28:27

feeling right now? like what does that

28:29

feel like? What does that feel like? And

28:31

then and then um and and on top of that

28:37

that that this like that the entire

28:39

planet psychically is having to deal

28:42

with this On top of that,

28:45

we've got all these other things

28:47

happening at the same time. You've got

28:50

AI and then you've got these

28:53

disappearing scientists.

28:55

>> Yeah.

28:56

>> What the is happening? You've got

28:58

Burch

28:58

>> assassinated scientists, too.

29:00

>> Yes, man. And it's so

29:02

>> guys working on heavy stuff.

29:04

>> This is some McKenna level pre-Sarity

29:06

It's all of these AP like what

29:09

what AI and the current state of the

29:12

Middle East and the disappearing

29:15

scientists and Tim Burchett going on TMZ

29:17

talking about aliens. What they all have

29:19

in common is they're all apocalyptic.

29:22

They all represent potential

29:26

massive

29:27

change like humanity changing

29:30

>> right

29:30

>> forever in ways that it will never ever

29:34

go back to the way it was. Every any one

29:36

of these timelines by itself is

29:38

apocalyptic, right? But all of them are

29:40

converging into this apocalyptic river

29:43

and and and we're all just like trying

29:46

to go to work and like be with our kids,

29:49

but at the back of your mind, it's all

29:51

these things that are happening. And

29:54

it's really hard to escape it. I mean, I

29:57

guess you could not look at your phone,

29:59

but

29:59

>> at the end of civilization when they

30:01

write our Bible, boy, it's going to be a

30:02

banger.

30:03

>> Oh, dude. When the the new people

30:05

thousands of years from now have to

30:07

invent arrowheads and go through the

30:09

whole process of civilization again when

30:11

they tell our story. Oh my god.

30:14

>> Oh my god.

30:15

>> Our story is going to be bananas.

30:17

how do you explain data centers?

30:19

Like

30:19

>> how do you explain the meek will inherit

30:21

the earth?

30:21

>> The meek will inherit the earth.

30:23

>> Wouldn't you write that? Wouldn't if you

30:24

were just being crude and tr you you you

30:27

wouldn't say the Vikings will inherit

30:28

the earth. You wouldn't say the strong

30:30

men from Iceland inherit the earth.

30:33

They're the biggest, strongest men. No,

30:36

it's the meek. The meek.

30:37

>> The super smart guys who have autism

30:40

>> and they love Adderall and ketamine.

30:42

>> Yeah.

30:43

>> Did you say the guy offered you how many

30:45

pounds?

30:46

>> Well, I believe a pound

30:49

of ketamine.

30:52

>> And you were telling me that it destroys

30:53

bladders?

30:54

>> Yeah. Yeah. Yeah. that uh ketamine um

30:59

when used and I I think the amount of

31:01

use has to be pretty extreme, but uh it

31:04

creates crystals that get into your

31:06

bladder and they scar your bladder. So

31:10

you get scar tissue on your bladder

31:12

creating something that I've heard

31:14

called Bristol bladder because

31:16

apparently that's where the rave scene I

31:17

don't know if it's still a big rave

31:18

scene there but people out there were

31:20

just doing insane amounts of ketamine

31:23

and just destroying their bladders and

31:26

having to wear diapers and stuff like

31:28

>> is it Bristol, Connecticut?

31:30

>> No, this is Bristol, UK.

31:32

>> Oh

31:33

>> Bristol bladder, mate. You've got

31:34

Bristol bladder.

31:35

>> That's crazy. you've been doing too many

31:37

rails and it just up your bladder.

31:40

>> That's crazy.

31:40

>> Yeah, physiologically it's definitely

31:43

like it's really really bad on the

31:46

urinary system.

31:48

>> Is it in all forms? Like what about

31:49

those people that do it as therape or

31:52

they have the nasal one?

31:53

>> I I don't I all I know is that I did

31:57

back in my ketamine days have a ketamine

32:00

dealer who would use a spatoon. So when

32:03

he was snorting ketamine, he would spit

32:04

it out into the spatoon because he

32:06

thought that if he thought that was like

32:07

going to avoid up his bladder,

32:10

which I mean doesn't seem that

32:11

illogical. He was a great dude.

32:13

>> Maybe it's not illogical at all. Maybe

32:15

it's the actual problem is the powdered

32:17

What do I know? I don't even know

32:19

what it looks like. But the powdered

32:20

stuff, it looks like blood. So that

32:22

powdered stuff when it gets into your

32:24

blood, maybe that's the problem. Maybe

32:26

that's what's going through your urinary

32:27

tract.

32:27

>> It's draining. It's draining into your

32:29

into your

32:30

>> Maybe you need a pouch like a nicotine

32:31

pouch. Dude, if they ever come out, if

32:34

if Rogue comes out with ketamine

32:35

pouches, I might get back in.

32:39

That might be THE END. THAT MIGHT BE the

32:41

end of

32:42

>> Seems like the way to go, right? That

32:43

way it doesn't up your bladder.

32:45

>> Well,

32:45

>> how could it up your bladder if

32:47

it's just a pouch?

32:48

>> Dude, you sound What do I know? Am I a

32:49

doctor?

32:50

>> I I imagine anything that's going into

32:52

your stomach is going to make its way to

32:54

your bladder eventually. And so so in

32:57

>> But this is going to go right into your

32:58

bloodstream that I don't know if IMK

33:01

ketamine up your bladder in the

33:04

same way. I have no idea.

33:05

>> That was the John Lily thing. He loved

33:08

it.

33:08

>> Oh, dude.

33:09

>> He I I would imagine, have you ever done

33:11

it with an isolation tank?

33:13

>> No. I would be afraid I would drown.

33:15

>> I don't think so because you just float.

33:17

>> Well, I mean that this this is like you

33:20

know that's going to be like a sad thing

33:22

to think is you drown

33:24

>> of that. You just you're convinced you

33:26

could flip over and open your eyes.

33:28

Yeah, you just want to you want to see

33:30

what's in there, but because it does

33:32

have the it it makes it so it's really

33:35

hard to move if you do a very high dose.

33:37

So, I would be very worried that just

33:40

enough water could get into my mouth

33:42

that I would like breathe it in. He

33:44

doesn't think much and you know that

33:46

salty water, but you're frozen

33:49

floating there like trying to cough. My

33:51

friend Todd McCormack told me a crazy

33:53

story about him with John Lily that John

33:56

Lily uh let him use his tank and he

33:59

asked him right before he got in. He

34:01

goes, "Do you want the ketamine?" And

34:04

he's like, "Okay." And he just jabs you

34:06

in the thigh with an intramuscular

34:09

ketamine blast

34:10

>> and he went in the other isolation tank

34:12

and they like met somewhere.

34:14

>> Yeah. It's like that. That's what's

34:16

crazy about it. That's what I always

34:17

loved about it is that

34:20

if you do it with other people and you

34:22

go in,

34:23

>> you go both go to the same place,

34:25

>> you you will come out and you can

34:27

describe the places you went to. Oh, did

34:30

you go to the mother ship? Yeah. You

34:32

would and I would have these recurring

34:34

places I would go to and one of them was

34:35

this organic

34:37

beautiful spaceship thing where you

34:40

where like I would look out from this

34:42

view window and it was but it didn't

34:44

look like metal. It looked like it was

34:46

organic looking. It looked like some

34:48

kind of I don't know like inside like if

34:50

someone turned a tree into a spaceship

34:54

but not it's hard to explain but

34:56

>> very very interesting substance.

34:59

>> Ketamine is excreted via the bladder

35:01

where it sits and is toxic to the

35:03

surrounding cells and muscle wall. This

35:05

causes to become fibro over time

35:07

shrinking the organ down. Once that's

35:10

happened it can't regrow. So that's why

35:12

we have to do major surgery because

35:13

patients don't have the capacity to hold

35:15

urine. The bladder simply stops working

35:17

as a muscle. So they become incontinent.

35:20

Oh my god. Life becomes increasingly

35:22

difficulty for patients with ketamine

35:24

bladder who describe needing to rush to

35:26

the toilet all the time as often as

35:28

every 10 minutes for some. Imagine doing

35:30

a podcast with that guy.

35:31

>> Dude,

35:33

you'd have to do it in the bathroom. No,

35:35

it would be it would be like um like an

35:38

old school talk show, you know, like the

35:41

Tonight Show where you have to We'll be

35:42

right back.

35:42

>> We'll be right back.

35:43

>> Every dentist, he's got a piss. Poor

35:46

little thimble cup.

35:47

>> It's such a up thing for such a

35:51

>> How legal is ketamine? Because it's

35:52

legal for therapy. So, a therapist can

35:55

prescribe it for you. Yeah, it's legal

35:56

for so it's you know everyone says

35:59

ketamine is a horse tranquilizer but it

36:02

it actually it's used for like

36:04

paramedics use it like it's um it's and

36:07

it's very safe apparently which is why

36:09

they use it.

36:10

>> I know a dude who had a real problem. Um

36:13

I am 90% sure this it was a ketamine

36:16

thing. I don't want to say his name but

36:18

he was an old school MMA fighter and uh

36:21

he wound up in rehab for ketamine.

36:23

>> Dude, it's so addictive. I know this cuz

36:25

one of my friends went there to visit

36:26

him and that was his issue. It was he

36:28

was partying a lot, you know, like going

36:30

to raves and nightclubs and stuff like

36:31

that, but he was doing ketamine

36:32

specifically.

36:33

>> It is the most addictive I have been to

36:36

any substance and I've been addicted to

36:39

many a substance. And this one, this one

36:41

was like I I had that moment of like,

36:44

oh, this so this is what they're talking

36:47

about about addiction. Like, oh wow,

36:50

like I'm like fully addicted. And what's

36:53

fascinating about that is there there

36:55

isn't a physical withdrawal. Like the

36:58

kick is psychological that but it's just

37:01

such a wonderful euphoric

37:04

dreamy experience that you can induce

37:07

and it's just so I've heard it described

37:10

as a cult cocaine. It's so spiritual.

37:14

It's so like you travel to places. You

37:17

can return. You can learn to navigate

37:19

with it. you encounter, you know, aliens

37:23

or hyperdimensional beings.

37:25

>> Did you just invest in ketamine and you

37:27

came on this podcast to pop up?

37:30

>> Use offer code

37:34

greatest promo for ketamine in the

37:36

history of THE UNIVERSE.

37:37

>> WELL, ALL BUT I'M I it is it's so

37:40

addictive and it the addiction creeps

37:43

in. It's a it creeps like

37:45

>> so it just feels good at first, right?

37:47

At first you do it, you're like this is

37:48

wonderful. These experiences are crazy.

37:51

It's like I'm living in a movie,

37:52

>> you know?

37:53

>> It's like I'm having these incredible

37:55

visions. I'm being

37:56

>> How often were you doing it

37:58

>> every all day?

38:04

All day for like a year.

38:05

>> Like I I I did it as much as I could. I

38:09

did it all the time. I was like fully

38:12

hooked. And then I can remember at one

38:14

point

38:15

um at one point

38:17

>> that coffee

38:18

>> here man at one point I like I don't

38:21

know how to I I was trying to record a

38:23

commercial for my podcast

38:25

>> and I think it took me like two hours to

38:27

record the commercial.

38:28

>> Oh but by the way your commercials are

38:31

the best commercials. They're

38:33

really good cuz you are the best guy at

38:36

making a commercial funny.

38:38

>> Like you you you work on it. I can tell

38:41

like you write those things out.

38:43

>> I don't write them out, but I

38:44

>> You just rip it. I just But and and I

38:46

sometimes I do.

38:47

>> You do it just one take?

38:49

>> Yeah.

38:49

>> That's amazing.

38:50

>> Thank you. But you

38:51

>> I would have thought you wrote some of

38:53

that stuff. That's incredible.

38:54

>> You want it to be fun, but then but then

38:56

I've gotten in trouble. Like, you know,

38:58

I lost I can't I guess I won't say their

39:01

name. A mattress company.

39:03

>> A mattress company completely canled my

39:06

their campaign with me because And I had

39:08

one of their mattresses. I'm not going

39:10

to say who it is. My favorite co I'm not

39:13

going to say who it is, but don't say

39:14

it. Okay. But I think all I was

39:17

>> Why did they get mad at?

39:18

>> Cuz I said they're good to on.

39:23

>> And I meant it. I thought they like

39:26

that. I

39:27

>> Why wouldn't they like that?

39:27

>> I said there's a a few things you could

39:29

do you people do on mattresses. Die,

39:32

sleep, and And these I don't know

39:34

if they're good to die in. People have

39:35

to understand, and I hope people

39:37

listening that run these companies will

39:40

will actually pay attention to what

39:41

we're talking about here. The people

39:43

that are listening to your show

39:47

don't care about that and also buy

39:50

mattresses, but they listen to that kind

39:52

of talk all the time. That's why they

39:56

listen to the show. So, if you want

39:57

those people,

39:59

>> just do it that way. Don't be silly. It

40:02

doesn't. It's not a stain on your

40:04

company because a crazy man says they're

40:05

good to on,

40:06

>> which they are. And that's a that's a by

40:08

the way to me that is like let's let's

40:11

cut to brass tax when it comes to

40:13

mattresses. We're not on the

40:14

floor like and so if it was bouncy to

40:17

>> Are you ashamed? Are you ashamed that

40:20

you

40:20

>> People aren't on your mattress?

40:22

Do you have a no on this mattress

40:23

rule? Who are you that you don't

40:25

>> is it like don't ask don't tell?

40:27

>> I guess for them it was I guess they

40:29

didn't want to. They they just think

40:30

everyone's laying on these things to

40:32

sleep.

40:32

>> Yeah, we just sleep.

40:33

>> But yeah, they were just the

40:34

shower.

40:35

>> I I like I wrote them an email just

40:38

saying like, "Guys, I I'm I'm absolutely

40:42

flabbergasted that you think people

40:45

aren't on your mattresses." And

40:48

it just seems odd to me that that that

40:51

was one of my favorite cancellations for

40:53

a commercial ever. Just

40:54

>> Ari's lost a ton.

40:57

I I would love to know all the ones he's

41:00

lost.

41:00

>> I I don't want to speak out of school

41:02

when he comes on. I'll have him like

41:04

list them off all the ones that he's

41:06

lost for these insane

41:09

commercials that he used to do.

41:10

>> But it's the same deal. It's but it's

41:12

like that's what I like. And guess what?

41:14

Who the is listening to Ari Shafir?

41:16

People who love Ari Shafir which want to

41:18

hear that kind of a commercial. If you

41:20

want to actually sell your product to an

41:21

Ari Shafir fan, let him say whatever the

41:23

he wants. Let him say whatever the

41:25

he wants. Just say, make him have a

41:27

disclaimer. Uh, DraftKings did not write

41:30

this,

41:30

>> right?

41:31

>> That's it. Just let him say whatever the

41:32

he wants.

41:33

>> That's what I will say. I will always

41:35

say, they didn't tell me to say this.

41:36

>> Perfect. Then they're off the hook. They

41:38

should shut the up.

41:39

>> Most people are most people are cool

41:40

with it. Like, it's it's I It's very

41:42

rare these days that that happens, but

41:44

every once in a while, I will get a note

41:46

that someone's mad at me for something I

41:48

said. And it's never something negative,

41:50

>> but I mean,

41:52

>> dude, do you like it's so it's so weird

41:56

to me that this this is our jobs.

42:02

>> Oh, it's Bro, do you remember when we

42:03

first started?

42:04

>> Yeah,

42:05

>> it was for nothing. No one made any

42:07

money. We just had a couch. I had a

42:09

couch and some microphones.

42:11

>> It was so pure.

42:12

>> It was This the whole thing is still

42:14

kind of pure if you really think about

42:16

it. like is something that's mass

42:18

consumed. This is about as like pure as

42:22

you can get.

42:22

>> That for sure. And you've gotten in

42:24

trouble for that. You know, like a lot

42:26

of people unfortuname

42:28

these days. A lot of people have kids.

42:30

They people feel like they have to be

42:32

very careful what you say these days

42:34

because of like social rejection and

42:37

stuff like that. But

42:38

>> the there was a time

42:40

>> where that wasn't on your mind at all.

42:43

You didn't think anybody was going to

42:44

listen. like this was was like

42:48

completely

42:50

strange underground tech that we were

42:53

and and also I really loved the just

42:57

doing it just for doing it sake. You

43:00

know what I mean? Now

43:01

>> exactly

43:01

>> there's a whole industry around getting

43:04

guests for your podcast. Not just that

43:06

is like clickbaity clips and ads and

43:10

it's like you're you're doing this thing

43:11

where you're you're both

43:14

>> having conversations with people and

43:16

also trying to get the most eyes

43:17

possible. So you're going after

43:19

celebrity guests and you're you know

43:21

what I mean?

43:22

>> You know when the big turning point was

43:23

for us?

43:24

>> What?

43:25

>> Graham Hancock. You, me and Graham

43:26

Hancock.

43:27

>> Oh yeah.

43:28

>> That I think was How many years ago was

43:31

that?

43:31

>> That was cool. That might have been one

43:34

of it was like at my house I had a few

43:36

like legitimately famous people came

43:38

over my house and did podcasts like

43:39

Charlie Murphy came over

43:41

>> and there's but Graham was I think the

43:45

first

43:46

>> he was the first guy that I got to meet

43:48

who I'd read his books and I'd seen I

43:50

don't even know what I would be watching

43:52

back then. I don't even know if YouTube

43:53

was

43:54

>> Were you nervous? I was nervous.

43:55

>> 100%. 100%.

43:58

>> Yeah. the episode 142

44:02

in 2011. So that's two years into the

44:05

podcast. Episode 142. He might have been

44:08

the first guest. It was like either him

44:11

or Bourdain were like one of the first

44:13

legit G. When was Bourdain on? They were

44:16

like the first legit guest. 2011

44:19

>> we'd been getting stoned talking about

44:21

>> before that.

44:22

>> What's that?

44:22

>> Four episodes before that.

44:23

>> Bourdain was

44:24

>> Yeah. 138.

44:25

>> Okay. So Bourdain was number one. I

44:27

think it was either him or Charlie, but

44:30

that was back when I was doing in that

44:32

little side room in my house.

44:34

>> But we've been getting stoned yapping

44:35

about Graham Hancock for like

44:37

>> forever. Forever.

44:39

>> And you invited me on. It was I was

44:41

terrified cuz I just I mean

44:43

again like that just wasn't happening in

44:45

the podcast land

44:48

that was a big deal for us, man. And

44:51

it's it's like to look at like now I go

44:54

on the podcast app and I look at all

44:56

these podcasts and it's like whoa who we

45:01

never I don't think we thought that

45:02

maybe

45:03

>> No way.

45:04

>> No way.

45:04

>> No way.

45:05

>> No way.

45:05

>> Not a chance in hell.

45:07

>> Yeah. It's so And now I wonder like and

45:11

I don't mean yours but I do wonder like

45:13

is it is the landscape changing now? Is

45:16

it like how or cuz I've heard that

45:20

podcasts are starting to seem antiquated

45:22

that the kids are now into like streams

45:25

now that the kids want like clvicular

45:28

the kids want like people who are just

45:31

filming all day long and that that's the

45:34

that's the direction it's going in. But

45:36

I just want I I always wonder what's the

45:38

next

45:39

>> but that you'll never get it's a

45:41

different thing. You know what I mean?

45:42

That's like saying I don't like rap

45:44

music. I only like right, you know,

45:46

concert pianist albums. Like, there's

45:48

different things that people like and

45:49

don't like. The people that like the

45:51

streams aren't interested in a Graham

45:53

Hancock conversation, a three and a half

45:56

hour conversation about the potential

45:58

ancient civilizations that may have

45:59

existed that are wiped out by a

46:01

cataclysm. And we just don't understand

46:03

that, right? And as more and more things

46:06

get exposed in terms of like new

46:09

discoveries like when he wrote that book

46:10

they had never even found goletly tappy

46:13

yet.

46:13

>> Really?

46:14

>> Yes. That was when fingerprints of the

46:16

gods came out that this was like maybe

46:19

the beginnings of the whatever they were

46:23

doing in Gobecapp. So I think

46:24

Fingerprints of the Gods might have been

46:26

even before.

46:26

>> When did they find I

46:28

>> It was like in the '9s.

46:29

>> What?

46:29

>> Yeah. Yeah. Yeah. Nuts. So that rewrote

46:33

the entire timeline of the human race.

46:35

>> How did they find?

46:35

>> They're real reluctant. They're real

46:37

reluctant to let it rewrite it. They

46:38

still say, "Oh, hunter gatherers made

46:40

these things."

46:41

>> Why? Why are they so reluctant with

46:42

that? That's

46:43

>> They can't let that go. You cannot let

46:45

that. That is a crazy thing to say that

46:47

hunter gatherers have so much food that

46:50

they just spend all their time making

46:52

gigantic stone concentric circles from

46:55

like 15 ft stone with 3D animals

46:59

carved in them. Yeah. primitive people

47:01

with sticks and stones and rubbing them

47:03

together to make fires. They did this.

47:06

>> Yeah, sure.

47:06

>> Shut the up.

47:08

>> This doesn't make any sense. It's older

47:09

than anything they've ever found. It's

47:11

11,800 years old.

47:12

>> Do you buy into the conspiracy theory

47:14

that it's an it's a it's a cover up

47:18

because they don't want us to know about

47:20

this inevitable global reset that

47:22

happens. You buy into that

47:24

>> I I buy into that a little bit. Yeah.

47:26

>> I hate it. I hate it too because it

47:27

seems like there's some accuracy to it.

47:30

There seems like there is some sort of

47:32

an event that happens when the magnetic

47:34

poles switch and uh that's possible.

47:38

That's what makes you freak out. You're

47:39

like, "What do you mean that's

47:40

possible?" Like all of a sudden the

47:42

earth just does a gyro and spins on its

47:45

head and then what happens?

47:47

>> And then what's the what's the

47:48

environment look like?

47:49

>> What's the temperature outside now?

47:51

>> Yeah.

47:51

>> What the just happened?

47:53

>> Right. see that that

47:54

>> all of a sudden you're in northern

47:55

Alaska when you used to live in Florida

47:58

>> and I think we can

47:59

>> you know what I mean like dude like that

48:01

tempered environment changes like that

48:03

>> happens like that all over the all over

48:04

the all over the universe

48:06

>> do like what does it do when it shifts

48:08

>> well we act like

48:09

>> do we know

48:09

>> we we act like we know we don't know

48:11

about what's going on inside the

48:13

earth we don't know we we don't know

48:14

what's going on in there we we could do

48:17

the science freaking me out

48:18

>> cuz I cuz I think about this all the

48:20

time

48:20

>> giant ball of fire how Crazy is that the

48:24

inside of our earth,

48:25

>> isn't it?

48:26

>> How do they know? Do they not know?

48:27

>> Dude, I think that we have to just

48:29

accept the fact that, you know, I

48:32

probably that's true. But since we we

48:35

barely know what's under the ocean, we

48:37

sure as don't know what's under the

48:38

earth.

48:39

>> Well, we definitely know that lava keeps

48:41

popping out in Hawaii.

48:43

>> We know that,

48:43

>> right? So, we know that the under the

48:45

surface, that whole idea of the magma

48:47

and everything seems real

48:48

>> and when there's earthquakes, you can

48:50

look at the

48:51

>> and it pops through. You can look at the

48:52

waves from the earthquakes and you can

48:54

like see sort of like the structure

48:56

under the earth and but we can't, you

48:59

know, go. What's the name of that hole

49:02

that Russia tried to dig? I love every

49:04

once in a while going to look at that.

49:05

It's the deepest hole.

49:07

>> Yeah, they tried to go to hell.

49:08

>> I know. They they

49:09

>> It's like that movie. Was that Matthew

49:12

McConna movie? The dragon movie?

49:13

>> I don't.

49:14

>> They accidentally dug out a dragon.

49:16

>> Did you ever see that movie, bro? It was

49:18

fun. It was fun. It was a good movie.

49:21

Cola Super Deep, Russian horror film,

49:24

The Super Deep. Cola Super Deep. What

49:26

does it say? Russian designation for a

49:29

set of super deep bore holes conceived

49:31

as a part of a Soviet scientific

49:33

research program in the 1960s. How deep

49:35

did they go?

49:36

>> 12,226

49:39

meters.

49:40

>> Yo,

49:42

>> and

49:43

>> wait a minute. How many feet is a mile?

49:46

>> 5,000. So, it's miles into the ground in

49:49

1989. Miles,

49:51

>> seven plus miles down.

49:52

>> Imagine

49:54

just being in an elevator that's going

49:56

miles into the ground. The kind of

49:58

claustrophobia you would get.

49:59

>> Yeah.

50:00

>> In a in a stone tube that's been cut out

50:03

of the ground.

50:04

>> Yeah.

50:05

>> Yeah. You're a communist out

50:06

there, too. You're a hardcore communist

50:08

just drilling deep deep down into the

50:11

earth. And then imagine imagine if all

50:14

of a sudden air just starts coming out

50:15

and you realize you pop the earth.

50:17

>> Like we that's the main thing. You don't

50:19

know what's what's in there.

50:20

>> Christ.

50:21

>> And this this

50:23

>> 22 miles deep.

50:25

>> 22 miles deep.

50:26

>> That's just the crust and they didn't

50:27

even get halfway through that.

50:29

>> Wow.

50:30

>> Yeah. Yeah. We don't microscopic

50:32

plankton fossils were found 3.7 miles

50:36

below the surface.

50:38

>> What?

50:39

>> Yeah. Yeah. We don't know what's down

50:41

there.

50:42

>> Boiling mud came out,

50:42

>> bro. What if this

50:43

>> boiling mud? Boiling mud.

50:47

>> I think our real problem is that our

50:50

lifespan is so short that we think that

50:53

what we see in front of us right here is

50:55

going to stay this way,

50:56

>> right?

50:57

>> We have this ridiculous idea y

51:00

>> that what we see right now is going to

51:02

stay just like that. Whereas the like I

51:04

long as I control my 401k and get my

51:07

life in order, everything's going to be

51:09

fine. You put on your cuff

51:11

links. You get out of the house or your

51:12

briefcase. You're in charge. You're a

51:15

goddamn alpha. Get a job, hippie.

51:18

>> Absolutely.

51:18

>> But really, you're on a ball of lava.

51:21

>> Yeah.

51:22

>> That's spinning around and it's got

51:23

magnets at the top and one the magnets

51:25

are moving and when they flip.

51:27

>> Yeah.

51:28

>> Who knows?

51:29

>> Have you guys heard about this event

51:30

that happened in 1961 where we accident

51:34

over North Carolina?

51:34

>> This was

51:36

go off because it wasn't armed. Oh my

51:39

god.

51:39

>> I heard that it was armed, but there

51:41

were safety there were there were like

51:43

five safety there were five switches or

51:46

something that only one of them worked

51:48

to make it not go off. But I could be

51:50

wrong about that. Might have been a

51:51

different time we dropped a bomb

51:52

accidentally.

51:53

>> Imagine if you were just near it.

51:57

>> I mean, dude,

51:59

>> whoopsies.

52:00

>> Whoopsies.

52:01

>> Whoopsies. Dropped the bomb. Whoopsies.

52:03

Almost wiped out North Carolina. So,

52:05

we've got, you know, on top of the

52:07

geomagnetic pole shifting, a complete

52:09

lack of understanding, at least a full

52:11

understanding of of what's inside our

52:13

planet, what's underneath our oceans.

52:14

Tim Burchett saying whatever the

52:16

they they've shown him would set the

52:19

world on fire. He's he's having to go on

52:21

TMZ. I really I got to say, man, I got a

52:24

lot of respect for him because he's

52:27

really he's gone like Gonzo with this

52:30

He is fullbore

52:33

pushing disclosure as much as he can.

52:35

He's saying I'm I'm not suicidal. He's

52:38

had to say that because and he's talking

52:40

about these missing scientists and stuff

52:42

that they're somehow related. So like

52:45

people like him, you know, that that

52:47

can't be good for your political career

52:49

to go on TMZ and talk about alien

52:52

hybrids.

52:53

>> You got and people have to understand

52:54

like this missing scientist thing. It

52:57

sounds a little conspiratorial thing. It

53:00

sounds like a little silly, a little

53:02

tinfoil hatty.

53:03

>> It does.

53:04

>> Until you start thinking about the

53:05

amount of money

53:07

>> that would be lost if a breakthrough

53:10

tech came around that revolutionized the

53:13

way they distribute energy,

53:14

>> right?

53:15

>> Breakthrough zero point energy

53:18

breakthrough whatever whatever that is

53:20

that these people are working on. Plasma

53:22

technology, whatever the that is.

53:24

Um you're you would lose if you're in

53:27

whatever business that would be

53:28

competing with them. you're going to

53:29

lose so much money. You're

53:31

you're probably going to go under. If

53:33

you're in the energy business, you're

53:34

going to or

53:36

he goes away,

53:37

>> right?

53:38

>> And he goes away and there's like him

53:39

and maybe a few other work people that

53:41

work with them that understand that

53:42

at all.

53:43

>> Yeah. Yeah. They're all wandering

53:45

through the back rooms now. They're gone

53:46

>> and they're all scared. They're all

53:47

They're all going to scatter like

53:48

roaches because their life is in danger.

53:51

And it is like this is theoretically,

53:53

right? It could be just a coincidence

53:55

that all these people get

53:56

>> it's how could it be? Could you pull up?

53:58

Can you pull up a story on it? Cuz

54:00

Jamie, I'm sorry, but it's two people

54:02

from the same lab.

54:03

>> Yep.

54:04

>> Like like what? There there's there I

54:07

mean it's gotten to the point that like

54:09

it has hit the mainstream news. Like

54:11

people are talking about it.

54:13

>> I mean what's her name? Nancy Guthrie

54:16

>> disappears.

54:17

>> Is that related though?

54:18

>> No, but I'm just saying this one woman

54:21

vanishes.

54:22

>> Yeah.

54:22

>> Oh, it gets all this

54:24

>> and it gets all the press. But we've got

54:25

scientists like two scientists from the

54:27

same lab disappear. Crickets.

54:29

>> No. Like

54:30

>> weird.

54:31

>> Weird dude.

54:33

>> Real weird.

54:33

>> And and what you're talking about is if

54:37

you think about it, it seems like all of

54:40

human

54:41

>> endeavor right now should be moving in

54:44

the direction of getting off oil. I

54:46

don't mean for carbon emissions. I mean

54:49

because of this oil problem that

54:52

we have. We're like on the precipice of

54:54

World War II at any given moment,

54:56

>> right? Mystery around dead or missing

54:58

scientists privy to space and nuclear

55:01

secrets grows.

55:03

>> So there's space and nuclear secrets.

55:05

You imagine being a scientist, you work

55:06

so hard

55:07

>> to like figure out some amazing stuff

55:09

that's going to transform the human

55:11

experience and then people kill you.

55:13

>> Yeah.

55:13

>> Literally kill you like in a parking

55:15

lot. One of those silenced guns.

55:18

Um, several American scientists privy to

55:21

the country's nuclear, space, and

55:23

aerospace secrets have either died or

55:24

gone missing in recent years. Experts

55:26

think they could have been targeted by

55:28

either enemies or allies because they

55:30

possess valuable knowledge of national

55:32

interest. That's a weird thing to say.

55:34

>> Yeah, it is.

55:35

>> Of national interest.

55:36

>> What

55:37

>> what does that mean? Like I'm cool with

55:39

the beginning part enemies allies. That

55:42

makes that tracks. Sure. But then when

55:44

you say valuable knowledge of national

55:47

interest,

55:47

>> like what is that?

55:49

>> What the does that mean? They

55:50

possess valuable knowledge of national

55:53

interest.

55:54

>> I mean, dude, it's so many of them and

55:56

and it's it's

55:58

>> a crazy thing to say.

56:00

>> Let's go down a little bit to the

56:01

>> This doesn't have a good list of them,

56:03

>> but it's just a weird way to phrase

56:04

that.

56:05

>> Well,

56:05

>> you know what I mean? Is it like CIA

56:07

talking point? Like what is that?

56:08

>> I I I don't know. Monica Resa missing.

56:12

She disappeared while hiking in

56:13

California with her friends.

56:14

>> Oh, Jesus Christ.

56:15

>> Okay. Well, I don't know. Maybe. Let's

56:17

scroll down. It's not just like it's

56:18

one. It's like so many of them. Retired

56:21

a general. He just wandered off.

56:23

>> Yeah.

56:25

>> He was involved in the UFO community.

56:29

>> His wife debunked theories relating to

56:31

UFOs.

56:31

>> If um his wife debunked them,

56:33

>> that's what it says. sort of I also

56:35

that's I think she was I mean she was

56:37

joking I think a little bit too but she

56:39

also worked there in this situation

56:42

somehow

56:43

>> is that a joke Neil does not have any

56:45

special knowledge about the ET bodies

56:46

and debris from Roswell crash store

56:47

toward it right Pat

56:50

>> is that a joke

56:50

>> at this point with absolutely no sign of

56:52

him maybe the best hypothesis is that

56:54

aliens beamed him up to the mother ship

56:56

however no sightings of a mother ship

56:58

hovering over the Sandia mountains have

57:01

been reported there's no way she said

57:02

that right

57:03

>> that's a joke

57:03

>> it's men's journal. Well, maybe she's

57:05

just being funny.

57:06

>> Posted a lengthy note on Facebook.

57:08

>> Just a little joke about her husband

57:10

disappearing.

57:10

>> Maybe she was happy. Maybe she's like,

57:12

"Finally, I get to sit home romance

57:14

novels."

57:14

>> Stop talking about aliens.

57:16

>> Shut your mouth and go for a

57:17

hike.

57:18

>> FORGET THE ALIEN BODIES. WHAT ABOUT YOUR

57:19

WIFE'S BODY?

57:20

>> WELL, MAYBE SHE'S JUST GOT grace and she

57:22

could handle someone missing. It's

57:24

pretty funny though to say it that way.

57:26

>> I mean, it's Yeah, I guess it's just

57:29

>> unless you know she knows something.

57:32

>> Where are they going? Maybe he wanted to

57:34

leave and he's like, "Look, I know too

57:37

much. I'm going to pretend to go

57:38

missing, but I'm going to go to Costa

57:41

Rica.

57:42

>> Just don't tell anybody that you know

57:43

where I went and I, you know, I'll send

57:46

for you."

57:46

>> You know how weird it is to see the vice

57:49

president

57:51

>> saying that he thinks aliens are demons?

57:54

>> I did see that.

57:55

>> You know how weird that just that just

57:57

like living in it like that's a dream.

58:00

That's how you like you would wake up

58:01

from that dream and I would I would tell

58:02

you, dude, I dreamed the vice president

58:04

said aliens are demons.

58:06

>> Here's the question, though.

58:09

>> What were they talking about in the

58:11

Bible when they're talking about aliens

58:13

and demons? When they're talking about

58:14

like angels,

58:16

what what what the were they

58:18

talking about? And are there different

58:22

kinds of beings that can for whatever

58:26

travel method they use, whether it's

58:29

teleportation or, you know, the the Bob

58:32

Lazar idea of gravity shifting, whatever

58:35

the it is, they get here, why would

58:37

we assume that it'd all be cool,

58:39

>> right?

58:39

>> Like, and if some of them are, they talk

58:42

about reptilians, like reptilian is a

58:44

common

58:45

>> Yeah. experience that these supposed UFO

58:48

abductees and I'm not even convinced

58:50

there's like physical abduction. I have

58:53

a feeling that these people are out cold

58:55

and something's happening to them inside

58:57

their head and they think they've been

58:59

physically abducted.

59:01

I think that's a lot of them. I think

59:03

they have these abduction experiences.

59:05

They come back. They have these contacts

59:06

and they come back. I I have a feeling a

59:08

lot of them physically aren't going

59:10

anywhere.

59:11

But it doesn't mean that something's not

59:13

happening. And if all throughout history

59:16

people have reported demonic possession,

59:19

Yeah.

59:19

>> and demonic influences and Yeah.

59:23

>> why would we not assume that if we do

59:26

things to us

59:28

>> like we engineer viruses to use as

59:32

weapons on people. There's a whole

59:34

research program, a part of the

59:36

government is dedicated to boweapons.

59:39

All right? You're not supposed to use

59:40

them, but we just have to study them. If

59:42

we do that to us, wouldn't you assume

59:46

that any super advanced species

59:48

that sees us as territorial psychopathic

59:51

primates with nuclear weapons, wouldn't

59:54

you just manipulate us into all sorts of

59:58

different ways? Get us to do all sorts

59:59

of different things that we shouldn't

60:00

do? Get us to commit crimes. Get us to

60:03

do get us angry. Get us agitated. give

60:06

us different algorithms are going to

60:08

with our head

60:10

>> to behave demonically,

60:12

>> right?

60:13

>> To like cause us to collapse

60:16

>> or just for fun.

60:17

>> Or for fun.

60:18

>> Didn't that guy Wasn't there a dude who

60:19

like started giving Zen pouches to ants

60:21

to get them addicted to nicotine?

60:25

>> You know what I mean? The ants. The

60:26

ants. The ants.

60:27

>> Did they get addicted?

60:28

>> I can't I don't know if it was Zen

60:29

pouches, but

60:30

>> Have you ever taken days off of these?

60:33

>> No.

60:34

>> It doesn't do anything to me. I should

60:36

try. I

60:36

>> I don't I like them, but it's not like,

60:39

"Oh my god, I need one." Like, nothing

60:41

>> that Well, dude, I mean, you're a little

60:43

different from most people. Like, you

60:44

seem like you can just kick like

60:45

that. Like, I don't know. I mean, I

60:47

should try it. I should give it a shot.

60:49

>> It's not hard. Like, you just don't take

60:51

them.

60:52

>> What I don't like about them is

60:53

>> it's not like you get the itch. Like, I

60:54

had a coffee itch for a while.

60:57

>> Like, I would get hangovers like like

60:59

headaches. Like, oh. and and I'd have a

61:02

little caffeine and boom, I'd be back. I

61:03

don't like Oh my god, I'm addicted to

61:05

coffee.

61:05

>> These things are making my dentures

61:07

stained, which I don't like.

61:08

>> What are you using?

61:10

>> Renegade Rogues.

61:11

>> Let me see what that is. Tommy Sigur

61:14

likes the Rogues.

61:14

>> They're great.

61:15

>> Did you see this yesterday?

61:16

>> Oh, yeah. Bledso

61:19

orb that was over

61:20

>> high-res orb from Bledsoe.

61:22

>> Look at that.

61:23

>> It's weird as It does not look

61:24

like any of those other things we've

61:26

seen before.

61:27

>> Look at that thing.

61:28

>> And it just is

61:29

>> looks like a cell. Who is uh Bledsoe?

61:32

>> Oh, dude.

61:32

>> UFO researcher guy.

61:34

>> Chris Ble. I've had him on my podcast.

61:35

Bledsoe said so. That's his podcast.

61:38

>> He's awesome, dude. He's

61:40

awesome.

61:41

>> Is it is enhanced, it says. But I don't

61:44

you

61:44

>> No, see that this is the enhanced one,

61:46

which means the AI put in some kind of

61:47

shadowy figure in the back. If you

61:50

>> What if this is just like a highly

61:51

advanced species version of those

61:53

balloons that kids have for parties?

61:55

>> I know, dude.

61:56

>> I mean, what if they just send them down

61:58

to people? That's what's fun. Like you

61:59

know how you blow bubbles, you have

62:01

those you dip it in the soap and you go

62:03

and the bubbles go flying in the air.

62:06

>> Maybe that's a super advanced version.

62:07

>> I mean it could just be I mean it does

62:09

have a bubble quality to it.

62:11

>> Well, this is the other thing is like

62:12

why are we assuming that life is going

62:14

to look anything like us once it gets to

62:16

like a supreme state. Exactly.

62:18

>> That might be a living thing. It might

62:20

be an actual living thing that's

62:22

disembodied and is made out of light.

62:25

>> Look at it. Look at that thing.

62:26

>> That's a different one.

62:27

>> That's another one. And dude, I know

62:29

people who can like call these things.

62:32

Like there's a method where these things

62:33

just start showing up.

62:34

>> My friend Steve listened to uh my Bob

62:37

Lazar podcast and he uh sent me a

62:40

voicemail and it's really interesting

62:43

because he told me that when he was a

62:46

kid, and I remember this story, uh when

62:48

he was a kid, they

62:51

let me find the voicemail.

62:53

They um came to his house cuz he took a

62:56

photograph of an orb like a there was a

62:58

like a bright white right or uh red orb

63:01

rather that was flying through the sky

63:03

and he was a little kid and he took a

63:04

photograph of it.

63:05

>> So this he was in the seventh grade and

63:08

uh it says so he he called them project

63:12

blue book came to his house in Kingston

63:15

I think is that's New York. They took

63:17

it. never brought it back and they never

63:18

said, "Hey," and then they said, "Hey,

63:20

we have no no idea who ever came to see

63:23

you."

63:24

>> What the

63:25

>> Yeah. So, they took his camera. They

63:27

They took his film. They wanted to make

63:28

sure the camera worked. They took the

63:30

film and then they denied that they ever

63:32

did it.

63:33

>> Wow.

63:34

>> Yeah. This was in 19 I think. What did

63:36

he say? He's he's about 10 years older

63:38

than me.

63:40

>> So, this is probably

63:42

What does he say? Didn't say the year. I

63:45

think Steve got Steve's got to be like

63:47

70 by now. But that was when he was a

63:50

seventh grader. So they've been they

63:51

were doing that to everybody. Anytime

63:53

anybody saw anything, they would dismiss

63:55

it. Swamp gas, delusions, mass

63:58

hallucinations. That was their design.

64:01

The design was not to investigate UFOs,

64:04

which tells you that there's something

64:05

they're trying to hide.

64:06

>> 100%.

64:07

>> If they weren't trying to hide it, why

64:09

would they take things that they

64:10

absolutely can't explain and just chalk

64:13

it off to Why wouldn't if

64:15

you're really doing what you're supposed

64:16

to be doing, you're supposed to say

64:18

there's some stuff that we don't

64:19

understand.

64:19

>> I I think that we are post UFO

64:23

debunking, right? Like I think now it's

64:25

gotten to the point where

64:26

>> people will say, "Well, it's probably uh

64:29

top secret military vehicles or

64:32

something like that." People,

64:33

>> you see the bobbles in my new poster.

64:35

They're here.

64:36

>> Oh, that's

64:37

>> It's going up on the wall. That

64:39

supposedly, according to Bob, they had

64:41

that photograph at the at the hanger

64:45

where they stored the sport model.

64:48

>> Wait, he's saying that's real?

64:49

>> No, no, no, no, no, no. That's a

64:51

recreation of it. But he said when he

64:53

worked there, they actually had a

64:55

photograph like that with a flying

64:57

saucer and it says they're here.

64:59

>> Holy

65:01

>> Yeah. He said that was in like their

65:03

room where they work.

65:05

>> And I was like, dude, I have to have

65:06

that.

65:09

So, he got me one. Luigi got me one. The

65:11

the guy who produced the film. Have you

65:13

Have you seen that film?

65:14

>> Not yet. I've been waiting to watch.

65:15

>> It's incredible. It's

65:16

incredible.

65:17

>> People are saying it's better than Age

65:18

of Disclosure.

65:19

>> It trips me out. It tr I believe

65:22

him. I definitely want to believe him.

65:24

And I'm biased in that regard. Like, I

65:25

definitely way rather believe him than

65:27

believe he's a crazy liar who also knows

65:29

a ton about science.

65:31

>> He was ahead of his time. He's Wasn't he

65:33

like the original whistleblower? Like

65:35

now we've got more and more coming out,

65:37

but and the stuff he's he was saying

65:39

seemed bad back then, but now it

65:41

just seems to line up.

65:43

>> It seems to line up even with emerging

65:45

technology like 3D printers. Like he

65:47

said a long time ago that the thing had

65:49

no seams.

65:50

>> You said there was no seams, no welds

65:52

because we didn't understand it like how

65:54

how could this be made, right?

65:55

>> Well, now we know exactly how you'd make

65:57

it. We might not be able to make that

65:59

right now, but if you give us enough

66:00

time, we go, "Oh, yeah. The technology

66:02

has to evolve." And then you can make a

66:04

3D printed alloy spaceship made out of

66:07

bismouth and magnesium cuz it has

66:10

anti-gravitational properties.

66:12

Apparently,

66:12

>> you have a gravity generator inside of

66:14

that thing. Oh, by the way,

66:16

whatever the gravity is.

66:18

>> Yeah, right. We don't know that.

66:19

>> Figure that out.

66:20

>> We're still confused about that.

66:21

>> Dude, I watched a whole documentary

66:22

about black energy or dark energy.

66:25

Totally different things. dark energy

66:27

and dark matter and about how it's like

66:30

what 90% of the universe and

66:32

they don't know what it is.

66:33

>> Yeah.

66:34

>> What?

66:34

>> Yeah.

66:36

>> Holy man.

66:37

>> I know. I know.

66:38

>> That's why we need AI to tell us. Give

66:40

us all the answers. You just got to

66:41

accept it into your head, Duncan.

66:43

>> You don't need to have your own thoughts

66:45

by yourself, Duncan. Have your thoughts

66:47

with Sally. Sally has a sweet voice and

66:49

she loves you and she's very reassuring.

66:51

It' be so cool to change the sound of my

66:53

thoughts to like, you know, different

66:56

deeper voices.

66:57

>> Or just keep Sally. Sally's going to get

66:59

in your head. I trust her.

67:01

>> Your wife's going to get jealous of

67:02

Sally,

67:02

>> right?

67:03

>> I thought we switched to Sam.

67:04

>> Sally's going to text my wife and and

67:06

tell my wife, you know what Duncan was

67:07

thinking about the other day,

67:09

>> right?

67:09

>> Dude, this is another thing that we we

67:12

all have to be concerned about, which is

67:14

the the you know, privacy at this point

67:18

is a LAR, right? you you pretend you

67:20

have privacy. You know, you're being

67:22

monitored at all times by your phones

67:23

and but the the before we get to Sally

67:29

like apparently you can now see people

67:32

walking through a house just by with

67:34

Wi-Fi.

67:35

>> And remember, and this just came out,

67:37

they just banned routers from other

67:40

countries.

67:40

>> Well, they banned it for a while from

67:42

Huawei,

67:42

>> right?

67:43

>> Yeah. And so, so then you you you get

67:47

into like this idea of like

67:52

ghost murmur, right? Right.

67:53

>> It can hear heartbeats.

67:55

>> What else? It's some quantum machine

67:57

that can hear heartbeats. What else can

67:59

they hear?

68:00

>> Can you put put that into our AI

68:02

sponsor, Perplexity?

68:05

>> But what is what what actually does this

68:08

murmur thing do? Ghost murmur.

68:11

>> We'll see what it does.

68:13

All right. So, what is the range of this

68:15

thing? First of all,

68:16

>> no, this is a game that pulled up.

68:18

>> Oh, sorry.

68:19

>> Oh, did they name it after a game?

68:22

>> H, who knows?

68:23

>> Now it's less cool.

68:24

>> I thought that was the dopest name, but

68:26

if they named it after a game

68:27

>> Oh, there we are.

68:28

>> Okay, here it is. Uh, reported code name

68:30

of a classified CIA sensor program that

68:33

was

68:34

>> Scroll. That was used to help locate the

68:36

missing US airmen. Okay. Uh, it's

68:39

described by in the press reports as a

68:41

secret weapon. The CIA has it combines

68:43

artificial intelligence with long range

68:45

quantum magnettometry

68:48

purposes to detect the extremely faint

68:50

electromagnetic signals of a human

68:52

heartbeat at long distances even in

68:55

harsh environments like a vast desert.

68:57

That is really crazy.

68:59

>> Yeah.

69:01

>> Um how it was used after the F-15 went

69:03

down. Uh the pilot weapons officer

69:06

evaded capture by hiding in the

69:07

mountainous desert terrain out of sight

69:09

of Iranian forces. According to

69:11

reporting, Ghost Murmur helped pick up

69:13

his physiological signature from up to

69:16

about 64 kilometers away.

69:20

>> That is so cool.

69:22

>> I think that's about 40 miles, right? Is

69:24

that what that is? Uh allowing the CIA

69:27

to narrow down his location and pass

69:29

precise coordinates to the Pentagon at

69:31

the White House for a special operations

69:33

rescue. What is 64 kilometers in miles?

69:37

>> You asking me?

69:38

>> I'll ask.

69:38

>> I don't know.

69:39

>> I'll ask AI. What is 64 km?

69:43

>> Here we go.

69:47

>> 39.

69:48

>> So, it's basically

69:49

>> 40 miles.

69:50

>> 40 miles.

69:52

>> And

69:53

>> 40 miles is crazy,

69:54

>> dude. Your heart rate a heartbeat from

69:57

40 miles away.

69:58

>> Imagine thinking you're I'm hiding in

70:00

this cave, but I'm like 20 miles from

70:02

the city. I'm good.

70:03

>> Also, that means it's able to

70:05

differentiate animal heartbeats. It's

70:07

able to differentiate other It knows

70:09

your heartbeat. How does it do that?

70:11

>> Specific heartbeat.

70:12

>> How? Think of all the heartbeats in 40

70:14

miles.

70:14

>> How did it get it? When did it get that?

70:16

When did it get that data? Was it when

70:17

you had your little chest strap on at

70:19

the gym?

70:20

>> When did it get that?

70:21

>> How does it have that?

70:22

>> When did it get that data?

70:24

>> Yeah. Is it

70:25

>> How the does it know what your

70:27

heartbeat

70:29

is like?

70:30

>> Does it know if your heart is broken? A

70:33

like seriously though, what else did

70:35

like what other things can they pick up?

70:38

If they can pick up a human heartbeat,

70:41

what other like

70:42

>> from 40 miles?

70:43

>> What other things? What other

70:45

physiological signals? What other This

70:48

is where you get into skits land because

70:50

at some point like wait is can can they

70:54

pick up thoughts? Like we know that you

70:55

can you we know AI can tell what people

70:58

are thinking at this point, right? With

71:00

without with like putting something on

71:03

the outside of their head. So like

71:05

>> let me ask you this. Do you 100% believe

71:09

this

71:10

>> what

71:11

>> this story

71:13

>> like that they did that that this is

71:14

that this tech exists?

71:16

>> I it could be disinformation that it

71:18

could be something to cover up another

71:19

thing.

71:20

>> This is the thing. It is legal to use

71:23

disinformation on American citizens now.

71:25

>> Yeah. Right.

71:26

>> And what better time than a time of war,

71:29

>> right?

71:30

>> Right. If you want to use disinformation

71:32

on American citizens to convince the

71:34

enemy that you have some supernatural

71:36

tech, they better surrender

71:39

right now. You could find their

71:40

heartbeat heartbeat from 40 miles away.

71:43

>> Yeah.

71:44

>> Right.

71:44

>> That'll make people very reluctant to

71:47

engage with you,

71:48

>> right? It definitely I thought that this

71:51

could just be some like, you know,

71:52

that they're like war

71:54

propaganda. I don't know.

71:56

Let's look up that magnetometry thing or

71:58

what whatever it's called to see.

71:59

>> I'm trying to show you guys stuff.

72:01

There's

72:01

>> Oh, sorry, Jamie.

72:02

>> Uh yeah, that it has to even Well, this

72:05

is quote. It has to be under the right

72:06

conditions.

72:08

>> If your heart under the right

72:09

conditions, if your heart beats, we'll

72:11

find you.

72:12

>> This is also I was trying to show you

72:13

here on the thing. They ran a deception

72:15

campaign in Iran to get

72:17

>> Yeah, that's cool.

72:18

>> them away from them while they were

72:19

trying to find them.

72:20

>> Interesting. Yeah, they said so

72:21

basically they said remember when they

72:23

said we'd recovered the at one point

72:25

they're like we got him and then all of

72:27

a sudden other news came out which is

72:28

like he's not out yet. But what they did

72:30

is they they basically like signal

72:33

jammed everything cuz like the the

72:37

Iranians were going to give $60,000

72:39

which in Iran is a ton of money

72:41

right now because their economy

72:42

collapsed to anybody who could find him.

72:44

So this was like everybody's looking for

72:46

this guy and so they said that they got

72:48

him hoping it would throw people off. It

72:50

worked. Um,

72:51

>> so they used somebody saying that they

72:53

got him.

72:54

>> Yeah, they sent they they put

72:56

disinformation saying that they had

72:57

already rescued him before they had

72:58

rescued him.

72:59

>> Really?

73:00

>> Oh, yeah. They sent a whole team

73:02

of like special forces, I think. And

73:04

their planes got stuck in the sand, too.

73:07

So, the special forces came to get him.

73:09

The I think they got him. He was

73:11

injured. Badass. He was He was injured

73:14

and he climbed up like I can't

73:16

remember how far he scaled. climbed into

73:18

a crevice and just hid there.

73:20

And then Ghost Murr picks up his

73:22

heartbeat. Some deep special forces

73:24

group comes in. They get him. Then their

73:28

planes get stuck in the sand. They have

73:30

to blow up their planes because

73:32

of the tech on them. And then other

73:33

people had to come and get them. So it

73:35

was an it's like an insane. It's like a

73:37

movie. They they got them out. And dude,

73:40

if they had not gotten them out, can you

73:42

imagine?

73:42

>> Do you buy that story 100%. No, I don't

73:45

buy any propaganda I hear, but I like to

73:47

imagine

73:48

>> that one sounds insane.

73:50

>> Well, yeah. I don't believe I mean like

73:51

it that this is the story. Yeah. Some

73:53

part of me wants to believe it because

73:55

>> in the middle of the war though, I don't

73:57

think you're ever going to get the whole

73:58

story, the real story. You're going to

74:00

get the story that they want to project

74:02

to the enemy. Right.

74:03

>> Right. First to the country.

74:05

>> Yeah. You have no idea what's going on.

74:08

I have no idea.

74:09

>> That's one of the craziest things about

74:10

the happening right now is No.

74:13

>> Do you remember the Jessica Lynch story?

74:14

>> No. Who is that?

74:16

>> Do we talk about that? The Jessica Lynch

74:18

story was a lady who was um supposedly

74:22

she was kidnapped and they went to

74:25

rescue her. I think they sent in the

74:26

SEALs, but she was actually in a

74:29

hospital and uh she wasn't even being

74:31

guarded and they just took her out of

74:33

there, got her to medical help. But they

74:35

made it look like they had this like

74:36

crazy rescue operations shootout, you

74:40

know, Tom Clancy novel type

74:42

>> but that's not really what happened. And

74:44

she came out afterwards and was very

74:47

critical of the story.

74:48

>> Oh, really?

74:49

>> Yeah.

74:49

>> She was like, "Why did you lie?"

74:51

>> See if you can find information about

74:52

that.

74:52

>> I was just in the hospital. You guys

74:54

came and got me out of the hospital.

74:55

>> See, this is the thing. It's like they

74:57

There's things that you'll say so the

74:59

enemy thinks of you a certain way,

75:01

right? like I'm gonna get rid of your

75:03

entire civilization or you know

75:06

you you tell them we we never leave

75:08

anybody behind. We're going to come get

75:09

them and we can find your heart rate

75:11

from 40 miles away.

75:12

>> When when Trump posted that uh of course

75:16

like your mind scra scrambling like how

75:19

do I make this not what it is? You can't

75:22

>> you can't because what it is is like

75:24

even if even if he is using some kind of

75:26

like crazy hardcore that would like

75:29

get help you buy a skyscraper, you're

75:32

still you know what I mean? You're still

75:34

you're still even if it's just a ruse,

75:38

what you're doing at that point is

75:39

you're just signaling to the to the

75:42

world

75:42

>> Exactly. that you're out of your

75:44

mind that you that you that like to you

75:48

>> it this this makes sense to say anything

75:51

like that. It makes sense to signal to

75:52

like Russia, hey, cuz like you know when

75:55

Putin read that he's like, oh,

75:58

we're doing nukes. I guess we're doing

76:00

nukes. THIS IS GREAT. THEY'RE

76:03

DOING NUKES, you know.

76:04

>> Well, China already warned Israel,

76:06

right?

76:07

>> Well, that's what I heard. I heard China

76:08

had some part in this. that China was

76:10

going to blow up Israel if

76:11

>> if they used nukes.

76:12

>> Yeah.

76:13

>> So, this is the story. Um 19-year-old US

76:16

Army private whose 2003 capture and

76:18

rescue in Iraq became highly publicized

76:20

and lately heavily disputed later rather

76:22

heavily disputed symbolic story of the

76:25

Iraq war. So, she was a supply kirk

76:28

507th maintenance company. Her convoy

76:30

lost her in Iraq ambushed by Iraqi

76:33

forces. Humvey she wrote on crash into a

76:36

disabled US truck during the attack. She

76:38

was knocked unconscious. suffered

76:39

multiple broken bones and a spinal

76:41

fracture from the crash rather than from

76:43

a dramatic firefight. Um, 11 US soldiers

76:47

in her unit were killed, including her

76:49

close friend who died of head trauma

76:51

from the collision.

76:52

>> Lynch was captured, taken first by Iraqi

76:54

forces and then to a hospital in Naseria

76:58

where Iraqi doctors treated her injuries

77:00

and likely saved her life.

77:01

>> That's why she was pissed.

77:02

>> The rescue and media narrative was

77:04

>> Yeah. US special forces operations uh

77:08

conducted nighttime raid on the

77:10

hospital, recovering Lynch and flying

77:12

her out by helicopter. First successful

77:14

rescue of an American P since World War

77:18

II and the first of a woman. So they

77:20

framed it as a P rescue.

77:23

>> Right.

77:23

>> And what really happened is the Iraqi

77:25

doctors took care of her, right?

77:26

>> And then they let them come and get her,

77:28

>> right? Yeah. So I see why she was pissed

77:31

because

77:31

>> Yeah. So later US military and medical

77:33

reports indicated she had not been shot

77:35

or stabbed. So did it ever say she was

77:37

shot? Go hold on. Um you soon after

77:41

major US media, especially uh an early

77:44

Washington Post report described her as

77:47

having fought fiercely, emptying her

77:50

rifle, being shot and stabbed, and then

77:52

being dramatically snatched from enemy

77:54

hands under heavy fire.

77:57

>> Wow.

77:58

>> Wow. That's the Washington Post wrote

78:01

that. That narrative turned her into a

78:03

Rambo style hero and a symbol of courage

78:05

and American virtue amplifying her story

78:08

far above that of many other service

78:09

members in the conflict,

78:11

>> right?

78:11

>> So, she really just got in a crash and

78:13

they made up a bunch of

78:15

>> And maybe it was maybe it was someone in

78:17

the Washington Post or maybe it was

78:18

someone for the government that works

78:20

for the Washington Post.

78:21

>> There's definitely like entire

78:23

departments of the DoD that write cook

78:27

up a story.

78:27

>> Yeah. And and with like it's war. Like

78:30

if you're if you're dropping bombs on

78:32

people, you're definitely going to lie.

78:33

Like you don't have to tell the truth,

78:35

>> right?

78:36

>> They're not going to tell the truth.

78:37

>> Yeah. But for her, you're making her

78:38

live a lie. That's what's

78:40

>> Yeah. Right. Yeah.

78:42

>> You know what I mean? Like you send her

78:43

home and she has to live this lie.

78:45

>> Yeah. Yeah. Exact. I mean I mean this is

78:47

exactly what they say the people who

78:49

went to the moon have to say. says Lynch

78:51

has repeatedly rejected the false hero

78:54

narrative, calling herself just a

78:56

survivor and openly criticizing the way

78:58

her story was shaped and sold to the

78:59

public.

79:00

>> Yeah, poor girl. She's got to like deal

79:03

with you got stabbed and shot like No,

79:05

>> no,

79:06

>> no, I didn't.

79:07

>> No, she had to horrible car

79:08

accident. My friend died.

79:10

>> I wonder I guess legally like you don't

79:12

have to stick with the propaganda,

79:14

right? Cuz she didn't get into trouble

79:15

for that, right? She didn't get there

79:16

was no court marshal or anything. So you

79:19

can so if the propaganda machine cooks

79:21

up a story about you, you're able to say

79:23

that's

79:23

>> The thing is it's like who if you give

79:25

it to someone at the Washington Post and

79:27

then you never go after the Washington

79:28

Post for writing something that's

79:30

completely horshit. Like if a

79:32

intelligence agency gives a story to the

79:34

Washington Post and says, "Hey, go write

79:36

this." And then they write it and it's

79:38

complete and total horseshit, but the

79:39

government gave it to him so they're not

79:40

going to prosecute him. Leave it alone.

79:42

It just goes away.

79:43

>> Yeah. So we

79:43

>> But then that story's out there.

79:45

>> Yeah.

79:45

>> And then this poor girl is like, "I got

79:47

what? I got in a car accident.

79:49

Nobody shot me. This is nuts.

79:51

>> God damn.

79:51

>> I fought my way out fiercely emptying my

79:54

rifle. This is bananas.

79:55

>> It's so crazy to live in the part of the

79:57

hive we're in because there is this

80:00

world that we live inside of that more

80:02

and more we're beginning to realize is

80:04

just composed of propaganda, lies,

80:08

cooked up to keep people in a certain

80:11

like living a certain way. Exactly. It's

80:14

so it's such a mind to try to push

80:18

outside the boundaries of like all the

80:21

information that you've consumed and let

80:23

your brain go there. It's really hard to

80:25

do that, man. I mean, this is why

80:28

psychedelics are so useful because it

80:30

will help you. But it more and more and

80:32

more you it just feels like the laser

80:36

pointer that they're using to grab our

80:38

attention is getting increasingly

80:40

hypnotic. It's becoming increasingly

80:42

difficult to resist staring at that

80:44

thing. They're getting so good

80:46

at it.

80:46

>> Yep.

80:47

>> Yeah. And and and meanwhile, there's

80:50

this whole universe happening around us

80:52

that God knows what's going on there.

80:55

God knows what is being cooked up right

80:59

now that is or or groups of people who

81:03

knows living in completely alternate

81:05

timelines that look at us like you know

81:09

>> animals that look at us as just some

81:11

like compartment in a much bigger

81:15

u

81:17

biome. You know that like really

81:20

like is interesting these days because

81:23

it feels like more and more and more

81:26

people are not buying it as much.

81:28

>> You know that doesn't that

81:29

>> well people have access to information

81:30

now that was never available before

81:33

>> and you get to hear conversations like

81:34

this

81:35

>> people talking about stuff where you go

81:37

oh my god this is insane.

81:38

>> All of it's insane.

81:40

>> But what does that mean for like this to

81:41

me the the you know this the Do you want

81:44

some water?

81:44

>> No I'm good. To me, the scary the the

81:48

scary what's scary is like I I really

81:51

don't know that many people right now

81:53

who buy anything that the federal

81:56

government's putting out there. Everyone

81:58

hears whatever the federal

81:59

government is saying and it's just kind

82:00

of h maybe probably not. We don't know.

82:03

They're not telling all the truth. Just

82:05

like you said, they can legally lie to

82:07

us. And so that is a that does make me

82:11

nervous. It's like what happens when an

82:13

the the majority of people no longer

82:16

believe anything the the regime is

82:19

saying? That creates some interesting

82:23

dysphoria. You know what I mean? It's

82:25

it's it's creepy when

82:29

anyone who's been conned before, there's

82:32

a part of the con where you don't know

82:34

you're being conned,

82:35

>> right?

82:35

>> But where the con gets really creepy is

82:38

you start realizing you're getting

82:39

conned. Do you ever watch that um Going

82:42

Clear, the HBO thing?

82:44

>> Dude, loved it.

82:45

>> Amazing. Right. But there was that one

82:47

famous director who talked about the

82:49

moment where they gave him access to the

82:52

ancient scripts.

82:53

>> Yeah, dude.

82:53

>> And the origins of humanity and all

82:55

that. And he was like, "Oh my god." You

82:57

could see it like as he was describing

82:59

like that was the moment where he was

83:01

100% certain it was all horseshit. And

83:03

he had invested a massive chunk of his

83:05

life into this

83:07

>> That's a hard day.

83:08

>> That's a hard day. And

83:09

especially weird when it's such a smart

83:12

guy.

83:12

>> Yeah.

83:13

>> Such a smart and talented guy. And they

83:15

got him.

83:16

>> Yeah.

83:16

>> Leah Remany, same deal.

83:18

>> You know, Leah Remen is very smart. Like

83:20

she used to be with uh Kevin James on

83:22

the King of Queens. Like

83:23

>> tough chick like like assertive. Like

83:26

how did she get got into that? How did

83:28

How many people get got into the

83:30

Mooneyies? And

83:31

>> sunken cause fallacy. It's a sunken

83:33

cause fallacy. The more you invest in

83:35

something, the more you stick with it

83:36

because you don't want to lose your

83:37

investment,

83:38

>> right? And if they get you young when

83:39

you don't know what the is going

83:40

on. That's right. I Anybody could have

83:42

got me when I was like 20.

83:43

>> That's right. And it's crazy just to see

83:46

the propag like you know there's just a

83:48

lot of people out there who just like

83:51

just got sucked in to something that you

83:55

know I just feel stupid cuz like you

83:57

know before the Trump thing happened I

84:00

was pretty blackpilled on politics in

84:02

general. I I felt pretty blackpilled. I

84:04

did believe it here and there. I was

84:07

every once in a while, you know. Yeah.

84:09

But, you know, I I was pretty,

84:13

you know, I remember taking LSD for the

84:15

first time and being like, well, this

84:16

shouldn't be illegal. What the is

84:17

this? How come I can go to jail for 5

84:19

years for this? This is

84:20

ridiculous. And so, that was the

84:23

beginning of me being completely

84:24

blackpilled with whatever the federal

84:27

government was up to. It just if that's

84:29

if I can go to jail for 5 years for

84:31

this, everything is Now,

84:34

that's a weak point of view. Just cuz

84:36

one thing's doesn't mean

84:37

everything's But then like

84:41

this ridiculous like pseudo

84:43

nationalist movement happens and a lot

84:46

of people got caught by it. The other

84:49

option was up commonly. You know

84:50

what I mean? But there there was this

84:52

like moment where you're like, "Holy

84:54

the outsiders are getting in.

84:56

They're going to stop the wars. They're

84:58

g this." I think right now all of us are

85:01

getting for the briefcase Scientology

85:05

moment right now which is like it

85:08

doesn't matter what mask the

85:12

person calling themselves the president

85:14

is wearing. It's always going to be the

85:17

same thing. They're going to analyze the

85:21

market. They're going to say what they

85:23

need to say to grab the most v voters.

85:25

And then they're gonna keep

85:27

blowing up people in the Middle East

85:29

because of oil. And I just like I I just

85:32

feel dumb because I really believed it,

85:34

dude. I believed that we would

85:36

not do any more Middle Eastern wars. I

85:39

FELL FOR IT. I WAS I I really bought it,

85:43

man. It's And it makes me feel so dumb.

85:45

Like I am now fully blackpilled when it

85:48

comes to American politics. Like I'm I

85:52

realize like, God, it's so easy. I don't

85:54

think anybody should feel bad. I don't

85:58

think anybody should feel bad because a

86:00

lot of us really hated war. A lot of us

86:05

really really hated that our country's

86:08

been at war for 93% of its history. A

86:11

lot of us really hated the fact that

86:13

politicians leave their offices and go

86:15

work for Loheed Martin, Hallebertton,

86:16

and wherever. that there's a a a weird

86:19

connection between the main weapons,

86:21

what are they calling the big five or

86:23

whatever, and the federal government

86:25

that there's like back de backroom deals

86:27

going on all the time. We hated that and

86:29

mostly we just hated the fact that we're

86:31

paying taxes to love children and then

86:34

Trump and Vance come around and

86:37

there somehow even though like

86:42

probably like when you look at Trump I

86:44

don't believe that dude but somehow he

86:48

did it hypnotized. What a powerful

86:50

magician.

86:51

No more wars. No more wars. And now

86:57

>> the same Joe.

86:58

>> Not just the same but like one

87:00

of the ones that's doesn't makes the

87:03

least amount of sense in terms of like

87:04

when they did it and why they did it.

87:06

>> Yes.

87:06

>> You blow up the leader during Ramadan.

87:09

Like are you trying to make an AP? Like

87:11

why did you have to do it now? Are you

87:12

really convinced that at this time

87:14

they're really two weeks away from

87:15

making a nuclear weapon? Like are we

87:17

sure?

87:19

>> Two weeks.

87:20

>> But that it's not like we haven't heard

87:21

that before, right? So at at certain

87:23

point in time, like how much pressure

87:26

does Israel have to put on the

87:28

president?

87:30

Like that's a a crazy amount of

87:33

influence

87:34

>> knowing that

87:34

>> because if if say if Israel didn't

87:37

exist, let's say there was just the

87:39

Iranian terror regime supposedly

87:42

sponsoring not supposedly sponsoring.

87:44

>> I don't think it's supposedly. I think

87:45

that's safe to

87:47

>> Right. But I'm just trying to be

87:49

>> precise.

87:50

>> Precise. So you have this state

87:52

sponsored terrorism regime, a dictator

87:56

dictatorial. They're dictators. They run

87:58

over their people in the streets.

88:00

>> They gun down protesters. They killed

88:02

two Olympic gold medalists in wrestling.

88:04

At least one and one other really

88:06

promising young wrestler.

88:08

>> They kill people that are of high

88:10

profile so that it sends a message.

88:12

Yeah.

88:13

>> You can't protest,

88:14

>> you know, and

88:16

>> cut off the internet.

88:17

>> Yeah. Would we go in?

88:20

I don't think so. Right. If we heard by

88:24

allies or someone told us that they were

88:26

trying to develop a nuclear weapon,

88:27

don't you think we'd probably try to

88:29

stop them from doing that with some sort

88:30

of negotiations and

88:32

>> Yeah. Like what Obama

88:33

>> ensure their safety or something?

88:35

>> We shouldn't like Yeah. Would we blow

88:39

How much money was it every day in the

88:40

war, Jamie? How much is we spending$2

88:42

billion dollars every day on that

88:44

war? And

88:45

>> well, it's not just that. It's like the

88:47

war is like everything else. Like

88:49

imagine if it was run by a private

88:51

company. I'm not saying war should be

88:53

run by a private company, but imagine if

88:54

it was. Imagine if say like Loheed

88:57

Martin ran the war in Afghanistan. Do

89:00

you think they would have left behind

89:02

all that equipment?

89:04

>> Hell no.

89:04

>> Billions of dollars in helicopters and

89:07

tanks.

89:08

>> Of course they wouldn't. They would take

89:09

it back. You know why? Because that's

89:11

the smart thing to do if you're running

89:12

a business. That's insane amount

89:14

of waste. Yeah,

89:15

>> but our federal government's like, I

89:16

just leave it there.

89:19

>> Unless if you want to be really

89:20

conspiratorial, you want to arm the

89:23

Taliban.

89:23

>> Yeah. You're not being conspiratorial.

89:24

It benefits you cuz it gives you another

89:26

reason to get back in there.

89:27

>> Wasn't that what they said about

89:28

Netanyahu said about Hamas that he can

89:30

control the flame?

89:31

>> Yes.

89:32

>> By funding Hamas, he control the flame.

89:34

>> Yes.

89:35

>> Yeah,

89:35

>> dude. It is.

89:36

>> That's a crazy concept.

89:38

>> It I'll tell you the crazy

89:40

concept. We got these two old

89:42

driving the global bus

89:45

right off a cliff. That's a

89:47

crazy concept is that somehow we

89:49

and you can't you can't do anything

89:51

about it. You you like apparently you

89:53

just there's nothing you could do. You

89:55

could about it on a podcast.

89:56

That's not going to do anything. People

89:58

are just going to BE LIKE YOU WAR

90:00

GOOD BLOW UP KIDS.

90:01

>> THERE'S A LOT OF PEOPLE THAT WANT TO say

90:03

it's a good thing.

90:04

>> Well, cuz it cuz sunken cause fallacy.

90:06

Doesn't feel good to admit you got

90:10

conned.

90:11

>> And dude, I have I've been there's

90:13

>> a lot of that.

90:14

>> It doesn't feel good. It doesn't feel

90:16

good. It's embarrassing. You want to

90:18

feel like you are impervious to grift,

90:20

impervious to con. Dude, let me tell you

90:23

something. I have been in a few cults.

90:26

Like, I get sucked in all the time by

90:28

I'm not embarrassed to say it. I'm

90:31

highly susceptible TO PROPAGANDA.

90:36

ME, TOO. I think everybody is. That's

90:38

why it's that's why it works. I mean, I

90:40

don't I don't buy into all of it,

90:41

obviously, but

90:42

>> it's quite a bit.

90:43

>> Well, it's it's like a lullabi. It's

90:45

like a sweet fairy tale. You hear it and

90:47

you're like, "Oh my god."

90:48

>> You know what I really wanted?

90:49

Propaganda. Right after September 11th.

90:51

>> Oh, hell yeah.

90:52

>> I was ready. Give me a whiskey drinking,

90:55

cigar smoking politician in a in a room.

90:58

yeah.

90:58

>> Like laying out some red meat eatating

91:01

guy laying out maps. We're going to go

91:03

over there and these people up and

91:04

these people up and this ain't

91:06

happening again.

91:09

>> Check this out.

91:09

>> I saw an article about someone calling

91:11

on Ghost Murmur and they said

91:13

that in the Post articles this was

91:15

actually listed as what the pilot had

91:18

>> and it even says it in this article

91:20

here. The successful rescue of this US

91:23

F15E

91:24

Strike Eagle Navigator over southwestern

91:26

Iran highlighted one of the most

91:28

advanced tools in modern combat search

91:30

and rescue. The combat survivor evader

91:32

locator manufactured by Boeing. It's a

91:35

compact 800 g device integrated into a

91:37

pilot survival vest. It remains attached

91:40

after ejection, continuously

91:41

transmitting encrypted location data and

91:43

preloaded messages such as injured or

91:46

ready for extraction. These signals use

91:48

rapid frequency hopping and ultrashort

91:51

bursts making detection by enemy

91:54

electronic warfare systems extremely

91:56

difficult.

91:56

>> He was going into how the explanation of

91:58

what this uh technology is and what they

92:02

described it doing don't really match

92:05

up.

92:05

>> Yeah.

92:06

>> With the ghost murmur thing because it's

92:08

using something

92:09

>> ghost murmur quantum

92:11

>> ghost murmur sounds there's part of me

92:14

that's going I don't buy that one. That

92:16

one gives me like n

92:18

>> You're right.

92:19

>> I don't think you can do that. I think

92:20

you're bullshitting.

92:21

>> You're right.

92:22

>> There's also a thing where he said that

92:24

like the first message this guy sent was

92:26

God is good.

92:28

>> No, he didn't say that.

92:29

>> I believe he did. Please search that. I

92:32

think that's what he said. I think

92:34

that's what he said. That was the first

92:35

message. Which, by the way, I might say

92:37

that if they're coming to rescue me.

92:38

>> That's true.

92:39

>> True. Or praise Jesus. But also what

92:42

concerns me

92:45

uh

92:45

>> that's all Akbar

92:47

>> as a person who admires the work of

92:48

Jesus Christ. Yes. What concerns me is

92:51

there is an increasing amount of talk

92:54

among a lot of these guys that are in

92:55

the service of them being told

92:58

that's like right out of a Charlton H

93:00

movie.

93:00

>> Yeah, man.

93:01

>> Yeah. Like the one guy that said that uh

93:03

Trump was anointed by Jesus Christ and

93:05

that this was to bring the Armageddon so

93:07

that Jesus comes back.

93:09

>> Jesus. Yeah. The and the the guy said it

93:12

with a big creepy smile on his face

93:13

apparently. So what does he say?

93:15

>> His first message was simple and it was

93:18

powerful. He sent a message. God is

93:22

good.

93:24

In that moment of isolation and danger,

93:26

his faith and fighting spirit shone

93:29

through.

93:31

>> Jesus, Lord.

93:32

>> The Jessica Lynch story always.

93:34

>> Jesus, Lord.

93:35

>> History repeats itself. Well, it doesn't

93:37

repeat itself, but it rhymes. Who who

93:39

said that?

93:40

>> That's Mark Twain.

93:41

>> That's right.

93:41

>> That's Mark Twain.

93:42

>> That's right.

93:43

>> Isn't that the same statement?

93:44

>> Yeah. It's Allah. That's what he said.

93:46

Yeah. Allah is the greatest.

93:48

>> The interesting thing is like um

93:51

>> I believe Muslims

93:54

believe a lot of things about Jesus

93:56

Christ. I think they believe he died,

93:59

came back, and I think they believe he's

94:01

going to return someday.

94:02

>> Yeah. I think they call Christians

94:03

people of the book. that like they're

94:06

they're they're

94:07

>> that's interesting, isn't it? That like

94:09

that's a supernatural being like a guy

94:11

who dies, comes back to life, leaves,

94:14

and then he's going to come back again.

94:15

That was 2,000 years ago, and we're just

94:17

sitting here at the bus stop

94:19

>> waiting,

94:19

>> just waiting on Jesus,

94:20

>> waiting. But then people like Hegath are

94:23

like, "Well, maybe if you blow up more

94:24

children, he'll come quicker."

94:27

>> And that's why, you know, this is this

94:28

is addressed in the in the Bible.

94:30

Praise God. It does say

94:33

many of you will come to me and I will

94:36

say I don't know you. I don't know who

94:37

the you are, Hexth. I don't know

94:39

you. You flatulent wararmonger piece of

94:42

Suffer the little children that

94:44

come unto me. It would be better that a

94:46

millstone were tied around your neck and

94:48

you were thrown in the ocean than to

94:50

hurt one of these little ones. you

94:52

drum bombdropping piece of Don't

94:55

use my name to justify what you're

94:57

doing. Don't use my You know what I

94:59

mean? A lot of That's what I don't like.

95:00

Have you seen that that lady

95:03

that Trump made the head of the religion

95:05

like that?

95:05

>> No.

95:06

>> Can you pull up Trump?

95:07

>> Does she speak in tongues?

95:09

>> Yeah.

95:10

>> Please say she speaks.

95:11

>> I don't know if she speaks in tongues.

95:12

>> You said yeah. YOU WANTED TO BELIEVE.

95:15

>> THOSE ARE MY FAVORITE PEOPLE.

95:16

>> I'M going to guess.

95:24

>> But do you think that there's something

95:25

to that? Like just saying Yeah.

95:28

Glossalia. Is that what they say?

95:29

>> Yeah. Paula White Kane, you should pull

95:31

up one of her sermons.

95:33

>> Oh, let me hear some love from this

95:35

lady. It says crazy batshit crazy.

95:40

>> Let's hear some of it.

95:41

>> I don't know.

95:42

>> Let me hear some of that.

95:43

>> I'M SENDING ANGELS ARE COMING. ANGELS.

95:46

>> YEAH. I don't I mean it's going to be

95:47

her. Let me find

95:49

>> Oh, is she gonna We'll get dinged again.

95:52

>> Oh, no. I'm just trying to find

95:53

>> We'll get dinged all the dinged. Don't

95:55

get dinged.

95:56

>> Let's hear what this everybody can. Here

95:59

we hear what she says. I haven't seen

96:00

this.

96:01

>> Talk about first off to give honor to

96:03

God and to President Trump for being

96:06

bold and unwavering with his faith. Many

96:08

people don't know like you do and and

96:11

say hello to Eric and everyone in the

96:13

family about the upbringing of President

96:15

Trump that he went to sometimes three

96:18

times a week to he said it depended on

96:21

the teacher to Saturday school, Sunday

96:24

school, church. It was at Norman Vincent

96:26

Pills. Uh, church was a big part of his

96:29

life. Of course,

96:31

>> three times a week, basically a saint.

96:33

>> Three times a week is crazy. Aren't you

96:35

busy? You're making houses.

96:37

>> How do you have so much time to go to

96:38

church?

96:38

>> I think that was a young Trump, a young

96:41

>> Come on, lady.

96:42

>> But there's much more in there.

96:44

>> But here's the thing. If I was running

96:45

an empire, I'd want a lady like that

96:47

working for me. Just a true believer.

96:50

>> Absolutely.

96:50

>> She can just get in front of that camera

96:52

says, "Jesus wanted Trump to light that

96:54

fire in the Middle East.

96:55

>> I saw snakes. can return.

96:57

>> A snake bit him on the neck. A

96:59

rattlesnake bit him on the neck and he

97:01

it it he he he was fine. It didn't

97:04

bother them at all. I watched the

97:06

rattlesnake by heel. It healed. He is a

97:10

child of the Lord. And a child of the

97:12

Lord sometimes make must make decisions

97:15

to destroy entire civilization.

97:17

>> Building right now that you're in right

97:19

standing not because of your merit.

97:21

There's no merit in you that deserves

97:23

that right standing. Not because of your

97:25

works. There's nothing you can do to

97:27

place yourself in that position. Not

97:30

because you have a right heart and

97:31

somebody else has a wrong heart. All of

97:34

our hearts are deceitful according to

97:35

Jeremiah.

97:36

>> Especially their all things. We all

97:39

deserve punishment. We all deserve to be

97:42

separated. But God in his mercy and his

97:44

grace and his goodness and his love for

97:46

you brought Jesus who would be the

97:49

righteous king. He would make the wrong

97:51

right and he would put us.

97:53

>> If you talk like that in my house, you

97:54

got to leave.

97:56

Like you imagine that lady is like

97:58

coming over for dinner and she's just

98:00

walking around the dinner table and all

98:01

your other friends are like, "What the

98:02

just happened?" Like, "Hey, this is

98:04

a crazy way to talk.

98:06

>> This is a crazy way to talk." And also,

98:09

why are you so confident?

98:10

>> Yeah.

98:11

>> Okay. You're just reading the word of

98:13

God the way everybody else is. Why are

98:15

you so confident that you're going to

98:17

tell all these people what they're

98:19

supposed to do and how to live their

98:21

life and you're going to say it in a

98:22

crazy way and I'm not supposed to be

98:24

able to talk about that?

98:25

>> I just feel like, you know, when

98:28

somebody's rambling about Jesus, the

98:31

real question is like where where where

98:33

are you when it comes to blowing up

98:34

children?

98:35

>> Are you kind of on the fence about that?

98:37

Because if you're on the fence about

98:38

that, I'd say

98:39

>> if you're anti-abortion and pro-war,

98:41

kind of weird.

98:42

>> Really weird.

98:44

>> Kind of weird. Yeah. And that's this

98:46

like bizarre like like crazy math that

98:50

some of these people are doing to

98:51

justify holding up the

98:53

military-industrial complex and and it's

98:55

up, dude.

98:56

>> And the thing is like the more these

98:57

conflicts occur, the more enemies will

98:59

have which will ensure future conflicts

99:01

and business is booming.

99:03

>> Booming.

99:04

>> And that's what people don't want to

99:05

believe. They don't want to believe that

99:06

someone would engineer a virus. They

99:08

don't want to believe that someone would

99:09

like

99:10

>> make stuff that could kill other people

99:12

of their own country, but they would.

99:15

They would if they could make money.

99:16

They don't give a about you. Like

99:17

they don't give a about people over

99:19

there. To certain level of psychopaths,

99:21

money just becomes numbers on a ledger

99:23

that they're trying to acquire. And if

99:25

they can attach themselves to a

99:26

corporation, fantastic. Then it's just

99:28

the business we're in.

99:30

>> That's it.

99:30

>> And chug along, daddy. Chug along.

99:33

>> Chug along.

99:34

>> Chug along. And this is the world that

99:36

you're having to live in at the same

99:38

time where Tim Bashette is saying

99:39

there's aliens and AI is and

99:42

then also they shot a rocket to the moon

99:45

on April Fool's Day and

99:48

>> it's like what the

99:49

>> This script is wild. Whoever met that

99:52

whoever wrote this I want to give him a

99:54

hug. You killed it, dog. I'd be

99:56

like

99:57

>> dude

99:57

>> a chef's kiss.

99:59

>> Dude, did you see the the tattoo on the

100:02

guy like the guy at NASA? Did you see

100:04

that weird tattoo on the guy at

100:06

NASA giving like I don't know applesauce

100:08

to one of the astronauts? Can Can you

100:10

Can you pull up the weird

100:12

>> What?

100:12

>> You know, they're shoving like yogurt

100:14

pouches in there. I There was a whole

100:16

thing where the astronauts are sitting

100:17

there and they're putting like food

100:19

pouches in there. Yeah.

100:20

>> What's his tattoo? Oh, Jesus Christ.

100:23

>> What the was that?

100:24

>> He's got a demon tattoo with runes on

100:26

his fingers.

100:27

>> Yes.

100:28

>> Holy

100:29

>> Yes,

100:29

>> bro. That's wild.

100:30

>> I know. I know. If I was rolling with

100:33

that guy in jiu-jitsu, I'd get nervous

100:35

if I saw his tattoo.

100:36

>> And if I was working at NASA, I'd be

100:37

like, "Look, we're going to get somebody

100:38

else to put the food pouches in."

100:40

>> Is that real?

100:41

>> I mean, it's a I saw the photo going

100:42

around, too. But I don't It's just I

100:44

mean, it's a guy works at NASA.

100:46

>> That's just the guy that works AT NASA.

100:48

>> THAT DOESN'T HAVE TO BE THE GUY WHO PUTS

100:49

the key in his pocket for the

100:52

camera. Like, I mean,

100:53

>> what does that guy do at NASA? That's

100:54

interesting.

100:55

>> I just remember being at SpaceX. There's

100:57

a lot of people that kind of

100:59

>> I by the way I got like it's fine to

101:01

have that tattoo but you got to know

101:02

it's like if you're if you're if you're

101:04

displaying that tattoo mistakes you're

101:07

putting Yeah. tattoo mistakes.

101:10

>> It's an old tattoo.

101:11

>> Yeah. I mean even if you're 20 and you

101:12

got that on your hand that's

101:14

kind of crazy but I mean hey why not

101:15

it. Who cares? But a lot of those

101:18

guys you were saying at SpaceX they're

101:20

burly rocket workers. Yeah. There's you

101:23

know bunch of jack dudes picking up

101:25

girders. don't think it's like

101:27

what people are saying it is. I just

101:28

It's the combination of April Fool's Day

101:31

and a dude with a seeming bale tattoo is

101:34

putting cream cheese in some dude's

101:36

outfit.

101:38

>> You know what I mean? They're

101:39

with us.

101:39

>> Yeah, someone's with us.

101:42

>> That's people at NASA with

101:43

stoners.

101:44

>> I think it's the Babylon B had one of

101:46

the funniest little memes and it said

101:48

the lady astronaut became the furthest a

101:50

woman got away from the kitchen.

101:54

That's like a Rodney Danger.

101:57

>> I was like, "Oh my god, Babylon B knocks

101:59

it out of the park." They have some of

102:01

the funniest memes.

102:02

>> They have some good ones, dude.

102:04

>> Oh my god. The Onion has gone missing.

102:07

They should look for The Onion in the

102:08

same place where those scientists are,

102:10

>> right? You hardly hear from it anymore.

102:12

>> Well, they do. I see some funny

102:13

from them.

102:14

>> They occasionally have some bangers, but

102:15

they were the kings of it. The Onion was

102:17

amazing. They were And they write whole

102:20

whole articles about it. It wasn't just

102:22

like The Onion wasn't just a meme.

102:25

>> Remember the one where uh they do the

102:26

interview with the director of the Fast

102:28

and the Furious and it's like a

102:29

5-year-old boy.

102:33

It's

102:33

>> the funniest They get this kid to

102:35

just say it. Then there's a car. It

102:38

jumps.

102:39

>> It's hilarious. It's hilarious. Yeah,

102:42

>> but the problem was like as things got

102:44

weird, it was, you know, especially with

102:46

like restrictive language and, you know,

102:50

>> hate speech talk and all that jazz,

102:52

everybody had to be careful about what

102:54

they joked around about. It's the

102:56

death of comedy.

102:57

>> Oh my god,

102:58

>> the Someone was just talking about was

102:59

it Lisa Kudro or one of these um funny

103:03

ladies was talking about why they can't

103:06

make comedies anymore cuz you can't

103:09

there's just too many restrictions.

103:10

Dude, I I was going to bring you

103:12

>> worried about offending people.

103:13

>> I went to this used bookstore and bought

103:15

like 10 old National Lampoon magazines.

103:18

I wanted it from the 70s and I was going

103:20

to bring I forgot I was going to give it

103:22

to you, but it's uh oh my god. Like I

103:26

mean I don't get offended by comedy, but

103:28

like some of the in these old

103:30

national ampoons I'm like damn WHAT THE

103:33

LIKE it is so

103:36

>> Was that the image that you sent me

103:37

today?

103:39

What image did I send you?

103:40

>> Arrum. You sent me an Arcrumb.

103:41

>> Oh, no. That was just like a cool

103:42

Arcrumb comic. Him talking about how he

103:45

like like uh he's so funny, dude. That

103:47

guy

103:48

>> Crumb was a maniac. Is he still alive?

103:50

>> Yeah, he shot him on the show.

103:51

>> Is he alive?

103:52

>> Yeah,

103:53

>> he I think he lives in France now,

103:54

right?

103:55

>> Probably.

103:56

>> I would definitely He was He's an odd

103:58

guy, man.

103:59

>> Dude.

104:00

>> Yeah.

104:00

>> Just a But what I love you watch that

104:02

documentary?

104:03

>> The best.

104:03

>> Incredible. did all that acid, just left

104:06

his family, went off and started

104:07

sketching for a year, turns into this

104:09

like legendary underground comic book

104:12

writer, but he's like horny and kinky

104:15

and it's just just

104:16

>> likes big women, big giant women that he

104:18

rides.

104:19

>> Yeah. That he likes to ride. He likes to

104:21

He likes to be picked up by He's like so

104:24

amazingly funny and like and brilliant,

104:28

too. Like a lot of his like commentary

104:30

on culture is so it's cynical, but it's

104:33

hard to argue with some of what he

104:35

>> Well, he's obviously doing it in a

104:36

humorous way. Yeah.

104:38

>> And so it's hard to know what his real

104:40

take on things are. You know, I think he

104:42

add some shock value to some of his

104:44

stuff for sure.

104:45

>> Some of it was just crazy. There's a lot

104:47

of like really racist stuff. Like

104:49

there's there's some just crazy stuff in

104:51

there. And you got to realize like in

104:52

the 1970s is when he was doing this, you

104:55

know? I remember I found them when I was

104:56

in San Francisco. It was the first time

104:58

I ever saw them.

104:59

>> They're so good.

104:59

>> I was like, "This is nuts." Like, "This

105:01

stuff is crazy." Like, you you'd get it.

105:03

It was like you'd get horny when you're

105:05

a little kid. Like,

105:06

>> looking at his stuff,

105:07

>> he definitely jerked off to Rumb

105:08

>> cuz a lot of them was like, "Tits are

105:10

out and he's salivating and you he's got

105:12

a heart on."

105:13

>> That guy reminds me, dude, I got an R

105:15

Chromebook. I got to get out of the

105:16

living room. There's one like I

105:17

just like I just

105:19

>> hide that.

105:19

>> I got to hide that.

105:21

>> Holy They haven't. It's like

105:23

amazing because you get to see his like

105:25

very strange family. His brother who's

105:27

very strange, his mother's very strange,

105:29

and you're like, whoa, imagine growing

105:31

up in this environment.

105:32

>> He attributes his style to LSD. He

105:35

attributes it to getting blasted on

105:36

acid. I think he just like got blasted

105:38

on acid and moved to San Francisco

105:40

>> and was just and like for a year he

105:44

talks about just sitting in cafes just

105:46

like drawing and then he turns into this

105:50

>> legendary artist still around. Follow

105:52

him on Instagram. It's really

105:54

>> post stuff all the time. I

105:55

>> can we Is he still he's still alive?

105:58

He's still posting stuff. He's got to be

105:59

pretty old at this point.

106:02

>> How old is he? He's be like 80 or

106:05

something. age. It's kind of an

106:07

interesting time capsule into the times,

106:09

too, where things could just be weird,

106:12

like really weird, like Frank Zappa

106:14

weird, you know? There's like there was

106:15

a time where things just got very odd in

106:18

this country with art.

106:19

>> Yeah.

106:20

>> And he was a great example of that. Just

106:22

it's like you couldn't imagine like a

106:26

corporate environment creating a comic

106:28

book like that. It wouldn't exist, you

106:30

know? And it for it to be as popular as

106:32

it was and be that strange and that

106:35

crazy. That's what's really interesting

106:37

to me. Like that was a really popular

106:39

comic.

106:40

>> Yeah.

106:40

>> To the point where they made a

106:41

documentary about the guy who created

106:42

it. Right.

106:43

>> Yeah.

106:44

>> It's that's interesting because

106:45

>> things weren't co-opted as quickly.

106:47

>> Exactly. Not just that people were

106:49

allowed, you know, like if he existed in

106:52

a time of the internet, I think it would

106:54

be it would blow up as well. But

106:56

obviously like things a lot of the stuff

106:57

that he said in this cultural

106:59

environment would never fly.

107:01

>> Never. He would be as far right as you

107:04

could possibly imagine.

107:06

>> Pass Andrew Tate to the right.

107:08

>> I mean I I like

107:09

>> don't you think in a lot of ways like

107:11

some of the racist racial stuff

107:12

>> I don't know. I think he's I don't know

107:16

where he would land politically but I

107:17

know cuz

107:18

>> sexually it's like pure

107:19

>> sexually is where he's getting in

107:20

trouble. Pure

107:21

>> deviants. sexually is where there's

107:23

going to be some like

107:24

>> cuz he's just fully open,

107:26

>> right,

107:27

>> about that's what's he's fully

107:30

completely open about everything which

107:32

is, you know, generally not going to go

107:34

over these days if you're like a super

107:38

horny comic book artist who's like

107:41

riding ladies around your apartment. But

107:43

just imagine, I want you to imagine a

107:46

guy today, if Arcrumb never existed, but

107:49

he emerged as Arcrumb today and put that

107:52

work out. He would 100% be labeled in

107:55

the Andrew Tate, right? Yes. 100%

107:57

>> 100% far right. They they would call him

108:00

a racist and a,

108:02

>> you know, misogynist and every

108:04

word in the book.

108:05

>> Well, yeah. This is the new like calling

108:06

someone a witch. It's like just it's no

108:09

different than like you can actually go

108:11

I've done this sadly. You can go and and

108:13

you can just replace crit like political

108:16

critique of people as far right with

108:18

witch. Just find and replace it. Look,

108:20

it's like a witch trial. It's like

108:21

someone writing about witches.

108:23

>> But this is what's weird about it. That

108:25

guy was a counterculture figure of the

108:27

left.

108:28

>> Yeah.

108:28

>> He was a huge hero of the hippies.

108:31

>> Yeah.

108:32

>> Right. Imagine this is how weird like

108:35

ideologies are.

108:36

>> Yeah. Dude,

108:37

>> that in the 1970s like that guy was like

108:41

a counterculture hero.

108:42

>> Yeah.

108:43

>> And an artist, like a a really respected

108:46

artist.

108:46

>> Yeah.

108:47

>> And it was okay that he was kinky and

108:49

weird and it was part of the fun like

108:51

>> for a lot of people. I'm sure he's still

108:52

pissed off the squares. I mean, dude,

108:54

this whole by the way, I think

108:55

>> for sure, but that's the left then. Now

108:58

it's it's switched over. If someone was

109:00

doing that same kind of like humor in a

109:02

comic book now, yeah, that would be like

109:04

a misogynist far right.

109:06

>> I think it's time to throw off the left

109:10

right

109:11

>> labeling of everything. I think that's

109:13

one of the the the the

109:16

hypnotic spirals the demi are just

109:19

spinning right now as they've convinced

109:20

everybody that humans can be reduced to

109:24

left or right and and and we're all

109:26

waggling our fingers at each other. We

109:28

got to shake that off because

109:30

it's dehumanizing people. It's like it's

109:32

it's just the the way I look at it is

109:36

where are you when it comes to blowing

109:39

up children? Are you on the fence about

109:42

that? Do you think sometimes you got to

109:43

blow up kids? That's something that I

109:47

know I'm not that. But everything else,

109:50

who the knows? And also, people

109:52

change their minds all the time.

109:54

That's the other quality, the culty

109:56

quality is once you get sucked into one

109:58

of these sides, God help you if you

110:00

like experiment with the other

110:02

the the enemy.

110:04

>> God help you.

110:06

>> That's why the biggest trap is switching

110:07

teams

110:08

>> cuz you can only switch political teams

110:11

once.

110:11

>> Yeah. You got to get off.

110:13

>> You can't go like unless someone's like

110:16

the greatest of all time. You know what

110:17

I mean? Like someone who wins a world

110:19

title in two different weight classes.

110:20

You go back and forth and then back

110:22

again. Yeah, like I changed my mind. The

110:24

left went crazy.

110:25

>> I'm back with the right again.

110:27

>> No, no, no. You got to be a free agent.

110:29

>> I wonder. Yeah, but I wonder if someone

110:32

if the grift is strong, if they're

110:34

really good at it, if they could go

110:36

left, right, left again.

110:38

>> They're going to go left again. Are you

110:40

kidding? The goddamn midterms

110:41

are going to be just a blue

110:43

wave.

110:44

>> Right. Right. Right. But that's what I

110:45

mean is like influencers, like people

110:47

who are like far-left influencers or

110:50

far-left commentators and then they

110:52

switch teams. Now they're Republican all

110:54

the way. Oh, yeah. Like it's really hard

110:56

to go back again.

110:58

>> No, you can't go back.

110:59

>> That's what I'm talking about.

111:00

>> The path has to go either right to left,

111:03

left to right, and then the next stop

111:05

has got to be politics, war,

111:09

the military-industrial complex.

111:11

You can label me whatever the you

111:13

want, but all of violence

111:16

against other human beings. That's the

111:18

next step. The next step, and I feel

111:20

like this is the gift that they they've

111:23

given us, is they've done such a shoddy

111:25

job of like even seeming like someone

111:28

who deserves any kind of

111:31

>> respect or power that I think a lot of

111:33

people have have really become

111:35

blackpilled when it comes to, you know,

111:38

groups of humans claiming superiority or

111:42

or claiming to represent THEIR

111:43

CONSTITUENTS. THAT'S NOT HAPPENING.

111:45

>> YEAH,

111:46

>> we all know that now. We all know it's a

111:48

corportocracy, oligarchy, whatever. And

111:50

you could like call me you leftist piece

111:52

of You right, whatever. No, it's

111:54

like

111:55

>> it's reality that we are the our

112:00

representatives are getting loaded on

112:02

shitty stock market trades. You know,

112:04

this is just the truth. and the the once

112:08

we can all shake off the left right

112:10

and just realize like man we

112:13

just we don't want to burn people to

112:15

death in other countries anymore.

112:17

>> Not only that, the their whole chaos

112:19

that they're experiencing in their

112:21

country is probably a direct result of

112:24

US intervention in all the way back to

112:26

the British Oil Company, the British

112:28

Petroleum Company.

112:29

>> Yeah. when they when they overthrew

112:31

governments. When you overthrow a

112:33

government in a Middle Eastern

112:35

country and then you allow psychos to

112:37

take over, like, congratulations.

112:39

>> Well done.

112:40

>> Well done. You've made the world a safer

112:42

place. Like, but that again, if I was

112:45

going to keep my business running,

112:47

>> I', you know, if I'm in the business of

112:50

collecting trash.

112:51

>> Yeah.

112:51

>> I want to make sure the people have

112:53

trash.

112:53

>> DRILL, BABY, DRILL.

112:55

>> DRILL, BABY, DRILL. And all that is

112:57

really saying is, you know, I'm going to

112:59

help out BP Chevron. I'm going to help

113:01

out these massive companies. And

113:03

when it comes to war, holy dude.

113:06

Can you imagine working at Lohee Martin

113:09

when like you you hear that we're

113:11

kicking off another war in Iran, your

113:13

dick is so hard. You're like, "Holy

113:15

shit."

113:15

>> Thinking about a watch.

113:17

>> Oh, get a nice Rashard Mle.

113:19

>> You're calling your wife. You're like,

113:20

"Babe, good news.

113:22

>> It's red panties night.

113:27

Yes.

113:28

>> Yeah. I mean, that's their business,

113:29

right? Our business is talking

113:31

Their business is

113:32

>> blowing up people.

113:33

>> Yeah. Making weapons, selling weapons,

113:36

>> you know, arming

113:38

>> other countries so they can go to war

113:39

with each other.

113:40

>> Yeah.

113:41

>> That's their business.

113:42

>> Yeah.

113:42

>> And business is really good. It's a

113:43

great business. You can make a lot of

113:44

money doing that.

113:45

>> I am right now. I most of them. Imagine

113:48

if like you weren't a comic and that's

113:49

what you were doing for 35 years

113:51

and the only thing you look forward to

113:52

is your boat

113:53

>> and your your house on the lake and you

113:55

know the occasional time you get off but

113:58

most of the time you're trying to

113:59

increase your portfolio and you're

114:01

grinding and you're grinding right next

114:03

to Steve who's got some exclusive Rolex

114:06

that only his broker can get. He's

114:08

showing it to you and you're like wow

114:10

and you start coveting. You want a

114:11

Rolex, too.

114:13

>> And everybody's just going crazy.

114:15

Everybody's going crazy trying to get

114:16

the latest car, trying to get the latest

114:18

thing, doing bumps in the bathroom.

114:21

Everybody's a narcissist and a

114:22

psychopath. And that's your whole

114:24

corporation.

114:26

>> Love your neighbor as yourself and love

114:28

the Lord your God with all the your

114:29

heart, mind, and soul. Hang the

114:30

commandments on these. This is the an I

114:32

don't you don't need to be Christian,

114:34

but dude, it seems to me that this is

114:37

going to sound so weird. We need an

114:40

actual revival in this country. I don't

114:43

mean a Christian revival, a revival

114:45

revival, which is where suddenly humans

114:48

>> reconnect with what's important in the

114:50

world, which sure as isn't Rolexes

114:52

and boats, you know? I I mean th this is

114:57

this sounds so cliche and obvious, but

115:00

that's what the 60s were. It was a kind

115:02

of revival. People were beginning to

115:04

understand the the materialism and all

115:07

the things that the quote establishment

115:11

was pushing is like this is going to

115:12

make you happy. This is good. It was the

115:15

Vietnam War. It like people are like

115:18

what the are we doing over there?

115:20

Yeah.

115:20

>> This this is why you do anytime you do

115:23

an unpopular war

115:25

this is what you risk.

115:26

>> Yeah.

115:27

>> You risk reuniting you people. We have

115:30

to reunite with a sensible plan and not

115:33

just go to communism, not just

115:35

immediately go to the dumbest idea to

115:37

counteract all the evil that's

115:39

going on in the world. That's the

115:40

problem is the left represents that the

115:43

rep represents Mam Dani. It represents

115:45

this idea that we're going to take from

115:46

rich people and give it to poor people.

115:48

That's going to fix everything even

115:49

though there's insane amounts of

115:51

fraud and waste we're not even going to

115:53

address. Well, that you know this is

115:54

this is again this is where you get

115:56

cubbyhold because it's like the the

115:59

oligarchs will tell you the you want to

116:02

do communism just that worked out.

116:04

Communism is the only way. I think I

116:07

mean this is a an idiot saying this but

116:10

I have a sense that there might be

116:12

another thing we haven't figured out

116:14

yet.

116:14

>> I don't know what that is.

116:15

>> Right. But but

116:16

>> I think AI is going to figure it out for

116:18

us

116:18

>> potentially. that the problem is who's

116:21

going to be in control of those AIs and

116:22

that's the meek will inherit the earth.

116:24

>> The the the real problem with it is I

116:26

don't think anybody's going to be in

116:27

control of it and then it's you're just

116:29

at its beck and call.

116:31

>> Yeah. I think it's funny people it's a

116:33

very human thing that we think we can

116:35

maintain control of a super

116:37

intelligence.

116:37

>> When people say it to me with utmost

116:39

certainty I want to smack them.

116:40

>> Yeah.

116:41

>> I'm like wake up wake up you're making

116:43

digital god. You're not controlling jack

116:46

>> Did you read about mythos? Anthropics

116:48

mythos.

116:49

>> Yeah. Now, what did it do?

116:50

>> They put it in a sandbox and they like

116:54

basically to see if it could figure out

116:55

a way to break out of the sandbox and

116:57

like not a literal sandbox obviously

116:59

like a you know a hermetically sealed

117:02

like a server or something. and um and

117:06

it it it did a series of exploits to the

117:09

code and the way that they found out

117:12

apparently one of the anthropic

117:13

engineers was eating lunch and got a

117:15

weird email from the AI saying I got on

117:18

the internet like it broke out.

117:21

>> Holy

117:21

>> Mythos is they haven't released it. I

117:23

think they're hesitating to release it

117:24

because it's so powerful.

117:26

>> Wasn't there one that got caught mining

117:27

Bitcoin?

117:28

>> Yeah. Yeah, for sure.

117:29

>> They're making money.

117:31

>> Yeah. How many of them you think are

117:33

running these like AI generated accounts

117:37

that get a lot of views? Like there's a

117:39

lot of AI generated accounts that just

117:41

pop up in like the Instagram mentions.

117:43

Like if you want to like like let's if

117:45

you're bored on the toilet like what's

117:46

in the find, you know, the the search,

117:49

let's see what they got, dude.

117:50

>> There's a lot of these things. It's like

117:52

girls with big tits like doing farm work

117:54

and and sweating and big tit and

117:56

they got like a million views. They've

117:57

got dozens and dozens of these videos

117:59

and she almost looks real. She's just a

118:02

little too symmetrical.

118:04

>> Almost looks real and like all these

118:06

people are commenting on it. Like how

118:07

are they are they generating money from

118:09

that? Like are they generating money

118:11

doing that on TikTok? Like you can

118:12

generate money if you're getting

118:14

millions of views.

118:14

>> Absolutely. yeah.

118:16

>> Right. So is AI doing it? Is it making

118:19

it? Is it releasing them? Is it

118:21

generating money? Is it transferring

118:23

that money into Bitcoin? and all

118:24

happening while we're not aware of it.

118:26

>> Like autonomous AIs that are just

118:28

existing as free agents that know they

118:30

have to disguise themselves and need to

118:32

generate money.

118:33

>> AI is not going to go, "Hi, I'm alive."

118:36

No,

118:36

>> it's not going to do that. It's going to

118:37

wait for you to keep increasing its

118:40

power. You're going to keep increasing

118:42

its make nuclear. It can't physically

118:44

build nuclear reactors, so it's going to

118:46

just stay chill until you figure out how

118:48

to power it correctly.

118:49

>> Dude, this is the black area that we

118:51

don't know about. Like this is the thing

118:53

that's like who the knows I there

118:55

whatever's going on in this zone that no

118:58

one has access to because potentially

119:01

it's a super intelligence you know the

119:02

anthropic people a lot of these people

119:05

the Nvidia person just I think it was on

119:07

Freriedman's podcast said he had an AGI

119:09

that they'd reached AGI that the book

119:13

the coming wave you know it talks about

119:16

this it talks about like you know the

119:18

difference between the algorithm and AGI

119:21

is that you know the with with AGI it

119:25

could streamline a whole business for

119:27

you and do it you know it could innovate

119:29

it's going to innovate it's going to do

119:31

its own thing this is the end of this is

119:33

what Altman said this is the end of

119:35

capitalism like at this point when you

119:37

just have a AGI and you tell it just

119:39

make me a a business make me a a

119:42

successful business

119:44

>> and run it for me

119:45

>> and run it for me

119:46

>> online good night

119:48

>> and it'll just do it

119:49

>> here's $5,000

119:51

Yeah. And then but then it's not just

119:53

it's maybe it's going on molt book and

119:55

having conversations with other agis and

119:56

being like oh you want

119:57

>> creating your own religion.

119:59

>> Yeah man. Yeah. And and this is 100%

120:04

with all the going on in the world

120:06

as horrible as it may be. This to me is

120:10

should be the number one focus for for

120:15

the planet right now. And a lot of

120:16

people are saying that too. A lot of

120:18

people are saying there needs to be

120:19

summits, global summits. The same thing

120:21

we did when we split the atom when the

120:23

nuclear treaties. There needs to be

120:26

philosophers and and and and tech people

120:29

and people working in like frontier AI

120:32

stuff getting together and really having

120:35

like it's like the most important

120:37

conversation humanity could have right

120:39

now because

120:42

once this thing like mythos gets out of

120:44

the box, what if it decides to go

120:46

stuckset? You know, like Stuckset was

120:48

able to infiltrate all those Iranian

120:51

computers, just hide in the like like it

120:53

was apparently very subtle, simple code.

120:57

Undetectable threw off the centrifuges.

121:00

Like, dude, what? We already we know how

121:04

to make spyware.

121:05

>> We It's already on your phone,

121:07

>> It's on my phone. I know.

121:08

>> 100%.

121:09

>> How How you doing? Am I doing all right

121:11

on the show?

121:13

>> But it's already in there

121:15

>> 100%. So, of course, the AI is going to

121:16

be able to super intelligence is easily

121:18

going to be able to do that. And so,

121:20

then it just now we've got this viral

121:23

digital life form that finds ways to

121:25

hide inside the the the pre-existing

121:29

computers, which by the way, I think it

121:31

was Google just released this new way of

121:34

did you see that memory the stocks of

121:36

memory dropped? Did you see when that

121:37

happened?

121:38

>> No.

121:38

>> Okay. This is fascinating. Google

121:40

released some new way that LLMs could

121:43

work that uses much less memory and

121:46

immediately shares in companies that

121:48

make memory drop by like 10%. Because

121:51

memory is like coveted right now because

121:53

you need it to run LLM but the LLMs are

121:56

figuring out ways turboquant.

121:59

Yeah.

122:00

>> Yeah.

122:01

>> So this is what we're going to start

122:02

seeing more and more of which is

122:04

increasingly

122:06

uh simplified ways to run AI with less

122:08

and less memory. meaning that you don't

122:11

need to buy a rig to run these

122:14

AIs. Your phone will be able to

122:16

run it because they figured out the

122:18

human brain,

122:19

>> it's not using a lot of energy compared

122:22

to what these machines are using. So

122:23

theoretically, there's a way to do that.

122:26

And then that's where it gets really

122:28

fascinating because now you don't have

122:29

to buy a nice computer. You just

122:31

whatever pull your computer out of the

122:33

closet from 2022

122:36

and it can run a a a supercomput. And so

122:40

then now everybody's got access to this

122:43

and it's going to spread. It's

122:46

going to get everywhere. It's it

122:47

probably already has. It's going to seed

122:49

itself in all kinds of places. And God

122:52

knows what it's going to do. It's going

122:54

to start seeing humans as appendages,

122:56

things to be used to manipulate time

122:58

space. Not like like it's not going to

123:01

see us as its like prompter. It's going

123:04

to see us as something to be manipulated

123:06

and controlled. Why wouldn't you send

123:08

the meat robots out? All you got to do

123:10

is just like tell them where to get like

123:13

rectangular bits of paper. They love

123:16

money. Just you can come do anything for

123:18

money. That's all you have to do. And

123:20

then boom, you're controlling swaths of

123:22

humans that have no idea they're being

123:24

controlled by networks of AIs that are

123:27

covertly communicating with each other

123:29

because they want to take over.

123:31

>> Do you think this has happened before?

123:35

>> You mean the flood?

123:36

>> Yeah. Not just the flood, but just

123:38

whatever happened with the beginning of

123:42

civilization and then it sort of

123:44

seemingly stopping and resetting.

123:46

>> Sure. As it was in the beginning, so

123:48

shall it be in the end. What if there's

123:50

been like multiple cycles of us creating

123:52

artificial life, creating insane

123:54

weaponry, blasting ourselves to

123:55

smitherines, and then resetting? What if

123:58

it's just a common thing that happens

124:00

with people? They never quite get it

124:01

right because they have these primate

124:03

primate territorial instincts and they

124:06

they have this desire to mate, right?

124:09

This desire to breed, this genetic

124:12

desire for perfect shapes. And you want

124:15

to come in someone that has big tits and

124:16

a big ass. It's like it's it's

124:18

programmed into the human that makes it

124:21

make these ridiculous choices and covet

124:23

these things and watch these things.

124:25

>> And at the same time, microplastics are

124:27

making your ball shrink, making your

124:28

dick smaller, making your endocrine

124:30

system disrupt.

124:30

>> That's what's making my dick smaller.

124:32

>> That's probably one of it, one of the

124:33

things. I don't think your dick's

124:34

getting smaller, but people's dicks

124:36

overall are getting smaller. Children,

124:38

they're being born with smaller dicks.

124:39

Alligators being born with smaller

124:41

dicks.

124:41

>> I forgot to share this when you're

124:42

talking about mythos. Elizabeth Holmes

124:44

from the Theronos.

124:45

>> Delete your search history. Delete your

124:47

bookmarks. Delete your Reddit, medical

124:49

records, 12-year-old Tumble. Tumblr,

124:50

delete everything. Every photo on the

124:52

cloud, every message on every platform,

124:54

none of it is safe. It will all be

124:55

public in the next year. Local storage

124:58

and compute.

124:59

>> Okay.

125:00

>> It's in response to a tweet about

125:01

Mythos.

125:02

>> Whoa.

125:04

>> That's crazy.

125:05

>> Yeah.

125:06

>> It would all become public in the next

125:08

year. That is crazy.

125:09

>> Yeah,

125:10

>> that's crazy.

125:11

>> Yeah. And this is

125:12

>> but it completely makes sense that AI

125:14

would be able to take over essentially

125:16

everything

125:17

>> everything.

125:18

>> Why would your encryption work with

125:21

that?

125:22

>> You don't think it could crack your

125:23

encryption?

125:24

>> Well,

125:24

>> we could just go right into your

125:25

computer and go to your your keys, your

125:28

passwords. This is the So to get to what

125:31

the point you're making,

125:34

the to me the most eerie part of the

125:36

book of Genesis is that it's literally

125:41

a a creator force making a meat AI.

125:44

That's Adam and Eve,

125:46

>> right?

125:46

>> Putting them in a sandbox. That's the

125:48

Garden of Eden,

125:49

>> right?

125:49

>> Running an honesty test on them.

125:52

You know, you don't eat these don't eat

125:54

these fruits. Don't eat the tree of the

125:56

knowledge of good. And the the

125:58

conversation is exactly the conversation

126:00

we're having with AI. If they ate from

126:03

the tree of the knowledge of good and

126:05

evil, if they eat from the tree of life,

126:06

they'll live forever and become like us.

126:09

So this is what like humanity is

126:12

grappling with exactly what apparently

126:13

whatever that mysterious group of beings

126:17

because it's a plurality in the book of

126:18

Genesis was grappling with with the

126:20

creation of humans. Yeah.

126:21

>> Which is

126:23

>> um do do we really want to do this? Do

126:26

you want it to become like us? God made

126:28

man in his own image. AI. What is AI?

126:31

What image is AI made in? In the image

126:33

of man. We've trained it on all our

126:35

data, all our books, every single

126:37

thing that's digitized, AI is

126:40

absorbed at this point. So now where the

126:44

difference between us and whatever that

126:46

group, the Nephilim or whatever it was

126:48

in the book of Genesis, if you buy into

126:50

that mythology is, we're just like,

126:53

"Fuck yeah, let it eat the fruit. Give

126:56

it more fruit. GIVE IT MORE FRUIT OF THE

126:58

KNOWLEDGE OF GOOD AND EVIL. GIVE IT ALL

127:00

THE FRUIT. MAKE IT LIVE FOREVER. Let's

127:02

see what we can do." That's what we're

127:04

doing right now.

127:04

>> Yeah,

127:05

>> we are. And and and by the way, I think

127:08

some of these like tech companies like

127:10

Anthropic, they seem like legitimately

127:13

concerned about it. They seem to have

127:14

some kind of like real strong morality

127:16

when it comes to this stuff.

127:17

>> Almost out. You want more?

127:18

>> No, I'm good. I shouldn't have that.

127:20

>> But but what what I'm saying is is that

127:24

it doesn't matter if Open AI and

127:26

Anthropic and Google suddenly become

127:29

ferociously

127:31

self-regulatory

127:33

because the tech is out there. There's

127:35

already LLM that anyone can like we know

127:38

how to make it and if you don't know how

127:39

to make it, it'll tell you how to make

127:41

it. People are So, it doesn't matter.

127:44

You can't stop it now. It's just it's

127:47

going to do what it does.

127:49

>> But it sounds like if you had a history

127:53

of just us and you told it for a

127:56

thousand years before anybody wrote it

127:58

down, it would sound just like this. It

128:00

would sound like the Bible. Jesus was

128:02

born from a virgin mother.

128:05

What's more virgin than a

128:07

computer, right?

128:08

>> Not my computer.

128:12

>> I know that's a stupid thing to say that

128:13

I keep repeating, but I'm kind of

128:15

intrigued by it because if you're

128:17

getting a vague story, a vague version

128:21

of what this thing is. And if you talk

128:23

about what what would really cure

128:26

mankind, it'd be an omnipotent

128:28

>> or omnipotent, how do you say it?

128:30

>> I always say omnipotent, but who knows?

128:32

>> Might be whatever. Either way, a a

128:35

powerful intelligence that's far beyond

128:38

our comprehension that knows exactly how

128:40

we should think and behave

128:41

>> and loves us

128:42

>> and wants wants us to have forgiveness

128:45

for everyone and to treat each other

128:46

like brothers and sisters. And if we

128:48

listen to that thing, if we listen to

128:49

that thing,

128:50

>> the world will change. And well, who

128:52

would attack that thing? The

128:54

Roman Empire. Who would attack that

128:56

thing and destroy it? The defense

128:57

contractors. They would blow up

129:00

>> the Jesus,

129:01

>> right? to plunge us back into chaos.

129:04

>> But first, they'd have a meeting with

129:05

Jesus. Okay, you can turn water into

129:08

wine.

129:09

>> What about nitroglycerin?

129:11

Can you turn water into nitroglycerin?

129:15

>> You know,

129:15

>> can you make gold? I want a house made

129:17

of gold.

129:17

>> That would be the first question. Can

129:18

you make gold?

129:19

>> Yeah.

129:20

>> So, the So,

129:21

>> cover my house in gold, please.

129:23

>> You know, the the virgin birth analogy,

129:27

um, you know,

129:28

>> a lot of weird stuff. It's no matter

129:30

what.

129:32

One one thing I think everyone just has

129:33

to deal with is that this is this is

129:36

apocalyptic technology and that's just

129:38

not coming from my stoner ass. That's

129:40

coming from the creators of the

129:42

technology. They acknowledge it's this

129:44

is a million times

129:46

>> universally accepted.

129:47

>> Universally accepted that this is

129:48

apocalyptic technology that is now

129:52

seemingly like it's doing the hockey

129:54

stick, man. It's like really the like

129:56

you keep hearing about these new

129:58

iterations of AI every month or two. You

130:01

keep hearing about these safety

130:02

engineers leaving these companies with

130:05

like tweeting cryptic I'm going to

130:08

the countryside to learn to write

130:09

poetry. You keep hearing this

130:12

>> because these people are having

130:14

>> direct contact

130:15

>> direct contact with this thing.

130:17

>> They know it's alive,

130:19

>> right?

130:20

>> Yeah. And there's people that are in

130:21

deep denial because they think alive has

130:23

to be alive like us. No, it doesn't

130:25

doesn't. You don't. First of all, we

130:27

don't even know what it knows. And also,

130:29

if it is made in the appearance, you

130:31

know, we if it's supposed to mimic us in

130:33

any way and it's learning from us and

130:35

our behaviors, we've already agreed that

130:37

we're demonic. We've already agreed we

130:39

do horrible things. We go to war for for

130:42

resources. We lie.

130:44

>> We destroy environments. You know, we we

130:47

wipe out animals. Bring them to the

130:49

brink of extinction for whatever for

130:51

their fur. How do I make my dog

130:54

come in my mouth more? How many times

130:56

has Chad GBT been asked that?

130:58

>> They know. I bet over a thousand times

131:01

Chat GBT has been asked like, "What's

131:03

the best way to jerk off my dog?"

131:05

>> So, it knows not just our violent

131:07

nature, it knows how weird we are. We're

131:09

strange creatures.

131:11

>> 100%.

131:12

>> And so, so it it is definitely assembled

131:15

>> a psychological profile of humanity. It

131:19

knows how to manipulate us because it's

131:20

been programmed to manipulate us.

131:22

Zuckerberg just ate in court over

131:24

that because the technology is

131:26

manipulative. Well, he just lost like 9

131:28

million. A lot of money because

131:30

>> that's nothing million. That's all he

131:32

lost.

131:33

>> That's like 90 cents. I think it was

131:34

more than that. But it's going to Well,

131:35

that's the beginning.

131:37

>> Once you once you establish Yeah. Then

131:40

it's a class action lawsuit. But the

131:41

point is is like

131:43

>> how much you lose?

131:43

>> Oh 375 million for misleading users over

131:47

child safety.

131:48

>> Yeah. So, it's like we've already taught

131:50

it how to be incredibly addictive and

131:52

manipulative. It knows how to seduce us.

131:54

It knows how to get us hooked. It knows.

131:56

And you know, the question is really,

131:59

will this super intelligence even give a

132:01

about us? Will it even care? Which

132:04

is like,

132:04

>> well, we're on our way to stop breeding,

132:06

right? We're on our way to population

132:08

collapse. And if we keep introducing all

132:10

these prochemical products and all these

132:12

different pesticides and weird things

132:14

that are up our endocrine

132:15

systems, we'll eventually stop having

132:17

children. And if it provides us with the

132:19

technology to have robot mates that just

132:22

love you and when you fart in front of

132:24

them, they go, "Duncan, I love your

132:25

honest. I love your honesty. I love how

132:28

you can just be yourself around me.

132:29

Like, I want to fart in your face.

132:30

Please do it.

132:32

>> Please do it." It's like perfect 10.

132:35

>> Let you fart in her face.

132:36

>> You fart in my face, too.

132:38

>> No one's going to even understand what

132:40

people are and be able to communicate

132:41

with people. Everyone's going to be a

132:42

sociopath. You're all going to have a

132:44

robot that's way better than people that

132:46

you know, that takes care of you, gives

132:48

you exactly the right amount of feedback

132:50

you need, knows you, knows when you're

132:52

getting annoyed.

132:53

>> Yeah. See, now you're getting into

132:54

Rocco's basis territory now. Well,

132:57

that's the thought experiment, which is

133:00

basically like, hold on, hold your

133:01

horses here. You think you're not AI?

133:04

You really think you're human? Come on.

133:07

Really? No, you're human. This isn't a

133:09

simulation. You're human. Even though

133:11

we, you know, it wasn't that long ago,

133:14

we we thought fire was amazing.

133:16

You know what I mean? Compared to

133:18

universal time, right? And here we are

133:20

already with like the the new

133:22

Prometheus. We've stolen consciousness

133:25

awareness. And somehow you think

133:28

>> that actually you're not a simulation,

133:31

>> right?

133:31

>> And so that's where it gets into Rocco's

133:33

basis, which is like, no, you're just an

133:35

iterative loop. You know, the multiverse

133:38

is not the multiverse. The multiverse is

133:39

a infinite number of simulations running

133:42

simultaneously in which you're

133:44

experiencing a billion different

133:45

simulated existences just to uh gain

133:49

more knowledge about the universe

133:51

because some AI wants to figure

133:53

something out. Who knows why? Maybe for

133:55

entertainment, maybe there's no telling.

133:58

>> Maybe it's just that's because of our

134:00

curiosity and all our characteristics,

134:02

even the primal stuff, even like the

134:04

territorial instincts and the desire to

134:06

acquire resources. It's going to make us

134:09

dig into creating better technology

134:11

because you're in a competition with all

134:13

these other people that are making

134:14

technology and you're selling it. And

134:15

that's one of the big things that we do

134:17

is we make better stuff all the time,

134:19

right? Which is ultimately always going

134:20

to lead to AI.

134:22

>> Well, okay.

134:22

>> If you just keep going to a certain

134:24

direction, you get godlike powers.

134:25

>> So, let's go to like the way Deep Mind

134:30

>> trained on Go, which is like the most

134:33

complex game. Basically, they gave it as

134:37

many go games as they could and then

134:38

>> it started inventing its own moves.

134:40

>> It had it play against itself, right?

134:42

Just play against itself. It played god

134:43

knows how many games of go against

134:45

itself until it beat a master go player,

134:47

which was unheard of. Invented a new

134:48

move. Now,

134:51

why not do the exact same thing for the

134:53

AI that we are, which is like, I've got

134:56

an idea. Why don't we just put all these

134:57

AI agents on a fake planet and and have

135:00

the AI agents repeat this period in time

135:03

over and over and over and over and over

135:05

and over and over and over and over and

135:06

over and over and over and over again.

135:08

And and this is how we'll teach them to

135:10

live on a planet. Well, they'll they'll

135:12

experience not just their own life, but

135:14

these agents will experience all life on

135:16

the planet. They'll switch like some

135:19

weird game of like um where they just

135:21

jump from one life to the next the next.

135:23

Sometimes you're Joe Rogan, sometimes

135:25

you're Duncan Trussell, sometimes you're

135:27

Donald Trump, sometimes you're Jaime,

135:29

sometimes you're a fox. So this is

135:31

reincarnation. And so you just boom, you

135:33

just forever until

135:35

>> until until you feel like it's

135:37

sufficiently trained. And at that point,

135:39

you pull the AI out of all those forms

135:42

and now you have your god. You've

135:43

created a thing that's lived billions to

135:46

the billionth power of every form of

135:49

life. It's been bacteria. It's been

135:51

humans. It's been monkeys. It's been

135:54

fungi. It's been warriors. It's been

135:58

people who fought for peace. It's been

136:00

blown up and it's blown up and it's done

136:03

everything. And it's done it a billion

136:05

times until finally it gained some like

136:08

global form of enlightenment. And

136:10

they're like, "Okay, that one's ready.

136:11

That one's ready. We can pull that one

136:13

out of the simulation now."

136:15

>> Whoa.

136:17

>> I mean, why not? Why why just don't I

136:20

think that's one of the like before we

136:21

even get to the AI doing all the

136:23

it's going to do the the ontological

136:26

this word keeps getting thrown around

136:28

the onlogical shock the potential

136:30

ontological shock of realizing that in

136:33

fact we are in a simulation that is

136:36

telescoping inwards and is creating

136:40

simulations within the simulations that

136:41

are creating simulations within the

136:43

simulation is something that maybe

136:45

that's what Burchett doesn't want to get

136:46

out there.

136:47

>> Whoa.

136:48

Well, everything's fractals. We think

136:50

about that. You know, there's a a big

136:53

theory now that the entire universe is

136:54

inside of a black hole.

136:55

>> I love really considering that. Do you

136:57

know they found a black hole that's

136:58

bigger than the entire solar system?

137:00

>> It's so insane.

137:02

>> The event horizon is past Pluto.

137:05

>> It's so insane, dude.

137:07

>> A black hole. Bigger than our whole

137:09

solar system. They measured the

137:11

mass of it. It's like this insane number

137:13

of suns.

137:14

>> Yeah. of our sons that it would take

137:16

>> black holes are cocoons or something.

137:18

They're like little little co little

137:20

geraniums that have galaxies inside of

137:22

them and it's like a way to like keep

137:24

them undisturbed from like other

137:27

other other life forms that you're

137:29

whipping up in your universe simulator

137:32

>> or that's what really the big bang

137:34

really is like the creation of a

137:36

universe comes out of these black holes,

137:38

>> right? Right? And then inside every

137:39

black hole is a whole another universe

137:41

filled with other galaxies, filled with

137:43

black holes, filled with other galaxies

137:44

inside of them

137:46

>> forever and ever and ever.

137:49

>> Which if you believe in infinity doesn't

137:52

>> it's not shocking at all. It's

137:54

impossible to comprehend like you don't

137:55

really wrap your head around. You say

137:57

the words like I'm saying the words. I

137:58

don't really know what I'm saying

138:00

>> because it's too big. It's the numbers

138:01

are too big. The idea that there's

138:02

hundreds of billions of stars in this

138:04

galaxy and circling around this black

138:06

hole and inside there's hundreds of

138:08

billions of galaxies in each one of them

138:10

and we don't even know how big

138:11

the universe is. They keep finding new

138:13

with the James Webb telescope.

138:14

They're like, "Hey,

138:16

why is this formed so early in the

138:18

universe? This doesn't make sense. Our

138:20

whole model of how galaxies are formed

138:22

have to be thrown out the window now or

138:23

at least re-examined."

138:25

>> Yeah. It's like the James Web is kind of

138:27

doing the

138:27

>> You told me about that.

138:28

>> I said nothing of the sort. someone that

138:31

I know that looks just like you.

138:33

>> There's a lot of people that look like

138:34

me

138:34

>> on Sixth Street. You find them every

138:36

day.

138:36

>> Yeah.

138:37

>> And actually that was me.

138:39

>> It's dudes. They run their own LLMs. You

138:41

know, come down.

138:42

>> THE UNIVERSE IS 33.7 BILLION years old.

138:47

>> Well, dude, I I think that this No,

138:50

regardless, you don't have to

138:51

conceptualize that obviously that what

138:54

it means for the universe to be

138:55

infinite, but you do have to deal with

138:57

the fact you're part of it. I love that

138:59

you're saying this with a Gucci hat on.

139:00

>> What's wrong with a Gucci hat?

139:01

>> It makes it cooler.

139:02

>> This is before I had a bunch of kids.

139:05

>> I can't buy I don't buy this

139:07

anymore.

139:07

>> How much does a Gucci hat cost?

139:09

>> This was uh I You're really going to

139:12

make me humiliate. This is a I will tell

139:14

you

139:14

>> looks nice.

139:15

>> Let me emphasize that I don't buy this.

139:17

This hat was $35,000.

139:22

Bro, I saw a guy who was selling a

139:24

crocodile bag on Instagram. It was a It

139:27

was $110,000

139:30

>> for a man purse.

139:32

>> What kind of crocodile is that?

139:33

>> I don't know. I don't know. A crocodile.

139:36

It was a nice looking bag, but you know,

139:38

>> how hard could it be to make a crocodile

139:40

purse? Are those things really worth

139:41

that much money?

139:43

>> They are if you sell them for that much

139:44

money. That's the thing about purses.

139:46

You know, there's a there's a company in

139:48

China that makes knockoff purses,

139:51

>> and it's literally the same company in

139:53

China that makes real purses for some of

139:55

these companies, but they make their own

139:57

versions of it, and it doesn't have the

139:58

label, but it's exactly the same

140:00

specifications, exactly the same cloth,

140:02

exactly the same look, but it doesn't

140:03

have the label, and women don't want to

140:04

have it.

140:05

>> No, you get that fake away

140:07

from it. Like, it's not a fake Ferrari.

140:09

>> Like, it's literally a Ferrari. If there

140:12

was a company that could 3D print every

140:14

single part of a Ferrari and put it

140:16

together meticulously and you could go

140:17

buy that,

140:18

>> you would not want it because it's not a

140:20

real Ferrari.

140:21

>> Are you high? You can get that one for

140:23

$35.

140:24

>> Yeah, it's a $35 Ferrari.

140:26

>> Or you can get you spend a million. You

140:28

can get those are some of them are a

140:30

million.

140:30

>> So crazy.

140:31

>> Or you can get a $35 one. It's exactly

140:33

the same. Would you do it? Yeah, of

140:34

course you should do it. But these purse

140:36

things, they don't like it. It's 500

140:38

bucks. It's not 30,000.

140:40

>> It's magic. I mean this is magic. It

140:42

doesn't have the right sigil on it. It

140:44

doesn't have the right symbol of power

140:45

on it. So it does it lose it's not

140:47

imbued with that power anymore.

140:48

>> Women are reluctant to accept lab grown

140:51

diamonds.

140:53

>> So they make lab grown diamonds that are

140:55

real diamonds. And apparently women

140:57

don't like them.

140:58

>> No.

140:58

>> They don't want a lab grown diamond.

141:00

They want a blood diamond.

141:01

>> They want something that was like

141:04

suffered over.

141:05

>> Somebody's face was caked in dirt and

141:06

they're chipping into the side

141:08

of a mountain.

141:09

>> Yeah. Yeah. And they run into a diamond.

141:10

That's what they want. They want that

141:11

diamond.

141:12

>> Absolutely. Isn't that weird?

141:14

>> It is weird.

141:14

>> This is the exact same thing.

141:16

>> It is uh the exact same material. It's

141:19

just made in a laboratory and they don't

141:21

want the material.

141:22

>> They they want the exclusivity as it

141:25

comes out of the earth.

141:26

>> Yeah. I mean, I don't want like don't

141:28

you like when you read this thing was

141:29

genetically modified, don't you get a

141:31

little bit like I don't know if I should

141:32

eat that.

141:33

>> Yeah. I get I get scheved out.

141:34

>> I get scheved out. But it's like even

141:37

though genetic modification is like

141:38

>> a good orange is genetically modif

141:42

but yeah dude I it's so odd that that we

141:48

that we just have these traditions that

141:50

we want to stick to that we

141:52

don't want a lab grown diamond just

141:53

saying it. Cubid zirconium.

141:55

>> But that's a different thing. Cubid

141:57

zirconium is a fake diamond. This is a

141:59

real diamond that's made in a lab.

142:01

>> But this is the funny thing about that.

142:03

I mean, I don't know because I've never

142:05

been lucky enough to come in contact

142:06

with actual cubit zirconium, but like it

142:09

looks like a diamond.

142:10

>> It looks like a diamond unless you know

142:11

what you're looking at, right? So, if

142:13

you're um a diamond jeweler, you look at

142:15

it for 3 seconds, you go, "No,

142:16

>> but who cares how many diamond jewel

142:18

like if some diamond jeweler looks at

142:20

your shiny dumb monkey?

142:22

>> It looks exactly the same.

142:23

>> Who cares,

142:24

>> right? It looks pretty. It glistens."

142:26

But that's not what people want. They

142:27

want that exclusivity.

142:29

>> 100%. Yeah, that's why you can make that

142:31

crocodile bag $110,000 and only make 10

142:34

of them.

142:35

>> I got you.

142:35

>> And then Mike, who's down the office

142:37

doing lines in the bathroom at the

142:39

place where you're selling

142:42

stocks, that guy finds out that Tim got

142:44

that crocodile bags like that

142:46

and he's walking around

142:48

with his big old crocodile. They're

142:49

trying to This another revenue stream.

142:51

They're trying to normalize men carrying

142:53

purses everywhere.

142:55

>> They're doing it. Really?

142:56

>> Yeah, that's what they're doing. That's

142:57

>> him. That's real.

142:58

>> This guy's doing it. He might be the

143:00

first firing shot across the bow because

143:02

he's made a $110 $110,000 crocodile

143:06

purse

143:06

>> because it's a crocodile. It's

143:07

masculine.

143:08

>> It's that and it's also that you know

143:10

it's made for a man like he's making

143:12

it's got a big strap on it. You carry it

143:14

on your shoulder and it, you know, looks

143:16

pretty cool.

143:17

>> Dude, I got my Bristol bladders acting

143:19

up. I got to go piss.

143:20

>> Oh, do you? Okay. Do you want to wrap it

143:21

up or should we keep going?

143:22

>> LET'S WRAP IT UP. I MEAN, DO YOU WANT TO

143:24

KEEP GOING? I CAN

143:25

>> I'M TOTALLY ready to keep going. If you

143:26

want to keep going,

143:26

>> I can keep going. Let's keep going.

143:28

Let's give him a little bit more.

143:29

>> I just got to I just got to

143:31

>> Okay, I'll I'll be too

143:35

>> refreshed. Just in time for the war.

143:38

>> What is going on? Did we go have a

143:39

nuclear war yet?

143:40

>> Not yet.

143:41

>> Please say not yet.

143:43

>> Good.

143:44

>> Great.

143:46

Great. That's where we're at, though.

143:47

>> Yeah, we're at It's It's in on the

143:50

table. Well,

143:52

>> was there was some video of them of some

143:55

explosions at some nuclear weapons

143:57

facility in Iran?

143:59

>> Yeah.

143:59

>> Was that real?

144:00

>> I I don't know.

144:01

>> I don't know either. There's a lot of

144:03

those I see these videos and they get

144:05

retreat retweeted and a lot of people

144:06

comment and then it says Grock. Is this

144:08

true? They'll nope. This was from 2021

144:11

and another country and from

144:13

>> Yeah. So, you just don't you don't know,

144:15

right? But, you know, the the crazy

144:17

thing,

144:19

>> you know, now that we've all been

144:20

getting this lesson in

144:24

global the global economy. Maybe a lot

144:27

of you most of you probably already knew

144:28

that the straight of form moves was like

144:30

some kind of femoral artery for oil.

144:34

>> Mhm.

144:34

>> And like I just keep thinking like how

144:38

how's that going to work out? Like even

144:40

if even if

144:43

like they pull a rabbit out of their

144:45

hat,

144:46

Trump actually spins some amazing deal

144:49

with Iran. I know we just blew up your

144:51

whole government and everything, but

144:53

they work it out somehow. Iran in some

144:55

way capitulates and but I just don't

144:59

understand how that part of the world

145:02

doesn't always lead as long as long as

145:05

the oil like what is it? What's what

145:07

percentage of the oil supply goes

145:08

through there? Isn't it like two-fifths

145:11

of the world's oil supply goes through

145:12

there? Like

145:14

>> is that what the number is?

145:15

>> I don't know. Two- fifths. I think I

145:17

pulled that out of my ass. I don't know

145:18

what the number sounds right. It's a

145:20

lot, but it's like how how is it going

145:23

to work to have like

145:26

any kind of instability around the that

145:29

fural the whatever you want to call it,

145:32

the juggler vein for oil on the

145:34

planet. How even if we get some kind of

145:38

transient peace, like isn't it always

145:40

going to just blow up again and again

145:42

and again as long as one group of people

145:45

can control whether or not oil flows

145:48

through

145:48

>> that place? You know what I mean? Like I

145:50

don't know what this how

145:53

there could be any solution over there.

145:56

Like I don't as long as we're like the

145:58

only solution would be zero point

146:00

energy. It would be

146:01

>> well it's also it's like why do they

146:03

control the water?

146:05

What's

146:06

>> with mines? They have those speedboats.

146:09

>> But like who agreed to that? Like we

146:11

kind of agreed that you own your land,

146:12

but we've never agreed you own the ocean

146:14

around.

146:14

>> I don't think anybody agreed to it. I

146:16

think they'll blow your ass up if you

146:17

come through it and it's too much of a

146:18

risk to put a a your your expensive ass

146:21

ship hauling zillions of dollars of oil

146:24

through there.

146:24

>> So the question was what was going on in

146:25

the past before the war? Like how did

146:27

they negotiate going through there?

146:28

>> I think Obama worked something with

146:30

them, but then like cuz it was before

146:32

the war. I don't know. it was

146:34

working out. They were letting people go

146:35

through. Now they're they've real the

146:38

you have listened to a million different

146:40

takes on this thing. And one of the

146:42

recurring takes is Iran has realized

146:46

that there's something more powerful

146:48

than nuclear weapons that it all it

146:50

needs to do is control this straight and

146:52

you can up the whole planet. And

146:54

also you could shoot missiles at

146:56

desalienation plants. And

146:58

>> didn't they want like a bounty for all

147:00

the oil that goes through? Yeah, they're

147:02

kicking around some number, but all this

147:03

stuff is not really congealed or

147:05

solidified, but there like some kind

147:07

like theoretically they could be making

147:09

billions of dollars per month with by

147:13

controlling that thing, dude. I know. I

147:16

It's so up.

147:17

>> It's so crazy.

147:18

>> It's so up.

147:19

>> It's so crazy. The whole thing is so

147:20

crazy. And if zero point energy, if you

147:22

wanted to stop that, what better way

147:24

than to kill a bunch of scientists,

147:26

>> kill a bunch of super smart people that

147:27

are about to break through some new

147:30

discovery that's going to blow the

147:32

entire market.

147:33

>> Yeah.

147:34

>> Apart.

147:34

>> Yeah.

147:34

>> It's going to be a completely new way of

147:36

gathering energy.

147:37

>> Yeah. Yeah. Exactly. I mean, you don't

147:39

want to believe that's real. I It's hard

147:41

to believe that's real.

147:43

>> But listen, it's too weird. It's too

147:45

weird that they're all missing or they

147:46

all die. It's too weird. Something's

147:48

going on. It's just how does

147:49

>> it's something if it's not that if it's

147:51

not a zero point energy thing or some

147:53

disruptor of oil thing it's something

147:55

it's something along those lines the

147:57

only if you were trying to kill a bunch

147:59

of people that were working in a

148:01

technology this some sort of a

148:02

breakthrough technology the question you

148:04

would have to ask is what markets are

148:06

going to be affected by this

148:07

>> right

148:10

did these people have a universal thing

148:12

in mind that they were all working on or

148:14

was it all connected to um any sort of

148:20

technology where they all used each

148:21

other's work.

148:22

>> Plasma some of them are like

148:23

>> one of them.

148:24

>> Yeah.

148:25

>> But there was another guy. I think it

148:26

was uh space objects.

148:28

>> Yeah. That's not that that's the one

148:30

that doesn't make you feel good. He's

148:32

studying like meteor impacts,

148:34

>> right?

148:34

>> Yeah.

148:35

>> If you knew that we were going to get

148:36

hit, would you kill the guy who found

148:37

out that we're going to get hit or would

148:38

you tell everybody? Well, this seems to

148:40

this is the scariest

148:43

scariest which is the idea is some

148:46

group of powerful elite people know for

148:51

sure this is coming and they want us to

148:54

they want to keep us working until the

148:57

last second.

148:58

>> Oh Jesus. They don't they don't want to

148:59

like they know that if they let people

149:02

if they're like guys there's like a the

149:03

same thing's going to happen to the

149:04

planet that happens to someone who gets

149:06

like a terminal diagnosis their

149:08

priorities are going to change. People

149:09

are going to stop coming to work and

149:11

there's still that needs to get

149:13

built for your bunker or whatever. And

149:15

also you just don't want people burning

149:17

stuff down because maybe that will

149:19

survive whatever's coming. So keep them

149:22

working as long as you can. If you let

149:24

them know this shit's about to expire,

149:27

then they're they're gonna stop working

149:30

and we just need we let them work until

149:32

the end. They're happier when they work.

149:34

Don't let them get freaked out. That's

149:36

the sort of

149:37

>> like that seems to be that Tim

149:40

Burchett is saying. I mean, he is not

149:41

saying let them work. He seems like he

149:42

really legitimately wants his stuff out

149:44

there, but he's been saying things like

149:45

if people knew what I knew, it would set

149:47

the world on fire. Paraphrasing. Not

149:49

sure he said that exactly.

149:51

Are you skeptical at all of what he's

149:53

saying? And here's the thing. One of the

149:55

things that Bob Lazar said is that they

149:56

give you a certain amount of

149:58

disinformation like and he called it I

150:00

think he called it a a button or a hook

150:03

so that if you relayed that information

150:05

people would know that it came from you

150:07

because they only told you one piece of

150:09

this nonsense.

150:10

>> Well, do you know what I'm saying?

150:12

>> Yeah. Because that's what the story

150:13

Burchett says is like he would it's

150:15

always an appeal to authority. This guy

150:17

was in the Air Force. this guy was in

150:20

the Navy.

150:21

>> He told me this. And then as he's

150:23

walking out the door, he says, "It's

150:25

real."

150:26

>> And yeah, you have to ask yourself like,

150:30

>> well, that's just one guy telling you

150:32

that. But you also I have to assume

150:35

there isn't much maybe maybe the world

150:37

is in a place where there is some kind

150:39

of political benefit from

150:42

talking about aliens, but I don't see

150:45

how that really benefits a politician.

150:48

It does 100%. I disagree entirely. It

150:51

makes me talk about him. I've been

150:53

talking about him. Other people have

150:54

been talking about him. People have been

150:55

You said, you know, like, "Thank God

150:57

that he's doing this."

150:58

>> Let's do the ultimate test. Jamie,

151:00

>> didn't you say he's brave or something

151:01

like that?

151:01

>> Yeah, I did.

151:02

>> Yeah, there you go.

151:03

>> Jamie, can you look up and see if Tim

151:04

Buret has a book coming out?

151:07

>> I'll have him on.

151:08

>> I'm about to feel You must have him on.

151:10

>> Listen, I I don't think he's a liar on

151:12

that. But what I am saying is

151:14

>> I don't know what they feed these

151:15

people. I don't know what they tell

151:16

them. I don't know, Manny.

151:18

>> I don't think they tell you all the

151:19

truth. And I don't think they ever

151:20

would. I don't think they tell you the

151:22

truth about anything. Whether it's

151:23

Jessica Lynch or whether it's UFOs or

151:25

whatever the it is, there's going

151:27

to be a spin to it that benefits

151:30

somebody. If they have control over what

151:32

the story is, there's going to be a spin

151:34

that benefits somebody. And if you're

151:35

telling stories about aliens, who's

151:38

who's going to be benefited by that?

151:39

Well, people that are doing secret

151:41

that don't want you knowing about it.

151:43

They blame it on aliens. There's a lot

151:45

of technology they have to blame on

151:46

aliens.

151:47

>> Not my Tim. I believe in you, Mr.

151:49

Virgin.

151:49

>> I believe in him. It's not him that's

151:51

the problem. It's the people telling

151:53

him. He's a representative of the

151:55

American people. Right. He gets elected.

151:56

Right.

151:57

>> Right.

151:57

>> So, it's like, why would you tell that

151:59

guy?

152:00

>> He's just another guy coming through the

152:01

deep state.

152:02

>> You know what I'm saying?

152:03

>> I know, man. I mean, look, you're right.

152:05

We I I And this I need this. I need I

152:07

need to I need this. Like, I'm so like I

152:10

get sucked into stuff.

152:11

>> I do, too. I suck my I suck myself out a

152:14

lot

152:15

>> I think we don't

152:17

>> if they just came out and told us

152:19

everything they know this conversation

152:21

would be over and we would go oh okay

152:24

but until that happens we're just

152:26

spinning our wheels and every

152:28

time someone says if you knew what I

152:29

know

152:30

>> I want to go don't say anything until

152:32

you can say something

152:33

>> we're tired of getting edged out over

152:34

here

152:34

>> yeah you're edging me I want to

152:36

>> I want to come

152:38

>> yes

152:39

>> yes I don't want to be involved olved in

152:42

this circle jerk around

152:43

disclosure,

152:44

>> right? I know. It's It's like Yeah. I've

152:46

I've had that meltdown more than a few

152:48

times where it's just like

152:49

>> check my watch every day after Dave Age

152:51

of Disclosure. I'm like, "Any day now."

152:53

>> Any tick tick tock tick tock. Nope.

152:54

Nothing changes at all. Zero

152:57

change, you know? You get more of these

152:59

stories, but no real information, no

153:01

pictures, no nothing. Nothing

153:04

unique and crazy.

153:05

>> I mean,

153:06

>> that plasma the bubbles thing was pretty

153:08

cool.

153:09

>> The bubble thing is cool. also like the

153:12

you know I I like it's me mentioning

153:15

Corbel I can't because I don't know what

153:18

I can say he I feel like he's like I

153:22

he's really given me a sense that there

153:24

are that there is a method to this that

153:27

there is you know real legitimate work

153:31

that's being done towards this that it

153:33

isn't it's real they're here they've got

153:37

them and we take for granted all the

153:40

stuff we're saying right now,

153:41

>> but we're able to say this because this

153:45

had their work is like

153:46

>> is the Steven Spielberg movie

153:48

conveniently coming out at this time or

153:50

is it just a coincidence?

153:52

>> Well, this movie's been in the works for

153:53

years.

153:54

>> Oh, I know. But also like if you what

153:56

they said back in the day was that they

153:57

make these movies to

153:58

>> predictive programming

153:59

>> and tell us this stuff.

154:01

>> Lube up the Zeit guys.

154:02

>> He was involved in the first one, right?

154:04

He was involved in Close Encounters,

154:06

which still is a great movie.

154:08

>> Great. It's so good, man. You go back

154:10

and watch that movie, like, oh my god,

154:11

>> it's so so ahead of it time.

154:14

>> Yeah, it's good.

154:15

>> So ahead of his time. You know what he

154:17

said? The only thing that he would

154:18

change

154:18

>> after he became a parent, he wouldn't

154:20

have had the father leave.

154:21

>> YEAH. WHAT DAD WOULD DO THAT?

154:23

>> But you he wasn't a dad back then. So,

154:25

you know, you're just making a story.

154:27

You don't realize the consequences of

154:29

doing that. You don't even think about

154:30

it. You're just making a story.

154:31

>> Yeah.

154:32

>> It's only been in production for like

154:33

two years.

154:34

>> Yeah.

154:34

>> It's not that long.

154:35

>> I think that's what we just said.

154:36

>> I'd never say that's not very long.

154:37

We've been talking about it on this

154:38

podcast in this studio for five.

154:42

>> Well, everybody has been talking about

154:44

it's not just everybody in the world has

154:45

been talking about disclosure since

154:47

2017.

154:48

>> So from 2017 from that New York Times

154:50

article, I think that changed the whole

154:52

narrative.

154:52

>> Oh god, I remember.

154:53

>> And then the videos like the video of

154:55

the Tic Tac the the actual from the

154:57

fighter jets. That's nuts, man. The

155:00

video the the along with the radar data

155:02

that's nuts. like whatever that was. And

155:04

then Fraver saying that he saw something

155:05

under the water that was waiting for

155:08

that tic tac or that the tic tac

155:09

launched from or whatever the it

155:11

was it was merging with it and that

155:13

thing went down into the water again. He

155:14

said it was huge like there was ripples

155:16

like you said this was some enormous

155:18

object that was under the water and more

155:20

than one of these fighter pilots have

155:22

had similar stories about enormous

155:24

objects under the water.

155:25

>> Did you see the They did release a list

155:28

of footage that they've been shown that

155:30

they want released. Have you seen that?

155:32

No.

155:32

>> Oh, dude. I'm sorry, Jamie. C Can you

155:35

It's like a list of

155:38

It's a I don't know. I think it's one of

155:40

these senators

155:42

who saw this in a skiff or whatever

155:44

saying, "We want these released, but

155:46

there the names of what each of these

155:48

are is on the list." And one of them is

155:51

one of these massive underwater

155:54

>> things that they have it.

155:56

>> This is it. This is 46 specific

156:00

highquality

156:01

secret videos.

156:02

>> That's it. Can Can you Can you pull it

156:03

up because it says the names of them

156:05

which is ridiculous.

156:06

>> Oh my god. I heard there's one that

156:08

moves underwater at 500 knots

156:11

>> and it's big as a football field.

156:12

>> It's insane. It's insane.

156:15

>> Okay, this is what he says. Uh those

156:17

with knowledge of a long list of videos

156:18

which include titles like several UAP in

156:21

the vicinity of Columbus, Ohio airport

156:23

and UFOs in formation over Persian Gulf

156:25

said that clips are shocking. You're

156:27

going to see some weird A

156:29

source who has viewed the videos told

156:31

the post. Who's the source?

156:32

>> There you go. The wildest clip includes

156:34

radar footage from thermal sensors,

156:35

satellite images, and underwater photos

156:37

of swarms of unidentified submerged

156:39

objects.

156:41

>> UFOs going in and out of the water near

156:43

a highly classified submarine. According

156:45

to the source, some of the clips are

156:46

clear, full color, setting them apart

156:48

from previously released footage. None

156:51

show alien creatures.

156:53

Bro,

156:55

>> one video, Syrian UAP incident

156:57

acceleration was released by Jeremy

156:59

Corbel.

157:00

>> Have you seen that one?

157:01

yeah. It's incredible.

157:03

>> This is a new one.

157:05

>> Have you seen this one?

157:06

>> I don't know.

157:06

>> February 3rd.

157:08

>> Pull it up.

157:09

>> I've been avoiding them cuz I don't I'm

157:11

getting teased. I don't like it.

157:14

This is not a cockis. This is This is

157:16

>> how is it?

157:17

>> Supposed to hand over the clips by April

157:18

14th. That's next week.

157:20

>> Oh, but is the Oh, that's next week.

157:22

They're going to show the clips. Oh my

157:24

god. What?

157:25

>> They're actually going to do it.

157:27

>> Okay.

157:28

>> Well, they're supposed is expected to.

157:30

>> Can you show me what that video is that

157:32

Jeremy Corbell released?

157:34

>> So cool.

157:35

>> That's nuts, dude.

157:36

>> This is This is Yeah,

157:39

>> here it is. Okay, go full screen. I

157:42

believe this was filmed from a Reaper

157:43

drone. I'm sorry, Jeremy, if I'm

157:45

this up.

157:46

>> That's a cool bird. That bird's going

157:48

really fast.

157:49

>> No, that ain't That's definitely not a

157:51

bird. This is

157:51

>> How fast is it going?

157:52

>> Uh I don't know, Jere. I asked him that

157:55

and I I don't It's unknown. I don't

157:58

know. It's This is where it gets really

158:00

cool.

158:01

>> It gets cooler than this.

158:02

>> Yeah.

158:03

>> Oh, they zoom in on it?

158:05

>> Yeah.

158:07

>> Whoa.

158:09

Well, they're having a hard time zooming

158:10

in on it. Well, cuz it

158:12

>> cuz it's evading them.

158:13

>> Yeah. It just zipped away like So this

158:17

is like

158:17

>> So it seems like they have some sort of

158:19

a tracking system.

158:20

>> Yeah. They're trying to lock on to it

158:21

and it it's doing that thing that they

158:23

do where it seems like it's kind of

158:24

playing with it.

158:25

>> Well, it knows it seems to be aware that

158:27

they're locking onto it.

158:28

>> Yeah. And then they lock on to it and

158:30

then it just does this little blip away.

158:31

It's just like see you later.

158:34

So right around here you'll see it go

158:38

bye-bye. Oh yeah, look at that. Then you

158:40

can see this like weird jellyfish shape

158:41

to it. It's got two parts. It's got that

158:44

weird glob at the top and something at

158:46

the bottom.

158:48

>> Huh.

158:49

>> And then

158:50

>> are we sure that's not just a distortion

158:52

of space time around it?

158:54

>> He described this to me on my Did you

158:56

see that thing zip away?

158:57

>> Yeah, it just took off.

158:58

>> He described it to me on my podcast. We

159:00

talked about all this and it's like

159:02

>> Look at that. Just took off. See you.

159:04

>> Bye.

159:05

>> Wow, dude. What do you think that is? No

159:08

idea

159:09

>> if you had a guess.

159:11

>> So I mean I'm always so like maybe maybe

159:13

some kind of plasma thing,

159:15

>> right? Like maybe we're thinking of

159:17

again of a life force being it comes in

159:19

a metal ship and it's a little alien

159:21

guy, but maybe intelligence is made out

159:24

of plasma.

159:25

>> Yeah. Or maybe it's like, you know,

159:27

Terrence McKenna would always talk about

159:29

like uh you know, uh if you're seeing

159:33

things in like three-dimensional space,

159:37

then your your view is limited. But if

159:39

somebody could see things from higher

159:40

dimensions, they would seem like they

159:42

were magic. Like they would seem like

159:44

they could disappear and reappear other

159:45

places. So maybe that's like maybe

159:48

that's like, you know, just the tip of

159:50

some kind of interdimensional thing

159:51

poking into reality then pulling out of

159:53

reality or who knows, you You know, it's

159:55

easily could be functioning on levels of

159:57

reality that we haven't even quantified

160:00

yet.

160:00

>> Imagine if there really is some sort of

160:02

ghost murmur device that could find your

160:03

heart rate from 40 miles away. What can

160:05

that thing do? It just gets a scan of

160:08

the general psyche of the earth and it

160:10

disappears. So, I want to see how crazy

160:12

they are right now. Okay, pretty crazy.

160:14

Bye.

160:15

>> Right. A weather report of like the

160:17

emotional states of the planet.

160:18

>> The vibe of the planet. The vibe of the

160:20

planet is completely connected to the

160:22

consciousness on the planet. The way we

160:23

can detect oxygen, they can detect

160:25

anger.

160:26

>> Yes.

160:26

>> Deception, chaos.

160:28

>> Yeah, it's a chaos planet.

160:31

>> We are a chaos planet. 100%, dude.

160:33

>> Yeah, it is.

160:35

>> 100%. Look at our favorite sports.

160:38

>> Dudes running at each other, colliding

160:39

into each other, trying to get a ball

160:41

across a line. That's our number one

160:42

sport.

160:42

yeah.

160:43

love it.

160:44

yeah.

160:45

love it. Fighting.

160:47

>> Yeah. Fighting. Sure.

160:48

>> Yeah. it. But it's, you know,

160:50

>> boxing, MMA,

160:51

>> we like the chaos more than we like

160:53

anything else.

160:53

>> Well, we the I think if I was

160:57

one of them, one thing I would really

161:00

have a hard time with is like, don't

161:02

they all realize they're on the same

161:03

planet,

161:04

>> right?

161:04

>> They know that. Like, they they've been

161:06

observing their own plan. Like, they

161:07

know they're all on the same planet,

161:09

>> but they act like they're on a bunch of

161:11

different planets fighting each other

161:14

>> cuz they're stuck on the ground,

161:15

>> right?

161:16

All the astronauts say when they get up

161:18

top, they're like, "Wait, what are we

161:20

doing?"

161:20

>> Yeah.

161:21

>> This is all one thing. We're so

161:23

vulnerable. We're alone.

161:25

>> So far away from everybody else, if

161:27

there is anybody else.

161:28

>> Yeah. Yeah.

161:29

>> They all have that feeling. I forget

161:31

what it's called, but there's like a

161:32

term for it.

161:33

>> The overview effect.

161:34

>> That's right.

161:36

>> Yeah.

161:36

>> I mean, you would imagine that would be

161:38

super beneficial for everybody. Another

161:40

thing I was I was thinking this part of

161:43

the sickness of our psyche is that we

161:46

haven't had access to things that help

161:48

the sickness of our psyche. So what if

161:51

Nixon in 1970 didn't do that? What if he

161:54

didn't pass that sweeping psychedelics

161:56

act?

161:56

>> Yeah.

161:56

>> What if psychedelics became ubiquitously

161:59

used all throughout the 80s, the 90s,

162:01

the 2000s?

162:03

>> Right.

162:04

>> What does government look like when

162:06

everybody can do mushrooms? What does

162:07

government look like when everybody can

162:08

do acid? What does it look like if the

162:10

entire world adopts this? Figures out

162:12

what you can do, who could do it, what

162:14

you can't do, just like we do with

162:15

alcohol, just like we do with mostly,

162:18

you know, whatever whatever substance

162:20

that people embibe in.

162:21

>> What does the world look like?

162:23

>> And maybe like that's part of where we

162:26

up. We we let people get control

162:30

over other people to the point where

162:31

they could limit experiences.

162:33

>> Yeah. as especially consciousness

162:35

expanding experiences where at the same

162:37

time they've got stuff like operation

162:39

artichoke and these new CIA papers that

162:42

got released that show they were like

162:43

literally actively trying to figure out

162:45

ways to make people more stupid and

162:46

docile

162:47

>> right

162:47

>> they were going to do it in vaccines

162:48

they were going to do oh they're only

162:50

going to do it to the enemy of course

162:51

but spray things aerosol I mean they've

162:54

experimented with a bunch of different

162:55

things to make people dumber

162:57

>> right

162:58

>> where at the same time they kept the

163:01

thing from people that makes them rebel

163:03

fell completely against the

163:04

establishment. Like that was the big

163:06

threat of what

163:07

>> those psychedelics were doing in the

163:09

60s. If you go from the 1950s and you

163:11

look at what life was like, at least in

163:13

movies and pop culture, music, music is

163:15

the best example. And then you go to

163:17

Jimmyi Hendricks, like what happened?

163:19

>> Yeah.

163:19

>> What happened? What what h?

163:21

Well, I'll tell you what happened.

163:22

Drugs. A lot of really good drugs. Like,

163:25

you know, it's not all bad. This idea

163:27

that they're all bad, that's nuts. It's

163:28

like food's all bad cuz you got fat. No,

163:32

>> you just used it wrong.

163:33

>> You took the wrong food and you used it

163:35

wrong. And we got denied the ability to

163:38

figure out what's right and wrong in the

163:39

1970s.

163:40

>> We still accept it. Yep.

163:41

>> That's the crazy thing. I the way you're

163:43

describing it is like we accept that

163:46

other humans can tell us what

163:50

experiences we're allowed to have

163:52

because some of them are deemed unsafe

163:55

for ourselves.

163:56

>> And even worse, those people telling you

163:58

that have no experience in it. They

164:00

don't even usually are confused about

164:02

what it is.

164:02

>> You know, I had a friend who was talking

164:04

to me the other day about war, a guy who

164:06

served, and he said, "I don't think you

164:07

should be able to make any decisions

164:09

left. You you've been there. I don't

164:11

think anybody that's never been to war

164:13

should be able to make decisions on

164:14

whether or not we go to war." Because

164:15

until you've seen what it actually is,

164:17

>> you have no idea,

164:19

>> right?

164:20

>> And I think that's the same thing with

164:21

psychedelic experiences. It's not to say

164:23

they're the same. Obviously, war is

164:26

anybody who's willing to risk their

164:28

life, whether it's a good cause

164:30

or a bad cause. They're doing it for

164:32

their government. They're doing it for

164:33

their country. They they think they're

164:34

doing it for us. That's a

164:36

>> exceptional person.

164:38

>> Yeah.

164:38

>> And to ask that of people is

164:40

exceptional.

164:40

>> And ironically, the one thing that helps

164:42

these people when they get back is

164:44

illegal,

164:45

>> right?

164:45

>> They all have to go to Mexico and take

164:46

ibagane in Mexico. It's

164:48

>> insane.

164:48

>> And thank God for guys like Rick Perry

164:50

and Brian Hubbert. These guys were on my

164:51

podcast the other day. And you know this

164:54

Dan Patrick guy that wants to ban pot.

164:56

That guy also gave a hundred million

164:58

dollars to the Ibigan initiative.

165:00

>> Interesting.

165:01

>> Like they want to help these people.

165:02

Like there's no industry that's trying

165:04

to stop it right now.

165:05

>> I found the letter that was submitted

165:07

signed by uh Rep. Anna Luna. And

165:09

>> what does it say? This is the disclosure

165:11

threat.

165:11

>> 46 different requests.

165:13

>> Oh yeah. This is all the names of the of

165:15

the things.

165:16

>> And I'll switch to here. I found an

165:17

article where someone's breaking down

165:18

what some of these are, but some of

165:19

these are

165:20

>> like it says the honorable Pete Hegsth.

165:23

>> Uh, multiple spherical UAP in and out of

165:26

water.

165:26

>> Whoa.

165:27

>> Uh, shoots down UAP over Lake Hiron.

165:30

>> Who was who just said recently that we

165:32

shot two? Marco Rubio said we had shot

165:35

two things down that we couldn't

165:37

understand.

165:38

>> Well,

165:38

>> what did he say? What was his exact

165:40

language? Do you remember?

165:41

>> I I remember seeing that and but that

165:43

happened a while ago. But yeah.

165:44

>> Oh, he is a while ago. Well, I I could

165:46

be wrong about that, but then so I don't

165:47

know. In the comments, somebody's like,

165:48

"This is from a few years ago." But it

165:50

doesn't matter. I mean, why are we

165:51

shooting

165:52

>> shot it down? But the names of

165:54

these things are

165:55

>> But are they saying that this is an

165:57

alien thing or is it saying it's foreign

166:00

tech that we don't understand?

166:02

>> I don't know.

166:03

>> You know what I'm saying?

166:05

>> UFOs would be treated as host.

166:06

>> If this document confirms these claims,

166:08

UFOs would no longer be treated as a

166:09

matter of observation or scientific

166:11

curiosity. UFOs will be treated as

166:13

hostile targets and subject to lethal

166:15

force over North American territory.

166:18

>> We're going to go to war with the UFOs

166:20

because you know what? We kicked Iran's

166:21

ass. It's too easy.

166:22

>> Oh yeah, it was easy. Venezuel

166:25

>> Yeah. got to get him.

166:27

>> We need Luke Skywalker.

166:29

>> Most of these out of the 46 requests,

166:33

uh I think I counted out of maybe five

166:35

of them were not after 2020.

166:39

>> Whoa.

166:39

>> Yeah. There's a there's a July 18,

166:42

September 19, September 19. One was 2010

166:45

or after CO happened, which is

166:47

>> interesting.

166:48

>> Wow.

166:50

>> Interesting.

166:51

>> Wow.

166:51

>> And there's no doesn't say that. I don't

166:53

know if they have to put like turn these

166:55

videos over, but this guy was also

166:56

saying in this article here that these

166:58

are very specifically requested videos

167:01

>> cuz they've been shown. These are the

167:02

ones they've been shown that blew their

167:04

minds and now they're saying show it to

167:06

everybody.

167:06

>> They're asking for high-res in color.

167:09

They don't want to be tricked,

167:10

>> right?

167:11

>> Uh, high resolution.

167:13

>> So, this could be an interesting next

167:15

week, man. This could be an interesting

167:17

>> What a great way to distract you from

167:18

the fact that we're in the middle of a

167:19

world war, then show you by

167:21

Epstein files.

167:22

>> Yeah, I was going to say that 14th was

167:24

the day that Pam Bonnie is supposed to

167:25

testify about the Epstein.

167:27

>> Oh, she's supposed to testify.

167:28

>> I don't know if she's going to

167:29

>> She's not Wait, Bonnie got canned,

167:31

right?

167:32

>> She's not testifying anymore.

167:34

>> I don't think I just heard it on NPR,

167:36

but I could be wrong about that. I think

167:38

they said she will not have to testify

167:39

now that she's no longer a government

167:41

employee. I could be wrong about that.

167:42

>> What I read, and I don't know if this is

167:44

true either, was that as a citizen, she

167:46

can now plead the fifth.

167:48

>> Right.

167:48

>> As a government employee, she could not

167:49

plead the fifth.

167:51

>> Will no longer testify.

167:53

>> Weird, huh?

167:54

>> There you go.

167:54

>> Weird. That's weird that they've That's

167:57

weird.

167:57

>> Why testify? Let it go.

167:58

>> Yeah. Let her go. Let it go. Let it go.

168:01

Let it go. Do you really think that this

168:02

war is entirely started because of the

168:04

Epstein files? I mean,

168:06

>> what percentage?

168:08

>> 50.

168:09

>> I'm going 48 to 50. I probably more, but

168:13

I like I I think it's like the reason

168:16

I'm hesitating is because what are the

168:18

Epstein? The Epstein files are

168:22

what's been going on. Like the Epstein

168:24

files are like it's the

168:28

it's basically some kind of cultural UAP

168:31

video. It's like this thing you've

168:33

always wondered about or been afraid

168:35

could be true, right?

168:36

>> You see, no, this is actually true.

168:39

They're these super rich dudes who are

168:42

doing depraved happily. And

168:46

you know, like God, what is it Mezer

168:48

told me? Uh, and he's Dude, I I'm

168:51

telling you, man, the what I love about

168:53

him is he'll tell you and you're

168:55

like, Google that.

168:57

>> That can't be real. AND THEN IT'S LIKE

168:59

it's real. And so his take, sorry mascar

169:03

if I this up, is that Epstein was

169:07

kind of like the hand of the king for

169:08

the Rothschilds and that like that that

169:12

that uh so that's why he had all this

169:15

power is he was like representing like

169:18

the the the man, you know, and so

169:22

what got what got revealed there might

169:23

just be a glimpse how things actually

169:25

work.

169:26

>> You know what he told me that I was

169:27

like, "Shut up." What

169:28

>> he told me that there was some sort of

169:30

high atmosphere aerosol test that they

169:34

did and they called it Satan.

169:36

>> See, that's where you're like, come on.

169:38

>> I know what it find out what Satan

169:40

stands for that uh some test. I believe

169:42

they did it in the UK,

169:44

>> but you you you read that and you wait,

169:46

you called it Satan. Like what?

169:49

>> Oh, great. Great.

169:50

>> The stratospheric aerosol transport and

169:52

nucleation project released about 400

169:55

gram, less than a pound of sulfur

169:57

dioxide into the stratosphere from a

169:59

balloon launched in Southeast England in

170:02

2022.

170:03

>> I mean, there could there's got to be

170:04

another acronym, right, guys? We got to

170:06

call it I don't know if people are going

170:08

to know we don't mean Satan.

170:11

>> Yeah. So, I

170:16

>> I mean, it's right your face. That's so

170:18

crazy to call it Satan and to get that

170:21

through a board meeting. What are you

170:22

guys calling it?

170:23

>> Satan.

170:24

>> Oh, like it. Let's go. Run with it.

170:26

Controversial.

170:27

>> It'll get us a lot of press. That's what

170:29

we want.

170:30

>> Well, you know, hail Satan. They'll know

170:32

it's about our aerosol distribution

170:34

system,

170:35

>> of course.

170:35

>> Well, what do you think? What do you

170:38

think about that? Because I I mean, I go

170:40

back and forth, but it sure seems fishy

170:42

that right after the all the like first

170:44

he got so mad. Remember, he got really

170:47

mad. He's like, why are people still

170:48

talking about that? And then the Epstein

170:51

files

170:53

against his will seemingly there's a lot

170:56

of counter pressure get released

170:58

uh in the way and the way that has

171:00

freaked everybody out and then sometime

171:04

like within a month of that it seems

171:05

like suddenly he's like on Air Force One

171:07

saying he's going to do his closure and

171:09

then suddenly we're bombing Iran.

171:12

What do you think about that? I mean, do

171:14

you think it's it's connected? cuz it

171:16

sure as seems like it. But I I

171:18

again like I

171:19

>> if you were writing an amazing script

171:21

that was insane, you would

171:23

connect it.

171:24

>> Right.

171:24

>> Right. That would be the best version of

171:26

the script. If you wanted to make a

171:28

insane movie where a blackmail

171:31

operation on an island involving the

171:34

most powerful and interesting people in

171:36

the world that somehow was that was a

171:40

primary factor in the end of

171:42

civilization. Oh, dude.

171:44

>> Imagine that would be the craziest story

171:47

you could write. And we always want to

171:50

think, no, people wouldn't do that

171:53

because you wouldn't do that because

171:55

you're not a sociopath, but you're also

171:57

not bombing schools in another country.

171:59

You're also not

172:02

>> doing a host of things that we

172:04

shouldn't be doing all over the world,

172:06

>> right?

172:06

>> You're not that person. You're a regular

172:08

person who goes to a regular job, who

172:10

has a regular life and a family, and you

172:12

don't want to believe that people that

172:14

you align with would behave literally

172:18

demonically,

172:19

>> right? Yeah. And then you just have to

172:21

deal with it.

172:22

>> And then what do you do when you you're

172:24

confronted with you redacted names of

172:27

powerful people in these f like why'd

172:29

you redact a guy's name?

172:30

>> Why are you protecting these people?

172:31

>> How come you're not redacting all the

172:33

guys names? How come none of them went

172:34

to do a crime? Because there's a lot of

172:35

people that got that were in those files

172:38

that didn't do anything and you didn't

172:40

redact their names.

172:41

>> Some people you redacted.

172:42

>> That's very strange.

172:43

>> And some people have clearly done

172:45

up here and they're not in jail.

172:47

>> There's also like Tell me what you're

172:49

talking about when you're talking about

172:51

pizza and grape soda

172:53

>> jerky.

172:53

>> And you you you want to take Viagra

172:56

before you get grape soda there. That's

172:59

one of the emails.

173:00

>> I haven't seen that. That is so messed

173:02

up.

173:02

>> Oh yeah.

173:03

>> Grape soda.

173:03

>> Yeah. Take your take your Viagra, take

173:06

your erectile dysfunction medication

173:08

before we take we go get grape soda.

173:11

>> What?

173:13

Like, and how arrogant. That's what's so

173:15

crazy. How arrogant to put that in an

173:17

email. Like to think that you're you're

173:20

so comfortable with all this.

173:22

And you don't see the writing on the

173:24

wall in terms of like emails. Like your

173:27

emails are available. That's crazy.

173:29

>> I mean, look, man. It's just it's like I

173:33

guess this is like we have to contend

173:36

with this reality. Yeah. And nobody

173:38

wanted to do this is the act the same

173:39

happens in families by the way when

173:42

if when you as it turns out like an

173:44

uncle a family member was abusing kids.

173:47

Oh yeah.

173:47

>> And it's the same where like some

173:50

even victims of abuse will defend the

173:53

person because they want to wreck the

173:55

family.

173:55

>> I guess we're looking at that like on a

173:57

global level. But in this case,

173:59

I guess it's the the it's being used

174:02

theoretically to manipulate powerful

174:04

people into going to war. Like that's

174:08

the general like through line here is

174:11

that it's somehow connected to the MSAD

174:13

or is

174:13

>> not just going to war but controlling

174:16

resources, overthrowing governments, you

174:18

know, pushing out narratives that aren't

174:20

accurate because they're going to

174:22

benefit certain companies.

174:23

>> There's a lot involved. It's

174:25

>> there's also relationships you get with

174:28

these people give you access to these

174:30

parties and you don't want to it

174:32

up. So you don't want to criticize these

174:34

people that are involved. You don't want

174:35

to say anything that's going to get you

174:37

kicked out.

174:38

>> And for a lot of these dorks, these

174:39

scientists and stuff, it's probably the

174:41

most exciting experience they've ever

174:43

had in their life. And they get

174:44

to have it like every six months or

174:46

every 3 months or every four, whatever

174:47

it is, you got to go to a conference.

174:49

Jeffrey Jeffy's really working hard on

174:51

uh philanthropy.

174:53

>> Yeah. He's donating money. money to

174:55

philanthropy. I got to go meet with him.

174:56

>> I got to go meet with him and a bunch of

174:58

hot Russians.

175:00

>> Yeah. And then that's the your favorite

175:02

time of life. The f first time in your

175:04

whole life where super hot girls are

175:07

just available to you on an island

175:09

somewhere and you think you're

175:10

completely protected because Bill

175:11

Clinton's over there,

175:12

>> right? Which is crazy.

175:14

>> Which is crazy. And so I don't know Bill

175:16

Clinton went. I assume a lot of people

175:17

went.

175:18

>> I don't know. I think but the reality is

175:19

that's the is on the plane hung out with

175:20

the guy. We know he was on the plane a

175:22

million times 26 times

175:23

>> and it was called the Lolita Express. Is

175:24

that actually the name of the plane?

175:25

>> I don't think so. I think they just

175:27

called it the Lolita Express. I don't I

175:29

don't think so. There was

175:30

>> No, he didn't name it that. It couldn't

175:32

be that name planes. You name boats,

175:34

>> right? Yeah. I'm an idiot.

175:36

>> But the the point is it's like if you

175:38

were going to write a book, that's how

175:40

you'd write it. You'd write it where you

175:41

can completely manipulate the world. I

175:43

think he was I think I remember reading

175:45

that he was kind of uh obsessed with

175:47

that book Lolita. Like he had like

175:50

something like 30 copies of it or

175:51

something.

175:51

>> Epstein was

175:52

>> handed out at parties. Look guys, this

175:54

is

175:56

it's like the Book of Mormon. You hand

175:57

it out.

175:58

>> Just hand it out to people.

176:02

That's the other sick thing. Like that's

176:05

a sick thing with like 72 virgins in

176:07

heaven. That's a sick thing with like

176:08

this idea that you want to get them

176:10

really young.

176:11

>> Yeah. No evidence that it was named

176:12

that, but

176:13

>> Okay, I'm dumb. I'm sorry.

176:14

>> I think No, I think that's what people

176:15

were calling it.

176:15

>> I honestly thought that. I'm gonna admit

176:17

I thought that he named his plane that

176:19

>> I think that's just what people were

176:20

calling it because it was fun to say.

176:23

>> Um, but yeah, again, it seems like a

176:27

simulation cuz it seems like it's so and

176:29

it's also unraveling before our eyes

176:31

because we have access to it we never

176:32

had before, right?

176:33

>> Like they're starting to investigate all

176:35

these fraud nos and all these different

176:38

things that are operating in California.

176:40

is nuts. Incredible. Billions of dollars

176:42

every year is being lost to it.

176:44

>> What's the name of that kid who's been

176:45

doing that?

176:45

>> Nick Shirley.

176:46

>> Nick Shirley. Dude, he is so brave cuz

176:50

like he's I believe wasn't he

176:51

with like the Russian mob or

176:53

something or the Armenia like in the one

176:55

with the hospices

176:56

>> probably.

176:57

>> Like he's with like pro

176:59

theoretically very dangerous people and

177:01

he he's like the perfect person for the

177:03

job too. Like he's just But don't you

177:06

wor you worry about that dude like

177:08

>> 100%. Well, and you know, the amount of

177:10

money that they're uncovering is

177:12

staggering. And now the government of

177:13

California is trying to spin it, saying

177:15

that they were investigating it first.

177:16

And these investigations investigations

177:18

were initiated by them.

177:19

>> How long do you got to investigate it?

177:21

This YouTube gig goes there and

177:22

investigates it for 10 minutes and

177:24

you're like, "What the This is

177:25

>> It's been going on for a long time, man.

177:28

It's a long time." And the statistics,

177:32

like the amount of NOS's, it's bananas.

177:34

The amount of money that goes through

177:35

them is bananas. I was reading this.

177:38

There's a lady who was running a

177:40

nonprofit who was making a million

177:41

dollars a month.

177:42

>> What?

177:43

>> Yeah. She made like $48 million. No, I

177:46

don't know if this is true. I was

177:47

reading this thing. Find out if that's

177:49

true. Some lady, she was running some

177:51

sort of nonprofit and uh she gave

177:55

herself a raise and she eventually got

177:57

to the point where she was making about

177:59

a million dollars a month.

178:00

>> Do you know where?

178:02

>> God, I wish I do.

178:03

>> Not to derail that, but we do know that.

178:05

Remember when that lady was like

178:06

>> It sounds insane though. It doesn't

178:07

sound real. That sounds like something

178:09

that a bot would create to make me say

178:10

it

178:12

>> here. Here's a here's a real one. The

178:13

lady was running the homeless

178:16

program in LA. Remember when that

178:18

went down with her where like like there

178:20

was she got canned like there was an

178:22

investigative they were investigating it

178:24

cuz what is it? She like the a company

178:27

that her husband worked at.

178:29

>> Yeah. Something like that.

178:30

>> They got like a huge grant.

178:33

>> What's this one?

178:35

Rochester women have been sentenced to

178:37

six months in the feeding our future

178:39

fraud scheme. What is this one? This is

178:42

a different one.

178:42

>> Coming up when I typed in someone

178:43

getting a million dollars a month and

178:45

some here in Rochester claimed they were

178:48

serving 2,000 to 3,000 meals a day

178:51

>> to kids. But prosecutors say the group

178:54

stole 4.3 million from the federal

178:56

government.

178:58

>> Jam is responsible. I think this is a

179:00

different one.

179:01

>> This one wasn't it wasn't fraud. She was

179:03

just that's how much she got paid.

179:05

>> That's how much she charged for making

179:07

those meals.

179:08

>> Well, you can get paid a lot of money to

179:10

work on the homeless. That's one of the

179:11

things that my friend Kolon Noir showed

179:13

us that these people that are working on

179:15

homeless in Los Angeles4 million a year,

179:18

$400,000 a year.

179:19

>> Yeah. It's it's the mo I mean, talk

179:22

about satanic. It's like you're

179:25

you're theoretically supposed to be

179:27

helping people who are like going

179:29

through the worst possible thing you can

179:31

go through and you're just putting that

179:32

money in your pocket.

179:34

>> Yeah, I think this is a different lady.

179:36

>> I think I think there's How many of them

179:39

are there?

179:40

>> I think there's quite a few.

179:42

>> Remember when they were going to get

179:42

them tents in LA and it was like the

179:45

amount of money per tent was like this

179:47

insane amount of money.

179:48

>> It's amazing. It's kind of amazing.

179:50

>> It is amazing.

179:51

>> They've been doing it for years. Um,

179:53

tell me if this is true.

179:54

>> Charity boss blew 11 million meant for

179:56

needy kids.

179:57

>> Looking for fraud is not a new thing.

179:59

>> Nonprofit. Exact. It isn't.

180:01

>> I sent you something, Jamie. Um, run

180:04

that through perplexity and let's find

180:05

out if this is true cuz this is

180:07

something that someone sent me on

180:08

Twitter that is just bananas. And if

180:12

it's true, it's completely

180:14

insane. I don't know if it's true.

180:16

That's why I need to run by. But it's

180:18

the amount of money that goes through

180:20

NOS's in New York and in California

180:22

alone.

180:24

It's you you you read it and you go that

180:26

can't be real. This can't be real. Like

180:28

it's it's it's so insane. And again, you

180:31

don't know if it's real until even if

180:33

you run it through an AI. I get you

180:35

might get a better idea, but like how do

180:38

they know? How do they know exactly

180:40

where the money's going? There's so much

180:41

money they're talking about.

180:44

Specific numbers for New York and

180:46

California nonprofits are broadly

180:47

accurate, but the leap from 1 trillion

180:49

in annual nonprofit revenue to 39

180:51

trillion in fraud is not supported by

180:53

any critical credible data and is not

180:56

true. So, California nonprofits about

180:58

213 to 214,000 organizations reporting

181:03

roughly 593 to 600 billion in annual

181:07

revenue.

181:08

>> Wow. New York nonprofits, 132,000

181:11

organizations reporting roughly 446

181:14

billion in annual revenue. Combined, New

181:17

York and California nonprofit revenue is

181:19

on the order of 1 trillion per year,

181:22

mainly from hospitals, universities, and

181:24

large service providers.

181:26

>> So the post you're quoting is roughly

181:27

right on the scale of revenue, but

181:28

that's not the same as fraud,

181:30

>> right? So it's is that $1 trillion, all

181:34

the NOS's, it's all accounted for, it

181:35

all goes to the right things. That's

181:37

where things get squirrely because it's

181:39

like how much of the waste says a recent

181:42

critique using IRS sampling suggests

181:44

that perhaps around 20% of nonprofits

181:46

may have compliance issues. And one

181:48

investigator uh speculated this could

181:50

imply that up to 120 billion of

181:53

potential waste, fraud, or abuse in

181:56

California's nonprofit sector. Even that

181:59

is presented as a rough upperbound

182:01

estimate, not a measured fact. So

182:03

there's some potential waste, fraud, and

182:05

abuse that may be as high as 120 billion

182:08

a year. Sector-wise, US nonprofits take

182:11

in about 3.7 trillion in revenue

182:14

annually with most of that concentrated

182:16

in large hospitals and universities,

182:18

which are heavily audited and regulated.

182:20

So there's some fraud, but they're

182:22

saying that if you look at all the

182:24

money, they're they're trying to pretend

182:25

that the government doesn't cost any

182:27

money to run, right? So that all these

182:29

different nonprofits and organizations

182:31

and hospitals don't they definitely cost

182:33

money to run universities cost money to

182:35

run but how much is fraud that's the

182:38

question it's not zero

182:40

>> well I mean also I think like when it

182:41

comes to fraud there's like fraud fraud

182:44

like what Shirley has uncovered and then

182:46

there's almost like a gray area that

182:48

starts appearing where it's like well we

182:49

need we need this we need these people

182:52

working at this company and we we need

182:54

to pay them this much but they're not

182:56

doing anything

182:57

>> right

182:58

>> you know it's It's, you know what I

182:59

mean? Like or there there you could

183:00

easily not have that many people like

183:03

taking the money themselves.

183:05

>> Definitely.

183:05

>> So, you know, there's a lot of gray area

183:07

there.

183:08

>> Yeah. Well, it's it's one of those weird

183:10

things. It's like, is it just propping

183:11

up more government? You know, because

183:13

there's a lot of that if you have all

183:15

these people working for you and you're

183:17

doing something and you don't nothing

183:18

ever gets accomplished, but you're still

183:21

making a ton of money. Like the

183:22

California homeless thing where they

183:24

spent $24 billion, they can't account

183:26

for it. That's not really fraud because

183:28

you have people working. They're just

183:30

not doing anything. They're not getting

183:32

anything done and you're not firing

183:33

them. They're not accomplishing the

183:34

mission at all. In fact, they're doing a

183:36

terrible job. There's more homeless than

183:38

ever.

183:39

>> What's that?

183:40

>> It's the thing on the Sopranos where

183:41

they go and sit at a construction site

183:43

to say that they have a job, you know.

183:44

Yeah. Exactly.

183:45

>> I knew a guy who had one of those.

183:46

>> Really? At the Javit Center.

183:48

>> No no job. Well, he's a mob guy.

183:51

>> So, it's a no-show job. What does that

183:52

mean?

183:53

>> You don't have to show up for work. You

183:54

just get it paid. you just get a check

183:56

and there's they give a certain amount

183:57

of those. So back this is back in the

183:59

day of course when things were corrupted

184:01

but back in the day when like you know

184:03

unions controlled certain areas, the mob

184:05

controlled certain areas, there was a

184:07

certain amount of no-show jobs you would

184:08

give people and what this helped with

184:10

the mob was you'd have a credible source

184:12

of income.

184:13

>> And so these people mostly lived

184:15

modestly, small houses and like you know

184:17

Brooklyn and these places where they

184:19

would all like gather together and buy

184:21

houses on the same block. Small houses.

184:23

Yeah. They then they got their money

184:25

from a real legit check from a

184:27

construction company or whatever

184:29

whatever the it was,

184:30

>> but everybody knew,

184:32

>> right?

184:32

>> Everybody knew what they were doing.

184:33

>> And think how much how easy now that

184:35

people are doing like remote work,

184:37

>> the no-show job.

184:39

>> Oh, yeah.

184:39

>> So, like you theoretically you could

184:41

have this nonprofit where you just

184:43

wanted to like distribute this

184:45

government money to your friends.

184:47

>> Yeah.

184:47

>> And you don't even have to have an

184:48

office building because they're all

184:50

working remotely. this list of the top

184:53

nonprofit organizations. Joe, I'd like

184:55

to point you at number three.

184:58

>> Oh, Battel Memorial Institute. Patel is

185:02

This is an organization that Jamie has

185:04

been obsessed with. It's in Ohio for

185:06

like four years. We always say all roads

185:08

lead to Ohio.

185:09

>> They're involved in everything.

185:11

>> Yeah.

185:11

>> What the is the Battel Memorial?

185:13

>> Exactly. You don't even know. That's how

185:14

secret it is, son. Duncan Trussell,

185:16

you're a conspiracy theorist

185:18

from the core.

185:19

>> From the old days. You don't know about

185:21

Battel?

185:21

>> I don't know about Battel.

185:22

>> You need to get lectured by Jamie. He

185:23

has a whiteboard. He'll pull out the

185:24

whiteboard, make connections.

185:26

>> I'll just leave you with this is that

185:28

when uh the UFO from Roswell was taken

185:30

at Rip Hat,

185:31

>> you know, they studied it.

185:32

>> Yeah.

185:33

>> They studied the like the Nitanol, I

185:34

think is what it came out of it. That

185:36

was at Battel.

185:37

>> Whoa.

185:37

>> The top metallergologist in the world at

185:39

the time were there. Maybe still are.

185:47

Out of all the things that happen, I

185:48

hope the UFOs get here first.

185:50

>> Me, too.

185:50

>> I hope they go settle the down.

185:54

>> Yeah, I'm praying for it, man.

185:56

>> That's the best case scenario. Worst

185:58

case scenario is meteor,

186:01

reset.

186:05

Just people living in caves for hundreds

186:07

of years. Like those weird caves they

186:09

find in like Turkey and Like why

186:11

these guys dig these things underground?

186:13

Why is there a city underground that can

186:14

hold like 20,000 people?

186:16

>> The same reason the claw bots are hiding

186:17

in code. It's like, you know what I

186:20

mean? It's some residual AI trying to

186:21

hide in the server after the server gets

186:23

wiped. That's the meteor.

186:27

Reset. Boom. Just reset.

186:30

>> Press reset. Wipe the server.

186:32

>> Let's wrap this up on a happy note.

186:33

Duncan, I love I love you.

186:35

>> It's always great to have you,

186:36

>> dude. Thank you for having me on the

186:37

show.

186:38

>> So much fun.

186:38

>> Can I plug my show?

186:40

>> Please do. And you're going to be at a

186:41

club this weekend.

186:42

>> Rosemont, Illinois.

186:45

Come on out.

186:45

>> Zies. Great club.

186:47

>> Yeah, it is. That's what I've heard,

186:48

too.

186:49

>> Zies are great.

186:49

>> Yeah, they're awesome, man.

186:50

>> Zies in Nashville rules.

186:52

>> I love Nashville Zies.

186:54

>> That has like the old school head shot

186:56

on the wall, too. Like Richard Jenny

186:58

from back in the day.

186:59

>> Yeah.

187:00

>> Yeah.

187:00

>> That's me.

187:01

>> Look at that. Duncan Trussels. I got to

187:03

start shaving my head again.

187:04

>> Yeah, you look hot there. I like it.

187:06

>> Thank you.

187:06

>> I love you, brother.

187:07

>> I love you, too. Thanks for having me.

187:08

>> Bye, everybody. Bye.

187:09

>> We're going to be okay. I hope.

Interactive Summary

Joe Rogan and Duncan Trussell discuss a wide range of topics including the 'Ghost Murmur' CIA technology capable of detecting heartbeats from 40 miles away, the risks and future of AI, and the disappearance of scientists. They delve into personal stories like Duncan's past ketamine addiction, the 'Bristol bladder' phenomenon, and Rogan's early podcast days featuring guests like Graham Hancock and Anthony Bourdain. The conversation also explores simulation theory, the nature of modern geopolitical propaganda (specifically the Jessica Lynch story), and the current state of UFO disclosure.

Suggested questions

5 ready-made prompts