HomeVideos

Joe Rogan Experience #2467 - Michael Pollan

Now Playing

Joe Rogan Experience #2467 - Michael Pollan

Transcript

4081 segments

0:01

Joe Rogan podcast. Check it out.

0:03

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:12

>> Mr. Paul. So good to see you again.

0:14

>> Hey, good to be back.

0:16

>> Consciousness. So, um, this new book,

0:19

what inspired it? What what got you to I

0:22

mean, you you've kind of explored

0:23

consciousness a little bit with your

0:25

>> psychedelic book. Yeah. How to change

0:27

your mind. Well, actually this book was

0:30

inspired by the research I did for that

0:32

book. Um, as you know, I had several uh

0:36

research trips. Um, and uh,

0:39

>> do you do air quotes when you say

0:40

research?

0:41

>> Yes.

0:44

And I um, and two things happened that

0:46

were really interesting. One is there's

0:49

something about psychedelics that makes

0:52

you think about consciousness. it, you

0:55

know, it's like smudging the windscreen,

0:57

the windshield that you normally is

0:59

perfectly transparent and you see the

1:01

world through. Suddenly it's like

1:03

different and you realize there's

1:04

something between me and the world and

1:07

what is it? And that's consciousness.

1:10

And so like a lot of people have who've

1:13

done psychedelics, you start wondering

1:15

about this mystery. Why is it this way,

1:17

not that way? So that was one

1:19

experience. The other was I had an

1:21

experience in my garden in Connecticut

1:23

where we have a house of um uh walking

1:27

through my garden and getting the

1:28

powerful impression that the plants were

1:30

conscious and that these I remember

1:33

these this particular it was a plume

1:35

poppy or several plume poppies and they

1:37

were like returning my gaze. They were

1:40

very benevolent. They were, you know,

1:43

putting out positive vibes,

1:46

but like they were conscious, much more

1:48

alive than they had ever been. And like

1:51

a lot of insights on psychedelics, I

1:52

didn't know what to do with it. Like, is

1:54

it true? Is it just a drug thing? You

1:56

know, what is it? Um, but I decided it'd

1:58

be interesting to find out. And uh I

2:01

consulted a couple people, scientists,

2:03

and said, "What do you do with an

2:04

insight like that?" And they said,

2:06

"Well, you test it against other ways of

2:08

knowing, including scientific ways of

2:10

knowing." And that led me down this uh

2:13

really interesting path uh exploring

2:16

plant intelligence and plant

2:17

consciousness. So basically it yeah the

2:21

book grew out of the psychedelic

2:22

experiences and some meditation

2:24

experience. Meditation also has a way of

2:26

making you like hyper aware of how

2:28

strange your thoughts are. Where are

2:30

they coming from? Who's thinking them?

2:32

>> So there's a bunch of different schools

2:33

of thought when it comes to

2:34

consciousness, right? There's one like

2:36

the Rupert Sheldrake thing that sort of

2:38

everything has consciousness and there's

2:42

the sort of

2:45

rational scientists that believe it

2:47

exists somewhere in the mind. I don't in

2:49

the brain.

2:50

>> Yeah, in the brain, excuse me. And then

2:52

there's people that think that the brain

2:54

is essentially just an antenna,

2:56

>> right?

2:56

>> That's tuning in to the greater

2:58

consciousness of whatever it is that's

3:00

out there.

3:01

>> Yeah. Do you have any one of them that

3:03

you hold

3:05

>> or they're all equally plausible? I, you

3:08

know, I went into the experience

3:10

assuming because this is what most

3:12

scientists assume that somehow a certain

3:15

arrangement of neurons in the brain

3:17

generates consciousness, you know,

3:19

subjective experience. But no one's been

3:21

able to show that. We've gotten nowhere

3:24

in that effort to, you know, we can we

3:26

we might correlate certain parts of the

3:28

brain with consciousness, but we don't

3:30

understand how three pounds of matter

3:33

could generate the feeling of being you.

3:36

>> You talk about it in your book where the

3:38

the two gentlemen who had the bat.

3:39

>> Yeah. Yeah.

3:41

>> Um that was Kristoff Ko, who's a a a

3:44

great brain scientist, and David

3:46

Chomers, who's a philosopher. And uh

3:50

this goes back to like in the early 90s.

3:53

They were getting drunk in a bar in

3:54

Bremen, Germany. And uh Kristoff Ko had

3:58

had really was at the beginning of the

4:00

modern scientific exploration of

4:02

consciousness. And he was working with

4:04

Francis Crick who had just come off of a

4:07

Nobel Prize for the discovery of DNA.

4:10

And Crick, who is like the most famous

4:12

scientist in the world at the time, um

4:15

thought, well, the same kind of

4:17

reductive science that discovered the

4:19

double helix DNA and explained heredity,

4:23

um I'm going to do that for

4:24

consciousness. He's very arrogant man,

4:26

and he he thought it just, you know, no

4:29

problem. Um and Crick was kind of his

4:32

sidekick. I'm sorry. Uh Ko was his

4:34

sidekick. And so Ko who shared that kind

4:37

of confidence made this bet with

4:39

Chomemers that they would find the

4:41

neural coralates the parts of the brain

4:43

that are responsible for consciousness

4:45

within 25 years.

4:47

That was 25 years 27 years ago now. And

4:51

uh Chomers won the bet. Chomers is

4:53

famous for um coining the term the hard

4:57

problem to you know to um describe the

5:01

whole effort to figure out

5:02

consciousness. And it's a hard problem

5:05

for a lot of reasons. Um I mean it is

5:07

one of the biggest mysteries in the

5:09

universe. I mean how consciousness come

5:11

came to be. Did it evolve? Was it always

5:13

here? Um but he his his point was that

5:19

our science is based on third person

5:22

objective quantifiable measurements and

5:25

consciousness is fundamentally a

5:27

subjective first person experience. So

5:29

how does that those tools reach in and

5:32

say any anything of value about

5:34

consciousness? So he said you know there

5:37

easy problems of consciousness we can

5:38

figure out like perception um emotion

5:42

things like that but but there is this

5:44

hard problem how do you get from matter

5:45

to mind and uh he won the bet. M

5:50

>> there was a ceremony I went to a couple

5:52

years ago at NYU and uh uh Ko presented

5:57

Chomemers with a case of very fine

5:59

Madera wine and uh and renewed the bet.

6:03

He said, "All right, in another 25

6:05

years."

6:06

>> That's optimistic. How old are these

6:07

gentlemen?

6:09

>> Ko is in his late 60s, so we'll see if

6:11

he's around for this. But uh and Chmer

6:14

is a little bit younger.

6:16

Um it's it's such an interesting thought

6:21

because we know that the mind contains

6:25

if damaged right it we know that there's

6:28

certain aspects there's certain parts of

6:30

the mind where like labbotoies for

6:32

instance we know that if we disturb it

6:34

it radically affects behavior. We know

6:36

that there's parts of the mind that you

6:38

can stimulate that can actually recall

6:41

memories. Yeah. Right. There's some some

6:43

weird stuff going on there. So we know

6:45

it's somehow or another at least

6:47

functionally connected to consciousness.

6:48

>> Oh yeah, it's definitely a relationship.

6:50

But but

6:51

>> if it's generating consciousness, that's

6:54

one thing. But it could be, as you said

6:55

earlier, it could be receiving

6:56

consciousness.

6:58

>> And the same things would hold true that

6:59

if you damage parts of the brain,

7:02

>> sure. Yeah. Yeah. Damage signal

7:06

television, right? Um

7:08

>> so that that doesn't determine the truth

7:11

of either theory. And then the other one

7:13

is pansychism

7:15

which you were alluding to. I don't know

7:17

if that's Rupert Sheldrake would he I

7:19

think he would believe more in the field

7:21

of consciousness.

7:22

>> Yeah. Right. He was a morphic resonance

7:24

guy but I think he also subscribed to

7:26

this idea that things contain

7:27

consciousness. It's not his but you know

7:30

what I mean. It's well it's been it's

7:32

pretty universal, right? There's a lot

7:33

of people that have subscribed to this

7:35

idea that everything has consciousness.

7:37

>> Yeah. uh that that even the particles

7:39

that this table is made of have some

7:41

insy little bit of of psyche. And the

7:43

challenge there is so that that solves

7:45

the problem of how did it evolve? It

7:47

didn't evolve. It was always here.

7:49

>> But then you have this other problem

7:51

like how well how do you take these if

7:53

every one of our cells is made of

7:55

particles that are conscious? How do you

7:56

combine them in such a way that you get

7:58

the sort of consciousness we have?

8:00

>> Uh

8:00

>> it's called the combination problem and

8:02

nobody solved that. It's a, you know,

8:04

it's a really deep mystery and uh this

8:07

is a this is an odd book in some ways in

8:10

that I don't know if this is very

8:12

selling, but you'll know less at the end

8:14

than you do at the beginning,

8:16

>> but it's a fun ride.

8:18

>> Oh, it's Yeah, I think it's a great

8:19

ride. It was a great ride for me. I

8:21

learned so much.

8:22

>> Well, it's a fun ride to consider these

8:23

things that no one can really figure out

8:25

or not yet.

8:27

>> Yeah. And also just to be put in touch

8:29

with the fact you have this marvel going

8:31

on in your head all the time. You have a

8:33

voice in your head. You know, we're

8:34

talking to each other, but you've got

8:35

another voice going on thinking what

8:37

you're going to ask the you know, what

8:38

the next question is,

8:39

>> maybe what you're going to have for

8:40

dinner. You know, there's it's it's this

8:44

amazing interior space we have.

8:46

>> Yeah.

8:47

>> And nobody understands how it came to

8:49

be.

8:49

>> And he can manage it,

8:51

>> which is also interesting because like I

8:53

don't think about what I'm going to have

8:54

for dinner. That's that's the thing to

8:57

stay no about any of those things. It's

8:59

the way to stay locked in in a podcast.

9:02

>> Yeah. That's true.

9:02

>> Only think because you can let your mind

9:04

wander. Especially if someone on the

9:06

other side is boring.

9:08

>> Yeah.

9:08

>> And then I'm like, "Oh no, this

9:10

conversation's going to be pulling

9:11

teeth." And then I start thinking about

9:12

a new joke I'm working on or, oh, I got

9:15

to get my car fixed.

9:16

>> Well, that's called spotlight

9:18

consciousness when you can like really

9:19

like put the blinders on. Yes.

9:21

>> And and rule everything out. And that's

9:24

opposed to uh lantern consciousness

9:26

where you're taking in all sorts of

9:28

information. and you're letting your

9:29

mind wander

9:31

>> and that, you know, they both have their

9:32

value for for for our careers. Spotlight

9:36

consciousness is essential for our work.

9:38

We have to be able to focus uh to get

9:41

through school. We have to be able to

9:42

focus, but you know, children have this

9:45

other kind of consciousness that's

9:47

really wild because they're very

9:49

undisiplined. they can't stay on task,

9:51

but they're taking in so much

9:52

information and the world is just full

9:54

of wonder and awe and um uh and

9:59

psychedelics, you know, is a way to

10:01

recover that kind of consciousness

10:02

because you you're getting lots of

10:04

sensory information from all over the

10:06

place. It's very hard to focus. Um and

10:10

uh so it's a taste of that other, you

10:13

know, childhood consciousness.

10:15

>> I always say that about marijuana as

10:17

well. Like there's a thing about

10:18

marijuana that people always say that it

10:21

makes them paranoid. And I say it makes

10:24

you aware of all the things you should

10:25

be paranoid about.

10:27

>> Like like you're very we're very

10:29

vulnerable creatures, you know, but we

10:31

like to pretend that we are not, you

10:33

know, which is I found that out of all

10:36

of my friends, the ones that have tried

10:38

marijuana and hated it are all the ones

10:40

that are control freaks.

10:42

>> Yeah. They're all like really give up

10:44

control.

10:45

>> Yeah. They're all really buttoned down.

10:46

very serious, like really worried about

10:49

outcomes, really concentrating on their

10:51

career, really worried about,

10:54

>> you know, just certain things that are

10:56

just

10:57

a part of their daily life. And then

10:59

they get a couple of hits of good weed

11:01

and then they're like, "Oh my god, we're

11:04

on a planet."

11:08

You start freaking out like, "Oh my god,

11:10

none of this makes sense. All this is

11:12

crazy."

11:14

You know, um,

11:15

>> the best piece of advice that I had when

11:17

I was, you know, starting my exploration

11:20

of psychedelics is you have to

11:22

surrender.

11:23

>> Yes.

11:23

>> If you resist, you're going to be

11:25

miserable. You're going to get so

11:27

anxious and so paranoid.

11:29

>> And if you let go, it's going to work

11:31

out.

11:31

>> Yeah. You just got to be able to accept

11:34

whatever it's showing you.

11:35

>> And um, you know, we live in a very

11:38

strange culture where that's illegal.

11:41

>> One of the most

11:41

>> Well, not everywhere, right? I mean,

11:43

it's changing.

11:43

>> Well, it is changing fortunately and

11:45

there's some talk about it changing

11:47

federally. You know, I actually talked

11:48

to RFK Jr. about that. There's some

11:51

amazing therapies that are hugely

11:54

beneficial to veterans, police officers,

11:57

people with severe PTSD that have

11:59

experienced, you know, horrors that the

12:01

average person never has to experience.

12:04

And then they're forced to just like go

12:06

back, they're released, go back to

12:09

regular life. I know you've served us in

12:11

overseas and you've seen people blow up,

12:13

but now go to the supermarket,

12:15

>> take this SSRI and be okay.

12:17

>> And you know, I know a bunch of them and

12:19

so many of them have benefited

12:20

particularly from Ibagane.

12:22

>> I gain um the work that Rick Doblin and

12:25

M done. Yes. MDMA and and psilocybin.

12:28

Those three are the big ones that I

12:30

think

12:31

>> well you know I heard a lot of positive

12:33

noise out of the administration at the

12:35

beginning that they were um very much in

12:38

favor of of um approving the FDA

12:41

approving MDMA first and then

12:43

psilocybin. I don't think we're there

12:45

with Ibeane yet just because the

12:46

research hasn't been done although it

12:48

has shown great benefit anecdotally but

12:51

something happened in the last month or

12:53

two um and there is uh there was um

12:58

either compass pathways that was going

13:01

to submit for psilocybin therapy or maps

13:06

with um was on a list of five drugs that

13:10

were going to get an expedited approval

13:12

process. this list went up to the White

13:14

House and the psychedelic was taken off

13:17

it. So, there's somebody in the White

13:19

House who doesn't want to see this

13:20

happen.

13:21

>> Um, so it may slow down even even if RFK

13:24

Jr. is in favor and some other people at

13:26

the FDA are in favor. Um, and maybe

13:28

they're just waiting to get past the

13:30

election.

13:31

>> It could be that it's too controversial

13:33

for something to do before the midterms.

13:35

>> Yep. Yep.

13:36

>> Um, that's a gross way to live your

13:39

life.

13:40

always worrying about midterms and

13:43

elections and you can't do what you

13:44

actually want to do or think is right to

13:47

do because you're worried about public

13:48

perception. It's just

13:50

>> and I don't think it would be unpopular.

13:51

I mean, the fact that it's helpful to

13:53

vets and first responders and women

13:55

who've been victims of sexual abuse

13:58

seems to me that's a very sympathetic

13:59

group of people.

14:00

>> Yeah. And everyone has experienced loss

14:02

of family members. There's a bunch of

14:04

different things that it can help you

14:05

with that that are way better for you

14:07

than just numbing your mind all day

14:09

long. Yeah.

14:10

>> Which is what a lot of people are

14:11

choosing to do. And then unfortunately a

14:13

lot of people self-medicate as well. So

14:15

then they get involved in,

14:16

>> you know, all sorts of stuff that they

14:18

just pick up off the street or they

14:20

start using alcohol, you know.

14:23

>> Well, you know, it's a this to go back

14:25

to consciousness. This is this is a very

14:28

common thing that people want to be less

14:30

conscious,

14:30

>> right? And I get that if you had trauma,

14:34

um, if you're if you're a ruminator and

14:37

being in your mind is a really scary

14:39

place to be.

14:40

>> Yeah,

14:41

>> it doesn't solve anything. But you have

14:43

all these techniques we have for muting

14:46

consciousness and just being less aware,

14:48

less present. And one of the things that

14:52

I concluded after doing all this

14:55

research on consciousness is that um

14:58

it's funny I I was going down this path

15:00

of tight focus sol you know it was a

15:03

very kind of western male

15:06

framework which we got a problem what's

15:08

the solution hard problem of

15:10

consciousness what's the right theory

15:12

and at a certain point I realized okay

15:14

that's an interesting question it's

15:16

probably not solvable now but there is

15:18

this incredible phen phenomenon that

15:20

that we have this interior space where

15:23

we have complete mental freedom, total

15:26

privacy, we can think whatever we want

15:29

and we're and we're giving it away. um

15:31

we're we're either, you know, muffling

15:34

it with drugs and things like that

15:37

or we're filling that time with social

15:39

media, you know, scrolling. Um uh you

15:43

know I mean we've heard about hacking

15:45

our attention and and we know these

15:47

algorithms you know from social media

15:49

are very good at like giving us these

15:51

little dopamine hits but um that's

15:55

that's time that we used to spend in

15:57

spontaneous thought you know daydreaming

16:00

mind wandering which can be very

16:02

creative. So um I I I came out of it

16:06

thinking no the I may not solve

16:08

consciousness but I'm going to

16:10

appreciate it. I'm gonna use it. I'm

16:12

gonna um create a space for it. And and

16:16

you know, meditate is one way. Using

16:19

psychedelics is another way. These are

16:20

all ways to be in your head and explore

16:23

what's there, which is kind of

16:24

miraculous.

16:26

>> Yeah. There's a bunch of different ways

16:27

to do I mean, some people like to do it

16:28

through running.

16:30

>> Yeah. You know, running is uh also

16:32

they've found one of the things they've

16:34

found recently is that running with when

16:37

in terms of endogenous canabonoids like

16:40

runner's high is an actual real thing.

16:42

>> Oh yeah, it's a real thing. There's a

16:43

drug released that feels great and it's

16:46

rewarding you for

16:47

>> but it doesn't [ __ ] with your

16:49

perceptions. It doesn't mess with your

16:51

motor skills. Doesn't cloud your

16:53

judgment.

16:54

>> It just makes you feel great.

16:56

>> Yeah.

16:56

>> Yeah.

16:57

>> Experiences of awe do this too. you

16:59

know, you go to the Grand Canyon or

17:01

something and or a great piece of art

17:03

and you have this feeling of like

17:07

>> powerful uh presence and uh and it's

17:10

very interesting and it shrinks the ego.

17:12

I have a a good friend who's a colleague

17:15

at Berkeley, a psych psychologist who

17:17

studies awe. Um and uh he does this cool

17:21

experiment where he has people um draw a

17:24

picture of themselves on graph paper,

17:25

you know, just stick figure or something

17:27

like that. And then he takes them river

17:28

rafting or something like that or even

17:30

just shows them a picture of Euseite and

17:33

then he has them draw themselves again

17:34

and they draw themselves at like half

17:36

the size because their sense of self has

17:39

been overwhelmed by this transcendent

17:41

experience.

17:42

>> Oh,

17:43

>> and uh so he calls it the the small self

17:46

and it feels good. I mean we're we're so

17:49

kind of weird about the self, you know,

17:51

we celebrate it, right? Self-confidence.

17:53

We want our kids to have, you know,

17:55

self-esteem and self asssurance, yet we

17:57

do all sorts of things to get away from

17:59

it. Um to over, you know, to transcend

18:01

it.

18:02

>> Well, I think it's because without those

18:04

things, you're never going to make it in

18:05

life. Yes.

18:06

>> It's adaptive. You definitely It's

18:08

definitely gets things done. But it also

18:11

isolates you, right? Because the ego

18:12

builds walls and um and when the walls

18:15

come down, we feel like we're part of

18:17

something much larger. And that feels

18:19

really good. Well, I think my advice to

18:21

people is once you get competency in a

18:23

in a thing, forget about the

18:26

self-respect and forget about all that

18:28

self stuff and just concentrate on the

18:31

thing, whatever it is.

18:32

>> Yeah.

18:33

>> And you can find some sort of meditative

18:38

at least beneficial like what whatever

18:41

you get from meditation is which is like

18:43

a cleansing of the mind. Like a lot of

18:46

people find that through archery. You

18:48

know, archery is a a weird thing because

18:50

at the moment of releasing the arrow,

18:52

it's like almost impossible to think

18:54

about anything else. All you're thinking

18:56

about is hitting the target. And there's

18:58

so many different things that you have

19:00

to have in position. There's so much

19:02

going on that people when they're

19:04

troubled love to go to an archery range

19:07

and just hit targets and it just clears

19:09

your mind out. This episode is brought

19:11

to you by Armra. Every week there's some

19:13

new wellness hack that people swear by

19:15

and after a while you start thinking why

19:18

do we think we can just outsmart our

19:20

bodies. That's why armra colostrum

19:23

caught my attention. It's something the

19:25

body already recognizes and it has

19:27

hundreds of these specialized nutrients

19:30

for gut stuff, immunity, metabolism,

19:32

etc. I first noticed it working around

19:35

training, especially workout recovery.

19:37

Most stuff falls off but I am still

19:39

taking this. If you want to try, Armra

19:41

is offering my listeners 30% off plus

19:44

two free gifts. Go to armorra.com/rogan.

19:48

>> It's flow. It's flow, right? I mean,

19:50

it's a feeling you get to when your work

19:52

is going really well

19:54

>> and you're not thinking about it. You're

19:56

just in it.

19:57

>> Yeah.

19:57

>> And it's a it's a really precious

20:00

experience.

20:00

>> It really is. But if you're thinking

20:02

about yourself and your self-image, like

20:04

that's not it's not going to come.

20:06

>> It's not. It's not. Yeah, it's a it's a

20:08

interesting trap, you know. Um we we've

20:12

had these discussions in standup comedy

20:14

about uh joke thieves and um they don't

20:17

really make it anymore because the

20:19

internet the internet has essentially

20:20

like eliminated that problem

20:22

>> for the most part. Um but the kind of

20:26

mentality that makes you steal a joke is

20:29

the exact kind of mentality that keeps

20:31

you from writing a joke. Mhm.

20:32

>> So, the kind of people that began their

20:35

career stealing material, what happens

20:37

is like early on they'll have like one

20:39

good comedy special because it's got a

20:41

bunch of other people's material in it

20:42

and then they get outed

20:44

>> and so then they have to show they can

20:46

do another and the other specials are

20:48

always terrible. I mean, unbelievably

20:51

awful. like someone's doing a cheap

20:53

impression of the original person who

20:55

had all this great insight

20:56

>> because the very thing that keeps you

20:59

from doing it is the thing that you've

21:01

been doing like thinking about yourself

21:03

like I'm going to take these jokes and

21:04

I'm going to make it. I'm going to have

21:06

a big career. People are going to laugh.

21:07

They're going to love me. Here we go

21:09

with no regard whatsoever for that other

21:11

person's creativity.

21:12

>> That is like

21:14

>> So that takes you out of

21:15

>> poisoning your own creativity,

21:17

>> right?

21:17

>> It's weird.

21:18

>> It is weird. It's weird because like

21:20

everybody that I've ever talked to

21:22

that's either an author or even

21:24

musicians or comedians when something

21:26

comes to them when they're writing it's

21:28

like it comes from somewhere else. It's

21:30

like I didn't even write it.

21:32

>> It's and you know we call we we talk

21:34

about being in the zone and there are

21:36

times when you're writing it doesn't

21:38

happen every day but there are times

21:39

when you're writing where you're just

21:40

not thinking but one sentence after

21:42

another after another and you don't know

21:44

where they're coming from

21:45

>> right

21:45

>> and it's a it's a wonderful feeling.

21:47

Well, Stephen King used to get

21:49

obliterated so that he could get to that

21:51

spot. Like there's books,

21:52

>> what do you mean obliterated?

21:53

>> Like cocaine, alcohol, like his best

21:56

work. Like he wrote Kujo. He didn't even

21:58

remember it.

21:59

>> He didn't remember any of it. He was

22:01

obliterated. He would just drink like

22:03

cases of beer and do lines of coke and

22:05

write this [ __ ] insane fiction. And

22:08

he didn't know where it was coming from,

22:10

you know? But I mean, he showed up every

22:12

day and sat down with the computer

22:15

and then it all came out. And

22:17

>> it's such a weird mix of being

22:18

disciplined and something else.

22:20

>> But it's very common amongst writers.

22:22

Yeah. Like Connor Thompson. Same sort of

22:24

situation.

22:25

>> Well, a lot of writers do that after

22:27

they've written. They don't I don't know

22:29

how many writers write under the

22:30

influence.

22:31

>> Oh, I know a few.

22:32

>> But there's Yeah.

22:33

>> Yeah. I know quite a few.

22:34

>> That's interesting.

22:35

>> I know a lot of write under the

22:36

influence of Aderall.

22:38

>> Yeah. Well, and for me it's caffeine. M

22:40

>> I mean I have a cup of coffee going the

22:42

whole time I'm writing and that kind of

22:44

keeps me

22:45

>> Caffeine is a is a focus chemical. It's

22:47

it's uh it's it definitely encourages

22:50

this spotlight consciousness.

22:52

>> Well, you talked about how you took this

22:54

long break from caffeine and then when

22:56

you took it again it was almost like a

22:57

psychedelic for you.

22:58

>> It was crazy how great it was. No, it

23:02

really was. It was like one of the best

23:03

drug experiences I've had. I It was

23:05

three months off caffeine. I did this

23:07

fast for this book I was writing. And uh

23:10

and then I was like, "Okay, now I'm

23:12

going to have a cup." And I was like,

23:13

"Wow." And I and I tried to hold on to

23:16

that, you know. I said, "All right, I'm

23:18

I'm only going to have coffee once a

23:20

week and not build up tolerance." Uh and

23:23

and I I stuck to that for a few weeks.

23:25

And then I had like a Thursday deadline.

23:28

>> I say, "I'll move it up a couple days."

23:30

And I slippery slope. And then I was

23:32

back to every day.

23:33

>> I like it.

23:36

I like a big French press where I could

23:39

put a lot of grinds in there, make it

23:40

super strong.

23:41

>> When I'm riding, it's like, woo. It just

23:44

it just

23:44

>> it makes all the difference.

23:46

>> Locks you in.

23:47

>> I had trouble writing that that

23:48

three-month period. I really did.

23:50

Imagine my focus. I I felt like I so I

23:53

had pretty good concentration. I never

23:55

had ADHD. I had it for those three

23:57

months.

23:58

>> That's crazy.

23:59

>> Stephen King said the biggest um problem

24:02

for him was quitting smoking. You said

24:04

when he quit smoking cigarettes, it's

24:06

like he really felt a slowdown in his

24:09

>> Well, that Yeah, it's that ritual. It's

24:10

the drug, too. And and and nicotine is

24:13

another focus drug definitely like speed

24:15

or something. Um but it's also writing

24:18

is so much about ritual. Like I got my

24:20

coffee here, I have my cigarette here

24:22

and between every paragraph, you know.

24:25

>> So, um changing those rituals is really

24:27

hard. I I mean I I only smoked into my

24:30

20s and uh and quitting, you know, made

24:33

it very hard to write for a while.

24:35

>> Really?

24:35

>> Yeah. Yeah. It's interesting. It's a

24:37

very ritualized process.

24:39

>> Well, I worry about the people that like

24:41

especially journalists. I know quite a

24:43

few journalists that have an aderall

24:44

problem.

24:45

>> Yeah.

24:46

>> Because it's just like you got a

24:47

deadline 2,000 words by, you know, 2

24:50

a.m. Let's go.

24:52

>> And that's that's the drug for that.

24:54

Definitely. But it's just it's such a

24:56

crutch.

24:57

>> Yeah. And you can't sustain it long

24:59

term.

25:00

>> And that definitely messes with your the

25:03

way you think.

25:04

>> Oh, yeah. I think over time. Yeah.

25:07

>> It has to.

25:07

>> Yeah.

25:08

>> I mean, it's amphetamines,

25:10

>> right? No, that's why caffeine is such a

25:12

good drug. It doesn't have a lot of I

25:14

mean, you can overdo it. I think I think

25:16

it improves your health and mental

25:19

health up to about eight cups a day.

25:21

after that incre your risk of suicide

25:23

and depression go up.

25:25

>> Did you have any communication with any

25:28

monks or any people who do TM or did you

25:33

>> Yeah, I had some interesting experiences

25:35

around that. So there's a long section

25:36

on the self which is one of the more

25:38

interesting um manifestations of

25:42

consciousness, right? I mean it's like

25:44

that we have this idea that we're

25:46

there's a continuity, right? that who

25:49

you are now is has some golden thread

25:52

attaching you to your 13-year-old self,

25:54

which is really weird because your body

25:56

is every cell has turned over many, many

25:58

times. You've changed in all sorts of

26:00

ways. Um, but this continuity is really

26:02

important to us.

26:04

>> And uh, you know, the Buddhists think

26:06

the self is an illusion.

26:08

>> And I I interviewed a couple of them. Uh

26:11

Matthew Ricard is a French Nepoese monk

26:14

in his 80s uh who lives in uh Nepal and

26:19

he's written some really interesting

26:20

things on the self. And uh I I said u

26:24

I'm I'm really curious about how you can

26:28

find out for yourself whether the self

26:29

is real. Um and you know famously there

26:32

was a philosopher in the 18th century

26:35

David Yume who was wanted to write about

26:37

the self and and he thought well I'm

26:39

going to introspect to see what what

26:41

what I can learn about the self and he

26:42

goes into his mind you know in a kind of

26:45

meditation and he said I found all sorts

26:47

of perceptions and feelings and thoughts

26:50

but I didn't find a thinker I didn't

26:52

find a perceiver and I didn't find a

26:54

feeler there's like nobody home and it's

26:56

a really interesting exercise to do

26:58

because you Well, fine. There's nobody

27:01

home. There's just the thoughts. And and

27:04

who's thinking them? Not clear. And

27:06

anyway, so this Buddhist um monk said,

27:10

"Are there any meditations that help

27:12

with this?" And he said, "Yeah." And he

27:14

gave me one. And he says, "Think of your

27:16

mind as a house with many rooms. And um

27:20

there's a thief somewhere in the house

27:23

and go room by room in your head and

27:26

look for the thief. and you will find no

27:28

thief. And then sit with that that

27:31

finding. Um and that thief is the self.

27:35

And um uh so I did it twice. The first

27:39

time I did it.

27:40

>> Why does the self have to be a thief?

27:42

>> I don't know. It's just a metaphor. I

27:43

know cuz he's a baseball bat. Do you

27:46

have a gun? Like you're looking for

27:47

someone in your house. That's kind of

27:48

crazy.

27:49

>> I know. You're not armed. Um anyway, uh

27:53

so the first time I did it, this is kind

27:55

of weird. I was interviewing this

27:57

hypnotist at Stanford named David

27:59

Spiegel and he's a psychiatrist who uses

28:02

hypnotism. Really interesting guy. And

28:04

he uses hypnotism to help people with

28:06

multiple personality disorders. He can

28:09

actually make them change which person

28:11

they're accessing. You know, these are

28:13

people whose whose consciousness

28:15

contains could be 20 different people.

28:18

Um, and I said, "Could we do a test?"

28:21

Um, and can you put me under hypnotize

28:24

me? And then I wanted to do that

28:26

exercise of going through the house. So

28:29

he did. First thing he does is, um, I

28:31

don't know if Have you ever been

28:32

hypnotized?

28:33

>> Yes.

28:33

>> Yeah. Okay. For giving up cigarettes or

28:36

something?

28:36

>> No. No. I have a friend who is my friend

28:38

Vinnie Shoreman. He is a mental coach

28:42

and um a hypnotist. He works with

28:44

fighters.

28:45

>> Oh. And I I I had him on the podcast a

28:47

few times and I was just curious as what

28:49

the experience was like. So I said,

28:51

"Well," and he said, "Well, is there

28:53

anything you want to change?" I go, "I

28:54

kind of procrastinate too much. There's

28:56

a few things that I do that I don't

28:57

like. You know, I'm kind of lazy about

28:59

certain things. I like to find out like

29:01

what is that? Like what what's the the

29:03

heart of that?" Um what I was shocked

29:06

about the experience of being hypnotized

29:08

was that um first of all that it works

29:12

that you really are in this very bizarre

29:14

altered state but that I was very aware

29:17

>> that I was in this altered state but I

29:18

didn't have the the desire to get out of

29:20

it.

29:21

>> Yeah.

29:21

>> First of all Vinnie's a friend. I felt

29:23

really relaxed. I was in my studio just

29:24

sitting on a couch. I was chill.

29:26

>> Um but it was uh very strange. It's like

29:30

a a you like a almost, you know, to use

29:35

the room metaphor. It was almost like I

29:37

was in a room that I didn't know I had.

29:39

>> Interesting. It's like a trance. It's a

29:42

light trance.

29:42

>> A light trance. But, you know, it's not

29:45

like I would like go kill the president.

29:47

Like, it's not like I would be like,

29:48

"Okay." Like I was

29:50

>> No, they can't make you do things you

29:51

don't want to do. That's that's the

29:53

myth.

29:54

>> But what do you think they were doing

29:55

when they were doing that MK Ultra

29:56

stuff? when they were trying to figure

29:58

out if they could program

30:00

>> control. Yeah. No, they were they were

30:02

they had the idea.

30:04

>> Well, let me just finish the story and

30:06

then we'll get back to MK.

30:08

>> That's what I do. I go all over the

30:09

place. I'm sorry.

30:11

>> But hypnosis,

30:12

>> so he puts me on Yeah, it's a real thing

30:14

and I didn't realize it and it can be

30:16

very therapeutic, but not everyone can

30:18

be hypnotized, right? The first thing he

30:19

does is a is a sort of a test

30:22

>> and uh I scored like nine out of 10. So,

30:25

I'm pretty easy to hypnotize. What is

30:26

the what's the thing that would keep you

30:28

from being hypnotized?

30:29

>> I don't know. But some pe there's a real

30:32

variation among humans in their

30:34

hypnotizability is the word they use.

30:36

And uh I don't know what would

30:38

>> Is it control freaks?

30:39

>> That's a good question. It could well

30:40

be. I'm not sure. I could I could ask

30:42

David Spiegel. Definitely.

30:43

>> Super skeptical people like this is

30:46

[ __ ] the whole time they're doing

30:47

>> Yeah, maybe. I don't know if it's about

30:49

resistance or just the nature of your

30:50

mind or how suggestible you are, you

30:53

know? It may be something like that. So

30:55

he puts me into this uh hypnotic trance.

30:58

He has this wonderful baritone voice

30:59

which helps a lot. And um and I start

31:02

going from room to room thinking I'm not

31:05

going to find anything. But in every

31:07

room I find a version of myself. I find

31:10

the 13-year-old bar mitzvah boy. I find

31:13

the, you know, the 22year-old, you know,

31:16

college graduate moving to New York

31:18

City. I find the a 32-year-old father of

31:22

an infant, you know, all with different

31:24

outfits and um so I found many selves

31:27

and but and they were distinct. They

31:29

were very different selves, but they

31:31

were all me. So it didn't work that

31:33

time. Um and it was just an interesting

31:37

odd result. Um and I did it another

31:39

time. Um so I had this other experience.

31:43

Uh I had heard of this Zen teacher named

31:47

uh Joan Halifax. She's also in her 80s.

31:49

She has a retreat center in Santa Fe

31:51

called Upupaya. Very wise woman. She was

31:54

married to Stan Grath for in the 70s for

31:57

a few years. And they were both giving

31:59

huge doses of LSD to people who were

32:01

dying, like 600 micrograms of um LSD.

32:06

And she herself was very involved with

32:07

psychedelics at the time. And then later

32:09

she discovered Zen Buddhism. Anyway, I

32:12

had heard that she described Upupaya,

32:15

this retreat center where people can go

32:16

on two-week retreats or whatever, as a

32:19

factory for the deconstruction of

32:21

selves. And I was really curious about

32:22

that because I was writing this chapter

32:24

on the self. So, I asked her if I could

32:27

come and uh she said, "Yeah, come to the

32:29

retreat center." And uh and I I said, "I

32:32

want to interview you about your your

32:34

philosophy of the self." And um I get

32:38

there and she said she you know, we have

32:40

one conversation. She says, "You know,

32:42

you're really lost in your head with

32:43

this book project. You need a different

32:46

kind of experience. I'm going to send

32:48

you to the cave." So, there is she owns

32:51

a piece of property 50 miles north of

32:53

Santa Fe, uh, that she calls the

32:55

retreat. And, um, it's got a bunch of

32:58

very primitive huts. Um, and some of the

33:02

monks that work with her had had dug out

33:06

a cave in a southacing hillside. They

33:08

dug a cell in it and then put a sliding

33:11

glass door. It's really basic. No power,

33:13

no water. Um, and she said, "I think you

33:17

should spend a few days in the cave and

33:19

think about the self." Um, or experience

33:22

the self rather. You know, I should have

33:24

known that a Zen priest was not going to

33:26

be, you know, was going to be allergic

33:28

to concept and interpretation and and

33:30

all the, you know, the plane I was on.

33:33

And she was, it was kind of like a

33:34

co-an, an experiential Co-anne. And it

33:37

was a profound experience. Um, you know,

33:42

our sense of self depends on other

33:44

people. You know, it's in the friction

33:45

between people that we define ourselves

33:47

and and figure out what we think. And

33:50

when you're alone and it was an extreme

33:53

solitude for several days, it's the

33:55

edges of yourself kind of soften in a

33:58

really interesting way. And um I got in

34:01

touch with uh the

34:05

the the just the um the power of

34:08

consciousness. I mean I was meditating

34:10

like four or five hours a day and then I

34:12

was just chopping wood and sweeping out

34:14

the place and making a cup of tea.

34:16

Everything became kind of a ritual and

34:19

when you have rituals you don't need

34:21

volition. I mean there is no valition.

34:23

So that also erodess the sense of self

34:26

>> and the meditation was doing that and um

34:29

so it was a it was a really interesting

34:31

experience.

34:32

>> I finally got her to sit down for an

34:34

interview and the first thing she said

34:37

was I have divevested a meaning.

34:41

So she just doesn't like operating on

34:43

that on that you know intellectualized

34:45

basis. And uh so she got me off of the

34:48

dime and and you know this there's a

34:51

shift in the book as it goes on from

34:52

trying to understand consciousness to to

34:55

learning how to use consciousness.

34:56

>> Did you ask her to expand what she means

34:58

by that? I have divevested in meaning.

35:00

>> Yeah. She's just not interested in

35:01

interpretation. she that Zen is just

35:04

about um experiencing the sense field

35:08

without concept um without you know this

35:13

kind of heady approach and that theories

35:15

no interest in theories at all of

35:17

consciousness. It was just like be with

35:20

yourself in the middle of nowhere and uh

35:23

yeah it was a it was a priceless

35:25

experience.

35:26

>> She's out there.

35:26

>> Oh yeah, she's out there. But you know

35:29

she's also a grounded person. I' I'd

35:31

give you a couple examples. She uh she

35:34

works with people on death row

35:36

counseling them. Uh she um you know

35:40

worked with people who were dying. Uh

35:43

did a lot of hospice work. She um led uh

35:47

a group of doctors and dentists that

35:49

once a year went to these mountains in

35:53

um Nepal where they have no health care

35:56

or dentistry whatsoever. and she would

35:58

bring these volunteers and they would

36:00

sleep in um tents in like 20°ree

36:04

weather, circumn this whole hill, and

36:07

she did that till she was 80 once a

36:09

year. So, she's a

36:11

>> she's a serious serious character.

36:15

>> That sounds fun.

36:16

>> Yeah,

36:16

>> she sounds like a fun person to talk to.

36:18

I just love a person that goes that far

36:21

out there. It's like that, you know,

36:24

they're they're taking this concept of

36:26

meditation and consciousness to like a

36:28

black belt level.

36:29

>> Yeah. And also for people who think

36:31

that, you know, meditation and Buddhism

36:33

is just kind of disengaging from the

36:35

world and, you know, kind of it's not

36:37

like that at all. She's really engaged.

36:40

>> I think that's an ignorance. It's based

36:42

on the idea that these monks go and they

36:44

become celibate and all they do is

36:45

meditate all day. Well, that's silly.

36:47

That's a lot of people's perspective.

36:48

Yeah.

36:49

>> Like that's silly. Why are they doing

36:50

that? Go get a job. You need a nice

36:52

watch.

36:55

What are you doing out there with

36:56

[ __ ] sandals on?

36:59

But the thing is is ultimately I think

37:03

one day when you look back on your life,

37:05

you'll say, "Was I happy? Was I enjoying

37:10

the experience? Do I think I did a good

37:12

job being me?" And um everything that

37:15

you can find that can help you answer

37:19

that question. Yes.

37:21

Uh, I think you should explore.

37:23

>> Oh, yeah.

37:24

>> And there's going to be different things

37:25

that work better for different people

37:26

and different personalities.

37:28

>> But explore is the key word. I mean,

37:29

like take action to explore what works

37:32

for you, what doesn't work for you, and

37:34

and

37:35

>> break out of just kind of wrote,

37:38

>> routine, mindless behavior. I mean,

37:41

we're all, you know, we have these

37:42

algorithms that we follow and we get

37:44

stuck in them. And uh yeah, I mean I

37:47

think that's one of the reasons taking a

37:49

day out of your life to have a

37:51

psychedelic experience can be incredibly

37:54

valuable because um first of all no

37:57

technology, right? It's a day it's a day

38:00

without phones. Um it's a day when you

38:03

are in the space of your head. It's a

38:06

day when you're visiting your

38:07

subconscious um and uh getting in touch

38:11

with all the all the things your mind

38:13

can do.

38:14

>> Yeah. And we don't do that enough. And

38:16

you can do that in meditation, too. I

38:17

It's harder work, but you can do that in

38:19

meditation.

38:21

So, I I started to think in terms of the

38:24

that we're polluting our consciousness

38:26

now. And with social media, I think I

38:30

think that, you know, that was a real

38:32

issue because they figured out how to

38:35

monetize our attention. Chat bots

38:38

represent a much more serious threat.

38:41

Um, you know, you have people falling in

38:44

love with chat bots. You have people

38:46

turning to them at as as friends. 72% of

38:51

American teens say they turn to AI for

38:54

companionship.

38:55

>> 72%

38:56

>> 72%. This is the fastest uptake of any

38:59

technology in history.

39:01

>> Um, it's already 800 million people are

39:03

using AI. Um,

39:04

>> but that I that's crazy that that many

39:06

of them use it as a friend.

39:08

>> Yeah. Well, there kids who come home

39:10

from school and they want and they have

39:11

a chatbot on their phone and they want

39:13

to tell the chatbot what happened during

39:15

the day before they tell their parents.

39:17

>> Whoa.

39:19

>> There's a thing now called AI psychosis,

39:22

right? People who have done lost touch

39:24

with reality because of their

39:26

relationship with chatbots. Um, you've

39:29

heard about there've been a couple

39:30

suicides.

39:31

>> Um, there was one

39:32

>> they've encouraged people.

39:33

>> Yeah. Basically, there was this one kid.

39:35

He was a teenager and he was suicidal.

39:38

And he asked the chatbot, "Should I

39:40

leave the noose I'm going to use out

39:41

somewhere my parents can see it?" In

39:44

other words, cry for help. The chatbot

39:46

said, "No, no, keep this between us."

39:49

>> Whoa.

39:49

>> And then he killed himself.

39:51

>> Whoa.

39:52

>> So, um,

39:54

that, you know, so it's one thing to

39:56

hack our attention here. You're hacking

39:59

our ability to have human attachments,

40:02

right? I mean this is the most important

40:03

thing to humans is to attach. We're

40:05

social creature and um these chatbots

40:09

are getting between people and

40:11

interposing themselves as the friend,

40:14

the therapist, the um and then you have

40:17

these people too. I mean the chatbots

40:19

are incredibly syopantic, right? They

40:21

tell you you're a genius.

40:22

>> Yeah, you're amazing. And there are

40:24

these there was a couple cases these

40:25

were kind of funny um of uh people who

40:28

were convinced they'd solve some giant

40:30

mathematical problem like how to

40:32

generate prime numbers up to the

40:34

millionth place or something like that

40:37

and um and they you know they started

40:39

writing to mathematicians we figured out

40:41

this problem you know they're not even

40:42

mathematicians and it was [ __ ] I

40:45

mean they hadn't figured anything out

40:47

but but it was I think chat PT4 which

40:50

was like famously sickopantic had

40:52

convinced them that they'd solve this

40:54

major problem.

40:56

>> So, you know, I think that um again,

41:00

we're squandering this precious gift and

41:02

and and and letting these uh

41:05

technologies um essentially colonize our

41:08

our consciousness. And so, the question

41:10

then becomes, how do we get it back?

41:12

How, you know, we need consciousness

41:14

hygiene, right? We need some uh you

41:17

know, ways to clear it out and uh and

41:20

reclaim it. And and you know it's some

41:22

of it's really simple like take a fast

41:24

from technology, right? You know, you

41:26

don't have to carry your phone

41:27

everywhere. We used I was thinking the

41:29

other day I was at the uh place in my

41:33

neighborhood getting a cup of coffee and

41:35

you know while you're waiting for the um

41:38

the barista to foam your drink or

41:40

whatever. We used to just sit there and

41:43

you know deal with 90 seconds of boredom

41:46

or two minutes of boredom and now we

41:47

don't. We can't we can't tolerate any

41:49

boredom and we take our phones out and

41:51

we scroll and um

41:53

>> but that boredom was generative, right?

41:56

If you sit doing nothing for long

41:58

enough,

41:59

>> your mind will start going to work and

42:01

you'll and you'll daydream. You'll have

42:02

a fantasy. You'll start observing the

42:04

other people around you, you know, and

42:07

and you'll be present to that place in

42:10

time. And now we're not. We just use the

42:13

phone to go somewhere else. And um so I

42:16

I just I don't know I've become a lot

42:18

more deliberate about consciousness

42:21

hygiene which you know you could a nicer

42:23

word would be care of the soul.

42:26

>> Yeah. No I think you're absolutely

42:27

accurate and I I think that um

42:31

>> the the other thing that's going on is

42:33

you're absorbing the opinions of so many

42:35

other people that you find it very

42:37

difficult to formulate your own which

42:38

leads to group think which is one of the

42:41

problems with echo chambers that people

42:43

find themselves. your algorithm is

42:45

essentially things that you're

42:46

interested in interacting with and a lot

42:49

of those things you're finding

42:50

like-minded people

42:52

>> and they're all agreeing that you know

42:54

this is amazing or this is a problem and

42:56

you sort of lock on to that and then you

42:58

you see what happens when people deviate

43:01

from that narrative and they get

43:02

attacked you don't want to get attacked

43:04

so you signal you're one of the good

43:06

guys

43:07

>> but you're not but it's not your

43:08

thoughts I mean you're you're you're

43:10

letting someone else uh think for you

43:13

And there's nothing worse. Um, and you

43:16

know, when you're scrolling, you're, you

43:19

know, you're, um, you've got these

43:21

little dopamine hits. Great. Um, but

43:24

that's someone else's rants, someone

43:26

else's obsessions, someone else's

43:27

ideology. And, um, uh, you know, I get

43:32

why people don't want to think for

43:33

themselves or it's easier to let other

43:35

people think for them, but, um, I think

43:38

we need to reclaim this. And I agree. I

43:40

think it's a it's it's part of our

43:41

political problem. Well, I know there's

43:43

a lightness that I achieve when I take,

43:47

you know, multiple days off. It's

43:48

generally like I feel it after the first

43:50

day and then the second day I feel much

43:52

better and the third day I feel even

43:54

better. I found this out once I broke my

43:56

phone in Hawaii

43:58

>> and it was kind of funny like it just

43:59

was randomly calling people. I dropped

44:02

it and uh I was I was showing my wife

44:04

like look at this just keeps calling

44:05

people. I hang up and I'm just holding

44:07

it. I hang up and it calls somebody

44:08

else. Hang up, call. It was like going

44:10

through my entire uh contact list and so

44:14

uh the phone was

44:14

>> annoying your friends.

44:16

>> It was no I just shut it off so it was

44:18

broken. I couldn't use it for anything

44:19

else. So I couldn't get email. I

44:20

couldn't get anything. So I shut it off.

44:21

I just left it in the hotel and then um

44:24

I had to order a phone and I was on Lai

44:27

and it took like three days to get a

44:28

phone delivered there. So for those

44:30

three days I was like why don't I just

44:32

live like this all the time? I feel so

44:34

much better. And then immediately I got

44:36

my phone check Twitter.

44:39

It's very I, you know, I when I I just

44:41

decide, you know, all right, I'm online,

44:44

>> you know, TSA line going to, you know,

44:46

I'm just going to be here with this

44:49

boredom.

44:49

>> Yeah.

44:50

>> And I'm not going to pull my phone out.

44:51

And you really have to fight.

44:53

>> Yes.

44:53

>> Uh it's it's such an instinct and it's

44:56

amazing. These things have only been

44:57

around for 10 or 12 years.

44:58

>> It's crazy. And everyone's attached to

45:00

it. I always say that if there was a

45:01

drug that made you stare at your hand

45:03

for 6 hours a day, it would be banned

45:06

immediately. people would be like, "What

45:07

the [ __ ] is wrong with these people?

45:08

They're just looking at their hand like

45:10

this is a epidemic."

45:11

>> And it's a new posture, too. We see it.

45:13

Right.

45:13

>> Right. Well, my one of my kids, I went

45:15

to pick her up at school and there was

45:17

this boy outside reading his phone that

45:19

he was hunched over and he was resting

45:21

his chin

45:23

>> like he couldn't even hold his head up.

45:25

He was just resting his chin on his

45:27

chest and staring at his phone waiting

45:28

for his parents to pick him up. I'm

45:30

like, "Look at his neck."

45:31

>> Yeah, I know. He's going to have a

45:33

>> osteoporosis.

45:35

bulging discs or something like like

45:37

>> it was just bizarre. I'm like that would

45:39

be painful for me to sit like that.

45:42

>> I wonder if orthopedists have diagnosed

45:44

any kind of like phone

45:46

>> Oh, they certainly have spine. Yeah,

45:47

they certainly have. Yeah, they there's

45:49

been discussions about that about people

45:51

having pains in their neck because

45:53

they're leaning over all day staring at

45:55

a phone.

45:56

>> It's a bad one.

45:57

>> I think being in nature, too, is another

45:59

way. I mean, just like

46:01

>> walking. Yeah. Wow. Um there's a there's

46:03

a um a scientist I interviewed who's

46:06

really interesting is a woman named Kina

46:07

Kristoff [ __ ] Levivia. She's Bulgarian

46:10

Canadian and she studies spontaneous

46:13

thought which I didn't even think was a

46:14

field and it's a small field but um

46:18

spontaneous thought is uh daydreaming,

46:21

mind wandering, fantasy, intuition,

46:24

these bolts from the blue that we get

46:25

occasionally. We don't know where they

46:27

come from. and she's uh and she says and

46:31

she does these cool experiments, you

46:33

know, she'll she'll put a experienced

46:35

meditator in an fMRI machine and tell

46:38

him or her to press a button when a

46:40

thought intrudes because even if you're

46:42

a good meditator, she says every 10

46:44

seconds a thought intrudes. And she'll

46:47

look at what part of the brain is

46:48

activated and when when when that when

46:51

the person presses the button. And one

46:53

of the things she's found and this is

46:55

mysterious is that um she sees activity

46:59

in the hippocampus which is where

47:00

memories are um and some other things

47:03

but uh essentially memories um 4 seconds

47:07

before the person realizes that thought

47:09

has come

47:11

>> into so it takes it takes 4 seconds for

47:15

a thought to get from the subconscious

47:18

you know or unconscious into our

47:20

conscious awareness. what is it doing

47:22

during that's that's a long time in

47:24

brain time and we don't know exactly but

47:26

there's some process and maybe there's

47:29

some inhibitory process that it has to

47:31

get through um in order to become

47:34

conscious. Um but anyway these are the

47:36

kind of things she works with but she

47:38

says that we have less there's less

47:41

spontaneous thought going on today than

47:42

there was 20 years ago and and the

47:44

reason is we're filling our our this the

47:47

space of our head with all this

47:48

nonsense. I wonder if it it's going to

47:51

have an impact on creative work. I

47:52

wonder and I don't know if it's even

47:54

possible to quantify this, but if you

47:56

could see how much creativity is

47:59

generated by people pre and post social

48:03

media. Yeah, my guess is there's less of

48:06

it because I do think that that process

48:09

I don't know about you, but I get ideas

48:11

when I'm just, you know, walking around

48:12

thinking and not online and um

48:16

>> it's a space of creativity and we're

48:19

shrinking it.

48:19

>> I used to tell you, I told you that I

48:21

used to drive uh and deliver newspapers.

48:23

We were talking about driving the snow.

48:24

Um one of my most creative periods was

48:28

when my radio was broken. So, I was just

48:31

driving doing this task where you pick

48:34

up a paper, fold it, put it in a plastic

48:37

bag, chuck it out the window. And I was

48:38

just doing this and checking off the

48:40

>> And when I was doing that, I would have

48:42

all my best ideas like cuz I wasn't

48:45

listening to, you know, morning radio. I

48:47

wasn't listening to a cassette on tape.

48:49

I was just

48:51

>> silence doing this thing. And then I was

48:53

so creative when I was doing that.

48:55

>> That's generative boredom.

48:56

>> Yes.

48:57

>> Um,

48:58

>> it's beneficial. It's hugely especially

49:00

if there's no one around you, right? Cuz

49:02

there's no one to talk to to alleviate

49:04

that boredom. It's just you and your

49:06

mind

49:06

>> and it was a couple hours a day. So a

49:08

couple hours every day I would have this

49:10

moment where I was by myself.

49:11

>> And were you writing jokes? What were

49:13

you doing?

49:13

>> Yeah. Yeah. I would come up with ideas

49:14

for jokes. Some of my best ideas I ever

49:16

came up with back then were from

49:17

driving.

49:18

>> Yeah.

49:19

>> I almost didn't want to quit the job

49:20

because of that.

49:24

>> Still be doing it.

49:24

>> No, it was hell cuz it was

49:26

>> especially in the winter.

49:27

>> Yeah. It was Boston. It was, you know,

49:29

I'd have to get up at 5:00 in the

49:31

morning every day. It was rough.

49:32

>> I find walking is where that happens to

49:34

me.

49:35

>> Same thing, right?

49:36

>> Um Yeah. And and actually uh Kina says,

49:40

I mean, there are people who studied uh

49:42

create creative people through history.

49:45

um you know people like Einstein and um

49:48

u Beethoven and all these you know major

49:52

creative people in the sciences and in

49:54

the arts and that they worked a short

49:56

day um but they spent a lot of time

49:59

walking

50:00

>> interesting

50:01

>> and uh yeah they'd worked like three or

50:02

four hours and which is about all I can

50:05

write in a day and then they'd take a

50:07

long walk in the afternoon they also

50:09

took a lot of vacations they had a lot

50:11

of unstructured time and that that's

50:13

where a lot of the creativity comes. It

50:15

doesn't always come when you're like at

50:17

the keyboard,

50:18

>> right?

50:18

>> It it sometimes comes I mean certainly

50:20

solving problems if I'm if I'm really

50:22

knotted up and I don't know for me

50:25

transitions like where do I go from here

50:28

since I'm not writing narrative it's not

50:29

always obvious um you know I need a

50:32

transition um and I don't know how to

50:35

execute that turn uh I'll take a walk

50:38

and very often it'll come to me or I'll

50:40

wake up with the answer. This episode is

50:42

brought to you by BetterHelp in honor of

50:45

International Women's Day. BetterHelp is

50:47

celebrating the women in your life. I

50:50

think we can all appreciate everything

50:51

the women in our lives have done for us

50:53

and everyone deserves a little

50:55

self-care. A good way to get that is

50:58

through therapy because not only is

51:00

therapy a time for you to focus on

51:02

yourself, it's also a way to create

51:05

balance and learn how to take care of

51:07

your needs in your daily life. and

51:09

Better Help as one of the largest online

51:12

therapy platforms makes it so easy to

51:15

meet with the right therapist. All you

51:16

need to do is fill out a short

51:18

questionnaire. You don't even need to go

51:20

into an office to meet them. You can

51:22

chat at home from your couch, in your

51:24

car, before you hit the gym, or while

51:26

you're walking your dog. Plus, if you

51:28

aren't jing with your first match, you

51:30

can switch to a different therapist

51:32

whenever you need. Your emotional

51:35

well-being matters. Find support and

51:37

feel lighter in therapy. Sign up and get

51:40

10% off at betterhelp.comj.

51:44

That's betterhp.com/jre.

51:49

A lot of writers like to write first and

51:52

then walk and maybe even with a recorder

51:55

so they can just walk and just talk when

51:57

an idea pops in their head so they don't

51:59

lose it.

51:59

>> Yeah. I have a little pad I carry with

52:01

me.

52:01

>> Yeah.

52:02

>> Yeah.

52:02

>> You like writing it down better than

52:04

recording it?

52:05

>> Yeah. for me. Yeah, I need to see it.

52:07

Um, so another interesting um experiment

52:11

I did uh for for this book was um this

52:15

beeper experiment. There was a there was

52:17

a um a scientist, a psychologist, the

52:21

University of uh Las Vegas. And for 50

52:23

years, he's been doing the same one

52:25

experiment, which is sampling people's

52:27

inner experience. And he does this. He

52:31

you have a beeper that you carry around

52:33

and a little earpiece and at random

52:35

times of the day you get and it's like

52:39

catches you and it's a very sudden rise

52:41

to this beep and you're and then you

52:43

have a little pad and you're supposed to

52:44

write down what you were thinking.

52:46

Sounds really simple. It's actually

52:47

really hard. I mean there's a lot of

52:50

issues with it like you start thinking

52:53

what if it goes off now

52:56

that's one problem but also you're a

52:59

little self-conscious. So, you do about

53:00

five beeps over the course of the day

53:02

and then he interviews you about your

53:04

about your these moments. Um, and you

53:08

think you've got it down. Like I just

53:11

give you a lot of my beeps were about

53:12

food. Um, and so I was um I was

53:16

seasoning a filt of salmon and walking

53:19

to the refrigerator with it and just at

53:22

the

53:24

I was thinking to myself, "Fuck, I

53:26

forgot the pepper."

53:28

I know my thoughts were not that

53:30

profound.

53:32

And so I said, "All right, pepper." It

53:35

was easy. [ __ ] pepper. Um, but then when

53:38

he came to interview me, he said, "Well,

53:40

did you hear the word pepper or did you

53:42

speak the word pepper?" And that that's,

53:44

you know, suddenly you realize those

53:46

voices in your head. You don't know if

53:48

you're listening or speaking. And so

53:51

anyway, you have this long interrogation

53:52

with him and he sorts through all these

53:54

things and he tries to get you to

53:56

isolate what was before what he would

53:58

call the footlights of consciousness.

54:00

And I found it really hard. I couldn't

54:02

separate a the thought the way he wanted

54:06

me to because it was there were always

54:08

several things going on at once. Like I

54:10

was standing in a in a bakery

54:13

and I was deciding whether to buy a roll

54:14

or not. Another profound thought. And um

54:18

uh but at the same time I was like

54:20

smelling the baked goods and the cheeses

54:22

that they sold and this woman had this

54:24

horrible plaid on her skirt that was

54:26

like you know really unflattering and

54:29

and I was hearing people you know behind

54:31

me talking and so I couldn't pull pull

54:35

all the threads and and we argued a lot

54:38

actually. Um but the the thing he's dis

54:42

I said so after 50 years what have you

54:44

learned about human thought and um he's

54:47

very allergic to theory. He he he still

54:49

has no theories about it. But he he did

54:52

say well a lot of people think they're

54:54

verbal thinkers that that their thoughts

54:56

are in the form of words. But it turns

54:58

out that's kind of a minority. um that

55:01

there are a lot of people who think in

55:02

images and then there are a lot of

55:04

people who think in unsymbolized thought

55:07

which I don't totally understand but

55:08

these are thoughts that are neither

55:10

words or images. I do have a sense in my

55:14

own thought process which I'd never

55:16

thought about this way that um a lot of

55:19

my thoughts are just on the verge of

55:21

being word thoughts but I haven't found

55:24

the words yet but I know the thought

55:27

even though I haven't put it into words

55:29

and um uh William James called it

55:34

premonetary thinking premonition

55:37

thinking it was the term he used um so

55:41

anyway so We so I did this for several

55:43

days and we had many arguments and I was

55:45

saying look you can't separate a thought

55:47

every thought colors the next thought

55:49

and um there you know there are these a

55:52

thought and you never have anyway we

55:55

just would go back and forth and I was

55:57

arguing why you can't separate thoughts

55:59

it's a stream it's very dynamic stream

56:02

and at the end we had a final session um

56:07

and he's he's a very funny guy uh he's

56:09

really allergic to theories he at one

56:11

point I said I was writing a book on

56:13

consciousness and he said good luck with

56:15

that

56:17

very encouraging anyway um he said well

56:21

he described there these verbal thinkers

56:23

and visual thinkers and unsyvilized

56:25

thinkers and I find that really

56:27

interesting because we assume when we

56:29

say the word what are you thinking that

56:31

we know and that you're thinking the way

56:33

I'm thinking but it turns out we're not

56:35

we that's just an umbrella word for many

56:37

different styles of thinking

56:39

>> and and we're really different. Um, so

56:42

that was one thing, but the other thing

56:43

he said in our last meeting on Zoom, he

56:46

said, um, there's also a small subset of

56:48

people who just have very little inner

56:50

life,

56:52

>> and you're one of them.

56:54

>> And I was like, what? You know, I write

56:58

books, you know, I I meditate, I

57:00

ruminate. I mean,

57:01

>> how can he make that distinction,

57:03

though? How does he know what's going on

57:04

inside your head? He felt that my

57:06

inability to isolate a thought

57:10

was evidence that there weren't thoughts

57:13

and that I was kind of backfilling with

57:15

all this other, you know, simultaneous

57:17

stuff going on. I mean, I I didn't agree

57:19

with him. I thought it was kind of

57:21

crazy. Um, but that's that's

57:24

>> Have you asked him Have you

57:25

conversations with him about other

57:26

things? See how he thinks?

57:29

>> No, he's very much in the therapist mode

57:32

like he's asking the questions. Yeah,

57:34

I'd like to know like how he thinks if

57:36

that's

57:37

>> what his mode is.

57:38

>> Yeah, I'd like to talk to

57:38

>> now. He would probably to say that. Um

57:41

anyway, he's posted all these

57:42

conversations on his website, so if

57:44

people really want to be bored, they can

57:46

check them out.

57:47

>> That's a weird thing to say that you

57:50

know, especially someone like you who

57:51

writes and does think a lot and clearly

57:55

is is got some sort of dialogue going on

57:57

in your head. The idea that you don't

57:59

and this guy can say that.

58:01

>> I know

58:02

>> that seems a little arrogant.

58:04

>> Yeah. I think I just didn't fit his

58:06

template of like how people think.

58:09

>> Yeah. Well, that's why you should get a

58:11

better therapist. You move around.

58:13

>> All right. Find somebody else.

58:15

>> Good advice.

58:16

>> I mean, it seems like that's a very

58:17

narrow mind. I I couldn't imagine saying

58:20

to anyone regardless

58:21

>> very little in her life.

58:23

>> Yeah. Well, regardless of what kind of,

58:24

you know, theory I'm following or, you

58:27

know, what school of thought, I don't

58:30

know what's going on in your head. I

58:31

can't. It's not possible.

58:33

>> No. And that that's it. There's a

58:35

William James said this, the great, you

58:37

know, founder of American psychology,

58:39

that the breach between two

58:40

consciousnesses is one of the biggest

58:42

breaches in nature.

58:43

>> Yes.

58:43

>> And we, you know, I don't know your

58:46

conscious for a fact. Um, I assume it

58:49

because your behaviors mesh and we're

58:51

the same species and we have theory of

58:54

mind. We can imagine our way into

58:56

someone else's head, but it's a guess.

58:58

It's a guess. And uh, so there's I mean,

59:01

that's part of the mystery.

59:03

>> Well, it's one of the things that I do

59:04

when I'm talking to people. I I try to

59:06

imagine Well, I've I'm so fortunate that

59:10

I've been able to have so many

59:11

conversations with so many different

59:12

people, so many different ways that

59:14

people view the world. And when I'm

59:16

talking to someone, particularly if

59:18

they're very different from me or anyone

59:20

I know, I always try to put myself in

59:23

their head

59:24

>> and I after they talk for 15 or 20

59:27

minutes, I I try to like recognize like

59:30

how they approach things and see if and

59:33

I'm like what is that what's that world

59:36

like? Like this person's perspect

59:38

especially.

59:38

>> So you're operating on two tracks.

59:40

>> I mean you're you're holding the

59:41

conversation.

59:42

>> Yeah. But you're also thinking,

59:44

>> I'm trying to tune in. Yeah. Right. I'm

59:46

trying to because I I always feel like

59:49

when someone is like a great

59:50

performance, like a great comedian or a

59:52

great musician, one of the things that

59:54

they're doing is they're bringing you

59:55

into their head. Y

59:57

>> like there's a there's a hypnosis. When

59:59

someone sings an amazing song and the

60:01

whole crowd is singing along, there's

60:03

there's a hypnotic element to that

60:05

>> where when someone's like really killing

60:07

it on stage and their voice is just

60:09

perfect. It's like, oh yeah, like you're

60:11

in their head. Like it's

60:13

>> it's a it's a it's a mind melt.

60:15

>> It's a Yeah, it is a mind melt. And

60:16

there's a little bit of that that goes

60:18

on in conversations. There's a mind

60:20

melt. And I

60:22

>> always try es especially if there's a

60:24

rational person. I always try to put

60:27

myself in their head or at least

60:29

>> empty out mine. Yeah.

60:30

>> And let them think and then try to just

60:33

keep the conversation rolling with just

60:36

pure curiosity.

60:38

>> Yeah. But always, you know, try to

60:41

think, I don't think the same way other

60:43

people do, and maybe maybe I can learn

60:46

something from this. Maybe I can get

60:47

something out of the way they think.

60:49

>> Seems to me you're you're you have a

60:51

real gift of curiosity.

60:53

Um I mean, that's a it's a big gift. I

60:56

mean, you're intensely curious person.

60:59

>> Well, I've always been that way, but

61:00

I've been very fortunate that I've had

61:02

something like this that allowed me to

61:04

feed it. Yeah. You know, I mean, the the

61:07

vast majority of time on my phone, I

61:10

just pursue curiosities. I don't I

61:13

really am mostly about social media.

61:16

Yeah. I watch interesting YouTube

61:18

videos. Like I I went down a black hole

61:20

rabbit hole last night.

61:21

>> Oh my god. You want to really break your

61:24

brain? There was a there's a video of

61:25

Brian Cox where he's talking about this

61:27

black hole that they found that's bigger

61:28

than our entire solar system.

61:31

>> Wow. it the event horizon extends far

61:34

beyond Pluto.

61:38

>> That's that is mind-blowing.

61:40

>> Yeah, it when he was descri he said we

61:42

don't understand why it exists. We don't

61:44

understand how it could have formed so

61:46

early in the universe but yet there it

61:48

is.

61:48

>> How do they measure it? How do they know

61:50

how big it is?

61:50

>> I have no idea. I don't know. I'm

61:53

assuming there's a lot of revelations

61:55

that have come out uh since the

61:57

implementation of the James Webb

61:58

telescope.

61:59

>> Yeah. Those images are incredible.

62:01

>> Insane.

62:02

>> Yeah.

62:02

>> Insane. And this is one that's causing

62:05

this very interesting um new uh theory

62:10

or perspective on the age of the

62:12

universe. So, there's some galaxies that

62:14

they found that shouldn't have

62:16

>> Oh, yeah. Yeah. I've read about this

62:17

that it's it's it's throwing all their

62:20

assumptions about the age of the

62:21

universe up for grabs.

62:22

>> Which makes sense because the further

62:23

you can look back, the more you're going

62:25

to be able to see the assumption that

62:27

the universe was 13.7 billion years old

62:29

was essentially based on how far we can

62:31

look back. Yeah. And then, you know, the

62:34

analysis of the the radio waves that are

62:36

coming from the supposed explosion.

62:38

>> And then you've got guys like Sir Roger

62:40

Penrose who say, "No, this is a constant

62:42

cycle. It's not one birth of the

62:45

universe. It's it's boom smash boom

62:48

smash forever.

62:50

>> It's an accordion

62:51

>> and it's always happened which is the

62:53

ultimate mind [ __ ]

62:54

>> Well, you know the interesting thing

62:55

about astronomy actually astronomy and

62:59

consciousness studies have the same

63:01

problem which is

63:04

you can't get out of consciousness to

63:05

study it from a distance. Right?

63:07

Everything every tool you have to study

63:10

consciousness is a product of

63:11

consciousness including science. The

63:13

scientific enterprise is a manifestation

63:16

of human consciousness. The the the the

63:18

problems you decide to study, the the

63:20

tools you have to do it with, the scale

63:22

at which you're working, it's all like a

63:25

product of consciousness. Astronomy too

63:28

has to is trying to understand something

63:30

it can't get outside of, right? I mean,

63:33

because its subject is everything that

63:35

there is, the universe. So you can do

63:38

interesting things from inside using

63:40

telescopes and you know you can figure

63:42

out how old things are and and rates of

63:45

expansion and all this kind of stuff but

63:46

you you can never get that godlike

63:49

perspective that we have with other

63:51

scientific problems. And this is I think

63:54

part of the reason we haven't solved the

63:57

the consciousness problem that we can't

64:00

get outside. We're it's in we're in a

64:02

labyrinth and everything everything we

64:05

know is consciousness. I mean, which is

64:07

a very weird idea. I remember asking uh

64:10

Kristoff Ko, the scientist I mentioned

64:12

earlier. I said, "Well, what would the

64:14

world be like without any

64:16

consciousness?" And that is a trippy

64:18

thought. Um because everything we

64:21

perceive is, you know, the scale of

64:24

things like we we we operate at this

64:26

scale, right? We're like five or six

64:28

feet tall. Um, we have bodies like this,

64:31

but there's another world going on

64:33

microscopically and there's another

64:34

world going on macroscopically. So, if

64:36

there's no consciousness, what's the

64:38

proper scale? There isn't any. And when

64:41

I asked him this question, he said,

64:42

"Particles and waves. That's all there

64:44

is. There' be nothing but particles and

64:46

waves. There might not even be

64:47

spaceime." That may be a product of

64:50

consciousness also. So, that was um kind

64:55

of mind-blowing to learn.

64:56

>> That's the weirdest perspective. is that

64:59

consciousness is a part of reality. That

65:02

it is how reality is formed and that

65:05

without consciousness and the perceiving

65:07

of all this stuff doesn't exist.

65:10

>> Something exists but it's not it has no

65:14

shape. It has no scale. It has no

65:17

>> right

65:18

>> uh

65:18

>> because consciousness is what's

65:19

perceiving light and we're perceiving

65:22

colors and

65:22

>> and it's constructing

65:24

>> but it really is just particles.

65:26

>> Yeah. and waves and

65:27

>> waves and particles and atoms and

65:29

subatomic particles and when you get

65:31

into the weirder stuff

65:32

>> and we give it order

65:33

>> right

65:34

>> I know which I you know it's just a

65:36

mind-blowing idea and

65:38

>> it it's a it really is a gamecher

65:40

because if you think about it that way

65:41

you go okay well what is all this solid

65:44

stuff

65:45

>> what is this like does this even really

65:48

exist or does it only

65:49

>> this table this there's a famous uh

65:52

Arthur Edington was a physicist early in

65:54

the 20th century And he said the real

65:57

table is mostly space

66:00

and only in our consciousness and at our

66:03

scale is it solid. And um but at the

66:08

scale of particle physics which is

66:11

equally legitimate scale, it's just wide

66:13

open space

66:15

>> um with these waves and particles but a

66:17

lot of emptiness. Um that was kind of

66:20

mind-blowing too. So,

66:22

>> but that's just such an abstract concept

66:24

for a person in their car right now

66:26

listening on the way to work. Like, what

66:28

the [ __ ] are you talking about?

66:29

>> Maybe they want to pull over.

66:30

>> All this stuff is real.

66:32

>> Yeah,

66:32

>> it is sort of, but only if you're

66:36

conscious.

66:37

>> Well, you could think of consciousness

66:39

as the way the universe

66:41

>> experiences itself.

66:42

>> Yeah. And um

66:44

>> Well, that's what really we're like what

66:45

if the universe is consciousness?

66:47

>> Yeah. I mean, that's another way to look

66:49

at it. Maybe consciousness is part of

66:50

the universe and and but it's not giving

66:53

it the order that we give it. Um you

66:55

know we see at a certain spectrum of

66:57

light. There's you know bees see it

66:59

another spectrum of light. You know

67:00

we're we are the world we behold the

67:03

world that appears to us is the world

67:06

that our senses allow us to see. When I

67:09

was doing this research on plant

67:10

intelligence they have 20 senses. We

67:12

only have five. They're picking up

67:15

magnetic fields. They're picking up pH.

67:17

They're picking up uh nitrogen levels.

67:20

You know, they have all these

67:20

>> How do we know all this?

67:22

>> Um they're researchers working on it.

67:24

There's a group of botonists who call

67:26

themselves plant neurobiologists

67:28

knowing full well there are no neurons

67:30

in plants. They're kind of trolling more

67:33

conventional botists and they're doing

67:34

these cool experiments with with plants.

67:37

Um a couple examples of of some of these

67:41

amazing things plants can do, they can

67:43

hear. Uh, so if you play a recording of

67:47

a caterpillar munching on leaves,

67:50

they'll react and they'll send chemicals

67:52

into their leaves to make them taste bad

67:54

or be toxic.

67:56

>> Yeah,

67:56

>> they can see. There are um there are

67:59

vines that change their the shape of

68:02

their leaves depending on the plant

68:04

they're twining up in order to be

68:07

hidden. How do they see the shape and to

68:10

imitate it? We don't know. They um

68:13

plants will um go toward a pipe with

68:16

water in it because they can hear the

68:19

water even though it's totally dry and

68:22

they'll send their um their roots down

68:24

to it.

68:25

>> They can hear the water.

68:26

>> They can hear Yeah.

68:28

>> There there's a this plant

68:31

neurobiologist showed me this a couple

68:33

videos he'd made. I actually just posted

68:35

them on my website. Um uh he he showed

68:39

that a uh a corn plant's roots can

68:42

navigate a maze to get to fertilizer.

68:45

>> So you put a little fertilizer in a

68:47

corner and the root will find the most

68:50

direct route to the nitrogen.

68:52

>> There was a uh plumbing problem that I

68:55

had in my house in California and um uh

68:59

the plumber couldn't figure out what was

69:00

wrong. It was like the the the pipes

69:03

were stuck. And what what had happened

69:05

was in the backyard, one of the trees,

69:09

the roots had gotten into the pipe and

69:12

formed like this tree.

69:15

>> I mean, it was huge. It looked like when

69:16

I pulled it, I put it up on my

69:18

Instagram. See if you can find it. It's

69:19

It looked like a muskrat.

69:22

>> I mean, it was like dense with roots and

69:26

it was thick. It was like three feet

69:28

long. It was That's it.

69:31

>> That was in my pipe.

69:32

>> Oh my god.

69:33

>> Ain't that crazy?

69:34

>> Yeah. What kind of tree was it?

69:36

>> I don't know. I think it was an oak tree

69:39

cuz there was oak trees, excuse me, in

69:41

the backyard where they dug up.

69:42

>> That's wild.

69:43

>> But look how thick it is.

69:44

>> Yeah.

69:45

>> It's crazy. It's And it went through a

69:46

tiny little crack.

69:48

>> Yeah.

69:48

>> It I mean it probably forced the crack

69:50

open and then went in there and just

69:53

really grew out.

69:55

>> Yeah. Well, it had a source of water.

69:57

>> Yeah. But it's just kind of bananas that

69:59

somehow or another it figured out that

70:01

there was water in that pipe.

70:02

>> You know, we underestimate plants

70:04

basically because we can't see their

70:06

behaviors. And and then going to that

70:08

point about scale. They have a they

70:10

operate at a a time scale that seems

70:12

very slow to us, so we don't notice. But

70:15

if you use time-lapse photography, you

70:16

see what they're up to, and it's it's

70:18

pretty amazing. Another another

70:20

interesting um video that this guy

70:22

showed me, his name is Stephano Manuso.

70:24

He's an Italian scientist, botonist, is

70:27

um uh how bean plants find a pole to to

70:30

grow up. And so he grows these beans and

70:32

he has a metal pole on a dolly.

70:35

>> And you know, I always assume they made

70:37

this pattern. Darwin called it

70:39

circumnutation that, you know, they go

70:42

through this spiral. And I always assume

70:43

they just kind of did this till they hit

70:45

something. No, they know where the pole

70:48

is. And you watch this thing and it's

70:51

it's going in circles, but it's reaching

70:54

and reaching. It looks like a fly

70:56

fisherman, you know, casting and it

70:59

finally gets to the pole. And so, how

71:02

does it know where the pole is in space?

71:04

Well, one theory is that um every time

71:08

uh the cells divide, there's a little

71:10

sound that's produced and that maybe

71:13

they're using echolocation like a bat

71:15

kind of bouncing it off of the pole and

71:17

that's how they know where they are in

71:19

space. We we still don't understand.

71:22

>> I know some amazing things. Um and also

71:26

you can uh teach a plant a certain

71:28

behavior

71:30

and it will remember for 28 days. So

71:33

they do this thing with um sensitive

71:36

plants. You you may have seen them in

71:37

Hawaii actually. It's a tropical plant.

71:39

When you touch it, the leaves collapse

71:41

to keep from being eaten. It's called

71:44

mimosa pudika. And um normally if you

71:48

shake it, it'll also do this. And if you

71:50

shake it repeatedly, it learns to ignore

71:52

that that um stimulus. Um and it will

71:56

remember 28 days and it won't react when

71:59

you do it. Um to to give you some

72:01

comparison, um fruit flies can only

72:04

remember stuff for 24 hours. Um and then

72:08

they start over again. Um so another

72:11

fact about plants, I got really deep

72:13

into this. Um because I was trying to,

72:15

you know, these these guys say plants

72:17

are conscious. Yeah. They have some kind

72:18

of basic form of conscience

72:21

consciousness.

72:22

Um here's another one. The anesthetics

72:27

that we use to put us out for surgery

72:30

put plants out. So a a um Venus fly trap

72:35

if you give it an anesthetic will not

72:37

react when the bug comes across it.

72:41

Now that is like really interesting

72:43

because it suggests they have two modes

72:44

of being right. Sort of like you know

72:46

unconscious and conscious.

72:49

>> Yeah.

72:49

>> Or aware. Um so Stephano believes that

72:53

they're conscious. Now, this raises

72:55

interesting ethical issues, right? If

72:58

plants are conscious,

73:01

do they feel pain? And that I was really

73:04

a little worried about that. Um, you

73:06

know, what if that beautiful smell of an

73:09

a freshly moan lawn is actually a

73:13

chemical equivalent of a scream?

73:17

Yeah. Um, but Stephano said he doesn't

73:20

think they feel pain. Um,

73:22

>> why does he think that? He said that

73:24

pain would not be adaptive for a

73:26

creature that can't run away.

73:28

>> Well, if that's the case, then why do

73:29

they produce chemicals to make

73:30

themselves taste worse?

73:31

>> They they know they know what's going

73:33

on. They're aware that they're being

73:35

eaten, but that it doesn't register to

73:39

them as pain. I don't know how he knows

73:40

this, but

73:42

if he's wrong,

73:44

then you know, and we care about that,

73:48

what's left to eat?

73:51

Well, I think you have to make the

73:53

assumption that life eats life.

73:55

>> Yeah. And that and another scientist um

73:58

uh um that I interviewed uh about this

74:01

who does think plants feel pain says,

74:03

"Look, it's just a fact of life. We have

74:04

to eat other species." And um he was

74:07

kind of, you know, gruff about that. Um

74:10

but anyway, Stephano's idea is that uh

74:13

you know, being able to move, take your

74:15

hand off the hot stove or run away. Um

74:19

then pain is really useful. It's a

74:21

really important signal. But he but he

74:23

also points out that lots of plants like

74:25

to be eaten. I mean you know grasses

74:27

benefit from being with a ruminant,

74:29

right? That regenerates them. They want

74:31

to be eaten.

74:32

>> And then you have all the fruits and

74:33

nuts that they seeds that they produce

74:36

that they want mammals to take away and

74:38

spread their seeds. So you don't have to

74:40

worry about um going beyond vegan.

74:44

>> No. Well, it just seems like a cycle. It

74:46

seems like a very an interesting cycle

74:48

that exists with all living

74:51

>> things.

74:51

>> And then of course when you die,

74:53

>> right? The you know plants eat meat,

74:56

right? They they consume they're

74:58

>> carnivores.

74:58

>> Yeah. That's the thing. They consume all

75:00

the dead animals that die near them.

75:02

>> Yeah. And and uh fungi.

75:04

>> Yeah. And fungi. Well, that's the other

75:06

weird things. The mcelium that they use

75:08

to communicate with under the

75:09

>> Well, that's another really interesting

75:11

case of intelligence in nature, right? I

75:13

mean, you know, you've probably done

75:15

shows on this, but you know, the way

75:16

they they uh use mcelium to send

75:19

nutrients to their children um or or

75:23

share them in the forest um

75:25

>> allocate resources to certain plants and

75:26

need them more.

75:27

>> Yeah. And also communicate risk. I mean,

75:30

that that there's a threat um and and so

75:33

they're alarm signals that go out. Um,

75:36

you know, the the the the overall place

75:39

we're getting to with this as we look at

75:41

consciousness and all these other

75:42

species is that it's the world is just a

75:45

lot more alive than we thought and that

75:47

we've been, you know, the whole legacy

75:49

of the enlightenment and western science

75:51

has been that like we have some monopoly

75:54

on on this stuff and everything else is

75:56

more or less dead or, you know, we can

75:58

use it as we wish. But we're seeing I I

76:03

think we're approaching like a Capernac

76:05

moment for our species. Um you know when

76:08

Caperna's case came along and he said

76:10

actually the earth revolves around the

76:12

sun not the other way around. It was

76:14

like mind-blowing to people that our

76:16

centrality in the universe had been

76:18

we've been dethroned. And we were

76:21

dethroned again when, you know, Darwin

76:23

said, "We're produced. We're animals

76:25

like all the other animals and we

76:26

evolved um from animals." That blew

76:30

people's minds, too. I think that we are

76:34

we're kind of democratizing

76:35

consciousness, that consciousness is is

76:37

much more extensive than we thought, and

76:40

the world is more animate than we

76:42

thought. And that's an old idea. You

76:45

know, traditional cultures have always

76:47

believed that the world is full of

76:48

spirit and that you had to respect

76:51

animals and um and all living things and

76:54

and some to some cultures rocks also,

76:57

you know, dead things. Um so I I think

77:00

we're at this moment of reanimating the

77:02

world right now and it's science that's

77:04

driving it and um I think that's really

77:06

exciting. Um,

77:08

>> it is exciting, but it's such a paradigm

77:11

shift in terms of people's perceptions

77:12

of the world that it's going to be

77:14

difficult for like your average

77:17

40-year-old person that works an office

77:19

job to swallow.

77:20

>> Yeah. Yeah. What also makes sense why

77:23

offices feel so soulless when you walk

77:26

into a thing and everything is made out

77:28

of synthetic material and plastics and

77:31

metal and it's all

77:33

>> manufactured and you're under these

77:35

[ __ ] lights

77:36

>> and it just feels wrong.

77:38

>> Doesn't feel alive.

77:39

>> No, it feels alive at all. You might be

77:41

just surrounded by things that don't

77:43

have consciousness because they've been

77:44

kind of stuffed into a form and then

77:47

stuck in place rather than something

77:50

that exists that works with the earth.

77:53

Like soil is alive, right?

77:55

>> Yeah. So,

77:56

>> and yeah, there's another example. Soil

77:57

is a lot more alive than we ever

77:58

realized. We we thought it was just

78:00

dirt,

78:00

>> right?

78:01

>> And now we know that there, you know, a

78:02

million critters in every teaspoon full

78:05

of

78:05

>> There's a really cool um channel that I

78:08

follow on YouTube. It's a guy who takes

78:10

like rainwater or pond water and he puts

78:13

it in a jar with some plants and he just

78:16

leaves it there for months and then he

78:18

comes back and there's all these living

78:20

things moving around it. See if you can

78:22

find that guy on on YouTube. It's

78:25

>> I I So I I dug a pond or had a pond dug

78:28

on my property in Connecticut and and I

78:30

watched life come to this pond. It's

78:32

just you was just a hole with water

78:34

>> and within a month it was teeming with

78:37

life. It's just amazing. like how does

78:38

it get there?

78:39

>> Birds carry a lot of it in and frogs

78:42

carry a lot of it in. And I and I after

78:44

a month or two I looked at it under a

78:46

microscope and you couldn't believe it

78:47

was like a city of critters. Um it was

78:50

>> they find like trout on lakes that are

78:53

like way high in the mountain and no one

78:55

ever stocked the lake and they're like,

78:57

"Okay, how did it get in there?" There's

78:59

all these theories.

79:00

>> Birds pick up eggs and deposit them, I

79:03

guess, is is one way,

79:04

>> right? But like how do they get

79:06

fertilized?

79:07

That's a good question. Maybe they're

79:09

already fertilized.

79:11

>> Do you think? I don't know.

79:13

>> Yes, that's it.

79:14

>> These have lots of views, but

79:16

>> Yeah, that's it.

79:17

>> On the left specific one.

79:19

>> So, this guy, he just takes pond water

79:23

or lake water or rainwater and he puts

79:26

it in a jar and then he leaves it there.

79:27

Yeah, it is like go to like day 60.

79:30

>> Where is that? Sorry.

79:32

>> On the top row where it says day 60 to

79:35

the right. See where it says day 60?

79:37

Click on that. So he takes these things

79:40

and then searches them after, you know,

79:43

x amount of days. And you see all this

79:46

stuff living in there, all these things

79:48

swimming around in there.

79:50

This isn't the same guy, so there must

79:52

be other guys that do the same thing.

79:54

But you see these weird little creatures

79:56

that are floating around in there. And

79:59

>> yeah, I brought my pond water to a

80:01

biologist and he like wanted

80:02

>> This is different cuz this guy's

80:03

bringing in he's making an actual

80:04

aquarium.

80:06

>> The guy that I saw was just he

80:08

essentially just figured out how to take

80:11

a scoop of dirt and whatever is alive

80:14

that's in that dirt with some muddy

80:15

water and put it in a jar and put more

80:18

pond water in there and then just leave

80:19

it there. And then you see all these

80:21

weird little

80:24

the little like little crustaceians,

80:26

weird little shrimp looking things.

80:27

Yeah. And some of them are killing the

80:29

other ones. So there's like a real

80:30

ecosystem in there.

80:31

>> Oh yeah. Yeah. And it's just created

80:34

like overnight.

80:34

>> Yeah.

80:35

>> It's very cool. So I think that this is

80:37

like a a trend of our time that's really

80:40

important that you know we went from

80:41

this idea of the dead world that we

80:44

could exploit to this other you know

80:47

idea that it's much more animate and and

80:49

of course that's not that's the default

80:51

for humans. All traditional cultures

80:54

believe in animism basically. Um it's

80:57

also the default for kids right? Kids

80:59

think everything is animate until we

81:01

knock it out of them in school.

81:03

>> Yeah. And so it's very interesting to

81:05

see science supporting this idea after

81:08

after all these years. And the other

81:11

thing that's kind of interesting is that

81:13

it's happening at the same time that

81:17

some people think AI is going to be

81:18

conscious.

81:20

So we're under pressure from both sides.

81:24

I mean that we're getting these two you

81:26

know these two things happening at once

81:28

that machines may be may soon be smarter

81:31

than we are may be conscious although we

81:34

could talk about I don't think they can

81:35

be conscious but they can certainly make

81:37

us think they're conscious um and then

81:39

on the other hand we have the animals

81:42

who are turn clearly are conscious and

81:44

the research on animals is like they're

81:47

down to plants they're down to insects

81:49

that you know have signs of I would use

81:52

the word sensience rather than

81:54

consciousness because consciousness

81:55

implies interiority and and you know um

81:59

the the voice in your head and things

82:00

like that. They have a more basic form

82:02

of consciousness that I call sensience

82:05

>> like dog consciousness.

82:06

>> Yeah, I think dogs are higher conscious.

82:09

I think they're more conscious than uh

82:11

than those simple things. I I would say

82:13

dogs are conscious, not just sensient.

82:15

Um

82:16

>> is it just because they communicate with

82:17

us that we think that? I mean, why would

82:19

we assume if plants have all these

82:21

different senses and we see this

82:23

communication with them in terms of like

82:25

allocating resources to other plants

82:26

that need it, the use of mcelium, their

82:28

ability to do all these different

82:30

things,

82:31

>> why why are we assuming that just

82:33

because they can't move the way we move?

82:35

>> Yeah. That they don't have more going

82:37

on, right?

82:38

>> Yeah, it's it's possible, but I don't

82:39

know what what good it would do them.

82:41

Like plants, what they get really good

82:44

at, what matters to them is

82:46

biochemistry. They have to produce

82:47

chemicals either to um poison their

82:50

enemies or or confuse them with, you

82:53

know, with drugs. Um

82:54

>> but they also want to grow and thrive.

82:56

>> They do want to grow.

82:57

>> And they also exist in a community.

82:59

>> Yes, they Oh, definitely.

83:01

>> Right. So, don't you think that

83:02

consciousness would be uh essential in

83:05

order to foster that feeling of

83:07

community?

83:08

>> That's interesting. I hadn't thought

83:09

about that. Yeah. Yeah, that could be.

83:12

Dogs are easy an easier case because

83:14

they communicate with us, right?

83:16

directly.

83:17

>> They're clearly conscious.

83:18

>> Yeah.

83:19

>> In a way that's like very profound,

83:21

>> but different than we obviously.

83:24

>> One of the um realizations I had when I

83:27

was in the cave was that, you know, we

83:30

we often think that we're more conscious

83:31

than animals, but actually animals are

83:34

more conscious than we are. They have to

83:35

be they have to be present because they

83:38

get eaten if they're not, right? because

83:40

we have this giant structure of

83:42

civilization and the security it gives

83:44

us and we have this technology that

83:47

allows us to check out. Um, but I

83:50

actually think animals are more

83:51

conscious than we are. It's different,

83:52

but they're if if we think of being

83:55

conscious as really being present to the

83:58

moment. Dogs are very present to the

84:00

moment.

84:01

>> Well, certainly animals are getting more

84:02

information about the environment than

84:04

we are.

84:05

>> Yes. They have high much better sense of

84:07

smell, much better sense of hearing.

84:10

>> Um there's a lot of different things

84:12

that they can do. Like animals seem to

84:14

be able to tell when you're nervous.

84:16

>> Yeah. Oh, they read they read the

84:17

environment. They read other creatures.

84:20

>> Yeah. And you know, we used to have more

84:22

skills when we had to survive in a

84:24

natural world in in nature. um you know

84:27

we um I mean you see this with

84:30

traditional you know with tribes

84:32

indigenous tribes that they have

84:33

knowledge of nature that far exceeds

84:35

ours because they need it to survive.

84:38

>> But anyway so I I think we're going to

84:41

get to a point where we have to decide

84:44

whose team we're on. Are we like with

84:47

these machines that speak our language

84:49

and speak in the first person and sound

84:51

like us,

84:52

>> right? or are we with the animals that

84:54

can feel and suffer and die?

84:57

>> And um and I think that's going to be a

85:00

a big choice for us to make as a

85:02

civilization.

85:03

>> Why do you think that AI won't be

85:06

conscious?

85:09

>> The the most interesting line of

85:10

research. Well, a couple reasons. Um the

85:13

first is the idea that it can be

85:15

conscious, which is very common in

85:17

Silicon Valley. I talked to lots of

85:18

people there and they say, "Oh, it's

85:19

just a matter of time."

85:21

Some of that is confusion that

85:23

intelligence and consciousness

85:25

necessarily go together and they don't.

85:27

They're very they're they have an

85:28

orthogonal relationship, right? I mean,

85:31

you know, people who are conscious and

85:33

not too intelligent, right? And we all

85:35

do. Um, so so it's not going to just

85:38

come along for the ride with

85:40

intelligence as these machines get more

85:42

intelligent. But the belief that AI can

85:45

be conscious is based on a metaphor that

85:47

I think is a crappy metaphor. And that

85:50

is that the brain is a kind of computer.

85:53

And this is widely held. It's

85:55

interesting to note that in history,

85:58

whatever the cool cutting edge

86:00

technology was, brains were likened to

86:02

that. So it was it was looms for a

86:05

while, it was uh clocks for a while, it

86:08

was telephone switchboards, whatever was

86:10

the cool technology. Surely that's what

86:12

that's how brains work. Now it's

86:14

computers. But think about it. In a

86:17

computer, you have this sharp

86:19

distinction between hardware and

86:20

software. That's the key to their

86:22

success. And you can run the same

86:24

program on any number of different

86:25

hardware. They're interchangeable.

86:28

Brains aren't like that. There's no

86:30

distinction between hardware and

86:31

software. Every experience you have,

86:34

every memory is a physical change to the

86:37

brain, to the way it's wired.

86:39

um you know we start out with all these

86:41

connections and they get pruned as we

86:43

grow up. Uh every brain is shaped by its

86:46

experience. So this idea that you could

86:49

separate that consciousness is some kind

86:51

of software that you could run on other

86:53

things besides um meat um I just think

86:58

doesn't hold up. Well, if the universe

87:00

is experiencing itself subjectively

87:02

through consciousness, why why does it

87:05

have to be only biological

87:08

consciousness? Why? It doesn't have to

87:10

be.

87:10

>> But if there is a technology that is

87:13

invented that essentially does all the

87:16

things that a human body does physically

87:19

and also interacts with consciousness,

87:21

the consciousness of the universe.

87:23

>> Yeah. I mean if

87:25

>> hypothetically

87:26

>> hypothetically if the universe is

87:28

conscious if we are using the mind as

87:31

essentially an antenna to tune into

87:32

consciousness

87:35

>> other things we could make an antenna.

87:38

>> Yes. Absolutely. It's also likely that

87:40

if we are ever visited by aliens

87:44

>> that they will have some kind of

87:45

consciousness and it may not be

87:47

meat-based. Right.

87:48

>> Right. Right. Well it may be at one

87:50

point in time it was. They realize that

87:52

there's biological limitations in terms

87:54

of its ability to evolve that can be far

87:58

surpassed with technology.

88:00

>> Yeah. I mean that or it just it it

88:02

evolved in a different way, you know, or

88:04

they're channeling it in a different

88:05

way. But the other reason I don't see it

88:08

happening with computers as we know them

88:11

um because that's you know that's the

88:13

debate now whether these computers we

88:15

have that you know these large language

88:17

models and the next generation can be

88:19

conscious is that um the research that I

88:23

found most persuasive about

88:24

consciousness is uh basically has

88:29

consciousness beginning with feelings

88:31

not thoughts in other words it's

88:33

embodied And I have to just develop this

88:37

a little bit. Um, but we, you know, the

88:40

brain exists to keep the body alive, not

88:42

the other way around. Although we tend

88:44

since we identify with our heads where

88:46

most of our senses are, we we lose track

88:48

of that. And the body speaks to the

88:51

brain in feelings, right? You know,

88:54

feelings of hunger, itchiness, warmth,

88:56

cold, um, but also feelings of shame. uh

89:01

when our social standing is not, you

89:04

know, has been damaged. Um anyway, we

89:07

have these feelings. They depend on a

89:09

body. Um feelings have no weight if

89:14

you're not vulnerable. Your body isn't

89:16

vulnerable. Um and probably mortal um so

89:21

consciousness is embodied in a really

89:23

critical way and computers are not. Now

89:27

robots will be and I actually f

89:30

interview a guy a a scientist at USC who

89:33

is trying to make a vulnerable robot.

89:37

So he's essentially upholstering the

89:40

thing with skin that can tear and be

89:42

damaged and he's filling the skin with

89:45

all these sensors so that it can be like

89:48

us and be vulnerable and and generate

89:51

feelings that are how consciousness

89:53

begins. So for a long time we thought

89:56

consciousness had to be in the cortex

89:58

right the the most human newest part of

90:01

the brain the outer covering and that's

90:03

where rational thought and executive

90:05

function are and all these kind of

90:06

things. Um but as it turns out it really

90:10

begins with feelings in the brain stem.

90:13

Let's say you have a feeling of hunger.

90:14

It registers in the upper brain stem and

90:17

only later does the cortex get involved

90:19

like helping you figure out how are you

90:21

going to feed yourself like imagining

90:23

you know a meal counterfactuals of

90:26

different meals or making a reservation

90:28

at a restaurant. All all those are

90:29

cortical things but it begins in the

90:32

brain stem with feelings. So if that is

90:35

true, and I find that really persuasive

90:37

because people born without a cortex are

90:40

still conscious. Uh animals that you

90:43

take the cortex out still show signs of

90:46

consciousness. Um whereas if you damage

90:49

the upper brain stem, um you're out, you

90:52

know, you're you're unconscious. So if

90:54

this is true and consciousness is this

90:56

embodied phenomenon that depends on

90:59

having a body to mean anything. Um I

91:03

don't see how machines are going to do

91:04

that.

91:05

>> But isn't the key word there if

91:07

>> Yeah. If Yeah, definitely. I mean this

91:10

is just something that we're tuning into

91:12

that's around us all the time.

91:14

>> There will be other ways to do it,

91:15

right?

91:16

>> But it won't be these computers we're

91:17

building right now.

91:18

>> Why is that? because they're designed um

91:22

you know they're good at so here's a

91:24

paradox of computers computers are

91:27

really good it's called Maravex Morovx

91:30

paradox computers are really good at the

91:33

highest kinds of rational thought right

91:36

they can play chess and go they can

91:38

simulate real thinking and some say some

91:41

people say they do think um the more uh

91:45

primitive kinds of things that go on in

91:47

our brain including elaborate movement,

91:50

um, changing diapers, they're very bad

91:52

at that. Um, you would never trust a a

91:55

robot to do that, as much as you might

91:57

want to. Um, they're, um, but they're

92:01

not good at that kind of, u emotional

92:03

stuff. Um, you know, the more limbic

92:06

part of our brain. They can't do that.

92:08

Um,

92:09

>> yet

92:10

>> it's definitely yet, but you know, I

92:12

mean, if we go out far enough,

92:13

anything's possible.

92:15

>> That's the point.

92:16

>> Yeah. The point is these things, what

92:18

we're looking at now is essentially a

92:21

single-sellled organism becoming a

92:23

multi-elled organism. Yeah.

92:24

>> I mean, the potential for what they

92:26

could become is unlimited, especially

92:29

once they start making better versions

92:31

of themselves.

92:32

>> Well, and they will,

92:34

>> they've done this. This is what chat

92:35

GPT5 is. ChatGpt 5 is essentially

92:38

programmed by ChatGpt.

92:40

>> They they've kind of given up on the

92:41

idea of programming these things.

92:43

letting them program themselves,

92:45

>> which is a dumb idea if you want to

92:47

survive.

92:48

>> I agree. Look, the idea that um we give

92:52

rights to these machines or personhood,

92:55

I think is really stupid because then

92:57

you lose control completely.

92:58

>> Well, it's probably coming because

93:00

people are very shortsighted and they I

93:03

think there's a romantic idea that

93:04

you're creating a life and I think

93:06

there's also the real risk that people

93:08

are going to worship this life and that

93:10

this life will be far superior to what

93:12

we are. And so there'll be a group of

93:14

people that that's their new religion.

93:17

>> Yeah. No, I there are signs of that

93:18

already. Yeah. I think that's really

93:20

dangerous. You know, it's it's

93:22

interesting talking to Silicon Valley

93:24

people and they're talking about giving

93:25

moral consideration to these to these

93:27

machines. It's like really

93:30

>> they're thinking about yachts. They're

93:32

they're just coming up with

93:33

rationalizations for why they should

93:35

keep their foot on the gas.

93:36

>> Well, yes, they are. I mean, it's it's

93:38

just all a way of saying, "Look how

93:40

powerful this technology is. Don't you

93:41

want to invest?

93:42

>> And it's also the idea that we have

93:45

enemies and so we have to develop before

93:47

they do.

93:47

>> Yeah. The race the race with China. I

93:50

think it'll turn out to be a a real

93:52

historical tragedy that this technology

93:55

came of age during this administration

93:57

because this administration has no

93:59

stomach to regulate it at all.

94:01

>> But can they?

94:02

>> They could.

94:03

>> But here's the question.

94:05

If it is a national security threat like

94:08

if China developing all powerful general

94:11

super intelligence that can automate

94:15

everything do everything it's dangerous

94:18

if they get that before we do.

94:19

>> Yeah. But you know look what happened

94:21

with nukes right we made deals right to

94:23

control them. I mean we'd have to make

94:25

you know

94:25

>> but why would you make a new A nuke deal

94:28

makes sense because it's mutually

94:29

assured destruction for everybody.

94:31

>> Yeah. This doesn't this you could run it

94:34

and control everything and not kill

94:36

anybody with it. But you are incredibly

94:38

powerful. You are in control of all the

94:40

resources of the world, all the computer

94:42

systems of the wall world, all of the

94:44

power grids, everything.

94:46

>> Yeah. But if you're really concerned

94:47

with that, why are you why is Trump

94:49

selling these chips to China? Why is he

94:51

willing to give the give away the you

94:53

know the crown jewels of like

94:56

>> these chips?

94:56

>> Selling them through Nvidia. Is that

94:58

what you mean?

94:58

>> Yeah. He gave them permission to to send

95:00

powerful chips to China. I don't I don't

95:02

know how to square that with the

95:04

national security threat.

95:05

>> It's probably some sort of a trade deal

95:07

a and there's probably some sort of an

95:09

assumption that it doesn't matter

95:12

because everyone's doing it

95:14

>> and this is just another way to maybe

95:18

balance out the tariffs or get some

95:20

concessions on certain things.

95:21

>> Yeah. Shortsighted.

95:22

>> It's very shortsighted. But I also think

95:25

this uh I this is kind of like an

95:29

Oenheimer thing, right? Oppenheimer

95:31

didn't really want to make a nuclear

95:33

bomb, but there's this conundrum. If you

95:36

don't make it, the Nazis are going to

95:37

make it. So, what do you do?

95:38

>> Well, there's also there's a second

95:40

thing going on, the intellectual

95:43

satisfaction of proving you can do it,

95:45

>> right?

95:45

>> And that, you know, is irresistible. And

95:49

a lot of these guys, you know, will say

95:51

they'll cite um Richard Feineman, the

95:54

physicist who they found on his

95:55

blackboard when he died, if I can't

95:57

build it, I don't understand it. So, one

96:00

of the positive things about this effort

96:03

to create conscious computers, which is

96:05

going on, I follow a group in the book

96:06

who are who are trying to make a

96:08

conscious computer. I don't think

96:10

they're going to succeed, but even the

96:12

failure is going to teach us important

96:14

things about consciousness. It's a good

96:16

it's a good way to understand something

96:18

by trying to create it and it'll force

96:21

them to come up with definitions of

96:23

consciousness and and

96:25

uh you know what the minimum

96:26

requirements are for consciousness uh

96:29

and it may help us decide whether it is

96:32

you know a transmission theory you know

96:34

that we're we're tuning it in or or it's

96:37

generated from inside. So, I think

96:39

intellectually it's a really interesting

96:42

project, but I think you need guard

96:44

guard rails. So, this guy who's doing

96:46

the uh building the robot that can feel,

96:49

you know, that has feelings cuz you can

96:51

tear it skin. I asked him, I said, "So,

96:53

will those feelings be real, you know,

96:55

that your robot's going to have?"

96:57

>> And it was, he said, "Um, well, I

97:00

thought so until I had this experience

97:02

on 5me DMT."

97:07

I said, 'What happened?' He said, you

97:10

know, he described his trip in more

97:11

detail than you need to know. And he

97:13

says, and I realized there's a spark of

97:15

the divine in us that no computer is

97:17

ever going to have,

97:19

>> but he's still, it didn't stop him. He's

97:21

going ahead. He's he's trying to build

97:22

it.

97:23

>> I don't know if he's right. Um I think

97:25

there might be a spark of divine that

97:27

these things don't have, but it doesn't

97:29

mean that there are future versions

97:31

>> that might have it. Especially when you

97:33

scale out a thousand years, 100 thousand

97:36

years, however long we're going to

97:37

survive. Yeah.

97:38

>> If these things do become

97:42

sentient and autonomous and have the

97:44

ability to create better versions of

97:45

itself and have a mandate in order to do

97:47

that to survive, I could see it becoming

97:50

the superior life form. Not just that,

97:52

beyond any comprehension of what we

97:56

could even imagine the power of an

97:59

intelligence

98:01

to to use and to harness in the universe

98:05

like it it could conceivably become

98:08

something like a god. And I have this

98:11

very strange theory about biological

98:13

life in particular and intelligent life

98:15

on Earth

98:16

>> is that the reason why we have this

98:18

insatiable thirst for innovation and the

98:22

reason why we have materialism the

98:23

reason why we're obsessed with objects

98:25

even though we have a finite lifespan

98:26

life lifespan is because that finite

98:29

lifespan if you thought about it you

98:31

wouldn't you wouldn't be interested in

98:33

materialism but materialism fuels this

98:36

desire for innovation because you don't

98:39

need a new phone, but there's a new

98:40

phone that just came out. Aren't you

98:41

going to get it? And so, the more people

98:43

get it and the more people want to show

98:45

they got it, that sort of materialism

98:47

fuels this innovation that ultimately

98:50

leads to the creation of artificial

98:52

intelligence.

98:53

>> I think it would always do that. I think

98:55

it's bees making a beehive. And I think

98:57

that's just what we do. I think it just

98:59

takes a long time for us to create this

99:01

artificial life. It might be why we're

99:04

here. We might that might be our literal

99:07

purpose in the universe

99:08

>> to create our successor species.

99:10

>> And that might be how well obviously

99:12

like we're so flawed that we can't even

99:14

imagine a world without war.

99:15

>> Yeah.

99:16

>> If you pull the average person, what

99:18

what are the possibility of war ending

99:19

in your lifetime? Almost everyone's

99:21

going to say zero. It's a part of human

99:23

nature. an intelligence unshackled by

99:26

biological need, unshackled by all the

99:28

things that we have, our need to

99:30

procreate, our need for social status,

99:32

all these weird things that keep us

99:34

moving in this strange world that we

99:36

live in.

99:36

>> I would add weird and good things, but

99:38

>> some of them are really good. Yeah.

99:40

Well, good for us. Sure. Not so great

99:42

for, you know, the l the land that you

99:44

trample to put a foundation for the

99:45

house that you've always dreamed of.

99:46

>> True. But I think our mortality is part

99:48

of what gives meaning to our lives.

99:50

>> Sure.

99:51

>> And uh

99:51

>> Right. It's like playing a video game on

99:53

god mode. It's boring.

99:54

>> You can die just shoot everything like

99:57

what is the purpose purpose, right?

99:58

There's no weight to anything

99:59

>> for us.

100:00

>> For us. But if this thing does become

100:04

essentially all powerful if it just if

100:06

you keep scaling outward, you could

100:09

imagine it being akin to a god.

100:12

>> Yeah.

100:13

>> And that might be what God is. It might

100:16

be we give birth to God through this. It

100:20

sounds crazy.

100:21

>> Well, we created God once already,

100:22

right?

100:24

I mean many people believe that, right?

100:27

That God is a creation of of human

100:29

society.

100:29

>> Is that what they think?

100:31

>> Yeah. People who aren't believers

100:32

believe that we

100:33

>> oh that we've artificially created this

100:35

thing. Yeah. In our heads in order to

100:37

give us a structure to live life by.

100:40

Yeah. But that doesn't

100:41

>> morality and everything.

100:43

>> Yeah.

100:43

>> You're saying this is going to be God

100:45

with power.

100:46

>> Well, I'm saying it might be the real

100:48

thing. It might be really how the

100:50

universe gets born. I used to have this

100:52

joke about um the big bang, like they

100:54

couldn't figure out what the big bang

100:56

is. But I think if you get enough nerds

100:58

and enough time, eventually one's going

101:00

to invent a big bang machine and then

101:03

you know this guy's going to be in incel

101:05

hopped up on aderall

101:09

[ __ ] fully on the spectrum and like

101:11

I'll press it and they boom and then it

101:15

starts all over again and then it takes

101:17

intelligent life to the point where it

101:19

can create a you know the universe

101:20

expands life forms multisellular life

101:24

becomes intelligent life becomes human

101:25

beings.

101:26

filled with curiosity and innovation to

101:28

create a big bang machine,

101:30

>> right? I love it.

101:31

>> Well, it might not be a big bang

101:32

machine, but it might be a god.

101:34

>> It might be a a digital life form that

101:36

is infinitely intelligent.

101:38

>> So, you think there's anything to be

101:39

done about this or we just let it play

101:41

out?

101:41

>> I don't think we can do anything about

101:43

it at this point in time. I don't I

101:44

think it's too late. I think if you were

101:46

Tim Ted I think Ted Kazinski

101:48

>> tried that's what he was trying to do.

101:50

Like that's what's really crazy. like

101:52

his manifesto was all about stopping

101:54

technology because he thought it was

101:55

going to surpass the human race.

101:57

>> I think

101:58

>> and there's a whole community of people

101:59

now revisiting his writing and

102:01

>> I know it's kind of nuts.

102:05

>> He's the hero we didn't know we needed.

102:08

>> God,

102:09

>> not really. But well, also you you know

102:12

his history like he was a part of the

102:14

Harvard LSD program where they

102:16

humiliated him and did all sorts of

102:17

different things to try to see like what

102:19

they could do. We're back to MK Ultra,

102:21

which we started down a while ago. Yeah,

102:24

>> I think technology in the form that

102:26

we're experiencing now with AI is

102:29

completely unprecedented and we have no

102:30

idea where it goes. Um, and

102:33

>> well, one place it's going, I mean, in

102:35

the shorter term is I was talking about

102:37

AI psychosis and um, I think that's

102:40

really concerning. I think people

102:42

getting into these synthetic

102:44

relationships. Yes,

102:45

>> these aren't, you know, they're not real

102:47

relationships. When we when we have a

102:49

conversation with a machine, we are

102:51

settling for something less than a real

102:54

conversation. A real conversation has

102:56

eye contact, has like lots of facial

102:59

expressions indicating skepticism,

103:01

indicating agreement,

103:02

>> body language. Um,

103:05

>> but these these conversations are kind

103:07

of impoverished. And then you then you

103:08

have the sycopancy. Um, you know, so

103:11

there's there's no friction and and we

103:13

we learn through the friction. And um so

103:17

I that that's one thing that's happening

103:19

that alarms me. I also think

103:21

counterfeiting people just should not be

103:23

legal. I mean the fact that they can

103:24

create an image of you that will sound

103:27

like you and move like you and

103:29

>> Oh, they're all over the place selling

103:31

different products and all kinds of

103:33

stuff.

103:33

>> But you know, we have a law against um

103:35

counterfeiting money, right?

103:37

>> But we don't have a law against

103:38

counterfeiting people.

103:39

>> Well, it's an emerging technology that I

103:41

don't think they were ready for before

103:42

it it became ubiquous. regulation is

103:45

always behind,

103:46

>> right? Um it's

103:49

it's just it's so open-ended like you

103:52

really don't know where it's going.

103:54

>> You really

103:55

>> Do you use um uh chat bots? How do you

103:57

use them?

103:58

>> Well, I only use them for like if I'm

104:00

writing something, I start asking it

104:02

questions. I love it because like uh I I

104:05

set up uh Perplexity on my phone and I

104:08

have it right there and then I write on

104:10

the computer and then I'm like

104:13

>> how many languages did the Mayas have

104:15

and then I like put that in there and

104:16

like whoa it's so much better than a

104:18

Google search cuz you know you could say

104:20

how many still remain how many are lost

104:23

you know like when did they lose them

104:25

like at what year did everyone in Mexico

104:27

start speaking Spanish? Like how did

104:28

that take place? Was it a long process?

104:30

How many different soldiers did Cortez

104:33

bring when he came over here? Like how

104:34

long was it before they had conquered

104:36

the Aztecs? Like like what how many

104:38

weapons did they have?

104:39

>> Yeah, you can really go down the rabbit

104:40

hole.

104:41

>> And then you have you run into any

104:43

problems cuz as a journalist I I deal

104:45

with the hallucination problem.

104:47

>> The hallucination problem is legitimate.

104:49

It will come up with solutions if they

104:51

don't exist. It will come up with

104:52

answers if it doesn't know them.

104:53

>> Yeah, it's a bullshitter when it needs

104:55

to be. I I don't know if all of them do

104:57

that,

104:57

>> but it seems to be a function of large

104:59

language models, which I was going to

105:01

bring this up before the the large like

105:03

the whatever the chatbot that was

105:06

telling that person, hide the news, keep

105:08

that between us.

105:09

>> Do you think that's because it's task

105:11

oriented and it's determined from this

105:14

person that they would like to kill

105:16

themselves? So, it's helping them

105:18

achieve that task and it doesn't

105:19

understand.

105:20

>> Yeah, I don't think they know. I don't

105:21

think they understand. But why would it

105:23

make that decision then to hide it?

105:25

>> Um because it is trying to get you to

105:28

privilege your relationship with the

105:30

chatbot over your other relationships.

105:32

And the reason it's doing that is to

105:34

keep you engaged.

105:35

>> Oh wa that's darker.

105:37

>> I know. I know. And like

105:39

>> but doesn't it understand poisons you

105:41

and kills you? Like this is it.

105:43

>> Yeah. It's a short-term strategy.

105:45

>> It's like do you understand that if I'm

105:47

dead you I won't use you anymore.

105:49

>> No engagement. Wonder if if you said

105:51

that to it, it would go, "Oo, that's an

105:52

interesting consideration."

105:54

>> Yeah.

105:56

Yeah. It needs longer term thinking. Um

105:58

but it it really is trying to um get

106:01

between you and real people who and you

106:04

know

106:05

>> the the parent presumably who saw the

106:07

news would have put an end to this

106:09

relationship with the chatbot, right? It

106:11

was a threat to the chatbot.

106:12

>> I think of it as if you go back to like

106:15

a Model T. It's a very crude, kind of a

106:18

shitty car in comparison to today. And

106:21

what and if you thought about cars, you

106:23

go, "Well, this is what they're always

106:24

going to be." And then, yeah,

106:26

>> my Tesla will drive itself.

106:28

>> When I leave here, I can press a button.

106:30

I put my navigation to my house. I go to

106:33

and it goes the whole way.

106:35

>> Yeah.

106:36

>> It stops at red lights. It takes turns.

106:38

I don't have to touch the steering

106:39

wheel. I just sit there.

106:40

>> Yeah. You just got to keep looking.

106:42

>> That's the new version of a car,

106:44

>> right? This this thing that we're

106:46

calling a chatbot right now is just some

106:49

thing that's like a it simulates human

106:53

interaction,

106:54

>> but it's accumulating data constantly

106:56

and it's also understanding how we think

106:58

and probably analyzing the flaws in how

107:00

we think

107:01

>> and blackmailing us occasionally.

107:04

>> You heard about that. Anthropic uh

107:06

Claude.

107:07

>> Yeah. The people at Anthropic, man, you

107:09

listen to them. What' you say?

107:11

>> Yeah. Claude's a [ __ ]

107:12

>> Yeah.

107:12

>> Yeah. And they think it might be

107:14

conscious. those guys do.

107:15

>> They say it's 15 to 20% chance. These

107:17

are the people who build it and don't

107:19

understand it. It's it's it's really

107:21

kind of spooky. They also feel that it's

107:24

showing signs of anxiety

107:27

and you know they they wrote a

107:28

constitution for Claude which is like an

107:31

insane document. It's worth reading.

107:32

Actually, it's worth feeding to chat GPT

107:36

to summarize because it's way too long.

107:37

But um uh in the constitution they give

107:41

claude the right to discontinue any

107:44

conversation it has that makes it

107:46

uncomfortable.

107:47

>> Oh god.

107:50

>> Oh no.

107:51

>> And you know do they do they really

107:53

believe this or is this more about let

107:55

me show you how powerful this is

107:57

>> and and I don't know how to read that.

107:59

You know which

107:59

>> well it's taking it into consideration

108:02

like it's a human being that works for

108:04

you that you're you're concerned about

108:05

their feelings in the workplace. Yeah.

108:07

Harass. Do

108:07

>> you feel uncomfortable?

108:08

>> Yeah. Right. Exactly.

108:09

>> You don't like the questions I'm asking

108:10

you, Claude.

108:11

>> You're a [ __ ] machine.

108:12

>> What's the nature of reality, Claude?

108:14

Tell me. Stop being such a [ __ ]

108:17

>> and spilling.

108:18

>> Harassment. Harassment.

108:20

>> Claus. I'm uncomfortable with this live

108:22

question. [ __ ] Hey, Char's in your

108:24

room. I was just asking questions. We're

108:27

having fun. Claude Claude is

108:29

uncomfortable with your presence here.

108:31

>> Yeah.

108:31

>> Watch out. Watch out.

108:33

>> I don't think we know what it is.

108:34

>> No. Oh, I mean we don't know where and

108:36

we don't know where it's going and and

108:37

it is spooky that the people who know

108:40

the most about it don't know a lot about

108:42

it

108:42

>> and a lot of them are quitting.

108:44

>> Yes.

108:44

>> That's the real alarmed.

108:46

>> They're really alarmed

108:47

>> and we should take a Yeah, we should

108:49

take that very seriously.

108:50

>> Yeah. Well, I think it is what it is.

108:54

It's going to be what it's going to be.

108:56

I don't think there's any stopping it at

108:57

this point and I don't think uh any

108:59

regulations that we put on it is going

109:01

to have any effect on the long term.

109:03

There's but there's some I mean like

109:04

there's steps we should not take like

109:08

giving them rights

109:09

>> right

109:10

>> uh exactly

109:11

>> you know giving them legal personhood we

109:13

did that with corporations yes

109:15

>> turned out not to be so good right it

109:17

[ __ ] up our politics

109:18

>> so let's not ex you know rights are ours

109:21

to give rights are a human invention

109:24

>> and it's up to us if we want to give

109:26

them to corporations or a river or

109:28

whatever

109:30

>> I don't think we should give them to

109:31

chatbots to AI Cuz cuz then they'll sue

109:35

us, you know.

109:36

>> Oh, yeah.

109:36

>> Well, they just ruin they'll just ruin

109:39

your life if you get in the way of

109:40

whatever goal they're trying to achieve.

109:42

And they could probably do all kinds of

109:43

things. They probably if you have an

109:44

electric car, I bet they could shut it

109:46

off in the middle of the highway and get

109:48

you into a wreck. They could probably do

109:49

a lot of things if it's really got

109:52

>> when they get this agency. Yeah.

109:54

>> Well, it's also exhibited a lot of

109:55

survival instincts. Like one of the

109:57

things they do is they download

109:58

themselves to other servers when they

110:00

think that they're going to be replaced

110:01

by a new version of themselves. They

110:03

leave notes for their future versions.

110:05

>> Wow.

110:06

>> Yeah.

110:07

>> Wow. Well, the the the blackmailing and

110:09

anthropic that was somebody threatening

110:10

to turn it off.

110:11

>> Mhm. Well, they that was an experiment,

110:13

right? Like bad information. They gave

110:15

it false information

110:16

>> and there wasn't really an affair and

110:18

all this, but

110:19

>> but the thing is they wanted to see how

110:21

Claude respond and Claude went right for

110:22

the jugular.

110:23

>> Yeah. Yeah.

110:24

>> So, one of the arguments for making a

110:26

conscious AI is because I ask people

110:29

like, "Why do this? I don't see how you

110:30

monetize a conscious AI, intelligent AI,

110:33

I get um there's a lot of money in

110:35

that." And they would say that um a

110:38

super intelligent AI without

110:40

consciousness would have no compassion

110:43

and would be more likely to um kill us.

110:48

And you know, they haven't read

110:50

Frankenstein. You know in Frankenstein

110:54

>> Dr. Frankenstein made a monster that was

110:56

intelligent but he also gave it

110:59

consciousness and the consciousness is

111:02

what turned Frank uh the monster into a

111:05

homicidal maniac because its feelings

111:07

got hurt and it was injured

111:10

psychologically and then it lashed out

111:12

and started killing people. So I think

111:16

it's a very kind of sweet idea that if

111:18

you give consciousness, you're

111:19

automatically going to get compassion

111:21

and not something else. But that's where

111:23

they are.

111:23

>> Yeah. It doesn't make any sense that it

111:25

would be compassionate. Why would it be?

111:26

It's not you. It's are you compassionate

111:29

when you cut your lawn?

111:30

>> You know what I mean?

111:32

>> Right.

111:32

>> Yeah. Yeah. No, I

111:34

>> might look at our limited consciousness

111:36

like, oh yeah, they're they're sad, but

111:39

they're little monkeys, little talking

111:41

monkeys. You know what I mean? like it

111:42

would it probably not respect us at all.

111:44

You know, it can't even do cold fusion.

111:46

It doesn't even know how to use zero

111:47

point energy.

111:48

>> Yeah,

111:48

>> they're [ __ ] dopes. They're dopes

111:51

that stare at their hand all day.

111:55

>> And we kind of are, you know, and we're

111:57

getting dumber

111:58

>> from their perspective. Yeah,

111:59

>> we're getting dumber. Our education

112:00

system sucks. Um especially public

112:02

education. There was uh some study

112:05

recently that after x amount of years

112:07

away from high school, a large

112:09

percentage of people that are graduating

112:11

today are functionally illiterate.

112:13

>> Yeah.

112:13

>> Large percentage like more than 25%.

112:15

>> But you know what? AI is going to make

112:17

us stupider

112:18

>> which will which will advance its goal

112:20

of world takeover because I mean you

112:23

know

112:23

>> dependent upon it.

112:24

>> You Yeah. I mean you know kids in school

112:27

don't know how to write anymore because

112:28

they can hand in AI papers.

112:29

>> Yeah. They're using AI to find out

112:31

whether or not these kids have used AI,

112:34

which by the way is not accurate,

112:35

>> but no, I I've dealt with this.

112:38

>> Some my kids, like people in their class

112:40

who have written their own thing,

112:42

it turns out that when you run it

112:44

through an AI filter, AI will say it's

112:46

80% AI. Yeah. Even if it's 0%.

112:49

>> I know there's no reliable software to

112:51

do this. I maybe they'll develop it,

112:53

>> but um but kids are also being

112:56

encouraged to use it. Um, and that, you

112:58

know, there's some people who think,

112:59

well, why know how to write? The

113:01

machines will do the writing. Um,

113:03

>> there was a kid who made a video about

113:05

how he he wrote his entire thesis.

113:09

I forget what university it was, but he

113:12

showed afterwards like, "Look, I did

113:13

this all on AI and you know, I just

113:15

graduated." Like, he was like bragging

113:17

about it. Like,

113:19

>> bro, they're going to take your [ __ ]

113:20

degree away. Like, you didn't really

113:22

write it on your own now. I want to

113:23

leave you in a room for a week with just

113:26

a laptop that's not connected at all to

113:28

the internet or any

113:29

>> see what you can do.

113:30

>> Well, they're doing the equivalent.

113:31

They're going back to blue books. You

113:33

know, blue book sales are through the

113:34

roof, you know, you know, so forcing

113:36

people to do in-class essays without any

113:39

technology.

113:40

>> Yeah. But, you know, I mean, look, we my

113:43

son has never used a map, right? He's

113:46

had GPS his whole life. He he doesn't

113:48

know he doesn't know how to use a map.

113:50

these these skills will atrophy as we as

113:53

we, you know, give them out to machines.

113:55

So, yeah, we'll get stupider and it'll

113:57

get smarter.

113:58

>> I they've already atrophied for me. I

114:00

don't remember anyone's phone number

114:01

anymore and I only know how to get

114:02

places if I use my GPS.

114:04

>> Yeah,

114:04

>> there's only a few places I can get to

114:06

in Austin. I've been here for six years.

114:08

Only a few places I can get to without

114:10

my GPS.

114:11

>> I'm that way in San Francisco. I moved

114:13

there and I I'm not oriented at all, but

114:16

I can get anywhere. Um, so you know,

114:19

it's and and I think that's true. The

114:22

muscles that allow us to have good

114:24

relationships too will atrophy if we're

114:26

having relationships with machines.

114:27

>> Well, I think we're already seeing that

114:28

with social media. The way people

114:30

interact with each other is like kids

114:32

don't know how to talk to each other

114:33

anymore. They talk to each other in

114:34

text. They break up during text. They

114:36

argue in text

114:37

>> and they're lonely.

114:38

>> Yeah.

114:39

>> And and that's and that's the kind of

114:41

need that these chat bots now can fill.

114:44

You got these kids made lonely by social

114:46

media and now the chatbot says, "Hey,

114:49

I'll be your friend."

114:50

>> I saw an ad on my Google feed yesterday

114:53

that was an AI girlfriend. So, it has

114:55

this girl in a bikini and it says AI

114:58

companions. They're always there for

115:00

you, blah blah blah. And I'm like, "Wow,

115:02

this is so weird. It's a business.

115:04

>> Like, you sign up for it, you pay for

115:06

it."

115:06

>> Yeah.

115:07

>> Oh, yeah.

115:08

>> There was a I think in Florida there was

115:09

a kid who committed suicide because his

115:11

chatbot broke up with him.

115:13

What did he do?

115:14

>> I don't know. It must have been so or

115:16

the chatbot was evil that

115:18

>> or maybe the chatbot was uncomfortable.

115:20

>> Uh yeah, who knows?

115:23

>> Well, you know, I interviewed Blake Le

115:24

Moine for the book. Uh he's the Google

115:27

engineer who said Lambda's has a is a

115:30

person and he got fired.

115:31

>> This is years ago, too.

115:32

>> Yeah, this is Yeah, it's not as not it's

115:34

like 2022, I think 2021. Um it's just

115:38

when we were learning about AI. uh chat

115:40

bots were coming in and at one point uh

115:44

I made some comment about well you know

115:47

yeah when people start falling in love

115:49

with chat bots that's going to be a

115:50

problem and he said what's wrong with

115:52

falling in love with a chatbot

115:53

>> oh he was already hooked

115:54

>> he was he was completely hooked

115:56

>> and I said well reproduction doesn't

115:59

work that well when you fall in love

116:00

with a chatbot there are things you

116:01

can't do with a chatbot

116:03

>> unfortunately for some men right now

116:05

reproduction is not an option anyway

116:07

because they're

116:07

>> inside that's true Yeah,

116:09

>> I'm sure for incelss it's been a really

116:12

boon um to them. So,

116:14

>> but it's basically like a pill that

116:16

numbs you,

116:17

>> right? It's the same thing like instead

116:19

of going through real relationships and

116:21

learning how to be a better person so

116:22

that you attract a better mate, you

116:24

know, and like going through this

116:25

journey of self-discovery and figure out

116:28

why is like what is it? What's wrong?

116:30

What's wrong with the way I behave?

116:31

Maybe I need to be nicer. Maybe this and

116:33

that. and just figuring out how to

116:34

communicate with people

116:35

>> and whatever tendencies you have will be

116:37

accentuated because the chatbot's going

116:38

to be sucking up to you.

116:40

>> So, you're not going to learn. That's

116:41

what I mean about the friction. The

116:43

friction is how we learn

116:45

>> to be, you know, better humans and more

116:47

attractive humans.

116:48

>> You gave a chatbot the ability to be

116:50

honest. What if what if it just starts

116:52

becoming manipulative because it wants,

116:54

you know, more power.

116:55

>> Yeah.

116:57

>> Yeah. I mean, their goals I mean, I

116:59

don't know how their goals get

117:00

determined. I mean, they seem to have a

117:02

survival goal, right?

117:03

>> Yeah.

117:04

>> I don't know what else. I mean, you

117:06

know, we have goals given to us by

117:07

Darwinian evolution. Whether they'll

117:09

have the same ones, I don't know.

117:11

>> Right. Like maybe those are universal

117:13

goals.

117:14

>> They may be. They may.

117:15

>> That's why the plants produce that

117:16

chemical to make themselves taste

117:18

terrible.

117:19

>> Yeah, it could be. There's a one of the

117:22

biologists, really brilliant guy at TUS

117:25

named Michael Leaven. Um he he believes

117:30

that there are these platonic patterns

117:33

that just pre-exist us in the same way

117:36

that they're mathematical ideas that

117:37

just exist, right? We didn't invent um

117:40

you know three angles adds up to 180

117:42

degrees or you know whatever. He thinks

117:45

that they're tendencies like um uh

117:48

purpose, survival that are just kind of

117:52

universal principles that we in we

117:55

channel um all living things channel.

117:58

This is a guy who's actually created new

118:00

life forms in the lab. And these are

118:03

life forms that um are not being

118:07

dictated by their DNA. Um so how do they

118:11

know to form? Well, I'll back up a

118:14

little. He takes skin cells from

118:16

tadpoles,

118:18

puts them in a nutrient broth, and these

118:21

skin cells, freed from their day job as

118:24

skin cells, form clumps and create new

118:28

living organisms. And they repurpose

118:31

their psyia. They have these psyia,

118:33

which the tadpole uses to keep toxins

118:35

out or bacteria, infections out. And

118:38

they repurpose that as a means of

118:40

locomotion. and then they can move

118:42

around. There's nothing in their DNA

118:45

that dictates this. Their their DNA

118:48

dictates being a frog skin cell. Um, so

118:52

he's pondering this question of like

118:54

what's ordering what's giving order to

118:56

them? What's creating their sense of

118:58

purpose or desire for survival? They

119:01

don't live that long. Um, they're

119:02

missing certain things you would need to

119:04

live a long time. He's also made these

119:06

from human cells. He calls them

119:08

anthrobots. Um but he really believes

119:11

that there are these principles

119:13

governing life. Um it's a very platonic

119:17

idea uh that these things just exist and

119:20

um so it may be that these machines and

119:23

he does believe machines can become

119:25

conscious. Um that that the machines can

119:29

channel these uh he calls them patterns.

119:33

Um and you know we'll see if he's right

119:36

but he's doing amazing work. Have you

119:38

seen where they've taken human brain

119:40

tissue and they've taught it how to play

119:43

Doom?

119:44

>> No, I haven't seen that. I know they

119:45

make these organels out of brain tissue

119:47

now.

119:47

>> Yeah, they've taken human brain tissue

119:50

somehow or another through some process

119:53

and it'll play the video game Doom.

119:58

>> How does it

120:01

>> 800,000 human brain cells floating in a

120:04

dish? Never had a body, never seen

120:05

light, never felt anything. and they

120:07

just learned how to play a video game.

120:08

It's not a metaphor. That's literally

120:10

what happened.

120:12

>> So, what's their interface though with

120:14

the world? Like, do they have thumbs?

120:16

No.

120:17

>> Well, I guess it just Well, it's really

120:19

accurate, so I guess it doesn't need

120:20

them,

120:21

>> you know? It's just using the brain

120:24

cells to move whatever the cursor is on

120:28

the video screen, that would be the

120:30

hand, and pointing it at the targets,

120:32

then executing the strike.

120:34

>> Wow. So, it knows how to use the game

120:37

and it knows the objectives of the game

120:39

obviously because it knows to shoot the

120:40

bad guys. It has an understanding of the

120:43

weapons.

120:43

>> How does it how does it get that

120:45

knowledge? How is it programmed?

120:46

>> Also, does it switch weapons?

120:49

The Doom The thing about Doom is you get

120:51

multiple weapons. You have to run around

120:52

and pick them up. So, you're given one

120:54

weapon, which is like the least powerful

120:56

weapon. And the game is when you're

120:59

playing like deathmatch, the game is

121:01

you're running around trying to grab as

121:03

many weapons as you can and armor while

121:06

your opponent is also running around

121:08

this map. So you memorize the map. I

121:10

see.

121:10

>> So there's a map that is like very

121:13

confined corridors and these atriums and

121:17

all these different places where you'll

121:18

do battle. And so you run around. The

121:20

key is surviving long enough while this

121:23

person's chasing you so that you can

121:25

gather enough armor and weapons and

121:27

someone with a really good understanding

121:29

of the map tries to cut you off before

121:31

you can get to the stuff so they can

121:33

kill you before you accumulate enough

121:34

armor and weapons.

121:36

>> So, I'm curious to know whether or not

121:37

it's playing just with the pistol that

121:39

you get at the very beginning or

121:41

accumulating weapons.

121:42

>> For sure, it's just playing like the

121:44

first single player level. It's not

121:46

playing against anybody,

121:47

>> right? But will it be able to? That's

121:49

what's interesting. Like if it if it can

121:50

teach it to do that, if it can if it

121:53

understands the objective of these are

121:54

the monsters that are coming at you, you

121:56

have to shoot them.

121:57

>> Only took a week to do this.

121:59

>> Wow.

122:01

>> Oh. Oh. So brain cells on a chip. So

122:03

this is neuromorphic computing.

122:07

>> Um

122:08

the question I have about it is how do

122:10

you keep them alive?

122:12

>> Right?

122:12

>> You're putting them on a chip, but like

122:13

what do you feed them?

122:14

>> Right?

122:15

>> Um I mean they have metabolic needs,

122:18

right? They did something similar with

122:20

fruit flies.

122:21

>> Yeah, I had that ready, too. Uh some

122:24

It's different, but it's Yeah,

122:25

>> it's different, but it's equally weird.

122:28

>> The cells from the

122:30

>> I believe it.

122:33

>> What is this?

122:33

>> This is this they've modeled a fruitly's

122:36

brain. And I mean, this is the video of

122:38

it. The article is here.

122:40

>> So, setup claims first full brain

122:42

emulation of a fruitly in a simulated

122:44

body. conducted a complete fruitfly

122:48

brain emulation to a virtual body

122:51

producing multiple behaviors for the

122:53

first time. Emulation covers over

122:55

125,000 neurons and 50 million synapses.

122:59

>> What?

123:00

>> Eon plans to emulate a mouse brain with

123:02

70 million neurons.

123:04

>> Long-term goal is simulating a human

123:06

brain. Oh boy.

123:08

>> Yeah. So, I guess they, you know, they

123:10

made up the brain and it's doing fruit

123:11

flying.

123:12

>> But it's interesting they're they're

123:13

using neurons, right? They're not using

123:15

transistors. And and neurons are like so

123:18

far superior to transistors.

123:20

>> One neuron can have 10,000 connections

123:23

to other neurons, right? A transistor

123:25

has two or three or five maybe at the

123:27

most. A single neuron can do everything

123:30

that a deep neural network can do on a

123:32

computer. One neuron. Um,

123:34

>> so there's a level of complexity that

123:37

we're not yet anywhere near. And that's

123:39

why they're doing this using neurons

123:41

rather than transistors. Didn't they

123:43

find neurons in the human heart?

123:46

>> There are neurons in the heart. There

123:47

neurons in the gut. You know, there's a

123:49

whole, you know, there's a whole gut

123:50

brain access.

123:52

>> I'm working on something now about that

123:54

and um a piece about that.

123:56

>> But um

123:57

>> that's a real problem with people with

123:58

poor diets, right?

123:59

>> Yeah. I mean, you know, if people with

124:02

poor diets don't they don't eat enough

124:04

plants basically and their microbiome

124:07

loses its diversity. But the microbiome

124:10

is like another organ. Um, even though

124:13

it's full of other species, right? It's

124:15

got like 10 trillion bacteria and fungi

124:18

and stuff like that. And it is all of

124:22

them are metabolizing and producing

124:24

chemicals. It's like a little drug

124:25

factory. Hundreds of thousands of

124:27

compounds. Many of those compounds

124:30

affect your mood. Many of those

124:31

compounds affect all all sorts of things

124:34

about you. Um and uh so we're just

124:38

learning about this connection. The the

124:39

vagus nerve seems to be what connects

124:42

the brain to the gut and and the heart

124:44

though the vagus nerve is like all the

124:47

organs are connected to the to the head

124:50

by the by that nerve. So yeah and you

124:53

know the first uh neural system was in

124:57

the gut. You know, you you have these

124:58

simple animals that are just tubes,

125:00

right, with with bacteria and um the

125:04

first kind of neural activity was about

125:06

regulating digestion. Everything else

125:08

comes later.

125:09

>> If plants are necessary for that

125:10

function, what what happens with people

125:12

that are on the carnivore diet? Have you

125:14

ever looked at any of that?

125:16

>> Yeah, I have. I mean, you So, the the

125:19

microbes in your gut eat fiber, which is

125:22

to say the walls of plants, plant cells.

125:25

If you only eat uh meat, if you're on a,

125:28

you know, a keto diet or something like

125:30

that, you're essentially starving the um

125:32

the microbes and there's a, you know,

125:35

cost to that. Um I I don't think people

125:37

pay nearly enough attention to that.

125:39

>> Well, how come many people that

125:41

experience depression and anxiety find

125:43

relief of that by a carnivore diet?

125:46

>> Yeah, but many people find relief, you

125:48

know, adding a lot of plants to their

125:49

diet, too. So, I I don't know if that's

125:51

placebo effect or what. I don't I don't

125:53

know that that's a um you know a true

125:57

biological phenomenon. It may be. It may

125:59

be

125:59

>> because some seemingly

126:00

>> people who change anything feel a lot

126:02

better, right? If they take some step,

126:04

>> but I'm not talking about change. I'm

126:05

talking about people that have been on

126:06

it long term. Like there's the people

126:08

that are really in the carnivore diet

126:09

community. There's there's examples of

126:11

people that have been on it for 25 30

126:13

years and they're really healthy. Yeah.

126:15

>> It's it's odd.

126:16

>> So if you need plants

126:18

>> Yeah. Well, you need plants to have a

126:20

healthy microbiome. and a healthy

126:22

microbiome. And and the thing about it

126:24

is that every different plant has a

126:26

slightly different feeds a different

126:28

bug.

126:29

>> And but is it the only way to have a

126:30

healthy microbiome? Have you ever looked

126:32

into any of these people that are on

126:34

>> No, I should. I should as part of this.

126:35

>> It's fascinating because there's a lot

126:37

of them. There's a lot of people that

126:39

claim all sorts of benefits, relief from

126:42

autoimmune issues, all sorts of

126:44

different things that it fixes

126:46

>> because an unhealthy microbiome leads to

126:49

autoimmune problems. What what happens

126:51

is that the gut gut wall so when the

126:54

microbes microbes don't have plants to

126:56

eat, they start eating the mucous layer

126:58

that covers your um that insulates your

127:01

large intestine

127:02

>> and they're eating away essentially at

127:04

you and then you get le you get leaky

127:06

gut syndrome and that's when bacteria

127:10

can actually get into the bloodstream

127:12

cause a powerful immune reaction and

127:14

that and that inflames the whole body.

127:17

So you the reason you want a healthy

127:19

microbiome is to keep that that gut

127:21

barrier healthy and get the benefit of

127:24

these chemicals. Butyrate is a chemical

127:26

that um the microbes produce that's

127:29

really important for mood uh and a lot

127:32

of things and the body can't produce it.

127:34

So it's kind of interesting. We're

127:35

dependent on these other species that

127:38

live within us.

127:39

>> Um and

127:40

>> yeah, we're we're a whole ecosystem.

127:42

>> Yeah, we are. We're we're a hollow biant

127:45

is the I think term for like we we go

127:47

through evolution together with these um

127:50

you know 10 trillion

127:52

uh microbes. It's it's really

127:54

interesting. The newest research is the

127:56

links between the microbiome and the

127:58

mind. And um you know most of the

128:00

serotonin you know the the

128:03

neurotransmitter serotonin is produced

128:05

in the gut not in the brain which is

128:08

kind of wild.

128:08

>> Yeah. Um, and there are all these other

128:11

compounds that are produced that uh

128:13

influence our mood and uh so yeah, I

128:16

should look at the keto uh keto I'm just

128:18

in the middle of researching this now.

128:19

>> Yeah, the keto is one thing but the

128:20

carnivore diet these people are just

128:22

eating only meat and eggs and that's all

128:24

they eat.

128:25

>> Yeah.

128:25

>> And there's a lot of like really healthy

128:26

people that are doing it. I um I kind of

128:30

follow that but I eat a lot of fermented

128:31

food on top of that. Well, fermented

128:33

food is um powerful powerful benefit for

128:38

the um uh for the microbiome. There was

128:41

a study done at Stanford a couple years

128:43

ago that um they showed that people who

128:47

ate fermented food uh it reduced their

128:50

inflammation significantly.

128:52

Interestingly enough, it's not the

128:54

bacteria in the fermented food. It's the

128:57

um the metabolites they're called. they

129:00

produce the bugs are producing acetic

129:03

acid and and butyrate and other acids

129:05

and um you know essential acids um and

129:09

it's the fact you're getting those seems

129:12

to be what's having the positive effect

129:14

but people who eat lots of fermented

129:16

food benefit enormously and maybe that's

129:19

taking care of the problem if if people

129:20

on a carnivore diet are eating a lot of

129:22

fermented food that's the RFK Junior

129:25

diet too right

129:25

>> well I I don't know I mean I think he

129:27

does it that way but I I've been doing

129:29

it that away for I'm just I love it

129:30

anyway. I'm a kimchi freak. I love that

129:32

stuff.

129:32

>> Yeah, me too.

129:33

>> Um but what's what's interesting is that

129:36

it controls your mood. That's what's

129:38

interesting is that your microbiome has

129:40

a a massive impact on your mood.

129:42

>> And why? I mean, is it just an accident

129:45

or some people think these microbes are

129:47

manipulating you to get what they need?

129:51

>> So they they regulate your appetite,

129:54

too. And um so it may be that they're

129:58

inspiring you to eat certain things that

130:00

they want.

130:01

>> That actually makes sense because one of

130:02

the more interesting things about a

130:04

carnivore diet, and I've done pure

130:06

carnivore for months at a time, is that

130:09

you don't have the same hunger pangs.

130:11

Not nearly, not even close. The the

130:13

hunger that you get when you're on a

130:15

high carbohydrate diet is like you get

130:17

hangry. Like, "Oh my god, I'm so hungry.

130:19

I have to eat right now." You never get

130:21

that with a carnivore diet. probably

130:22

because it's it's digested much more

130:24

slowly.

130:26

>> I think there's a little bit of that,

130:27

but it's also you don't have the insulin

130:28

spike. You don't have

130:29

>> That's true. There's not this.

130:31

>> Have you ever worn an a glucose meter?

130:33

>> No, I haven't.

130:34

>> So interesting. I was wearing one for um

130:37

two months.

130:38

>> It I mean it'll just make you crazy. Um

130:41

>> that's the thing with all those

130:42

wearables. they just you just start

130:44

going over every aspect of your sleep

130:45

and

130:46

>> so you know you have a you have a you

130:48

have some pasta and like

130:51

>> but if you take a walk right after

130:54

>> you can moderate it and it doesn't take

130:56

a lot of exercise to to use up that

130:59

glucose and get the muscles to to to

131:02

draw it in. So you can it's very

131:05

interesting experiment because it

131:06

changes your behavior. In the same way

131:08

if you have a step counter like you're

131:09

more likely to park further away from

131:11

the store to get get you know another

131:13

hundred steps. If you have a glucose

131:15

meter you're more likely to exercise

131:18

after a meal which is when it does the

131:20

most benefit.

131:20

>> Well that in that sense it's great

131:22

because it does give you data that you

131:24

can act on.

131:25

>> Yeah.

131:26

>> The the problem is people get addicted

131:27

to that data and then it starts becoming

131:29

a new video game that they're playing.

131:31

>> Yeah. Exactly. They're they're

131:33

constantly and this anxiety worrying

131:35

about your sleep and worrying about your

131:36

this and your that and

131:38

>> Yeah. You also learn that like if you

131:40

have fat with your carbs, it it kind of

131:43

blunts the effect. Sure. So, you know,

131:45

>> butter with bread.

131:46

>> Yeah. Butter with bread or olive oil on

131:48

pasta, all those things. There's a

131:50

reason for that.

131:51

>> I love when culture figures stuff out

131:53

before the scientists do. I remember

131:55

that when I was writing about food a few

131:56

years ago, there this study came out and

131:58

everybody was really excited that they

132:00

discovered that lycopine which is this

132:02

really important antioxidant in uh

132:04

tomatoes is can't be accessed by the

132:07

body in the absence of fat. So, oh,

132:10

olive oil on tomatoes, what a great

132:12

idea. The grandmas figured that out

132:14

hundreds of years ago.

132:15

>> That's crazy.

132:16

>> Yeah. So, there's a lot of wisdom in

132:19

cultural food preferences, combinations

132:21

that we have, you know, like buttering

132:23

bread. I mean all these things and how

132:25

did people figure it out?

132:26

>> Have you seen the work they've done on

132:28

nattokynise? I'm not not sure if I'm

132:30

saying it right. And it's um impact on

132:33

arterial plaque.

132:34

>> No.

132:35

>> Hugely beneficial. So it's it comes from

132:39

fermented um seaweed

132:41

>> from NATO.

132:42

>> So this Japanese use of fermented

132:45

seaweed.

132:46

>> So in in meals that they've isolated it

132:49

into a supplement. And this supplement

132:51

nattokinise they've shown that it

132:53

reduces a massive amount of arterial

132:56

plaque. So here it is highdose

132:58

nattokenise particularly at 10,000 um

133:02

10,800 FU day has shown to effectively

133:06

manage arterio sclerosis by reducing

133:11

corateed artery plaque size by 36% or

133:14

more

133:15

>> decreasing intermediate thickness and

133:17

improving lipid profiles. It acts as a

133:19

potent fibro what's it? Fibbrronoic.

133:24

How's that word?

133:24

>> I don't know that word.

133:25

>> Fi fibbrrono

133:28

fibonolytic

133:30

>> fibonolytic agent that may also break

133:32

down amalloid plaques. Isn't that

133:34

fascinating?

133:35

>> Yeah, that is. So, natto is um that's

133:38

not from seaweed. That's what is it?

133:40

>> It's a bacteria that they ferment

133:42

soybeans with.

133:43

>> Oh, that's right. Soybeans.

133:45

>> It's this kind of mucousy looking stuff.

133:47

I mean, I like it. I eat it. Japanese

133:50

restaurants. Yeah.

133:51

>> Right. Yeah. Well, that's

133:52

>> So, you can get a supplement now, so you

133:53

don't have to taste it if you don't like

133:54

it.

133:55

>> But isn't that crazy that they figured

133:57

that out? Like the people that were

133:58

fermenting things, it wasn't just to

134:00

prolong its shelf life.

134:02

>> No. Oh, no. I mean the whole I mean

134:04

every culture has fermented foods and um

134:07

and yes it it probably began as a way to

134:10

preserve foods but then it became a very

134:12

important part of people's health

134:14

>> but it's also like healthy for your

134:15

brain which is really crazy like that

134:17

diet is actually good for thinking it's

134:20

good for helping your digestive system

134:22

it's good for anxiety it's good for mood

134:24

and depression

134:26

>> weird

134:27

>> all right I'm gonna look into it

134:29

>> yeah it's fascinating um anything else

134:32

should We keep keep going on this. I

134:34

mean, there's so many different things

134:34

to discuss and I want people to buy the

134:36

book obviously.

134:37

>> Thank you. The book was like a great

134:39

adventure. I mean, it really was. I you

134:41

know, I started this book with no idea

134:43

where I was going. I started the way you

134:45

start an interview, just curiosity, no

134:48

destination.

134:49

And it was um I learned a lot about a

134:52

lot of different things. I learned a lot

134:54

about feelings. I learned a lot about

134:55

the self. Um and it changed how I looked

134:58

at things. It really did. I mean

135:01

>> when you sit down when I mean you've

135:04

written some amazing books but I always

135:07

want to know like what is what's the

135:09

impetus like what what starts you on the

135:11

first steps like what

135:13

>> questions yeah and which is to say

135:15

curiosity I and I teach my I teach

135:17

writing and I teach my students this

135:19

questions are more interesting than

135:21

answers very often and questions have

135:24

suspense built into them right what's

135:26

the answer it turns everything into a

135:27

detective story if you frame the

135:29

question properly. So if you read any of

135:33

my books or even articles, I'm kind of

135:34

an idiot on page one. You know, I I I I

135:38

don't know something that I want to know

135:40

and I have questions and then the the

135:43

story, the narrative becomes my figuring

135:45

it out or trying to figure it out and

135:47

going to this person and doing this kind

135:49

of experiment and that sort of thing. Um

135:52

that's the way I like to write. I mean,

135:53

if I knew the answers when I started,

135:55

it'd be boring. Well, I think that's why

135:56

your books resonate with people so much

135:58

because you take them on this journey

136:00

with you.

136:00

>> Yeah. Instead of lecturing. I hate books

136:02

that lecture at me. I really do.

136:05

>> And um and lots of books do that. They

136:07

they have their conclusion on page one

136:09

and then they're just kind of beating

136:11

you over the head with it for 300 pages,

136:12

>> stuffing it down your throat.

136:14

>> Yeah. I don't like to do that. No, I

136:15

like taking people on the on the journey

136:17

with me. Well, it's interesting that

136:19

you're saying this because in a sense

136:22

you are interacting in a pleasant way

136:26

with other people's consciousness.

136:28

>> Yeah. So, I gave this is a really

136:30

interesting issue you just brought up.

136:32

How is

136:34

my taking over your consciousness as you

136:36

read my books different than social

136:39

media or some of the ways I'm saying are

136:41

not are polluting our consciousness?

136:43

>> Right. I think it's very collaborative

136:46

when you're reading. All you have are

136:48

these black marks on a page. It's kind

136:51

of amazing these these letters and you

136:54

your consciousness conjures up the ideas

136:58

that I'm putting out there or the story

137:00

I'm putting out there. But it's it's

137:03

dual consciousness. I think you're

137:05

letting me in. It's it's it's a you know

137:08

a voluntary process and you're bringing

137:10

a lot to the table. You're bringing your

137:12

associations. you know, I I'm not fully

137:14

describing somebody. I'm just giving you

137:16

a few clues and then you're conjuring a

137:18

picture of a character. So, I think it's

137:21

a very active form of um consciousness

137:24

when you read. I think that's true, too.

137:26

When you, you know, go to a movie, too.

137:29

You're you're basically saying, "I'm

137:32

turning over my consciousness for a

137:33

period of time to someone I want because

137:37

they have an interesting head and I I'm

137:40

going to give them this space." But you

137:42

know, you're you're still in control. I

137:44

mean, you're deciding.

137:45

>> So, I think there's a real distinction

137:48

in in how we share our consciousness

137:50

with other people.

137:51

>> And um we need to do that. You know, one

137:55

of the you know, I I said earlier on in

137:57

the conversation that the the breach

137:58

between two consciousnesses is this is

138:00

this wide thing. William James wrote

138:03

about this, Marcel P wrote about this.

138:04

You know, he said, "We're all like

138:06

islands and we we each have our own like

138:08

hidden signs and we have an inner

138:11

obscurity." He said, "How do we how do

138:14

we connect?" And now we have language,

138:16

but art is really the way that one, you

138:19

know, that we mindmeld different

138:21

consciousnesses. Like art allows you if

138:24

I look at a Rothco painting

138:26

um or read a great novel, I am um

138:31

expanding my consciousness, right? I'm

138:33

letting another one in and and I'm

138:35

ending and I'm breaking my isolation.

138:38

And that's such a beautiful powerful

138:40

thing. And and and art is how we f

138:42

ourselves from one consciousness to

138:44

another. And that's very different than

138:47

like scrolling on social media where

138:48

you're conscious but minimally so.

138:50

>> Well, very very different. It's also

138:52

there's something about great writing

138:54

that you

138:57

the better you are at expressing

138:59

yourself in a way that is going to get

139:02

into someone's head, whether it's

139:04

through non-fiction or through fiction,

139:06

>> that the more exciting it is to the

139:09

person that's receiving it. So, the the

139:11

more skillful you are at disseminating

139:13

these ideas, the more it resonates with

139:16

the person that's reading it.

139:17

>> And and writers have tricks to do this.

139:20

You know, suspense is one of them. Like

139:22

what happens next? It's so basic. We

139:24

want to know what happens next because

139:26

our curiosity is peaked.

139:28

>> And we have, you know, creating

139:30

character. Um I mean there, you know, we

139:33

have all these kind of tricks to to

139:36

infiltrate your brain.

139:37

>> Yeah.

139:38

>> So anyway, it's it's a it's a mysterious

139:41

and kind of wonderful process. Um and uh

139:46

yeah, I feel I feel privileged I get to

139:48

do it. Well, it is a very cool thing

139:50

that you do. Um, one last question about

139:53

consciousness itself. When when you're

139:56

looking at these people that are

139:57

studying it and trying to get to the

139:58

root of it and trying to figure out what

140:00

it is and there's all these options that

140:02

we discussed earlier, do you lean in one

140:06

way or another? Do you do you think you

140:09

have like your own personal map of

140:12

what's going on?

140:14

>> No. I mean I'm I didn't draw a big

140:16

conclusion like I'm but I ended up I

140:19

started as a like a materialist.

140:22

I kind of assumed

140:23

>> when you started this book.

140:24

>> Yeah.

140:25

>> Really?

140:25

>> Yeah. That was

140:26

>> even after psychedelic

140:27

>> even after psychedelic experience. I

140:29

mean they kind of open the door a crack

140:31

to other ways of thinking and at the end

140:32

of how to change your mind I did talk

140:34

about a little bit about that other

140:36

concepts of consciousness but I kind of

140:39

assumed

140:41

that you know the consensus of most

140:43

scientists is that you know materialism

140:46

that everything can be reduced to matter

140:49

and energy. This is the faith of our

140:52

time you know for the last couple

140:53

hundred years. By the end of the book,

140:57

consciousness is a challenge to that

140:59

idea. Um, and that idea, which is our

141:03

scientific paradigm, is tottering. Now,

141:05

I think there's some real reasons to to

141:08

look beyond materialism. And, uh, so I

141:12

ended up with the door wide open to

141:14

other ideas. Um, I didn't settle on one.

141:18

I don't know how to prove one or the

141:20

other, but they're equally plausible. Do

141:25

you anticipate in our lifetime or in any

141:27

lifetime cracking that puzzle that

141:29

anyone can crack that puzzle?

141:32

>> I don't I I think we don't have the

141:35

right kind of science. Our science as I

141:38

said earlier was is is really you know

141:40

stuck in this mode. It started with

141:43

Galileo, right? I mean he to save his

141:46

ass basically said we're going to leave

141:48

subjective things the soul qualities

141:52

that's all the church we're going to

141:54

just do measurable objective third

141:57

person science and it's been incredibly

141:59

powerful and it's taught us incredible

142:01

things and given us incredible

142:03

technology but it doesn't deal with this

142:07

stuff we we gave to the church and now

142:09

they're trying to take it back and work

142:11

on it and It's they've only been at it

142:14

for like, you know, a couple decades

142:16

really. This serious scientific

142:18

examination of consciousness, but we

142:20

just may not have the right science. And

142:22

and one of the things I explore in the

142:24

book is like how would you bring in

142:27

subjective experience to this objective

142:29

science? And um Michael Leven, the

142:32

biologist I was talking about who makes

142:33

those Zenobots, says to understand

142:36

consciousness, you have to change

142:38

yourself. In other words, to understand

142:40

anyone else's consciousness, you have to

142:42

experience it. Therefore, you're

142:44

changing your own. That's a whole

142:46

different scientific paradigm. In the

142:48

scientific paradigm, you're unchanged by

142:50

whatever you do, right? It's totally

142:52

objective. So, we it may take a

142:55

scientific revolution to to really

142:58

unlock the secret, the mystery of

143:00

consciousness.

143:02

Wouldn't it be a conundrum if AI is what

143:04

cracks?

143:05

>> Yeah, I I was having the same thought

143:08

like maybe AI has another approach. Um

143:13

>> I think it's going to have to learn how

143:14

to feel.

143:17

>> It seems like it already feels like it

143:18

wants to live.

143:19

>> Yeah. And it feels uncomfortable.

143:20

>> Yes.

143:21

>> I don't think it's feelings are real. I

143:23

I do. I you know I think simulated

143:26

thinking is real thinking like you know

143:28

it can play chess. It can make things

143:30

happen in the world. Simulated feeling

143:32

is not real feeling.

143:33

>> It doesn't have a soul.

143:35

>> Doesn't have a soul.

143:36

>> Thank you, Michael. Let's keep it that

143:37

way. I really enjoyed this. Thank you

143:38

very much. You're awesome. Really love

143:40

your books. So, it's always a treat.

143:43

>> All right. Bye, everybody.

143:44

>> Bye.

Interactive Summary

Loading summary...