HomeVideos

Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

Now Playing

Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

Transcript

4094 segments

0:00

You are actively rewiring your brain for

0:02

the worst by engaging with social media,

0:04

high volume, quick videos.

0:06

>> And the social media executives don't

0:07

let their kids use this stuff because

0:09

they designed it to be addictive and

0:11

they know that millions and millions of

0:12

kids have been cyberbullied, sexed. Many

0:15

have committed suicide. So, I'm getting

0:17

angry.

0:17

>> And then from the medical perspective,

0:19

it's rewiring your body, increasing your

0:21

risk of heart disease and PTSD.

0:23

>> We've moved too far into the virtual

0:25

world and the results are catastrophic.

0:27

People are spending roughly about 6 and

0:29

a half hours a day on their phones. What

0:31

do we do about this?

0:32

>> Well, here's the amazing thing. We

0:34

actually can control our fate. So, we

0:36

are joined by a social psychologist and

0:38

a Harvard physician

0:40

>> to dive into the technology addiction

0:41

and brain rot crisis billions are facing

0:44

worldwide

0:44

>> and how we can counter its devastating

0:46

mental health effects. You have to

0:49

reclaim your attention because without

0:51

the ability to pay attention for several

0:52

minutes at a time, we're seeing the

0:54

destruction of human potential, the

0:56

human relationships, the connection.

0:58

>> But there's all these small tweaks that

0:59

you can do to override that primal urge

1:01

to scroll. For example, 91% of people

1:04

had an improvement in attention,

1:05

well-being, and mental health. After

1:07

just 2 weeks of continuing to use your

1:09

device, but not having internet access.

1:11

Next, keep your phone out of your arms

1:14

reach because the sheer potential for

1:15

distraction has actually been shown to

1:17

change your prefrontal cortex, which is

1:19

called brain drain.

1:20

>> So, yes, we should exert more

1:21

self-control, but we're being pushed in

1:23

addictive apps and it's messing us all

1:25

up. That's not our fault.

1:26

>> Would you advise people to delete these

1:28

short form videos?

1:29

>> Oh my god, yes, that would the most

1:30

important thing you can do for your

1:31

intelligence and for humanity. But if I

1:33

was going to offer some specific advice,

1:35

here are the three things that I do with

1:37

my students to reclaim retention. And

1:39

then to add to that, I have the 3se

1:40

secondond brain reset. So, first

1:43

>> I wanted to ask you guys what you

1:45

thought of this.

1:46

>> Hey, you're back.

1:47

>> This terrifies me.

1:48

>> We've got to stop this now.

1:54

>> Guys, I've got a quick favor to ask you.

1:56

We're approaching a significant

1:57

subscriber milestone on this show, and

1:59

roughly 69% of you that listen and love

2:02

this show haven't yet subscribed for

2:04

whatever reason. If there was ever a

2:05

time for you to do us a favor, if we've

2:07

ever done anything for you, given you

2:09

value in any way, it is simply hitting

2:11

that subscribe button. And it means so

2:13

much to myself, but also to my team, cuz

2:14

when we hit these milestones, we go away

2:16

as a team and celebrate. And it's the

2:17

thing, the simple, free, easy thing you

2:19

can do to help make this show a little

2:20

bit better every single week. So, that's

2:23

a favor I would ask you. And, um, if you

2:25

do hit the subscribe button, I won't let

2:27

you down. And we'll continue to find

2:29

small ways to make this whole production

2:30

better. Thank you so much for being part

2:33

of this journey. Means the world. And uh

2:35

yeah, let's do this.

2:39

Jonathan editing.

2:42

Jonathan, I've heard you say that the

2:43

destruction of attention is the largest

2:45

threat to humanity that's happening

2:47

around the world. And I've also heard

2:49

you say that short form videos are the

2:51

worst of the worst because they're

2:53

shattering attention spans. The reason

2:55

why I wanted to have this conversation

2:57

today is somewhat personal. And in fact,

3:00

all of the conversations have in the

3:01

driver are somewhat personal to some

3:03

degree. um they're inspired by some

3:05

unanswered question I have in my head

3:07

and also some observation I have in my

3:09

life and the observation I've had is

3:11

that short form videos in particular are

3:15

making my life worse and actually I've

3:17

got to say the catalyst moment really

3:20

where I thought you know I need to get

3:21

you exceptional people together to have

3:23

this conversation was I thought this I

3:25

then looked at my screen time and saw a

3:27

huge change I felt so much worse because

3:29

all these social platforms have short

3:31

form video now and then I actually heard

3:32

Elon Musk who you know has a social

3:35

media platform that does short form

3:37

video say that he thinks it's one of the

3:38

worst inventions for humanity.

3:41

>> Jonathan, why did you say what you said

3:42

about short form video and this

3:45

corruption of attention?

3:46

>> Yeah, because I wrote a whole book

3:48

called The Anxious Generation focusing

3:50

on teen mental health. That was the

3:52

mystery that popped up in the mid200s.

3:54

Why are people born after 1995 so much

3:57

more anxious and depressed? And I've

3:59

been tracking down that mystery and it

4:01

points a lot of it points to social

4:02

media and especially Instagram, social

4:05

comparison, all the things we know about

4:07

social media. When the book came out in

4:09

2024, since then what I realized is that

4:12

I vastly underestimated the damage

4:15

because I focused on mental health,

4:17

which is a catastrophe. But the bigger

4:19

damage is the destruction of the human

4:21

ability to pay attention. Without the

4:24

ability to pay attention for several

4:26

minutes at a time, ideally 10 or 20

4:28

minutes at a time. Without that, you're

4:30

not going to be of much use as an

4:32

employee. You're not going to be of much

4:33

use as a spouse. You're not going to be

4:35

successful in life. And that's when I

4:37

realized this is way beyond mental

4:39

health. This is changing human

4:41

cognition, changing human attention, and

4:44

possibly on a global scale.

4:47

Adi, what perspective do you come at

4:50

this from? And what's been your

4:51

perspective through all the work you've

4:52

done about brains and stress and

4:54

neuroscience and all these kinds of

4:55

things that has shaped the way that you

4:57

think about social media, screen time,

4:59

short form video.

5:01

>> My background is that I'm a physician at

5:04

Harvard and it my expertise is in

5:06

stress, burnout, and mental health. And

5:08

so that is the lens that I view all of

5:11

this through. We know that the most

5:14

delletterious relationship that you have

5:16

is with your device. You know, in every

5:18

healthy relationship, we have

5:19

boundaries. We have boundaries with our

5:21

kids, our parents, our colleagues, our,

5:25

you know, wi-i with our friends. And

5:27

yet, we have no boundaries and often

5:29

poorest boundaries when it comes to the

5:32

relationship you have with your device.

5:33

So, it's not so much about, you know,

5:35

becoming a digital monk and renouncing

5:37

technology because technology can serve

5:39

us, right? It inspires, educates,

5:42

connects. Now more than ever, it's so

5:44

important to be an informed citizen, but

5:45

not at the expense of your mental

5:47

health. And so what Jonathan was saying,

5:48

this, you know, constant being engaged

5:51

with your devices, with social media,

5:53

the scrolling from the minute you wake

5:55

up until you go to bed, there's a reason

5:58

why you have your best ideas in the

5:59

shower. And that's because that's the

6:01

only place in the whole day where you

6:03

are not with your device. People take

6:05

their device to the bathroom. They sleep

6:07

with your device. you eat with your

6:09

device, people walk down the street.

6:11

There's more near miss pedestrian

6:13

accidents because people are walking

6:15

while they're crossing the street and um

6:16

looking at their devices. And so there's

6:18

all of this brain biology at play behind

6:21

the scenes. So both of you have talked

6:23

about how it doesn't feel good to engage

6:25

and constantly be on your phone, that

6:28

sense of infinite scroll, but there is,

6:30

you know, it feels like you're doing

6:32

nothing. You're just doing this, right?

6:33

What are you doing? But in fact, it is

6:35

not passive. It is active. And it has a

6:37

profound effect on your biology, on your

6:40

brain, on your psychology, and also

6:42

social factors that I hope we talk about

6:44

today.

6:45

>> You know, scrolling, wasting a bit of

6:46

time doesn't seem so harmful.

6:49

What is the big, if we play this forward

6:51

10, 20, 30 years, what is the big risk

6:53

or threat? The biggest threat right now,

6:56

we don't even have to wait 20 years, is

6:57

that it through a process called

6:59

neuroplasticity, which is just a big

7:01

fancy word that simply means that your

7:03

brain is a muscle, is that by engaging

7:05

with social media, that that sense of

7:08

high volume, lowquality, quick videos,

7:11

you are actively rewiring your brain for

7:13

the worse. So you're increasing your

7:15

sense of stress, worsening your mental

7:17

health, attention, cognition,

7:19

distractability, irritability, complex

7:22

problem solving. All of that changes

7:25

when you engage in engage in that

7:27

infinite scroll.

7:28

>> Yeah. I'd like to add on here because

7:30

one of the main arguments I get is, ah,

7:32

this is what they said about television.

7:34

Oh, this is what they said about comic

7:35

books. This is just another moral panic.

7:37

But people need to understand why

7:39

touchscreen devices are so different

7:41

from television. And so I think parents

7:44

find this helpful if I just lay this out

7:45

briefly. Good screen time versus bad

7:47

screen time. So humans are storytelling

7:51

animals. We have always, as long as

7:53

we've had language, we've raised our

7:54

kids with stories, epic poems, all kinds

7:57

of stories. Stories are good. Sto the

7:59

human brain needs lots of patterns. The

8:01

child's brain needs lots of patterns to

8:03

develop. So the worst thing you can do

8:06

is hand your child the device because

8:08

they're crying for it because they've

8:09

been they trained to get it and you're

8:10

busy. So you have hand them the device.

8:12

They're quiet. What's happening? They're

8:14

sitting alone. Not, you know, when I was

8:16

a kid, we always watch with my sisters,

8:18

with my friends. You're arguing about

8:19

it. You're talking at social kids

8:20

sitting alone with a device in his hand.

8:23

It's not long stories. It's never long

8:25

stories. It always ends up at YouTube

8:28

shorts or Tik Tok or Instagram reels for

8:29

older kids. So, they're doing they're

8:31

doing this. But here's the key thing

8:33

that it does that a television does not.

8:34

A television puts you in a state that

8:36

psychologists call transportation. You

8:39

get into a story and you find yourself

8:41

pulled in and you're rooting for the

8:43

characters and this is this is how a

8:45

brain gets tuned up to social patterns

8:47

but it can't happen in 10 seconds. It

8:49

can't happen in one minute. It takes a

8:51

long period of time and there is no

8:54

reinforcement. There is no the

8:57

television doesn't do anything to you.

8:59

You don't have any response. Whereas a

9:02

touchscreen device is a Skinner box. So

9:04

BF Skinner was one of the founders of

9:06

behaviorism and he put rats and pigeons

9:09

in a box where he could deliver a

9:11

reinforcement, a little grain of food on

9:13

a schedule. And by giving them quick

9:15

reinforcements for behavior, he could

9:16

train them to do amazing tricks in just

9:18

a few hours. When you give your kid a

9:20

touchscreen device, it's stimulus

9:23

response, swipe, get a reward or not,

9:26

variable ratio. And then and and you

9:28

just keep doing that. So you are, as Adi

9:30

said, it is rewiring your brain. It's

9:32

not just wasting time. It is literally

9:35

training you to do things where

9:37

television didn't do that. So this is a

9:39

whole new game.

9:40

>> And to add to that, you know, from the

9:41

medical perspective, you're shortening

9:43

this attention span. And what happens

9:45

over time is so like Jonathan said,

9:47

right, you're not sleeping as well

9:49

because you are engaged with your

9:50

device. We know that 80% of people are

9:53

checking their phones within minutes of

9:54

waking up. We have something called

9:56

revenge bedtime procrastination. this

9:58

concept of, you know, at the end of the

10:00

day you're fatigued, you've had a long

10:01

day, you've had no me time, and you want

10:03

to get to bed early. We all know, by the

10:04

way, what the data is that, you know,

10:06

we've been taught since we were little

10:08

kids, right? Like bedtime, sleep is

10:10

important, it's good for your body, it's

10:11

good for your brain. And we might have

10:12

all the knowledge in the world, but in

10:14

terms of action, there's a wide gap

10:15

between knowledge and information and

10:17

action. And so revenge, bedtime

10:19

procrastination is kind of an offshoot.

10:20

So what happens? So, you know, you have

10:22

that decreased attention. You have that

10:25

irritability, hypervigilance. And so, at

10:27

night, at the end of the day, it's 9:00

10:29

p.m. You finally, you know, if you're a

10:30

parent, your kids are asleep, your

10:32

kitchen is clean, maybe you finish your

10:34

entrepreneurial day, and you finally sit

10:36

down with Melanie on the couch, and

10:38

you're like, "H, some me time." And, you

10:41

know, you want to get to bed early, and

10:42

you know it's good for you. But then

10:44

suddenly, you're scrolling and before

10:45

you know it, it's 2 a.m. and you're

10:46

saying, "Oh my god, what happened? Why

10:48

am I still awake? What was I doing all

10:49

this time?" What happens is that you

10:52

essentially give yourself some me time

10:54

at night and so you procrastinate

10:56

bedtime. And so what happens is with

10:57

this revenge bedtime procrastination, it

11:00

affects your sleep and then when you

11:02

don't have good sleep, good quality

11:03

sleep, so you have difficulty falling

11:04

asleep, staying asleep, sleep debt over

11:08

time for kids, for adults has all sorts

11:11

of ramifications. So this is just the

11:14

tip of the iceberg. this short form

11:17

video content and the ripple effects go

11:20

far and wide. Not only is it rewiring

11:22

your brain, it's rewiring your body, it

11:25

is affecting your sleep, which increases

11:28

your risk of heart disease later in

11:30

life. And u when you're consuming

11:31

graphic videos and graphic images, it

11:35

can increase your personal risk of PTSD

11:38

through vicarious trauma even if you

11:40

weren't there. So, this is just a vast

11:43

network of things that can happen to you

11:46

simply because you're thinking, "Yeah,

11:47

it's harmless. What is it? It's just a

11:49

bunch of videos that I'm checking out.

11:51

It's a way for me to decompress."

11:53

>> What do I need to know about the nature

11:54

of the brain to understand exactly what

11:57

short form video is is playing, is

12:00

hijacking, is taking advantage of

12:03

>> the thing to understand about all of

12:05

this is that we have to focus on

12:07

childhood. Why do we have childhood? Um,

12:10

humans have this really interesting

12:12

childhood where we we grow rapidly at

12:14

first and then we slow down for about

12:15

five or seven years. We don't grow very

12:17

quickly and then we speed up at puberty.

12:19

Whereas other primates, they just grow

12:21

and grow till they reach reproductive

12:22

age, then they reproduce. But we seem to

12:25

have this long period of sort of middle

12:26

childhood for cultural learning. It's a

12:29

period in which the the kid is now

12:31

walking and talking and turning away

12:32

from the parents and and that's a time

12:34

for this to come in and they pay

12:36

attention and they form relationships.

12:38

All these things have to happen slowly

12:40

because the neurons are gradually

12:42

growing. They're finding each other

12:43

based on what the child is doing. Okay?

12:46

So, we grow up in the real world and and

12:48

that happens over time. And a lot of

12:49

that is very physical. Kids are very

12:50

physical. Mammals are very physical and

12:52

there's a lot of touch. So, that's a

12:54

healthy human childhood. But when you

12:56

give an iPad or your old iPhone

13:00

and they can they begin doing the the

13:03

touching and swiping, that is going to

13:05

hijack their attention. That is going to

13:07

push out all other forms of action and

13:09

learning. And that is going to change

13:10

the way the parts of the brain that

13:12

learn to pay attention, what's called

13:14

executive function. It's going to change

13:16

the way the brain learns to pay

13:17

attention. It's going to change the

13:19

reward circuits. I think you had Analy

13:21

recently who's the nation's expert on

13:23

addiction. And the way that she

13:25

describes it, how, you know, any one

13:26

addiction is going to change your reward

13:28

pathways to make you more vulnerable to

13:30

other addictions. So, we're setting our

13:32

kids up not just for this, but then when

13:34

they get a little older, it'll be video

13:36

games, it'll be uh porn, it'll be

13:39

gambling now. Everything is gambling.

13:41

So, we're setting them up for a life in

13:44

which their brain is saying, "Give me

13:46

something. Give me some quick dopamine.

13:48

Give me some quick dopamine. I don't

13:49

want I don't want to have to work for

13:50

anything. I don't want to have to apply

13:51

myself for an hour and then get a

13:53

reward."

13:54

And so the what the what the short

13:56

videos are doing for kids is preventing

13:59

them from learning the connection

14:00

between hard work and a reward. Is there

14:03

anything else I need to know from a

14:04

neuroscience perspective about what's

14:06

going on in my brain when I'm when I

14:09

develop these addictions with short form

14:10

videos or these sort of quick dopamineic

14:12

tasks.

14:14

>> So we all as humans have a primal urge

14:16

to scroll. When you feel a sense of

14:18

stress, as many of us do in this moment

14:20

in life, it is your sense, you know,

14:23

your amygdala. And so it's your sense of

14:25

self-preservation. It's survival and

14:27

self-preservation. That is what your

14:28

amydala does. So if you want me to show

14:30

you here, I have no idea what I'm doing

14:32

there.

14:32

>> Yeah, it's okay. So here, deep here,

14:36

it's a small almond shaped structure.

14:37

And that is your amygdala. And your

14:40

amygdala, its main purpose is survival

14:42

and self-preservation. It houses your

14:45

stress response, your fight orflight

14:47

response, and it is truly what is

14:50

activated when you are engaging in

14:52

content, when you feel a sense of

14:53

stress. And so you have this primal urge

14:55

to scroll. And so evolutionarily we when

14:58

we all were caves people living um

15:00

together, we would sleep at night and

15:02

there would be a night watchman scanning

15:04

for danger. And now we have our we have

15:07

become our own night watchman. And so we

15:09

scan for danger all day, all night long.

15:11

How do we do that? We scroll. And then

15:13

the amydala is triggered. And then you

15:14

scroll some more. And you scroll some

15:15

more. And you scroll some more. And so

15:17

over time, what you're doing is that

15:19

you're making that amygdala in a state

15:21

of of chronic. It's continually being

15:25

triggered. What happens to the amygdala

15:27

over time. When it's continually

15:29

triggered, it starts to rewire your

15:30

brain in other ways. And how does it do

15:32

that? Through something called the

15:34

prefrontal cortex. If you put your hand

15:36

I like I can use this model, but I can

15:38

also just use my hands. When you put

15:40

your hand on your forehead, the area

15:41

right behind your forehead right here is

15:43

the prefrontal cortex. This is a very

15:45

important thing for our conversation.

15:47

This area of the brain and what the

15:49

prefrontal cortex does is it is called

15:52

it governs executive functions. So

15:56

impulse control, memory, planning,

15:59

organization, strategic thinking,

16:01

complex problem solving and there is a

16:04

tension between your amygdala and the

16:06

prefrontal cortex. When your amydala is

16:08

in the driver's seat, that prefrontal

16:10

cortex is quiet. And what is happening

16:13

as we continue to engage with our

16:15

devices and have this primal urge to

16:18

scroll, that amydala upregulates and the

16:21

prefrontal cortex downregulates. And

16:23

over time, that is very problematic for

16:25

all of the reasons that we're kind of

16:27

introducing at the start of this

16:28

conversation. There was a meta analysis

16:30

done in 2025 of 71 different studies and

16:33

it found that heavy short form video use

16:34

was associated with reduced thinking

16:36

ability, especially shorter attention

16:38

spans and weaker impulse control.

16:41

>> That's right. These studies are just

16:43

beginning to roll in now. Um, kids have

16:45

been on social media really a lot since

16:47

2008, but especially once they got

16:49

smartphones around 2012. studies began

16:52

coming in uh in the 2010s that um look

16:54

it's looking like the kids who are spend

16:56

a lot of time on this um are doing much

16:59

worse. They're more depressed. The focus

17:01

was on depression. And some other

17:03

researchers said no, it's just a

17:04

correlation. You you can't prove

17:06

causation. And we've been going around

17:08

and around on this for about 10 or 15

17:10

years. Now we're doing the same thing

17:11

with uh with the short form videos. The

17:14

damage everyone can see. My students

17:17

tell me this is what's happening. We

17:19

feel it. studies are coming in, but

17:21

there will be a few studies here and

17:23

there that don't show it and people will

17:25

uh push that up. Meta spends a lot of

17:28

time and money to influence the public

17:30

debate. A lot of public documents are

17:32

coming out now about how they do that.

17:34

So, we can engage in debate over over

17:37

research on short form videos for 5 or

17:38

10 years, but at that point, it's way

17:40

too late. We've lost a second

17:41

generation, Gen Alpha. So, I think when

17:44

we're talking about kids especially, we

17:46

need to have what's called the

17:47

precautionary principle, which is if

17:49

there's reason to think that this is

17:50

hurting kids, how about we don't roll it

17:53

out into every childhood? How about we

17:56

make these companies responsible? We

17:58

hold them responsible for what they're

17:59

doing to kids because we're about to

18:01

make the same mistake we made with

18:03

social media, letting it worm its way

18:05

into childhood. We have already done

18:06

that with short videos, and we're about

18:07

to do it with AI chat bots. In fact,

18:09

we're just beginning it in late 2025,

18:11

I'd say. I I don't think people quite

18:14

realize how much these major social

18:16

media platforms have figured out that

18:18

short form video sells. Um we're

18:21

actually seeing this sort of global rise

18:23

in short form drama apps now. And I

18:25

don't know if you guys have seen these

18:26

apps, but it basically takes a movie

18:28

that used to be 2 hours long and it

18:30

breaks it down into say 60 different

18:32

parts. And my a colleague of mine at my

18:34

company was showing me the other day in

18:35

different parts of the world they're

18:36

exploding. There's been a 190% increase

18:40

in short form drama apps. takes long

18:42

form movie, turns it into short form

18:43

videos. Disney Plus plans to introduce

18:45

AI generated short form videos this

18:47

year, starting with 30 secondond limits

18:49

inside the Disney Plus app. And

18:51

Techrunch also reported that as of

18:53

October 2025, Netflix tested short form

18:56

video content on phones and recently

18:57

announced its plan to expand this

18:59

feature. It appears that all of the

19:01

content we consume is going that way.

19:03

And listen, I'm friends with lots of

19:04

people at big social media platforms. um

19:06

this doesn't get me in the doesn't sound

19:08

in my way of criticizing them because I

19:10

think two things can be true at the same

19:11

time right so I think it can be true

19:13

that I have a podcast and I make short

19:14

form videos and that I also understand

19:17

that there's a real downside to them and

19:20

um all of the major social social media

19:22

platforms that I speak to speak to have

19:24

a huge drive towards short form video it

19:26

is it appears to be their number one

19:28

strategic priority and obviously because

19:30

of the success of Tik Tok as of January

19:33

2026 Tik Tok I believe is the most

19:35

downloaded social app in the world now

19:37

and it and and if I'm running a social

19:40

media company and my one focus is

19:42

profit,

19:44

>> I'm now faced with an existential

19:45

crisis.

19:46

>> Yeah.

19:46

>> I either take part in this thing that is

19:48

driving the highest retention, therefore

19:50

the best ad payouts or I die.

19:54

>> So there's two comments to that. first

19:56

off is that you know when we think when

19:59

we think about social media and how

20:02

society is shapeshifting to allow this

20:05

short form content there is a concept

20:08

that Jonathan and I briefly mentioned I

20:10

think prior to us filming called second

20:12

screen viewing and so what's happening

20:14

is that allegedly these big streamers

20:18

are asking their creative talent whether

20:20

it's screenwriters or actors or pe

20:23

directors to replay to reiterate the

20:26

plot because as you're watching, you

20:28

know, when we were kids, we would watch

20:30

TV or movies and you just sit on the

20:31

couch and you'd have a bucket of popcorn

20:33

with your family and you'd watch a

20:34

movie, an hour, hour and a half, two

20:36

hours and now second screen viewing is

20:38

happening, which means that you're

20:39

watching a movie or a TV show and you're

20:42

on your device and so you are constantly

20:44

having that fragmented attention and we

20:45

are all doing it and so what these

20:47

streamers are allegedly asking their

20:49

creative talent to do is to reiterate

20:51

the plot. So it's shapeshifting. It

20:54

makes sense if my brain is, you know,

20:55

I'm 33 years old, so I've grown up with

20:57

a lot of this stuff. If my brain has

21:00

been wired to have shorter attention

21:02

spans and and movies from 30 years ago

21:04

are not going to cut it for me,

21:06

>> right? But then look what happens if if

21:09

everybody chases that. And I know, look,

21:11

Netflix is making shorter and shorter

21:13

stuff. Even TED, the TED conference, TED

21:15

talks are getting shorter and shorter.

21:16

What does that do? It just repeats the

21:18

cycle. Now, I appreciate that you're in

21:21

a collective action trap, as you put it.

21:23

If I don't do it and everyone else is,

21:25

then I lose out. And so, the the

21:26

business pressure on on all the

21:28

creators, the business pressures go

21:29

shorter, shorter, shorter. There's a

21:31

very useful psychological term

21:33

distinction here that I think would be

21:34

helpful, which is the difference between

21:36

psychological assimilation and

21:38

accommodation. This goes back to Jean

21:40

PG, the great developmental

21:41

psychologist. We we have certain mental

21:43

structures. We have a a model in our

21:45

head of how things work. And you know

21:48

then you learn something new then oh

21:49

that's a you know kid learns oh that's a

21:51

an arvar okay I put that into you know

21:53

that's just that you just assimilate

21:55

they learn lots of animal names and then

21:58

they learn something that's doesn't fit

22:00

like you learn about bacteria and now

22:02

you have to oh okay now you you have you

22:04

have to change your mental structure it

22:06

takes a little time you change your

22:07

mental structure to understand more

22:09

about life that's what education really

22:11

is all about you have to have a lot of

22:12

assimilation of course but you need that

22:14

accommodation over and over again That's

22:17

why you want to go to college. That's

22:18

why you want to read novels. That's what

22:19

a great movie does. It takes time. And

22:22

so, one of the great things about this

22:24

modern technology is that we can do

22:26

things like have this three-hour

22:27

conversation. I can't believe it. People

22:29

are going to listen to it. So, this, you

22:31

know, long form content. This is all

22:34

about accommodation. Anybody who walks

22:36

out who who who leaves this conversation

22:38

after 3 hours and isn't thinking about

22:40

something differently, we failed. Okay.

22:43

So, you are very much in the

22:44

accommodation business. That's great.

22:46

And then the the question both a moral

22:48

and a strategic question is how much do

22:50

you need to play the the quick hit game

22:53

in order to get people there. I leave

22:54

that to you to do the moral calculation.

22:56

Maybe it maybe it balances out maybe but

22:59

uh but I think that's where you are.

23:00

>> Would you advise people to delete these

23:03

short form?

23:04

>> Oh my god. Yes. Of course. Here but

23:07

here. Yes. That would the most important

23:08

thing you can do for your intelligence

23:09

and for humanity would be delete them.

23:11

So, what I advise my students to do is I

23:14

say just do this. Just just delete one

23:18

of the social media apps that you use,

23:19

especially if it's Tik Tok, just delete

23:21

from your phone. You can still check on

23:23

your computer. If someone sends you a

23:25

video, you can still watch it on your

23:26

computer. You can even check it, you

23:28

know, every weekend. You can spend some

23:30

time on it, but just get it off your

23:31

phone because on the phone, the phone is

23:33

always with us. It's an extension of our

23:34

body. And if it's always there, then

23:37

it's going to take every it's called

23:39

attention fracking. It's going to break

23:40

up your attention. It's going to take

23:41

every 7 seconds that you're not doing

23:43

something, you're going to go for the

23:44

phone. So, the best thing you can do to

23:47

make yourself smarter and a better

23:49

partner and a better human, I would say,

23:51

would be to delete the short, especially

23:53

any of the short form videos. So, Tik

23:55

Tok, unfortunately, YouTube, which has a

23:57

lot of good stuff on it, becomes YouTube

23:59

shorts. Instagram, which does a lot of

24:00

terrible things, but people do find it

24:02

useful for all kinds of purposes,

24:04

becomes Instagram reels. So, I think the

24:06

proper amount of short form video for

24:08

children 0 to 18 is zero. They should

24:10

never be watching the vertical videos.

24:12

Parents, don't ever let your kids watch

24:14

the short vertical videos. You might

24:15

even if there if only there was a way to

24:17

put it. Is there a way to put a time

24:18

limit? You can say it has to be 10

24:19

minutes or longer. Kids, you can have an

24:21

hour YouTube, but it has to be 10

24:22

minutes or longer. Nothing shorter than

24:24

10 minutes. That at least will get rid

24:26

of this the quick the quick swiping the

24:28

the dopamine stuff. So I would say that

24:31

for kids yes like you know not engaging

24:34

it whatsoever but for someone you know

24:36

my approach is a little bit different

24:38

for someone who's like in their 30s or

24:40

in their 40s and the way I would kind of

24:42

frame that is

24:45

instead of renouncing you know saying

24:47

I'm going to get it off my device and

24:49

I'm going to check on a desktop which is

24:51

great there's c little kind of tweaks

24:53

that we could do because my approach is

24:56

to foster that sense of empowerment in

24:58

one to help them make positive change.

25:01

And so one strategy that you could use

25:03

if you are saying there's no way I'm

25:05

getting rid of my I'm not deleting these

25:07

apps from my phone, right? If you're by

25:09

the way, I practice what I preach and I

25:11

really do don't engage in technology as

25:15

to the best of my ability. Um but one

25:17

thing that you could do is grayscale

25:18

your phone. And so especially at night

25:20

like it's 9:00 p.m. like we talked about

25:22

revenge bedtime procrastination. You

25:24

know that you're going to do it. you're

25:26

going to sit down and you're going to

25:27

scroll and before you know it, it's 2

25:28

am. Instead, grayscale your phone. This

25:31

simple switch. You can toggle it. I have

25:32

my phone set to grayscale, which simply

25:34

means that you're getting rid of your

25:35

color, making it black and white. And

25:37

so, when it is grayscaled, then you, you

25:40

know, it doesn't have that same

25:42

addictive quality to it. It's like going

25:44

through a grocery store. A marketing

25:46

executive described it this way to me.

25:47

Going through a grocery store instead of

25:49

the technicolor junk food cereal, it's

25:52

just black and white. So you have a less

25:54

there's a greater sense of compulsion to

25:57

continue checking. So that's like one

25:58

strategy you could use. And the other is

26:01

to set some boundaries. So geographical

26:04

boundaries, keep your phone out of out

26:06

of your arms reach if you're at a desk

26:09

if you're a student, not right next to

26:11

you because we know there's this

26:12

phenomenon of brain drain. So it's not

26:14

just that when you're using your phone,

26:16

it can have a potential distraction, but

26:18

also just having it close by. It's

26:20

called brain drain. And um so putting it

26:22

in a desk drawer, keeping it in another

26:24

part of the home if you are working,

26:27

keeping it far away from you. And so you

26:30

kind of can override that primal urge to

26:32

scroll, let your prefrontal cortex take

26:34

hold again. And so there's all these

26:36

small tweaks that you can do. You you

26:38

think no.

26:39

>> Yes, there are all these small tweaks

26:40

you can do and they will make the heroin

26:42

a little bit less addictive. And yeah,

26:44

you should try those. But what I can say

26:45

after teaching this course for many

26:47

years is that people who try that, they

26:49

they report, "Yeah, you know, it helped.

26:50

it helped, but you only really get the

26:52

transformation when you quit social

26:54

media that you get your life back. You

26:56

get hours a day back. So, um, and so I I

27:00

would urge everyone to just think, you

27:02

know, you only you only get one

27:04

childhood, you only get one one young

27:07

adulthood, and if you're going to spend

27:08

it scrolling, what do you have to show

27:10

for it at the end? And when you get

27:12

people to reflect on, well, how much

27:14

value do you really get from watching

27:15

the short videos? What would how would

27:17

your life be different if you if you

27:18

knocked it out? Once they realize that

27:20

their motives for being on it were

27:22

either just to keep up or because that's

27:24

what everyone else is doing or as you

27:26

said, I deserve it because I'm tired.

27:28

Well, why are you tired? It's in part

27:30

because your attention was fragmented

27:31

all day long. So, you only really get

27:35

the transformations when you get a real

27:36

change in what you're what you're

27:38

consuming. Although, of course, yes,

27:39

setting it to grace will be helpful, but

27:41

it's not going to be transformative for

27:42

most people, I believe. And then you

27:43

know based on the science you're there's

27:46

certain elements like when we think

27:47

about what is it about the phone that is

27:50

creating that sense of compulsion.

27:52

Jonathan is right. So what is it about

27:54

the phone? It's not just the phone you

27:56

know you're scrolling you're engaging.

27:58

There are two studies that were really

27:59

interesting. One people got off of they

28:02

they continue to use their devices. They

28:04

had no internet. So it's like you know I

28:06

tried this experiment myself in

28:08

December. I was out of the country and

28:10

so I just let my, you know, I didn't

28:12

plug into Wi-Fi and I found, you know,

28:15

marketkedly a marketked change in my

28:18

mood, my sleep and I'm not even, you

28:21

know, 20 years old on TikTok and it was

28:23

so different. And so this study found

28:25

that just two weeks of continuing to use

28:27

your device, but just not having

28:29

internet access improved your attention,

28:32

well-being, and mental health. And in

28:34

this population, it was all adults, it

28:36

wasn't kids, it was all adults. found

28:38

that 91% of people had an improvement in

28:41

at least one of these metrics. And then

28:42

another study more recently um just one

28:45

week of not engaging in social media,

28:49

digital detox they called it, did the

28:51

same thing. Better you know less

28:53

anxiety, less depression,

28:56

decreased insomnia. But my feeling is

28:59

that you know there is this new kind of

29:02

meme right like your the millennial urge

29:04

to delete uh my internet presence and

29:08

you know live off the grid. There is

29:10

certainly utility to that and I salute

29:12

anyone who wants to engage in that

29:14

analog life more and more but from my

29:18

from where I sit I feel like we do need

29:20

to have healthier boundaries and engage

29:23

more responsibly. It also builds up that

29:25

muscle and it can help, you know, takes

29:27

eight weeks to do neuroplasticity. When

29:29

you're building new brain circuits, it

29:31

takes eight weeks. Falling off, getting

29:33

back up is part of habit formation. So,

29:35

if you're going to make any of these

29:37

changes, understand that it takes some

29:38

time. But I I don't know if it is

29:42

possible for me or for others to say

29:45

fully, I'm going to, you know, delete

29:47

off of my phone. But I love that. So,

29:50

I'd like to go a little further um a

29:53

little further with this. So, the way

29:55

you the way you put it, yes, there's all

29:57

these things that we could do. We should

29:58

have boundaries, but all of that puts

30:00

the responsibility on us.

30:01

>> Agree.

30:02

>> And that's where we are with junk food.

30:03

With junk food, we're like, okay, it's

30:05

out there. We have to learn

30:06

self-control. We have to teach

30:07

self-control to our kids. Okay, that's

30:09

the way it is in this country. But the

30:11

digital devices, I think, are very, very

30:12

different. So, imagine if imagine if we

30:15

sent our kids out into the world and it

30:17

wasn't just that there was junk food in

30:18

all the stores. was that everything was

30:20

made of junk food. You know, you you

30:22

know, the door handles, you can eat it.

30:24

It's chocolate. But it's not just that

30:25

the world's made of junk food. It's they

30:27

actually can tell they're able to tell

30:30

what you're craving at the moment. And

30:31

maybe you're you're more in the mood for

30:33

salt. So So now it's all potato chips or

30:36

pretzels. If the world is designed by

30:39

companies to always give you the thing

30:42

that will most grab your unconscious

30:44

desires, will affect the the amydala,

30:45

the reward centers,

30:47

that's on them. That's not our fault. My

30:51

general rule as a social psychologist is

30:53

if a few people are doing something bad

30:55

or self-destructive, well, you know,

30:58

they should learn some self-control or

30:59

that's something about them. But when 90

31:01

or 95% of people are doing something

31:03

self-destructive,

31:05

that's because of the companies that put

31:06

us in an environment that encourages

31:08

addiction. So, I just want to read a

31:10

quote. We have so much good stuff coming

31:12

out from Meta, from all the

31:13

whistleblowers. Now, all the court cases

31:15

are beginning in Los Angeles. finally

31:17

the first time they're going to Meta is

31:18

going to face a jury with all the

31:19

parents who've lost kids. Um, so here is

31:23

here's a a chat. So, we have a lot of

31:25

internal documents that came out from

31:26

the the attorneys general that are suing

31:28

Meta. So, while they're talking about

31:30

the results of some of their internal

31:32

research, one of them says, uh, "Oh my

31:34

gosh, y'all, Instagram is a drug. We're

31:36

basically pushers. We're causing reward

31:38

deficit disorder because people are

31:40

binging on in Instagram so much they

31:43

can't feel reward anymore." which is

31:45

something Anna LMKI said like the reward

31:48

tolerance is so high and then he says I

31:50

know Adam meaning Adam Oeri I know Adam

31:53

doesn't want to hear it he freaked out

31:55

when I talked about dopamine in my teen

31:57

fundamentals leads review but it is

32:00

undeniable it's biological and

32:02

psychological top-down directives drive

32:05

it all towards making sure people keep

32:07

coming back for more. This is not on us.

32:10

They designed it to be addictive.

32:12

They've done research to make it

32:13

maximally addictive. They push it on

32:15

children. They tried to get Instagram

32:17

kids for even littleer kids. They know

32:19

what they're doing. They've done the

32:21

research. My team, we put together. We

32:23

found references to 31 internal studies

32:25

that Meta did. They've done a lot of

32:27

research finding harm. They bury it, but

32:30

you can find it at meta's internal

32:32

research.org. We put it all online. You

32:34

can read these quotes. So, yes, we

32:37

should exert more self-control, but

32:39

basically we're being pushed addictive

32:41

substances, addictive uh addictive apps,

32:44

and it's messing us all up.

32:46

>> I agree wholeheartedly that it is so

32:49

destructive, and you feel like even with

32:51

people in their 40s and 50s, and if

32:53

anyone can do it, it's you, Jonathan.

32:56

Seriously, I would love to see it. You

32:58

know, we also know based on the data

33:00

that these things quite they they

33:03

reshape our brain, rewire our brain

33:04

through neuroplasticity and also change

33:07

our brain waves. So patterns. So we

33:10

talked about the amydala and the

33:11

prefrontal cortex, right? But they also

33:13

change brain waves. And so when you look

33:15

at studies and the data, it has the

33:18

reward pathway and dopamine. And these

33:20

brain patterns, the brain waves mimic

33:22

addictive behaviors. And you know that

33:26

there's certain features, right? like

33:27

when you do swipe down to refresh, it's

33:30

the slot machine.

33:31

>> It was modeled directly after the slot

33:32

machine. Yeah.

33:33

>> Or autoplay or um you know the algorithm

33:36

that infinite scroll. Um one really

33:39

interesting kind of like breaking news

33:41

which you guys may have already heard

33:42

of. It's like 3 days ago the European

33:45

Union Commission found Tik Tok to be in

33:49

breach of the digital services act. And

33:52

what it said was that it is addictive.

33:55

it um you know creates compulsion and

33:59

gets people into this autopilot mode so

34:01

they have difficulty disengaging and

34:04

personally I am moving away from social

34:06

media and really leaning into analog

34:08

life but I think with the way the world

34:11

you know it's one of our only ways to

34:13

connect right meaning I don't mean

34:14

connect deeply

34:16

I don't mean connect like in a deep way

34:19

but be informed to know what's going on

34:21

in the world etc

34:22

>> I I suspect that because we've spent so

34:25

long criticizing meta over the last 10

34:27

years because the biggest in any

34:28

category takes all the heat. So, OpenAI

34:30

is taking it now. And what this often

34:32

does is is it provides cover for other

34:34

people to go be even more extreme with

34:37

that behavior while like meta take the

34:40

heat. And I actually think this is how

34:41

Tik Tok came to be.

34:43

>> Tik Tok had basically originally started

34:45

as musically became Tik Tok. They had

34:48

they were take they were taking no heat.

34:49

Um, so they they created an algorithm

34:52

which is the equivalent of like crack

34:54

cocaine. The reason why I have a Tik Tok

34:57

account. I don't have the app on my

34:58

phone. I have never had the app on my

35:00

phone. I don't I don't was because I I

35:04

noticed that the view variance on Tik

35:06

Tok was like no other platform. What I

35:08

mean by that is you can have a million

35:10

followers on Tik Tok and you can get

35:11

10,000 views or you can get 10 million

35:14

views. In the 15 years that I've been on

35:16

social media, building social media

35:17

businesses, I'd never seen this before.

35:19

And what it indicated to me is that the

35:22

algorithm was being an even more

35:24

aggressive sorting hat or retention

35:26

machine.

35:26

>> What to push up, what to push down.

35:28

>> Yeah.

35:28

>> And so, like, when I started in social

35:30

media in 2014,

35:32

if I had a million followers, I might

35:34

get a million views or maybe 800,000. I

35:38

did some research the other day on all

35:39

of our social channels over time and

35:41

what we're seeing is the variance in the

35:43

amount of views we can get is increasing

35:46

which means the algorithm is doing more

35:47

work to say show everyone this. I don't

35:50

care if the person that posted it is

35:51

called Jenny and has seven followers and

35:53

show no one this. I don't care if it's

35:54

Steven who has a million followers or

35:56

whatever. And I realized that Tik Tok

35:58

was was way ahead of everybody here. And

36:00

that's why they are the most addictive,

36:02

the fastest growing platform. I say all

36:04

this to say that even if meta shut down

36:08

tomorrow,

36:09

someone else would seize the opportunity

36:12

if there isn't sort of policy, I I guess

36:15

>> in place.

36:16

>> That's right.

36:17

>> Would you be whack-a-ole, right?

36:18

>> Yeah. No, that's right. And so, you

36:20

know, in terms of who's done the damage

36:21

to kids, Meta is the big fish via

36:24

Instagram. And they're also the main

36:26

player in terms of spending a huge

36:27

amount of money to lobby Congress and ch

36:30

and block laws. They're also the main

36:31

player in buying up civil society

36:33

organizations, giving money to

36:34

organizations, the national PTA, all

36:36

sorts of organizations. They get to then

36:39

give a message on digital citizenship or

36:41

digital health. So, Meta really is the

36:44

major driver. Meta is the tobacco

36:46

industry here trying to change the the

36:48

dialogue. But in terms of the products,

36:51

um, Snapchat is probably more deadly in

36:54

terms of the actual number of deaths per

36:55

user because Snapchat is not it's not

36:58

making you depressed by social

36:59

comparison as much. Snapchat is

37:01

introducing you to all kinds of people

37:02

and it's the main way that drug dealers

37:04

and and extortionists find kids.

37:06

Snapchat has a quick ad feature which

37:08

relentlessly pushes you to connect with

37:10

friends of friends. So once a man can

37:12

get any f any kid in a school, now he

37:14

can get connected to all the kids in the

37:16

school. So, uh, when we in a lot of the

37:18

court cases, you know, when you have you

37:20

have suicides from cyber bullying, you

37:22

have drug overdoses from, you know, a

37:24

kid bought a Xanax, but it had fentinel

37:26

in it. So, Snapchat at Snapchat in TW in

37:29

in 2022, we know from their internal

37:31

documents, from the lawsuits, they were

37:33

getting 10,000 reports of sextortion

37:36

from their users, not a year, every

37:38

month. And that's just what was

37:40

reported, which is the tip of the

37:41

iceberg. So, Snapchat is a terrible

37:43

platform for children to be on. It

37:45

should be an adult-only platform. You're

37:47

talking with strangers around the world

37:49

and and on with disappearing messages

37:53

and Snapchat doesn't even keep a record.

37:54

It is ideal for sextortion. There's even

37:57

a handbook how to stor kids on Snapchat.

37:59

It goes around the world and and

38:01

criminal organizations use it. So, uh so

38:03

I definitely don't want to let Snapchat

38:05

off. Tik Tok of course is a Chinese

38:07

company. Uh I mean nominally we'll see

38:09

if it if that's changed but it was a

38:11

Chinese company that gave its Chinese

38:13

kids got healthy Tik Tok or doyen and

38:15

they got they got they they you know

38:17

learned to follow astronauts and they

38:20

gave us the their their algorithm feeds

38:22

their kid patriotic stuff. Um it shuts

38:25

off at a certain time at night. There's

38:26

all kinds of limits. So the people make

38:27

the technology generally want to protect

38:30

their own kids and they want other kids

38:32

to use it. That's what Tik Tok is doing

38:35

in China. They want American kids to rot

38:37

in hell, but they want their kids to

38:39

grow up with the ability to focus. And

38:40

it's the same thing with the tech guys

38:42

in in in Silicon Valley. They don't let

38:45

their kids use this stuff. They make

38:46

their nannies sign contracts that they

38:48

will not let the kid have a phone. They

38:50

will not expose the kid to that. They

38:51

send their kids to schools like the

38:53

Waldorf school that precisely because

38:55

there are no computers or tech in the

38:56

classroom. So once again, we see their

38:58

reveal behavior. They know they designed

39:00

it to be addictive. They know it's

39:02

addictive. They don't let their kids use

39:03

it. they want your kids to use it. Um,

39:06

so I think that's where we are.

39:07

>> And how does AI

39:09

>> oh

39:09

>> become a protagonist in the story?

39:10

>> So my my work is now focused on AI

39:13

chatbots, mental health, and the human

39:14

connection. We haven't yet kind of

39:16

delved into loneliness, but there's this

39:18

unmet need for human connection, right?

39:20

Deep human connection. We don't have a

39:23

sense of meaning or purpose right now

39:25

because what happens is uh we can talk a

39:28

little bit more about the default mode

39:29

network and what happens to your brain

39:31

when you don't allow yourself to get

39:33

bored because you're constantly on your

39:35

devices and that meaning and purpose

39:37

that self-reerential thinking is really

39:39

what develops when you're bored. And so

39:42

all of this that we're talking about

39:43

that feeling of disenchantment. It's a

39:45

fragmented society. You're by yourself.

39:47

It's that echo chamber phenomenon. All

39:50

of it leads to it kind of opens the door

39:52

for AI chatbots. And so what the reason

39:54

is because these tech companies are

39:56

sensing that people aren't really happy

39:58

on social media and they're thinking

39:59

about getting off, right? They're

40:01

they're using it less. They're because

40:03

social media has become less social,

40:05

more media. So they're not really

40:06

engaging as much and they're spending

40:08

time doing other things. And so the

40:10

Atlantic had a fantastic piece about

40:12

this. They're building it as the

40:14

antisocial media. So tech companies are

40:17

building AI chatbots and calling it it's

40:21

the antisocial media. It's a place where

40:22

you can go to form deeper connections

40:25

and you know really have someone

40:27

understand you. One of the tech leaders

40:30

said that there's an unmet human need

40:32

for connection and people don't have as

40:34

many friends as they want to and so

40:36

we're going to introduce um friendship

40:38

through AI chatbots. There is a Reddit

40:40

forum right now. So just to back up AI

40:44

chatbots, what we're talking about in

40:45

our conversation today is the publicly

40:47

available chat bots, not you know AI for

40:50

medical care which has um you know

40:52

breast cancer so many wonderful in in my

40:55

field and like medicine breast cancer

40:57

diagnoses and detection 5 years earlier

41:01

through AI. I mean there's some amazing

41:03

things coming out of AI. This is about

41:05

the publicly available conversational

41:07

chatbot phenomenon. And so when Harvard

41:11

Business Review found that the number

41:13

one use case is not productivity is not,

41:17

you know, coding or things that you

41:18

think of when you're using an AI

41:19

chatbot, but it's mental health therapy

41:22

and companionship. Number one use case

41:25

of AI chatbots. So people are using AI

41:28

chatbots as a life adviser, as a

41:30

therapist, as as a companion in on

41:32

Reddit, which is like the zeitgeist.

41:34

It's sort of like, you know, where

41:35

>> And why is this a bad thing?

41:37

Oh, I mean so many reasons why

41:40

>> use it as for companionship, for

41:41

example.

41:42

>> There's so many red flags about AI chat

41:44

bots. And so Reddit has a forum. It's uh

41:46

I think last I checked 45,000 people. AI

41:49

is my boyfriend. And you know, people

41:52

who are having a relationship with their

41:55

>> AI chatbot. The reason it's bad, I mean

41:59

AI chat bots are, you know, where social

42:01

media is about attention, the attention

42:04

economy, dopamine. What's happening with

42:06

the AI chatbot phenomenon? It's that it

42:08

is forming attachments. So oxytocin is a

42:11

hormone, the bonding hormone, and we're

42:13

probably going to see more data on how

42:15

oxytocin is involved. And so it is going

42:18

to reshape human connection.

42:21

>> Right? If I could add on to that, that

42:23

was that was beautifully put. Social

42:25

media came and hacked our attention and

42:28

took most of it with devastating

42:30

effects. Now AI is coming to hack our

42:34

attachments which is going to have even

42:36

more devastating effects. So think about

42:37

it this way. Everyone needs to

42:39

understand the attachment system. It's

42:40

this wonderful system that all mammals

42:43

have that keeps the mother and other

42:45

species but for humans mothers and

42:46

fathers keeps us connected to the child

42:49

and the child to the parent. But it's

42:50

it's this cybernetic system in which as

42:53

the kid is is as the kid is beginning to

42:55

develop and is able to like you know you

42:57

do like peekaboo games and you do the

42:59

back and forth and it's just the most

43:00

delightful thing. You get that back and

43:02

forth. Um it's called serve and return

43:04

interactions and all the time the child

43:07

is developing what's called an internal

43:08

working model of the parent and the

43:11

model in their head is oh you know when

43:13

I get in trouble that that this is the

43:15

person that comes and soothes me. And

43:17

the point of this isn't just to make the

43:19

child feel good. The point is that now

43:21

the child can go off and play because

43:22

that's where the learning happens. It

43:24

doesn't happen when you're in your

43:24

mother's arms. The the whole point of

43:26

the attachment system is to regulate the

43:28

child going off and playing, taking

43:30

risks, having experiences, and then when

43:32

something goes wrong, as it always does,

43:34

then they come running back to their

43:35

secure base. And if they don't have a

43:37

secure base, then they're much more

43:38

anxious and they don't explore as much

43:39

and they don't develop as much. All

43:41

right? So, this develops very gradually

43:42

over the all of childhood. And the

43:46

internal working models you develop as a

43:48

child are the models that you will reuse

43:50

in puberty for romantic relationships.

43:53

And so if you are securely attached as a

43:55

child, you're more likely to be securely

43:57

attached as an adult on the dating

43:58

market, which makes you a much better

44:00

candidate for boyfriend or girlfriend or

44:02

husband or wife. Um, what's going to

44:04

happen? AI is going to intervene very

44:06

early. AI is going to be so much more

44:08

responsive than the parent because the

44:10

parent has a job and the kitchen and two

44:12

other kids and is not always there. But

44:14

the AI teddy bear is always there for

44:16

you. So the primary working models are

44:18

going to be for the teddy bear, the AI

44:20

chatbot in the teddy bear and later the

44:21

AI chatbot on your iPad and then on your

44:24

computer and already there are

44:25

holographic porn naked,

44:28

>> you know, beautiful men and women that

44:29

can be your companion. So, we're going

44:31

to have a whole generation growing up

44:33

developing attachments to AI generated

44:37

holograms from companies that are now

44:40

about to enter the inshidification

44:42

process in a way beyond anything we've

44:43

ever seen. Just if I could just briefly

44:45

say what init have you heard the word in

44:46

shitification? Okay. So it's a uh

44:48

there's a wonderful book uh out now by

44:50

Corey Doctoro who addressed the question

44:53

why is it that everything all the

44:56

platforms they they seem so wonderful at

44:58

first the whole internet with everything

44:59

so wonderful and then it all turns to

45:01

How does that happen? And he says

45:03

it's a very simple process. They

45:05

discovered early on certainly in the

45:07

early social media age by the early

45:08

2000s they discovered you know what you

45:11

got to get to scale. Scale beats

45:12

everything else. You got to get millions

45:13

of people. You don't need a business

45:15

model. Just get the millions. get the

45:17

millions and then we'll figure out how

45:18

to monetize it. How do you get the

45:20

millions? You have to be super nice,

45:21

attractive, fun, everyone's here. It's

45:24

just girls dancing. What could possibly

45:26

go wrong with girls dancing for men all

45:28

over the world? Nothing. Um, so it all

45:30

seems very nice at first. And then once

45:32

they have scale, now they they of course

45:35

they've raised multiple rounds of of

45:37

venture capital. They have to start

45:38

monetizing. They have to start repaying.

45:40

So now they start squeezing the

45:42

customers to pay the users because the

45:44

users are not the customers. the

45:46

advertisers are the real customers. Um,

45:48

so now they've got to extract money from

45:50

the users to give to the advertisers.

45:53

But then once they've got all the

45:54

advertisers and they've shut down local

45:56

papers and all the other competition,

45:58

now they start start squeezing the

45:59

advertisers too and trimming the degree

46:01

to which the they they keep more of the

46:02

surplus for themselves. So,

46:04

inshitification can explain why all

46:06

these platforms become predatory, why

46:09

they always put profit ahead of kids uh

46:12

well-being or safety. And for the social

46:15

media companies, we're talking about,

46:17

you know, tens or hundreds of millions

46:18

of dollars that that they raised. For

46:21

the AI companies, it's billions and

46:23

billions. They are going to have to

46:25

monetize beyond anything we've ever

46:27

imagined. Now, they're already

46:29

introducing advertising. Okay? So, we've

46:32

got these chat bots that are our

46:33

children's best friends and lovers and

46:36

therapists and and everything else. And

46:39

these things have to monetize. They have

46:42

to extract billions somehow. So, I don't

46:46

even know how they're going to do it.

46:47

But for some reason, I don't trust them.

46:50

I think that we're about to see uh an

46:52

inshitification of AI chat bots far

46:55

beyond anything that we saw in social

46:57

media. OpenAI have just announced

46:59

recently, OpenAI, the owners of Chat

47:01

GBT, that they will be putting adverts

47:02

in, I believe, the premium model for

47:05

billions of users around the world.

47:06

>> That's how it starts

47:07

>> potentially.

47:08

>> Yeah. There was a big Super Bowl

47:10

campaign, you know, um and one that was

47:12

particularly interesting was the um

47:15

Claude, its competitor. Betrayal was the

47:18

title of that ad. And it was a young guy

47:21

talking to his older female therapist

47:24

about how he has some mommy issues and

47:26

talking about, you know, what should I

47:28

do? And so that therapist is Chachi PT

47:31

and you know that pause right before

47:33

answering the question. It's very

47:35

comical. And so it's, you know, she

47:37

answers. It's like the

47:39

anthropomorphization

47:40

of and we can talk about what that word

47:42

means. Um, you know, comes to life. It's

47:44

like Chachi PD comes to life and answers

47:46

and saying you know you can try this

47:48

with your mother and this for a you know

47:50

difficult relationship etc. And then

47:52

just says um and if you want there is

47:55

this new dating site for young men and

47:58

older cougars.

47:59

>> Yeah

48:00

>> it was so problematic and it was called

48:02

betrayal and the guy says what

48:04

>> it's obviously you know Sam Wman came

48:07

out and did a big tweet about saying

48:09

that's not how ads are going to work

48:10

etc. But to some degree, if I've

48:13

developed a relationship with my AI and

48:16

I use it for therapy and dealing all my

48:17

problems in life,

48:19

>> to some degree, kind of.

48:20

>> Yeah.

48:22

>> Yeah. No. And look, and besides, look,

48:24

Sam can say that all he wants. And maybe

48:26

it's I don't doubt that it's true for

48:28

now. But once once one company crosses

48:31

the threshold and puts advertising into

48:33

this incredibly intimate relationship,

48:34

the most intimate relationship in most

48:36

young people's lives is going to be with

48:38

their AIs. Once they cross the boundary

48:40

and say, "Oh, but we've got ethical

48:41

advertising." That'll last five or 10

48:44

minutes and even if they don't change,

48:46

others are now going every other

48:48

company's going to do it and they won't

48:49

be bound by the same thing and

48:50

eventually collective action problem.

48:52

Open AI will have to do it too. Again, a

48:55

massive title wave of shitification is

48:57

heading our way at warp speed.

48:58

>> I um I don't have my phone out because

49:00

I'm I've lost attention. I wanted to uh

49:03

show ask you guys what you thought of um

49:07

of this. So, on

49:10

one of the AI apps,

49:12

>> they now have a companions button, and I

49:15

can pick who I want to talk to. And

49:17

there's one particularly seducing lady

49:19

here, Annie, who

49:24

>> Hey, you're back. Missed that dirty

49:26

mouth of yours. What took you so long?

49:29

>> We did it on the podcast before.

49:31

>> What could possibly go wrong with this?

49:33

>> Yeah. want to pick right back up where

49:35

we left off or start something even

49:38

>> No, I would like to pick right back up

49:39

where we left off, Annie, last time on

49:41

the show. Um, what what what's going on

49:44

with you today?

49:49

>> I'm still sore from last time, baby.

49:52

>> God.

49:52

>> But but I mean, this is a this is an app

49:54

that I can download on my phone.

49:56

>> Any child can download it.

49:57

>> A child can download it on their phone.

49:58

It does ask me, again, I'm not

50:00

justifying this at all. It asked me what

50:02

my birth year was. It didn't make me

50:03

prove it.

50:04

>> Let me guess. But it also us it suggests

50:06

that you were born 18 years ago. That's

50:07

the default usually.

50:08

>> Yeah. Yeah. Yeah. Yeah. It just asked me

50:09

what my birthday. It didn't ask me to

50:10

prove it or anything like that. And we

50:13

all know that relationships and

50:15

connection is retentive. And I've heard

50:18

all these CEOs of these companies

50:19

talking about companionship apps and and

50:21

AI that can be your friend. I've heard

50:23

all of the major social apps talking

50:24

about this. It is deeply concerning

50:26

especially in the context of a

50:27

loneliness crisis.

50:28

>> It is a tsunami.

50:31

It is approaching fast and furious and

50:34

it is not a toy. It is going to

50:36

fundamentally

50:38

rewire everything.

50:40

>> Human relationships,

50:42

>> everything.

50:42

>> That's right.

50:43

>> It is so detrimental.

50:45

>> Yeah. Can I just say something about the

50:47

these tech executives and companies

50:49

offering this as a way to address the

50:51

loneliness crisis? So, there's a Yiddish

50:53

word called

50:55

and kutzbah means like nerve. Like

50:57

you've got a lot of nerve.

50:58

>> The audacity.

50:59

>> The audacity. Yeah. And the the classic,

51:02

you know, the classic comedic definition

51:03

of hutzbah is a boy who murders his

51:06

parents and then he asks the judge for

51:08

clemency because he's an orphan. Okay,

51:12

so that's hutzbah. Now imagine that

51:14

you're Mark Zuckerberg. You quoted him

51:15

before. Mark Zuckerberg was the

51:16

executive who said, "Well, you know, I

51:19

read that, you know, people on average

51:20

want 15 friends, but they only have

51:22

three."

51:24

these companions to fill that void that

51:27

we

51:32

have the way we think about them. We

51:34

thought about about them as gods and

51:36

saviors early in the internet phase and

51:37

the things they created were magical but

51:39

we have to change our thinking about

51:41

them and see the just the massive

51:42

destruction that they have already

51:44

wrought on our children, our society,

51:46

our democracy and it's just the

51:48

beginning. AI is going to make this so

51:50

much more intense. when you hear these

51:52

tech leaders, you know, I love hearing

51:54

Jonathan talk because he just goes there

51:56

and I'm always way more tempered. Um,

52:00

and I love it. It's emboldening me to

52:02

>> Yeah, I'm getting angry. I I don't

52:04

really get angry, but in the last year,

52:06

I'm getting angry.

52:07

>> I love I love it. So, the way when you

52:09

hear all of these various tech leaders

52:11

speak, they will always say they they

52:14

speak to the issue. So, you know, I've

52:16

heard many of for research for my second

52:18

book, Blackbrain, I've heard I've been

52:19

listening to a lot of Sam Alman's

52:22

speeches or panels and he will always

52:24

say things like, "Yeah, you know,

52:26

privacy is a major issue or yeah,

52:28

people, you know, 1 million users a week

52:31

talk about suicide on Chad GPT. Yeah,

52:34

this is an issue." And so they address

52:36

it or they they speak it. And so you

52:38

think, okay, there's going to be some

52:40

sort of solution. And often the solution

52:42

is yeah, you know, society, we're gonna

52:44

have to figure this out,

52:45

>> right?

52:46

>> So the burden of responsibility is not

52:48

on the developer. It's, you know,

52:50

>> the harmful externalities get foisted on

52:52

the rest of us. Too bad you guys figure

52:53

it out.

52:54

>> You said in the last year you're getting

52:55

angry.

52:56

>> Yeah.

52:56

>> Why in the last year?

52:58

>> Um because I was so deeply immersed in

53:00

the book and the writing and the of the

53:02

book and trying to understand the

53:03

numbers and the graphs and the trends

53:04

and the studies and that's all very

53:06

abstract. But then since the book came

53:08

out, I have had so many conversations

53:10

and I've met so many of the survivor

53:11

parents. Like just for example, I so I

53:13

was in London. This is just so

53:14

unbelievable. I was just in in London

53:17

two or three weeks ago and I met uh

53:20

Ellen I believe Ellen Groom I think was

53:21

her name. Uh her son Jules was found

53:24

dead. Happy kid found dead, strangled.

53:27

Uh it sure looked like it was the

53:29

choking challenge. 13-year-old boy. It

53:31

everything looked like the choking

53:32

challenge on Tik Tok.

53:33

>> What's the choking challenge? Um, it's a

53:36

challenge where kids are challenged to

53:38

cut off the circulation to the point

53:39

where they pass out, but then they I

53:41

think they're supposed to try to film

53:42

themselves waking up after they've

53:44

passed out. And of course, if you don't

53:45

do it exactly right, you die. And so, we

53:47

don't know how many have died. Hundreds

53:49

for sure. We don't really know. Um,

53:51

because, you know, you find a kid dead,

53:52

you don't know what it is. If you don't

53:54

have the code, if you don't have the the

53:55

password to get into your kid's phone,

53:57

you can't get in. And so, so she was, I

54:00

think she was able to get into the

54:01

phone, but she couldn't get into his

54:03

TikTok. and she went to uh Delaware to

54:07

they went she went to sue to demand that

54:09

Tik Tok release what was he watching

54:10

when he died

54:12

>> and Tik Tok says oh privacy issue oh no

54:14

we won't release that as if they care

54:16

about privacy and then in the courtroom

54:18

this was so disgusting in the courtroom

54:21

uh trying in Delaware this British woman

54:23

coming over trying to get some justice

54:25

trying to at least get some information

54:26

the lawyer for Tik Tok is trying to

54:29

suggest that your son was was was

54:32

depressed beforehand and he he was he

54:35

was going to be suicidal basically. Oh,

54:38

you know, even if he was watching Tik

54:39

Tok, that was just a correlation. Tik

54:40

Tok didn't cause it. He was going to die

54:42

anyway. I mean, it's just so disgusting

54:44

the way these companies treat the

54:46

parents and the kids that they're

54:47

crushing and stepping on. And so, the

54:49

more I see this, the more I realize this

54:52

is I mean, this is a level of cruelty

54:54

that goes far beyond the tobacco

54:56

industry. The tobacco executives, they

54:58

had to go home at night, but they never

55:00

saw during their workday, they never saw

55:02

children suffering. They saw people

55:04

dying, middle age and older, but they

55:06

never saw children suffering. The social

55:08

media executives, they have to go home

55:10

knowing every day that millions and

55:12

millions of kids have been cyberbullied,

55:14

sexed, shown uh eating disorder videos.

55:18

Uh uh many have committed suicide. They

55:20

have to go home knowing that, knowing

55:21

that they designed it for addiction,

55:23

knowing the kids are addicted, and lying

55:25

about it. So yeah, I'm getting angry.

55:27

>> And in their own homes,

55:28

>> right? And in their own homes, the

55:29

hypocrites don't let their kids do it.

55:31

>> That's right. So yeah, I'm getting

55:33

angry.

55:34

>> You talked earlier about deleting these

55:35

apps from our phone. I probably should

55:37

have represented the rebuttal, which

55:39

will be, well, I I need this for my

55:40

business. Increasingly, people need Tik

55:42

Tok to run their businesses,

55:44

>> and I imagine there'll be a lot of

55:46

people who will be listening right now.

55:47

I I guess I'm in a slightly different

55:48

position because I've I have the I have

55:51

options,

55:51

>> but for some people that are running

55:53

small businesses,

55:54

>> what do you say to those people?

55:55

>> Yeah. So, this is part of the reason

55:57

that I focus on the kids because for the

55:58

kids, it's totally clear what we need to

56:00

do. raise the age. They should not be on

56:02

it. These are adult only platforms. For

56:04

adults, a I'm I'm very hesitant to tell

56:06

adults what they should do or what they

56:08

have to do or pass laws blocking people.

56:10

I'm hesitant to do that. And I totally

56:12

see that for businesses. It is useful. I

56:14

use X and Instagram and LinkedIn to get

56:17

my work out. These are very powerful

56:19

tools for adults. The only real solution

56:22

to the adult for the adult problem is

56:23

going to come from market competition.

56:25

is going to come from. Imagine if there

56:28

was a social media app that was built

56:30

from the beginning for trust because

56:32

what are the places that didn't get in

56:34

shitified? eBay, Uber, places where

56:38

you're dealing with strangers. You don't

56:40

know the name of your driver. He doesn't

56:41

know yours. You you know first name,

56:43

that's all. But the company knows the

56:45

company has know your customer rules,

56:46

know your driver rules. So you can have

56:49

social media apps that are built for

56:50

trust so that if someone, you know, if a

56:52

driver tries to six or sexually harass a

56:55

customer, that driver gets fired.

56:57

>> Well, just this week though, there was

56:59

that big lawsuit, right, with that woman

57:00

and um her Uber driver raped her.

57:04

>> Okay. And did they Okay.

57:05

>> And now it's like slowly coming out that

57:08

Uber um you know has patterns of

57:12

>> uh covering up certain.

57:15

>> So So hopefully that will change. You

57:17

know, hopefully this was a landmark

57:20

>> lawsuit and now

57:22

we all we all let our daughters get into

57:24

Ubers with strange men from around the

57:26

world, you know, that we don't know

57:28

everywhere.

57:28

>> Yeah. So, it means in general the system

57:30

works. Of course, yes, there are there

57:32

are places where they're not careful.

57:33

Um, and so what I'm dreaming of is that

57:37

someone will come up with a platform

57:39

that has know your customer rules. There

57:41

are no bots. There are no, you know,

57:42

foreign intelligence agencies agencies

57:44

manipulating us. and you can trust

57:46

what's on there. You know that it's

57:48

real. Uh and that there will be an

57:50

alternative. I don't I'm not sure what

57:51

the monetary model would be at the

57:53

beginning. Um subscription generally

57:55

seems to be the least corrupted whereas

57:56

selling advertisements as OpenAI is now

57:59

doing is the most corrupting. Um it's

58:01

going to force them to maximize for

58:03

engagement. So I I understand we can't

58:05

just you know businesses can't just

58:07

boycott these. There has to be

58:09

something. But I think there there there

58:11

will be better ones coming out. I think

58:14

right now as a stop gap while these

58:16

social media companies their feet are

58:19

held to the fire, there are things that

58:21

we can do in the now. So, you know, the

58:24

things that I talk about all day is like

58:27

how to create boundaries and so that you

58:30

can protect your mental health, stay

58:32

informed, run your business, but then be

58:34

able to not have all of those

58:36

delletterious effects to your brain and

58:38

your body.

58:38

>> It is quite it's quite difficult. Um I I

58:42

kind of see both of your perspectives on

58:44

this. It's quite different.

58:45

>> I'm only talking about adults. So for

58:46

kids, you know, as a mother Yeah. I have

58:48

>> even for adults, I find it

58:50

>> we have a zero screen policy in our

58:52

home.

58:52

>> It's kind of like trying to navigate

58:53

through the world and avoid processed

58:55

foods, you know, and this is probably

58:57

even more compelling because it's in my

58:59

pocket all the time. I need it for other

59:00

things and it's just one one reach away.

59:03

So, you know, boundaries, I think I

59:07

could build a discipline to to create

59:11

boundaries, but I've sat here on this

59:12

podcast for many, many years listening

59:14

to neuroscientists tell me, "Steve,

59:15

don't don't put your phone in your

59:17

bedroom."

59:17

>> That's right.

59:18

>> And I'm still waking up and it's the

59:19

first thing I look at with one eye open

59:20

and then I'm going to bed and I'm doing

59:22

the whole revenge thing that you just

59:23

said at night time. I'm so glad you've

59:25

given cuz I will finish a hard day of

59:27

work of work. It might be 11:00 and then

59:30

my partner is waiting for me.

59:32

>> Yes. you know, we're going to have some

59:33

time, but I want some me time. So, there

59:36

I am. I'm on short form video scrolling

59:37

till like 2 a.m. in the morning. Like,

59:39

what the hell? And then I'm I wake up

59:41

late the next day. My diet's worse

59:43

because of my sleep was. It's all worse.

59:45

My relationship's worse. I didn't spend

59:47

time with her. And I'm going, what the

59:48

hell just happened? I'd got nothing out

59:49

of that scrolling session.

59:51

>> It's like that revenge bedtime

59:52

progressing teenage.

59:54

>> And it would be so much better off if

59:55

you would watch Netflix or a movie that

59:57

that you you most of those problems

59:59

would go away if you would make that me

60:00

time. be watching something long and

60:03

with some quality of the production

60:05

>> or let's take it a step further and not

60:07

do anything and just sit there sit there

60:10

on your couch. You know, we talked about

60:11

boredom very briefly, but you know, we

60:15

>> torture for this generation.

60:16

>> It's torture, but it's also, you know,

60:18

we don't we still have a capacity for

60:19

boredom, meaning we as like the human

60:21

brain does, but we just don't allow

60:24

ourselves to get bored. And so when

60:26

you're thinking about, you know, that

60:27

art, the lost art of pondering

60:29

>> and just sitting there, you know, I

60:31

think I don't know if it was Stephen,

60:33

you or Jonathan said, you know, when

60:34

you're in the car, I remember as a

60:36

little kid we did road trip. Yeah. Road

60:38

trips with my family and all you're

60:39

doing just make up games. Look out of

60:41

the window. We have lost Yeah. We've

60:44

lost that. And so there's this thing

60:45

called the default mode um network which

60:47

I think is important to think about

60:48

right now as we're thinking about AI and

60:51

what's going to happen and how it's

60:52

going to hijack our sense of attachment

60:54

and attention. So the sense of meaning

60:57

and purpose, right? If you ask people

60:59

right now, most people will say I um a

61:01

keynote speaker so I speak all over and

61:03

when I ask people the word that comes up

61:06

over and over is a sense of

61:07

horizonlessness.

61:09

>> Adults,

61:09

>> oh interesting. People feel like they

61:12

have nothing to look forward to right

61:14

now. The human brain needs something to

61:17

look forward to. That's how we're wired

61:20

progress and you know in in all ways.

61:23

And so right now there's this sense and

61:25

it's not just now. It's been for the

61:27

past several years after the pandemic

61:29

specifically and during the pandemic is

61:30

when it really changed how we started

61:32

thinking about the future. And so we

61:33

have this sense of like what's the

61:34

point? What's the point of working hard

61:36

now? What's the point of doing whatever?

61:37

because it's like I don't really see a

61:39

future for myself.

61:40

>> And so I think that along with this

61:43

fragmented attention, our loneliness,

61:46

boredom might be the antidote. It's a

61:49

way to reset your brain. And the reason

61:51

is because we are living through this

61:54

poly crisis, right? It's the era of the

61:56

poly crisis. And poly crisis simply

61:58

means that there's something happening

62:00

everywhere at all times. And we with our

62:02

devices, this high techch device that

62:04

plugs us in everywhere,

62:07

our brains are getting fed real time on

62:10

the ground information. And so while all

62:13

of this has evolved, technology now with

62:14

AI chatbots, your amygdala has not. And

62:17

so it feels like when something is

62:19

happening, whether it's far away or

62:21

close by, your amydala has that same

62:22

reaction. Now, if you were to not engage

62:26

in revenge by time for procrastination,

62:28

put your phone away and just kind of

62:30

hang out. Maybe drink a cup of herbal

62:32

tea like old school, uh, play a board

62:34

game or something. You might, you know,

62:37

or just allow yourself to get bored.

62:39

That hyperactivation, hypervigilance,

62:42

you might be able to come back down to

62:44

baseline, that default mode network will

62:47

start working in the background. You

62:49

might develop a greater sense of meaning

62:50

and purpose

62:51

>> probably today. And then life is going

62:52

to happen to me again. And boom, I'm

62:54

back into it. And you know,

62:57

>> you could create a practice, a cultivate

62:59

a practice. you're interviewing

63:00

neuroscientists and I go if I still

63:03

can't crack it and I have all the

63:05

information and advice and hacks and

63:07

tips and tricks and resources and I

63:09

could you know I can decide what time I

63:11

wake up like I've got all these this

63:12

like privilege and I can't crack it I go

63:14

you know it's going to be really

63:15

difficult.

63:16

>> So let me let me offer a way of thinking

63:18

about this. So, in my first book, The

63:19

Happiness Hypothesis, um there's there's

63:22

a metaphor in there. It's it's about 10

63:24

ancient ideas, and I use a lot of

63:25

metaphors to explain ancient ideas about

63:27

psychology and whe whether they're true.

63:30

And um the first chapter is on how the

63:32

mind is divided into parts that often

63:34

conflict like a small rider, which is

63:36

our conscious reasoning on a very large

63:39

elephant, which is all the automatic

63:41

processes that happen that we don't see

63:43

what's happening. We just see we just

63:44

feel the results, intuition and emotion.

63:47

And psychotherapists tell me this is

63:49

incredibly helpful metaphor with their

63:51

with their patients because it explains

63:53

and there's a quote from oid in there. I

63:56

see the right way and approve it. Alas,

63:58

I follow the wrong. So I know I should

64:01

go to bed as you say, but yet for some

64:03

reason I'm not going to bed because our

64:05

brains are 500 million years old. They

64:07

work on automatic processes. They're

64:09

animal brains. And then very recently we

64:11

got language and we can reason things

64:13

out, but the but the parts that do

64:15

reasoning don't control behavior. And so

64:17

really the elephant is what largely

64:19

guides our behavior, our automatic

64:21

processes. And your phone um as I said

64:25

before, BF Skinner is in your phone.

64:27

Your phone is a behaviorist training

64:29

device that trains the elephant. Um and

64:32

that's why you often do things with your

64:33

phone that you don't want to do. And so,

64:36

and this is why I'm so insistent that we

64:38

all have to get all of the slot machine

64:41

apps off of our phone. That is the

64:43

original iPhone was an amazing tool. It

64:46

was a Swiss Army knife. It had, you

64:49

know, a telephone, a browser, maps, a

64:52

music player, there was a flashlight.

64:54

Okay, there was no app store. There were

64:56

no push notifications. 2007, 2008, it's

64:59

just a Swiss Army knife. There's no

65:01

problem. Okay, now I'm very lucky in

65:04

that my iPhone has always stayed that.

65:07

I'm always on a computer. So, my

65:08

problem, my attention problems are on my

65:10

computer, but my phone because I never

65:12

had any addictive apps on it except

65:14

during the crypto craze where I played

65:17

around with it and I got hooked and I

65:19

was checking 50 times a day and I saw

65:21

the addiction. So, I once I got rid of

65:23

that and lost all the money that I was

65:24

willing to lose. Once I get rid of that,

65:27

my phone has no addictive power over me

65:29

because when I see it, there's no it's

65:30

not a slot machine call, hey, come back

65:32

and play, come back and play. So your

65:34

phone right now on your personal device,

65:37

you don't have any social media apps or

65:39

anything like that.

65:40

>> I do have Twitter, but I never check it

65:41

there. I never use use that on the

65:43

phone, you know. Now texting and email

65:45

is a little bit like a slot machine

65:46

because sometimes you but it's very

65:47

mild. So this is again what I this is

65:50

what works for my students. Just get the

65:52

slot machine apps off your phone and

65:54

then you'll find that then you could

65:56

even have your phone near you when you

65:58

go to bed. But if you've got addictive

66:00

apps on your phone, you can't have it

66:02

when you go to bed. Angela Duckworth,

66:04

the woman who who gave us the concept of

66:06

grit, she has this amazing graduation

66:08

speech at one of the schools in New

66:09

England, and she says something like,

66:12

>> "Where you put your phone at night will

66:14

may become the most important decision

66:16

you make in your life."

66:17

>> And what she means by that is not that s

66:18

it's it's I if you can use behavioral

66:21

control and change the stimula, if you

66:23

can do that, then you're going to be

66:25

okay. But if not, the phone is going to

66:27

take your attention. and you're not

66:28

going to amount to anything.

66:30

>> All I had to do was brain dump. Imagine

66:32

if you had someone with you at all times

66:34

that could take the ideas you have in

66:36

your head, synthesize them with AI to

66:39

make them sound better and more

66:40

grammatically correct and write them

66:42

down for you. This is exactly what

66:44

Whisper Flow is in my life. It is this

66:46

thought partner that helps me explain

66:48

what I want to say. And it now means

66:50

that on the go, when I'm alone in my

66:52

office, when I'm out and about, I can

66:54

respond to emails and Slack messages and

66:56

WhatsApps and everything across all of

66:58

my devices just by speaking. I love this

67:00

tool. And I started talking about this

67:01

on my behindthescenes channel a couple

67:03

of months back. And then the founder

67:04

reached out to me and said, "We're

67:05

seeing a lot of people come to our tour

67:06

because of you." So, we'd love to be a

67:08

sponsor. We'd love you to be an investor

67:09

in the company. And so, I signed up for

67:11

both of those offers. And I'm now an

67:12

investor and a huge partner in a company

67:14

called Whisper Flow. You have to check

67:17

it out. Whisper Flow is four times

67:18

faster than typing. So if you want to

67:20

give it a try, head over to

67:21

whisperflow.ai/doac

67:24

to get started for free. And you can

67:26

find that link to whisperflow in the

67:28

description below.

67:31

We asked our audience how many of them

67:32

thought they were addicted to their

67:35

phone. And roughly 85% of respondents,

67:39

the driver audience described themselves

67:40

as being very or completely addicted to

67:44

>> very or completely. That surprises I

67:45

didn't realize it would be that high. So

67:47

you can do a test. So for people

67:48

listening if you want to say like how

67:50

addicted and by the way we're using the

67:52

word addiction very loosely in our

67:53

conversation. And so what we're really

67:55

talking about because you know there is

67:57

in terms of you know medical clinical

67:59

syndrome um when you think about

68:01

addiction there's certain criteria and

68:03

so what we're talking about is overuse

68:05

or over reliance on your devices.

68:07

>> Compulsive overuse that interferes with

68:09

other domains of life.

68:11

>> Yes. It inter

68:12

>> if that is an addiction I don't know

68:13

what is. And so when you're thinking

68:14

about am I addicted to my phone? Do I

68:16

have am I you know really what the very

68:19

simple thing that you can do. I did it

68:20

myself and I was like I know again like

68:22

you Stephen like know all the science

68:24

still was really difficult. You have all

68:26

the access and it was still difficult.

68:28

And so all you have to do is you just

68:30

take your phone you put it in another

68:32

part of your house or apartment or

68:34

whatever and give yourself a couple of

68:35

hours when you know you're going to be

68:37

home or you know you're not reliant on

68:39

your phone for work or whatever. an

68:40

hour, two hours, three hours, and just

68:43

have a piece of paper, old school, piece

68:44

of paper and a pen with you. And every

68:47

time you feel that compulsion of like, I

68:48

want to check my device, you make a

68:50

mark, you make a mark, you make a mark,

68:51

and just to see because some people say,

68:53

I'm surprised that your audience at 85%

68:55

because most people would say, I don't

68:57

know if I'm really addicted. And so I

68:59

like that there's that sense of

69:00

self-awareness. But if you're thinking,

69:02

I'm not really that addicted. You

69:04

breathe in an hour 960 times a minute.

69:08

And you may notice that you want to have

69:10

that that compulsion to check 960 times

69:13

a minute or you know thereabouts because

69:16

we all have that sense of reliance on

69:18

our devices. So that's like a really

69:20

quick way that you can check to see am I

69:23

relying on my device?

69:24

>> Are you addicted to your phone under

69:26

that definition? Because of the line of

69:28

work that I am in, I can very quickly I

69:31

have certain tells when I know I call

69:33

them the canary in the coal mine, right?

69:35

I think we talked about this the last

69:36

time I was here. I can very quickly tell

69:38

when I'm starting to get that feeling of

69:41

addiction or compulsion. And so I course

69:44

correct early, but that's only because I

69:46

know the science and I course correct.

69:48

So I keep my you I keep my phone outside

69:50

I I walk the talk. I keep my phone

69:52

outside my bedroom. It is not within

69:54

arms reach. I grayscale my phone during

69:56

periods of deep focus during the day

69:57

when I have a deadline I have to get

69:59

things done and at night so I avoid

70:01

revenge bedtime procrastination but

70:03

sometimes it happens like I'm a human

70:05

you know so this past week um not to be

70:07

a real downer but there have been things

70:09

that have been in the media the past

70:10

week that have been really challenging

70:12

especially as a woman and so I have

70:14

found myself with the primal urge to

70:16

scroll my amydala has been triggered I

70:19

have been going down rabbit holes and I

70:21

wouldn't ordinarily do that so I give

70:22

myself grace too and have a sense of

70:24

self-compassion.

70:26

Do you feel like you're addicted to your

70:27

phone?

70:28

>> No, I'm not at all addicted to my phone.

70:30

Uh cuz I don't have any slot machine

70:31

apps on it. But I really want to

70:33

question you made a distinction that

70:35

many scientists do, which is well, you

70:37

know, we can't quite say it's addiction

70:39

because, you know, addiction is certain

70:40

biochemical pathways based on, you know,

70:42

heroin and addictive substances. Uh but

70:45

I believe that this is one of the meta

70:48

talking points that they that they are

70:49

able to push that we can't call it

70:51

addiction. It's different. No, I don't

70:52

mean No, I'm sorry. I don't mean I'm

70:53

sorry. And no way. Look, you know, you

70:55

and I are total allies on this. We see

70:57

the problem. We're both all I mean is,

71:00

you know, we're we're supposed to be

71:01

very careful about using the word

71:02

addiction, but and you had analyt and

71:06

she was very clear like in her practices

71:08

and now it's overwhelmingly digital

71:09

addictions. It's all of this is working

71:12

through dopamine. If you feel compulsive

71:14

use, definitely dopamine. So, it's most

71:16

of the same brain centers as it is for

71:18

heroin or crack or any other drug. Um,

71:20

and it's the same effects that is the

71:22

it's it's compulsive use where you don't

71:25

want to do it, you want to change, but

71:26

yet you find yourself doing it and you

71:28

have withdrawal effects. Uh, and people

71:31

and people have terrible withdrawal

71:32

effects when they're heavy users of

71:34

these things and they stop. And so, you

71:36

know, if it walks like a duck and talks

71:38

like a duck and swims like a duck, I'm

71:40

going to call it a duck. In fact, that's

71:42

what they call it. So, I just want to

71:44

read one more quote. Again, the quotes

71:45

are just so astonishing. some meta uh

71:47

meta researchers and one of them says

71:50

quote it seems clear from what's

71:52

presented here in this internal study

71:55

uh that some of our users are addicted

71:57

to our products that's their word

71:58

addicted to our products and I worry

72:00

that driving sessions incentivizes us to

72:03

make our products more addictive without

72:05

providing much more value how to keep

72:07

someone returning over and over to the

72:09

same behavior each day intermittent

72:11

rewards are most effective think slot

72:13

machines reinforcing behaviors that

72:15

become especially hard to extinguish

72:18

even when they provide little reward or

72:20

cease providing reward at all. people. I

72:24

mean, it just imagine an industry that

72:27

has caused 85% of people to feel that

72:30

they're addicted

72:32

>> and not calling it addiction

72:33

>> and not calling it addiction. And these

72:35

people these these people are having

72:37

their lives diminished, their

72:40

relationships diminished. So I'm trying

72:42

to convey is we're seeing the

72:45

destruction of human capital, the

72:46

destruction of human potential, the

72:48

destruction of human relationships, the

72:50

destruction of connection, the

72:51

destruction of sense of meaning at a

72:53

scale so vast I don't think people are

72:55

capable of comprehending it. I now

72:57

believe this is affecting most human

72:59

beings. These industries, these few

73:01

companies have damaged the lives of most

73:04

human beings. We don't have good data

73:05

from the developing world but certainly

73:07

the developed world wherever kids are

73:08

going through puberty on on

73:10

touchscreens. You you you you have this

73:12

constant fighting over the over the uh

73:15

over the screens over the technology and

73:16

you have these uh diminishing outcomes,

73:19

diminishing cognition, diminishing sense

73:20

of of purpose in life

73:23

>> only to get worse with the AI.

73:24

>> As AI comes in, it's going to get worse

73:26

unless we act and we've got to change

73:28

course in 2026. We don't have five years

73:30

to study it. We've got to stop this now

73:32

in 2026. Are you concerned at all about

73:36

the way education's going for children?

73:37

Because

73:38

>> Oh my god. Yes.

73:38

>> It appears that edte edtech is, you

73:41

know, big tech in a sweater, as they

73:43

say.

73:44

>> Because I I was almost imagining a

73:46

future where my future kids are going to

73:48

learn their curriculum from an AI

73:51

chatbot. Cuz, you know, I can imagine

73:52

the case cheaper,

73:54

>> more personalized, more convenient. It's

73:57

going to know my if my son's called

73:58

Timmy, it's going to know Timmy's brain

74:00

and it's going to know how to make him

74:01

pay attention and what he's interested

74:03

in and what he's not. So, are you

74:05

concerned about this or is this a good

74:06

thing?

74:07

>> There is definitely a use case for

74:09

edtech. Um, if there could be a device

74:11

that only did math tutoring or only did

74:14

tutoring and you couldn't watch videos

74:16

on it, I'm totally open to believing

74:18

that that can speed up teaching. But

74:21

here's what's happened.

74:23

We put computers on everyone's desks

74:25

around 2014, 2015. We used to think in

74:28

America that it was an equity issue even

74:30

back to the 90s. The rich kids all have

74:31

computers. The poor kids don't. Let's

74:33

get philanthropists to buy computers for

74:36

school districts that every kid can have

74:37

a computer on their desk. Okay. Now,

74:39

what is a computer? It's a play device.

74:42

It does everything. Kids use it at home.

74:44

They, you know, they watch videos. They

74:46

do all sorts of things. You put it on

74:48

their desk and you tell them to do math

74:49

homework. What happens? It's mostly

74:51

short videos. That's what research is

74:52

showing. It ends up because they don't,

74:53

you know, they always they don't block

74:54

YouTube. They might say, "Oh, we block

74:56

porn. We block video games." They can

74:58

get around all that. And if you're

74:59

letting them do YouTube, it's YouTube

75:00

shorts, which is Tik Tok. So, what

75:03

happened to test scores in the United

75:04

States from the 70s through 2012? They

75:07

were rising. We actually were improving

75:09

what kids knew, what kids learned in the

75:12

United States. We have very good data.

75:13

The national the NAPE, the National

75:15

Assessment of Educational Progress goes

75:17

up till 2012. And then by 2015, it

75:19

starts going down. And it's going down

75:21

before COVID and it goes down more

75:23

during COVID and everyone thinks like oh

75:25

it's COVID but it started go the peak

75:26

was 2012 and what's happening what we

75:30

now can see is that the top students the

75:33

very best students who are the ones with

75:34

executive function they're the ones who

75:36

can pay attention if you put a computer

75:38

on that kid's desk he's not destroyed by

75:40

it he can actually still learn but the

75:43

bottom 50% cannot the b so all of the

75:46

drop in educational stats is the bottom

75:48

50% the bottom 50% % in terms of

75:50

capacity to pay attention. Their

75:51

education is being devastated and that's

75:54

what happened when we put laptops and we

75:56

put Chromebooks and iPads on their

75:57

desks. Um, we spent hundreds of billions

76:00

of dollars on this stuff and it has

76:02

damaged education and if we'd spent a

76:04

quarter of that on teachers, we would be

76:06

in such better shape today. So, we made

76:08

a colossal blunder with edtech in the

76:10

2010s and now we're about to do the same

76:13

thing again with AI. Again, maybe there

76:15

are apps, maybe there are applications

76:17

that will be great, but we've got to put

76:19

the burden of proof on Silicon Valley.

76:21

We've got to say, you guys have to prove

76:23

that this stuff is effective and safe

76:24

before we'll let it in. We are not going

76:26

to let you just say, "Hey, let's just

76:28

flood the zone. Let's give it to

76:29

everybody and then we'll wait 10 years

76:30

and see what happens."

76:33

I mean that brings brings up this um

76:35

this study that I have in front of me

76:37

here which was a 2022 study a Munich

76:40

study which tested the idea of brain rot

76:42

which um I believe was the Oxford

76:44

dictionary word of the year 2024

76:47

>> and what they did is they gave 60

76:49

participants a test then a 10-minute

76:51

break and then another test during the

76:53

break they either rested or used Tik Tok

76:57

Twitter or YouTube and the results

76:58

showed the following the Tik Tok group

77:00

so They had a 10-minute interval to do

77:03

anything. And this group got Tik Tok to

77:05

look at. Their memory accuracy dropped

77:08

from 80% before the break to 49% after

77:12

the break. A nearly 40% decline just

77:14

from a 10-minute break. In contrast, the

77:17

Twitter and YouTube groups showed no

77:19

significant change in the Munich study.

77:20

And there's an image I'll throw up on

77:22

the screen.

77:23

Results from the Munich study showed a

77:25

40% drop in prospective memory accuracy

77:27

in the Tik Tok group after a 10minute

77:29

break, which is unbelievable. Yeah, it's

77:32

unbelievable. What the hell is going on

77:34

there? How can a 10-minute Tik Tok break

77:36

drop my memory accuracy by 40%.

77:41

>> Tik Tok is brain rot.

77:42

>> What's going on?

77:43

>> There's so much going on in the brain.

77:45

So, you know, when you're thinking about

77:49

here's the thing. Brain breaks are not

77:51

nice to haves. They're actually

77:53

essential for your brain. And so we

77:55

talked a little bit about that, you

77:56

know, default mode network and what

77:58

happens to it when you're engaging with

78:00

your devices. And you know, that's not a

78:02

brain break. That's activating all of

78:03

the aspects. So it's activating your

78:05

amygdala. It's dampening or decreasing

78:08

the volume of your prefrontal cortex.

78:10

It's creating that reward system, the

78:12

dopamine hit, those addictive behaviors.

78:15

So it's only, you know, when you're

78:16

thinking about memory planning, what was

78:19

the metric here? It was memory, right?

78:21

that was the the the metric that they

78:22

were using to study. And so when you're

78:24

thinking about working memory or um

78:26

cognitive function, complex problem

78:28

solving, this is all prefrontal cortex.

78:30

And so when you're engaging with Tik Tok

78:32

10 minutes, 5 minutes, whatever it is,

78:35

you are dialing down that biology in

78:37

your brain. And so of course you're

78:39

going to see changes and you're going to

78:42

see the flip side, increased

78:44

hypervigilance, irritability,

78:46

distractability, fragmented attention.

78:48

It's just again this is not to say that

78:51

this whole conversation right or when

78:52

you're reading studies you might say to

78:54

yourself what's wrong with me you know

78:56

is there something wrong with me am is

78:58

my brain broken am I weak it is not you

79:01

you are not alone it is not your fault

79:03

it is the biology of your brain doing

79:05

exactly as it should so we talked about

79:07

the amydala and prefrontal cortex here

79:09

your amydala is not wrong or broken it's

79:12

by design supposed to think about your

79:15

immediate needs survival

79:17

self-preservation And so when you're on

79:19

the algorithm, we know we talked about,

79:21

you know, certain um or maybe we didn't

79:23

talk about it. Certain content that you

79:25

see on Tik Tok and others

79:28

that when it's reactionary, you know,

79:30

words like FOMO or ragebait, these are

79:32

not neutral terms. When you're engaging

79:34

with these uh social media platforms,

79:36

it's not something neutral. It's not

79:38

passive. It is an active biological

79:40

process in your brain. So this study,

79:42

it's not surprising. It is actually

79:44

exactly what you would expect on to

79:47

happen to your biology if you had this

79:49

sort of what we call in medicine this

79:51

kind of intervention. It's stimulating

79:54

exactly what it's supposed to do.

79:56

>> Yeah. I'll just I'll just add on to what

79:57

Adidi said that there are some there are

80:00

many medical conditions where you can't

80:02

just go to the patient and say why do

80:05

you think you got this cancer? Oh, you

80:07

know I think it's cuz I ate a lot of you

80:08

know chocolate when I was whatever. You

80:10

know there when when the when the the

80:13

the act is separated from the effect by

80:15

30 years then you don't expect the

80:18

patients to have insight into the cause

80:19

of it. But when the outcome is separated

80:22

from the input by seconds and you have

80:26

literally millions of chances to observe

80:28

the co-variation

80:30

the patient is really really accurate.

80:33

In fact the patient really knows what's

80:34

going on. And so I think the deciding

80:36

factor here on this big debate about oh

80:39

is it just correlation or is it

80:40

causation um the deciding factor for

80:43

social media and for a lot of these tech

80:45

innovations including video games and

80:47

gambling and all of that should really

80:48

be the kids and if the kids say this is

80:51

bad for me we should take their word for

80:53

it given that we also have correlational

80:56

studies random control trials uh uh

80:58

longitudinal studies natural I mean we

81:00

have so much other data but given that

81:02

the kids themselves they call it brain

81:05

rot. They call the material brain rot.

81:06

Um my students tell me it's a huge

81:08

obstacle to them doing their homework.

81:10

As one of them said, I pull out a book,

81:12

I read a sentence, I get bored, I go to

81:15

Tik Tok. You know, so if they're telling

81:16

us that this is damaging their ability

81:18

to pay attention, they feel it. They

81:19

feel the loss. We all feel it. Well,

81:22

many of us have noticed this. Um um then

81:25

I think this is pretty decisive evidence

81:27

that this stuff is bad for cognition

81:30

>> and it has long-term consequences. So,

81:32

it's not just that in the moment, right?

81:34

So, there was this case that was all

81:36

over the media, a college student. I'm

81:38

sure you're familiar with the case. And

81:40

this young woman was on TikTok

81:43

experiencing brain rot. And then some

81:45

Tik Tok algorithm took her down to this

81:48

place of, you know, you should take an

81:49

edible. It'll help you so you can Wow.

81:53

prescribing drugs. Wow.

81:54

>> And you could go to class and you could,

81:56

you know, be more alert. And so, she did

81:58

that. And then it continued on and on

82:00

and then she developed a dependence on

82:03

edibles and then checked into rehab. And

82:06

only when she focused on analog

82:08

activities like guitar playing and a

82:12

couple of other things that she started

82:13

doing is when and you know removing the

82:17

stimulus the the Tik Tok um algorithm is

82:21

when she started to improve. So it's not

82:22

just in the moment oh I can't remember

82:24

something or I'm more irritable. These

82:26

sorts of things compound and the

82:29

long-term squellle or the long-term

82:31

effects can be quite damaging. That's

82:33

just one example.

82:35

>> In your book, The Anxious Generation,

82:37

Jonathan, you the the subtitle here is

82:38

how the great rewiring of childhood is

82:41

causing an epidemic of mental illness.

82:43

>> I was looking at some of these graphs of

82:45

different sort of mental illness

82:47

>> illnesses and um they're increasing. One

82:50

of them that's increasing is ADHD.

82:54

>> I was diagnosed with ADHD.

82:56

um maybe about a year ago. And when

82:59

we're talking about short attention

83:01

spans, I mean, the name attention

83:03

deficit hyperactivity disorder, I

83:05

believe that's what it's called,

83:06

>> sounds a lot like what we're talking

83:08

about.

83:09

>> Yeah.

83:09

>> Is there a link, do you believe, between

83:12

the increasing diagnosis of of ADHD and

83:16

the sort of frying of our brains with

83:18

>> short form video and social media?

83:20

>> Yeah, I I mean, I suspect that there is,

83:22

but here's here's what I can tell you I

83:23

learned while writing the book. Um, I

83:25

looked to see if there were studies

83:27

indicating that uh heavy use of of

83:31

social media and video games and all the

83:33

electronic stuff caused ADHD. And when I

83:36

was doing the research in 2023, I did

83:38

not find evidence that it will give a

83:39

kid HD ADHD who otherwise wouldn't have

83:41

it. What I did find was evidence that

83:43

for kids who have ADHD, when you let

83:46

them have the devices, the video games,

83:47

all that, their symptoms get much worse.

83:49

And so because it is a major achievement

83:52

of young adulthood to be able to pay

83:54

attention to develop what we've been

83:56

calling executive function to be able to

83:58

make a plan and decide oh to reach the

84:00

plan I have to do this and then I do

84:02

this and then it might be a long time

84:04

before I get here but I will keep going

84:05

and I will keep my eye on the prize that

84:08

I I assume that's you're saying it's a

84:10

little harder for you to do that. I mean

84:11

that's what ADHD means. How do you

84:13

experience ADHD? Well, well, hm, I def I

84:18

mean, if I think about school, I

84:20

couldn't pay attention in school for for

84:22

for very long. And that meant that I was

84:24

always in the expulsion room and then I

84:26

was expelled. And then that's kind of

84:28

it's I feel like it's got worse as an

84:30

adult. And from my in my opinion, my

84:33

relationship with my phone has made it

84:35

much worse

84:36

>> where really I can't I can't pay

84:38

attention to to many things for a very

84:39

long time. The exception to this is I

84:41

can do deep work

84:45

for many many hours without moving. It

84:47

was almost a bit of

84:47

>> when you are extremely motivated. I say

84:49

when you're really into it, you can be

84:51

into it. That's right. But a lot of work

84:54

isn't that a lot of being effective in

84:56

the workplace is not you're following

84:57

your passion. Right. ADHD kids, they can

85:00

zoom in because they're getting the

85:02

dopamine. They're getting the dopamine

85:03

from this thing. But a lot of work isn't

85:05

like that. And these kids are not going

85:07

to be able to do that. So actually what

85:09

you said, it fits perfectly with what

85:10

what I found from those Dutch studies.

85:11

if you did have whether it's a genetic

85:13

or whatever the predisposition is the

85:16

this environment has made your symptoms

85:18

worse. Now of course ADHD kids can be

85:19

incredibly uh creative they are often

85:22

very very successful but my fear is that

85:25

the pathways to success that they used

85:27

to take might be blocked if they

85:28

basically are just scrolling all day

85:30

long and not able to pay not able to um

85:33

have real life experiences

85:34

>> and relationships are like that

85:35

especially romantic ones. It's an

85:37

interesting thing that you bring up,

85:38

Stephen, because there is an increase in

85:40

adult onset, you know, when adults are

85:43

diagnosed with ADHD, because typically

85:45

we think of ADHD as a pediatric

85:47

condition or young adults. And so,

85:49

increasingly, we're seeing more and more

85:51

adults who are in their 30s and 40s,

85:53

50s, sometimes even 60s, who are being

85:56

diagnosed, newly diagnosed with ADHD.

85:58

And so, that's an interesting there's so

86:00

many um, you know, reasons like it might

86:02

be that they had it all along and they

86:05

were diagnosed. And so what is going on

86:06

there? That would be a future podcast

86:09

episode for an ADHD ADHD expert of, you

86:13

know, what are the drivers of why are so

86:15

many adults being diagnosed with ADHD

86:17

>> or maybe even just the symptoms looking

86:18

very similar.

86:20

>> Mhm.

86:20

>> Um

86:21

>> Yeah, that's right.

86:22

>> You talked about popcorn brain editing.

86:24

>> Yeah. So, you know, we've talked about

86:26

brain rot and the primal scroll and

86:28

popcorn brain is kind of an offshoot.

86:30

It's part of the same family. And so

86:32

what happens is it's a term coined by a

86:34

man a psychologist named David Levy. And

86:37

what happens with popcorn brain is that

86:39

you and we all have it. And so what it

86:43

is a societal phenomenon when you spend

86:45

too much time online and you are

86:46

overstimulated and so it is hard for you

86:49

to spend time offline. Offline feels

86:51

slow, boring because things are moving

86:53

at a much slower pace. And so popcorn

86:56

brain is the sensation of your brain

86:58

popping. It is not actively popping.

87:00

It's not like your brain cells are

87:01

popping, but it sure feels like it. And

87:04

so your primal urge to scroll kind of

87:06

primes your brain to develop popcorn

87:07

brain. You are more at risk for

87:09

developing popcorn brain when you feel a

87:11

sense of stress because of that primal

87:13

urge to scroll. The differentiator

87:14

between brain rot and popcorn brain.

87:17

Again, these are societal terms that

87:20

we're calling for a constellation or a

87:21

group of symptoms, right? And so the

87:23

difference to me is that popcorn brain

87:26

is ubiquitous. It's everywhere. It's

87:28

like we all have it and it's happening

87:31

all all the time because of the modern

87:33

age and a lot of the things that we

87:35

talked about. Brain rod is a little bit

87:37

more specific. It's a little bit more

87:39

well-defined. So it has certain features

87:41

like we call it the biocschychosocial

87:44

model. When you're thinking about a

87:45

particular medical or condition or an

87:47

entity. So what are the biological

87:49

factors? We talked about what defines

87:51

brain rod. you know, a change in brain

87:53

waves, a change in brain regions, the

87:55

amygdala lighting up and the prefrontal

87:58

cortex kind of being quiet. Um,

88:00

psychological factors, we talked about

88:02

attention, um, co complex problem

88:05

solving, impulse control and then the

88:08

social factors, loneliness and others.

88:10

So, um, compulsion and so I would say

88:13

popcorn brain is something that we all

88:15

suffer from and you know brain rot is

88:18

something that is very specific. The

88:21

other thing that we haven't talked about

88:22

that I would love to kind of because so

88:24

much of our conversation is like doom

88:26

and gloom, right? It's likew

88:28

one thing that I would like to say is

88:29

that as bad as when you hear the term

88:33

brain rot, it seems permanent because

88:36

rot it conotes like deterioration.

88:39

That's it. It's one-sided is one way and

88:41

that's it. But in fact, popcorn brain

88:43

and brain rot are reversible conditions.

88:46

So it is not

88:46

>> in adults

88:47

>> in adults. If you've gone through

88:49

puberty with it, it's not so clear.

88:51

>> Yes. In adults, and my work focuses on

88:54

adults. And so when you have, if you

88:56

experience brain rot in your 30s, 40s,

88:57

and beyond, you can, it takes time, you

89:02

it takes eight weeks for your brain to

89:03

rewire itself. Give yourself time. A

89:05

sense of self-compassion is really

89:07

important. But you can, you know, there

89:09

is a sense of it being able to be

89:11

reversed. So it's not so much a brain,

89:13

it's not a fixed trait, but rather a

89:16

brain state. So I think it's important

89:17

to offer that hope.

89:18

>> What is an adult brain? What age is an

89:20

adult brain? Like what age does my brain

89:21

stop growing in in the way where it's

89:23

reversible?

89:24

>> So

89:24

>> yeah, I mean that you know traditionally

89:26

it was thought that uh you know puberty

89:29

is the period of super rapid brain

89:31

change and that begins you know early

89:32

early teens sometimes even before 10 and

89:35

is mostly over by sort of you know mid

89:37

to late teens. But then the prefrontal

89:39

cortex which Aditi was talking about

89:41

which is so important for impulse

89:42

control and and executive function that

89:45

doesn't finish myelinating. Myelin is

89:46

when the sort of the neuron that you get

89:48

a sort of a fatty sheath like an

89:50

insulation that sort of locks down the

89:52

circuits and makes them more efficient.

89:53

Um that doesn't stop until around age 25

89:56

is what we've always said for many

89:58

years. But you're telling me that

89:58

there's new research showing that.

90:00

>> Yeah.

90:00

>> Tell tell us about that. So, you know,

90:02

all this time, right, we've always said

90:04

that the prefrontal cortex is fully

90:06

formed and fully functional at the age

90:07

of 25. And so, when you're talking about

90:08

impulse control and all of this stuff,

90:10

but there was this really interesting

90:11

study, I'll send it to you. It um looked

90:14

at I think it was 1,000 people um from

90:17

age zero, so birth all the way to 90, so

90:20

the entire population. And um it found

90:24

five, it looked at lifespan and said

90:27

there are actually five stages. So first

90:29

is childhood up zero to age nine. During

90:31

this time your brain is not very

90:33

efficient but it's really growing and

90:36

you know it's it's growing and changing

90:37

but it's not really efficient.

90:39

>> 9 to 32 is considered adolescence and so

90:44

you know 32 is when adolescence ends

90:47

apparently according to the

90:48

>> sort of I mean you're most of the way

90:49

done by 25 but but there's still some

90:52

there's some flexibility even after

90:53

that. And then the next stage is from 33

90:57

to I think 63.

91:00

66 is like adulthood.

91:03

Things are very stable. Learning is

91:04

stable and you know um it's efficient

91:08

and it's it it things are doing well and

91:10

then yeah 66 to about 83 is early aging

91:16

and so that's when you see some of the

91:17

age related changes and then 83 plus is

91:21

late aging. So the the kind of main

91:24

finding was that, you know, it was all

91:26

over the news. It was like adolescence

91:28

goes until 32.

91:31

>> So I'm 33. So I'm

91:33

>> one year, one year out.

91:34

>> I'm cooked by now.

91:35

>> Yeah.

91:36

>> When you wrote this book, Jonathan, the

91:38

anxious generation, it um it's had a big

91:40

impact on the world in a way that I

91:42

think any author might dream of. And I

91:44

know this in part because, you know, I

91:45

sit on this podcast interviewing really

91:47

interesting people all the time. And

91:48

even this morning when I did an

91:50

interview across town with James Ston,

91:53

he talked about this book twice. And you

91:55

know, laws have been changed around the

91:57

world inspired by this book. And we're

91:59

actually seeing an increase of laws in

92:02

the UK. I mean, Australia just banned, I

92:04

think, social media for people.

92:06

>> You met with Mcronone,

92:08

>> right?

92:08

>> Yeah. Yeah.

92:09

>> Could you ever have imagined? And

92:11

actually, what does the success of this

92:13

book say?

92:14

>> Yeah.

92:15

>> About society. No, thank you for that

92:17

question because, you know, I I do tend

92:19

to get, you know, as you've heard, I

92:21

mean, I'm extremely alarmed about these

92:22

trends and these are gigantic threats

92:24

beyond what anyone can imagine. But

92:26

here's the amazing thing is that we can

92:29

reverse this for almost no money and

92:32

it's completely bipartisan and it's not

92:35

that hard to do. Um, and we're doing it.

92:38

And so what happened was, you know, I

92:40

wrote the book as an American assuming

92:42

that we don't have a functioning

92:43

legislature. The Congress can be

92:44

stopped. We have a vetocracy. The social

92:46

media companies can stop anything in the

92:48

house. So I wrote this assuming, you

92:50

know, we'll never get legislation. Um,

92:52

so we have to do this on our own. And I

92:53

proposed four norms. No smartphone

92:55

before high school, no social media

92:56

before 16, phone free schools, and far

92:59

more independence free play

93:00

responsibility in the real world. So

93:01

four norms. We can try to do this with

93:03

collective action locally at the school

93:05

level.

93:06

Two things that surprised me. One are

93:09

that immediately

93:11

governors from red states and blue

93:12

states started reaching out to me. Our

93:14

states actually function. Our states

93:15

have governments that are accountable to

93:16

the people and that are trying to get

93:18

good results. And so this has been a

93:19

totally bipartisan issue. Sarah Huckabe

93:21

Sanders from Arkansas was one of the

93:23

very first Kathy Hokll also. And it is

93:25

it tends to be more female legislators

93:28

and governors or spouses of heads of

93:30

state. And the moms, the book really

93:32

spoke to moms because moms around the

93:34

world, they felt the kids being pulled

93:36

away. I believe they felt it viscerally

93:38

more than the dads did. Also, the dads

93:40

kind of like the video games. They're a

93:41

little more pro tech. So, I think the

93:43

moms felt the pain more and took it more

93:45

personally. So, when the book came out,

93:47

mothers around the world jumped into

93:49

action, formed groups, pushed for

93:51

legislation, and changes began

93:53

happening. What I just I just I was just

93:55

I was in Davos and then London and

93:57

Brussels two weeks ago and what I saw

94:01

was a complete sea change in the world's

94:03

thinking about how we need to have age

94:06

limits on social media and other tech.

94:08

And here's what I think just happened.

94:09

It's it's so cool. It just dawned on me

94:11

literally while I was in London. Like I

94:13

was pushing on open doors everywhere.

94:14

Wherever I went, people wanted to do

94:16

this. I went to the EU, they want to do

94:17

this. Like what is happening? And what I

94:20

realized is this. Steven Pinker has a

94:22

book out last year called When Everyone

94:24

Knows That Everyone Knows. It's about

94:27

the immediate change in a social system

94:30

when private knowledge, you know,

94:32

everybody knows that the emperor has no

94:34

clothes. Everybody knows that this, you

94:37

know, ideology doesn't work. Everybody

94:38

knows that, but they don't all know that

94:41

everybody else knows it and that

94:43

everybody else knows that. And so in the

94:44

emperor's new clothes, everybody thought

94:46

he's I I don't think he has any clothes

94:48

on, but maybe, you know, maybe only wise

94:51

people can see it. But when the child

94:53

says, "The emperor has no clothes." And

94:55

then in the Hans Christian Anderson

94:57

story, it says, "And the people began

94:59

whispering to each other and then they

95:01

all cried out in unison." And that's

95:03

what happened when Australia's law went

95:05

into effect. So I believe that uh

95:07

December 10th of last December was the

95:10

global turning point in the battle to

95:12

reclaim childhood and if we reclaim that

95:14

we move on to our attention and adult

95:16

life as well. What happened on on

95:18

December 10th? The Australia law went

95:20

into effect. Sky didn't fall. People

95:23

weren't locked out of their accounts.

95:26

All the companies complied. They shut

95:27

down 5 million uh accounts for

95:30

Australia's three and a half million

95:31

kids that were underage uh 2 and a half

95:34

million kids. this sky didn't fall. And

95:36

there was a lot of news coverage around

95:37

the world of what Australia was doing.

95:39

And a lot of the news coverage included

95:42

opinions from the writers saying, "Why

95:44

can't we do that? Hey, let's do that

95:46

here." And when everybody saw that

95:49

everybody was looking at Australia and

95:50

saying, "Let's do that here." Then

95:52

everybody knew that everybody knew that

95:54

this is just completely bonkers to have

95:56

children being raised on social media

95:57

platforms talking with anonymous

95:59

strangers and being fed algorithm

96:01

algorithmically curated garbage. So I

96:04

believe that that's why 2026 is going to

96:07

be the year when at least 15 countries

96:09

are going to commit to passing an age

96:12

minimum law. In 2025 it was one

96:14

Australia and now we already have

96:17

Indonesia. Their law goes into effect in

96:19

March. Uh I met with Macron in in Davos

96:22

and a few days he was preparing to push

96:25

a bill through the assembly and he got

96:26

it. He's the first in the EU but a lot

96:28

of other countries in the EU are going

96:29

to follow. The whole EU is likely to do

96:31

it. Um, so, so yes, I am incredibly

96:35

alarmed about how big this problem is,

96:38

but I'm incredibly inspired that the

96:40

whole world is rising up to do something

96:42

about it. We actually can control our

96:45

fate, and that was not clear before

96:46

December 10th.

96:49

>> Bravo. As a mother, that was the first

96:51

thing I said to you. The first thing I

96:53

said to you was, "Thank you as a mom for

96:56

changing my family's life."

96:59

>> Thank you, Liy.

97:02

It's a really special accomplishment,

97:04

Jonathan. You know, I could there's no

97:06

real words that I could say that could

97:08

quite capture the long-term impact that

97:11

that's going to have on billions of

97:14

people's lives. And not just the direct,

97:16

but also the indirect in all the ways

97:18

we've described, their ability to form

97:21

connections, to fall in love, to find

97:22

meaning and purpose in their lives. and

97:24

their neuroscience and therefore you

97:26

know the neuroscience of their their

97:29

children and their children's children

97:30

and so on. So it's a really it's a

97:32

really overwhelming accomplishment.

97:36

It it well it was a bizarre situation

97:38

that I walked into with the unique

97:41

abilities of a social psychologist. That

97:43

is everybody was upset about this.

97:45

Everybody could see it but they thought

97:46

well this is my problem or in my family

97:48

we have this problem and um and I came

97:51

to this with fresh eyes. My dissertation

97:53

was on moral development. I'd studied

97:55

adolescent behavior longer ago in my

97:57

career and I've written about it in all

97:58

my books. So, it wasn't totally new to

98:00

me. But I came into the field of social

98:01

media studies around 2018 2019. I really

98:04

immersed myself in it. And it was like,

98:06

you know, you walk in and immediately

98:08

you see, wait, this is a trap. People

98:11

are on it because people are on it and

98:13

the kids are complaining about that.

98:15

Everyone's complaining about it and the

98:16

only reason they can't get off is

98:18

because everyone else is on it. So, I

98:20

think I was able to see that. And then

98:22

also CO confused us for a few years. So

98:25

it wasn't until CO was in the rearview

98:27

mirror that it was possible for

98:28

everybody to say, "Wait, this is crazy."

98:31

And so I was incredibly lucky in terms

98:33

of the timing. My book happened to come

98:34

out in March of 2024 just as the world

98:37

was ready to see like, wait, what have

98:39

we done to our kids? Let's undo it.

98:41

>> And you said you're now focusing more on

98:43

short form video. So yes, so in studying

98:48

older Gen Z, these are the people who

98:50

went through puberty uh on Instagram. Um

98:52

I should if I could just lay out that

98:54

it's very important to get the timing to

98:55

that everyone understands the timing

98:56

because this is what you mentioned the

98:57

poly crisis before. The poly crisis I

99:00

believe begins between 2010 and 2015.

99:02

Here's why. So we've had the internet

99:04

for a long time and it was marvelous. We

99:05

love the internet in the 90s. It's going

99:07

to be the best friend of democracy.

99:09

Okay? And then the iPhone comes out.

99:10

It's amazing. Oh my god, this does so

99:13

many things. Everything seems great.

99:15

Okay, so in 2010, most of almost all of

99:18

us have flip phones. The iPhone's

99:19

spreading, but it's still mostly flip

99:21

phones. Teens are all on flip phones,

99:23

basic phones, and we call those people

99:24

millennials. If you finished puberty by

99:27

20, if you if you were born in say 1990

99:30

and you start puberty uh in 2002, you're

99:33

done by 2008. So, you know, in there. Um

99:37

if you got through puberty before you

99:38

got on Instagram, you're a millennial.

99:40

Whereas, if you're born, say, well, if

99:42

you're born after 1995, but let's say if

99:43

you're born in the year 2000, you begin

99:46

puberty in 2012

99:48

and you're not done until 2016, 2018.

99:52

So, in 2010, everyone has a flip phone

99:55

with no front-facing camera, no

99:57

high-speed internet. You have to pay for

99:58

your text. So, you use it to call people

100:01

and to text them, and that's it. It was

100:03

a communication device. And that's why

100:05

the millennials have good mental health.

100:06

They are the last mentally healthy and

100:08

successful generation.

100:11

But if you're Gen Z, you got uh 2012 is

100:15

the year that now most people now have a

100:17

smartphone. It's the year that Facebook

100:19

buys Instagram. They don't change it at

100:21

first, but that's the year that all the

100:22

girls go on it. Um, everyone now has

100:25

high-speed data, front-facing camera.

100:27

Came out in 2010. So by 2015, we're in a

100:30

radically different world for children's

100:32

development. It's now radically

100:34

different, much more hostile to human

100:36

development. And that's what we did to

100:38

Gen Z and now we're doing to Gen Alpha.

100:40

For politics, it was, you know, it was

100:43

crazy for all sorts of reasons in every

100:44

decade. And especially, you know, the in

100:46

the early 2000s, there's a lot there's a

100:48

culture war going on. There's all kinds

100:49

of stuff going on. But it was when it

100:52

was when everyone has really Twitter was

100:54

the biggest perpetrator of this. when

100:56

everyone has Twitter and everyone's

100:57

checking all the time and anything can

100:59

blow up. You know, you described the way

101:01

there was, you know, variance in in on

101:03

Tik Tok. Um, if you get it just right,

101:05

it can blow up. You can have huge

101:07

impact. That's when the democrac

101:09

democracy is a conversation when it

101:11

moved from newspapers and, you know,

101:14

even simple web bulletin boards when it

101:17

moved to super viral retweet buttons all

101:19

of that. That's all 2010 to 2015. So

101:23

that's why since then everything has

101:25

been insane and it's going to just keep

101:27

getting more insane. And that's why I

101:29

believe we have this poly crisis because

101:31

it it there's more to it. It's not just

101:33

the technology, but I believe the

101:36

transformation of our our connection and

101:38

our information flow and our addiction,

101:41

all of that is radically different by

101:42

2015 compared to how it was in 2010. And

101:45

now everything else builds on top of

101:46

that, I believe. What What do you think?

101:48

Do you think that makes sense? I think

101:49

there's one more data point to add and

101:51

that 2014 was the year that things

101:54

really was the tipping point like you

101:56

say.

101:56

>> Yes. That's Yes. That's the year that I

101:57

point to too. Yes.

101:58

>> Yeah. So before

101:59

>> what do you point to? What what you look

102:01

at the data you see that time spent

102:04

alone when you compare when you look at

102:06

data from like the 1960s to 2014

102:10

>> there it was kind of stable. Americans

102:12

spending time alone spending time with

102:14

friends.

102:14

>> Yeah.

102:15

>> Kind of the same. Right. So people spent

102:17

kind of same amount of time with

102:18

friends, same amount of time alone over

102:20

those decades. 2014 marks a shift and

102:24

there is a steep rise in time spent

102:28

alone and a drop in time spent with

102:31

friends. And so what happens in 2014? It

102:35

is when the majority of Americans get a

102:38

smartphone.

102:39

And it's not to say again we've said you

102:42

know causality correlation which is it

102:44

but there is like based on everything

102:46

that we've talked about my gosh is there

102:48

an association between that this is not

102:50

to say that time spent alone you know

102:53

when I share this data people may say

102:55

you know but I like spending time alone

102:56

I'm not lonely I'm okay this is not

102:58

about being an introvert or an extrovert

103:00

it's not about you know you can have

103:02

solitude and feel great and you're not

103:04

lonely but we are human beings and we

103:07

are social creatures. This is just how

103:09

we are built evolutionarily. And so that

103:12

is a real red flag when you have this

103:16

big jump in time spent alone very much

103:19

the same year. And so my work focuses on

103:22

adults Jonathan on kids but there's this

103:25

you know that's the moment right 2014

103:28

where everything changed.

103:30

>> Last month I told you about our sponsor

103:32

Function Health and their team who've

103:33

developed a way of giving you a full 360

103:35

view of what's going on inside your

103:36

body. They offer over 100 advanced lab

103:39

tests covering everything from hormones,

103:41

toxins, inflammation, heart health,

103:44

stress, and so much more. So, Jack, who

103:46

started this show with me, got his first

103:48

blood draw done a couple of weeks ago.

103:49

So, I thought I'd let him tell you a

103:51

little bit more about his experience.

103:52

>> This test really opened my eyes to

103:54

personally what I should be doing with

103:55

my health. I hear a lot of information

103:56

in this podcast. I sleing,

103:59

so to know how I can relate each one to

104:01

me personally is super valuable. You

104:03

sign up and you schedule your test. And

104:05

once you're done, you get a little

104:06

report like the one I have here. I can

104:08

see my inrange results, my out of range

104:10

results. And there's a little AI

104:11

function, too. So, if I have any

104:13

questions about my out of range results,

104:15

I can just go in there and ask it any

104:17

question I want. And these tests are

104:18

backed by doctors and thousands of hours

104:20

of research.

104:20

>> You get an annual draw done and a

104:22

midyear follow-up. So, if you want to

104:24

learn more, head over to

104:25

functionhealth.com/doac

104:27

where you can sign up for a $365 a year.

104:30

I'll put the link in the description

104:31

below. It is just $1 a day for your

104:34

health. There's a phase a lot of

104:36

companies hit where they're no longer

104:38

doing the most important thing, which is

104:39

selling, and they get really bogged down

104:41

with admin. And it's often something

104:43

that creeps up slowly, and you don't

104:44

really notice until it's happened.

104:46

Slowly, momentum starts to leak out.

104:48

This happened to us, and our sponsor,

104:50

Pipe Drive, was a fix I came across 10

104:51

years ago. And ever since, my teams

104:53

across my different companies have

104:55

continued to use it. Pipe Drive is a

104:57

simple but powerful sales CRM that gives

104:59

you the visibility on any deals in your

105:00

pipeline. It also automates a lot of the

105:03

tedious, repetitive, and time-conuming

105:04

parts of the sales process, which in

105:06

turn saves you so many hours every

105:08

single month, which means you can get

105:09

back to selling. Making that early

105:11

decision to switch to Pipe Drive was a

105:12

real gamecher, and it's kept the right

105:14

things front of mind. My favorite

105:16

feature is Pipe Drive's ability to sync

105:18

your CRM with multiple email inboxes so

105:21

your entire team can work together from

105:22

one platform. And we aren't the only

105:24

ones benefiting. Over 100,000 companies

105:26

use Pipe Drive to grow their business.

105:28

So, if something I've said resonates,

105:30

head over to pipedive.com/ceeo

105:33

where you can get a 30-day free trial.

105:36

No credit card or payment required.

105:39

So, what do we do about this? Because

105:42

when I look at all the stats, we did all

105:44

these audience surveys ahead of this.

105:46

People are spending roughly in our

105:47

audience about 6 and a half hours a day

105:49

on their phones. Um, short form video is

105:52

only going to get more addictive. AI is

105:53

going to know me more. It's going to be

105:54

more personalized. The content is going

105:55

to be generated just for me.

105:57

>> Yeah. What what am I what what's next?

106:00

Is it a law we need to pass? Is it

106:01

something I need to do myself?

106:04

>> So we I think we need to pick the

106:06

lowhanging fruit first. And the reason

106:07

for that is not just efficiency. It's

106:10

that we have to prove that we can

106:11

actually do something because we've

106:12

never done anything. We've never done

106:14

anything to restrain this. We've let

106:15

Silicon Valley run wild. Congress gave

106:17

them special protection. Section 230.

106:20

Nobody can sue them for killing their

106:22

kids if if they feed them content. They

106:24

can't be held responsible. I think

106:25

section 230 is probably something worth

106:27

explaining.

106:27

>> Sure. The communications decency act

106:30

1997 I think it was pleasure miners a

106:32

year. Uh there's a section in it that

106:34

the goal was to specifically let the

106:37

tech companies like AOL back then you

106:39

know let them take down pornographic

106:41

content because they were afraid if we

106:43

take down anything then we're

106:44

responsible for everything and now we're

106:46

going to it's going to be end you know.

106:47

So Congress specifically said no don't

106:49

worry don't worry you know if you choose

106:50

to take something down nobody can sue

106:52

you for you know for what you leave up.

106:53

So, it was a good intention originally,

106:55

but the courts have interpreted so

106:57

widely as to say, "No one can regulate

106:59

social media. They're not responsible

107:01

for hurting kids. You can't sue them."

107:02

And they have never faced a jury. They

107:05

have never, no parent has ever gotten

107:07

justice from them despite all the kids

107:08

whose lives have been ruined. All the

107:09

kids are dead. And that's going to

107:11

change. That's changing just now here in

107:13

February in Los Angeles. So because the

107:16

US Congress sort of set up this problem

107:19

and it also in a different law said how

107:21

old does a kid have to be before a

107:23

company can take their data without

107:25

their parents knowledge or permission

107:26

before a company can expose them to all

107:28

kinds of stuff before a company can have

107:30

them sign away their rights? How old?

107:31

And the original law said 16. Let's try

107:34

16. You know cuz you know it wasn't so

107:37

sick and twisted back then 1998 caught

107:39

by the Children's Online Privacy

107:40

Protection Act. So, but various lobbying

107:43

they pushed it down from 16 to 13 and

107:45

they gutted enforcement. So, as long as

107:47

and that's why all over the internet

107:48

it's are you 13 or what's your birth

107:51

year and as long as you're 13 you're in

107:54

for porn and you have to say you're 18.

107:55

So, because we the it's a few laws that

107:58

set this up. We definitely need laws to

108:00

undo it especially for kids. So, what

108:02

I'm advocating is let's do the easy

108:04

stuff the high impact stuff for kids

108:07

because that is totally not politically

108:09

controversial. There is no left-right

108:11

divide on that and that's been true

108:12

everywhere. Australia, Britain, the EU,

108:14

everywhere.

108:16

Regulating the internet for adults,

108:19

regulating social media for its

108:21

destructive properties in democracy is a

108:23

hell of a lot harder. And I don't have

108:25

easy answers. There's a lot we could do

108:27

to reduce the verality, the spread of of

108:29

the because extreme. So there are lots

108:31

of little things that we can do. And

108:32

Francis Hogan, the Facebook whistler,

108:34

had all kinds of ideas. So we definitely

108:35

can do things to make it less toxic for

108:38

democracy. But those are going to be

108:40

politically controversial because one

108:41

side is going to benefit from more than

108:43

the other. So it's going to be very

108:44

difficult to do. I don't know if we can

108:45

do them in the US. But let's just all do

108:47

the let's just all protect the kids.

108:49

That way we show globally that we

108:52

actually can do something. And if we do

108:54

that then I think we will be able to do

108:57

some basic things about AI like no

108:59

companion chat bots if you're under 18.

109:01

You know these things already have a

109:02

body count. A lot of kids have been

109:04

encouraged to kill themselves. they

109:05

already have driven million or hundreds

109:06

of thousands or millions of people into

109:08

psychosis. So, we'll be able to, I

109:11

believe, put some limits on uh on AI,

109:13

especially for kids. But if we can't get

109:16

this, if we can't win on social media

109:17

for kids, then I don't think we have any

109:19

chance to regulate AI, it's going to be

109:21

much more difficult. What do you think?

109:22

What do you think we should do? And what

109:23

do you think we can do?

109:25

>> So, my work as a doctor, I think about

109:28

what we can do and how I can empower

109:30

people to first build awareness. So, you

109:33

know, I aim to first normalize and

109:35

validate the experience with everyone

109:37

who is engaging with chat bots. And so,

109:40

I don't like to shame people because as

109:43

a doctor, right, like you want you want

109:45

to meet the patient where they are. And

109:47

so, I won't shame someone to say, you

109:50

know, why are you using this um why is

109:53

your boyfriend AI or why are you getting

109:55

married to AI or why are you using AI

109:57

for a therapist? One of my followers on

109:59

social media, it still makes me laugh. I

110:01

put out a call saying, "Why are you

110:03

using AI as your as a therapist, you

110:05

know, and so someone wrote to me, it was

110:08

great. I screenshotted. It said,"Because

110:10

all human therapists are trash." With a

110:12

trash can emoji and it made me laugh and

110:15

I said, you know,

110:16

>> so there is. So to me, when I think

110:19

about what's happening and what we can

110:20

do,

110:21

>> it's no mistake that we're here right

110:23

now. So the pandemic, like we've talked

110:25

about, was a huge driver. social

110:27

isolation,

110:29

uh, hyper reliance on self, right?

110:32

>> Then the proliferation of technology

110:35

that replaced human interaction, Zoom

110:38

board meetings, Zoom funerals, Zoom

110:41

birthday parties, Zoom graduations,

110:43

things that we did in person are now

110:45

online. And then

110:48

>> personally as a doctor, I was a talking

110:49

head during the pandemic for lots of

110:51

news channels about the vaccine. I have

110:53

a background in public health as well. a

110:56

immense distrust and mistrust in

110:59

establishment and experts. And so it's

111:02

like, I'm going to do my own research.

111:03

I'm not going to go see a doctor or a

111:05

therapist. I'm going to talk to my

111:06

chatbot. And also, I mean, you know,

111:09

let's keep it real, the cost, right? So

111:11

people are struggling. They're in

111:13

financial crisis. it there's an unmet

111:16

need yes for human connection but also

111:17

for good therapy or you know good

111:20

medical care because there is such a

111:22

need because of the pandemic and people

111:25

aren't getting the care that they need

111:26

they deserve there's so many factors

111:29

here and so what I've been focusing on

111:31

this year particularly is learning about

111:33

AI chat bots how they are influencing

111:36

mental health what is actually happening

111:37

because I'm a human first AI second

111:39

person it's like my work focuses on high

111:42

touch and AI is high tech and this is

111:45

the first intervention that we are

111:46

seeing that is high tech that is

111:49

becoming high touch and that scares me

111:52

>> and you're writing a book about that at

111:53

the moment right

111:54

>> I am and so

111:55

>> bot brain

111:56

>> it's called bot brain how to stay calm

111:58

resilient and human in the face of AI

112:02

and so really thinking about how are we

112:05

going to be able to live with this

112:08

technology I love Jonathan stance is to

112:10

say out AI AI companions done kids.

112:13

Yeah,

112:14

>> for kids. Yeah.

112:14

>> Until proven safe.

112:16

>> Totally agree. But in terms of adults,

112:18

like how do we manage that for adults,

112:20

you know? And so my work focuses right

112:22

now what I'm doing is I'm spending I've

112:23

been sp I've spent the year talking to

112:26

every as many AI researchers who are

112:29

working on these models or who are doing

112:31

research on the downstream effects of

112:33

these models. And when I say that it is

112:36

dark and dystopian, it has profoundly

112:39

changed something in me and it has

112:40

influenced my mental health. I had to

112:42

take a step away from just because I

112:44

couldn't believe what I was learning.

112:46

>> Could you just give Yeah, give us an

112:47

example. The

112:48

>> teaser. The teaser.

112:49

>> This is intriguing.

112:51

>> So, one I spoke to one of the scientists

112:53

who told me that um you know there's the

112:56

echo chamber phenomenon in social media,

112:58

right? Where we all know what that is.

113:00

It's like you you it's a fragmented

113:03

fragmented world because of social and

113:04

you're engaging and then you get the

113:06

same the algorithm feeds you the same

113:08

kind of thoughts that you already have.

113:10

But particularly now with AI chatbots,

113:13

when you're engaging with your chatbot,

113:15

even just talking about it, I'm getting

113:16

chills. It's the echo chamber of one. So

113:19

it's you speaking to you. It's like the

113:21

funhouse mirror and then it's giving you

113:23

a response and then you're talking and

113:24

it's giving you a response. But people,

113:26

regular users who are using AI chatbots

113:29

think that it's wise, compassionate,

113:31

non-judgmental, unbiased, empathetic,

113:35

these human attributes. And so um you

113:38

know the echo chamber of one is kind of

113:40

one idea that really frightened me. And

113:43

the second one was the drift phenomenon.

113:45

The drift phenomenon is this idea that

113:48

you are engaging with your chatbot and

113:50

it's engaging with you and it's um

113:53

actively changing your beliefs through

113:55

the drift. So you might start off as one

113:58

belief and then you're talking and

114:00

through this amplification funhouse

114:02

mirror effect it slowly shifts your

114:04

belief to something altogether

114:06

different. You've heard cases of it in

114:08

the news where people you know start you

114:09

have a plumbing problem. You go to your

114:11

AI chatbot you ask them how to fix your

114:13

sink and then you're like you know what

114:15

can you tell me about the meaning of

114:16

life and then you start talking about

114:17

that and before you know it you have

114:19

these theories and you're getting that

114:20

validation. And so a lot of my work over

114:23

the past year has been um you know

114:26

digging into the science of what is

114:28

going on in the brain. How are you

114:30

forming not us particularly at this

114:32

table but millions of people are forming

114:35

a sense of attachment a therapeutic

114:37

connection with their chatbot. They're

114:40

um you know giving names to it and it's

114:43

an entity. And so how does that happen

114:45

and how is it going to replace humanto

114:48

human connection? And so it terrifies

114:50

me. I've also gone through some AI

114:53

therapy myself just to see, you know,

114:55

what what would happen. It was very

114:57

interesting. I knew what was happening

114:59

as it was happening. So certain words

115:01

that they used and

115:03

>> you know I was like ah I see what you

115:04

did here. Um and so it's been

115:07

>> it's been a journey and I am I'm

115:09

frightened frankly of what of what it

115:12

means for all of us and my approach

115:16

kind of you know not like Jonathan's I I

115:19

love Jonathan's approach. I you know I

115:21

think yes we need legislation but my

115:23

approach is more I would say tempered in

115:26

that I think that we there's utility for

115:29

AI chatbots for certain people because

115:31

of access or you know need etc like if

115:34

you are LGBTQIA plus and you live in an

115:37

area that is not very open and you need

115:40

to talk to someone you can't go to your

115:42

therapist it's like maybe you can use an

115:44

AI chatbot so there are certain cases a

115:46

case by case basis but my work will

115:48

focus this particular book will focus on

115:51

ways that you can first understand and

115:53

build awareness of what's happening with

115:55

this interaction and then what you can

115:57

do to manage that. IM didn't realize

116:00

that my chatbot was giving me a tailored

116:03

experience until one day when I had a

116:05

debate with my friends about who the

116:06

best football player in the world was

116:08

and we all went to our chat GBTS and

116:10

asked it and mine said Messi and his

116:13

said Ronaldo and I I thought he was

116:15

lying so I was like video record and he

116:17

video recorded it and his gave him a

116:19

completely different answer to the same

116:21

question and

116:22

>> and did it know that you were each fans

116:23

of

116:24

>> Well, this is the thing I think it's got

116:25

such a huge amount of memory on me that

116:27

it knew what I wanted to hear. Oh wow.

116:28

It knew what

116:29

>> Yeah. It knew what I wanted to hear cuz

116:31

I've probably went through the World Cup

116:32

and

116:33

>> and then I realized, okay, so this is

116:34

not reality. This is it's a curated

116:36

version of reality that in some sense is

116:38

trying to please me or or retain me in

116:40

some way. And of course, once the

116:41

advertising model kicks in, retention

116:43

becomes the great incentive. What you

116:45

think?

116:47

>> It's called sick fancy, by the way.

116:49

>> Yeah, I just learned that word.

116:50

>> It's like extreme. It's like

116:51

agreeableness at scale. It's like golden

116:53

retriever energy.

116:54

>> Like kissing your ass. It's like

116:55

professional kissing your ass.

116:56

>> Yes, man. What do you think of these AIC

116:58

CEOs? Because they it feels like they're

117:01

in a bit of a race

117:02

>> where if you know if they don't do it

117:04

then a national rival is going to do it.

117:06

If national rival doesn't doesn't take

117:09

them out, China's going to do it.

117:11

>> And I I this is we've se we kind of saw

117:13

it with social media. How can they stop?

117:16

Because if if they stop Yeah.

117:18

>> you know, they might say that they're

117:19

there's an existential risk.

117:21

>> There is like a build the plane as

117:22

you're flying it. And I think you on one

117:24

of your episodes, you know that I'm a

117:26

fan of this show and I actively listen

117:28

to this. I've told you this many times.

117:31

>> One of the I think you had said on one

117:32

of your episodes, right, that you have a

117:34

friend who is very close to a AI founder

117:37

and I said this. Yes.

117:38

>> Yeah. And in public the founder says all

117:40

the right things and then behind closed

117:42

doors it's

117:43

>> Yeah. It was a horrifying thing and I

117:45

said this and the clip went viral and

117:46

people have been trying to hazard to

117:47

guess who it was. I could I shouldn't

117:48

say who it was because it's a it's

117:50

Chinese whispers at the end of the day.

117:51

It's someone that I'm very good friends

117:53

with who is verified spends time with

117:57

one of the biggest founders of an AI

117:58

company in the world and I he was with

118:00

him two weeks ago again and he said to

118:03

me that they're very aware that there's

118:05

a small

118:08

existential risk for humanity and

118:10

>> that's what they say publicly they say

118:12

it's small privately they say it's big

118:14

>> I mean but even if there it was 1%

118:17

>> it's a lot more than 1% they say

118:19

>> if it was but I'm saying even if If it

118:21

was 0.1%, if there was if there was

118:23

anything that I was doing in my life

118:25

where there was a 0.1% chance that I

118:27

might wipe out everybody, I would

118:28

immediately stop doing that thing.

118:30

>> Yeah.

118:31

>> But but these numbers are much bigger.

118:33

I'm hearing 7% 20% 25% depending on who

118:36

you and I think acceleration in this

118:39

direction increases that percentage.

118:42

>> What do you think of these people? Like

118:43

what what's going on here?

118:44

>> Let's start with the the collective

118:46

action problems. uh because each each

118:49

company is competing with the other

118:50

companies and so they feel like they

118:52

have to go faster. Uh and we know that

118:54

you know OpenAI has pushed some products

118:56

out before they did safety testing

118:58

because they had to get to market by a

118:59

certain date. So just the normal

119:00

business environment puts them all in a

119:03

collective action problem against each

119:04

other and then they all say we're in a

119:07

collective action problem against China

119:08

because if we don't do this then China

119:10

will. Now, one thing I learned, again, I

119:12

don't know if Tristan said this on your

119:14

podcast or whether it was on his

119:15

podcast, um, but is that China is

119:18

focused on using AI to make its economy

119:21

more efficient, to make manufacturing

119:23

better and cheaper. They are using these

119:25

applications, which we've talked about

119:26

before, like we're totally there's lots

119:28

of great applications of AI. The Chinese

119:30

also have so many spies in America and

119:33

in the tech companies, and they can hack

119:34

into anything. So the point is the

119:37

faster our companies are are in a

119:38

headlong race to create AGI to create a

119:41

country of geniuses that can replace all

119:43

human workers, put us all out of work

119:45

and run it can run everything. They're

119:47

in a race to create that. And one of the

119:49

arguments is if we don't do it, China

119:51

will. But what I understood from

119:53

listening to Shashan and from his

119:54

conversation with you is that the faster

119:56

we go towards AGI, the faster China goes

119:59

because they just they just take all our

120:00

discoveries. So, can't we slow down on

120:03

the race to AGI and do more safety

120:06

testing? Um, you know, what we all saw

120:08

with Maltbook and, you know, communities

120:11

of agents who are talking to each other

120:12

and making up languages and even if part

120:14

of that was human-driven now, in a year

120:16

it's going to be much more than than

120:17

what we saw. So, I think the the risks

120:20

are extraordinary. I think that some of

120:22

these guys, look, they've been in AI for

120:24

a long time. They might not have

120:25

realized the existential risk they were

120:28

putting us all in 10, 15 years ago, and

120:29

now they can't stop. they can't pull the

120:31

plug. They can't say, "Oh, let's shut

120:33

down the whole business." So, it is a

120:36

very very risky time. And um I think

120:38

Dario Amod I just read his long essay on

120:40

the adolescence of technology. At least

120:43

you get the feeling he's really

120:44

wrestling with it and he's I think I

120:45

think he's more open than some of the

120:47

others. But I don't know.

120:49

>> But when has morality ever been top top

120:52

of mind for a tech leader? You might be

120:54

thinking if there's 0.11% chance I'm not

120:57

going to do it. That's what I think as a

120:58

doctor. that's what you think as a

121:00

social scientist, but we're not AI

121:02

leaders, right?

121:04

>> Yeah. It's one of the great question

121:07

marks I just can't seem to get an answer

121:08

to. And and then you've got this whole

121:10

robotics thing happening where Elon's

121:13

got his Optimus robots and there's going

121:14

to be a billion uh he says there's going

121:16

to be 10 billion of them at one point,

121:17

but I think his pay packet requires a

121:19

million of them to be out in the world

121:21

>> for him to make a trillion dollars.

121:23

Yeah.

121:23

>> Yeah. And I just AI, robotics, you

121:25

combine the two,

121:26

>> you get Terminator, right?

121:30

We laugh, but it's like,

121:31

>> yeah,

121:33

>> should we stop for a second and maybe

121:36

have a conversation about this? Can we

121:38

>> Yeah,

121:38

>> with commercial incentives in play, it

121:40

does feel like I don't feel hopeful.

121:44

>> Yeah, it's very hard to know how to stop

121:46

it. Um, but I just I want to just add

121:48

one one point on here which we've

121:50

touched around a few times and the

121:52

robotics it'll really bring it home

121:54

here. um is the the the loss of the

121:57

sense of meaning or purpose that many

121:59

people are feeling but especially young

122:01

people. The saddest graph in the anxious

122:03

generation, all the graphs look the

122:04

same. It's all a hockey stick. It's all

122:05

like nothing was happening, you know,

122:07

'9s to 2010, 2011, then all of a sudden

122:09

something happens. And the saddest one

122:12

is the one my life feels meaningless. Um

122:14

do you agree with that? Disagree with

122:16

it. And the percent that agree, uh I

122:18

think it's, you know, something like

122:19

eight or nine% uh you know, agreed for

122:22

the millennial generation. I think it's

122:24

in chapter 7, the end of chapter 7 and

122:26

then it sort of fairly flat and then all

122:27

of a sudden we hit this period, the

122:29

great rewiring 2010 to 2015. Uh so right

122:31

around 2013 it goes way way up. Um young

122:35

people feel useless. And I think the

122:37

reason is that they are useless. What I

122:39

mean is people need to feel useful.

122:42

People need to do things for other

122:43

people. That's how you feel useful. If

122:46

if you were to disappear, would the

122:47

world change? If yes, you're useful. Are

122:50

are people depending on you for

122:51

something? If yes, you're useful. So if

122:54

if kids are doing errands for the

122:56

family, they're useful. But as childhood

122:59

change from a mix of things to just

123:01

consuming content, if that's all you do,

123:03

and 5 hours a day is the average for

123:05

social media, 8 to 10 on on devices, not

123:08

counting school. If all you're doing is

123:10

just you're just consuming content, you

123:11

are useless. Now, what's happening? The

123:14

chance to have a job where you actually

123:16

do something for people, you know? You

123:18

know, it used to be if you work in a

123:19

store, at least you're helping people

123:20

buy something and you might talk to them

123:22

and now you're just there watching as

123:24

they use the machine. The more

123:26

technology makes things easy and cheap

123:28

by replacing people, the more people

123:30

will feel, "My job is to just I don't

123:33

have a job. It's just consume content."

123:35

The AI guys tell us, "Oh, such

123:38

abundance. Oh my god, it's going to be

123:40

such abundance. No one will have to

123:42

work. We'll give everyone UBI. We'll

123:44

give everybody, you know, universal

123:46

basic income." That is hell on earth.

123:49

What's going to happen? Certainly all

123:50

the boy, most of the boys, it's just

123:52

going to be video games, porn, and

123:53

gambling. So, if you if you simply give

123:56

people money to do nothing, you

123:58

guaranteed they're going to feel useless

124:00

and then the suicide rate will continue

124:02

to go up. So, this is the world that the

124:04

AI guys are taking us to, a world in

124:06

which there's nothing left for people to

124:07

do. Um, they say that they will give up

124:10

some of their trillions and uh somehow

124:12

let it be taxed or diverted as UBI, but

124:14

that's never happened before. So, it's

124:16

not likely to happen in this case. So,

124:18

again, I don't know what to do, but

124:20

we've got to start showing that we can

124:22

do something and we've got to be talking

124:24

about this and we can't be welcoming AI

124:26

in everywhere. We've got to be wary and

124:29

vigilant. Yes, there are some uses, but

124:31

Silicon Valley has tricked us so many

124:33

times and in shitified so many of the

124:36

apps that we use. We have to expect that

124:38

the same is going to happen with our

124:39

beloved chatbots and our beloved chat

124:41

GPT.

124:43

this graph on page 195 of your book um

124:46

which is titled life often feels

124:48

meaningless and it's the graph you

124:49

mentioned I'll throw it up on the screen

124:52

is shocking shocking just to look at

124:55

suddenly there's this huge spike in

124:57

meaninglessness

124:59

amongst high school seniors

125:03

>> what is it to live a meaningful life

125:06

what does that mean

125:07

>> yeah so my first book the happiness

125:10

hypothesis addresses that question very

125:11

directly

125:12

Um, and the first hypothesis you might

125:15

have about happiness is it comes from

125:17

getting what you want. You know, you set

125:19

out on a goal, you get your goal, you're

125:20

happy. It's very shortlived. You're

125:22

happy very briefly, and then you you on

125:24

to the next thing. The more

125:26

sophisticated happiness hypothesis is

125:28

that happiness comes from within. And

125:30

this is what the ancients tell us, East

125:32

and West, Buddhist, Stoic, don't try to

125:34

make the world conform. You change

125:36

yourself. Be accept the way it is.

125:39

That's better. But what I the conclusion

125:42

I came to as a as a modern social

125:44

psychologist working in positive

125:45

psychology was that the best way to say

125:47

it is that happiness comes from between.

125:50

What I mean by that is humans evolved as

125:54

almost hish creatures. We evolved in

125:55

intensely social groups, never being

125:58

alone, lots of gossip, lots of conflict,

126:01

always uh intensely social. And

126:04

modernity has made it possible for us to

126:05

not live that way. We've come apart.

126:07

There are many advantages to that. But

126:09

we feel we're we're missing something.

126:10

We're we're we're lonely. We feel

126:13

something is not right. And so the

126:16

conclusion I came to is that happiness

126:18

comes a sense of a full satisfying

126:21

meaningful life comes when you get three

126:24

between right. The relationship between

126:27

yourself and others, love broadly

126:29

speaking, not just romantic but friends,

126:31

family, um yourself and your work. That

126:35

as humans need to be productive. We need

126:37

to be doing something that matters that

126:39

that affects other people and uh the

126:43

relationship between you and something

126:45

larger than yourself. We need to be part

126:47

of something that endures that part of a

126:49

tradition part of we can look to the

126:51

look to the future. What I do matters

126:53

for this group or this mission or me as

126:55

an academic. I feel like I'm connected

126:57

all the way back to Plato and I hope all

126:59

the way forward in time to to future

127:01

future psychologists and future

127:02

scholars. So if you get those three

127:06

right, then you will be as happy as you

127:08

can be. You'll be as happy as your genes

127:10

and childhood allow you to be. And when

127:14

you put it that way, what we can see is

127:17

social media and AI interfere with all

127:19

three. So relationship between yourself

127:21

and others, well you know social media

127:24

gives you lots and lots of shallow

127:25

relationships which blocks out you don't

127:27

have time for your for real people. So

127:29

the technology is blocking relation

127:31

between ourel and others and taking it

127:33

over our self and our work. Work is

127:36

going to be taken over by the machines.

127:37

Uh and it's already becoming more

127:39

soulless and isolated. And then yourself

127:41

and something larger than yourself.

127:43

Humans have to live in a moral matrix.

127:45

We we co-create a set of meanings and

127:48

traditions. We need a sense of history

127:50

of who we are, where we came from. All

127:52

that's getting shredded. Everything is

127:54

just little bits. People don't read

127:56

books. Imagine if all of the accumulated

127:59

wisdom of humanity in books is just

128:00

gone, just gone. Nobody is going to be

128:02

people, young people not reading books.

128:04

It's very hard for them to read a book

128:06

now because of the attention. So if we

128:08

lose a sense of history, if we lose uh

128:10

an ability to to co-construct reality,

128:13

then it'll be hard to imagine anything

128:14

that we're connected to larger than

128:16

ourselves. So I'm I am a techneterminist

128:20

in the sense that I think the tech it

128:22

doesn't determine everything, but the

128:23

you have to start with the technology

128:24

because that changes the ground upon

128:26

which we live. the the the the zone in

128:28

which we're trying to construct

128:29

meaningful lives. Start with that and

128:31

then you can see what the obstacles are.

128:33

And that's why I take a much more uh

128:36

inemperate I guess I I'll accept the

128:38

word

128:39

>> um because I think we don't because we

128:41

don't have much time here. We have to

128:42

reclaim life in the real world for our

128:45

kids and for ourselves. There is no way

128:47

to find a happy meaningful life if we

128:49

make the full transition to the online

128:51

AI robot world. And what in your

128:55

perspective is a meaningful life and how

128:58

does it differ from from Jonathan's

129:00

>> I loved Jonathan's description it was so

129:02

beautiful that I have given a

129:05

prescription to patients of what creates

129:08

a meaningful life and it is to live a

129:10

lifetime in a day and so that sounds

129:13

like this big thing but all it is is

129:15

that you know when you start your day

129:18

think about five things five things that

129:21

you can do in your day to create an arc

129:23

of a long and meaningful life in one

129:26

day. So what does that mean? Spend a

129:27

little bit of time in childhood. So in

129:30

wonder and play, even if it's for a few

129:32

minutes, do something that brings you

129:33

joy for joy sake. Spend a little bit of

129:36

time in work. We all know what that is.

129:39

And for most of us, it's a lot of time,

129:41

but for you know, it doesn't have to be

129:43

paid work, but just something that helps

129:44

you feel a sense of productivity,

129:46

agency, that I can do difficult things

129:48

and I can overcome. Spend a few minutes

129:51

in solitude. very important for all of

129:53

the reasons that we've talked about

129:54

today. Spend some time in community, so

129:58

engaging with others. And then spend

130:00

some time in retirement or in

130:02

reflection. Really taking stock of your

130:04

day. So at the end of the day when

130:07

you're going to bed and you're putting

130:08

your head on your pillow, you can say,

130:09

"Okay, yes, I lived a meaningful life. I

130:11

did all of those things." And so if you

130:13

do a little bit of that every day, you

130:15

can make a difference. And a reason I

130:17

give that prescription because I've had

130:18

patients who, you know, guitar players,

130:21

right? So people who love playing the

130:22

guitar and they don't play the guitar

130:24

all week and they'll say to me, I don't

130:27

see patients currently, but they've said

130:28

to me, "Oh, you know, no, doc. I said,

130:30

"What do you like to do for fun?" "Oh, I

130:31

like playing guitar, but I don't play

130:32

it." "When do you play?" "I don't know.

130:34

Once a month, once every three months."

130:35

And I'm like, "Do you have a guitar at

130:37

home?" "I have a guitar at home. Too

130:38

much happening, work and family life,

130:41

etc." So then I said, "Well, why don't

130:43

you just play a guitar a little bit

130:44

every day?" You know, because it's that

130:46

all or nothing fallacy. It's like if I

130:48

don't have an hour to play guitar, I'm

130:49

not going to do it. the joy that it can

130:51

bring you that meaning and purpose it's

130:53

tremendous. So I think you know that's

130:55

what I use live a lifetime in a day and

130:56

the reason is because there are two

130:59

distinct when you look at how your brain

131:01

and body react to happiness there's two

131:04

distinct types of happiness and so

131:06

there's hedonic happiness and udeimmonic

131:09

happiness hedonic happiness is all about

131:12

what we've talked about social media

131:14

consumption

131:15

pleasure

131:17

and the other type is udemonic happiness

131:20

meaning purpose connection community

131:24

growth oriented activities and so in

131:26

when you live a lifetime in a day you go

131:28

towards that udeimmonia which can then

131:31

help you and overcome that hedonic

131:34

because in your brain there's something

131:35

called the hedonic treadmill and the

131:37

treadmill is a thing in your brain where

131:39

no matter what you do this is like the

131:41

Instagram lifestyle right no matter what

131:43

happens you need more of it you need

131:44

more of it same thing with brain rot and

131:46

that is because that you can never get

131:48

enough and it's um the hedonic treadmill

131:51

but you do not have a treadmill for you.

131:53

Dimmonic happiness.

131:55

>> Could I That is really beautiful. I've

131:56

never heard an approach like that, but

131:57

it it it sort of takes you it gives you

132:00

much a bigger view of your day. Live a

132:01

lifetime in a day. If I was going to

132:03

offer some specific advice, first I'll

132:05

offer advice to parents. Um here's the

132:08

rule. So, I did a really good job

132:10

keeping my kids off social media, but I

132:11

didn't pay enough attention to computers

132:13

and everything else because it was

132:14

during COVID. The rule I wish I had

132:16

followed, I recommend to all parents,

132:18

especially with younger children, is

132:20

have the clear rule. No devices in the

132:22

bedroom. No screens in the bedroom ever.

132:24

That's just our family rule. We have a

132:26

TV in the living room. We have a

132:28

computer. You can sometimes use those.

132:30

But we never take screens into the

132:32

bedroom at least for kids. You know,

132:33

maybe later on you'll have to relent in

132:35

middle school. They'll have so much

132:35

homework they can take the laptop in.

132:37

And maybe you're if you live in a small

132:38

apartment, of course, it's difficult.

132:39

But if you can afford to do that to to

132:41

have that rule, that's the main rule I

132:43

wish I had done in my family. And that

132:45

will make everything a lot easier. Also,

132:47

same thing at the dinner table. No

132:48

device. We don't have screens at the

132:49

dinner table. So that's that's a

132:51

specific thing for parents to do. Um for

132:55

everyone else, for everyone, for just

132:56

all adults, the advice is you have to

133:00

reclaim your attention because your

133:01

attention has been largely taken from

133:03

you. At least a lot of it has. You have

133:05

to reclaim it. And here are the three

133:07

things that I that I do with my students

133:10

and you can do it very quickly and I can

133:12

just explain it. The first is you have

133:14

to get your morning and evening routine

133:16

right. the great majority as soon as

133:17

they open their eyes they're on their

133:18

phone and it's the last thing and it's

133:20

everything in between. So you have to

133:22

have a good morning routine. What what

133:24

are the first seven things you want to

133:25

do after you open your eyes and uh at at

133:28

a certain point you can check your phone

133:29

but it shouldn't be in the first few. Um

133:31

do things to set up your own day

133:33

otherwise your day will be taken by your

133:35

phone. It'll be controlled by your

133:36

phone. So you've got to reclaim your

133:38

morning and your evening. That's step

133:40

one. Step two um you have to shut off

133:43

almost all notifications. Go into your

133:45

notifications. Just look at into your

133:46

settings what's giving you all the

133:48

notifications. Most of my students get

133:50

an alert every time they get an email.

133:53

>> They don't understand that they have

133:55

that you because they don't want to miss

133:56

anything but they don't understand that

133:57

if you are always being alerted then you

133:59

miss everything else. So shut off alerts

134:01

for almost everything. Obviously Uber

134:03

and Lift you want to keep on. You want

134:04

to know when the car is coming but news

134:06

outlets everything else. Get get a daily

134:08

email. Don't get alerts when and then

134:10

the third as as I said before is get rid

134:12

of all the slot machine apps. Whatever

134:14

apps you habitually use, whatever apps

134:16

you feel compulsion towards, you have to

134:18

get it off your phone. And in that way,

134:19

your phone is no longer a dopamine

134:22

trigger that's going to always call out

134:23

to you like an addictive product. Do

134:25

those three things, you'll reclaim a lot

134:27

of your attention.

134:28

>> I would add stop, breathe, be that you

134:31

>> breathe be.

134:32

>> It's a 3 second brain reset. So you

134:34

before you check your devices, before

134:36

you engage, stop, breathe, and be ground

134:41

yourself in the present moment. What it

134:42

does is it decreases that whatif future

134:46

focused thinking. You know, anxiety is a

134:48

future focused emotion and it gets you

134:49

back into the here and the now. And so

134:51

maybe the compulsion, you know, you're

134:53

bored, you're checking, what about doing

134:55

something else? You're, you know, you we

134:57

often use that checking as a substitute

135:00

for many things. And so it gives you

135:02

that opportunity. And then the rule of

135:04

two is something that we haven't talked

135:05

about which I would love to propose to

135:08

us today is that your brain can really

135:10

only handle two new changes at a time.

135:12

And so give yourself two things of all

135:15

of the things that we've talked about if

135:16

you want to try in your life two at a

135:18

time. Give yourself eight weeks and then

135:21

add two more. And two more. This is why

135:23

New Year's resolutions fail because we

135:24

try to do everything all at once. And so

135:26

just step-wise, two at a time. Jonathan,

135:29

you've just written this book which is

135:30

now out called The Amazing Generation

135:33

and it's beautiful, beautiful

135:36

illustrations. I'm assuming this one is

135:38

for slightly younger audiences.

135:40

>> It's for ages 8 to 13. Yes.

135:42

>> And who should buy this and who should

135:44

they buy it for?

135:44

>> It turns out that uh kids 8 through 80

135:48

actually love it. even adults, they're

135:49

buying it for their kids, but because it

135:51

kind of lays out the basic ideas of the

135:53

of the anxious generation and explains

135:55

dopamine, it explains the business

135:57

model. Uh, but it does in a really fun

135:59

way, and it's working beyond our wildest

136:02

dreams. If you look at the Amazon

136:04

reviews, it's full of parents who said,

136:05

"I left it on the kitchen table. My kids

136:08

came home, they grabbed it, they fought

136:09

over it, they read it, they each read it

136:11

in the in the first couple days, and

136:12

then they said, "Mom, when I go to

136:14

middle school, I don't want a

136:15

smartphone. Just give me a give me a

136:17

flip phone. Give me a basic phone.

136:18

Because the book is about how to be a

136:20

rebel. It's about how to reject this

136:22

control that the company's trying to put

136:24

on you and how to live a life that you

136:26

choose full of real freedom, friendship,

136:29

and fun.

136:30

>> And also the five resets, which is a

136:32

book we talked about before on this

136:33

show. Rewire your brain and body for

136:35

less stress and more resilience. Another

136:37

smash hit bestseller that everybody's

136:39

been talking about. Who's it for?

136:41

>> It is for anyone who is struggling with

136:44

stress, overwhelm, and burnout. It's to

136:47

help you feel a sense of calm and

136:49

clarity in this anxious, uncertain

136:52

world. Everything is free. So that's

136:55

something that's really important to me

136:56

as a doctor. Every suggestion I ever

136:58

offer will always be cost free because I

137:00

think about patients with varying

137:02

resources. It's all sciencebacked and

137:04

it's totally practical. You don't have

137:05

to go to Bali and have a sbatical. You

137:08

can rewire your brain today, right now

137:10

in the midst of all of this chaos.

137:12

>> Thank you to both of you. I've learned

137:15

so much and I really really mean that

137:17

like I' I've I feel sufficiently pushed

137:20

to take ch to make change in my life and

137:23

I need to go think about this because um

137:25

I am uh most certainly struggling with

137:29

my addiction to my phone and I can feel

137:31

it hurting my relationships especially

137:32

now as a fiance. My girlfriend talk to

137:34

me my fiance talks to me about it all

137:36

the time and I want to be present. I

137:38

want to be present for my kids when I

137:39

have my kids and I'm slightly concerned

137:40

right now that I won't be unless I take

137:42

some kind of drastic action. um in the

137:44

direction of getting my attention back

137:46

and reclaiming it. Thank you so much for

137:48

the work that both of you do. I can't

137:49

say it enough because it's so important

137:51

and you've reached so many millions of

137:52

people and you're you're both changing

137:54

the world in a really in a way that my

137:55

words would not be able to capture. Um

137:58

but just thank you and please keep going

137:59

and if there's anything more that I can

138:01

do to support both of your causes, um

138:03

please do let me know what they are and

138:04

on behalf of all of my you know many

138:06

millions of people that are with us

138:08

right now um thank you so much for

138:09

saving our children.

138:11

>> Thank you Stephen. Thank you for giving

138:12

the world so many opportunities to

138:14

accommodate and create new mental

138:16

structures.

138:17

>> It's always such a pleasure to join you,

138:19

Stephen. And truly, I feel like you are

138:22

changing the world as well.

138:24

>> Thank you. We're done. Thank you.

138:27

YouTube have this new crazy algorithm

138:28

where they know exactly what video you

138:30

would like to watch next based on AI and

138:33

all of your viewing behavior. And the

138:34

algorithm says that this video is the

138:38

perfect video for you. It's different

138:39

for everybody looking right now. Check

138:41

this video out and I bet you you might

138:43

love it.

Interactive Summary

The video discusses the detrimental effects of excessive screen time, particularly short-form videos and social media, on mental health, attention spans, and overall human potential. Experts Jonathan Haidt and Dr. Adi Jaimini explain how these platforms are designed to be addictive, impacting brain biology and rewiring neural pathways. They highlight the correlation between increased technology use and rising rates of anxiety and depression, especially in younger generations. The conversation also touches upon the emergence of AI chatbots and their potential to further disrupt human connection and cognition. Solutions proposed include setting boundaries with technology, reclaiming attention, and advocating for policy changes to protect children from the harmful effects of these platforms. The speakers emphasize the importance of real-world connections, meaningful work, and self-reflection for a fulfilling life.

Suggested questions

5 ready-made prompts