HomeVideos

We Didn’t Ask for This Internet | The Ezra Klein Show

Now Playing

We Didn’t Ask for This Internet | The Ezra Klein Show

Transcript

2488 segments

0:00

When was the last year that the internet

0:02

felt good to you? I think everybody's

0:05

different answers to this. Mine, I

0:07

think, go fairly far back, maybe to the

0:09

heyday of blogging.

0:11

>> Words and getting up to 40,000 hits on

0:13

his blog a day.

0:14

>> At least before the moment when Twitter

0:17

and Facebook went algorithmic.

0:20

>> What we're trying to do is give everyone

0:22

in the world the best personalized

0:24

newspaper we can. But whatever your

0:26

answer to it is, I have not found many

0:29

people who think 2026 right now, this

0:32

internet with all of its anger and its

0:34

outrage and its AI sloth, this is what

0:37

we were promised.

0:38

>> A glitch spread graphic violent videos

0:41

to unsuspecting Instagram users.

0:43

>> This is living at the technological

0:46

peak.

0:46

>> In just three clicks, I can go from date

0:49

night ideas to mind control seduction.

0:53

But even if there is this growing

0:55

consensus that something went wrong with

0:56

the internet somewhere and that it is

0:59

driving our society somewhere we don't

1:01

want it to go, there's not really a

1:03

consensus of what to do about it. What

1:06

to do about these giant platforms

1:08

increasingly spammed up with ads and

1:10

sponsored results boosting content that

1:12

will keep us hooked and angry, isolating

1:16

and dividing us and deranging our

1:17

politics or making a few billionaires

1:19

ever richer. held up by an army of

1:21

lowwage workers in warehouses and on

1:23

delivery bikes. Something has gone so

1:27

wrong. But what do we do about it? My

1:29

guests today have two theories of the

1:31

case. Cory Doctoro is a longtime

1:33

blogger, an activist with the Electronic

1:35

Frontier Foundation, and a science

1:37

fiction writer. His new book is

1:39

Inshitification: Why Everything Suddenly

1:41

Got Worse and What to Do About It. Tim

1:43

Woo worked as a special assistant to

1:45

President Biden for technology and

1:46

competition policy. He's a professor at

1:49

Columbia Law School and author of

1:50

influential books on technology,

1:52

including his latest, The Age of

1:53

Extraction: How Tech Platforms Conquered

1:55

the Economy and Threatened Our Future

1:57

Prosperity.

1:59

I wanted to have them both on because

2:02

their books both feel to me like they

2:04

talk to each other. They feel like

2:05

they're describing something similar,

2:06

but in importantly different ways. And

2:10

so, I wanted to think through their

2:11

theories

2:12

and through what might be done about it.

2:15

As always, my email as Kleinshowny

2:17

Times.com.

2:24

Tim Woo, Cory Doctoro. Welcome to the

2:26

show.

2:27

>> Thank you very much.

2:28

>> Great to be here.

2:29

>> So, I just learned that you both went to

2:30

elementary school together.

2:33

>> Yep.

2:34

>> Yeah, that's true.

2:35

>> In suburban Toronto, a a weird little

2:37

school with like 80 kids. It was

2:38

kindergarten to 8th grade in one

2:40

classroom. Older kids taught the younger

2:42

kids. We more or less were left to go

2:44

feral and design our own curriculum.

2:46

They chucked us out of the school on

2:47

Wednesday afternoons to take our subway

2:50

pass and find somewhere fun in Toronto

2:51

to go do stuff. It was great.

2:54

>> Is there anything about that school that

2:55

would lead people to becoming sworn

2:57

enemies of our tech overlords?

3:00

>> Well, we love tech at the time. I mean,

3:02

we had we were early in on Apple 2.

3:05

>> Mhm.

3:05

>> And I frankly that's where it all

3:06

started in a way. You know, both of our

3:08

our our books have this kind of pining

3:11

for a lost age. And I think some of it

3:13

is this early era of computing when we

3:15

were just like so excited, so

3:16

optimistic, everything was just going to

3:18

be so amazing. And that that to me a

3:20

little bit was was fifth grade or grade

3:22

five as we say, you know, programming

3:24

the Apple 2.

3:25

>> Can I can I slightly uh uh add

3:29

problematize that? So, um I I do think

3:32

that like so we were both also science

3:34

fiction readers back then. And so I was

3:37

pretty alive to the dystopian

3:39

possibilities of computers at the time.

3:40

So I wouldn't call myself optimistic. I

3:43

would call myself hopeful and excited

3:46

but not purely optimistic. And I would

3:49

also like to say that like Pax uh um

3:52

John Hudman, nostalgia is a toxic

3:54

impulse. And when I when I think about

3:55

what I like about those days, it's not

3:57

that I want to recover those days. It's

4:00

more that I kind of dispute that the

4:02

only thing an era in which people had

4:04

lots of control over their computers

4:05

could have turned into is one in which

4:07

the computers had lots of control over

4:09

them. That there is probably something

4:10

else that we could have done when you're

4:13

spending time on the internet these

4:15

days, Corey, what feels bad to you about

4:18

it?

4:19

>> So, what I would do is contrast what

4:22

happens when things aren't great now

4:23

with how I felt about what happened when

4:25

things weren't great before. So, I think

4:26

when I was a a a laram on the early

4:29

internet and I saw things that sucked, I

4:31

would think someone's going to fix this

4:33

and maybe it could be me. Uh, and now

4:36

when I see bad things on the internet,

4:38

I'm like, this is by design and it

4:40

cannot be fixed because you would be

4:43

violating the rules if you even tried.

4:45

Tim, how about you? I feel it's like a

4:48

tool I cannot trust. You know, I feel

4:50

like the tools I like in my life, like a

4:53

hammer, you know, I swing it, it does

4:54

something predictable. The internet

4:56

seems like it's serving two masters.

4:59

>> You know, I search for something, I get

5:00

a bunch of stuff I don't really want,

5:02

and I don't really know what I'm

5:03

getting. Um, I go, I want to write like

5:05

one email or check one thing, I end up

5:08

in some strange rabbit hole and reading

5:11

and like three hours go by and I don't

5:13

know what happened. So, I feel like I'm

5:15

constantly at risk of being manipulated

5:18

or taken from, and I don't trust the

5:22

tools to do what they say they're going

5:23

to do. And I feel that makes using it

5:27

much, you know, kind of like living in a

5:29

fun house. So, I don't like that. So, I

5:32

want to make sure I give voice to

5:34

somebody who is not in the show at the

5:36

moment because this has a it's going to

5:38

have the flavor of

5:39

>> the prophet Elijah has entered the chat.

5:40

>> Yeah. Right. Yeah. three three uh

5:43

middle-aged guys who think the internet

5:45

went wrong somewhere along the way. Uh

5:47

when I was working on this episode with

5:48

my producer, one of the interesting

5:50

tensions behind the scenes was she

5:52

doesn't think the internet is bad. She

5:54

thinks Tik Tok is uh she said a perfect

5:57

platform. Um she's young kids and feels

5:59

Amazon is a godsend for a young parent.

6:02

Obviously there are many people like

6:04

this who are using these platforms

6:05

freely of their own valition happily.

6:08

So, what do you say to somebody who

6:09

says, "What are you [ __ ] all talking

6:12

about?"

6:13

Yeah. I mean, I guess I I'll start. I I

6:15

think that I I I well, the the middle-ag

6:18

thing though used to be better, which is

6:20

I I don't want to fall into that sort of

6:22

situation. I just think the deal is not

6:26

what it could be. And I think that, you

6:28

know, maybe as a consumer who sort of

6:30

lightly uses this, the internet is is

6:33

still useful. But if people I mean, I I

6:35

have children, too. And you know, I

6:37

think it's hard to deny that social

6:39

media um has been tough on kids and has

6:43

had all kinds of negative effects on

6:44

that and that really started

6:46

accelerating over the last 15 years or

6:48

so. But I think we have a highly

6:50

polarized political structure which is,

6:53

you know, made worse by social media. I

6:55

think we have a problem with inequality

6:58

which has gotten worse and worse and is

7:00

accentuated by the fact that the margins

7:02

are just so thin for independent

7:03

business. And I also think this vision

7:06

that it would be this um equalizer

7:10

leveler this this technology that made a

7:13

lot of people rich not just a few people

7:15

rich that it was you know more or less I

7:17

know easy but reasonable and and and a

7:19

lucrative thing to do to start your own

7:21

business that it would sort of change

7:23

some of the challenges of inequality and

7:25

class uh structure in the United States.

7:27

Now maybe those were very high hopes.

7:29

This is a key concept in in my book and

7:31

I think key to understanding the

7:33

economics of our time is the importance

7:36

of platforms

7:38

um which are you know any space or any

7:41

institution that brings together buyers

7:44

or sellers speakers or listeners. Every

7:47

civilization has had platforms. I was in

7:50

Rome a few weeks ago and you know you go

7:53

to the Roman forum and there it is. It's

7:54

all together. the buyers, the sellers,

7:57

um they have the courts, they have where

7:59

people gave their speeches. You know,

8:01

they're kind of the core of every

8:02

civilization.

8:04

And in some level, why I wrote this book

8:06

is I was I was interested in this

8:09

question of what our fundamental

8:10

platforms look like and how that

8:12

reflects on the civilization we are

8:15

building because I do think they have a

8:17

large impact. I think that's kind of

8:19

undeniable.

8:20

But I think that, you know, things have

8:22

gotten worse in in many dimensions. And

8:24

I guess it relates to my view of the

8:26

state of the country as well. I think

8:28

we've been in a better place in other

8:30

periods of American history. And I think

8:32

the internet's not the only cause, but I

8:34

think it's part of it. If I were having

8:37

this conversation with your producer and

8:39

we had some time to talk about it, I

8:40

would probably walk them through a

8:42

couple of the undisputed ways in which

8:44

some people have found the internet get

8:46

worse for them. So Tim has talked a

8:48

little about margins for small

8:49

businesses. Um there's also people who

8:52

are performers who found that the take

8:55

that's being given from the that's being

8:57

sucked out of their uh pay packet every

8:59

month is going up and up from the

9:00

platforms. There's people who would like

9:02

really not like to be snatched by ICE

9:05

snatch squads who installed ICE block on

9:07

their iPhone only to have Tim Cook

9:09

decide that uh ICE officers were a

9:11

member of a protected class and remove

9:13

that app and now you can't install that

9:15

app cuz the iPhone only lets you install

9:17

official apps. And I'd say like just

9:19

because this hasn't hit you, I I think

9:22

unless you have a theory about why you

9:24

are favored by these platforms, then you

9:26

should at least be worried that this

9:29

would come. And I would follow up by

9:31

saying like let's not fall into the trap

9:33

of vulgar Thatcherism. You know,

9:35

Thatcher's motto was there there is no

9:37

alternative. And I think tech bosses

9:39

would like you to believe that too. That

9:41

if you're enjoying having a conversation

9:43

on Facebook with your friends, which I

9:45

stipulate lots of people do. I think

9:46

that's absolutely the case and we should

9:48

value and celebrate that. That you just

9:50

have to accept that there is no way to

9:52

have a conversation with your friends

9:54

that Mark Zuckerberg isn't listening in

9:55

on. And that to ask for otherwise than

9:58

that would be like asking for water

10:00

that's not wet. It's just like not

10:02

possible. And what I'mating for is not

10:06

like you don't like that thing you like.

10:08

It's like I I like that you like the

10:10

thing you like. I want to make it good

10:12

and I also want to guarded against

10:14

getting worse because just because it

10:16

hasn't happened to you yet, uh it would

10:19

be uh um I think naive to think that it

10:23

would never come for you. So your books

10:25

are two frameworks for understanding

10:28

what I would call corporate capture of

10:30

the internet. The way we went from the

10:33

dream of a decentralized

10:36

uh user controlled internet to to

10:38

something that a small number of

10:39

corporations really run and have

10:40

enormous power over. And Tim the term

10:43

you focus on is extraction Cy. The term

10:45

you focus on is initification.

10:48

So I'd like you just both to define

10:50

those terms for me. What is extraction

10:52

Tim? What is inchification Cy?

10:55

>> So extraction is actually a technical

10:58

economic term that refers to the ability

11:02

of of any entity or any firm with market

11:05

power or monopoly power to take um

11:10

wealth or uh other resources far in

11:13

excess of the value of the good being

11:15

provided. Not not only the value being

11:17

provided but also it's its cost to

11:19

provide it. That's the technical

11:20

definition. So you might have a

11:21

pharmaceutical company. They have uh you

11:24

know there's a rare disease. They have

11:25

the only treatment for it and you know

11:27

maybe they're they're extracting as much

11:29

as they can. You know $100,000 uh a year

11:32

is about the usual. And I think the idea

11:34

of it comes from a sense something I get

11:37

from teaching at business school

11:38

sometime is that American business has

11:41

in my view moved increasingly to focus

11:46

its efforts on trying to find points of

11:47

extraction as a business model. um as

11:50

opposed to say improving the product or

11:52

lowering the price. You know, try to

11:54

find the pain points where your

11:56

customers really have no choice and then

11:58

uh take as much you can kind of like in

12:00

a poker game when you go all in because

12:01

you got the good hand. Now, there's

12:03

always been a little bit of that in

12:05

business or maybe a lot like in the

12:06

guilded age. But the question is what is

12:08

the ratio and how much of business is

12:11

providing good services uh for good

12:14

prices, you know, making a profit that's

12:16

fine and how much is just that different

12:17

thing of extraction. So Tim, I want to

12:20

before I move on to Corey, zoom in on

12:21

something you said there because a lot

12:23

of that definition seem to turn on how

12:25

you define value.

12:27

>> Yeah.

12:28

>> And I mean a lot of economists would say

12:29

price is a method of discovering value.

12:32

If you have a pharmaceutical people are

12:34

willing to pay $70,000 for that means

12:36

they value it at $70,000 even if you

12:38

think that is extractive. Mhm.

12:40

>> So, how do you know when a price when a

12:44

profit

12:46

is actually extractive versus when we're

12:49

just seeing that people value that

12:51

product very highly and bully on the,

12:54

you know, producer for creating

12:55

something people value so highly.

12:58

>> Yeah. So if someone for example, you

13:01

know, has no choice but they they are

13:03

desperate, let's say, for water and

13:05

someone is able to sell them, you know,

13:07

a bottle of water cuz they're dying for

13:09

$100,000 or something like that, um,

13:12

yes, that person does value it at that

13:15

level. But an economy full of nothing

13:17

but maximized monopoly prices where

13:20

people are in a position to extract

13:22

affirms is is uh not it's inefficient

13:25

for two reasons. One is too much money

13:27

gets spent on that water versus you know

13:29

other things like maybe pursuing an

13:31

education and second that the entity

13:34

that holds that much power actually has

13:36

a has a impulse to reduce supply reduce

13:39

output and therefore produce less of the

13:42

stuff so that they can extract the

13:43

higher price. So there's I mean this is

13:45

just classic monopoly economics I guess

13:47

I'm I'm getting into. I mean, everyone

13:49

inside themselves has something they are

13:50

willing to pay, but that doesn't mean

13:52

it's a good society when you're

13:54

constantly paying the maximum you are

13:56

willing to pay in every situation. It's

13:58

actually a very oppressive economy, I

14:00

think. So, Tim, when we're talking about

14:02

extraction for many of these platforms,

14:04

for a Facebook, for a Tik Tok, we're not

14:07

paying for them. So, when you say they

14:09

are extracted, what are they extracting

14:11

and from whom? When you use Facebook,

14:14

you are constantly being mined for your

14:17

time, attention, and data in a way that

14:20

is extraordinarily valuable and that,

14:23

you know, yielded something like 67

14:24

billion in profit last year. So, you

14:27

know, things that feel free. Is it free

14:29

when you suddenly spend, you know, hours

14:32

um wandering around random things you

14:35

didn't intend to? Is it free when you

14:37

end up buying stuff that you didn't

14:38

really want and wonder why you get it

14:40

later? Is it free when you feel that

14:43

you've, you know, had your

14:44

vulnerabilities exploited? I would say

14:46

none of that's free. You're poorer both

14:49

uh in your own consciousness and in

14:51

terms of what attention and your control

14:53

over your life. And you're poorer

14:55

probably in misspend money.

14:57

>> Cory, how about initification?

14:59

>> Well, before I do that, I also wanted to

15:01

react to something um that you were sort

15:03

of fainting at, Ezra, which is this idea

15:05

of revealed preferences, which you often

15:07

hear in these discussions, right? that

15:09

if you let Facebook spy on you, no

15:13

matter what you say about how how you

15:15

feel about Facebook spying on you, you

15:16

have a revealed preference. And Tim used

15:18

the word power when he responded to

15:20

that. And I think that, you know, if you

15:22

if you ask the neocclassicals, they'll

15:24

say, well, we like models, and it's hard

15:26

to model qualitative aspects like power.

15:28

So, we just leave them out of the model

15:30

and hope that it's not an important

15:32

factor. And this is how you get these

15:34

incredibly bizarre conclusions like if

15:37

you sell your kidney to make the rent,

15:39

you have a revealed preference for

15:40

having one kidney. But what we actually

15:42

know when we give people choices when

15:44

when the state intervenes or when

15:46

there's counterveailing power is that

15:47

often you get a different revealed

15:48

preference. You know, when Apple gave

15:51

Facebook users the power to tick a box

15:53

and opt out of Facebook spying, 96% of

15:57

Apple users tick that box. So the

15:59

argument that Facebook users don't being

16:01

mind being spied on, I think is blown

16:03

out of the water when you actually give

16:04

them a a way to express preferences, and

16:07

I assume the other 4% were like either

16:09

drunk or Facebook employees or drunk

16:11

Facebook employees, which makes sense

16:12

cuz I would be drunk all the time if I

16:13

worked at Facebook. But I think it's

16:15

hard to deny that people really don't

16:18

want to be spied on if they can avoid

16:20

being spied on.

16:21

>> All right. I think that's a good setup

16:22

to inshification.

16:23

>> Yeah, inshitification. It's really a

16:26

label I hung on both an observation

16:29

about a characteristic pattern of how

16:30

platforms go bad, but I think much more

16:32

importantly why they're going bad now

16:34

because we didn't invent greed in the

16:36

middle of the last decade. So something

16:38

has changed. I my thesis is that some

16:40

exogenous factors have changed. So the

16:42

pattern of platform decay is that

16:44

platforms are first good to their end

16:46

users while locking them in. That's

16:47

stage one. And once they know that the

16:50

users have a hard time departing when

16:52

they face a collective action problem or

16:53

when they have high switching costs, um

16:55

you can make things worse for the end

16:57

users safe in the knowledge that they

16:59

are unlikely to depart in order to lure

17:01

in business customers by offering them a

17:03

good deal. And so far so good. I think a

17:05

lot of people would echo that, but they

17:06

would stop there. They would say, "Oh,

17:08

you're not paying for the product, so

17:09

you're the product." So that this is

17:10

about luring in users and then getting

17:13

in business customers will pay for it.

17:14

That's not where it stops because the

17:16

business customers are also getting

17:18

screwed because the business customers

17:19

get locked in and you know this power

17:21

that the platforms end up with over

17:23

their business customers is then

17:25

expressed in stage three where they

17:27

extract from those business customers as

17:28

well. They dial down the value left

17:32

behind in the platform to the kind of

17:35

minimum like homeopathic residue needed

17:38

to keep the users locked to the

17:39

platform, the businesses locked to the

17:41

users and everything else is is split up

17:43

among the executives and the

17:44

shareholders and that's when the

17:46

platform's a palace of [ __ ] But the

17:48

more important part as I say is why this

17:50

is happening now. Broadly, my thesis is

17:53

that platforms used to face consequences

17:55

when they did things that were bad for

17:56

their stakeholders. And those

17:58

consequences came in in four forms. They

18:01

had to worry about competitors, but we

18:02

let them buy those. They had to worry

18:04

about regulators, but when a sector is

18:07

boiled down to a cartel, they find it

18:09

very easy to agree on what they're going

18:11

to do and make their preferences felt

18:13

because they have a lot of money,

18:14

because they're not competing with one

18:15

another and they capture their

18:16

regulators. They had to worry about

18:18

their workers because tech workers were

18:19

in very scarce supply and they were very

18:21

valuable and they often really cared

18:23

about their users and they could really

18:25

say no I'm not going to inshitify that

18:26

thing I missed my mother's funeral to

18:28

ship on time and make it stick because

18:30

there was no one else to hire if they

18:32

quit and they were bringing a lot of

18:34

value to the firm but of course tech

18:36

workers famously thought that they were

18:38

temporarily embarrassed founders and

18:40

they didn't unionize they didn't think

18:42

they were workers so when the power of

18:44

scarcity evaporated they had not

18:46

replaced it with power of solidarity.

18:48

And so now you have 500,000 tech layoffs

18:50

in three years and tech workers can't

18:52

hold the line. And then finally, there

18:53

was new market entry. There were new

18:54

companies that could exploit something

18:56

that I think is exceptional about tech.

18:58

I'm not a tech exceptionalist broadly,

19:00

but I'm an exceptionalist about this,

19:01

which is that every program in your

19:04

computer that is adverse to your

19:06

interests can be neutralized with a

19:08

program that is beneficial to your

19:10

interests. And that means that when you

19:13

create a program that is deliberately

19:15

bad, you invite new market entrance to

19:17

make one that's good. Right? If you lock

19:19

up the printer so it won't take generic

19:21

ink, you just invite someone to not only

19:23

get into the generic ink business, but

19:25

get into the alternative printer

19:26

firmware business, which eventually

19:28

could just be the I'm going to sell you

19:30

your next printer business. But what

19:31

we've done over 20 plus years is

19:34

monotonically expand IP law until we've

19:36

made most forms of reverse engineering

19:38

and modification without manufacturer

19:40

permission illegal, a felony. Uh uh my

19:44

friend Jay Freeman calls it felony

19:45

contempt of business model. And as a

19:47

result, you don't have to worry about

19:49

market entry with this incredible

19:51

slippery dynamic character of

19:53

technology. And when you unshackle firms

19:56

from these four forces of discipline,

19:58

when they don't have to worry about

20:00

competitors or regulators or their

20:01

workforce or new market entry through

20:03

interoperability, the same CEOs go to to

20:06

the same giant switch on the wall in the

20:08

seauite marked in shitification and they

20:10

yank it as hard as they can as they've

20:12

done every day that they've shown up for

20:13

work. And instead of being gummed up, it

20:15

has been lubricated by an init policy

20:19

environment that allows it to go from

20:20

zero to 100 with one pull. And that's

20:24

how we end up where we are today. All

20:25

right. I want to bring these out of

20:27

theory though, Corey. I I applaud how

20:29

well structured that was on the fly.

20:32

>> Um, and have you both walk through this

20:34

with an example that you use in your

20:36

books.

20:37

>> Sure.

20:37

>> And Cory, I want to start with you. Uh,

20:40

>> walk me through how you see

20:43

initification as having played out on

20:44

Facebook itself. Not all of Meta, but

20:47

Facebook where it started when it was

20:49

adding value to users in the the early

20:51

days to where you feel it has uh gone

20:54

now. Tell me your Facebook story.

20:55

>> Yeah, so Facebook uh really its big bang

20:58

was 2006. That's when they opened the

21:00

platform to anyone, not just people with

21:02

aedu address from an American college.

21:05

And Mark Zuckerberg needs to attract

21:06

users. And his problem is that they're

21:07

all using a platform called MySpace. So

21:10

he pitches those users and he says,

21:11

"Look, I know you enjoy hanging out with

21:13

your friend on MySpace, but nobody

21:15

should want to use a surveillanced

21:17

driven social media platform. Come to

21:19

Facebook and we'll never spy on you.

21:21

will just show you the things that you

21:22

asked to see.

21:24

>> We need to give people complete control

21:25

over their information, right? People

21:27

need to be able to say exactly who they

21:29

want to share each piece of information

21:31

that they're sharing, who they want to

21:33

share that with.

21:34

>> So, that's stage one. But part of stage

21:35

one, remember, is that there's a lock

21:37

in. It's just the collective action

21:39

problem, right? You love your friends,

21:41

but they're a pain in the ass. And if

21:43

the six people in your group chat can't

21:44

agree on what bar to go to this Friday,

21:46

you're never going to agree on when it's

21:47

time to leave Facebook or where to go

21:49

next. Especially if some of you are

21:51

there because that's where the people

21:53

with the same rare diseases you are

21:54

hanging out. And if some of you are

21:56

there because uh that's where the people

21:58

in the country you immigrated from are

22:00

hanging out. And some of you are there

22:01

because that's where your customers or

22:02

your audiences or just that's how you

22:04

organize the carpool for the kids little

22:06

league. And so we are locked in. And so

22:08

that ushers in stage two making things

22:10

worse for end users to make things

22:11

better for business customers. So think

22:13

about advertisers. Advertisers are told,

22:16

you know, do you remember we told these

22:17

rubes that we weren't going to spy on

22:18

them? Obviously that was a lie. We spy

22:20

on them from [ __ ] to appetite. Give

22:23

us pennies uh and we will target ads to

22:26

them with exquisite fidelity and uh so

22:29

the advertisers pile in, publishers pile

22:30

in too. They become locked to the

22:32

platform. They become very dependent on

22:34

it. And in stage three, advertisers find

22:37

that ad prices have gone way up. Ad

22:39

targeting fidelity has fallen through

22:41

the floor. Ad fraud has exploded to

22:43

levels that are almost incomprehensible.

22:45

Publishers famously now have to put

22:48

their whole article there, not just an

22:49

excerpt. And Woatide, the publisher that

22:51

has a link back to their website because

22:53

Facebook's downranking off platform

22:55

links is potentially malicious. And so

22:57

they don't have any way to monetize that

22:59

except through Facebook's own system.

23:01

And we've got a feed that's been, you

23:03

know, basically denuted of the things

23:05

we've asked to see. It has the minimum

23:06

calculated to keep us there. And this

23:10

equilibrium is what Facebook wants, but

23:12

it's very um brittle because the

23:15

difference between I hate Facebook and I

23:17

can't seem to stop coming here and I

23:18

hate Facebook and I never coming back.

23:20

It can be disrupted by something as

23:22

simple as a live stream mass shooting

23:24

and then users m bull for the exits. The

23:26

street gets nervous. The stock price

23:28

starts to wobble. The founders panic

23:30

although being technical people they

23:32

call it pivoting. And you know, one day

23:34

Mark Zuckerberg like arises from his

23:36

sarcophagus and says, "Harken unto me,

23:38

brothers and sisters, for I've had a

23:39

vision. I know I told you that the

23:41

future would consist of arguing with

23:42

your most racist uncle using this

23:44

primitive text interface that I invented

23:46

so I could non-conensually rate the

23:48

fuckability of Harvard undergraduates.

23:50

But actually, I'm going to transform you

23:51

and everyone you love into a legless,

23:53

sexless, low polygon, heavily surveiled

23:55

cartoon character so that I can imprison

23:57

you in a virtual world I stole from a

23:59

25-year-old comedic dystopian cyberpunk

24:01

novel that I call the metaverse. And

24:03

that's the final stage. That's the giant

24:04

pile of [ __ ] All right,

24:08

Corey, you got a good rant there, my

24:10

man.

24:12

>> Cory Cory could be a rapper if he uh if

24:14

he decided to get into that.

24:16

>> I give you real props on it. that the

24:17

world is crying out for a middle-aged

24:19

technology critic rapper.

24:21

>> Let me ask you at least one question

24:23

here so I'm not just too taken in by

24:24

your charisma, which is to say I think

24:27

that the

24:29

counter argument somebody would offer is

24:31

that I think two things. One is for all

24:35

the pivots, all the um all the scams, by

24:38

the way. I mean, I was a publisher

24:40

during the era of the Facebook fire hose

24:42

to publishers

24:43

>> and the era of pivot to video when

24:45

Facebook videos were getting these

24:46

astonishing view counts

24:48

>> and one fraudulent view counts.

24:50

>> That's what I was about to say. One,

24:51

they kept first all the money. They

24:53

promised everybody, you know, come get

24:54

this huge scale. We're giving you all

24:56

this traffic. You can build a business

24:57

here. There was no business to build

24:58

there at any significant scale. And two,

25:00

it turned out that the video view counts

25:02

were fraudulent, right? And so a huge

25:04

amount of the news industry, among other

25:06

things, pivoted to video and it was

25:09

based on lies

25:11

>> and and scams are there's a recent

25:14

Reuters report that Facebook was

25:16

actually charging um advertisers more

25:18

for these things that they knew were

25:19

scams.

25:20

>> 10% of their ad revenue is ads for scams

25:22

by their own internal accounting.

25:23

>> I'm really not here to defend Facebook

25:25

as an actor. But the one of the crazy

25:28

things amidst all of this, a thing you

25:30

really focused on there was moving from

25:33

showing us what we had asked to see to

25:35

showing us what I would say Facebook

25:37

wants us to see. There's just the FTC

25:38

versus Meta case. Tim was of course

25:40

involved in that.

25:41

>> And one of the statistics that came out

25:43

during it is that only 7% of time spent

25:46

on Instagram is spent on things your

25:49

friends and family uh have actually

25:51

shown you, things people you follow are

25:53

showing you. Similarly on Facebook

25:54

itself it's under 20%. I forget the

25:56

exact number but it's very low.

25:58

>> They have moved under competition from

26:00

Tik Tok specifically although not only

26:03

to these AIdriven algorithmic feeds

26:06

showing you not what you have asked to

26:08

see but what they find will keep you

26:10

there. And what they are finding is that

26:12

it will in fact keep you there and

26:14

people are coming back to it and they

26:15

spend more time on Instagram when you

26:16

turn the feed into this algorithmic

26:18

feed. This is the whole revealed

26:20

preference thing that you were talking

26:21

about earlier. My personal experience of

26:24

Instagram when I go on it now is one

26:25

reason I try to go on it less

26:27

>> is that I can actually feel how much

26:30

more compelling it is. I like it less,

26:33

>> but the feeling of getting pulled into

26:35

something is much stronger.

26:36

>> And so I think if you had Mark

26:39

Zuckerberg risen from his

26:41

>> uh sarcophagus,

26:42

>> I was going to say office because I'm a

26:43

more polite person. Uh here he would say

26:46

we did this under competitive pressure.

26:48

Tik Tok was eating our lunch. We stole a

26:50

bunch of things from Tik Tok and now

26:52

we're doing better. We also stole a

26:53

bunch of things from Snapchat and now

26:54

we're doing better because in fact we

26:56

are under a lot of competition and we

26:58

are incredibly good at responding to

26:59

that competition in ways that our user

27:02

base responds to. This is not

27:04

initification. This is the magic of

27:06

competition itself and you know that

27:08

because look at our profit margin and

27:10

look at how much we've changed.

27:12

>> So let me say that I don't think

27:14

competition is a good unto itself. uh

27:16

and I think it is absolutely possible to

27:19

compete to become the world's most

27:21

efficient uh human rights violator. The

27:23

reason I like competition is because it

27:26

makes firms into a rabble instead of a

27:29

cartel. So in 2022 with two teenagers

27:33

reverse engineered Instagram and they

27:35

made an app called OG app. And the way

27:37

OG app worked is you give it your login

27:39

and password. It pretended to be you and

27:41

logged into Instagram. It grabbed the

27:42

session key. It grabbed everything in

27:44

your Instagram feed. It discarded the

27:46

ads. It discarded the suggestions. It

27:48

discarded all of the stuff that wasn't a

27:51

chronological feed of the people who

27:53

followed you uh the people you followed

27:55

rather uh that they had posted recently.

27:58

Facebook or Meta sent a letter to Apple

28:00

and Google who obliged them by removing

28:02

the app because there's honor among

28:03

thieves. So if you want to find out what

28:05

people actually prefer, you have to have

28:07

a market in which people who disagree

28:09

with the consensus that people are kind

28:11

of gut flora for immortal colony

28:14

organisms we call limited liability

28:16

corporations and that they are entitled

28:18

to dignity uh and moral consideration as

28:21

beings unto themselves. Those people

28:22

have to be offering some of the

28:25

alternatives to find out what they want.

28:27

But because under modern IP law,

28:30

something called the Digital Millennium

28:31

Copyright Act, it is a felony to modify

28:34

the app without permission,

28:37

when Meta sent the letter to Apple and

28:39

Google, they agreed that that was what

28:41

they would uh they would side with Meta.

28:44

And because you can't modify those

28:45

platforms to accept apps that haven't

28:48

run through the store, that that was the

28:50

end of the road for OG. But but I think

28:52

this is a little bit of a narrowed

28:53

example. As somebody who gets a huge

28:55

number of press releases for all these

28:57

pro-social apps that are built to

28:59

compete with Instagram and Tik Tok and

29:01

all of them, apps that are meant to

29:02

respect your attention, apps that are

29:04

meant to be virtuous in a way these apps

29:06

are not. And watches one after another

29:08

after another after another basically go

29:10

nowhere, get out competed. The point I'm

29:12

making is the example you're giving,

29:15

they were able to basically say there

29:16

was a term of service violation. Maybe

29:17

they should not be allowed to do that.

29:19

And people do. They This is where I want

29:21

to make sure my producer has a voice.

29:23

There are people who just absolutely

29:24

like Tik Tok. There are people who like

29:25

Instagram. They know there are other

29:27

things out there and they're not

29:28

clamoring for a competitor or an

29:30

alternative.

29:31

I I think suggesting that there is no

29:33

capacity to switch is going a little far

29:36

is

29:37

>> No, I'm not saying there's no capacity

29:38

to switch. I'm saying the higher the

29:40

switching costs are, the lower the

29:42

likelihood that people will leave. You

29:44

know, when we had popup ads in our

29:47

browsers and real pop-up ads, the the

29:48

Paleolithic pop-up ad that was a whole

29:50

new browser window that spawned one

29:52

pixel squared, autoplayed audio, ran

29:54

away from your cursor. The way that we

29:56

got rid of that was it was legal to

29:58

modify browsers to have pop-up blockers.

30:01

More than 50% of us have installed an ad

30:04

blocker in our browser. Docles calls it

30:06

the largest consumer boycott in human

30:07

history. And as a result, there is some

30:10

moderation upon the invasiveness of what

30:13

a browser does to you. That is in marked

30:16

contrast with apps because reverse

30:18

engineering an app because it's not an

30:19

open platform is illegal under American

30:21

copyright law. It violates section 121

30:24

of the Digital Millennium Copyright Act.

30:26

And so when we talk about how these

30:27

platforms uh have competed their way

30:29

into toxicity, we're excluding a form of

30:32

competition that we have made illegal.

30:34

for example, ad blockers, for example,

30:36

privacy blockers, for example, things

30:38

that discard algorithmic suggestions and

30:39

so on. Taking those off the table means

30:42

that the only competitors you get are

30:44

firms that are capable of doing a sort

30:46

of wholeless bolless replacement to uh

30:49

convince you that no, you don't want to

30:50

use Instagram anymore. You want to use

30:52

Tik Tok instead, as opposed to you'd

30:54

like to use Tik Tok or Instagram rather,

30:56

but in a slightly different way that

30:58

defends your interests against the

30:59

firm's interests. But I think that we

31:02

mustn't ever forget

31:04

that within digital technology and

31:06

living memory, we had a mode of

31:08

competition that we prohibited that

31:11

often served as a very rapid response to

31:14

specifically the thing you're worried

31:16

about here. you know that I have a

31:18

friend Andrea Downing who has um the

31:21

gene for breast cancer and she's part of

31:23

a breast cancer prevor group that was

31:25

courted by Facebook in the early 2010s

31:27

and they move there and this group is

31:29

hugely consequential to them because if

31:31

you have the breast cancer gene you are

31:33

deciding whether to have your breast

31:34

removed your ovaries removed the women

31:36

in your life your daughters your sisters

31:38

your mothers they're dying or sick and

31:40

you're making care decisions this group

31:41

is hugely important and discovered that

31:44

you could enumerate the full membership

31:46

of any faith Facebook group whether or

31:48

not you were a member of it. This was

31:50

hugely important to her friends there.

31:52

She reported it to Facebook. Facebook

31:54

said, "That's a feature, not a bug.

31:56

We're going to won't fix it. We're going

31:57

to keep it." They sued. It was

32:00

non-consensually settled when the FTC

32:02

settled all the privacy claims. And they

32:03

are still there because they cannot

32:07

overcome the collective action problem

32:10

that it takes to leave. Now, they will

32:11

eventually. When Facebook is terrible

32:13

enough, that community will shatter and

32:15

maybe it will never reform. That is not

32:18

a good outcome. All right, Tim, I want

32:20

to go to your story here. One of the

32:23

core tales you tell in your book is

32:25

about Amazon. So, walk me through the

32:27

process of moving toward moving a

32:30

platform from a kind of healthy

32:32

constructive platform to becoming an

32:34

extractive platform through your kind of

32:36

story of of what happened with Amazon.

32:38

Um Amazon is you may remember was once

32:41

upon a time a bookstore.

32:42

>> I do remember that actually. It's how

32:44

old I am.

32:45

>> And uh you know their basic idea was be

32:47

bigger and we'll sell more stuff. At

32:49

some point they opened the marketplace

32:52

um the Amazon marketplace which was

32:54

different because it was a platform. In

32:56

other words, it was a place that people

32:58

could come and sell their stuff. At

33:00

first it was used books. Then it spread

33:02

into other markets. And they uh realized

33:06

a few things. one is that fulfillment

33:08

would be very important. You know, eBay

33:10

in the old days, the the sellers had to

33:12

wrap it themselves and and send it off.

33:14

So, that that wasn't a very scalable

33:15

model. Um, and uh they understood they

33:19

had a good search engine. Amazon vested

33:21

hard in search and it worked and more

33:24

and more sellers came, more and more

33:26

buyers came and so the Amazon

33:28

marketplace um took over eBay and became

33:31

very successful and at that point I

33:33

would say you know maybe around 2010 or

33:36

something like that was fulfilling what

33:38

I think you know would would call the

33:40

dream of of the internet age which is a

33:43

lot of people would be able to go on

33:44

this place you know start their thing uh

33:47

make a lot of money it's it coincides

33:49

with the rise of the blog and and small

33:51

online magazines you know that whole era

33:53

that we are are talking about during

33:55

that period Amazon's take was below 20%.

33:59

It kind of depends how you count, but

34:01

you know somewhere between 15 to 20 20%

34:04

>> their take of what a small business is

34:06

>> of the sales of the sales. Yeah. Yeah.

34:07

So if you sold like $100 they take $20.

34:09

I mean it depend a little bit. There

34:11

were some storage fees and so on. So you

34:13

know it was it was a good place to make

34:15

money

34:16

>> and um what changed I think was once

34:20

Amazon had confidence that it had its

34:23

sellers and it had its buyers more or

34:25

less locked up. Um, and this is

34:29

basically over the the the the 2010s.

34:32

They bought a couple companies that were

34:34

potential threats to them. Diapers.com,

34:36

for example. It might seem ridiculous,

34:38

but Diapers, you know, could have been a

34:40

kind of a way in to threaten them.

34:43

>> Why don't you tell the diapers.com story

34:45

for a minute? It's a kind of famous

34:46

story in Amazon, but I think it's worth

34:48

telling. So there was a platform

34:51

launched to be an alternative to Amazon

34:53

and their thought was you know new

34:55

parents diapers you know every parent

34:58

needs diapers delivered quickly so why

34:59

don't we make that the beginning in the

35:01

same way Amazon started with books and

35:03

then Amazon saw this thought it was kind

35:05

of threatening and in the strategy of

35:08

the day just bought them um of course

35:11

the founders pretty happy um and Amazon

35:14

managed basically to capture this market

35:16

and and that's when I think it turned uh

35:18

to the extraction phase. Well, in the

35:20

last 10 years, Amazon's strategy has

35:23

just basically been for its marketplace

35:26

to to turn the screws and increase the

35:29

fees, change the margins so that many

35:32

sellers are paying, you know, over 50%

35:35

or more, you know, basically the same as

35:36

as brick and mortar businesses. And

35:40

Amazon prices are rarely uh any lower.

35:42

they they actually have done a lot to

35:44

try to prevent uh being priced anyone

35:47

pricing lower and I I think the one

35:49

thing I would focus on is their what

35:52

they call advertising which may be

35:54

familiar to you as as sort of the

35:55

sponsored results that you get when

35:57

you're searching. So what's going on

35:59

there is that sellers are bidding

36:01

against each other bidding down their

36:03

own margins to get higher up in the

36:05

search results. And that little trick,

36:08

that sort of one weird trick has become

36:10

this extraordinary cash cow. It's more

36:13

profitable than Amazon Web Services,

36:14

which is sort of surprising.

36:16

>> Last year it was 56 billion.

36:21

>> Just paying Amazon for higher for higher

36:23

rankings in their search results was 56

36:25

billion.

36:25

>> 56 billion. It's looking like it's going

36:28

to be over $70 billion. Corey, when I'm

36:30

searching on Amazon and I see that

36:32

Amazon's choice looks like a little

36:35

prize like that that product won a

36:37

competition uh where a bunch of editors

36:38

chose it as best one. What am I looking

36:40

at there?

36:42

>> So, that is broadly part of this thing

36:44

Tim was discussing where they're piling

36:46

on junk fees for the right to be at the

36:48

top of the results where if you're not

36:50

uh paying for Prime and paying for

36:51

fulfillment by Amazon and paying for all

36:53

these other things, you you aren't uh

36:55

eligible. And the more of these you buy,

36:56

the greater chance you have of being

36:58

chosen. But is that be are they

37:00

literally paying to be Amazon's top

37:01

choice? I I mean as a dumb consumer

37:04

maybe I look at that and I think oh this

37:06

is some algorithmic combination of is it

37:10

the best seller, what are its reviews,

37:12

etc.

37:13

>> Mhm. No, it's so you're right that it is

37:15

algorithmic but the algorithmic inputs

37:17

are not grounded primarily in things

37:20

like quality or customer satisfaction.

37:23

They're grounded in how many different

37:25

ways you've made your business your

37:26

business dependent on Amazon in such a

37:29

way that uh every dollar you make is

37:30

having more and more uh of that dollar

37:32

extracted by Amazon. There's some good

37:35

empirical work on this from uh Maria

37:37

Mazicado and Tim O'Reilly where they

37:39

calculate that the first result on an

37:41

Amazon search engine results page on

37:44

average is 17% more expensive than the

37:46

best match for your search. So that's

37:48

what you're seeing is basically the the

37:49

the Amazon top choice is the worst

37:51

choice.

37:52

So, this really feels to me like a place

37:55

where, to use Cory's word, things in

37:56

shitified. uh that when I go around the

37:59

internet now, when I play something in a

38:03

Spotify playlist or click on a song I

38:05

like and move to the radio version of

38:07

Spotify or when I search something on

38:09

Google or when I search something on

38:11

Amazon,

38:13

these used to be very valuable services

38:16

to me to search for something on Amazon

38:18

and see rankings uh weighted by how

38:21

popular the product is, how high the

38:23

reviews are, right? that like I took the

38:25

the waiting of the search as to some

38:28

degree a signal of quality. Certainly

38:29

Google the whole idea was that what

38:31

comes first in search was you know built

38:32

on page rank and it was going to be

38:34

quality and now

38:36

there is so much sponsored content in

38:38

every one of these results

38:41

and it is so unclear what is what and

38:43

who is paying for what and why I'm

38:45

getting this song or that result that

38:48

this whole industry uh or part of the

38:51

industry that you know one reason I

38:54

ended up on these platforms is because I

38:56

trusted these results

38:58

>> and now I trust

38:59

nothing.

39:01

>> Yeah. I mean it's going back to the

39:03

definition of extraction. I mean it's we

39:04

are kind of paying $70 billion

39:07

collectively to make search worse. So

39:10

when does this move from this is just

39:13

their business model and if you want to

39:15

find something else like go yeah go buy

39:16

something on Walmart, go buy something

39:17

on Target, go buy something at Best Buy.

39:19

You can do all those. I've done all

39:20

those. I just ordered a blender from

39:22

Kohl's versus we've moved to extraction

39:26

and we should see it as a public policy

39:29

problem. Yeah, I think that's the a

39:32

really great question. is a kind of

39:34

question we've faced um I I think

39:37

repeatedly in history when you start to

39:40

have a business model start to settle

39:42

down you see less um real disruptive uh

39:47

competition

39:49

um possible and Amazon is still you know

39:52

a great way to find a lot of product

39:54

it's the world's largest marketplace but

39:56

they have um I would say they they're

40:00

running themselves like an unregulated

40:01

monopoly and I I guess I would compare

40:02

it to electricity I mean, we'd all say

40:04

electric network is great. Um, we can't

40:07

do without they provide this incredible

40:08

service, but we really say, okay, we're

40:10

just going to put up with whatever

40:11

choices they charge. I don't think we

40:13

would. And I think at some level, once a

40:16

market has settled, at some point, you

40:19

got to call a limit. And we do that in

40:20

many other markets. Both of you spend a

40:22

lot of time on the number of small

40:25

acquisitions

40:27

that these companies make. And so not

40:29

where Google buys ways, but where Google

40:32

buys something very modest and and maybe

40:34

many of them get shut down or they aqua

40:36

hire the top people, but they're also

40:38

things that might have grown into

40:40

something bigger or else.

40:43

On the other side, sometimes it really

40:45

is the case that a big player buying

40:47

something smaller, they can scale it up

40:48

into something, you know, new like

40:50

Google, you know, bought, I mean, this

40:52

was actually a fairly big acquisition,

40:54

but Whimo and kind of amazingly like

40:57

they seem to have made driverless cars

40:59

work and I think access to Google's

41:01

compute and other things was not

41:02

insignificant in that. uh and you can

41:05

look at other cases where you know these

41:07

companies buying something small they're

41:09

able to build it into something you know

41:11

that ends up being a great uh option in

41:13

Microsoft Office or in Google Docs or

41:16

whatever it might be. So how do you

41:18

think about the ways in which that harms

41:20

competition

41:22

but also you know I've known founders

41:24

who get acquired and are excited to get

41:26

acquired because they think it will give

41:27

them scale and the capacity to compete

41:29

in a way they they wouldn't versus

41:30

Google just trying to do it itself. Um,

41:34

I think the antitrust level is one

41:35

thing, but the the sort of

41:36

anti-competitive versus proscale level

41:40

is like a much bigger challenge the way

41:42

Silicon Valley now works. And I'm I'm

41:44

curious to hear you talk through the the

41:45

pros and the cons of that.

41:47

>> You know, Joseph Shumpeder back in 1911

41:50

wrote a book about entrepreneurs

41:52

basically. And he said, you know, these

41:53

very unusual people who are willing to

41:55

go out and start a, you know, take these

41:57

kind of risks. They have some vision.

41:58

They do this kind of thing. and uh he

42:01

thought they were essential to economic

42:03

growth that they were these kind of

42:05

unusual almost like superheroes and

42:08

would do they do do these things and go

42:10

out and take these chances. the United

42:12

States economy in general has thrived

42:14

because it has a lot of those kind of

42:16

individuals and they can start things

42:18

and I think we've aired too far in

42:22

having all the brains under one roof and

42:25

you know it's starting to remind me of

42:27

of AT&T in in the '60s or IBM where they

42:30

they sort of became much more

42:33

centralized about innovation and big

42:35

ideas would never be developed. It

42:37

became kind of group thinky I think when

42:39

the justice department did a deal sued

42:42

AT&T tried to break them up and they

42:44

forced them AT&T to stay out of

42:47

computing forever and also license all

42:49

of their patents including the

42:51

transistor patent and all kinds of

42:53

people started quitting their jobs and

42:54

saying I'm going to start a

42:54

semiconductor firm and there lies the

42:57

origins of US semiconductors and also

42:59

frankly US computing without AT&T. So I

43:03

think we have done much better with

43:05

divided technological leadership. I I

43:08

frankly think that you know LLMs might

43:11

never have gotten started without open

43:13

AI being an alternative force because

43:15

they're obviously threatening to

43:16

Google's business model though. Don't in

43:18

a way you have to give Google some

43:19

credit on LM specifically. They you were

43:22

talking about transistors a minute ago,

43:24

but Google does the fundamental research

43:26

in transformers and releases it publicly

43:29

and and and creates in many ways the

43:31

industry

43:31

>> but doesn't do anything with it

43:33

internally until there's a competitor

43:34

that threatens them.

43:35

>> Yeah, that's right. That it's just

43:36

striking how good of an actor they were

43:38

for a period on AI specifically, right?

43:41

Like treating it like like they had a

43:42

Bell Labs.

43:44

>> I agree with that. It actually is a lot

43:45

like Bell Labs in the sense that Bell

43:46

Labs kept inventing stuff. I mean, Bell

43:48

Labs collected these, you know, a lot of

43:49

amazing people and then never let things

43:52

come to market. The internet being

43:54

probably the best example of it.

43:56

>> Yeah. I I um so I I I think when you

44:00

look at these companies and their

44:01

acquisitions, what you see is that these

44:03

companies very quickly suffer from what

44:05

both Brandeise and Tim called uh the

44:07

curse of bigness. Um that they're find

44:09

it very hard to bring an actual product

44:11

to market that they invent inhouse. When

44:13

you look at Google, they've had like one

44:15

really successful consumerf facing

44:16

product launch and that was in the

44:19

previous millennium and almost

44:21

everything they made in this millennium

44:22

failed, right? They they did not it

44:24

either didn't launch or when after it

44:26

launched they shut it down. Whereas

44:28

their giant successes um their video

44:30

stack, their ad tech stack, documents,

44:32

collaboration, maps, um their uh

44:36

navigation, server management, all of

44:38

this stuff. These are mobile, right?

44:41

These are companies they acquired from

44:42

someone else and operationalized. And

44:44

I'm an exops guy. I'm a I'm a recovering

44:46

CIS admin. So I'm not going to say that

44:47

that's nothing, right? It's it is a

44:50

skill unto itself. The careful work to

44:52

make things work and make them resilient

44:54

and scale them.

44:56

>> But the idea that that has to happen

44:57

under one roof, I think is is um a false

45:01

binary, right? I mean, one of the things

45:02

Google did arguably far more efficiently

45:05

than they than they hired um uh

45:07

innovators is they hired operations

45:09

people. uh and and those are the people

45:11

who really do the yman service at Google

45:13

cuz the innovators, the product managers

45:15

never get to launch. They only get to

45:17

buy other people's products and refine

45:19

them. You know, it comes down to what

45:22

you think of is the track record, I

45:24

guess, of monopolized innovation. And it

45:26

has some hits, but I'm saying a much

45:29

more mixed model, I think, historically

45:31

is a lot stronger. Um, if you look at

45:33

the 70s, 80, if you look at the entire

45:35

track record of US innovation, I think

45:37

monopoly innovation, you know, leads you

45:40

towards AT&T, Boeing, um, you know,

45:43

General Motors kind of model as opposed

45:45

to what the best of Silicon Valley has

45:47

been.

45:48

>> And meanwhile, I think you mentioned

45:49

aqua hires for people who aren't

45:52

unfortunate enough to be steeped in the

45:54

business of Silicon Valley. An aqua hire

45:55

is when a company is purchased not for

45:58

the product it makes but because the

45:59

team who made it have proved they can

46:01

make a product and then they shut down

46:02

the product and they hire the team and

46:06

aqua hires are I think a leading

46:08

indicator of pathology in tech and

46:11

investment. An aqua hire is basically a

46:14

postgrad project where venture

46:17

capitalist sink some money into you

46:18

pretending that you're going to make a

46:20

product. It's a science fair demo in the

46:22

hopes that the company will buy you and

46:25

in lie of a hiring bonus will give you

46:26

stock and in lie of a finder fee will

46:28

give them stock. But no one's trying to

46:31

actually capitalize a product or a

46:33

business. I think anytime you see a

46:36

prepoundonderance of aqua hires in your

46:37

economy that should tell you that you

46:39

need to sit down and figure out how to

46:40

rejigger the incentives because your

46:41

economy is sick.

46:43

Corey, we've been talking here about

46:45

these markets as really having two

46:49

players in them, which is well maybe

46:51

three. We've been talking about users,

46:53

sellers, and platforms.

46:55

But something the yearbook focuses quite

46:57

a bit on is a fourth, which we need to

46:59

talk about too, which is labor.

47:01

>> There are huge numbers of people working

47:02

for these companies, huge number of

47:04

people delivering Amazon uh packages and

47:07

Walmart packages.

47:09

And one thing that that both of you

47:11

focus on is the way in which as these

47:14

companies become bigger and more

47:15

dominant, their labor practices can

47:18

become I don't know if initification is

47:21

the term you would use there, but but

47:22

but shittier or more extractive.

47:26

Can you talk a bit about that side of

47:28

it? What has happened to the labor

47:29

practices?

47:31

>> Yeah, I I mean we could we could talk

47:32

about the other tech workers, right? The

47:35

majority of tech workers drive uh for

47:37

Uber or for Amazon or work in a

47:39

warehouse and they certainly don't get

47:41

like free kombucha and massages and a

47:43

surgeon who'll freeze their eggs so they

47:44

can work through their fertile ears.

47:46

They're in a factory in China with

47:48

suicide nets around it. But uh I think

47:50

if we an example that kind of pulls this

47:52

all together, how you get monopoly,

47:54

regulatory capture, the degradation of

47:57

labor with technology that is uh relies

47:59

on blocks on interoperability. I I think

48:01

we could do no better than to talk about

48:04

nurses. Um, and I'm going to be making

48:06

reference here to the work of Vina

48:08

Dubel, who's a legal scholar who coined

48:09

a very important term, algorithmic wage

48:11

discrimination. In America, hospitals

48:14

preferentially hire nurses through apps,

48:18

and they do so as contractors. So,

48:20

hiring contractors means that you can

48:22

avoid the unionization of nurses. And

48:26

when a nurse signs on to get a shift

48:28

through one of these apps, the app is

48:31

able to buy the nurse's credit history.

48:33

And the reason for that is that the US

48:36

government has not passed a new federal

48:39

consumer privacy law since 1988 when

48:42

Ronald Reagan signed a law that made it

48:44

illegal for video store clerks to

48:45

disclose your VHS rental habits. Every

48:48

other form of privacy invasion of your

48:51

consumer rights is lawful under federal

48:53

law. And so among the things that data

48:56

brokers will sell anyone who shows up

48:59

with a credit card is how much credit

49:01

card debt is any other person carrying

49:03

and how delinquent is it. And based on

49:06

that, the nurses are charged a kind of

49:08

desperation premium. the more debt

49:11

they're carrying, the more overdue that

49:14

debt is, the lower the wage that they're

49:16

offered on the grounds that nurses who

49:20

are facing economic uh uh privation and

49:24

desperation will accept a lower wage to

49:26

do the same job. Now, this is not a

49:29

novel insight, right? Paying more

49:30

desperate workers less money is a thing

49:32

that you can find in like Tennessee

49:35

Ernie Ford songs about 19th century coal

49:38

bosses. But the difference is that if

49:40

you're a 19th century coal boss who

49:41

wants to figure out how much the lowest

49:44

wage each coal miner you're hiring is

49:46

willing to take, you have to have an

49:47

army of Pinkertons that like are

49:49

figuring out the economic situation of

49:51

every coal miner. And you have to have

49:53

another army of guys in green eye shades

49:55

who are making uh annotations to the

49:56

ledger where you're calculating their

49:58

pay packet. It's just not practical. So

50:00

automation makes this possible. And you

50:02

have this vicious cycle where the poorer

50:05

a nurse is, the poorer they become. the

50:08

lower the wage they're offered. And as

50:10

they accumulate more consumer debt,

50:12

their wage is continuously eroded. Um,

50:15

and and I think we can all understand

50:17

like intuitively why this is unfair and

50:19

why as a nurse you might not want it,

50:21

but also like do you really want your

50:23

catheter inserted by someone who drove

50:25

Uber till midnight the night before and

50:27

skipped breakfast this morning so they

50:28

could make rent? This is a thing that

50:29

makes everyone except one parochial

50:31

interest worse off. And this is not a

50:34

free floating economic proposition. This

50:36

is the result of specific policy choices

50:39

taken in living memory by named

50:41

individuals who were warned at the time

50:43

that this would be the likely outcome

50:45

and who did it anyway. I want to stay on

50:48

the the the labor question on a couple

50:49

other levels, but I I want to ladder

50:51

this one up for a second, Tim.

50:52

>> Sure.

50:53

>> Which is because I think this is getting

50:54

at something we're starting to hear a

50:55

lot about, which is anger over

50:58

algorithmic pricing of various kinds.

51:00

So, when I was walking up to do the

51:01

podcast today, the Chiron on on CNN was

51:04

about uh an investigation finding that

51:06

Instacart was uh charging many different

51:08

people many different prices.

51:10

>> And so, the price you were seeing on

51:12

Instacart wasn't the price, it's your

51:14

price. And I could imagine a

51:17

neocclassical economist sitting in my

51:19

seat right now and saying

51:22

pricing becomes more efficient when it

51:25

discriminates. that the market will be

51:28

more efficient if it can charge, you

51:30

know, Ezra a higher price for kombucha,

51:34

uh, if I'm getting that delivered, uh,

51:36

because of things it knows about me and

51:37

my kombucha habits. And it charges

51:39

somebody else a lower price because it

51:40

knows they value the kombucha less or a

51:42

nurse a higher price and or a higher

51:44

wage and a lower wage depending on their

51:47

uh, situation. That in fact, we're just

51:49

getting better and better and better at

51:50

finding the market clearing price. And

51:53

this is what economics always wanted,

51:55

right? we're we're finally hitting the

51:57

utopia of every person having, you know,

51:59

the market clearing wage and the market

52:01

clearing price. Uh why don't you agree

52:03

with that?

52:04

>> Yeah, I mean the fundamental question is

52:06

is that really the kind of world you

52:08

want to live in? In other words, do you

52:10

constantly want to live in a place where

52:12

you are being charged the maximum you

52:15

would pay for something? Now you know

52:17

that could rebound to the benefit of

52:18

people who are very poor but it is to in

52:21

economic terms it's always only about

52:24

producers taking everything from the

52:27

market and I just think it's a very you

52:29

know just moving away from the the

52:31

efficiency potentially of it. I think it

52:34

makes for a very unpleasant

52:37

lifestyle to be constantly feeling

52:39

you're being exploited. And the other

52:41

thing I'll say is there's also a huge

52:43

amount of effort people make trying to

52:44

move what category they're in um and you

52:47

know pretend to be poor. So it I think

52:48

it is overrated

52:51

and relies on um I guess it relies on

52:56

overly simplistic models of what makes

52:58

people happy.

53:00

There's a way in which efficiency

53:03

housing is an interesting term in

53:05

economics

53:06

because in economics is in life you want

53:11

things to be somewhat efficient

53:14

>> but too much efficiency becomes truly

53:16

inhuman. Uh I find this even in the the

53:20

the very modest uh example of like

53:22

personal productivity uh efforts. You

53:25

know, it's great to have a to-do list.

53:27

>> If I really force myself onto the

53:29

scaffolding of a to-do list at all

53:31

times, I I feel like I cease to be a

53:32

human being and and and become a a kind

53:34

of machine always just getting things

53:36

done and responding to the emails and

53:38

and and this is a place I think it was

53:40

important and when you said it raises

53:42

the question of what kind of world you

53:44

want to live in because the truth is

53:46

that I don't want to live in a maximally

53:49

efficient world. I have other competing

53:51

values. You know the competitive

53:54

efficient market is good up to a point

53:55

and after a point it becomes something

53:58

corrosive to human bonds, human

54:00

solidarity, just in time scheduling

54:03

makes sense from the perspective of

54:05

economic efficiency and not if you want

54:07

healthy families in your society

54:10

and and and I I think being able to

54:12

articulate that question of what kind of

54:14

world you want to live in, not just what

54:15

kind of economy works on models, I think

54:18

is uh is is important and and often a

54:21

lost political

54:22

art in my view.

54:24

>> Yeah, I I I agree. And I feel there's,

54:26

you know, there are some intuitive

54:28

feelings like people feel it's unfair.

54:30

People don't like being ripped off, but

54:33

people hate paying junk fees. The the

54:35

original word for that, by the way, was

54:36

[ __ ] fees, but there was inside

54:39

government. We felt we had to we

54:41

couldn't have the president say that.

54:43

So, yeah, I think that gets the heart of

54:45

the matter. I mean you had also talked

54:47

about you know human attention and human

54:50

attention turns out to be quite

54:52

commercially valuable but do you ever do

54:55

you really want every second of your

54:56

time and every space you inhabit to

55:02

being mined for your attention and its

55:04

maximum value even if that contributes

55:07

to the I guess overall GDP of the

55:10

economy. I mean, I'd like to have some

55:11

time for my kids and friends in which no

55:13

one's making any money. And you know,

55:15

it's an example of a commodity that is

55:16

very close to who we are. Um, at the end

55:20

of your days, you know, what your life

55:22

was was what you paid attention to. And

55:25

the idea that you can with maximum

55:26

efficiency mind that at every possible

55:30

moment seems to me a recipe for a very

55:32

bad life. I think one way to frame this

55:35

rather than around efficiency is around

55:38

optimization.

55:39

And I I think that we can understand

55:41

that for a firm the optimal arrangement

55:45

is one in which they pay nothing for

55:48

their inputs and charge everything for

55:50

their outputs. So optimization things

55:53

are optimal from the perspective of the

55:55

firm when they can discover who is most

55:58

desperate and pay them as little as

56:00

possible or who is most desperate and

56:02

charge them as much as possible. But

56:04

from the perspective of the uh users and

56:08

the suppliers, things are optimal when

56:11

you get paid as much as possible and are

56:13

charged as little as possible. And so

56:16

much of kind of the specific

56:18

neurological injury that arises from

56:21

getting an economics degree is organized

56:24

around never asking the question sort of

56:26

optimal for whom. I mentioned before

56:28

that we don't have any privacy law in

56:30

this country. One of the things that a

56:31

privacy law would let us do is to become

56:34

unoptimizable.

56:35

All optimization starts with

56:37

surveillance. Whether it's things like

56:40

Tik Tok trying to entice your kids to

56:42

spending more time than they want to

56:43

spend there, or whether that's

56:45

advertisers uh uh finding ways to follow

56:49

you around and hit you up with things

56:50

that you're desperate for, or whether

56:53

it's discrimination in hiring or in

56:55

lending. All of this stuff starts with

56:57

an unregulated surveillance sector. We

57:00

have platforms that take our data and

57:01

then sell it and use it and and recycle

57:03

it and become sort of the Lakota of

57:05

information where they use the whole

57:07

surveillance package. Uh and uh we do

57:10

nothing to curb that behavior. It is not

57:14

an incredible imaginative lift to say

57:16

that we might tell them to stop. I want

57:18

to pick up on surveillance because when

57:20

you talk about the

57:24

harms to an economy working in a human

57:25

way, I think that the

57:29

new frontiers and how you can surveil

57:31

workers

57:32

>> Mhm.

57:33

>> are I think this is going to become a

57:36

very big political issue and probably

57:37

should be already.

57:39

>> So the category that this that this

57:40

falls into, it's broadly called bossware

57:43

and there's a whole lot of different

57:44

versions of it. Like if your firm buys

57:46

Office 365,

57:48

Microsoft will offer your boss the

57:50

ability to stack rank divisions within

57:52

your firm by like how often they move

57:54

the mouse and how many typos they make

57:55

and how many words they type. And then

57:57

this is amazing. They will tell you how

57:59

you perform against similar firms in

58:01

your sector, which is like the most

58:03

amazing thing I can imagine that that

58:04

Microsoft is finding customers for a

58:06

sales pitch that says, "We will show you

58:08

sensitive internal information about

58:10

your competitors." And apparently none

58:11

of those people are like, "Wait, doesn't

58:13

that mean you're going to show my

58:14

competitors sensitive commercial

58:16

information about me?" So you have this

58:18

on the kind of the the, you know,

58:20

broadstrokes level, but I have this

58:22

notion I call the shitty technology

58:24

adoption curve, right? If you've got a

58:26

really terrible idea to uh that involves

58:29

technology that's incredibly harmful to

58:31

the people it's imposed on, you can't

58:33

start with me. I'm a mouthy white middle

58:36

class guy with a megaphone and when I

58:38

get angry, other people find out about

58:40

it. You have to find people without

58:41

social power and you grind down the

58:44

rough edges on their bodies. You start

58:46

with prisoners. You start with uh people

58:48

in mental asylums. You start with

58:50

refugee uh and then you work your way up

58:52

to kids and then high school kids and

58:53

blue color workers and pink color

58:55

workers and white color workers. And it

58:56

starts with like the only people who eat

58:58

dinner under a CCTV are in supermax. And

59:00

20 years later it's like no, you were

59:02

just dumb enough to buy a home camera

59:04

from like Apple or Google or god help us

59:05

all Facebook. Right? So that is the

59:07

shitty technology adoption curve. And if

59:09

you want to know what the future of

59:10

workers is, you look at the least

59:11

privileged workers at the bottom and

59:13

then you see that technology working its

59:15

way up. If you look at drivers for

59:17

Amazon, they have all these sensors

59:18

pointed at their faces, sensors studded

59:20

around the van. Um, they're not given a

59:22

long enough break even to deal with

59:24

things like period hygiene. And so, uh,

59:27

women who drive for Amazon who go into

59:28

the back of the van to deal with their

59:30

periods discover that, uh, that's all on

59:32

camera cuz that's all being recorded.

59:35

Um, all of this stuff is subject to both

59:37

manual and automated analytics. And at

59:39

one point, Amazon was docking drivers uh

59:41

for driving with their mouth open

59:43

because that might lead to distraction

59:45

while driving. And so, as you say, it

59:47

kind of denudes you of all dignity. It

59:50

really is very grim. And, you know, Tim

59:51

and I used to ride the Toronto Transit

59:53

Commission buses to uh to to school in

59:56

the morning when we were going to

59:57

elementary school, and we loved the

60:00

drivers who would sing and tell jokes

60:03

and remember you. This is the thing that

60:05

makes uh working uh in the world, being

60:08

in the world great. It's having a human

60:10

relationship with other humans, not

60:13

having standardized labor units that

60:15

have been uh automated and standardized

60:18

to the point where they can be swapped

60:19

out. You know, if you give a a cashier a

60:22

cash register, instead of making them

60:23

add up things on the paper, you could

60:25

give them the surplus to talk with the

60:27

customers and have a human relationship

60:29

with them. Or you could speed them up so

60:32

that you fire nine ten of the cashiers

60:34

and you take the remainder and you make

60:36

them work at such an accelerated pace

60:37

that they can't even make eye contact.

60:40

There were things in Cory's description

60:42

there in his answer there that in my

60:46

view we should just make a social

60:48

decision to outlaw. Like I am willing to

60:50

say politically like I want to vote for

60:52

the people who think you can't eyeball

60:54

surveil workers

60:56

>> and if other people want to stand up and

60:58

say the surveillance of workers eyeballs

61:00

is great that like that's a good values

61:02

debate to have in a democracy and and

61:04

and I know where I I fall on that. Then

61:06

there are other things right you know I

61:08

I'll use as an example of I'll build on

61:10

the cash register example to say that

61:14

I really struggle with what I think as a

61:16

public policy measure one should think

61:20

about the rise of automated checkout in

61:22

the way we've seen it. I watch people

61:24

turned into these managers of machines.

61:26

So, they've gone from being somebody

61:27

who, you know, did check out with me and

61:29

asked me how my day was and asked them

61:30

how their day was, and now they get

61:33

called over because the three apples I

61:36

put on the weighing machine didn't weigh

61:38

in correctly, and it seems dehumanizing

61:40

to them, dehumanizing to me. I also get

61:43

it. How do you think about weighing

61:46

that? Right. There's the stuff that is

61:48

genuinely like grim and dystopic and

61:50

maybe we should just outlaw. And then

61:52

there is stuff like

61:54

the just generalized automation in which

61:56

there genuinely can be a consumer

61:58

surplus from that. Like time is a

61:59

surplus for me. Things moving faster is

62:01

a surplus for me. More checkout stations

62:03

is a surplus for me. And there's a cost

62:06

on the other side of it.

62:08

>> Well, the first thing I'd say is we

62:09

should be making more of these kind of

62:11

decisions

62:12

>> about what we really care about and what

62:14

kind of world we want to inhabit. I

62:16

mean, one of the things that I think

62:17

happens is by default, um, we don't pass

62:20

any laws or have new ethical codes. I

62:23

mean, ethics does a lot of work and we

62:25

just sort of allow a trump card to to

62:28

new stuff because it's new. And, you

62:30

know, I I get that you don't want to ban

62:32

everything new that shows up, but I feel

62:34

that we have over the last 15 years or

62:36

so sometimes just kind of taken a

62:39

position that, you know, the people

62:41

don't get to vote on this. I mean, a

62:43

good example is everything to do with

62:44

children. Um, you know, I don't think

62:46

there's a lot of people uh who think

62:49

it's a great thing to surveil children

62:51

and, you know, uh have targeted ads for

62:54

children and know, you know, and try to

62:57

create addictive technologies for

62:59

children. Uh, you know, when I worked in

63:00

government, we tried to pass just basic

63:03

even child privacy laws. We couldn't get

63:04

a vote ever. And so, one of the things

63:06

that's going on is we're not even

63:08

deciding these things as as society. And

63:11

that that gets to, you know, the problem

63:12

of Congress not taking votes on popular

63:15

issues. But I also think this relates to

63:18

our conversation earlier about

63:21

competition and when it's good and when

63:23

it's bad because I think for almost any

63:26

endeavor, there's such a thing as

63:29

healthy competition and such a thing as

63:31

toxic competition. You know, I think

63:33

this is we were talking about attention

63:35

markets earlier. What is good healthy

63:37

competition in attention markets? It's

63:39

like making really great movies,

63:42

new TV shows that people love, you know,

63:45

podcasts that people want to listen to.

63:48

Uh toxic competition was the stuff

63:49

you're talking about. Essentially,

63:50

different forms of manipulation and

63:53

addiction. And we've had this kind of

63:55

like hands-off, we cannot try to direct

63:58

things in a positive direction. I think

64:00

that has been a giant mistake. So, first

64:01

I would say we have to even try to make

64:04

the decisions. You know, how would I do

64:06

the tradeoff? I mean, I guess I would

64:08

start with the most unredeeming toxic

64:11

stuff and ban that first and then see if

64:14

we can, you know, I I mean, that's maybe

64:16

easy, but we haven't been able to even

64:18

do that. And I I was sort of shocked

64:20

when I worked in government that we just

64:21

could not get a vote on what seemed like

64:23

stuff. I had 90 privacy laws. I mean,

64:26

even national security was really into

64:27

this stuff. They're like, it's too easy

64:28

to spy on everybody. And um you know,

64:31

that's a problem for us as a national

64:33

security issue. and we just could not

64:36

get a vote on even the most basic

64:38

anti-serveillance which would suggest

64:39

like if you download a dog walking app

64:42

it shouldn't be just like tracking you

64:44

and uploading every kind of information

64:45

about you that that should be illegal. I

64:48

have been very uh

64:52

I have been disturbed we've not been

64:53

able to do more on surveillance and

64:54

privacy and I've also been struck by how

64:57

badly what has been done elsewhere seems

65:00

to have worked out. Um, I call this

65:02

terms and conditions capitalism where

65:04

you just move the burden onto the the

65:06

the consumer. So, Europe uh has put out

65:10

some very sweeping rules that have given

65:12

me the opportunity to individually

65:14

decide which of the 303 cookies on every

65:16

website I visit might be good, it might

65:18

be bad. Um, similarly, nobody's ever in

65:21

my view to a first approximation read an

65:23

iOS terms and conditions update. And I

65:26

have found that a lot that that very

65:28

often it seems to me where policy makers

65:31

end up after the debate is saying well

65:34

as long as there is disclosure then the

65:36

consumer can decide but the consumer in

65:39

a very rational way does not want to

65:41

decide.

65:42

>> Yeah.

65:43

>> So it is ended up I think in a very

65:44

dispiriting place. Instead of creating a

65:49

structure in which I'm confident what

65:51

companies are are doing is well-bounded,

65:55

it has demanded of me a level of

65:57

cognitive work I'm not willing to do and

65:58

I think nobody else is willing to do to

66:00

oversee those companies myself with not

66:04

really great options if I don't like

66:07

what they're doing. And and and so I'm

66:09

curious how you think about that.

66:11

>> No, I couldn't agree more. I feel like

66:13

if the byproduct of government action

66:16

is that you are clicking on more little

66:19

windows like that is government failure.

66:22

And I would trace it to

66:25

frankly a lack of courage on on the part

66:29

of government um and the regulators or

66:32

the officials or you know to to make

66:35

decisions that are really supposed to

66:37

help people. uh it it's much easier to

66:39

say, well, you know, I'm afraid to do

66:41

something, so I'm gonna help them

66:42

decide. So, I agree. I think the GDPR

66:45

has actually failed to prevent

66:46

surveillance.

66:47

>> That being the European government that

66:48

created all those pop-ups. Yeah, GDPR,

66:52

the European privacy laws succeeded um

66:55

you know in creating as a lot of popups

66:57

and things to mess with succeeded in

66:59

making it harder to challenge um big

67:02

tech companies in Europe because they're

67:03

overregulated and the little guys have

67:05

to also go through all this all this

67:07

stuff. And so yes, I think this has been

67:09

a failure. I think for people to start

67:11

to believe in government again, it has

67:14

to help us in situations where we are

67:17

not strong enough to deal with something

67:20

much more powerful or something that has

67:22

a lot more time to think about it. I

67:24

mean, it's like we're playing poker

67:26

against experts. You know, at some point

67:28

we need to get backbone and have

67:31

government on people's side. Now, I'm

67:32

starting to sound like a politician, but

67:34

I but I but I mean it. People say that,

67:35

but really doing it makes making, you

67:37

know, helping people when they are

67:40

powerless or distracted or don't have

67:42

energy to deal with things.

67:44

>> Cory, so look, I I love you both, but I

67:46

think you're dead wrong about the GDPR

67:48

just as a factual matter about where it

67:50

comes from, what it permits, what it

67:52

prohibits, and why it failed. Cuz I I

67:54

agree it failed. So, you may ask

67:56

yourself, how is it that GDPR compliance

67:59

consists of a bunch of cookie compliance

68:00

dialogues? And the answer to that is

68:03

that European federalism allows tax

68:05

havens to function within the

68:07

federation. One of the most notorious of

68:09

those is Ireland. And almost every

68:11

American tech company pretends that it's

68:12

Irish so that its profits can float in a

68:15

state of untaxable grace in the Irish

68:17

sea. And because of the nature of the

68:20

GDPR, enforcement for uh these [ __ ]

68:24

cookie popups, which are the uh progeny

68:27

of the big American tech companies,

68:30

starts in Dublin with the Irish data

68:31

commissioner, who to a first

68:33

approximation does nothing. That that

68:35

sounds bad, but I want to get you to

68:36

explain the core mechanism you're

68:38

describing here better cuz I actually

68:39

don't know it because the GD that bill

68:42

did pass and then all of a sudden the

68:43

entire internet filled with these

68:44

pop-ups. So that's only because the

68:47

companies went to Ireland, broke the

68:49

law, and said, "We're not breaking the

68:51

law, and if you disagree, you have to

68:53

ask the Irish data commissioner to

68:54

enforce against us." A few people, um,

68:57

Johnny Ryan with the Irish Civil

68:59

Liberties Association, uh, uh, Max

69:02

Shrems with NOIB, this none of your

69:05

business, this, uh, nonprofit, European

69:07

nonprofit. They've dragged some of those

69:08

cases to Germany. More importantly,

69:10

they've got the European Commission to

69:12

start modifying the way the law works.

69:13

So you can just you can tick a box in

69:15

your browser preferences and it can come

69:16

turned on by default that says I don't

69:18

want to be spied on and then they're not

69:20

allowed to ask you. So the answer is

69:22

just going to be no. And so I think that

69:25

corporations want you to think that is

69:28

transcendentally hard to write a good

69:31

law that bans companies from collecting

69:34

data on you. And what they mean is it's

69:36

transcendentally hard to police

69:38

monopolies once they've attained

69:40

monopoly status because they are more

69:42

powerful than governments. And if that's

69:44

their message, then a lot of us would be

69:45

like, well, we need to do something. We

69:47

need to turn the cartel into a rabble

69:49

again. As opposed to, God, I guess

69:51

governments just have no role in solving

69:53

this problem. The one place where I do

69:55

disagree with you having covered a lot

69:56

of different both cartels and rabbles

69:59

lobbing Congress. It I mean it's not

70:01

easy to regulate the association of

70:04

community banks for instance. When you

70:06

have something where there are in every

70:09

single district like individual leaders

70:12

of the district who will come and lobby

70:13

their member of Congress, it's really

70:14

hard. I am not saying that monopolies

70:16

are good because they make it easier to

70:18

to regulate. I'm just saying that it

70:19

doesn't solve the problem of the

70:22

government runs on money and influence

70:25

and on top of it's very hard to get.

70:28

Yeah. So we can we can do that. I want

70:30

to ask but I I want to build on this and

70:31

ask Tim about a separate but related

70:34

question. Tim, you mentioned a second

70:36

ago sort of the entertainment industry

70:38

and one of the questions about to come

70:40

up is whether Netflix should be able to

70:43

buy all of the assets of or all the

70:46

entertainment assets I should say of

70:48

Time Warner. And I this is one where I

70:53

think people who care about the quality

70:56

of the media we consume seem for reasons

71:00

that seem compelling to me very very

71:01

worried about having that happen. How

71:03

would you think about that? And is this

71:05

a place where we need to be say making

71:08

values judgments that are different than

71:09

our antitrust judgments? Is this a place

71:11

where the antitrust laws can suffice? Is

71:14

everybody just worried about something

71:15

they don't need to be worried about? How

71:16

do you see it?

71:17

>> Yeah. No, I I think this is a place

71:19

where if the antitrust laws are enforced

71:22

uh correctly and fairly that um the

71:25

merger or the acquisition would be

71:27

blocked. And I I'd say that this is not

71:30

a particularly exotic situation in the

71:33

sense that you have the number one, you

71:36

know, premium streaming company wanting

71:38

to buy the number three or number four.

71:40

And if you do the numbers under the

71:42

guidelines which the government issues

71:44

to tell people when their mergers are uh

71:47

presumptively illegal, the result is

71:49

that this is a presumptively illegal

71:52

merger. The reason I do think it's bad

71:54

is I I think that Netflix and Time

71:56

Warner have frankly over their history

71:58

been some of the most innovative

71:59

interesting outlets. Um, and often in an

72:03

oppositional role, you know, this goes

72:05

way back, but like you know, Time Warner

72:07

took a chance on on sound film back in

72:09

the 20s. In the ' 50s, they took a

72:12

chance on on television, which people

72:14

thought was, you know, useless. And then

72:16

prestige television, early thousands

72:18

with HBO and the golden age. So, they've

72:21

taken a lot of bets. Netflix has done a

72:22

lot of innovative stuff. Really

72:25

interesting obviously and frankly you

72:27

want to talk about good tech over the

72:29

last 20 years. How about, you know, not

72:31

having to wait until your show comes on.

72:34

Um that's a form of efficiency I I can

72:36

agree with. And I think it would be a

72:38

tragedy to have these two companies who

72:41

are often so oppositional, you know,

72:43

combined into one. I think culturally it

72:46

would be a great mushification at the

72:48

economic level. Well, just to continue

72:50

on this, I I think it's usually going to

72:52

be those two companies who are bidding

72:54

for the most interesting shows. So, if

72:55

you had a new version of, I know, White

72:57

Lotus or something or The Wire, who are

72:59

going to be bidding for it, it's going

73:01

to be Netflix and Warner uh HBO, Netflix

73:04

or others. So, you know, the elimination

73:07

of one bidder is just the definition of

73:10

a loss of useful competition. So, yeah,

73:13

I think it's pretty straightforwardly

73:15

illegal. I don't think it's that

73:16

complicated. Cory, you looked like you

73:18

wanted to jump in on that. No,

73:20

>> I I I think that um one of the things we

73:23

should probably anticipate Tom Warner

73:24

saying uh in defense of this merger is

73:27

the same thing that Simon and Schustster

73:28

and Penguin Random House said in defense

73:30

of their failed merger that was blocked

73:32

under the Biden administration. They

73:33

said, "Oh, well, we'll still internally

73:36

bid against one another with within our

73:38

divisions for uh the most premium uh

73:41

material and that we'll be exposed to

73:43

discipline that way." And I love what

73:45

Stephen King had to say about this when

73:46

he testified. He said, "That's like me

73:48

and my wife promising to both bid

73:50

against each other on the next house we

73:51

move into."

73:54

Tim, one thing I was thinking about

73:55

while I was reading your book was the

73:57

metaphor you use of a gardener. That the

73:59

way to think about economic regulation

74:01

and antitrust and a bunch of the

74:02

different buckets of solutions we're

74:04

talking about is that is like a gardener

74:08

who is trying to prune certain species

74:10

and and plants from taking over their

74:11

their garden. And the gardener has to

74:14

make judgments. And you know the there

74:16

are some decisions you make as a

74:17

gardener where you don't want blight

74:19

getting all over your garden and killing

74:21

everything. But others are made for

74:23

aesthetic reasons and others are made

74:24

because you want to have native species

74:26

and not invasive species. And and there

74:27

are all these sort of decisions being

74:29

made. And having been around

74:31

conversations of economic regulation and

74:33

tech regulation for a long time,

74:36

I I've come to this view that there is a

74:39

fetish in them for truly neutral rules.

74:42

That what people always seem to be

74:44

looking for is a rule that you don't

74:45

have to apply any judgment on. You can

74:47

just say if you get over this line,

74:48

everybody knows it's bad.

74:50

>> As opposed to actually having to say we

74:52

have views about how the economy should

74:54

work. We have views about how our

74:55

society should work.

74:57

We want the interest of small businesses

74:59

to prosper and they'll prosper more if

75:01

they don't have to give 30 cents of

75:02

every dollar to Apple or Google or you

75:05

know if you're selling on the Facebook

75:06

marketplace Facebook and yet I mean

75:09

you've been a policy maker Tim I think

75:11

that there has been in a kind of a like

75:14

a defensive crouch particularly among

75:16

Democrats and you know Lena Khan and

75:17

others were an exception to this but a a

75:20

sort of effort to describe everything

75:21

neutrally when sometimes you just don't

75:23

want to be neutral on how fundamental

75:26

companies and and and markets in your

75:27

economy are are working. You want to be

75:29

able to have values that those serve as

75:32

opposed to your values are subservient

75:34

to your economy.

75:36

>> Yes, I know. I I agree with that and I

75:38

think it's an astute observation. I

75:41

think it kind of comes as I said earlier

75:44

from a lack of courage or vision that um

75:48

it reminds me of you said uh when you

75:49

were talking about well okay we'll just

75:51

create a bunch of windows and let

75:52

everybody decide what options they want

75:54

for their privacy and hope that works.

75:57

Um you know because it comes from that

76:00

same impulse that we don't actually want

76:03

to arrive at a vision of the good

76:05

society. Um, it's it's one of the flaws

76:09

of classic liberalism, frankly, if you

76:11

get into the in political theory. And

76:13

frankly, the gardener metaphor is

76:16

targeted at that. It's not just like let

76:18

it all run and see what happens. Um, it

76:22

is one where you have some idea of what

76:25

kind of world we want to live in and

76:26

what kind of society we think is good

76:30

and you have to make decisions based on

76:32

that. I think we need a vision of what

76:34

we want and what a good country looks

76:37

like and a and a good place to live.

76:40

>> So I think that um bright line rules

76:44

make a lot of sense particularly where

76:46

you have questions that have to be

76:47

frequently adjudicated. The thing we

76:49

really want to be asking before we ask

76:50

any of these other questions is how

76:52

often are you going to have to answer

76:53

this question? So lots of people are

76:54

like oh we should just ban hate speech

76:56

and harassment on platforms. Well,

76:58

that's hard because not because we

77:00

shouldn't do it, but because agreeing

77:02

what hate speech is, agreeing whether a

77:04

given act is hate speech, agreeing

77:06

whether the platform took sufficient

77:07

technical countermeasures to prevent it

77:09

is the kind of thing you might spend 5

77:10

years on. And hate speech happens a 100

77:12

times a minute on platforms. Meanwhile,

77:14

if we said we're going to have a bright

77:16

line rule that platforms must allow

77:18

people to leave but continue to

77:20

communicate with the people they want to

77:22

hear from, then people who are subjected

77:24

to hate speech, who are currently there

77:26

because the only thing worse than being

77:27

a member of a disfavored and abused

77:29

minority is being a member of a

77:30

disfavored abused minority who is

77:31

isolated from your community, those

77:33

people could leave and go somewhere

77:35

else. And it's not that we shouldn't

77:37

continue to work on hate speech in

77:38

parallel, but if you think that a rule

77:40

that takes 3 years to answer a question

77:43

is going to solve a problem that happens

77:45

100 times a second, you're implicitly

77:47

committing to full employment for every

77:49

lawyer in the world to just answer this

77:52

question. One thing I admire about both

77:55

of your books is that you you spend a

77:56

lot of time on solutions and so I don't

77:58

think we can go through every one, but

78:00

but I let me do it this way for each of

78:03

you and and and Cory, why don't we start

78:04

with you? Mhm.

78:05

>> If you were king for a day, what are the

78:08

the three areas or the three policies,

78:12

you can define it the way you want

78:14

>> that you think would make the most

78:15

difference? One would be um getting rid

78:18

of this anti-ircumvention law um in in

78:20

America. It's section 121 of the digital

78:22

millennium copyright act and saying that

78:24

it should be legal to modify things you

78:26

own to do things that are legal and that

78:28

it shouldn't be the purview of the

78:29

manufacturer to stop you from doing it.

78:31

Uh, another one would be to create a

78:33

muscular federal uh, privacy right with

78:36

a um with a private right of action so

78:38

that uh, impact litigators like the

78:40

Electronic Frontier Foundation as well

78:41

as agreved individuals could bring cases

78:44

when their privacy laws were violated.

78:46

And I guess the third would be an

78:49

interoperability mandate specifically

78:51

for social media. So it would be a rule

78:54

and we've had versions of this. The

78:55

Access Act was introduced um, I think

78:57

three times, various versions. They're

78:59

all pretty good. Mark Warner, I think,

79:01

was the main senator behind them. But a

79:02

thing that just says that you should be

79:04

able to leave a social media network and

79:06

go to another one and continue to

79:08

receive the messages people send to you

79:10

and reply to them the same way you can

79:11

leave one phone carrier and go to the

79:13

other. And there's a lot of technical

79:14

details about what that standard looks

79:15

like and how you avoid embedding um you

79:18

know, parochial interests of incumbents

79:19

and so on. I don't think they're

79:21

insurmountable. Uh and I think that the

79:23

trade-offs are more than worth it. Tim,

79:27

>> so I'll say three things. So, first I

79:28

think we need the confidence to ban the

79:31

worst and most toxic business models

79:33

that are out there. You know, whether

79:34

it's exploitation of children, whether

79:37

frankly it's it's some of this um total

79:40

absolute price discrimination you're

79:42

talking about, which may technically

79:43

already be illegal. Number two, I think

79:46

that it's unquestioned that the

79:48

platforms have become essential to

79:50

commerce, the main tech platforms. It

79:52

just it I'm not in any way thinking you

79:55

can do without them. And so I think we

79:58

need to understand which of them need to

80:00

be treated more like utilities

80:02

>> and which of them need to be not allowed

80:03

to discriminate in order in favor of

80:06

themselves or as between customers to

80:08

try to maximize their extraction.

80:10

>> Can I hold you on that one for a minute

80:12

before you go? Because I always when I

80:15

hear this it makes sense to me

80:17

>> and then I think to myself, do the

80:18

people I know who focus on how utilities

80:21

act and are regulated seem happy with

80:23

the situation? And the answer is no.

80:26

They all think it's a total disaster. So

80:27

when you say they should be treated as

80:29

utilities, but you know you worked in

80:31

the Biden administration, you know,

80:32

everybody who works on say green energy

80:34

will tell you that the models and

80:37

regulatory structures of the utilities

80:39

is like a huge huge huge problem. What

80:42

specifically do you mean?

80:44

>> It's a good it's a good question and

80:46

I've spent a lot of my life exposed to

80:47

that. But I think what's important about

80:50

utility regulation is what it doesn't

80:52

allow to happen. Like the electric

80:54

networks, the electric utility

80:56

regulators are not perfect. On the other

80:58

hand, if you think about the electric

81:00

network, it has been an extraordinary

81:02

foundation for people to build stuff on,

81:04

you know, and the reason they're able to

81:06

build on it is they don't think the

81:08

electric network is going to take like

81:09

half their profits if you invent the

81:11

computer on top of it. Or they don't

81:14

think that, for example, the electric

81:16

network is going to decide that, you

81:17

know, it likes Samsung toasters instead

81:19

of like LG, I don't know, whoever else's

81:22

toasters. Z zenith zenith something like

81:24

that so they they don't discriminate

81:25

between manufacturers on the electric

81:27

network and so I think we need to

81:29

understand and look carefully at which

81:32

part of the platforms are the most like

81:35

the electric network or the broadband

81:36

network where they are essential to the

81:38

rest of business and therefore need to

81:40

play by different rules and some of

81:42

those main rules the most obvious are

81:45

duties of treating everybody the same so

81:47

they don't play favorites and then if

81:49

you've got it figured out you get to the

81:51

question of price regulation And maybe

81:53

Amazon's margin be capped at 30% or

81:55

something like that.

81:56

>> And then number three for you.

81:58

>> Number three, I uh I think we need a

82:00

constant, you know, I'm an anti-

82:02

monopoly kind of guy, constant pressure

82:05

on the main tech platforms so that they

82:08

stay, I guess, insecure in their

82:12

position and aren't able to easily

82:15

counter new forms of competition. I

82:18

think you have to take out of the

82:21

picture the easiest ways of uh tamping

82:25

down or eliminating challenges to your

82:28

monopoly. I think that's been a really

82:30

important thing in US tech uh since

82:34

AT&T, since IBM, since Microsoft of

82:37

keeping the main dominant market players

82:40

uh insecure and force them to uh to

82:45

succeed, to improve themselves as

82:46

opposed to buying off their competitors

82:48

or excluding them. So that's my third.

82:51

So before we wrap here, I want to return

82:52

to something we've sort of been

82:54

circling, which is what kind of

82:56

competition do we want to be encouraging

82:59

among these platforms? Tim, one thing

83:01

you said earlier was that there can be

83:03

this difference between healthy

83:04

competition and toxic competition, which

83:07

if you read a lot of economic commentary

83:09

from the early 20th century, you hear a

83:11

lot about that. And I feel like we don't

83:12

talk about it that much anymore. But but

83:14

this is a place where I've been

83:15

skeptical of the argument that many

83:18

problems would be solved by breaking up

83:20

the big particularly attentional social

83:22

media and algorithmic media giants that

83:26

I don't I mean I don't think Instagram

83:28

has gotten better under pressure from

83:29

Tik Tok. Uh I don't think that more

83:32

ferocious innovation and entrepreneurial

83:35

work to capture my attention or my

83:37

children's attention is necessarily

83:38

good. Maybe the problem is that the

83:41

entire thing that the companies are

83:43

trying to do, whether there are two of

83:44

them or 50 of them, is negative.

83:48

>> Yeah, it's it's a really good point and

83:50

a good question. You I think in the

83:52

markets you're talking about, we have a

83:54

serious failure to wall off, discourage,

83:59

ban, um or ethically consider wrongful

84:04

the most toxic for ways of making money.

84:07

So there is such a thing as healthy

84:08

intentional competition like making a

84:10

great movie that keeps the audience

84:12

enraptured for two hours you know

84:15

producing a great podcast that is good

84:17

intentional competition and frankly the

84:19

intentional market you know includes all

84:20

these these forms but we have just

84:23

allowed the flourishing of uh negative

84:26

models. So I think you know if you had a

84:29

world in which you had uh much more

84:32

limits on what counted and what was

84:35

frankly legal in terms of manipulating

84:37

your devices you would see more positive

84:39

competition if you broke up some of

84:41

these companies. I just think the entire

84:43

marketplace of social media is cursed by

84:46

the fact that we haven't gotten rid of

84:48

the most brutal toxic and damaging

84:50

business models for our country and for

84:53

our children and for individuals.

84:54

>> I think that is a nice place to end. So

84:57

always our final question, what are

84:58

three books you'd recommend to the

84:59

audience? And and Tim, why don't we

85:01

begin with you?

85:02

>> Sure. I'd start with EF Schumacher's uh

85:05

small is beautiful, economics as if

85:08

people mattered. And I say that because

85:10

it, you know, targets this question of

85:12

what kind of world do we want to live

85:13

in? And you know, I think our efficiency

85:16

obsession uh is taking us in one

85:18

direction. I think we should choose a

85:20

different direction. A second book um is

85:23

more recent. Cass Sunstein wrote a book

85:25

on manipulation that uh I think has been

85:29

um is underrated and is really good for

85:32

understanding what we have allowed to

85:34

happen. It's called manipulation what it

85:37

is how to stop it. The last book I guess

85:40

this is where I got some ideas about uh

85:42

tech platforms and the big picture is

85:44

from Paul Kennedy rise and fall of the

85:46

great powers is I feel everything is is

85:49

on a cycle and you know every empire has

85:52

its destiny, its golden age, its

85:55

decline, its stagnation and fall and I

85:57

feel like understanding imperial

85:58

dynamics is very important to

86:00

understanding the technological empires

86:03

of our time.

86:04

>> Corey. Yeah. Uh so my first pick is is

86:07

Sarah Win Williams book Careless People.

86:09

Um and uh it's a great example of the

86:11

Stysand effect that when a a company

86:13

tries to suppress something, it brings

86:15

it interest. So uh Win Williams, she was

86:18

a minor uh diplomat in the New Zealand

86:20

diplomatic corps. She became quite

86:22

interested in how Facebook could be a

86:24

player geopolitically. She started to

86:27

sort of nudge them to give her a job as

86:29

like an an intergovern or an

86:31

international governmental relations

86:32

person. No one was very interested in

86:34

it, but she just sort of kept at it

86:35

until she got her dream job. And then

86:37

the dream turned into a nightmare. My

86:40

second choice is a book by Bridget Reid.

86:42

It's a book called Little Bosses

86:43

Everywhere. And Little Bosses Everywhere

86:45

is a history of the American pyramid

86:47

scheme. And it's an argument that the

86:49

American pyramid scheme is kind of the

86:52

it's the center of our current rot. And

86:55

everywhere you look in the MAGA

86:57

movement, you find people who have been

86:59

predated upon by the kinds of scams that

87:02

are characteristic of this and who've

87:04

adopted the kind of toxic positivity

87:07

that comes with it. Uh it is an

87:10

incredibly illuminating, beautifully

87:12

researched book. And then the final book

87:14

is a kids book by my favorite kids book

87:17

author ever, this guy called Daniel

87:20

Pinkwater. And last year he had a book

87:22

out from Tachion Press called uh let me

87:24

find the title here, Jewels, Penny and

87:26

the Rooster. And recapping the plot of

87:29

this book would take 10 minutes because

87:31

it is so gonzo and weird. But suffice it

87:33

to say it revolves around a young woman

87:36

and a talking prize dog who find a

87:39

haunted woods nearby where uh the young

87:43

woman is welcomed by a sort of Beauty

87:45

and the Beast story as a kind of savior

87:47

but who wants no part of it. It's funny.

87:49

It's mad cap. It's gonzo. It's full of

87:51

heart. Um, it is like everything great

87:54

about a kids book. I read my daughter so

87:57

many Daniel Pinkwater books when she was

87:59

little. They are so fun to read at

88:01

bedtime. It's a middle-grade book and uh

88:03

I cannot recommend it highly enough.

88:04

Jules, Penny, and the Rooster by the

88:07

incredible Daniel Pinkwater. Cory, Dr.

88:09

O, and Tim Woo. Thank you very much.

88:11

Thank you.

88:12

>> Thanks, Desra.

88:19

Hey

88:23

you.

88:28

Hey. Hey.

Interactive Summary

The video discusses the current state of the internet and social media platforms, highlighting concerns about corporate capture, algorithmic manipulation, and the erosion of user control. Guests Cory Doctorow and Tim Wu, authors of "Inshittification" and "The Age of Extraction" respectively, share their perspectives on how platforms have evolved from their initial promise to extractive and harmful entities. They explore concepts like "extraction" (taking wealth beyond value provided) and "inshittification" (the decay of platforms from good to users, then to business customers, and finally to exploitative entities). The discussion touches on the decline of competition, the impact on labor, the need for regulation, and the contrast between healthy and toxic business models. They propose solutions involving stronger privacy laws, interoperability mandates, and a shift from purely efficiency-driven economic models to those that prioritize human values and well-being. The conversation also delves into specific examples like Facebook's evolution, Amazon's marketplace practices, and the challenges of regulating monopolies.

Suggested questions

5 ready-made prompts