We Didn’t Ask for This Internet | The Ezra Klein Show
2488 segments
When was the last year that the internet
felt good to you? I think everybody's
different answers to this. Mine, I
think, go fairly far back, maybe to the
heyday of blogging.
>> Words and getting up to 40,000 hits on
his blog a day.
>> At least before the moment when Twitter
and Facebook went algorithmic.
>> What we're trying to do is give everyone
in the world the best personalized
newspaper we can. But whatever your
answer to it is, I have not found many
people who think 2026 right now, this
internet with all of its anger and its
outrage and its AI sloth, this is what
we were promised.
>> A glitch spread graphic violent videos
to unsuspecting Instagram users.
>> This is living at the technological
peak.
>> In just three clicks, I can go from date
night ideas to mind control seduction.
But even if there is this growing
consensus that something went wrong with
the internet somewhere and that it is
driving our society somewhere we don't
want it to go, there's not really a
consensus of what to do about it. What
to do about these giant platforms
increasingly spammed up with ads and
sponsored results boosting content that
will keep us hooked and angry, isolating
and dividing us and deranging our
politics or making a few billionaires
ever richer. held up by an army of
lowwage workers in warehouses and on
delivery bikes. Something has gone so
wrong. But what do we do about it? My
guests today have two theories of the
case. Cory Doctoro is a longtime
blogger, an activist with the Electronic
Frontier Foundation, and a science
fiction writer. His new book is
Inshitification: Why Everything Suddenly
Got Worse and What to Do About It. Tim
Woo worked as a special assistant to
President Biden for technology and
competition policy. He's a professor at
Columbia Law School and author of
influential books on technology,
including his latest, The Age of
Extraction: How Tech Platforms Conquered
the Economy and Threatened Our Future
Prosperity.
I wanted to have them both on because
their books both feel to me like they
talk to each other. They feel like
they're describing something similar,
but in importantly different ways. And
so, I wanted to think through their
theories
and through what might be done about it.
As always, my email as Kleinshowny
Times.com.
Tim Woo, Cory Doctoro. Welcome to the
show.
>> Thank you very much.
>> Great to be here.
>> So, I just learned that you both went to
elementary school together.
>> Yep.
>> Yeah, that's true.
>> In suburban Toronto, a a weird little
school with like 80 kids. It was
kindergarten to 8th grade in one
classroom. Older kids taught the younger
kids. We more or less were left to go
feral and design our own curriculum.
They chucked us out of the school on
Wednesday afternoons to take our subway
pass and find somewhere fun in Toronto
to go do stuff. It was great.
>> Is there anything about that school that
would lead people to becoming sworn
enemies of our tech overlords?
>> Well, we love tech at the time. I mean,
we had we were early in on Apple 2.
>> Mhm.
>> And I frankly that's where it all
started in a way. You know, both of our
our our books have this kind of pining
for a lost age. And I think some of it
is this early era of computing when we
were just like so excited, so
optimistic, everything was just going to
be so amazing. And that that to me a
little bit was was fifth grade or grade
five as we say, you know, programming
the Apple 2.
>> Can I can I slightly uh uh add
problematize that? So, um I I do think
that like so we were both also science
fiction readers back then. And so I was
pretty alive to the dystopian
possibilities of computers at the time.
So I wouldn't call myself optimistic. I
would call myself hopeful and excited
but not purely optimistic. And I would
also like to say that like Pax uh um
John Hudman, nostalgia is a toxic
impulse. And when I when I think about
what I like about those days, it's not
that I want to recover those days. It's
more that I kind of dispute that the
only thing an era in which people had
lots of control over their computers
could have turned into is one in which
the computers had lots of control over
them. That there is probably something
else that we could have done when you're
spending time on the internet these
days, Corey, what feels bad to you about
it?
>> So, what I would do is contrast what
happens when things aren't great now
with how I felt about what happened when
things weren't great before. So, I think
when I was a a a laram on the early
internet and I saw things that sucked, I
would think someone's going to fix this
and maybe it could be me. Uh, and now
when I see bad things on the internet,
I'm like, this is by design and it
cannot be fixed because you would be
violating the rules if you even tried.
Tim, how about you? I feel it's like a
tool I cannot trust. You know, I feel
like the tools I like in my life, like a
hammer, you know, I swing it, it does
something predictable. The internet
seems like it's serving two masters.
>> You know, I search for something, I get
a bunch of stuff I don't really want,
and I don't really know what I'm
getting. Um, I go, I want to write like
one email or check one thing, I end up
in some strange rabbit hole and reading
and like three hours go by and I don't
know what happened. So, I feel like I'm
constantly at risk of being manipulated
or taken from, and I don't trust the
tools to do what they say they're going
to do. And I feel that makes using it
much, you know, kind of like living in a
fun house. So, I don't like that. So, I
want to make sure I give voice to
somebody who is not in the show at the
moment because this has a it's going to
have the flavor of
>> the prophet Elijah has entered the chat.
>> Yeah. Right. Yeah. three three uh
middle-aged guys who think the internet
went wrong somewhere along the way. Uh
when I was working on this episode with
my producer, one of the interesting
tensions behind the scenes was she
doesn't think the internet is bad. She
thinks Tik Tok is uh she said a perfect
platform. Um she's young kids and feels
Amazon is a godsend for a young parent.
Obviously there are many people like
this who are using these platforms
freely of their own valition happily.
So, what do you say to somebody who
says, "What are you [ __ ] all talking
about?"
Yeah. I mean, I guess I I'll start. I I
think that I I I well, the the middle-ag
thing though used to be better, which is
I I don't want to fall into that sort of
situation. I just think the deal is not
what it could be. And I think that, you
know, maybe as a consumer who sort of
lightly uses this, the internet is is
still useful. But if people I mean, I I
have children, too. And you know, I
think it's hard to deny that social
media um has been tough on kids and has
had all kinds of negative effects on
that and that really started
accelerating over the last 15 years or
so. But I think we have a highly
polarized political structure which is,
you know, made worse by social media. I
think we have a problem with inequality
which has gotten worse and worse and is
accentuated by the fact that the margins
are just so thin for independent
business. And I also think this vision
that it would be this um equalizer
leveler this this technology that made a
lot of people rich not just a few people
rich that it was you know more or less I
know easy but reasonable and and and a
lucrative thing to do to start your own
business that it would sort of change
some of the challenges of inequality and
class uh structure in the United States.
Now maybe those were very high hopes.
This is a key concept in in my book and
I think key to understanding the
economics of our time is the importance
of platforms
um which are you know any space or any
institution that brings together buyers
or sellers speakers or listeners. Every
civilization has had platforms. I was in
Rome a few weeks ago and you know you go
to the Roman forum and there it is. It's
all together. the buyers, the sellers,
um they have the courts, they have where
people gave their speeches. You know,
they're kind of the core of every
civilization.
And in some level, why I wrote this book
is I was I was interested in this
question of what our fundamental
platforms look like and how that
reflects on the civilization we are
building because I do think they have a
large impact. I think that's kind of
undeniable.
But I think that, you know, things have
gotten worse in in many dimensions. And
I guess it relates to my view of the
state of the country as well. I think
we've been in a better place in other
periods of American history. And I think
the internet's not the only cause, but I
think it's part of it. If I were having
this conversation with your producer and
we had some time to talk about it, I
would probably walk them through a
couple of the undisputed ways in which
some people have found the internet get
worse for them. So Tim has talked a
little about margins for small
businesses. Um there's also people who
are performers who found that the take
that's being given from the that's being
sucked out of their uh pay packet every
month is going up and up from the
platforms. There's people who would like
really not like to be snatched by ICE
snatch squads who installed ICE block on
their iPhone only to have Tim Cook
decide that uh ICE officers were a
member of a protected class and remove
that app and now you can't install that
app cuz the iPhone only lets you install
official apps. And I'd say like just
because this hasn't hit you, I I think
unless you have a theory about why you
are favored by these platforms, then you
should at least be worried that this
would come. And I would follow up by
saying like let's not fall into the trap
of vulgar Thatcherism. You know,
Thatcher's motto was there there is no
alternative. And I think tech bosses
would like you to believe that too. That
if you're enjoying having a conversation
on Facebook with your friends, which I
stipulate lots of people do. I think
that's absolutely the case and we should
value and celebrate that. That you just
have to accept that there is no way to
have a conversation with your friends
that Mark Zuckerberg isn't listening in
on. And that to ask for otherwise than
that would be like asking for water
that's not wet. It's just like not
possible. And what I'mating for is not
like you don't like that thing you like.
It's like I I like that you like the
thing you like. I want to make it good
and I also want to guarded against
getting worse because just because it
hasn't happened to you yet, uh it would
be uh um I think naive to think that it
would never come for you. So your books
are two frameworks for understanding
what I would call corporate capture of
the internet. The way we went from the
dream of a decentralized
uh user controlled internet to to
something that a small number of
corporations really run and have
enormous power over. And Tim the term
you focus on is extraction Cy. The term
you focus on is initification.
So I'd like you just both to define
those terms for me. What is extraction
Tim? What is inchification Cy?
>> So extraction is actually a technical
economic term that refers to the ability
of of any entity or any firm with market
power or monopoly power to take um
wealth or uh other resources far in
excess of the value of the good being
provided. Not not only the value being
provided but also it's its cost to
provide it. That's the technical
definition. So you might have a
pharmaceutical company. They have uh you
know there's a rare disease. They have
the only treatment for it and you know
maybe they're they're extracting as much
as they can. You know $100,000 uh a year
is about the usual. And I think the idea
of it comes from a sense something I get
from teaching at business school
sometime is that American business has
in my view moved increasingly to focus
its efforts on trying to find points of
extraction as a business model. um as
opposed to say improving the product or
lowering the price. You know, try to
find the pain points where your
customers really have no choice and then
uh take as much you can kind of like in
a poker game when you go all in because
you got the good hand. Now, there's
always been a little bit of that in
business or maybe a lot like in the
guilded age. But the question is what is
the ratio and how much of business is
providing good services uh for good
prices, you know, making a profit that's
fine and how much is just that different
thing of extraction. So Tim, I want to
before I move on to Corey, zoom in on
something you said there because a lot
of that definition seem to turn on how
you define value.
>> Yeah.
>> And I mean a lot of economists would say
price is a method of discovering value.
If you have a pharmaceutical people are
willing to pay $70,000 for that means
they value it at $70,000 even if you
think that is extractive. Mhm.
>> So, how do you know when a price when a
profit
is actually extractive versus when we're
just seeing that people value that
product very highly and bully on the,
you know, producer for creating
something people value so highly.
>> Yeah. So if someone for example, you
know, has no choice but they they are
desperate, let's say, for water and
someone is able to sell them, you know,
a bottle of water cuz they're dying for
$100,000 or something like that, um,
yes, that person does value it at that
level. But an economy full of nothing
but maximized monopoly prices where
people are in a position to extract
affirms is is uh not it's inefficient
for two reasons. One is too much money
gets spent on that water versus you know
other things like maybe pursuing an
education and second that the entity
that holds that much power actually has
a has a impulse to reduce supply reduce
output and therefore produce less of the
stuff so that they can extract the
higher price. So there's I mean this is
just classic monopoly economics I guess
I'm I'm getting into. I mean, everyone
inside themselves has something they are
willing to pay, but that doesn't mean
it's a good society when you're
constantly paying the maximum you are
willing to pay in every situation. It's
actually a very oppressive economy, I
think. So, Tim, when we're talking about
extraction for many of these platforms,
for a Facebook, for a Tik Tok, we're not
paying for them. So, when you say they
are extracted, what are they extracting
and from whom? When you use Facebook,
you are constantly being mined for your
time, attention, and data in a way that
is extraordinarily valuable and that,
you know, yielded something like 67
billion in profit last year. So, you
know, things that feel free. Is it free
when you suddenly spend, you know, hours
um wandering around random things you
didn't intend to? Is it free when you
end up buying stuff that you didn't
really want and wonder why you get it
later? Is it free when you feel that
you've, you know, had your
vulnerabilities exploited? I would say
none of that's free. You're poorer both
uh in your own consciousness and in
terms of what attention and your control
over your life. And you're poorer
probably in misspend money.
>> Cory, how about initification?
>> Well, before I do that, I also wanted to
react to something um that you were sort
of fainting at, Ezra, which is this idea
of revealed preferences, which you often
hear in these discussions, right? that
if you let Facebook spy on you, no
matter what you say about how how you
feel about Facebook spying on you, you
have a revealed preference. And Tim used
the word power when he responded to
that. And I think that, you know, if you
if you ask the neocclassicals, they'll
say, well, we like models, and it's hard
to model qualitative aspects like power.
So, we just leave them out of the model
and hope that it's not an important
factor. And this is how you get these
incredibly bizarre conclusions like if
you sell your kidney to make the rent,
you have a revealed preference for
having one kidney. But what we actually
know when we give people choices when
when the state intervenes or when
there's counterveailing power is that
often you get a different revealed
preference. You know, when Apple gave
Facebook users the power to tick a box
and opt out of Facebook spying, 96% of
Apple users tick that box. So the
argument that Facebook users don't being
mind being spied on, I think is blown
out of the water when you actually give
them a a way to express preferences, and
I assume the other 4% were like either
drunk or Facebook employees or drunk
Facebook employees, which makes sense
cuz I would be drunk all the time if I
worked at Facebook. But I think it's
hard to deny that people really don't
want to be spied on if they can avoid
being spied on.
>> All right. I think that's a good setup
to inshification.
>> Yeah, inshitification. It's really a
label I hung on both an observation
about a characteristic pattern of how
platforms go bad, but I think much more
importantly why they're going bad now
because we didn't invent greed in the
middle of the last decade. So something
has changed. I my thesis is that some
exogenous factors have changed. So the
pattern of platform decay is that
platforms are first good to their end
users while locking them in. That's
stage one. And once they know that the
users have a hard time departing when
they face a collective action problem or
when they have high switching costs, um
you can make things worse for the end
users safe in the knowledge that they
are unlikely to depart in order to lure
in business customers by offering them a
good deal. And so far so good. I think a
lot of people would echo that, but they
would stop there. They would say, "Oh,
you're not paying for the product, so
you're the product." So that this is
about luring in users and then getting
in business customers will pay for it.
That's not where it stops because the
business customers are also getting
screwed because the business customers
get locked in and you know this power
that the platforms end up with over
their business customers is then
expressed in stage three where they
extract from those business customers as
well. They dial down the value left
behind in the platform to the kind of
minimum like homeopathic residue needed
to keep the users locked to the
platform, the businesses locked to the
users and everything else is is split up
among the executives and the
shareholders and that's when the
platform's a palace of [ __ ] But the
more important part as I say is why this
is happening now. Broadly, my thesis is
that platforms used to face consequences
when they did things that were bad for
their stakeholders. And those
consequences came in in four forms. They
had to worry about competitors, but we
let them buy those. They had to worry
about regulators, but when a sector is
boiled down to a cartel, they find it
very easy to agree on what they're going
to do and make their preferences felt
because they have a lot of money,
because they're not competing with one
another and they capture their
regulators. They had to worry about
their workers because tech workers were
in very scarce supply and they were very
valuable and they often really cared
about their users and they could really
say no I'm not going to inshitify that
thing I missed my mother's funeral to
ship on time and make it stick because
there was no one else to hire if they
quit and they were bringing a lot of
value to the firm but of course tech
workers famously thought that they were
temporarily embarrassed founders and
they didn't unionize they didn't think
they were workers so when the power of
scarcity evaporated they had not
replaced it with power of solidarity.
And so now you have 500,000 tech layoffs
in three years and tech workers can't
hold the line. And then finally, there
was new market entry. There were new
companies that could exploit something
that I think is exceptional about tech.
I'm not a tech exceptionalist broadly,
but I'm an exceptionalist about this,
which is that every program in your
computer that is adverse to your
interests can be neutralized with a
program that is beneficial to your
interests. And that means that when you
create a program that is deliberately
bad, you invite new market entrance to
make one that's good. Right? If you lock
up the printer so it won't take generic
ink, you just invite someone to not only
get into the generic ink business, but
get into the alternative printer
firmware business, which eventually
could just be the I'm going to sell you
your next printer business. But what
we've done over 20 plus years is
monotonically expand IP law until we've
made most forms of reverse engineering
and modification without manufacturer
permission illegal, a felony. Uh uh my
friend Jay Freeman calls it felony
contempt of business model. And as a
result, you don't have to worry about
market entry with this incredible
slippery dynamic character of
technology. And when you unshackle firms
from these four forces of discipline,
when they don't have to worry about
competitors or regulators or their
workforce or new market entry through
interoperability, the same CEOs go to to
the same giant switch on the wall in the
seauite marked in shitification and they
yank it as hard as they can as they've
done every day that they've shown up for
work. And instead of being gummed up, it
has been lubricated by an init policy
environment that allows it to go from
zero to 100 with one pull. And that's
how we end up where we are today. All
right. I want to bring these out of
theory though, Corey. I I applaud how
well structured that was on the fly.
>> Um, and have you both walk through this
with an example that you use in your
books.
>> Sure.
>> And Cory, I want to start with you. Uh,
>> walk me through how you see
initification as having played out on
Facebook itself. Not all of Meta, but
Facebook where it started when it was
adding value to users in the the early
days to where you feel it has uh gone
now. Tell me your Facebook story.
>> Yeah, so Facebook uh really its big bang
was 2006. That's when they opened the
platform to anyone, not just people with
aedu address from an American college.
And Mark Zuckerberg needs to attract
users. And his problem is that they're
all using a platform called MySpace. So
he pitches those users and he says,
"Look, I know you enjoy hanging out with
your friend on MySpace, but nobody
should want to use a surveillanced
driven social media platform. Come to
Facebook and we'll never spy on you.
will just show you the things that you
asked to see.
>> We need to give people complete control
over their information, right? People
need to be able to say exactly who they
want to share each piece of information
that they're sharing, who they want to
share that with.
>> So, that's stage one. But part of stage
one, remember, is that there's a lock
in. It's just the collective action
problem, right? You love your friends,
but they're a pain in the ass. And if
the six people in your group chat can't
agree on what bar to go to this Friday,
you're never going to agree on when it's
time to leave Facebook or where to go
next. Especially if some of you are
there because that's where the people
with the same rare diseases you are
hanging out. And if some of you are
there because uh that's where the people
in the country you immigrated from are
hanging out. And some of you are there
because that's where your customers or
your audiences or just that's how you
organize the carpool for the kids little
league. And so we are locked in. And so
that ushers in stage two making things
worse for end users to make things
better for business customers. So think
about advertisers. Advertisers are told,
you know, do you remember we told these
rubes that we weren't going to spy on
them? Obviously that was a lie. We spy
on them from [ __ ] to appetite. Give
us pennies uh and we will target ads to
them with exquisite fidelity and uh so
the advertisers pile in, publishers pile
in too. They become locked to the
platform. They become very dependent on
it. And in stage three, advertisers find
that ad prices have gone way up. Ad
targeting fidelity has fallen through
the floor. Ad fraud has exploded to
levels that are almost incomprehensible.
Publishers famously now have to put
their whole article there, not just an
excerpt. And Woatide, the publisher that
has a link back to their website because
Facebook's downranking off platform
links is potentially malicious. And so
they don't have any way to monetize that
except through Facebook's own system.
And we've got a feed that's been, you
know, basically denuted of the things
we've asked to see. It has the minimum
calculated to keep us there. And this
equilibrium is what Facebook wants, but
it's very um brittle because the
difference between I hate Facebook and I
can't seem to stop coming here and I
hate Facebook and I never coming back.
It can be disrupted by something as
simple as a live stream mass shooting
and then users m bull for the exits. The
street gets nervous. The stock price
starts to wobble. The founders panic
although being technical people they
call it pivoting. And you know, one day
Mark Zuckerberg like arises from his
sarcophagus and says, "Harken unto me,
brothers and sisters, for I've had a
vision. I know I told you that the
future would consist of arguing with
your most racist uncle using this
primitive text interface that I invented
so I could non-conensually rate the
fuckability of Harvard undergraduates.
But actually, I'm going to transform you
and everyone you love into a legless,
sexless, low polygon, heavily surveiled
cartoon character so that I can imprison
you in a virtual world I stole from a
25-year-old comedic dystopian cyberpunk
novel that I call the metaverse. And
that's the final stage. That's the giant
pile of [ __ ] All right,
Corey, you got a good rant there, my
man.
>> Cory Cory could be a rapper if he uh if
he decided to get into that.
>> I give you real props on it. that the
world is crying out for a middle-aged
technology critic rapper.
>> Let me ask you at least one question
here so I'm not just too taken in by
your charisma, which is to say I think
that the
counter argument somebody would offer is
that I think two things. One is for all
the pivots, all the um all the scams, by
the way. I mean, I was a publisher
during the era of the Facebook fire hose
to publishers
>> and the era of pivot to video when
Facebook videos were getting these
astonishing view counts
>> and one fraudulent view counts.
>> That's what I was about to say. One,
they kept first all the money. They
promised everybody, you know, come get
this huge scale. We're giving you all
this traffic. You can build a business
here. There was no business to build
there at any significant scale. And two,
it turned out that the video view counts
were fraudulent, right? And so a huge
amount of the news industry, among other
things, pivoted to video and it was
based on lies
>> and and scams are there's a recent
Reuters report that Facebook was
actually charging um advertisers more
for these things that they knew were
scams.
>> 10% of their ad revenue is ads for scams
by their own internal accounting.
>> I'm really not here to defend Facebook
as an actor. But the one of the crazy
things amidst all of this, a thing you
really focused on there was moving from
showing us what we had asked to see to
showing us what I would say Facebook
wants us to see. There's just the FTC
versus Meta case. Tim was of course
involved in that.
>> And one of the statistics that came out
during it is that only 7% of time spent
on Instagram is spent on things your
friends and family uh have actually
shown you, things people you follow are
showing you. Similarly on Facebook
itself it's under 20%. I forget the
exact number but it's very low.
>> They have moved under competition from
Tik Tok specifically although not only
to these AIdriven algorithmic feeds
showing you not what you have asked to
see but what they find will keep you
there. And what they are finding is that
it will in fact keep you there and
people are coming back to it and they
spend more time on Instagram when you
turn the feed into this algorithmic
feed. This is the whole revealed
preference thing that you were talking
about earlier. My personal experience of
Instagram when I go on it now is one
reason I try to go on it less
>> is that I can actually feel how much
more compelling it is. I like it less,
>> but the feeling of getting pulled into
something is much stronger.
>> And so I think if you had Mark
Zuckerberg risen from his
>> uh sarcophagus,
>> I was going to say office because I'm a
more polite person. Uh here he would say
we did this under competitive pressure.
Tik Tok was eating our lunch. We stole a
bunch of things from Tik Tok and now
we're doing better. We also stole a
bunch of things from Snapchat and now
we're doing better because in fact we
are under a lot of competition and we
are incredibly good at responding to
that competition in ways that our user
base responds to. This is not
initification. This is the magic of
competition itself and you know that
because look at our profit margin and
look at how much we've changed.
>> So let me say that I don't think
competition is a good unto itself. uh
and I think it is absolutely possible to
compete to become the world's most
efficient uh human rights violator. The
reason I like competition is because it
makes firms into a rabble instead of a
cartel. So in 2022 with two teenagers
reverse engineered Instagram and they
made an app called OG app. And the way
OG app worked is you give it your login
and password. It pretended to be you and
logged into Instagram. It grabbed the
session key. It grabbed everything in
your Instagram feed. It discarded the
ads. It discarded the suggestions. It
discarded all of the stuff that wasn't a
chronological feed of the people who
followed you uh the people you followed
rather uh that they had posted recently.
Facebook or Meta sent a letter to Apple
and Google who obliged them by removing
the app because there's honor among
thieves. So if you want to find out what
people actually prefer, you have to have
a market in which people who disagree
with the consensus that people are kind
of gut flora for immortal colony
organisms we call limited liability
corporations and that they are entitled
to dignity uh and moral consideration as
beings unto themselves. Those people
have to be offering some of the
alternatives to find out what they want.
But because under modern IP law,
something called the Digital Millennium
Copyright Act, it is a felony to modify
the app without permission,
when Meta sent the letter to Apple and
Google, they agreed that that was what
they would uh they would side with Meta.
And because you can't modify those
platforms to accept apps that haven't
run through the store, that that was the
end of the road for OG. But but I think
this is a little bit of a narrowed
example. As somebody who gets a huge
number of press releases for all these
pro-social apps that are built to
compete with Instagram and Tik Tok and
all of them, apps that are meant to
respect your attention, apps that are
meant to be virtuous in a way these apps
are not. And watches one after another
after another after another basically go
nowhere, get out competed. The point I'm
making is the example you're giving,
they were able to basically say there
was a term of service violation. Maybe
they should not be allowed to do that.
And people do. They This is where I want
to make sure my producer has a voice.
There are people who just absolutely
like Tik Tok. There are people who like
Instagram. They know there are other
things out there and they're not
clamoring for a competitor or an
alternative.
I I think suggesting that there is no
capacity to switch is going a little far
is
>> No, I'm not saying there's no capacity
to switch. I'm saying the higher the
switching costs are, the lower the
likelihood that people will leave. You
know, when we had popup ads in our
browsers and real pop-up ads, the the
Paleolithic pop-up ad that was a whole
new browser window that spawned one
pixel squared, autoplayed audio, ran
away from your cursor. The way that we
got rid of that was it was legal to
modify browsers to have pop-up blockers.
More than 50% of us have installed an ad
blocker in our browser. Docles calls it
the largest consumer boycott in human
history. And as a result, there is some
moderation upon the invasiveness of what
a browser does to you. That is in marked
contrast with apps because reverse
engineering an app because it's not an
open platform is illegal under American
copyright law. It violates section 121
of the Digital Millennium Copyright Act.
And so when we talk about how these
platforms uh have competed their way
into toxicity, we're excluding a form of
competition that we have made illegal.
for example, ad blockers, for example,
privacy blockers, for example, things
that discard algorithmic suggestions and
so on. Taking those off the table means
that the only competitors you get are
firms that are capable of doing a sort
of wholeless bolless replacement to uh
convince you that no, you don't want to
use Instagram anymore. You want to use
Tik Tok instead, as opposed to you'd
like to use Tik Tok or Instagram rather,
but in a slightly different way that
defends your interests against the
firm's interests. But I think that we
mustn't ever forget
that within digital technology and
living memory, we had a mode of
competition that we prohibited that
often served as a very rapid response to
specifically the thing you're worried
about here. you know that I have a
friend Andrea Downing who has um the
gene for breast cancer and she's part of
a breast cancer prevor group that was
courted by Facebook in the early 2010s
and they move there and this group is
hugely consequential to them because if
you have the breast cancer gene you are
deciding whether to have your breast
removed your ovaries removed the women
in your life your daughters your sisters
your mothers they're dying or sick and
you're making care decisions this group
is hugely important and discovered that
you could enumerate the full membership
of any faith Facebook group whether or
not you were a member of it. This was
hugely important to her friends there.
She reported it to Facebook. Facebook
said, "That's a feature, not a bug.
We're going to won't fix it. We're going
to keep it." They sued. It was
non-consensually settled when the FTC
settled all the privacy claims. And they
are still there because they cannot
overcome the collective action problem
that it takes to leave. Now, they will
eventually. When Facebook is terrible
enough, that community will shatter and
maybe it will never reform. That is not
a good outcome. All right, Tim, I want
to go to your story here. One of the
core tales you tell in your book is
about Amazon. So, walk me through the
process of moving toward moving a
platform from a kind of healthy
constructive platform to becoming an
extractive platform through your kind of
story of of what happened with Amazon.
Um Amazon is you may remember was once
upon a time a bookstore.
>> I do remember that actually. It's how
old I am.
>> And uh you know their basic idea was be
bigger and we'll sell more stuff. At
some point they opened the marketplace
um the Amazon marketplace which was
different because it was a platform. In
other words, it was a place that people
could come and sell their stuff. At
first it was used books. Then it spread
into other markets. And they uh realized
a few things. one is that fulfillment
would be very important. You know, eBay
in the old days, the the sellers had to
wrap it themselves and and send it off.
So, that that wasn't a very scalable
model. Um, and uh they understood they
had a good search engine. Amazon vested
hard in search and it worked and more
and more sellers came, more and more
buyers came and so the Amazon
marketplace um took over eBay and became
very successful and at that point I
would say you know maybe around 2010 or
something like that was fulfilling what
I think you know would would call the
dream of of the internet age which is a
lot of people would be able to go on
this place you know start their thing uh
make a lot of money it's it coincides
with the rise of the blog and and small
online magazines you know that whole era
that we are are talking about during
that period Amazon's take was below 20%.
It kind of depends how you count, but
you know somewhere between 15 to 20 20%
>> their take of what a small business is
>> of the sales of the sales. Yeah. Yeah.
So if you sold like $100 they take $20.
I mean it depend a little bit. There
were some storage fees and so on. So you
know it was it was a good place to make
money
>> and um what changed I think was once
Amazon had confidence that it had its
sellers and it had its buyers more or
less locked up. Um, and this is
basically over the the the the 2010s.
They bought a couple companies that were
potential threats to them. Diapers.com,
for example. It might seem ridiculous,
but Diapers, you know, could have been a
kind of a way in to threaten them.
>> Why don't you tell the diapers.com story
for a minute? It's a kind of famous
story in Amazon, but I think it's worth
telling. So there was a platform
launched to be an alternative to Amazon
and their thought was you know new
parents diapers you know every parent
needs diapers delivered quickly so why
don't we make that the beginning in the
same way Amazon started with books and
then Amazon saw this thought it was kind
of threatening and in the strategy of
the day just bought them um of course
the founders pretty happy um and Amazon
managed basically to capture this market
and and that's when I think it turned uh
to the extraction phase. Well, in the
last 10 years, Amazon's strategy has
just basically been for its marketplace
to to turn the screws and increase the
fees, change the margins so that many
sellers are paying, you know, over 50%
or more, you know, basically the same as
as brick and mortar businesses. And
Amazon prices are rarely uh any lower.
they they actually have done a lot to
try to prevent uh being priced anyone
pricing lower and I I think the one
thing I would focus on is their what
they call advertising which may be
familiar to you as as sort of the
sponsored results that you get when
you're searching. So what's going on
there is that sellers are bidding
against each other bidding down their
own margins to get higher up in the
search results. And that little trick,
that sort of one weird trick has become
this extraordinary cash cow. It's more
profitable than Amazon Web Services,
which is sort of surprising.
>> Last year it was 56 billion.
>> Just paying Amazon for higher for higher
rankings in their search results was 56
billion.
>> 56 billion. It's looking like it's going
to be over $70 billion. Corey, when I'm
searching on Amazon and I see that
Amazon's choice looks like a little
prize like that that product won a
competition uh where a bunch of editors
chose it as best one. What am I looking
at there?
>> So, that is broadly part of this thing
Tim was discussing where they're piling
on junk fees for the right to be at the
top of the results where if you're not
uh paying for Prime and paying for
fulfillment by Amazon and paying for all
these other things, you you aren't uh
eligible. And the more of these you buy,
the greater chance you have of being
chosen. But is that be are they
literally paying to be Amazon's top
choice? I I mean as a dumb consumer
maybe I look at that and I think oh this
is some algorithmic combination of is it
the best seller, what are its reviews,
etc.
>> Mhm. No, it's so you're right that it is
algorithmic but the algorithmic inputs
are not grounded primarily in things
like quality or customer satisfaction.
They're grounded in how many different
ways you've made your business your
business dependent on Amazon in such a
way that uh every dollar you make is
having more and more uh of that dollar
extracted by Amazon. There's some good
empirical work on this from uh Maria
Mazicado and Tim O'Reilly where they
calculate that the first result on an
Amazon search engine results page on
average is 17% more expensive than the
best match for your search. So that's
what you're seeing is basically the the
the Amazon top choice is the worst
choice.
So, this really feels to me like a place
where, to use Cory's word, things in
shitified. uh that when I go around the
internet now, when I play something in a
Spotify playlist or click on a song I
like and move to the radio version of
Spotify or when I search something on
Google or when I search something on
Amazon,
these used to be very valuable services
to me to search for something on Amazon
and see rankings uh weighted by how
popular the product is, how high the
reviews are, right? that like I took the
the waiting of the search as to some
degree a signal of quality. Certainly
Google the whole idea was that what
comes first in search was you know built
on page rank and it was going to be
quality and now
there is so much sponsored content in
every one of these results
and it is so unclear what is what and
who is paying for what and why I'm
getting this song or that result that
this whole industry uh or part of the
industry that you know one reason I
ended up on these platforms is because I
trusted these results
>> and now I trust
nothing.
>> Yeah. I mean it's going back to the
definition of extraction. I mean it's we
are kind of paying $70 billion
collectively to make search worse. So
when does this move from this is just
their business model and if you want to
find something else like go yeah go buy
something on Walmart, go buy something
on Target, go buy something at Best Buy.
You can do all those. I've done all
those. I just ordered a blender from
Kohl's versus we've moved to extraction
and we should see it as a public policy
problem. Yeah, I think that's the a
really great question. is a kind of
question we've faced um I I think
repeatedly in history when you start to
have a business model start to settle
down you see less um real disruptive uh
competition
um possible and Amazon is still you know
a great way to find a lot of product
it's the world's largest marketplace but
they have um I would say they they're
running themselves like an unregulated
monopoly and I I guess I would compare
it to electricity I mean, we'd all say
electric network is great. Um, we can't
do without they provide this incredible
service, but we really say, okay, we're
just going to put up with whatever
choices they charge. I don't think we
would. And I think at some level, once a
market has settled, at some point, you
got to call a limit. And we do that in
many other markets. Both of you spend a
lot of time on the number of small
acquisitions
that these companies make. And so not
where Google buys ways, but where Google
buys something very modest and and maybe
many of them get shut down or they aqua
hire the top people, but they're also
things that might have grown into
something bigger or else.
On the other side, sometimes it really
is the case that a big player buying
something smaller, they can scale it up
into something, you know, new like
Google, you know, bought, I mean, this
was actually a fairly big acquisition,
but Whimo and kind of amazingly like
they seem to have made driverless cars
work and I think access to Google's
compute and other things was not
insignificant in that. uh and you can
look at other cases where you know these
companies buying something small they're
able to build it into something you know
that ends up being a great uh option in
Microsoft Office or in Google Docs or
whatever it might be. So how do you
think about the ways in which that harms
competition
but also you know I've known founders
who get acquired and are excited to get
acquired because they think it will give
them scale and the capacity to compete
in a way they they wouldn't versus
Google just trying to do it itself. Um,
I think the antitrust level is one
thing, but the the sort of
anti-competitive versus proscale level
is like a much bigger challenge the way
Silicon Valley now works. And I'm I'm
curious to hear you talk through the the
pros and the cons of that.
>> You know, Joseph Shumpeder back in 1911
wrote a book about entrepreneurs
basically. And he said, you know, these
very unusual people who are willing to
go out and start a, you know, take these
kind of risks. They have some vision.
They do this kind of thing. and uh he
thought they were essential to economic
growth that they were these kind of
unusual almost like superheroes and
would do they do do these things and go
out and take these chances. the United
States economy in general has thrived
because it has a lot of those kind of
individuals and they can start things
and I think we've aired too far in
having all the brains under one roof and
you know it's starting to remind me of
of AT&T in in the '60s or IBM where they
they sort of became much more
centralized about innovation and big
ideas would never be developed. It
became kind of group thinky I think when
the justice department did a deal sued
AT&T tried to break them up and they
forced them AT&T to stay out of
computing forever and also license all
of their patents including the
transistor patent and all kinds of
people started quitting their jobs and
saying I'm going to start a
semiconductor firm and there lies the
origins of US semiconductors and also
frankly US computing without AT&T. So I
think we have done much better with
divided technological leadership. I I
frankly think that you know LLMs might
never have gotten started without open
AI being an alternative force because
they're obviously threatening to
Google's business model though. Don't in
a way you have to give Google some
credit on LM specifically. They you were
talking about transistors a minute ago,
but Google does the fundamental research
in transformers and releases it publicly
and and and creates in many ways the
industry
>> but doesn't do anything with it
internally until there's a competitor
that threatens them.
>> Yeah, that's right. That it's just
striking how good of an actor they were
for a period on AI specifically, right?
Like treating it like like they had a
Bell Labs.
>> I agree with that. It actually is a lot
like Bell Labs in the sense that Bell
Labs kept inventing stuff. I mean, Bell
Labs collected these, you know, a lot of
amazing people and then never let things
come to market. The internet being
probably the best example of it.
>> Yeah. I I um so I I I think when you
look at these companies and their
acquisitions, what you see is that these
companies very quickly suffer from what
both Brandeise and Tim called uh the
curse of bigness. Um that they're find
it very hard to bring an actual product
to market that they invent inhouse. When
you look at Google, they've had like one
really successful consumerf facing
product launch and that was in the
previous millennium and almost
everything they made in this millennium
failed, right? They they did not it
either didn't launch or when after it
launched they shut it down. Whereas
their giant successes um their video
stack, their ad tech stack, documents,
collaboration, maps, um their uh
navigation, server management, all of
this stuff. These are mobile, right?
These are companies they acquired from
someone else and operationalized. And
I'm an exops guy. I'm a I'm a recovering
CIS admin. So I'm not going to say that
that's nothing, right? It's it is a
skill unto itself. The careful work to
make things work and make them resilient
and scale them.
>> But the idea that that has to happen
under one roof, I think is is um a false
binary, right? I mean, one of the things
Google did arguably far more efficiently
than they than they hired um uh
innovators is they hired operations
people. uh and and those are the people
who really do the yman service at Google
cuz the innovators, the product managers
never get to launch. They only get to
buy other people's products and refine
them. You know, it comes down to what
you think of is the track record, I
guess, of monopolized innovation. And it
has some hits, but I'm saying a much
more mixed model, I think, historically
is a lot stronger. Um, if you look at
the 70s, 80, if you look at the entire
track record of US innovation, I think
monopoly innovation, you know, leads you
towards AT&T, Boeing, um, you know,
General Motors kind of model as opposed
to what the best of Silicon Valley has
been.
>> And meanwhile, I think you mentioned
aqua hires for people who aren't
unfortunate enough to be steeped in the
business of Silicon Valley. An aqua hire
is when a company is purchased not for
the product it makes but because the
team who made it have proved they can
make a product and then they shut down
the product and they hire the team and
aqua hires are I think a leading
indicator of pathology in tech and
investment. An aqua hire is basically a
postgrad project where venture
capitalist sink some money into you
pretending that you're going to make a
product. It's a science fair demo in the
hopes that the company will buy you and
in lie of a hiring bonus will give you
stock and in lie of a finder fee will
give them stock. But no one's trying to
actually capitalize a product or a
business. I think anytime you see a
prepoundonderance of aqua hires in your
economy that should tell you that you
need to sit down and figure out how to
rejigger the incentives because your
economy is sick.
Corey, we've been talking here about
these markets as really having two
players in them, which is well maybe
three. We've been talking about users,
sellers, and platforms.
But something the yearbook focuses quite
a bit on is a fourth, which we need to
talk about too, which is labor.
>> There are huge numbers of people working
for these companies, huge number of
people delivering Amazon uh packages and
Walmart packages.
And one thing that that both of you
focus on is the way in which as these
companies become bigger and more
dominant, their labor practices can
become I don't know if initification is
the term you would use there, but but
but shittier or more extractive.
Can you talk a bit about that side of
it? What has happened to the labor
practices?
>> Yeah, I I mean we could we could talk
about the other tech workers, right? The
majority of tech workers drive uh for
Uber or for Amazon or work in a
warehouse and they certainly don't get
like free kombucha and massages and a
surgeon who'll freeze their eggs so they
can work through their fertile ears.
They're in a factory in China with
suicide nets around it. But uh I think
if we an example that kind of pulls this
all together, how you get monopoly,
regulatory capture, the degradation of
labor with technology that is uh relies
on blocks on interoperability. I I think
we could do no better than to talk about
nurses. Um, and I'm going to be making
reference here to the work of Vina
Dubel, who's a legal scholar who coined
a very important term, algorithmic wage
discrimination. In America, hospitals
preferentially hire nurses through apps,
and they do so as contractors. So,
hiring contractors means that you can
avoid the unionization of nurses. And
when a nurse signs on to get a shift
through one of these apps, the app is
able to buy the nurse's credit history.
And the reason for that is that the US
government has not passed a new federal
consumer privacy law since 1988 when
Ronald Reagan signed a law that made it
illegal for video store clerks to
disclose your VHS rental habits. Every
other form of privacy invasion of your
consumer rights is lawful under federal
law. And so among the things that data
brokers will sell anyone who shows up
with a credit card is how much credit
card debt is any other person carrying
and how delinquent is it. And based on
that, the nurses are charged a kind of
desperation premium. the more debt
they're carrying, the more overdue that
debt is, the lower the wage that they're
offered on the grounds that nurses who
are facing economic uh uh privation and
desperation will accept a lower wage to
do the same job. Now, this is not a
novel insight, right? Paying more
desperate workers less money is a thing
that you can find in like Tennessee
Ernie Ford songs about 19th century coal
bosses. But the difference is that if
you're a 19th century coal boss who
wants to figure out how much the lowest
wage each coal miner you're hiring is
willing to take, you have to have an
army of Pinkertons that like are
figuring out the economic situation of
every coal miner. And you have to have
another army of guys in green eye shades
who are making uh annotations to the
ledger where you're calculating their
pay packet. It's just not practical. So
automation makes this possible. And you
have this vicious cycle where the poorer
a nurse is, the poorer they become. the
lower the wage they're offered. And as
they accumulate more consumer debt,
their wage is continuously eroded. Um,
and and I think we can all understand
like intuitively why this is unfair and
why as a nurse you might not want it,
but also like do you really want your
catheter inserted by someone who drove
Uber till midnight the night before and
skipped breakfast this morning so they
could make rent? This is a thing that
makes everyone except one parochial
interest worse off. And this is not a
free floating economic proposition. This
is the result of specific policy choices
taken in living memory by named
individuals who were warned at the time
that this would be the likely outcome
and who did it anyway. I want to stay on
the the the labor question on a couple
other levels, but I I want to ladder
this one up for a second, Tim.
>> Sure.
>> Which is because I think this is getting
at something we're starting to hear a
lot about, which is anger over
algorithmic pricing of various kinds.
So, when I was walking up to do the
podcast today, the Chiron on on CNN was
about uh an investigation finding that
Instacart was uh charging many different
people many different prices.
>> And so, the price you were seeing on
Instacart wasn't the price, it's your
price. And I could imagine a
neocclassical economist sitting in my
seat right now and saying
pricing becomes more efficient when it
discriminates. that the market will be
more efficient if it can charge, you
know, Ezra a higher price for kombucha,
uh, if I'm getting that delivered, uh,
because of things it knows about me and
my kombucha habits. And it charges
somebody else a lower price because it
knows they value the kombucha less or a
nurse a higher price and or a higher
wage and a lower wage depending on their
uh, situation. That in fact, we're just
getting better and better and better at
finding the market clearing price. And
this is what economics always wanted,
right? we're we're finally hitting the
utopia of every person having, you know,
the market clearing wage and the market
clearing price. Uh why don't you agree
with that?
>> Yeah, I mean the fundamental question is
is that really the kind of world you
want to live in? In other words, do you
constantly want to live in a place where
you are being charged the maximum you
would pay for something? Now you know
that could rebound to the benefit of
people who are very poor but it is to in
economic terms it's always only about
producers taking everything from the
market and I just think it's a very you
know just moving away from the the
efficiency potentially of it. I think it
makes for a very unpleasant
lifestyle to be constantly feeling
you're being exploited. And the other
thing I'll say is there's also a huge
amount of effort people make trying to
move what category they're in um and you
know pretend to be poor. So it I think
it is overrated
and relies on um I guess it relies on
overly simplistic models of what makes
people happy.
There's a way in which efficiency
housing is an interesting term in
economics
because in economics is in life you want
things to be somewhat efficient
>> but too much efficiency becomes truly
inhuman. Uh I find this even in the the
the very modest uh example of like
personal productivity uh efforts. You
know, it's great to have a to-do list.
>> If I really force myself onto the
scaffolding of a to-do list at all
times, I I feel like I cease to be a
human being and and and become a a kind
of machine always just getting things
done and responding to the emails and
and and this is a place I think it was
important and when you said it raises
the question of what kind of world you
want to live in because the truth is
that I don't want to live in a maximally
efficient world. I have other competing
values. You know the competitive
efficient market is good up to a point
and after a point it becomes something
corrosive to human bonds, human
solidarity, just in time scheduling
makes sense from the perspective of
economic efficiency and not if you want
healthy families in your society
and and and I I think being able to
articulate that question of what kind of
world you want to live in, not just what
kind of economy works on models, I think
is uh is is important and and often a
lost political
art in my view.
>> Yeah, I I I agree. And I feel there's,
you know, there are some intuitive
feelings like people feel it's unfair.
People don't like being ripped off, but
people hate paying junk fees. The the
original word for that, by the way, was
[ __ ] fees, but there was inside
government. We felt we had to we
couldn't have the president say that.
So, yeah, I think that gets the heart of
the matter. I mean you had also talked
about you know human attention and human
attention turns out to be quite
commercially valuable but do you ever do
you really want every second of your
time and every space you inhabit to
being mined for your attention and its
maximum value even if that contributes
to the I guess overall GDP of the
economy. I mean, I'd like to have some
time for my kids and friends in which no
one's making any money. And you know,
it's an example of a commodity that is
very close to who we are. Um, at the end
of your days, you know, what your life
was was what you paid attention to. And
the idea that you can with maximum
efficiency mind that at every possible
moment seems to me a recipe for a very
bad life. I think one way to frame this
rather than around efficiency is around
optimization.
And I I think that we can understand
that for a firm the optimal arrangement
is one in which they pay nothing for
their inputs and charge everything for
their outputs. So optimization things
are optimal from the perspective of the
firm when they can discover who is most
desperate and pay them as little as
possible or who is most desperate and
charge them as much as possible. But
from the perspective of the uh users and
the suppliers, things are optimal when
you get paid as much as possible and are
charged as little as possible. And so
much of kind of the specific
neurological injury that arises from
getting an economics degree is organized
around never asking the question sort of
optimal for whom. I mentioned before
that we don't have any privacy law in
this country. One of the things that a
privacy law would let us do is to become
unoptimizable.
All optimization starts with
surveillance. Whether it's things like
Tik Tok trying to entice your kids to
spending more time than they want to
spend there, or whether that's
advertisers uh uh finding ways to follow
you around and hit you up with things
that you're desperate for, or whether
it's discrimination in hiring or in
lending. All of this stuff starts with
an unregulated surveillance sector. We
have platforms that take our data and
then sell it and use it and and recycle
it and become sort of the Lakota of
information where they use the whole
surveillance package. Uh and uh we do
nothing to curb that behavior. It is not
an incredible imaginative lift to say
that we might tell them to stop. I want
to pick up on surveillance because when
you talk about the
harms to an economy working in a human
way, I think that the
new frontiers and how you can surveil
workers
>> Mhm.
>> are I think this is going to become a
very big political issue and probably
should be already.
>> So the category that this that this
falls into, it's broadly called bossware
and there's a whole lot of different
versions of it. Like if your firm buys
Office 365,
Microsoft will offer your boss the
ability to stack rank divisions within
your firm by like how often they move
the mouse and how many typos they make
and how many words they type. And then
this is amazing. They will tell you how
you perform against similar firms in
your sector, which is like the most
amazing thing I can imagine that that
Microsoft is finding customers for a
sales pitch that says, "We will show you
sensitive internal information about
your competitors." And apparently none
of those people are like, "Wait, doesn't
that mean you're going to show my
competitors sensitive commercial
information about me?" So you have this
on the kind of the the, you know,
broadstrokes level, but I have this
notion I call the shitty technology
adoption curve, right? If you've got a
really terrible idea to uh that involves
technology that's incredibly harmful to
the people it's imposed on, you can't
start with me. I'm a mouthy white middle
class guy with a megaphone and when I
get angry, other people find out about
it. You have to find people without
social power and you grind down the
rough edges on their bodies. You start
with prisoners. You start with uh people
in mental asylums. You start with
refugee uh and then you work your way up
to kids and then high school kids and
blue color workers and pink color
workers and white color workers. And it
starts with like the only people who eat
dinner under a CCTV are in supermax. And
20 years later it's like no, you were
just dumb enough to buy a home camera
from like Apple or Google or god help us
all Facebook. Right? So that is the
shitty technology adoption curve. And if
you want to know what the future of
workers is, you look at the least
privileged workers at the bottom and
then you see that technology working its
way up. If you look at drivers for
Amazon, they have all these sensors
pointed at their faces, sensors studded
around the van. Um, they're not given a
long enough break even to deal with
things like period hygiene. And so, uh,
women who drive for Amazon who go into
the back of the van to deal with their
periods discover that, uh, that's all on
camera cuz that's all being recorded.
Um, all of this stuff is subject to both
manual and automated analytics. And at
one point, Amazon was docking drivers uh
for driving with their mouth open
because that might lead to distraction
while driving. And so, as you say, it
kind of denudes you of all dignity. It
really is very grim. And, you know, Tim
and I used to ride the Toronto Transit
Commission buses to uh to to school in
the morning when we were going to
elementary school, and we loved the
drivers who would sing and tell jokes
and remember you. This is the thing that
makes uh working uh in the world, being
in the world great. It's having a human
relationship with other humans, not
having standardized labor units that
have been uh automated and standardized
to the point where they can be swapped
out. You know, if you give a a cashier a
cash register, instead of making them
add up things on the paper, you could
give them the surplus to talk with the
customers and have a human relationship
with them. Or you could speed them up so
that you fire nine ten of the cashiers
and you take the remainder and you make
them work at such an accelerated pace
that they can't even make eye contact.
There were things in Cory's description
there in his answer there that in my
view we should just make a social
decision to outlaw. Like I am willing to
say politically like I want to vote for
the people who think you can't eyeball
surveil workers
>> and if other people want to stand up and
say the surveillance of workers eyeballs
is great that like that's a good values
debate to have in a democracy and and
and I know where I I fall on that. Then
there are other things right you know I
I'll use as an example of I'll build on
the cash register example to say that
I really struggle with what I think as a
public policy measure one should think
about the rise of automated checkout in
the way we've seen it. I watch people
turned into these managers of machines.
So, they've gone from being somebody
who, you know, did check out with me and
asked me how my day was and asked them
how their day was, and now they get
called over because the three apples I
put on the weighing machine didn't weigh
in correctly, and it seems dehumanizing
to them, dehumanizing to me. I also get
it. How do you think about weighing
that? Right. There's the stuff that is
genuinely like grim and dystopic and
maybe we should just outlaw. And then
there is stuff like
the just generalized automation in which
there genuinely can be a consumer
surplus from that. Like time is a
surplus for me. Things moving faster is
a surplus for me. More checkout stations
is a surplus for me. And there's a cost
on the other side of it.
>> Well, the first thing I'd say is we
should be making more of these kind of
decisions
>> about what we really care about and what
kind of world we want to inhabit. I
mean, one of the things that I think
happens is by default, um, we don't pass
any laws or have new ethical codes. I
mean, ethics does a lot of work and we
just sort of allow a trump card to to
new stuff because it's new. And, you
know, I I get that you don't want to ban
everything new that shows up, but I feel
that we have over the last 15 years or
so sometimes just kind of taken a
position that, you know, the people
don't get to vote on this. I mean, a
good example is everything to do with
children. Um, you know, I don't think
there's a lot of people uh who think
it's a great thing to surveil children
and, you know, uh have targeted ads for
children and know, you know, and try to
create addictive technologies for
children. Uh, you know, when I worked in
government, we tried to pass just basic
even child privacy laws. We couldn't get
a vote ever. And so, one of the things
that's going on is we're not even
deciding these things as as society. And
that that gets to, you know, the problem
of Congress not taking votes on popular
issues. But I also think this relates to
our conversation earlier about
competition and when it's good and when
it's bad because I think for almost any
endeavor, there's such a thing as
healthy competition and such a thing as
toxic competition. You know, I think
this is we were talking about attention
markets earlier. What is good healthy
competition in attention markets? It's
like making really great movies,
new TV shows that people love, you know,
podcasts that people want to listen to.
Uh toxic competition was the stuff
you're talking about. Essentially,
different forms of manipulation and
addiction. And we've had this kind of
like hands-off, we cannot try to direct
things in a positive direction. I think
that has been a giant mistake. So, first
I would say we have to even try to make
the decisions. You know, how would I do
the tradeoff? I mean, I guess I would
start with the most unredeeming toxic
stuff and ban that first and then see if
we can, you know, I I mean, that's maybe
easy, but we haven't been able to even
do that. And I I was sort of shocked
when I worked in government that we just
could not get a vote on what seemed like
stuff. I had 90 privacy laws. I mean,
even national security was really into
this stuff. They're like, it's too easy
to spy on everybody. And um you know,
that's a problem for us as a national
security issue. and we just could not
get a vote on even the most basic
anti-serveillance which would suggest
like if you download a dog walking app
it shouldn't be just like tracking you
and uploading every kind of information
about you that that should be illegal. I
have been very uh
I have been disturbed we've not been
able to do more on surveillance and
privacy and I've also been struck by how
badly what has been done elsewhere seems
to have worked out. Um, I call this
terms and conditions capitalism where
you just move the burden onto the the
the consumer. So, Europe uh has put out
some very sweeping rules that have given
me the opportunity to individually
decide which of the 303 cookies on every
website I visit might be good, it might
be bad. Um, similarly, nobody's ever in
my view to a first approximation read an
iOS terms and conditions update. And I
have found that a lot that that very
often it seems to me where policy makers
end up after the debate is saying well
as long as there is disclosure then the
consumer can decide but the consumer in
a very rational way does not want to
decide.
>> Yeah.
>> So it is ended up I think in a very
dispiriting place. Instead of creating a
structure in which I'm confident what
companies are are doing is well-bounded,
it has demanded of me a level of
cognitive work I'm not willing to do and
I think nobody else is willing to do to
oversee those companies myself with not
really great options if I don't like
what they're doing. And and and so I'm
curious how you think about that.
>> No, I couldn't agree more. I feel like
if the byproduct of government action
is that you are clicking on more little
windows like that is government failure.
And I would trace it to
frankly a lack of courage on on the part
of government um and the regulators or
the officials or you know to to make
decisions that are really supposed to
help people. uh it it's much easier to
say, well, you know, I'm afraid to do
something, so I'm gonna help them
decide. So, I agree. I think the GDPR
has actually failed to prevent
surveillance.
>> That being the European government that
created all those pop-ups. Yeah, GDPR,
the European privacy laws succeeded um
you know in creating as a lot of popups
and things to mess with succeeded in
making it harder to challenge um big
tech companies in Europe because they're
overregulated and the little guys have
to also go through all this all this
stuff. And so yes, I think this has been
a failure. I think for people to start
to believe in government again, it has
to help us in situations where we are
not strong enough to deal with something
much more powerful or something that has
a lot more time to think about it. I
mean, it's like we're playing poker
against experts. You know, at some point
we need to get backbone and have
government on people's side. Now, I'm
starting to sound like a politician, but
I but I but I mean it. People say that,
but really doing it makes making, you
know, helping people when they are
powerless or distracted or don't have
energy to deal with things.
>> Cory, so look, I I love you both, but I
think you're dead wrong about the GDPR
just as a factual matter about where it
comes from, what it permits, what it
prohibits, and why it failed. Cuz I I
agree it failed. So, you may ask
yourself, how is it that GDPR compliance
consists of a bunch of cookie compliance
dialogues? And the answer to that is
that European federalism allows tax
havens to function within the
federation. One of the most notorious of
those is Ireland. And almost every
American tech company pretends that it's
Irish so that its profits can float in a
state of untaxable grace in the Irish
sea. And because of the nature of the
GDPR, enforcement for uh these [ __ ]
cookie popups, which are the uh progeny
of the big American tech companies,
starts in Dublin with the Irish data
commissioner, who to a first
approximation does nothing. That that
sounds bad, but I want to get you to
explain the core mechanism you're
describing here better cuz I actually
don't know it because the GD that bill
did pass and then all of a sudden the
entire internet filled with these
pop-ups. So that's only because the
companies went to Ireland, broke the
law, and said, "We're not breaking the
law, and if you disagree, you have to
ask the Irish data commissioner to
enforce against us." A few people, um,
Johnny Ryan with the Irish Civil
Liberties Association, uh, uh, Max
Shrems with NOIB, this none of your
business, this, uh, nonprofit, European
nonprofit. They've dragged some of those
cases to Germany. More importantly,
they've got the European Commission to
start modifying the way the law works.
So you can just you can tick a box in
your browser preferences and it can come
turned on by default that says I don't
want to be spied on and then they're not
allowed to ask you. So the answer is
just going to be no. And so I think that
corporations want you to think that is
transcendentally hard to write a good
law that bans companies from collecting
data on you. And what they mean is it's
transcendentally hard to police
monopolies once they've attained
monopoly status because they are more
powerful than governments. And if that's
their message, then a lot of us would be
like, well, we need to do something. We
need to turn the cartel into a rabble
again. As opposed to, God, I guess
governments just have no role in solving
this problem. The one place where I do
disagree with you having covered a lot
of different both cartels and rabbles
lobbing Congress. It I mean it's not
easy to regulate the association of
community banks for instance. When you
have something where there are in every
single district like individual leaders
of the district who will come and lobby
their member of Congress, it's really
hard. I am not saying that monopolies
are good because they make it easier to
to regulate. I'm just saying that it
doesn't solve the problem of the
government runs on money and influence
and on top of it's very hard to get.
Yeah. So we can we can do that. I want
to ask but I I want to build on this and
ask Tim about a separate but related
question. Tim, you mentioned a second
ago sort of the entertainment industry
and one of the questions about to come
up is whether Netflix should be able to
buy all of the assets of or all the
entertainment assets I should say of
Time Warner. And I this is one where I
think people who care about the quality
of the media we consume seem for reasons
that seem compelling to me very very
worried about having that happen. How
would you think about that? And is this
a place where we need to be say making
values judgments that are different than
our antitrust judgments? Is this a place
where the antitrust laws can suffice? Is
everybody just worried about something
they don't need to be worried about? How
do you see it?
>> Yeah. No, I I think this is a place
where if the antitrust laws are enforced
uh correctly and fairly that um the
merger or the acquisition would be
blocked. And I I'd say that this is not
a particularly exotic situation in the
sense that you have the number one, you
know, premium streaming company wanting
to buy the number three or number four.
And if you do the numbers under the
guidelines which the government issues
to tell people when their mergers are uh
presumptively illegal, the result is
that this is a presumptively illegal
merger. The reason I do think it's bad
is I I think that Netflix and Time
Warner have frankly over their history
been some of the most innovative
interesting outlets. Um, and often in an
oppositional role, you know, this goes
way back, but like you know, Time Warner
took a chance on on sound film back in
the 20s. In the ' 50s, they took a
chance on on television, which people
thought was, you know, useless. And then
prestige television, early thousands
with HBO and the golden age. So, they've
taken a lot of bets. Netflix has done a
lot of innovative stuff. Really
interesting obviously and frankly you
want to talk about good tech over the
last 20 years. How about, you know, not
having to wait until your show comes on.
Um that's a form of efficiency I I can
agree with. And I think it would be a
tragedy to have these two companies who
are often so oppositional, you know,
combined into one. I think culturally it
would be a great mushification at the
economic level. Well, just to continue
on this, I I think it's usually going to
be those two companies who are bidding
for the most interesting shows. So, if
you had a new version of, I know, White
Lotus or something or The Wire, who are
going to be bidding for it, it's going
to be Netflix and Warner uh HBO, Netflix
or others. So, you know, the elimination
of one bidder is just the definition of
a loss of useful competition. So, yeah,
I think it's pretty straightforwardly
illegal. I don't think it's that
complicated. Cory, you looked like you
wanted to jump in on that. No,
>> I I I think that um one of the things we
should probably anticipate Tom Warner
saying uh in defense of this merger is
the same thing that Simon and Schustster
and Penguin Random House said in defense
of their failed merger that was blocked
under the Biden administration. They
said, "Oh, well, we'll still internally
bid against one another with within our
divisions for uh the most premium uh
material and that we'll be exposed to
discipline that way." And I love what
Stephen King had to say about this when
he testified. He said, "That's like me
and my wife promising to both bid
against each other on the next house we
move into."
Tim, one thing I was thinking about
while I was reading your book was the
metaphor you use of a gardener. That the
way to think about economic regulation
and antitrust and a bunch of the
different buckets of solutions we're
talking about is that is like a gardener
who is trying to prune certain species
and and plants from taking over their
their garden. And the gardener has to
make judgments. And you know the there
are some decisions you make as a
gardener where you don't want blight
getting all over your garden and killing
everything. But others are made for
aesthetic reasons and others are made
because you want to have native species
and not invasive species. And and there
are all these sort of decisions being
made. And having been around
conversations of economic regulation and
tech regulation for a long time,
I I've come to this view that there is a
fetish in them for truly neutral rules.
That what people always seem to be
looking for is a rule that you don't
have to apply any judgment on. You can
just say if you get over this line,
everybody knows it's bad.
>> As opposed to actually having to say we
have views about how the economy should
work. We have views about how our
society should work.
We want the interest of small businesses
to prosper and they'll prosper more if
they don't have to give 30 cents of
every dollar to Apple or Google or you
know if you're selling on the Facebook
marketplace Facebook and yet I mean
you've been a policy maker Tim I think
that there has been in a kind of a like
a defensive crouch particularly among
Democrats and you know Lena Khan and
others were an exception to this but a a
sort of effort to describe everything
neutrally when sometimes you just don't
want to be neutral on how fundamental
companies and and and markets in your
economy are are working. You want to be
able to have values that those serve as
opposed to your values are subservient
to your economy.
>> Yes, I know. I I agree with that and I
think it's an astute observation. I
think it kind of comes as I said earlier
from a lack of courage or vision that um
it reminds me of you said uh when you
were talking about well okay we'll just
create a bunch of windows and let
everybody decide what options they want
for their privacy and hope that works.
Um you know because it comes from that
same impulse that we don't actually want
to arrive at a vision of the good
society. Um, it's it's one of the flaws
of classic liberalism, frankly, if you
get into the in political theory. And
frankly, the gardener metaphor is
targeted at that. It's not just like let
it all run and see what happens. Um, it
is one where you have some idea of what
kind of world we want to live in and
what kind of society we think is good
and you have to make decisions based on
that. I think we need a vision of what
we want and what a good country looks
like and a and a good place to live.
>> So I think that um bright line rules
make a lot of sense particularly where
you have questions that have to be
frequently adjudicated. The thing we
really want to be asking before we ask
any of these other questions is how
often are you going to have to answer
this question? So lots of people are
like oh we should just ban hate speech
and harassment on platforms. Well,
that's hard because not because we
shouldn't do it, but because agreeing
what hate speech is, agreeing whether a
given act is hate speech, agreeing
whether the platform took sufficient
technical countermeasures to prevent it
is the kind of thing you might spend 5
years on. And hate speech happens a 100
times a minute on platforms. Meanwhile,
if we said we're going to have a bright
line rule that platforms must allow
people to leave but continue to
communicate with the people they want to
hear from, then people who are subjected
to hate speech, who are currently there
because the only thing worse than being
a member of a disfavored and abused
minority is being a member of a
disfavored abused minority who is
isolated from your community, those
people could leave and go somewhere
else. And it's not that we shouldn't
continue to work on hate speech in
parallel, but if you think that a rule
that takes 3 years to answer a question
is going to solve a problem that happens
100 times a second, you're implicitly
committing to full employment for every
lawyer in the world to just answer this
question. One thing I admire about both
of your books is that you you spend a
lot of time on solutions and so I don't
think we can go through every one, but
but I let me do it this way for each of
you and and and Cory, why don't we start
with you? Mhm.
>> If you were king for a day, what are the
the three areas or the three policies,
you can define it the way you want
>> that you think would make the most
difference? One would be um getting rid
of this anti-ircumvention law um in in
America. It's section 121 of the digital
millennium copyright act and saying that
it should be legal to modify things you
own to do things that are legal and that
it shouldn't be the purview of the
manufacturer to stop you from doing it.
Uh, another one would be to create a
muscular federal uh, privacy right with
a um with a private right of action so
that uh, impact litigators like the
Electronic Frontier Foundation as well
as agreved individuals could bring cases
when their privacy laws were violated.
And I guess the third would be an
interoperability mandate specifically
for social media. So it would be a rule
and we've had versions of this. The
Access Act was introduced um, I think
three times, various versions. They're
all pretty good. Mark Warner, I think,
was the main senator behind them. But a
thing that just says that you should be
able to leave a social media network and
go to another one and continue to
receive the messages people send to you
and reply to them the same way you can
leave one phone carrier and go to the
other. And there's a lot of technical
details about what that standard looks
like and how you avoid embedding um you
know, parochial interests of incumbents
and so on. I don't think they're
insurmountable. Uh and I think that the
trade-offs are more than worth it. Tim,
>> so I'll say three things. So, first I
think we need the confidence to ban the
worst and most toxic business models
that are out there. You know, whether
it's exploitation of children, whether
frankly it's it's some of this um total
absolute price discrimination you're
talking about, which may technically
already be illegal. Number two, I think
that it's unquestioned that the
platforms have become essential to
commerce, the main tech platforms. It
just it I'm not in any way thinking you
can do without them. And so I think we
need to understand which of them need to
be treated more like utilities
>> and which of them need to be not allowed
to discriminate in order in favor of
themselves or as between customers to
try to maximize their extraction.
>> Can I hold you on that one for a minute
before you go? Because I always when I
hear this it makes sense to me
>> and then I think to myself, do the
people I know who focus on how utilities
act and are regulated seem happy with
the situation? And the answer is no.
They all think it's a total disaster. So
when you say they should be treated as
utilities, but you know you worked in
the Biden administration, you know,
everybody who works on say green energy
will tell you that the models and
regulatory structures of the utilities
is like a huge huge huge problem. What
specifically do you mean?
>> It's a good it's a good question and
I've spent a lot of my life exposed to
that. But I think what's important about
utility regulation is what it doesn't
allow to happen. Like the electric
networks, the electric utility
regulators are not perfect. On the other
hand, if you think about the electric
network, it has been an extraordinary
foundation for people to build stuff on,
you know, and the reason they're able to
build on it is they don't think the
electric network is going to take like
half their profits if you invent the
computer on top of it. Or they don't
think that, for example, the electric
network is going to decide that, you
know, it likes Samsung toasters instead
of like LG, I don't know, whoever else's
toasters. Z zenith zenith something like
that so they they don't discriminate
between manufacturers on the electric
network and so I think we need to
understand and look carefully at which
part of the platforms are the most like
the electric network or the broadband
network where they are essential to the
rest of business and therefore need to
play by different rules and some of
those main rules the most obvious are
duties of treating everybody the same so
they don't play favorites and then if
you've got it figured out you get to the
question of price regulation And maybe
Amazon's margin be capped at 30% or
something like that.
>> And then number three for you.
>> Number three, I uh I think we need a
constant, you know, I'm an anti-
monopoly kind of guy, constant pressure
on the main tech platforms so that they
stay, I guess, insecure in their
position and aren't able to easily
counter new forms of competition. I
think you have to take out of the
picture the easiest ways of uh tamping
down or eliminating challenges to your
monopoly. I think that's been a really
important thing in US tech uh since
AT&T, since IBM, since Microsoft of
keeping the main dominant market players
uh insecure and force them to uh to
succeed, to improve themselves as
opposed to buying off their competitors
or excluding them. So that's my third.
So before we wrap here, I want to return
to something we've sort of been
circling, which is what kind of
competition do we want to be encouraging
among these platforms? Tim, one thing
you said earlier was that there can be
this difference between healthy
competition and toxic competition, which
if you read a lot of economic commentary
from the early 20th century, you hear a
lot about that. And I feel like we don't
talk about it that much anymore. But but
this is a place where I've been
skeptical of the argument that many
problems would be solved by breaking up
the big particularly attentional social
media and algorithmic media giants that
I don't I mean I don't think Instagram
has gotten better under pressure from
Tik Tok. Uh I don't think that more
ferocious innovation and entrepreneurial
work to capture my attention or my
children's attention is necessarily
good. Maybe the problem is that the
entire thing that the companies are
trying to do, whether there are two of
them or 50 of them, is negative.
>> Yeah, it's it's a really good point and
a good question. You I think in the
markets you're talking about, we have a
serious failure to wall off, discourage,
ban, um or ethically consider wrongful
the most toxic for ways of making money.
So there is such a thing as healthy
intentional competition like making a
great movie that keeps the audience
enraptured for two hours you know
producing a great podcast that is good
intentional competition and frankly the
intentional market you know includes all
these these forms but we have just
allowed the flourishing of uh negative
models. So I think you know if you had a
world in which you had uh much more
limits on what counted and what was
frankly legal in terms of manipulating
your devices you would see more positive
competition if you broke up some of
these companies. I just think the entire
marketplace of social media is cursed by
the fact that we haven't gotten rid of
the most brutal toxic and damaging
business models for our country and for
our children and for individuals.
>> I think that is a nice place to end. So
always our final question, what are
three books you'd recommend to the
audience? And and Tim, why don't we
begin with you?
>> Sure. I'd start with EF Schumacher's uh
small is beautiful, economics as if
people mattered. And I say that because
it, you know, targets this question of
what kind of world do we want to live
in? And you know, I think our efficiency
obsession uh is taking us in one
direction. I think we should choose a
different direction. A second book um is
more recent. Cass Sunstein wrote a book
on manipulation that uh I think has been
um is underrated and is really good for
understanding what we have allowed to
happen. It's called manipulation what it
is how to stop it. The last book I guess
this is where I got some ideas about uh
tech platforms and the big picture is
from Paul Kennedy rise and fall of the
great powers is I feel everything is is
on a cycle and you know every empire has
its destiny, its golden age, its
decline, its stagnation and fall and I
feel like understanding imperial
dynamics is very important to
understanding the technological empires
of our time.
>> Corey. Yeah. Uh so my first pick is is
Sarah Win Williams book Careless People.
Um and uh it's a great example of the
Stysand effect that when a a company
tries to suppress something, it brings
it interest. So uh Win Williams, she was
a minor uh diplomat in the New Zealand
diplomatic corps. She became quite
interested in how Facebook could be a
player geopolitically. She started to
sort of nudge them to give her a job as
like an an intergovern or an
international governmental relations
person. No one was very interested in
it, but she just sort of kept at it
until she got her dream job. And then
the dream turned into a nightmare. My
second choice is a book by Bridget Reid.
It's a book called Little Bosses
Everywhere. And Little Bosses Everywhere
is a history of the American pyramid
scheme. And it's an argument that the
American pyramid scheme is kind of the
it's the center of our current rot. And
everywhere you look in the MAGA
movement, you find people who have been
predated upon by the kinds of scams that
are characteristic of this and who've
adopted the kind of toxic positivity
that comes with it. Uh it is an
incredibly illuminating, beautifully
researched book. And then the final book
is a kids book by my favorite kids book
author ever, this guy called Daniel
Pinkwater. And last year he had a book
out from Tachion Press called uh let me
find the title here, Jewels, Penny and
the Rooster. And recapping the plot of
this book would take 10 minutes because
it is so gonzo and weird. But suffice it
to say it revolves around a young woman
and a talking prize dog who find a
haunted woods nearby where uh the young
woman is welcomed by a sort of Beauty
and the Beast story as a kind of savior
but who wants no part of it. It's funny.
It's mad cap. It's gonzo. It's full of
heart. Um, it is like everything great
about a kids book. I read my daughter so
many Daniel Pinkwater books when she was
little. They are so fun to read at
bedtime. It's a middle-grade book and uh
I cannot recommend it highly enough.
Jules, Penny, and the Rooster by the
incredible Daniel Pinkwater. Cory, Dr.
O, and Tim Woo. Thank you very much.
Thank you.
>> Thanks, Desra.
Hey
you.
Hey. Hey.
Ask follow-up questions or revisit key timestamps.
The video discusses the current state of the internet and social media platforms, highlighting concerns about corporate capture, algorithmic manipulation, and the erosion of user control. Guests Cory Doctorow and Tim Wu, authors of "Inshittification" and "The Age of Extraction" respectively, share their perspectives on how platforms have evolved from their initial promise to extractive and harmful entities. They explore concepts like "extraction" (taking wealth beyond value provided) and "inshittification" (the decay of platforms from good to users, then to business customers, and finally to exploitative entities). The discussion touches on the decline of competition, the impact on labor, the need for regulation, and the contrast between healthy and toxic business models. They propose solutions involving stronger privacy laws, interoperability mandates, and a shift from purely efficiency-driven economic models to those that prioritize human values and well-being. The conversation also delves into specific examples like Facebook's evolution, Amazon's marketplace practices, and the challenges of regulating monopolies.
Videos recently processed by our community