HomeVideos

How AI Could Freeze Progress with Hilary Allen | Masters in Business

Now Playing

How AI Could Freeze Progress with Hilary Allen | Masters in Business

Transcript

1847 segments

0:02

Bloomberg Audio Studios podcasts, radio,

0:06

news.

0:10

This is Masters in Business with Barry

0:13

Rith Holtz on Bloomberg Radio. I'm Barry

0:17

Rholtz. You're listening to Masters in

0:19

Business on Bloomberg Radio. My extra

0:22

special guest this week is Hillary

0:24

Allen. She is a professor at the

0:26

American University Washington College

0:29

of Law in DC. Where she specializes in

0:33

financial regulation, banking law,

0:36

securities regulation, and technology

0:38

law. She published a book, Fintech

0:42

Dystopia, a summer beachread about how

0:45

Silicon Valley is ruining things,

0:48

covering the intersection of finance,

0:51

technology, law, regulation, and

0:53

politics. It's a perfect subject for us

0:56

to talk about. Hillary Allen, welcome to

0:58

Bloomberg.

0:59

>> Thank you so much for having me.

1:01

>> So, fascinating conversation,

1:03

fascinating uh topic that you write

1:05

about. Before we jump into that, let's

1:08

let's spend a few minutes going over

1:10

your background. You get a bachelor's in

1:13

laws from the University of Sydney in

1:15

Australia, a master of laws in

1:18

securities and financial regulation law

1:21

from Georgetown here in the States. Um,

1:24

and you graduated first in your class

1:26

there. What was the original career

1:28

plan? Was it simply I'm going to go be a

1:30

lawyer? What what what were you

1:31

thinking? The original career plan was

1:33

I'm just going to be a lawyer. Uh, and

1:35

then I loved law school and I practiced

1:38

for 7 years and discovered there wasn't

1:40

so much law always in the practice of

1:42

law and I'm a nerd and I missed it. And

1:45

so the the drive was to go back to

1:48

Georgetown, get my masters, do some

1:50

academic writing and then launch a

1:52

career as a professor where I could

1:54

really sort of think slowly about the

1:56

law. And

1:56

>> and you practiced, you were in London,

1:58

you were in Sydney, uh, Sherman and

2:00

Sterling here in New York. tell us a

2:02

little bit about the sort of legal work

2:03

you were doing when you were a

2:05

practicing attorney.

2:06

>> So basically there's sort of two broad

2:08

categories of the work I did. Um I did

2:10

transactional work, banking

2:11

transactional, typically acting for

2:13

banks in leverage buyouts. Um but the

2:17

work I think I enjoyed more was the

2:19

regulatory compliance advisory. Um so

2:22

there was more law in that. um

2:24

especially when you had new financial

2:26

laws being handed down in Australia and

2:28

changes in the US with DoddFrank and

2:31

sort of trying to figure out how to

2:32

comply with those new rules.

2:34

>> So, how do you go from practicing

2:37

bank transactions and some regulatory

2:39

law to ultimately working with the

2:42

Financial Crisis Inquiry Commission?

2:44

Tell us a little bit about your

2:45

experiences there.

2:47

>> So, that was a a series of series of

2:49

fortunate events. Um, while I was doing

2:52

my masters at Georgetown, I had a

2:54

professor who was tapped to be um on the

2:58

staff of the Financial Crisis Inquiry

3:00

Commission and he pulled me in to work

3:02

with them two days a week and we were

3:03

investigating the causes of the 2008

3:06

financial crisis to put together the

3:08

report uh that came out which really was

3:10

sort of

3:10

>> it's a nice thick book that they

3:11

published.

3:12

>> It's a really thick book with a really

3:13

thick index even. Um, and the idea was

3:16

to tell the story and that's really sort

3:18

of stuck with me throughout my career,

3:20

the importance of being able to explain

3:23

complex things and how they knit

3:24

together to cause things. So, working

3:27

with the FCIC, how did that affect how

3:31

you looked at regulation in general, but

3:34

more specifically the government's

3:37

response to technology, new financial

3:40

products, um, the regulatory world in

3:43

general?

3:44

So the gift that I got from working with

3:47

the financial crisis inquiry commission

3:48

is sort of understanding that there are

3:51

a lot of things that come together and

3:53

you need to really look very broadly to

3:55

understand systemic changes. Um, another

3:59

gift that it gave me was, I think, a

4:01

healthy skepticism of innovation

4:02

rhetoric, right? Because if you think

4:04

back to 2008 and what caused it, you

4:07

know, there was all these stories about,

4:09

well, these new financial products,

4:10

these complex new derivatives, we don't

4:12

need to regulate them. They're

4:14

innovation, sophisticated parties

4:16

involved. We don't want to tamp down on

4:18

innovative potential. And so that that

4:20

skepticism has been a helpful skill set

4:22

as I've been navigating the sort of

4:24

post208

4:26

financial world where you have the

4:28

innovation rhetoric from Silicon Valley

4:30

infiltrating into financial services.

4:34

>> Uh you you raise a really interesting

4:36

issue that I I have to ask about. So how

4:41

much of what we see as regulation is

4:45

either an adherence to a a an ideology

4:49

that sometimes says regulation is good

4:53

and are guard rails on capitalism uh and

4:56

other uh ideology says regulation is

4:59

expensive and anti-inovative and reduces

5:03

job creation. It seems like regardless

5:06

of the facts on the ground, each side

5:08

has their belief system. H how do you

5:11

contextualize that?

5:13

>> Well, I mean, I think I don't think

5:15

there were too many people in the depths

5:18

of the 2008 crisis who were saying

5:20

there's too much regulation, right? I

5:22

think it's a function of where you are

5:25

in a particular time. I think people's

5:27

memories fade really quickly

5:29

>> and as soon as regulation has solved the

5:32

problems it was intended to solve or the

5:35

crisis that spurred the regulation has

5:38

dissipated,

5:39

people quickly forget why that

5:42

regulation is there in place and then it

5:44

becomes much easier to see it as

5:46

something that is just a hindrance,

5:48

something that is just expensive that

5:49

doesn't have a role to play. But I think

5:52

what we're actually seeing right at this

5:54

moment is the erosion of the securities

5:58

laws that really have stood investors in

6:01

goodstead since the 1930s. Not to say

6:03

they're perfect, but the the general

6:06

sort of investor protection regime that

6:08

the Securities and Exchange Commission

6:09

has always implemented has really

6:11

encouraged trust in the US um stock

6:15

market and and sort of made it the envy

6:17

of the world and people wanted to list

6:18

here. that's really getting peeled back

6:21

right now. And so I think, you know,

6:24

it'll be

6:26

pretty soon a moment where we realize

6:28

why we had all that regulation and we'll

6:30

miss it.

6:32

>> So, so heading into the financial

6:34

crisis,

6:35

um, I recall looking at some of what I

6:38

called radical deregulation prior, and

6:42

this isn't by no means the sole cause of

6:45

the financial crisis. Lots of factors

6:47

led to this, but you had the Commodities

6:50

Futures Modernization Act, which allowed

6:53

what was essentially an insurance

6:55

product to be issued without any

6:58

insurance reserves. Seems kind of risky.

7:01

And then you had the repeal of

7:02

GlassSteagall that kept uh depository

7:05

banks separate from speculative uh Wall

7:08

Street banks. probably didn't cause the

7:10

crisis, but certainly allowed it to get

7:13

much bigger at the at the very least.

7:16

Um, and yet there didn't seem to be any

7:20

desire after the crisis. Hey, maybe we

7:23

should put these things back into place.

7:25

Maybe we should repeal what was added

7:27

and restore what was repealed. Nobody

7:31

want they want to go a totally different

7:32

direction. Well, I think again this is a

7:34

story of political economy and there are

7:36

still a lot of people who are mad at the

7:38

Obama administration for prioritizing

7:40

health care over financial reform

7:42

>> because basically they had one shot at

7:44

doing something big. Um, and if they had

7:48

and I I'm not weighing in to say that

7:51

this was the right or the wrong move,

7:53

but if they had gone right out of the

7:56

gates with financial reform, I think we

7:58

would have seen more of the bigger

8:00

structural things that you're talking

8:02

about. So, you know, in that immediate

8:04

aftermath of the 2008 crisis, you had um

8:07

Sandy While who had been the head of

8:09

Cityroup and and had sort of engineered

8:12

the end of the GlassSteagall um

8:14

legislation and and from this may be

8:17

apocryphal, but apparently he had a a

8:19

deal toy that said um shatterer of

8:22

GlassSteagall that he kept kept on his

8:24

desk. And again, this may be apocryphal,

8:26

but I heard that he basically sort of

8:28

had a conversion after 2008 and said,

8:30

"Oo, yeah, probably shouldn't have done

8:32

that."

8:32

>> Well, well, a lot of people did. Alan

8:34

Greenspan famously said, "I incorrectly

8:38

assume people's concern over their own

8:40

reputation would have prevented some of

8:42

the excesses we've seen." I'm

8:44

paraphrasing, but that was pretty close

8:46

to what he said.

8:46

>> Yeah. He said, "The world sort of didn't

8:48

work the way I thought it did." And I

8:50

think, you know, had they gone straight

8:51

out of the gates with financial reform,

8:53

you might have seen some of that

8:54

structural reform, but by the time they

8:57

got around to it, you know, DoddFrank

8:58

wasn't passed to 2010. You know, then

9:01

then the political economy calculus had

9:04

shifted. Um, the industry was in more of

9:06

a position to sort of argue for weaker

9:09

rules and and fewer structural changes.

9:11

It it's amazing how rapidly memories

9:14

fade and and people just quickly,

9:16

>> oh no, that was then, now it's new.

9:19

You've worked inside the global

9:21

financial system as well as studying it

9:23

from the outside. How did being part of

9:26

the FCIC affect how you perceive

9:29

technology, new financial products,

9:32

regulation,

9:33

um, and deregulation? How how did that

9:36

affect your your perspective?

9:38

>> You know, I didn't think a ton about

9:39

technology at that time. and that's sort

9:41

of been a later addition to the work

9:42

that I do. But the broader themes of

9:44

financial innovation, um, regulation,

9:48

deregulation,

9:50

you know, I see the value in financial

9:53

stability regulation in particular. So

9:55

financial stability regulation are the

9:57

rules that are supposed to prevent

9:58

financial crisis and they work often

10:00

sort of handinhand with investor

10:02

protection regulations, but they also

10:04

aim to do something differently. And

10:07

part of the challenge when you're trying

10:09

to prevent a financial crisis is this

10:12

silo mentality where people just think

10:15

about their own little piece of the

10:16

world and okay we can deregulate our

10:19

little piece and we don't won't think

10:20

about the flow on consequences and what

10:22

what incentives it'll create etc. And so

10:26

you know my real takeaway was always to

10:29

have the most holistic perspective

10:32

possible to break down that silo

10:34

mentality. And later in my career that

10:36

meant learning about the new

10:38

technologies that are sort of

10:40

infiltrating the the financial system.

10:42

>> So so I want to talk about technology

10:44

and I want to talk about fintech

10:46

dystopia. But there is a quote from

10:49

within that that applies directly to

10:52

what you're describing with stability

10:54

which was it's the economic precarity

10:57

stupid. Um paraphrasing James Carville

11:00

tell us a little bit about the economic

11:03

procarity. Yeah. So I think a mistake

11:07

that we have made collectively in recent

11:10

years is to say well look the economy is

11:13

doing well everything's fine and that

11:15

really doesn't

11:18

you know mesh with many people's

11:20

experience of the economy. So it used to

11:22

be well probably not always the case but

11:25

closer to the case in the Clinton years

11:27

where there was less economic inequality

11:30

than there is now that you could sort of

11:32

say a rising tide lifts all boats but

11:34

now what we're seeing is over half of

11:36

Americans live from paycheck to paycheck

11:39

even in a good economy right and so in

11:42

that kind of circumstance

11:45

the financial systems not in the economy

11:47

aren't working for everybody and so I

11:49

think when we think about what we're

11:52

trying to achieve with our financial

11:55

system. It should be

11:58

that we are trying to find an a solution

12:00

to this economic procarity. And also

12:03

that begs the question of whether the

12:05

financial system and investing is in

12:07

actually the way to get there

12:08

>> and maybe we need broader public

12:10

policies to address that economic

12:12

procarity so that no one or at least not

12:15

half of the population are just scraping

12:18

by. So, we just passed a new set of laws

12:21

that include thousand accounts for for

12:25

newborns. Isn't that going to solve

12:28

financial inequality? All these kids, by

12:30

the time they're 30, they'll be worth

12:32

millions.

12:34

>> Um, I think you might need to offset

12:36

against the people losing their health

12:38

insurance subsidies. I don't think that

12:40

$1,000 is going to go very far,

12:42

>> right? And and what's fascinating is

12:44

watching um just a parade of

12:46

billionaires come out and no no we need

12:49

to supplement that $1,000. So first it

12:51

was Michael Dell and then it was Ray

12:54

Dallio. I don't know who else is going

12:55

to step forward. But it appears hey

12:58

we're not really paying a whole lot in

13:00

taxes. We might as well throw some money

13:01

at some some babies. That seems to be

13:04

the philosophy.

13:05

>> Yeah. I mean I don't love

13:09

philanthropy in that sense.

13:10

supplementing democratically sort of

13:13

elected policies. You know, it it gives

13:16

a lot of sort of discretion and power to

13:18

people as to how they want to distribute

13:19

their large ass and to to some degree

13:21

that's fine. But again, when we have a

13:23

society where half of the population is

13:25

barely scraping by, I don't think their

13:29

livability should be predicated on the

13:32

whims of billionaire large s fair

13:34

enough. Um you you talked about

13:37

technological

13:38

innovation in your book. You argue that

13:42

that is financial technology innovation

13:45

is driven largely by legal design rather

13:49

than technical brilliance. Explain that

13:52

a little bit. What what is it about

13:54

fintech that seems to be working the

13:57

perspective uh from an attorney rather

14:00

than an engineer?

14:01

>> Yeah. So this was something that as I

14:03

said I came to a little later in my

14:05

career. I think earlier in my career

14:06

when I first started looking at fintech

14:08

I generally accepted you know the party

14:10

line. This technology is revolutionary.

14:13

This technology is making things more

14:15

efficient. This technology is fixing

14:17

things. And then I realized that the

14:20

people who were saying that had

14:21

something to sell and I probably should

14:23

learn a little more about the technology

14:25

because if you want to work on financial

14:27

regulatory policy now you need to

14:29

understand the extent to which the

14:30

technology actually lives up to what is

14:34

claimed it can do. And so sort of my

14:37

first sort of foray into this was I've

14:39

looked really in detail at blockchain

14:41

which is is truly frankly a terrible

14:44

technology. It's a clunky database and

14:46

and and it's not something you would

14:48

ever choose for any kind of financial

14:52

market infrastructure, but for the fact

14:54

that it's been very easy to convince

14:56

regulators not to regulate it. And so

14:58

the value ad that comes from crypto has

15:01

never been blockchain technology as a

15:03

technology. It's been whipping up

15:06

stories about that technology that have

15:07

justified avoiding regulation. And we

15:10

see it in other instances as well. You

15:13

know, there are um fintech lending that

15:16

is replicating

15:18

um some of the the predatory payday

15:20

lending that we've seen before.

15:22

>> The buy now pay later sort of financing

15:24

or

15:25

>> Well, the payday payday loans have been

15:27

around a lot longer than that. Um these

15:30

is sort of a sort of it's like a $400

15:32

loan that you get to bridge you over

15:34

till your next payday.

15:37

Um, and you know there's been a lot of

15:39

predation in that market and some states

15:41

had banned those those products

15:43

essentially.

15:43

>> You think 29% interest is not fair? You

15:47

have a problem with that? We're just

15:48

trying to make a profit here.

15:50

>> Some of these interest rates are 300%.

15:52

>> Get out.

15:53

>> Yeah.

15:54

>> That's insane. And and what does New

15:55

York top out at? Like 19% something like

15:58

that?

15:58

>> I I don't know about New York. Yeah. Um

16:00

but but normally anything you know mid

16:03

double digits is is thought of as

16:05

usurious. 300% is just next level.

16:08

>> Yeah. I mean, they're not it's not set

16:10

as an interest rate per se. They're

16:11

fees, but once you actually convert that

16:13

into a peranom, they can be in the

16:15

hundreds of percentages. And so, that

16:17

has always been a problem. And we have

16:19

had states act. And then we've had new

16:21

fintech lenders saying, "Well, actually,

16:23

we're different from payday lenders

16:24

because we use AI to screen our

16:26

borrowers and so you should treat us

16:28

differently." Um, and yet they're

16:30

charging interest rates that are

16:32

equivalent to what payday lenders do.

16:34

And then you mentioned buy now pay

16:35

later. Again, they say, "Well, we're

16:37

we're not even extending loans. This

16:39

isn't a loan at all, so we shouldn't

16:40

have to comply with the laws around

16:43

lending, around disclosure, around that

16:45

kind of thing."

16:46

>> How is that not a loan? You're buying a

16:48

product that you don't have money for.

16:51

Someone is paying for that. Isn't that a

16:54

loan?

16:54

>> I would say so.

16:56

>> Okay.

16:56

>> But but what what's the counter to this

16:59

isn't a loan. This is a a pre- layaway

17:03

>> essentially. Yeah. you know, we you

17:04

know, we we we don't charge interest.

17:06

There are late fees if you don't pay,

17:08

but that's not the same as interest, you

17:10

know.

17:10

>> Oh, that's fair. Like we we we bought a

17:13

couch, no interest for 6 months. So, as

17:16

long as you pay it off within 6 months,

17:18

that sort of thing seems to be interest

17:21

free.

17:22

>> But then when you look at the business

17:23

model and you see that a significant

17:25

chunk of the people are incurring these

17:27

late fees, then

17:28

>> well, that's their fault, isn't it?

17:30

That's human nature. We you can't blame

17:32

us if we take advantage of people

17:35

procrastinating and not paying off their

17:37

uh fees in time.

17:38

>> Well, it's not that they're

17:39

procrastinating. It's that they're

17:40

choosing between paying rent or paying

17:42

this off. So, this

17:43

>> food, medicine.

17:45

>> Exactly. So, this this is coming back to

17:47

it's the econ economic procarity,

17:49

stupid. Right. If people are in these

17:51

dire straits, we should not be surprised

17:53

that fintech firms are trying to

17:55

capitalize on that and profit from it.

17:57

which is why I think, you know, what we

18:00

need are

18:02

some kind of public safety nets um to to

18:06

sort of make and and a higher minimum

18:08

wage and higher social security

18:10

benefits.

18:10

>> Coming up, we continue our conversation

18:12

with Professor Hillary Allen discussing

18:15

her new book, Fintech Dystopia, a summer

18:19

beachread about Silicon Valley and how

18:22

it's ruining things. I'm Barry Rholtz.

18:25

You're listening to Masters in Business

18:27

on Bloomberg Radio.

18:42

I'm Barry Rholz. You're listening to

18:44

Masters in Business on Bloomberg Radio.

18:47

My extra special guest this week is

18:48

Hillary Allen. She teaches at the

18:51

American University Washington College

18:53

of Law in Washington DC where she

18:56

specializes in regulation of financial

19:00

and technology laws. So, so let's talk

19:03

about the digital only book. Ironic,

19:08

right? um fintech dystopia where you

19:12

describe modern financial technology

19:15

simply as Silicon Valley ruining things.

19:19

Explain that seems like an extreme

19:22

example and and give us some examples of

19:25

how Silicon Valley is ruining things. So

19:27

just to be clear, not all modern

19:29

technology is ruining things. There's a

19:31

particular business model approach that

19:33

I think is ruining things and that is

19:36

derivative in many ways of the venture

19:38

capital model in Silicon Valley.

19:40

>> Venture capital,

19:41

>> just venture.

19:42

>> Okay.

19:42

>> Yeah. Venture capital model in Silicon

19:44

Valley. So, it's sort of got this sheen

19:47

around it that's iconoclastic and they

19:49

they make bets on these moonshots

19:52

that'll, you know, save all of humanity

19:54

and yada yada yada. But in fact, it's

19:56

it's pretty well established as a

19:59

playbook at this point. Um, you know,

20:02

there's a lot of subsidies that go to

20:04

venture capital by virtue of their

20:07

having access to pension funds um by

20:10

virtue of sort of um capital gains

20:12

taxation. And so they've got sort of and

20:14

and especially in low interest rate

20:16

environments, they attract a lot of

20:17

money. So they have pretty cheap money

20:18

available to them.

20:20

>> And then they go shopping. Um, and what

20:22

they go shopping for is not the

20:24

iconoclastic sort of outlier that we

20:27

think of, but what we've seen and what

20:29

the evidence shows is that they tend to

20:31

go shopping for the same things that

20:32

their friends are going shopping for.

20:34

And they go shopping for the businesses

20:35

that their friends have developed. And

20:37

so there's this sort of very sort of uh

20:40

insular mentality in what they're

20:42

looking for. And they're also looking

20:43

for something that they can cash out of

20:45

very quickly. Um because the you know

20:48

the average venture capital fund has a

20:50

what a 10 year sometimes 12 but usually

20:52

10 year duration that's really not that

20:55

much time to find something to invest in

20:57

have it grow and then cash out and so

21:00

they're not looking for things that are

21:02

going to take decades to develop.

21:04

They're looking for things that they can

21:06

grow quickly and get out of in about

21:08

five or six years.

21:09

>> So give us a few examples. What do you

21:11

think is uh the sort of um you know not

21:15

adding a whole lot of value

21:16

venturebacked businesses?

21:19

>> So

21:20

not intentionally but it just turned out

21:22

that way as I wrote this this book.

21:25

Almost every fintech business I looked

21:27

at had been funded by Andre Harowitz. Um

21:29

they had been sort of the lead. So you

21:31

know they they

21:33

>> they're the hot VC these days. I I've

21:36

full disclosure I've interviewed Andre.

21:39

I've interviewed Kapoor. I've

21:40

interviewed Horowitz. So, I've sat with

21:42

them and talked about a lot of their

21:45

businesses, but the past few years

21:48

they've been very front and center, very

21:50

active.

21:50

>> Yeah. No. And and they sort of they have

21:53

their as a marquee name, as you said,

21:55

they're the hot VCs. Once they say they

21:57

like something, they can basically

21:59

attract other venture capital to those

22:02

those businesses. And so, they're

22:03

essentially taste makers. Um,

22:05

>> which which is fascinating you say that

22:07

cuz before that it was Sequoia, before

22:09

that it was Kleiner Perkins. Like you

22:12

work your way there's a hot firm for a

22:15

decade. The '9s had it, the 2000s had

22:17

it, the 2010s had it. Um, they tend not

22:20

to maintain that position forever.

22:23

Although to Andre and Harowitz's credit,

22:25

they've been the itgirl for a good good

22:28

run so far.

22:29

>> Yeah. I mean, I wouldn't say that that's

22:32

a good thing, but um Yeah. So, you know,

22:35

they they basically built the crypto

22:37

industry. So, you know, we we the the

22:39

narrative around crypto is that is this

22:41

organic sort of community of cyber punks

22:44

and libertarians, but but they really

22:46

built that industry. They were um early

22:50

investors in Coinbase. That was their

22:52

first crypto investment. And then they

22:53

have plowed a lot of money into the

22:55

industry and it's sort of their seal of

22:57

approval has been what's attracted

22:58

people to it. And you know, part of what

23:01

Andre and Horowitz does is it doesn't

23:03

just invest. it's go does aggressive

23:05

marketing campaigns for the things that

23:07

they've invested in aggressive lobbying.

23:10

Um so they've really been at the

23:11

forefront for trying to get the laws

23:13

changed um to accommodate their business

23:16

models. Um so yeah they there's there's

23:20

crypto but they've also been at the sort

23:23

of the forefront of um I always there's

23:26

one of the do not pays I think it's a

23:27

firm that's theirs. I always get get

23:29

mixed up. Um they they were very um

23:33

early investors in Robin Hood um the the

23:37

fintech trading stock app

23:39

>> which originally started out as a stock

23:41

app and then it became eventually a

23:43

crypto app and now it's a bet on

23:46

anything app.

23:47

>> Yeah. And again that is a company that

23:51

by the time it IPOed had racked up all

23:53

kinds of fines from the SEC and FINRA

23:56

because it was violating laws left,

23:58

right, and center. Um, you know, it's

24:01

it was one of the first to offer

24:04

commission free brokerage. Um, but as

24:07

the chestnut goes, if you're not paying

24:10

for the product, you are the product.

24:12

and it makes most of its money from

24:13

payment for order flow and was not clear

24:16

with its customers in the early years

24:18

about that how that was going on and how

24:21

that they get paid a lot more for your

24:23

options trades than your regular stock

24:25

trades because um

24:28

>> more profitable.

24:29

>> Yeah. More profitable for the Citadel

24:31

securities of this world to to take

24:32

those. Yeah.

24:33

>> Huh. Really kind of interesting. Um, and

24:37

yet at the same time you have a chapter

24:39

in your book, Silicon Valley Welfare

24:42

Queen.

24:43

>> Explain. I thought that these are, you

24:46

know, Ein Randian

24:48

um, libertarians that don't want to

24:51

suckle off the teeth of big government

24:53

and these are people that are builders

24:55

and self-made um, people. You're arguing

25:00

not so much. Well, they don't want us

25:02

suckling on the teeth of the state

25:04

because they might have to fund that

25:05

with taxes, but but they're okay

25:07

suckling themselves,

25:08

>> right? So, so give us a few examples.

25:10

What companies started out as as welfare

25:13

queens?

25:14

>> Well, I mean, again, the whole story of

25:17

of tech, the the internet and smartphone

25:20

boom is very much based on technologies

25:24

developed by the government,

25:25

>> DARPA and the whole internet.

25:27

>> Exactly. And you know and and I think if

25:30

you look at the iPhone a lot of the

25:31

individual technologies that went into

25:33

that again came from

25:34

>> everything with microwaves comes out of

25:35

NASA right

25:37

>> so you know first of all this this

25:40

entirely self-made story falls apart

25:42

right there because as I mentioned

25:45

earlier if you've only got six years to

25:47

turn around a technology you're not

25:49

really investing in prototypes in

25:51

thinking really hard about physical

25:53

hardware and how that works. you're

25:55

really looking for a software thing that

25:57

you can jin up pretty quickly. And so

25:59

the really long-term investment comes

26:02

from the state um and and has always

26:05

done and then it's commercialized. Um

26:07

you know and I think that that sort of

26:09

has worked well except that you get to

26:12

the point where the you know the venture

26:14

capitalists who are commercializing are

26:15

saying well we shouldn't have to pay any

26:17

taxes to fund the state that develops

26:19

these technologies. They also benefit as

26:21

I said enormously from um laws that they

26:25

lobbied for in the late 70s I believe u

26:29

changes to Orisa um which allowed

26:32

pension funds to vent to uh invest in

26:35

venture capital basically didn't exist

26:37

before and at that same period they were

26:39

lobbying for changes to the capital

26:41

gains taxation

26:42

>> well you have the carried interest

26:43

loophole which continues to persist um

26:47

I'm drawing a blank on the author's name

26:49

there's the book Americana 400 years of

26:53

technological innovation that makes the

26:55

argument you're making go back to the

26:58

telegraph funded by Congress go back to

27:00

railroad like every major technological

27:03

innovation o or most major innovations

27:06

got seated with the government and then

27:09

eventually uh the private sector takes

27:12

over and what has changed in recent

27:15

years is that public private partnership

27:18

seems to have broken

27:20

Yeah, actually. So, the book I really

27:21

like on this is Margaret Omara's book,

27:23

The Code, who does she does a great

27:25

history of Silicon Valley. Um, and yeah,

27:28

I think the

27:31

the understanding that there was a

27:34

quidber quo has sort of fallen away. So,

27:39

always the private sector has

27:42

commercialized this this technology. But

27:45

if we have an unwillingness to sort of

27:49

pay any taxes, if we have an

27:50

unwillingness to invest in government

27:53

capacity to invest in universities where

27:56

so much of this stuff is developed, you

27:57

know, you take Mark Andre, he you know,

28:00

he got his start because he was happy or

28:03

sorry lucky enough to be a student at

28:06

the University of Illinois at the time

28:08

where they had a special grant to look

28:10

at the beginnings of the internet. He

28:12

worked on a team there that developed a

28:15

prototype internet browser and then he

28:18

went into the private sector and they

28:19

let him build one from the private

28:21

sector and that was Netscape and that's

28:22

how he made his fortune. So he was sort

28:24

of in the right place at the right time

28:26

to take advantage of public investment

28:30

in this kind of thing and yet this is

28:32

the kind of thing that we're seeing that

28:34

these leading venture capitalists want

28:35

to shut down.

28:36

>> Really interesting. Since we've been

28:38

talking about books, you've you've

28:41

criticized abundance, which is by Derek

28:44

Thompson and Ezra Klene, as as the whole

28:47

concept of abundance is sort of a sexy

28:50

way to make excuses for technosolutions.

28:54

Tell us a little bit about that. Yeah.

28:55

So this is this is a something I get

28:57

into a lot of conversations with people

28:59

these days because I think there are

29:01

some elements of the original sort of

29:03

abundance agenda that are very appealing

29:05

to people in terms of for example

29:06

increasing housing capacity and I I do

29:09

think that that is something that needs

29:11

to happen and has to be done in the

29:13

right way.

29:15

>> But if you look at who is funding the

29:18

abundance movement they have conferences

29:20

etc. it is Andre Horowitz and other

29:22

people from Silicon Valley

29:24

>> and it seems to be this attempt to

29:27

essentially um put a a happier face on

29:32

the deregulatory project that Silicon

29:35

Valley is looking for to sort of make it

29:37

seem kinder, gentler, and more

29:39

progressive because the abundance

29:41

movement sort of in a nutshell is

29:43

supposed to be well we shouldn't have

29:45

artificial scarcity. we should build

29:47

more of what we want to do that we

29:49

should take away some of the roadblocks

29:51

that are getting in our own way. And

29:53

when you say it like that, it's sort of

29:54

hard to disagree with.

29:55

>> Well, that works for housing. You you

29:57

have nimiism with housing, but when you

30:01

take that away, it also means you're

30:03

going to end up with perhaps highrises

30:07

or multif family units in a suburban

30:09

area that some people don't want in

30:11

their neighborhood. There's always a

30:13

series of trade-offs with people who are

30:14

already there versus people who want to

30:16

get there. What is the specific problem

30:19

with abundance as a philosophy towards

30:23

building more of what we want as a

30:26

society?

30:28

>> Because it's who gets to decide what

30:30

more of what we want is. And if you look

30:32

at who's funding the abundance agenda,

30:34

it is the billionaires and the tech

30:36

elite. And these are people who have

30:39

really shown that they are quite willing

30:41

to run rough shod over regulations that

30:43

are there to protect the public from

30:44

harm if that enables them to profit. And

30:48

so I am just skeptical that a movement

30:50

that is funded by these people is really

30:54

going to be prioritizing the kinds of

30:56

projects that would benefit the

30:57

economically precarious. I think it's

31:00

more likely that it'll be benefiting

31:02

themselves and will lose protections for

31:05

people with less voice that are

31:07

currently in place.

31:08

>> So, what sort of overhyped products do

31:11

you think best explain um the problems

31:16

with this approach? Like what are these

31:18

companies putting out that either is a

31:22

result of regulatory capture or just

31:24

don't do what they promise? Because you

31:26

would think that in the world of venture

31:29

either your product finds um an

31:31

audience, it finds a customer base or it

31:34

doesn't and fails and that goes out of

31:36

business.

31:37

>> Yeah. So that's sort of the perverted

31:39

part of this is that that market logic

31:41

like you know survival of the fittest

31:43

because of all the subsidies that

31:46

benefit venture capital that doesn't

31:48

really apply that logic anymore. So, you

31:50

know,

31:50

>> give us an example.

31:51

>> Crypto,

31:52

>> crypto should have died many times

31:55

already. Particularly, it should have

31:57

died in 2022 when we had the big crypto

32:00

winter.

32:01

>> Mhm.

32:01

>> At that time, particularly um Andre

32:05

Harowitz's crypto had this huge war

32:07

chest of funds that they had raised and

32:10

they stopped investing in crypto

32:11

startups at that point because, you

32:13

know, everything was more abundant. But

32:14

what they started using that money for

32:17

was lobbying, political spending. Um,

32:21

and they really worked very hard on

32:24

members of Congress to essentially

32:26

create laws that would allow the crypto

32:29

industry to keep doing what they're

32:31

doing, which was not allowed under the

32:34

securities laws as they were. So the

32:36

whole business model was regulatory

32:37

arbitrage.

32:38

>> Mhm. Um they wanted laws that would sort

32:41

of give a patina of legitimacy and

32:44

hopefully encourage institutional

32:46

investment, attract more money to the

32:48

space um

32:51

but not actually make them have to for

32:53

example like Coinbase combines combines

32:55

the functions of a broker dealer and an

32:57

exchange. That's not allowed in

32:59

securities. You can see why there's all

33:01

kinds of conflicts of interest that go

33:02

>> right either you're an exchange or a

33:04

brokerage firm, not both.

33:05

>> But in crypto you're both, right? And so

33:07

if you applied the securities laws to

33:09

crypto, they would have to disagregate

33:12

and basically would probably destroy

33:14

their business model. So what they

33:16

wanted was a law that said, "No, it's

33:18

fine. Crypto special. You do both." Um

33:21

and so that that really

33:24

an industry that should have failed is

33:28

you know again rising being propped up

33:31

all through this sort of aggressive

33:33

political spending and and um I mean I

33:37

I've talked to people in Congress off

33:39

the record who have said that they've

33:41

only voted for these laws because

33:42

they're afraid that if they don't that

33:46

crypto industries will target them. Hm.

33:48

Um, what other products do you think are

33:51

are overhyped and and fail to satisfy

33:54

their markets?

33:56

>> Well, right now the obvious answer is a

33:57

lot of the AI products. Um, the anything

34:01

sort of it's hard when you talk about AI

34:03

because it's such an umbrella term for

34:05

so many different things, right?

34:06

>> I I have perplexity on my phone. It it

34:09

does a better job with search than

34:11

Google does. I get better, more

34:13

comprehensive

34:15

answers. Um, what's wrong with AI?

34:18

>> Well, let me disagregate it first

34:20

because there's plenty of AI that

34:21

there's nothing wrong with, right? So AI

34:23

is not intelligent in any way, shape, or

34:25

form. It's a market, that's a marketing

34:26

term. What is it is it's an applied

34:29

statistical engine.

34:30

>> You have an algorithm that looks for

34:32

patterns in data and then acts

34:34

accordingly.

34:35

>> Um, and that kind of technology has been

34:38

around for a long time. It does like for

34:39

example, it's great for fraud detection

34:41

in a bank um for credit card

34:43

transactions for example. So that that

34:44

you know that's that's an A+ use of of

34:46

AI.

34:48

>> But the last few years everybody has

34:50

been pouring everything they've got into

34:53

these um LLM based tools these large

34:56

language made model based tools. So

34:58

these are tools that can um you know old

35:03

AI tools would just sort of classify

35:05

something, put something in a group or

35:07

or predict something but but now we have

35:09

these tools that generate content um

35:13

particularly text but also you know

35:14

video um music etc. And there are so

35:20

many problems with this technology

35:22

because it's being sold as technology

35:25

that can replace humans, right? That

35:27

that can basically it's worth throwing

35:30

trillions of dollars into this because

35:32

of the productivity gains that we'll get

35:34

by firing all the humans essentially is

35:36

is the story they're telling. Um, first

35:39

of all, that wouldn't be great,

35:40

>> right? That's a problem in and of

35:42

itself. the the way I have heard it

35:45

described that's a little less um

35:50

catastrophic is this is going to make

35:53

everybody more efficient, more

35:54

productive. It'll make companies more

35:57

profitable and we'll all be able to do

36:00

more with our existing um staff than

36:03

having to go out and hire hundreds of

36:05

more people.

36:06

>> But that is not true sadly. That's the

36:08

pitch line, right? So these these tools

36:12

make a lot of mistakes. Um you know even

36:14

the very best ones make mistakes.

36:17

>> We we've seen a lot of attorneys, you

36:19

and I are both attorneys. A lot of

36:20

judges have been calling out attorneys

36:22

who theoretically are supposed to be

36:25

doing this on their own and instead are

36:27

outsourcing it to AI and all of its

36:30

hallucinations and citing cases that

36:32

don't exist. The assumption is that's

36:36

going to get better eventually,

36:37

>> but it won't. So this is this is the

36:39

problem.

36:39

>> But it won't

36:40

>> but it won't. So these things are

36:42

statistical engines, right? They they

36:45

can't check for accuracy because they

36:47

don't understand accuracy as a concept.

36:49

>> Mhm.

36:49

>> Right. There's no reasoning. It's it's

36:51

literally I the the most statistically

36:54

most likely word after the last word I

36:57

gave you is this word.

37:00

>> There is no way to make that care about

37:03

accuracy because it's it's it's not a

37:05

it's not a thinking machine. And I think

37:09

there's increasing acceptance that these

37:11

these models have hit a wall and they

37:13

are as accurate as they are going to

37:14

get.

37:15

>> Really?

37:15

>> Yeah, that's kind of that's kind of

37:17

fascinating. Uh my concern was at least

37:20

on the legal side, hey, you have this

37:22

existing body of work and all this

37:24

research and brief writing and arguments

37:27

that exist as of now. If you're going to

37:30

replace people from doing that, are you

37:31

going to freeze the state of legal

37:34

knowledge at 2026 and five or 10 years

37:37

from now? If you don't have people

37:39

writing these briefs, you don't have

37:41

people writing these decisions, how can

37:43

AI respond to what's taken place over

37:47

the past 10 years, if we don't have the

37:49

humans actually doing the grunt work?

37:52

>> Yeah. I mean, there's a there's I mean,

37:53

I think those kinds of concerns have

37:55

been expressed very much in the cultural

37:57

context. you know, if if we

38:00

disincentivize creators from making new

38:02

music and new art, is is this it? Are we

38:05

stuck with with what we've got?

38:07

>> With something like the law, one of the

38:09

challenges is that, you know, these

38:11

large language models,

38:13

>> they don't get updated on a day-to-day

38:15

basis. You know, there's there's sort of

38:16

a stop point and then they they don't

38:18

know well, they don't know anything that

38:21

they don't have the data from after a

38:23

certain date. So, that that's a

38:25

limitation. But the thing I worry most

38:27

about with the law is that you

38:31

have to be able to spot the

38:34

hallucinations or you're going to get

38:35

yourself in very big trouble. And I

38:37

think this is true for a lot of

38:38

different um fields. And and this is

38:41

again just to digress a little

38:43

>> why the the profitability narrative is

38:45

not true,

38:47

>> right? Because the only place where you

38:49

can just put this content out and just

38:51

leave it there is in very low stakes

38:54

places, right? where it doesn't matter

38:55

if you get something wrong. But even,

38:59

you know, things that you wouldn't think

39:00

are such a big deal have proved to be

39:02

quite high stakes. So, Air Canada had a

39:04

chatbot that told a um customer that if

39:08

they wanted to apply for a bereavement

39:11

discount for a flight, they could do

39:13

that after their flight was done. Now,

39:15

that's not Air Canada's policy. They

39:17

they you had to do it in advance. And so

39:19

this customer tried to get their refund

39:21

after the fact and Air Canada said,

39:23

"Well, the chatbot got it wrong. Too

39:25

bad. So sad for you." And

39:27

>> it's your chatbot. You own You're

39:28

responsible for it.

39:30

>> Exactly.

39:30

>> Not not my mistake, your mistake.

39:32

>> Exactly. And so even in these sort of

39:34

reasonably low stakes customer service

39:36

interactions, there's reason to be

39:38

really worried about inaccuracy. Now you

39:40

start dialing up to things to medical

39:42

advice, legal advice, you know, it's

39:45

just you you can't rely on them. And I

39:49

worry that we're putting people in a

39:51

very difficult position because it's a

39:53

it's a lot easier to get something right

39:55

when you write it yourself than it is to

39:57

find mistakes in something someone else

40:00

has put together. Right. So, let me push

40:02

back a little bit cuz I've been watching

40:05

the

40:07

AI reading medical scans and at some

40:11

point last year um or maybe it was two

40:14

years ago, the the technology

40:18

theoretically passed the accuracy rate

40:21

of humans. um fewer false positives,

40:24

more identifying

40:26

um missed negatives that should have

40:29

been positive than people. Is is that

40:32

not accurate or where where are we with

40:35

with the medical application of that?

40:37

>> So, this is why I think it's so

40:38

important to disagregate the different

40:40

kinds of AI because that is not sort of

40:43

LLM based AI and some, as I said, some

40:46

of those tools are great. I can't weigh

40:47

in on medical imaging and things like

40:49

that. So, it may very well be the case.

40:52

What I'm talking about is, you know,

40:53

what if you've got um, you know, a

40:56

doctor coming up with instructions for a

40:59

care plan for their patients

41:02

>> and they let the AI do it, right? If

41:05

there's a mistake in there, they're much

41:07

less likely to catch it if the AI

41:09

because you know, you know how things

41:11

go. You'll be expected to look at more

41:13

of these because you're not generating

41:14

them yourself, right? And it's always

41:17

easier to get things right when you do

41:20

it yourself than when you're reviewing

41:21

someone else. I mean, when we were

41:23

lawyers, we used to that's why you want

41:24

to have the pen on contracts. You want

41:26

to you want to hide things from the

41:27

other side. And now it's now it's the AI

41:29

hiding stuff from you. And I worry that

41:33

especially with younger lawyers coming

41:35

up through the ranks who are encouraged

41:37

to rely on these tools from the

41:39

beginning who won't actually develop the

41:41

skills

41:42

because you don't learn well when you

41:45

sort of don't process it yourself. So if

41:47

you you spent your whole career using

41:49

AI,

41:50

>> you're not going to be able to spot the

41:52

problems in the AI.

41:53

>> You're not going to have the skill set.

41:55

>> No. And so then I'm worried about, you

41:57

know, those young lawyers getting sued

41:58

for malpractice because they missed

42:01

something that the AI generated, but

42:02

they were never even given the

42:03

opportunity to learn how to spot it

42:05

themselves.

42:06

>> It's it's a problem with the rungs on

42:08

the ladder being removed, especially

42:12

um we see that now manifesting itself.

42:14

The unemployment rate of the under 30 is

42:17

about double what it is for the national

42:19

unemployment rate. And I can't help but

42:22

wonder how much of that is somehow

42:24

related to the proliferation of AI tools

42:28

for white collar jobs.

42:30

>> I think you know Corey Doctor um who

42:33

does a lot of work in the tech space has

42:35

a great quote on this that I'm going to

42:36

butcher a little not say it quite as

42:38

well as he does it but he said the AI

42:40

can't do your job but the AI salesman

42:44

can convince your boss to replace you

42:46

with AI that can't do your job. Right?

42:49

So it's I think you're right that there

42:52

is at this moment you know I mean it's

42:56

also hard to say how much of this is a

42:58

AI washing as opposed to real AI

43:02

displacement right

43:03

>> the economy is not in a great place

43:04

right now. People don't want to hire

43:06

anyway. It looks a lot better if you say

43:08

well we're not hiring because we're

43:10

replacing them with AI than just we're

43:12

having a rough time. We're not hiring.

43:14

AI washing is a phrase I haven't heard

43:16

used in modern parliament yet, but it

43:18

certainly makes a whole lot of sense.

43:20

The line I heard, and I don't know where

43:22

I'm stealing this from, is you're not

43:24

going to be replaced by AI. You're going

43:27

to be replaced by somebody with a

43:29

greater facility working with AI than

43:31

you have. And it sort of creates a

43:33

self-fulfilling arms race to make sure

43:37

you you learn how to use that tool.

43:40

Otherwise, you're at risk for being

43:41

replaced by somebody who knows how to

43:44

use that tool.

43:44

>> I've heard that, too. But I don't think

43:46

these tools are that hard to use, right?

43:48

I mean, that's a failure on the part of

43:50

the AI companies if they're so hard to

43:51

use, right? Wasn't hard to use. Google

43:54

search

43:55

>> perplexity and and even chat GPT is is

43:58

absolutely easy as pie to use. I don't I

44:01

don't find them difficult. Sometimes you

44:03

have to keep changing the prompts to get

44:07

an improved answer. Like if you just ask

44:09

a question and walk away, well then

44:11

you're getting what everybody gets. But

44:13

if you I don't I don't really buy into

44:17

the prompt engineer job title. But a

44:21

little exposure is the more you ask it

44:23

and the more you vary it, you get a

44:25

variety of answers and eventually you

44:27

come up with something. Oh, that's

44:29

interesting and different. Let me let me

44:31

take a look at that. So I mean I have

44:33

strong feelings about this as an

44:34

educator because if these tools are

44:36

worth their salt,

44:38

>> it shouldn't take our students long to

44:40

figure out how to use them. Right.

44:41

>> Right.

44:41

>> So why are we bringing them into

44:43

education where what they really need to

44:45

learn is how to spot hallucinations, how

44:47

to think critically so that if they are

44:49

going to use these tools later, they can

44:51

use them to the best of their abilities.

44:53

This whole arms race sense of well they

44:55

need to use them in in school so they

44:57

don't get left behind. I'm like it it

44:58

didn't take long to learn how to Google.

45:00

they'll be fine.

45:01

>> You've been pretty critical of things

45:04

like crypto and stable coin. We're gonna

45:06

get to those in a moment. I want to talk

45:09

about some other things you've

45:11

discussed. Um you've brought up the

45:15

whole idea of um technology as a

45:19

branding exercise phrases like

45:21

democratizing finance, disruptive

45:25

technology,

45:27

banking the unbanked. you've described

45:29

these as just, you know, marketing and

45:32

not really accomplishing anything. Tell

45:34

us a little bit about those and and give

45:36

us some examples.

45:38

>> Sure. I mean, I think at the heart of

45:40

all this is is innovation speak and

45:42

innovation worship, right? When we

45:44

alluded to that earlier, this sense that

45:46

anything that is innovative is

45:50

inherently good and must therefore be

45:51

permitted at all costs. And that is sort

45:54

of the font of a lot of the rhetoric and

45:58

narrative that we get out of Silicon

46:00

Valley that ultimately is there to

46:03

attract funding. Yes. But also to

46:06

procure

46:07

legal treatment that facilitates what

46:09

they want to do. It actually creates

46:12

often an unlevel legal playing field

46:14

where you have the incumbents who have

46:16

to comply with all the laws and then the

46:18

disruptors as you say um who don't have

46:22

to comply with all the laws and can

46:24

succeed on that basis even if their

46:26

product isn't superior in the way we

46:28

would typically expect a disruptor's um

46:31

product to be. So yeah, I mean

46:33

disruptive innovation

46:36

um you know goes back to Clayton

46:38

Christensen and and the innovator's

46:40

dilemma. This this sense that if you if

46:43

you stay still and just make good

46:44

products, you'll be out competed by

46:46

someone who is trying to um do things a

46:50

little differently. But you know there

46:53

there's no real formula that you can

46:57

take away from that as to what you

46:58

disruptive is in the eye of the

47:00

beholder. So, so let me push back on

47:02

that a little bit. Um, and all my VC

47:05

friends, I could just hear their voices

47:06

in my head. Um, and the push back is

47:10

look, most new companies fail. Most new

47:14

technologies crash and burn. Most new

47:16

ideas never make it. And even the best

47:18

of the best VCs, they'll make a hundred

47:21

investments for that one moonshot that

47:24

works out. and most of the other 99 are

47:27

at best break even but mostly losers.

47:31

How could you say uh this is true? Oh,

47:34

and real innovation often finds itself

47:38

in between the regulatory regime because

47:42

the technology that's being created was

47:45

never anticipated by the regulators or

47:47

or anybody else. Fair push back.

47:51

>> A lot of points that I would quibble

47:52

with there. That's fair. Um,

47:54

>> quibble away.

47:54

>> Quibble away. All right. So,

47:57

there's this idea that the law is a

47:59

barrier to innovation because law is old

48:01

and innovation is new and the law

48:03

couldn't possibly have contemplated the

48:05

innovation.

48:07

The story about the innovation is what

48:09

makes it new. Right? Most of the things

48:11

that we're seeing in the fintech space,

48:14

>> they're not that new. Right? As I said,

48:16

you know, we've got fintech lending has

48:18

a lot of the things that we didn't like

48:19

about payday lending, right? Why

48:23

shouldn't the laws from payday lending

48:24

apply? Crypto basically, I mean, the the

48:28

crypto markets for all the world look

48:30

like the stocks and bonds and the

48:31

unregulated markets of the 1920s. We saw

48:33

how that ended. They ended in such a

48:36

spectacular crash that we ended up with

48:37

the securities laws. Why shouldn't they

48:39

apply? What's what's so different?

48:42

Right? So this construction of novelty

48:45

is something that is done intentionally

48:47

as as a as a narrative. Now I fully

48:50

appreciate that we need the optimists in

48:53

this world who are going to try new

48:54

things and and and I say that very early

48:57

on in the book, the people who these

48:59

stories are useful because they attract

49:02

funding to new things. So I'm not saying

49:04

we should do away with it completely. My

49:06

argument is that the the yin and yang,

49:08

the balance between the optimists and

49:13

the realists is badly out of whack

49:15

because we give so much difference to

49:17

the stories about innovation, about

49:20

disruption, about how technology can

49:22

solve problems that have been with us

49:24

for centuries. We can magically get rid

49:26

of intermediaries now with blockchain

49:28

technology apparently. Well, that was

49:30

one of the that was one of the story

49:32

narratives was this intermediation and

49:35

until um it no longer was a story, but

49:38

but let's talk about some specific

49:40

companies that you've mentioned that

49:42

you've written about um and I and I want

49:44

to get your sense on it and and the the

49:47

oldest one was PayPal. To this day, and

49:51

I was a PayPal user back in the 1990s

49:54

with eBay and those sort of things. Um,

49:57

to this day, I don't understand what

49:59

they did that was any different than a

50:02

credit card other than being a bit of

50:04

middleware

50:06

uh that uh uh eventually became a renty

50:09

air. Why not just use a credit card? Why

50:12

do I need PayPal between me and Amazon

50:14

or me and eBay? So this is really an

50:18

interesting story and I learned a whole

50:20

lot about this in research for this book

50:22

um by reading Max Chaffkin's book uh the

50:24

contrarian about Peter Teal and the

50:26

start of the beginning of of PayPal and

50:29

in fact it the idea for PayPal came from

50:32

the same place that the idea for crypto

50:34

has come from which is this this

50:36

technolibertarian idea of we don't like

50:39

regulation we don't like central banks

50:41

we would like to have private money and

50:44

we would like technology to help us have

50:47

private money. And PayPal wasn't the

50:50

only one of these kinds of startups back

50:52

in the the early bubble. So, PayPal, I

50:55

think, succeeded because it sort of

50:58

lucked into this deal with eBay, as you

51:00

said, right? It it sort of had no

51:02

distinguishing features, as far as I can

51:04

tell, that made it any superior to the

51:06

beanses and the flooes of this world. It

51:09

lucked into this deal with with eBay. Um

51:12

and so

51:13

>> and eventually eBay buys them um to

51:17

solve their I guess credit card

51:20

management problem. I don't really

51:21

understand I still you know 20 or 25

51:24

years later I still don't understand

51:26

>> why they were necessary.

51:28

>> I think yeah I mean my my knowledge of

51:30

this comes primarily from reading Max

51:32

Trafkin's book which I highly recommend

51:34

but that's that's my understanding too.

51:36

And so you know they are a payments

51:39

technology. Um I too struggle to sort of

51:44

understand what they offer that a credit

51:47

card doesn't in many ways. Um one thing

51:52

they are though is they are sort of the

51:55

regulatory arbitrage um story in fintech

51:58

right so you know I've said so much of

51:59

fintech is actually about arbitrageing

52:01

the law rather than technological

52:02

superiority.

52:04

PayPal from the beginning was flaunting

52:06

quite aggressively the banking laws

52:08

because only banks are allowed to accept

52:11

deposits and people were keeping money

52:14

in their PayPal PayPal wallets and for

52:16

all the world that looks like keeping a

52:18

deposit. Peter Thiel from the beginning

52:20

was very aggressive on the um lobbying

52:23

to make sure that that was not

52:25

considered deposit taking. Early on

52:27

there were multiple states that were

52:28

investigating it because they thought it

52:29

was the unlawful taking of deposits. um

52:32

he lobbied heavily in Congress and

52:35

lobbied heavily at the FDIC and

52:36

ultimately um you know that worked and

52:41

so I think that has sort of been the

52:42

prototype that blitzcaling prototype. I

52:45

think people

52:47

>> perhaps underestimate the degree to

52:49

which blit scaling is really about

52:51

playing it on an unle unlevel legal

52:54

playing field.

52:54

>> Let let's talk about stable coins. What

52:57

sort of value do they provide? again um

53:00

unless you are trying to do illicit

53:02

transactions or gamble not a whole lot

53:04

right so

53:05

>> well a stable coin is worth a dollar and

53:07

it promises to always be worth a dollar

53:10

don't we have dollars why do I need a

53:13

stable coin

53:14

>> well you need a stable coin often to do

53:16

illicit payments um so if you want you

53:19

know if you're they're very popular for

53:22

example with all kinds of drug cartels

53:24

and um they're good for sanctions

53:26

evasion

53:28

>> they're also very good if you want to

53:30

gamble in crypto and you want to use it

53:32

as sort of a cash management tool in

53:34

between crypto investments. Kind of like

53:35

a money market mutual fund in your

53:37

brokerage account for parking funds in

53:39

between crypto gambling.

53:41

>> Mhm.

53:41

>> Um but they've really never had any

53:45

utility in any big way as a legal

53:47

payments mechanism.

53:49

>> All right. So what about you mentioned

53:51

the blockchain? I keep reading that

53:54

blockchain uh is going to allow us to

53:56

use smart contracts and have things

53:58

happen automatically that now have to be

54:01

manually. What what's the problem with

54:03

blockchain?

54:04

>> Well, first of all, smart contracts can

54:06

work without a blockchain. Smart

54:08

contracts predate blockchains. They can

54:10

run on all kinds of databases. So, if if

54:13

you want that kind of functionality, and

54:15

it has pros and cons, and I've written

54:16

about this a ton, um you can have that

54:19

without a blockchain. The reason why you

54:22

don't want to have it on a blockchain

54:24

and this is something that does not get

54:25

anywhere near the attention it needs is

54:28

that there's all kinds of operational

54:29

risks associated with the blockchains

54:31

themselves. So blockchains are software.

54:33

They are maintained by

54:36

>> in the case of the the Bitcoin

54:39

blockchain just a few individuals. In

54:40

the case of the Ethereum blockchain,

54:42

it's the Ethereum Foundation. They're

54:44

not regulated at all. They have no

54:47

obligation to invest in cyber security,

54:51

to invest in getting their blockchains

54:54

up and running again should something go

54:56

wrong. you're just you're really sort of

54:59

as I sometimes say yoloing operational

55:01

risk with regards to these um these

55:04

blockchains. And so if you want smart

55:06

chain sorry smart contract functionality

55:09

like don't use a blockchain.

55:11

>> Huh. Coming up we continue our

55:13

conversation with Professor Hillary

55:14

Allen discussing her new book Fintech

55:18

Dystopia a summer beachread about

55:21

Silicon Valley and how it's ruining

55:24

things. I'm Barry Rholz. You're

55:26

listening to Masters in Business on

55:28

Bloomberg Radio.

55:43

I'm Barry Rholz. You're listening to

55:45

Masters in Business on Bloomberg Radio.

55:48

My extra special guest this week is

55:49

Hillary Allen. She teaches at the

55:52

American University Washington College

55:54

of Law in Washington DC where she

55:57

specializes in regulation of financial

56:00

and technology laws. So we mentioned

56:03

stable coin, we've mentioned blockchain.

56:06

Is there any value in any of the crypto

56:09

coins be it Bitcoin or Ethereum? Um I

56:13

know we we can't actually describe the

56:17

last hundred coins that are out there on

56:19

the radio. um will will violate George

56:22

Carlin's seven words you can't say on TV

56:25

or radio, but there's a outside of the

56:29

you know, Dogecoins and everything below

56:32

that, what's the value of the first five

56:36

or so cryptocurrencies? Is there

56:38

anything worthwhile to these or is this

56:41

just a solution in search of a problem?

56:44

>> It's a solution in search of a problem.

56:46

I mean essentially even so Bitcoin often

56:49

is seen as the most credible of these

56:51

because it's been around the longest and

56:52

has the largest

56:53

>> Bitcoin and ETH. Those are the two I

56:55

hear about the most.

56:56

>> Um but both of them are essentially

56:59

ponzies in the sense that there's

57:01

nothing backing them. The only reason

57:03

they have value is because someone else

57:06

might buy them from you. If they choose

57:08

not to, it could go to zero. And

57:10

actually someone put it to me this way.

57:12

It's not that they could go to zero.

57:13

they could go to less than zero because

57:14

they don't even have any assets that

57:16

could be used to administer a winding

57:18

up. Right. Right.

57:20

>> And and and and that's expensive. Um you

57:22

know, you you're going to get the

57:23

lawyers and the courts and everybody

57:24

involved that

57:25

>> Well, you're not suggesting that if you

57:27

own Bitcoin, you may have a liability

57:29

down the road. Is that is that the

57:31

implication?

57:31

>> No, I'm just saying that if if someone

57:34

was trying to work out the end of one of

57:37

these things, there wouldn't even be,

57:39

you know, office furniture you could

57:41

sell to pay the lawyers. Okay. Um, you

57:45

you've written about uh startups like

57:48

Theronos. I remember Juicero. You

57:51

>> Juicer is the best.

57:52

>> Tell us a little bit about those two.

57:54

And was that just uh, you know, one of

57:57

these products that just didn't work

57:59

out? What What's the problem with that

58:02

technology solution to our uh, juicing

58:06

problems?

58:07

>> So, Juicer is just my favorite metaphor

58:08

for all of this. So, for those of you

58:10

who are unfamiliar with the the gift

58:12

that is Juicerero, so basically this was

58:15

a machine. It cost hundreds of dollars.

58:17

It was Wi-Fi enabled.

58:19

>> Well, roll back. The the guy, and you

58:21

described this in the book, the guy who

58:23

invented this previously had set up a

58:25

fairly successful

58:27

um was it a juicing uh chain of

58:30

companies that got bought. And so, he

58:33

had some credibility in the space. And

58:36

now, I'm not going to run restaurants.

58:38

I'm going to create a technology that

58:41

people can juice at home

58:42

>> and it was venture funded. They put a

58:44

lot of money into this

58:45

>> 100 plus million dollars

58:47

>> and these these what it did was it

58:49

squeezed these juice pouches and the

58:51

problem was that people could just

58:53

squeeze the juice pouches with their

58:55

bare hands and get all the juice. There

58:57

was there was a notorious Bloomberg

58:59

article about this,

59:01

but what it raises the question, did the

59:05

company already squeeze the juice and

59:07

put in these pouches? Why didn't they

59:10

like why wasn't this set up so that you

59:12

can actually put fresh fruit? Like,

59:15

doesn't it defeat the purpose if you're

59:17

buying pouches? Or was the whole idea

59:20

the razor blade model? So I mean the

59:23

reason why I love this as a metaphor is

59:25

it it really gets at this this

59:27

technosolutionism which is one of the

59:29

concepts that I'm really coming for in

59:31

in this book. And technosolutionism is

59:34

this idea that everything in our world

59:37

can be reduced into a technology

59:39

problem. And that the only reason we

59:41

haven't solved certain things is because

59:42

we haven't spent enough time and money

59:44

on developing the the technology. And

59:46

and what that does is it it sort of

59:49

flattens problems into it gets rid of

59:52

the human messiness. It it flattens

59:55

problems. It ignores domain expertise.

59:56

People who've been working in particular

59:58

fields for a long time and know a lot of

60:00

non- tech stuff. It it sort of dismisses

60:02

their expertise.

60:04

And sadly, you know, there's just this

60:07

magic associated with technology at this

60:09

point. And and as I said, I'm not

60:10

anti-technology. A lot of it's great,

60:13

but it doesn't deserve the level of sort

60:15

of magical difference that we give it.

60:16

It can't solve all our problems. And

60:19

when we get into this mindset where we

60:21

think that if we throw enough money at

60:23

technology, it can solve anything and it

60:24

will always be the best solutions, we

60:27

end up squeezing pouches with a machine

60:30

that we could squeeze with our bare

60:31

hands. And and a joke that I try and

60:34

make in the book is like with with AI,

60:36

we may be better off squeezing things

60:38

with our bare minds. Mhm. So, one more

60:40

company I have to ask about um Theronos.

60:44

Uh I love the book Bad Blood, which

60:46

really went into details about how

60:49

corrosive and co-opting co-opting the

60:53

company itself was for everybody around

60:55

it, including the attorneys and and all

60:57

sorts of other bad actors. Um why wasn't

61:01

Theronos just an idea that didn't work?

61:03

that you can't if you want to draw blood

61:06

from a vein, you have to draw blood from

61:09

a vein, you can't just prick your

61:11

fingertip and think that's going to be

61:14

the same as as venal draws.

61:16

>> Well, so that's the thing with this

61:17

technosolutionism. It presumes that

61:19

everything is a tech problem waiting to

61:21

be solved. It doesn't even countenance

61:24

the possibility that there may not be a

61:27

technological solution for what you want

61:29

to do. that the technology you want may

61:31

not be able to do the thing you want it

61:33

to do. And when you have that sort of

61:36

collective sense that I think we have

61:39

now that if we throw enough money at any

61:42

technology, it can solve any problem we

61:44

give it. You can see how people get so

61:46

susceptible to being sort of drawn in to

61:49

the stories that outright con people

61:53

like Elizabeth Holmes might be telling,

61:55

but also the stories that we're being

61:57

told about, you know, about AI right now

62:00

and about crypto. You know, the more you

62:03

know about these technologies, the less

62:05

impressive they seem and the more

62:09

clearly it becomes illuminated that that

62:12

they just can't do a lot of the things

62:14

that they're going to do. But that's so

62:16

counter to how we typically talk about

62:18

technologies that it sort of it feels a

62:20

bit weird to talk like that. Um and and

62:24

you sort of you're going against

62:25

societal norms in a way. And so one of

62:27

the things that I really wanted to do

62:29

with this is to start making it easier

62:33

to talk about these things critically to

62:36

be not such an outlier to express your

62:38

frustrations. And I think we're actually

62:39

having a moment like that about AI

62:42

because so many people really hate it.

62:45

>> Really? So, so you use the phrase

62:47

technosolutionism

62:50

and Theronos is really the poster child

62:52

for that because as you're describing a

62:55

lot of these things I am recalling the

62:58

story a especially what you're referring

63:01

to with domain expertise she had no

63:04

medical or medical device training

63:08

>> none of the VCs who put money into um

63:12

Theronos were healthcare biotech medical

63:16

devices like they all passed. Um

63:19

eventually she hired a number of people

63:21

to try and with some background but they

63:25

seemed to turn over pretty quickly

63:27

because no you can't do that. What you

63:30

just pricking the skin you're getting

63:32

all the interstitial tissue and fluids

63:34

and you're corrupting the sample that

63:36

you want to test for something you have

63:38

the re the reason we draw from the vein

63:41

is very medically specific. Um, and yet

63:45

it attracted

63:47

uh Henry Kissinger and the uh all sorts

63:50

of big law firms and everybody plowed

63:53

in. She's the next Steve Jobs, the

63:55

youngest self-made female billionaire.

63:59

What is it about us that we're just so

64:02

susceptible to buying into these

64:04

narrative tales that turn out to be

64:07

nonsense?

64:08

So, I mean, part of it is that we're

64:10

humans and humans have often sort of

64:16

been snowed by things that are flashy

64:18

and shiny and exciting. I mean, that

64:19

that's just very much the human

64:21

condition. Um, some of the stuff I talk

64:24

about in the in the book that I really

64:26

enjoyed working on was the cognitive

64:29

psychology aspects of it. you know, sort

64:31

of

64:32

when we hear certain stories,

64:36

it's very difficult to budge ourselves

64:39

and and be contrarian. And I was as I

64:41

was saying earlier, so you sort of need

64:43

a a collective tipping point where

64:46

people start to question it. So you're

64:48

you don't feel like an outlier or the

64:51

norm when you start to question these

64:53

things. And so I think there's a role

64:54

for media here. I think there's a role

64:56

for education. Unfortunately, the people

64:58

who benefit from technosolutionism also

65:00

know this and have a very big media

65:02

presence and invest a lot in education.

65:04

So, it's it's an uphill battle to start

65:07

talking about these things differently.

65:10

Um, but you know, ultimately we we are

65:13

all human. Um, and it's nicer to believe

65:16

that something will succeed than than

65:19

that it will fail. I mean, you might not

65:21

think I'd be much fun at cocktail

65:22

parties, although I am. And the book is

65:25

available for free at

65:27

fintechdistopia.com.

65:30

Um, let's jump to our final questions,

65:33

our favorite questions we ask all of our

65:35

guests. Uh, starting with tell us about

65:38

your mentors who helped steer your

65:40

career.

65:42

>> Um, so my first mentor is probably my

65:44

first law firm partner boss um in in

65:48

Australia, Steven Kavanagh. Um, and I

65:52

had thought I was going to be an IP

65:55

lawyer. Um, but we had a rotation system

65:58

and we ended up and I ended up in his

66:00

financial services practice. Um, and he

66:04

was just a wonderful person to work for.

66:05

It was a time when the law had just

66:07

changed in Australia and and he really

66:10

was willing to hear what I had to say

66:13

about this this new new law. And so it

66:16

was just I just felt very invested in

66:19

and that that was that was lovely. And

66:21

then I think as an academic

66:24

Patricia McCoy who I adore um sort of I

66:27

I had had a very non-traditional path to

66:29

academia. I had more practice experience

66:31

than is usually the case. I had fewer of

66:32

the bells and whistles credentials that

66:34

people usually have. And again she just

66:37

saw in me someone who was really

66:39

passionate about preventing financial

66:42

crisis about sort of systemic risk. um

66:45

and and sort of was willing to look

66:48

through the fact that I wasn't as

66:49

polished as most of the other people

66:51

trying to enter academia and support me

66:53

and I was very grateful for that.

66:54

>> We've talked about a run of different

66:56

books. Uh what are some of your

66:58

favorites? What are you reading right

67:00

now?

67:01

>> Oh, I was an English lit major so I I

67:03

have many favorites. I'm I'm very into

67:06

the dystopian track. So, Handmaid's Tale

67:09

1984. Yeah. I just finished the parable

67:11

of the sewer in that vein, which was

67:13

>> parable of the

67:14

>> the parable of the sewer.

67:16

>> Um, Octavia Butler. Um, I also have

67:20

always had a soft spot for really good

67:22

children's literature. So, um, Philip

67:24

Fulman's Dark Materials trilogy is one

67:26

of my favorites and and right now I'm

67:28

reading with my kids um, Katherine

67:31

Randelle's books um, Impossible

67:33

Creatures and um, The Poison King and

67:36

it's just they're just so good. And then

67:39

work-wise, I've just started Jacob

67:41

Silverman's Gilded Rage, which is very

67:43

much on point for the conversation we're

67:45

having.

67:45

>> Gilded Rage. You know, we talked about a

67:47

few cryptoreated books. Did you see uh

67:51

Zeke Fox's

67:53

>> number go up?

67:54

>> It really is just an astonishing

67:56

astonishing work. Um, what sort of

68:00

advice would you give to a recent

68:02

college grad interested in a career on

68:06

uh whether it was law uh financial

68:08

technology, regulation? What's your

68:10

advice to those people?

68:11

>> It's a really hard time for them and I I

68:14

talk to my students a lot about the

68:15

careers and you know things are the

68:17

ground is shifting under our feet and in

68:18

this time of uncertainty it's really

68:20

it's really hard um to figure out what

68:22

to do. So I would recommend investing in

68:26

the fundamentals. Um, and I think it's

68:27

it's hard to do when AI is being pushed,

68:29

but but becoming a good communicator,

68:32

learning how to write and speak to

68:34

people clearly will never, I think, go

68:36

out of fashion. And investing in

68:38

relationships, again, we're in this time

68:40

where everything is sort of becoming

68:41

technologized and atomized, etc. But in

68:44

my career, having good relationships

68:47

with people, and I'm pretty sure you'll

68:48

agree with this, has been one of the

68:49

most successful things um that has

68:52

helped me along the way. Um, and so just

68:55

investing in personal relationships I

68:57

think is is always good advice.

68:59

>> And our final question, what do you know

69:01

about the world of fintech investing

69:06

regulation today might have been useful

69:08

20 25 years ago?

69:11

>> Well, honestly,

69:14

I'm not sure that there's much because

69:16

the world was very different 20 20 to 25

69:18

years ago. you know, I I always just

69:20

invested in um

69:23

in index funds basically and and you

69:25

know, and that worked out frankly great

69:27

for me. Um the challenge is and I study

69:31

financial crisis.

69:33

>> The challenge is that when things go

69:34

horribly wrong, everything is

69:36

correlated.

69:38

>> Everything is correlated.

69:39

>> All correlations go to one in a crisis

69:41

for sure.

69:41

>> And I think we're on the brink of a

69:45

crisis. When you say on the brink, days,

69:48

weeks, months, years,

69:49

>> ah, well, John Maynard Kane said that

69:51

the markets can stay irrational longer

69:53

than you and I can stay solvent. So, I

69:55

will never put a time frame on it. But

69:57

I, you know, all warning indicators are

70:00

flashing red at the same time as we are

70:01

pulling back all regulatory apparatus.

70:03

So, I think it's safe to say we're on

70:05

the brink of a crisis. H

70:07

>> how could that ever go wrong?

70:09

>> Regulation unleashes the animal spirits.

70:13

As long as we're talking about canes.

70:15

Um, it's all good.

70:18

>> Perhaps not.

70:19

>> Perhaps not. Um, Hillary, thank you so

70:22

much for being so generous with your

70:24

time. We have been speaking with Hillary

70:26

Allen, professor of law at American

70:29

University Washington College in DC and

70:32

author of the book available for free

70:35

online, Fintech Dystopia, a summer

70:39

beachread about how Silicon Valley is

70:42

ruining things. If you enjoy this

70:45

conversation, well, check out any of the

70:47

600 previous discussions we've had over

70:51

the past 12 years. You can find those at

70:54

iTunes, Spotify, YouTube, Bloomberg, or

70:57

wherever you find your favorite podcast.

71:01

I would be remiss if I didn't thank our

71:03

crack staff that helps put these

71:04

conversations together each week. Alexis

71:08

Noriega is my video producer. Shawn

71:11

Russo is my researcher. Anna Luke is my

71:15

podcast producer. I'm Barry Rholtz.

71:19

You've been listening to Masters in

71:21

Business on Bloomberg Radio.

Interactive Summary

Hillary Allen, a professor at American University Washington College of Law, discusses her book "Fintech Dystopia." She critiques the tech industry's impact on finance, arguing that much of fintech innovation is driven by legal design and regulatory arbitrage rather than genuine technological advancement. Allen expresses skepticism about the hype surrounding technologies like blockchain and AI, highlighting their limitations and potential for misuse. She also touches upon economic precarity, the role of venture capital, and the dangers of "technosolutionism," where complex societal problems are oversimplified as purely technological challenges. Allen emphasizes the importance of critical thinking and robust regulation to prevent financial crises and protect consumers.

Suggested questions

5 ready-made prompts