HomeVideos

Richard H. Thaler — The Winner’s Curse and Going Against the Establishment (with Nick Kokonas)

Now Playing

Richard H. Thaler — The Winner’s Curse and Going Against the Establishment (with Nick Kokonas)

Transcript

2517 segments

0:00

There's a story I tell about a dinner

0:02

party and I bring out a bowl of cashew

0:05

nuts. People started eat nibbling as

0:08

they do. And at some point I realized

0:10

their appetite was in danger. And so I

0:14

grabbed the bowl of cashew nuts and went

0:17

and hid them in the kitchen. People

0:19

thanked me. So you removed choice. And

0:22

then because this is a group of

0:24

economists, somebody mentions that well

0:26

we're actually not allowed to be happy

0:28

about that because more options is

0:30

always better and we used to have the

0:33

option to eaten us and now we don't.

0:35

Well, you can imagine the principle is

0:37

interesting that sometimes

0:41

we prefer

0:43

not to have options. I'm excited to dive

0:47

in and I thought Nick we would let you

0:50

take the reigns because you had this

0:52

idea of starting from first principles

0:54

or at least fundamentals and I think

0:56

that is a great place to start because

0:59

maybe the things we think we understand

1:01

we don't understand or the things we

1:02

think we've defined for ourselves we

1:04

haven't defined. when I got to college

1:06

at a liberal arts college and didn't

1:08

know what I was going to study but I

1:10

knew I enjoyed business quote unquote or

1:13

might work in business of some sort

1:15

you're left with the study of economics

1:17

there like you're not getting an

1:18

undergrad MBA or something so you get to

1:21

economics class and it's not that at all

1:25

you know it's a bunch of models it's a

1:27

bunch of all that so I thought what we

1:29

would start which is first principles

1:31

what is the study of economics

1:34

and we're going to the best source in

1:36

the world on that. But it's really basic

1:38

yet I think fundamentally misunderstood.

1:41

>> I think that's actually a great place to

1:43

start and especially

1:45

it's not really possible to talk about

1:48

what behavioral economics is without

1:50

understanding what economics is.

1:53

Economics is really two things. It's

1:58

people interacting in markets and then

2:01

what are those people doing?

2:04

And what happened is

2:07

sometime right after World War II,

2:12

economists started getting interested in

2:15

making their models more rigorous and

2:19

more mathematical.

2:22

And

2:24

the easiest model to write down of what

2:27

somebody is doing

2:29

is to write down a model in which

2:31

they're doing it perfectly. So if you

2:34

open up any economics textbook,

2:38

you'll see the three letters max

2:43

and that's short for maximize.

2:46

And all models start with that. So, we

2:50

assume that when Nick goes to the

2:53

grocery store, what he chooses is the

2:57

best things he could choose.

3:00

And that's a simplifying assumption.

3:03

It's simplifying for the economist

3:06

because that's the easiest model to

3:08

write down.

3:09

>> And so, what are they modeling though?

3:11

Like even more fundamentally,

3:12

>> what they're modeling is whatever you

3:14

do. So, what route do you take to drive

3:19

from home to the golf course? The best

3:22

route. Which home do you choose to buy?

3:25

The best one. What mortgage do you

3:27

choose? The best one.

3:28

>> Yes.

3:29

>> Look, economists are jealous of physics.

3:34

>> And they're jealous of physicists. Many

3:37

economists started out in school as a

3:40

math major or physics major or

3:42

engineering major and then decided, oh,

3:45

this is too hard, but they kind of

3:48

admire that.

3:50

So, they want a model that's as accurate

3:54

as the model you use to send up a

3:56

rocket.

3:58

And the the problem is that that problem

4:02

is solvable. How much stuff do you need

4:05

to get a rocket to go up there? That's a

4:07

solvable problem. Figuring out what

4:10

people do. If you open a book, an

4:12

economics textbook, you actually don't

4:14

see the word people,

4:15

>> right?

4:16

>> You see the word the agents. Economics

4:19

starts with Adam Smith, 1776.

4:22

And it's that way about until 1950. Now,

4:27

we have an equation that says exactly

4:30

what a smart person is going to do. And

4:33

so the agents in these models are

4:36

getting smarter and smarter because the

4:38

norm is my model is better than your

4:41

model if my agents are smarter than your

4:44

agents.

4:45

>> And so what does it mean to be rational

4:48

within those models?

4:50

>> Well, it means to solve the problem the

4:53

way an economist would. And I don't mean

4:56

that economists think that they're the

4:58

smartest, though they may. But if it's

5:02

an economic problem,

5:05

like how to adjust your thermostat so

5:08

you're comfortable and spend the least

5:11

money, right? That's a little practical

5:13

economic problem. And an economist and

5:17

an engineer might solve it. And knowing

5:20

you, Nick, you you could easily get

5:22

absorbed with figuring out how to really

5:25

do it, but most people can't figure out

5:28

how to use that easy using thermostat in

5:31

their house, much less solve it

5:32

themselves. So people will take

5:35

shortcuts

5:38

and

5:40

you know my joke is instead of writing

5:43

down max suppose we wrote down meh

5:46

because what people are doing isn't

5:48

really max right it's you know I'll do

5:52

something where I come in on that part

5:55

of the story is okay if people are not

6:00

capable or interested in solving

6:04

and they're doing something else, taking

6:06

some shortcut

6:08

then then what? So that's principle

6:12

number one. Principle number two is

6:14

economists again for simplicity

6:17

have assumed that people are selfish.

6:20

Most of us care more about ourselves

6:22

than anybody else. Maybe our family,

6:24

some family members, you know, but we

6:28

give money to charity. You know, NPR

6:31

collects money.

6:33

>> Churches collect money, right?

6:34

>> We might care about fairness.

6:36

>> We might care about fairness, right? I'm

6:37

sure we're going to have a discussion

6:39

about fairness. And we might care about

6:41

being treated fairly.

6:44

And so that was left out of the model

6:48

again

6:50

because it seemed like a simplifying

6:52

assumption to just start out.

6:54

>> Yeah. You're making the rocket equation

6:57

and you don't really care about the

6:58

astronauts at this point,

7:00

>> right? You just got to get the rocket.

7:02

>> You got to get the rocket up. I'll

7:03

mention a third thing, which is these

7:07

agents don't have any self-control

7:10

problems. So, they eat just the right

7:13

amount. They exercise just the right

7:15

amount. We wouldn't need these new fat

7:19

drugs because people would already

7:22

>> be optimizing for the health. it would

7:24

be perfectly fit and you wouldn't have

7:27

sold half as many books, Tim,

7:30

if people were those agents and you know

7:34

even the other kinds of things you're

7:36

interested in implicitly in this idea

7:39

that the agents are maximizing means

7:42

they don't need any advice.

7:44

>> They're doing it right. They're getting

7:46

the labor leisure tradeoff right.

7:50

They don't need any help in getting

7:51

along with their spouse because they're

7:54

they're

7:54

>> maximized I've optimized my marriage.

7:58

>> And in fact, our wives would be happy to

8:00

testify that we are a wonderful job.

8:03

>> We're both perfect. Really, we couldn't

8:06

be better husbands. I've been looking

8:09

forward to this conversation that I've

8:12

always sort of furrowed my brow at the

8:15

agents all as rational and selfish

8:18

because I just don't see that behavior

8:20

if you look at your neighbor or your

8:21

friend or someone else. So my question

8:24

though is not so much to dive into that.

8:27

We could and the the story of your

8:30

friend who got hay fever Richard when he

8:32

mowed his lawn is a pretty funny one

8:34

from New York Times. Maybe we'll bring

8:36

that up. But suffice to say, people

8:38

self-sabotage. They care about fairness.

8:40

There all these things that seem to

8:42

invalidate getting the rocket to the

8:44

moon or that approach to economics. And

8:47

I'm wondering, were they just

8:49

force-fitting precision to something in

8:52

order to defend it as more rigorous and

8:55

it was a waste of time? Or was it more

8:57

like Newtonian physics versus quantum

8:59

mechanics where it's like, well, you can

9:00

actually use Newtonian physics for a lot

9:03

of good things. Is there anything

9:04

productive that came of these incorrect

9:06

assumptions about all agents being

9:09

rational and selfish as a bedrock

9:11

assumption?

9:12

>> I would say sure. Supply and demand

9:14

still works. All economics starts with

9:16

supply and demand. If you raise the

9:18

price, you're going to sell less almost

9:21

always. When you write down these more

9:24

formal models and make more precise

9:26

predictions

9:28

then the question is are you adding

9:31

predictive power through that and I

9:36

think what happened is as this norm

9:40

we're starting in the 50s and I would

9:42

say

9:44

rationality peaked in the '9s maybe

9:48

where this norm that a model with

9:52

really, really, really smart people is

9:55

the best possible model. Eventually,

9:58

people start to realize, well, maybe

10:00

there's some drawbacks to that. But you

10:03

can argue, and of course, I've spent my

10:05

career arguing about how wrong this is.

10:09

You know, the great Chicago economist

10:11

Milton Freriedman

10:13

had this defense. He would say, "Look,

10:18

I just want a model that people are

10:21

behaving as if they were maximizing."

10:26

So he would say, "It doesn't matter if

10:28

they literally know how to do it if

10:30

their behavior is close enough." And so

10:33

the real debate over my career has been

10:37

about that question. Well, let's go back

10:39

to the start then I think of sort of

10:44

your origin story and thus the origin

10:47

story of behavioral economics itself

10:51

because at some point psychologists

10:54

start getting involved and they start

10:56

looking at these models and they start

10:58

saying

11:00

yeah but people don't really act this

11:02

way and so this could be great in a

11:04

laboratory or on a piece of paper in a

11:07

spreadsheet

11:08

but it might not work in the real world

11:10

and there's real consequences to those

11:12

things. So let's go back to when you

11:14

were a young academic and started coming

11:18

across those ideas.

11:20

>> Yeah, I guess this is like in grad

11:22

school. So, there's a story I tell about

11:25

a dinner party with some other economics

11:27

graduate students and there's some roast

11:31

in the oven. It smells great and there's

11:35

some adult beverages and I bring out a

11:38

bowl of cashew nuts and people start

11:42

nibbling as they do. And at some point I

11:45

realized that their appetite was in

11:48

danger. And so I grabbed the bowl of

11:53

cashew nuts and went and hid them in the

11:56

kitchen.

11:57

And then I came back into the living

11:59

room and people thanked me. Oh, thank

12:03

God you got rid of those nuts. We were

12:04

going to eat them.

12:05

>> Yeah. So you removed choice.

12:06

>> Yeah, I removed choice. And then because

12:08

this is a group of economists,

12:11

they start analyzing it. There's a rule

12:13

of thumb. You don't want too many

12:15

economists at any dinner party. And this

12:18

is a good example of it. Somebody

12:20

mentions that, well, we're actually not

12:21

allowed to be happy about that because

12:24

more options is always better. And we

12:27

used to have the option to eaten us and

12:28

now we don't. Well, you can imagine. So,

12:31

but the principle the discussion wasn't

12:34

that interesting but the principle is

12:35

interesting that sometimes

12:39

we prefer

12:41

not to have

12:44

options and so I started with this list

12:47

of stuff like that

12:51

and then the work comes into all right

12:55

well how can you go beyond a story

12:59

so yeah that's an amusing story, but so

13:03

what?

13:04

>> You want to change the framework of

13:07

economic theory without throwing out the

13:08

rigor, but now you're introducing

13:10

something super messy, which is humans

13:13

and psychology and irrationality and all

13:15

of those things. So, how do you do that

13:17

without getting rid of the rigor? Yeah,

13:19

just to make sure I'm tracking it seems

13:21

like so you've created this list of

13:23

sacred cows that you would put on trial,

13:25

but the question was how to do it

13:27

quantitatively or in some way like Nick

13:29

said rigorously without just leaving it

13:31

as an anecdote.

13:32

>> Well, there are two parts to it. One is

13:34

can you show that people are really

13:36

doing that?

13:37

>> And then second, can you create rigorous

13:41

models

13:43

that describe that behavior? Mhm.

13:46

>> And I think we might as well stick to

13:48

the demonstration part. So we can go

13:51

from that the cashew story and say,

13:54

well, what does that have to do with the

13:56

real world? And we can talk about

13:59

retirement saving as a first principle.

14:03

Americans don't save

14:06

unless the money is taken from their

14:08

paycheck and put into a retirement plan.

14:11

Now,

14:12

economic theory would say it doesn't

14:14

matter. People are going to save the

14:16

right amount. There have been two Nobel

14:19

prizes for theories that basically say

14:22

people save the right amount. So, they

14:24

they take their income and they decide,

14:28

okay, I'd like this consumption path

14:31

over my lifetime, and now how much do I

14:33

have to save to get that? And and then I

14:36

keep reoptimizing.

14:38

marker goes up, I can save a little

14:40

less.

14:40

>> I mean, that seems so obviously wrong.

14:42

So, were you frustrated at this point?

14:44

Like, I read some of your old papers in

14:45

preparation for this, and I saw these

14:47

little backhanded little mentions that

14:50

were kind of snide. I mean, it's funny

14:52

reading 40-year-old academic papers and

14:54

reading the snark in them, right? I

14:57

mean, there's actual snark in Young

14:58

Thaylor long before any of this.

15:00

>> It never escaped.

15:02

>> It never escaped. So, you know, tell me

15:04

a little bit about that because it seems

15:06

to me like in hindsight, you know, the

15:09

first time we met, we played golf

15:10

together after like a Twitter exchange.

15:14

And I remember thinking to myself as you

15:17

were saying this, I would go like, yeah,

15:19

well, obviously. And then you'd look at

15:20

me like, "No, no, no. You don't

15:22

understand. For 150 years, that wasn't

15:23

obvious." And so within the context of

15:26

this academic world, why was any of

15:29

applying what seems to be pretty logical

15:32

stuff, why was it resisted so much? Why

15:35

is that system built like that?

15:37

>> I can tell you while I was living

15:39

through that, the emperor has no clothes

15:43

was a recurring thought. Why am I seeing

15:47

that and no one else does?

15:50

And the no one else was just economists.

15:53

I remember giving a talk in the

15:56

psychology department at Cornell where I

15:58

was teaching and I was talking about

16:00

this theory of how people save and the

16:03

audience just starts laughing.

16:05

>> Yeah, that's what I did

16:06

>> and I was like pretty easy laughing and

16:10

one of my economist friends was there

16:12

and he had to assure them that I wasn't

16:15

making this up and that this wasn't a

16:18

caricature

16:20

of economic theory. No, there are

16:22

economists one floor up from here who

16:25

actually believe this is the way people

16:27

behave.

16:28

>> But they didn't even think of it as an

16:30

abstraction in the model. They actually

16:32

thought like, hey, this is how humans go

16:34

through life

16:35

>> or as if. Remember those magic?

16:38

>> Yes.

16:39

>> They don't have to know how to do a

16:41

present value, but they're acting as if

16:44

they knew.

16:45

>> That's right. That has a bit of a like

16:47

maxing works in mysterious ways type of

16:51

ruring to it.

16:52

>> Yeah.

16:53

>> Is that defensible as an argument the as

16:55

if or is that just kind of a a wiggle?

16:59

Look, it was the winning argument

17:03

when I started on this. And in fact, in

17:06

my first paper, which was published in

17:08

1980, my first behavioral economics

17:10

paper, it ends with a long response to

17:14

Milton Freriedman's as if. And he talks

17:17

about a billiard's player. It's an

17:20

expert billiards player, I should point

17:21

out, that he talks about. He says he may

17:23

not know physics or trigonometry

17:27

but he acts as if he did.

17:29

>> Is it really inferring that like the

17:32

like law of large numbers or crowd

17:34

intelligence or whatever you want to

17:36

call it where you go like well it

17:38

doesn't matter what the individual does

17:39

as an aggregate.

17:41

>> Okay so

17:42

>> when we look at a model it will average

17:44

out that the smart people and the idiots

17:46

all get to the midline which is the

17:47

model.

17:48

>> There are two things here. One is when

17:51

he talked about this expert billiards

17:53

player, I pointed out in this article,

17:55

you know, we actually study regular

17:58

people, not experts. So, you're a pretty

18:02

good golfer. I'm a mediocre golfer.

18:06

Neither of us play like Tiger Woods,

18:09

right? So, even though you're a pretty

18:10

good golfer, we wouldn't want to predict

18:13

the way you're going to hit a shot by

18:14

saying, "What would Tiger do?"

18:16

>> Thousand%. Yeah,

18:17

>> that was my first point about the

18:19

billiards player is let's just go to a

18:22

bar and try to predict what this guy is

18:26

going to do. Is the model going to be

18:29

the one that is optimizing or is it the

18:33

model of the regular guy at a bar?

18:36

>> Right?

18:37

>> And if we're studying investors,

18:39

they're not Warren Buffett.

18:42

You know, they're pretty far from Warren

18:44

Buffett. The second thing is and it's

18:47

sort of another version of the same

18:48

thing which is if we're trying to

18:52

describe behavior

18:54

whose behavior is it? So you know

18:57

there's a lot of discussion

19:00

in say monetary policy about

19:04

expectations.

19:06

So the the Fed will say we have to

19:09

change interest rates because we're

19:11

worried that if prices go up, people

19:14

will expect them to go up further. I'm

19:17

always asking my friends who are in that

19:19

field, whose expectations are they

19:21

talking about? If it's the, you know,

19:24

guy walking down Michigan Avenue, they

19:28

have no expectations about inflation.

19:31

They may have impressions of what's

19:33

going on now, like, oh, meat's high now.

19:37

>> Eggs,

19:38

>> eggs are high, right? Gasoline, right?

19:41

>> You know, I have an electric car. Even

19:43

I'm aware of the price of gas because

19:44

it's posted in those big signs that you

19:48

walk by, right? So, we know kind of the

19:51

level. Do we have real forecasts about

19:54

the future? No. Going back a little bit

19:57

now, how did you then go about designing

20:00

thought experiments, actual lab

20:02

experiments, experiments out in the

20:04

public

20:06

to take these erratic, if you will, or

20:11

nonoptimal

20:12

behaviors

20:14

and go back to the models that you

20:17

questioned and improve them, alter them,

20:21

change them. If you could give a couple

20:22

examples because I think they're kind of

20:24

fun, too. Let's talk about loss

20:26

aversion. Here's the first survey I ever

20:29

my my thesis which was a very

20:32

traditional bit of economics although on

20:34

a kind of exotic topic. It was on the

20:36

value of saving lives. So if we make a

20:40

highway safer and we save 10 lives a

20:43

year, how much should we be willing to

20:44

pay for that? And I decided it might be

20:49

interesting to ask people a question. So

20:52

I ask people suppose by attending this

20:57

lecture today you've been exposed to a

21:00

one in a thousand risk of dying. You

21:02

have this disease and there's one in a

21:05

thousand chance you're going to die a

21:07

quick and painless death next week. But

21:09

I have a cure here that I can sell. How

21:13

much would you pay for it? That was one

21:15

question. Another question was over at

21:18

the med school who were studying that

21:20

same disease.

21:22

We'd like to know how much you would

21:24

have to pay you to expose yourself to a

21:27

one in a thousand chance of getting that

21:30

disease. And there's no cure here. Now,

21:33

economic theory says the answers to

21:35

those two questions have to be the same.

21:39

So the amount I'm willing to pay to get

21:42

rid of it or the amount of I'm have to

21:45

be paid to do it should be approximately

21:48

the same. They're nowhere near the same.

21:51

So people would say, "Oh, I'd pay $1,000

21:55

to get that cure. I wouldn't do that

21:57

experiment for a million dollars." Now

21:59

they're lying because they drive they

22:03

>> Yeah. They do all sorts of things. Yes.

22:05

But they wouldn't choose to be in that

22:09

experiment for a million. So, okay. So,

22:12

that's buying and selling prices are

22:14

wildly different. Now, how do we get

22:16

that down to something more real? You

22:18

asked about an experiment. There's a a

22:20

famous experiment I did with my friend

22:24

and mentor Danny Conorman and our friend

22:26

Jack Kitch. And the way it works is is

22:29

very simple. We go into a classroom

22:32

and we did some of these at Cornell. We

22:36

would go and put a Cornell coffee mug of

22:41

the sort you can get at any campus

22:42

bookstore. We put it on every other

22:45

desk.

22:47

And then we say, "All right, if you have

22:49

a mug, we ask you each of the following

22:53

prices are you willing to sell?

22:56

Start at $10 and go down. And if you

22:58

don't have a mug, you get the same form

23:00

and say, "At each of the following

23:02

prices, are you willing to buy?"

23:04

And now the mugs are assigned at random.

23:09

People have had this mug for 30 seconds.

23:11

It's not their grandma's mug. It's been

23:14

in their possession for 30 seconds. And

23:17

what do you find? Well, the people who

23:20

have a mug demand twice as much to give

23:23

it up than the ones who don't have a mug

23:25

are willing to pay to get it. Why is

23:27

that, do you think?

23:28

>> Well, if I've got it, I don't want to

23:30

give it up.

23:30

>> That's it.

23:31

>> But I wouldn't pay much to get it,

23:32

>> right? The variance between retaining

23:35

something and acquiring it,

23:37

>> right,

23:37

>> are really wide. Like, what are the

23:39

consequences of that?

23:40

>> Well, it means there's much less trading

23:45

and much less change than we would

23:48

expect because we hold on to the stuff

23:52

that we have because we don't like

23:54

giving it up. But when there's a big

23:57

fire like they had in LA last year,

24:00

people are going to have to decide, all

24:01

right, now they don't have the option of

24:03

moving into the old house. What are they

24:05

going to do? The interesting thing about

24:07

these is that the way that we met is

24:10

that I was running experiments of loss

24:12

aversion with a restaurant. So I had the

24:15

these restaurants, I had people making

24:17

reservations, and if they had absolutely

24:20

not a single penny in, they didn't care

24:22

about anything. But you could take the

24:24

richest person in the world and once

24:25

they had $5 in for their reservation as

24:28

a deposit, it took the no-show rate from

24:30

14% to under 3%. And I wrote about that

24:34

and published it and these economists

24:37

from Northwestern

24:39

published an article saying that I was

24:41

an idiot and I should just run an

24:43

auction. And I replied to them

24:46

suggesting that maybe there's a little

24:48

bit of human behavior and and psychology

24:52

involved in this. and I think that I've

24:53

got it right and I have hundreds of

24:55

thousands of of examples as to why this

24:58

is working for my business. And Thaylor

25:01

read this and tweeted at me, but at the

25:04

time I didn't know who he was. And so

25:06

finally people said, "Hey, you know,

25:08

you've got one of the best economics

25:10

professors in the world who who really

25:12

wants to talk to you about this." And so

25:14

I was just doing it out of intuition and

25:16

experimentation,

25:18

but they're the same sorts of

25:20

experiments in a practical way that you

25:23

were abstracting into these traditional

25:25

models.

25:26

>> Conorman and I wrote another paper where

25:28

we tried to find out what people think

25:30

is fair.

25:31

>> Yeah, fairness is a really interesting

25:33

concept. You know, the Northwestern

25:35

economists that were dumping on Nick

25:39

thought that what he really should do is

25:42

just auction off the tables at 7:30 on

25:45

Saturday night for whatever price he

25:47

could get

25:48

>> because I'd be maximizing my utility.

25:50

>> Well, no. You'd be maximizing your

25:52

profit,

25:52

>> right?

25:53

>> And there is some rich guy who will pay

25:57

$2,000, especially then.

25:59

>> You know, you go back and said, "Yeah,

26:01

but they might not come back." The

26:03

questions that we asked in this paper

26:08

were scenarios like there's one there's

26:11

a hardware store that's been selling

26:14

snow shovels for $20 and there's a

26:17

blizzard and they raise the price to

26:19

$30. Is that fair?

26:22

And people say no. You know, but there's

26:26

one exception. You know, there's a group

26:29

that say absolutely yes, and that's

26:32

business school students.

26:35

So, I teach a class in decision-m

26:39

and each week I show them, look, here's

26:41

the data from some experiment. You think

26:44

these people are idiots, but look, you

26:47

do it the same. So, they may be idiots,

26:50

but so are you.

26:51

>> What's the example? But any of them, any

26:53

of these other experiment except this

26:56

one on fairness, the business school

26:59

students are different from the idiots

27:02

because they think of course you should

27:06

raise the price of snow shovels after a

27:08

blizzard. We learned that in micro.

27:11

>> Yeah. Well, that's the Uber surge

27:12

pricing. Tim, you know something about

27:14

that.

27:15

>> Yeah, surge pricing. I know it well.

27:17

>> Surge pricing. I thought at the time

27:21

that there's nothing wrong with search

27:24

pricing, but you have to put a limit on

27:27

it. And the example I gave, I tried to

27:30

convince the owner of Uber of this. I

27:34

said, suppose Uber existed on 9/11 and

27:38

you had Ubers charge $5,000 to drive

27:42

people back to Greenwich. How many days

27:45

would Uber still be in business? Right.

27:47

minutes. You can't do that.

27:49

>> You can't do that. And that's the

27:51

fairness principle.

27:52

>> That's right.

27:53

>> That proves the rule that we are

27:55

psychological. Everyone is a

27:57

psychological creature when it comes to

27:59

markets and interaction.

28:00

>> It might be the guys that are in that

28:03

Uber for five grand. But even they are

28:06

going to be a little pissed.

28:08

>> Yeah.

28:08

>> Yeah.

28:09

>> The thing is at the time when they would

28:12

have these surges like of 10x

28:17

They were not making any money off of

28:19

that. It would be fleeting. So, they'd

28:23

make a little bit of money just like if

28:26

Nick had sold one dinner reservation for

28:29

10 grand. Yeah, he'd make 10 grand, but

28:33

he'd have thousands of people writing

28:35

articles, thousand%. So Uber was making

28:38

a little bit of money and pissing off

28:42

millions of people and that was dumb in

28:46

a business where they had to fight city

28:48

by city to get permission to take people

28:51

to the airport. Mhm.

28:53

>> And so I think the important lesson is

28:57

that if you're doing business in the

29:01

real world and you have customers and

29:05

employees that are people, not agents,

29:09

then you have to do things a bit

29:12

differently. That's like the one

29:14

sentence summary of behavioral

29:15

economics. Richard, could you for people

29:18

listening and for me give an example or

29:21

two of how you take the research and

29:23

then apply it in the real world? You

29:27

mentioned effectively forced savings

29:30

earlier. Maybe that's a domain we could

29:33

explore.

29:34

>> When my father worked, he was an

29:35

actuary. He worked at a big insurance

29:38

company. He had the pension that was

29:42

prevalent at that time, defined benefit

29:45

pension plan, the oldfashioned kind

29:48

where how much you got in your pension

29:51

just depended on how long you worked and

29:53

what your final salary was. No

29:55

decisions.

29:57

And we gradually started shifting over

30:00

to the new 401k type that's called

30:03

defined contribution. Meaning you put

30:06

money in and invested and then you get

30:10

what you have at the end. Now,

30:14

when I started working in this area, one

30:16

problem we noticed was

30:19

lots of people weren't joining this

30:23

savings plan even though their employer

30:27

was matching contributions dollar for

30:30

dollar up to say 6% of their salary.

30:33

That's like the stupidest thing you

30:35

could ever do, right? You're making

30:37

$100,000. They'll say, "I'm going to

30:40

give you $6,000 as long as you put

30:43

>> save 6,000."

30:44

>> Yeah. In a tax deferred.

30:46

>> So, an economist would say, "Well,

30:48

>> 100% of people do.

30:49

>> Everybody will do it." And what we

30:51

noticed is in a lot of companies, only

30:54

half of new workers would sign up within

30:57

the first year.

31:00

So, how can we fix that? Well, remember

31:03

we talked about status quo bias. So,

31:05

here's a simple way. The way it worked

31:07

at that time was in order to join you

31:09

have to fill out a form and choose some

31:12

investments and then sign. And this was

31:14

a piece of paper at the time. Said, "How

31:17

about if we just change the form and say

31:19

there's this plan we're going to put you

31:22

in unless you fill out a form saying you

31:24

don't want it." Again, economic theory

31:26

says that won't make any difference.

31:28

Everybody's going to join. And certainly

31:31

just filling out a piece of paper,

31:33

>> it's not enough friction to change

31:34

things, right? I mean, we're giving you

31:36

$6,000, right?

31:38

>> But the first company that did that,

31:42

>> new employees now joined 90% instead of

31:45

50%. I wrote a book called Nudge, and

31:48

that's an example of a nudge.

31:51

>> I am fascinated by nudges. And tell me

31:53

if I'm defining this correctly, but some

31:55

feature of the environment that improves

31:57

decisions but doesn't force anyone to do

31:59

anything. Is that a fair Yes.

32:00

>> I think I'm trying to quote correctly,

32:02

so hopefully it's accurate. I'm pretty

32:04

sure I wrote those words.

32:05

>> Yeah, I think you did. One of the

32:07

examples that I've heard you

32:10

discuss, I think this started in the

32:13

Netherlands, but it is the fly etched or

32:17

otherwise put inside of urinals to

32:19

reduce spillage because a lot of guys

32:20

are on autopilot. Turns out they like to

32:22

aim at things.

32:22

>> I love that. That's what you of

32:24

everything that you read, that's what

32:25

you chose to pick.

32:27

>> Well, I picked it because at least most

32:30

guys listening have seen this. Yes, a

32:33

thousand%.

32:34

>> And my question is,

32:37

is there a certain halflife to the

32:40

effectiveness of nudges? Because I

32:42

remember the first time I saw one of

32:44

these, I was like, I'm definitely going

32:45

to get that fly. I remember it. And then

32:47

after a while, I was like, okay, I

32:48

realize this is just painted on enamel

32:50

or etched into the enamel. It's no

32:52

longer that interesting. And not to

32:54

extrapolate from myself to everyone, but

32:56

I'm wondering if you need to refresh

33:00

nudges as you might refresh many other

33:03

things that maybe Nick has experimented

33:05

with in the realm of business. How do

33:06

you think about the durability of these

33:09

types of nudges?

33:10

>> There's a good example of a nudge of

33:12

that sort here in Chicago. When Nick and

33:16

I drive back home, we're going to go on

33:19

Lakeshore Drive and there's a bendy part

33:24

And it's beautiful, beautiful road. And

33:27

a lot of people wipe out around these

33:29

bends. You really can't go more than

33:31

about 30. And it's a sixlane road. So

33:34

people think they can go fast. So what

33:36

somebody did around the time we wrote

33:38

that book, a little before, is they

33:41

painted lines on the road

33:44

that get closer and closer together.

33:46

That gives the illusion that you're

33:49

speeding up.

33:50

>> That's clever. And so you're just

33:53

instinctively tap the brake and then

33:57

don't wipe out your car. That's good.

33:59

Right now those lines, they keep

34:02

repainting them.

34:03

>> No one pays attention anymore.

34:05

>> Well, I don't know.

34:06

>> I don't know either.

34:07

>> I don't know. I think the fly in the

34:09

urinal probably

34:12

won't have any effect in the toilet you

34:17

use at your place of work where you know

34:19

you see it several times a day or

34:22

whatever.

34:24

>> But for the pension thing if we only

34:26

have to get you to sign up once that's

34:28

enough.

34:29

>> Yeah. So yes, attention it may be that

34:33

we have to do something different to get

34:35

your attention this time, but there's a

34:38

rule which is if you want people to do

34:41

something, make it easy. That's a rule

34:44

that's always true. The more complicated

34:48

you make things, the less people are

34:50

going to do it. I think that's pretty

34:52

much automatic in terms of

34:56

capturing attention.

34:58

That's what the business of advertising

35:01

is constantly trying to do.

35:06

And you know, clickbait on ads on social

35:12

media. Social media itself is in the

35:14

business of that,

35:15

>> right? Keep it simple is a formula that

35:17

always works and

35:20

getting your attention always works. But

35:23

it won't be the same thing that will

35:26

keep getting your attention.

35:28

>> So this turned into a whole field from

35:30

relatively simple concepts like that

35:32

called choice architecture.

35:34

>> And you've done consulting with various

35:37

companies, the NFL, all sorts of people.

35:40

I don't even know which ones I'm allowed

35:42

to talk about or not, so I have to be

35:43

careful. But

35:44

>> tell us a little bit about like when

35:47

does that become a bad thing? Can you

35:50

turn the nudge or can someone that's

35:52

malicious turn the nudge into something

35:55

that takes advantage of the lack of

35:58

self-control in these models?

36:00

>> Yeah. Sure. We always say we didn't

36:02

invent nudging. Adam and Eve, right?

36:05

>> Yeah. Yeah.

36:06

>> Then the serpent, right? There was, you

36:09

know, the apple. So, human nature has

36:13

been there all along. Husters have

36:16

existed forever. Charles Ponzi didn't

36:20

read our book, didn't read any of my

36:22

papers, neither did Bernie Maidoff. When

36:25

we wrote Nudge, it was saying, "Look,

36:29

here are some basic principles of human

36:31

behavior.

36:33

Can we use those to help people make

36:36

better decisions?" So practically

36:38

speaking, how do you then go into one of

36:41

the businesses that you've consulted for

36:42

and come up with through your framework

36:46

what they have overlooked?

36:48

>> Well, you want people to do more of

36:50

that. Why are you making it hard for

36:53

them to do it? I mean, right, that's the

36:55

answer. But where I was going with that

36:56

was the same principles can be used to

37:01

harm people. So if you go into a casino,

37:05

the whole casino has been designed to

37:08

get people to bet as much as possible

37:13

and to bet on things that

37:16

>> have the worst possible outcomes,

37:17

>> right?

37:18

>> Yeah.

37:18

>> And now we have online gambling and we

37:22

have places like Robin Hood that have

37:25

made investing

37:26

feel a lot like casino gambling.

37:30

>> Yeah. They gified it. Yeah, they're

37:32

making it easy, right? They've made it

37:34

easy to bet. It used to be you had to go

37:37

find a bookie. Now you open your phone

37:40

and you can bet on the game that you're

37:43

watching. And that's very tempting. So

37:47

the the principles of understanding the

37:51

customer and then designing the product

37:54

can be used for good or evil. I take no

37:57

responsibility

37:59

for somebody optimizing

38:03

an online gambling app to make it as

38:08

attractive as possible for people to

38:11

lose all their money. Don't blame me.

38:14

That's what's going to happen in a

38:16

competitive market with consumers who

38:19

are humans.

38:20

>> Richard, question for you. How long have

38:22

you been teaching your or how long did

38:24

you teach? It sounded like it was

38:25

current day. the decision-m class

38:28

>> 40 years.

38:29

>> Okay. You've had time to work on your

38:30

material.

38:31

>> Yeah, I should be better at it, right?

38:33

>> Well, I mean, I wasn't going to go that

38:35

far. I was going to ask you, what seems

38:38

to be the stickiest of what students

38:42

repeat back to you from that class as

38:46

concepts, frameworks, stories, could be

38:49

anything at all. And I suppose the

38:50

precursor question is what are they

38:52

hoping to gain from the class in the

38:54

first place? What's the promise of the

38:55

class? But I'd be I'd be curious to know

38:57

what sticks.

38:58

>> First thing I will say is nobody thinks

39:01

they need a class in decision-m

39:04

because they're great at decision-m. Why

39:06

would they need a class in that? Do I

39:08

need a class in breathing? Although

39:10

you're going to tell me actually you

39:12

don't know how to breathe, right? And

39:13

yeah,

39:15

>> I've got a frictionless ecourse for you

39:17

with lots of inapp purchases. I do hear

39:19

from people who took a class from me at

39:24

Cornell 40 years ago, which is very

39:27

gratifying. I'm glad that they even

39:30

remember that they had such a class.

39:33

What do they remember? They remember

39:36

stories. That is the only thing people

39:39

remember. They do not remember a

39:43

formula. They don't remember some

39:45

abstract concept. They remember a story

39:49

or they remember a demonstration.

39:52

Take the concept of the winner's curse.

39:55

>> This is an obvious move on my part since

39:58

I have a new book that's called the

39:59

winner's curse. But let's talk about the

40:01

winner's curse because it's a great

40:03

example. The way you do this in a class

40:05

is you bring in a jar of coins and you

40:09

say, "I'm going to auction this off."

40:10

You get the money in the jar.

40:12

>> You mean like the high bidder gets

40:13

>> high bidder, right? High bidder gets the

40:15

money. So, there's $75 worth of coins in

40:18

there and the high bidder gets 75 bucks

40:21

and they pay me

40:22

>> something. That's what we're getting to,

40:24

>> right?

40:24

>> Yeah.

40:24

>> Do they know that it contains 75 or it's

40:26

an unknown?

40:27

>> It's like a jelly bean estimation or

40:29

something. Exactly.

40:29

>> Yeah, I got it.

40:30

>> You can use jelly beans or whatever

40:32

paper clips.

40:34

>> So, what what do you find in that? You

40:37

always make money on this.

40:38

>> The creator of the jar makes money.

40:40

>> Yeah. The professor always makes money.

40:42

Yeah.

40:43

>> Because you have this jar. or it's worth

40:45

$75. There will be somebody that'll beat

40:48

a h 100red or 150

40:50

>> and they win.

40:51

>> And they they're the winner.

40:52

>> They win. Yes.

40:53

>> That experience, you can tell people

40:55

this abstract concept of something

40:58

called the winner's curse. They won't

40:59

even remember what it means cuz it's got

41:03

a weird name. It doesn't have anything

41:05

to do with cursing or witches. But they

41:09

remember, oh yeah, that that guy who bid

41:13

a lot too much

41:14

>> bid too much. Now this concept was not

41:17

discovered by psychologists. It was

41:20

discovered by

41:22

>> engineers at Arco, an oil company. There

41:26

were bidding for leases

41:30

in what I'm going to insist on

41:32

continuing to call the Gulf of Mexico.

41:35

And what they discovered was the leases

41:39

that they won had less oil

41:43

than the engineers and geologists had

41:47

told them would be there.

41:50

And they said, "Gee, that's weird

41:53

because we thought we had great

41:54

geologists

41:56

and what's the problem?" And the the

42:00

problem they figured out, which very

42:02

subtle, which is that the auctions you

42:07

win are not a random sample of the

42:10

auctions you bid in. They're the ones

42:12

where you're the highest bidder. And if

42:14

you're the highest bidder, there's a

42:16

good chance that

42:17

>> you bid too much.

42:18

>> You bid too much. That leads to an

42:21

interesting conundrum or you know like

42:26

it's almost like war games where the

42:28

only way to win the game is not to play

42:30

if you're Arco which means you should

42:32

just go out of business.

42:33

>> Well, so

42:34

>> you know so how do you win that if you

42:36

are in that market where you have to bid

42:38

on these things?

42:39

>> That's a great question. So all right

42:41

it's 1970 or something whenever they

42:44

published that paper. they get this

42:46

finding. What should they do?

42:49

And right, one would be to go into some

42:51

other line of business. Another would be

42:54

to bid less, but then they're not going

42:58

to win very many auctions.

43:00

They came up with a pretty clever

43:02

solution.

43:03

>> Was it collusion?

43:04

>> No,

43:05

>> cuz that wouldn't work. But I bet it's

43:06

something like that.

43:08

>> Major League Baseball.

43:09

>> Major League Baseball does that.

43:10

>> That was their solution

43:12

>> and they were outed on that. No, their

43:15

solution was to write a paper. Think

43:17

about it, you know?

43:18

>> So, they made everyone aware of it,

43:20

>> right? So, instead of going to all the

43:22

other team owners and say, "Hey guys,

43:25

when Catfish Hunter becomes a free

43:28

agent, don't bid."

43:30

You know, that's illegal. But publishing

43:32

a paper saying people are bidding too

43:36

much and the more bidders there are, the

43:38

less you should bid. That's perfectly

43:40

legal and useful.

43:43

Now, it turns out that there's a funny

43:47

story about this, which is the version

43:50

of this book, The Winner's Curse, that I

43:53

published in 1992.

43:55

The editor who bought that went to

43:58

Princeton University Press. And then

44:01

when Nudge came along, there was an

44:04

auction for the rights to bid it. And

44:06

did he pay too much?

44:07

>> No, he didn't bid. And I said, Peter,

44:10

how come you didn't bid on this book? I

44:12

think it's going to sell. He said, "No,

44:14

I read The Winner's Curse." I knew

44:16

>> I can't bid on your book.

44:18

>> Yeah. Right. And no, don't bid in

44:20

auctions. So, I said, "Well, you know,

44:22

maybe this one should have been an

44:23

exception. I haven't forgotten your

44:25

question." I don't know whether people

44:27

will learn that theoretical lesson, but

44:31

they'll remember the jar of coins and

44:34

they'll remember stories. You know, I

44:36

had two psychologist mentors, Amos

44:40

Diverski and Danny Conorman. now both

44:42

dead. But Amos sadly died at 59.

44:47

At his funeral, his son read a little

44:51

note that Amos had given him that said

44:54

something like, "I'm not going to get

44:56

this exactly right, but he had cancer

44:59

and had a few months where he knew he

45:02

was dying and was spending time talking

45:04

to his family about it." and he wrote a

45:06

note saying that he thinks the time

45:08

they've been spending talking has been

45:10

useful and that he thinks people learn

45:14

through stories.

45:16

And I've put that little note in my

45:21

first class ever since then. And I say

45:25

to people, look, people will tell you

45:28

don't take this class. All he does is

45:29

tell stories.

45:31

And I said, that's true. And talk about

45:34

sports. That's also true. But here's

45:37

this line from Amos, smartest man on

45:39

earth. That's the way you learn. You're

45:42

going to learn through the stories. We

45:45

show people that they're overconfident

45:47

and

45:47

>> in their decision- making.

45:48

>> Yeah. Or or in judgments. I mean, you

45:52

can you ask people what's the length of

45:56

the Amazon River and give 90% confidence

46:00

limits. meaning give a high and low

46:03

estimate so that you're 90% sure that

46:06

the correct answer lies somewhere. Yeah.

46:09

>> And the right answer will be within it

46:12

not 90% but like 60%.

46:15

>> Yeah. I would not wager on that. I have

46:17

no idea.

46:18

>> Yeah. So you know you have no idea but

46:20

still the limits are are too narrow. By

46:23

the way, the same is true for CFO CFOs

46:28

of Fortune 500 companies. I have two

46:30

friends at Duke who do a survey twice a

46:34

year of CFOs and they're asked what's

46:38

going to be the return on the S&P 500

46:40

over the next year and they are asked

46:44

for a high and low estimate and the

46:47

correct answer comes out between those I

46:51

think they asked for 80% limits and it's

46:54

like

46:55

>> yeah no

46:55

>> a third of the time I mean now it's true

46:58

that that's an impossible task meaning

47:01

nobody can predict the market but you

47:05

should know that you can't predict the

47:06

market. So a correct answer for 80% is

47:10

well it's going to be

47:12

somewhere between up 20% and down 10.

47:16

That's a reasonable forecast.

47:17

>> Yes.

47:18

>> But instead they say up 10 minus two%.

47:23

Yeah, there were there was a whole

47:24

decade where the average downside

47:29

scenario was zero.

47:31

>> Well, it's a recency bias, right? Like

47:33

whatever happened the last couple years,

47:34

people tend to extrapolate into the

47:36

future.

47:36

>> Well, they were doing that right up

47:37

until the financial crisis.

47:39

>> Yes.

47:40

>> Right.

47:40

>> Yeah. Yes. Yeah.

47:41

>> So, they were most overconfident

47:44

before the [ __ ] hit the fan. Exactly.

47:46

Right.

47:47

>> So, that was Kakona saying the [ __ ] hit

47:49

the fan. Not

47:50

>> I'm allowed to swear on this podcast.

47:51

>> Oh. Oh, okay. That's good. Oh yeah, you

47:53

can swear. Yeah, you can feel free to

47:54

fire away.

47:55

>> You're fine.

47:57

>> So, the winner's curse sounds like a

47:59

abstract concept, but Nick knows I wrote

48:02

a a paper about the NFL draft

48:07

that applies exactly that concept. Teams

48:12

really think it's valuable to have the

48:15

first pick or one of the top 10 picks.

48:18

And then you just cited the Chicago

48:20

Bears and their quarterback picks and

48:22

that's all you needed to do.

48:24

>> Yeah. I mean, and you know, I think the

48:27

Bears traded up twice to pick

48:30

quarterbacks at least.

48:32

>> This is always It's not just the Bears,

48:34

though.

48:34

>> It's not just the Bears. No, it's not.

48:36

This is available.

48:39

We live in Chicago, right? But they're

48:41

not the only team that does this. and my

48:44

co-author and I of that paper and and

48:49

somebody else have been again updating

48:53

that and nothing's changed.

48:56

>> But then people actually then hire you

48:58

to tell them this because for some

49:02

reason they can't believe it.

49:04

>> Yeah. But then the problem is that

49:06

there's an owner. Let me ask you,

49:08

Richard, about the the hiring just for a

49:11

second because the example with Arco

49:12

involved writing a paper that draws

49:14

attention to the fact that if you bid

49:16

the most, you're likely going to be

49:17

overpaying, which is a very interesting

49:20

strategium. I'm wondering in the case of

49:22

say an NFL team, what is it that they

49:24

can do? How can they change their

49:27

behavior or bidding behavior based on

49:32

you describing the winner's curse and

49:34

sort of all the connective tissue around

49:35

it? If they have a top pick, they can

49:38

trade down.

49:40

>> If you have the first pick, you can

49:41

trade it for the seventh and eighth

49:43

picks or five, count them, five second

49:48

round picks. And those five players will

49:50

cost you about the same

49:52

>> in dollars in contracts. Yeah.

49:54

>> Right. And if you look, I mean, any

49:58

sports fan can rattle off the number of

50:04

very high picks, quarterbacks, and

50:07

others that have been complete busts.

50:10

Here's the one statistic from that paper

50:12

that I think is most compelling.

50:15

take the players at any one position,

50:19

let's say running backs, and rank them

50:23

in the order in which they were picked.

50:25

So, we have the first down to whatever.

50:28

Now, we ask, what's the chance the

50:30

higher one picked is better than the

50:33

next one. My co-author Kade and I used

50:35

to we called this the better than the

50:36

next guy stat.

50:38

>> It's like a tennis ladder,

50:39

>> right? If teams are perfect at

50:41

predicting, it'll be 100%.

50:43

>> Yeah.

50:44

>> Right. If we rank it tallest to

50:46

shortest, that'll be 100%. Right?

50:50

>> If they're flipping coins,

50:51

>> it's 50%.

50:53

>> Sure.

50:54

>> It was 53%.

50:55

>> All that work, all of the prediction,

50:58

all of the people, all the scouting, all

51:01

the combine, pretty much coin flip.

51:03

>> Yeah, it's pretty much coin flips. That

51:05

means more picks are better. So Tim's

51:08

podcast is really about taking, you

51:11

know, as he always says at the beginning

51:12

of everyone, the high performers and the

51:14

people who see things differently and

51:16

trying to take the nuggets to that

51:18

people can apply to their lives. Mh.

51:20

>> And so I know that like some of what

51:23

you've studied and done,

51:26

you've looked at people's habits like we

51:29

were saying at the very beginning where

51:30

everyone makes perfect decisions

51:34

and of course that's not the case and

51:36

that's really what the whole podcast is

51:38

about. How to change those bad habits

51:40

into positive habits. And so what kind

51:44

of frictions can we create in our lives

51:47

where we can improve our

51:50

decision-making? We can be more like

51:52

that ideal agent that actually cares

51:55

about our economic utility without, you

51:58

know, going nuts and sitting in a room

52:00

with spreadsheets. But how do you take

52:03

these things that you've studied in

52:05

human nature for 40 years and apply them

52:09

like to my life normally?

52:11

>> Let's go back to the cashews. This is

52:13

stuff everybody knows. Your mother told

52:15

you. If you're trying to quit smoking,

52:17

you don't have cigarettes around. If you

52:19

are drinking too much,

52:21

>> lock the wine celler.

52:22

>> Yeah. Lock the wine celler. Make it

52:25

harder to do the stuff you want to do

52:27

less of and make it easier to do the

52:30

stuff you want to do more of.

52:32

>> Yeah. I mean, that seems obvious.

52:33

>> Well, not so much for economists.

52:36

>> Basically, everything I've done

52:39

has seemed obvious after the fact.

52:42

selling

52:44

reservations at a restaurant instead of,

52:47

as you used to say, having five people

52:50

you pay to say no on the phone. That

52:53

seems like an obvious thing to do. It

52:56

does, but I will say that

52:58

since I have sold the company, we'll

53:01

talk about the law of one price, right?

53:02

This penge, if it's identical, should

53:04

cost the same kind of all over the

53:06

place. And that's where arbitrage

53:07

opportunities come from and all of that.

53:09

And classical economics would say, well,

53:12

those get scrubbed out like because of

53:14

perfect information and all of that. But

53:16

as it turns out, you have to then

53:18

convince business owners that, hey, this

53:21

is not a controversial idea and you can

53:24

indeed charge a deposit and change the

53:27

economics of your business. And I spent

53:29

over a decade doing that. And it was

53:31

very difficult actually. And no matter

53:33

how easy we made that choice

53:34

architecture for them as business

53:37

owners, their psychology was that well

53:40

this is a controversial topic. And then

53:42

since I've left the company, what I've

53:45

watched is that one of the big

53:46

competitors is now simply going to other

53:49

restaurants, some of the premier

53:51

restaurants, and they're saying, "Well,

53:51

we'll give you $10,000 to leave talk

53:55

upfront cash. I would go. Why would they

53:58

want to give me free money? There is no

54:00

such thing as a free lunch, but it works

54:02

remarkably well. And that sort of thing

54:05

is also an interesting psychological

54:07

problem.

54:08

>> You know this better than anybody, but

54:10

people are good at something

54:13

like being a chef. Many restaurants are

54:17

run or owned by the chef.

54:20

And being a good chef doesn't make you a

54:23

good business person.

54:25

And the same is true of

54:29

being a coach.

54:32

You don't get to be the coach of a team

54:37

just by being smart. You almost always

54:39

have to have played that sport.

54:43

And that doesn't make you a good

54:45

decision maker. You know, it's

54:47

interesting the field of behavioral

54:49

economics and the field of sports

54:52

analytics.

54:54

You can think of think of Michael

54:55

Lewis's book Moneyball. It's the same

54:58

field. Why do I say that? Well, again,

55:00

people optimize, right? So, economists

55:03

would say, well, teams are all going to

55:06

do the strategy that maximizes their

55:09

chance of winning. Let's take

55:10

basketball. There was an innovation 40

55:13

years ago, the three-point shot. Before

55:16

that, all shots are worth two points.

55:19

Now, you have a shot that's 50% better.

55:23

Now, every team had somebody who could

55:26

make 40% of their three-point shots,

55:30

and teams average about half of their

55:32

two-point shots. Now, Nick, see if you

55:36

can keep up with the math here.

55:37

>> Yeah.

55:38

>> 40% of three

55:40

>> expected value

55:41

>> is greater than 50% of two.

55:44

>> Yeah. How long did it take them to

55:46

figure that out?

55:47

>> Basically 40 years.

55:48

>> That's right.

55:49

>> Steph Curry.

55:50

>> Steph Curry. Yeah. Well, right now I'm

55:52

going to say the words Michael Jordan.

55:54

Give me your image

55:56

that comes to mind and I can tell you

55:59

what it is. It's Michael

56:02

taking some last second shot

56:05

somewhere mid-range with two guys

56:08

hanging on him. Now that even if you're

56:11

Michael Jordan, that's a low percentage

56:13

shot. Steve Kerr, who's now the coach of

56:17

the Warriors, was on the team with

56:20

Jordan,

56:22

for an entire year, his three-point

56:25

shooting percentage was 50%.

56:27

>> Was that true? Really?

56:28

>> Yes.

56:29

>> I had no idea.

56:30

>> And how many shots a game do you think

56:32

he got?

56:32

>> Like one and a half or something.

56:34

>> Yeah. Right. Yeah. Right.

56:35

>> Right. So if you have a look at a plot

56:38

of threepoint attempts over time, it's

56:42

been going up but very slowly.

56:45

>> Yeah.

56:46

>> And so I'm friends with Daryl my who's

56:49

the general manager of the 76ers. I

56:54

always tease him that he got to be rich

56:56

and famous because he was the first guy

56:59

to calculate that point4* 3 was greater

57:02

than.5* 2. He's actually a really smart

57:06

guy. But that's kind of true.

57:09

>> And then that happens everywhere around

57:11

us.

57:12

>> Yes, there are examples of that. And

57:16

again, you know, when I came from

57:19

Cornell to Chicago, I came and gave a

57:22

job talk. It's called an interview and

57:25

you present a paper. and they were

57:27

taking me to lunch and we walk out the

57:29

door and there's literally a $20 bill

57:31

lying on the and people think I'm making

57:34

up the story because it's sort of an

57:37

apocryphal economic story that

57:39

economists look at that it can't be real

57:42

because otherwise somebody would have

57:44

already picked it up. I picked it up. So

57:47

economists really they think there

57:49

aren't these $20 bills on the street and

57:52

there kind of are.

57:53

>> There are. But then what I was going to

57:55

say is where do you put that? So I want

57:56

to just touch on a little bit my

57:59

favorite concept of yours of all because

58:02

it comes up in my household and in my

58:04

businesses like once a week and that is

58:06

mental accounting.

58:07

>> Oh yeah.

58:07

>> And if you could just go over because I

58:09

think this one might be the most

58:12

applicable to every single person that I

58:15

know because people are incredibly

58:17

irrational about this. Explain what that

58:19

is. In economic theory, there's money

58:24

and it has no labels.

58:26

There's just you have wealth W and then

58:32

you figure out and it doesn't matter

58:34

where it is or how you got it or that's

58:37

it.

58:38

>> Now, humans think about money as sort of

58:44

coming in categories. And let's suppose

58:48

you take out a pair of jeans you haven't

58:51

worn in a long time and you find there

58:53

are $300 bills in there. You don't know

58:55

exactly when you left them there. Oh,

58:58

that feels like a windfall.

58:59

>> Jackpot, right?

59:00

>> I can go have a nice meal.

59:02

>> So again, the the standard theory is

59:06

money has no labels. Now, here's a

59:08

policy version of this question. In the

59:11

financial crisis, the Obama White House

59:14

had to there was going to be some tax

59:17

refund to stimulate the economy. And the

59:20

question was, should we give it in a

59:22

lump

59:24

or should we spread it out? Now, the

59:29

economists will say, does it matter?

59:31

It's W. That's it. Right.

59:34

>> Yeah,

59:35

>> it matters. I'm not saying I know

59:38

exactly what the right answer is. It's

59:40

kind of a complicated question,

59:41

>> but the point is is that people take

59:44

sort of money and how they acquired it

59:47

matters to them,

59:49

>> right?

59:49

>> Like if I win $100 off you at golf, I

59:53

might go like, "Well, I'll buy a bottle

59:54

of wine with that." But really, it's

59:55

just part of my cumulative wealth. And I

59:58

should have just done that anyway

59:59

because I had another $100. But that

60:01

comes true like we're selling our house

60:03

right now. And

60:05

>> that money, I'm pretty sure you ought to

60:06

just give that to me. So, my house is

60:08

going to get sold. And so, there's this

60:10

concept now that, well, that's the money

60:14

for the next house, right?

60:16

>> Or the condo we're buying in Chicago as

60:18

we downsize. Somehow the budget is tied

60:21

from one house to the other even though

60:23

it's completely irrelevant. Like, the

60:25

money is going to come in from the house

60:27

sale. And I can use any pool of it's

60:30

just in the big swimming pool. It

60:31

doesn't matter which drop you take.

60:33

Right.

60:33

>> Right. And in our companies, I think

60:35

businesses do a terrible job of that.

60:37

People get budgets. You know, they own

60:39

that budget and they look at tax savings

60:43

that the company might get as completely

60:45

different than earnings that they might

60:47

get and they spend it differently and

60:49

they think about it differently. And

60:50

boards I've been on are like,

60:52

>> yeah,

60:53

>> are talking about all this. And what we

60:54

all said in our businesses, we tried to,

60:56

Steve Bernaki, I'll give you a shout

60:57

out. Every dollar spends the same.

61:00

They're all the same. So, I got to know

61:02

the CEO of an airline. I won't mention

61:05

which one. And I was trying to convince

61:08

them before CO that they should get rid

61:11

of change fees. I think I was also

61:15

lobbying for getting baggage fees. And

61:18

he told me, well, you know, there's a

61:20

guy

61:22

they have a billion dollars a year in

61:23

baggage fees.

61:24

>> Yeah.

61:25

>> There's a guy who owns that.

61:28

>> He ain't going away. Yes. of course

61:31

owns. What does that mean? It's not that

61:33

the money goes to him.

61:35

>> No,

61:35

>> he's the baggage guy.

61:37

>> Yeah. Yeah. Yeah. He's pricing out the

61:38

baggage.

61:39

>> No, it's it would be like if in your

61:41

restaurant Well, I'm sure there was a

61:44

beverage manager, but that money is the

61:47

same as the

61:47

>> same money as all the other. Yes.

61:49

>> So people the mental accounting concept

61:52

is don't do mental accounting basically,

61:55

right?

61:55

>> I mean, now it can be helpful. So,

61:59

putting money into

62:03

uh children's education account,

62:07

that can be smart and treating that as

62:09

offlimits.

62:11

>> Mhm.

62:11

>> Some people have trouble spending too

62:14

much. Most people have that problem.

62:16

Some have the opposite problem. And so

62:20

it's just like we were talking about you

62:22

want to hide the booze and put the

62:24

exercise equipment somewhere where it'll

62:27

be easy to use. It's the same with the

62:30

money. So you can have a fiction.

62:33

>> So there could be good fictions and bad

62:35

fiction.

62:35

>> Yes. Yes. And now you know part of

62:38

mental accounting probably the biggest

62:40

mental accounting thing is the so-called

62:42

sunk cost fallacy. And the idea is

62:47

if you paid for something, so we go out

62:51

to dinner and we've bought some dessert

62:53

and we realize, you know, God, we're

62:55

really full and neither of us need to

62:59

weigh more. We'll just say that, but you

63:03

know, we paid 30 bucks for that dessert,

63:05

so we got to eat it, right? That's dumb.

63:09

And again, every economist

63:12

teaches that. And this is this sort of

63:15

discussion I used to have in the old

63:16

days. I said, "Look, why do you have to

63:19

teach people the sunk cost fallacy

63:23

and then assume they already know it?"

63:25

>> Yeah.

63:25

>> You know, people would say, "What do you

63:27

mean? I can't waste that."

63:30

>> I mean, I fully admit your wine example.

63:32

I do.

63:33

>> I fully admit this. And every time I do

63:35

it, I think of the sun cost fallacy

63:38

because, you know, I've got this old

63:39

bottle of wine. It's now worth $500 or

63:41

$600. I would not pay $500 or $600 to

63:43

acquire it, but I will gladly drink it.

63:46

But I won't go buy a $500 bottle of

63:48

wine, right? And that's it in a

63:50

nutshell, right? And that's literally I

63:52

built the whole company off that. Like

63:54

the entire company of talk was built off

63:56

that one concept.

63:57

>> Wait, Nick, could you expand on that?

63:59

How is that built off of that?

64:00

>> Well, you know, the big friction in

64:04

>> and maybe you could explain it. I would

64:06

have mentioned it briefly in the intro,

64:08

but since it's come up a few times, the

64:10

reservation platform talk, maybe you

64:12

could just give a little bit of

64:13

background.

64:13

>> Well, he doesn't own anymore now, so you

64:15

know, he doesn't need to plug it.

64:17

>> No, no, no. I'm not I'm not trying to

64:18

plug it. That's how we met. I got into

64:20

the restaurant industry by accident in

64:23

some ways. And then when I got there, I

64:24

saw all these sort of irrational

64:26

behaviors, right? And one of them was

64:30

that people would make reservations for

64:32

restaurants and then simply not show up.

64:34

And it's a big number. It's like 12 14%

64:37

of the people just wouldn't show up. And

64:38

then even at a destination place like

64:40

Alineia that I used to own, you know,

64:43

six, seven, eight% of the people

64:44

wouldn't show up. And what I realized

64:47

very quickly was that if people had paid

64:49

for it, they would show up. And they

64:52

would show up at all costs. like the dog

64:55

could have died and like you know the

64:58

snowstorm is happening but they're going

65:00

to figure out a way to get there because

65:02

they have paid some amount of money

65:05

whether it's the whole or the half it

65:06

doesn't really matter and it's

65:09

fascinating because

65:12

if something more important lines up or

65:14

something has more economic utility to

65:16

you should in classical theory just go

65:20

well screw it like that's already done

65:22

I've already spent that $300 or whatever

65:24

it is.

65:24

>> Y

65:25

>> and now I have something that's more

65:26

important or more valuable.

65:29

>> But people cling to that thing very very

65:33

very very strongly.

65:34

>> I'll tell you a funny story. My daughter

65:36

lives in there was a guy a kid in the

65:40

neighborhood grew up to be a pitcher for

65:42

the Mets

65:44

and

65:45

he was pitching in some first round

65:47

playoff game and I noticed that and I

65:52

said, "I think I can get you tickets to

65:55

this game. Why don't you go? That'd be

65:57

fun." And she said, "Oh, that's great.

65:59

That's great." So I'd look online at one

66:02

of these ticket sites and tickets are

66:06

this was a first round. It wasn't that

66:08

expensive. So $300 $400. But then I

66:12

wasn't sure which ones she would want

66:14

and how to get them to her. So I say,

66:15

"Okay, I'll send you $1,000.

66:18

You pick which tickets you want and take

66:21

the rest to buy hot dogs." So she texts

66:24

me back.

66:25

>> Now she has a choice.

66:26

>> She texts me back and says, "Oh, this is

66:28

just like in your book. If you send me

66:30

$1,000, I'm not gonna spend it on

66:32

baseball tickets.

66:34

>> Right. Right.

66:34

>> Just last week, I learned my lesson.

66:37

We're David Burn fans in my family, and

66:40

David Burn had a show and where she was

66:42

playing. I sent her the tickets.

66:45

>> Yeah. Yeah. Yeah. Yeah.

66:46

>> And she had no joint. They were free.

66:49

>> Well, they're free. They're mentally

66:50

accounted for as zero.

66:52

>> Right. So, that that was the best gift

66:54

ever.

66:55

>> Yes.

66:56

>> I'd love to talk about cognitive biases

66:58

for a second. A few things have come up

67:00

already.

67:01

>> Sunk cost fallacy. I think maybe you

67:04

were referring to something that I might

67:05

put under the category of endowment

67:07

effect with maybe the mugs.

67:09

>> Yeah,

67:10

>> might be mixing that up. But my question

67:12

is what are good examples? I can think

67:14

of a few for myself actually as a

67:17

backstory. I bought books on cognitive

67:18

biases and the framing around the

67:21

reading for me was things to avoid,

67:23

right? These are things that I want to

67:24

avoid. These are yellow flags. But what

67:27

I realized, at least for myself, and

67:29

maybe I'm misapplying the term, but I

67:33

could basically do what Nick did to his

67:36

customers making reservations to myself.

67:39

For instance, I could prepay for

67:42

personal trainer or something like that

67:45

and it would make me more inclined to do

67:46

the thing that I say I want to do that's

67:50

good for me.

67:52

I know of actually wrote about the case

67:55

of two engineers. They worked at tech

67:57

companies. They made perfectly good

67:59

money, but they bet each other

68:01

effectively. It was a bet $1. So, it's

68:05

kind of like trading places, but if they

68:07

would show up at the gym at the same

68:08

time to do something like 15 minutes of

68:11

treadmill and if somebody didn't show

68:13

up, they had to pay the other person a

68:15

dollar. And these are two people who had

68:17

failed at every exercise regimen prior

68:20

to that. and they both ended up losing

68:21

50 plus pounds, even though they didn't

68:23

really know the nuances of exercise or

68:27

anything like that. So, I'm I'm curious

68:29

if any examples come to mind where you

68:31

can actually use cognitive biases to

68:34

your advantage.

68:35

>> I'm a big believer in that that you

68:38

know, a good way to get yourself to do

68:41

something is have a commitment,

68:45

>> pay for it,

68:45

>> and pay for it.

68:46

>> It's a monetary commit. It's pain. It's

68:48

a little bit of pain,

68:49

>> right? I have some young colleagues who

68:51

wrote a paper called paying not to go to

68:54

the gym. So yes, I do Pilates and if I

68:59

make an appointment with my trainer then

69:02

I go. There's a clever experiment by a

69:06

young colleague of mine called Katie

69:07

Milkman who's big in this behavior

69:10

change space and she has ran an

69:14

experiment

69:16

with getting people to go to the gym

69:18

where what she did was she gave them the

69:24

Hunger Games audio book and they could

69:27

only listen to it when they were on the

69:30

treadmill.

69:32

The idea is you pair something good with

69:35

something that you don't want to do.

69:37

>> Mhm.

69:38

>> So if you go then

69:39

>> you get to hear the next chapter.

69:41

>> You get to hear the next chapter.

69:42

>> Yeah.

69:43

>> And it's like if you're binging, imagine

69:47

to watch the next episode. First you

69:49

have to run around the block.

69:52

Temptation bundling. That's what you

69:54

>> That's it. Yeah. It strikes me that a

69:56

lot of the experiments you've done

69:58

required finding groups of people and

70:01

then testing them methodically and then

70:04

putting rigor to some things that were

70:06

maybe a little amorphous and whatnot so

70:07

that the the academic community would

70:10

accept them as rigorous enough to be its

70:12

own, you know, department and ultimately

70:15

win a Nobel Prize.

70:17

And at the end of the day, that seems

70:21

like those tests and experiments are so

70:24

much easier now with social media, with

70:26

the internet, with the ability to engage

70:28

with huge populations of people. Is that

70:31

true? Has the field like sort of

70:34

utilized that? I know you've cited eBay

70:38

auctions in some of your papers. Yeah, I

70:41

think the big thing as you know I took

70:43

on a task a kind of a weird task of

70:46

taking a book I wrote in 1992

70:50

and taking on a young co-author and

70:52

going back and saying did we make all

70:55

that up or how does it hold up?

70:57

>> How does it hold up? It holds up and the

71:00

kind of encouraging thing is that we can

71:04

go from the lab and now to the field.

71:07

So, we were talking about mental

71:08

accounting. Here's a funny mental

71:10

accounting result. During the financial

71:13

crisis,

71:16

the price of gasoline fell

71:19

like by 50%.

71:22

So,

71:24

what do people do now? Remember, it's a

71:27

financial crisis, right? So, people are

71:28

tight for cash, but their gasoline

71:32

budget

71:33

is overflowing. They have a little

71:35

compartment in their head that's for

71:36

gas.

71:37

>> So let's say they spend a hundred bucks

71:39

a week at the gas tank

71:43

>> and

71:45

now it's 50 bucks. So what do they do?

71:48

Well,

71:50

>> time they start treating their car to

71:53

occasional tanks of high test.

71:55

>> Oh, really?

71:57

>> Now that's really stupid. Your Honda

72:00

Prius, it's not gonna do any better with

72:04

>> premium gas.

72:05

>> With premium gas, it's made to run on

72:08

regular. No matter what the gasoline

72:12

companies are telling you, the more

72:14

expensive gas isn't better for 90% of

72:17

cars. But what they found was when the

72:20

price went down, they would buy more

72:24

expensive gas. Now, you know, you and I

72:27

would say if we were going to do some

72:28

mental accounting with that, we say,

72:30

"All right, we could upgrade the wine."

72:32

>> Yes. I always say that.

72:35

>> That's always we would always do it

72:37

anyway. or they buy better olive oil,

72:40

you know, instead the store brand. And

72:43

the bigger picture is that's kind of one

72:46

of the lessons in this new book is all

72:48

the stuff that we found in thought

72:50

experiments and laboratories now because

72:54

of big data

72:56

you can find in the real world. And like

73:00

this paper, they had data from millions

73:04

of shoppers at a large box chain store.

73:09

And so they could show not only are they

73:11

upgrading the gas, which is stupid, but

73:14

they're not upgrading the orange juice

73:17

>> or purchasing in bulk to save money

73:20

during a crisis. Yeah.

73:21

>> Right. Right. You're right that it's

73:24

much easier to run experiments now. And

73:27

of course, companies are running these

73:31

experiments every minute. The largest

73:34

economics department in the world is now

73:36

at Amazon.

73:37

>> Mhm.

73:38

>> 100 PhDs in economics working at Amazon,

73:42

>> which could be good or bad for them.

73:45

>> Well, according to my

73:47

>> I was going to say it depends if they're

73:49

the right stripe, right?

73:51

>> Yeah. I think they're getting pretty

73:52

good economists. uh how many do you

73:55

think work and maybe the label of

73:58

economists is too confining here but in

74:01

terms of working with mass data sets in

74:05

the real world say a palunteer I don't

74:07

know if those numbers are public but I

74:09

would imagine they also have an entire

74:11

army of people who are working on this

74:13

stuff

74:14

>> and you know the mix of data scientist

74:18

and econ some of them got their training

74:19

in economics department so exactly what

74:22

their training is, but there are people

74:26

with the equivalent of PhDs in economics

74:30

or computer science working at all these

74:34

companies.

74:34

>> To rewind the clock quite a ways, you've

74:38

done a lot of amazing things in your

74:40

career. I was looking at an interview

74:43

with you on Nobelprize.org

74:46

and there's a line here I'd love for you

74:48

to explain. And my thesis adviser

74:51

famously said when interviewed about me

74:53

of my time in graduate school that quote

74:55

we did not expect much of him end quote.

74:58

So why is that the case? I was not the

75:03

best grad student in my class and I

75:07

wasn't in the best department. Actually

75:09

I wasn't a great student in any way but

75:12

I certainly knew

75:14

I was not the best grad student in my

75:18

class. Why is that?

75:20

>> I was good in math but not as good in

75:22

math as the people who go to get PhDs in

75:27

economics. Mhm.

75:29

>> And I was better at

75:33

noticing

75:34

the problems with economics

75:38

than

75:39

you can think about it as you could be

75:44

somebody who can draw

75:48

perfectly

75:50

or you can be somebody who thinks of a

75:54

different way of drawing.

75:55

>> Mhm. And I was more that guy. The only

75:59

way I managed to succeed, even get a job

76:04

as an economist

76:06

and get tenure, much less get a Nobel

76:09

Prize, which was certainly never

76:12

on my radar when I was a young person,

76:15

was to think of a different way of doing

76:17

economics.

76:20

and I more or less had to invent

76:23

behavioral economics

76:26

to have a career

76:28

>> otherwise I would have done something

76:29

else. I was reading in preparation for

76:31

this a bunch of his old source material

76:34

papers like I've read Nudge I read

76:36

misbehaving and all of these and I went

76:39

back to some of the source papers and I

76:41

was literally laughing out loud. I mean,

76:42

they were written 30, 40 years ago. And

76:45

some of these same problems and the same

76:47

human nature shows up again and again

76:50

and again. And I it's just a fascinating

76:53

thing that within this entire academic

76:56

discipline for hundreds of years, no one

76:58

said, well, the emperor doesn't have the

77:00

clothes on this. And I think that's what

77:01

you've done really, really well time and

77:04

again.

77:04

>> Thanks, Nick. I always say that

77:08

I never changed anybody's mind. What I

77:10

was saying was heresy. It was the

77:14

emperor has no clothes. I had been

77:16

thinking, look, see that mole on it on

77:19

his belly? You know, you can't see that.

77:22

You're talking about the three-piece

77:24

suit. But sarcasm doesn't really

77:28

convince people. So the strategy I

77:31

adopted at some point I mean I had to

77:34

write some papers but the strategy I

77:36

adopted to

77:38

broaden the field I always say instead

77:42

of changing people's minds I would

77:44

corrupt the youth.

77:47

One example of that is there's a

77:49

foundation in New York called the

77:51

Russell Sage Foundation

77:53

and they wanted to support behavioral

77:56

economics

77:57

when we were just getting started and

77:59

they gave us some money and they said

78:00

you can do whatever you want with it and

78:02

what we decided to do is start a 2 week

78:07

summer camp. That's not the official

78:08

name but everybody refers to it as the

78:10

summer camp. So it's two weeks. We got

78:13

30 grad students from around the world,

78:17

best students in the best departments

78:19

and we would teach them about behavioral

78:22

economics. There are graduates from that

78:25

graduates I mean attendees

78:27

>> alums

78:28

>> alums in the best economics departments

78:31

around the world. They're editing

78:33

journals now. the new chairman of the

78:37

Berkeley economics department was at one

78:40

of those and I think it's still the

78:42

truth that people my age they never got

78:47

convinced

78:48

and it's the 30 and 40 year olds

78:52

>> Mhm. The other thing I did was there was

78:56

a new journal called the journal of

78:58

economic perspectives and it tells you

79:01

something about economics that this

79:03

journal had to be created. Journal

79:05

articles had gotten so arcane and

79:07

technical that the papers were not

79:11

understandable

79:12

unless you were in the sub field.

79:16

you know a macroeconomics paper was not

79:18

understandable to a labor economist or a

79:21

finance professor. So they started this

79:24

journal and the idea was the articles

79:27

would be written in a way that would be

79:29

accessible to any economist or uh grad

79:32

student or even advanced undergrad. My

79:35

friend Halvarian who was the chief

79:37

economist at Google. He was an editor at

79:39

this journal and he and I were having

79:42

lunch one day and we got the idea they

79:45

were going to have some regular features

79:48

and the idea was I would write a column

79:51

in this journal on anomalies. So these

79:55

were pokes. There was one on the

79:59

endowment effect that we've talked about

80:01

that buying and selling prices are

80:03

different. There was one about the fact

80:06

that stocks that have gone down a lot do

80:09

better than ones that have gone up a

80:11

lot. So I started writing this when I

80:13

was about 40. And it it's kind of an old

80:16

man thing to do to

80:20

write stuff like that. And there was a

80:23

colleague of mine at Cornell who I

80:27

overheard telling somebody about this

80:30

journal. Well, I don't know whether

80:32

articles in that journal should count.

80:35

And I'm thinking,

80:36

>> what are they counting?

80:37

>> Yeah. What are they counting?

80:39

>> Right. Right. Right. Imaginary economist

80:41

points.

80:42

>> Right. Yes. You know, now there is

80:44

something you can count which is

80:45

citations. Citation is if somebody else

80:48

writes an article and cites your

80:50

article, those are counted. And actually

80:53

publications in this journal get a lot

80:56

of citations because people read them.

80:58

People can read them

81:00

>> because they're readable. Yes, they can

81:02

read them

81:02

>> because right first idea, write an

81:05

article somebody can understand.

81:08

>> Make it easier.

81:09

>> Now, it is the case that if the article

81:12

is

81:13

too easy, we've mentioned my friends

81:17

Conorman and Tverki who were writing the

81:19

psychology articles that inspired me a

81:22

lot. A lot of people would look at those

81:24

articles and say, you know, what's the

81:26

big deal? there just wasn't enough rigor

81:27

to them within the

81:28

>> and it was seems like so obvious. So

81:31

they have this idea availability that

81:34

you're going to think something is more

81:35

likely if examples of it come to mind.

81:40

So ask people what's the ratio of

81:43

homicides to suicides and they think

81:46

like two or 3:1. Turns out there are

81:48

twice as many suicides as homicides.

81:51

But suicides are quiet and I'm betting

81:55

you know more than one person either

81:58

directly or you know in your community

82:02

who is a suicide victim and the chances

82:06

are you don't know any homicide victims.

82:08

>> Y

82:09

>> but nevertheless you might give that

82:10

same answer and the obvious reason is

82:13

that we read about homicides all the

82:15

time and suicides are kind of quiet. So

82:18

the thing is their papers look too easy.

82:21

>> I don't even know if you know this Nick,

82:23

but when I was at Princeton undergrad,

82:26

one of the many ways that I got together

82:29

little bits of money here and there was

82:31

by volunteering at mostly Green Hall in

82:35

the psychology department and I was a

82:37

subject for some of Danny Cotman's

82:40

studies.

82:42

>> So you were there when Danny was

82:43

teaching there?

82:44

>> I was. Yeah, I was there. Did you take a

82:47

class?

82:48

>> I did not take a class with him, which

82:49

is one of my great regrets.

82:51

>> That was a bad move, Tim.

82:53

>> I know. It was a bad move.

82:54

>> Next time, get that right.

82:57

>> Exactly. And people may recognize the

83:01

name popularly from Thinking Fast and

83:04

Slow, which has been recommended by

83:06

presidents and so on. But why is he so

83:09

notable? What did he do or show or

83:12

explain that made him so noteworthy?

83:17

>> The early work was done jointly with

83:19

Amsterverki.

83:20

>> Mhm.

83:21

>> And they are the reason why you're

83:23

talking to me because I had that list of

83:27

weird behavior,

83:29

but I didn't know what to do with it.

83:31

And then I went to a conference and one

83:33

of their students, this is back in the

83:36

70s, one of their students was telling

83:39

me about the work they were doing. And I

83:42

went back home and read a bunch of their

83:45

papers, which you had to do by going to

83:48

the library and finding the psychology

83:51

section in the library, which I had

83:53

never been to. And

83:56

a big light bulb went on. And the light

83:59

bulb was it's the phrase systematic

84:02

bias. So let me explain to an economist

84:07

if people make a mistake that's no big

84:10

deal because fine they'll admit and in

84:14

fact if you give economists like a half

84:18

a glass of wine they'll admit even

84:21

traditional economist that most of the

84:23

people they know are idiots.

84:26

>> See what I mean?

84:29

certainly their students and their

84:31

spouse and their dean and the president

84:36

of the university and actually here

84:39

there's a funny story about Amos. Amos

84:41

and I are at this conference and there's

84:44

an economist at dinner and he starts

84:47

going into this rant about oh Amos had

84:50

set him off and said how's your wife's

84:53

decision-m and the guy starts telling

84:56

stories. Then Amos asked him about the

84:59

president of the university and the

85:00

president at the time, I don't remember

85:02

who it was. We're getting like this

85:04

halfhour long

85:07

rant about the irrationality of all

85:10

these people,

85:10

>> right?

85:11

>> And then it's like Amos is having him

85:14

walk the ledge

85:16

and then pulls it out and says, "See,

85:20

let me see if I can understand this." So

85:22

basically

85:23

everybody you know you think is dumb

85:27

but the people in your models are all

85:29

brilliant.

85:31

>> So that's the systematic bias.

85:32

>> Yeah. And the systematic bias is like

85:35

back to the availability we were talking

85:37

about. Right. So the fact that I can ask

85:39

you a question are homicides or suicides

85:42

which is more common? I can predict

85:45

that. Right? And that's a mistake and

85:49

it's not a random error. So, it's not

85:51

that people are dumb, you know. I don't

85:54

really think people are dumb. I think

85:55

the world is hard, but people deal with

85:59

this hard world using shortcuts and so

86:04

forth. And the shortcuts are useful,

86:08

but not perfect. And they lead to

86:10

predictable mistakes like the sunk cost

86:13

fallacy. The more you paid for the play

86:19

you were going to go to,

86:22

the less willing you are to skip it, no

86:24

matter how good the alternative is. A

86:27

friend you haven't seen for 20 years

86:29

calls and says, "My flight got

86:32

cancelled. I'm in Chicago tonight."

86:35

>> We bought tickets the day I started

86:37

talk. This is really true. We bought

86:39

tickets to a movie with the kids that

86:42

they wanted to go to and I went on

86:44

Fandango, bought the tickets. I don't

86:46

like superhero movies. It was some

86:48

superhero movie and I did not want to

86:50

go. I would have easily paid the $150 to

86:53

not go at 9 in the morning. Then I

86:56

bought the tickets and it was pouring

86:59

rain outside at like 6:00 and everyone's

87:03

looking at each other and they're

87:04

comfortable on the couch and everyone's

87:05

like, "Do you really want to go out in

87:06

this?" I was like, "We are going to that

87:08

damn movie." Like, how can you not?

87:11

Literally that moment, I went, "We are

87:14

putting deposits down on every damn

87:16

person that goes to the aviary." And I

87:18

walked in and like my CFO is like, "This

87:20

is what I was talking about."

87:22

>> I hope you didn't go to the movie.

87:23

>> We did go and I hated it,

87:25

>> but that's because

87:26

>> you didn't know me then.

87:28

>> But it is absolutely true that that is a

87:31

real thing that we all succumb to. So

87:34

that was the big idea from Conorman and

87:37

Tverki. And by the way, everybody knows

87:40

Michael Lewis and Moneyball and many of

87:43

his other books like The Big Short, my

87:46

favorite movie. People don't realize I'm

87:48

a I have a cameo in that movie. It's not

87:52

the one with Margot Roby. But an amazing

87:55

book Michael wrote was about Conorman

87:58

and Tverki called The Undoing Project.

88:01

and I kept telling him, "You can't write

88:04

a book about two psychologists

88:06

talking to each other,

88:09

but he's an amazing writer and it's an

88:12

amazing story." So, if you're curious

88:14

about those two people who are two of

88:19

the greatest 20th century scientists,

88:22

I recommend that book. It's an easy

88:24

read. And

88:25

>> can we bring up a difficult subject?

88:27

>> Is there anything I could do to stop

88:29

you?

88:29

>> Absolutely. You can say no. Oh, okay.

88:32

Yeah, sure. Bring it up.

88:34

>> Yeah, we can always edit it out as well.

88:36

>> Yeah. No, I mean, look, I say it with

88:38

respect, but you know, so it became

88:40

public, I guess, earlier this year, and

88:43

I literally just found this out a couple

88:45

hours ago that Danny chose assisted

88:50

suicide. And I've known that for a

88:52

little while. but as a friend, as a

88:54

mentor, that had to be incredibly

88:57

difficult and something to struggle with

89:00

when he told you that he was going to do

89:02

this. Furthermore, he wasn't actually

89:06

like tremendously ill or anything like

89:08

that. Are you comfortable talking about

89:10

that a little bit?

89:11

>> He had been

89:13

a friend and mentor. He was my best

89:15

friend for 40 years. Yeah. He calls me

89:20

one day and says, "Ah, that's it." And

89:24

he had just turned 90.

89:29

And you know, one of his findings

89:34

was that the our memory of an experience

89:39

is determined by two factors.

89:43

The peak and the end. Like you go to one

89:47

of those meals

89:49

at a three-star restaurant. What was the

89:52

best thing? That's the peak.

89:56

And how was it at the end? I think those

89:59

restaurants don't get the end part right

90:00

because they give you too much food.

90:03

>> But anyway, Danny was concerned. He took

90:06

this part seriously and he was mostly he

90:11

didn't want to lose control. And at 90,

90:15

I can tell you he was still the smartest

90:17

guy I knew. He had lost nothing. So, we

90:21

spent a week or so arguing and

90:26

I thought I was winning and he said,

90:30

"Okay, you're getting annoying." So, I

90:34

flew to New York. I was in California. I

90:36

flew to New York. Took him out for a

90:39

good dinner. Bought him a bottle of

90:41

wine. 1998 luin that I thought this is

90:47

worth living for. So that was my

90:50

attempt. I wasn't allowed to try and

90:52

argue with him anymore.

90:53

>> Yeah. I figured he'd probably put the

90:55

kibash on that.

90:55

>> Right. So no arguing, but we went out to

90:58

dinner together. He did think the wine

91:01

was good, but wasn't going to change his

91:03

mind. And then the next day we spent

91:07

figuring out how to manage the next

91:09

month or so.

91:12

And our goal was that the obits

91:18

weren't about the way he died.

91:20

>> And they weren't.

91:21

>> And they weren't

91:22

>> until that came out.

91:24

>> Yeah. Then a year later, there was an

91:26

article in the Wall Street Journal. I

91:28

think the writer shouldn't have included

91:30

the letter he sent to the email he sent

91:33

to friends. But anyway, I mean, Danny

91:36

had great 90 years and he was great up

91:38

until the end. And I would have liked a

91:42

few more, but

91:45

I respected

91:47

the right to

91:50

him to end the way I kept sending him

91:52

emails saying, you know, tell me how the

91:55

chocolates are in Switzerland, but he

91:59

didn't reply.

92:01

>> Richard, what was his

92:03

argument for doing it? Did he feel like

92:06

he was slipping? Did he want to just

92:10

head that off at the pass alto together?

92:12

>> He wanted to be able to decide when he

92:16

was going to do it. And his argument was

92:20

yes, he realizes that

92:24

it's premature, but it would be

92:26

premature

92:29

whenever he decided to do it.

92:33

And so he's going to do it now. And I

92:36

will say like the last month of his life

92:39

might have been his happiest.

92:42

So maybe he got it exactly right. He

92:47

went to Paris for two weeks with his

92:50

partner

92:51

and then his Israeli family, his

92:54

daughter

92:56

lives in Tel Aviv and she and her family

93:01

came and spent a week with him in Paris,

93:03

which is where he grew up as a kid. Then

93:06

he went off to Switzerland. So yeah, I'm

93:09

a greedy man. I would have liked a few

93:12

more, but I had 45 years. So that's

93:15

pretty lucky.

93:17

>> I won't spend too much more time on

93:18

this, but I am curious. What was his

93:23

belief around

93:25

death? Was it lights out? That's it.

93:27

Just like before you were born. Was it

93:29

something else? Was he afraid of dying

93:31

or did he not have a fear of it?

93:32

>> I think he had no fear of it. He didn't

93:36

want

93:38

to

93:42

go through a phase where he didn't have

93:45

his full faculties. And

93:48

>> you explained to me when you first told

93:50

me about this because I think there's

93:52

this innately human thing which you know

93:54

Tim is reacting to as well and I

93:56

certainly did which is we are so

93:58

ingrained to protect life and the life

94:03

of ourselves and others that we love

94:07

no matter what. Right? And he feared the

94:12

cognitive decline. The thing he valued

94:14

the most was wrestling with ideas. And

94:18

you told me that he feared that more and

94:21

the control over how that ended than

94:23

anything else. It's not like he was

94:26

worried about no longer being the

94:29

smartest guy in the room as much as he

94:33

thought that

94:35

he might be slipping. And then I mean my

94:39

intervention

94:40

and my attempt at an intervention was to

94:44

create

94:46

a group of people he loved and trusted

94:50

to say all right when certain

94:55

steps are there we buy you the ticket.

95:01

But he wanted to be the one who got to

95:03

decide when that was going to be. and

95:06

and that was with all his facilities.

95:09

And so that was it.

95:10

>> Yeah. Thank you, Richard. We can shift

95:12

gears, but thank you for being willing

95:14

to share that. I mean, I was taken aback

95:16

when I read the piece and have just been

95:19

very very curious as someone who was in

95:22

the same hallways but never took a

95:24

class, which is a real shame on my part.

95:26

In any case, thanks for thanks for being

95:28

willing to talk about that.

95:30

>> No problem.

95:31

>> What keeps you going, Richard? Like what

95:33

is what gets you excited?

95:36

>> There's a transition, Tim.

95:38

>> Yeah. Yeah.

95:39

>> Not saying you should buy a ticket to

95:41

Switzerland. I'm just saying.

95:43

>> Yeah.

95:44

>> What is it that gives you the feeling of

95:47

aliveness? Is it the wrestling with

95:49

ideas? Is it something else? Is it

95:51

corrupting the corrupting the youth in

95:52

productive ways?

95:53

>> Corrupting the youth. I took on this

95:57

possibly wacky task of rewriting a book

96:02

I published in 1992

96:04

about those anomalies columns.

96:06

>> Mhm. And part of that was there's

96:10

something in in psychology called the

96:13

replication crisis

96:15

>> that there are some experiments that

96:17

just don't replicate and there are some

96:20

people that have been proven just to

96:22

have made stuff up

96:24

>> and I wanted to see whether the stuff we

96:29

had built everything on could stand

96:32

scrutiny. So, I corrupted a young

96:35

colleague of mine, Alex Emis, who just

96:38

turned 40, and we took some of those old

96:42

things, two pieces I wrote with Danny

96:44

and one with Amos, and then some others,

96:47

and then gave it the hard look. Does

96:51

this hold up? Is it true out of sample?

96:54

Is it true in the real world? And that's

96:58

what keeps you thinking. I like that in

97:01

the book

97:02

at the end of every one of these

97:04

chapters where they go through the rigor

97:05

of updating it and seeing if it holds

97:07

up, they also say for the economist and

97:10

it's like one sentence. Here's your

97:12

takeaway if you're an economist. And

97:14

then it's like for everyone else, here's

97:16

one sentence that's a takeaway. You can

97:18

read the whole book, but you can also

97:20

read those and get an awful lot out of

97:22

it, which is really good because those

97:24

conclusions are the nuggets that that

97:26

kind of propel the book forward. I think

97:28

as well

97:29

>> the way we wrote it is yeah takeaway for

97:33

humans and for economists

97:36

we don't say whether we think economists

97:39

are not humans but

97:41

>> that actually preempts in a way my

97:43

question but I'll ask it anyway who is

97:45

this book for like who is who is this

97:48

book for who's the reader

97:49

>> I think we tried very hard to write it

97:52

in a way that it's not a thriller and

97:56

it's not a self-help book, but I don't

97:59

think it's as hard as thinking fast and

98:02

slow, which was tough. It's a great

98:05

book, but it's dense.

98:07

>> Yeah.

98:07

>> And this book is much funnier than that.

98:12

>> I think corrupting the youth is always

98:15

on my mind. So, I'm giving a series of

98:18

talks at universities. So, I have a trip

98:21

next week, Cornell, Penn, and Princeton.

98:25

So your alma mater I'll be there in

98:28

green hall. I like interacting with the

98:32

young people. I'm officially went

98:35

ameritus

98:36

July 1.

98:38

So I'm not teaching but I still,

98:42

you know, I divide my time between

98:43

Chicago and Berkeley. I still like going

98:47

to workshops and interacting with my

98:51

colleagues and having them sharpen me. I

98:55

mentioned this to Thaylor when we were

98:57

on our way here is that I was struck by

99:00

the fact that these anomalies were

99:02

pointed out 30, 40 years ago, something

99:05

like that. And every single one of them,

99:09

I could think of an example of a person

99:13

or myself or a business that fell victim

99:17

to one of these issues, if you will. And

99:21

so it almost like shines a light on our

99:24

own, as you were saying, cognitive

99:26

biases in a way that takes something

99:30

that's a little squishy like, you know,

99:31

psychology and this and that, and then

99:33

just applies it to something that

99:36

impacts all of our lives, markets,

99:40

business, the way we conduct our own

99:43

households, and does so in a very pretty

99:45

basic way.

99:47

>> And just for people, I'll give the title

99:48

again. And I'll mention it also towards

99:50

the end, but the winner's curse,

99:51

behavioral economics, anomalies then and

99:53

now. Is this the subject matter,

99:54

Richard, of the talks that you're giving

99:56

at these various schools?

99:57

>> Yeah. So, it's essentially a little book

100:00

tour, but no point in going to

100:01

bookstores.

100:03

I'd rather have

100:06

300 young students minds to corrupt.

100:12

>> Is there anything else, Nick or Richard?

100:14

I'll kick it to Nick first that you'd

100:16

like to cover with Richard before we

100:19

wind to a close or Richard, anything

100:21

else that you'd like to mention, point

100:23

people to requests of my audience,

100:25

anything like that that you'd like to

100:27

mention? Nick, you want to go first? I

100:29

was going to ask the the Tim question,

100:33

which is what books, if you're if you're

100:35

new to understanding this this topic of

100:38

behavioral economics or even just

100:40

traditional economics, what are your

100:42

favorite sources other than your own, of

100:46

course, and you've already mentioned

100:47

Danny's book and all that, but there

100:49

must be some that are kind of the

100:50

foundational books that you go to or you

100:53

suggest to these young folks that you're

100:54

trying to corrupt.

100:55

>> I mentioned this journal, the Journal of

100:57

Economic Perspectives. Most academic

100:59

journals you you can't get. That one is

101:03

posted online. Anybody can read it. And

101:07

if you're modestly interested in

101:10

economics, it's a fantastic journal.

101:13

There's a guy called Timothy Taylor who

101:16

they hired brilliantly. They call him

101:19

the managing editor. I I call him the

101:20

writing editor. and he quickly

101:24

adopted the strategy of taking your

101:27

article and then just rewriting it

101:30

and he would say you know it's like in

101:32

Microsoft Word with track changes but

101:35

the version you would get is the one his

101:38

version and you could restore but we

101:42

know status quo bias works so and he's

101:45

still at it and that's a fantastic place

101:49

to learn about economics is four times a

101:51

here. Typically, there's a symposium on

101:54

some topic and it's a resource nobody

101:58

knows about and is fantastic. Yeah, I

102:02

mentioned Michael Lewis's book, The

102:04

Undoing Project, and it's a great

102:08

insight into Conoran and Tverki. And I

102:12

think I'm not going to mention any other

102:13

books because whichever one I mention, I

102:15

will piss off 12 other people. So, I'm

102:18

going to

102:20

I'll keep the friends I have.

102:22

>> Well, Richard and Nick, thanks so much

102:25

for taking the time today for a very

102:27

wide range of conversation. There's a

102:28

lot more that I could ask about, but

102:31

since we're racking up some decent

102:32

mileage on this conversation, I'll keep

102:34

it to to this duration for round one.

102:38

And people can find the winner's curse,

102:41

behavioral economics, anomalies then and

102:43

now, which is co-authored with Alex, is

102:46

it Immus? Am I saying that correct?

102:47

>> Emus. Emis.

102:49

>> Emis with Alex Emis. And we'll link to

102:52

that in the show notes. You can find

102:53

Richard on X, the artist formerly known

102:56

as Twitter, x.com/r_thailer,

103:00

t h a l e. And as usual everybody, I

103:03

will link to anything that came up in

103:05

the conversation in the show notes at

103:07

tim.blog/mpodcast.

103:08

You can just search t h a l e r. And

103:12

Nick has been on the show I think at

103:14

least now this would be the third or

103:16

fourth time. So, if you want to delve

103:17

into all the background on Nick, you

103:19

have ample opportunity.

103:21

>> Hey, Thaylor, thanks for doing this. I

103:23

really appreciate it. I always love

103:25

spending time with you.

103:27

>> Was great having Tim here to make me

103:30

sound better at asking questions. It is,

103:32

I will say to the audience, it is much

103:33

much harder what Tim does than to be a

103:36

guest on the show. And so, great respect

103:38

to you because week after week, I listen

103:40

to your podcast and you do a wonderful

103:41

job.

103:42

>> Oh, thanks, man. Thanks, Nick. and we're

103:44

overdue for an inerson catchup. So, I

103:47

look forward to making that happen.

103:49

>> And I I look forward to meeting you in

103:51

person as well.

103:52

>> That would be great. I do spend some

103:54

time in Chicago. I also spend time

103:57

occasionally in NorCal. I got a lot of

103:58

friends at Berkeley, so I would suspect

104:01

we'll cross paths.

104:02

>> Yeah, I think we both know Michael

104:04

Poland, right?

104:05

>> Yeah, absolutely. I'm involved with the

104:07

the center there on a couple of levels.

104:09

So, lots of overlap. I really appreciate

104:11

the time, guys.

104:12

>> Cheers, Tim. Thanks, too.

104:13

>> Thank you.

104:14

>> And enjoy your dinner. I will talk to

104:16

you guys soon. Take care.

104:17

>> Sounds good. Bye-bye.

104:19

>> Take care everybody.

Interactive Summary

The discussion delves into the foundational principles of economics, particularly highlighting how traditional economic models often make simplifying assumptions about human behavior, such as rationality and selfishness. It explores the origins of behavioral economics, stemming from observations that real people don't always act according to these idealized models. Key concepts like loss aversion, the endowment effect, mental accounting, and the sunk cost fallacy are explained through anecdotes and experiments. The conversation also touches upon the application of these behavioral insights in real-world scenarios, from retirement savings and business strategies to public policy and even sports analytics. Finally, it reflects on the evolution of economic thought and the ongoing efforts to integrate a more realistic understanding of human psychology into economic theory.

Suggested questions

9 ready-made prompts