HomeVideos

Joe Rogan Experience #2422 - Jensen Huang

Now Playing

Joe Rogan Experience #2422 - Jensen Huang

Transcript

3819 segments

0:01

Joe Rogan podcast. Check it out.

0:04

>> The Joe Rogan Experience.

0:06

>> TRAIN BY DAY. JOE ROGAN PODCAST BY

0:08

NIGHT. All day.

0:12

>> Hello. Hey, Joe.

0:14

>> Good to see you again. We were just

0:15

talking about Was that the first time we

0:17

ever spoke or did was the first time we

0:19

spoke at at SpaceX?

0:20

>> SpaceX.

0:20

>> SpaceX. The first time when you were

0:22

giving Elon that crazy AI chip,

0:24

>> right? DJX Spark.

0:26

>> Yeah. Oo, that was a big moment. That

0:27

was a huge

0:28

>> That felt crazy to be there. I was like

0:30

watching these wizards of tech like

0:32

exchange information and and you're

0:36

giving him this crazy device, you know,

0:38

and then the other time was uh I was

0:40

shooting arrows in my backyard and uh

0:43

randomly get this call from Trump and

0:45

he's hanging out with you. President

0:46

Trump called and I called you.

0:47

>> Yeah. It's just

0:48

>> we were talking about you. [laughter]

0:51

>> It's just talking about he was talking

0:52

about the US UFC thing he was going to

0:54

do in his front yard.

0:55

>> Yeah. And he pulls out. He's JJS, look

0:58

at this design. He's so proud of it. And

1:01

I go, "You're going to have a fight in

1:04

the front lawn in the White House." He

1:05

goes, "Yeah, yeah, you're going to come.

1:07

This is going to be awesome." And he's

1:08

showing me his design and how beautiful

1:10

it is. And he goes, and somehow your

1:14

name comes up. He goes, "Do you know

1:16

Joe?" And I said, "Yeah, I'm going to be

1:18

on his podcast." He Let's call him.

1:21

[laughter]

1:23

>> He's like a kid.

1:24

>> I know. Let's call him. It's so He's

1:26

like a 79y old kid.

1:28

>> Oh, he's so incredible.

1:30

>> Yeah, he's an odd guy. Just very

1:33

different, you know, like the what you'd

1:35

expect from him. Very different than

1:38

what people think of him. And also just

1:40

very different as a president. A guy who

1:42

just calls you or texts you out of the

1:43

blue. Also, he makes when you te you.

1:45

You have an Android, so it won't go

1:47

through with you, but with my iPhone, he

1:49

makes the text go big.

1:50

>> Like, you know, USA is respected again.

1:53

like [laughter]

1:55

all caps and it makes the te the the the

1:58

text enlarge is kind of ridiculous.

2:00

>> Well, the the 101 Trump President Trump

2:04

is very different. He he surprised me f

2:07

first of all he's an incredibly good

2:09

listener. Almost everything I've ever

2:11

said to him, he's remembered.

2:13

>> Yeah. People don't they only want to

2:16

look at negative stories about him or

2:19

negative narratives about him. You know,

2:21

you can catch anybody on a bad day. Like

2:22

there's a lot of things he does where I

2:24

don't think he should do. Like I don't

2:25

think he should say to a reporter rep

2:26

reporter, "Quiet piggy." Like that's

2:29

pretty ridiculous. Also objectively

2:32

funny. I mean, it's unfortunate that it

2:34

happened to her. I wouldn't want that to

2:35

happen to her, but it was funny. Just

2:38

ridiculous that the president does that.

2:39

I wish he didn't do that. But other than

2:41

that, like he's he's an interesting guy.

2:43

Like he's a lot of different things

2:45

wrapped up into one person, you know?

2:49

You know, part of part of his charm,

2:51

well, part of his genius is Yes. He says

2:53

what's on his mind.

2:54

>> Yes.

2:55

>> And which is like an anti-olitician in a

2:57

lot of ways.

2:58

>> So, you know, what's on his mind is

3:00

really what's on his mind,

3:02

>> which

3:04

I I do some people some people would

3:06

rather be lied to.

3:07

>> Yeah. But but I I like the fact that

3:09

he's telling you what's on his mind. Um,

3:11

almost every time he explains something,

3:13

he says something,

3:15

he starts with his, you could tell, his

3:18

love for America, what he wants to do

3:20

for America. And everything that he

3:24

thinks through is very practical and

3:26

very common sense. And, you know, it's

3:29

very logical and um

3:32

I still remember the first time I I met

3:34

him and so this was I I'd never known

3:37

him, never met him before. and um uh

3:39

Secretary Lutnik called and we met right

3:43

before right at the beginning of the

3:45

administration. He said he told me what

3:47

was important to President Trump that

3:49

that um uh that United States

3:53

manufactures on shore and that was

3:56

really important to him because because

3:58

uh it's important to national security.

4:00

He wants to make sure that that the

4:02

important critical technology of our

4:03

nation is built in the United States and

4:06

that we re-industrialize

4:08

and get good at manufacturing again

4:10

because it's important for jobs.

4:11

>> It just seems like common sense, right?

4:13

>> Incredible common sense. And and that

4:14

was like literally the first

4:16

conversation I had with Secretary Letic

4:19

um and he was talking about how how um

4:24

that he started he started our

4:26

conversation with uh Jensen. This is

4:29

Secretary Lutnik and I I just want to

4:32

let you know that you're a national

4:34

treasure. Uh Nvidia is a national

4:37

treasure and whenever you need access to

4:41

the president um the administration uh

4:44

you call us. We're always going to be

4:46

available to you. Literally, that was

4:49

the first sentence.

4:50

>> That's pretty nice.

4:51

>> And it was completely true. every single

4:54

time I called, if I needed something, I

4:57

want to get something off my chest, um,

4:59

express some concern, uh, they're always

5:01

available. Incredible. It's just

5:03

unfortunate we live in such a

5:04

politically polarized society that you

5:07

can't recognize good common sense things

5:09

if they're coming from a person that you

5:11

object to. And that, I think, is what's

5:13

going on here. I think most people

5:15

generally a as a country, you know, as a

5:18

a giant community, which we are, it just

5:20

only makes sense that we have

5:24

manufacturing in America that especially

5:26

critical technology like you're talking

5:28

about. Like it's kind of insane that we

5:30

buy so much technology from other

5:32

countries.

5:33

>> If United States doesn't grow, we will

5:36

have no prosperity. We can't invest in

5:40

anything domestically or otherwise. we

5:43

can't fix any of our problems. If we

5:45

don't have energy growth, we can't have

5:48

industrial growth. If we don't have

5:50

industrial growth, we can't have job

5:52

growth. These it's as simple as that,

5:54

>> right?

5:55

>> And the fact that the fact that he came

5:57

into office and the first thing that he

5:58

said was drill baby drill. His point is

6:01

we need energy growth. Without energy

6:03

growth, we can have no industrial

6:05

growth. And that was it saved it saved

6:08

the AI industry. got I got to tell you

6:10

flat out if not for his progrowth energy

6:15

policy

6:16

we would not be able to build factories

6:18

for AI not be able to build chip

6:20

factories we won't sure surely won't be

6:22

able to build supercomputer factories

6:24

none of that stuff would be possible

6:26

without all of that

6:28

construction jobs would be challenged

6:30

right electrical you know electrician

6:32

jobs all of these jobs that are now

6:34

flourishing would be challenged and so I

6:36

think he's got it right we need energy

6:38

growth We want to re-industrialize the

6:40

United States. We need to be back in

6:42

manufacturing. Every successful person

6:45

doesn't need to have a PhD. Every

6:47

successful person doesn't have to have

6:48

gone to Stanford or MIT. And I think I

6:51

think that that that you know that

6:53

sensibility is is um spot on. Now, when

6:57

we're talking about technology growth

6:59

and energy growth, there's a lot of

7:00

people that go, "Oh, no. That's not what

7:02

we need. We need to, you know, simplify

7:04

our lives and get back." But the the

7:06

real issue is that we're in the middle

7:07

of a giant technology race. And whether

7:10

people are aware of it or not, whether

7:11

they like it or not, it's happening. And

7:13

it's a really important race because

7:16

whoever gets to

7:18

whatever the event horizon of artificial

7:22

intelligence is, whoever gets there

7:23

first has massive advantages in a huge

7:26

way.

7:28

Do you agree with that? Well, first the

7:30

part I I will say that we are in a

7:32

technology race and we are always in a

7:34

technology race. We've been in a

7:36

technology race with somebody forever.

7:39

>> Right.

7:39

>> Right. Since the industrial revolution,

7:40

we've been in a technology

7:41

>> since the Manhattan project.

7:42

>> Yeah.

7:43

>> Or or you know, even going back to the

7:45

discovery of energy, right? The United

7:48

Kingdom was where the industrial

7:51

revolution was, if you will, invented

7:54

when they realized that they can turn

7:55

steam and such into into energy into

7:58

electricity.

8:00

All of that was invented largely in

8:04

Europe and the United States capitalized

8:07

on it. We were the ones that learned

8:09

from it. We industrialized it. We

8:12

diffused it faster than anybody in

8:14

Europe. They were all stuck in

8:17

discussions about

8:20

policy and

8:22

jobs and disruptions. Meanwhile, the

8:25

United States was forming. We just took

8:26

the technology and ran with it. And so I

8:29

I think we were always in in a bit of a

8:31

technology race. World War II was a

8:32

technology race. Manhattan Project was a

8:35

technology race. We've been in the

8:36

technology race ever since during the

8:38

Cold War. I think we're still in a

8:40

technology race. It is probably the

8:42

single most important race. It is the

8:45

technology is uh it gives you

8:48

superpowers.

8:50

you know whether it's information

8:51

superpowers or energy superpowers or

8:55

military superpowers is all founded in

8:57

technology and so technology leadership

8:59

is really important

9:00

>> well the problem is if somebody else has

9:02

superior technology right that's that's

9:04

the issue it seems like with the AI race

9:08

people are very nervous about it like

9:10

you know Elon has famously said there

9:13

was like 80% chance it's awesome 20%

9:16

chance we're in trouble and people are

9:18

worried about that 20% % rightly so. I

9:20

mean that you know if you had 10 bullets

9:23

in a a a revolver and you know you you

9:28

took out eight of them and you still

9:30

have tw two in there and you spin it,

9:32

you're not going to feel real

9:32

comfortable when you pull that trigger.

9:34

It's terrifying,

9:34

>> right?

9:35

>> And when we're working towards this

9:38

ultimate goal um of AI,

9:42

it it just it's

9:45

impossible to imagine that it wouldn't

9:46

be of national security interest to get

9:48

there first.

9:49

We should The question is what's there?

9:51

That's the That was the part that

9:52

>> What is there?

9:53

>> Yeah. I'm not sure.

9:54

>> And I don't think anybody I don't think

9:56

anybody really knows.

9:57

>> That's crazy though. If I ask you,

10:00

>> you're the head of Nvidia. If you don't

10:02

know what's there, who knows?

10:04

>> Yeah. I I think it's probably going to

10:06

be much more gradual than we think. It

10:09

won't It won't be a moment. It won't be

10:12

It won't be as if um somebody arrived

10:15

and nobody else has. I don't think it's

10:17

going to be like that. I think it's

10:18

going to be things that just get better

10:20

and better and better and better just

10:22

like technology does.

10:23

>> So, you are rosy about the future.

10:25

You're you're very optimistic about

10:27

what's going to happen with AI.

10:29

>> Obviously, will you make the best AI

10:31

chips in the world?

10:32

>> You probably better be.

10:33

>> Uh h if history is a guide, um uh we

10:37

were always concerned about new

10:39

technology.

10:41

Humanity has always been concerned about

10:42

new technology. There are always

10:44

somebody who's thinking there always a

10:46

lot of people who are quite concerned.

10:47

were quite concerned and and and so if

10:51

if history is a guide, it is the case um

10:56

that all of this concern is channeled

10:59

into making the technology safer.

11:02

And so for example, in the last several

11:05

years, I would say AI technology has

11:08

increased probably in the last two years

11:12

alone, maybe a 100x. Let's just give it

11:14

a number, okay? It's like a car two

11:18

years ago was 100 times slower. So AI is

11:22

100 times more capable today. Now, how

11:25

did we channel that technology? How do

11:27

we channel all of that power? We

11:29

directed it to um causing the AI to be

11:33

able to think, meaning that it can take

11:37

a problem that we give it, break it down

11:39

step by step.

11:41

It does research before it answers. And

11:44

so it grounds it on truth.

11:47

It'll reflect on that answer. Ask

11:49

itself, is this the best, you know,

11:52

answer that I can give you. Am I certain

11:54

about this answer? If it's not certain

11:56

about the answer or highly confident

11:58

about the answer, it'll go back and do

11:59

more research. It might actually even

12:02

use a tool because that tool provides a

12:04

better solution than it could

12:06

hallucinate itself. As a result, we took

12:08

all of that computing capability and we

12:11

channeled it into having it produce a

12:14

safer result, safer answer, a more

12:16

truthful answer because as you know, one

12:19

of the greatest criticisms of AI in the

12:20

beginning was that it hallucinated,

12:22

>> right?

12:23

>> And so if you look at the reason why

12:25

people use AI so much today is because

12:28

the amount of hallucination has reduced.

12:30

You know, I use it almost I well I used

12:32

it the whole trip over here and so so I

12:35

think the

12:37

the uh the the capability most people

12:40

think about power

12:42

and they think about you know maybe as

12:44

an explosion power but the technology

12:47

power most of it is channeled to towards

12:49

safety. A car today is more powerful but

12:52

it's safer to drive. A lot of that power

12:55

goes towards better handling. You know,

12:58

I'd rather have a Well, you have a 1000

13:01

horsepower truck. I think 500 horsepower

13:04

is pretty good. No, I thousand's better.

13:06

I think a th00and is better.

13:07

>> I don't know if it's better, but it's

13:09

definitely faster.

13:10

>> Yeah. No, I think it's better. You can

13:12

get out of trouble faster. Um,

13:17

I enjoyed my 599 more than my 612. It

13:21

was I think it was a better better

13:23

horsepower is better. My 459 is better

13:25

than my 430.

13:27

more horsepower is better. I I think

13:29

more horsepower is better. I think it's

13:30

better handling. It's better control. In

13:33

the case of in the case of technology,

13:34

it's also very similar in that way, you

13:37

know. And so if you if you look at what

13:38

we're going to do with the next thousand

13:40

times of performance in AI, a lot of it

13:43

is going to be channeled towards more

13:46

reflection, more research,

13:49

thinking about the answer more deeply.

13:51

So when you're defining safety, you're

13:53

defining a it as accuracy,

13:55

>> functionality.

13:56

>> Functionality. Okay.

13:58

>> It it does what you expect it to do. And

14:01

then you take all the the the technology

14:04

in the horsepower, you put guard rails

14:05

on it, just like our cars. We've got a

14:08

lot of technology in in a car today. A

14:10

lot of it is goes towards, for example,

14:12

ABS. ABS is great. And so, uh, traction

14:16

control, that's fantastic. without a

14:18

without a computer in the car, how would

14:20

you do any of that,

14:21

>> right?

14:22

>> And that little computer, the computers

14:24

that you have doing your traction

14:25

control is more powerful than the

14:27

computer that went to Apollo 11. And so

14:30

you want that technology,

14:32

channel it towards safety, channel it

14:34

towards functionality. And so when

14:36

people talk about power, the advancement

14:38

of technology, often times I I I feel

14:41

what they're thinking and what we're

14:43

actually doing is very different.

14:45

>> Well, what do you think they're

14:46

thinking? Well, they're thinking somehow

14:49

that this this uh this AI is being

14:52

powerful and their their mind probably

14:55

goes towards a sci-fi movie. The

14:58

definition of power, you know, often

15:00

times the definition definition of power

15:02

is military power or physical power. But

15:06

in in the case of technology power when

15:09

we translate all of those operations

15:11

it's towards more refined thinking you

15:14

know more reflection more planning more

15:17

options

15:18

>> I think the big fears that people have

15:20

is one a big fear is military

15:22

applications that's a big fear

15:24

>> because people are very concerned that

15:25

you're going to have

15:27

>> AI systems that make decisions that

15:29

maybe an ethical person wouldn't make or

15:31

a moral person wouldn't make based on

15:33

achieving an objective versus based on,

15:36

you know, how it's going to look to

15:39

people.

15:41

>> Well, I'm I'm happy that that uh our

15:44

military is going to use AI technology

15:46

for defense and I think that that um uh

15:51

Andural uh building military technology.

15:53

I'm happy to hear that. I'm happy to see

15:56

um all these tech startups now

15:57

channeling their technology capabilities

15:59

towards defense and military

16:01

applications. I think you needed to do

16:03

that.

16:03

>> Yeah, we had Palmer Lucky on the

16:05

podcast. He was demonstrating some of

16:06

the stuff I put his helmet on. And we

16:08

show we he showed some videos how you

16:09

could see behind walls and stuff like

16:11

it's nuts.

16:12

>> And he's he's actually the perfect guy

16:13

to go start that company.

16:15

>> 100%. [laughter] Yeah. 100%. It's like

16:17

he was born for that. Yeah. He came in

16:20

here with a copper jacket on. He's a

16:22

freak. [clears throat] It's [laughter]

16:23

awesome. He's awesome. But it's also

16:25

it's a you know an unusual intellect

16:27

channeled into that very bizarre field

16:30

is what you need, you And I think it's

16:32

it's uh I think I'm happy that we're

16:35

making it so more socially acceptable.

16:38

You know, there was a time where when

16:40

somebody wanted to channel their

16:41

technology capability and their

16:43

intellect into defense technology, uh

16:47

somehow they're vilified. Um but uh we

16:50

need people like that. We need people

16:51

who enjoyed enjoy that part of uh

16:54

application of technology.

16:55

>> Well, people are terrified of war, you

16:58

know. So it depends.

16:59

>> Best way to avoid it has excessive

17:01

military might.

17:03

>> Do you think that's absolutely the best

17:04

way? Not not diplomacy, not working

17:07

stuff out.

17:08

>> All of it.

17:08

>> All of it. You have to have military

17:11

might in order to get people to sit down

17:12

with you.

17:13

>> Right. Exactly. All of it.

17:14

>> Otherwise, they just invade.

17:15

>> That's right. [laughter] Why ask for

17:17

permission?

17:18

>> Again, like you said, history. Go back

17:20

and look at history. Um, when you look

17:22

at the future of AI and and you just

17:26

said that no one really knows what's

17:27

happening, do you ever sit down and

17:30

ponder scenarios?

17:32

>> Like what do you what do you think is

17:33

like bestcase scenario for AI over the

17:37

next two decades?

17:40

Um

17:43

the best case scenario is that AI

17:46

diffuses into everything that we do and

17:51

uh our

17:54

everything's more efficient but

17:58

the threat of war remains a threat of

18:01

war.

18:02

Uh, cyber security remains

18:07

a super difficult challenge.

18:09

Somebody is going to try to

18:12

breach your security. You're going to

18:15

have thousands of millions of AI agents

18:19

protecting you from that threat.

18:22

Your technology is going to get better.

18:25

Their technology is going to get better.

18:26

Just like cyber security. Right now,

18:28

while we speak, we're being

18:32

we're seeing cyber attacks all over the

18:34

planet on just about every front door

18:35

you can imagine.

18:38

And

18:39

and yet you and I are sitting here

18:43

talking. And so the reason for that is

18:46

because we know that there's a whole

18:48

bunch of cyber security technology in

18:50

defense. And so we just have to keep

18:52

amping that up, keep stepping that up.

18:55

This episode is brought to you by

18:56

Visible. When your phone plans as good

18:59

as visible, you've got to tell your

19:01

people. It's the ultimate wireless hack

19:03

to save money and still get great

19:05

coverage and a reliable connection. Get

19:07

one line wireless with unlimited data

19:10

and hotspot for $25 a month. Taxes and

19:14

fees included, all on Verizon's 5G

19:18

network. Plus, now for a limited time,

19:20

new members can get the Visible plan for

19:23

just $19 a month for the first 26

19:26

months. Use promo code switch 26 and

19:30

save beyond the season. It's a deal so

19:33

good you're going to want to tell your

19:35

people. Switch now at visible.com/rogan.

19:38

Terms apply. Limited time offers subject

19:41

to change. See visible.com for planned

19:44

features and network management details.

19:47

That's a big issue with people is the

19:50

the worry that technology is going to

19:51

get to a point where encryption is going

19:53

to be obsolete. Encryption is just it's

19:56

no longer going to protect data. It's no

19:57

longer going to protect systems. Do you

19:59

anticipate that ever being an issue or

20:01

do you think there's it's as the defense

20:03

grows, the threat grows, the defense

20:06

grows, and it just keeps going on and on

20:07

and on and they'll always be able to

20:09

fight off any sort of intrusions?

20:15

>> Not forever. some intrusion will get in

20:18

and then that we'll all learn from it.

20:20

And you know the reason why cyber

20:22

security works is because of course the

20:24

technology of defense is advancing very

20:27

quickly. The technology offense is

20:29

advancing very quickly. However, the

20:33

benefit of the cyber security defense is

20:36

that socially the community all of our

20:40

companies work together as one. Most

20:43

people don't realize this.

20:45

There's a whole community of cyber

20:48

security experts. We exchange

20:53

ideas. We exchange best practices. We

20:55

exchange what we detect. The moment

20:58

something has been breached or maybe

21:00

there's a loophole or whatever it is, it

21:03

is shared by everybody. The patches are

21:05

shared with everybody.

21:06

>> That's interesting.

21:07

>> Yeah. Most people don't realize this.

21:08

>> No, I had no I had no idea. I've assumed

21:10

that it would just be competitive like

21:12

everything else.

21:13

>> We work together. Interesting. Has that

21:15

always been the case?

21:17

>> Uh, it surely has been the case for

21:18

about about 15 years. It might not have

21:21

been the case long ago, but this this

21:24

>> what do you think started off that

21:25

cooperation?

21:27

>> Um, people recognizing it's a challenge

21:29

and no company can stand alone.

21:32

>> And the same thing is going to happen

21:33

with AI. I think we all have to decide

21:37

work working together uh to stay out of

21:40

harm's way is is our best chance for

21:42

defense. Then it's basically everybody

21:45

against the threat.

21:46

>> And it also seems like you'd be way

21:48

better at detecting where these threats

21:50

are coming from and neutralizing them.

21:52

>> Exactly. Because the moment you detect

21:54

it somewhere,

21:55

>> you're going to find out right away.

21:56

>> It'll be really hard to hide.

21:57

>> That's right.

21:58

>> Yeah.

21:59

>> That's how it works. That's the reason

22:00

why it's safe. That's why I'm sitting

22:02

here right now instead of, you know,

22:03

locking everything down in video.

22:05

[laughter]

22:07

>> It's not only am I watching my own back,

22:10

I've got everybody watching my back. and

22:12

I'm watching everybody else's back.

22:13

>> It's a bizarre world, isn't it? When you

22:15

think about that cyber threat,

22:16

>> this idea about cyber security is

22:19

unknown to the people who are talking

22:21

about AI threats. They're I think when

22:24

they think about AI threats and AI cyber

22:26

security threats, they have to also

22:27

think about how we deal with it today.

22:29

Now, there's no question that AI is a

22:33

new technology

22:35

and it's a new type of software. In the

22:37

end, it's software just it's a new type

22:39

of software and so it's going to have

22:41

new capabilities but so will the defense

22:44

you know where you use the same AI

22:45

technology to go defend against it. So

22:48

you do you anticipate a time ever in the

22:51

future where it's going to be impossible

22:54

where there's not going to be any

22:56

secrets where the bottleneck between the

23:00

technology that we have and the

23:01

information that we have. Information is

23:02

just all a bunch of ones and zeros. It's

23:04

out there on hard drives and the

23:06

technology has more and more access to

23:07

that information. Is it ever going to

23:09

get to a point in time where there's no

23:12

way to keep a secret?

23:14

>> I don't think

23:14

>> because it seems like that's where

23:15

everything is kind of headed in a weird

23:17

way.

23:17

>> I don't think so. I think the quantum

23:18

computers were supposed to will Yeah.

23:21

quantum computers will make it possible

23:22

will make it so that the previous

23:25

quantum previous encryption technology

23:28

is obsolete. But that's the reason why

23:31

the entire industry is working on

23:33

postquantum

23:35

encryption technology.

23:37

>> What would that look like?

23:39

>> New algorithms.

23:40

>> But the crazy thing is when you hear

23:41

about the kind of computation that

23:44

quantum computing can do.

23:45

>> Yeah.

23:46

>> And the the power that it has. Yeah.

23:47

>> Where you know you're looking at

23:49

>> all the supercomputers in the world. It

23:51

would take billions of years and it

23:52

takes them a few minutes to solve these

23:54

equations. Like how do you make

23:56

encryption for something that can do

23:58

that? I'm not sure, but there's

23:59

[laughter]

24:00

but I've got a bunch of scientists who

24:02

are working on that.

24:02

>> Boy, I hope they [snorts] could figure

24:04

it out.

24:04

>> Yeah, we got a bunch of scientists who

24:05

are expert in that. And

24:06

>> is the ultimate fear that it can't be

24:08

breached that quantum computing will

24:10

always be able to to decrypt all other

24:13

quantum computing encryption?

24:16

>> I don't think that

24:16

>> it just gets to some point where it's

24:18

like, stop playing the stupid game. We

24:20

know everything.

24:21

>> I don't think so.

24:22

>> No,

24:22

>> because I I'm you know, history is

24:25

guide.

24:26

History is a guide before AI came

24:28

around. That's my worry. My worry is

24:30

this is a totally, you know, it's like

24:31

history was one thing and then nuclear

24:33

weapons kind of changed all of our

24:35

thoughts on war and mutually assured

24:37

destruction came

24:40

everybody to stop using nuclear bombs.

24:42

>> Yeah.

24:43

>> My worry is that

24:44

>> the thing is Joe is that that AI is not

24:46

going to it's not like we're cavemen and

24:49

then all of a sudden one day AI shows

24:51

up. every single day we're getting

24:54

better and smarter because we have AI

24:57

and so we're stepping on our own AI's

24:59

shoulders. So when when that whatever

25:01

that AI threat comes, it's a click

25:05

ahead. It's not a galaxy ahead,

25:08

>> you know, it's just a click ahead. And

25:10

so so I think I think the the the idea

25:14

that somehow this AI

25:17

is going to pop out of nowhere and

25:20

somehow think in a way that we can't

25:22

even imagine thinking and do something

25:25

that we can't possibly imagine I think

25:28

is far-fetched. And the reason for that

25:30

is because we're all have we all have

25:32

AIs and you know there's a whole bunch

25:34

of AIs being in development. we know

25:36

what they are and we're using it and and

25:38

so every single day we're getting we're

25:40

close to each other.

25:42

>> But don't they do things that are very

25:44

surprising?

25:46

>> Yeah. But so you you have an AI that

25:48

does something surprising. I'm going to

25:49

have an AI and my AI looks at your AI

25:52

and goes that's not that surprising.

25:53

>> The fear for the lay person like myself

25:55

is that AI becomes sentient and makes

25:57

its own decisions

25:59

and then ultimately decides to just

26:03

govern the world. do it its own way.

26:06

They're like, "You guys, you had a good

26:08

run, but

26:09

>> we're taking over now."

26:12

>> Yeah, but my my AI is gonna take care of

26:14

me. I mean, [laughter]

26:15

so that's the this is the cyber security

26:19

argument.

26:19

>> Yes.

26:20

>> Do you have an AI and it's super smart,

26:23

but my AI is super smart, too. And and

26:25

maybe your AI. Let let's pretend let's

26:28

let's pretend for a second that we

26:30

understand what consciousness is and we

26:32

understand what sentience is and and

26:34

that in fact

26:34

>> and we really are just pretending.

26:36

>> Okay, let's just pretend for a second

26:37

that we we believe that. I don't believe

26:39

actually I don't actually don't believe

26:40

that but nonetheless we let's pretend we

26:42

believe that.

26:42

>> So your your your AI is conscious and my

26:45

AI is conscious and and let's say your

26:47

AI is you know wants to I don't know do

26:51

something surprising.

26:53

My AI is so smart that it won't it might

26:56

be surprising to me, but it probably

26:57

won't be surprising to my AI. And so

27:00

maybe my AI

27:02

thinks it's surprising as well, but it's

27:05

so smart the moment it sees it the first

27:07

time, it's not going to be a surprise

27:08

the second time, just like us. And so I

27:11

feel like I think the idea that that

27:15

only one person has [clears throat] AI

27:16

and that one person's AI is compares

27:20

everybody else's AI is Neanderthal

27:24

[snorts] is um probably unlikely. I

27:26

think it's much more like cyber

27:28

security.

27:30

>> Interesting.

27:31

>> I think the fear is not that your AI is

27:34

going to battle with somebody else's AI.

27:36

The fear is that AI is no longer going

27:38

to listen to you. That's the fear is

27:41

that human beings won't have control

27:42

over it after a certain point if it

27:45

achieves sensience and then has the

27:47

ability to be autonomous

27:49

>> that there's one AI.

27:51

>> Well, they just combine.

27:53

>> Yeah. Becomes one AI

27:54

>> that it's a life form.

27:55

>> Yeah.

27:56

>> But that's the there's arguments about

27:57

that, right? That we're dealing with

27:59

some sort of synthetic biology that it's

28:01

not as simple as new technology that

28:03

you're creating a life form.

28:04

>> If it's like life form,

28:07

let's go along with that for a while. I

28:09

think if it's like life form, as you

28:11

know, all life forms don't agree. And so

28:14

I'm going to have to go with your life

28:16

form and my life form are going to agree

28:18

because my life form is going to want to

28:19

be the super life form. And and now that

28:22

now that we have disagreeing life forms,

28:25

uh we're back back again to where we

28:26

are. Well, they would probably cooperate

28:28

with each other.

28:31

It would just the reason why we don't

28:33

cooperate with each other is we're

28:35

territorial primates.

28:37

But AI wouldn't be a territorial

28:39

primate. It would realize the folly in

28:41

that sort of thinking and it would say,

28:44

"Listen, there's plenty of energy for

28:46

everybody. We we don't need to dominate.

28:49

We don't need We're not trying to

28:51

acquire resources and take over the

28:52

world. We're not looking to find a good

28:54

breeding partner. We're just existing as

28:58

a new super life form that these cute

29:02

monkeys created for us."

29:04

Okay. Well, that would be a that would

29:07

be a um a superpower with no ego,

29:12

>> right? And and if it has no ego,

29:17

why would it have the ego to do any harm

29:19

to us?

29:20

>> Well, I don't assume that it would do

29:22

harm to us, but the the fear would be

29:25

that we would no longer have control and

29:27

that we would no longer be the apex

29:30

species on the planet. this thing that

29:32

we created would now be. [laughter]

29:35

>> Is that funny?

29:36

>> No.

29:37

>> I just think it's not gonna happen.

29:38

>> I know you think it's not gonna happen,

29:40

but

29:41

>> it could, right? And here's the other

29:43

thing is like

29:44

>> if we're racing towards could Yeah.

29:46

>> And could could be the end of human

29:50

beings being in control of our own

29:51

destiny.

29:53

>> I just think it's extremely unlikely.

29:55

>> Yeah.

29:55

>> That's what they said in the Terminator

29:57

movie [laughter]

29:58

>> and it hasn't happened.

29:59

>> No, not yet. But you guys are working

30:01

towards it. Um the the thing about

30:04

you're saying about conscience and

30:05

sensience that you don't think that AI

30:08

will achieve consciousness or that the

30:11

question is what's the definition?

30:12

>> Yeah. What's the definition of

30:14

>> what is the definition to you?

30:16

>> Um uh

30:19

consciousness

30:21

um

30:23

uh f I guess first of all uh you need to

30:26

know about your own existence.

30:31

Um,

30:36

you have to have experience, not just

30:39

knowledge and intelligence.

30:47

The concept of a machine

30:49

having an experience.

30:52

I'm not well, first of all, I don't know

30:54

what defines experience, why we have

30:56

experiences, right?

30:57

>> Yeah. and why this microphone doesn't

31:00

uh and so it I think I know I well I

31:05

think I I I think I know what

31:08

consciousness is the sense of experience

31:11

the ability to know self versus

31:16

um

31:18

uh the ability to be able to reflect

31:21

know our own self the sense of ego I

31:25

think all of all of those human

31:27

experiences

31:29

uh probably is what consciousness is

31:35

but why it exists versus

31:39

the concept of knowledge and

31:41

intelligence which is what AI is defined

31:44

by today [clears throat] it has

31:45

knowledge it has intelligence artificial

31:48

intelligence we don't call it artificial

31:49

consciousness

31:51

artificial intelligence the ability to

31:54

uh perceive believe, recognize,

31:58

understand,

32:00

um, plan,

32:04

uh, perform tasks.

32:07

Those things are foundations of

32:09

intelligence

32:11

to know things, knowledge.

32:14

I don't, it's clearly different than

32:17

consciousness.

32:18

>> But consciousness is so loosely defined.

32:20

How can we say that? I mean, doesn't a

32:22

dog have consciousness? Yeah.

32:23

>> Dogs seem to be pretty conscious.

32:25

>> That's right.

32:25

>> Yeah. So, and that's a lower level

32:27

consciousness than a human being's

32:29

consciousness.

32:30

>> I'm not sure. Yeah. Right. Well,

32:32

>> the question is what lower level

32:34

intelligence? It's lower level

32:35

intelligence, but I don't know that it's

32:37

lower level consciousness.

32:38

>> That's a good point. Right.

32:39

>> Because I believe my dogs feel as much

32:42

as I feel.

32:42

>> Yeah. They feel a lot. Right.

32:45

>> Yeah. They get attached to you. That's

32:47

right. They get depressed if you're not

32:48

there.

32:49

>> That's right. Exactly.

32:50

>> There's There's definitely that.

32:52

>> Yeah. um the the concept of experience,

32:56

>> right?

32:56

>> Um but isn't AI interacting with

32:59

society? So, doesn't it acquire

33:01

experience through that interaction?

33:04

>> Um I don't think interactions is

33:06

experience. I think experience is uh

33:10

experience is a collection of feelings.

33:13

I think

33:15

>> you're aware of that AI um I forget

33:17

which one where they gave it some false

33:20

information about one of the programmers

33:22

having an affair with his wife just to

33:24

see how it would respond to it and then

33:25

when they said they were going to shut

33:26

it down it threatened to blackmail him

33:28

and reveal his affair and it was like

33:31

whoa like it's conniving like if that's

33:33

not learning from experience and being

33:37

aware that you're about to be shut down

33:38

which would imply at least some kind of

33:40

consciousness or you could kind defined

33:43

it as consciousness if you were very

33:45

loose with the term and if you imagine

33:47

that this is going to exponentially

33:49

become more powerful. Wouldn't that

33:52

ultimately lead to a different kind of

33:54

consciousness than we're defining from

33:56

biology? Well, first of all, let's just

33:58

break down what it probably did. It

34:01

probably read somewhere. There's

34:03

probably text that that in these

34:07

consequences

34:09

certain people did that. I could imagine

34:12

a novel,

34:12

>> right?

34:13

>> Having those words related.

34:16

>> Sure.

34:16

>> And so inside

34:18

>> it realizes it strategy for survival is

34:20

>> it's just a bunch of numbers

34:22

>> that it's just a bunch of numbers that

34:24

that in the in the collection of numbers

34:28

that relates to a husband cheating on a

34:30

wife. Um

34:33

has subsequently a bunch of numbers that

34:37

relates to blackmail and such things.

34:39

However, whatever the revenge was,

34:41

>> right?

34:42

>> And so it has spewed it out.

34:44

>> And so it's just like, you know, it it's

34:47

just as if I'm asking it to write me a

34:50

poem in Shakespeare. It just whatever

34:52

the words are in the world in in that

34:55

dimensionality, this dimensionality is

34:57

all these vectors and in in

34:59

multi-dimensional space. These words

35:04

that were in the prompt that described

35:07

the affair um subsequently led to one

35:11

word after another led to um you know

35:14

some revenge and something but it's not

35:16

because it had consciousness or you know

35:18

it just spewed out those words generated

35:20

those words

35:21

>> I understand what you're saying that

35:23

patterns that human beings have

35:25

exhibited both in literature and in real

35:27

life

35:27

>> that's exactly right

35:28

>> but it at a certain point in time one

35:30

would say, "Okay, well, it couldn't do

35:32

this two years ago and it couldn't do

35:34

this four years ago." Like when we're

35:36

looking towards the future, like at what

35:38

point in time when it can do everything

35:40

a person does, what point in time do we

35:42

decide that it's conscious? If it

35:44

absolutely mimics all human thinking and

35:48

behavior patterns,

35:49

>> that doesn't make it conscious.

35:50

>> It becomes in disccernible. It's it's

35:52

aware. It can communicate with you the

35:54

exact same way a person can. Like is con

35:57

is consciousness are we putting too much

35:59

weight on that concept because it seems

36:01

like it's a version of a kind of

36:03

consciousness.

36:04

>> It's a version of imitation.

36:06

>> Imitation consciousness, right? But if

36:08

it perfectly imitates it,

36:10

>> I still think it's a per it's an example

36:12

of imitation.

36:12

>> So it's like a fake Rolex when they 3D

36:14

print them and make them

36:15

>> indestruable. The question is what's the

36:17

definition consciousness?

36:18

>> Yeah.

36:19

>> Yeah.

36:20

>> That's the question. And I don't think

36:21

anybody's really clearly defined that.

36:23

That's what get where it gets weird and

36:25

that that's where the real doomsday

36:27

people are worried that you are creating

36:29

a form of consciousness that you can't

36:31

control. I believe it is possible to

36:35

create a machine

36:38

that imitates

36:41

human intelligence

36:43

and

36:44

has the ability to

36:47

understand information,

36:50

understand

36:52

instructions, break the problem down,

36:54

solve problems, and perform tasks. I

36:58

believe that completely.

37:00

I believe that that um we could have a

37:07

computer that has a vast amount of

37:09

knowledge. Some of it true, some of it

37:12

not true.

37:14

Some of it generated by humans, some of

37:17

it generated synthetically. And more and

37:19

more of knowledge in the world will be

37:23

generated synthetically going forward.

37:25

You know, until now the knowledge that

37:27

we've we have are knowledge that we

37:31

generate and we propagate and we send to

37:33

each other and we amplify it and we add

37:35

to it and we modify it. We change it. In

37:39

the future,

37:42

in a couple of years, maybe two or three

37:45

years, 90% of the world's knowledge will

37:47

likely be generated by AI.

37:49

>> That's crazy.

37:50

>> I know. But it's just fine.

37:52

>> But it's just fine.

37:53

>> I know. And the reason for that is this.

37:56

Let me tell you why.

37:57

>> Okay?

37:57

>> It's because um what difference does it

38:00

make to me that I am learning from a

38:04

textbook that was generated by a bunch

38:06

of people I didn't know or written by a

38:10

book that you know from somebody I don't

38:12

know uh to uh knowledge generated by AI

38:17

computers that are assimilating all of

38:18

this and reynthesizing things. To me, I

38:21

don't think there's a whole lot of

38:22

difference. We still have to we still

38:24

have to fact check it. We still have to

38:26

make sure that it's you know based on

38:28

fundamental first principles and we

38:30

still have to do all of that just like

38:31

we do today.

38:32

>> Is this taking into account the kind of

38:34

AI that exists currently? And do you

38:37

anticipate that just like we could have

38:40

never really believed that AI would be

38:42

at least a person like myself would

38:43

never believe AI would be as so

38:45

ubiquitous and so worth it. It's it's so

38:48

powerful today and so important today.

38:50

We never thought that 10 years ago.

38:52

Never thought that,

38:53

>> right?

38:53

>> You imagine like what are we looking at

38:55

10 years from now?

39:00

>> I I think that if you reflect back 10

39:03

years from now, you would say the same

39:05

thing that we would have never believed

39:07

that

39:08

>> but

39:08

>> in a different direction,

39:09

>> right? But if you if you go forward 9

39:12

years from now

39:15

and then ask yourself what's going to

39:17

happen 10 years from now, I think it'll

39:19

be quite gradual. Um, one of the things

39:22

that Elon said that makes me happy is he

39:25

he's he believes that we're going to get

39:28

to a point where it's not

39:31

it's not necessary for people to work

39:34

and not meaning that you're going to

39:37

have no purpose in life, but you will

39:39

have in his words universal high income

39:42

because so much revenue is generated by

39:44

AI that it will take away this need for

39:50

people to do things that they don't

39:52

really enjoy doing just for money. And I

39:54

think a lot of people have a problem

39:56

with that because their entire identity

39:59

and who how they think of themselves and

40:01

how they fit in the community is what

40:02

they do. Like this is Mike. He's an

40:04

amazing mechanic. Go to Mike and Mike

40:06

takes care of things. But there's going

40:08

to come a point in time where AI is

40:11

going to be able to do all those things

40:12

much better than than people do. And

40:14

people will just be able to receive

40:16

money. But then what does Mike do? Mike

40:18

is, you know, really loves being the

40:21

best mechanic around. You know, what

40:23

does the guy who, you know,

40:26

codes, what does he do when AI can code

40:29

infinitely faster with zero errors? Like

40:32

what what happens with all those people?

40:34

And that is where it gets weird. It's

40:37

like because we've sort of wrapped our

40:38

identity as human beings around what we

40:41

do for a living.

40:42

>> You know, when you meet someone, one of

40:44

the first things you meet somebody at a

40:45

party, hi Joe. What's your name? Mike.

40:47

What do you do? Mike and you know Mike's

40:49

like, "Oh, I'm a lawyer." "Oh, what kind

40:50

of law?" And you have a conversation,

40:52

you know, when Mike is like, "I get

40:54

money from the government. I play video

40:55

games."

40:56

>> Gets weird.

40:57

>> Mhm.

40:57

>> And I think um the concept sounds great

41:01

until you take into account human

41:03

nature. And human nature is that we like

41:06

to have puzzles to solve and things to

41:08

do and and an identity that's wrapped

41:10

around our idea that we're very good at

41:13

this thing that we do for a living.

41:16

>> Yeah. Yeah, I think um let's see, let me

41:20

start with the more mundane and I'll

41:21

work work backwards, okay? Work forward.

41:24

Uh so one of the predictions from uh

41:30

Jeff Hinton who who started the whole

41:33

deep learning phenomenon the deep

41:36

learning technology trend

41:38

and uh in incredible incredible

41:41

researcher uh professor at University of

41:44

Toronto

41:45

uh he invented discovered or invented

41:48

the the idea of of back propagation

41:51

which which uh allows the neural network

41:54

to learn.

41:56

And um

42:00

and as as as you know uh for for the

42:03

audience,

42:05

software historically was humans

42:08

applying first principles and our

42:10

thinking to uh describe an algorithm

42:15

that is then codified just like a recipe

42:19

that's codified in software. It looks

42:22

just like a recipe. how to cook

42:23

something looks exactly the same just in

42:25

a slightly different language. We call

42:27

it Python or C or C++ or whatever it is.

42:32

In the case of deep learning, this

42:35

invention of artificial intelligence,

42:37

we put a structure of a whole bunch of

42:40

neural networks and a whole bunch of

42:43

math units

42:45

and we make this large structure. It's

42:49

like a switchboard of little

42:53

u mathematical units and we connect it

42:55

all together.

42:57

Um, and we give it the input that

43:03

the software would eventually receive

43:06

and we just let it randomly guess what

43:10

the output is. And so we say, for

43:12

example, the input could be a picture of

43:15

a cat.

43:17

And and um one of the outputs of the

43:20

switchboard is where the cat signal is

43:23

supposed to show up. And all of the

43:25

other signals, the other one's a dog,

43:28

the other one's an elephant, the other

43:29

one's a tiger.

43:31

And all of the other signals are

43:33

supposed to be zero when I show it a

43:35

cat. And the one that is a cat should be

43:38

one.

43:40

And I show at a cat through this big

43:43

huge network of switchboards and math

43:46

units and they're just doing multiply

43:49

and adds multiplies and ads. Okay?

43:53

And and uh and this thing, this

43:55

switchboard is gigantic.

43:58

The more information you're going to

43:59

give it, the more the bigger this

44:01

switchboard has to be. And what Jeff

44:03

Hinton discovered was a invented was a

44:06

way for you to

44:09

guess that put the cat signal in put the

44:11

cat image in and that cat image you know

44:15

could be a million numbers because it's

44:18

you know a megapixel image for example

44:20

and it's just a whole a whole bunch of

44:22

numbers and somehow from those numbers

44:26

it has to light up the cat signal. Okay,

44:29

that's the bottom line. And if it the

44:33

first time you do it, it just comes up

44:35

with garbage. And so it says the right

44:39

answer is cat. And so you need to

44:43

increase this signal and decrease all of

44:45

the other and back propagates the

44:48

outcome through the entire network. And

44:51

then you show another. Now it's an image

44:54

of a dog and it guesses it takes a swing

44:58

at it and it comes up with a bunch of

45:00

garbage and you say no no no the answer

45:03

is this is a dog I want you to produce

45:05

dog and all of the other switch all the

45:09

other outputs have to be zero and I want

45:11

to back propagate that and just do it

45:14

over and over and over again. It's just

45:16

like uh showing a a kid this is an

45:18

apple, this is a dog, this is a cat. And

45:20

you just keep showing it to them until

45:22

they eventually get it. Okay. Well,

45:25

anyways, that big invention is deep

45:26

learning. That's the foundation of

45:28

artificial intelligence, a piece of

45:31

software

45:33

that learns from examples. That's

45:36

basically we machine learning, a machine

45:38

that learns. Uh and so so one of the the

45:42

big

45:44

first

45:46

applications was image recognition and

45:48

one of the most important image

45:50

recognition applications is radiology.

45:53

>> And so so uh uh he predicted uh about 5

45:59

years ago that in five years time the

46:02

world won't need any radiologists

46:05

because AI would have swept the whole

46:06

field.

46:08

Well, it turns out AI has swept the

46:10

whole field. That is completely true.

46:13

Today, just about every radiologist is

46:16

using AI in some way. And what's ironic

46:20

though, what's what's interesting is

46:22

that the number of radiologist has

46:24

actually grown.

46:27

And so the question is why? That's kind

46:30

of interesting, right?

46:31

>> It is. And so the prediction was in fact

46:34

that

46:36

30 million radiologists will be wiped

46:38

out.

46:40

But as it turns out, we needed more. And

46:42

the reason for that

46:43

[clears throat and cough]

46:44

is because the purpose of a radiologist

46:47

is to diagnose disease,

46:49

not to study the image. This the image

46:51

studying is simply a task to in service

46:57

of diagnosing the disease. And so now

47:01

the fact that you could study the images

47:04

more quickly and more precisely

47:07

without ever making a mistake and never

47:08

gets tired.

47:10

You could study more images. You could

47:13

study it in

47:15

3D form instead of 2D because you know

47:19

the AI doesn't care whether it studies

47:20

images in 3D or 2D. You could study it

47:23

in 4D. And so the now you could study

47:26

images in a way that radiologist

47:28

radiologists can't easily do and you

47:31

could study a lot more of it. And so the

47:33

number of tests that people are able to

47:35

do increases and because they're able to

47:38

serve more patients, the hospital does

47:42

better. They have more clients, more

47:44

patients. As a result, they have better

47:46

economics. When they have better

47:48

economics, they hire more radiologists

47:50

because their purpose is not to study

47:52

the images. their purpose is to diagnose

47:55

disease. And so the question is the what

47:58

I'm leading up to is ultimately what is

48:00

the purpose? What is the purpose of the

48:03

lawyer? And has the purpose changed?

48:07

What is the purpose? You know, one of

48:09

the examples that I gave is is um that I

48:11

would give is for example uh if my car

48:15

became self-driving

48:17

will all chauffeers be out of jobs? The

48:20

answer probably is not because for some

48:22

per for some chauffeers they for some

48:25

people who are driving you they could be

48:26

protectors some people um they're part

48:29

of the experience part of the service so

48:31

when you get there they you know they

48:33

could take care of things for you and so

48:35

for a lot of different reasons not all

48:37

chauffeers would lose their jobs some

48:40

chauffeers would lose their jobs and uh

48:42

many chauffeers would change their jobs

48:45

and the type of applications of

48:47

autonomous vehicles will probably

48:49

increase you know the usage of the

48:51

technology within find new homes and so

48:54

I I think you have to go back to what is

48:56

the purpose of a job you know like for

48:58

example if AI comes along I actually

49:00

don't believe I'm going to lose my job

49:01

because my purpose isn't to I have to

49:05

look at a lot of documents I study a lot

49:08

of emails I look at a bunch of diagrams

49:11

you know um the question is what is the

49:14

job and and uh the purpose of somebody

49:17

probably hasn't changed a lawyer for

49:19

example help people that probably hasn't

49:21

changed studying legal documents

49:23

generating documents it's part of the

49:26

job not the job

49:27

>> but don't you think there's many jobs

49:29

that AI will replace

49:31

>> if your job is automation

49:33

>> yeah if your job is the task

49:35

>> right so automation

49:36

>> yeah factor if your job is the task

49:39

>> that's a lot of people

49:40

>> it could be a lot of people but it'll

49:42

probably generate like for example

49:45

>> uh let's say we let's say I'm super

49:47

excited about the the the robots Elon's

49:50

working on.

49:52

It's still a few years away.

49:54

When it happens, when it happens,

49:58

um

50:00

there's a whole new industry of

50:02

technicians and people who have to

50:05

manufacture the robots, right?

50:06

>> Mhm.

50:07

>> And so that that job never existed. And

50:10

so you're going to have a whole industry

50:12

of people taking care of like for

50:15

example, you know, all the mechanics and

50:17

all the people who are building things

50:18

for cars, supercharging cars, uh that

50:23

didn't exist before cars and now we're

50:24

going to have robots. You're going to

50:26

have robot apparel. So a whole industry

50:29

of [laughter] Right. Isn't that right?

50:30

Because I want my robot to look

50:31

different than your robot.

50:32

>> Oh god.

50:33

>> And so [laughter] you're going to you're

50:35

going to have a whole, you know, apparel

50:36

industry for robots. You're going to

50:38

have mechanics for robots and you have

50:40

you know people who comes and maintain

50:42

your robots

50:43

>> automated though.

50:43

>> No,

50:44

>> you don't think so? You don't think

50:45

[clears throat] they'll be all done by

50:46

other robots

50:47

>> eventually? And then there'll be

50:49

something else.

50:50

>> So you think ultimately people just

50:52

adapt except if you are the task

50:56

>> which is a large percentage of the

50:58

workforce.

50:59

>> If your job is just to chop vegetables,

51:01

quezin art is going to replace you.

51:02

>> Yeah. So people have to find meaning in

51:05

other things. Your job has to be more

51:07

than the task.

51:08

>> What do you think about Elon's belief

51:10

that this universal basic income thing

51:13

will eventually become necessary?

51:18

>> Many people think that. Andrew Yang

51:19

thinks that

51:21

>> he was one of the first people to sort

51:22

of sound that alarm during the the 2020

51:24

election.

51:30

Yeah, I I guess um

51:34

yeah, both ideas probably won't exist at

51:37

the same time and and um as in life,

51:40

things will probably be in the middle.

51:42

One idea, of course, is that there'll be

51:45

so much abundance of resource that

51:47

nobody needs a job and we'll all be

51:49

wealthy.

51:51

On the other hand, um we're going to

51:54

need universal basic income. Both ideas

51:57

don't exist at the same time,

51:59

>> right?

52:00

>> And so we're either going to be all

52:01

wealthy or we're going to be all

52:04

>> How could everybody be wealthy though?

52:05

But

52:06

>> because scenario wealthy not because you

52:08

have a lot of dollars, wealthy because

52:09

there's a lot of abundance. Like for

52:11

example, today we are wealthy of

52:14

information.

52:16

You know, this is some a concept several

52:17

thousand years ago only a few people

52:20

have. And so, uh, today we have wealth

52:23

of a whole bunch of things, resources

52:25

that that historic point. Yeah. And so,

52:27

we're going to have wealth of resources,

52:29

things that we think are valuable today

52:32

that in the future are just not not that

52:34

valuable, you know, and so it because

52:36

it's automated. And so I think I think

52:39

the question

52:41

maybe maybe partly it's hard to answer

52:45

partly because

52:48

it's hard to talk about infinity and

52:50

it's hard to talk about a long time from

52:51

now and and the reason for that is

52:54

because

52:57

there's just too many scenarios to to

52:59

consider. But I think it I think in the

53:01

next several years, call it 5 to 10

53:03

years,

53:06

there are several things that I I

53:08

believe in hope. Um, and I say hope

53:11

because I'm not sure. One of the things

53:13

that I believe is that the technology

53:16

divide will be substantially collapsed.

53:22

And of course the alternative

53:26

viewpoint is that AI is going to

53:29

increase the technology divide.

53:32

Now the reason why I believe AI is going

53:34

to reduce the technology divide.

53:37

I is because we have proof

53:40

the evidence is that AI is the easiest

53:43

application in the world to use. Chat

53:45

GPT has grown to almost a billion users

53:48

frankly practically overnight. And if

53:51

you're not exactly sure how to use,

53:53

everybody knows how to use chatpt. Just

53:54

say something to it. If you're not sure

53:56

how to use chatpt, you ask chatd how to

53:59

use it. No tool in history has ever had

54:03

this capability. A quez an art, you

54:06

know, if you don't know how to use it,

54:07

you're kind of screwed. You're going to

54:09

walk up to it and say, "How do you use a

54:10

quezin art?" You're going to have to

54:11

find somebody else. And so, but an AI

54:14

will just tell you exactly how to do it.

54:16

Anybody could do this. It'll speak to

54:18

you in any language. And if it doesn't

54:20

know your language, you'll speak it in

54:22

that language and it'll probably figure

54:24

out that it doesn't completely

54:25

understand your language. Go learns it

54:28

instantly and comes back and talk to

54:29

you. And so I think the the technology

54:32

divide has a real chance finally that

54:35

you don't have to speak Python or C++ or

54:38

forran. You can just speak human and

54:41

whatever form of human you like. And so

54:43

I think that that has a real chance of

54:45

closing the technology divine. Now, of

54:48

course, the counternarrative would say

54:51

that

54:53

AI is only going to be available for the

54:58

nations and the countries that have a

55:00

vast amount of resources because AI

55:02

takes energy

55:04

and AI takes um a lot of GPUs and

55:08

factories to be able to produce the AI.

55:11

No doubt at the scale that we would like

55:13

to do in the United States. But the fact

55:15

of the matter is your phone's going to

55:17

run AI just fine all by itself, you

55:21

know, in a few years. Today, it already

55:23

does it fairly decently. And so the the

55:26

the fact that every every country, every

55:29

nation, every every society will have

55:32

the benefit of very good AI. It might

55:34

not be tomorrow's AI. It might be

55:36

yesterday's AI, but yesterday's AI is

55:39

freaking amazing. You know, in 10 years

55:41

time, 9year-old AI is going to be

55:43

amazing. You don't need, you know, 10

55:45

year old AI. You don't need frontier AI

55:48

like we need frontier AI because we want

55:50

to be the world leader. But for every

55:52

single country, everybody, I think the

55:54

ele the capability to elevate

55:56

everybody's knowledge and capability and

55:58

intelligence, uh, that day is coming.

56:00

The octagon isn't just in Las Vegas

56:02

anymore. It's right in your hands with

56:04

DraftKings Sportsbook, the official

56:06

sports betting partner of UFC. Get ready

56:09

because when Dwabish Willie and Yan face

56:12

off again at UFC 323, every punch, every

56:16

takedown, every finish, it all has the

56:19

potential to pay off in real time. New

56:21

customers bet just $5. And if your bet

56:24

wins, you get paid $200 in bonus bets.

56:27

And hey, Missouri, the wait is over.

56:29

DraftKings Sportsbook is now live in the

56:32

Show Me State. Download the DraftKings

56:34

Sportsbook app and use promo code Rogan.

56:37

That's code Rogan to turn five bucks

56:39

into 200 in bonus bets if your bet wins.

56:43

In partnership with DraftKings, the

56:45

crown is yours. Gambling problem? Call

56:47

1800 gambler. In New York, call 8778

56:50

wire or text hope 467-369. In

56:53

Connecticut, help is available for a

56:54

problem gambling. Call 888789-7777

56:57

or visit ccpg.org. Please play

57:00

responsibly. On behalf of Bootill Casino

57:01

and Resort in Kansas, pass through of

57:03

per wager tax may apply in Illinois. 21

57:05

and over. Age and eligibility varies by

57:07

jurisdiction void in Ontario.

57:08

Restrictions apply. Bet must win to

57:10

receive bonus bets which expire in 7

57:11

days. Minimum odds required. For

57:13

additional terms and responsible gaming

57:14

resources, see dkg.co/audio.

57:17

Limited time offer.

57:19

>> And also energy production, which is the

57:22

real bottleneck when it comes to third

57:24

world countries and

57:25

>> that's right,

57:26

>> electricity and all all the resources

57:29

that we take for granted.

57:31

>> Almost everything is going to be energy

57:32

constrained. And so if you take a look

57:35

at um

57:37

one of the most important technology

57:39

advances in history is this idea called

57:41

Moore's law. Moore's law

57:45

was the started basically in my

57:48

generation

57:50

and my generation is the generation of

57:52

computers. I graduated in 1984 and that

57:56

was basically at the very beginning of

57:58

the PC revolution.

58:01

And the microprocessor and and um

58:06

every single year it approximately

58:08

doubled

58:10

and we describe it as every single year

58:12

we double the performance. But what it

58:14

really means is that every single year

58:17

the cost of computing halfed.

58:20

And so the cost of computing in the

58:24

course of five years reduced by a factor

58:27

of 10. The amount of energy necessary to

58:31

do computing to do any task reduced by a

58:33

factor of 10. Every single 10 years 100

58:38

a th00and 10,000

58:41

100,000 so on and so forth. And so each

58:45

one of

58:47

the clicks of Moore's law, the amount of

58:49

energy necessary to do any computing

58:51

reduced. That's the reason why you have

58:53

a laptop today when back in 1984 sat on

58:57

the desk, you got to plug in, it wasn't

58:59

that fast and it consumed a lot of

59:01

power. Today, you know, it is only a few

59:03

watts. And so Moore's law is the

59:06

fundamental technology, the fundamental

59:08

technology trend that made it possible.

59:10

Well, what's going on in AI? The reason

59:12

why Nvidia is here is because in we

59:14

invented this new way of doing

59:16

computing. We call it accelerated

59:17

computing. We started it 33 years ago.

59:19

Took us about 30 years to really made a

59:22

huge breakthrough. In that in that 30

59:25

years or so

59:27

we took computing you know probably a

59:31

factor of well let me just say in last

59:33

10 years the last 10 years we improved

59:36

the performance of computing by 100,000

59:40

times.

59:41

Whoa. Imagine a car over the course of

59:44

10 years that became a 100,000 times

59:46

faster or at the same speed 100,000

59:50

times cheaper or at the same speed

59:54

100,000 times less energy. If your car

59:57

did that, it doesn't need energy at all.

60:00

What I mean what what I'm trying to say

60:02

is that in 10 years time the amount of

60:06

energy necessary for artificial

60:08

intelligence for most people will be

60:10

minuscule

60:12

utterly minuscule and so we'll have AI

60:15

running in all kinds of things and all

60:16

the time because it doesn't consume that

60:18

much energy and so if you're a nation

60:21

that uses AI for you know almost

60:24

everything in your social fabric of

60:26

course you're going to need these AI

60:27

factories but for a lot of countries I

60:29

think you're going You're going to have

60:30

excellent AI and you're not going to

60:32

need as much energy. Everybody will be

60:34

able to come along is my point.

60:36

>> So currently that that is a big

60:38

bottleneck, right? Is energy.

60:40

>> Yeah, it is the bottleneck.

60:41

>> The bottleneck is this. So was it Google

60:43

that is making nuclear power plants to

60:47

operate one of its AI factories?

60:50

>> Oh, I haven't heard that. But I think in

60:51

the next six, seven years, I think

60:53

you're going to see a whole bunch of

60:55

small nuclear reactors.

60:57

>> And by small, like how big are you

60:58

talking about? Hundreds of megawws.

61:00

Yeah.

61:01

>> Okay. And that these will be local to

61:04

whatever specific company they have.

61:06

>> That's right. Will all be power

61:07

generators.

61:08

>> Whoa.

61:09

>> You know, just like just like your you

61:12

know, somebody's farm.

61:13

>> It probably is the smartest way to do

61:15

it, right?

61:16

>> And it takes the burden off Yeah. takes

61:18

the burden off the grid. It takes and

61:20

you could build as much as you need

61:22

>> and you can contribute back to the grid.

61:24

It's a really important point that I

61:25

think you just made about Moore's law

61:27

and the relationship to pricing because

61:31

you know a laptop today like you can get

61:32

one of those little Mac MacBook Airs.

61:34

They're incredible. They're so thin,

61:36

unbelievably powerful. Battery life is

61:38

charge it.

61:39

>> Yeah. Battery [laughter] life's crazy.

61:40

And uh it's not that expensive

61:43

relatively speaking. Like something like

61:44

that.

61:45

>> I remember.

61:45

>> And that's just Moore's law, right?

61:47

>> Then there's the Nvidia law.

61:49

>> Oh,

61:49

>> just right. the the the law I was

61:51

talking to you about, the computing that

61:53

we invented,

61:54

>> right?

61:54

>> The reason why we're here, this new this

61:57

new way of doing computing

61:59

>> is like Mo's law on energy drinks. I

62:02

mean, it's [laughter]

62:04

it's like Mo's law

62:07

it's it's like Yeah. Moore's law and Joe

62:09

Rogan.

62:10

>> Wow. That's interesting.

62:12

>> Yeah. That's us.

62:13

>> So, explain that. Um this this chip that

62:16

you brought to Elon, what what's the

62:18

significance of this? It's like why is

62:19

it so superior? And so

62:23

in 2012, Jeff Hinton's lab, this

62:26

gentleman I was talking talking about,

62:29

um Ilas Suscober, Alex Kresevski, um

62:34

they made a breakthrough in computer

62:37

vision in literally creating a

62:42

piece of software

62:44

called Alexnet.

62:47

And its job was to recognize images. And

62:50

it recognized images

62:52

at a c at a level computer vision which

62:55

is fundamental to intelligence. If you

62:57

can't perceive, you can't it's hard to

62:59

have intelligence. And so computer

63:01

vision is a fundamental pillar of not

63:03

the only but fundamental pillar of. And

63:05

so breaking

63:07

computer vision or breaking through in

63:10

computer vision is pretty foundational

63:11

to almost everything that everybody

63:13

wants to do in AI. And so in 2012,

63:17

their lab in Toronto

63:20

uh made this made this breakthrough

63:23

called Alexnet. And Alexet was able to

63:26

recognize images

63:29

so much better than any human created

63:33

computer vision algorithm in the 30

63:36

years prior. So all of these people, all

63:39

these scientists and we had many too

63:42

working on computer vision algorithms

63:45

and these two kids, Ilia and Alex under

63:49

the the uh

63:51

under under uh Jeff Hinton took a giant

63:55

leap above it and it was based on this

63:58

thing called Alexet this neural network.

64:01

And the way it ran,

64:03

the way they they made it work was

64:06

literally buying two Nvidia graphics

64:08

cards

64:09

because Nvidia Nvidia's GPUs we've been

64:12

working on this new way of doing

64:14

computing and our GPUs application

64:18

and it's basically a supercomputing

64:20

application to back in 1984

64:25

in order to

64:28

process computer games and what you have

64:31

in your racing simulator that is called

64:34

an image generator supercomputer.

64:37

And so Nvidia started our first

64:40

application was computer graphics and we

64:43

applied this new way of doing computing

64:46

where we do things in parallel in

64:47

instead of sequentially. A CPU does

64:50

things sequentially. Step one, step two,

64:52

step three. In our case, we break the

64:55

problem down and we give it to thousands

64:58

of processors.

65:00

And so our way of doing computation

65:05

is much more complicated.

65:08

But if you're able to formulate the

65:11

problem in the way that we

65:14

created called CUDA, this is the

65:16

invention of our company. If you could

65:18

formulate it in that way, we could

65:20

process everything simultaneously.

65:23

Now, in the case of computer graphics,

65:25

it's easier to do because every single

65:28

pixel on your screen is not related to

65:31

every other pixel. And so, I could

65:33

render multiple parts of the screen at

65:36

the same time. Not not completely true

65:39

because, you know, maybe maybe the way

65:41

lighting works or the way shadow works,

65:43

there's a lot of dependency and and

65:44

such. But computer graphics with all the

65:48

dis with all the pixels, I should be

65:50

able to process everything

65:51

simultaneously. And so we we took

65:54

this embarrassingly parallel problem

65:56

called computer graphics and we applied

65:58

it to this new way of doing computing.

66:01

Nvidia's Nvidia's accelerated computing.

66:05

We put it in all of our graphics cards.

66:08

Kids were buying it to play games. We're

66:11

you probably don't know this, but we're

66:13

the largest gaming platform in the world

66:15

today.

66:15

>> Oh, I know that. Oh,

66:16

>> okay.

66:16

>> I used to make my own computers. I used

66:18

to buy your graphics cards.

66:19

>> Oh, that's super cool.

66:20

>> Yeah. [laughter] set up SLI with two

66:22

graphics cards.

66:23

>> Yeah, I love it. Okay, that's super

66:24

cool.

66:24

>> Oh, yeah, man. I used to be a Quake

66:26

junkie.

66:26

>> Oh, that's cool.

66:27

>> Yeah.

66:28

>> Okay, so SLI, I'll tell you the story in

66:30

just a second and how it led to Elon.

66:33

I'm still answering the question. And

66:35

so, anyways, these these two kids

66:38

trained this model using the technique I

66:40

described earlier on our GPUs because

66:42

our GPUs could process things in

66:44

parallel. It's essentially a supercomput

66:48

in a PC. The reason why you used it for

66:51

Quake is because it is the first

66:54

consumer supercomputer. Okay. And so

66:56

anyways,

66:59

they made that breakthrough. We were

67:00

working on computer vision at the time.

67:02

It caught my attention

67:05

and so we went to learn about it.

67:08

Simultaneously this deep learning

67:10

phenomenon was happening all over all

67:12

over the country. Universities after

67:14

another recognized the importance of

67:16

deep learning and all of this work was

67:19

happening at Stanford, at Harvard, at

67:21

Berkeley, just all over the place. New

67:24

York University, L Yan Lakun, Andrew

67:27

Yang at Stanford, so many different

67:29

places. And I see it cropping up

67:32

everywhere.

67:34

And so my curiosity asked, you know,

67:38

what is so special about this form of

67:40

machine learning? And we've known about

67:42

machine learning for a very long time.

67:43

We've known about AI for a very long

67:45

time. We've known about neural networks

67:46

for a very long time. What makes now the

67:50

moment? And so we realized that this

67:54

architecture for deep neural networks

67:56

back propagation the way deep neuronet

67:59

networks were created. We could probably

68:02

scale this problem, scale the solution

68:05

to solve many problems.

68:08

that is essentially

68:10

a universal function approximator. Okay?

68:13

Meaning meaning you know back when

68:16

you're in in in school you have a you

68:18

have a you have a box inside of it is a

68:21

function you give it an input it gives

68:23

you an output and and the the reason why

68:26

I call it universal function

68:27

approximator

68:29

is that this computer instead of you

68:31

describing the function a function could

68:34

be a new equation fals ma that's a

68:37

function you write the function in

68:39

software you give it input f mass

68:43

acceleration, it'll tell you the force.

68:45

Okay? And

68:48

the way this computer works is really

68:50

interesting.

68:52

You give it a universal function. It's

68:55

not fals, just a universal function.

68:57

It's a big huge deep neural network

69:01

and instead of describing the inside,

69:05

you give it examples of input and output

69:08

and it figures out the inside.

69:11

So you give it input and output and it

69:13

figures out the inside. A universal

69:15

function approximator. Today it could be

69:17

Newton's equation. Tomorrow it could be

69:19

Maxwell's equation. It could be Kulum's

69:22

law. It could be thermodynamics

69:24

equation. It could be you know

69:25

Shingers's equation for quantum physics.

69:28

And so you could put any you could have

69:30

this describe almost anything so long as

69:33

you have the input and the output. So

69:36

long as you have the input and the

69:37

output or it could learn the input and

69:39

output.

69:40

>> And so we took a step back and we said,

69:42

"Hang on a second. This isn't just for

69:46

computer vision. Deep learning could

69:48

solve any problem.

69:50

All the problems that are interesting so

69:52

long as we have input and output. Now

69:55

what has input and output?

69:58

Well, the world. The world has input and

70:01

output. And so we could have a computer

70:03

that could learn almost anything.

70:05

Machine learning, artificial

70:07

intelligence. And so we reasoned that

70:09

maybe this is the fundamental

70:11

breakthrough that we needed. There were

70:14

a couple of things that had to be

70:15

solved. For example, we had to believe

70:17

that you could actually scale this up to

70:19

giant systems. It was running in a they

70:22

had two graphics cards, two GTX 580s,

70:26

[laughter]

70:28

which by the way is exactly your SLI

70:30

configuration. Yeah. Okay. So, that GTX

70:34

5880 SLI was the revolutionary computer

70:38

that put deep learning on the map.

70:41

>> Wow.

70:41

>> It was 2018 and you were using it to

70:44

play Quake.

70:45

>> Wow. That's crazy.

70:46

>> That was the moment. That was the big

70:48

bang of modern AI. We were lucky because

70:52

we were inventing this technology, this

70:54

computing approach. We were lucky that

70:56

they found it.

70:58

Turns out they were gamers and it was

71:00

lucky they found it. And it it was lucky

71:03

that we paid attention to that moment.

71:06

It was a little bit like, you know, that

71:10

Star Trek,

71:12

you know, first contact.

71:16

The Vulcans had to have seen the warp

71:18

drive at that very moment. If they

71:20

didn't witness the warp drive, you know,

71:23

they would have never come to Earth and

71:26

everything would have never happened.

71:28

It's a little bit like if I hadn't paid

71:29

attention to that moment, that flash.

71:31

And that flash didn't last long. If I

71:34

hadn't paid attention to that flash or

71:35

our company didn't pay attention to it,

71:38

who knows what would have happened, but

71:40

we saw that and we reasoned our way into

71:42

this is a this is a universal function

71:44

approximator. This is not just a

71:46

computer vision approximator. We could

71:48

use this for all kinds of things. if we

71:50

could solve two problems. The first

71:51

problem is that we have to prove to

71:53

oursel it could scale. The second

71:56

problem we had to

72:01

wait for I guess contribute to and wait

72:04

for is

72:08

the world will never have enough data

72:11

on input and output where we could

72:15

supervise

72:16

the AI to learn everything. For example,

72:19

if we have to supervise our children on

72:21

everything they learn, the amount of

72:24

information they could learn is limited.

72:26

We needed the AI, we needed the computer

72:28

to have a method of learning without

72:31

supervision.

72:33

And that's where we had to wait a few

72:35

more years, but un unsupervised

72:38

AI learning is now here. And so the AI

72:41

could learn by itself. And and the

72:44

reason why the AI could learn by itself

72:45

is because we have many examples of

72:48

right answers. Like for example,

72:51

if I want to learn uh if I want to teach

72:54

an AI how to predict the next word, I

72:57

could just grab it, grab a whole bunch

72:59

of text we already have, mask out the

73:01

last word and make it try and try and

73:04

try again until it predicts the next

73:06

one. or I mask out random words inside

73:09

inside the text and I make it try and

73:11

try and try until it predicts it. You

73:12

know, like uh Mary uh Mary goes down to

73:17

the bank. Is it a river bank or a money

73:21

bank? Well, if you're going to go down

73:23

to the bank, it's probably a river bank.

73:25

Okay. So, and it it it might not be

73:27

obvious even from that. It might need

73:31

and

73:32

uh and uh and caught a fish. Okay. Now

73:36

you know it's must be the riverbank. And

73:38

so so you give you give these AIs a

73:41

whole bunch of these examples and you

73:42

mask out the words, it'll predict the

73:44

next one. Okay? And so unsupervised

73:47

learning came along. These two ideas,

73:49

the fact that it's scalable and

73:50

unsupervised learning came along.

73:53

We were convinced that we ought to put

73:56

everything into this and help create

73:58

this industry because we're going to

74:00

solve a whole bunch of interesting

74:01

problems. And that was in 2012. By 2016,

74:06

I had I had built this computer called

74:08

the DGX1. The one that you saw me give

74:11

to Elon is called DGX Spark. The DGX1

74:17

was $300,000.

74:20

It cost Nvidia a few billion dollars to

74:22

make the first one.

74:25

And instead of two chips SLI,

74:30

we connected eight chips with a

74:32

technology called MVLink, but it's

74:34

basically SLI supercharged.

74:38

Okay.

74:38

>> Okay.

74:38

>> And so we connected eight of these chips

74:40

together instead of just two. And all of

74:43

them work together just like your Quake

74:46

rig did to solve this deep learning

74:49

problem to train this model. And so I

74:52

create we created this thing. I

74:54

announced it at GTC

74:56

and at one of our annual annual events

75:00

and I described this deep learning

75:01

thing, computer vision thing and this

75:05

computer called DJX1.

75:07

The audience was like completely silent.

75:09

They had no idea what I was talking

75:11

about. [laughter]

75:14

And I was lucky because I I had known

75:17

Elon and uh uh I helped him build the

75:21

first computer for Model 3

75:23

uh uh the Model S. And uh and when he

75:27

wanted to start working on autonomous

75:29

vehicle, I helped him build the computer

75:31

that went into the the Model S AV

75:34

system, his full full self-driving

75:36

system. We were basically the FSD

75:40

computer version one. And so

75:44

we we're already working together and um

75:47

when I announced this thing, nobody in

75:50

the world wanted it. I had no purchase

75:52

orders. Not not one. Nobody wanted to

75:54

buy it. Nobody wanted to be part of it

75:57

except for Elon. He goes, he was at the

76:00

event and we were doing a fireside chat

76:02

about the future of self-driving cars.

76:06

I think it's like 2016. Yeah, 20 maybe

76:09

at that time it was 2015. and he goes,

76:12

"You know what?

76:14

I have a company that could really use

76:16

this."

76:17

I said, "Wow, my first customer." And

76:20

so, so I was pretty excited about it.

76:23

And he goes, "Uh, yeah. Uh, we have this

76:26

company. It's a nonprofit company."

76:30

And all the blood drained out of my

76:32

face. Yeah. [laughter]

76:34

I just spent a few billion dollars

76:36

building this thing. Cost $300,000. and

76:40

you know the chances of a nonprofit

76:43

being able to pay for this thing is

76:44

approximately zero. And he goes, you

76:46

know, this is a it's an AI company and

76:48

uh it's a nonprofit and and uh we could

76:52

really use one of these supercomputers.

76:54

And so I I picked it up. I built the

76:57

first one for ourselves. We're using it

76:58

inside the company. I boxed one up. I

77:01

drove it up to San Francisco and I

77:02

delivered to Elon in 2016. A bunch of

77:05

researchers were were there.

77:08

Peter Beiel was there, Ilia was there,

77:10

and there was a bunch of people there.

77:12

And uh I walk up to the second floor

77:15

where they were all kind of in a room

77:17

this smaller than your place here. And

77:20

and uh uh that place turned out to have

77:23

been open AI

77:25

>> 2016.

77:26

>> Wow.

77:26

>> Just a bunch of people sitting in a

77:29

room.

77:30

>> It's not really uh nonprofit anymore,

77:32

though, is it?

77:33

>> They're not They're not nonprofit

77:34

anymore. Yeah.

77:35

>> Weird how that works.

77:36

>> Yeah. Yeah. But anyhow, anyhow, Elon was

77:39

there. The Yeah, it was it was really a

77:41

great great moment.

77:42

>> Oh, yeah. There you go. Yeah, that's it.

77:45

[laughter]

77:45

>> Look at you, bro. Same jacket.

77:48

>> Look at that. I haven't aged.

77:50

>> Not not a lick of black hair, though.

77:53

>> Uh the size of it is uh it's

77:56

significantly smaller. That was the

77:57

other day. SpaceX.

77:59

>> Oh, yeah. There you go.

78:00

>> Yeah. Look at the difference.

78:01

>> Exactly the same industrial design. He's

78:03

holding it in his hand

78:06

>> here. Here's the amazing thing. DJX1 was

78:10

one pedlops. Okay, that's a lot of

78:14

flops. And DJX Spark is one pedlops.

78:20

Nine years later.

78:22

>> Wow.

78:23

>> The same the same amount of computing

78:25

horsepower

78:26

>> in a much smaller

78:27

>> shrunken down. Yeah.

78:28

>> And instead of $300,000, it's now

78:31

$4,000. And it's the size of a small

78:33

book.

78:34

>> Incredible.

78:35

>> Crazy.

78:37

>> That's how technology moves. Anyways,

78:38

that's the reason why I wanted to get

78:40

give him the first one

78:41

>> because I gave him the first one 2016.

78:43

>> It's so fascinating. I mean you if you

78:45

wanted to make a story for a film I mean

78:49

that would be the story that like what

78:52

what better scenario if if if it really

78:54

does become a digital life form how

78:57

funny would it be that it is birthed out

79:00

of the desire for computer graphics for

79:03

video games [laughter]

79:05

>> exactly

79:06

>> kind of cra it's kind of crazy

79:08

>> kind of crazy when you think about it

79:09

that way

79:10

>> because

79:12

it's just

79:13

>> perfect origin

79:14

Computer graphics was one of the hardest

79:18

computer supercomputer problems

79:21

generating reality

79:22

>> and also one of the most profitable to

79:24

solve because computer games are so

79:27

popular.

79:28

>> When Nvidia started in 1993,

79:32

we were trying to create this new

79:33

computing approach. The question is

79:35

what's the killer app?

79:38

And

79:41

the the problem we wanted to the the

79:43

company wanted to create a new type of

79:45

computing pro a computing architecture a

79:48

computing a a new type of computer that

79:51

can solve problems that normal computers

79:54

can't solve.

79:56

Well,

79:58

the applications that existed in the

80:01

industry in 1993

80:03

are applications that normal computers

80:05

can solve because if the normal

80:07

computers can't solve them, why would

80:08

the application exist?

80:11

And so, we had a mission statement for a

80:16

company that has no chance of success.

80:19

[laughter]

80:21

But I didn't know that in 1993. It just

80:23

sounded like a good idea,

80:25

>> right?

80:27

And so if we created this thing that can

80:30

solve problems, you know, it's like

80:34

you actually have to go create the

80:35

problem.

80:37

And so that's what we did in 1993. There

80:41

was no quake. John Carmarmac hadn't been

80:43

reduced doom released Doom yet. You

80:46

probably remember that.

80:47

>> Sure. Yeah.

80:49

>> And and uh there were no applications

80:51

for it. And so I went to Japan because

80:55

the arcade industry had this at the time

80:59

of Sega, if you remember.

81:00

>> Sure.

81:01

>> The arcade machines, they came out with

81:03

3D arcade systems, virtual fighter,

81:08

Daytona, Virtual Cop, all of those

81:12

arcade games were in 3D for the f very

81:14

first time. And the technology they were

81:17

using was from Martin Marietta, the

81:20

flight simulators. They took the guts

81:22

out of a flight simulator and put it

81:24

into an arcade machine. The system that

81:28

you have over here, it's got to be a

81:31

million times more powerful than that

81:33

arcade machine. And that was a flight

81:35

simulator for NASA. Whoa. And so they

81:39

took the guts out of that. They were

81:42

they were using it for flight simulation

81:43

for jets and, you know, space shuttle

81:46

and and they took the guts out of that.

81:49

and Sega uh had this brilliant computer

81:52

de developer. His name was Yuzuki.

81:56

Yuzuki and Miiamoto. Sega and Nintendo.

82:01

These were the, you know, the incredible

82:04

pioneers, the visionaries, the

82:06

incredible artists, and they're both

82:08

very, very technical.

82:11

They were the origins really of of the

82:14

gaming industry. and Y Suzuki

82:17

pioneered 3D graphics gaming and um

82:22

so I went we we created this company and

82:25

there were no apps

82:27

and we were spending all of our

82:29

afternoons you know we told our family

82:31

we were going to work but it was just

82:33

the three of us you know who's going to

82:34

know and so we went to Curtis's my one

82:38

of one of the founders went to Curtis's

82:40

townhouse and uh Chris and I were

82:42

married we have kids I already had

82:44

Spencer at Madison. They were probably 2

82:46

years old. And um

82:50

and uh Chris's kids are about the same

82:52

age as ours. And we would go to work in

82:56

this townhouse. But you know, when

82:58

you're a startup and the mission

83:00

statement is the way we described,

83:02

you're not going to have too many

83:03

customers calling you. And so we had

83:06

really nothing to do. And so after

83:09

lunch, we would always have a great

83:10

lunch. After lunch, we would go to the

83:13

arcades and play the Sega V, you know,

83:16

the Sega Virtual Fighter and Daytona and

83:17

all those games and analyze how they're

83:20

doing it, trying to figure out how they

83:22

they were doing that.

83:24

And so we decided, um, let's just go to

83:27

Japan and let's

83:30

convince Sega to move those applications

83:33

into the PC.

83:35

and we would start the PC gaming the 3D

83:38

gaming industry partnering with Sega.

83:42

That's how Nvidia started.

83:43

>> Wow.

83:44

>> And so so uh in exchange for them part

83:47

developing their games for our computers

83:52

in the PC, we would build a chip for

83:56

their game console. That was the

83:58

partnership. I build a chip for your

84:01

game console. you port the Sega games to

84:03

us and um

84:07

and then they paid us a you know at the

84:09

time a quite a significant amount of

84:11

money to build that game console

84:14

and that was kind of the beginning of

84:18

Nvidia getting started and we thought we

84:20

were on our way and so so I started with

84:23

a business plan a mission statement that

84:25

was impossible we lucked into the Sega

84:28

partnership we started taking off

84:31

started building our game console. And

84:33

about a couple years into it, we

84:35

discovered our first technology

84:38

didn't work.

84:40

It was it it would have been a flaw. It

84:42

it was a flaw. And all of the technology

84:45

ideas that we had

84:47

the architecture concepts were were

84:49

sound, but the way we were doing

84:52

computer graphics was exactly backwards.

84:55

you know, instead of

84:57

I won't bore you with the technology,

84:58

but instead of inverse texture mapping,

85:01

we were doing forward texture mapping.

85:04

Instead of triangles, we did curved

85:07

surfaces. So, other people did it flat,

85:10

we did it round. Um,

85:14

other technology, the technology that

85:16

ultimately won, the technology we use

85:18

today has has Zbuffers. It automatically

85:21

sorted.

85:23

We had an architecture with no Zbuffers.

85:25

The application had to sort it. And so

85:27

we chose a bunch of technology

85:29

approaches

85:30

that

85:32

three major technology choices. All

85:34

three choices were wrong. Okay. So this

85:36

is how incredibly smart we were. And so

85:39

[laughter]

85:40

and so in 1995 19 early mid95

85:44

we realized we were going down the wrong

85:46

path. Meanwhile,

85:49

the Silicon Valley was packed with 3D

85:52

graphics startups because it was the

85:54

most exciting technology of that time.

85:57

And so 3D FX and rendition and Silicon

86:01

Graphics was coming in. Intel was

86:02

already in there and you know gosh like

86:06

what added up eventually to a hundred

86:08

different startups we had to compete

86:10

against. Everybody had chosen the right

86:12

technology approach and we chose the

86:15

wrong one. And so we were the first

86:17

company to start. We found ourselves

86:21

essentially dead last with the wrong

86:22

answer.

86:24

And so

86:26

the company was in trouble

86:30

and um

86:33

ultimately we had to make several

86:34

decisions. The first decision is

86:38

well

86:41

if we change now

86:44

we will be the last company.

86:49

And

86:52

even if we changed into the technology

86:54

that we believe to be right, we'd still

86:57

be dead. And so that argument,

87:01

you know,

87:03

do we change and therefore be dead?

87:05

Don't change and make this technology

87:07

work somehow or go do something

87:10

completely different.

87:13

That question stirred the company

87:15

strategically and was a hard question. I

87:18

eventually, you know, advocated for we

87:22

don't know what the right strategy is,

87:23

but we know what the wrong technology

87:25

is. So, let's stop doing it the wrong

87:27

way and let's give ourselves a chance to

87:29

go figure out what the strategy is. The

87:31

second thing, the second problem we had

87:34

was our company was running out of money

87:36

and I had I was in a contract with Sega

87:39

and I owed them this game console

87:43

and if that contract would have been

87:45

cancelled, we'd be dead.

87:48

We would have vaporized instantly.

87:52

And so so uh uh I went to Japan and I

87:56

explained to uh the CEO of Sega, Erie

88:01

Madri, really great man. He was the

88:04

former CEO of Honda USA. Went back to

88:08

Sega to run Sega. Went back to Japan to

88:10

run Sega. And I explained to him that I

88:15

was uh I guess I was what 30 33 years

88:18

old.

88:20

you know, when I was 33 years old, I

88:22

still had acne. And I got this this, you

88:25

know, Chinese kid. I was super skinny.

88:30

And he he was already kind of elder.

88:34

And uh I went to him and I said I said,

88:37

"Listen,

88:38

I've got some bad news for you." And and

88:42

first, the technology that we promised

88:45

you doesn't work.

88:50

And second,

88:55

we shouldn't finish your contract

88:57

because we'd waste all your money and

89:01

you would have something that doesn't

89:02

work. And I recommend you find another

89:05

partner to build your game console.

89:06

>> Whoa.

89:08

>> And so I'm terribly sorry that we've set

89:10

you back in your product roadmap.

89:15

And third,

89:19

even though you're going to I'm asking

89:20

you to let me out of the contract,

89:24

I still need the money

89:27

because if you didn't give me the money,

89:29

we'd vaporize overnight.

89:34

And so

89:37

I explained it to him humbly, honestly.

89:40

I gave him the background.

89:43

explain to him why the technology

89:45

doesn't work, why we thought it was

89:47

going to work, why it doesn't work.

89:51

And um and I asked him

89:55

to uh

89:58

convert the last $5 million that they

90:02

were to complete the contract to give us

90:05

that money as an investment

90:09

instead.

90:11

and he said,

90:14

"But it's very likely your company will

90:16

go out of business, even with my

90:18

investment."

90:21

And it was completely true. Back then,

90:24

1995, $5 million was a lot of money.

90:27

It's a lot of money today. $5 million

90:29

was a lot of money. And here's a pile of

90:31

competitors doing it right. What are the

90:34

chances that giving Nvidia $5 million

90:38

that we would develop the right strategy

90:40

that he would get a return on that $5

90:41

million or even get it back? 0%.

90:45

You do the math. It's 0%.

90:49

If I were sitting there right there, I

90:50

wouldn't have done it.

90:53

$5 million was a mountain of money to

90:54

Sega at the time.

90:57

And so

90:59

I told him that that that um

91:03

uh

91:05

if you invested that $5 million in us,

91:08

it is most likely to be lost.

91:12

But if you didn't invest that money,

91:14

we'd be out of business and we would

91:16

have no chance.

91:18

And I I told him that I

91:24

I don't even know exactly what I said in

91:26

the end, but I

91:30

told him that I would understand if he

91:33

decided not to, but it would make the

91:35

world to me if he did. He went off and

91:38

thought about it for a couple days and

91:39

came back and said, "We'll do it."

91:41

>> Wow.

91:44

strategy to how to correct what it was

91:47

doing wrong. Did you explain that to

91:49

him?

91:49

>> Wait, oh man, wait until I tell you the

91:50

rest of it's scarier. Even scarier.

91:54

>> Oh no. [laughter]

91:57

>> And so so um

92:00

so what he what he decided was was uh

92:07

Jensen was a young man he liked. That's

92:09

it.

92:10

>> Wow. to this day.

92:13

>> That's nuts.

92:14

>> I was

92:15

>> Boy, do you owe what the world owes that

92:17

guy.

92:18

>> No doubt,

92:20

>> right?

92:21

>> Well, he's he c he's he celebrated today

92:23

in Japan.

92:25

>> And if he would have kept that five

92:28

>> the the investment, I think it'd be

92:30

worth probably about a trillion dollars

92:32

today.

92:36

I know. But the moment we went public,

92:39

they sold it. They go, "Wow, that's a

92:40

miracle." So, [laughter]

92:42

>> wow.

92:42

>> They sold it. Yeah. They sold it at

92:45

Nvidia valuation about 300 million.

92:48

That's our IPO valuation. 300 million.

92:51

>> Wow.

92:53

>> And so, so anyhow,

92:56

I was incredibly grateful. Um,

93:00

and then now we had to figure out what

93:02

to do because we still were doing the

93:04

wrong strategy, wrong technology. So

93:07

unfortunately we had to lay off most of

93:09

the company. We shrunk the company all

93:10

back. All the people working on the game

93:13

console, you know, we had to shrunk it

93:14

all. Shrink it all back.

93:17

And um

93:20

and then and then somebody told me that,

93:23

but Jensen,

93:25

we've never built it this way before.

93:27

We've never built it the right way

93:28

before.

93:31

We've only know how to build it the

93:32

wrong way.

93:34

And so nobody in the company knew how to

93:37

build this

93:41

supercomputing image generator 3D

93:43

graphics thing that Silicon Graphics

93:45

did. And so so uh I said, "Okay,

93:50

how hard can it be? You got all these 30

93:53

companies, you know, 50 companies doing

93:54

it. How hard can it be?" And so luckily

94:00

there was a textbook written by the

94:01

company Silicon Graphics.

94:04

And so I went down to the store. I had

94:06

200 bucks in my pocket. And I bought

94:09

three textbooks, the only three they

94:11

had, $60 a piece. I bought the three

94:14

textbooks. I brought it back and I gave

94:17

one to each one of the architects and I

94:18

said, "Read that and let's go save the

94:20

company."

94:21

>> [laughter]

94:23

>> And so

94:25

[gasps and sighs] so they they they read

94:27

this textbook, learned from the giant at

94:31

the time, Silicon Graphics, about how to

94:33

do 3D graphics. But the thing that was

94:36

amazing and what makes Nvidia special

94:39

today is that

94:42

the people that are there are able to

94:44

start from first principles,

94:47

learn best known art, but reimplement it

94:51

in a way that's never been done before.

94:55

And so when we re-imagined

94:58

the technology of 3D graphics, we

95:01

reimagined it in a way that manifest

95:04

today the modern 3D graphics. We really

95:07

invented modern 3D graphics, but we

95:10

learned from previous known arts and we

95:13

implement it fundamentally differently.

95:16

>> What did you do that changed it?

95:18

Well, you know, the ultimately

95:21

ultimately the um uh the simple the

95:24

simple answer is that the way silicon

95:27

graphics works uh the geometry engine is

95:31

a bunch of software running on

95:32

processors.

95:34

We took that and

95:40

eliminated all the generality,

95:43

the general purposeness of it and we

95:46

reduced it down into the most essential

95:48

part of 3D graphics

95:51

and we hardcoded it into the chip. And

95:55

so instead of something general purpose,

95:56

we hardcoded it very specifically into

96:00

just the limited applications,

96:03

limited functionality necessary for

96:06

video games.

96:08

And that capability that super and and

96:11

because we reinvented a whole bunch of

96:13

stuff, it supercharged the capability of

96:15

that one little chip. And our one little

96:18

chip was generating images as fast as a

96:23

$1 million image generator. That was the

96:26

big breakthrough. We took a million

96:27

dollar thing and we put it into the

96:30

graphics card that you now put into your

96:32

gaming PC. And that was [snorts] our big

96:35

invention.

96:36

And then and of course the question is

96:38

is um

96:41

uh how do you compete against these 30

96:43

other companies doing what they were

96:44

doing?

96:46

and and there we did we did several

96:49

things. One

96:51

uh instead of building a 3D graphics

96:56

chip for every 3D graphics application,

96:59

we decided to build a 3D graphics chip

97:01

for one application. We bet the farm on

97:04

video games.

97:06

The needs of video games are very

97:08

different than needs for CAD, needs for

97:10

flight simulators. They're related, but

97:12

not the same. And so we narrowly focused

97:15

our problem statement so I could reject

97:17

all of the other complexities and we

97:19

shrunk it down into this one little

97:21

focus and then we supercharged it for

97:24

gamers. And then the second thing that

97:26

we did was we created a whole ecosystem

97:30

of working with game developers and

97:32

getting their their games ported and

97:34

adapted to our silicon so that we could

97:37

get turn essentially what is a

97:40

technology business into a platform

97:43

business into a game platform business.

97:45

So we, you know, GeForce is really today

97:48

it's also the most advanced 3D graphics

97:50

technology in the world, but a long time

97:53

ago GeForce is really the game console

97:55

inside your PC. It's, you know, it runs

97:59

Windows, it runs Excel, it runs

98:00

PowerPoint, of course, those are easy

98:02

things, but its fundamental purpose was

98:06

simply to turn your PC into a game

98:08

console. So we we were the first

98:10

technology company to build all of this

98:14

incredible technology in service of one

98:16

audience gamers. Now of course in 1993

98:21

the gaming industry didn't exist. But by

98:24

the time that John Carmarmac came along

98:26

and the doom phenomenon happened and

98:29

then quake came out as you know

98:34

that entire world oh that entire

98:36

community boom took off. Do you know

98:38

where the name Doom came from?

98:40

>> It came from this se there's a scene in

98:42

the movie The Color of Money where Tom

98:45

Cruz who's this uh elite pool player

98:47

shows up at this pool hall and this

98:49

local hustler says what he got in the

98:51

case and he opens up this case. He has a

98:53

special pool queue. He goes in here and

98:55

he opens it up. He goes, "Doom.

98:57

>> Doom." [laughter]

98:58

>> And that's where it came from. Yeah. Cuz

99:00

Carmarmac said that's what they wanted

99:01

to do to the gaming industry.

99:02

>> Doom.

99:03

>> That when Doom came out, it would just

99:05

be everybody be like, "Oh, we're

99:06

fucked."

99:07

>> Oh, wow.

99:07

>> This is Doom.

99:08

>> That's awesome.

99:09

>> Isn't that amazing? That's amazing.

99:10

>> Cuz it's the perfect name for the game.

99:12

>> Yeah.

99:12

>> And the name came out of that scene in

99:14

that movie.

99:14

>> That's right. Well, and then of course,

99:17

uh, Tim Sweeney and

99:20

>> Epic Games and, uh, and the 3D gaming

99:24

genre took off.

99:24

>> Yes.

99:25

>> And so, if you just kind of in the

99:28

beginning was no gaming industry. We had

99:31

no choice but to focus the company on

99:33

one thing. That one thing,

99:35

>> it's a really incredible origin story.

99:37

>> Oh, it's it's amazing. Like you must be

99:40

like look back

99:41

>> a disaster is what

99:42

>> a $5 [laughter] million that pivot with

99:44

that conversation with that gentleman if

99:46

he did not agree to that if he did not

99:48

like you what would the world look like

99:50

today that's crazy then then our entire

99:54

life hung on another gentleman

99:58

and so so now here we are we built so

100:01

before GeForce it was Revo 128 revo 128

100:05

saved the company it revolutionized

100:07

computer graphics

100:09

The performance cost performance ratio

100:12

of 3D graphics for gaming was off the

100:14

charts amazing.

100:17

And

100:21

we're getting ready to to ship it. Get

100:24

well, we're we're building it, but we're

100:27

so as you know, $5 million doesn't last

100:30

long. And so every single month, every

100:33

single month, uh we were drawing down

100:39

You have to build it, prototype it. You

100:42

have to design it, prototype it,

100:45

get the silicon back,

100:48

which costs a lot of money. Test it with

100:51

software

100:53

because without the software testing the

100:55

chip, you don't know the chip works.

100:58

And then you're going to find a bug

101:00

probably

101:01

because every time you test something

101:03

you find bugs,

101:06

which means you have to tape it out

101:07

again, which is more time, more money.

101:11

And so we did the math. There was no

101:13

chance anybody was going to survive it.

101:15

We didn't have that much time to tape

101:17

out a chip, send it to a foundry TSMC,

101:20

get the silicon back, test it, send it

101:23

back out again. There was no no shot, no

101:25

hope.

101:27

And so the math, the spreadsheet doesn't

101:31

allow us to do that. And so I heard

101:34

about this company and this company

101:37

built this machine.

101:40

And this machine is an emulator.

101:43

You could take your design, all of the

101:48

software that describes the chip, and

101:52

you could put it into this machine. And

101:54

this machine will pretend it's our chip.

101:57

So I don't have to send it to the fab,

101:59

wait until the fab sends it back, test.

102:01

I could have this machine pretend it's

102:03

our chip and I could put all of the

102:06

software on top of this machine called

102:07

an emulator and test all of the software

102:12

on this pretend chip and I could fix it

102:15

all before I send it to the fab.

102:18

>> Whoa. And if and and if I could do that

102:21

when I send it to the FAB, it should

102:23

work.

102:24

Nobody knows, but it should work. And so

102:27

we came to the conclusion

102:30

that let's take half of the money we had

102:33

left in the bank. At the time it was

102:35

about a million dollars. Take half of

102:38

that money and go buy this machine.

102:42

So instead of keeping the money to stay

102:44

alive, I took half of the money to go

102:46

buy this machine. Well, I call this guy

102:48

up. This the company's called IOS.

102:52

Call this company up and I say, "Hey,

102:54

listen. I heard about this machine.

102:57

I like to buy one."

102:59

And they go, "H,

103:02

that's terrific, but we're out of

103:03

business." I said, "What? You're out of

103:06

business?" He goes, "Yeah, we had no

103:08

customers."

103:10

[laughter]

103:12

I said, "Wait, hang on a sec. So, you

103:14

never made the machine?" They [snorts]

103:16

can say, "No, no, no. We made the

103:17

machine. We have one in inventory if you

103:20

want it, but we're out of business." So,

103:22

I bought one out of inventory.

103:26

Okay. After I bought it, they went out

103:28

of business.

103:29

>> Wow.

103:30

>> I bought it out of inventory.

103:32

And on this machine, we put Nvidia's

103:36

chip into it and we tested all of the

103:39

software on top.

103:41

And at this point, we were on fumes.

103:45

But we convinced ourselves that chip is

103:47

going to be great.

103:49

And so I had to call some other

103:50

gentleman. So I called TSMC.

103:54

And I told TSMC

103:56

that listen, TSMC is the world's largest

103:58

founder today. At the time they were

104:01

just a few hundred million dollars

104:02

large,

104:06

tiny little company.

104:12

And I explained to them what we were

104:13

doing. And um I explained to him I told

104:18

him I had a lot of customers. I had one,

104:20

you know, Diamond Multimedia,

104:23

probably one of the companies you bought

104:24

the graphics card from back in the old

104:26

days. And I I said, you know, we have a

104:29

lot of customers, and the demand's

104:30

really great, and

104:33

we're going to tape out a chip to you,

104:37

and I like to go directly to production

104:41

because I know it works.

104:44

>> [snorts]

104:45

>> And they said, "Nobody has ever done

104:47

that before.

104:50

Nobody has ever taped out a chip that

104:52

worked the first time.

104:54

And nobody starts out production without

104:56

looking at it."

104:58

But I knew that if I didn't start the

105:00

production, I'd be out of business

105:02

anyways. And if I could start the

105:05

production, I might have a chance.

105:08

And so

105:10

TSMC

105:12

decided to support me and uh this

105:15

gentleman is named Morris Chang. Morris

105:17

Chang is the father of the foundry

105:20

industry, the founder of TSMC. Really

105:23

great man.

105:27

He decided to support our company. I

105:30

explained to them everything.

105:32

he decided to support us frankly

105:34

probably because they didn't have that

105:36

many other customers anyhow but they

105:39

were grateful and I was immensely

105:41

grateful and as we were starting the

105:44

production

105:46

Morris flew to United States and uh

105:50

he didn't so many words asked me so but

105:53

he asked me a whole lot of questions

105:55

that was trying to tease out do I have

105:58

any money

106:00

but he didn't directly ask me that you

106:02

know and so the truth is that we didn't

106:06

have all the money but we had a strong

106:09

PO from the customer and if it didn't

106:14

work some wafers would have been lost

106:16

and I'm you know I I'm not exactly sure

106:19

what would have happened but we would

106:20

have come short it would have been it

106:23

would have been rough but they supported

106:25

us with all of that risk involved we

106:28

launched this chip turns out to

106:32

been completely revolutionary.

106:35

Knocked the ball out of the park. We

106:37

became the fastest growing technology

106:40

company in history to go from zero to $1

106:43

billion.

106:44

>> So wild that you didn't test the chip.

106:46

>> I know. We tested afterwards. Yeah, we

106:49

tested afterwards.

106:50

>> Afterwards, but [laughter]

106:52

production already. But by the way, by

106:55

the way, that methodology that we

106:58

developed to save the company is used

107:01

throughout the world today.

107:02

>> That's amazing.

107:03

>> Yeah, we changed we changed the whole

107:05

world's methodology of designing chips.

107:07

The whole world's uh rhythm of designing

107:10

chips. Uh we changed everything.

107:13

>> How well did you sleep those days? It

107:16

must have been so much stress,

107:18

[laughter]

107:19

>> you know. Um,

107:24

what is that feeling where where uh the

107:27

world just kind of feels like it's

107:29

flying? It you you have this what do you

107:33

call that feeling? You can't you can't

107:36

stop the the feeling that everything's

107:38

moving super fast and you know and

107:41

you're laying in your laying in bed and

107:44

the world just feels like you know it

107:47

you and you're you you feel deeply

107:50

anxious

107:51

uh completely out of control. Um

107:56

I've felt that probably a couple of

107:57

times [laughter] in my life. It's during

108:01

that time.

108:02

>> Wow.

108:03

>> Yeah. It it was incredible.

108:05

>> What an incredible success story.

108:06

>> But I I learned I learned a lot. I

108:08

learned I learned about I learned

108:09

several things. I learned I learned uh

108:11

how to develop strategies.

108:14

Um I learned how to

108:16

uh uh and when I when I you know our

108:19

company learned how to develop

108:20

strategies. What are winning strategies?

108:22

We learned how to create a market. We

108:24

created the modern 3D gaming market.

108:28

We learned how and and so that exact

108:31

same skill is how we created the modern

108:34

AI market. It's exactly the same.

108:37

>> Wow.

108:37

>> Yeah. Exactly the same skill. Exactly

108:40

the same blueprint. And

108:43

uh we learned how to uh deal with

108:45

crisis, how to stay calm, how to think

108:49

through things systematically.

108:52

We learned how to remove all waste in

108:56

the company and work from first

108:58

principles and doing only the things

109:00

that are essential. Everything else is

109:02

waste because we have no money for it

109:05

to live on fumes at all times. And the

109:09

feeling

109:12

no different than the feeling I had this

109:13

morning when I woke up that you're going

109:16

to be out of business soon. that you're

109:19

you know the phrase 30 days from going

109:21

out of business I've used for 33 years

109:24

because

109:25

>> you still feel that.

109:25

>> Oh yeah. Oh yeah. Every morning. Every

109:27

morning.

109:28

>> But but you guys are one of the biggest

109:30

companies on planet earth. But the the

109:32

feeling doesn't change.

109:33

>> Wow.

109:34

>> The the sense of vulnerability, the

109:36

sense of uncertainty, the sense of

109:39

insecurity. Uh

109:42

it it doesn't leave you.

109:44

>> That's crazy. We were, you know, we had

109:46

nothing. We had nothing. We were dealing

109:49

with giant.

109:50

>> Oh, yeah. Oh, yeah. Every day, every

109:52

moment.

109:52

>> Do you think that fuels you? Is that

109:54

part of the reason why the company's so

109:56

successful? That you have that hungry

109:59

mentality,

110:04

that you never rest, you're never

110:06

sitting on your laurels, you're always

110:08

on the edge.

110:12

I have a greater drive from not wanting

110:17

to fail

110:19

than the drive of wanting to succeed.

110:22

[laughter]

110:25

>> Isn't that like sex coaches would tell

110:27

you that's completely the wrong

110:29

psychology?

110:29

>> The world has just heard me say that for

110:31

out loud for the first time.

110:33

>> But but it's true.

110:35

>> Well, that's how fascinating. fear of

110:36

failure drives me more than the than the

110:41

the greed or whatever it is.

110:43

>> Well, ultimately that's probably a more

110:45

healthy approach now that I'm thinking

110:48

about it because like the fear

110:50

>> I'm not ambitious for example,

110:52

[laughter] you know, I just want to stay

110:54

alive, Joe. I want the company to

110:57

thrive, you know, I want us to make an

110:59

impact.

110:59

>> That's interesting.

111:00

>> Yeah.

111:01

>> Well, maybe that's why you're so humble.

111:03

That's what maybe that's what keeps you

111:04

grounded, you know, because with the

111:07

kind of spectacular success the

111:08

company's achieved, it would be easy to

111:10

get a big head.

111:11

>> No.

111:12

>> Right. But isn't that interesting? It's

111:13

like the if you were the guy that your

111:16

main focus is just success. You probably

111:20

would go, "Well, made it. Nailed it. I'm

111:23

the [laughter] man.

111:24

>> Drop the mic."

111:25

>> Instead, you wake up, you're like, "God,

111:27

we can't [ __ ] this up."

111:28

>> No. Exactly. Every morning. Every

111:30

morning. No. Every moment. Yeah. That's

111:32

crazy.

111:33

>> Before I go to bed.

111:34

>> Well, listen. If I was a major investor

111:36

in your company, that's what I'd want

111:37

running it. I'd want a guy who's

111:41

>> Yeah.

111:42

>> That's what I work. That's why I work

111:43

seven days a week. Every moment I'm I'm

111:46

awake.

111:47

>> You work every moment.

111:48

>> Every moment I'm awake.

111:50

>> Wow.

111:51

>> I'm thinking about solving a problem.

111:53

I'm thinking about

111:55

>> How long can you keep this up?

111:57

>> I don't know. But so [laughter]

111:59

could be next week. Sounds exhausting.

112:02

>> It is exhausting.

112:03

>> It sounds completely exhausting.

112:04

>> Always in a state of anxiety.

112:07

>> Wow.

112:08

>> Always in a state of anxiety.

112:10

>> Wow. Kudos to you for admitting that. I

112:12

think that's important for a lot of

112:13

people to hear because, you know,

112:15

there's probably some young people out

112:17

there that are in a similar position to

112:21

where you were when you were starting

112:22

out that just feel like, oh, those

112:25

people that have made it, they're just

112:27

smarter than me and they had more

112:29

opportunities than me and it's just like

112:31

it was handed to them or they're just in

112:33

the right place at the right time. And

112:36

>> Joe, I just described to you somebody

112:37

who didn't know what was going on,

112:39

[laughter]

112:39

actually did it wrong.

112:41

>> Yeah. Yeah. And the ultimate diving

112:43

catch like two or three times.

112:45

>> Crazy.

112:46

>> Yeah.

112:47

>> The ultimate diving catch is the perfect

112:48

way to put it.

112:50

>> You know, it's just like the edge of

112:51

your glove. [laughter]

112:53

>> It probably bounced off of somebody's

112:55

helmet and landed at the edge.

112:58

[laughter]

113:00

>> God, that's incredible. That's

113:02

incredible. But it's also it's really

113:04

cool that you have this perspective that

113:06

you look at it that way because you know

113:10

a lot of people that have delusions of

113:12

grandeur or they have you know

113:15

>> and their rewriting of history

113:19

often times had them somehow extraord

113:23

extraordinarily smart and they were

113:25

geniuses and they knew all along and

113:27

they were they were spot-on. And the

113:29

business plan was exactly what they

113:30

thought. And

113:31

>> yeah,

113:31

>> they destroyed the competition and you

113:34

know and they emerged victorious.

113:36

[laughter]

113:39

>> Meanwhile, you're like, I'm scared every

113:40

day.

113:41

>> Exactly. [laughter]

113:43

Exactly.

113:44

>> That's so funny. Oh my god, that's

113:47

amazing.

113:47

>> It's so true, though.

113:48

>> It's amazing.

113:49

>> It's so true.

113:50

>> It's amazing. Well, but I I think

113:52

there's nothing inconsistent

113:55

with being a leader and being

113:56

vulnerable. You know, I the company

114:00

doesn't need me to be a genius right all

114:03

along, right?

114:06

Absolutely certain about what I'm trying

114:08

to do and what I'm doing. The the

114:10

company doesn't need that. The company

114:12

wants me to succeed. You know, the thing

114:14

that and we started out today talking

114:17

about President Trump and I was about to

114:19

say something and listen, he is my

114:23

president. He is our president. We

114:25

should all and we're talking about just

114:27

because it's President Trump, we all

114:29

want him to be wrong. I think that

114:31

United States, we all have to realize he

114:34

is our president, we want him to succeed

114:37

because

114:37

>> no matter who's president attitude.

114:40

>> That's right.

114:41

>> We want him to succeed. We need to help

114:43

him succeed because it helps everybody,

114:45

all of us succeed.

114:48

And

114:51

I'm lucky that I work in a company where

114:54

I have 40,000 people who wants me to

114:56

succeed.

114:58

They want me to succeed and I can tell

115:00

and they're all every single day to help

115:02

me overcome these challenges

115:05

trying to realize

115:07

realize what I describe to be our

115:09

strategy doing their best. And if it's

115:12

somehow

115:14

wrong or not perfectly right to tell me

115:17

so that we could pivot and the more

115:20

vulnerable we are as a leader the more

115:23

able other people are able to tell you

115:26

you know that Jensen that's not exactly

115:27

right or

115:28

>> right right

115:29

>> have you considered this information or

115:31

and the more vulnerable we are

115:35

the more able we're actually able to

115:37

pivot if we put ourselves into this

115:39

superhuman capability then it's hard for

115:41

us to pivot strategy,

115:42

>> right?

115:43

>> Because we were supposed to be right all

115:44

along.

115:45

>> And so if you're always right, how can

115:47

you possibly pivot? Because pivoting

115:49

requires you to be wrong. And so I've

115:51

got no trouble with being wrong. I just

115:54

have to make sure that I stay alert,

115:57

that I reason about things from first

115:58

principles all the time. Always break

116:01

things down to first principles.

116:03

Understand why it's happening.

116:05

Reassess continuously. The reassessing

116:09

continuously is kind of partly what

116:11

causes continuous anxiety,

116:14

>> you know, because you're asking

116:15

yourself, were you wrong yesterday? Are

116:17

you still right? Is this the same? Has

116:20

that changed? Has that condition is that

116:22

worse than you thought?

116:23

>> But God, that mindset is perfect for

116:25

your business, though, because this

116:27

business is ever changing

116:29

>> all the time. I've got competition

116:30

coming from every direction. So much of

116:33

it is kind of up in the air

116:36

and you have to invent a future where

116:41

a 100 variables are included and there's

116:44

no way you could be right on all of

116:45

them. And so you have to be

116:47

>> you have to surf.

116:49

>> Wow. That's a good way to put it. You

116:51

have to surf. Yeah. You're surfing waves

116:54

of technology and innovation.

116:55

>> That's right. You can't predict the

116:57

waves. You got to deal with the ones you

116:59

have.

116:59

>> Wow. And but skill matters and I've been

117:04

doing this for 30 I'm the longest

117:05

running tech CEO in the world.

117:07

>> Is that true? Congratulations. That's

117:08

amazing.

117:10

>> And you know people ask me how is one

117:12

don't get fired. [laughter]

117:16

That'll stop a short heartbeat.

117:19

And then two don't get bored.

117:22

>> Yeah.

117:22

>> Well, how do you maintain your

117:23

enthusiasm?

117:28

Well, the honor truth is is not always

117:30

enthusiasm. It's, you know, sometimes is

117:32

enthusiasm. Sometimes it's just good

117:35

oldfashioned fear and then sometimes,

117:37

you know, a healthy dose of frustration,

117:40

you know, it's whatever keeps you

117:42

moving.

117:43

>> Yeah. Just all the emotions. I think,

117:45

you know,

117:46

>> CEOs, we have all the emotions, right?

117:49

you know, and so probably probably

117:53

jacked up to the maximum because you're

117:55

you're kind of feeling it on behalf of

117:57

the whole company. I'm feeling it on

117:59

behalf of everybody at the same time.

118:02

And it kind of, you know, encapsulates

118:04

into into somebody. And so I have to be

118:08

mindful of the past. I have to be

118:09

mindful of the present. I've got to be

118:11

mindful of the future. And um you know,

118:14

it can't it's not without emotion.

118:18

It's not just it's it's not just a job.

118:20

Let's just put it that way.

118:23

>> It doesn't seem like it at all. I would

118:25

imagine one of the more difficult

118:26

aspects of your job currently now that

118:29

the company is massively successful is

118:31

anticipating where technology is headed

118:34

and where the applications are going to

118:36

be.

118:36

>> Yeah.

118:37

>> So, how do you try to map that out?

118:40

>> Yeah. there there um there there's a

118:43

whole bunch of ways and and it takes it

118:46

takes um

118:50

takes a whole bunch of things but let me

118:51

just start

118:53

uh you have to be surrounded by amazing

118:55

people and Nvidia is now you know if you

118:59

look at look at look at um the large

119:02

tech companies in the world today

119:05

most of them have a business in

119:08

advertising or social media or you know

119:12

content distribution and at the core of

119:15

it is really fundamental computer

119:18

science and so the company's business is

119:22

not computers the company's business is

119:25

not technology technology drives the

119:27

company is the only company in the world

119:29

that's large whose only business is

119:32

technology we only build techn we don't

119:34

advertise the only way that we make

119:36

money is to create amazing technology

119:39

and sell

119:40

And so

119:43

to be that to be NVIDIA today, you're

119:46

the number one thing is you're

119:48

surrounded by the finest computer

119:49

scientists in the world. And that's my

119:52

gift. My gift is that we've created a

119:55

company's culture,

119:57

a condition by which the world's

120:00

greatest computer scientists want to be

120:01

part of it because they get to do their

120:04

life's work and create the next thing

120:06

because that's what they want to do.

120:08

because maybe they're not they don't

120:11

want to be in service of another

120:12

business.

120:13

>> They want to be in service of the

120:15

technology itself. And we're the largest

120:16

form of its kind in the history of the

120:18

world.

120:19

>> Wow.

120:20

>> I know. It's pretty amazing.

120:21

>> Wow.

120:22

>> And so so one, you know, we have a we we

120:26

have got a great condition. We have a

120:27

great culture. We have great people. And

120:30

then now now now the question is how do

120:32

you systematically

120:36

um

120:37

be able to see the future stay alert of

120:42

it

120:44

and uh reduce the reduce the the

120:48

likelihood of missing something or being

120:51

wrong.

120:53

And so there's a lot of different ways

120:54

you could do that. For example, we have

120:56

great partnerships. We we have

120:57

fundamental research. We have a great

120:59

research lab, one of the largest

121:00

industrial research labs in the world

121:02

today. And we partner with a whole bunch

121:04

of universities and other scientists. We

121:07

do a lot of open collaboration

121:09

and so I'm constantly working with

121:12

researchers outside the company.

121:15

We have the benefit of having amazing

121:17

customers and so I have the benefit of

121:20

working with Elon and you know and

121:22

others in the industry and we have the

121:25

benefit of being the only pure pure play

121:28

technology company that can serve uh

121:31

consumer internet

121:33

industrial manufacturing

121:36

um scientific computing healthcare

121:39

financial services all the industries

121:42

that we're in. They're all signals to

121:44

me. And so they all have mathematicians

121:48

and scientists and and so because I I

121:51

have the benefit now of a radar system

121:54

>> that is the most broad of any company in

121:56

the world working across every single

121:59

industry from agriculture to energy

122:03

to video games.

122:05

And so the ability for us to have this

122:08

vantage point,

122:10

one doing fundamental research ourselves

122:13

and then two working with all the great

122:15

researchers, working with all the great

122:17

industries, the feedback system is

122:19

incredible. And then finally,

122:22

you just have to have a culture of

122:23

staying super alert. There's no easy way

122:27

of being alert except for paying

122:29

attention.

122:31

I haven't found a single way of being

122:33

able to stay alert without paying

122:34

attention. And so, you know, I probably

122:37

read several thousand emails a day.

122:42

>> How How do you have a time for that?

122:44

>> I wake up early. This morning I was up

122:46

at 4:00.

122:47

>> How much do you sleep?

122:49

>> Uh, six, seven, six, seven hours.

122:53

>> Yeah.

122:54

>> And then you're up at 4 reading emails

122:56

for a few hours before you get going.

122:58

>> That's right. Yeah.

122:58

>> Wow.

123:00

Every day.

123:01

>> Every single day. Not one day missed.

123:03

[sighs] including

123:06

Thanksgiving, Christmas.

123:06

>> Do you ever take a vacation?

123:09

>> Uh, yeah. But they're um my definition

123:13

of a vacation is when I'm with my

123:14

family. And so if I'm with my family,

123:17

I'm very happy. I don't care where we

123:19

are.

123:19

>> And you don't work then or do you work a

123:21

little?

123:21

>> No. No. I work a lot. [laughter]

123:24

>> Even like if you go on a trip somewhere,

123:27

you're still working.

123:28

>> Oh, sure. Oh, sure.

123:29

>> Wow. Every day.

123:30

>> Every day.

123:32

>> But my kids work every You make me tired

123:34

just saying this.

123:34

>> My kids work every day.

123:38

Both of my kids work at Nvidia. They

123:39

work every day.

123:40

>> Wow.

123:40

>> Yeah. I'm very lucky.

123:42

>> Wow.

123:43

>> Yeah. It's brutal now because, you know,

123:45

it's just me working every day. Now we

123:47

have three people working every day and

123:49

they want to work with me every day and

123:51

so it's it's a lot of work.

123:54

>> Well, you've obviously imparted that

123:57

ethic into them.

123:58

>> They work incredibly hard. I mean,

123:59

there's no unbelievable.

124:01

>> But my parents work incredibly hard.

124:05

>> Yeah. I was I was born with the work

124:07

gene,

124:08

>> the suffering gene. [laughter]

124:10

>> Well, listen, man. It has paid off. What

124:14

a crazy story. It was just It's really

124:16

an amazing origin story.

124:19

It really I mean, it has to be kind of

124:21

surreal to be in the position that

124:22

you're in now when you look back at how

124:24

many times that it could have fallen

124:25

apart and humble beginnings. But Joe,

124:28

this is great. It's a great country. You

124:31

know, I'm an immigrant. My parents sent

124:34

my older brother and I here first.

124:37

We're we're in Thailand.

124:40

I was born in Taiwan, but my dad had a

124:43

job in Thailand. He was a chemical and

124:46

instrumentation engineer, incredible

124:48

engineer.

124:50

And his job was to go start an oil

124:52

refinery. And so we moved to Thailand,

124:54

lived in Bangkok.

124:56

And um in 19

125:00

I guess 1973 1974 time frame,

125:05

you know how Thailand every so often

125:06

they would just have a coup. You know,

125:08

the military would have an uprising and

125:12

all of a sudden one day there were tanks

125:14

and soldiers in the streets and my

125:15

parents thought, you know, it probably

125:17

isn't safe for the kids to be here. And

125:19

so they contacted my uncle. My uncle

125:22

lives in Tacoma, Washington.

125:25

and um we had never met him and my

125:28

parents sent us to him.

125:30

>> How old were you?

125:31

>> Uh I was about to turn nine and my older

125:35

brother uh almost turned 11 and so the

125:39

two of us came to United States and we

125:42

stayed in with our uncle for a little

125:45

bit while he looked for a school for us

125:49

and my parents didn't have very much

125:50

money and they never been to United

125:53

States. my father was. I'll tell you

125:55

that story in a second.

125:57

And um

126:00

and so my my uncle found a school that

126:04

would accept foreign students and

126:08

affordable enough for my parents.

126:11

And that school turned out to have been

126:13

in Onita, Kentucky, Clark County,

126:17

Kentucky, the epicenter of the opio

126:19

crisis today.

126:21

cold country.

126:24

Clark County, Kentucky is

126:27

was the poorest county in America when I

126:31

showed up. It is the poorest county in

126:33

America today.

126:36

And so we went to the school, it's a

126:38

great school, um, Onita Baptist

126:41

Institute

126:43

in a town of a few hundred. I think it

126:46

was 600 at the time that we showed up.

126:49

No traffic light.

126:51

And um I think it has 600 today. It's

126:54

quite an amazing feat actually.

126:58

The ability to hold your population for

127:01

[laughter]

127:02

when it's 600 people. It was quite a

127:04

magic quite a magical thing. however

127:06

they did it. And and so uh the school

127:11

had a mission of being an open school

127:16

for any children who would like to come.

127:20

And what that basically means is that if

127:23

you're a trouble student, if you have a

127:26

troubled family,

127:28

um if you're,

127:32

you know, whatever your background,

127:35

you're welcome to come to Onita Baptist

127:38

Institute, including kids from

127:41

international who would like to stay

127:43

there.

127:44

>> Did you speak English at the time?

127:45

>> Uh, okay. Yeah. Yeah. Okay. Yeah. And so

127:50

we showed up

127:53

and uh

127:57

my first my first thought was gosh there

128:01

are a lot of cigarette butts on the

128:02

ground. 100% of the kids smoked.

128:05

[laughter]

128:09

So right away you know this is not a

128:10

normal school.

128:11

>> Nineyear-olds?

128:12

>> No, I was the youngest kid.

128:14

>> Okay. 11 year olds.

128:15

>> My roommate was 17 years old. Wow.

128:19

>> Yeah. He just turned 17. And he was

128:22

jacked

128:23

and and um

128:29

I don't know where he is now. I know his

128:31

name, but I don't know where he is now.

128:32

But anyways, uh that night we got and

128:36

and the second thing I noticed when you

128:37

walk into the into your dorm room

128:41

is uh there are no drawers and no closet

128:44

doors.

128:47

just like a prison.

128:50

And

128:52

there are no locks

128:55

so that people could check check up on

128:56

you.

128:58

And so I go into my room and he's 17 and

129:03

uh you know get ready for for bed and he

129:06

had all this tape

129:09

all over his body and uh turned out he

129:12

was in a knife fight and he's been

129:15

stabbed all over his body and these were

129:17

just fresh wounds.

129:19

>> Whoa. And the other kids were hurt much

129:22

worse.

129:24

And uh so he was my roommate, the

129:27

toughest kid in school, and I was the

129:29

youngest kid in school. It was a it was

129:32

a junior high,

129:34

but they took me anyways because if I

129:38

walked about a mile across the Kentucky

129:41

River, the swing bridge, the other side

129:45

is a middle school that I could go to

129:47

and then I can go to that school and I

129:50

come back and then I stay in the dorm.

129:53

And so basically Onita Baptist Institute

129:55

was my dorm when I went to this other

129:57

other school. My older brother went went

130:00

to um went to the junior high. And so we

130:03

were there for a couple of years. Um

130:05

every kid had every kid had chores.

130:09

My older brother's chore was to work in

130:11

the tobacco farm, you know. So tobac

130:14

they raised tobacco so that they could

130:15

raise some extra money for the school.

130:18

Kind of like a penitentiary.

130:19

>> Wow. And my job was just to clean the

130:22

dorm. And so I I was 9 years old. I was

130:26

cleaning toilets. And for a dorm of 100

130:29

100 boys, I

130:33

I clean more bathrooms than anybody. And

130:35

I just wish that everybody was a little

130:37

bit more careful, you know. [laughter]

130:42

But anyways, I was the youngest kid in

130:43

school. The my memories of it was really

130:46

good. Um, but it was a pretty tough It

130:49

was a tough town.

130:50

>> Sounds like it.

130:51

>> Yeah. Town kids, they all carried

130:53

Everybody had knives.

130:55

>> Everybody had knives. Everybody smoked.

130:58

Everybody had a Zippo lighter. I smoked

131:00

for a week.

131:01

>> Did you?

131:02

>> Oh, yeah. Sure.

131:02

>> How old were you?

131:03

>> I was nine. Yeah.

131:04

>> When you nine? You were nine, you tried

131:05

smoking.

131:06

>> Yeah. I got myself a pack of cigarettes.

131:08

Everybody else did.

131:09

>> Did you get sick? No. I I got used to

131:11

it, you know, and I learned how to blow

131:14

blow smoke rings and, you know, [snorts]

131:19

you know, breathe out of my nose, you

131:20

know, take it in out of through my nose.

131:22

I mean, there was a all the different

131:24

things that you learned. Yeah.

131:26

>> At nine.

131:27

>> Yeah.

131:27

>> Wow. You just did it to fit in or it

131:29

looked cool.

131:30

>> Yeah. [clears throat] Because everybody

131:30

else did it,

131:31

>> right?

131:31

>> Yeah. And and then I did it for a couple

131:34

weeks, I guess. And I just rather have I

131:38

had a quarter, you know, I had a quarter

131:41

a month or something like that.

131:44

I just rather buy popsicles and fried

131:46

sickles with it. I was nine, you know,

131:48

[laughter]

131:48

>> right?

131:49

>> I chose I chose the the better path.

131:52

>> Wow.

131:53

>> That was our school. And then my parents

131:55

came to United States two years later

131:57

and um we met him in Tacoma, Washington.

132:01

>> That's wild. It It was a really crazy

132:04

experience. What a strange formative

132:07

experience.

132:08

>> Yeah. Tough kids.

132:10

>> Thailand to one of the poorest places in

132:14

America or if not the poorest

132:17

as a 9-year-old.

132:20

>> Yeah. It was my first experience with

132:21

your brother.

132:22

>> Wow.

132:23

>> Yeah. Yeah. No, I I remember and what

132:26

breaks my heart probably the only thing

132:28

that really breaks my heart of

132:32

about that experience was

132:35

so

132:37

we didn't have enough money to make you

132:40

know international phone calls every

132:41

week and so my parents gave us this tape

132:44

deck this Iowa tape deck and a tape

132:51

and so every month we would sit in front

132:54

on that tape deck and that my older

132:56

brother Jeff and I,

132:59

the two of us would just tell them what

133:01

we did the whole month.

133:04

>> Wow.

133:06

>> And we would send that tape by mail

133:09

and my parents would take that tape and

133:12

record back on top of it and send it

133:14

back to us.

133:17

>> Wow.

133:17

>> Could you imagine if for two years

133:20

>> Wow. is that tape still existed

133:23

of these two kids just describing their

133:25

first experience with United States.

133:28

Like I remember telling my parents

133:31

that that uh I joined the swim team and

133:37

uh

133:40

my roommate was really buff and so every

133:42

day we spent a lot of time in the in the

133:44

gym and so uh uh every night 100

133:48

push-ups, 100 sit-ups every day in the

133:50

gym. So, I was nine years old. I was

133:51

getting I was pretty buff

133:54

and I'm pretty fit. And uh

133:58

and so I joined the soccer team. I

134:00

joined the swim team because if you join

134:03

the team, they take you to meets and

134:06

then afterwards you get to go to a nice

134:08

restaurant. And that nice restaurant was

134:10

McDonald's.

134:11

>> Wow.

134:12

>> And and I recorded this thing. And I

134:15

said, "Mom and dad, we went to the most

134:17

amazing restaurant today.

134:20

This whole place is lit up. It's like

134:22

the future."

134:24

And [snorts] the food comes in a box

134:28

[laughter]

134:31

and the food is incredible. The

134:32

hamburger is incredible. It was

134:34

McDonald's. [snorts] But anyhow, it it

134:37

wouldn't it be amazing?

134:38

>> Oh my god. Two years recording. Yeah.

134:40

Two years. Yeah. What a crazy connection

134:44

to your parents, too. Just sending a

134:46

tape and them sending you one back and

134:48

it's the only way you're communicating

134:50

for two years.

134:51

>> Yeah. Wow. Yeah. No, I've My parents are

134:56

incredible actually. They're just

134:58

they're uh they grew up really poor and

135:01

um when they came to United States, they

135:03

had almost no money. Uh probably one of

135:06

the most

135:09

impactful memories I have is is uh we

135:12

they came and we were we were staying in

135:14

a in a in a uh apartment complex

135:21

and they had they had just rent back in

135:23

the I guess people still do rent rent a

135:26

bunch of furniture

135:29

and

135:31

we were messing around

135:36

and uh

135:38

we bumped into the coffee table and

135:40

crushed it. It's made out of particle

135:42

wood and we crushed it.

135:46

And I just still remember my the look on

135:49

my mom's face, you know, because they

135:51

didn't have any money and she didn't

135:52

know how she was going to pay it back.

135:54

And but anyhow, that's that kind of

135:56

tells you how hard it was for them to

135:58

come here. They they left everything

136:00

behind and all they had was their

136:02

suitcase and the money they had in their

136:05

in their pocket and they came to United

136:07

States.

136:08

>> How old were they pursued the American

136:09

dream? They were in their 40s.

136:10

>> Wow.

136:11

>> Yeah. Late late 30s.

136:13

>> Pursued the American dream. This is this

136:15

is the American dream. I'm the first

136:17

generation of the American dream.

136:19

>> Wow.

136:20

>> Yeah. It's hard not to love this

136:21

country.

136:23

>> That's

136:23

>> it's it's hard not to be romantic about

136:25

this country.

136:26

>> That is a romantic story. That's an

136:28

amazing story.

136:29

>> Yeah. And and my dad found his job

136:32

literally in the newspaper,

136:34

you know, the ads and he calls people.

136:38

Got a job.

136:39

>> What did he do?

136:40

>> Uh he was a consulting engineer and a

136:43

and a consulting firm and they helped

136:45

people build oil refineries, paper mills

136:49

and fabs. And that's what he did. He was

136:51

an he he's really good at factory design

136:55

instrumentation engineer. And so he's

136:59

he's brilliant at that. And so he did

137:01

that and my mom uh worked as a maid and

137:05

uh they found a way to raise us.

137:08

>> Wow.

137:10

That's an incredible story, Jensen. It

137:12

really is. Every all of it from your

137:15

childhood to the perils of Nvidia almost

137:19

falling. [laughter]

137:21

It's really incredible, man.

137:22

>> It's a great story. Yeah. I I've lived a

137:25

great life.

137:26

>> You really have. And it's a great story

137:28

for other people to hear, too. It really

137:30

is.

137:30

>> You don't You don't have to go to Ivy

137:33

League schools to succeed.

137:36

This country creates opportunities. Has

137:38

opportunities for all of us. You do have

137:40

to strive.

137:43

You have to claw your way here.

137:45

>> Yeah.

137:46

>> But if you put in the work, you can

137:48

succeed.

137:49

Nobody works with

137:50

>> a lot of luck and a lot of

137:51

>> a lot and

137:52

>> good decision- making

137:53

>> and the good graces of others.

137:55

>> Yes, that's really important.

137:57

>> Yeah. You and I spoke about two two

137:59

people who are very dear to me. Um but

138:02

the list goes on. the people the people

138:05

at NVIDIA who have have uh helped me um

138:10

uh many friends that are on the board uh

138:13

the decisions you know them giving me

138:15

the opportunity like when we were

138:16

inventing this new computing approach

138:19

I tanked our stock price because we

138:22

added this thing called CUDA to the chip

138:24

we had this big idea we added this thing

138:26

called CUDA to the chip but nobody paid

138:28

for it but our cost doubled and so we

138:31

had this graphics chip company and we

138:34

invented GPUs, we invented programmable

138:37

shaders, we invented everything modern

138:39

computer graphics,

138:42

we invented real-time tracing. That's

138:44

why it went from GTX to RTX.

138:48

We invented all this stuff, but every

138:50

time we invented something,

138:53

the market doesn't know how to

138:54

appreciate it, but the cost went way up.

138:56

And in the case of CUDA that enabled AI,

139:00

the cost increased a lot. it and but I

139:03

really we really believed it you know

139:06

and so if you believe in that future and

139:09

you don't do anything about it you're

139:10

going to regret it for your life

139:13

and so we always you know I always tell

139:15

the team do you believe what do we

139:17

believe this or not and if you believe

139:19

it and so grounded on first principle is

139:21

not random you know hearsay and we

139:25

believe it we've got to we owe it to

139:26

ourselves to go pursue it if we're the

139:29

right people to go do it if it's really

139:31

really hard to do. It's worth doing and

139:33

we believe it. Let's go pursue it.

139:36

Well, we pursued it. We we launched the

139:38

product. Nobody knew. It was exactly

139:40

what like when I launched DGX1 and the

139:42

entire audience was like

139:45

complete silence. When I launched CUDA,

139:48

the audience was complete silence. No

139:52

customer wanted it. Nobody asked for it.

139:56

Nobody understood it. Nvidia was a

139:58

public company.

139:59

>> What year was this? This is uh

140:02

uh let's see 200

140:05

2006

140:07

20 years ago

140:10

2005.

140:12

>> Wow.

140:14

>> Our stock price just went

140:18

our valuation went down to like two or

140:21

three billion dollars

140:23

>> from

140:24

>> from about 12 or something like that.

140:28

I crushed it.

140:29

>> [laughter]

140:30

>> in a very bad way.

140:32

>> Yeah.

140:32

>> What is it now though?

140:34

>> H Yeah, it's higher. [laughter]

140:38

>> Very humble of you. [gasps]

140:40

>> It's higher. But it changed the world.

140:43

>> Yeah,

140:43

>> that invention changed the world.

140:46

>> It's a It's an incredible story,

140:47

Johnson. It really is.

140:50

>> Thank you.

140:51

>> I like your story. It's incredible. Ah,

140:53

>> my story is not as incredible. My story

140:55

is more weird,

140:57

you know.

140:59

It's much more fertuitous and weird.

141:01

>> Okay. What are the three milestones that

141:04

most important milestones that led to

141:06

here?

141:10

>> That's a good question. Um,

141:12

>> what was step one?

141:13

>> I think step one was seeing other people

141:17

do it. Step one was in the initial days

141:20

of podcasting, like in 2009 when I

141:23

started, podcasting had only been around

141:25

for a couple of years. Um, the first was

141:28

Adam Curry, my good friend, who was the

141:31

podfather. He he invented podcasting.

141:34

And then, you know, um, I remember Adam

141:37

Corolla had a show because he had a

141:38

radio show. His radio show got cancelled

141:41

and so he decided to just do the same

141:42

show but do it on the internet. And that

141:44

was pretty revolutionary. Nobody was

141:45

doing that. And then there was the

141:47

experience that I had had doing

141:49

different morning radio shows like Opie

141:51

and Anthony in particular because it was

141:55

fun and we would just get together with

141:56

a bunch of comedians, you know, I'd be

141:59

on the show with like three or four

142:00

other guys that I knew and it was always

142:02

just looked forward to it. It was was

142:04

just such a good time and I said, "God,

142:07

I miss doing that. It's so fun to do

142:08

that. I wish I could do something like

142:09

that." And then I saw Tom Green setup.

142:12

Tom Green had a setup in his house and

142:14

he essentially turned his entire house

142:16

into a television studio and he did an

142:19

internet show from his living room. He

142:21

had servers in his house and cables

142:22

everywhere. Had to step over cables. I

142:24

was this is like 2007. I'm like Tom this

142:26

is nuts. Like this is

142:28

>> and I'm like you got to figure out a way

142:29

to make money from this. Like this

142:31

everybody I wish everybody in the

142:32

internet could see your setup. It's

142:34

nuts. I just want to let you guys know

142:35

that [laughter]

142:37

>> it's not just this.

142:39

>> Yeah. So that was the the beginning of

142:41

it is just seeing other people do it and

142:43

then saying all right let's just try it

142:44

and then so the beginning days we just

142:47

did it on a laptop had a laptop with a

142:49

webcam and just messed around had a

142:51

bunch of comedians come in we would just

142:53

talk and joke around and I did it like

142:56

once a week and then I started doing it

142:57

twice a week and then all a sudden I was

143:00

doing it for a year and then I was doing

143:01

it for two years then it was like oh

143:03

it's starting to get a lot of viewers a

143:05

lot of listeners you know and then I

143:08

just kept doing It's all it is. I just

143:10

kept doing it because I enjoyed doing

143:12

it. Well, was there any setback?

143:15

>> No. No, there was never really a setback

143:17

really.

143:17

>> No,

143:18

>> it must have been. Or you kind of

143:19

>> You're just You're just resilient.

143:22

>> Or you're just tough.

143:23

>> No. No. No. No. It wasn't tough or hard.

143:26

It was just interesting. So, I just it

143:28

the the

143:29

>> You were never once punched in the face.

143:30

>> No, not in the show. No, not really. Not

143:32

Not doing the show.

143:33

>> You never did something that that

143:37

big blowback. Nope.

143:40

Not really. No, it all just kept

143:42

growing.

143:43

>> It kept growing and the thing stayed the

143:46

same from the beginning to now. And the

143:48

thing is, I enjoy talking to people.

143:50

I've always enjoyed talking to

143:51

interesting people.

143:52

>> I could even tell just when we walked

143:53

in, the way you interacted with

143:55

everybody, not just me.

143:57

>> Yeah, that's cool.

143:58

>> People are cool.

143:59

>> Yeah, that's cool. You know, I I it's a

144:02

an amazing gift to be able to have so

144:06

many conversations with so many

144:07

interesting people because it changes

144:09

the way you see the world because you

144:11

see the world through so many different

144:12

people's eyes and you have so many

144:15

different people have different

144:16

perspectives and different opinions and

144:18

different philosophies and different

144:20

life stories. And you know, it's an

144:24

incredibly enriching and educating

144:26

experience having so many conversations

144:30

with so many amazing people. And that's

144:33

all I started doing. And that's all I do

144:36

now. Even now, when I booked the show, I

144:39

do it on my phone. And I basically go

144:41

through this giant list of emails of all

144:44

the people that want to be on the show

144:46

or that request to be on the show. And

144:48

then I factor in another list that I

144:50

have of people that I would like to get

144:52

on the show that I'm interested in. And

144:53

I just map it out and that's it. And I

144:56

go, "Oh, I'd like to talk to him."

144:58

>> If it wasn't because of President Trump,

144:59

I wouldn't have been bumped up on that

145:00

list. [laughter]

145:01

>> No, I wanted to talk to you already. I I

145:04

just think, you know, what you're doing

145:06

is very fascinating. I mean, how would I

145:08

not want to talk to you? And then today,

145:09

it proved to be absolutely the right

145:11

decision.

145:12

>> Well, you know, listen, it's it's

145:14

strange to be an immigrant one day.

145:18

going to Onita Baptist Institute

145:21

with with the students that were there

145:24

and then here

145:26

Nvidia's one of the most consequential

145:29

companies in the history of companies.

145:32

>> It is a crazy story.

145:34

>> It has to be that journey is is a and

145:37

it's very humbling and

145:39

>> and um I'm very grateful.

145:41

>> It's pretty amazing man.

145:42

>> Surrounded by amazing people. You're

145:44

very fortunate and you've also you seem

145:46

very happy and you seem like you're 100%

145:49

on the right path in this life. You

145:51

know,

145:51

>> you know, everybody says you must love

145:54

your job. Not every day. [laughter]

145:56

>> That's not that's part of the beauty of

145:59

everything is that there's ups and

146:00

downs. It's never just like this giant

146:02

dopamine high.

146:03

>> We leave we leave this impression here.

146:06

Here's here's an impression I don't

146:07

think is healthy. We we um people who

146:12

are successful leave the impression

146:13

often that that

146:16

our job gives us great joy. I think

146:19

largely it does

146:22

that our jobs were passionate about our

146:25

work.

146:27

Um and that passion relates to it's just

146:30

so much fun. I think it largely is, but

146:35

it it it distracts from in fact a lot of

146:38

success comes from really really hard

146:42

work.

146:42

>> Yes,

146:44

>> there's long periods of suffering and

146:49

loneliness and uncertainty and fear and

146:54

embarrassment and humiliation. all of

146:57

the feelings that we most not love that

147:02

creating something

147:04

from the ground up and and Elon will

147:07

tell you something similar very

147:09

difficult to invent invent something new

147:12

>> and people people don't believe you all

147:15

the time you're humiliated often

147:17

disbelieved most of the time and so so

147:21

people forget that part of success and

147:24

and I I don't think it's health. I think

147:26

it's it's good that we pass that forward

147:29

and let people know that that it's just

147:31

part of the journey.

147:32

>> Yes.

147:34

>> Suffering is part of the journey.

147:35

>> You will appreciate it so these horrible

147:37

feelings that you have when things are

147:39

not going so well. You will appreciate

147:41

it so much more when they do go well.

147:43

>> Deeply grateful.

147:44

>> Yeah.

147:45

>> Yeah. Deep deep pride. Incredible pride.

147:49

In incredible incredible gratefulness

147:51

and and and surely incredible memories.

147:54

Absolutely. Jensen, thank you so much

147:56

for being here. This was really fun. I

147:58

really enjoyed it and your story is just

148:01

absolutely incredible and very

148:02

inspirational and and I you know, I

148:06

think it really is the American dream.

148:07

It is the American dream.

148:08

>> It really is. Thank you so [music] much.

148:10

Thank you. All right. Bye, everybody.

148:15

[music]

Interactive Summary

Jensen Huang, CEO of Nvidia, discusses his journey from immigrant to leading a major tech company, the evolution of AI and computing, and the importance of perseverance. He shares anecdotes about early Nvidia struggles, including a critical partnership with Sega and the development of key technologies like GPUs and CUDA. Huang highlights the parallels between creating the 3D gaming market and the current AI boom, emphasizing how a focus on technology and a culture of innovation have driven Nvidia's success. He also touches on the challenges and anxieties of entrepreneurship, the evolving nature of work, and the potential of AI to transform society, while grounding his optimism in his personal experiences and the company's collaborative spirit.

Suggested questions

3 ready-made prompts