HomeVideos

OpenAI Kills Sora & Hits $100M ARR on Ads | Oura Going Public & Whoop Raises at $10BN

Now Playing

OpenAI Kills Sora & Hits $100M ARR on Ads | Oura Going Public & Whoop Raises at $10BN

Transcript

2116 segments

0:00

So we start with anthropics monster

0:02

week.

0:02

>> We may be at the stage where we throw

0:04

the humans under the bus, not the AI

0:06

anymore, which I think at some level is

0:07

pretty terrifying.

0:08

>> We move to open AI killing Sora.

0:10

>> I think shooting in the head is even

0:12

more significant.

0:13

>> A big part of the whole strategic

0:14

direction of the company was flawed.

0:16

>> Agreed. You're seeing the economists,

0:18

the accountants have wandered into the

0:20

room and they said, "We have a scarce

0:21

resource here. Let's optimize it. Let's

0:23

devote this compute to the people who

0:25

can pay the most for it." And then we

0:26

finish on the man with the biggest balls

0:28

in tech, Massa. You haven't lived till

0:30

you've seen an 85% decline in an index.

0:33

>> This is one where it's just backass

0:35

barkwards. I I don't believe there's

0:36

right or wrong in money. There's just

0:38

money.

0:38

>> I just don't think raising it 5 or 8

0:40

billion when you're at 80 million or 100

0:42

million of suspect AR is the most

0:43

exciting accomplishment in the world.

0:45

>> Let me be direct. Get the over it. You

0:47

should conform your company around your

0:48

customers and your model, not your VCs.

0:51

Being mean to a billionaire is actually

0:52

a feature.

0:53

>> Ready to go?

1:06

Boys, welcome back. It is this week in

1:08

Anthropic, otherwise known as the SAS

1:10

OG's, which has been renamed. Um, I want

1:13

to start with, you guessed it,

1:15

Anthropic. unbelievable 28 day month of

1:18

February where they did 6 billion in

1:22

revenue which was more than data bricks

1:25

has done in their entire lifetime. You

1:27

know what I think was the most

1:28

interesting news out of anthropic this

1:29

week? It was actually the the accidental

1:31

leak of claude mythos uh essentially

1:34

3,000 unpublished assets leaked. Um it's

1:38

a 10 trillion parameter model apparently

1:41

that is this next level step changing

1:43

capabilities that they're not releasing

1:46

because of how powerful it is. This is

1:49

by far the most interesting to me.

1:50

Jason, how did you think about this

1:52

news?

1:53

>> Well, look, obviously it's embarrassing,

1:55

right? To anthropic to to to leak it. Um

1:59

I actually just think we're going to see

2:01

more and more of this accelerate. Um the

2:04

faster we vibe code, the faster we ship,

2:06

the more corners we cut in general on

2:10

application level security, it happens.

2:12

I mean so many folks are accidentally

2:15

uploading code to insecure GitHubs to to

2:18

database to to superbases that are by

2:20

default open. So this is this is

2:23

accelerating uh our data which is just

2:25

open on the internet and and you could

2:27

say god this shouldn't happen at the

2:29

entropic level and I'm sure someone will

2:30

get will get scolded, right? I'm sure it

2:33

will. But overall, this is accelerating

2:35

and it's going to accelerate even more

2:37

as we let our AI agents make decisions.

2:39

Our agents are going to decide where to

2:41

put code. They're going to decide what

2:42

level of security to use. And this is

2:45

going to become happen stance. And you

2:48

know, it's funny. I mean, people were

2:49

like, "Oh, how could Anthropic have a

2:52

new um security agent and have this

2:55

happen at the same time?" I think

2:56

they're they're I think it makes perfect

2:58

sense. the anthropic AI security agents

3:00

which I've basically used in Replet's

3:02

very very good and it also makes sense

3:04

as we rush we're going to leak source

3:06

code data PII right it was I don't know

3:08

whether it's happened it was reported

3:10

today all of Merur's data leak it's

3:12

being held hostage all of it every

3:14

single interview every single piece of

3:16

PII every single piece of humans and so

3:20

are you know we used to mock these I

3:21

think it's going to start happening

3:23

daily and weekly in the agentic era and

3:25

it doesn't excuse it but but it's Um, it

3:29

it's a reality. Agents agents are goal

3:31

seeking and agents are going to make not

3:33

only are they going to make the same

3:34

mistake as humans, they're going to work

3:35

a thousand times faster. So even if they

3:37

make the mistakes 10% as often, Rory,

3:39

help me with the math. If they do a

3:41

thousand times more productive, they're

3:42

still going to make a hundred times more

3:44

mistakes. So we're going to see it

3:45

everywhere. Again, just for perspective,

3:47

because there's two things going on

3:49

here. Entropic. Some data leaked from

3:51

Antropic about their new model mythos

3:53

which of itself is meant to be amazingly

3:55

powerful in dealing with cyber security

3:58

and there was a whole consequence that

3:59

we'll talk about in a second in terms of

4:01

how that impacted cyber security stocks.

4:03

But as Jason pointed out, the level of

4:05

irony here is acute because it was an

4:07

inadvertent leak. So you had the

4:09

situation where a a model that's meant

4:11

to be amazing for cyber security

4:14

actually leaks via cyber security leak.

4:16

So that's kind of so we're toggling

4:18

between the two on the cyber security

4:20

leak. It was noseworthy entropic quote

4:22

unquote blamed human error, right? We

4:25

may be at the stage where we throw the

4:27

humans under the bus, not the AI

4:28

anymore, which I think at some level is

4:30

pretty terrifying, but and and you know

4:32

exactly what happened. You see this I

4:33

mean not down in the weeds you often see

4:35

this where you know you're about to do a

4:38

big announcement you have your content

4:40

management system you stage all the

4:41

assets you know be it their Fed press

4:44

release or in the UK it happened on the

4:45

budget if you remember Harry the budget

4:48

you have the press release ready to hit

4:49

play the minute the budget is ended and

4:51

someone inadvertently forgets and put it

4:53

on the public side in advance. It's the

4:55

same thing here. So it probably was a

4:56

human error. with a whole bunch of

4:58

content ready for I don't know pick a

5:00

date the March the May 15th announcement

5:03

of Mythos they forget to secure it

5:05

correctly and out it goes so so that's

5:08

the first thing right so that's kind of

5:09

the that's the embarrassing part of it

5:12

and then the interesting part of it and

5:13

you really do have to do this not

5:14

without sniggering despite the fact that

5:17

it all leaked you also have to

5:18

separately talk about the fact there's

5:20

some big claims on mythos right and on

5:22

topic we're making via again via this

5:24

leaked memo reminder no one else has

5:26

seen seen it h the actual model um at

5:30

least not publicly available. Obviously

5:32

some people have seen it but not

5:33

publicly available and even I was trying

5:34

to get copies of the leaked memo.

5:36

There's just a few screenshots at this

5:38

stage. It's hard to track it down at

5:39

least quickly anyway. But the statement

5:41

is it's way more powerful. Second thing

5:44

is it's going to be way more expensive

5:45

for them to serve and therefore it's

5:47

going to be way more expensive for

5:48

customers to buy. And then the third

5:50

thing is a particular focus on cyber

5:52

security. It's meant to be quote unquote

5:53

extremely good at detecting cyber

5:55

issues. And the result of that and the

5:59

result of that was a four or 5% decline

6:01

in the average cyber security stock last

6:04

Friday when this leak happened.

6:05

>> Yeah, just two maybe just two other

6:07

things on the league just for for this

6:09

tradeoff. Um, you know, I'm dating

6:11

myself, but when I was at Adobe and we

6:13

were acquired, we were an early customer

6:15

of GitHub. Um, and so we were putting

6:17

source code in the cloud and that was

6:19

banned in Adobe at the time. It was

6:20

banned because the source code was their

6:22

crown jewel. It was pretty easy to to

6:24

make a crappy PDF reader or a crappy

6:27

image generator. But to do what

6:29

Photoshop or Adobe Acrobat did, all the

6:31

exceptions, all the corn the thous tens

6:33

of thousands of corner cases was the

6:35

crown jewel, right, of of the company.

6:37

And so everything we got the first

6:39

exemption to be able to use source code

6:41

in the cloud and pros and cons. But when

6:43

when they used this onrem source code

6:46

management tool, it took a month to do a

6:48

release. a month a month. Okay, now

6:51

we're doing 60 releases a day, right? Or

6:54

even Enthropic, fastest growing

6:56

enterprise company of all time, is still

6:58

doing massive releases every month or

7:01

two and dropping features every day,

7:02

right? So, we went to something that

7:03

took 30 days at a tech leader to

7:06

something that takes hours. There's tra

7:08

there's tradeoffs there and I I'll take

7:10

them, but we're going to see it explode

7:12

in terms of like the stuff that was

7:13

published today.

7:15

Going back a few threads on show number

7:17

50 to Rory. One of the things that I

7:19

thought was pretty cool in Chyros was

7:20

two things. Um, always on background

7:23

assistant that works constantly. Our AI

7:26

is working with us 24/7 and agents that

7:29

can sleep, wake, and self-resmpt.

7:32

The autonomous agents, which I've been

7:34

talking about how this is going to

7:37

consume orders of magnitude more tokens

7:38

and change our life. I I'm excited to

7:40

see more coming. And you know, Open Claw

7:43

was just this this brief thing that woke

7:46

us up to what Enthropic appears to be

7:48

all in on, right? Truly autonomous

7:50

agents running 24/7, hopefully safely,

7:54

hopefully not leaking all of our source

7:55

code, but it's coming soon, right? Not

7:58

these, you know, this whole idea that

7:59

we've been doing. When we started this

8:01

podcast, you went on to chat GPT or

8:03

Claude. No one had heard of Claude when

8:04

we started this. I was a quirky guy

8:06

using Claude. And you talk to it and go

8:08

back the next day. The next release,

8:10

it's gonna be on all the time. All the

8:14

time. Debating Harry's latest

8:16

investment. Was it big enough? Is he too

8:18

concentrated in the fund? Where should

8:20

he go? What was Rory thinking on that

8:22

deal? Right.

8:24

>> Why was Rory abusing Harry by email

8:26

again for the second time in a day?

8:28

>> Laugh, but this is the future. I'm

8:30

excited to see it coming sooner when our

8:32

agents are 247. like they're literally

8:35

around us and we give up all of our

8:37

personal freedoms and autonomy as part

8:38

of it. I hear you on the embarrassment

8:40

of it being leaked and you know the the

8:43

human error element but while anthropic

8:46

has mythos which is supposedly as

8:49

powerful as it is you're juxtaposing

8:51

that with open AI [ __ ] around with

8:54

killing Sora kind of ads not really

8:57

working and people being unhappy with it

8:59

and it's seeming like this massive chasm

9:02

of the progression of force that is

9:04

Dario and anthropic continuing faster

9:07

and harder than ever with a fault

9:09

altering confused and dazed open AI

9:12

wandering around the product desert

9:14

trying to find some water.

9:16

>> You're just being mean. I mean, again,

9:18

as I said last, and I'm sorry to repeat,

9:20

>> is that not fair?

9:21

>> Yeah, but again, narrative is overdone

9:24

on both sides, right? So, um I think

9:27

some parts of it are true. Obviously,

9:29

you're true in a bunch of different

9:30

things. The decision to shoot Sora in

9:32

the head, right? Almost certainly a good

9:35

decision. Look, it's obviously

9:37

embarrassing to say something is going

9:38

to be amazing less than four or five

9:39

months ago and then shoot it in the

9:40

head. But if it's a mistake, give him

9:42

credit for at least saying it's a

9:43

mistake. Move on, right? And yeah, that

9:46

relationship with Disney again, I think

9:47

I I'm going to give I I it wasn't me. I

9:50

was sneering at it on real time when it

9:51

happened. I think someone else in this

9:53

podcast said it's really significant.

9:54

Just saying, right? It's clock.

9:56

>> I do I think it's massively significant.

9:58

>> I think shooting it in the head is even

10:00

more significant.

10:02

I think it's saying that the ent a big

10:04

part of the whole strategic direction of

10:06

the company was flawed.

10:08

>> Agreed.

10:09

>> The whole that we are going all in on

10:10

consumer that from what I read Sor made

10:13

singledigit millions of revenue, right?

10:15

And was consuming a million a week,

10:17

which actually sounds way too low,

10:19

right? It must it must have consumed

10:21

billions and made single digit millions.

10:23

It makes no sense as a product either in

10:25

the short term or the long term. But if

10:26

you want to own the whole consumer

10:28

experience with AI, you've they decided

10:30

we have to own image and video and

10:32

Anthropic never even attempted to do it

10:34

right. So it's a massive retreat. It

10:37

doesn't mean it's the right it's

10:38

probably the right decision to your

10:39

point. In fact, almost certainly is. But

10:41

man, that's a our strategy was wrong.

10:43

Like this is a huge own goal. Our

10:46

strategy was wrong.

10:47

>> Agreed. But I and I agree with that. But

10:49

I still think that, as I say, I still

10:52

think Harry's over kind of overeagering

10:54

it a little bit cuz look, you made a

10:56

comment about ads that I think is caught

11:00

effectively

11:02

implying that the ad strategy hasn't

11:04

worked. That's a bit of a a bigger leap.

11:06

I mean, Sora hasn't worked. They've

11:07

killed it. I think I'm with Jason. I

11:09

think that's smart because I think one

11:11

of the things you're seeing right now is

11:12

in a world of scarce compute and

11:14

astonishingly despite all the

11:16

investments that we've seen in terms of

11:18

actual available compute for people to

11:22

sell AI on we're in a scarcity mode. You

11:24

don't devote compute to things that are

11:28

highly compute intensive and low revenue

11:29

intensive. I mean and Sora was almost

11:33

the definition of that. video generation

11:34

is extraordinarily comput intensive

11:37

relatively speaking and the revenue is

11:40

almost minuscule conversely you know

11:43

codegen while it is comput intensive is

11:45

orders of magnitude less comput

11:46

intensive and there's real dollars

11:48

attached to it so you're literally

11:49

what's happening right now I actually

11:51

think at a at a higher level it's

11:52

actually very healthy you're seeing the

11:55

economists the accountants have wandered

11:57

into the room and they said we have a

11:58

scarce resource here let's optimize it

12:01

let's let's devote this compute to the

12:03

people who can pay the most for it. So

12:05

that's the Sora comment. On the ads

12:07

comment, Harry, it's early days for, you

12:11

know, JPT ads, but again, I cite that

12:14

quote that Brian Kim that I thought was

12:16

really good. Of course, they're going to

12:17

run damn ads because there's no other

12:19

way to build a mass consumer business

12:21

and they have no choice, right? because

12:24

you know their their consumer conversion

12:25

rates run roughly 5%. that gets them to

12:28

a I think a roughly 1015 billion dollar

12:31

consumer business, right? And you know,

12:34

out of their 500 million uniques or

12:36

whatever it is. So, one of two things

12:37

has to happen in the consumer business.

12:39

Again, I'm going to leave the enterprise

12:40

business out on the consumer business.

12:42

Either A, they take that conversion rate

12:44

to a number we've never seen before from

12:46

a typical consumer business. I think

12:48

that's unlikely. I don't think most

12:50

consumers are going to pay 20 bucks a

12:52

month for this. Or option B is you make

12:55

an ad business work. They got no choice

12:57

to make it work. And by working, I don't

12:58

mean a hundred million dollars. People

13:00

are kind of ragging on the hundred

13:02

million. It's in the noise. It's scale.

13:03

Big picture here. Facebook and Google

13:06

each do 200 billion plus or minus a year

13:09

in digital ads. If these guys aren't

13:12

doing 20 billion within a couple years,

13:14

they're not even in the game. And to get

13:16

to the market cap of I mean remember

13:17

Facebook has a 1.7 whatever it is

13:19

trillion market cap doing 200 billion.

13:22

Alphabet/G Google has a three trillion

13:24

market cap doing 260 billion plus thing.

13:27

If they're going to grow into the market

13:29

cap on the consumer side, 20 billion is

13:31

not enough. They have to do 50 billion

13:33

70 billion of ads. So unlike Sora, this

13:37

is not going to be a try the ads and

13:38

then fold. This is a there's only two

13:41

existential bets for this company. One

13:43

of them is ads to make the consumer

13:44

business work. And then the other is oh

13:46

my god, we should have done coding all

13:48

along. Let's get a competitive coding

13:50

and enterprise model out there and

13:52

compete with Antropic on that side. So,

13:54

those are the only two things they're

13:56

doing and they're the only two things

13:56

they should be doing. It's

13:58

straightforward. I mean, I actually see

13:59

this as good news. Like, at least

14:01

they've like we've gone from the let's

14:02

wander around the woods feeling cool

14:04

building [ __ ] to there's only two things

14:06

to do. Let's get them done. I It's net

14:07

net. It's a positive. Better late than

14:09

never,

14:10

>> man. Did they had the Wall Street

14:11

Journal this week, they had uh they had

14:13

a story of why Daario left uh OpenAI.

14:16

Did you

14:18

Yeah.

14:19

>> I mean,

14:19

>> yes, I did.

14:20

>> The amount of tension at OpenAI, the

14:23

fact that

14:24

>> Greg Brockman

14:26

>> recruited them and no one would work for

14:28

him. He and D and his sister would not

14:31

work for Greg Baku, would not talk to

14:33

him. They would not allow him to be part

14:36

of the LLM or GTP groups. Um, then Sam

14:40

had to constantly tell each of them that

14:42

they were in charge, right? Told Daario

14:45

he was the boss. then told Ilia and Greg

14:48

they could fire him at any time if they

14:50

wanted to fire Sam, right? Then begging

14:53

then Daario to come back. Then Dario

14:55

saying he would stay only if he directly

14:57

reported to the board and nobody else.

15:00

I mean the level and then and then

15:04

firing Sam and then bringing him back

15:06

and then Sora and D Sora and we're not

15:09

doing coding. It's just I mean I'm

15:11

exhausted. I I maybe I'm wrong. I have

15:14

to think at least someone like me would

15:15

feel much more comfortable in anthropic

15:17

where it appears there's a much more

15:19

consistent process in leadership, right?

15:22

Same founders, same things, same goals.

15:25

It just I have to think a company

15:27

organized like that's just going to ex

15:28

out execute someone with that level of

15:30

drama. I I I almost can't take it.

15:32

>> You're going to kill me for this, Rory.

15:35

It's the best thing for OpenAI not to

15:37

buy Sierra, incorporate that as its

15:40

customer support product and have Brett

15:42

Taylor come in as the day-to-day CEO and

15:45

Sam can be fundraiser. Sam can be master

15:47

of

15:48

>> I'm not in the boardroom. So, you know,

15:49

I I hear look at the end of the day,

15:53

>> at the end of the day, I think you're

15:54

right, Harry, and I would favor that as

15:55

a board member, but I'm not going to say

15:58

that publicly because I don't want Sam

16:00

to break my balls. I am too unimportant

16:02

for Sam to even give a [ __ ] about.

16:04

Right? So, I don't worry about that at

16:05

all. So, let me say this delicately.

16:08

That amount of board level and senior

16:10

team level turnover over an extended

16:13

period of time is probably the highest

16:16

warning signal that you could have as a

16:18

board member about how your CEO is

16:20

doing, right? And if if it was anything

16:23

other than a founder, let's put it this

16:25

if it was anything other than a founder

16:27

company and this level of drama was

16:29

going on, you'd probably be having you

16:31

probably sitting down with the CEO and

16:32

asking how's it going at least and what

16:33

are you thinking of doing about this? Um

16:35

I don't think you turn on people just

16:37

when things go to [ __ ] But you probably

16:41

want to cut down the drama from here,

16:44

build a team and try and call a shot and

16:46

play it for more than 20, you know, for

16:48

more than 6 months at a time. when

16:50

you've worked at or observed startups

16:51

where the CEO is spending so much of

16:53

their time load balancing talent that

16:55

can't work together versus when you've

16:57

worked at one or with one where the

16:58

talent's rowing in the same direction.

17:00

To say that it's night and day would be

17:02

an understatement, right? It's like the

17:03

the the backside of Pluto and the front

17:05

side of Mercury and it's just ex and I

17:08

think Sam we can criticize him actually

17:10

when I read the everything I've seen and

17:11

then when I read the Wall Street Journal

17:12

it's like my god this guy has spent so

17:15

much time load balancing the drama of

17:18

these extremely brilliant personalities

17:21

that just oh my god that that can

17:24

consume number most of your time as CEO

17:26

most of your time load balancing

17:28

>> you're exactly right it is the drama of

17:30

you know we're not dealing with a bunch

17:32

of people just trying to crank out some

17:33

B2B software and make a paycheck. We're

17:35

dealing with people who are angsting

17:36

about whether this is going to change

17:38

the world, who have fears about the

17:40

technology, who have yeah, desires to be

17:42

seen as credited for the technology

17:44

despite their fears about it. This is a

17:46

I mean, as is often the case,

17:48

extraordinary talented people come on an

17:50

extraordinarily high bandwidth,

17:52

a a demand on attention and care and

17:54

feeding. It's it's been a real slog, I'd

17:56

say.

17:58

>> Okay. The man with the most balls in

18:01

investing, Massa Sun. Soft Bank gets $40

18:05

billion bridge loan to buy Open AI

18:08

stock. How deep can Massa go? He'll go

18:11

as deep as they let him. I mean, that's

18:13

the one thing we know. If they give him

18:14

another 20, he'll borrow that, too,

18:16

right? I mean, look, this is a high I

18:19

checked the Soft Bank. You've got Soft

18:21

Bank holdings. Have to be careful.

18:23

There's the telco group which is you

18:25

reasonably levered at the Japan level

18:27

and then soft bank group is around it's

18:29

around 2x levered right one and a half

18:31

to 2x levered in terms of equity right

18:33

what that means is a 30 40% decline you

18:37

know wipes them out right it's a very

18:40

aggressive stance right it would be like

18:43

me taking our $800 million venture $900

18:45

million venture fund borrowing 1.8 8

18:49

billion and investing at all and if you

18:52

if it works I make you I really juice my

18:54

return but if it goes wrong by 30% I'm

18:57

done right and it's just it's it's super

19:00

aggressive I mean I suppose his lesson

19:02

is mass survived 2002 when I remind

19:05

everyone the NASDAQ went down 85%

19:08

you haven't lived till you've seen an

19:10

85% decline in an index right and

19:13

obviously if that happened or anything

19:15

like it

19:17

um you'd just be way on the water,

19:19

right? So, it's it's a it's a fairly

19:21

high amount of leverage for an

19:24

investment fund to say the least.

19:26

>> Yeah, I mean for for sure it's it's

19:27

dramatic. Having said that, you know,

19:29

real estate investment funds get the

19:30

maximum leverage they can by design,

19:32

right? That is how they work. I would

19:34

imagine if venture had access to more

19:37

debt, we we'd all we'd all load up on

19:39

it. if we all could if we all could do

19:42

the growth rounds in your hottest

19:43

company and uh may maybe we would and we

19:46

could get we could get all the carry

19:47

from it with uh and the worst thing is

19:49

we leave the keys to fund seven on the

19:51

table we might we might load up too I'm

19:53

not sure but certainly real estate funds

19:55

load up as much as they can

19:56

>> but just pushing back again because real

19:58

estate funds load up because the cash

20:00

flows are predictable right at the end

20:02

of the day I mean look in

20:03

>> well and they can but they can because

20:05

the cash flows are predictable they can

20:06

load up

20:07

>> agreed

20:08

>> right we don't we it's just harder for

20:09

my little fund to go to Silicon Valley

20:12

Bank and borrow 200 million against the

20:14

>> inium of risk. I would argue the Soft

20:16

Bank portfolio, not the telecom company

20:18

at the subsidiary level, but I would

20:20

argue the Soft Bank portfolio is more

20:22

like Jason fund than it is a real estate

20:23

fund. So, I think it's a high level of

20:25

risk.

20:25

>> Well, plusy, what did he lose on

20:26

Weiwork? 12 billion. He know he knows he

20:29

knows what it's like.

20:30

>> Yeah. The two big assets from memory are

20:33

um obviously the open AI position and I

20:36

think the ARM position which I still

20:37

think is in um the the the in the

20:40

holding company and yeah but I mean

20:43

amazing companies worldass companies

20:46

easily imaginable a border would decline

20:48

30%. So yeah it's a hell of a way to

20:51

live. Speaking of declining 30% and

20:54

being in the hole, we we touched on it

20:57

earlier, but obviously Mythos leak

20:59

hammered cyber stocks. Uh, Crowdstrike,

21:02

PaloAlto, Zcaler all down 6%, Octton,

21:05

Netcope down 7%, Tanimal down 9%. Was

21:09

this a justified dip or is this an

21:13

unjust reaction to anthropic news? was

21:16

listening carefully to the name because

21:18

I was listening carefully to the names

21:19

and there's different aspects of

21:21

security and some of them I can say yeah

21:23

maybe that overlaps and then some I go

21:26

that's just a different thing right and

21:27

when you listen to all the names been

21:29

thrown out you say that's just you know

21:30

baby with the bath water cuz step back

21:33

how does how does entropic you know make

21:36

security better at the code development

21:38

stage they can look at code and find

21:40

security flaws so there are companies

21:43

that upfront

21:44

um do something like that and you know

21:46

application security companies and you

21:48

could argue that this is a different way

21:50

of doing that maybe some of those guys

21:51

will be impacted right what they're not

21:53

doing for example is real time perimeter

21:56

defense they're not in a run in a real

21:58

time basis you know blocking people like

22:00

a firewall right nor are they doing what

22:03

for example octa does right which is

22:04

single sign on and authentication that's

22:07

simply not what they do it's a different

22:09

thing and the fact that both of those

22:11

kind of stocks sold off says it's just a

22:14

kind of knee-jerk reaction rather than

22:16

anything thought true. It will have an

22:18

impact. If you were doing application

22:21

security or code review for secure

22:25

security code review, you're probably

22:27

going to have to either incorporate how

22:29

this works in your analysis or you'll be

22:31

redundant just as the the coding

22:33

companies anyone kind of just as GitHub

22:35

had to roll in complete models and

22:37

figure out how to adopt it. Right? So

22:38

for some of them this is really going to

22:40

matter and then for others it's like

22:41

it's just a different thing. Stepping

22:43

back, I think we're in the panicky

22:45

stage, right? I think we're in the stage

22:46

of because these companies are doing so

22:48

well, because they're private, no one

22:50

sees the numbers, because AI is so sexy

22:52

and so potentially amazing. We're at the

22:55

stage now where everything, anything can

22:56

cause a panic.

22:57

>> Robin Hood was down like 10% because

23:00

Elon didn't potentially give them the

23:03

tender and was going straight through

23:04

Erades. And that alone was like a

23:07

massive hit for them.

23:09

>> Obviously, there's a panic in the

23:10

market. And the question is, is the

23:11

panic justified, right? The panic is

23:13

that this revenue is not durable, right?

23:14

That's the panic. The cyber security

23:16

one's really interesting in my in my

23:18

experience and opinion. This is one

23:20

where it's just backass barkwards. Um

23:24

because if you're in the agentic world,

23:27

this is the golden age of security. The

23:29

number of security threats and issues is

23:31

going up orders of magnitude. Claude

23:33

leaking its source code, it doesn't

23:34

matter. the number of of apps exploding.

23:37

The number of like there's so many

23:38

mobile apps that app store is like it's

23:40

like a month to get your app reviewed

23:42

versus a week. It is everything is

23:44

exploding. These apps are being built by

23:46

agents. They're being built in

23:47

unpredictable ways. Folks aren't looking

23:49

at the code. The pace of features being

23:51

shipped, products being shipped, corners

23:54

being cut. This is a golden age of

23:56

taking any mature category and

24:00

acknowledging good news for us. There's

24:02

more threats. Good. whether I don't care

24:04

whether it's application level perimeter

24:07

and like the good news is threats are

24:09

exploding and that the whole shtick of

24:12

cyber I'm not a real cyber security

24:13

expert although I'm doing another

24:14

investment right now for just this

24:16

reason the whole shtick in my whole

24:18

lifetime has been look you've got to

24:20

constantly buy new products because new

24:22

threats keep emerging like this is

24:24

there's been a golden goose of cyber

24:26

security that has allowed new entrance

24:28

to come into a conservative category

24:29

someone like whiz will show up and say

24:31

guys we know how to do this on the web

24:33

And people are so terrified of new

24:35

threats, they'll take the meeting.

24:36

Right? This should be the golden age for

24:38

new and existing investors because the

24:41

threats are terrifying. And you can't

24:42

stop the rogue engineers that vibe coded

24:45

something that accessed your data. This

24:48

this should benefit everybody. Like

24:49

everyone should be a rocket ship. Like

24:51

everybody built like everybody

24:53

monetizing GPUs is a rocket ship. We

24:55

have and the fact that the market

24:56

doesn't see it shows we're in a in my

24:58

opinion we're in a true panic which is

25:00

hard to predict a bottom. But I don't

25:02

get it. Everyone should be benefiting

25:03

when you see an explosion in application

25:06

production and a change in the paradigm.

25:08

The change in the paradigm is good for

25:09

everybody except you know Windows

25:12

Defender from 1996. Like it probably

25:14

doesn't help that product or whatever

25:15

the hell they have but anyone everyone

25:16

with engineers should benefit.

25:18

>> I broadly agree with Jason. I mean there

25:20

might there are more than Windows

25:22

Defender 2006 that might be impacted. As

25:25

I say some of the application security

25:27

code review stuff could be but big

25:29

picture Jason's right. Instead of having

25:32

people trying to get into your firewall,

25:34

we everyone's now downloading an agent,

25:36

giving it full root access to their

25:38

computer and telling it have a go. And

25:40

as Jason just pointed out, work

25:41

overnight. It's going to I mean no one

25:44

yet it's funny we my colleague Aar who

25:46

does a lot on the security we've been

25:47

looking at a lot of these companies no

25:49

one yet knows the exact approach that

25:52

we're going to have to take to defend

25:53

against agents running within the

25:55

organization but everyone 100%

25:57

understands that this is this is a

26:00

emerging mega threat because of the

26:03

velocity adoption times the power of the

26:06

solution so I I agree with Jason it it

26:08

mightn't be the old guard that takes

26:10

advantage of it But there's no but they

26:13

tend to be I mean one of the things I

26:14

admire about the security companies is

26:16

the crowd strikes the PaloAlto networks

26:18

of this world is they know damn fine

26:20

that when a new thread emerges and a new

26:22

solution emerges for that threat when

26:25

when an earlier winner comes out you

26:26

better spend your 300 million bucks your

26:28

500 million bucks and just swoop up the

26:30

winner and add it to your product right

26:32

so I think there'll be a ton of fast

26:35

acquisitions

26:36

as agent security solutions emerge you

26:39

know and and people will be doing if

26:41

they're smart And I think those two

26:42

companies are extraordinary smart.

26:44

They'll be doing acquisitions long

26:45

before it's quote certain because you're

26:49

going to have CIOS come and talking to

26:51

you, right? One thing worth mentioning

26:54

on that is it was interesting again I

26:56

and somebody leaked information from

26:58

Anthropic. They're masters at selling

27:00

fear. One of the things they're doing is

27:02

they're releasing the mythos model first

27:04

to sis within companies. It's kind of

27:07

like, oh, it's so scary. we're going to

27:09

give you this model and give you time to

27:11

figure out how to use it. Of course,

27:13

part of that time will involve giving a

27:14

million bucks to Anthropic for um so

27:18

it's just great marketing. So, they're

27:19

actually leaning into that and saying to

27:21

the sisos, you're going to have to

27:22

figure this out. This is the new

27:24

terrifying weapon we've invented. Please

27:26

give us a million dollars and we'll let

27:28

you defend yourself with it. also great

27:30

marketing and but it speaks to the

27:32

perceived to Jason's point it speaks to

27:34

how correctly afraid every security for

27:39

um SISO should be given the pace of

27:42

agentic AI adoption in the enterprise

27:44

>> the golden age of cyber

27:47

>> I mean it's just it's just how hard is

27:49

it to get a meeting whoever you are if

27:51

you have any established brand we we

27:53

we've got a new agentic product we're

27:55

going to help protect you from this

27:56

you're going to get a meeting that

27:57

afternoon,

28:00

right? Wish I bought them over Figma.

28:02

That's a depressing chart that I'm

28:04

looking at. Um,

28:05

>> you need to let go, Harry. You need to

28:06

let go.

28:07

>> Down 30% in a month, Rory. It's hard to

28:09

let it go after 30% in a month.

28:11

>> Okay, no crying in the casino. Move on.

28:14

>> We we I do want to discuss revenue kind

28:18

of questionability. Uh, we've got

28:20

anthropic recognizing revenue in a very

28:21

different way to open AI. And then you

28:23

also have questionability around

28:25

emergent labses um and is it okay if

28:28

error is kind of

28:31

questionable in sorts of how it's

28:32

accounted for. Um how do we think about

28:35

that? You can choose which one you want

28:37

to take.

28:38

>> Let me just can I just maybe Rory can

28:40

dig into it but I'll I'll tell you

28:41

there's one startup I invested in that's

28:43

over 100 million ARR and I get them I I

28:46

I own just enough to get the investor

28:48

updates. It's not I'm not on the board.

28:49

And I get three numbers every month.

28:52

Three revenue numbers. I don't know what

28:54

the hell they are over 100 million, but

28:57

the smallest one is ARR.

29:00

Now, I invested at seed. I don't really

29:02

care. I'm in the money. I don't I don't

29:04

have a choice. But I I can't understand.

29:06

This company's doing great, but I can't

29:08

I for the life of me, I cannot

29:09

understand these three numbers. And

29:10

there's asterisk and daggers, and

29:12

there's charts that go every, but they

29:13

keep going up and to the right, which I

29:15

think was on this emerging thing. We

29:16

could talk about our next. I think

29:18

that's what some of the investors said.

29:19

Who cares? But I can't tell the hell the

29:21

difference what a what an ARR is in

29:23

2026.

29:24

>> Well, what I always get is like pipe,

29:26

which is complete [ __ ] the

29:28

contracted and then there's live. So,

29:31

first of all, stepping back to be fair

29:32

to both Entropic and OpenAI, they have a

29:34

very clear and sensible

29:37

>> way they define AR. What they take is

29:39

they take the last the average of the

29:41

last four weeks to smooth out times 13

29:44

because there are 13 four-week periods

29:46

in a year which is more sensible than

29:48

monthly because you have these varying

29:50

months. So they're basically what

29:51

they're saying is realized revenue for

29:53

the last four weeks averaged you know

29:57

the average of the last four weeks times

29:58

30 that sorry obviously if it's the

30:00

average is times 52 but basically it's

30:02

actual gap revenue. what did we bill for

30:05

the last four

30:07

the average across the calculated across

30:10

the last four weeks to take into account

30:11

how it is that's their run rate right so

30:13

it's actually pretty it's not commit to

30:15

be fair to them it's not committed or

30:16

any of the [ __ ] kind of higher level

30:18

stuff it's actual money flowing through

30:20

the system and traffic is roughly at 19

30:23

billion according based on that kind of

30:25

trailing four month four week metric and

30:28

open AI is around 25 but now let's talk

30:30

about your thing there was this kind of

30:32

whole meme of OpenAI reports net on

30:36

their partner revenue and Entropic

30:39

reports gross. And what they're saying

30:41

there is if OpenAI sells through

30:43

Microsoft and Microsoft takes some money

30:46

off the top, OpenAI only reports the net

30:48

amount. If Antropic sells through AWS

30:52

and they sell $100 worth of revenue,

30:55

they report the gross amount and then

30:57

they give 20 GR $20 back to Amazon as a

31:00

cost of sales. So there's two different

31:02

methods for what looked like the same

31:04

revenue kind of nicks or same revenue

31:07

approach.

31:08

>> I thought you were going to extend that.

31:09

I thought part of where you're going was

31:10

to Michael Canon Brook's point on the

31:12

show was that a lot of this revenue is

31:13

getting double or triple counted because

31:15

it's being recognized and not only does

31:17

this happen then cursor is selling it

31:18

again and recognizing the revenue right

31:20

the same to people keep reselling these

31:22

tokens again and again and recognizing

31:24

them as their own ARR. Um how many times

31:26

do we get to resell these these poor

31:28

little tokens? I think that's actually a

31:30

great point, Jason. I hadn't got, but

31:32

you're exactly right. No, it's like the

31:34

everyone's got amazing revenue growth

31:36

because it's the same little token going

31:38

to this little token.

31:39

>> I mean, if we all agree to have

31:41

essentially 0% gross margins, an

31:43

infinite number of us can keep reselling

31:44

tokens to each other, can't we?

31:46

>> This is our new 20 VC scale faster demo

31:49

day. We all resell a million tokens to

31:52

each other on the first week. So

31:53

everyone in batch O001 has a million ARR

31:57

its first week because we just resold

31:58

our tokens to each other. So it's

32:00

completely fair. The VCs don't mind. And

32:03

>> and you're exactly right. And the

32:04

sentence that you added in passing is

32:06

the key one. Until we all have to get

32:08

profitable, all this, you know, can

32:10

continue. And then at some point, that's

32:11

why I said I think you're starting to

32:12

see it. Someone's going to have to say,

32:14

assuming we want to have a net present

32:16

value and a cash flow, what's going on

32:18

here? And then

32:20

and then all this becomes more clear. I

32:22

I didn't comment on the um emergent labs

32:25

fastest to 100 million.

32:27

>> Jason, you actually tried You tried it,

32:29

didn't you? You thought it was good.

32:30

>> I did. I thought I mean listen, it's

32:32

hard for me to know the criticism,

32:34

right? You know, some folks in the press

32:36

in the India B2B environment tried to

32:38

make this some sort of scandal, right?

32:40

Because and in a sense, fair enough. If

32:42

you go to emergent labs and emergent

32:43

labs is sort of an Indian competitor

32:45

replet and lovable, which I'll show you

32:47

what I learned in a minute, right? And

32:48

if you go right now to the homepage,

32:49

they say 0 to 100 million I think in 8

32:51

months. It's right there. It's the

32:52

biggest banner. So, in all fairness, if

32:55

you're going to put yourself out there,

32:56

not not just as a tweet, but if it's

32:58

going to be right there on your website,

33:00

one would expect 70 to 80% accuracy in

33:03

that number, ideally higher, right? So,

33:05

if it's lower than that, I think it's

33:07

fair that some daggers came out. But I

33:10

was curious. Um, but I don't actually

33:12

know what happened. Is it triple

33:13

counting to I I can tell you one thing

33:15

that I learned which I don't love, which

33:17

is that what a and a lot of AI startups

33:20

do this. So this is not unique to

33:21

Emerant. They kind of hot instead of

33:24

getting you use the free version, they

33:26

try to get you to immediately do a free

33:28

trial instantly that says it's $0 and

33:30

$20 a month thereafter. Now so many

33:33

folks do this. It is not unique to them.

33:35

It's probably best practice in most

33:36

accelerators. But I'm pretty sure that

33:39

means they recognize all $240 in ARR

33:43

that first month when you're paying

33:44

zero. and and they trick you because you

33:47

don't even Yeah, you do have to click on

33:48

the Stripe link, but you almost think

33:50

you're just using the free product. So,

33:52

is that if I do a $0 a month product um

33:56

that's discounted as a marketing cost

33:57

and I churn after 30 days, does that

33:59

count as $240 of ARR? I think for a lot

34:01

of startups it does. Okay, so that's a

34:04

fair criticism. I'm not saying this is

34:06

what emerges, but a lot of startups will

34:08

instantly recognize that as $240 in AR,

34:11

which is how they rock it if you're

34:12

self-s served. That's how you otherwise

34:14

you can't get there that quickly, right?

34:15

So, so they clearly did that. I will say

34:18

what was interesting is I overall I

34:20

think the criticism is probably

34:22

unfounded because I thought the product

34:24

was pretty good, much better than make,

34:27

like an order of magnitude better than

34:29

the disaster of make. Um, because I I do

34:32

a five-part test, a six-part test. The

34:34

first part is awareness test. So, I ask

34:35

it to redo the saster.ai homepage. Uh,

34:38

actually, of all the platforms, it did

34:40

the best job. It it beat it beat it beat

34:43

all of them all of the leaders because I

34:44

redid this recently. I redid it and and

34:46

they're all good at it. Repetit lovable

34:48

vzero they're all good at they all pass

34:49

the test but it actually it actually was

34:52

probably the best and it and it passed a

34:53

bunch of the other tests. So I'm not

34:55

going to switch to emergent labs, but I

34:59

would say it's in the top 10% of vibe

35:02

coding apps. That's pretty good. So that

35:04

tells me it's a legit business. like

35:06

they did they did the work and a lot of

35:08

these they're just the the truth is if

35:10

you play with a lot of these even from

35:11

leaders like Nick's not the only one

35:13

that's crappy okay because they're

35:15

basically relying on the fact that

35:16

claude code does 90% of the work for you

35:19

right they're just putting the the

35:20

simplest wrapper around this and so they

35:23

did a good job but is I really didn't

35:25

like the the way they do the billing but

35:27

we'd probably have to to shoot half our

35:29

portfolio companies that that do like

35:31

PLG AI because I think it's a sus

35:34

practice I just don't like tricking you

35:36

with this $0 for the first month when

35:38

you think you're using a free trial,

35:40

right? That's the sus part. I I don't

35:42

love that kind of gray art. Um, but the

35:44

product's pretty good.

35:45

>> You know what I don't like when it comes

35:46

to confusing? I was wondering whether to

35:47

go off on one in this show and then I

35:49

thought, "Fuck it. Let's go off on one.

35:50

It's been a long day." I'm pissed off by

35:52

these tranched rounds. I see them all

35:55

the freaking time. The amount of Sequoia

35:58

rounds where it's like, oh, you know, X

36:00

raises money from score at 5 billion.

36:02

Trust me, Sequoia got in at one, but

36:04

they just club it together and then

36:05

announce the sum and then the latest

36:08

valuation and it's just very misleading.

36:11

The tier ones get in early, a tier 2,

36:14

tier three instantly marks it up saying

36:16

>> same as crypto, isn't it? For years,

36:18

what's the difference?

36:20

>> Well, I think the Andre crypto fund uh

36:23

the you know, essentially 80% off the

36:25

the token. What's the It's the same

36:26

thing, isn't it? This you're paying for

36:28

the paying for the signal. I think if

36:31

you break it down

36:33

first of all just so everyone's on the

36:35

same page because interestingly neither

36:37

Claude nor GPT was on the same page and

36:40

didn't know what a tranch round was and

36:42

they gave the old conventional venture

36:44

trunch round based on performance

36:46

milestones you know BS from back in the

36:49

day when we actually ran businesses

36:51

right so didn't have a clue about this

36:52

so let's be clear on the practice here

36:54

the practice here is when a a company a

36:56

hot company raises a round where there

36:59

are effectively two different prices per

37:02

share. A first, let's call it a first

37:04

close and a second close even if they're

37:06

at or near contemporaneous where the

37:08

first one might be at 250 and the pre

37:12

and the second one is at a billion pre

37:14

and you know the highlight and the

37:16

headline is the always at a billion pre.

37:19

There's two imp impacts of this. First,

37:21

let's do the simple one where there's

37:22

just a single participant in the round,

37:24

right? That's where, you know, if I'm

37:25

the new investor, I want to pay 600.

37:29

The company wants a headline of a

37:31

billion.

37:32

And to win the deal, someone says,

37:34

"Okay, let me put some money in at 250,

37:36

some money in at a billion. I can do

37:38

math because I'm paid to do math because

37:40

I'm an investor. So, I know my overall

37:41

basis is a billion, sorry, 600 million.

37:44

So, I'm getting what I want and the

37:46

company's getting what it want, which is

37:47

a headline number of a billion, right?

37:49

It's silly, but that's all that's

37:52

happening in that case. That's the

37:53

single participant tunch deal, right?

37:56

And it's and for some if a company wants

37:57

a headline, that's what they get, right?

38:01

Generally those things come back to bite

38:02

you because by definition, if you are

38:04

the company, just as the investor can

38:06

do, Matt, presumably you can do Matt. If

38:09

you accept that combined deal, you're

38:10

implicitly saying, "I know I'm only

38:12

worth 600, but I'd like the optics of a

38:14

billion." You better be damn sure that

38:15

your next round you're at one half

38:17

billion. Otherwise, you'll have the

38:18

optics of a down round. And if you're an

38:20

optics believer, that's probably worse

38:22

than the uptick. Right? So that's kind

38:25

of the single participant version. The

38:28

much more annoying version that Harry

38:30

clearly was getting on his high horse

38:32

about is when you have the same

38:34

structure,

38:35

but access to those rounds where the the

38:39

lead investor maybe does all of the 250

38:41

pre- round and only half of the billion

38:44

round and then some new investors just

38:46

get to do the billion round. So

38:48

literally at the same time the lead

38:51

investor is investing at 600 billion and

38:54

the follower investor ne le marquee

38:58

investor is investing in the same asset

39:01

at a billion and there's I don't believe

39:03

there's right or wrong in money there's

39:04

just money right that's where at the

39:08

minimum you have to look yourself in the

39:09

mirror as the other investor and saying

39:11

wow that's the price of being cool right

39:14

that's the price of access I'm paying

39:16

50% more because I just can't access

39:18

that deal, right? And that feels like

39:22

pretty invidious thing to I mean again

39:24

going back to the comment the the prof

39:26

you got to remember if you think about

39:28

and again trying to avoid morality and

39:30

saying oh because it would feel shitty.

39:31

I mean you really would feel like a

39:33

loser if you did that but let's play it

39:34

out. This is a situation where the the

39:38

the lead investor let's say it's sequoia

39:40

because everything good and strong

39:41

should be sequoia. They're admitting

39:43

it's only worth 600 on average and

39:45

they're just doing this fakey

39:46

transaction. The company is admitting

39:48

it's only worth 600 on average because

39:50

they're taking the money at a blended

39:52

cost of 600. So what you're saying doing

39:54

at a billion is you're either saying

39:56

either I have a lower cost of capital

39:57

and I'm willing to take a lower return

39:59

than everyone else or the only positive

40:01

spin you can come up with is the company

40:04

thinks it's worth 600. Sequoia thinks

40:06

it's worth 600. But I am smart enough

40:08

even though I don't have access. I am

40:10

smart enough and clever enough to know

40:12

that it's really worth a billion and I

40:14

should do it at a billion even though I

40:15

can't get to 600 and I'm willing to put

40:17

up with the upfront tax and foolishness

40:21

look because I think 6 12 months from

40:24

now it'll be obvious that I bought at a

40:26

great price and maybe I look like a

40:27

genius.

40:27

>> Yeah, but we've well we've entered an

40:29

era though the I think the meta thing

40:31

maybe this wasn't exactly what you were

40:32

queuing up Harry but it is tough. We've

40:34

entered an era where so many founders

40:37

are obsessed about headline prices.

40:39

Obsessed. They're obsessed coming out of

40:41

demo day. They're obsessed. They're

40:44

obsessed once they cross a billion,

40:46

which I think should be a moment to take

40:48

a pause because of M&A options. They're

40:51

obsessed about driving to 11 billion and

40:53

9 billion and one uping their

40:54

competition. And the numbers have become

40:57

a joke to many founders, right? They

40:59

just don't joke the wrong term. They

41:01

don't think through the any of the

41:03

ramifications of the valuation they're

41:05

hid and they don't care. And I'm not

41:06

even saying that's bad. I mean, I think

41:08

burning the bridges is a good way to

41:10

have a big outcome, but it's become

41:12

utterly gamified on many levels, right?

41:14

It's just become gamified. And so this

41:16

11 trunch isn't around is just part of

41:18

gamifying it, right? It's been true of

41:20

YC since I started investing there.

41:22

There was always a cheaper price before

41:24

demo day if you're reasonably hot, a

41:26

higher price at demo day, and then a 20%

41:28

or 30% after demo day. So that version

41:30

has just become institutionalized and so

41:33

be it. If it's what the founders want,

41:34

if they want to gify it, so be it,

41:36

right? I I just don't think raising it

41:38

five or eight billion when you're at 80

41:40

million or 100 million of suspect AR is

41:43

the most exciting accomplishment in the

41:44

world. Like I'm not going to like I'm

41:46

I'm going to send a few thumb emojis on

41:48

the email, but that's about it.

41:52

>> That's about it. They're all They're all

41:54

fake anyway. They're all just bets,

41:55

right? These are not public companies.

41:57

It goes back to your point though on

41:58

emergent labs and the graph doing the

42:00

eight months to 100 million. The

42:01

gamification of like the race to 100

42:03

million. I'm not choosing emergent labs

42:05

but

42:05

>> they listen I think they built a good

42:06

product. I think I'm sure they've been

42:08

overly lamb cuz whether it's 100 or 80

42:11

or 60 I don't care. It's pretty damn

42:12

good, right? Whatever it is, but if

42:14

you're going to if you're going to if

42:15

you're going to do that, you deserve the

42:17

the daggers to come out when it's not

42:19

100%. Right. One that I thought was

42:21

fantastic, exciting. I always like to

42:23

see a potential IPO or to IPO shortly.

42:27

Um I thought this was fascinating. It's

42:30

been an incredible journey actually from

42:31

like you know Scandinavia these founders

42:34

building this business. It's had a

42:35

couple of CEO changes. Um the business

42:38

is actually in incredible shape both

42:40

actually and Whoop announced today that

42:42

they raised I think it was 500 million

42:44

at 10 billion fitness and health data.

42:46

Do you know what actually Rory Jason's

42:49

annoyingly right again? I don't know if

42:51

you remember his predictions, but he

42:53

predicted, if I'm not wrong, that 2027

42:56

would be the year for like human

42:58

healthcare data and longevity.

43:00

>> Yes. And it looks like it might even be

43:01

2026.

43:03

And Yeah. Know and you know, the great

43:05

thing about both stories is very defend.

43:08

I mean, you know, Johnny Iverside very

43:10

defendable from this is not an AI envy

43:13

story. This is I mean they use AI and

43:15

what they do but these are fundamentally

43:17

standalone products with a clear

43:19

consumer value proposition and they're

43:22

you know they're not going to be clawed

43:23

coded on Friday. I totally see it and

43:25

they clearly have had critical mass in

43:26

terms of revenues. I think it's awesome.

43:28

I think the question listen the

43:29

interesting thing for for these products

43:32

uh obviously it's exploded is they are

43:34

they are recurring you know going back

43:35

to the topic of a arr

43:38

these are recurring revenue products

43:39

right um for the most part right fairly

43:42

expensive subscriptions um and they're

43:45

exciting until like Pelatin when they

43:47

aren't right now it's not there's not a

43:49

$2,000 cost here um but um and I'm not

43:53

being critical I I think that they're

43:55

exciting but there's also a fattishness

43:57

ISM people can switch. So the RR the ARR

44:01

the pirate R um uh what multiples do

44:05

these companies deserve what it is I I'm

44:07

not smart enough to know but the but the

44:09

acceleration is a force of nature right

44:11

I'd love I'd love to be a seed investor

44:12

don't get me wrong

44:13

>> do you think there's a foolishness in

44:14

the same way I think we as

44:16

>> I think you can switch from v Harry

44:17

you're into fitness I I'm I'm not so

44:20

much but I run 360 days a year 5 miles a

44:22

day for 10 years so if there were a

44:24

better treadmill a better device a

44:27

better thing I would switch and and

44:29

whatever you're you're fairly fit Harry

44:30

like if you're if you wear or and you

44:32

love it but Whoop is is better and you

44:34

care you're going to switch. So it's not

44:36

it's not service now RR right it's not

44:40

you will you're loyal you're loyal but

44:43

but but there's just some disruption ri

44:45

like look at Pelatin when Pelatin blew

44:47

up but actually as the world changed

44:50

even though people love Pelatin right

44:52

super high MPS they you remember the

44:54

Pelatin addicts of 2020 on Zoom they

44:57

they loved it but when the world changed

44:59

they just the simple answer to Pelin is

45:01

they just switched they they just

45:02

switched and uh Whoop is different an

45:05

aura and there could be a whoop or woop

45:07

woop whoop aura and uh maybe one is your

45:10

ankle and it has your AI rock from

45:12

Johnny IV in it and we'll switch to two

45:14

comments on this one disclosure we are

45:17

lucky enough to have a small investment

45:19

or true the acquisition of one our

45:21

companies so I don't have a ton of

45:23

information so I'm not going to breach

45:25

any confidentialities but just an

45:26

abundance of caution I'm not going to

45:28

comment on numbers at all right great

45:30

products right but to your point Jason

45:33

on Um,

45:35

you know, it's not AR like service. Now,

45:37

let me be direct. Get the [ __ ] over it.

45:39

Right. Not every business on the planet,

45:41

Hang on. Not every business on the

45:43

planet has five year designed in. If

45:44

you're running a bar down the street,

45:46

every night I can go drink at a

45:48

different bar. If you're selling

45:49

Coca-Cola every day, I can switch to

45:51

Pepsi, right? If you're selling on If

45:53

you're running Amazon consumer, every

45:56

every day I can go search and go on

45:57

Walmart, right? Not every business is

46:00

going to have enduring kind of long-term

46:02

lock in. And that's obviously you'd

46:03

prefer to have lock in, right? But there

46:06

are lots of business that have been

46:07

around for 50 years where every day they

46:09

have to earn the right for the consumer

46:11

to go to them, right? And I think

46:13

there's no doubt in my mind that any

46:15

kind of consumer hardware software

46:17

combination product has some residual

46:19

asset from the subscription. But then

46:20

yeah, that every new device has to be

46:22

awesome. You're in competition with

46:24

other awesome products. It turns out

46:26

capitalism is hard. You know, if you

46:28

want to make 10 billion in value, you

46:29

got to deliver value for your consumers.

46:31

And I think for for what it's worth on

46:33

Pelaton, I actually think what really

46:35

happened to them, it's a little like the

46:36

Zoom story is demand that should would

46:39

have been wonderful. It would have been

46:40

the greatest stock ever had that demand

46:42

been spread out over five or six years

46:45

increasing at 20% a year. We'd be

46:47

talking about the Pelaton compounding

46:49

machine. Instead, everyone bought the

46:51

damn thing at the same time. They

46:53

staffed up to kind of meet that demand.

46:56

the market was wildly saturated and then

46:59

the stock went down and it broke the

47:00

narrative. So I I do agree, you know, I

47:04

yeah, there's nothing you can do to make

47:05

a market bigger than what it is. But I I

47:07

I think they got whiplash by virtue of

47:10

the co demand spike followed by demand

47:13

fall off.

47:14

>> No, I the question I think the meta

47:16

question listen aura I I I as Harry said

47:19

I guess I called it these are great

47:20

markets. They're large markets.

47:22

markets where people will pay actually

47:24

relatively high subscription fees for

47:26

data right a lot of attractance the the

47:28

meta question for venture is you know

47:31

the classic Peter Teal 0ero to one only

47:33

competition's for losers is what Dr.

47:36

Teal said competition's for losers.

47:38

Competition destroys pro profits.

47:41

Monopolies drive innovation. You want to

47:43

invest in monopolies. And so that's just

47:47

my meta anxiety is if these are

47:50

unmonopizable markets, are they good

47:52

ones for venture or not? And I I I

47:56

obviously there's two sides to it, but

47:57

we hope to we hope I I would feel more

47:59

comfortable investing in things that

48:01

become monopolies. I mean it's a it's a

48:03

better landing place um than uh than uh

48:08

con than investing in bars

48:10

>> and you can't and you can't ascribe the

48:12

same durability of revenue to this as

48:13

you can what like as much as I love

48:16

>> but on the other hand you can't ascribe

48:17

super high growth and you can't ascribe

48:19

big Tam you right just it look if there

48:21

were enough monopolies to do even one

48:23

good monopoly a year I'd be in right and

48:26

you know speaking if the founders are

48:28

about to get the all-time prize because

48:30

they invested in the space monopoly and

48:32

20 years later they're going to cash in

48:33

their chips, right? Monopolies are

48:35

better businesses than competitive

48:37

markets. But I do think you can still

48:39

build dollars of value from a high good

48:42

consumer product, right? And there are

48:44

lots of prior examples of that, you

48:46

know, and yet we all understand the

48:48

dynamics of I think it's much less comp.

48:50

I mean, actually, for what it's worth, I

48:53

think if you look at consumer products

48:55

that flame out, like the GoPro, it's

48:58

much less, and I'm doing this on the

49:00

fly, but it's much less a competition

49:02

issue. It's not like GoPro died because

49:05

the competitor to GoPro emerged, right?

49:07

It's that saturation is as big a problem

49:10

as anything else.

49:11

>> Well, DG DJI might disagree with you. I

49:13

mean, there was a whole step function in

49:14

the industry that they got they got left

49:16

behind, right?

49:17

>> Yeah. Would you prefer $2 billion in

49:20

consumer hardware revenue, $2 billion

49:22

worth of five-year contracts? Um, like

49:25

Palunteer. Yeah, I'll take the contracts

49:26

with the 90% gross margin of the 5-year

49:28

lock in, please. You're a starter for

49:30

10. But you got to give a lot of

49:31

respect.

49:32

>> I think maybe the more interesting

49:33

question, Rory, that you brought up. Um,

49:35

because so much has changed. This is our

49:37

50th show. So much has changed, right?

49:38

When we started the show, uh, uh, public

49:42

durable public company revenue despite

49:44

slowdown in the top line was the gold

49:45

standard, right? it was the best revenue

49:47

out there. Fast forward to today, do we

49:50

or going public, do we give a crap what

49:53

type of R it is? Because the the durable

49:55

software stuff is trading lower than the

49:57

S&P 500. Maybe I'd rather have ring

50:00

revenue and I'm with with a somewhat

50:02

suspect uh customer lifetime value

50:04

because this the software value is so

50:06

low. Maybe I don't care where my R comes

50:08

from, right? It used to matter. It used

50:10

to matter, right? We were so we'd be in

50:12

board meetings where you would torture

50:13

companies so that they would have more

50:15

ARR and that they would have less

50:17

variable revenue. I mean that seems like

50:19

archaic today.

50:20

>> Yeah. And and I remember doing that. I

50:22

remember telling people not to do that

50:24

because I'm a big believer is you can't

50:26

make you should sell your product the

50:28

way the customer wants to buy it. And I

50:29

agree one of the things I hated about

50:30

venture was when people would say oh

50:32

make it all recurring revenue. But then

50:34

the fun one that's actually really

50:35

relevant right now is you remember

50:37

everyone would say oh you know it's a

50:39

hardware product but all the values in

50:41

the software so we're really like a

50:42

software company and now hilariously

50:44

everyone's going oh thank god I've got

50:46

hardware because hardware is defensible

50:47

not software right and I think it's a

50:50

big picture comment is you should

50:52

conform your company around your

50:54

customers and your model not your VCs

50:56

because I agree with you this kind of

50:58

pretend it's AR but then next year we

50:59

hate it's just a total waste of time for

51:01

entrepreneurs things are what they are

51:04

and you do best in business if you

51:06

actually say what they are and just live

51:08

and die by that. Most consumer products

51:11

have high volatility associated with

51:13

them. You better have a damn good R&D

51:15

function and continue to build great

51:17

products.

51:17

>> We looked today this week they also

51:18

talked about how I think all birds was

51:20

was it was it acquired for less than 30?

51:22

>> It was acquire Yeah, I was literally

51:23

about to bring this up Jason. It was

51:24

acquired by AMX for $39 million. So my

51:27

question is if a company like Ora goes

51:29

public and you see weakness in a

51:31

quarter, should you dump this thing

51:32

instantly like Allirds um versus forgive

51:36

a little bit of weakness in a in a

51:37

Salesforce or service?

51:39

>> Yeah, I I I'm going to avoid any

51:40

specifics genuine comment here, right?

51:42

Because it's not appropriate. But I

51:44

would say something unlike the other two

51:47

guys. I've run a textile manufacturing

51:49

company 30 years ago. The technology

51:50

required to make an all birds or a shoe

51:53

is not the same as the technology

51:54

required to make a modular electronic

51:56

device that sits on the human finger and

51:58

measures blood. Either of these kind of

52:00

consumer electronic products, they're

52:02

not a monopoly in the same way Nvidia

52:04

is. But they're pretty it's pretty rare

52:07

number of companies that can do that.

52:09

They're not go down. Put it this way,

52:11

Jason. I'll name a wearable, you'll name

52:13

a wearable, and then I'll name a

52:15

sneaker, and you'll name a sneaker.

52:16

We'll be done with wearables long before

52:18

we're done with sneakers cuz there's a

52:20

lot of different sneaker companies. And

52:21

yeah, turns out sneakers are easier to

52:23

make than wearables, which are easier to

52:25

make than Nvidia GPU chips.

52:27

>> Speaking of like, do we care? What do we

52:29

actually care about? There were two that

52:32

I I don't know if you guys know this,

52:33

but I have wonderful partners, and one

52:35

of my partners is much more intelligent

52:37

than me, which Rory, you're going to

52:38

make some form of gag about, but he

52:40

helps me put together some of the

52:41

schedules, too. And he was like, "Whoa,

52:43

I had no idea about this." He was like,

52:45

"Whoa, Epic Games laid off 25%."

52:49

I didn't even hear about that.

52:51

>> Yeah. And then I then I had had Mark

52:53

Andre on your last pod sort of laughing

52:55

about how we all overhired in 2021.

52:58

>> Well, Mark Andre was was very clear. He

53:00

thought that we were all using AI as an

53:02

excuse and that we were all over staffed

53:05

by 50 or at least 75%.

53:08

>> Did any of his portfolio companies do

53:10

that over hiring?

53:10

>> No, Harry. Logically, it's it would be

53:13

75 or at least 50% over staff by 50 or

53:16

at least 75. This doesn't make any

53:18

logical sense, but keep going. Just

53:20

picking up on the arrows here.

53:22

>> Rory,

53:23

>> he's pissed now. He's pissed.

53:25

>> I would love to see you do a day of my

53:27

life.

53:27

>> I would love

53:30

I now you've been cranking.

53:32

>> I will give you two hours sleep for 6

53:35

hours a day running two companies at

53:37

once and then you come.

53:38

>> Now I'm feeling guilty. Move on. But no,

53:40

>> don't worry. But but point being like

53:45

went completely under the radar.

53:46

>> They didn't try and do an AI [ __ ]

53:49

story. They basically said, you know, um

53:51

active daily active use of their

53:54

Fortnite game and their games is down,

53:55

so your revenue is down, so you take

53:57

your expenses down. It was struck me as

53:59

a no [ __ ] layoff announcement. It's

54:01

like, you know, we sell less stuff, we

54:03

have less people. It sucks, you And

54:05

again, I really do try never to be

54:07

cavalier about people losing their jobs

54:09

because every one of those has to put

54:10

food on the table. They're not earning

54:12

the kind of money we're earning and now

54:14

they got to go out and find another job

54:15

in a shitty job market. It sucks. But

54:17

the lesson is and that's why I respect

54:19

them. It's like we're selling less so we

54:22

got to do what we got to do to keep the

54:23

company profitable.

54:24

>> Guys, we keep talking about these

54:25

layoffs and these big numbers. I mean,

54:27

it was over a thousand people laid off

54:30

in this layoff. A thousand. numbers are

54:33

relatively meaningless and we've had so

54:35

many of these conversations.

54:37

What happens to the labor markets?

54:39

>> Well, one thing on on the epic thing if

54:41

you um and the Wall Street Journal did a

54:43

good article on this one too this week

54:45

on the a permanent decline of Hollywood

54:48

employment. It's permanently in decline.

54:50

It's it's not there there it it's on

54:53

it's in decline because fewer uh movies

54:55

and TV shows are being made. Tik Toks

54:57

and YouTubetubes are doing it. And it's

54:58

in permanent decline because every other

55:01

country provides larger subsidies,

55:03

right? And so there's this permanent

55:04

decline in Hollywood labor. I think

55:06

entertainment is is sort of a shows us

55:08

the future. EP Epic Games is

55:10

entertainment too, right? And uh they

55:13

will absorb as much AI and technology as

55:17

they can to address the to to adapt. And

55:20

it's just early. It's just early.

55:21

They've had to adapt to YouTube. They've

55:23

had to adapt to social gaming. And I

55:25

think uh we talk about these you know

55:27

thousand people at lasting or whatever

55:28

but I think Epic Games is just it's I

55:31

think it's a more interesting view of

55:32

the future than block. We talk about

55:34

folks might vibe code a B2B app but

55:36

content's already being massively

55:38

disrupted

55:39

>> and some part of that is as you pointed

55:41

out to me when I got it wrong a few

55:42

episodes back AI related in terms of

55:44

recommendation engines. But I think a

55:46

lot of it is just you a very competitive

55:48

attention economy. You're right.

55:50

Fortnite was the was the game everyone

55:52

talked about. Now it's not. It's the

55:54

nature of the gaming industry. So yes,

55:57

what what does that mean?

55:58

>> It's the Fortnite circle coming for

56:00

everybody at the end of the game for

56:02

everybody. Even Fortnite the Fortnite

56:04

circle has come to Fortnite itself is

56:06

surrounded itself. Poor Epic Games is in

56:08

the middle of its of its end game of

56:10

Fortnite.

56:12

>> It's just hidden content creators

56:13

shooting it out at the very end. It's

56:15

coming for all of the the Fortnite

56:17

circle is coming for all of us.

56:19

The the other one that kind of

56:21

relatively was I think maybe a little

56:23

bit overlooked is reports of Manis

56:25

founders. Manis obviously for context

56:27

being bought by Meta recently. Um Manis

56:30

founders trapped or kept in China.

56:33

>> So just again give people context and

56:35

then put out one question mark there.

56:38

Manis was a company originally based in

56:40

China had some Chinese investors then

56:42

redomicized to Singapore benchmark

56:45

invested effectively refounded as a US

56:48

Singapore company meta acquired it I

56:51

want to say and I use the word past

56:52

tense acquired because my understanding

56:53

is the transactions closed and the

56:55

money's moved though interestingly not a

56:57

chachi panropic were clear on that but

57:00

my understanding is that's what happened

57:01

but then now the latest thing is two of

57:03

the the Chinese government has a takes a

57:05

dim view of this because they don't want

57:08

Chinese talent leeching overseas and

57:10

going to the US and effectively not

57:13

being Chinese anymore and they they they

57:15

feel it as a brain drain. So they did

57:17

something that was pretty coercive in

57:19

the sense of two of the key founders of

57:21

Manis I think were either in China or

57:24

summoned to China and they're no longer

57:26

able to leave. So that's those are the

57:29

facts and yeah of course you care. I

57:31

mean I think that well starting from

57:33

scratch I mean that sucks. I wish him

57:35

the best because that's not a pleasant

57:37

place to be. I mean, I think you've had

57:40

the Jack Mah thing of, you know, at

57:42

Alibaba of effectively going, as it

57:44

were, under the radar for a few years

57:46

when you kind of incurred the

57:47

displeasure of the administration. You

57:50

also have people who've had

57:51

significantly worse consequences than

57:52

that. So, let's start with the basic.

57:54

You wish them all the best, right? Um,

57:57

but

57:57

>> I don't think another deal like this

57:58

would happen to you. I think this whole

58:00

Singapore washing thing is over. It's

58:02

over.

58:02

>> I I totally agree. That's where I was

58:03

going to go with that long preamble.

58:05

I'll tell you who did notice. Maybe no

58:07

one in America spent any time thinking

58:08

about it, but every Chinese founder who

58:11

was thinking about doing this is going,

58:12

"Hm

58:14

h I don't know how I feel about this. I

58:17

don't know if I can do this deal. I do

58:18

know if I do this deal, I am never going

58:20

home again." But I'm with you, Jason. I

58:22

think all these other China washing

58:24

deals, they're put on pause or they're

58:26

put on re-evaluation or next thing is

58:29

going to sound harsh. It's a fairly

58:30

coercive regime. If your family's not

58:32

out of the country, do you have exposure

58:34

there? Right. I think it puts it it just

58:37

shows I mean authoritarian governments

58:40

can take pretty dramat drastic steps to

58:42

impact our citizenry if they want to.

58:43

And I agree, Jason, it makes it really

58:45

hard to imagine doing another one of

58:48

these deals without being worried about

58:50

this consequence. But hopefully they'll

58:51

kind of go naughty you pay 50% like you

58:54

know California makes it hard to leave

58:56

too but if you pay them 13% they'll let

58:58

you go to Nevada right? Yeah, hopefully

59:00

it turns out to something like that. And

59:02

please God, it's not something more, you

59:03

know, coercive. But I I agree, Jason.

59:05

Wouldn't do another one.

59:06

>> You know, in venture, you take risk,

59:07

right? It's part of the job. So, we've

59:09

all had deals where there's some rule,

59:12

some corner that was cut and we talked

59:14

ourselves into it's okay, right? This

59:17

this is this is companies something

59:19

weird about this company, but and we

59:21

convince ourselves as as as talking to

59:24

some mediocre lawyer or asking an LLM

59:26

today that it's okay. So like the

59:28

Singapore washing must work, right?

59:30

They've moved to Singapore. It's got to

59:31

work. And you convince yourself. You

59:32

talk to a few people and you take the

59:34

risk and it it bounced. It appears to

59:36

have bounced the right way for Benchmark

59:38

and Friends, right? It appears they've

59:39

gotten their money, but you don't do the

59:41

next one, right? And there's 242

59:43

millionaires in Singapore. The majority

59:45

of the inflow is Chinese. You don't do

59:47

the next deal. Maybe other capital does

59:49

the deal, and that's fine, right?

59:51

Capital is funible. But you just

59:52

inventually just don't you just can't do

59:53

the next one like this. It's too risky.

59:55

What do you do if you're Meta? Part of

59:57

the asset you're requiring is the team.

59:59

>> Two billion is not a lot for Meta and

60:01

they have the product.

60:02

>> Yeah. What are you going to do, Harry?

60:04

What would you recommend? Getting angry

60:06

at the Chinese. That'll work well for

60:07

them, right? I mean, I think it'll be,

60:10

you know, yet another acquisition that

60:13

looked clever, but in retrospect wasn't

60:16

amazing.

60:17

>> Well, listen, for Meta, I'll just say

60:18

one thing. I I I only have a tiny bit of

60:20

information, but I I it appears to me

60:22

Manis is running mostly and smoothly as

60:25

as an application and a company. Now, I

60:28

don't know if the founders are working

60:29

out, you know, I I certainly feel

60:31

strongly when you lose your founders,

60:32

you lose your heart and soul of your

60:34

company, but in the short term, I I

60:36

don't think it's a big deal for Meta

60:38

outside of the founders because it's

60:39

running smoothly, right? That that's my

60:41

in the short term, it's not down. the

60:44

team's functioning, they're running and

60:45

um but it's crazy

60:47

>> and at the risk of being Polyiana but

60:49

also wanting to assume the best of

60:52

people, I would hope that the Meta

60:54

management team and board to the extent

60:56

they do have any influence can help

60:57

these guys come to an amicable end. And

60:59

if it requires a tax settlement or

61:01

whatever, you know, just you you don't

61:04

want to leave you don't want to leave

61:05

people you just acquired in limbo. at

61:07

some zoom out level when you listen to

61:09

the rhetoric on both capitals you just

61:13

have to realize that trying to tread

61:15

between those two these two countries is

61:16

pretty hard right now right you know you

61:19

know we have China hawks and the US

61:20

government they obviously have a whole

61:22

ton of US hawks or whatever the

61:24

equivalent is there's a real perception

61:26

of competition you know we don't let

61:28

them buy the Nvidia chips etc etc you're

61:32

playing with fire in that thing and

61:33

sometimes it bites you

61:35

>> I just think overall it's natural in

61:37

given the outcomes in AI and given the

61:39

growth that I think it's tied to taking

61:42

the highest levels of risk we've also

61:44

taken because the payoff seem to be

61:45

there and when this deal happened folks

61:47

kind of thought this was aggressive

61:49

Benchmark's never done a deal like this

61:50

why are they doing a deal like this it's

61:52

not even very cheap right it seems a

61:54

little crazy and they're like well we've

61:55

never seen anything grow like this and

61:56

the team's incredibly talented right so

61:58

they took a little bit of risk and um

62:00

and they made their they made their

62:02

profit we're all taking more and more

62:04

risk folks that you know now now it's a

62:06

week of revenue at a demo day. I did a

62:07

million dollars my first week, it's

62:09

amazing. What about the second week? I

62:10

don't know. Like it but it as long as it

62:12

all works out in the aggregate. Um and I

62:15

think this is why nobody cares to

62:17

Harry's point. I cared about I cared

62:18

about Madness. I added to the to the

62:20

list. I don't think anybody cares. We're

62:21

all focused on getting a million dollars

62:23

our first week.

62:24

>> Just good realization that the worst

62:25

thing that can happen is not just oh you

62:27

lose your money. There are there are

62:30

worse than that.

62:31

>> I mean speaking about cooling their shot

62:33

and making billions of dollars. Steve

62:35

Jervson. He's tied his career to Elon

62:37

very smartly. So that's not in any

62:38

negative way in terms of the investments

62:40

that he has. Plowed, tripled, double,

62:43

quadrupled, everything in between. Um

62:45

leaves California, buys most expensive

62:48

home in incline village and and these

62:50

were Jason's notes. Will anyone with

62:52

liquidity be left in California? What if

62:55

California is structurally bankrupt?

62:58

>> Well, I mean, yeah,

63:00

>> it's not a great sign when they keep

63:02

leaving, is it? It's not a It's not a

63:03

positive

63:05

>> Rory staying Jason.

63:07

>> I mean, look, all first of all, you were

63:09

exactly right. All credit to Steve and

63:10

all more power to him. I've known him

63:13

intermittently for 30 years. He made a

63:14

brilliant call to align with SpaceX,

63:16

been on the board of Tesla and SpaceX

63:19

Tesla for a while and then came off

63:21

obviously for those back in the day. But

63:23

SpaceX too, yeah, he's he's put his

63:25

money in a compounding machine and now

63:27

he's clearly hit the DPI moment, right?

63:29

But yeah, going back to the thing, yes,

63:31

capital, I mean, the truth is this,

63:34

that's why we said last week, high ultra

63:36

high net worth people have a high degree

63:38

of mobility. And unfortunately, if you

63:42

put the hammer up too high, they can

63:45

leave and choose to go across the border

63:47

to Incline Village and save 13% on um

63:52

any realized gains. Plus, as we pointed

63:54

out, 5% on all gains if this wealth tax

63:57

passes, you know, at the margin, why

64:00

wouldn't you? You know, it's not like

64:02

you need to be in California to be a

64:03

Tesla board member or a SpaceX board

64:05

member given they're down in Texas. So

64:07

yeah, actions of consequences.

64:09

>> Well, it's interesting also this week um

64:11

Washington state did pass their 9.9%

64:14

state income tax for millionaires and

64:15

the governor said the reason the

64:18

governor said to sign it because there's

64:19

a lot of folks who said don't do it,

64:20

right? It's I mean already Howard

64:22

Schultz left. He said today he said well

64:24

they just deserve to pay more. And that

64:27

may well be true. It it may well be

64:29

true. Like I don't want to debate that.

64:30

This is not political, right? I'm I'm

64:33

more concerned about the tipping point

64:35

when uh we kill golden geese. You know,

64:39

there've been Washington and California

64:42

and to a lesser extent New York have

64:44

been the gold golden geese. It's uh you

64:45

know, Washington said they're going to

64:47

lose money. They're not going to make

64:48

money on this. It it appears that mo

64:50

most folks that are neutral or right

64:52

have said California will lose money on

64:53

the billionaire tax. Everyone's left and

64:55

and the tax itself assumed massive

64:57

amounts from Larry Ellison who's been

64:58

gone a half decade, right? So, no one's

65:01

It's just I do I I do worry they're all

65:04

they're all they're all leaving.

65:07

Everyone that doesn't work at Open

65:08

Anthropic uh you know on this show we've

65:11

done it 50 and I said in the beginning

65:14

of this that you'll leave after the

65:15

series B and and I now I see that used

65:17

again and again by these folks who are

65:18

on the right on it. They say all the

65:20

founders will leave after the series B.

65:21

But it may happen by show 100. And one

65:24

of the arguments I make is because you

65:26

know the truth is this articulating the

65:28

argument to the to you know the activist

65:31

on the other side as being you're being

65:32

mean to the billionaires is of genuinely

65:35

no interest and being mean to a

65:36

billionaire is actually a feature right

65:39

but I think the real articulation is

65:41

this

65:43

if you you actually are losing revenue

65:45

that won't be available to California

65:48

and the marginal dollar in California

65:50

probably goes into you know payment for

65:53

homelessness business, payment for young

65:54

your kids, payment for foster homes,

65:56

payment for marginal social welfare

65:58

services that are easy to defund when

66:00

times are tough, right? And by choosing

66:02

to obtely tax without any attention to

66:05

ability to collect that money, you've

66:08

actually reduced the revenue that's

66:10

available to you, right? And that's the

66:12

argument you have to make to the to

66:13

someone on the other side of the table.

66:15

you have literally chosen something

66:17

instead of getting you know 200 50 I

66:20

don't pick a number of 50 million from

66:21

the Larry and Sergeys and the Jervsons

66:23

of this world you went for 200 million

66:25

and now you're going to get zero and

66:27

what that means in real terms is

66:29

somewhere down the line long after all

66:31

these changes have been made somewhere

66:33

in Sacramento someone will zero out a

66:35

line item on the budget and let me give

66:37

you a clue it won't be payments to the

66:39

teachers it won't be payments to firemen

66:40

it'll be marginal services to marginal

66:42

people that your craft stupidity and

66:46

desire to make a political point has

66:49

ended up costing them money and that's

66:50

the only argument that moves the needle

66:52

because it because it's true and you're

66:54

right this you're saying is that it will

66:56

have a net negative return how now do

66:57

you feel good Rory if you were Steve

67:00

would you have left

67:01

>> I think from my perspective I'm just so

67:03

glad to be in California it's so

67:04

wonderful I moved around a lot early in

67:06

my life I have my friends here have my

67:07

life here at the margin the whole point

67:09

of having money is to be able to do what

67:11

you want and for three or four or five

67:13

or even 15% of your income. Do you

67:16

really want to leave? Now, I will say

67:19

that's why you can tax income income

67:22

relatively highly because it comes all

67:25

the time and you can't control timing

67:27

and therefore you have to uproot your

67:29

whole life for the rest of your life to

67:31

avoid it and I don't think it's worth

67:32

it. So, I wouldn't move to avoid income

67:33

tax. Conversely, if you have this

67:36

pending capital event where literally in

67:39

one year you're going to sell quote all

67:41

your SpaceX stock and realize a $2

67:43

billion gain and you're going to pay an

67:47

extra 13% of that in California, which

67:49

is $260 million, maybe you turn to your

67:52

wife and say, "Honey, for the next two

67:53

years, why don't we live in incline

67:55

village 165 days? I'll pay for the

67:58

plane. We'll go back every week. you

68:00

won't lose contact with anyone and we

68:02

will save $260 million and you go hm

68:06

that's real coin. So, and that's the

68:08

point about, you know, that's not the

68:10

life I live. That's not the situation

68:11

I'm in. But that's the argument you

68:13

make. It's like it's not crazy.

68:16

>> That's real coin, baby.

68:18

>> That's real coin.

68:19

>> Is there any Is there any story that I

68:21

haven't hit on, guys, that we should hit

68:23

on?

68:23

>> I just have to bring up the Ron Conway

68:25

Matthew Prince one because it was so I

68:28

highlighted that one on Twitter. It was

68:29

just the funniest thing in the world.

68:31

And uh you know I don't know Ron Conway

68:33

call for context. You want to provide

68:34

some context?

68:35

>> Yeah. I don't know Ron Conway but he's

68:36

certainly viewed as one of the Silicon

68:38

Valley gems right seed investor in so

68:40

many leaders. Always out there as an

68:41

advocate everywhere. Uh probably could

68:44

have retired years ago, right? Very

68:45

founder centric. And he wrote that he

68:48

had helped Cloudflare navigate some very

68:50

significant issues earlier in the day I

68:52

think on Jack Alman's podcast. Yeah. On

68:54

on Uncapped. And and they asked Matthew

68:56

Prince, CEO of Cloudflare, the question.

68:58

and he said, "Well, maybe I don't

69:00

remember any of that."

69:03

And it's just it's not and he wasn't

69:05

mean. Matthew can be fairly uh sharp as

69:08

as Harry knows these days. It wasn't

69:09

meant mean. The tweet was not mean. He

69:11

literally just meant he couldn't

69:12

remember getting any help from this

69:15

beloved VC. And I I think it just said

69:17

so much to me about VCs adding value,

69:20

but also VCs thinking they add value.

69:23

VC's possibly adding a modest amount of

69:26

value, but founders not really thinking

69:28

that modest value was consistent with

69:30

the bravado of the VC. It just uh it

69:33

just it just crystallized the whole

69:35

value ad idea to me in a in a single

69:37

tweet. It wasn't mean. It's just I don't

69:39

remember any of I don't remember Ron

69:40

helping, but maybe he did.

69:42

>> Yeah, I you're right, Jason. I I did

69:44

laugh at that and I think it does I

69:46

think actually my bigger heart to your

69:48

point is both to some extent are right

69:50

is that you know as a you all want to

69:52

have agency. They all want to feel we

69:54

help and you know want to be good people

69:55

and you look at go hey I spend some of

69:57

my time helping the CEO I feel I helped

70:00

but from the company's perspective

70:02

they're founding a company they're doing

70:04

a million things on one or two things on

70:06

a 10-year journey you helped you

70:09

remember that vividly they're like dude

70:10

it just fades into the background of you

70:12

know a hundred things and you know

70:14

better than me Jason they have to do

70:15

every day right and the truth is this I

70:18

one of the proofs of this interesting

70:20

way to check it is I often read business

70:22

biography raphies

70:24

and business stories of great companies,

70:27

ventureback companies and how they

70:28

formed and what happened and you know

70:30

what I notice in them every single one

70:32

of them very few little mention of VCs

70:35

if you just read them you you eyeball

70:37

them says oh that's a biography yeah and

70:39

they crop in and come out a couple of

70:41

times right and I think that's right

70:42

because realistically in the journey of

70:44

what's going on the only significant

70:46

things we done I've said this before in

70:47

the podcast we put in the money and we

70:49

put in more money when they need it we

70:51

decide to hire hire or not hire and fire

70:53

the CEO, we agree to broad strategic

70:55

direction and anything after that is at

70:57

best an assist, right? And if you read

71:00

the biographies of businesses, right,

71:02

what you generally see is the only time

71:03

the VCs come in is on some version of

71:05

those, right? And it's, you know, five

71:07

pages of the journey early on

71:09

interspersed around 200 pages in the

71:11

first five chapters and by the time they

71:13

get to the IPO, it doesn't even rise to

71:14

the level of a thing, right? I was

71:16

reading the, you know, the the Open AI

71:18

biography, the bunch of them recent, and

71:20

that's just the way it is. Microsoft

71:22

like same thing right you know so don't

71:26

I mean and you the VC can feel those

71:28

five minutes of impact were amazing and

71:29

they feel really good about them and you

71:31

feel warm and fuzzy but you know

71:35

the only thing founders really remember

71:37

for better or is oh my god our backs

71:39

were to the wall and no one would put in

71:40

money and they put in money they

71:42

remember that

71:43

>> some some sometimes in my experience

71:46

sometimes

71:47

>> sometimes they even forget that but to

71:49

your point

71:49

>> at least half the time they forget that

71:51

>> if they forget that, they're definitely

71:53

going to forget the time you made that

71:54

phone call to help them connect with XYZ

71:56

and that helped them do something cuz

71:57

that's something that happens 100 times

71:58

a day. No, you're right.

71:59

>> Yeah.

72:01

>> Yeah.

72:01

>> We're not the stars in the drama. We're

72:03

we're bit players who get well paid for

72:05

our part.

72:06

>> Boys, as always, the most humbling 90

72:08

minutes of my week.

72:10

>> I f you'll get more. You'll be humbled

72:12

tomorrow.

72:13

I'd

72:14

>> be surprised.

Interactive Summary

Loading summary...