HomeVideos

SpaceX's Financials Leaked: Is it Worth $2TN | Meta Debuts Muse Spark: Are They Back in the AI Race?

Now Playing

SpaceX's Financials Leaked: Is it Worth $2TN | Meta Debuts Muse Spark: Are They Back in the AI Race?

Transcript

2660 segments

0:00

I don't buy Dario anymore. He may well

0:02

be the second greatest founder of all

0:04

time behind Elon, but I am just so

0:06

burned out of the boy who cries wolf.

0:08

>> Starting off on the agenda, Anthropic

0:10

unveils mythos, but withholds it from

0:13

public release because it's too good at

0:15

hacking. Number two, public software

0:17

stocks tumble to new lows with City

0:20

saying there really is no flaw.

0:22

Optimistic. And then finally, Meta

0:24

debuts Muse Spark. It's Alex Wang's

0:27

first model from Meta's Super

0:28

Intelligence Labs. Does it save Meta in

0:32

the race to catch up?

0:33

>> So, I'm pretty bullish actually on

0:35

OpenAI in the enterprise.

0:36

>> I think it's a two-way fight. Antropic

0:38

has the advantage of clarity and focus.

0:40

OpenAI has the advantage of the consumer

0:42

business.

0:43

>> If your agents are only 60% as good,

0:45

you're in a slow death spiral. It

0:47

appears to be the most expensive IPO at

0:49

scale of all time.

0:49

>> The Elon discount rate is zero and the

0:52

Elon probability of failure rate is zero

0:54

to get to 2 trillion. I can't open the

0:56

straight of Hormuz myself. I can't do

0:57

this like enough already. Let me just

0:59

use my tokens.

1:02

Ready to go.

1:14

Guys, I am so excited for this show. Uh,

1:17

as we always have, we're going to start

1:19

with Anthropic. What else could we start

1:21

with but anthropic unveiling mythos with

1:23

the preview withheld from public release

1:27

because it is too good at hacking

1:31

discovered thousands of zeroday

1:32

vulnerabilities admittedly some were

1:34

quite old. Um how did we think about

1:38

this?

1:39

Did it deserve the reaction that it got?

1:43

Which reaction are you talking about

1:44

Harry?

1:46

>> I would say widespread fear that there

1:48

was then shown in a loss of market cap

1:50

of a lot of public companies in the US.

1:52

Let's leave to one side the was it a

1:54

marketing stunt whether they have not

1:55

have compute. Let's focus on what mythos

1:58

does in terms of cyber security and what

2:00

your correct response to that would be.

2:02

And you know if you read a lot of the

2:04

stuff it it finds a whole ton of

2:05

vulnerabilities including some that have

2:08

been lily there for years right? So

2:09

that's kind of the oh my god that's

2:10

scary and then you see and that's why

2:13

they withheld it and shared it with a

2:15

bunch of security vendors right and then

2:17

you see a bunch of kind of

2:18

counterarguments that basically some

2:20

version of this which is using older

2:22

models and using them well you can

2:25

actually get to the same outcome right

2:26

you can find the same security

2:28

vulnerabilities right and that's the

2:29

counter argue and there was a whole

2:30

bunch of twitters that said this is not

2:32

a big deal and you know I'm processing

2:34

true from the outside and and my

2:36

conclusion is those people who said it's

2:38

not a big deal are wrong and entropic is

2:40

right. And I was thinking about the

2:41

metaphor here today, right? Because what

2:43

they were saying is and it's actually

2:44

very interesting about the hologantic

2:45

revolution. It's kind of a microcosm

2:47

that allows us to talk about a lot of

2:49

things. It's like basically they are

2:52

right which is sorry the the naysayers

2:54

are right which is that you can take an

2:56

older model you can point it at some of

2:57

these issues you can kind of query you

2:59

can direct it a couple of times and

3:01

someone actually did the exercise of

3:02

here's how I found the same bugs. I have

3:05

to steer the model a little bit and you

3:06

got it right and but the comment is

3:09

mythos just kicked off on its own

3:11

agentically goes and looks at all the

3:13

code and finds them on its own right and

3:15

the metaphor I was trying to look at

3:16

here is very simple it's like it's the

3:18

difference between a rifle and a machine

3:19

gun in one sense both of them can kill

3:22

someone right but one shoots one bullet

3:25

and then stop and reload and the other

3:27

just spews guns bullets out and in the

3:29

first world war you know we all

3:30

tragically learn that machine guns are

3:33

it might be the same thing, but quantity

3:35

makes a huge difference. And I think

3:37

that's what's really going on here. The

3:39

speed at which this can process reason

3:42

across large code bases means that

3:44

they're just going to find more bullets.

3:46

They're going to shoot more bullets. So

3:49

it's not so the kind of the the Twitter

3:52

cynical it's not that different isn't

3:54

true because it's the it's the

3:56

capabilities to do so much so quickly

3:58

with such human direction that makes it

4:00

definitely a quantum step difference in

4:03

terms of real capability. My big aha was

4:05

it's it's it's not overblown in the

4:08

sense it can find stuff. I think that AI

4:10

is enabling every single breach

4:13

possible, every security hole to be

4:15

found. Not a subset of the hottest

4:17

companies, not folks trying to attack uh

4:20

open AI APIs, but everyone. And like for

4:23

example, you know, the other day my

4:24

fitness pal bought Cali, right? Cool

4:27

story, right? What was it, Harry? $100

4:28

million. 19-year-old kid from Miami,

4:30

something like that, right? A great

4:32

story. Two days later, it was instantly

4:34

breached. All the records were stolen.

4:36

3.2 million records. everyone's single

4:38

use. Everyone all the data on you, all

4:40

your HIPPA data, every single thing on

4:42

you was stolen within days and it became

4:45

a sport for a hacker. They just stole it

4:47

all. Now the root cause was and actually

4:50

this is surprisingly common. It's an

4:52

issue Superbase and others had to deal

4:53

with. Um they didn't have any

4:54

authentication on Firebase. It was but

4:57

as most databases are now built by AI,

5:00

as more and more apps are built by AI,

5:04

um the number of issues is going to

5:05

explode. And if Mythos and Friends lets

5:08

bad actors find every site the second it

5:12

launches with any PII and steal it, I

5:15

think we may enter an era later where

5:18

sites get more secure as as it's flipped

5:20

on the other side. But I think we're

5:22

going to go through a transition phase

5:23

where security is just getting worse and

5:25

worse and worse because every single

5:27

website can be instantly hacked and

5:28

stolen from it. And the whole mythos run

5:31

they said I think Claude said it took

5:33

him I don't know I'm sorry I'm going to

5:35

misquote the numbers. It took him

5:37

$20,000 of credits or a couple hours or

5:39

something like that. Right. And hey

5:41

that's that's enough that I'm not going

5:42

to do it against scales website. But if

5:44

I could if I could simplify that and

5:46

distribute against every single thing

5:48

with any PI on it, you know, bad actors

5:50

are just going to hit everybody. I think

5:52

it's a big deal. Whether this is a

5:54

publicity stunt for not having enough

5:56

capacity, I don't maybe maybe a little

5:58

bit, right? But um but everything's

6:01

going to be found every security hole.

6:03

Right. Agreed. Which is why the second

6:05

comment I'll make is I I I thought that

6:09

so the second com is that the reaction

6:11

to it in terms of security stocks going

6:13

down to me didn't make sense because I'm

6:16

like what this says is there's no doubt

6:19

that the process of going forward part

6:22

of the process of security will be to

6:25

use entropic or another code model to

6:27

check your code before you deploy to

6:29

find these vulnerabilities. this will be

6:31

a thing, right? Um, but someone's going

6:33

to have to administer that. Someone's

6:34

going to have to build frameworks and

6:36

harnesses to do pre-screening code. And

6:39

then, but then more importantly,

6:42

everyone's going to have to operate on

6:43

the assumption that if you miss

6:44

anything, they're going to find it,

6:46

which is different than if you miss

6:48

anything and you're really strategic,

6:50

they might find it. Right? To Jason's

6:51

point, if the other side now have

6:53

machine guns, then you've got to build

6:54

tanks, right? So what security is might

6:58

change. The the vendors who step up and

7:01

meet the challenge will triumph and the

7:03

ones who don't will fall away. If

7:05

anthropics say that their model now

7:07

allows anyone to find any

7:09

vulnerabilities and are going to

7:10

withhold it for 6 months, that means

7:12

that in 6 months and one day every bad

7:14

guy in the planet is going to be pinging

7:16

your your code and trying to find the

7:18

bad bits, right? So you bet you're going

7:20

to be investing in cyber. So I think the

7:22

part that made sense was the this is a

7:24

big deal. The part that didn't make

7:26

sense is the cyber stock should go down

7:29

because I think you're going to want way

7:30

more defenses because the bad guys are

7:32

more heavily armed. And yes, as I say,

7:33

it is an arms race. Do you buy Dario's

7:37

it's too powerful. We can't release to

7:39

the public. Is it just great marketing?

7:41

I don't buy it anymore. I'll tell you

7:43

something. One thing that changed with

7:44

me with the miss thing, for what it's

7:46

worth, um I I don't buy Dario anymore.

7:50

What I mean is, listen, he may well be

7:52

the the second greatest founder of all

7:55

time behind Elon. Look what he's done in

7:57

five years, right? Five years to 30

7:58

billion. The great the greatest um

8:01

grudge startup of all time, right? I

8:03

mean, it's hard as a as a founder to not

8:06

your job not to fall on the ground. But

8:08

I am just so burned out of the boy who

8:11

cries wolf. Every job's going to be

8:13

destroyed. Everything is insecure.

8:15

Everything like enough already. And like

8:17

I've heard it so many effing times. And

8:19

then about Mythos, I have to hear that

8:21

like he's created the spawn of evil if

8:24

we're not careful. Like I just can't

8:25

like I'm I'm I'm I've rotated back to to

8:28

team Sam after all this because I just

8:30

can't take the can't take the endless

8:33

boy who cries wolf. It's like even if

8:34

you're right there's I can't open the

8:37

straight of Hormuz myself. I can't do

8:38

this like enough already. Let me just

8:40

use my tokens. Seriously, I've lost

8:42

confidence in his not in him as a as a

8:45

CEO, but this endless marketing machine.

8:47

I'm tuning it out now. I don't care

8:49

anymore what he says about this stuff. I

8:50

don't care.

8:51

>> What specifically what specifically do

8:53

you not buy? I'm just trying to

8:54

understand.

8:55

>> Listen, if every Dario's like 80% of

8:57

jobs are going to be destroyed in in two

8:59

years, we need we'll need no programmers

9:01

by next week. Okay, that endless thing.

9:03

Maybe he's right, but what can I do

9:05

about it? I heard you. I heard you the

9:07

11th time. I heard you the 80th time. I

9:10

heard you on Joe Rogan. I heard you on

9:11

on on TBPN. I heard you on Harry. I just

9:15

can't. And then and then the mythos

9:16

thing and they were holding it back and

9:18

it's like I believe you're you're a

9:20

safety guy but if you talk your game too

9:22

much um I just got to check out at some

9:24

point. Show me something that's

9:26

inspiring. Like I actually I honestly

9:28

feel like his message is uninspiring.

9:30

That's the problem. It's uninspiring.

9:33

>> I'm going to push back a little on that.

9:36

Um but in the following way I think a

9:40

lot of I think a lot of the doom

9:43

warnings are wrong and the doom warnings

9:45

to date have been wrong

9:49

if you look at the unwillingness to

9:51

release chat GPT2 which in retrospect

9:53

was over overdone but I think the

9:56

concerns are sincerely held right and

9:59

yeah it it also and it also is good

10:02

marketing I acknowledge that too but I I

10:04

think the starting point is there is

10:06

belief here that these things could

10:07

happen. To be very concrete, I think

10:09

it's totally wrong about the economic

10:10

50%. I think that's beyond madness and

10:12

I'm not worried about in the slightest.

10:14

But I do believe, and I thought about

10:16

this a lot because what I realized is if

10:18

I'd met them at the sea, which I didn't

10:20

because it was outside our price

10:21

bracket, but if I'd met them, I would

10:23

have done exactly what you did, Jess, I

10:25

would have listened to the doom warnings

10:26

and I said, "That's all silly and wrong.

10:29

Therefore, I won't do the deal." Right.

10:31

And what I've learned is something more

10:32

nuanced. I think a lot of Silicon Valley

10:34

companies have this have a culture

10:37

that's overreaching and you listen you

10:39

go the grandiosity if if you're kind of

10:40

a grounded person you reject the

10:42

grandiosity but what I've internalized

10:44

is the grandiosity is is a rallying is

10:48

sincerely held because I don't believe

10:50

you can portray grandiosity consistently

10:52

for 5 years if you don't believe it. So

10:54

unless you're really psychopathic so

10:56

sociopathic I should say. So, um I I

10:59

think it's sincerely held and I think it

11:01

has a huge unifying effect on a company.

11:03

Like take for example Elon and we're

11:05

going to Mars. Like the minute we had to

11:07

file an S1 and someone had to say you

11:09

might have to go to prison if you say

11:10

things wrong. We said we're not going to

11:12

Mars. We're going to the moon. Right? So

11:14

you could be cynical. If id looked at

11:15

that deal much earlier on, I would have

11:17

said the the the cynical but incorrect

11:20

approach would have been to say I don't

11:22

think they're going to Mars for here's

11:23

10 reasons. Therefore, I'm not going to

11:25

do the deal. the more evolved approach,

11:27

and this is why I'm pushing back in on

11:29

Tropic, is I don't think they're going

11:31

to go to Mars, but I do think the Mars

11:33

vision over 20, 30 years is a rallying

11:36

cry that will allows them to do amazing

11:38

in the short term, which they've

11:40

done. And I think the same thing applies

11:41

with Dario here. It's like I think it's

11:43

all over. I think the half of Silicon

11:45

Valley, I'm going to say it here, is

11:46

running around thinking they're

11:48

inventing the next thing after the atom

11:49

bomb. And I simply don't. I don't think

11:52

we're going to unemploy 50% of white

11:54

collar workers. I think it's madness. I

11:55

think it has some legitimate dangers in

11:57

cyber security and bioteterrorism but

11:59

the manageable I think we're all it's

12:01

all over wrought but that overought

12:05

his I won't say hysteria that overworked

12:07

intensity right has allowed them to

12:10

build a culture where it had no churn

12:12

it's given them mission clarity and then

12:14

we've given them it's a given a $ 30

12:17

billion

12:18

revenue line and a possibly trillion

12:20

dollar market cap and what I've learned

12:22

and it's really hard for me because I

12:24

find all this problem Like

12:25

it's like the Airbnb. Remember the

12:26

Airbnb was the sharing economy and Uber

12:29

was the you remember the sharing

12:30

economy. It sounded like a bunch of

12:31

communism. We're all going to sleep on

12:33

each other's, you know, air mattresses.

12:36

That was a visionary fairy

12:38

That turns out what really is going to

12:39

happen is people are going to buy houses

12:40

and rent them out, right? So what I've

12:42

learned, you know, Steve Jobs was a

12:44

bicycle for the mind. It turns out we're

12:45

all just going to sit on our phones and,

12:47

you know, watch Instagram and get

12:49

depressed about other people's lives.

12:50

But the vision oomphy bit it just helps

12:53

keeps the machine of innovation churning

12:55

here people. So what I've learned to do

12:57

which is really hard is literally listen

12:59

to the idealism.

13:01

Don't say do I agree or not say to

13:03

myself will it motivate people enough to

13:05

do something where there is economic

13:07

advantage to be obtained and that's a

13:08

very cynical old person's perspective.

13:10

But that's the context in which I say I

13:12

think Dario

13:15

believes all that stuff and I think it's

13:16

useful for them and I think it's wrong.

13:18

But it's damn useful and it's worked.

13:21

>> Jason, I get you. What do you want him

13:24

to say then? Like you said, oh, I don't

13:26

find him inspiring enough. What would

13:27

what would make you happy? What do you

13:29

think he should say? And I think before

13:32

things got tougher, I think Sam was good

13:34

at teasing at this. I want him to take

13:35

us to Mars. I want to see the good side.

13:38

Even Venode, who is very direct that

13:41

there's going to be a lot of job losses.

13:42

Right or wrong, Roy disagrees, but

13:43

Venode's very direct, his point is it's

13:45

going to be okay, right? We will figure

13:47

this out with AGI. Everyone will pay

13:49

more taxes even in California. It's

13:51

okay. Mark Andre, his point is we're

13:55

entering an area of deflation and

13:56

abundance, right? I I don't need I don't

13:58

need too much on the other side. I don't

13:59

I don't need the fluff, but I just need

14:01

a little inspiration that there's some

14:04

good um and I'm not saying maybe the law

14:07

maybe the third hour of Dario's speeches

14:09

have it, but everything I see on social

14:11

media feels like he's an invertton

14:13

Debbie Downer. And um it's just uh I'm

14:16

tuning out. I'm just tuning out now. And

14:18

maybe in the enterprise he's got to

14:20

we'll just see. Listen, I don't run a $

14:21

30 billion. It this it may it may almost

14:24

have to change as the years go on as

14:26

this works less well with the

14:28

million-doll customers, right? You may

14:30

have to be more positive about the

14:31

benefits in your workflow. I think

14:34

Jason, the odd thing is we're actually

14:35

agreeing on one thing. tune out the

14:38

noise and just, you know, in the words

14:41

of Halddederman, you know, don't look at

14:43

what we say, look at what we do. I

14:44

always love quoting the Nixon White

14:45

House as what used to be the most

14:47

cynical White House we've ever seen, but

14:48

we can come to that another day. And you

14:51

know, ignore what all ignore the people

14:55

that people are saying, "Oh my god, this

14:57

could eliminate white collar jobs." And

14:58

ringing their hands and saying, "This is

15:00

awful. Look at what we're doing. We're

15:01

shipping code. We're shipping software.

15:02

We're doing 30 billion and run rate. Oh

15:04

it's amazing." Turns out you're

15:06

not buying the guilt. I mean, it's

15:07

you're not buying the guilt. You're not

15:09

buying the hand ringing. You're actually

15:10

buying the revenue. And the revenue is

15:13

amazing.

15:14

>> I'm with you.

15:15

>> It's pretty amazing.

15:16

>> I've spent a lot of the last year

15:20

attempting to help founders that they

15:22

genuinely need to move more quickly,

15:24

that they are too complacent in their

15:26

approach to AI, that this what I that

15:29

they have at best a 60% solution, 60%

15:32

answer to the problem. I've tried I've

15:34

tried to vibe code in public. I've tried

15:36

to build my own apps. I've tried to

15:37

share how we've rebooted our teams to

15:39

three humans in 28. I've done all this

15:41

and I and I know it's profoundly helped

15:43

a lot of people. I get so many messages,

15:45

so many so many public company CEOs,

15:47

leaders reach out to me. Even me, I'm

15:49

like, I'm almost done with this phase.

15:51

Like, I have alerted you, okay? If after

15:54

me with my 10 trillion tweets and 2,000

15:56

blog posts and 54 20 VC pods together,

16:00

if you haven't heard the message that

16:02

you got to like you got to catch up in

16:04

AI, may maybe I'm no Dario, but even I'm

16:07

ready to to move on to the new world.

16:09

I'm leaving the past behind. And if

16:11

we're all going to live in a world of

16:12

robots and AI lawyers, so be it. Like uh

16:15

you know, I'm I'm ready to move on to

16:17

the new world. And I'm almost and I'm

16:18

frankly ready to write off a lot of

16:20

portfolio companies and a lot of public

16:22

companies. It's time to move on, guys.

16:23

If you're not going to get there in

16:26

April of 2026, then so be it. So be it.

16:30

Well, let's mark do a markdown and call

16:31

it a day. And good luck to you. I'm

16:33

going to close with the Oppenheimer

16:35

quote. All of these founders have their

16:37

Oenheimer moment. They want to be

16:38

Vishnu, destroyer of worlds. And that's

16:40

been, you know, they all reference the

16:42

book. Obviously, everyone, you know, so

16:44

so obviously channeling Sam and

16:46

channeling their their kind of all um

16:48

kind of Oppenheimer moment, you know,

16:50

with with the new atomic bomb. My

16:52

favorite moment in that movie was when

16:53

Harry Truman says, "Get that crybaby out

16:55

of the White House." In the end, Harry

16:57

correctly says, "I dropped the bomb."

16:58

The equivalent of that is if a whole

17:00

bunch of people are fired, Jamie Mo

17:02

Jamie Diamond will fire them. He doesn't

17:03

need you ringing your hands with guilt,

17:05

Dario. It's okay. Right? Other people,

17:07

and it was a great moment when Harry

17:08

Trump correctly said, "History won't

17:10

say, Robert Oppenheimer, you killed all

17:12

those people." History will say, "You

17:14

built the bomb." And history will say,

17:15

"I dropped it." Right? And so, in other

17:18

words, get over your guilt. ship the

17:20

product in a methodical fashion. Do be

17:23

careful. I do think he was right to keep

17:24

that product back, but in the end, you

17:26

know, it's not stoppable. I I appreciate

17:28

that you I actually think he's very

17:30

thoughtful about it. So, I'm on his side

17:31

on being thoughtful. I don't think it's

17:33

fake, but this is going to happen and

17:35

other people are going to own the

17:37

problem. Onwards.

17:39

>> Onwards.

17:41

The final element which is connected but

17:44

not the same, which is mythos was

17:45

trained entirely on Amazon's training.

17:47

>> I don't think that's correct.

17:49

straightforwardly. What What What makes

17:51

you I'm sorry I cut you off there

17:53

because I've had too much coffee, but

17:54

whatever. Um

17:56

>> it feels like a three cup of morning uh

17:58

morning for Rory, doesn't it, Harry?

17:59

What are you saying?

18:01

>> Keep Sorry, keep going. Keep going cuz

18:03

I'll let you finish your sentence.

18:05

>> Well, well, um apparently Mythos was

18:07

trained entirely on Amazon's Traium

18:08

chips. Um Jasse disclosed that it's now

18:11

a $20 billion annualized business

18:13

growing triple digits. Um uh Traium now

18:17

is nearly sold out. Uber among one of

18:20

their biggest customers. Question being,

18:22

if this is the case, are we slightly

18:25

seeing a loosening of Nvidia's

18:26

stronghold on the market? And does this

18:28

change how we feel about Nvidia? First

18:30

of all, I think you need to be really

18:31

precise here cuz I checked I didn't know

18:33

this point, so I checked it. It's like

18:35

it sounds like you're a couple things.

18:37

It sounds like you're saying, "Oh my

18:38

god, they're shipping tranium chips to

18:40

others who are using who are you buying

18:42

chips and using them." They're not. They

18:43

don't to a rounding error have a

18:45

merchant silicon business. that are not

18:46

competing directly with Nvidia. What is

18:49

true is Amazon instead of buying Nvidia

18:53

chips is buying its own chips and then

18:55

offering cloud hosting services and you

18:57

know inference services and model

18:58

training services. So it's not

19:03

so think of it as less oh my god

19:04

someone's buying chips and competing

19:06

with Nvidia is that Amazon is not buying

19:08

Nvidia chips instead using its own chips

19:10

and most of that runway is internal

19:12

purchasing. So what they're really

19:13

saying is Amazon is saying we have a

19:14

capex budget of 200 billion a year this

19:17

year which probably means about half of

19:19

that typically is chips. It's 100

19:21

billion. So where we can we're buying

19:23

our own chips, of course we are. And

19:25

where we're not, we're going to have to

19:27

buy Nvidia just like everyone else. So

19:29

that's true. And then in terms of who's

19:30

quote unquote a customer, all they're

19:33

saying is this is when they're either

19:34

when they're if they're doing training

19:36

runs for Entropic, which I'm sure they

19:38

are, or they're doing inference runs,

19:40

which I'm definitely sure they are cuz

19:42

that's a product that they offer true

19:43

bedrock. When they're doing that,

19:44

they're running it on their trip. So

19:46

yes, in that sense, in that sense, some

19:49

of the mythos model was probably trained

19:53

on tranium trips, but not because

19:55

Atropic said, "Yo, I love tranium." It's

19:57

because to the extent that Amazon is

20:00

offering them compute, some of that

20:01

computes Centranium. That's all that's

20:02

happening here. But you all that said,

20:05

it's still 20 billion of dollars that

20:07

didn't go to Nvidia that went to Amazon.

20:09

So it it is at the margin meaningful.

20:11

It's 10% of kind of Nvidia's revenue, a

20:13

little less than 10%. So it's and so

20:16

it's not a mega competitor. It's just an

20:18

in-house bundle product at some

20:20

significant scale.

20:23

>> I'm loving this Rory Jason, aren't you?

20:25

is like

20:26

>> you know 10% is material like we I don't

20:28

want to we don't need to spend all over

20:29

10% is material right and I guess

20:32

>> the the bare case is just everyone's

20:34

building their own chip or deploying

20:35

their own chip everyone's trying

20:37

everyone's trying especially on

20:38

inference and um you know this is and

20:41

and that just Nvidia is dented it's

20:43

dented sufficiently to see multiple

20:45

compression it's dented sufficiently

20:46

that our 401ks go down more right it's

20:49

really just that it's that that when

20:51

things are priced to perfection there's

20:52

dent right Fortnite Maybe

20:55

>> Fortnite. Nvidia may have its own

20:57

Fortnite moment as as crazy as it

20:59

sounds. It just may be the first 30

21:00

seconds of the game on a on a relative

21:02

basis. But uh yeah, but but no one knows

21:05

this better than Jensen, right? No one

21:07

no one is is friends with his frenemies

21:10

and and friend of partners and friend of

21:12

better than better than Jensen. No one's

21:14

played this game in the history of

21:16

mankind of being kind to everybody,

21:18

pulling back, being right, understanding

21:20

the dynamics, and still winning. So

21:22

crazy crazy good at it, right? doesn't

21:24

get his dander up on this stuff like

21:26

most of us do.

21:27

>> And remember, I I I think Amazon and

21:29

Nvidia had a famously have a famously

21:32

difficult relationship. So, they're

21:33

probably the company most interested in

21:34

not buying from individ. So, I think I

21:36

think what you're saying is fair at the

21:37

margin. It's 20 billion like to have 10%

21:39

is not meaningless market share.

21:41

>> Yeah. But, you know, the big picture

21:43

comment is compute is scarce, chips are

21:46

scarce, they're pretty much sold out,

21:48

the stocks at 194, and you know, it it

21:51

didn't super accelerate it when they did

21:53

that trillion dollar backlog comment,

21:55

but it didn't go down either. So, you

21:57

know, I I I think we're up against the

21:59

constraint limit rather than anything

22:01

else.

22:02

We're going to stick on Anthropic, but

22:04

Moonlight Anthropic to now compete with

22:06

lovable directly. They launched recently

22:08

in the last 48 hours a competitive

22:10

product. Um,

22:12

>> you sure it was launched?

22:14

>> Well, they announced it.

22:16

>> Okay.

22:16

>> Harry, see Harry's a media guy. He

22:18

thinks when things are announced that

22:19

they're real. Jason is a software guy.

22:21

He actually thinks you have to ship

22:22

product.

22:23

>> No, it's all about the announcement.

22:25

Surely you've seen that in the last few

22:26

days. A

22:27

>> as the lovable replet guru here. Look, I

22:30

mean Claude Code is clearly directly

22:32

competitive with Cursor and then you

22:33

have this slightly different segment.

22:34

I'd love your opinion, Jason, on kind of

22:36

the the lovable replet segment. What do

22:38

you think the competitive

22:40

threat from a Claude from a slant

22:44

onropic to those players? I mean, cuz

22:47

>> well, look, Eric from Bolt on these

22:49

screenshots. Eric from Bolt said, "Oh,

22:51

we we all I was with I was at a dinner

22:53

with the CTO of Level. We all knew this

22:54

was coming. It was just a question of

22:56

when." And then I asked him if he

22:56

thought the screenshots were real and he

22:58

said it didn't really matter because

23:00

because it was coming, right? So, so

23:02

that's that's a that's a a number three

23:04

or number four players view. The meta

23:07

question, you know, if this were if this

23:09

were 52 episodes ago, um I'd be like,

23:12

well, you know, it could happen, but

23:13

it's it's not important enough. They're

23:15

going to have to get into databases and

23:16

hosting and identity management and OOTH

23:19

and and and and enduser support, like

23:22

consumer level enduser support. They

23:23

like it's a whole bunch of things

23:24

culturally that they don't want to do,

23:26

right? But the pace of innovation at

23:29

Enthropic is so intense that on a

23:32

whiteboard, it's hard not to want to

23:34

grab a couple billion of extra revenue

23:36

from vibe coding, right? Because because

23:38

you're just you're just a database and

23:39

an ooth uh and and sort and and a few

23:42

other, you know, for them it's 30 days

23:44

of work, right? Um so I don't know, but

23:48

the classic the old school the old

23:49

school the old school VC would be like

23:52

it's distracting. Even if they launch

23:54

it, they're not going to maintain it.

23:55

They're not going to have support.

23:56

they're not going to they're not going

23:57

to put all the resources you need to

23:59

maintain it. But the the truth is and

24:01

maybe Eric from Bolt's point is maybe if

24:03

they don't directly compete at the

24:05

proumer level, right? Even if they don't

24:06

build a a base 44 lovable replet, they

24:09

might just go halfway there and that

24:11

might be enough. Like it might be a it

24:13

might be something that developers use

24:15

who just want to get something going,

24:17

right? It might be something that that

24:19

more technical product teams use, which

24:21

is like the number one highest ROI

24:23

category for ripe unlovable. the these

24:25

these product teams and they may only

24:27

target the the nerdier the more

24:28

technical part of the market and that

24:30

might be sufficient to again maim maim m

24:33

name m name the folks right um it

24:35

doesn't have to be 100% they don't have

24:37

to replace shopping sites and stuff like

24:39

that to have a material presence so but

24:42

you know that that Anton and Amjad and

24:44

everyone thinks about this 26 hours a

24:47

day how do we stay ahead right how do we

24:50

stay ahead um and um and it may just be

24:54

maming. If it happens, it may just be

24:56

maming, but maming hurts.

24:58

>> But Jason, those product teams could

24:59

just use Figma make, right? I mean,

25:02

>> they can't use make.

25:03

>> Don't feed. Now you're just Now you're

25:05

just poking the bear. Harry,

25:07

please leave the bear alone.

25:08

>> Well, I'll tell you.

25:10

>> Well, we could talk about

25:11

>> Annie's off. Annie's off.

25:13

>> No, I just um

25:16

>> Well, well, let's stay on this topic. It

25:17

actually ties to If we talk about

25:19

software stocks, why I've become more

25:21

pessimistic since the last show on them,

25:23

but

25:23

>> Oh, no. Why have you become more

25:25

pessimistic? But because we are kind of

25:28

ahead in this agentic thing, right? At

25:29

least in the real world. I do talk to

25:31

lots of teams, lots of senior product

25:33

teams, lots of CEOs at massive, more

25:36

than I ever done in the last 10 years

25:38

combined. Okay, every week multiple

25:40

zooms, multiple calls, and there are

25:43

exceptions for for sure. But I would say

25:46

here's why I'm pessimistic and why I

25:47

think that the draw down is accurate.

25:50

Even though I don't understand the

25:51

public markets, I think almost

25:52

everybody's building a 60% solution and

25:56

make is an example. You look and you

25:58

look what Claude did or or or prompting

26:02

did last November and you spent the last

26:05

four to 6 months building something

26:07

that's kind of similar to what what

26:08

these products were like 5 to 6 months

26:10

ago, but you can't really explo afford

26:13

the tokens. You're worried about the

26:15

cost. You can't build all the features.

26:17

Um, you're trying to use cheap models to

26:19

bring cost down. You're trying to limit

26:21

it and you end up with make, but

26:22

everyone has a make. Why is make so

26:25

crappy? It's because you didn't care

26:27

enough to spend the money or put the

26:29

team and make is a good copy of replet

26:32

or lovable from last summer, right?

26:34

That's what happens. And and by the time

26:36

make catches up to replet and lovable

26:38

today and then and then they decide it's

26:39

too expensive and then they decide they

26:41

have to lock it down because they can't

26:42

afford the agents and tokens. Now you're

26:44

9 months behind and 12 months behind.

26:46

And so what I mean is and and so it's

26:48

not just that you're have a 60%

26:50

competitive solution. Here's the meta

26:52

problem for the incumbents. You can't

26:53

charge for a 60% solution. Here's the

26:56

problem. If you could charge 60% of what

26:58

Claude charges or or Lorra charged or

27:02

Replet charged, that'd be great. That

27:04

would that's enough to charge 60%. But a

27:06

60% product has to be free. It has to be

27:08

included with your base charge. And

27:11

while we're recording this, for example,

27:12

today, HubSpot's launching its next

27:14

group of AI agents, right? And I'm

27:16

excited to try them and I will be

27:18

supportive. If they're only 60% as good

27:20

as a standalone solutions, HubSpot

27:22

cannot really charge for these things.

27:23

You can't get away with charging another

27:26

20, 40, $60,000 to HubSpot customer. If

27:29

your agent is fine, it works like in

27:32

isolation. When I meet with internal

27:34

product teams at large companies at

27:35

scale, they are so effing proud of

27:37

themselves. They show me their agent

27:39

that they built. They show me their vibe

27:41

coding thing and if I didn't use any

27:43

other products, I'd think they were

27:44

great, too. Or if this was 51 weeks ago,

27:46

I would think these products were great.

27:48

And they're so insular at their 2,000

27:50

person company in their in their in

27:52

their fancy campus wherever they are

27:54

with their mugs at bringing their mugs

27:56

to the meetings because because at their

27:58

pace and and and they're failing even as

28:01

they're proud of themselves with their

28:02

60% solution because the market will not

28:05

pay. They will use it like they'll use

28:07

your pro your your 60% solution but

28:09

they're not going to pay for it. They're

28:11

not going to pay for it. And so you're

28:13

you're you're stuck in this doom loop of

28:15

yes I have an AI product as a public B2B

28:18

company but no one is willing to pay for

28:19

it for a 60% solution. They're not

28:21

willing to pay 60%. And so we can say

28:24

all these companies have moes and

28:26

service now has the biggest mode of all.

28:28

So it shouldn't be sold off and this and

28:30

that. But if your agents are only 60% as

28:32

good, you're you're you're in a slow

28:34

death spiral. And that's what I see. I

28:37

can't think of maybe one or two

28:39

exceptions of everyone at scale where

28:41

their agents are as good as either a

28:43

standalone company or just what I can do

28:44

in Claude. Now, I can't maintain Claude,

28:46

there's a whole bunch of issues, but if

28:48

it's only 60% as good, there's no way

28:50

I'm going to pay this this AE that just

28:52

called me up 100 grand for it. I'm not

28:53

going to do it. It's not good enough.

28:55

You checking the box does not work with

28:58

agents. The check the box feature cannot

29:00

be monetized in the AI era. And this is

29:03

why I think they're all properly

29:05

sold down because none of them h they

29:07

all have 60% solutions. All of them. All

29:11

of them. And they should be

29:13

it's it's do or die guys because you

29:16

can't sell these things. You can't sell

29:18

these things, right? And that's and know

29:20

the one there's two counter examples. We

29:21

could argue over agent force and and the

29:24

the the base 44 Wix one may not save

29:26

Wix, right? which has repurchased like

29:28

30 or 40% of its company, but at least

29:30

they made a bet that got them beyond a

29:32

60% solution, right, to 9 figures in

29:34

revenue.

29:35

>> I I I I agree. I I I think

29:39

I I think there's a lot to unpack on

29:41

what's happening in SAS, but I think I

29:42

I've internalized that Jason has

29:44

articulated one of the big truths, which

29:46

is unless you have a product that's good

29:49

enough to charge for independently,

29:52

you won't have revenue acceleration of

29:54

any meaningful scale. And if you don't

29:56

have revenue reaceleration, then you're

29:58

in a different valuation metric. And

29:59

I'll talk about how to value it value

30:01

mature companies with probably

30:04

persistent users, but stockbased comp

30:07

issues and no growth issues. And you can

30:09

do that. And you know, one of my rules

30:10

is price clears all market. There's a

30:12

price at which Service Now and

30:14

Salesforce are all quote unquote worth

30:16

something. So zoom out a million miles.

30:18

Jason is right. If you can't charge for

30:20

your you won't reacelerate. And if

30:22

you reacelerate, you instantly move to

30:24

another valuation bucket. Right? So I

30:26

actually I've listened to you a few

30:28

times and I think there's a clarity of

30:30

simplicity there because you can talk a

30:32

lot about, hey, we're doing this or the

30:33

other, but the gut level test is can you

30:36

charge for it? So I I I really it's a

30:38

big freaking comment, right? Because

30:40

I've been wrestling with what do you use

30:42

to sort out all these public SAS

30:44

companies and you know what are the how

30:46

do you think about it, right? And if

30:48

you're in a market where there is high

30:50

if you if you're in a very workfl

30:56

now there might be other things that are

30:58

more payments related or stuff like that

31:00

where I think there's different dynamics

31:02

but if you were in a workflowcentric

31:03

world for the last two decades like

31:05

salesforce and service now you are in an

31:07

agentic world now and if you're in a

31:09

world now you better have exactly what

31:11

Jason says agents that are worth the

31:14

money which means they do the work. It's

31:15

actually a very brilliant test and I

31:17

think if I was I'm literally looking at

31:18

the horizontal software applications

31:20

list for Morgan Stanley and Jason you're

31:23

right that it was probably the first if

31:24

I'm thinking about how to value this

31:26

bucket of 1.6 6 trillion. That is the

31:28

first test. If yes, then you're on the

31:31

increasing value scale and you can make

31:33

it out. And it's still going to be hard.

31:34

See Wix for details. If no, then you're

31:37

in the how do you value

31:40

a company with um you know mid mid

31:44

single to high single digits growth rate

31:46

at best probably I think an example

31:48

sales are very sticky and you know you

31:50

might be hit and I think that's a

31:52

separate valuation question right I

31:54

think that you know at nine times these

31:56

things are trading now at 8 to nine

31:57

times cash flow you might be hitting a

31:59

point where just the money allows you to

32:01

be a deep value player you're you know

32:03

the pees of something like salesforce

32:05

excluding stockbased comp are it's 11 or

32:08

12 times forward PE when the market's

32:10

20, right? This is unparalleled, right?

32:13

So, if you want to make yourself a value

32:16

play, money can still be made, but it's

32:18

it's a grim way to make money. The only

32:21

way to still be a growth play is to pass

32:23

the Jason test. So, that's my zoom out

32:25

comment here, right? And again, you said

32:27

it in financial terms a few weeks ago,

32:30

which is reaceleration, but today you're

32:32

actually articulating the the pre

32:34

predecessor test. If you have a gentic

32:36

workflows, then you will have

32:37

reaceleration. Then you'll be in the

32:39

Jason Happy bucket. And if not, then

32:41

you'll be in this the tragic value

32:43

bucket and you will have to do hard

32:45

things that would make Dario and Sam

32:48

cry. If you're going to make this thing

32:49

cash flow, it's going to be involve SPC

32:52

reduction. It's going to involve

32:53

headcount reduction. It's going to

32:54

involve a bunch of grim And you

32:55

can probably still make money, but

32:57

you're also top stopped. Nothing magical

32:59

will ever happen to a high singledigit

33:01

growth rate tech company that's kicking

33:03

off cash. At best, you build a mini

33:04

version of IBM CA and the top five

33:07

executives make money. It's not going to

33:09

be fun. Right. So, you're right, Jason.

33:13

Would you buy Service Now?

33:15

>> No, I wouldn't buy any of them.

33:18

I wouldn't buy because I I'm looking

33:22

I'm looking for products that are more

33:24

than a 60% solution. I don't it's just

33:27

the world's moving too fast and no one

33:29

wants um when we started this this pod I

33:33

wasn't sure if I believed it or not. I

33:34

was probably on the fence, but there was

33:36

definitely a sense that the models might

33:37

plateau, right? That that that there'd

33:39

be parody. Um that they're all pretty

33:41

good. They all basically could could

33:43

could could do a chatbot. Um it's

33:45

clearly not the case today. If we tie

33:47

tie it to the beginning of this

33:48

conversation, the models have radically

33:50

accelerated their power since December,

33:52

right? Since since since since Opus 45

33:54

and more, and we haven't used Mythos or

33:56

whatever, it's going to be even more

33:57

powerful. And so I'm not optimistic that

34:01

that anybody um building to last year's

34:05

spec slowly can compete. It's too

34:07

furious. It's too furious. And um and I

34:11

also think that the problem with Moes is

34:12

they keep your customers in, but they

34:14

don't lure any new ones in. No one's

34:15

excited to cross the moat except the

34:17

folks that want to breach the castle

34:19

walls. This whole moat discussion, I

34:21

think, is at the edge of moronic, right?

34:23

It's at the edge of hooray, you have a

34:24

moat and your customer I signed a 5year,

34:27

you know, the average service now deal

34:29

between three and 5 years. So what? That

34:32

doesn't bring in anybody new. It doesn't

34:33

bring in aentic revenue. It just means

34:35

I'm trapped. Prisoners prisoners don't

34:37

create growth uh other than at the

34:38

margin, right? Other than other than

34:40

with the margin.

34:41

>> First of all, I love the mode analogy.

34:42

That's great. And what you're basically

34:44

say because I've listened to this is

34:45

that what you're basically saying is

34:47

Jason is not a buyer of any stock that's

34:49

not a growth story,

34:50

>> right? I think it's good, right? And I

34:53

think so and I and I I've come to the

34:55

conclusion you were right on that. And

34:56

one of the reasons I enjoy doing this

34:57

part is when we argue sometimes I change

34:59

my mind, right? Because let's I want to

35:02

play out the value track just for

35:03

another few minutes. Right? I think the

35:05

problem with the value play I think at

35:07

this price I think you make money on

35:09

Salesforce but unless they get regrowth

35:12

you're going to make single digits

35:13

returns and the overall Ibson small cap

35:15

is 11 so you are you going to

35:17

underperform I actually think the other

35:19

thing you're wrestling with

35:22

in terms of these stocks is the

35:23

following the weird thing right now is

35:26

the public markets don't have access to

35:29

growth to the growth side of software

35:30

they have if so right now the trade is

35:32

sell sell SAS buy semis which

35:35

effectively means sales buy AI making AI

35:39

right what you don't have yet in the

35:41

public markets is AI native companies

35:44

starting with entropic and open AI right

35:47

and therefore you've you know mythos is

35:50

a wonderful word because in fact you're

35:52

comparing the practical values of owning

35:55

Salesforce with the mythical values of

35:58

owning this company that's growing 10x

35:59

where you've never seen the actual

36:01

financials no one's seen GAP financials

36:03

but oh my god it's amazing And

36:04

everyone's always going to want the

36:06

myth, you know, pick your girlfriend,

36:08

boyfriend analogy. It's the, you know,

36:10

the the practical realities of the

36:12

person you're with now versus the

36:14

mythical example of something that could

36:15

be. So my other aha is these stocks

36:19

aren't going to trade in the same

36:20

fashion until five or six public of

36:23

these two of the foundation models and

36:25

four or five other um AI gen AI native

36:29

companies are public. And then you can

36:31

finally as a public market investor say

36:32

okay now I can choose do I want let's

36:35

put right do I want entropic growing at

36:38

that one probably 5x at 30 times

36:40

revenues with huge losses and big

36:41

stockbased comp or do I want boring ass

36:44

salesforce growing at 10% with 30%

36:46

operating margins and at that point no

36:48

stockbased comp loss right at least now

36:51

you can have a choice the and then then

36:54

what'll happen is then we'll find out

36:55

how to evaluate those two things and

36:57

that you know it's probably going to

36:58

take six 12 months after the IPOs

37:00

My point is until then what you're

37:03

dealing with is the mythical desire for

37:06

the as yet unrealized relationship,

37:09

right? So these stocks are going to

37:11

trade for until then is my big aha

37:13

because no one's going to be able to

37:14

value them, right? The only people that

37:15

get out of that mess now are what Jason

37:17

said. If you if you manage to claw your

37:19

way to growth as a

37:23

public SAS company, then you do actually

37:25

have some kind of chance of trading up.

37:27

And but if you got to trade on

37:29

fundamental value,

37:32

you're always going to be chasing this

37:33

kind of fear that the model companies

37:36

can do everything you can. I don't think

37:37

they can. I think when the model

37:39

companies go public, people are going to

37:40

realize, oh yeah, Salesforce in its

37:42

current form, it's probably going to be

37:43

there for the next couple of decades.

37:44

It's worth it cash flow. But Jason is

37:46

still right. Unless it gets its

37:48

together and makes great agents, it's

37:50

it's another IBM. And you know, when did

37:52

you last even I don't even know the

37:54

price of IBM cuz why would you? I

37:56

believe someone's probably made money,

37:57

but it's not my problem, right? You just

37:59

become a boring ass old thing. So, I

38:01

think Jason, summary, you're right,

38:03

Jason. If you don't have gold, and it's

38:05

an interesting question is, and Wix is

38:07

interesting because

38:10

because I think that's an they're kind

38:11

of they've done what you suggested,

38:14

Jason, they're trying to get growth and

38:15

they're getting growth. And as yet,

38:16

you're right, they did a buyback and the

38:18

stock's down 20% since then. So, my aha

38:21

from that is

38:22

>> they've done 1.6 six to buy back nearly

38:24

30% as they bought it at $92 and the

38:28

stock fell 23% on the week in whenever

38:31

that happens you got to say to yourself

38:32

that wasn't a good week right it's

38:34

that's kind of like an inverse bill

38:36

girly instead of selling stock instead

38:38

of selling stock and then it goes up you

38:40

buy stock and then it goes down that's

38:41

like the IPO premium but the other way

38:43

it's like oh my god that hurts even more

38:45

talk about leaving money on the table

38:46

and I think the aha is to Jason's point

38:50

they did one thing brilliantly which is

38:51

figure out they got to product out the

38:53

door. Maybe they should have sat on

38:54

their capital for another 6 or 12

38:57

months, kept it in reserve to maybe buy

38:59

another AI product or invest behind

39:01

their AI product because I don't think I

39:04

think these stocks are going to bounce

39:05

around in value trap land for a long

39:07

time. So now you we may be wrong if they

39:10

if they really nailed their growth on

39:13

the new product and the low and the kind

39:15

of vibe coding product really

39:17

accelerates and you know you look back

39:19

six months from now and you know

39:21

growth's at 15% and the stock's way up

39:23

you'll go oh yeah it was fine it was

39:24

just the next day reaction thing but the

39:28

the point is fix the Jason problem which

39:32

is core growth before you try and do

39:34

financial engineering you can and in

39:36

Salesforce don't fix financial

39:38

engineering is useful, but it's not the

39:40

solution. In the end, if all you can do

39:42

is grow at 8%, you can tw you can screw

39:44

with your balance sheet to your little

39:46

heart's content. You're never going to

39:48

matter a dam. As I said, you'll be IBM.

39:49

They've screwed around with that stock

39:51

forever, but nobody cares. You got to do

39:53

the Jason thing first and put all your

39:55

effort on that. It is tough that, you

39:57

know, Salesforce did a big buyback, too,

39:59

right? I think they bought 25 they did

40:00

25 billion of debt to buy it stock on a

40:03

spreadsheet. This looks brilliant,

40:04

right? We can support it, right? Um,

40:07

yeah, we wish the debt was a little bit

40:08

cheaper, right? It was a little but but

40:10

but this is the simplest thing we can

40:11

do. Retire a significant amount of our

40:13

shares, right? Drive up our EPS and keep

40:16

going our agentic transition. But um,

40:19

arguably, you know, the market at best

40:21

shrugged it off. At best, it already

40:23

priced it before it happened, but in the

40:25

short term, no benefit. All this

40:27

financial engineering that looks great

40:28

on a spreadsheet amounted to to nothing

40:30

in the short term, but 25 billion of

40:32

debt.

40:33

>> I totally agree, J. Not only that, but I

40:36

think one of the big advantages you have

40:38

as a public company with cash flow

40:40

positive is your financial flexibility.

40:42

I wouldn't trade that for anything. Cuz

40:44

look, one of two things that happens if

40:47

the AI wave rolls on without a blip,

40:50

then your stocks and and you don't have

40:51

a growth story as a public sum, then

40:53

your stock's going to be cheap two years

40:54

from now. There's no hurry to buy it.

40:55

>> Yeah.

40:56

>> Right. Second thing is if there's a blip

40:58

then the guy in the market with a public

41:00

currency and 25 billion in cold hard

41:02

cash maybe you buy five big things in

41:06

private land that allows you to compete

41:07

maybe you buy not a tropic but a second

41:10

tier financial you know um model maybe

41:13

you buy one of the big apps companies

41:14

>> if you can but if you can afford it

41:17

Salesforce is one of the few people that

41:19

can make a big bet here it's one of the

41:21

>> point is this that's why I would have

41:22

kept my 25 billion in my back pocket

41:24

right like it's what you

41:26

around short term with your

41:28

stock and the number of shares. The only

41:30

people who care about that don't matter.

41:32

You got to win the war. And there's some

41:34

chance that in the next two years

41:35

there's a blip in the market. All these

41:37

AI companies are burning money. And if

41:39

you're sitting there with $25 billion,

41:41

you could have been saying, "Come to

41:42

daddy. I got money. Let's talk about how

41:45

I'm going to be an AI behemoth." And I

41:47

think that would be a better use of your

41:48

time and your money. Come to daddy, I

41:51

got money. uh on on that topic. Uh Alex

41:54

Wang uh founder of Scale obviously now

41:57

at Facebook following the acquisition of

41:58

scale. Meta debuts Muse Spark first

42:02

model from Meta Super Intelligent Super

42:05

Intelligence Labs which obviously Alex

42:07

runs.

42:08

I mean the candid like truth is I did

42:11

okay. It was decent. My question to you

42:15

on the back of this it was decent. Not

42:18

quite as good as the others but good

42:20

enough. Is this Facebook back in the

42:23

game? And is this a encouraging sign for

42:26

Facebook where you feel more optimistic

42:28

post it or less given it didn't blow

42:32

anything out of the water?

42:35

I think it's a win. If you're not in the

42:38

game and then you get back in the game,

42:41

that's a win. So if you're fifth, you're

42:43

in the game. I I So I told I mean you

42:45

read all the reviews, you know, I

42:47

haven't got hands on the model. I don't

42:48

have the level of sophistication to

42:50

evaluate, but I did read a lot of the

42:51

reviews of people who did. And you're

42:52

right, the summary is good in some of

42:54

the things that they were working on at

42:56

Scale AI a year ago, not so good at some

42:59

of the newer things that the advanced

43:00

labs have been developing for the last

43:01

12 months, which totally makes sense.

43:03

You took your knowledge from a year ago

43:05

and you implemented it, right? But I

43:07

mean, leaving aside the question of does

43:09

it make sense to be in this game at this

43:11

level, leaving that aside, if you if you

43:13

decide as Mr. Zuckerberg that you want

43:15

to be in this game, then you achieved

43:17

your mission. You spent $14 billion and

43:20

you're back in the game. Now you got to

43:21

move up the league table. But yeah, I I

43:23

felt this was a few cuz you know had you

43:26

done all this and you did a lamb of four

43:29

which was disappointing where people are

43:31

like it's not even you know credible

43:33

then you'd have felt like a and

43:35

you don't. So I think it was a win. Also

43:38

worth noting that they're talking much

43:40

being much more closed source which is a

43:41

significant thing because at some point

43:44

someone's going to need the American

43:45

version of open source and LMA was that

43:48

and now they're pivoting more to being

43:49

more closed source which has

43:50

implications across the ecosystem and is

43:52

a bit of a bummer but yeah know I think

43:54

it was like a few exhale

43:57

um not sure why we're playing this game

43:59

but if we're going to play it I'm glad

44:00

we're not losing anymore. Stepping back

44:02

though to where we're going toward the

44:04

back half of 2026.

44:06

Um I don't think if I if I'm Zuck and

44:09

I'm looking that Google owns its own

44:11

models which have become extremely

44:12

competitive, right? Um that's that's one

44:15

of my big direct and adjacent

44:17

competitors. If I want to be in the big

44:20

leagues, maybe I just have to own this.

44:22

Like I don't I don't I'm not Apple. I I

44:24

don't want to be stuck buying tokens

44:26

from Anthropic or OpenAI. I'm Meta. I

44:30

have the one of the dominant consumer

44:32

advertising in other platforms on the

44:33

internet. I have the dominant social

44:34

networks and I sure better own I sure

44:37

better own this. It turns out it is not

44:39

a commodity. And maybe maybe maybe this

44:42

is it's way too much money down the

44:44

drain, but this is this is core to our

44:45

existential existence just like it is

44:47

for Google. And I don't I don't want to

44:49

wither. I don't it may that may not have

44:52

been where this all started, right? Um

44:54

but the goal of Facebook is not to

44:56

provide uh API tokens like anthropic.

44:59

This is this is not the the direct goal,

45:01

right? Or it certainly isn't to uh to

45:04

encourage the open source community to

45:05

rise up and use meta products. Um, this

45:08

is to stay in the top echelon of of

45:11

software of of consumer software

45:13

companies and it's worth 14 billion.

45:16

It's worth 14 billion, right? If you

45:18

just don't want to be become Apple and

45:20

dependent on everyone else's models, you

45:21

just don't want to be that, right? And

45:23

um, you know, if you can just do better

45:25

than the the Kimmy open-source, etc.

45:28

stuff Cursor is doing, it might be worth

45:30

it just for that. just just to be in

45:32

that in that in that zone between what I

45:34

can just rework purely from open source

45:36

into what I can buy from anthropic if

45:38

I'm close enough and this is my core it

45:40

might it might be worth owning

45:42

ruthless this is as competitive a

45:44

company as exists on planet earth right

45:45

this might be the most competitive

45:47

company on planet earth is meta right

45:49

ruthless

45:49

>> yeah I mean if you look at if you look

45:50

at this if you look at the big play

45:52

scoreboard for that it's like bet a

45:54

billion on Instagram win 100 billion

45:56

plus maybe 400 billion bet 70 billion on

46:00

Meta um and the V are lose it all. Bet

46:04

14 billion on this and clearly have a

46:06

win and you know somewhere in the middle

46:08

between those two outcomes and you feel

46:10

good. I agree. The other thing just and

46:12

again it's not totally my expertise but

46:14

when when again when I look back when we

46:15

started this show a lot of folks thought

46:17

AI would kill Google search and maim

46:20

Facebook that Facebook was dying as a

46:22

platform and that why Google search was

46:24

of course dead chat GPT was going to

46:26

destroy Google right fast forward today

46:28

these are record growth businesses now

46:29

right Google search we started the show

46:32

talking about AI overviews and other

46:33

things and Google search is a better

46:35

business than it has ever been as is

46:36

Facebook and Instagram so it makes sense

46:39

to to to triple down there. This is not

46:42

a time to retreat. This is not a time to

46:43

retreat for either of them. It is not a

46:45

time to retreat. It is a time to get

46:46

that lance out, go straight into battle,

46:48

just knock knock those other guys off.

46:51

>> Meta surpassed Google, I'm sure you both

46:53

saw, is the largest ads engine in the

46:55

world. I think met 243 billion.

46:57

>> I didn't kill it yet.

47:00

hope that stock price comes

47:02

back.

47:05

Um, speaking of ads engine, Open AI

47:08

projects 2.5 billion in ad revenue for

47:11

2026. The ads pilot was 100 million

47:14

annualized in just 6 weeks. There were

47:16

600 advertisers. We touched on it

47:18

before, but they're guiding to 11

47:21

billion in 2027,

47:23

25 billion in 28, 53 billion in 29. Is

47:27

ads the great comeback for open AI in H2

47:30

2026? Is this the shining light that we

47:33

should be directed towards?

47:35

>> It's both obvious and inevitable that a

47:39

consumer product like OpenAI Chptd is

47:43

going to have to be ad supported. So

47:45

yes, they're making the moves exactly

47:47

the moves you'd expect, right? And yeah,

47:49

the and the little chatter of, oh my

47:51

god, that's a bad idea or it's not doing

47:53

enough that we saw from before a few

47:56

weeks ago, it's all said. This is just

47:58

going to happen. Three people have done

47:59

it at super scale already. Google has

48:02

done it in 2004 on um Meta themselves

48:05

have done it in 2008

48:08

really 200 probably seven and eight on

48:09

in 2012 in mobile and then Am we always

48:12

forget Amazon I think got to about 100

48:13

billion plus on ad revenue in the last

48:16

you know kind of half a decade or

48:18

probably seven or eight years at this

48:20

point right so the the movie is clear

48:22

right and yes they got to get and

48:24

they're talking about getting to hundred

48:25

billion dollars in four years it all

48:28

makes sense right The funny thing is,

48:32

and I looked at all that and I thought,

48:33

"Yep, that t and I'm, you know, it's

48:34

funny. I'm mentally giving them 100%

48:36

credit for getting that opening eye is

48:38

typically not unaggressive in its

48:40

projections. So, let's assume this

48:42

progressives are aggressive." You this

48:44

going to sound really awful and I hate

48:45

even saying it. Oh my god, 100 billion

48:47

is amazing, but it's not enough relative

48:49

to the market cap. In other words, so

48:51

the bigger hop for me is you can build a

48:54

hundred billion dollar ad business

48:57

in chat GPT which makes sense and in the

48:59

context of a total trillion dollars of

49:01

total ads with Meta already having 300

49:05

of that, Google already having 200 ad,

49:07

Amazon already having 100 of that and

49:09

TV's got to eat too, you know, right?

49:11

That's probably a realistic high-end

49:12

estimate, right? What it means is you

49:15

need another 100red billion plus from

49:16

your enterprise business. That was the

49:18

big aha for me is consumer alone ain't

49:21

going to be enough to feed this beast

49:23

because your competitor who's all in on

49:25

enterprise is you know already at 30

49:30

right and you're so that was my as I say

49:33

I I almost feel like a jerk saying it's

49:34

like hey congratulations on your hundred

49:36

billion dollar ad business probably one

49:38

of the three or four best ad businesses

49:40

ever one of the best launches ever but

49:43

it may be that corporations want to buy

49:44

more intelligence than consumers do and

49:49

that 100 billion consumer ads won't

49:50

support your burn. You need more.

49:52

>> The 100 billion is what 15% of ad spend,

49:55

right? Of

49:56

>> 10. 10 at a trillion worldwide.

49:57

>> 10 something 15. I I like this as a goal

50:00

for 2030 cuz it's clear what people

50:01

should be doing. We can't just add 100

50:03

million of ads to chatb.

50:06

We have to build something that is as

50:07

essentially as big as several of our

50:09

competitors. It's well understood why it

50:11

works. We're not directly in commerce.

50:12

We don't have the advantages that Amazon

50:14

does, right? Um, but can we achieve the

50:16

scale of of Facebook and others? Yes.

50:19

This is our job. This is our job, guys.

50:21

And every week we're going to iterate on

50:22

it. We're going to improve it. We're

50:24

going to make it better. It is

50:25

mathematically possible. This is not as

50:27

aspirational as the enterprise stuff,

50:29

right? It's mathematically possible.

50:30

This is our effing job. We're going to

50:32

review it every week. We're going to put

50:33

some of our best team on it. And, you

50:36

know, it's it's doable. And if it comes

50:38

up short a year or two like Elon, I

50:40

mean, it would suck for the IPO, but

50:42

it's not the it's doable. It is

50:43

achievable, right? And so I think

50:44

because it is achievable, it will be

50:47

achieved. I actually think it will be

50:49

achieved.

50:50

>> I I think you're you're exactly right.

50:52

And it's clarity a whole bunch of things

50:53

they've been lacking.

50:54

>> Clarity on the enterprise side. I will

50:56

say one thing. You know, there was this

50:57

memo from this week that that that

50:59

leaked from Denise Dresser, right? Who's

51:01

the CRO president saying, "Wow, well,

51:03

first of all, Anthropics overstating

51:05

their revenue. We're still ahead." And

51:07

saying, "Hey, we have all you know, we

51:08

have the capacity. They're out of

51:09

capacity." And blah, blah, blah. Okay.

51:11

At first blush, this memo to me, I mean,

51:15

it it's it it it seemed like a flashback

51:17

to something that Mark Benoff might

51:19

write, who I love, but more appropriate

51:21

for Salesforce than for OpenAI when I

51:23

first read it, right? And her last gig

51:24

was CEO of Slack, right? So, my first

51:26

blush, I thought it seemed out of place

51:27

at an AI leader, but then I thought

51:29

about it and I'm like, this is the exact

51:33

type of messaging you want to win

51:35

traditional enterprise customers. So,

51:38

I'm pretty bullish actually on OpenA in

51:40

the enterprise because all this

51:42

enterprise DNA they have which probably

51:45

didn't help a snail's a snail's inch the

51:48

last 12 months in the future when all

51:51

the models are so powerful and big

51:53

enterprises are trying to make decisions

51:55

between a couple of top brands. I think

51:58

I think the ability to

52:01

sell this directly to enterprise versus

52:03

coming in from the bottom coming in

52:04

through the CTO coming through the other

52:05

coming in from functional groups is

52:07

going to be very powerful. So I think it

52:08

may be hiring the you know OpenAI said

52:11

they're going to double in size right

52:12

and a lot of it's around selling motions

52:14

to the enterprise. I think it's going to

52:16

work. I think they're going to run a lot

52:17

of the traditional playbook which is

52:19

going to work better and better in 2027

52:22

than when it did when the last year was

52:23

ask your developer. that that was the

52:25

land and that's why anthropic one ask

52:27

your developer I want I don't want

52:29

copilot

52:30

I want opus and your developer picked

52:33

everything developer picked it for your

52:35

app and it you I'm using the old twilio

52:37

mantra because it worked for years until

52:39

it didn't and I think it's going to work

52:40

for LMS until it doesn't until the bud

52:42

until you go deep into traditional

52:44

enterprises and if open AI which is you

52:48

know it's going to be the number one or

52:49

number two brand for as long as we do

52:51

this show I think they may be able to

52:53

outsell it versus uh hey the world's

52:55

going to end from Daario. Thanks for

52:57

letting me in the lobby. Everyone's

52:58

going to be unemployed next week.

53:02

That one may cast a chill in the CIO's

53:04

office. Thanks guys for having me. Most

53:07

of you won't have a job next week. I I

53:10

would have two comments because I'm I'm

53:12

not quite in that place. Maybe three

53:14

comments, right?

53:16

And one is the one aha I had from this

53:21

is that you know the the comment on

53:23

comput is the ball game right is you

53:26

know whatever about the long-term

53:27

overinvestment and I still angst about

53:29

that there's no doubt in my mind that

53:31

right now

53:33

everyone is compute scarce and there's

53:36

going to be no restraint no one's going

53:38

to blink on their investments for 2026

53:40

you actually know the next four quarters

53:42

of Nvidia announcement you know the next

53:44

four quarters of everyone one of the

53:46

infrastructure advant which is every

53:48

single thing we can make if you can

53:49

count the wafer starts at TSMC you can

53:51

predict Nvidia's revenues everything is

53:53

going to be sold out from now and for

53:54

the next 12 months because if the rate

53:56

limiting constraint is compute then

53:59

everyone's just going to buy compute

54:00

right so that was the first big thing

54:02

computable and yeah they've got some

54:03

interesting position relative to

54:05

anthropic in that they were more

54:07

aggressive so they have more compute so

54:09

yeah that's that that's one thing it

54:11

also means by the way the second

54:12

consequence is you're going to see some

54:14

compute ration people are going to start

54:16

allocating tokens to the highest

54:17

effectively the highest bidder. You're

54:18

going to see a lot of throttling like

54:20

you're going to see these plans like

54:22

that's why they shot Sora. You're going

54:23

to see some of the clawed plans get

54:25

throttled down. You know that's what

54:26

money is for. It's to allocate scarce

54:28

resources. It's actually the definition

54:29

of economics. It's the study of the

54:31

allocation of scarce resources right and

54:33

they're going to start allocating those

54:35

scarce resources of compute via price.

54:36

So that's one big trend. Um, the second

54:39

one, I don't know if you're going to see

54:41

that level of flip from, oh my god, it's

54:43

all entropic to oh my god, it's open AI.

54:44

I think it's a two-way fight. H,

54:46

entropic has the advantage of clarity

54:48

and focus.

54:50

Open AAI has the advantage of the

54:52

consumer business, right? And they're

54:55

going to have to slug it out, right? I

54:58

think hang on but I I I think Antropic

55:02

has the the little just the way they've

55:04

played the last 12 months have a

55:05

slightly better lead right now both in

55:08

terms of perception and in terms of

55:10

developer friendliness but open eyes is

55:11

not going to roll away and die right I

55:13

think I want to come back to a point I

55:14

made earlier which

55:17

I've only processed true now but

55:18

sometimes it's important to say the big

55:20

very clearly right if you do

55:23

consumer versus enterprise typically the

55:24

biggest things have been a cons I mean

55:27

If you look at you Google, Google's

55:29

consumer business and ads is twothirds

55:31

of the value and you know maybe the

55:32

cloud is maybe one/ird roughly right and

55:35

the big money has been in consumer right

55:40

bear with me when I say the obvious but

55:41

consumers actually don't want to they

55:43

want really great chat GPT there might

55:45

be some models like you know kind of

55:48

friends and companion models but

55:50

fundamentally when I go home I want to

55:52

buy Netflix when I go to work I want to

55:54

buy intelligence it may well be that

55:57

that the zoom out comment from all this

55:58

is enterprise is twothirds of the

56:01

ball game in AI and consumer is oneird

56:04

or less which is the flip of the last

56:06

time because you would have thought two

56:07

years a three years ago when chachi PT

56:10

exploded that that was a really great

56:11

launching point right but if in fact

56:14

enterprise is the better place then yeah

56:16

you're going to have a good consumer

56:17

business but the ball game may be

56:20

compute but the ball game from a

56:22

customer's perspective may be two/3s

56:24

enterprise one/3 consumer which is the

56:26

mirror opposite of the internet. And

56:28

that's just one of those huh big picture

56:30

comment because I realize I don't go

56:31

home and want to do cognition. I go home

56:33

and I want Netflix. I go to work and I

56:35

want thinking and they're selling

56:37

thinking. It's an enterprise business.

56:39

And by the way, you get into a really

56:40

fun and interesting discussion which is

56:43

above my pay grade, but I started to see

56:44

comments about it is do the things you

56:47

do to make a consu the model consumer

56:49

friendly, you know, the happy like does

56:51

the same model continue to work really

56:54

amazingly well for both? You know, if

56:56

the enterprise wants clarity and

56:57

concision, if the consumer wants a

56:59

little more friendly answers, do you

57:00

really find that divide the tone and the

57:03

persona of the consumer model and the

57:05

enterprise model start to become

57:06

different? I don't know if that has

57:07

implications, it's above my pay grade,

57:09

but it's in a category of something to

57:10

think about.

57:11

>> I think I think very much so. And I

57:13

think you've seen it on like consumer

57:14

research studies. So, they've seen that

57:16

actually younger people like Open AI

57:18

because it's much more supportive of

57:19

their emotional challenges

57:21

>> and in you're exactly and in and in

57:23

business. I don't want support. I want

57:24

to be told these five things are good

57:26

and these three things are bad. Like

57:28

when we're doing investing, we don't

57:30

want supportive. We want decision. It's

57:31

almost the exact opposite. I want harsh

57:33

critique. This is a stupid deal. I want

57:35

the AI to say, "This is a stupid deal,

57:37

Rover. You looked at this three years

57:38

ago. It was dumb then, it's dumb now.

57:40

Stop, you idiot. Here are five

57:42

fact-based reasons why this is wrong."

57:45

And if you give that in a consumer app,

57:47

you're not going to have great lifetime

57:48

value or retention.

57:51

>> That was interesting. You know, Aaron

57:52

Levy had a had a tweet this year this

57:54

week about his latest road show meeting

57:56

with CIOS from Box. And um he talked

57:58

about how so many of the CIOS meeting

58:00

with now are are are token maxing. And

58:03

what he meant was they are now they are

58:06

they are creating ideally fixed token

58:08

budgets really. I you know they're

58:10

they're dollar budgets rather than

58:12

numerically tokens for the coming year

58:15

and they're making the departments fight

58:16

it out per project. Now, this is where I

58:19

think the game is going to change again

58:20

with the leaders because when most of

58:22

the budget is essentially rogue, when

58:24

it's developer teams picking anthropic

58:26

almost universally last year, when

58:28

you're giving them the budget because

58:30

the the throughput is so intelligent,

58:31

you're finding budget for that team. Um,

58:34

or you're using discretionary budget to

58:37

bring in to bring in little agents. It's

58:39

one thing. When the CIO takes control

58:41

again of how many tokens across a large

58:44

enterprise, that's a very different

58:46

calculus of which vendor I choose from.

58:47

And if the CIO prefers to buy OpenAI

58:50

because it's got a more traditional

58:51

sales motion, it's packaged better. It's

58:54

better used. And there will be

58:55

exceptions. There will be exceptions in

58:57

departments. They will get exceptions.

58:58

But overall for the enterprise, I'm

59:01

standardizing on OpenAI for 2028. It is

59:03

the right choice for our Fortune 2000

59:06

company. It's just a different world

59:08

when you know whatever the 60 billion in

59:11

Open AI and ChatB revenue, all the

59:13

enterprise stuff is still in some sense

59:14

rogue today. So much of it is rogue. It

59:16

is out of budget. It is out of band. It

59:19

and and as that changes, it will change

59:21

which vendor we buy from.

59:23

>> If you are correct, and I'm not sure you

59:24

are, what it says is the people who

59:27

should go into guidance counseling are

59:29

Microsoft and Open AI because they need

59:32

to get their relationship back together

59:34

again. Cuz who is the dominant path to

59:37

every single enterprise in the world?

59:39

It's Microsoft. They need to get some

59:41

couples therapy. They need to accept

59:43

that they're different. They have

59:44

differences but they can reconcile and

59:46

they need to start because otherwise

59:48

they're going to get the clock cleaned

59:49

and that frankly as I think that's an

59:51

indulgence that OpenAI can no longer

59:53

afford. Right. And

59:55

because you're right is that the other

59:57

guys have stolen the march has stolen a

60:00

march on the kind of developer love.

60:02

Microsoft can go top down. I don't care

60:05

about your long-term competitive

60:06

dynamics. You need to kiss and make up

60:08

here people. So if you want it and if if

60:11

we agree that enterprise is twothirds of

60:13

the game, Microsoft is the key and so

60:16

you know figure it out, go to therapy,

60:18

talk to your issues and get this thing

60:19

back together again. And then the other

60:21

comment to make is I just want to give

60:22

an advert for Aaron is that I I read his

60:25

tweet last week about his you know

60:27

comments from the road show and look I

60:29

said this before we were lucky enough to

60:31

back Aaron 16 years ago and many years

60:34

in fact last year we had him back at our

60:36

invest at our annual meeting just to

60:38

talk. Erin is a walking investment

60:39

insight. I mean, I read his tweets and

60:41

I'm like literally I send it out to the

60:42

group and say, "This is the latest

60:44

thinking on what you should be thinking

60:45

about about what CIOS are thinking

60:47

about. Just read this and then you'll

60:48

know, right?" It's just so that was such

60:50

an insightful tweet about agents. He's

60:52

not a doomer employment at all. He's

60:55

like, "People are going to be rolling

60:57

this thing out and if you understand

60:58

what they're doing, you will have a role

60:59

here." And he has a really good feeling

61:01

that you he talked about token maxing.

61:03

It's like beyond

61:05

Silicon Valley, there are going to be

61:06

meaningful budgets. there are going to

61:08

be constraints and this thing, you know,

61:09

just a real sense of how CIOS on the

61:12

front line are rolling out AI. I just I

61:14

just think it's excellent. So, I I

61:15

follow him all the time.

61:17

>> I thought it was so excellent. I

61:18

actually messaged him and said, "Do you

61:19

want to come on the show this week and

61:21

do it?" And he said, "I would love to.

61:22

Let's do it tomorrow." And that was this

61:23

morning. So, I'm actually doing a show

61:25

with him tomorrow about this tweet

61:27

thread.

61:27

>> He's in that rare combination of

61:29

frankly, you know, grounded enough in

61:32

terms of 15 years calling on CIOS. 15

61:35

years calling on CIOS to really know

61:37

what they think and at the same time

61:39

frankly young enough and flexible enough

61:41

to really understand what AI is doing.

61:43

It's literally like an investment

61:44

insight. It's like a one of my

61:46

colleagues even said it when he spoke

61:48

last it was like he said literally it

61:49

was like having an invest it was just

61:50

like an investment memo spewed out on

61:52

enterprise AI. Good for you.

61:54

>> Yeah. Here's the thing now. Just the one

61:56

tough thing and I I almost don't want to

61:57

say it uh because I love

61:59

>> you can say it.

62:00

>> But if he can't reacelerate box with his

62:03

incredible depth of like 10 out of 10,

62:06

right? If he can't re what hope is there

62:08

for so many other leaders, so many other

62:11

unicorns and others if he can't get

62:13

boxed to 20 to 30% growth, I'm I'm

62:16

giving up on on the rest of the world.

62:18

>> It's a totally fair comment. given up.

62:19

>> Absolutely.

62:20

>> And I and I hope he is I I look I admire

62:22

them so enormously for and I really hope

62:24

they can reacelerate,

62:25

>> right? Because I do think he's

62:27

>> he knows everything that's happening and

62:28

he's not pretending. He's not he's not

62:30

he's not like so many folks are

62:31

pretending as engaged he's ever been,

62:34

right? As stressed as he's ever been. He

62:36

can't work any harder. If he can't get

62:38

this reacelerated,

62:41

uh good god, who the hell can who who

62:44

the hell can? We had the financials for

62:47

SpaceX leaked a 5 billion loss on 18.5

62:51

billion in revenue. The reason it's so I

62:53

think important for the audience is

62:54

there are so many endowment funds and

62:57

managers who hold SpaceX in some way who

63:00

are awaiting the IPO later this year. Um

63:03

the loss is driven by the XAI

63:05

acquisition not by operations important

63:06

to add. Uh but the $2 trillion potential

63:10

IPO is a big number and the math needs

63:12

to be worked out.

63:14

>> Okay, keep going.

63:16

So 18.5 billion in revenue at 2 trillion

63:18

is 108x.

63:21

Did these numbers change a perspective?

63:24

Did they confirm a opinion? Didn't

63:27

change confirmed.

63:30

But let's break it apart a little bit.

63:32

The first is okay all technical

63:34

accounting on you now. Right. I can't

63:36

when did XAI close? Because pulled

63:39

accounting is gone. Pul County doesn't

63:41

exist anymore where you retroactively

63:43

recast the financials as if the

63:44

companies were together. So that's only

63:46

the loss from when XAI was acquired,

63:49

right? And I think it was late last

63:51

year, early this year. So we actually

63:52

don't know the actual run rate, right?

63:54

Do you understand me? In other words, if

63:56

the deal closed in in October 1st, then

63:59

that would only reflect one quarter of

64:01

loss. So anyone hypothesizing on it's

64:03

profit or it's profit excluding that or

64:04

it's only or it's quote unquote only a

64:06

$5 billion loss until I see the actual

64:08

gap financials I don't know right

64:11

let's start with

64:11

>> it could be 20

64:12

>> right exactly

64:13

>> $20 billion loss

64:14

>> but but it but I think the rough

64:16

trajectory of it will be something like

64:18

the following

64:20

we have an amazing we have a an amazing

64:24

launch business with a near monopoly on

64:27

cost effective launch and that price

64:28

could come down with the next generation

64:30

rocket

64:31

We have an amazing business on Starlink.

64:33

I think the ex the whole XA in

64:36

retrospect I think you'll look back and

64:37

go I'm not sure I would have paid 250

64:39

billion for XAI. And then the question

64:41

as you say is how do you value that

64:44

relative to 100 times revenues and you

64:45

know we've talked about this before. I'm

64:48

not going to say it's right or wrong

64:49

because I think what I would say is

64:53

obviously very few things trade at 100

64:55

times revenues for any extended period

64:56

of time. Let's just say that. So, it's

64:58

clearly underwriting a level of growth.

65:00

It's underwriting a reaceleration of

65:02

growth even of the Starlink business,

65:04

which is plausible based on the future

65:06

things they're doing, but feels like a

65:09

lot, he said gently.

65:11

>> It appears to be the most expensive IPO

65:13

at scale of all time.

65:17

>> Yeah, it appears to have no one at scale

65:19

that has IPO has ever has ever IPOed at

65:21

a revenue multiple approaching this,

65:23

right? I mean the case will obviously be

65:25

all the future things you know what is

65:28

space and you read the bold case and as

65:29

I say as I say I'm trying to avoid the I

65:31

don't believe it and I try to express my

65:34

concern rather than saying oh I think

65:36

that's crazy I just say you have a you

65:38

have the existing business and then you

65:39

have a series of new initiatives around

65:42

direct to cellular and all of which and

65:44

then obviously data centers in space you

65:45

can articulate a massive market and

65:50

the question as I've said before is so

65:52

you take these adjacent times. And if

65:54

you give them a 100% probability of

65:56

happening and a 100% probability of them

65:59

happening right now, in other words, no

66:00

NPV because it takes five years to make

66:02

it happen, then you probably get to $2

66:03

trillion. If on the other hand, you

66:06

apply a probability of it not happening

66:07

in a time value of money, you get to a

66:09

lower number. And maybe that's a good

66:10

way to reduce it. I mean, the Elon

66:12

believers are saying, "These are the

66:14

future things, and I ascribe 100%

66:16

probability of success." And it's like,

66:18

I'm going to give them credit for today,

66:20

even though it's going to take three or

66:21

four more years. And so it's basically

66:24

the Elon discount rate. The Elon

66:25

discount rate is zero. And the Elon

66:28

probability of failure rate is zero to

66:31

get to 2 trillion. If you put a more

66:33

conservative number in both of those,

66:34

you probably end up in a different

66:35

place. You still have the upside. You

66:37

still have the long-term story, but are

66:38

you getting paid for the risk? And

66:39

that's a way of framing it. That's not,

66:41

oh, I think it's silly because a 100

66:42

times revenues is just too much. I think

66:44

that's a reductionist argument. I think

66:47

that what's really what you're really

66:49

saying is I'm looking at all this future

66:51

time and perhaps being more sober about

66:53

the probability of it happening. I'm

66:55

revising my prior from oh 100 is crazy

66:57

which is just too simplistic Rory to

67:01

what are you saying about these other

67:04

markets when you feel that it should be

67:06

at 30 times revenue you're effectively

67:08

saying maybe it takes four years for the

67:11

data centers to happen and the direct to

67:13

sell to happen and maybe the discount

67:14

rate for that is 15% and maybe the

67:17

probability of success is 70 but not 100

67:20

and pretty you know you multiply all

67:21

that you got to get paid for the risk

67:25

Jason, what topic do you think we should

67:27

discuss that we have left?

67:30

>> Um, some private stuff.

67:33

>> There's uh uh some private stuff.

67:36

>> No, some private market stuff. Simple

67:37

humble venture

67:38

>> The first for what it's worth, the topic

67:41

I put, but I actually I don't think it's

67:43

as interesting as a larger topic.

67:44

Appleven 898 employees. This is this is

67:48

this is not a brand new AI company. Last

67:50

week, 4.5 million revenue per head. I've

67:52

been thinking a lot about this. You

67:53

have, you know, the block memo and and

67:56

what Jack Dorsey wants to do and you

67:58

have, you know, every Andre chart of the

68:00

week is showing how efficient the next

68:02

generation is, right? How 11 Labs and

68:04

everyone's so efficient. Um, just the

68:07

meta thing. My captain obvious learning

68:09

from all the conversation I have is um

68:11

it's a choice.

68:14

Everyone wants to be small by choice.

68:17

And this is what I think is going to be

68:19

disruptive for the next year and a half.

68:21

As VCs, you get really excited when you

68:23

see an efficient company because all

68:24

things being equal, hey, they don't need

68:26

to fund raise as much. I'm going to be

68:27

diluted less. Um, it's less risky,

68:29

right? Everyone loves, everyone wants to

68:31

invest in the next version of Viva. We

68:33

raised three 3 million and got to 30

68:35

billion. That's the no matter what

68:36

anybody says, that's the venture dream.

68:38

That's good.

68:39

>> We raised three million and we're worth

68:40

30 billion. I don't care what you do.

68:42

Absolutely. Yeah. Whether it's bagels or

68:44

healthcare software, three. So, we want

68:46

and and so we see hints of this in this

68:48

employee thing. Um it's not quite that

68:51

simple when the gross margins are lower,

68:53

but I think what I I'm seeing everywhere

68:56

is everyone just wants to be smaller by

68:59

choice. They just and and AI is an

69:02

enabler because it lets my best

69:03

engineers do more. AI is an enabler

69:05

because I can get rid of those SDRs. Um,

69:08

but um,

69:09

>> Jason, I I actually tweeted last night,

69:12

"The core test when evaluating a team is

69:14

knowing what I know now about the person

69:16

having worked with them, would I hire

69:19

them again?"

69:19

>> Yeah.

69:20

>> And you said that's not the question.

69:22

What did you say was the question? Would

69:24

I replace them with an agent? Would I

69:25

rather work with them or replace them

69:27

with an agent? It's the same thing. I'd

69:28

rather I'd rather have an agent than a

69:30

mediocre person. Right. Everyone thinks

69:31

that. Everyone think Not everyone says

69:34

it out loud. But provided it wasn't a

69:36

mediocre agent as you've articulated

69:37

earlier.

69:38

>> Yeah, but I know how to build a good

69:39

agent now.

69:40

>> Okay.

69:41

>> As well, everyone in 18 months, they

69:43

don't know how to today. Everyone in 18

69:44

months will figure out how to build an

69:45

AA because they'll get easier and easier

69:48

to train. Like, and we we touched on

69:50

this briefly, but like you don't need

69:51

prompt engineers anymore, right? Um like

69:54

I built a for fun yesterday on Replet, I

69:57

built a fully functional website in

70:00

about six minutes that has video, audio,

70:02

and everything. My prompt was um create

70:05

create create create a whole website

70:07

around my theme of the recycled mediocre

70:09

in this post. Recycled mediocre or when

70:12

you keep hiring the same mediocre folks

70:13

again and again. And it it did the whole

70:16

thing from that prompt. It created the

70:17

whole site. Pulled up which mate can't

70:19

do. Pulled up all the context, created

70:21

an incredible horror image, then created

70:23

the video out of it, then created the

70:24

connection, then pulled up all the

70:25

context. And so my point there is you

70:28

don't need to you the most mediocre

70:30

prompts in the world do magic. Now, you

70:32

don't need to be a prompt engineer. And

70:34

right now, getting an agent to work, you

70:36

need you need an FTE and it takes weeks

70:38

or sometimes even months and lots of

70:40

training. It it it's not it shouldn't be

70:43

true in 18 months, right? It should be

70:45

as magical as prompts are today. So, I

70:46

think we're all going to be good at

70:47

agents in 18 months. And so, we're all

70:50

going to say to Harry's point, do I want

70:52

to work with that person again or would

70:54

I rather replace them with an agent?

70:55

We're going to it's not and it's not

70:56

about the money. We're just going to

70:57

choose to be leaner. We're going to

70:59

choose to be leaner for many reasons. I

71:01

think you are right on the

71:02

directionality, right? I just want to

71:04

make a kind of a a a business point

71:06

which is the simplistic revenue per

71:09

employee is just not a useful metric

71:11

because uh you know you you know you

71:14

can't compare the efficiency like

71:17

someone like a cursor has very low

71:19

employee count but very massive gross

71:21

margin count right gross margin cost of

71:23

sales right I mean watch this cursor

71:26

does 4 million per employee Salesforce

71:28

does 700 per 700,000 per employee oh

71:31

cursor must be more efficient well it

71:33

turns sell for us a 30% operating

71:35

margins and cursor is losing a lot of

71:36

money. Why? Because they spend a whole

71:38

ton on tokens, right? So, it's not I

71:40

mean you can compare B companies in the

71:43

same business on an efficiency metric,

71:45

but the one the one shot I mean I one

71:48

shot fits all revenue per employee

71:50

doesn't really cut it, right? Um that

71:53

said, two comments. One is Apploven is

71:55

what Apploven is amazing business

71:57

because they actually have fairly high

71:58

gross margins and low employee count.

72:00

They're just one. It's one of those

72:01

businesses where I've Yeah. You just

72:05

have to go away and understand how it

72:06

fits at that interstitial moment in

72:08

mobile ad networks. And it just it's

72:10

it's the only at this point given Trade

72:12

Desk's downturn, it's the most

72:14

successful ad network business by far.

72:16

And we could digress onto why that is,

72:19

but it's a one-of-a-kind business. It's

72:20

it's a 4.5 million per whatever it is

72:23

per employee business where unlike

72:25

Atropic are opening at no capex, unlike

72:28

cursor, no token cost. It's just a money

72:31

printing machine. Um I'm jealous. But

72:34

Jason, the second com is you are exact

72:36

still you're still exactly right is that

72:38

the trend everywhere is grind down the

72:39

headcount. Do you need them? What can be

72:42

automated? So you know the truth. So

72:44

maybe reflecting in my own mind maybe

72:47

the way to say it is it's not fair to

72:50

say to a SAS company, hey um you know

72:53

apploven does 4 and a.5 million curs and

72:56

a half million. You know 500,000. But

72:58

what is fair to say is last year you

73:00

were at 500,000. This year you better be

73:02

at 600,000 per employee because Jason's

73:05

telling you and next year maybe you got

73:06

be at 800. Right. So I do agree the

73:08

metric if you're not making progress on

73:10

that metric as a software company you

73:13

are not with the program. So in that

73:15

basis I think you're right.

73:17

>> Obviously I'm I'm just sapping off the

73:19

knowledge of smarter people than me.

73:20

Would you buy out love today?

73:22

>> I don't know now. I mean, you always

73:24

worry ad network ad network businesses

73:26

over the medium-term get ground down,

73:29

but it's been able to survive for the

73:30

longest time. It's gotten rid of its

73:32

game business. It's purely focused on

73:34

this. And you know, for some reason,

73:37

it's found a way to exist in the Apple

73:38

ecosystem with all the privacy issues as

73:40

being the only way to do some of this

73:42

targeting. So, I need to know I need to

73:44

spend a lot more time thinking about it,

73:46

but it's been an astonishing run for it.

73:49

It's probably the standalone the biggest

73:52

beneficiary of mobile networks, mobile

73:54

ads clearly mobile ads, you know, after

73:56

maybe Meta and that. Yeah, amazing win.

73:59

There's two that I just wanted to touch

74:00

on. One was Toma Bravo shutting down the

74:02

growth equity business and is this uh

74:05

foreshadowing of a load of other growth

74:07

equity businesses shuttering or is this

74:09

just Toma Bravo independent?

74:11

>> Well, first before you guys answer, can

74:13

you educate me because I'm I'm ignorant.

74:14

I don't understand the whole tommo bravo

74:16

empire and what it truly means they're

74:19

shutting down growth equity versus the

74:21

other vehicles. I don't understand it.

74:23

>> I think it's pretty straightforward. The

74:25

core business that puts 90% of the money

74:27

on the table is buying control positions

74:30

in software companies and you know with

74:32

some leverage and running them you know

74:35

adding doing build and add-on other

74:38

companies to them and ultimately selling

74:40

them either to another PE buyer. That's

74:42

most of what they do. it's 90% of the

74:43

money. They started doing non-control

74:45

and minority positions in latestage high

74:48

companies

74:50

and it's just a different business. But

74:52

I just think it's different enough

74:55

from the control like in in a control

74:57

position business you're trying to

74:58

you're trying to buy value. You know, we

75:00

may look back and say many of the prices

75:02

they paid for those control positions in

75:04

21 and 22 weren't value, but you're

75:06

trying to buy value where you have

75:08

control and you're going to be EBDA

75:10

positive and you're trying to pay down

75:11

the debt and do all those things. You

75:13

know, classic venture growth. You're

75:15

still hopefully growing 50 to 100%

75:18

minimum per our discussions. You're

75:20

probably still losing money. You're not

75:22

in a controlled position as the PE

75:24

investor. And in a period like right now

75:26

where your core business is threatened,

75:28

the first rule of threaten is you

75:30

retreat to the core. PE guys regularly

75:32

come into growth venture at the top of

75:34

markets thinking this looks easy and

75:37

they regular retreat from those markets

75:39

when they discover it's hard, right? And

75:42

and

75:42

>> shouldn't they be going back in in 2026?

75:44

Isn't this a time to to be re-entering?

75:47

>> Possibly. But and again, not but two

75:50

reasons why not. One is your core

75:51

business is under threat. is that you

75:54

don't and you you know the the one thing

75:57

you don't want to be doing with your

75:58

investors is saying we have this thing

76:00

that 90% of your money is in but we're

76:02

fussing around with this other 10% even

76:04

if it's a good business it doesn't

76:05

matter right we gave you $10 billion to

76:08

invest in control positions in software

76:10

companies and we gave you half a billion

76:12

dollars to screw around doing something

76:13

else you're having problems in your core

76:14

business how about you fix that it

76:16

doesn't even rise to the level of is it

76:17

a good opportunity or not it's not the

76:19

opportunity we have end up

76:20

>> you said ret you said retreating to core

76:22

Rory I I completely completely agree

76:24

with you. Core is business. I'm sorry

76:26

I'm not I'm never on

76:28

businesses, but challenge businesses

76:29

like Kooper, like Anna Plan, like

76:32

Medallia. I mean, I would say,

76:35

>> can you help me? Like, this feels like a

76:38

cluster of pain. Separate comment.

76:40

Yes. I mean, I think look, the one of

76:43

the things is that well, they're the

76:46

same type of companies as were hit four

76:48

in the public market. So the same

76:50

discussion we had in the public markets

76:52

applies here. In other words, these are

76:54

mature plain vanilla SAS companies with

76:56

singledigit growth rates, right? They're

76:59

just not traded every day in the public

77:00

markets are traded in the privates. So

77:02

the question is what does that mean? The

77:04

first thing is these kind of companies

77:05

are trading excuse me at two to four

77:07

times revenues. Right? So the same

77:10

equivalent companies in private are

77:12

probably should be quote value today at

77:13

that and many of them will bought at 10

77:15

times and have leverage. Right? So

77:17

that's a pretty tough place to be. The

77:18

negative spin is if you apply the same

77:21

math of three or four times revenues and

77:24

then deduct the debt, you have little or

77:27

no enterprise value, right? And that's

77:30

terrifying. And that means you could you

77:31

could see big losses in some of these PE

77:34

funds. Now, the positive spin that they

77:36

would give, which kind of goes back to

77:38

Jason's thing, I'm not sure I fully

77:40

believe it, is if these software marks

77:41

in the public markets are totally wrong

77:43

and then two years from now they're back

77:44

to eight times, then you know it'll be a

77:47

tree that fell in the forest, no one

77:49

will know, and in two years time they'll

77:50

be able to go public with an plan again

77:52

at eight times. And maybe that happens

77:54

and maybe not. Um, but that would be one

77:56

part of the why it's going to be okay.

77:58

If I was articulating as to braava why

78:00

it's going to be okay, the first comment

78:02

would be it's way overdone and these

78:04

things are really worth eight times

78:05

revenues because they're profitable and

78:07

in the end things trade at 15 times cash

78:09

flow and not nine times we'll all be

78:11

okay. That's one argument. And then the

78:13

second argument could be some version of

78:15

the Jason one which is sometimes the

78:17

advantage of private ownership is acute

78:19

clarity. And if you're going to make the

78:21

transform to AI bet, I bet you these

78:24

guys are going to articulate that we

78:26

will make that happen because we will

78:27

own these things. We will replace

78:29

management if they're not capable of

78:30

doing it. We will hire other people that

78:32

can do it. Maybe we'll buy assets. I'm

78:34

not sure I buy that, but that's probably

78:36

part of the argument, which is PE's

78:38

argument has always been transformation.

78:40

Right now, the to date the

78:43

transformation has been about cutting

78:45

cost and being more efficient. I don't

78:47

know if they can pull off transformation

78:49

where transformation is as Jason says

78:52

taking wicks and adding you know not a

78:54

six to Jason's point which I realize not

78:57

adding a 60 if you add a 60% agent as a

79:01

privately held company you haven't

79:02

passed the Jason test can they add the

79:04

question you have to ask these guys is

79:05

can they add a 100% agent that you can

79:08

charge for if they can and they rekindle

79:10

growth to 20 then they'll have earned

79:12

their massive carry right if they can't

79:15

and these things don't bounce back then

79:17

you're right you could have a train

79:18

wreck and that's the game that's their

79:20

ball game right now right which is why

79:22

they're all looking for AI experts it's

79:24

why on a going forward basis they're

79:26

pitching buying new companies and kind

79:28

of AI enabling them but for their

79:30

existing portfolio it's all about you

79:31

know what do you add to Koopa or and a

79:34

plan to make it AI first AI forward at

79:36

least not AI first on the one hand I

79:39

think we're going to look back on this

79:41

and see it's all a shame

79:44

because if you have 10,000 happy

79:45

customers customers, 50,000, 100,

79:47

150,000 who are reasonably happy. Not

79:49

thrilled,

79:50

>> but reasonably happy.

79:51

>> And you've had now 12, 18, 15 months to

79:55

build them an agentic product. You've

79:57

had access to the LMS. You've been able

79:59

to carve out 50 of your best engineers

80:01

to work on it. And you didn't take

80:03

advantage of your installed base for

80:05

real. Not in the moat way, not

80:07

prisoners, but I mean, if you didn't

80:08

take advantage of the fact that 90% of

80:10

your customers are not at the bleeding

80:11

edge of AI and sell them an agent, this

80:14

is such a missed opportunity for the

80:15

leaders. It's tragic. It's tragic

80:18

because most folks have not made their

80:19

decisions in agents. It it is and we're

80:22

going to look back and we're going to

80:23

see these teams were so mediocre and so

80:25

paralyzed. And I got to tell you, when I

80:27

talk with folks so out of ideas, people

80:29

should not be asking me what they should

80:31

do with their agents. They should be

80:33

showing me their agents and asking me

80:35

for constructive criticism on them.

80:37

Right? People are are are paralyzed with

80:39

fear. They don't want to work twice as

80:41

hard as they used to and they don't know

80:43

what to build. And it's a tragedy

80:44

because even today selling to the

80:46

install base is much easier than finding

80:48

a new customer. If they're happy,

80:50

>> yeah,

80:50

>> just call them up. They will take the

80:51

meeting. And this is the great tragedy.

80:53

And I think a lot of private equity

80:55

firms are probably pretending pretending

80:57

their playbook's gonna work. I'm going

80:58

to hire this AI expert from Stebings,

81:01

Driscoll, and Lumpkin. It's 2 million a

81:03

year. They're coming in with their ties

81:05

and their checkered shirts, and they're

81:07

going to teach us how to do AI and get

81:08

us to a 60% solution by the end of the

81:11

year. It slips a bit and it's just it's

81:13

it's it's a tragedy and I'll tell you

81:15

why. It's a triple tragedy. Okay, this

81:17

is something I didn't know. I mean,

81:19

granted, in my brief tenure at Adobe as

81:21

a VP, okay, and that was not a high

81:24

point for Adobe. It was during the

81:25

transition to the cloud. But I will tell

81:27

you,

81:28

>> they said the same

81:29

>> and I never, you know, I never

81:31

underestimate competitors or big

81:32

companies. I got to be careful, right?

81:34

But I will tell you what I learned at

81:35

Adobe that folks don't realize. 100

81:37

surplus amazing engineers either by

81:40

design or accident. Either by accident

81:42

they were working on projects that

81:43

weren't quite going to get there, right?

81:45

Or they were available. They were

81:47

available more. Now, sometimes they were

81:50

on the back half of their career.

81:51

Sometimes they weren't quite as as ra as

81:53

as razed as the top engineers at Replet

81:56

or lovable or cursor, but I mean great.

81:58

My CTO is the toughest critic would

82:01

actually say the let's go f let's go

82:03

steal these five guys. They're actually

82:04

great. They exist. So it is a crying

82:07

shame you can't take a team six at an at

82:11

Koopa at whatever build the world's best

82:14

product and ship it to your 10 20 50

82:17

100,000. this it's we're watching

82:19

tragedies in the making and it it's sad.

82:22

It's sad because they're still deep down

82:23

running the dated playbook of a big

82:26

release every four to five years and a

82:28

quarter release which changes a few

82:30

pixels and adds some workflow. They're

82:32

deep down all the companies I've talked

82:33

to are still running that playbook and

82:35

it's a tragedy because they have they

82:37

have they have the opportunity but

82:39

Stebins Driscoll Lmin AI consulting is

82:42

not going to get them there and that's

82:44

who PE firms want to do bring in these

82:46

guys. it that's not going to work. And

82:48

and everyone is right. They have the

82:50

base. They have the opportunity. And um

82:52

you know, and they're going to end up in

82:53

these medallia death spirals where

82:55

they're defaulting on on debt and

82:57

defaulting on billions of dollars and

82:59

and and they can't they can't afford it

83:01

and they don't and it's just and um

83:03

>> it's not too late to sell sell to your

83:05

install base.

83:06

>> To make the to make the math work on the

83:08

LBO, they don't need to attract a whole

83:10

bunch of new customers. You know, they

83:11

they already once you do the PE deal,

83:13

you've already accepted that you're not

83:14

trying you're not a growth story

83:16

anymore. But to Jason, Jason Jason's

83:17

exactly right. If you can upsell, you

83:20

know, 20, 30, 40% more by delivering

83:23

this, you know, 100% agent, it might be

83:26

the next most amazing company, but

83:27

you'll cash flow positive. You'll pay

83:29

off your debt. You'll create enterprise

83:30

value and 5 10,000 customers aren't

83:33

going to have to do a migration in two

83:34

years when you file for bankruptcy. So,

83:36

I I agree. It's a It's a bounded

83:38

problem. It should be solvable, but it's

83:40

not going to be in many cases.

83:43

>> What?

83:43

>> I'm going to ask you two questions.

83:45

>> Oh, God. And you you got the binaries on

83:47

them.

83:48

>> So, who's going to go out first? Open AI

83:51

or Anthropic?

83:52

>> Entropic.

83:55

>> SpaceX. An entropic. Open AI in that

83:56

order.

83:57

>> It appears that SpaceX has already filed

83:59

and they're on track. So, we already

84:00

know the answer there. The fact that

84:01

Enthropic just added the Noardis CEO to

84:04

the board, that's a sign they're getting

84:05

ready to IPO as soon as they can. I

84:07

mean, I'm sure he's going to add value

84:08

in healthcare, but that means nothing,

84:10

but we're trying to IPO very soon,

84:12

right? that and finding who the hell

84:14

will chair the audit committee are clear

84:16

signs you're going to IPO as soon as

84:17

possible. So given that they have that

84:20

and OpenAI is sending out war memos. I'm

84:22

just voting that that they go out first.

84:25

>> Yeah, not even a difficult question.

84:28

>> Will Sarah Fry, the CFO of Open AI be

84:31

there when they go out? Yes or no?

84:33

>> Look, I do know one thing. CEOs and CFOs

84:37

have to a a have to a have to be wildly

84:40

aligned and b the CFO should probably

84:42

report to the CEO, right? And right now

84:45

I believe the CFO in this case reports

84:47

to the president VJ which seems an

84:50

anomalous arrangement, right? And so

84:52

maybe rather than kind of doing yeah so

84:54

and so is in or out what I would say is

84:56

this.

84:58

If that IPO is going to happen and if

84:59

this team is going to make it happen,

85:01

then they need to be in absolute sync

85:03

and they probably need to have a more

85:05

traditional reporting structure so that

85:07

people don't have one more thing to

85:08

think about or why this company is

85:09

weird. I mean, I always tell my CEOs up

85:11

and down, dude, on the things where

85:13

you're unique and different, you know,

85:16

where you're changing the world, do it

85:17

as different as you like, but all the

85:19

boring stuff, just give the market what

85:21

it wants. It wants the CEO or the CFO

85:23

reporting to them. They want to be in

85:24

sync. It makes everyone's head hurt if

85:26

the CEO and CFO are saying different

85:28

things about something as fundamental as

85:30

where when we're going to go public that

85:32

you know stop the leaking, stay in sync.

85:35

If you know that's just not a thing you

85:38

No one wants to hear that. No one wants

85:39

to hear that because those are going to

85:41

be the two people on the road show. They

85:43

should be able to finish each other's

85:44

sentence. You should be able to put them

85:45

in separate rooms like a police

85:47

interrogation room and each of them

85:48

should say exactly the same thing and

85:49

they should stick to their story. The

85:51

idea that you have separate stories from

85:53

the two, it's just not it's just

85:55

palpably absurd. So rather than saying

85:57

who's in and who's out, that's what you

85:58

got to do. Again, back to the same

86:00

therapist that they're using for their

86:02

relationship with Microsoft could

86:03

actually do some internal relationships

86:05

too. Interesting. But I will say two

86:07

things. One, on the one hand, in

86:08

isolation, if I'm running something like

86:11

any any anything at scale, but

86:12

especially at OpenAI, I want no daylight

86:14

between me and my top lieutenants. No

86:16

daylight. Okay. Now, you could argue a

86:18

CFO's job is to create a little bit of

86:20

distance to be that objective person in

86:22

the room and and there is some truth to

86:24

that, right? But there can't be daylight

86:26

or it's not going to work. So, if there

86:28

is that daylight and it were that

86:30

simple, you make a change and you make a

86:31

change before the IPO so it's least

86:33

disruptive. Now having said that if you

86:35

are running a company at scale and there

86:37

is already a ton of transition on the

86:38

senior team which there has been ton of

86:41

turnover I've seen a lot of times roles

86:43

like CFO and others where you're like

86:45

listen I just don't want to change one

86:47

more thing like yeah there's some issues

86:50

here but Sarah is so experienced that

86:52

things work like the workday finally

86:55

works we finally got all these things to

86:57

work this is not I got 99 problems this

87:00

is not one that I want to tackle so I

87:02

have often seen something like this

87:03

where you shouldn't have the daylight,

87:05

but it's not it's not it's not

87:08

poltergeistes streaming through and and

87:10

it's just not enough of a problem. Um if

87:12

your management team is super stable,

87:14

you have time to work on these things.

87:16

But sometimes islands of stability in

87:18

your management team, even if they're

87:20

not perfect, um it's just not worth

87:23

another turnover on the senior team. It

87:25

takes its every time you replace

87:26

someone, it takes its toll on the team,

87:28

especially if they're popular,

87:30

especially if they're liked and

87:31

respected. It just especially if there

87:33

if if everyone thinks they're terrible,

87:34

you move them out and and there's a

87:36

there's cake and a party on Friday. But

87:38

I wouldn't be surprised if she's pretty

87:39

popular in her own way and it takes its

87:41

toll. The raw ingredients of success are

87:43

there. The war ingredients of success

87:45

are there to have open eye be an amazing

87:48

mega IPO just you know and I say Lily

87:52

you get your therapist to make up with

87:54

Microsoft. Get your therapist to make up

87:56

get aligned with your CFO. focus on the

87:58

two big things which are getting the ads

88:00

product out and getting the enterprise

88:02

cranking and stay a course you have the

88:05

compute and get it done. If you have an

88:07

executive that's truly arguing with the

88:09

co in public in the media, right? Andor

88:12

like Daario back in the day at OpenAI

88:14

going directly to the board with

88:16

craziness, they got to go. Like it

88:18

doesn't matter. They got to go. They you

88:20

cannot be out in the media arguing with

88:22

the CEO. And you cannot be Daario, no

88:24

matter how smart are you, going to the

88:25

board and saying, "I will only stay at

88:27

Open Eye if I directly report to the

88:28

board." Like, it doesn't matter how good

88:30

you are. And this is brutal. Sometimes

88:31

you have to let some of your best people

88:33

go because it's too dysfunctional. They

88:35

got to go in those situations. Whenever

88:36

you get a phone call as a VP from a

88:38

board member from a VP, you're like,

88:40

"Okay, there's a problem here." Now,

88:41

maybe the CEO goes, maybe the VP goes,

88:44

maybe there's a problem you can solve

88:45

it, but you I'm not going to be as

88:47

absolute as you got to go, but your

88:48

antenna go up, you know, an entire notch

88:51

when you get that call. And you're

88:52

right, you and as for briefing, I mean,

88:54

look, most companies aren't interesting

88:55

enough to have a media briefing. It's

88:58

almost like these companies have become,

89:01

you know, political level drama. And I

89:04

just saw a fun tweet from Martin Cassada

89:06

was just basically saying, you know,

89:08

enjoy all the drama, enjoy all the pity,

89:10

backstabbing, and all that. It's because

89:12

this is such an exciting moment that the

89:13

media is focused on. And because they're

89:15

focused on it, you get all this is just

89:16

what happens when you're in the center

89:18

of the universe in terms of tech. So

89:20

roll with it. But you are right, Jason.

89:23

You want to be the tightun ship in this

89:26

sloppy sloppy world. And whatever, if

89:28

any VPs get this far on the pod,

89:30

whatever you do, do not reach out to

89:31

your VCs to say there's problems with

89:33

your CEO, you will you're losing your

89:35

job. And not only are you losing your

89:37

job, it doesn't matter if you're right.

89:40

You're at some level, you're probably

89:42

right. If you if you're a passionate VP

89:44

and you want and you're see issues in

89:46

the company, you reach out to Rory or

89:47

Harry on the board, odds that you're

89:49

100% wrong are 0%. Like you're but you

89:52

are not going to it's not worth it. You

89:54

are not going to what are they going to

89:55

do? Fire the CEO over you? 0 point.

89:57

Unless there's fraud, 0.0%. You're gone.

90:01

You're maybe in three months or it may

90:03

be that afternoon. You're gone. Just

90:04

don't do it. VP, just resign. Just

90:06

resign with grace.

90:08

>> Yeah, it's a much longer discussion, but

90:09

yes.

90:10

>> What's the Shakespeare quote, Rory? The

90:12

The world is a stage. All the world is a

90:14

stage and it was play part. Yes.

90:16

>> Oh, he had to finish on a Shakespeare

90:17

quote. Incredibly cultured. Thank you so

90:19

much, guys.

Interactive Summary

In this discussion, the hosts cover Anthropic's decision to withhold the Mythos model due to security risks, the critical 'Jason Test' for SaaS companies to avoid the '60% solution' trap, and Amazon's pivot to in-house Trainium chips. They also analyze OpenAI's massive ad revenue targets and the ambitious $2 trillion valuation of SpaceX based on its future potential.

Suggested questions

5 ready-made prompts