HomeVideos

OpenAI Buys TBPN & Their Management Team Reboot | Mercor Hack & Why Now is the Time for Cyber

Now Playing

OpenAI Buys TBPN & Their Management Team Reboot | Mercor Hack & Why Now is the Time for Cyber

Transcript

2662 segments

0:00

I'm going to call [ __ ] start to

0:02

finish on this whole discussion.

0:04

>> So, what do we have on the agenda this

0:05

week? Open AAI reboots management team.

0:08

Open AAI buys TBPN.

0:10

>> I thought the acquisition was just

0:12

insane. Owning a media asset invariably

0:15

takes way more time than you think for

0:16

way less money than you expect. See Jeff

0:19

Bezos for details.

0:20

>> There's no way that deal is going to

0:21

happen today. Like, it's dead because of

0:23

management change. Anthropic hits a

0:25

whopping $30 billion in revenue,

0:28

surpassing Open AI.

0:29

>> Their training costs are a quarter of

0:31

Open AI. It really feels like the

0:32

investors in Open AI got a much worse

0:34

deal in the last round than the

0:35

anthropic ones did.

0:36

>> And then SpaceX finally confidentially

0:39

files for IPO targeting a $2 trillion

0:42

valuation.

0:43

>> The big three, SpaceX plus OpenAI plus

0:45

Enthropic, their value at IPO will

0:47

exceed every other IPO for the last 20

0:50

years combined. Ready to go,

1:05

boys. Welcome back. I've been looking

1:07

forward to this one. I was doing this

1:08

schedule over the weekend and last night

1:09

and I was like, "Wow, this this week we

1:11

really have a lot of meat to get into."

1:13

So, I want to start with OpenAI and

1:16

Anthropic. So, Anthropic now have 30

1:19

billion in revenue. uh obviously

1:21

surpassing Open AI, it's all intertwined

1:24

with the subsequent things that we will

1:26

discuss with Open AI, but as Jason put

1:30

in an email to us all, holy cow, Jason,

1:33

holy cow indeed. What did you think?

1:36

>> Even Even in an era where we're getting

1:38

annured and anesthesized to crazy

1:40

numbers, this one I did fall out of my

1:41

chair, right? Uh getting to 30 billion

1:45

up from 9 billion at the start of the

1:46

year. I mean, uh, I think it took I

1:49

mean, Salesforce is the largest software

1:52

company, right? Uh, at least Cloud One,

1:54

and it it took them 25 years to get

1:56

there. Anthropic got there in five, but

1:58

maybe they really got there in three,

1:59

depending on how you count. But you just

2:03

uh, you know, we we it was it was

2:05

incredible to see where they were uh, in

2:07

February. We couldn't believe it. And

2:09

then essentially adding 10 million of

2:11

net AR. Let's let's not debate whether

2:13

it's how many RS there are and whether

2:16

it's recurring at this level of growth.

2:18

It really doesn't matter.

2:20

It doesn't matter. So, uh and that

2:23

there's still capacity constrained and

2:25

that Claude still shows us when we're in

2:26

there that it can't finish chats and

2:29

that uh you know every engineer in tech

2:32

has been told to consume more tokens and

2:33

move faster, right? The crazy thing is

2:35

what will it be at this rate at the end

2:37

of next year, right? Um it's uh crazy,

2:40

right? If it's it grew 3.3x in four

2:43

months, we need Rory's math help to

2:45

figure out what anthropics run rate will

2:46

be at the end of 27.

2:49

>> It the estimates that we were just

2:51

looking at two months ago just look

2:53

incredibly wrong at this stage.

2:55

>> Yeah.

2:55

>> So, yeah. No, these are these are all

2:57

amazing numbers, right?

2:59

>> Yeah.

3:00

>> And I think a bunch of other interesting

3:02

things start to happen here. One is you

3:04

kind of munching in some stuff. One is

3:05

their announcement on open claw and not

3:08

allowing that to be in the base plan. I

3:11

think it kind of gets back to they're in

3:13

a massively interesting situation now.

3:15

The revenue is exploding. Despite the

3:18

revenue explosion, they're still compute

3:20

constrained. In other words, they could

3:22

sell more if they had more, right? And

3:24

what do you do when you can sell more if

3:26

you had more, but you can't make more,

3:28

right? You can't magically make data

3:29

centers, though obviously they have that

3:31

big announcement to do that. What you

3:33

start doing is allocating capacity based

3:36

on money, right? And one of the first

3:39

things they figured out is these folks

3:40

using these open claw type agents are

3:43

consuming vast amounts of tokens on you

3:46

know fixed on fixed price plans and they

3:49

probably want to stop that which is what

3:51

they've done. So you're going to see

3:52

them do exactly what anyone in economics

3:55

would say do which is try and find a way

3:57

to maximize and extract even more

4:00

revenue. And you know, we saw it even

4:01

with OpenAI last week where you

4:03

deemphasize things like video which

4:05

consumes huge amounts of compute for

4:07

small amounts of revenue. In Entropics's

4:10

case, obviously they have much less of

4:11

that pure slop, but you deemphasize

4:15

things like open law access where it

4:17

consumes a lot of your compute and

4:19

doesn't make you a ton of money. And I

4:21

think you're just going to see a

4:22

continued trend to pricing tokens

4:26

pricing closer to the value, right? not

4:29

a huge trend because you want to get

4:31

people I mean you don't want to

4:32

overcompensate because you want to get

4:34

people addicted on the product because

4:35

the truth is the thing you have in your

4:37

favor with any digital good is the

4:39

complete certainty that prices per token

4:41

go down over time but you do at least

4:43

want to start allocating it a little

4:45

more sensibly while you're constrained.

4:47

So that's kind of I think the trend

4:49

here,

4:49

>> you know, the the we could talk a little

4:51

bit about the open claw stuff. Um just

4:53

on the on the but before we get there on

4:55

the growth and enthropic, the other

4:56

interesting thing was the Wall Street

4:57

Journal today had a bunch of leaks on

4:59

the financials for Anthropic and OpenAI.

5:01

And the one that jumped out at me when I

5:04

contrast it with the fact that Anthropic

5:06

has caught OpenAI, right, in half the

5:08

time is that their training costs are a

5:11

quarter of OpenAI. Their training costs

5:13

for models are a quarter of the Open AI.

5:15

Now maybe that's because they're

5:16

focused. They don't have to do video.

5:18

They don't have to do images. They don't

5:19

have to do a lot of consumer stuff. But

5:21

if you just think about it for a moment,

5:23

the compounding effects of catching Open

5:25

AI in half the time, right? At at

5:27

roughly the same revenue or more, 30

5:29

billion in 5 years, and having training

5:32

costs that for now are a quarter of it,

5:34

you know, that's a double code red.

5:37

It's one thing if if if if if you have

5:40

two classic startups where one is bled

5:42

money and it's artificial or there's

5:43

other things, but if you have a dramatic

5:45

cost benefit and you're out accelerating

5:47

your competitors, um and there's

5:50

management team turmoil at your

5:51

competitor, uh it really feels like the

5:55

investors in OpenAI got a much worse

5:57

deal in the last round than the

5:58

anthropic ones did. Just crazy just

6:00

having both. You usually don't have both

6:01

together with your competitor. you're

6:02

out accelerating your competitor and

6:04

your and your training costs are a

6:06

fraction of your competitor. Good god,

6:08

it that that just compounds.

6:11

>> That's actually a good point because you

6:12

take the Uber lift struggle, right? You

6:14

Uber had the oh my god, we're out

6:16

accelerating. Oh, but by God, we're

6:18

spending every dollar we have to to do

6:20

it and we and we show no fear, right? In

6:22

this case, you're out accelerating the

6:24

opposition while being more efficient on

6:26

a bunch of interesting measures. No,

6:28

you're right, Jason. That's a scary fact

6:30

pattern. If you're, you know, if you're

6:31

running the game theory and you're the

6:33

other guy, it's like, hm, that's not

6:35

good, right? They're growing faster than

6:38

us. The gross margin economics are

6:40

roughly the same/ slightly better and

6:43

their cost below the line and to a

6:45

rounding error costs are compute and

6:47

scientists to run the compute uh for

6:49

training are better. And that's that's a

6:50

bad fact pattern.

6:52

>> Have we ever seen a bigger seeming chasm

6:54

between where they're at? I mean, with

6:58

the greatest of respects, it seems like

6:59

anthropics is accelerating faster than

7:02

ever, and Open AI is having more

7:04

challenges than ever all at once. I'll

7:07

tell you the one I think about uh the

7:08

word I I I didn't realize when we did

7:10

this show the last because the press is

7:12

always focused on uh the headline stuff,

7:14

right? Yeah. The open air was barely

7:16

real.

7:17

>> Yeah, you said Yeah, you said

7:18

>> barely real. And Dre's money appears to

7:20

be real. It came in out front, right?

7:22

Good. The the 13 the you know, the 134

7:25

billion, whatever they put in, that's

7:26

real. 11 billion. Um, you know, all they

7:29

have to do is get 20% carry and double

7:31

that and it's it's it's a nice side bet

7:33

in an SPV, but that was real. The soft

7:36

bank money comes in tanches. They have

7:38

to borrow money to pay it. The Amazon

7:40

money is trunched in part on IPO or AGI,

7:44

right? And the Nvidia money is almost

7:46

all not money. It's almost all offsets

7:49

in compute. So,

7:51

you know, I thought about it, but then

7:52

in context of anthropics growth, like,

7:55

you know, that's I think Open I would

7:57

have rather have all the cash. Like,

7:59

it's not a sign of strength where the

8:01

majority of the round is not cash up

8:03

front. Like, that's not I don't think

8:04

that's a sign of strength. That's a sign

8:06

of like classically at least barely

8:09

getting the round done. Barely getting

8:10

the round done. Um, versus getting

8:13

because why wouldn't you want all cash

8:14

up front? Why wouldn't you want 140

8:15

billion up front? bit harsh on the

8:17

barely because I thought they tacked on

8:18

another I can't believe I said the

8:20

sentence. They tacked on another 10

8:22

billion. Think about that sentence

8:23

sometime that I think was cold hard

8:25

cash. So I think it I I agree your

8:28

comment is correct, Jason. The vast bulk

8:31

of the dollars weren't cash, but enough

8:33

money changed hands that it represented

8:36

a bonafide price at the time. But yeah,

8:39

you are right.

8:41

And and again I mean look Antropic does

8:43

some of the same stuff in the sense of

8:45

given that your biggest expenses are you

8:47

know compute and then distribution you

8:49

know from Microsoft on with open AI to

8:51

all the recent entropic deals there's a

8:53

lot of this roundtpping business but in

8:56

both ca I mean I'd say make two comments

8:58

perhaps in both cases there was enough

9:01

hard dollars changed hands to represent

9:04

both of them represent price estimates

9:06

but to your point would based on what

9:09

you know now given the revenue venue

9:10

equivalents, rough revenue equivalents.

9:12

Shouldn't assume until Open AI releases

9:14

their numbers, maybe they've exploded,

9:16

too. But definitely

9:19

Anthropic at 370 billion feels a little

9:21

more comfortable, let's just say, than

9:23

um Open AI at 820 or 840 or whatever the

9:25

final closing was, right?

9:27

>> Well, OpenAI did say 2 billion last

9:29

month. I think that's why Anthropic

9:31

rushed out the 30,

9:32

>> right? That they're at a $2 billion run

9:34

>> rate. And look, we have the whole gross

9:36

net thing, but the bottom line is this.

9:37

When you look at those two graphs, you

9:40

definitely don't say to yourself, I

9:41

mean, I think what you if this was a

9:43

public stock, let me put it this way. If

9:45

this was a public, if both of these

9:46

companies were public, there would be a

9:48

bunch of those Yeah. New York hedge

9:50

funds shorting Open AI, longing on

9:53

Tropic and saying they have the perfect

9:55

AI bet, right? At you know, would you

9:58

would you go would you short Open AAI at

10:00

870 and go long on Tropic at 372? I'm

10:04

not a risky guy, but even I would

10:05

contemplate doing that. It feels like a

10:07

no-brainer bet. You have roughly the

10:09

same revenue, a better trajectory in a

10:11

management team for half the price. Hm.

10:13

And if you short the one and long the

10:15

other, you're kind of you're

10:16

diversifying away the AI overall risk,

10:19

and you're just making a relative

10:20

performance bet. That's probably Yeah,

10:23

that would be an interesting one. What

10:25

would you say to an OpenAI employee who

10:27

is now looking at their incredible stock

10:30

price appreciation with tens of millions

10:32

of dollars in equity that they now have

10:36

at the 820 price? Sell it all at 820 the

10:39

minute a tender comes. What would you

10:41

say to them?

10:42

>> I think I I wouldn't pile on. I in

10:45

general, look in I have some things in

10:47

opening I want to pile on this time. And

10:49

if you recollect in the last couple of

10:51

weeks, I've tried to avoid the pile on

10:53

when someone's down. And I think you'd

10:54

always want to be more tempered. But I

10:57

always say to everyone in any um

10:59

private, you know, in any private

11:02

company, I say when the liquidity window

11:04

opens, take it seriously because it

11:05

might open again for a while, right? So

11:08

yeah, when the liquidity window opens at

11:10

$.8 trillion,

11:12

the alert reader should say, you know,

11:15

if you're planning to buy that house in

11:16

San Francisco, you might need an extra

11:18

few million just based on what I'm

11:20

seeing in the market now in terms of

11:21

house prices. So take advantage of this

11:23

thing cuz all your brethren have right

11:26

you know I wouldn't yeah as I said I I

11:29

think the people I don't want to pile on

11:30

the individual employ I mean the

11:32

company's still doing a lot of great

11:33

stuff you got a lot of turmoil we got a

11:35

lot of drama at the top we'll talk about

11:37

that but you know I think you take

11:39

advantage of liquidity just because you

11:42

should always take some advantage of

11:43

liquidity

11:44

>> let's knock it on the head let's talk

11:45

about the drama at the top I mean talk

11:47

about a management team turnover you

11:49

have Brad the COO who's been moved to

11:51

special projects. Um,

11:54

>> my dream sign being moved to special

11:56

projects.

11:57

>> I'm going to be the SVP of special

11:58

projects for 20 VC in my next phase of

12:00

life.

12:01

>> Jason, I would love you to be the SVP of

12:03

special projects.

12:04

>> Just special projects. Yes, special

12:06

projects.

12:06

>> Um, we we have the CMO stepping down due

12:09

to health reasons. Um, we have the CRO

12:12

out. We have Fiji um who's head of apps

12:15

taking a short leave of absence um with

12:17

health problems. Um,

12:20

how do we read this very significant

12:22

multitude of changes at the management

12:25

layer?

12:26

>> If we step back a minute, it it ties to

12:28

anthropic passing them.

12:31

You don't you don't just sit there and

12:33

make no changes on the team when your

12:35

competitor uh over the last 6 months has

12:38

radically changed the competitive

12:39

posture. So, look, I don't think any of

12:42

us like that amount of change in any

12:44

management team, right? It feels almost

12:45

a wholesale change at some level. But um

12:49

and it's risky, but you you got to you

12:52

know the calling code red four three

12:53

months ago didn't magically change the

12:55

trajectory here. So it ties you got you

12:58

got you got to try to mix things up in

12:59

some fashion. Hopefully you can do it

13:00

with the team you have. But in in the

13:03

context of of of

13:05

anthropic now out accelerating OpenAI,

13:08

it just makes sense to to reboot the

13:09

team. It just makes sense.

13:11

>> Yeah. There's some rebooting some I mean

13:14

the act to use your phrase the reboot. I

13:16

mean who's been hired? Who's the what's

13:17

the re what's the additive reboot? Well,

13:20

the dramatic one, which is always risky

13:22

for like any startup, is you take Denise

13:24

Dresser, who was CEO of Slack, who came

13:25

from Salesforce just a couple months

13:27

ago, and you put her in charge of

13:29

basically everything go to market and

13:31

related, right? That's a good bet on a

13:33

seasoned executive, but that's the type

13:35

of change that we've all seen as

13:37

investors is like super risky, right?

13:39

You bring in the one the person with the

13:41

perfect LinkedIn, right? And the perfect

13:42

background that's still getting to know

13:44

the product. They're still on a get to

13:46

know you tour. they haven't quite been

13:48

to the uh to the New York office yet.

13:50

They're getting to know the product and

13:51

all of a sudden you give them this

13:52

massive portfolio because they're proven

13:54

executive. In my experience, I I don't

13:57

know what you guys think. In my

13:58

experience, that has about a 30% chance

13:59

of success just just roughly that

14:01

bringing in the big the perfect LinkedIn

14:03

giving them a massive portfolio and

14:05

either attaching them to and attaching

14:07

to something in tumult if it's executing

14:10

to perfection, it always seems to work

14:13

bringing in Mr. Mr. LinkedIn miss like

14:16

but but when you're in Tumult there's

14:18

not a lot of time to learn everything

14:21

right there's not a lot of time for the

14:22

get to know you tour so it's just risky

14:25

but it's it's a but it's a play like it

14:27

it is a play I get where you're going

14:29

there Rory which is like for the

14:31

replacements to be additive there needs

14:33

to be great talent added and there seems

14:35

to be a lack of people coming on the

14:37

field when they're coming off agreed and

14:39

you know I'm I'm all as I say I have a

14:41

couple of comments one is I'm always low

14:42

to comment on um yeah because the

14:45

illness related is because you just

14:46

don't know what's going on in people's

14:48

lives and that's tough and people have

14:49

challenges and you know you wish people

14:51

all the best especially in these kind of

14:52

chronic diseases and hope they can get

14:54

back to full health right let's just

14:56

start with that cuz that sucks right you

14:58

know at the same time you know you have

15:00

a lot going on here right you have a lot

15:02

going on and you know I I I to me even

15:06

all these people change you know I I'm

15:09

tempted to make that you know the famous

15:11

Oscar Wild quote um in the importance of

15:13

being earnest you

15:15

to lose one parent might be an accident

15:17

when he was talking to this woman was

15:19

talking to the orphan, but to lose both

15:21

parents smacks of carelessness, right?

15:23

Well, you know, you you you are getting

15:25

to the stage of carelessness here,

15:27

right? But I actually don't think that's

15:29

the real issue. It's fun to say. I think

15:31

two I'll tell you, we haven't mentioned

15:33

the two most surprising things in the

15:34

last week on Open AI. One is I'm just

15:37

going to say it. I thought the

15:38

acquisition of a TPBN was just insane.

15:43

Not on its on the in in in in the

15:45

particular it doesn't matter but you

15:48

don't

15:49

launch an ecode red edict and a focus

15:51

edict and and you know no more side

15:54

projects edict and then within the space

15:56

of a week do something that's so

15:58

obviously a side project

16:00

to me no matter what you get I mean we

16:03

can discuss whether it's stupid on its

16:04

face and whether you know buying media

16:06

assets is the way to go and I I

16:09

acknowledge the unreason articulated

16:11

thesis that you know you have to control

16:13

the media story though doesn't seem to

16:15

be Antropic has any need to do that. But

16:17

stepping back one level, you're running

16:19

a $25 billion company, the most exciting

16:22

company on the planet. If the number and

16:25

you just told your entire internal team

16:27

that you need to focus and then buying

16:29

there's nothing that's more of a vanity

16:31

project than buying a media company,

16:34

right? I mean, look, you know,

16:35

>> just one thing we could talk about it

16:36

more or less. The one thing just to add

16:38

when you look at the press, this is

16:40

interesting. that deal the hands the the

16:42

outreach was in January that's a lot of

16:45

time in open AI and AI time right Viji

16:48

was new thought this would be a great

16:50

thing to elevate open AI in January now

16:53

it now it's April and maybe the deal

16:55

seems a lot different but in January it

16:58

was a different world right

16:59

>> at some point I mean I remember yes

17:02

>> I don't think it h it didn't happen last

17:04

week is my only point it happened in

17:05

January it took some time to close right

17:07

and I will say one thing I'm 90% sure it

17:09

wouldn't happen today. To your point,

17:11

priority if nothing else, prior like

17:13

priorities change, right? It probably

17:15

wouldn't happen today.

17:16

>> You know, if you only noticed in the

17:17

last week, if you only noticed in the

17:20

last week that you need to focus, then

17:22

yes, I'll give you that, right? But you

17:24

didn't just notice in the last week you

17:25

need to focus, right? And if you did,

17:27

maybe you need to focus and see fire

17:30

comment, right? If you haven't realized

17:32

you're in code red for the last two or

17:34

three months, and if you have realized

17:35

this is the kind of thing you don't do

17:36

when you're in code red, then you're

17:38

just not paying attention. So, I

17:40

challenge that. I think it's a vanity

17:41

project and absurd. And then the other

17:43

kind of weird

17:44

>> Can we actually Can we just pause on on

17:45

that?

17:46

>> And I know you want to because you want

17:47

like where's your 200 million, Harry?

17:48

But yeah, let's pause in your lack of

17:50

200.

17:50

>> No, I know. I I just want to actually

17:52

articulate a bull and a bear case

17:54

rationally for an audience for how this

17:57

acquisition could be seed from both

17:59

sides because it is very confusing. So,

18:01

if we were to start with a bullcase,

18:03

Rory, and Jason, please chime in, too,

18:06

because you're you're the master also of

18:07

kind of media and venture as well. Um,

18:10

what is the bullcase first?

18:11

>> I'll give you the the bullcase. There's

18:13

there's two. The the strategic one's

18:15

more interesting, but let me hit the

18:17

tactical one because Rory because Rory

18:19

made a good point. If you there this is,

18:22

look, this is not going to make or break

18:23

the company, right? There are certain

18:24

acquisitions that can, right? There's

18:26

certain actors this like stipulate that

18:29

but there are some things you acquire

18:31

where it it they run almost on

18:34

autopilot. They are not massive

18:36

distractions and if the price is small

18:39

relative to what you hope to get out of

18:41

it that does factor into the equation.

18:43

If you have to rebuild your whole team

18:45

it's a total distraction. You're going

18:46

to rip out your guts. That's a big deal.

18:49

when once in a while on one it's pretty

18:52

rare you can acquire something that

18:53

isn't massively distracting to some

18:55

management team level. So even if it's

18:58

not the perfect acquisition I don't

18:59

think it's a huge it's not going to

19:01

require a huge amount of senior

19:02

executive time. So it's just it's just

19:03

important general to the calculation.

19:06

The one point I'll make and I wrote a

19:09

post that every every profitable public

19:11

company should do a deal like this of

19:13

which OpenAI is neither right it is

19:15

clearly not profitable. is clearly not

19:16

public. But but other than that, let me

19:18

tell you why, Rory, and you might end up

19:20

agreeing with me on this. Um because,

19:22

and this is why the bar stool deal

19:24

almost worked but failed, right? If you

19:26

are a profitable B2B company,

19:28

especially,

19:30

you are under insane pressure to get

19:33

more profitable. Like we c I actually

19:35

can't overstate how how intense the

19:38

pressure is. Like they're looking at

19:39

every headcount, every sales efficiency,

19:41

everything. Now it is brutal, right? and

19:44

your cash is trapped on your balance

19:46

sheet and so it is very difficult to

19:49

increase marketing spend. It is very

19:52

difficult to spend another hundred

19:53

million this year on marketing. But at

19:56

least in the short term if you can buy a

19:59

marketing asset that is at scale that is

20:01

at scale you can turn your balance sheet

20:05

into marketing which is hard to do. It's

20:08

hard to do and I think maybe TBN is not

20:12

the most successful way to get OpenAI's

20:14

brand out there. We could debate that

20:16

but it but it is a way to turn a balance

20:19

sheet into a marketing asset.

20:21

>> I'm going to call [ __ ] on start to

20:24

finish on this whole discussion. This

20:27

Open AI is the most known company on the

20:30

planet perhaps other than Apple. Right?

20:32

Within the last two years, the CEO of

20:35

OpenAI has been able to meet every world

20:38

leader he wants, right? He's gone on

20:40

world tours. He's met Macron, he's met

20:42

the president, he's met every single

20:44

prime minister of India, whatever,

20:46

right? They get constant attention,

20:49

constant. The AI story has been, you

20:51

know, the entire zeitgeist for the last

20:55

3 years and they're the leader of the AI

20:58

story, right? So, in terms of media

21:00

minutes, there's nothing left to get.

21:03

Right now, if what you're saying is, I

21:05

don't like what they're saying about me.

21:06

Oh, they were mean to me, then yeah,

21:09

maybe you can pay these guys to say

21:10

nicer things about you than on average.

21:12

But you don't need more. It's not like

21:14

you're making [ __ ] widgets in the

21:15

Heartland here, right? You are the most

21:18

exciting tech story on the planet. You

21:20

don't need a little bit of help and to

21:23

just get out and and get covered. I

21:26

mean, literally everything Sam does gets

21:27

covered, right? So I hear you Jason most

21:30

of the time but not for these guys. If

21:32

you were to pick the one company who

21:34

doesn't need media attention and does

21:36

need to focus it would be open AI and

21:38

this is nonfocused and getting media

21:40

attention. So I'm like just from a

21:42

signaling perspective we're 100%

21:45

aligned. The only thing I would come

21:46

back to you with saying is they have

21:48

consistently shown an inability with how

21:51

to respond on social to negative

21:54

moments. Whether it's lemonade stand,

21:57

whether it's anthropic adverts, they've

21:59

consistently messed up crisis PR and

22:02

crisis communications and made

22:04

themselves not look great. The only way

22:07

I could justify this is by saying they

22:09

are vibe maintenance for those [ __ ]

22:12

times to make us better, cooler, better

22:15

responders to bad things because they

22:18

have no editorial control. Like this is

22:20

the most important thing. Andrees are

22:21

right the importance of owning media but

22:23

they have no ability to own the content

22:25

to influence it to impact it in any way.

22:29

It is editorially completely impartial.

22:32

So they have zero benefits. This is the

22:35

only reason this does not make any

22:36

sense. If they had the ability to own

22:39

the media properly, it would make sense,

22:42

but they have zero impact on it.

22:44

>> History is riddled with people who, you

22:46

know, buy media assets to try and change

22:49

outcomes. And you know it it it

22:52

generally results it generally my

22:53

observation is owning a media asset

22:55

invariably takes way more time than you

22:57

think for way less money than you expect

23:00

right see Jeff Bezos for details right

23:03

and you know you end up getting abuse

23:04

you yeah it just is a sinkhole right and

23:07

Harry if you can't control the story

23:10

then hire a better storyteller you know

23:12

hire a better comms person hire a better

23:14

marketing person think before you speak

23:16

and before you hit on your [ __ ] about

23:19

lemonade stands But and look in ter it's

23:21

in the noise it and Jason you are right

23:23

they're not going to spend a lot of time

23:24

managing in this case at least in the

23:26

short term. My comment is more

23:30

it's just really silly when you say

23:32

we've really got to focus nothing else

23:34

matters but these two or three big

23:36

things. Oh but by the way here's here's

23:37

here's one last play thing project. The

23:40

one thing I will at a meta level just to

23:42

founders especially have listened to

23:43

this honestly this is why you should

23:46

default yes to a good deal. Let me let

23:48

me be clear. I'm pretty sure this is

23:50

this deal. I just read the press. This

23:52

things things at open. There was stress

23:54

in January, but it's not like today.

23:56

Okay. It was not like today. Um Fidgety

23:59

comes in. She has an idea. This is not

24:01

the biggest bet the company's going to

24:02

make, but they have a team meeting.

24:03

She's like, I love TBN. What if we

24:05

brought them in for for a little bit of

24:07

good promotion? And everyone around the

24:08

corner is like, whatever. Yeah, we let's

24:11

go talk about buying some open claws or

24:13

something. But but but they say fine and

24:15

things are good and they kind of shake

24:16

hands on a deal. It takes a little while

24:18

to happen and it closes last week.

24:20

There's no way that deal is going to

24:22

happen today. Like it's dead because of

24:24

management change. And I can't I can't

24:26

tell you how many times I've seen this

24:27

for portfolio companies and even it's

24:29

happened to me twice where time it's not

24:32

just time is the enemy of deals. It's

24:34

management turnover, right? Priority

24:36

turnover. So the metal lesson is I just

24:38

don't think this deal would have

24:39

happened today. It has nothing to do

24:40

with with with the team at TBN. Just so

24:43

so when you say no to a to an attractive

24:45

deal, just be sure you're okay if it's

24:47

no never because the odds that VP that

24:50

wants to do the deal is there in 12

24:52

months and that their priorities have

24:53

not changed approaches single digits.

24:57

>> Yeah, it's back to the liquidity window

24:59

comment. You're exactly right, Jess.

25:00

>> Yeah, but my god, I think it's worse for

25:02

M&A because so many times in M&A that

25:04

guy just isn't there next year. But I'll

25:06

tell you, one of the things about being

25:08

the big boss is that even when you're a

25:11

long way down, if you don't think it

25:12

suits what you're doing now, you should

25:14

stop it. I remember fun story to 25

25:16

years ago, we were selling a company to

25:19

GE. I'm not going to name the company,

25:21

right? And it was a mediocre company and

25:22

we were darn lucky to get the bid,

25:24

right? And it was going all the way

25:26

through and it went every level at GE,

25:29

right? And then it came to the CEO and

25:32

you know, he's not perfect Jack Welch,

25:33

but he's willing to take a tough

25:34

decision. We were a long way down.

25:36

Everyone was about to sign and be all

25:37

happy happy. He looked at the numbers

25:39

and said, "No." And I remember thinking,

25:41

"Damn, I thought we get away with it,

25:42

but he's right. I should have, you know,

25:45

right?" And at some point, I remember

25:47

thinking, "Oh, that's impressive. All

25:48

these people were in. He was a long way

25:50

down with the process." And he just

25:51

said, "I'm thinking no." Right? And if

25:55

you could, this is one where you say,

25:56

"I'm thinking no."

25:57

>> I'm going to give you a hard one before

25:59

we move to SpaceX. You have the chance

26:01

to buy Anthropic at 850 or Open AAI at

26:05

380. Which would you rather buy?

26:08

>> You may have that opportunity in the

26:09

secondary market as we speak.

26:12

I'm

26:13

>> I wouldn't be surprised.

26:16

I think I'd mean having said I'd do

26:18

Antropic last time at I mean six months

26:20

ago at the 300 something thing I think

26:22

I'd go the other way this time because

26:24

again if you if the choices were

26:25

entropic at at the last OpenAI price of

26:28

850 post or whatever it is 820 something

26:30

post or

26:33

OpenAI at the last entropic price of 370

26:36

post or 380 post I would argue that you

26:39

would buy open AAI on one proviso you

26:42

could sit down with the board and say

26:43

what are you going to do about this

26:44

because it doesn't take a lot to fix

26:46

this thing, right? To just stop screwing

26:49

around and focus, right? And you know,

26:53

>> no, no, no, no. Because you've got to

26:54

stop then a machine that is anthropic

26:56

that is now picking up more and more

26:58

pace with every day that goes by and

27:00

being first of all, you still have the

27:02

consumer asset where you are by far the

27:03

dominant thing. Again, this goes back to

27:05

what we said last week. You have to do

27:06

two things. You have to figure out a

27:08

consumer monetization model and you just

27:10

have to get a Codeex, the Codeex

27:13

competitor to Claude out there. It's

27:15

pretty mission clarity is pretty simple.

27:16

You do have one big advantage we didn't

27:18

talk about though it's changing a little

27:19

bit and give Sam he was more aggressive

27:22

on compute purchases and I'll admit I

27:25

was someone thinking from the peanut

27:26

gallery hm is that a bit aggressive but

27:29

now it looks like compute constraint is

27:31

a real thing in 26 and early 27. You

27:33

have that asset maybe you figure out how

27:35

to deploy that aggressively with codecs.

27:38

So there there's buttons you can press

27:39

there's things you can do if you focus

27:41

and do them.

27:43

Jason, you've got that same choice.

27:46

>> I'd say buy both at if you can invert if

27:48

you can invert the valuations.

27:50

>> Yeah, that's actually

27:51

>> that's what all the the growth VCs are

27:52

doing if they can get away with it

27:53

anyway. So, let's invert.

27:54

>> That's amazing. But you can only buy

27:56

one.

27:57

>> Um, yeah, but conflicts aren't important

28:00

in our firm anymore. We they don't

28:01

matter at at pre and they don't matter

28:03

at growth.

28:04

>> Jason, you only have one check left.

28:06

>> Well, look, I mean, I've said the same

28:08

thing on the show. I'm just not into

28:11

the the the tumult at OpenAI. I'm not

28:13

into the drama. I'm not into a non- tech

28:17

non-deical

28:19

founder leadership. It's just not my

28:20

vibe. Like I wouldn't invest in anything

28:22

like OpenAI at a high price. It doesn't

28:24

matter what it is cuz it's just I just I

28:27

just find it so risky that that the

28:30

turnover and not being led by a deeply

28:33

technical CEO that's just I in my life

28:35

at investing I I ain't doing those risks

28:37

anymore, right? and and maybe I'll miss

28:39

a lot of opportunities. It's just, you

28:41

know, I want someone Dario or smarter

28:43

technically running these companies or I

28:45

just it's just too it's too much change.

28:47

You get you get too lost on the on the

28:49

on the on the on the pen and and the

28:51

TBPN's

28:52

although I don't think Sam had anything

28:54

to do with TBNN in all fairness. I think

28:55

he said fine in a meeting and moved on.

28:58

No, because and related to that just for

29:00

folks I don't know how M&A works at

29:02

OpenAI. It's not that sophisticated.

29:04

Okay. But I will tell you when I was at

29:06

Adobe a long time ago for M&A, basically

29:08

every every senior executive got a big

29:11

chip and a small chip. Okay, the big

29:14

chip was a big deal. Back then it was

29:15

maybe a billion dollar deal. Okay, that

29:17

would move the needle. If it doesn't

29:19

work, you get fired. It's that simple,

29:21

right? And everyone got a small chip

29:23

could could be like 50 to 200 million

29:25

deal. And you had to justify it and you

29:28

didn't get five. as a forcing function.

29:30

You got one, but you really weren't

29:32

challenged that much to to do the

29:34

smaller chip. You picked one a year and

29:36

you didn't get fired if it didn't work

29:37

out. There was there was a there was an

29:38

idea that maybe 20% of them would work

29:40

out. And so I bet he spent five this was

29:42

a small chip deal and he spent 5 minutes

29:44

on it. This is the one. Is this the one

29:46

that you really want to do this year,

29:47

Fiji? Then just do it. Let's move on. We

29:48

got bigger fish to fry. That's why I

29:50

don't think it's that big of a deal. It

29:51

was a small chip deal, right? And no one

29:54

loses their job at Adobe over the small

29:57

chip deal. Otherwise, it would never

29:58

happen. No one would take any risk in

30:00

buying an emerging company, right? They

30:02

just wouldn't do it.

30:03

>> Okay, we've got to move on. Other things

30:05

did happen. SpaceX finally finally

30:09

confidentially files for IPO targeting a

30:12

$2 trillion valuation. Um, it would be

30:16

the largest IPO in history, surpassing

30:18

Saudi Aramco. They could raise up to $75

30:20

billion. This obviously includes XAI

30:24

otherwise known as Twitter um which

30:26

obviously incorporated earlier this

30:27

year. 2025 revenue 15 to 16 billion 8

30:32

billion of EBIT uh at 2 trillion it's

30:35

125x revenue so coming in punchy um

30:42

to say the least feels like a series A

30:44

these days Rory uh how do we feel when

30:48

we hear this? Well, I'll just tell you

30:49

one one one insight. I I think to say

30:53

that at least venture is different or

30:55

remade is an understatement. Like the

30:58

big three, SpaceX plus OpenAI plus

31:00

Enthropic, assuming they all IPO, I mean

31:02

certainly SpaceX will in the next 12

31:03

months. Uh their value at IPO will

31:07

exceed every other IPO for the last 20

31:09

years combined. All of them. All of the

31:12

last 25 years. These the big three,

31:14

every other little every other little

31:17

deal. I mean, Rory's had some great

31:19

IPOs. There's been tons of them out

31:21

there. Um, but this exceeds all of them

31:23

combined, right? So, so it almost makes

31:26

I I I found it almost depressing in a

31:28

way when I thought about this way

31:30

because it was like, what's the guy we

31:32

had earlier in the show from Slow

31:33

Ventures who kind of bothered me a

31:34

little bit,

31:35

>> Tom Les.

31:36

>> Yeah. And he kept saying box doesn't

31:37

matter. And then he said, "Open AI

31:39

doesn't even matter. It's not that

31:40

important." He was very triggering. I

31:42

tried not to get triggered directly, but

31:43

he kind of rattled in my head. I was

31:44

like, "Maybe the guy's right. Maybe

31:47

nothing we're doing matters because the

31:49

big three dwarf the last 25 years

31:52

combined. Like what are we doing guys?

31:54

What are we what are we doing here?

31:56

First of all, I I think that is a real

31:57

phenomen and what you're simply saying

31:59

is, you know, especially in SpaceX's

32:02

case, the longer the holding period, the

32:05

more dispersion which sets in, which is

32:08

more the bigger be the big become bigger

32:10

and the little ones fade out. And you

32:12

know, and you're exactly right at the

32:14

tail end of a power law, it does mess

32:16

with your head because the combined

32:18

value of the top three privately held

32:20

companies are larger than everything

32:21

else. In much as the same way, it's even

32:23

more concentrated than the public

32:25

markets which are more concentrated than

32:27

they've ever been where the market cap

32:29

of, you know, the top four or five,

32:32

Nvidia, Apple, Microsoft, um, Alphabet

32:35

and what and I think Meta is, you know,

32:37

approximately 30% of the total S&P,

32:40

right? Which is everything for the last,

32:42

you know, xund years, right? And you're

32:44

right. The psychologically

32:47

the thing about a power law they don't

32:49

tell you is you can have the third best

32:51

outcome in venture history and be only

32:54

onetenth as large as the largest outcome

32:58

in venture history. And if you're going

32:59

to let that in your head, it's going to

33:00

be it's just going to be very tough

33:02

business psychologically for you because

33:04

you can have a life-changing event

33:06

that's you down in the noise of 10 or 20

33:09

billion dollar outcomes, which can be

33:10

enormously great for you and your family

33:13

and for your co- investors and for

33:15

everyone involved. And if you're going

33:17

to let it in your head that it's not $2

33:18

trillion, then you're doomed and you're

33:20

just going to need therapy. I wrestle

33:22

with these things all the time. I mean,

33:23

it is the thing that your mama told you,

33:25

right? is right. You just have to not

33:27

let other people define you. I mean, you

33:30

said it really. It is a psychologically

33:31

weird thing, right? You're going to have

33:33

these three deals go public and um

33:37

they're going to be worth literally

33:38

everything else that's happened in the

33:40

last 20 years if they trade anything

33:41

like their current prices. Will SpaceX

33:43

rip and hit the two trillion when it

33:45

does go out?

33:47

>> I try and not spend time talking down,

33:51

you know, an amazing company, right? Not

33:54

least because I'm going to be a buyer of

33:55

SpaceX 15 days from the IP. I surprised

33:59

you because 15 days from the IPO, it's

34:01

coming in QQQ. I have a big QQQ holding,

34:04

right? If you're an index fund, you're

34:06

getting this thing in 15 days, right?

34:09

And if I don't know when it'll be for

34:11

the S&P, but it'll be fairly s soon

34:12

thereafter, right? So, we're all going

34:14

to be buyers of this thing, right? So,

34:17

in terms of the valuation, I I you know,

34:20

you don't know. Look, you do any kind of

34:23

meaningful analysis some of the parts

34:24

and you come up with a lot lower number

34:26

and then as we we've discussed this

34:28

before and then the gap between what you

34:30

think the assets are worth on any kind

34:32

of normal basis and $2 trillion is huge

34:34

and it's all Elon premium and what

34:37

you're really asking therefore is how

34:39

big is the elon premium sometime in June

34:42

and I don't know I think that you're

34:44

definitely seeing the Elon premium come

34:45

off on Tesla which has been worth

34:47

pointing out and you know it's it it's

34:49

it's it's down significantly year to

34:53

days and there's definite and you know I

34:55

was interesting to see JP Morgan put an

34:57

actual sell on Tesla with a prediction

34:59

of a 60% price decline. So

35:03

really what you're asking me Harry is

35:05

what is the Elon premium in June and I

35:07

hell I don't know.

35:08

>> Well I'm asking actually do you think it

35:09

will materialize in public markets and

35:11

they hit that two trillion. Well, Rory,

35:14

here's what your your thought.

35:15

Obviously, I don't think either of us

35:16

have worked on an IPO quite of this

35:17

scale, right? But

35:19

>> no, by definition, no one has in the

35:20

[ __ ] universe because the first time

35:22

it's ever happened,

35:23

>> right? So, the process is going to be

35:24

different. But here's my point. At some

35:26

level, if the company ha there is a

35:31

there is a a tough negotiation sometimes

35:35

between the company and the underwriters

35:36

on valuation, right? Um, and often times

35:38

some cos are like whatever the whatever

35:41

the lord brings and some are extremely

35:42

aggressive on the number they want,

35:44

right? And depending on the situation,

35:46

sometimes the co wins those debates,

35:48

gets out with the valuation, the

35:49

underwriters are very uncomfortable with

35:51

and sometimes it works and sometimes

35:53

they stumble because of it. I think

35:54

Elon, what Elon said publicly on X, it

35:57

ain't going to be two trillion. Now

35:58

maybe he'll change his mind. He said two

35:59

trillion was too high. So whatever his

36:01

number is, I think he's going to get it

36:03

on IPO day. He's going to will it into

36:05

existence. the underwriters are not

36:06

going to be able to argue with him for

36:07

more than 5 minutes and there'll be

36:09

enough demand between retail 30% of the

36:13

IPO. It's a lot, right? There'll be

36:15

enough whipped up demand, I think, to

36:17

support it for one day. He will will it

36:19

into existence. Whether that evaluation

36:21

is there in 30 days uh or possibly even

36:24

in one day um I don't know but I do

36:26

think the sheer force of will the lack

36:29

of power of underwriters and the 30%

36:31

retail will will his 1.75 into existence

36:34

for one day at least one day

36:36

>> I think that's quite correct it's worth

36:38

pointing out I think less than 12 months

36:39

ago there was a meaningful transaction

36:41

in SpaceX at 400 billion right then

36:44

there was this much smaller I don't have

36:47

even happened in the end secondary at

36:48

800 billion

36:50

Then in conjunction with the merger with

36:53

um come on X yeah X// Twitter it SpaceX

36:59

was valued at a billion to value the

37:00

other asset with its negative 12 billion

37:03

in cash flow at 250 billion. So they

37:06

added that in to get to 1.25 and now you

37:09

know you're at um you're talking about

37:12

1718 and it's all been walked up in a

37:14

very interesting way. It is worth

37:15

remembering that the last time the

37:17

useful asset was valued on a standalone

37:19

basis, it was worth $400 billion. So I

37:22

think if the deal went public at 1.5 1.6

37:27

less than the whisper number, I still

37:29

think they'd have done a magnificent job

37:31

of walking the value of the asset up

37:33

because it's not clear to me that the X

37:36

AI asset has a positive NPV in anything

37:39

like the near-term. We just had a long

37:41

conversation on entropic versus open AI

37:45

and they're kind of number one and two

37:46

in this space and Gemini Google is

37:50

almost certainly number three. So X.AI

37:53

is number four in the kind of model LLM

37:56

space at best burning $12 billion a

37:59

year. So I don't know what that's worth

38:02

but I but I I would argue that they

38:05

won't be talking about that in page one

38:06

two or three of the slide deck at the

38:08

IPO. they'll be talking about SpaceX

38:10

which means the entire addition of that

38:12

probably was net negative. So I go back

38:14

to my comment is I think you're right

38:17

Jason they'll will something amazing

38:19

into existence for a short period of

38:21

time because you know this has all the

38:24

leverage and the drive and I think you

38:26

know only in the long term are markets

38:28

weighing machines in the short term

38:29

they're voting machines and we'll see

38:31

over time how it settles as you know

38:34

people just look at the dynamics of a

38:36

you know 20 billion plus or minus

38:37

business cash flow positive apparently

38:41

well EBDA positive capex not

38:44

excluding X.AI and then add an X.AI and

38:47

it'll settle into a long-term value over

38:48

time. What happens on the day? I think

38:50

you're right. It'll be much more a

38:51

function of the will and it's a small

38:53

float. So, and people will push.

38:56

>> I'm switching it up here. We're going to

38:57

go to PI we're going to go to private

38:59

market.

38:59

>> And we we we had big news from Seoia

39:02

this week for context. Always like to

39:05

contact set. Doug Leone had taken a step

39:07

back from the firm, back from dayto-day,

39:09

back from investing and um you know Pat

39:13

and Alfred had recently taken over the

39:15

leadership from Rolof and now Doug is

39:18

back in an investing capacity, not in a

39:20

leadership capacity. That's still very

39:22

much with Pat and with Alfred, but

39:23

Doug's back in the firm investing. Um

39:27

which is very big news given he is one

39:29

of the OGs. How do we read Doug back and

39:33

back in the trenches? Well, look, I

39:36

don't know. Rory may have more thoughts.

39:37

I I don't know, but um from from from a

39:41

distance, it feels like something to

39:43

calm the LPS. I mean, everyone is

39:45

raising so much capital, so much change

39:48

there. Um that uh you know, I I think I

39:54

mean, you guys have even more experience

39:55

than I do. LPS are uncomfortable with

39:57

change. LPS say that they're looking at

39:59

the new generation and the vanguard, but

40:02

they are comfortable when the old

40:03

leadership is still actively involved in

40:05

the fund. It does make LPS more

40:06

comfortable. Um whe whether they're

40:09

writing investing half the fund or a few

40:11

deals. So it struck me as that simple is

40:14

you you you bring back someone that

40:16

makes the LPS comfortable and you get

40:18

through this crazy amount of fundraising

40:20

everybody's doing. But I could be wrong.

40:22

I could be wrong. But I don't think it's

40:24

just to get somebody on your slack and

40:26

get a little wisdom. you you don't need

40:27

to bring them back to just to to get an

40:29

hour or two of of of insights on on

40:31

deals that you already have.

40:32

>> I think it was a sensible move. I don't

40:34

think it's an earthshaking move. I mean,

40:35

they've made the changes they've made

40:37

already as a firm and it all made sense.

40:40

I think at the margin, you're right, it

40:42

helps a bunch of different things. It

40:43

just provides some continuity.

40:46

Absolutely. Which is important, I think,

40:47

for LPs, for the firm, even for

40:50

entrepreneurs. Also, let's not lose

40:52

sight of the fact he's a damn good

40:53

investor, right? I mean, one of the ch

40:55

questions we always ask when we're

40:57

hiring someone and thinking about it, in

40:58

this case, you are effectively hiring

41:00

someone, is do you think the next check

41:02

that they'll write will be better than a

41:03

check that one of us will write? And I

41:05

think Doug Leone's proven that he can

41:07

write pretty good checks. So, I think

41:09

even at the margin from a check writing

41:11

perspective kind of makes sense. The

41:13

transition having made one transition to

41:16

Rolof and having had to make another

41:17

transition abruptly means the first

41:19

transition wasn't that successful. I'm

41:21

sure there's an element of scratching

41:22

the itch. Yeah. want to come back and

41:24

make it work. You know, they've put a

41:25

lot of his life into this firm. They've

41:27

done an amazing job and it just felt a

41:29

little janky late last year when that

41:31

transition happened. So, if a couple

41:32

more years can help manage that

41:34

transition and send a continuity

41:36

message, why not do it?

41:38

>> With the greatest respect, I spend a lot

41:40

of time with LPS a lot. Um, the

41:43

insatiable appetite from LPS for Sequoia

41:47

has never been more prominent still. Um,

41:49

and so I I don't respectfully I don't

41:50

think it's LPS. I think it's actually

41:52

just like in the face of increasing

41:54

competition from a founders fund who've

41:56

got an Andreel and a SpaceX at their

41:58

tailwind wins for founder brand and and

42:01

which are more attractive than ever for

42:02

founders. You ask the question, how can

42:05

we be more competitive? And Doug is the

42:07

ultimate winner of deals. He is the

42:10

>> You telling me the kids at YC have heard

42:11

of Doug Leone or even know how to spell

42:13

his last name? I doubt it. I'm telling

42:15

you, when Doug Leone goes to that

42:16

meeting with them, whether it's

42:18

Christian Hacker at Trade Republic in

42:21

Germany or whether it's the team at

42:23

Whiz, he [ __ ] closes the deal. Yeah,

42:26

cuz maybe not the YC founder who's 24,

42:29

but you're right. Across, look, let's be

42:31

get real here. Across the venture and

42:33

tech ecosystem, this is someone who's

42:36

had wild success and even in a meeting

42:39

can bring knowledge to bear that would

42:40

move the needle on a close. I I agree. I

42:42

mean it's look as I say don't make

42:45

>> well let's call it gravitas whether it's

42:46

the founders or the LPs it is adding

42:49

gravitas back into sequoia in a time of

42:52

a lot of change right and

42:54

>> that's that's exactly

42:54

>> it says they needed more gravitas that's

42:56

just what we need a little more gravitas

42:58

guys who can we bring in

42:59

>> yeah one of the things I admire about

43:01

sequoia to be I've always said this is

43:03

like literally even if they're winning

43:05

on every round but one they'd be like

43:06

well how do we win on that round as well

43:08

right and you as you say Harry it's

43:11

never been more competitive of there are

43:12

wildly talented similarsized firms. Why

43:15

not, you know, even even if you have 10

43:17

great players, why not get an 11?

43:19

>> Jason, you mentioned the youngest

43:21

founders from YC.

43:23

Some very young founders from YC

43:24

obviously founded Delve, a top two

43:27

compliance business. Um, sorry Rory,

43:30

don't look pissed at me, but it is.

43:32

>> Okay, good. Um, very young, 21 year

43:35

olds. Um, and

43:38

as everyone knows, uh, Delve has been in

43:41

the news for not providing a product in

43:44

the Sock 2 compliance space that they

43:46

said they were. A lot of problems around

43:48

that. Uh, YC have since kicked them out

43:51

of the YC community, which was announced

43:53

this week or leaked this week from

43:55

Bookface, YC's internal product, which

43:58

obviously wasn't meant to be leaked. Um,

44:00

is this the ultimate sign of their

44:02

guilt? Um, inside invested 32 million

44:05

bucks into the company uh, you know,

44:08

within the last year. Um, should there

44:10

have been more diligence from an

44:11

investor perspective? How did we think

44:13

about this?

44:13

>> My guess is you know they they listen

44:15

obviously a lot of things went wrong,

44:17

right? One one one was making up a lot

44:20

of audits with AI. We're going to find

44:21

more more portfolio companies did that.

44:24

Uh, the second one was stealing from a

44:27

fellow company, stealing IP, forking a

44:29

fellow company. And I think, listen, I

44:31

don't know how you manage it with YC

44:32

with thousands of companies, but there's

44:34

a limit where you cross the the the the

44:36

the bro code or the girl code or the

44:39

founder code with other folks at weon

44:42

portfolio companies. And you there's a

44:44

line you just can't cross. And whether

44:46

they see it as open code theft or what

44:47

happened, whether you're manipulating,

44:49

you you can't allow that within the core

44:51

portfolio. And I think they were ejected

44:53

for the combination. And it wasn't just

44:56

some young kids misusing AI. I think it

44:58

was the second. I think it was breaking

45:00

the code and that's why they were just

45:03

there was no need to comment more. You

45:04

broke the code, you're out. You're out

45:05

of you're out of the team. I I totally

45:08

agree, Jason. I mean, look, these things

45:10

are going to happen. I mean, I was just

45:11

running the math in my head. You know,

45:13

why see 200 companies a quarter. So

45:16

that's 8 900 a year, right? Step back.

45:19

United States, we have 300 million

45:21

people here. We have approximately 3

45:23

million people incarcerated at any one

45:25

point in time. So we run rough 1% you

45:28

know between felons and misdemeanors

45:30

right across the whole population. So if

45:33

you just index to that that means out of

45:36

the 800 YC founders a year statistically

45:39

if they're just no better or no worse

45:40

than the rest of the country there's

45:42

eight eight of them that are you know

45:44

will in the course of their life commit

45:45

some kind of crime. It's going to

45:47

happen. You're going to have fraud. And

45:49

you know at the end of it, you know,

45:51

when you have a portfolio of 30

45:53

companies or 40 companies as we do, then

45:55

most VCs avoid it and every once in a

45:56

while one VC gets unlucky. If you have

45:59

200 companies a year, it's going to

46:00

happen to you a lot, right? So first of

46:03

all, no drama there. No, I mean, I saw

46:05

all this, oh YC is bad cuz this guy is a

46:08

fraud. But dude, when you have this

46:09

number of companies statistically, it's

46:11

just going to happen. So that's the

46:12

first comment. And then the second

46:13

comment, Jason, I love what you said.

46:15

You're exactly right. What do you do if

46:16

you're running YC? can't stop this [ __ ]

46:18

up front, right? And especially when a

46:22

lot of your value ad to entrepreneurs is

46:27

um you know the community, right? That

46:30

is what you're selling and you get you

46:31

do business with each other. Anyone who

46:33

did business with these guys was at the

46:34

very least discombobulated and

46:36

embarrassed because you rely on this for

46:38

sock 2 compliance and then it wasn't

46:40

true and then on top of that you stole

46:42

from another YC bro. You're exactly

46:44

right. It's like in the Old West when

46:46

there wasn't, you know, much law. Um,

46:49

you know, you have to take the law on

46:50

your own hands and hang the cattle

46:51

thieves, right? This is the same thing,

46:53

right? Dude, you broke the code of the

46:56

West, you're out. And I think from a

46:58

enforcement perspective,

47:01

you know, I can totally see why they did

47:02

it. You know, now you can talk about

47:05

should other people have known? Should

47:06

you really buy compliance software from

47:08

21s? That's an interesting comment. But

47:11

fundamentally, I I I think you're

47:13

exactly right, Jason. You're going to

47:15

have this thing and the only way you can

47:16

deal with is not a priori policing but

47:19

especially when you break the bro and

47:22

whatever the non sex loaded term of bro

47:26

is. When you break the that code, you

47:29

just got to be pretty ruthless about it.

47:31

So yeah,

47:31

>> I really think it was the part two that

47:32

did it. you know, there was a

47:34

>> stealing, you know, taking a customer

47:37

Sim Studio that is also a YC company,

47:40

maybe even a batchmate. Taking their

47:42

open source software, not attributing it

47:44

back and claiming it's your own software

47:46

to like your own batchmate or your own

47:47

customer. That's, you know, we we've all

47:50

thrown a few things into Claude and

47:51

pretended we did the work. Like all

47:52

three of us have done that, but this one

47:54

breaks the code. You you you took the

47:56

open source code from your batch mate

47:58

and said it was your own software. I

47:59

mean, and they were your customer.

48:01

That's you can't you can't you can't

48:03

handwave that one away.

48:05

>> Move on. Exactly.

48:06

>> You can't handwave.

48:06

>> Moving on. Open router, very well-known

48:09

company for those that don't know, a

48:11

marketplace for LLM, so to speak. Uh at

48:14

1.3 billion price at 50 million of AR,

48:17

up from 10 million in October. Um so

48:20

obviously 10 to 50 in whatever that's

48:22

been 6 to 7 months feels quite cheap for

48:26

an AI leader. Jason, I'm intrigued to

48:28

hear your thoughts specifically on this

48:29

one.

48:30

Uh, I I love Open Router. I mean, I use

48:33

it. Um, and it's just very interesting.

48:36

You know, it's a very simple way to to

48:38

dynamically pick which LLM to use,

48:40

right? And going to our conversation

48:41

from last week, sometimes it doesn't

48:42

matter if you're not price sensitive for

48:44

certain workloads. Sometimes, not only

48:46

does it matter, but it's incredibly

48:48

helpful to not have to do all this work

48:49

yourself. Oh my god, which model should

48:51

I pick? How should I do it? an open

48:52

router let you do it dynamically or you

48:55

can pick different LLMs for different

48:57

use cases and it just makes it elegant

48:59

and uh what I love and it's also really

49:02

cheap right it it it's quite cheap um I

49:05

suspect the cheapness is why it's not

49:07

worth 10 billion right when you have

49:09

such a low take rate from such high GMV

49:13

do you do naturally get a little nervous

49:16

about the address the true TAM even

49:19

though we've given up on TAM that would

49:20

be my guess But um you know they they

49:23

they've become the market leader in this

49:25

space. It's cheap and it works and adds

49:27

a lot of value. You you got to love it,

49:29

right? Um it's just when the flip side

49:31

is you know you one of the reasons

49:34

anthropic I mean god 20 billion right is

49:36

a really good anthropic call at the API

49:39

level is a buck. It's a buck. Okay

49:42

here's my simplification. You can do so

49:44

much on your $20 a month cloud

49:46

subscription or $200 but I can tell you

49:47

on all the apps I've built the complex

49:49

stuff it's a dollar. So that scales

49:52

massively if you're taking uh 1 to 5% of

49:56

a subset of that um you could you you

49:59

know there is in theory a ceiling if you

50:02

don't expand it. What I like is if you

50:03

get market leadership in this kind of

50:05

thing and you're not that expensive

50:06

there's no reason to switch. It's not

50:08

worth switching for a for a tiny amount

50:10

more basis points. It ain't worth it.

50:12

Right. And just for listeners context,

50:16

what the company does is acts as if

50:18

you're building an application, this

50:20

product open router acts as an interface

50:22

between you, the builder of whatever

50:24

software product you're building, and 50

50:26

to 60 different LLMs such that it can

50:29

dynamically pick in real time which LLM

50:31

is the right one for whichever call

50:33

you're making. And it charges around 5%

50:36

5.5% of the money you pay the ultimate

50:39

model provider. So, if you're building

50:42

this app and you're spending, you know,

50:43

$100,000 a year on LLM calls

50:48

using these guys, you pay 5% to them,

50:51

but in return, instead of having to

50:53

access each separate LLM se each LLM

50:56

separately, you get access to them all

50:58

in one kind of API call. And so, it kind

51:01

of just totally makes sense to me. It's

51:03

kind of in that Stripe Twilio business

51:05

model of an interface. Twilio was an

51:08

interface between um an app builder and

51:12

all the complexities of telco. And these

51:15

guys are an interface between an app

51:17

builder and all the complexities of

51:18

LLMs. And you know Twillio's gross

51:20

margins cuz they accounted for growth

51:22

were 20 30 40% plus. They were pretty

51:24

darn good. Whereas in this case they're

51:25

only booking the net revenue at 5%. So

51:28

maybe there is actually room for margin

51:30

expansion there over time you know. So I

51:33

so it's an interesting business kind of

51:35

the world needs it. The other thing

51:36

that's interesting about it which gets

51:38

to the wider question is a number of

51:39

folks have backed into figuring out what

51:41

are the most common models and you see a

51:43

lot of the Chinese open source models

51:45

now right which gets to so I always

51:47

think I'd love to spend time thinking

51:49

about and I just haven't is they must

51:50

have open must have a pretty good sense

51:52

of what things do you need

51:55

state-of-the-art models and what things

51:57

can you do easily on you know much

52:00

cheaper um open source models right

52:03

>> well they can even turn it on for you

52:04

that's one of the reasons I think open

52:05

router So clever. If you want, they will

52:07

just decide which router which which

52:08

model to use for workflow.

52:09

>> And my point is

52:10

>> you don't even have to figure it out.

52:11

>> Yeah. At some point the people spending

52:14

$30 billion a year on entropic um

52:18

corporate IT is going to wake up and say

52:19

do I have to spend all this money on

52:22

entropic or can I pass some of these

52:24

calls to a cheaper model and something

52:26

like you know just given the size of

52:28

spend that openai and entropic are

52:31

getting there is at least the

52:32

opportunity for corporate purchasing to

52:34

think about is any of this doable on a

52:36

cheaper model

52:37

>> I'm a super fan right super fan of open

52:39

like great software super easy to deploy

52:42

everything. It's like 11 Labs just like

52:44

super easy to use, super easy to deploy.

52:46

I give it a 10 out of 10. What I've

52:48

learned from another investment we can

52:49

chat about is okay, so they're at 50

52:51

million AR, they said. Um, and the

52:54

nominal take rate's 5%. But I but some

52:57

folks probably pay less, right? And in

52:59

some cases, you don't have to pay

53:00

anything. So they might be needing to

53:02

manage 2 billion in inference just to

53:05

get to 50 million in revenue. So how do

53:08

you build easy to see how you get to a

53:10

couple hundred million in revenue right

53:12

in today's world what I worry about

53:14

companies like open router is how do you

53:16

get to a billion in revenue right and do

53:17

you just wave your hands and say these

53:19

are great founders they're at the heart

53:21

of AI or do you say oh my god like even

53:23

if anthropic keeps growing and some

53:26

folks won't use it because they'll get

53:28

big enough they'll do their own things

53:29

how the hell does this get 20x bigger

53:31

when it's already managing two billion

53:32

of inference

53:33

>> I I'm going to give you the argument

53:34

which I'm not sure I believe but look if

53:36

you believe in a world where look you

53:39

just look at the open AI and entropic

53:40

projections which cumulatively add up to

53:43

north of in 2029 on the code estimates

53:46

$4500 billion plus let's call it $500

53:49

billion in API across both companies

53:52

right as you take out chat GPT consumer

53:54

business maybe 3400 billion of

53:56

enterprise API calls across um

54:01

open AI and entropic I don't know if 10

54:04

or 20% of that went open source That's

54:07

40 to 80 billion and 40 billion to 5% is

54:10

pleasing eat 2 billion, right? So if you

54:13

could and now that's 100% of the market.

54:15

So you're right.

54:16

>> 100% of the market.

54:17

>> That's fair. That's fair. You got to get

54:18

100%. I mean maybe there's maybe there's

54:21

40 to 80 billion of kind of value going

54:24

to open-source LLMs and maybe you can

54:26

get 5% of that. Now the other question

54:29

to your point Jason is right now

54:31

amazingly all these open- source models

54:34

are primarily Chinese open source models

54:37

and you know until some until either LMA

54:40

until if if Meta reintroduces an

54:42

up-to-ate open source model or someone

54:44

like reflection ships one there'll be a

54:46

US equivalent but right now the low

54:49

ironically the Chinese Communist Party

54:51

is effectively subsidizing you know the

54:54

American small independent software

54:56

vendor by providing cheap open source

54:58

models. God bless them. Right? And

55:00

because if you look at the the the kind

55:01

of winner list on open router, it's all

55:04

Quen Kimmy and all the other open source

55:06

products.

55:07

>> What I think about open router just for

55:09

investing right the news is I do really

55:12

think about you know small takes of

55:15

large TAMs is intellectually confusing.

55:18

So Harry and I are both investors in a

55:20

company called Revenue Cat. I was the

55:21

first investor and they have about 50%

55:23

market share in managing mobile

55:24

subscriptions. If you have a mobile app

55:26

that is paid, 50% chance they have

55:29

revenue cap deployed. Okay, it is it is

55:31

it's competitive and their net take rate

55:34

is like half a percent up to 1%. Right?

55:37

Even with all of that, they're only so

55:39

much bigger than Open Router. Now, they

55:41

grew 40% last month because of AI. Like

55:44

it's great, but like and I love the

55:46

company. I love it. They have a clear

55:47

path to a billion in revenue now. But my

55:49

learning from that is sometimes it's

55:52

hard to do the math intuitively. If your

55:54

if your product is very cheap in a in a

55:56

in aish market, but you don't get all of

55:59

it, you open router could be one of the

56:02

greatest 200 millionaire AR companies,

56:04

right? It's just a risk that I think

56:05

about more than I used to. Um that

56:08

that's fair, but I will give you the

56:10

counterpoint, which is the two best

56:12

financial businesses on the planet are

56:14

Visa and Mastercard. I sit literally 30

56:17

yards away from the Visa headquarters.

56:19

They don't even get 2 and 12% because

56:21

most of that goes to the banks. They

56:23

get, you know, 15 20 bips, but on every

56:26

dollar every human spends on the planet,

56:28

it turns out to be a remarkable

56:29

>> No, I'm with you. I'm just I guess my

56:31

personal intellectual limitation is that

56:33

the notional BIPS map doesn't always

56:35

translate to the real world BIPS map,

56:37

right? That's the thing that the

56:38

Revolute and Visas sound great, but

56:40

niche sometimes products that seem mass

56:43

scale are more niche in practice. And if

56:46

your product, if your product is

56:47

$200,000 a year or $100,000 a year, who

56:50

cares, right? You'll figure it out

56:51

later. If your product is dirt cheap,

56:54

you really really got to like own

56:56

everything. Own everything when it's

56:59

dirt cheap. And I think we're we're

57:00

we're all making a lot of AI mistakes

57:02

here. And we're be our investments are

57:04

being flattered by high ACVs right now.

57:07

like the ones that have high ACVs all

57:10

seem to be doing great because they're

57:11

50 to 100k per check doing getting to

57:14

where 11 Labs got from the early days is

57:16

much harder than a lot of a a lot of

57:18

Loras and Harveys are just because the

57:20

large ACV flatters flatters the the the

57:23

inputs and the outputs to achieve that

57:25

scale.

57:25

>> Agreed. Not sure I could trace it back

57:27

to open router but I agree with what

57:29

>> well I'm just nervous I'm personally as

57:30

an investor and this may be one of my

57:32

many flaws. It's a long list. I'm

57:34

nervous about exciting AI investors that

57:37

have very low ACVs right now. I think

57:40

their actual TAMs may end up being

57:42

smaller than they look despite the epic

57:44

numbers when we started this

57:45

conversation. Despite, you know,

57:47

Anthropa getting to 30 billion in 5

57:49

years, the little tiny crumbs we get out

57:51

of this 30 billion may not be may not

57:53

make a whole loaf of bread sometimes.

57:55

Like a bad analogy, but some truth to

57:58

that. That's just so rather than shoot

58:00

from the hip when it's a 7 million post

58:02

back in the old days, if I've got to

58:04

shoot from the hip at a 100 million post

58:06

in the preede, maybe I got to really

58:08

believe that that small that small ACV

58:10

will scale up.

58:12

>> Do you think Open Route will be a 10

58:13

billion dollar company?

58:14

>> It's always a weird question because if

58:16

I knew for certain I'd go do the deal

58:17

and not sit here talk to you, right? You

58:20

know, because that's what they're paying

58:21

me to do. I think we're in a world right

58:22

now where everyone is just doing the

58:24

buildout as quickly as possible. And

58:27

what that means is everyone on that

58:29

journey can attract some capital right

58:31

and get some revenue because if you're

58:33

solving a problem that's in the rate

58:37

that's a rate limiting step in terms of

58:39

getting the AI buildout done you can get

58:41

revenue you can grow quickly and I think

58:42

open route is an example of that right

58:45

and you know you can put on your

58:46

intellectual MBA hat and say in the end

58:49

when things settle out maybe a lot of

58:51

these businesses get commoditized and

58:53

you can you can worry about that and a

58:55

certain amount of that worry is legit

58:56

legitimate and there's a whole bunch of

58:58

markets like I'll give them all there's

58:59

kind of the labeling marketplace there's

59:02

the inference marketplace there's

59:03

products like this open router where you

59:05

say oh when things settle down and

59:07

people starting getting more efficient

59:09

then all these businesses will get

59:11

scrunched a little bit and that's true

59:13

intellectually but my advice and I say

59:14

it internally is please don't overthink

59:17

it because while that is true at the

59:19

same time in the short term this

59:22

explosive lift in demand gives you a

59:24

chance to be relevant and It's your job

59:26

to add products on top of that such that

59:28

when the great crunch does come and it

59:30

will come in a couple of years, you've

59:32

just delivered enough value. You I mean

59:34

like in other words, so do I think open

59:37

router will get to 10 billion in value

59:38

on just what they do today? No. And and

59:42

if they just keep doing what they're

59:43

doing today, no more than the inference

59:44

guys, no more than the labeling guys,

59:46

when things slow down, all these

59:48

businesses will get crunched, right?

59:50

When people start to optimize, but

59:51

you're you have a chance to parlay. You

59:54

have you're building you're building

59:55

relationships with a whole bunch of apps

59:57

developers in Open Router's case, right?

60:00

The your job is to find the add-on

60:02

products on top of this that over the

60:04

next two or three years, you know, give

60:06

you value or do the adjacent

60:07

acquisitions that give you value. Maybe

60:09

you start doing inference, maybe you

60:11

start hosting stuff on top that allows

60:13

you to extract more value from those

60:15

customers such that when the thing slows

60:17

down, you're the survivor. I mean, we

60:19

saw

60:19

>> Yeah, I think that's the challenge with

60:20

these investments. I mean, the one that

60:22

if I'm running it, it's a dream. 50

60:23

people, right, at 50 million in revenue

60:25

at the center of this. Like it's I I if

60:26

I was a founder, I'd be there's a dream

60:28

job, right? Um but I think my learning

60:31

is Rory's point. You the reality is you

60:32

have to go truly multi-product earlier

60:34

in this type of situation. Not not just

60:37

a little feature, right? Not just a

60:38

little enhancement. Um but you literally

60:41

probably have to build five distinct

60:43

products to get to that billion. And not

60:46

um you know, not all founders are

60:48

actually up for that. They say they are

60:49

but not but you need a very distinctive

60:52

founder to run the AI rippling playbook

60:54

and say hey I want to break something up

60:57

in some ways that's crushing it with 50

60:59

people if they have it. I mean again my

61:01

dream job and say we're going to do five

61:02

of these and we're not going to wait 2

61:04

years. We're not going to we're not

61:05

going to like just focus focus focus

61:07

focus and um I think if they're up for

61:10

it I would hold my stock. Harry you

61:11

probably have no choice. If you see this

61:14

sort of Stuart Butterfieldesque

61:16

reluctance to go multi-product, which

61:17

was very rational at at that time and

61:20

place, then uh I would be less excited

61:22

to hold stock. I think you've got to run

61:25

these businesses right now like you're

61:27

in this insane

61:30

kind of period of time when money is

61:33

just raining down on everyone and all

61:35

the time you should be saying to

61:37

yourself at some point the music will

61:39

stop and twothirds of the people will be

61:42

will will have to go how do I make sure

61:44

I'm the one-third that make it right and

61:46

that's what you know the smart inference

61:48

providers are doing that's what the

61:49

smart up and down the stack should be

61:50

doing how do I lock in because look the

61:52

truth even when the crunch comes, the

61:54

foundational model companies make it

61:56

because they they're on top of the heap.

61:57

They have the high intellectual property

61:59

asset, right? They're going to make it.

62:01

Everyone else one level down has got to

62:02

be saying to themselves when people

62:04

sober up, they're going to say, "Oh my

62:06

god, this is a commodity. There's a

62:07

bunch of adjacencies. How do I make sure

62:09

I win in that world?" Speaking of, "Will

62:12

this become a commodity in a future

62:13

world?" We've seen you the need and the

62:17

explosion of databases. Um, we've seen

62:19

some people like your lovables and your

62:20

raplets incorporate them, build it

62:22

themselves. Some people um outsource to

62:25

Superbase. Superbase at $10 billion.

62:29

Jason, you're the man for this. The man

62:31

who's used more replet instances than

62:34

anyone else. Is Superbase at $10 billion

62:37

a good buy?

62:39

How do

62:40

>> I think it might I think I like it. I

62:41

mean, I do think it's it's an

62:42

interesting buy. Um, first of all, you

62:45

know, huge credit to the team. I mean

62:47

this is one this is one I call an AI an

62:50

AI tailwind to the maximum. Superbase

62:52

founded I think in 2020 right this is

62:54

pre AAI and they're like oh well we'll

62:57

do another fork of Postgress which is

62:59

open source and free and we'll make it e

63:02

easier to use and easier to deploy. I

63:04

mean, who the hell I mean, I know it was

63:06

it was a hot YC company, which kind of

63:08

uh you know, a lot of folks want to say

63:10

it's the unhot ones that take off, but

63:11

you know, sometimes it is the hot ones,

63:13

but I don't know that that I I don't

63:14

know that certainly wouldn't have been

63:15

obvious to me in 2020 that we needed

63:17

another a forked version of an open

63:19

source database process. I mean,

63:20

everyone was having issues with

63:21

Postgress at the low end and the high

63:23

end. Folks were having to shard it and

63:25

it got complicated for big and it was it

63:27

was reasonably difficult to deploy at

63:29

the low end. So their idea but then that

63:31

just worked with agents like they built

63:33

a product that could basically

63:35

self-deploy a Postgress database and

63:37

it's what every agentic product needed

63:39

right they needed to spool up a database

63:41

without humans and they leaned the hell

63:43

into it right they let um uh they didn't

63:46

get replet with neon which data bricks

63:49

bought but everyone else standardized on

63:50

superbase right they they supported them

63:52

and then they let everyone lovable and

63:55

emergent and all these other ones white

63:57

label it a couple months ago um and Now

63:59

I don't have the exact numbers but I

64:01

know more databases are being created by

64:03

agents and humans. So that is the trend

64:06

you're betting on. All right. Database

64:08

is a fundamental category of software.

64:09

It always has been right now. The number

64:12

of databases we're creating I mean it's

64:14

it's it's it's an order of magnitude

64:16

more than it's been 12 months ago. So

64:18

why the hell wouldn't you want to bet on

64:19

the leader in that trend? Right? Every

64:21

app needs a database. like every and

64:23

even the ones and what's interesting now

64:25

is I'm not sure if this is true of

64:28

lovable v 0ero but replet changed it a

64:30

little while ago where every single app

64:31

has a database whether you use it or not

64:32

they found that enough of them are using

64:34

databases no matter what they build

64:36

right that it's not worth adding a

64:38

database later so whether you even

64:39

realize you have a database all the

64:42

millions and millions and millions of

64:43

vioded apps have a database in the

64:46

background so um and with superbase they

64:49

get to monetize them all like they're

64:51

charging these guys for every single

64:53

database. So I I do like this one. I I

64:56

do like it. This is one where the agents

64:58

are so far ahead of humans now. There

65:00

are categories where the agents are

65:02

doing like venode and everyone's talking

65:04

about what the world will be like in

65:05

four years, right? Database is a world

65:07

where already the agents are creating

65:08

more databases than humans. We've

65:10

already we've already crossed that line

65:11

and so why wouldn't you want to invest

65:12

in the leader?

65:14

>> Agree. And I think it is Lily

65:17

an excellent example of two things we've

65:19

talked about. One is that kind of thing

65:21

I just mentioned which is you start with

65:23

something and you have to parlay and

65:25

then the other thing is Jason that

65:26

you've talked about a lot is being a

65:28

preAI company that you know brilliantly

65:30

finds a way to co-attach. These guys

65:32

co-attached to the trend as you say did

65:35

the deals with many of the vibe coding

65:37

things and now their job in the next two

65:39

years is before the music stops be

65:42

perceived just as MongoDB was the the

65:45

right database for the kind of SAS era

65:48

and for cloud you want to be the right

65:50

database for vibe coded and agent apps

65:52

in 2026 27 or 28 and at some point when

65:56

when the when things slow down enough

65:59

for the lovers and the replets and the

66:01

other folks to say, "Hey, maybe we

66:02

should just back in and do this

66:04

ourselves." You want to superbase be in

66:06

a position to say, "No, every developer

66:08

on the planet uses us. Every agent

66:11

framework supports us. Why would you do

66:13

this? Your users will rebel." Right? The

66:16

playbook is super clear. As I say, it's

66:18

literally it's like it is just like the

66:20

SAS and Cloud Spade playbook, but on

66:23

super fast speed. you know this is all

66:25

going to happen in two or three years

66:27

and make make sure that you know before

66:30

things slow down you are a lot more than

66:32

you are today in the eyes of your users

66:35

think about how hard it classically has

66:38

been to deploy a database I mean Oracle

66:40

is still ma massive right I mean I've

66:42

never deployed Oracle but I can only

66:44

imagine how difficult it is to deploy

66:46

Oracle database right [ __ ] is work

66:49

these product and that was disruptive

66:50

these things are work I even [ __ ] has a

66:53

a vector data based product and I I

66:55

deployed it for one of our apps and it

66:57

it only took a few hours but it took

66:59

head scratching and headaches and not

67:01

everyone could do it. Superbase you

67:03

could do in 5 seconds. I mean it's so

67:05

disruptive.

67:06

>> All these databases start being easier

67:08

than the prior alternative. I mean I

67:10

don't shock hor relational started

67:13

because it was in the 70s but I remember

67:15

even in the early 90s

67:17

>> Arthur Rock used to talk about it

67:18

though. I remember the relational

67:19

database days

67:20

>> and then I but I do remember when

67:22

MongoDB started and it was just the drop

67:25

deadad simple cloud-based alternative to

67:28

a lot of these you know to some of the

67:31

other alternatives at the time not so

67:32

much directly competitive relational

67:34

databases but for some of the newer use

67:36

cases and then they get more complex

67:37

>> or a DBA for someone that could spend a

67:39

month configuring it and getting it

67:40

going is disruptive right

67:42

>> yeah because it it didn't take a month

67:44

it took a few hours it was easy it was

67:46

JSON it was whatever and you're right

67:48

now It's 5 minutes. So, it's just now

67:50

I've no doubt

67:51

>> or actually it's invisible. You don't

67:53

even know. Here's what's interesting.

67:54

You don't even know you have a database

67:56

until you need it. It's lurking in the

67:58

background now. You build an app without

68:00

a developer and you didn't even know you

68:02

needed a database because you're not a

68:03

developer and it's already there and

68:05

configured and has all your data. It's

68:06

pretty cool.

68:07

>> That's true. But the odd point I was

68:09

trying to make is the tragedy is that's

68:11

great, but over the medium term, a white

68:13

label business to five or six vibe

68:16

coders won't be enough. So they're going

68:17

to have to expand beyond that. And

68:19

ironically, over the next 5 years, that

68:20

will mean adding complexity, adding

68:22

functionality. And in 10 years time,

68:24

someone, and it won't be me at that

68:26

point, will be saying, "Oh my god, those

68:27

legacy Superbased products, they're

68:29

almost as bad as MongoDB, and there'll

68:32

be a new alternative at that point." But

68:34

that's just the movie. And this is

68:36

Superbase's time to crank. Good for

68:38

them.

68:39

>> I think it's also a reminder just that

68:41

we've given up on worrying too much

68:43

about intellectual durability in these

68:45

in these investments, right? It's a

68:47

winner. The growth is is exciting. The

68:49

MPS is high. We're not the fact that

68:53

everyone else may build their own

68:54

Postgress databases or other things may

68:56

change. We're not we don't even care

68:57

anymore.

68:58

>> I mean, I'd say differently, by the way,

68:59

just to be clear, it's not that we don't

69:01

care. It's that you just don't have the

69:02

luxury of f there are very few things

69:05

where you can say, "Oh, this is

69:07

something that is highly different. It

69:09

has that level of defensibility."

69:11

Arguably, LLMs themselves did because

69:13

there was only a small number of people

69:14

who knew how to make the magic. But

69:16

you're right, most of the time right

69:18

now, I mean, I can regret the fact that

69:20

there's not a lot of barriers to entry

69:24

or I can just accept that that's just a

69:25

reality that exists today. And the

69:27

barrier to entry is, as Brian from Andre

69:30

said, is speed. And if you execute well,

69:32

you create these barriers to entry over

69:34

time. But you're right, Jason. Right

69:35

now, most of the deals you look at is in

69:38

the short term, the barriers to entry

69:39

are low. And what that means is if you

69:41

stumble, you lose,

69:43

>> right? Because if there's five of you

69:45

going out of the gates, one of them

69:46

won't stumble and they'll win,

69:48

>> right? And over time by winning, they'll

69:50

be able to create. I I believe

69:52

downstream there will be barriers and

69:54

modes created, second order modes, but

69:56

out of the gate, you're exactly right.

69:57

It's a race. And I don't know who the

69:59

superb competitor was, but they didn't

70:01

get the two or three key white label

70:03

deals. And there you are.

70:04

>> I think also this is why I view a lot of

70:06

VCs today as enablers.

70:08

It really bothers me because

70:11

Rory's point is accurate. If you stumble

70:13

today, you may lose forever, right? And

70:15

I see way too many the classic VC thing

70:18

is guys, keep pushing. You you've got

70:20

time. Keep at it. You know, a bad

70:23

quarter or two falling behind the

70:24

competition. Like everyone, it's not

70:27

that I don't think you should be

70:28

supportive of your portfolio companies

70:29

of of course you have to be, right? And

70:31

what cho what choice do you have? I just

70:32

see too many VCs running a preAI enabler

70:36

playbook where where

70:39

when folks do fall behind a tick or two

70:43

the you see kumbaya activity instead of

70:45

code red activity. I don't think I'm a

70:48

come by. But I also I I was interested

70:50

to see where you're going with that.

70:52

>> I think it goes to your point of what

70:53

you said before, which is like there's

70:55

no point in being difficult with

70:58

founders or being opinionated in the way

71:00

that you've been before because they

71:02

don't listen. You said it yourself, they

71:04

don't listen. And so why

71:05

>> No, but I'm making you're I'm making a

71:06

slightly different point. For example,

71:07

I'm a I'm an investor in a company

71:09

that's crossed nine figures in revenue,

71:11

but it is hitting it is hitting massive

71:13

AI competition and issues. Okay, it

71:15

happens, right? and they have a new

71:16

investor on the board that that doesn't

71:19

really know the space that well and

71:21

doesn't really want to learn and frankly

71:22

isn't as close to some of the AI changes

71:24

as as we are and every email and

71:27

conversation is great job guys keep at

71:30

it he like you know don't you understand

71:31

the disruption in the space don't you

71:33

understand the issues and he's become an

71:34

an enabler an inadvertent enabler by

71:36

being a cheerleader right I just worry

71:39

about it because so many folks are still

71:41

hiding and I don't think having enablers

71:43

around the table even if it feels good

71:46

on a given month is helpful today. I I

71:48

think enablers can can can enable a

71:50

death spiral that you feel good about as

71:53

you approach the event horizon and your

71:54

startup implodes.

71:55

>> I was thinking about what you said and

71:57

you know frankly just checking myself

71:58

have I at times I mean because you know

72:00

what one man's descriptive of enabler

72:02

can be another person's descriptive of

72:04

being supportive right and at times when

72:06

things are tough you want to be

72:07

supportive. So I was I was thinking

72:09

actually Jason because I actually think

72:10

it's a very important comment and I

72:12

think what makes a difference is this.

72:14

You have to be very cleareyed with your

72:16

companies on where the competition is

72:18

and understand exactly, you know, what

72:20

the other guys are doing and therefore

72:22

how well and how far behind you stack

72:24

up, right? And that takes it away from

72:26

enabling it's it's it's kind of a

72:28

judgmental term is am I being you know

72:31

because also being a jerk is also a

72:32

judgmental term, right? I think what

72:34

you're saying that is correct is you're

72:37

not a useful board member unless a you

72:40

understand what the company does and how

72:42

it compares to the direct competitors.

72:44

ideally with hands-on experience of the

72:46

products. And then B, you find a way

72:48

without being a jerk

72:51

to keep the company honest about where

72:53

they are relative to the competition.

72:55

You know, what are they seeing?

72:58

What do our product what do our

72:59

competitors products do? What do the

73:01

adjacent products do? How do we think

73:03

about that? Not in a kind of defeist

73:04

kind of way, but you know, how do you

73:07

distinguish between, oh, these guys

73:09

leaprogged us for a month and we need to

73:11

get our act together versus we are a

73:13

year, year and a half behind in a

73:15

market, we may never catch up. We should

73:16

sell while we can, right? And I think

73:19

that's actually that I think that's kind

73:22

of threading the needle between you

73:23

don't want to be an enabler, but you

73:24

don't want to be a debut. You want to be

73:26

supportive.

73:28

It's kind of a slogan we have

73:29

internally. It's not so much founder

73:30

friendly, it's founder honest, right?

73:33

But also, I think actually as I think

73:34

about it, it has to be founder

73:36

fact-based. The number one thing, and

73:38

I'm even thinking on a couple of my

73:39

deals where I have to do some work this

73:40

week, do you really understand where

73:42

your two direct competitors are and the

73:44

strengths and weaknesses of your product

73:46

and what the last three win losses say

73:48

about how you're really doing in the

73:50

field? Because if you don't, you're

73:52

just, you know, co-playing a board

73:54

member

73:55

>> and are you being honest about what has

73:56

to change?

73:57

>> Yeah, agreed.

73:58

>> Right. Okay, champs, there there are

74:00

many other topics that we can discuss.

74:02

Um, I often get chastised for my

74:04

selection, so I'm going to

74:06

>> No, I want you to make the selection.

74:07

You make the selection, Harry. I'm too

74:08

tired to decide. I

74:10

>> I think we will have to at some point

74:11

address. I think someone in a comment

74:14

put like, "Oh, Harry, you often shill

74:16

them and so you can't not talk about

74:18

them when there's trouble." I don't like

74:20

to do that. So, for context, Mccor

74:23

obviously a data provider to some of the

74:25

largest uh companies in the world. most

74:28

notably Mata who they have since

74:30

reportedly lost part or all of them

74:34

being a customer of their data. Um

74:37

it was also unfortunately at the same

74:40

time that Forbes released their

74:41

billionaires list of young billionaires

74:44

where the founders of Mccor are on that

74:46

list. Um very unfortunate timing there

74:49

for them. Um

74:52

Jason, I'm sure you've got an opinion on

74:53

this in terms of bluntly the Mccor hack

74:57

and then losing Facebook as a customer.

75:00

Well, just two thoughts. One, not that

75:01

long ago, I was with um part of the

75:04

management team of one of the leading

75:05

hyperscalers or and um what what what he

75:10

said and and I was with him when there

75:12

was a minor security issue with the

75:13

third party vendor, relatively minor,

75:15

right? uh not but not like this not like

75:18

all of the private data of all of Mccur

75:20

being exposed and what he said was

75:23

there's not much higher on our list with

75:25

partners and when this happens there's

75:26

not much higher like we we have no

75:28

tolerance it's not worth it there is no

75:30

tolerance and in this case they got a

75:32

pass because it was a relatively minor

75:35

issue with a vendor that was honest and

75:37

they fixed it and it did not lead to

75:38

actually any internal data issues it was

75:40

just an external issue but it was

75:42

crystal clear there is no toler there is

75:44

no it is not worth it for us. So that is

75:48

troubling. And the other thing and I'm

75:49

not an expert in the data labeling or

75:50

more space. What I don't know is how

75:52

fungeible the products are at some

75:54

level. The more fungeible it is, right,

75:57

the more freedom you have to route that

75:59

to what's left at scale or whatever the

76:01

guys at Handshake want to do or whatever

76:03

they want to do. If it's if it's not

76:04

fungeible, you can you can you can bang

76:06

your chest. Uh but you're still you're

76:08

still you're still stuck using them. But

76:10

but that moment like that when I was

76:12

with that hypers scale

76:13

>> my number one criticism of the space is

76:15

that all the largest customers are

76:17

customers of all of them and so they are

76:19

generally heavily funible and

76:21

>> so I would be very worried because this

76:22

comment was chilling from this executive

76:24

is like this is the highest thing on our

76:26

list with third party partners is

76:27

security. We're exposing our our

76:30

applications to folks we'd rather not

76:32

expose it to because of our ecosystem.

76:34

We'd rather have no partners because

76:35

there is so much confidential stuff

76:37

flowing through what we do at all

76:39

levels. So we have no we just have no

76:41

tolerance for anything that's material.

76:43

I just don't see how you would come back

76:45

from this if you've crossed there's just

76:47

no unless you have to. I think it's I

76:49

think it's it's potentially death. And

76:51

the reason I bring it up is in the old

76:52

days like through 2023 in our lifetimes

76:56

you always got to pass once as a vendor

76:58

even for the worst breaches the worst

77:00

issues unless your your app was down for

77:03

weeks. You you'd get called into the the

77:06

CISO's office you'd get yelled at. It

77:08

was brutal. But you always got at least

77:09

one because it was just the reality of

77:12

working with emerging vendors. But man,

77:15

I just I don't know that you get a

77:17

second one here. I just don't know. A

77:18

lot at the margin depends on how you

77:20

handle it. I don't know at this point.

77:22

Do we have any sense of you know, was it

77:23

a state actor? Was it a malicious

77:25

employee? Was it just a stupid

77:27

configuration breach? So I don't have

77:29

any sense of the forensic here. It was

77:31

it was it was a organization I think

77:33

they called Latipus which basically hold

77:36

you de hold it for ransom and you have

77:38

to pay for it back. It's a commercial

77:39

activity.

77:40

>> Gotcha. It's a commercial like decent

77:42

decent honest criminals and right. Got

77:45

it. That's

77:45

>> very successful. It's I I asked the team

77:47

if we could invest in them. It seems

77:49

this is one of many very successful

77:51

>> actually. You'll find Harry they they

77:52

don't actually

77:53

>> No, they're very good. They're very

77:54

good.

77:54

>> They don't need much outside capital.

77:56

Right. That that in one sense sucks. On

77:58

the other hand, the good news is at

77:59

least they're going to be rational and

78:01

you can buy them off. Um I I think is it

78:05

fatal? Hopefully not. I think in these

78:07

things you generally pay a pretty

78:09

significant penalty. How you handle it

78:12

is a key part of it where you're

78:13

straightforward and honest with your

78:15

suppliers and with your customers and

78:17

let them know what's happened. Was it

78:20

entirely was it your fault entirely?

78:22

Were you crassly stupid? Was it bad

78:24

luck? was it you know where in that

78:25

containment it was you know so you can

78:29

manage through it I think it is hard

78:31

when you have four or five vendors of

78:33

roughly the same thing on the other hand

78:35

I don't mean this cynically but there

78:37

also appears to be right now an

78:38

insatiable demand for labeling data so

78:42

it may be I I think cuz I think the meta

78:44

statement didn't say they've I said

78:45

they've paused which makes sense

78:47

everyone's going to pause right and my

78:49

guess my guess is the likely outcome is

78:52

you lose a fair amount of revenue you

78:53

lose a fair amount of time. You're going

78:55

to have to spend a ton of money to

78:57

bolster your defenses, but if you do the

78:59

right thing, you'll be able to earn your

79:00

way back slowly and over time, right?

79:03

That's the likely outcome here. You

79:06

know, hopefully not fatal. Um, but my

79:10

guess is that and and I think Sam Alman

79:12

to maybe to tie it back to the start, I

79:14

think Sam Alman said something this week

79:15

that massive cyber security attacks will

79:17

become from AI, right, are coming. I

79:20

genuinely think that most B2B companies

79:22

are going to get hit worse than Merkore.

79:25

Okay, this light LLM was was one of the

79:28

weaknesses that they had. Like tons of

79:30

B2B folks have how strong how

79:32

state-of-the-art and strong are their

79:34

security teams? They're they're not

79:36

they're relying on a hodgepodge of open-

79:39

source and other products that are

79:41

barely monitored in many cases. They're

79:43

busy. They're under pressure for

79:44

profitability. they're managing their

79:46

teams and I and and and maybe Mercury I

79:49

mean it's probably a small company they

79:50

probably don't have a huge team but my

79:51

point is I think um it's going to be

79:54

really tough on a lot of startups and

79:56

scaleups because they just don't have

79:57

the teams to deal with the levels of

79:59

threats that are coming from AI this is

80:01

a start it's going to get worse this

80:02

light LLM incident it's going to happen

80:04

to everybody and as soon as you figure

80:05

out you can hold all these B2B companies

80:07

and others hostage the they and their

80:10

network of thousands of affiliates are

80:11

going to do it well I don't know that

80:13

the average pretty good to mediocre er

80:15

to actually don't even really have a

80:17

security team. How the hell are they

80:18

going to keep up when I don't even have

80:19

a team? Remember when Gainsite was

80:21

offline for a month? Drift permanently

80:23

was destroyed. And this was pre all this

80:26

craziness. Like there was an old saying

80:27

my CTO told me back in the day. The only

80:29

reason we've been hacked is no one cares

80:31

about us. And that has resonated in my

80:33

ears for years.

80:35

>> With AI, you can hack anybody you want.

80:37

I I think Jason you're exactly right

80:40

because I think you know in general you

80:44

know security purchases tend to l you

80:47

know new stuff gets deployed people

80:49

should think about security up front

80:51

they tend not to they deploy bad stuff

80:54

happens and then they panic and think

80:55

about security that's the way it's

80:57

always been in every cycle and I think

80:59

we're just about to hit that stage of it

81:01

now and I think there's kind of two

81:03

separate vectors there first of all the

81:05

AI apps themselves have security needs

81:07

that are different than um pre-AII apps.

81:10

You know, get the whole prompt injection

81:12

kind of issues. But I think the bigger

81:13

issue is Jason's point, which is

81:15

everything else. AI can now using AI,

81:19

the bad guys can just automate attacks,

81:21

automate fishing, you know, automate,

81:24

you know, duplication of voice, all that

81:26

kind of stuff. I think Entropic

81:27

referenced this and you know that and

81:30

it's it's the Red Army quote that I

81:32

often use that quantity has a quality

81:33

all its own. And with AI, you can make

81:36

quantity of fake people. You can make

81:38

quantity of attacks. You can do this

81:40

thing. So, I think it's not so much

81:41

going to be you're getting attacked

81:43

because of your AI apps. It's much more

81:45

going to be AI apps are going to attack

81:47

you. A lot of the AI doomerism I kind of

81:49

discard, but this is a legitimate and

81:51

big issue, I think, right? The ability

81:53

of AI to escalate the velocity and

81:57

ferocity of attacks, right? So, which is

81:59

why, by the way, I think the whole the

82:02

response of security stocks going down

82:05

to the anthropic announcement was

82:07

absurd. I think anyone who's cutting

82:09

back on the security budget in 2026 is

82:12

missing the point, right? Because not

82:14

are the tax more ferocious, but thinking

82:16

again, Harry, to your point, this again,

82:18

the second statement is really Captain

82:19

Obvious, but it's worth saying

82:22

these stuff get more important every

82:24

year because more and more of our

82:26

stuff's online. you know, the percentage

82:28

of things that we do that are done

82:30

online just continues to escalate. It's

82:33

like stupid stuff. It's like all the

82:34

stuff in your house, all the stuff in

82:36

your financial life, there's nothing

82:38

that matters that isn't done online at

82:40

this point, right? Which means that

82:42

attacks there can attack more and more

82:44

of what counts, right? So I think

82:46

absolutely you're going to see you

82:48

should be seeing an you know

82:52

a real acceleration of investment in a

82:54

different class of security to cope with

82:56

a different class of threat cuz look

82:58

it's a fatal error if you don't there's

83:00

only really two things that can destroy

83:01

one of these companies the app goes down

83:03

for a long period of time or the app

83:05

gets hacked grievously. Those are the

83:08

two kind of red card fatal errors right

83:11

and that's where you're going to have to

83:13

put the money. there's a wave of the

83:15

second tier that's coming. It's just

83:17

massive, right? Even su to tie it, I

83:19

know we got to end rap, but even

83:20

superbase, which I'm a super fan of,

83:22

right? Um, a lot of times folks turn off

83:24

the default security, right? And we've

83:26

seen these issues and and and in the old

83:28

days, no one would find it because they

83:29

didn't care about our little Rory and

83:31

and Harry and Jason's app. Now AI will

83:32

find it in in seconds, right, on the

83:34

internet. and it will use every possible

83:37

way to penetrate that to steal the data

83:39

to create malicious acts everything that

83:41

is possible on the face of the earth

83:43

that can be delivered with software. Um

83:45

maybe there's nothing we can do but I

83:47

you know I don't I don't think uh I

83:48

think we've all underinvested in the

83:50

attacks to come. It

83:51

>> it used to be hacker farms of people in

83:53

fill in the blanks the Philippines

83:55

Russia. Now it's going to be hacker

83:57

farms of AI agents cranking 24/7. It's

84:00

going to be miserable.

84:01

>> Guys you can choose one more. Okay,

84:03

Jason. Rory doesn't feel like choosing,

84:06

so he's delegated it to you.

84:07

>> Absolutely. Okay, there there's the GLP1

84:10

two bit two company again for people who

84:12

use this as their like news on tech. A

84:15

twoperson company who uses AI

84:18

intelligently scaled to 1.8 billion in

84:20

revenue selling GLP1s.

84:22

Interesting. A lot to unpack there. Wix

84:25

buying back 31.6% of its shares given

84:29

its low stock price. uh Oracle getting

84:32

rid of 20 to 30,000 employees via a 6

84:35

a.m. email. Jason, any that stand out,

84:37

baby?

84:38

>> Obviously, at some level, uh people are

84:40

so excited about the one or two person

84:42

billion dollar company. Facts were

84:44

glossed over, points were missed and all

84:46

that, right? Oversimplified.

84:48

Um

84:50

and they used deep fakes. They made

84:51

representations about doctors they

84:53

shouldn't. They did all the wrong

84:54

things. They did all the crappy

84:56

affiliate marketing stuff people have

84:57

been doing for 20 years and they did it

84:59

at scale with AI and built a big

85:01

business on it, right? Um, I get it. Um,

85:04

but what is interesting about it is if

85:06

you let's not over glamorize this

85:08

company that maybe is at the edge of

85:09

fraud in many ways and its marketing

85:11

tactics. The fact that they could use AI

85:13

to scale this with two people and maybe

85:16

some consultants and stuff on the side.

85:19

It is it is we are seeing the future.

85:21

Why? And um AI is completely changing

85:24

marketing. Marketing has not gone away.

85:26

Daario and Sam to to Rory's point, Dario

85:28

and Sam are everywhere because marketing

85:30

matters as much now as ever. And this is

85:33

a different version of how marketing is

85:35

changing. And if you don't adapt, right?

85:37

Maybe buying TBM is a bad idea, but you

85:39

got to adapt to the new world of

85:40

marketing when and and and you know,

85:43

there's a lot of there's a number of

85:44

startups that have tried to automate all

85:46

this marketing at scale with the eye.

85:47

Most of them are terrible, right? They

85:48

don't quite work. They create boring

85:49

assets. They look like an awful art.

85:51

They look they look kind of like make.

85:53

They're that bad. A lot of the assets,

85:55

um, they don't have context. They use

85:57

too much stock art. But no reason in a

85:59

year everyone shouldn't have the AI

86:01

power to blanket the entire internet

86:04

with the best hyperpersonalized

86:06

marketing in the world. Like we

86:07

hopefully in a year we will all have the

86:09

power to do some of what um, Medv have

86:12

done. And I think it's I think it's a I

86:15

think if we step back, it's a chance to

86:17

see 12 months into the future that when

86:18

we will all of us will have more of this

86:20

power.

86:22

>> In other words, just cuz they're

86:24

illegally selling uh weight loss

86:26

reduction drugs that are off label to

86:30

people without the appropriate FDA

86:31

safeguards doesn't mean they're not

86:33

great marketers.

86:34

>> I just I talked to so many CMOs who are

86:36

still struggling to run the 2023

86:38

playbook and falling further and further

86:40

behind. you have to adapt and uh and and

86:45

and and I think crappy automation

86:48

doesn't work anymore, right? But these

86:50

guys that were able to do it at mass

86:52

scale and target everybody, this is the

86:54

future that we all should should know

86:57

and um and marketing, I guess the kota

87:00

is marketing appears to be more powerful

87:02

than ever in the age of AI and humans

87:04

have to control it. But if you don't

87:06

leverage how marketing has been done in

87:09

the future, you're you're going to be

87:10

stuck in the past. You're gonna and

87:12

you're you're going to be killed. You're

87:13

going to be killed by folks running the

87:14

old playbook. So, um, but yes, I mean,

87:18

anyone that achieves some sort of scale

87:20

through marketing, as dodgy as some of

87:23

these consumer guys are, there's

87:24

something to learn, right? Whether it's

87:25

multivariant testing, whether it's AI,

87:27

whether it's p personalization at scale,

87:30

uh, I think it's a waste as a marketer

87:32

if you don't learn from it, right? And

87:33

this is what I think is going to happen.

87:34

We are this is this there just like it

87:37

is inexcusable to send a dated sales off

87:40

outreach cadence from 2021 today the the

87:44

way we're doing advertising and

87:45

marketing will seem incredibly dated in

87:47

two years like only only the creakiest

87:49

companies will be doing marketing and

87:51

advertising the way they do it today in

87:52

two years. It makes no sense. It makes

87:55

no sense. I should be getting the GLP1

87:57

ad specifically targeted to you. Jason,

88:00

you don't technically need it, but I

88:01

noticed your jawline on 20 VC could be a

88:03

little bit tighter. Uh, what if we just

88:05

micro dosed you today and uh, so you

88:08

could get some of that t-shirt look that

88:09

Harry has. Like, I'd click on that in 60

88:11

seconds. 2 seconds. You

88:12

>> would, you know,

88:13

>> I shouldn't get that. And versus stock

88:15

art of some grandpa running with his

88:17

golden retriever on a beach, right? So,

88:19

I'm both both think this is fraud and

88:22

and a and a over headlined and and the

88:25

future. We're watching the future.

88:26

>> I I think you're right. I mean I would

88:28

argue that a hu just a huge percentage

88:30

of it is on the fraud side but none not

88:32

on the fraud side but the um hairy edge

88:35

of regulation side but I actually agree

88:37

upon reflection what you're saying is

88:38

correct. It's funny it's a little like

88:40

the last comment. What we're basically

88:42

saying is the most entrepreneurial

88:45

people on the planet out there right now

88:47

are the security yeah the hackers and

88:52

the kind of dodgy marketers that are on

88:54

the front line of pushing things. But

88:56

the tactics that they're using in the

88:57

way of leveraging AI, everyone's going

88:59

to be doing within a year or two are

89:00

going to be left behind. I could I I

89:01

could go along with that. Yeah. There

89:03

was there was an era in the old days

89:04

when the best affiliate marketers knew

89:06

things nobody else did and built

89:08

billion-dollar companies out of it,

89:10

right? Uh then there was an era which

89:12

has now faded when some of the best

89:13

companies used SEO in a way nobody had

89:16

used before. Millions of pages, right?

89:18

You know, even things you could, you

89:20

know, maybe it's only worth 10 billion.

89:21

Digital Ocean was built entirely on SEO

89:23

farms. So was Zapier and others. And

89:24

there was a group of folks that knew how

89:26

to do this and it worked. The next group

89:28

of folks will know how to do this mass

89:30

personalization at scale that really

89:32

works to millions and millions of people

89:34

and they will win like the prior

89:36

affiliate market. They will crush folks

89:38

and the rest of the world just won't get

89:39

it. They'll think it's dark arts and um

89:42

and they need to become agentic

89:44

marketing experts. But in the same way

89:46

that great affiliate marketers actually

89:49

largely originated from porn and Viagra

89:51

if you want the

89:52

>> dark

89:54

and here I think probably some of the

89:56

best nextgen AI marketers will spawn

89:59

from GLP1s or anything at the border

90:02

lines of you know next generation

90:05

e-commerce in some ways.

90:06

>> Yeah. But I think that Jason's insight I

90:07

I agree with that by the way. Yeah. I

90:09

mean there's no doubt that criminals I

90:12

mean like many of these new technologies

90:14

are adopted by crime or by porn or dodgy

90:17

marketers but I think Jason's insights

90:18

is a profound one which is the marketing

90:22

tactics that are used by and look very

90:25

kind of dark arty over the next 10 years

90:27

tend to be adopted by everyone and now

90:30

as you say everyone in any corporate

90:32

America is an SEO expert whereas 20

90:34

years ago it was a dark art and I think

90:36

you're right Jason in the next two or

90:37

three years if you're not adopting aic

90:40

marketing as a digital marketer, you're

90:42

just going to be left way behind and

90:44

personalized marketing and leveraging

90:45

the technology. I think yeah I mean I

90:47

knew it already intellectually but

90:49

actually I will say you kind of

90:50

crystallize it in my mind because I've

90:52

lived the last I remember when SEO

90:55

marketing was literally something that

90:57

only the lead genen companies did and

90:59

there was two or three year period where

91:00

they had you know hyperrowth $100

91:02

million businesses kicking off 30% cash

91:05

and then those businesses went away as

91:07

normies adopted the technology and the

91:10

correct play was to be the software

91:11

provider helping the normies tool up and

91:14

I think the same is true here in agentic

91:16

marketing. So that's my stolen insight

91:19

from you for the day, Jason, as I go

91:21

look for those companies.

91:22

>> Well, all it's just interesting. All

91:23

Medvy has to do to make 400 bucks is

91:26

drive a GLP lead to someone that buys a

91:28

product that that just like tokens the

91:30

world the world can't consume enough of.

91:33

And all these and most of these folks

91:34

are selling the exact same product and

91:36

they're fungeible, right? And so you're

91:37

getting three it's one there's a moment

91:39

in time where it's a great marketing

91:40

arbitrage. If you're excellent at this,

91:42

you get three to 400 bucks for

91:43

delivering a a customer that already

91:45

wants to buy your product, right? It's a

91:46

moment. That's that's how the math it

91:49

kind of ties back to our open route or

91:50

another conversation. If you only made

91:52

10 bucks, this wouldn't be such an

91:53

exciting business. But these this

91:55

they're so profitable, but also so

91:57

competitive. When you get 400 bucks for

91:58

just delivering a customer from a

92:00

Facebook ad, man, you you know, you want

92:02

to you want to run this.

92:04

>> You do what it takes.

92:06

>> Boys, a lot to discuss this week. Thank

92:08

you for being so good. It's so good to

92:10

see

Interactive Summary

Loading summary...