HomeVideos

Are SaaS Companies Cooked: Which Thrive & Which Die | Aaron Levie

Now Playing

Are SaaS Companies Cooked: Which Thrive & Which Die | Aaron Levie

Transcript

1588 segments

0:00

I would still be probably loading up on

0:01

all of the Frontier rounds. These

0:03

numbers could continue to get much

0:04

larger.

0:05

>> Now, I think Aaron Levy is one of the

0:07

luminaries on AI pervading enterprise.

0:10

And he did a viral tweet the other night

0:12

and I said, "Dude, we've got to do a

0:13

show on this." So, this is specifically

0:16

on how AI will impact the biggest

0:18

enterprises in the world, how agents

0:20

will be introduced into the largest

0:22

enterprises. And we couldn't have anyone

0:24

better than Aaron, founder and CEO of

0:26

Vox, one of the public companies of the

0:28

last decade. This is an incredible

0:29

discussion.

0:30

>> What we are in is a commercial and

0:32

economic race. We haven't removed humans

0:34

from the loop. We've just changed where

0:35

they enter the loop. Everybody is so

0:37

myopic about this. I want to just like

0:39

shake the industry. There are going to

0:40

be more lawyers in the next 5 years than

0:42

we have today. The workflow needs to be

0:44

redesigned for agents, not for people.

0:46

The budget of tokens will have to move

0:48

out of it spend and into regular OPEX

0:51

spend. Is your job harder than ever?

0:53

>> Yes. If you're in software or

0:55

infrastructure or building agents, it's

0:56

a year of complete unrelenting

0:58

execution. Ready to go.

1:11

>> Aaron, dude, it's so lovely to have you

1:14

on the show. You know that we have Rory

1:16

on every week and he's just like Aaron

1:18

is the greatest. I'm not going to do his

1:20

accent because I suck at them, but he's

1:22

like the greatest. You can't you can't

1:23

easily do an Irish accent.

1:25

>> Well, you know, I I can't really do the

1:27

Irish accent so well, but that's right.

1:30

Exactly. Uh but you basically, I'm sure,

1:32

bought Rory one of his houses. So, no

1:34

wonder he's grateful. Uh but

1:36

>> we we Rory uh we got more out of Rory

1:38

than he got out of us. So, uh

1:41

>> exceptional man. But I wanted to start

1:43

and we were just chatting. I was running

1:45

around the pot listening to the Dwark

1:47

and Jensen episode and I was like, I

1:49

don't think Jensen came out very well.

1:51

Do you agree with me that Jansen didn't

1:53

come out very well from that episode?

1:55

>> Um, I I think this is like the greatest

1:58

roarshack test of all time of of where

2:01

uh where where somebody is mentally on

2:03

AI. Um I if I so I happened to see a

2:07

bunch of the tweets before I watched it

2:09

and so I was a little bit obviously

2:10

biased in advance but if I hadn't seen

2:12

any of the commentary and I had just

2:14

watched it um I would have been very

2:16

confused by the by the commentary post

2:19

uh you know post uh interview and and um

2:22

to be clear I kind of jumped to the more

2:24

uh salacious part of of you know China

2:27

and that topic but I'm I'm almost

2:30

probably 80% with Jensen um and um and I

2:34

I my my my sort of uh kind of way of

2:38

thinking through the logic actually

2:39

works much closer to uh to Jensen. Um

2:42

you know the idea that we're in some

2:44

kind of you know kind of race

2:46

existential race where a month or two of

2:49

advantage is going to you know change

2:51

the total outcome of of AI progress and

2:54

and what everybody does between us and

2:56

China. I I just don't agree with. I

2:58

think what we are in is a commercial and

3:00

economic race obviously with safety you

3:02

know built into that. There's no

3:03

question. Um, and I think we actually

3:05

have a lot more power globally uh if

3:08

it's our technology stack that's

3:10

powering AI. And so I I I kind of am

3:13

more in the camp of of Jensen on on his

3:16

lines of logic. And you know, Doresh

3:17

kind of oversimplified a few a few

3:19

components. You know, he said, well, you

3:21

know, with Mythos, if if we get early

3:23

access to that, then we can go and

3:24

upgrade all of our systems. And you know

3:26

with with again great respect to Dores

3:28

Cesh it's like it's like upgrading

3:31

software is a multi-year effort. So

3:33

unless they somehow keep Mythos you know

3:36

closed for the next decade there's not

3:38

like some magical moment where you can

3:40

just secure everything. This is an

3:42

ongoing endless you know till the end of

3:44

time. You're always in this sort of

3:46

leaprogging you know between the

3:48

defensive side and the offensive side.

3:51

Um and so I just don't think these

3:52

things are as binary. And so I I

3:54

actually more am inclined to uh to to

3:56

Jensen's view of that. And then Jensen

3:59

had a really key point that was didn't

4:01

go viral yet. So maybe you could kick it

4:03

off, but he had this little small

4:05

vignette about 90 seconds in the whole

4:08

conversation where he said, you know,

4:09

we're going to do ourselves a disservice

4:11

if we scare people out of engineering,

4:12

if we scare people out of radiology, if

4:14

we scare people out of healthcare

4:16

because they think all these jobs are

4:17

going to get eliminated with AI. That is

4:19

not helping us. uh that is that is uh

4:21

it's doing a disservice to the next

4:23

generation. It's doing a disservice to

4:25

society as a whole. Like we we don't yet

4:28

know any way to use AI in a capacity

4:31

other than augmenting our work where we

4:33

still eventually have to go and review

4:35

the work in some in some form. Maybe you

4:38

don't have to review the tiny little

4:39

parts of it anymore. You can review a

4:41

bigger you know part of the of of the of

4:44

the you know work product that happens.

4:46

But um but we haven't removed humans

4:48

from the loop. we've just changed where

4:49

they enter the loop. Um and uh and I

4:52

think that that Jensen has a more

4:54

pragmatic view of of the technology. We

4:56

should be, you know, very thoughtful

4:58

about how we make these systems safe,

4:59

but I much more land in Jensen's camp uh

5:02

on the overall kind of contours of the

5:04

debate.

5:04

>> Oh Jesus, dude. You didn't leave me. You

5:06

didn't quite

5:11

Okay. first a disservice by discouraging

5:14

people to go into categories like

5:16

radiology or engineering. Do you think

5:19

you will have more engineers at box in 5

5:21

years time?

5:22

>> Uh we will and and I think the part that

5:25

everybody misses is that they everybody

5:27

is so myopic about this I want to just

5:29

like shake the industry. We are we are

5:32

so myopic and and self-interested and

5:34

and we think that the entire industry is

5:36

the tech industry and when you go around

5:39

the country or world and you go and talk

5:41

to a tractor company and a bank and a

5:43

pharma company and you ask them do you

5:45

think you have enough engineers to go

5:47

and automate what is going to happen in

5:49

your industry going forward they

5:51

absolutely unequivocally universally

5:53

always say no and so what what the

5:55

breakthroughs of cloud code or codeex or

5:57

others are are doing is it's making it

5:59

so those companies companies now can

6:01

actually do the same kind of engineering

6:03

that Silicon Valley has been able to do.

6:05

And so we are myopic because we think

6:07

that that tech is the only use of of

6:09

engineers and tech is only I don't know

6:12

what the right number is 8 10 12 15% of

6:15

of GDP in the economy. What happens when

6:17

85% of the economy now gets access to

6:20

engineering like tech has always had?

6:22

That is what will happen. That and and

6:24

so yes, maybe if you're graduating, you

6:27

know, name your computer science school

6:28

today, you don't go immediately to

6:30

Google. You go to literally John Deere

6:32

or Caterpillar or Eli Liy, but the

6:35

skills that you have are going to be

6:36

just as relevant in just a different

6:38

domain. You're not going to be building

6:39

a little app with little buttons. You're

6:41

going to be automating pharmaceutical

6:43

research. You're going to be doing AI

6:44

for the future of of of, you know,

6:47

farming and industrial equipment. So,

6:49

we're just too myopic about about how

6:51

this works. And um and and you know uh

6:55

uh you can already start to see this

6:57

sort of playing out. There was a really

6:58

funny FT article which is lawyers are

7:01

being inundated by all of these kind of

7:03

AI responses that they're now getting

7:05

from their clients saying, "Hey, can you

7:07

review this contract or can you review

7:08

this memo or can you look at this this

7:10

case?" Well, guess what happens when

7:12

everybody thinks that they're a lawyer?

7:14

Do you know what the ultimate constraint

7:15

is? the ultimate constraint is the

7:17

actual number of lawyers that that

7:19

actually are able to go and review all

7:21

of this stuff being produced. So like I

7:24

would take the other side. I'd rather I

7:25

like like there are going to be more

7:27

lawyers in the next 5 years than we have

7:29

today because we've made it easy to

7:31

generate legal content. But it has not

7:34

gotten any easier to actually get any of

7:36

that approved by any court system or

7:38

file a patent or any of the things that

7:40

law actually ends up relating to. So

7:42

these are again this is where I I I just

7:44

differ from the rest of the industry.

7:47

>> Do you really think so? With the

7:48

greatest of respect we are seeing the

7:50

eradication of kind of lower ranking

7:52

legal positions

7:54

>> and that that is a different issue which

7:56

is how do you do the next generation of

7:58

mentorship and apprenticeship when AI

8:00

does automate the maybe u traditional

8:03

tasks that those workers are doing. big

8:06

question, a big question facing every

8:08

bank in the world, every law firm in the

8:10

world, anybody who had a sort of an

8:12

apprenticeship model. I don't doubt that

8:14

that's a real issue, but that's

8:15

different from the constraints that that

8:17

all of this work ends up resulting in

8:20

that you still have not been able to

8:21

automate. We had a customer conversation

8:23

two weeks ago and and and this is just

8:25

going to sit with me forever. I'm going

8:26

to always have this example. They've

8:27

automated or they're working on

8:28

automating patient referrals when you

8:31

know when you want to go and see the

8:32

radiologist or the the high-end doctor

8:34

for whatever issue you have. They're

8:35

automating that which is awesome. So now

8:37

you don't have to be on the phone for

8:39

you know a week or whatever. Well guess

8:40

what you can automate anything but if it

8:43

still is 18 months out before an

8:45

appointment is available. What what your

8:47

ultimate constraint is still the

8:48

healthcare institution and the amount of

8:50

doctors we have and actually the amount

8:53

of of of real labor we have across those

8:55

organizations. So yes, maybe maybe you

8:58

you don't want to, you know, stake your

9:00

career on being a frontline, you know,

9:02

customer ser in healthcare right now,

9:05

but but first of all, that same person

9:08

will have a lot of other types of jobs

9:09

that they'll have access to. Um, but you

9:11

still will end up having all of these

9:13

other constraints that that that

9:14

eventually we will need to produce more

9:16

and more jobs to go and resolve. So

9:18

automation is going to actually just

9:19

force us to see the next set of

9:21

bottlenecks that are in all of these

9:23

industries that we didn't perceive that

9:24

we had before because everything was so

9:26

slow and manual.

9:27

>> What job title does not exist today that

9:30

will be incredibly prominent in 5 years

9:32

time?

9:34

>> Yeah. So I'm workshopping and and a

9:36

bunch of people are doing this. So this

9:38

is like not my invention, but I'm I'm

9:39

workshopping.

9:40

>> Aaron, you you've got to take

9:42

attribution as a venture investor. It's

9:44

all about coining a term. Okay. This was

9:46

your original thought in the shower.

9:49

Aaron Lavies, share it with me.

9:51

>> I've been influenced by nothing I've

9:53

seen online. Uh, this is all from me.

9:55

Um, so there's some kind of and and who

9:58

knows if this sustains as as a full-time

10:00

role or where it gets diffused into. I'm

10:02

not I'm not uh I'm not 100% clear on

10:04

that, but there is 100% a role right now

10:07

that there's going to be

10:10

500,000 a million jobs created for and

10:13

and it's basically some kind of agent

10:15

operator and and this person is um is

10:20

actually going to be needing to be uh

10:22

somewhat technical. They're going to

10:24

have to like be deep in the AI world.

10:26

They're going to have to understand MCPs

10:28

and CLIs and they're going to have to

10:30

know how to write skills. They're going

10:32

to have to understand agents.mmd files.

10:35

The it's going to be this group of

10:37

people that will know how to go into

10:39

your marketing team or your legal team

10:41

or your operations team or your life

10:43

sciences research team and and this is

10:46

the person that is basically going to

10:48

enable that function to get leverage

10:50

from agents. And um and the problem that

10:53

the real world has that startups and and

10:55

frankly many of your guests don't

10:57

understand is that is that when when you

11:00

start a company from scratch, you've got

11:02

like you know the world is your oyster,

11:03

right? You can design your workflows

11:05

however you want. There's really no risk

11:07

if you if something goes wrong because

11:08

you don't have much scale to begin with.

11:10

There's no real regulator that that is

11:12

sort of calling on you to say, hey, you

11:14

know, are you doing things the right

11:15

way? It's it's it's effectively infinite

11:17

upside in whites space. when you go into

11:19

a a you know a Fortune 1000 pharma

11:22

company or bank or or you know

11:24

consultancy um that doesn't that's just

11:26

not the case right these guys have

11:28

they're regulated they have data

11:30

fragmented across their organization

11:32

they have employees that are that are

11:33

sort of wired to do workflows a

11:35

particular way so there needs to be

11:36

somebody that can basically say hey if

11:38

we actually want to get real leverage

11:39

from automation we need to start to

11:41

redesign the workflow that we're doing

11:43

and the workflow needs to be redesigned

11:45

for agents not for people and so So what

11:48

do you do when you when you reimagine a

11:51

business process where the agent is now

11:53

doing much more of the work than than

11:55

what you know than what the human used

11:57

to do in that process. And that just

11:59

means it's a it's a very different sort

12:02

of implementation cycle. It's there's

12:03

real change management. You've got to

12:05

get data organized in the right way.

12:07

You've got to connect up systems in the

12:08

right way. Guess what? The second a new

12:10

model drops your workflow probably

12:12

breaks. Um because the way you prompt

12:15

that agent now is different. there's a

12:17

different way that that it wants its

12:18

syntax to be to be handled. So that it

12:21

just requires care and feeding and and

12:23

and a real level of kind of technical

12:25

and business process acumen. Uh so I

12:28

think we're going to create you know an

12:29

untold amount of jobs that look like

12:31

that. Some of those people will come

12:32

from it. Some of those people will come

12:34

from operations. Some of them will come

12:36

from engineering. uh if you're in a

12:38

maybe more technically inclined company

12:39

where it's like the next generation of

12:41

if you know there's a a limit to again

12:43

the number of software you need to build

12:45

that looks like an app on your phone

12:47

there's an unlimited amount of software

12:49

you need to build that looks like a

12:51

background system process that's

12:52

connecting different data sources

12:54

automating workflows that's where the

12:56

the work is going to go well this was

12:58

really going to be one of my main

12:59

questions which is you know Jensen very

13:00

clearly said AI won't kill software it

13:03

will explode the amount of software

13:04

needed and when I thought about that you

13:06

know the thesis is obviously kind of you

13:07

have this kind of core AI that crawls

13:10

over 15 SAS tools and they really become

13:13

databases that agents crawl on top of.

13:16

Is that what it looks like? And are they

13:18

not just valueless SAS tools then?

13:20

>> Yeah, I mean I I think that that I'm I'm

13:23

sympathetic to that argument in some in

13:25

some categories. I think there's some

13:27

software where because the person was

13:30

the user of the of the software and they

13:32

were clicking all the buttons that your

13:34

sort of ratio of buttons to underlying

13:36

APIs was like more in favor of buttons.

13:38

And I'm I'm oversimplifying, but there

13:40

there are some tools where you open it

13:41

up and there's like 93 features um that

13:44

you're kind of clicking around on and

13:45

the and the user has been so accustomed

13:47

to exactly how to do that that the that

13:50

the software's value proposition was

13:51

correlated to to roughly that that sort

13:54

of mass in a world of APIs and a world

13:58

of agents being able to do more of the

14:00

of the work that you used to do on

14:02

clicking those buttons then then again

14:04

the value goes more to the API layer. So

14:05

then the question is how many APIs do

14:08

you have? Not not in like a you just

14:10

need a thousand APIs, but like like how

14:12

robust and useful and and proprietary

14:15

and how much business logic is embedded

14:16

in those APIs versus it's just calling a

14:19

database and pulling a record like does

14:21

the API surround a a a set of business

14:23

logic of like no it actually secures the

14:25

data or it knows exactly what person

14:28

each piece of attribute should have

14:30

access to inside the organization.

14:31

That's you know at the end of the day

14:33

all software has a database behind it.

14:35

So you could oversimplify it and be

14:37

quite reductive to that. But there's a

14:38

lot of business logic in the layer above

14:40

the database that that software players

14:42

have. Like if you're an ERP system, you

14:44

know, you're way more than a database at

14:46

this point because you've written a

14:47

tremendous amount of business logic of

14:48

how your supply chain should be

14:49

automated and work and how you should do

14:51

accounting. None of that goes away. So

14:54

then the question is what changes is the

14:56

user interface that the either the user

14:58

or the workflow is interacting with. The

15:00

user interface might be now you're just

15:01

chatting with an agent. I think

15:03

increasingly the right way to do this is

15:05

there's some kind of agent in the

15:06

background that's connecting multiple

15:08

systems. So you're you're not even like

15:09

the user is maybe not even seeing half

15:11

the value that's happening, but the

15:13

agent is sort of working across an ERP

15:15

system, a CRM system, an HR system, a

15:18

you know a document repository and then

15:20

doing work across those systems, which

15:22

means that the value proposition has to

15:23

be how good are your APIs, how

15:25

well-designed are they, are they ready

15:26

for agents, and then can you monetize

15:28

that in some way that makes sense? and

15:31

and we we are treating software too too

15:34

much like one gigantic sort of of

15:37

monolithic industry and you know it'd

15:40

probably be better to have some kind of

15:41

2x two which is like how much business

15:44

logic is there how much sort of human to

15:47

agent you know collaboration does there

15:49

need to be and and like so the reason I

15:52

bring that up is the moment you have

15:53

human and agent collaboration you need

15:55

some kind of you know usually you need

15:56

something that the user can pop into to

15:58

to experience the work that the agent

16:00

did and and that probably doesn't go

16:02

away so much. Um uh and then and then

16:04

you know when more agents are working on

16:06

the software which parts of software do

16:08

the agents need those APIs even more

16:10

than humans ever did and I think there's

16:12

a lot of categories of software where

16:13

actually agents using the tools is a

16:16

massive boon for the technology as

16:18

opposed to a dilemma.

16:19

>> Where will agents use the tools more

16:22

than humans do and those API calls

16:24

become much more frequent?

16:25

>> Yeah, I mean an easy one is just

16:27

unstructured data. You know, agents are

16:29

going to be this incredible consumer and

16:31

creator of your unstructured data.

16:33

They're going to read through every one

16:35

of your contracts and generate all of

16:36

your contracts. They're going to

16:38

generate marketing assets. They're going

16:40

to write reports for you. And so, when

16:42

it becomes trivially easy for you to

16:43

generate all this new information or

16:45

have agents review it all, well, guess

16:46

what? You still need a backbone that

16:48

kind of manages and coordinates and

16:50

creates the guard rails of those

16:52

workflows and and all the agents doing

16:54

that work. So we're about to see an

16:56

explosion of unstructured data as an

16:58

example.

16:58

>> With the greatest of respect, Aaron, can

17:00

I just interject?

17:01

>> 100%.

17:01

>> Does that increase the value of your

17:03

business? When I think about that, I I

17:05

asked Aaron from Monday, if you become,

17:07

you know, a data repository which agents

17:09

crawl on top of,

17:10

>> how do you retain value in that? I would

17:13

ask the same to you with respect

17:15

>> 100%. It's it's the question on the mind

17:17

of every investor on the planet right

17:19

now. So we're we're used to it and it's

17:20

not a it's not it's not a scary

17:22

question. Um, one thing that that helps

17:25

us is we've always had a a a an API sort

17:28

of maybe not first but equal strategy.

17:31

Um, uh, if if I told you the number of

17:33

API calls we did last year, you'd and or

17:35

you guessed first, you'd probably be off

17:37

by an order of magnitude. Um, uh, so so

17:40

the volume of of API usage on our system

17:42

is already enormous and already, you

17:45

know, is outsized relative to any of the

17:47

enduser interactions on the system. Um

17:49

and that's just a virtue of you use

17:51

content in a variety of applications and

17:53

workflows that that you know far exceed

17:56

what what people you know kind of you

17:57

know open up their finder and and upload

17:59

a document into like like an ERP system

18:01

generates files. A a wealth management

18:04

portal you have clients uploading

18:05

documents into the portal and they never

18:07

see box. Um you have workflows of

18:09

invoice processing that's happening

18:11

behind the scenes. So the headless

18:13

version of box has been alive and well

18:14

for you know almost since the day we

18:16

started the company. And so agents to me

18:18

just again represent a force multiplier

18:20

on that. So it's not a it's it's it's

18:22

actually an exciting proposition for us.

18:24

We already know how to monetize it. The

18:26

question is like will the exact dollar

18:28

and cents be the same between an agent

18:30

user and a previous application user. We

18:32

don't know. But we do know that if the

18:33

number goes up by 100x or a thousandx

18:36

that that's actually more opportunity

18:37

for us in the future. Now that's not all

18:39

the same for all software providers. for

18:42

where we sit in the workflow where you

18:45

just you generate a document it needs to

18:46

go somewhere you have to secure that you

18:48

have to protect it um you have to govern

18:50

it over the long run that's just more

18:53

data going into our platform and uh and

18:55

that's you know so that's why it's just

18:56

all upside for us

18:57

>> you said secure and protect it we

18:59

mentioned our mutual love for Rory

19:01

O'Driscoll I do a show with Rory and

19:02

Jason every week Jason has bluntly said

19:04

that this will be the golden age for

19:06

cyber security because the security

19:08

threats are going through the roof are

19:10

you concerned with the system

19:11

vulnerabilities and the security threats

19:13

that are coming with AI and what do we

19:16

not know about security that we should

19:18

know.

19:18

>> I am concerned but not uh in any kind of

19:22

like new concerned sense. Uh this this

19:24

to me was kind of priced in the moment

19:26

that we were generating code with AI. So

19:29

if you can generate code you have two

19:30

problems. One you're going to generate

19:33

way more code than anybody's ability to

19:35

review that code. So, you know, starting

19:37

with GitHub Copilot six years ago or

19:40

whatever the date was five years ago,

19:41

like that was just priced in which is

19:43

which is as soon as as AI writes most of

19:46

the code or and then like 90% of the

19:48

code and then 95% of the code then by

19:51

volume we're just going to produce this

19:52

unbelievable amount of code and and any

19:54

you know any change in a system uh you

19:57

know everybody kind of thinks about

19:59

security as like um you know is there a

20:02

zero day where there was an unpatched

20:04

you know component of your technology.

20:06

ology or somebody found a clever new

20:08

packet uh package that that that you

20:10

could kind of slip into. Uh every time

20:12

you you ship a new feature, you have a

20:15

chance of a security vulnerability

20:16

because the the AI could have written

20:18

in, oh, you know, we want to actually

20:20

open up that port in in the system

20:23

because we need to do something and

20:24

maybe that was the wrong decision for

20:25

the agent to go and do. So, so we're

20:28

going to be living in this new world of

20:30

of cyber risk uh in the form of of using

20:34

agents more and then on the other side

20:36

obviously if you have the offensive side

20:39

able to use AI probably more in the form

20:41

of open models and and whatnot then they

20:43

can find more vulnerabilities because

20:44

they can scan across the internet far

20:46

faster than before. So you actually have

20:48

two new forms of risk in the development

20:50

process and you only have one benefit

20:52

which is agents can also review the code

20:54

and and try and keep it secure. So, so

20:56

it's it's it's going to be a a very

20:58

dynamic um uh you know, period. I I

21:01

think you know, for better or worse,

21:02

agents are the solution to the problem

21:04

that agents have caused and um and and

21:07

that's why there's going to be a lot of

21:08

money made in agentic security uh as

21:10

well.

21:12

>> You said agents are the solution um to

21:14

the problem that agents have caused. It

21:16

almost reminded me of when Yansen went

21:18

on TV and was like, "Oh, every engineer

21:20

should be spending I can't remember the

21:22

amount. I think it was either 250 or

21:24

500,000."

21:24

>> Yeah. or like half the salary

21:26

essentially.

21:26

>> Yeah. Yeah. And it's kind of like, you

21:28

know, drug dealer, you should buy drugs.

21:29

Well, okay. No [ __ ] Uh

21:33

>> I again I'm I'm you know obviously Jen I

21:35

mean listen we love Jensen for that

21:37

level of of uh of you know grandiosity

21:40

and and charisma. So I I actually I but

21:43

you know whether he's off by half or

21:45

not. I mean directionally the idea is

21:47

actually pretty salient which is which

21:49

is you're going to be spending more on

21:51

compute per person in the future than

21:53

than you ever thought. and that you

21:54

certainly are today.

21:55

>> What percentage of salary are you going

21:57

to spend on compute in box in five

22:00

years?

22:01

>> Uh great great question. Uh I don't

22:04

think we've modeled that out in in five

22:06

years and obviously the joy of being

22:07

public.

22:08

>> Well, this is your chance.

22:09

>> No, no, totally. I you know I was told

22:11

not to model long-term financial

22:12

projections on podcasts. So, um so let

22:14

me let me Yeah, it's a weird SEC

22:17

financial thing.

22:18

>> So boring.

22:20

>> Don't ever go public if you don't want

22:21

to model on uh on podcasts. out. This is

22:24

why the Collisons don't. Everything else

22:26

is great. They just didn't podcast.

22:28

Yeah. Cheeky P.

22:29

>> I don't know if you I don't know if

22:30

you'd be able to pin Patrick or John on

22:32

the same question for their five-year

22:34

view, but but you know, it'll be a

22:35

larger number for sure than it is today.

22:37

So,

22:37

>> I'll smash them with four tequilas and

22:39

then ask them. Um, you said one of your

22:42

observations in your very viral tweet uh

22:44

was about token maxing and kind of token

22:46

allocations within enterprises. I'm

22:48

really intrigued. How do you think about

22:50

advising CIOS on token allocation? token

22:53

maxing. What we should know that we

22:54

don't know. Yeah. How do you think about

22:56

that?

22:57

>> This one's tough. It's it's it's you

22:59

know, the the general advice will end up

23:02

sounding kind of like um uh you know,

23:04

kind of generic by definition. Um it you

23:07

know, usually I mean it's going to have

23:08

it's going to have something to do with

23:10

with your tokens will have to correlate

23:13

to where there is the most amount of you

23:15

know, value generated for your company.

23:17

Like like most bland statement of all

23:19

time, but just obviously has to be true.

23:21

So in the software industry, we're into

23:23

token maxing because guess what? Like

23:25

generally the value proposition of your

23:27

company will correlate to how much

23:28

software can you produce. And so so you

23:31

know if you're trying to drive a lot of

23:32

change and you want to make sure

23:34

everybody's shipping lots of soft

23:35

software and you want to be able to

23:36

teach the best practices faster, then

23:38

token maxing and leaderboards are an

23:40

interesting way to do that. I you know

23:42

it's not obvious that you're going to

23:43

see that across every industry. Um uh

23:46

we've we've seen a couple of interesting

23:47

examples. One company had this sort of

23:49

like Shark Tank pitchathon type thing

23:51

which is you know teams have to show up

23:53

and they have to go pitch for for

23:54

compute you know token budget and then

23:57

you kind of allocate it in some central

23:58

fashion like a VC would and then you

24:00

sort of you know I don't know their

24:02

exact interval but I would imagine you

24:03

review that 3 months 6 months in being

24:05

like okay did you get the upside that

24:07

you thought on on that token usage. So

24:10

that that's an interesting one. I think

24:12

you have a another company had a kind of

24:14

a view of like you know it's it's some

24:16

kind of like natural stratification of

24:20

you know 5% of your users are doing the

24:22

most valuable things 20% are doing the

24:25

next tier of most valuable things and

24:27

then everybody else is sort of doing

24:28

general productivity I'm making up their

24:30

numbers but but the idea would then be

24:32

like well for that five or 10% give them

24:34

the the best models with unlimited

24:36

capacity for the next 20% have some have

24:39

some limits maybe it's a a little bit

24:41

more efficient of a model and for

24:43

everybody else it's sort of like we're

24:44

going to just use the cheapest thing on

24:46

the market. It's it's not going to be

24:48

like the game changer of the employee

24:49

productivity. Um and so I think

24:51

everybody's kind of working their way

24:52

through this. the the part that that

24:54

back to Silicon Valley's again kind of

24:56

you know sometimes more um uh you know

25:02

let's just say like like positively

25:04

naive view is is like real world they

25:06

have like budgets and they have like

25:07

annual budget planning cycles because

25:09

they have EPS numbers they they commit

25:11

to Wall Street and so you don't get to

25:13

just be like oh we're going to token max

25:15

across the enterprise where everybody

25:17

gets unlimited token budgets because

25:18

obviously then that company would just

25:20

miss their their earnings throughout the

25:22

year. So you have to like wait for the

25:24

earnings. You have to wait for the

25:25

budget cycle. You have to figure out

25:26

what teams you know make are are most

25:28

interested and and have the best use

25:30

cases. That's a that's a natural

25:31

journey. One final bookmark one final

25:33

one final bookmark um that that I think

25:36

is is well understood now at this point

25:38

is the budget of tokens will have to

25:40

move out of it spend and into regular

25:44

kind of opex spend. this can't be

25:46

treated like a oh, you know, I'm going

25:48

to trade off between Salesforce licenses

25:50

or or or compute tokens like like it's

25:53

going to more be I'm going to trade off,

25:55

you know, this next marketing campaign

25:57

uh and and instead I'm going to go and

26:00

drive more automation in our in our

26:01

marketing engine. Like it's it's going

26:02

to be that kind of of set of trade-offs.

26:04

>> What happens to that token budget when

26:07

it transitions to that different spend

26:09

category? Um well first of all it goes

26:11

up because because no well because it

26:14

spend as a percentage of revenue of of

26:16

large enterprises is is

26:17

>> but is this the same as the kind of

26:19

classic VC blog post which every [ __ ]

26:21

firm has written which is like you know

26:23

AI it's moving from software budgets to

26:26

labor budgets and every partner goes and

26:29

likes the tweet and there's like no

26:31

[ __ ] [ __ ] like really.

26:34

>> Yeah. Uh I mean if you do it in that

26:35

voice it sounds it sounds kind of like

26:38

um you know maybe uh you know simple but

26:40

like yeah that that but like that's just

26:42

like a very big deal in technology.

26:43

We've never had there's never slash

26:45

rarely been a technology that you could

26:47

sell into an enterprise where you

26:49

weren't capped by that company's

26:51

corporate IT budget. And so now for the

26:53

first time ever, you have a technology

26:54

where you can go into the line of

26:56

business and you can say, I can now

26:58

offer you a a a new tool in the form of

27:01

an agent that will augment a workflow

27:04

that will make you 50% or 100% more

27:06

productive. And so maybe I should be

27:08

able to get 5% of your opex budget this

27:11

year to go and do that. Like that that

27:13

is a new budget to tap into. And I don't

27:16

think it like you know 10xes the size of

27:18

of IT spend or or technology spend

27:20

globally but it certainly doubles it.

27:22

>> I mean current enterprise technology

27:24

spend is estimated between 10 and 12%.

27:26

To see that going to like 20% as you

27:28

said there is like I think relatively

27:30

feasible. You said about kind of

27:31

companies being like based on earnings

27:32

per share and actually having budgets

27:35

that they have to adhere to. Very

27:36

strange not to have venture funded

27:38

companies.

27:38

>> Yeah. They don't have a limited VC to go

27:40

and solve this.

27:41

>> Can't we just go to our venture investor

27:43

and ask for more money? Um, the one

27:45

thing that I worry about is we see this

27:46

insane demand side pull. Every company

27:49

in the world needs an AI story. Everyone

27:51

wants to kick the tires with something.

27:53

And I think we project the same demand

27:56

side pull and extrapolate it

27:58

continuously. Do you worry that we are

28:01

in a momentary 18-month period on the

28:03

demand side pull and that may not always

28:06

be lasting?

28:07

>> You know, it's very possible I should be

28:09

more sensitive to that. Um uh but uh I I

28:14

I would take the opposite side of of of

28:16

that particular wager at the moment

28:18

because um partly because we I already

28:21

saw one diffusion cycle with cloud and

28:24

actually how long that ended up taking

28:26

and the and and the the the kind of

28:28

spiky early nature you would have just

28:30

been like oh my god this is this is on

28:33

fire it's it's and how could this last

28:35

and 20 years later it lasted and got way

28:38

bigger than we ever realized if it

28:40

works. The market's always larger than

28:41

you ever think. And um and then the only

28:44

the only part why why 18 months is like

28:46

not even a relevant window to me is I

28:48

think diffusion is going to take longer

28:49

than Silicon Valley thinks. And it's

28:51

back to the very first kind of that new

28:53

role idea. When you go to most

28:54

companies, they can't yet just deploy an

28:57

agent to do you know full uh you know

29:00

financial proposals for all of their

29:02

their clients without a human reviewing

29:04

the thing. And because the SEC will just

29:07

show up and be like, "Hey, like you you

29:09

just you just gave this person bad

29:11

financial advice and you're going to

29:12

lose your license." Like that that will

29:14

just start to happen kind of across the

29:16

board. And so um and and so that that's

29:19

why, you know, people take time. That's

29:21

why we we we there's a lot of regulatory

29:23

controls and compliance teams, security

29:25

teams have to figure this out. That just

29:27

takes time in the economy.

29:28

>> I I had I think Matt Fitz Fitzpatrick

29:31

from Invisible, which is like a cheuring

29:33

or a Mccor competitor. Okay. And

29:35

>> he said you cannot sell into enterprise

29:37

without an FD model. It is impossible.

29:39

>> I mean it rounds to being true. So

29:43

>> yeah,

29:44

>> super interesting to hear that because

29:45

we're seeing like the rise of oh we go

29:47

PLG and then we like seep up into

29:50

enterprise.

29:51

>> Well I I I don't sorry I wouldn't I I I

29:54

don't uh think of those as as mutually

29:55

exclusive for what it's worth. I guess

29:57

what I'm saying is when you think about

29:59

adoption within the largest enterprises,

30:01

aren't AI services companies the best

30:04

positioned companies of the next 5

30:06

years?

30:07

>> As in you're saying like traditional

30:08

professional services?

30:10

>> Yeah, I'm saying Accenture's AI team

30:12

that come into Bank of America.

30:14

>> No, 100%. These these these spaces are

30:16

going to be again both bigger and more

30:18

sustainable and robust than people

30:20

realize. We we are always so back to the

30:22

myopic thing. We're so myopic. We're

30:23

like AI will replace all of this stuff

30:25

because it just does it for you. And

30:26

it's like like uh I'm trying to think of

30:29

um uh you know maybe my most recent

30:32

experience with the best models in the

30:33

world. I probably had to go and change

30:35

15% of the thing that that that that was

30:38

the output. And so and so like you just

30:42

like we're nowhere near eliminating the

30:44

human from the workflow. And so in a

30:46

world where you don't eliminate the

30:47

human then there's a lot of like real

30:50

change management of like where should

30:51

the human enter that business process?

30:53

How would you want to review that that

30:55

work output? How do you wire up your

30:57

systems to make them effective for a for

31:00

the agent and human collaboration? How

31:02

do you connect all of these data sources

31:04

together? One one thing that we see is

31:06

um you know, if you wanted an agent

31:09

right now in a Fortune 500 company to go

31:12

and and give you an answer to where is

31:16

the most risk you have in your upcoming

31:18

renewals for your contracts.

31:20

that agent might find 10 different

31:22

systems that contain contracts in them.

31:25

And half those systems will be like

31:27

legacy technologies that don't work well

31:29

with the agent. They're kind of low

31:30

throughput or maybe you can't even wire

31:32

them up. They're on network file shares.

31:33

They're in legacy document management

31:35

systems. So, first of all, half your

31:37

data state is not even ready to work

31:39

with the agent. The other half of the

31:41

data state is probably fragmented

31:42

because you have two decades of

31:44

employees bringing their own tools. And

31:46

so the agent will just go and find the

31:47

wrong document or the wrong contract or

31:49

the wrong piece of data because you

31:51

never really cared to have some kind of

31:53

standardized system for your contract

31:55

because people could just always go and

31:57

find what they were looking for. Agents

31:59

can't do that. They I mean they'll find

32:00

what they're looking for, but they'll

32:02

just as often find the wrong thing as

32:03

the right thing. So they have to be

32:05

targeted. They have to have that

32:06

information get curated. They need to

32:08

understand the context of of what is the

32:10

process that they're doing. That is like

32:12

what I just described right now is 10

32:14

years of work for Accenture in every

32:16

enterprise on the planet or the nextg

32:18

Accenture that does this in particular

32:20

industries or workflows like we have to

32:22

go upgrade your systems. We have to

32:24

start to understand and organize your

32:26

data in the right way. We have to start

32:28

to describe these workflows to the agent

32:30

itself. We have to figure out where the

32:31

human is in the process. That is just

32:33

real change management that every

32:35

organization will have to go through.

32:36

>> We also have to have someone to blame.

32:39

>> 100%. This is why a lot of these

32:41

industries last, which is like I have

32:43

lawyers, not because I can't necessarily

32:45

ride an NDA. It's because it's your

32:47

freaking fault if anything goes wrong.

32:49

>> Yes. No, literally. And and and we don't

32:51

know. We're not like like I promise you,

32:54

you're not going to be able to blame

32:55

Anthropic when something goes wrong. And

32:57

so if you can't blame Anthropic when

32:59

something goes wrong, then then at some

33:02

point it it doesn't really work to tell

33:04

the comp like like to tell your

33:06

customer, well that that sort of system

33:08

that we set up screwed up your data or

33:11

it automated something the wrong way or

33:12

create a security vulnerability because

33:14

the company will just say, well, I'm

33:15

never working with you again. So then

33:16

you have to have some accountability in

33:18

your own organization for who is liable

33:21

to when when something goes wrong. And

33:23

and the moment you have to have any

33:25

liability, you have to have some amount

33:26

of ownership and accountability and and

33:29

and people have to have have sort of,

33:31

you know, they have to roll up to

33:32

somebody who has more liability and more

33:34

ownership and more accountability. Like

33:35

this hasn't really changed the

33:37

fundamental pattern of of human behavior

33:40

and contract law and and you know, the

33:43

regulatory regimes that everybody's a

33:45

part of. We've just sort of given our

33:48

our our computers a machine gun to go

33:50

generate way more information and work

33:51

with all of our data.

33:53

>> You said before when I've tried the

33:55

latest model, it's got like 85% of the

33:57

way there. I speak to many of the best

33:59

early stage and more mature you West

34:02

Coast based companies and they say,

34:03

"Hey, we use frontier models to set

34:05

where we can be and then we use

34:08

open-source Chinese models to get as

34:10

close as we can to that frontier

34:12

benchmark." Is Silicon Valley being

34:15

funded by a generation of open CCP

34:18

funded open models?

34:19

>> I mean that that must be kind of

34:21

empirically true. Um I uh I I I don't

34:25

have the same kind of like uh oh that's

34:27

so scary you know kind of element. Now

34:29

obviously again holding out some some

34:31

element of risk of of some some backdoor

34:34

weights that that can get triggered at

34:36

some moment or some parameters but like

34:38

like like I'm not I I just like that's

34:41

not how I'm perceiving it. But um uh but

34:44

yeah and but also that's yeah I would

34:46

say that's kind of orthogonal to my

34:48

point about like the best frontier model

34:50

still will go and do the wrong thing. Uh

34:52

and so thus I have to be in the I have

34:54

to be in the workflow loop to make sure

34:56

that I review its its work.

34:58

>> You know as a vantage investor I

35:00

specialize at making bold statements

35:02

with little substantive evidence. Um

35:06

>> it's worked for the greats. So

35:08

>> do you know what I'm just following

35:09

their lead. Jason Lmin, my dear friend,

35:12

says, "Why has no public company created

35:16

any good agent product? Everyone creates

35:19

60% [ __ ] agents, but he's like the one

35:22

person who's done it, Palunteer, and no

35:24

other public company has created a

35:26

sufficiently good agent product." Why is

35:29

that?

35:29

>> Um, you know, I I don't know that I can

35:31

fully endorse the point, but I'll I'll

35:33

I'll give you the because I I I would

35:36

argue our agent is sort of the best

35:37

agent for working with content. you

35:39

know, this is a very fastmoving uh space

35:42

and you have to be kind of wired in at a

35:44

level that that I don't think you've

35:46

ever had to be wired in in tech. Um uh

35:49

like I am, you know, and and the

35:52

information sources aren't the classic

35:54

ones. It's not the it's not the rollup

35:57

review two weeks later from your

35:59

traditional news publication that is

36:00

going to give you any kind of alpha.

36:02

It's it's it's the practitioner who's

36:04

the you know literally the engineer at

36:08

the you know agent sandbox company and

36:11

their their long form article on how

36:13

they are handling you know memory and

36:15

the harness and you know like like like

36:19

if you're not wired into that ecosystem,

36:22

it's very hard to then have your team

36:25

you know be at the kind of forefront of

36:27

all of what is happening. And so it just

36:29

is a it's a different pattern than what

36:31

we've ever had to do. Like like you know

36:33

co was was pretty crazy like we all had

36:35

to kind of like hunker down and be

36:37

paying attention to daily news cycles on

36:39

on coy stuff. Um but it wasn't like a

36:42

tech problem like it wasn't hard

36:43

technologically. Um so there's like

36:45

there but there's not been a moment

36:47

before where the speed of change and

36:50

responsiveness you have to have is quite

36:53

literally on a multi- uh you know

36:56

multiple times a week cycle. Is your job

36:58

harder than ever?

36:59

>> Yes.

37:00

>> Because of that speed of transience of

37:02

superiority of technology.

37:04

>> Yes. Uh you have you basically have this

37:07

this you know component of one there's a

37:09

tsunami of change that you can just feel

37:12

and so you're you're like okay we got to

37:14

like run faster than ever before. And

37:16

then there's a and then there's just

37:18

like the pure technical underpinnings

37:20

which some of it has business and

37:22

strategy implications some of it has

37:23

product implications. Some of it has

37:24

partner ecosystem implications.

37:27

uh because of that tsunami that you have

37:29

to very quickly kind of wire up what you

37:31

are doing about that shift uh and and

37:34

where the market is going. Uh at the

37:36

exact same time you have to also be like

37:38

you know find a way to be a bridge for

37:39

your customers that that you know also

37:42

don't want to get you know crash into by

37:44

the tsunami and they want to be able to

37:46

to be able to have a bridge into the

37:47

future and so there's just you're

37:48

juggling a lot right now.

37:50

>> You said your agent product is the best

37:52

product. Again, Jason and Rory said

37:53

this, and Rory might kill me for this

37:55

because he gets a little bit more

37:56

sensitive about when I quote him or

37:58

misquote him more appropriately, but

38:00

like he basically says if, and this is

38:02

Jason again, if you can't like charge

38:04

way more for your agent product

38:07

that Wall Street doesn't give a [ __ ]

38:09

that you have to reacelerate revenue

38:11

with agent products, can you charge

38:13

significantly more for an agent product?

38:15

The the answer is yes, but but there's a

38:18

little bit of nuance which is our our

38:20

business model is we have a new plan

38:22

tier that we just introduced last year

38:25

that basically houses our you know best

38:28

workflow capabilities, our business

38:30

automation, you know, uh our application

38:32

development capabilities and then the

38:34

agent is sort of central to that because

38:36

it's going to help you automate the work

38:37

that that you're actually doing with

38:38

your content. So it'll read a document

38:40

and extract metadata from it. it'll

38:42

process information inside of a of a of

38:44

a workflow. So that is actually causing

38:47

a re reaceleration of our revenue

38:49

growth. Last year we we we saw an

38:51

inflection in our revenue growth. And so

38:53

that that it's already happening in our

38:55

business. Um and and so we are we are

38:57

doing the thing that I think Rory is is

38:59

sort of probably saying is the new

39:00

benchmark. Now to be fair to what's

39:03

what's happening though is I think Wall

39:04

Street still is sort of saying we kind

39:06

of need to just step back and see where

39:09

everybody lands in this because of how

39:11

much change there is. So um so this is

39:14

very much a year where if you're in

39:16

software or infrastructure or building

39:18

agents you just it's an ex a year of

39:20

complete unrelenting execution.

39:22

>> Do you look at the ticker?

39:24

>> Yeah.

39:25

>> Every day.

39:26

>> Yeah.

39:30

But I was like a day trader.

39:31

>> I've never met I've never met a public

39:33

company CEO who hasn't. The Nan CEO was

39:36

on the other day and he's like multiple

39:37

times a day. Multiple multiple.

39:39

>> Yeah. No, 100%. I'm but like but like

39:42

partly I just I have like you know ADHD

39:45

or something and so I just need like I'm

39:46

like it's will we look back on this

39:49

period and be like what the [ __ ]

39:50

Companies trading at three times cash

39:52

flow like way over exaggerated or not?

39:55

>> Well three times cash flow is is very

39:58

much overexaggerated. I would say that

40:00

we're in a period right now where

40:03

basically the market is being treated

40:05

roughly as you know in in as a kind of

40:08

indiscriminately

40:10

you know kind of bucketed se sector and

40:12

the next year two years or whatnot

40:14

you'll start to see some separation and

40:16

parsing between the companies because as

40:18

I noted in the beginning agents will be

40:20

really good for some parts of software

40:22

and agents will put pressure on others

40:24

other other parts of software so and

40:25

it'll mean some companies have to fully

40:27

pivot and some companies can just sort

40:29

of ride their wave and and if they

40:30

respond, you know, effectively, like

40:32

clearly 3x free cash flow is is like,

40:34

you know, that that seems like

40:36

aggressively low territory. But I also

40:38

think that at times in software things

40:40

have been aggressively overvalued um

40:42

beyond the the realm of of likely what

40:45

the terminal value is of of particular,

40:48

you know, category or or or company as

40:50

well. So So I think we're I think

40:52

there's just a pendulum that that needs

40:54

to kind of find its equilibrium right

40:56

now. Um and uh and that that'll play out

40:58

over the next year.

40:59

>> Okay. I'm going for spicy. Uh I I

41:02

interview most of the public company

41:04

CEOs that you know and I know and Jason

41:06

Lin said on the show if Aaron Levy is

41:08

the best which you a phenomenal No, but

41:11

you're a phenomenal AI first mind and

41:13

leader. Uh to just roll with me on this

41:16

and because it gets worse. Sorry. And

41:18

even he is struggling.

41:19

>> Wait, why am I struggling?

41:21

>> Well, I mean

41:22

>> I just said I'm tired. I Wall Street who

41:25

I mean Wall Street does its things like

41:27

we're like we're not like we're cranking

41:29

>> dude I it's Jason blame him

41:32

>> but like I I do think I think that that

41:34

there's a little bit

41:35

>> I think this generation of CEOs that you

41:37

have around you though is equipped for

41:39

the AI transformation that is ahead

41:42

cuz I don't like I think I'm I'm not

41:45

blowing smoke up your ass you are you're

41:46

so versed in this you're so fluid but a

41:49

lot are like now one said to me the

41:51

other day no we don't have the AI chops

41:53

in house, we might need to bring it in.

41:55

>> This one's hard. I think um I think you

41:57

still have a lot of kind of founder or

42:00

or tech uh you know, forward, whether

42:03

they were an engineer or just they're

42:05

just very technical, you know, category

42:07

of folks that are are pretty dialed in

42:09

and like like I I have Slack channels

42:12

and and WhatsApp groups where people on

42:14

the weekend are are just like working

42:16

with Cloud Code or Codex building stuff

42:18

and they're public company CEOs. So, so

42:21

like like they they are clearly wired

42:23

in, tapped in, they they can feel the

42:25

technology and they are not going to let

42:27

their company lose. Um, you know,

42:29

assuming that that as a as a category

42:31

they're in a spot where where there's a

42:33

lot of upside. Um, so yeah, but like

42:35

every every technology wave there's

42:37

winners and losers. I don't know that

42:38

the this won't be any different. Um, I

42:41

and I I just think that you just have to

42:44

you just have to be super dialed in and

42:45

work through it.

42:46

>> Hard one before we do a quick fire.

42:48

Okay. Who has the world turned their

42:49

back on who you think should be much

42:52

more appreciated?

42:53

>> I I don't I like you know I'll give

42:55

maybe a shout out to like um Atlassian

42:57

uh as an example. Um uh I I think um I

43:00

think that that feels like like oversold

43:03

territory. You know

43:05

>> you think 78%'s a bit harsh.

43:07

>> I I think I think I think I think

43:09

possibly. and uh and and you know it's

43:11

in the category of I just like like what

43:14

they've they've been fighting this this

43:16

narrative and I'm not going to speak too

43:18

much for them but like I think what I

43:19

perceive is is oh no engineering gets

43:22

commoditized and so like where in the

43:23

stack was was their engineering you know

43:25

revenue generation and again with my

43:28

headset I'm like no there's going to be

43:30

more engineers and so now does that mean

43:32

that Alassian's product set will have

43:34

will look exactly like it does today no

43:36

like obviously it's got to evolve and

43:37

whatnot but but I think if you're like a

43:39

company selling infrastructure for

43:41

engineering to be more automated. Uh

43:44

like that seems like a good spot to be

43:46

in and and you know I you look at what

43:48

linear is doing and it's it's fantastic

43:50

and it's awesome to watch. Um but I

43:52

don't I think there'll be you know

43:53

multiple plays in that space just given

43:55

how big the market is. Um I think right

43:57

now this is a moment where you need to

43:59

be deep in the workflow and you need to

44:01

have data. uh you you you have to you

44:03

have to have data uh in your platform

44:06

and you have to be the best place for

44:08

that data to go and you have to be the

44:09

best place where agents want to work

44:10

with that data. That's like the mandate

44:12

right now is if you are not the best

44:14

place that for that an agent would would

44:17

sort of intentionally choose for working

44:20

with data of that particular category or

44:22

automating the workflow in that

44:23

particular you know area. That's a tough

44:25

spot to be in and that that's the the

44:27

job for all of us. You know, if you're

44:29

building software

44:30

>> and the best place where Asians want to

44:31

work is defined by great API,

44:33

>> great APIs, great pricing models, um,

44:37

uh, you like the the surrounding

44:39

features to the API. So, if you were to

44:42

say, hey, I want to be able to wire up a

44:45

workflow where this is a, you know, the

44:47

box sales pitch. I want to be able to

44:48

wire up a workflow where an agent is

44:50

interacting with FINRA compliant

44:51

documents where you when you know FINRA

44:54

compliant document means the things gets

44:56

generated or seen or shared with the

44:58

customer and it it can't ever be deleted

45:00

and and removed you know for a certain

45:02

amount of time then then on one hand the

45:04

APIs have to be super clean for the

45:05

agent on the other hand you have to have

45:07

a bunch of surrounding capabilities to

45:08

ensure that that company can go to uh

45:11

you know uh go to their regulator or

45:12

auditor and say yeah we we are complying

45:14

with FINRA so that combination is what

45:16

makes it though you would build that

45:18

kind of agent on something like Box and

45:20

that that persists across a variety of

45:21

industries.

45:23

>> I'm going to do a quick fire around with

45:24

you. Uh you have to go and be a public

45:27

company CEO. I know. Um so what have you

45:29

changed your mind on in the last 12

45:31

months most significantly?

45:33

>> I do think that that I've I've become

45:37

more convinced that software is headless

45:40

in the past year than I was maybe three

45:42

years ago. And it's because of the the

45:45

level of agentic capabilities on tool

45:47

calling and searching across systems and

45:49

the accuracy of that. Uh and that that

45:51

has happened faster than I I would have

45:54

uh perceived. So two to three years ago,

45:57

if you were to kind of, you know, wire

45:59

up an agent and tell it, hey, go work

46:00

inside of Box and find a document to

46:02

work with and do some process, it would

46:05

it would basically almost always find

46:07

the wrong document and it wouldn't be

46:09

able to handle actually like cracking

46:11

open the file and reading through it.

46:13

And so thus, you know, going headless

46:15

wasn't sort of the the most urgent

46:17

priority uh from an agentic standpoint.

46:20

And in the past year, those capabilities

46:22

have just absolutely accelerated to the

46:24

point where I'm fully convinced that

46:26

that you just you have to be, you know,

46:28

headless first as a software platform.

46:31

>> What acquisition did you not make that

46:33

you wish you had made over the box

46:35

journey? Jensen said in the show, "Oh, I

46:38

wish we'd invest in Frontier Models."

46:40

That was my big mistake. What

46:41

acquisition did you not make that you

46:43

wish you had done?

46:44

>> I honestly don't uh I don't think I have

46:46

any M&A regrets. I actually the the the

46:50

the it's the deals that I wanted to do

46:53

that that we ended up not doing that I

46:56

don't regret um is probably more the the

46:58

situation. So

47:00

>> which one is which one is that?

47:01

>> I'm not going to tell you those. But but

47:04

there are some where left to my own

47:05

devices I would have done and I look

47:08

back and I'm like oh thank god that

47:10

there was uh there was more rational

47:11

logic in the process.

47:13

>> Who is going to win the enterprise race

47:15

open AI or anthropic? Oh god, that

47:17

that's impossible. Back to the cloud

47:19

piece, I think um you know I think it's

47:22

it's totally fair to think about it as a

47:23

race and certainly if if you're in

47:25

either of those companies, you have to

47:26

treat it like a race because because you

47:28

know like you obviously want 80% market

47:31

share, not 55% market share. So like you

47:33

have to treat this as a you know we got

47:35

to dominate. Uh that's exactly how they

47:37

should be executing that way. Everything

47:39

is going according to plan. if you

47:40

compare it to other areas of compute.

47:43

Um, and I I ran this analysis recently

47:46

in 2010.

47:48

2010, not like, you know, maybe you were

47:51

12, but like the rest of us, we were

47:52

just like in companies doing things. Uh,

47:55

in 2010, AWS made $500 million in

47:58

revenue. Azure had just launched and GP

48:02

and GCP was called Google App Engine and

48:05

it had a little like a turbine logo with

48:07

like wings or something. So that was the

48:09

state of cloud. Fast forward to this

48:12

year and it's a couple hundred billion

48:13

dollar a year revenue ecosystem, right?

48:16

So so in 15 years, right? So and we were

48:19

in that moment being like who's going to

48:21

win AWS or Azure or GCP? What's how's

48:23

this all going to play out? And it just

48:25

turns out the market was so large like

48:28

obviously it was due to their execution

48:30

that they kept it going and kept it

48:31

large and the competition kept up, but

48:33

it it just didn't really matter. Like

48:35

everybody everybody kind of won. And so

48:37

I I sort of think of AI in a similar

48:39

fashion, which is I I can't predict if

48:42

it's going to be open AI 60% and

48:44

entropic 40% or it gets flipped or I'm

48:46

off by another 10% here or there. Uh but

48:49

but no matter what, these markets are

48:51

just fantastically large. Companies are

48:53

are going to adopt multiple of these

48:55

systems. They don't want to be single

48:57

vendor. Uh they don't want to be they

48:58

don't want a single vendor in this

48:59

stack. One service goes down or one

49:01

changes it APIs or one has a new

49:03

commercial model. you're going to it's

49:05

going to be a multi- vendor multi-AI

49:07

world and so uh and so that that's why

49:09

it's like very hard to kind of like call

49:11

it at this stage.

49:12

>> What does everyone think they know about

49:14

enterprise adoption with AI that they

49:16

get totally wrong? uh what they think is

49:19

that is that uh that the the outcomes

49:23

that you're seeing in AI coding will

49:26

quickly come for other areas of

49:28

knowledge work and um and that is a uh a

49:32

slight misread on the other areas of

49:34

knowledge work and uh and and some of it

49:37

is the idiosyncrasies of of coding and

49:39

some of it is the broad you know kind of

49:43

just elements of the rest of work and

49:45

how it happens. If you were a venture

49:47

investor today, which category would you

49:50

be most excited to invest in? Obviously,

49:51

I'm just hypothetically

49:54

speaking.

49:54

>> I mean, I I think I would still be

49:56

probably loading up on all of uh all of

49:58

the the hyp uh the frontier rounds. Uh

50:01

it's like these numbers could could

50:04

continue to get much larger and um and

50:06

then I think that the

50:07

>> could they get I mean much larger. I

50:09

mean like this is where in my at 850

50:11

billion you've got like a 3x to like a

50:14

2.1 trillion style with dition for

50:17

>> you know I always think it's hard

50:18

because um I I I kind of have said that

50:21

on the way up of many companies like

50:24

like like no just like

50:26

>> I did it with crypto like how much

50:28

further can it go? It's like

50:29

>> yeah well well that one I'm going to put

50:31

in a different category cuz that's that

50:33

can just sort of be me'd. Um I I

50:36

>> actually No, but I actually think but

50:37

like I think to the point on Atlassian

50:39

and bluntly you and your whole category

50:42

is the casinoization of the stock

50:44

markets which is like if you're a

50:46

momentum trader today, you still buy

50:49

Palunteer because the market's a casino

50:52

right now.

50:53

>> Yeah. Well, to to be a little bit more

50:56

fair, um I think you have some one-off

50:59

companies that have done, you know,

51:01

amazing job capturing the zeitgeist on

51:02

that. I do think you have I I think I

51:05

think the broad story right now is the

51:06

sector rotation story which is which is

51:09

sort of hey this AI thing is happening

51:12

right now I can get a higher return if I

51:14

if I get closer to the semi stack and

51:16

the and and the kind of you know where

51:18

the workloads are going and where the

51:19

data center buildout is happening and I

51:21

get less of a return if if I'm in

51:23

software you know with with kind of pure

51:25

licensing and so I think that that is

51:27

probably more of the color of of what

51:29

we're seeing now some of the some of the

51:32

data center and infra, you know, names

51:35

maybe have been nemified also. And so

51:37

that's kind of helping the case. Um, but

51:39

it's just a it's just a really weird

51:40

time, you know, overall. Uh, that

51:42

that's, you know, hard to hard to think

51:43

through.

51:44

>> So, you would not buy Allirds as an AI

51:46

company.

51:47

>> I mean, maybe you would because of that

51:49

exact point. So, I think you'll have

51:50

that will be in the kind of one-off

51:52

category. So, New Bird AI or whatever it

51:54

it's called. Um, but but I think um, no,

51:56

I I am still I I hear this is a generic

51:59

statement. there will still be a lot of

52:01

money to be made in the companies that

52:03

can take the innovation that we're

52:05

seeing in Silicon Valley and in the labs

52:08

and apply it to the real world you know

52:10

work that happens inside of enterprises

52:12

and whether that looks like vertical AI

52:14

whether that looks like you know the new

52:16

kinds of tooling that that companies

52:19

will need um there's a you know company

52:21

and a new category emerging on on uh

52:24

like agent observability and evaluations

52:26

I'll give a shout out to Brain Trust as

52:28

an example not an investor um where I

52:31

can just kind of sit back and be like,

52:32

"Shit, like we thought that that agent

52:35

builders were going to need eval." So

52:37

that's like a Silicon Valley TAM. And

52:39

then I'm like, "Oh, actually everybody

52:41

on the entire planet if you're putting

52:42

agents into an enterprise workflow needs

52:45

evals because you need to know if all of

52:47

a sudden your agent just stopped

52:49

producing, you know, like like uh you

52:52

know, loan origination documents the

52:54

right way." And so I, you know, that's a

52:56

category where, you know, it's probably

52:58

not going to be owned by one of the

52:59

labs. You kind of want it to work across

53:01

all the labs. It's it's it's, you know,

53:04

it's a very relevant kind of new form of

53:07

of of infrastructure for an agentic

53:09

enterprise. I think you're going to see

53:11

a dozen, two dozen, five dozen of things

53:13

like that that that start to emerge.

53:15

>> I've known you for a while now and you

53:17

put up with me for multiple different

53:18

sessions. Uh, so I want to finish on

53:20

something a bit off script, but you're a

53:23

phenomenal CEO. You're a public company

53:25

CEO. The pressure that you have on you

53:27

is intense. You're also like married and

53:30

have a great relationship.

53:32

Biggest advice on marriage when it's

53:35

super I'm being serious. When it's super

53:36

stressful, it's hard and you also have

53:38

to show up and be a great husband.

53:40

What's the advice on marriage?

53:42

>> Uh it feels dangerous if I actually

53:44

acknowledge the great husband uh piece

53:46

and other other parts that were embedded

53:48

in that. Um that that feels like you

53:49

need like a full 360 eval. I I will uh

53:52

I'll just say from my perspective and

53:54

I'm very lucky to have an amazing wife

53:56

and and family and you know you you are

53:59

you're in a grind in one of these roles

54:01

and um and so obviously having a a

54:04

strong support base um uh you know helps

54:07

a ton. Um we try and make time you know

54:10

for for the fun you know side of of life

54:13

uh as much as possible but uh obviously

54:15

that gets constrained in in the kind of

54:16

window that we're in. But I've been with

54:19

my wife for I don't know 15 years or so

54:23

uh 16 years and so she's seen the whole

54:25

the whole grind uh all the way and uh

54:28

she has her own set of grind uh in her

54:31

business and so it's it's just lots of

54:33

fun. So

54:34

>> dude, you're my hero. I want to be you

54:35

when I grow up. Thank you for being so

54:38

great. I really appreciate and I was 14

54:40

in 2010. Okay. All right. All right. So

54:42

I almost called it Yeah.

Interactive Summary

This video features a discussion with Aaron Levie, CEO of Box, about the transformative impact of AI on large enterprises. They explore the shift from manual workflows to AI-driven agentic processes, the evolving role of enterprise software, and the ongoing challenges of security, regulation, and workforce adaptation. Levie emphasizes that while AI will augment human work rather than eliminate it, the primary constraint for enterprises will remain the need for human accountability and complex, long-term change management. He also discusses the future of tech budgets, the competition between frontier models, and the importance of well-integrated, secure APIs for enterprise-grade AI adoption.

Suggested questions

4 ready-made prompts