HomeVideos

Mitchell Hashimoto’s new way of writing code

Now Playing

Mitchell Hashimoto’s new way of writing code

Transcript

3535 segments

0:00

What was your experience back then of

0:02

AWS? Your honest view.

0:04

>> AWS was really arrogant. Felt like they

0:07

were doing us a favor. Subtle vibe of we

0:09

will spin up a product and kill your

0:11

company.

0:11

>> Terraform just seemed to be everywhere.

0:13

Why do you think that sudden popularity

0:15

was?

0:15

>> One of the things that frustrated me was

0:16

like, oh, they only won cuz they were

0:17

first to market. We were like seventh to

0:19

market.

0:20

>> It feels like most of open source will

0:22

have to change because of AI. AI makes

0:25

it trivial to create plausible looking

0:28

but incorrect and lowquality

0:30

contributions. Open source has always

0:31

been a system of trust. Now it's just

0:33

default deny and you must get trust.

0:35

>> Do you think Git will be around in a few

0:37

years?

0:37

>> What's interesting is this is the first

0:39

time in like 12 to 15 years that anyone

0:42

is even asking that question without

0:44

laughing.

0:48

If AI agents can write code, open pull

0:50

requests and ship features, do we even

0:52

need open source contributors anymore?

0:53

Michelle Hashimoto, the co-founder of

0:55

Hashi Cororp, has been thinking deeply

0:57

about this, the future of open source

0:58

and how to efficiently integrate AI into

1:01

his day-to-day workflow. Michelle built

1:03

the tools that power modern cloud

1:04

infrastructure, Terraform, and the Hashi

1:06

stack. He also created a popular

1:08

terminal, Ghosty, and I consider him to

1:11

be one of the most thoughtful voices in

1:12

the industry on how AI is changing the

1:14

craft of software engineering. In

1:16

today's episode, we cover the original

1:18

story of Hash Corp, a failed university

1:20

research project, a notebook of unsolved

1:22

problems, and an email from his future

1:24

co-founder that he answered in two

1:26

minutes. His honest unfiltered take on

1:28

working with AWS Azure and Google Cloud

1:30

as partners, both the arrogance and also

1:32

the brilliant engineers who never

1:34

thought about the business, how he's

1:36

adopted to AI coding tools, why he

1:38

always keeps an agent running in the

1:39

background, and his practical advice for

1:41

engineers who have not yet warmed up to

1:42

AI agents, and many more. If you're

1:45

interested to hear from one of the most

1:46

hands-on builders in the industry and

1:48

want to know where AI tools are useful

1:50

versus not, then this episode is for

1:52

you. This episode is presented by

1:54

Statsig, the unified platform for flags,

1:56

analytics, experiments, and more. Check

1:58

out the show notes to learn more about

1:59

them and our other season sponsors,

2:01

Sonar and Work OS. Michelle, welcome to

2:04

the podcast. It's awesome to be here in

2:07

person.

2:08

>> Yeah, it's it's cool to meet you in

2:09

person after so many years of following

2:11

you.

2:12

you've had such a massive impact on on

2:15

the tech industry on software engineers,

2:17

but how did it start?

2:18

>> I think the high level is the same story

2:19

as a lot of people. I self-taught uh

2:22

around 12, 13, early teens, motivated by

2:25

video games. Same like same as a lot of

2:26

people. Um although I really quickly

2:28

realized that I liked web, you know, web

2:30

was new. Google wasn't out yet. I think

2:32

web was new. And so I I kind of like

2:33

really quick I I never became a video

2:36

game programmer. I really quickly just

2:37

became a web programmer, PHP, um Pearl,

2:40

that sort of stuff. And uh because I was

2:42

so young, the only way I could learn was

2:44

through whatever code was published

2:46

online. And so that's how I got

2:48

acquainted with open source. I didn't

2:50

know that's what it was called then, but

2:52

a kid with no job, no money. Um parents

2:54

didn't want to buy, you know, uh

2:56

professional books were like I don't

2:57

know what they are now, but they were

2:58

like 50 bucks then, right? And and so

3:01

they were like, "No way, right? This and

3:03

also they didn't believe I was going to

3:04

read it." And so there was no way

3:06

they're gonna buy that. So, um, yeah,

3:08

just anything I find online was my my

3:11

inn into coding. I'd walk to school

3:13

every day with a group of friends.

3:14

There's a period of time where I printed

3:16

out the first or second chapter of the

3:19

PHP manual. I remember it was about 30

3:22

to 40 pages of of paper and I never

3:26

programmed. So, all the stuff and I'm 12

3:28

is it's very confusing. So, I read the

3:31

whole 40 pages every walk to school. And

3:33

I don't remember how long it took me,

3:34

but I did that a long time before, you

3:37

know, I remember this one moment where I

3:38

was walking to school where suddenly I

3:39

understood

3:41

what these dollar sign things were. I

3:43

like it like for whatever reason it just

3:46

came in.

3:46

>> Those are variables, right?

3:47

>> Variables. Yeah. Yeah. And I I really

3:49

understood I never heard that word

3:51

before. Like you don't hear the word

3:53

variable as a 12-year-old out in any

3:56

context. And finally at one point it

3:58

like hit me that they store things and

4:02

things could change and I remember just

4:04

like weeks of reading this thing and not

4:06

understanding it getting to school so

4:08

excited being like it it triggered and

4:10

then after that I remember stuff

4:11

happened really quickly.

4:12

>> What what kind of stuff did you build?

4:14

Websites.

4:14

>> Yeah, websites. It was gaming related

4:16

websites. It was like a lot of like

4:18

gamech stuff, forum software. Yeah, I

4:21

mean I had a lot of fun cloning

4:23

websites, you know, in a poorly, but

4:25

like uh PayPal was out and then and I

4:27

really wondered like how does money get

4:29

transferred over the internet? How does

4:30

that work? So I tried to build like

4:32

copies of cloning websites. I did like

4:35

masquerade as a 18-year-old on um uh

4:38

like freelance websites. And so I got,

4:40

you know, 100 bucks here, 50 bucks here

4:43

to do like image like upload stuff. I

4:46

decided to study computer science in

4:47

college. Um went to University of

4:49

Washington. I mean, I guess that's when

4:51

you would call it serious, but I was I

4:53

was like really I mean, I was coding

4:56

every day as much as I could through

4:58

high school.

4:58

>> Oh, okay.

4:59

>> Yeah, that's impressive. Were you alone

5:03

with this when your friend group there?

5:04

Were there other people doing it or was

5:06

it kind of lonely?

5:07

>> It was lonely. It was very lonely. It

5:08

was It was lonely in the real world and

5:10

then I quickly found online friends

5:12

through like MSN Messenger and ALS

5:14

Messenger and forums. I found online

5:17

friends which many I I have met now and

5:19

I still keep in touch which was cool but

5:21

no I mean like back then I mean being a

5:25

being a programmer when no one knew that

5:27

word but but being into computers was

5:29

like a social death kiss and so uh even

5:32

my closest friends didn't though my best

5:34

friends and stuff like I hid it from all

5:36

of them and I didn't talk about it at

5:37

school and stuff like that so it was

5:39

just a secret until I went to college

5:40

and college is when I decided to like

5:42

let it all out. The big like break that

5:44

I got was I blogged and uh my freshman

5:49

year, late freshman year heading into

5:50

summer um after it of college, someone

5:53

just emailed me out of the blue and I

5:55

kind of thought it was a scam. It was

5:57

just like do you want to, you know, it

5:59

was do you want to be a Ruby on Rails

6:00

programmer? And I didn't know Ruby. I

6:02

was a PhD programmer. Um I had never

6:04

done Ruby. I'd never done Rails. But I

6:06

got this email and I'd never been like

6:08

head-hunted before. Like I didn't know

6:09

what this was. I was also 18. So I

6:11

didn't really know what to think about

6:12

it. I probably would have not responded

6:14

except that the person contacted me was

6:16

in LA and so I did respond and we set up

6:21

a meeting like a real physical meeting

6:23

and I met him and met the company and

6:25

realized this is real and they're

6:27

serious and genuine and I took that job

6:30

and uh yeah I mean that was I learned a

6:33

lot on the job there. So that was a huge

6:35

change. Um

6:35

>> was it a startup or small company

6:37

something like that?

6:38

>> No, it was a consultancy. So, it's kind

6:39

of like one of those standard like this

6:41

like 2007 Ruby on Rails was had blown

6:44

up. It was already very popular and uh

6:47

there was all these consultancies that

6:49

that appeared out of nowhere that was

6:50

basically like we'll build your minimum

6:52

viable product and yeah and we're one of

6:54

those shops. So great job for a college

6:56

student cuz we'd see a client for like 2

6:58

months and I would build a YouTube style

7:01

website and then I would build like a

7:03

philanthropy website and then I'd build

7:04

an e-commerce website and like it was

7:06

just like I got to learn all these

7:07

different technologies and different

7:09

scale challenges and different like you

7:11

there wasn't a lot of scale because

7:11

we're building MVPs but different like

7:13

thinking of scale problems. Um yeah it

7:16

was it was great. How did eventually

7:19

Hashi Corp start or what happened

7:21

between like getting getting this this

7:23

Ruby job to a few years later?

7:25

>> It kind of starts with this Ruby job. Um

7:28

there was one guy that worked at the the

7:31

company and and he's he's pretty into

7:33

his privacy so I won't share his name

7:34

but he was my boss and there was no

7:37

Heroku, there was no engineard so you

7:38

had to like self-host and Ruby on Rails

7:40

hosting then was kind of like difficult.

7:42

So he was the guy who got all these

7:44

projects hosted on on dedicated servers

7:47

and I didn't know anything about that

7:49

and I and he ran Linux and he had long

7:52

black hair and he like didn't use a

7:55

mouse and all these things that were so

7:57

weird to me and I was just intrigued. I

7:59

just he sat in the corner. He didn't

8:00

want to talk to anybody. Um, and I just

8:03

wanted to know more about what that

8:04

world was. And luckily, despite

8:07

appearances, he's very nice. And so, um,

8:11

yeah, I I think as soon as I showed a

8:13

genuine interest, started asking a lot

8:14

of questions, he started just giving me

8:17

challenges like, well, the first

8:19

challenge I remember he did is he

8:20

unplugged my mouse. And it's funny cuz I

8:21

I don't think there's an era of time

8:23

where if you did that, it probably would

8:25

have been some kind of harassment or

8:26

something. But he he literally said

8:27

unplug he unplug my mouse and said

8:30

you're never going to work with a mouse

8:31

again. So figure it out. I'm not going

8:33

to tell you how. Just unplug my mouse,

8:35

restart the computer, your problem now.

8:38

And took the mouse away.

8:39

>> Mhm. Um took me about a week and I got

8:41

really good with the keyboard.

8:42

>> Harsh lesson.

8:43

>> Harsh lesson. And once I got good with

8:45

the keyboard, um he said, "Okay, here's

8:47

he he installed screen on my, you know,

8:49

early team. He installed screen in my

8:51

terminal and said, "Figure this out.

8:53

You're going to use this now." you know,

8:54

there's no questions like you will use

8:56

this and and he just slowly instilled on

8:59

it um on me and as we got there then it

9:02

became you know here's SSH here's a

9:05

package manager he's like it slowly

9:07

taught me more and more and that got me

9:09

just in I loved in like immediately it

9:10

was like this is super cool super fun so

9:12

that long-winded process got me into

9:15

infrastructure and then simultaneously

9:17

or very shortly afterwards I joined a

9:19

research project at the University of

9:20

Washington called the Seattle project

9:22

which is a terrible name cuz you can't

9:24

Google it, but it's called the Seattle

9:26

project. It was I'm sure it doesn't

9:28

exist anymore. And it was again another

9:31

popular thing during this time was uh

9:33

kind of like um folding at home. It was

9:35

this idealized folding at home which is

9:38

can a bunch of people compute of

9:40

different you know it could be your home

9:42

machine it could be a unused rack it

9:44

could be in your basement it could be

9:46

around the world but can you comp you

9:48

donate all this heterogeneous hardware

9:50

and then can you generalize auler on top

9:53

of it so that academic institutions

9:55

across the world could just run

9:57

workloads and not just like not just

9:59

research like the job I got was

10:01

basically to very vaguely to create not

10:06

the scheduler component but like create

10:09

the ability to spin up all these nodes

10:11

um and and and a bunch of other stuff.

10:13

It's very vague but but it was this

10:15

infrastructury problem and I completely

10:17

failed at it. like I I tried for a

10:20

quarter but um from a technical side I

10:22

just failed and and I wrote down on his

10:25

notebook like what I thought the pieces

10:27

were missing that I couldn't solve this

10:30

problem in a quarter in a 10e period

10:32

like why well we need this we need this

10:34

we need this it's interesting to see how

10:36

structured Michelle was in his approach

10:38

in defining components that would later

10:39

become parts of the hashy stack and this

10:41

leads us nicely to our season sponsor

10:43

work OS one thing I've learned from

10:45

studying great engineers Michelle

10:47

included is is that they're very

10:48

deliberate about what they choose to

10:50

build. Great engineers don't just ship

10:52

fast. They think in systems. They

10:54

understand leverage and they're careful

10:56

about what becomes part of their

10:57

long-term service area. If you're

10:59

building SAS, especially an AI product,

11:01

authentication, and enterprise identity

11:03

can quietly turn into a long-term

11:05

investment. SAML edge cases, directory

11:07

sync, audit logs, and all the things

11:09

enterprise customers expect. Works

11:11

provides these building blocks as

11:13

infrastructure so your team can stay

11:15

focused on what actually differentiates

11:16

your product. Great engineers know what

11:19

not to build. If identity is one of

11:21

those things for you, visit

11:22

workwise.com. And with this, let's get

11:24

back to Michelle's notebook with all the

11:26

components he would end up building at

11:27

Hashior Corp. And I still have this

11:29

notebook um at my house here, but the

11:31

problems are really like, you know, I

11:33

have no way to declaratively manage the

11:36

different resources that are out there.

11:37

I have no way to network these together

11:39

in a private network. Um, you know, I

11:41

wrote these things down and there was a

11:43

lot of stuff there that I never ended up

11:44

building, but a subset of that was

11:47

ultimately what Hashorp would end up

11:48

building. And I shared this with my

11:51

undergraduate like boss who was Arman

11:54

who was my co-founder.

11:55

>> Y later became your co-founder.

11:57

>> Yes. He was the my my boss on the

12:00

undergrad side. And I shared it with him

12:02

as kind of an exit interview like this

12:05

is what it is. And then some period of

12:08

time passed, not much, weeks passed and

12:10

he emailed me out of the blue and was

12:11

like, "Do you want to do a startup

12:13

together?" That, you know, you're a

12:14

teenager and you have no idea what this

12:15

commitment is.

12:16

>> Well, you're like 21 or something at

12:17

this point.

12:18

>> Uh, probably not even. Probably probably

12:19

19 or 20. Yeah. And he emailed me out of

12:21

the blue like, "Do you want to do a

12:22

startup?" Like person you never met or

12:24

you barely met, never met personally,

12:26

like all this stuff. It's so funny. And

12:27

he emailed me that at like 11:30. You're

12:29

in college. I emailed him back in 2

12:31

minutes and said, "Sure."

12:35

and he remembers thinking, "Wow, yours

12:36

thought it so fast that he's just in.

12:38

He's ready to go." That was sort of the

12:40

start of our friendship. And then uh and

12:42

uh again like there's overlapping pieces

12:44

here, but I was also at the time working

12:46

on something called Vagrant. And Vagrant

12:48

was, you know, came out of the

12:50

consultancy less the less the research

12:52

project. It was solving the problem in

12:53

this consultancy where we had new

12:55

clients every two months and we had

12:56

different teams. How do we create

12:58

reproducible dev environments so I could

12:59

go help somebody without a lot of

13:01

billable hours? So, so this is a

13:03

development environment that you could

13:05

spin up quickly, right?

13:06

>> Yeah. Yeah. Yeah. The metaphor I always

13:08

had was I didn't use Windows then, but

13:09

the metaphor I always used was how could

13:10

I doubleclick and open a dev?

13:12

>> Yep.

13:13

>> That was a metaphor I used because

13:14

>> it's a good one.

13:15

>> Yeah. What what what the problem we're

13:17

having was

13:17

>> any hour waste in a consultancy that you

13:19

can't build is just a waste. And so it

13:21

was basically like if somebody else is

13:23

behind schedule, how can I jump in, help

13:26

implement a feature, and jump out? And

13:28

we were in that era, just setting up the

13:31

dev environment for a project might take

13:34

you half a day.

13:35

>> And you couldn't build that for the

13:36

client, right? The client would only pay

13:38

for the work.

13:38

>> Yeah. You couldn't build that for the

13:39

client. So it' be like 4 hours of work

13:41

wasted and it would probably mess up

13:44

your dev environment for your actual

13:46

client because you would be a different

13:47

Ruby version, a different Rails version

13:49

and so you would kind of destroy both

13:51

ends. And so vagrant came out of that

13:53

which was I just need to go over there

13:55

and what ended up becoming vagrant up

13:57

sweet you know few minutes let's help

14:00

you for the next two hours and then

14:02

>> and how how did you build it back then

14:03

was it some kind of virtual machine or

14:05

>> yeah I was with virtual box virt oracle

14:07

well it wasn't it was sun then but um

14:09

virtual box and and that's that's

14:11

another cool constraint which is that I

14:13

was a college student so I had no money

14:14

so

14:15

>> this was expensive back then right

14:17

>> uh virtualization was expensive virtual

14:19

box was free and open source I don't

14:21

care about the open source side um for

14:23

that. I was I was never going to read

14:24

it. But yeah, it was free. That was why

14:26

I did it and and that's why I did that

14:28

and not like EC2, which should come out

14:30

by then, but I didn't do EC2 cuz I I

14:32

didn't have money to pay for these

14:33

instances. So, um yeah, that's that was

14:35

the constraints and and I like bringing

14:37

that up because I think so much of

14:40

software engineering is understanding

14:42

constraints and working with these

14:43

constraints. And your prior podcast

14:45

there were, you know, called the forces

14:47

like static and dynamic forces. It's

14:48

that and and I think that helps create

14:51

better software um when you have

14:53

constraints and that was my constraint.

14:54

So yeah, so that was we have vagrant, we

14:57

have this failed infrastructure project.

14:59

Um we have uh sort of the my boss at

15:02

consultancy getting me into

15:03

infrastructure and all of the and then I

15:05

mean externally we had the cloud being

15:08

introduced AWS I I went to school

15:10

University of Washington so

15:11

>> oh

15:11

>> I was right there

15:12

>> right in the epicenter of it Amazon was

15:15

next door right

15:15

>> Amazon was right next door they donated

15:17

a bunch of credits right away I knew

15:19

about the launch um most of the CS

15:21

students at UDub interned at at Amazon

15:24

not necessarily AWS but also including

15:26

AWS but all over Armon on uh interned at

15:30

uh AWS and so like I I was in the bubble

15:35

of like cloud cloud cloud AWS as a when

15:38

people were pronouncing S3 like s cubed

15:40

like people didn't know how to pronounce

15:41

it right that's how that's how new it

15:43

was and so yeah all this stuff kind of

15:44

came together and uh kind of led me on

15:47

the path to to build tooling to better

15:49

manage it and

15:50

>> at that moment in time when you saw

15:52

cloud you know you saw it was being big

15:55

did you know or have a conviction that

15:57

it would be big or as big as cloud has

16:00

become cuz this was I'm just trying to

16:02

put yourself back like this was very

16:03

very new back then right

16:04

>> totally

16:05

>> and and I think you know like if if if I

16:07

imagine I assume more people would have

16:08

been skeptics or think that is just a

16:10

fat or whatever what was it like can you

16:12

can you bring us back a little bit there

16:14

compared to the to the to today it was

16:17

very unpolished I guess as I describe it

16:19

you know like EC2 was I mean I used AWS

16:22

in general was very unreliable um S3 was

16:24

the only ever reliable piece everything

16:26

else was was totally unreliable. Um, and

16:29

there was only a few services like EBS

16:32

didn't even exist when we started. So,

16:33

there was no durable storage besides S3

16:35

when when I first started with it. It

16:37

just felt very raw. Um, and I don't I

16:40

don't I never really viewed it as this

16:42

is going to be big. I mean, eventually I

16:44

thought it was going to be big. What I

16:45

viewed it as is this is the better way

16:48

to do it. This feels like the better way

16:49

to do it. Just Yeah. at a base level

16:52

like whether this wins or loses in the

16:55

realm of markets and social like

16:59

popularity I don't know but this felt

17:01

good and so that that's what kind of

17:03

pushed me towards it is and I say this

17:06

over and over I'm I'm really motivated

17:08

by like what's the most fun and what

17:10

like feels right and that it just felt

17:13

right to me. Um I think where I started

17:16

making the bet, me and Arman both

17:19

started making some kind of bet was not

17:21

just when we started Hashor but we

17:22

started Hashorb on the basis of like

17:24

multicloud and

17:27

I really like to like contextualize that

17:30

at the time we were starting this which

17:32

was like 2011 2012 which is that AWS was

17:35

huge Azure didn't really exist and

17:38

Google cloud didn't really exist. There

17:40

was Google App Engine, right? It wasn't

17:41

even cloud.

17:42

>> Correct. Correct.

17:42

>> I I used to use that when it was App

17:44

Engine. Yeah.

17:45

>> Yeah. Yeah. And so in that context, as

17:49

we were pitching these cloud agnostic

17:50

tools, I mean, we got a lot of raised

17:52

eyebrows being like, "This is a waste of

17:54

time because AWS is the only player in

17:56

town." And our conviction was at that

17:59

point cloud is going to be huge and

18:02

anything that's economically huge, other

18:05

people want piece of that pie. And so

18:07

you're not going to just have AWS. it'll

18:09

be huge, but you're going to have these

18:10

others pop up and Microsoft is not going

18:12

to sleep on it and Google's not going to

18:14

sleep on it and who knows who else and

18:17

who knows and that was our conviction.

18:18

That was our bet and uh it mostly played

18:21

out that way.

18:22

>> So when you decided to start Hashi Corp,

18:24

you had Vagrant like was the idea to

18:26

like you know like invest in

18:27

commercialized Vagrant and did you go

18:29

out to raise money or did you start

18:30

doing it with you know bootstrap? How

18:32

did that go?

18:33

>> It wasn't to commercialize Vagrant. But

18:35

what we had done is Arman and I both

18:36

worked at this mobile ads company

18:38

startup. Um there's like less than 30

18:41

people and we had built like with Python

18:45

and C like these really um rough

18:49

prototypes of these ideas that I had in

18:51

this notebook of like service discovery

18:52

and um like an early version of

18:54

Terraform we called Launchy. So we did

18:56

DNS based service discovery service

18:58

discovery by connecting an off-the-shelf

19:01

DNS server with Postgress and we did all

19:03

these like hacky things but they felt

19:05

good u and again we like get back to

19:07

this like how things feel to me to

19:09

motivate me like it felt right

19:10

directionally right. I graduated the the

19:14

environment in Seattle was not very

19:16

startup heavy at the time. It was

19:17

basically everyone was like are you

19:18

going to work for Amazon or are you

19:19

going to work for Microsoft? Yeah, that

19:21

was like kind of it and and like to a

19:23

certain extent Facebook was starting to

19:24

show up up there, but that was it. I

19:26

knew I wanted to work for startups. So,

19:27

I had I I moved to to San Francisco. So,

19:30

I moved to San Francisco, found a

19:32

startup that would hire me, which was a

19:33

mobile ads thing. Um and uh just wanted

19:36

to learn. So, that that's the short step

19:38

there. So, I ended up in San Francisco.

19:40

Um and I convinced Arman was actually

19:42

going to do a PhD at Berkeley and he was

19:45

accepted and in and he was this a huge

19:47

deal, huge deal. I mean, incredible

19:49

program. Um, and so he was going to go

19:51

there and he would have done amazing

19:52

things there. But I convinced him to

19:54

join this mobile ad startup. He actually

19:56

took a year to ferment on the PhD. He's

19:58

like, I'll give it a year.

19:59

>> Yeah.

19:59

>> I'll join this mobile ads

20:01

>> and I'll go back for sure.

20:02

>> If it doesn't work, I'm going to go

20:03

back. And what ended up happening in

20:05

that year is is now where we get to. Um,

20:08

which is that we had this these these

20:10

this hodgepodge of proto prototype tools

20:13

that felt right. and we were going to

20:16

all these little startup mingling

20:18

parties, you know, it's like things like

20:19

GitHub drinkups, but also just like our

20:22

this is such a San Francisco thing and

20:24

that's why I think it's even though I

20:25

don't want to live there again, it was

20:27

so magical at the time. Um was like

20:29

across the street was this company that

20:32

was called Zimrite at the time

20:33

ultimately became Lyft and they invited

20:35

us over to get drinks and have pizza to

20:39

demo this new app with a mustache that

20:42

like didn't have a name and

20:44

>> Wow. Yeah. So, like stuff like that.

20:45

>> You were there when I was born.

20:47

>> Yeah. Yeah. Yeah. And like that happens

20:49

all the time. Like all the time in San

20:51

Francisco and and it's not unique to me

20:53

at all. Like Yeah. There there's a bunch

20:54

of stories there that I think aren't

20:55

worth getting into. It's just like it's

20:57

fun. But I went to all these things and

20:59

people would just talk. They're all

21:01

bunch of tech guys, right? And and you'd

21:03

be like, "What what are you working on?"

21:04

And and there's two things I realized.

21:07

One is all these companies are cloud

21:09

first. They're all just adopting AWS

21:11

first. There was no there was no

21:13

dedicated

21:14

>> this was like in in 2011 2012 or so like

21:16

like they they just like went and paid

21:18

for paid for cloud which was brand new

21:20

right the previous generation just had

21:22

had onrem I repeat but server rooms and

21:24

and server admins they had roles for

21:26

those all all that jazz

21:28

>> that was just gone gone like

21:30

>> that must have been a massive shift

21:31

>> I I literally can't think of one social

21:33

event I went to where there was somebody

21:35

that had dedicated servers the only one

21:37

is

21:37

>> Twitter yeah but I I think you like we

21:39

probably have to emphasize that that

21:41

This is a massive shift in the industry,

21:42

right? And it probably was was only

21:44

happening in Silicon Valley or like

21:45

>> probably

21:46

>> yeah well well ahead of of everyone else

21:49

>> at a scale that was larger than anywhere

21:51

else. It was probably in Silicon Valley.

21:52

The joke used to be cuz AWS is so

21:54

unreliable. The joke used to be that

21:56

when AWS went down uh all these startups

21:58

finally became more cash flow neutral

22:02

and they would lose less money. Um so

22:05

there would be like a huge you know US

22:06

east outage and and everyone would be

22:08

like are you going to migrate regions?

22:10

like no, we're saving money right now.

22:13

But yeah, getting back to it, uh,

22:16

everyone was cloud first, cloudorn,

22:19

cloud native, whatever you want to call

22:20

it. And, uh, the other thing was they

22:23

were hitting all the same challenges

22:25

that we were hitting and they didn't use

22:28

our tools cuz they were just like

22:29

internal prototype tools, but

22:30

>> but I knew that our tools felt good. So,

22:33

I had these two things come together

22:34

where I had some ego, some hubris where

22:36

I'm like, I'm pretty sure we're building

22:37

the right thing along with I think the

22:39

industry is moving in that direction and

22:41

like we could we could come together.

22:42

Um, and so that led to let's start a

22:46

company based around that. The fact that

22:49

I had Vagrant was more of like a

22:52

industry respect. I mean, Vagrant wasn't

22:56

that big then, so that's not saying

22:57

much. Um but it was it I just had some

23:00

foundation publicly with to give some

23:03

credibility to head in this direction.

23:05

Um that was about it and we we started

23:07

Ashp.

23:08

>> And then when you decided you

23:09

incorporated you know got the things did

23:11

you decide to raise money cuz again back

23:13

then I guess it wasn't as common wisdom

23:15

you know why cominator was probably

23:17

starting around that time. So like

23:18

startups were startups a big thing or

23:20

was it a given that okay if you start a

23:21

startup you're going to raise money

23:23

>> in my social bubble it was pretty much a

23:25

given. Um, and and not not just that.

23:28

So, I we incorporated um I self-funded

23:32

um I I I transferred $20,000 from my

23:35

savings account into this corporate

23:37

account, initial funding. Um and I

23:39

worked off of that. I didn't I paid

23:41

myself $0 for the first 6 months. So,

23:43

the 20,000 was purely towards whatever

23:46

things the company needed. That was the

23:47

first 6 months. And then Arman joined

23:49

after 6 months. Um and and we decided to

23:53

raise uh and the motivation there really

23:56

is there weren't many there weren't many

23:58

other options. There are basically three

24:00

options as I saw it then which was uh

24:03

bootstrapping um right just like build

24:06

something make money and as it becomes

24:08

affordable continue to grow reinvest and

24:10

grow bootstrapping VC on the other side

24:13

and then in the middle was like what I

24:14

called patronage which was not not like

24:17

not not Patreon style stuff today like

24:20

that infrastructure didn't exist there

24:21

was no subscriber donate type

24:24

infrastructure then um patronage was

24:26

more like you might be able to convince

24:28

a company like VMware to pay your salary

24:32

for you to work on some idea and the

24:34

best example is Reddus at VMware and

24:36

yeah and we kind of laid out this plan

24:38

that we wanted to do um which was which

24:41

at inception of the company included

24:43

Terraform console no it included

24:46

everything but Vault uh Vault came a

24:48

little bit later and we looked at that

24:51

and said if we bootstrap this even if we

24:53

hit it out of the park this is going to

24:55

take us like a decade just to like build

24:57

the software and that's in Best case

24:59

scenario, this is just going to be slow.

25:01

And and the problem with slow is that

25:02

things have a window and and cloud was

25:04

growing so fast that if we were that

25:06

slow, someone else was going to do it

25:09

their own way. I mean, that was I guess

25:10

that was the primary issue is we really

25:12

just wanted to go fast.

25:13

>> You need you knew you needed to.

25:15

>> Yeah. I needed to we needed to hire many

25:18

engineers right away and start building

25:19

right away. And so VC was the route we

25:22

chose.

25:22

>> Can you talk us through the the first

25:24

several products and and what they do?

25:25

you know, we we know Vagrant, but just

25:27

for those who are less aware of of what

25:29

what became the hashy stack later,

25:31

right?

25:32

>> Yeah. Let me see if I can still get

25:33

these in order. I'm pretty sure I can.

25:34

So, this vagrant was predated it. The

25:36

first product that came out of Hashore

25:38

itself was a product called Packer. Um,

25:40

kind of understated

25:42

publicly, but kind of underpins a lot of

25:45

things in the industry to this day.

25:47

That's an image building uh tool. So,

25:50

building Amazon images, VMware images,

25:53

etc. Um, I'm not even sure how much like

25:56

publicly came out, but there are whole

25:58

cloud like multi-billion dollar cloud

26:01

platforms that all of their official

26:03

images are like the service images are

26:05

built with packer. Everyone was trying

26:07

to utilize this horizontal scaling

26:09

autoscaling nature of AWS. That was the

26:11

dream. And if you were, it's kind of

26:14

like the u what the cold star problem

26:16

with serverless today. If you were

26:18

waiting tens of minutes for your server

26:20

to be ready, you couldn't react. Um, and

26:22

so my idea was do that, snapshot the

26:26

image and then next time just spin up

26:27

that image. Um, and so that was Packer.

26:29

>> That was Packer.

26:30

>> So Vagrant Packer. The next one that

26:32

came out was console. Um, console was uh

26:37

solving the networking problem and not

26:39

networking. It was more solving the

26:40

service discovery problem which was you

26:43

have all these machines coming and

26:44

going. Before, again, like to

26:46

conceptualize this, before you would

26:47

have a static set of machines that had

26:49

IPs, and you would probably use DNS or

26:50

something, but the IPs didn't change

26:52

that much. So, you could be like, "Oh,

26:53

my database is here and it's not

26:55

moving." But if you're in this world

26:56

where web servers and load balancers and

26:59

databases are just breathing, you know,

27:01

they're that's how I always describe

27:02

breathing. They're they're creation

27:04

destruction. Creation destruction like

27:05

constantly. Then things are happening at

27:07

a scale where the service discovery

27:09

needs to be much faster. Um, and not

27:11

just faster, but you want to be have

27:13

better guarantees that when you get a

27:15

response that oh, it's at this IP

27:16

address, so that IP address is like

27:18

ready. It's not just Yeah, I I think

27:20

this is also kind of more mainstream

27:22

with like Kubernetes readiness checks

27:23

and health checks and things like that.

27:24

It was it was bringing that to more like

27:28

physical server or cloud servers,

27:29

virtual machines and things like that.

27:31

And so that was console. Then after

27:32

that, I think we did Terraform. Um,

27:35

Terraform uh spins up infrastructure

27:38

code. Describe your infrastructure in in

27:40

AWS parlament. It was things like all

27:43

the attachments to your EBS volumes,

27:45

gateways, VPCs, subnets and like

27:48

connecting them all together. Like the

27:49

idea was I wanted to have an empty AWS

27:51

account or any cloud account and I

27:53

wanted to have this text and I wanted to

27:55

say make this text reality and that's

27:57

what Terraform is and and you would wait

27:59

whatever amount of time it took AWS and

28:01

you would blink and you would have

28:02

thousands of resources and then you with

28:05

one command again you could just tear it

28:06

down to zero. That was Terraform. So

28:08

that came out like 2014. Um so that was

28:11

the next thing. Uh and then was vault.

28:15

>> Yep. Um, vault is is is the easiest to

28:17

describe as secrets management as core.

28:20

Secrets management encryption grew to do

28:22

a lot more things for that.

28:23

>> So it's like like well we have like our

28:25

on your local developer machine you have

28:26

you have like your environment variables

28:28

and doing that at scale at a team level

28:30

at a company level at a service services

28:32

needs to access all these stuff

28:34

securely.

28:35

>> Yeah, it was much more focused on the

28:37

like the the production environment

28:39

secrets. Um, I had dreams and visions of

28:43

really solving the developer secret

28:44

problem, but Vault really never never

28:46

did that well.

28:47

>> Mitchell just talked about secrets

28:48

management, which turned out to be a

28:50

pretty important focus area for him. In

28:52

general, security is both very valuable,

28:54

but also pretty hard to do well. This

28:56

leads us nicely to our season sponsor,

28:58

Sonar. Looking at where we are today,

29:00

we've now moved past tap completion into

29:03

the era of Agentic AI. Autonomous agents

29:05

are opening poll requests. One big

29:07

question. How do we get the speed of AI

29:09

without inheriting a mountain of risk?

29:11

Sonar, the makers of Sonar Cube, has a

29:14

really clear way of framing this vibe

29:16

than verify. The vibe part is about

29:19

innovation, giving your teams and your

29:21

AI agents the freedom to build and

29:22

iterate at high velocity. The verify

29:25

part is the essential automated

29:26

guardrail. As agents start contributing

29:28

more of our codebase, independent

29:30

verification that checks every line,

29:32

human or machine generated, against your

29:34

quality and security standards, is more

29:36

critical than ever before. Helping

29:38

developers and organizational leaders

29:39

get the most out of AI while ensuring

29:41

quality, security, and maintainability

29:43

is one of the main themes of the

29:45

upcoming Sonar Summit. This isn't just a

29:47

user conference. It's where devs,

29:49

platform engineers, and engineering

29:50

leaders are coming together to share

29:51

practical strategies for this new era.

29:53

I'm excited to share that I'll be

29:55

speaking there as well. If you're trying

29:56

to figure out how to adopt AI without

29:58

sacrificing code quality, join us at the

30:00

Sonar Summit. To see the agenda and

30:02

register for the free virtual event on

30:04

March the 3rd, head to

30:05

sonarsource.com/pragmatic/Sonarsummit.

30:09

And with this, let's get back to Hashi

30:11

Corp and why the company decided to

30:13

raise 6 months after founding. But yeah,

30:15

it it's just basically like, yeah, where

30:17

do you store your secrets? And and the

30:18

secrets were not just I I forgot the

30:20

words I use to describe this, but

30:21

secrets were not just like passwords,

30:23

but it was also like PII. So how do you

30:26

protect emails and addresses and stuff

30:27

for your customers

30:28

>> or credit card numbers?

30:29

>> Credit card numbers. Um so vault was

30:31

core to all of that and continues to be

30:34

>> that that is a part to build something

30:37

like that.

30:37

>> Yeah, we were really scared when we

30:39

built that actually cuz um we kind of

30:41

hid the fact we never lied about it but

30:44

nobody on the team that build vault had

30:46

more than one quarter of security

30:49

undergraduate security experience. There

30:50

was no professional security engineers

30:52

from industry. There was no professional

30:54

security academics and uh yeah we built

30:56

it. We got a lot of audits because of

30:58

that. Like we were scared. So we did get

31:00

a couple we for us it was very expensive

31:02

as a startup. We paid a couple firms

31:04

tens of thousands of dollars for vault

31:06

0.1 to audit it. We paid two. Um we got

31:10

we shared the early beta with a lot of

31:11

people who were security experts in

31:14

order to review it. Not publicly, just

31:15

privately. Um we got a lot of good

31:17

feedback. Um but yeah, we we didn't want

31:20

that exposed in a sense. So yeah, I I I

31:22

understand, but I mean it kind of

31:24

validates that you can build good stuff

31:26

with I guess people who might not have

31:29

the experience, but I guess people were

31:30

learning, right? Like yeah, the security

31:32

stuff ended up we you know, we really

31:34

quickly hired professionals that helped

31:36

of the product and and the security

31:37

stuff was always pretty solid. Um, but

31:39

but I think what it really showed was

31:41

what the security industry needed was a

31:44

shift in user experience more than a

31:46

shift in like what it did because like

31:48

what we were doing was not fundamentally

31:49

different than

31:51

existing multi-undred million billion

31:55

dollar companies that already existed

31:57

but the experience the way you interface

31:59

with it was dramatically different and

32:01

that was I think a good example of that.

32:03

Yeah.

32:04

>> And after travault came

32:06

>> Nomad

32:06

>> Nomad. Yeah. Nomad which was our

32:09

scheduler which was a couple years late

32:13

for to the market. Yeah.

32:14

>> What was you sayuler was it not an

32:16

orchestrator?

32:17

>> I always described it as scheduling.

32:19

>> What what did it do?

32:20

>> Simple thing. You have a pool of

32:21

compute. It finally solved that problem

32:23

that I had in undergraduate. You have a

32:25

pool of compute. You have an app that

32:27

has a certain set of requirements and it

32:28

needs to find a place to run it.

32:29

>> Yeah. Yeah. The the underground problem

32:31

we talked about. And as you're building

32:33

out like these, you said like some of

32:35

these took years like how did the

32:38

business like hash corp as a business

32:39

work? Like did you did you start to

32:41

generate some business?

32:43

>> There was so so like all right tell me

32:45

about this one.

32:46

>> Yeah. I I think we waited too long to to

32:48

develop a business but um for four years

32:51

there was there was actually revenue

32:53

from a couple random sources but there

32:55

was no real reproducible growing

32:58

business. So you were just building this

32:59

vision of of the the you know your your

33:02

the founders vision of like all right we

33:04

need all these things that would have

33:06

taken like a decade bootstrap let's

33:07

build it

33:08

>> build in 5 years and figure it out.

33:10

>> That was literally it.

33:11

>> Yeah that was literally it and and you

33:12

know it was it was all open source and I

33:15

always had this mentality which was

33:16

which was like if the company fails it

33:19

doesn't matter because if there are good

33:21

ideas the open source community will

33:22

just continue. Um, and so I don't think

33:25

I would ever tell that to my investors

33:26

at that time, but you know, I had this

33:28

idea which is like the technology was

33:29

the most important thing to get out into

33:31

the world. Um, the business I really

33:34

sure hope we could figure it out, but

33:36

it's not the most important thing. And

33:38

for those engineers who are thinking of

33:40

becoming founders or you know might

33:43

might be founders, how did this work

33:44

with your investors? You know, when they

33:46

put in money like did they get some

33:47

board sees? Did you have to you manage

33:49

manage expectations? cuz kind of I'm

33:51

hearing just putting a bit of my

33:52

business hat on is like you know for 4

33:53

years you're building these cool things.

33:55

You don't exactly have a business plan.

33:57

How did that work or or they just

33:59

believe that eventually you guys will

34:00

figure it out or or they saw some kind

34:02

of traction with like open source. It's

34:04

traction and I don't think what we did

34:06

was atypical for Silicon Valley. So the

34:09

really broad handwavy way I like to

34:11

describe it is you know your seed is

34:13

about building the product. You don't

34:15

even know if it's product market fit.

34:16

you're just guess not you're you're

34:18

making educated guess but you're

34:19

building something getting the A you've

34:21

sort of proven hints of product market

34:23

fit but you definitely don't have it

34:25

yet. You've proven hints and then when

34:27

you get the B you've proven product

34:29

market fit and now you haven't really

34:31

proven like repeatable revenue. You've

34:33

you you now have hints of revenue but

34:36

you know the product is useful. You know

34:37

people like the product and want to use

34:39

the product and maybe want to pay for

34:40

the product, but you don't know exactly

34:42

how to get everybody to pay for the

34:44

product. and then CD and so on is just

34:46

continued to build the repeatable

34:48

revenue machine. And so with that

34:50

framework in mind, we were on the right

34:52

track. It was basically like to build

34:54

the product. Um we had clear product

34:56

market fit by the a um in terms of the

35:00

open source, right? We had millions of

35:02

downloads, a lot of stars on GitHub, all

35:05

sorts of signals that showed that this

35:08

was resonating. We had zero revenue. And

35:11

so, you know, it was raise money and

35:13

slowly slowly get closer and closer to

35:16

solving the business problem. And I

35:17

think we were just a year or two late or

35:19

like later than the average startup, but

35:22

the general key frames were the same,

35:24

just on this slightly wrong timeline, I

35:26

guess.

35:27

>> And then when you decided to do a

35:29

business, this was you already had the

35:31

the hashy stack and then you built

35:33

managed offering. I remember.

35:34

>> Yeah. Our first foray into

35:36

commercialization was a total failure.

35:39

It was this

35:40

>> Oh, really?

35:40

>> Yeah. We had this product that some

35:42

people You had You would have to have

35:44

been a diehard like Hashukore product

35:46

fan to know this, but we had this first

35:48

product that was called Atlas. And the

35:50

idea was commercially shipping the

35:53

vision of running all the products. And

35:55

so the, you know, there's a couple death

35:57

nails there. Um, one of them was that

35:59

you had to run all the products. And so

36:01

if you were just like a vault user, you

36:03

had a really impossible time buying or

36:06

buying into our commercial product. And

36:07

the second was just that it was just a

36:09

huge problem to like attach on to

36:12

regardless of the adoption required.

36:13

You're trying to solve the problem that

36:16

multiple different buying organizations

36:18

in a company were fighting over. So like

36:21

even the people who had adopted all our

36:22

tools, we ran into the problem of who

36:24

pays for it.

36:24

>> It wasn't as simple as engineers paying

36:26

for it.

36:26

>> Correct. And I think um one of the

36:28

lessons that I would have, you know, I

36:29

would have for engineers that become

36:31

founders that don't have a business

36:32

background. One of the tough lessons I

36:33

had to learn is that companies want to

36:35

pay for software, but they will fight

36:38

over whose budget owns that.

36:40

>> Budgets are important, right?

36:41

>> Yes. So, the budget has to exist and if

36:44

it looks like a networking problem,

36:45

they're going to say, "Oh, networking

36:46

should pay for that." They so I have

36:47

more budget to buy my other toys that I

36:50

want

36:50

>> or I can hire more people or uh yeah, it

36:53

could get broken down into like vendor

36:54

budget. So, it could already be

36:56

earmarked for external purchase. Yeah.

36:58

So, we have this product that was like,

36:59

does security pay for it? Does

37:00

networking pay for it? Does um

37:02

infrastructure play pay for it? Like,

37:03

does dev tooling pay for it? Like, where

37:05

does this go? And it's just that

37:07

Spider-Man mean where everyone's

37:08

pointing at each other. Ultimately, you

37:09

don't sell anything. And so, that was a

37:13

failure for that reason. So, I don't

37:15

remember the total time we chased this

37:18

down, but uh we had a board meeting for

37:20

sure on a Friday, and board meetings

37:23

were usually on Fridays. And and we had

37:25

this board meeting. We're based in the

37:26

city of San Francisco. Um board meetings

37:29

were an hour south in in um real Silicon

37:32

Valley. And it didn't go well. It wasn't

37:35

there was no no yelling. There was

37:37

nobody saying you guys are messing up.

37:41

There was nothing like that. It was just

37:42

the way I describe it is when your

37:45

parents aren't happy with you, but they

37:47

don't have to say that they're not happy

37:48

with you.

37:49

>> You know,

37:49

>> but you know they're not happy with you.

37:52

We had this board meeting. We drove

37:53

home. our mom and I complete drive home

37:56

was silent and nor Friday it's Friday

37:59

night so usually what we do is we'd go

38:01

straight to our mom lived in the city

38:03

and I lived in LA already but we'd go

38:05

straight back to where Arman's place and

38:07

just like have a glass of wine debrief

38:10

talk through things and we didn't talk

38:12

on this car ride home Arman drove

38:14

straight to the office I didn't question

38:16

that uh we went into the office um sat

38:20

at a table not much larger than this

38:22

only difference was there would be a

38:23

whiteboard

38:24

I think one of us at that point said,

38:27

"Well, that didn't go well." Uh, we both

38:31

knew it. We didn't feel good. And, uh,

38:35

the the sequence of events here is now

38:36

very fuzzy. But at a certain point, we

38:39

decided,

38:41

let's play this experiment where if

38:43

there was no sunk cost, if we were

38:45

starting from scratch, what would we do

38:46

differently today? We whiteboarded all

38:48

this stuff out. What we whiteboarded out

38:50

was per product enterprise products and

38:53

doing vault first and all this stuff. We

38:55

wrote it out, spent some amount of time

38:57

there. It's still Friday. It might be

38:59

Saturday in terms of the time of day,

39:01

but it's still Friday. I think it was

39:02

Arman who looked at the board and goes,

39:05

"Why don't we just do that?" Like, why

39:07

not? Like, and and and I was like,

39:09

"Yeah, why not?" So, we decided over the

39:12

course of that weekend to just throw it

39:15

all away. Just throw everything we were

39:17

doing before way. We had two paying

39:18

customers. We're like just breach

39:20

contract. I don't know. Like figure it

39:22

out. Like get out of it. We're done. And

39:25

we convened an all hands meeting on

39:26

Monday. Probably only about 20 30 people

39:29

in the company that time, but we

39:30

convened an all hands meeting over Zoom.

39:33

Um I think and we might not have used

39:34

Zoom then, but whatever video chat. And

39:36

we said, "Okay, we're switching

39:37

directions. We are now enterprise as our

39:41

customer open core per product." We

39:44

would have this open source and we would

39:46

have a forked version internally that

39:48

had closed source features. Yeah, it was

39:50

a fork but yeah um open core business

39:52

model. Arman and I thought people would

39:55

quit like we thought we would lose like

39:58

we don't have exact number. We thought

39:59

it would uh shatter some level of

40:02

confidence and like wow these these guys

40:04

have no idea what they're doing. We

40:05

didn't have any idea what we're doing.

40:06

Um, and you know, yeah, open core even

40:09

then had a bit of a like icky taste in

40:12

people's mouth. And so like we thought

40:13

people would just like philosophically

40:15

quit being like, "No, I came here to

40:16

work on open source. I'm not going to do

40:17

open core." Um, enterprise was kind of

40:19

just like a sudy boring thing. There was

40:21

like so multiple facets of why people

40:23

might quit. Nobody quit. Uh, the vibes

40:27

in Slack were amazing, super positive.

40:31

>> Oh, what happened? You think like why

40:34

people internal

40:34

>> we asked about it in oneon ones and

40:36

follow-ups. We asked about it and it was

40:38

really like everyone was kind of just

40:40

like buzzing that we had a clear

40:42

direction and a conviction and you know

40:45

there was fear of the unknown. But but

40:47

before there was this feeling of like

40:49

we're just we're just throwing darts at

40:52

the wall and doing this thing and we

40:54

don't know who exactly who our customer

40:56

is. And there was all this uncertainty

40:57

in a different way. And now it was like

40:59

we don't know if this will work but at

41:00

least we're just gonna sprint towards

41:03

this like there's these clear things

41:04

which was like definitely enterprise

41:06

definitely open core definitely vault

41:07

like all these things were set in stone

41:08

that gave us a different set of

41:10

certainty that suddenly the company was

41:12

like let's go um so yeah nobody quit it

41:15

went super well and um we started I

41:18

don't the I don't know the time of year

41:19

but it was it was like in the fall we

41:21

built Vault Enterprise um by the new

41:23

year within like the first quarter of of

41:26

trying to do sales we could just

41:28

tell that it was different. It wasn't

41:29

like obviously successful yet, but just

41:31

the caliber of conversation we're

41:34

having, the the distance we were getting

41:36

in the buying process and the speed

41:38

we're doing it, it just felt different.

41:41

>> And what was different of this approach?

41:42

Yeah, I mean part of it just comes down

41:44

to like the classic startup like listen

41:46

to your customer and we we should have

41:48

listened from the beginning because uh

41:50

our potential customers were were

41:52

screaming at us to do what we ended up

41:54

doing which is we would give these

41:55

pitches about adopt all the products and

41:57

buy this pie in the sky thing and and

41:59

there were so many meetings where

42:00

someone would be like okay I'll think

42:02

about that but how do you replicate your

42:05

secrets involve you know they would just

42:07

like ask these questions where if you if

42:09

I was just listening I was so blinded a

42:12

lot of us were blinded, but I was so

42:13

blinded. If I was just listening, I'd be

42:14

like, "Wait, a lot of people are asking

42:16

about secrets replication." And that's a

42:19

at scale problem. Maybe we could close

42:23

source that, right? Like that's what we

42:25

ended up doing is that was our first

42:27

feature with with secrets replication.

42:29

Uh not even across data centers. The

42:30

first feature was just like a cluster of

42:33

vault servers in a single uh region. you

42:36

would sell this more focused product,

42:38

but now kind of the problems I talked

42:40

about earlier, security was definitely

42:42

the buyer. There was an obvious budget,

42:44

obvious person you were talking to.

42:46

There was a feature that it resonated

42:49

with that scale. And so we were just

42:51

having much higher quality meetings in

42:54

terms of uh getting this done. Michelle

42:56

just talked about how Hashorp managed to

42:58

build a product that enterprise

42:59

customers cared about and wanted to buy

43:01

because it resonated with their scale.

43:03

This brings us nicely to our presenting

43:04

partner for the season, Statsig. Statig

43:07

offers engineering teams a tooling for

43:09

experimentation and feature flagging

43:10

that used to require years of internal

43:12

work to build and is especially

43:14

important at enterprise scale. Here's

43:15

what it looks in practice. You ship a

43:17

change behind a feature gate and roll it

43:19

out gradually, say to 1% or 10% of users

43:22

at first. You watch what happens. Not

43:24

just did it crash, but what did it do to

43:26

the metrics you care about? Conversion,

43:28

retention, error rate, latency. If

43:31

something is off, you turn it off

43:32

quickly. If it's trending the right way,

43:34

you keep rolling it forward. And the key

43:36

is that the measurement is part of the

43:38

workflow. You're not switching between

43:40

three tools and trying to match up

43:41

segments and dashboards after the fact.

43:43

Feature flags, experiments, and

43:45

analytics are in one place using the

43:47

same underlying user assignments and

43:49

data. This is why teams at companies

43:50

like Notion, Brex, and Atlastian use

43:52

Statig. Static has a generous free tier

43:54

to get started, and pro pricricing for

43:56

teams starts at $150 per month. To learn

43:59

more and get a 30-day enterprise trial,

44:00

go to statsic.com/pragmatic.

44:03

And with this, let's get back to the

44:04

episode and what came after they built

44:06

Vault.

44:07

>> And I get asked on the open source side

44:09

all the time, but these buyers, like

44:11

corporate buyers, do not care at all

44:14

about open source. They don't care at

44:16

all. Like, they need a commercial

44:19

agreement. And so the the closed source

44:22

nature of it like some people needed

44:26

like legal protections around like code

44:27

escrow in terms of downtime and stuff

44:30

like that. That was about the extent of

44:31

it. Otherwise they were like you know we

44:33

need support. We need proof of concept

44:35

to prove it works. Um we need some white

44:37

papers in terms of like other customers

44:39

scale blah blah blah. Um and yeah that's

44:42

what we had to build up after that and

44:43

get going.

44:44

>> And then so you started selling at Vault

44:46

and then you did it for the other

44:47

products as well, right? Yeah, we did

44:48

Terraform and we did console. Um,

44:52

we had it for all the products but but

44:54

you know all this data is public. You

44:55

could look at it in and the well for a

44:57

period of time it was public. You could

44:58

look in like the public reports of when

44:59

Hosk was a public company. Um, you know,

45:01

it really broke down to all Terraform.

45:03

>> One thing I I remember is Terraform just

45:06

became so so so popular across the

45:09

industry. So like you know like there's

45:11

a hashy stack but I I only layer that

45:14

all the other parts existed cuz like

45:15

Terraform just seemed to be everywhere.

45:17

Why why do you think that sudden

45:19

popularity was?

45:20

>> It's so funny to hear that because I I I

45:23

accept and know that now and I feel the

45:25

same way that you feel now that

45:26

Terraform is this huge thing. But for

45:28

the longest time like we were the

45:30

vagrant company like all the other tools

45:32

were like no one knew the other tools

45:34

and not only that like Terraform uh I

45:37

one of the things that kind of

45:38

frustrates me I haven't heard it

45:39

recently but for a period of time one of

45:41

the things that frustrated me was like

45:42

oh they they only won because they were

45:44

first to market. I hear that a lot and

45:46

we were like seventh to market. Okay. So

45:50

like

45:50

>> to market in in in what category?

45:52

>> In terms of that infrastructure as

45:53

codes.

45:55

>> So there were like other like players

45:57

who you know

45:58

>> so many Yeah. Yeah. And and no one was a

45:59

clear winner. It was a waring market.

46:01

But like that first year 2014 when we

46:05

came out Terraform I you know at that

46:07

time one of my marketing strategies was

46:09

I was at every conference I I went to I

46:12

traveled an obscene amount. I was

46:15

speaking wherever I could, but even if I

46:17

couldn't speak, I was going just to talk

46:18

to people. Um, and there's actually a

46:21

little anecdote here was when the COVID

46:23

lockdowns happened in March 2020, my

46:25

wife and I had nothing to do at night.

46:27

We didn't have kids yet. And we opened

46:29

up our calendars and we realized that it

46:32

was a we had been dating since 2012 and

46:35

the first time in in almost 10 years of

46:37

our relationship that I will have been

46:41

in the same place longer than 8 days.

46:44

No, for for almost 10 it was at 9 years.

46:46

For 9 years straight, I had been

46:48

somewhere different at least every eight

46:50

days.

46:50

>> That's how much you traveled.

46:51

>> That's how much I traveled. Yeah. And I

46:53

know there's consultants that travel a

46:54

lot more and stuff, but like I was

46:56

traveling a lot. I was coding a lot. I

46:58

was like doing all these things. Um

46:59

>> you must have like coded while you

47:01

traveled as well.

47:02

>> All all the time. Yeah. I had a whole

47:03

system. When I started traveling,

47:04

inflight Wi-Fi didn't exist.

47:05

>> Yeah. Yeah. Exactly.

47:07

>> Even now it's kind of patchy.

47:08

>> Yeah. So I wrote these scripts that I

47:09

end up iterating on but mostly used. um

47:12

where I downloaded all the GitHub issues

47:14

and I categorized them and I would just

47:17

break it down into tasks that none took

47:19

more than 10 to 15 minutes and I just

47:22

created this list and and when I was on

47:24

the plane I would just one by one bust

47:26

them out.

47:28

>> Uh there's no internet so just commit

47:29

them locally.

47:30

>> Yeah.

47:30

>> And then I would get back and and some

47:32

people used to notice this cuz I would

47:33

land and you would get this push and

47:35

people would get these email

47:36

notifications where like 30 issues were

47:38

closed all at once.

47:39

>> Wow. But I found the key was

47:41

pre-planning what issues you were going

47:43

to work on. I did that online on the

47:46

ground.

47:46

>> Yeah.

47:47

>> And then breaking them down into

47:48

15-minute chunks because I found it was

47:50

really hard to get into like multi-our

47:53

even when I was traveling to like Japan

47:54

or something. It's really hard to get

47:55

into like multi-hour flow on an

47:57

airplane. So I was like, I'm only going

47:59

to work on the stuff that isn't like

48:00

heavy design work, none of that. It's

48:02

just like bug fixes, right? Like just

48:04

cleaning stuff up. And so that was my

48:05

process. In 2021, Hashi Corp went

48:09

public. What is it like to go public?

48:11

Both in terms of preparing for it, how

48:13

did it feel? What changed after on the

48:16

prep side? I don't have the full answer

48:18

because I also stepped down from the

48:20

executive team about maybe 6 months

48:22

before we went public. So, I was part of

48:23

some of the planning and obviously I was

48:25

very aware that we were planning to go

48:27

public and but um like for example, I

48:30

wasn't part of the road show or any of

48:32

that. But yeah, you know, from my seat,

48:35

the the parts that I was part of, the

48:37

parts I had visibility on to, I mean,

48:38

it's it it takes over a year to to do

48:41

it. So, there's a lot of prep and and

48:44

there's some funny things that you do

48:45

like you do you start running like a

48:48

public company at least two quarters

48:51

before you're public.

48:52

>> Um, I don't remember what the drop

48:53

deadad date is, but there's a date where

48:55

you could just like cancel going public

48:57

and it's pretty close. Like, it's like

48:59

very close to to when you actually like

49:01

have that day. So you you kind of run

49:04

like a public company and to the point

49:06

where you do mock earnings calls like

49:09

you actually with a conference room

49:12

table. Your investors are the public

49:15

investors that aren't in the room. They

49:17

go somewhere else and they talk over the

49:19

speaker phone and ask you the types of

49:20

questions. Um your CFO or VP of finance

49:23

gives the full report of the quarter.

49:26

They try to frame the types of questions

49:28

you get. You run it and you try to

49:30

figure out like whether it's running

49:31

well enough, I guess. And that's sort of

49:34

what the prep feels like. And there's a

49:36

an obscene amount of secrecy because um

49:39

from a regulation standpoint, you can't

49:40

talk about any of this. And so I mean

49:42

you could look back at at even the dumb

49:44

stuff like hacker news comments like I

49:45

just went radio it's the clearest signal

49:47

that a company's going to go public cuz

49:49

I went radio silent on every topic

49:51

because everything became questionable.

49:54

I remember there was just a point cuz I

49:56

I there was a hacker news comment I gave

49:59

like eight months before it went public

50:01

and our general counsel like in the

50:04

middle of the night was like you have to

50:05

delete that after he talked to me I was

50:06

like I could see how that might affect

50:08

things but like I didn't realize the

50:10

matter and I ended up deleting it.

50:11

>> And is is this because you're not

50:14

supposed to give public information away

50:17

or something like that?

50:18

>> I don't remember the exact regulation to

50:20

be honest.

50:20

>> Yeah. But there there's some regulation

50:21

about like uh like not leaking

50:25

information or

50:26

>> not. It's not really I mean it is it's

50:28

all information but it's more about like

50:30

you can't influence the market uh in any

50:33

way. And so yeah and you can't make

50:35

promises because if you say oh we're

50:37

going to go public it might cause even

50:38

private funding to froth up and it's

50:42

it's a form of fraud. Um so yeah

50:45

basically like I just stopped talking

50:47

about everything. I don't know how

50:49

seriously other people take it, but I

50:51

took it to the point where I planned

50:53

this trip to New York to go public and I

50:57

invited my parents and I didn't tell my

50:58

parents why we were going to New York

51:00

and I just told them I want you to go to

51:02

New York. It's really really important.

51:04

It has to do with Ashp and they were

51:05

like, "Sure." And I said, "I can't tell

51:08

you about it." And they're said, "Sure."

51:10

And I told them maybe a month in

51:12

advance. We had a dog. We had to get our

51:13

dog sat by my aunt and I just told them

51:17

we're going on a family vacation up to

51:19

the point we left. They like I didn't

51:20

tell nobody except my parents knew

51:23

basically. Um none of my friends,

51:26

nothing. Um except the friends that

51:27

worked at the company. Um but yeah, it

51:29

it that's what it's like leading up to

51:31

it. Yeah, I I was at Uber when we went

51:34

public and and previously I read that uh

51:36

while before going public uh Hashi Corp

51:39

VMware made an offer earlier way

51:42

>> early that was like in super early

51:44

>> like two years into the company. We went

51:45

probably like 10 years in the company.

51:46

So yeah.

51:47

>> Yeah. So so like when when they tried to

51:48

to to buy you like what was it like did

51:50

you almost sell at some point? Was there

51:52

any point where where you were close to

51:53

potentially selling?

51:54

>> It felt close and and and

51:57

I got a lot of accounts afterwards that

51:58

it was very close. It came down to like

52:00

one vote on the VMware board was what I

52:02

heard about two years in the company. We

52:03

were only three employees me including

52:05

me and Armon. So we had one employee I

52:07

guess it was two founders and employees

52:08

three of us. We got approached by

52:10

VMware. Um you know I didn't know what

52:13

this would be like and and it is not

52:15

what it what it isn't is they don't show

52:18

up and say we would like to buy you.

52:20

>> No

52:20

>> no no

52:21

>> that would be too obvious. The way it

52:23

happens is you get an email from some

52:25

low-level business development person

52:27

that wants to just like talk vaguely.

52:30

And the vague talk is they're not

52:32

interested in buying you. The one of the

52:34

jobs of BD people at large companies is

52:37

just to have an understanding of the

52:38

ecosystem. So it's really just like

52:40

let's have an understanding. They might

52:42

have had an executive tell him or her to

52:44

go talk to this company. There might

52:46

already be an executive kind of poking

52:47

around, but yeah. So it kind of starts

52:49

out that way. It turns into would you

52:52

like to come by our offices and meet in

52:53

person? Um oh our VP of engineering

52:57

swung by let's talk to him nice to meet

52:59

you blah blah that uh then I think this

53:02

is like our actual timeline and then I

53:04

think uh there was a dinner where there

53:06

was three VMware executives at the

53:08

dinner. Um at that point we thought they

53:12

might be interested but it was still so

53:15

so much dancing. Oh, this is not even

53:17

this just this is months before there

53:19

was even an offer. It was still so

53:21

social like we drank, we talked about

53:23

our hobbies and interests and very not

53:26

about I mean very basic about like tech.

53:30

It's really more vibes. They go to

53:32

dinner. Uh and then it started to get

53:34

more serious. We spent more time impella

53:36

at the VMware offices where we started

53:37

talking about partnerships about how how

53:40

can VMware help our products more and it

53:43

starts about partnerships and then it

53:44

turns into like hypothetical if you had

53:47

the resources of VMware what what would

53:49

you do you know we're like six meetings

53:53

in at this point there's no no offer of

53:55

anything and then at a certain point um

53:58

honestly we were getting tired of it

53:59

because nothing was happening anyway

54:00

>> sounds like you're a startup and you're

54:02

going to all these meetings

54:03

>> oh I don't even live in the bear area.

54:05

So I was flying up all the time. It was

54:06

a waste of time. And um and to a lot of

54:09

founders that is the warning I give them

54:11

is M&A becomes a waste of time. So I

54:12

have another mer acquisition.

54:14

>> M acquisitions becomes a waste of time.

54:16

So I'll tell you another anecdote after

54:18

this but um ultimately we kind of

54:21

politely had the like okay let's like

54:24

or get off the pot kind of

54:26

conversation and they uh put an LOI in

54:29

front of us which is a uh letter of

54:32

intent. Letter of intent. The LOI was it

54:35

was one page. Um, you know, it was it's

54:38

basically like a non like semi-binding

54:40

promise that we're pursuing buying you.

54:44

Uh, no number on there. It's just like

54:46

kind of vain.

54:47

>> Still no number.

54:48

>> Yeah. Well, verbally, but they're not

54:50

they're not writing anything down.

54:51

They're not putting anything in email,

54:52

none of that. It's just verbal. And so

54:54

at that point, verbally,

54:55

>> we had gotten a drop of $20 million,

54:58

which

54:59

>> doesn't sound that much. Well, yeah, but

55:01

we're 23 years old. Oh, yeah. The three

55:04

of you 23 years old.

55:05

>> I'm 23 years old. Me and Arman together

55:08

own 70% of the company. Um

55:11

>> Okay. Yeah.

55:12

>> Yeah. It you know it it sounds

55:15

interesting to say the least. What I

55:17

tell people is you start you start

55:21

thinking about the things you will buy

55:22

is you start that's the that's the it's

55:25

a dangerous path. That's the that's what

55:27

happens. And we had advice from people

55:30

who said it's like phenomenally too low,

55:33

like wildly too low, so go ask much

55:36

higher. And we asked, I don't remember

55:39

anymore, but we asked for maybe like 40

55:40

or 50 or something. And they just said

55:42

yes. They said okay. And then, you know,

55:44

it's like way too low.

55:47

And uh and that was verbal, too. So,

55:48

there was nothing binding about that.

55:49

Yes. It was just it was it wasn't like

55:51

yes. It was more like okay, we'll work

55:53

on that, you know, but very positive.

55:55

>> Yeah.

55:56

like in this indirect. It's an indirect

55:58

>> indirect business sense in indirect.

55:59

Yes. And it turned into come meet the

56:01

CEO of VMware, you know, like clearly

56:03

they're interested cuz we're like

56:04

climbing still. Arman and I we kind of

56:07

started getting cold feet because it's

56:10

it's it's the way we described it is

56:12

it's a dreamkilling amount of money.

56:14

It's like you would take the money but

56:16

you're too small to be important to a

56:18

company like VMware. So they're gonna

56:20

just

56:20

>> because because even though it's like so

56:22

much money but

56:23

>> personally it's so much money

56:24

>> but but you know that at VMware level I

56:26

guess you see their revenue their all

56:27

that you realize that for them it's not

56:29

a big deal.

56:29

>> It's meaningless to them. Yeah. It's

56:31

meaningless.

56:32

>> It's crazy. That's that messes with your

56:33

mind you know.

56:34

>> Yeah. Yeah. So it becomes this thing

56:35

where it's like personally your life

56:37

could change but this thing that we both

56:39

were truly passionate about like the

56:42

thing I wanted to work on more than

56:43

anything else would end in a sense

56:45

because you know I would probably get

56:48

thrown into like working on ESX or

56:50

something you know and

56:51

>> you would get a manager of a VM where

56:53

not even the CEO

56:54

>> the executives make it sound like

56:55

they're going to do all this stuff with

56:56

your products but like that's just one

56:58

executive in a cog of corporate

57:00

machinery. So we started getting cold

57:02

feed being like if they're interested

57:04

maybe we're on to something. If we're on

57:07

to something we want to sell out early

57:09

and sell out in a way where our dream

57:11

dies. That's why it's like a dream

57:12

killer. Arman very maturely and he's two

57:15

years younger than me so he's 21 at this

57:17

time.

57:17

>> No, he's he sounds like the older one.

57:19

>> Yeah. Yeah. Yeah. Yeah. Yeah. He's very

57:20

mature. And Arman very maturely came up

57:23

with the uh I forgot where it comes from

57:26

but the the risk minimization uh not

57:28

risk um the um regret minimization

57:31

framework. He was like what personally

57:33

on your own go think and I'll do the

57:35

same and let's come up with a number

57:38

that if we walked in the next day and

57:41

they said we're killing everything.

57:43

You're going to go work on ESX for the

57:44

next four years cuz we had a we were

57:46

going to have a lock up no matter what.

57:47

you're gonna work next four years that

57:49

we would be like cool this was worth it

57:51

like what's the minimum no regret or

57:54

minimum regret we came back and I don't

57:55

remember exactly what our numbers were

57:56

but they were pretty close and we ended

57:58

up at 100 and we and so we're like it

58:01

felt so wrong like how could we possibly

58:02

ask for 100 but we're like we said this

58:04

is what we're going to do and we stuck

58:05

to it so we went back we asked for 100

58:09

and it wasn't a no

58:11

>> and they they wasn't a yes this one had

58:13

a lot more hesitance it was a lot more

58:14

like

58:15

>> uh we'll get back to you right like I

58:18

don't know but it wasn't a no

58:21

and basically they came back to us and

58:24

said this requires board approval so

58:27

we're convening a board meeting next

58:29

week like unplanned that's not when

58:31

their board like we're convening the

58:32

VMware board we're going to vote on this

58:34

and then we heard that uh the vote

58:36

didn't pass that was that

58:37

>> it's just crazy how

58:39

>> so such small things could you know like

58:42

influence if if that was an extra yes

58:45

>> who knows what your

58:47

one person you you might have, you know,

58:49

like it's hard to but you know in VMware

58:51

you might have been clogging away on

58:53

like this this project.

58:54

>> Yeah. Yeah. I mean we didn't build

58:55

terform yet. So Terraform

58:57

>> terform might have not probably never

58:59

would have existed. High confidence I

59:00

know who the vote was. I know why they

59:02

voted that way. Like I know a lot more

59:03

details but it's like I it's it worked

59:05

out obviously in my favor. But yeah.

59:07

>> So you've you've left Hashi Corp and and

59:10

you're you're independent. And one thing

59:12

about cool about being independent is

59:13

you're just very honest about stuff. And

59:15

there was this really interesting thread

59:17

where on on Twitter you wrote about you

59:20

said like ask me anything about the big

59:21

cloud providers because at Hashior Corp

59:23

you work with all of them. What was your

59:25

experience back then uh of you know like

59:28

Azure, AWS, Google Cloud like like your

59:31

kind of honest view of how they work

59:33

back then and possibly like how has your

59:35

views changed on them. The precursor to

59:37

that is while I was at hosp I I

59:39

obviously had to be very careful about

59:41

what I said about any of the cloud

59:42

fighters because we're partners with all

59:44

of them. We're partners and I didn't

59:46

want to insult anyone and and so I was

59:48

just very professional about all their

59:50

relationships and like

59:51

>> we like all of them like

59:53

>> yeah or just say nothing. You have

59:55

nothing nice to say don't say anything

59:56

at all. And then I left and I was still

59:57

I kept that up because it was too close.

59:59

I was I was still flying too close to

60:01

the sun as they say. and then enough

60:02

time passed where I was like ah like my

60:05

opinion doesn't really matter and um

60:08

yeah so my to answer your question um my

60:11

broad view of all of them was that AWS

60:14

was really arrogant annoyingly arrogant

60:18

was how I describe it

60:19

>> and and when you say arrogant like can

60:21

you help us understand like how you work

60:23

with them or what part of them or like

60:25

is just general I

60:27

>> I'll start disclaiming this though that

60:29

you know we worked with so many people

60:30

there that there for individuals and all

60:34

of them who are awesome and nice and

60:35

kind. And so I'm not trying to make like

60:36

individual judgments here. It was just

60:38

more of like how all of it came together

60:40

and how it felt as a as a whole. So by

60:43

arrogant I mean it always felt like they

60:45

were doing us a favor at every turn in

60:47

terms of partnerships, in terms of just

60:49

getting a meeting with them. It always

60:51

felt like you should be thankful that

60:54

we're spending time talking to you. And

60:55

not just that, but also like there was

60:57

always this subtle vibe of like we will

61:00

just spin up a product and kill your

61:02

company. You know, it it felt that no

61:04

one ever said that. Um well, it kind of

61:06

got to a point where it was sort of like

61:08

if we don't come to terms, we're going

61:09

to build this service. It it did kind of

61:11

come to that,

61:11

>> but you know, we did see that later on

61:13

with Elastic and

61:15

>> Oh, that had already happened.

61:17

>> Oh, it happened already.

61:18

>> Yeah. Just not with us, but with other

61:20

open search.

61:20

>> Yeah. and they always publicly spun it

61:22

as like, oh, it's so great and and

61:25

builds the builds the ecosystem larger

61:27

and we're doing it by the letter of the

61:29

license and you know, all has truth

61:31

elements to it, but it's still not a

61:33

nice thing.

61:33

>> No, I I I think like um I I I don't

61:36

think people paying attention to open

61:38

source appreciated what Amazon did with

61:41

with it. It it really hurt Elastic's

61:43

business and it showed how open source

61:45

can be weaponized against a company that

61:47

spends, you know, their blood, sweat,

61:48

and tears. And I guess you know Hashorp

61:50

you had the same thing right cuz cuz you

61:51

you were publishing permissive but I

61:52

mean open source needs to be permissive

61:54

so

61:54

>> it was MIT or MBL license. Yeah.

61:57

>> So like Amazon could have spun up any

61:58

anything they wanted.

62:00

>> Yeah.

62:00

>> There was like a 2-year period where I I

62:04

think for the entire two years the

62:05

entire leadership team was terrified

62:08

that at any moment there would be like a

62:10

vault service or something would pop up.

62:14

Um and so yeah that's that's sort of my

62:16

characterization of AWS. It really took

62:18

like for example teeth ringing to get

62:20

them to help with the AWS Terraform

62:23

provider. Um we had I don't remember the

62:25

exact number but we had something like

62:27

five full-time engineers employed

62:30

working on only the AWS provider for

62:32

Terraform which you know maths out full

62:35

benefits and everything to like a

62:36

million dollars a year.

62:37

>> Um and all of that was pure open source

62:42

pure integration with a commercial

62:43

entity and they were not helping us all.

62:47

and and they were the last of any of the

62:49

cloud providers to to provide any sort

62:50

of help there. And it it came down to

62:52

some drama where we went to a meeting

62:56

and basically said that we're going to

62:59

publicly say that the AWS provider is

63:01

deprecated and we're done. Like the

63:04

community could pick it up or whatever,

63:05

but we're not we're going to

63:07

>> Yeah. Cuz you didn't get any help from

63:08

them.

63:08

>> Yeah. And it's taking up too much work

63:09

and there's too many bugs and you're

63:10

shipping. Honestly, AWS is shipping

63:12

features too fast and like it's just

63:13

like not worth it. And that freaked them

63:15

out and finally they started helping.

63:17

You know, they might recount their side

63:19

of things differently, but that's pretty

63:20

much it felt like no movement for years

63:23

and we said that and movement started

63:25

happening really fast. So yeah, there

63:26

was that. Um Microsoft I would I have

63:30

the most positive view on Microsoft.

63:32

They had a really hairy technical

63:33

product is how I describe it. It was

63:35

very difficult to use

63:36

>> Azure

63:37

>> Azure and a lot of nouns like like

63:41

principles and I I didn't I still to

63:43

this day and I've integrated with the

63:44

service don't fully understand the AM

63:48

hierarchy of Azure

63:51

um I just kind of bolted it and got it

63:53

working with a team and and that was

63:55

that but so technically kind of h but

63:58

from the business side competent um

64:01

professionals and team players that's

64:04

like how I describe it. They we we went

64:06

into every meeting with them and a lot

64:08

of our meetings the first question was

64:10

how do we both win? That was like the

64:12

first question and and yeah, very

64:14

pleasant. Awesome. They were the first

64:16

people to jump on board uh supporting

64:18

Terraform. Sure, that's some kind of

64:20

bias, but like they were consistent

64:22

throughout the years. So positive on

64:24

Microsoft. Um and Google Cloud, you

64:28

know, my my or yeah, Google Cloud in

64:30

general, it was always like the best

64:32

technology, the most incredible

64:35

technology and architectural thinking.

64:37

And I swear none of them, it felt like

64:41

none of them cared or thought about the

64:43

business at all. It was like every

64:46

partnership meeting we'd spend hours

64:48

talking about the coolest edge cases and

64:52

scalability and how this is going to

64:53

work and like I I think the the best

64:56

public example that you could just see

64:57

in history was they were the only

64:59

company that when they partnered with us

65:01

to write the provider they spent a lot

65:04

of time building this very good I think

65:06

they called it magic something they they

65:07

they fully automated the whole thing. So

65:09

when they shipped the new Google Cloud

65:11

thing, it had a Terraform fighter

65:12

resource right away and not just like it

65:15

didn't feel automated. It felt very

65:17

ergonomic and like it was good. It was

65:19

really good. And so that they had that.

65:22

But whenever we would get into how do we

65:24

do co-ail? How do we like attribute your

65:29

sales engineers quota to selling like

65:32

infrastructure that's spun up by

65:33

Terraform? Like how do like how do we do

65:35

this? to like the business side of

65:36

things.

65:37

>> Crickets like impossible to get anyone.

65:40

Not just impossible, it was like even if

65:42

you got someone, they would say

65:44

something for 20 minutes and be like,

65:45

"Okay, cool. We have two more hours.

65:46

Let's figure this other thing out." And

65:49

yeah, it was it that's what it felt

65:50

like. And the other disclaimer I give is

65:52

all this knowledge was circa, I don't

65:54

know, 2019,

65:56

something like that. So maybe in the

65:58

past seven years things have

65:59

dramatically changed, but that's what it

66:01

felt like.

66:01

>> Yeah. going to to open source. You're

66:04

you're actively involved in in open

66:06

source and open source today and it

66:09

seems open source is changing a lot

66:12

especially with with with AI and and you

66:14

know you're seeing stuff at ghosty like

66:15

can can you tell us like how you know

66:18

open source has has changed with ghosty

66:20

with AI contributions and and what what

66:21

are you seeing with with open source

66:23

maintainers? Seems like there's a bit of

66:24

a you know like drama or worrying stuff

66:26

happening. Well, I would say more

66:28

broadly the issue facing open source

66:31

today um is I mean there's there's

66:34

multiple but the one that I feel is most

66:37

prevalent across industries right now is

66:40

AI contributions and the specifically

66:45

the ratio the signal to noise ratio

66:47

being incredibly low. In other words,

66:50

just being super noisy with low quality

66:52

contributions. It's just stressing the

66:54

system quite considerably. And yeah.

66:56

>> And and so after you left left Hashi

66:59

Corp, you you started Ghosty. Uh how

67:02

many years ago was that? Was that like

67:03

two years or so?

67:04

>> Well, I left Hashorb over two years ago

67:06

or a little over two years ago. I had

67:08

like

67:09

>> poked around with prototypes of Ghosty

67:11

like maybe 3 years ago. But after I left

67:13

Hashorb, I started just like kind of

67:15

working on it like 20 hours like much

67:17

more um just because it was the thing

67:19

that I had.

67:20

>> What drew you to go see? What was your

67:22

kind of vision? Why did you start

67:23

working on it? It's a it's a it's a

67:25

better terminal, right?

67:26

>> It's a terminal. Better is subjective.

67:30

>> Well, I I installed it cuz I I I like it

67:32

better. But yes, a a terminal and

67:34

opinionated terminal, right?

67:35

>> Opinionated. um very modern in terms of

67:38

like supporting as many of the newer

67:40

specs as possible that enable

67:42

functionality like displaying images or

67:46

um you know clicking on your prompt to

67:47

move the cursor and but like dozens more

67:50

uh examples like that. The original

67:52

thing that drew me to it is is the exact

67:55

opposite of good advice that people

67:57

usually give to people which is that you

68:00

find the problem and you build a

68:02

solution and what I did and you pick the

68:04

best technology that then solve that.

68:05

What I did was I found a set of

68:07

technologies and I was like what could I

68:08

build with these technologies? I went

68:10

the opposite direction and I had spent

68:12

over 10 years 12 years at Hash Corp

68:16

incorporated and 3 years prior to that

68:18

doing infrastructure open source. So 15

68:20

years in total just thinking almost all

68:22

the time about infrastructure and cloud

68:24

services and things like that. And so I

68:26

had felt that I was rusty. I had sort of

68:28

like my skills have had weakened on

68:31

desktop software systems programming to

68:34

a certain extent because I was so

68:35

constrained by networking challenges

68:37

distributed systems. So like low-level

68:39

systems programming had had had

68:41

atrophied. Um I had never really worked

68:43

with GPUs and GPUs I guess crypto was

68:46

happening but I kind of ignored that

68:48

whole trend. Um uh but this is preAI so

68:51

um but GPUs were obviously in use and I

68:53

I just felt like I had no idea how they

68:55

worked so I wanted to go to desktop. So,

68:56

I picked all these like different

68:57

technologies and I said, "Okay, Zigg."

69:00

Cuz it looked cool to me. I just wanted

69:02

to try it.

69:03

>> Can for those of us I'm I'm not into

69:05

Zigg. I heard good things about it. Can

69:07

you explain why Zigg is so interesting,

69:10

innovative, and why does it grab so many

69:12

so many devs attention? I don't know why

69:14

it grabs other people's attention, but

69:16

for me, it was it it just felt like the

69:18

the best better C that I saw out there.

69:21

And I am someone that's coming from the

69:23

position where I actually enjoyed

69:24

writing C. So, a better C sounds great

69:27

to me. To me, it's it's not very

69:30

annoying in terms of like if I want to

69:32

blow my own foot off, please let me blow

69:33

my own foot off. You know, a bunch of

69:35

qualities came together where I thought

69:36

on the surface it looked cool, but it's

69:38

very hard to judge a programming

69:39

language on the surface. So, I wanted to

69:41

build something with it. And so, yeah, I

69:42

I picked the GPUs, desktop software,

69:46

what could I build? Uh for for all my

69:47

time at Hashorp, um I built CLIs. And I

69:51

was like, well, I live in a terminal.

69:53

Like, what does it take? I don't I live

69:54

in a terminal and yet I understand very

69:56

little about a terminal. So why don't I

69:57

just like build a toy project that's a

69:59

terminal. That's how it started. And and

70:01

as as with a lot of stuff I find that

70:04

once you dig beneath the layer of taking

70:07

something for granted, you realize that

70:09

everything is way more nuanced and

70:11

complicated than you imagined it to be.

70:14

And terminals were the same way. is once

70:15

I dug beneath the surface, I realized

70:18

how much they were doing, how brittle

70:20

some things were, how much better

70:23

certain things could be, and I I I got

70:26

sucked into being like, I want to do

70:27

this better. So,

70:28

>> okay, for like someone who's a a dev,

70:30

you know, like I I use terminals as

70:32

well. I'm going to ask the stupidest

70:33

question. How hard could it be? What

70:35

does a terminal actually do? And then

70:37

can you maybe tell us like how Ghosty is

70:40

is structured or like what what are the

70:42

things that it it needs to do? just give

70:43

a little empathy of like actually all

70:46

the work that you're doing.

70:47

>> Yeah. Yeah. Yeah. I I actually get that

70:48

a lot. I I get that question a lot. So,

70:49

it's definitely not a dumb question.

70:50

It's really like it gets asked less now,

70:52

but a lot of people are like, "I thought

70:53

they were done." Is usually the most

70:55

feedback I get. Like, what is there to

70:56

do in a terminal? Um so, at a basic

70:59

level, they don't do a lot. The problem

71:01

is that the the functionality's grown

71:04

significantly of what terminal

71:05

developers want to do. But let me let me

71:07

just give um what they do. It's kind of

71:09

like an application development

71:10

platform, right? It's like a it's it's

71:12

not an operating system. You're not

71:13

dealing with like hardware level

71:14

problems, but it is like a application

71:17

sandbox on top of that. And that other

71:20

applications run within it and need to

71:21

render text. They need to render colors

71:24

and images and widgets and mouse events

71:27

and all this stuff. Like you're the best

71:30

description is it's like it's like a

71:31

browser but for text content. And so all

71:33

of the complexities that a browser has,

71:36

a terminal has similar ones, a smaller

71:38

scale but similar ones. And if you try

71:41

to extend what a terminal is capable of,

71:43

then it it gets, you know, you start

71:46

bringing in more and more problems. Like

71:47

as soon as you brought images into a

71:48

terminal, you introduce like a whole new

71:50

ecosystem of problems. But the

71:51

tongue-in-cheek answer I like to give to

71:53

Ghosty's complexity is that it's 30% of

71:57

terminal and 70% a font renderer.

72:00

Uh and uh yeah, that's what it feels

72:03

like. It's really like a problem of uh

72:05

you know that terminal screen you see

72:06

whether it's GPU or CPU rendered that

72:08

terminal screen you see is like you're

72:10

drawing on a canvas so you are building

72:13

a renderer for text uh in there

72:15

everything kind of bubbles from there so

72:17

from a rough architecture standpoint of

72:19

Ghosty I like breaking it down in terms

72:21

of threads because GOI is multi-threaded

72:23

not all most terminals are not um but

72:25

I'm not saying that as a positive point

72:26

it's just a good way to describe the

72:28

architecture we have a central UI thread

72:30

which just draws the windows and stuff

72:32

that's pretty standard for desktops

72:33

software and then we have an IO thread

72:35

which runs the actual shell that you're

72:37

seeing. So any bytes that we send or it

72:39

sends back to us, it's processed by the

72:41

IO thread and then we have a renderer

72:44

thread which is actually drawing it. So

72:46

it's it's the best way to think of it as

72:48

is it's on a VSYNC clock through 30 60

72:51

120 frames per second is it's just

72:52

sampling what the terminal state is and

72:54

then drawing it. And the renderer itself

72:56

uses a font subsystem on the same

72:59

thread. But we have to take the fact

73:01

that this grid has this character at

73:04

these sets of characters and map them

73:06

into fonts and do that all on our own. A

73:08

lot of people think, oh, doesn't the

73:10

operating system solve that for you? But

73:11

they don't unless you're much higher

73:14

level like you know you can't just draw

73:17

easily, you know, monospace text in that

73:19

way. You have to really put pieces

73:20

together. That's the the big picture. Um

73:23

it's quite simple at that level. and

73:24

then just you know extend all the

73:26

functionality the terminals have into

73:27

that.

73:28

>> So you're kind of like building a like a

73:30

2D graphics engine a little bit that has

73:33

like very focused on fonts.

73:35

>> Yeah. Yeah. It's it's a from a renderer

73:38

side it's very simple. The renderer is

73:40

actually not that complicated and and I

73:41

won't over complicate the hardest part

73:42

is actually maintaining the terminal

73:44

state. So the way terminals work is

73:46

they're a they're a grid of monospace

73:48

cells. So you'll have like 80 by 2480

73:51

columns 24 rows and there's commands

73:54

that the program could send to move the

73:56

cursor or say if I like to say think of

73:58

it like a paintbrush that could say make

74:00

the paintbrush red and bold and

74:01

everything after that is red and bold

74:03

and now change it and you're just

74:04

maintaining the state and drawing around

74:06

and then there's all the scroll back

74:08

right which people are used to in

74:09

terminals going back and that's where

74:11

the challenge is is doing that in a fast

74:13

performant way and that's what I try to

74:15

do with GOI. And I I show this there's

74:18

so many benchmarks we run, but one of

74:20

the most obvious ones that shows the

74:22

speed, which also gets a lot of

74:23

criticism um is just catting a large

74:26

reading a large file. If you just like

74:28

dump a bunch of text, how fast can it

74:30

get through it? And you'll see a stark

74:32

difference between modern terminals. I'm

74:34

not just going to say ghosty here, like

74:35

if you if you take ghosty kitty, um

74:38

elacrity, um any of these newer

74:40

terminals, they're all going to do great

74:42

compared to terminal app on Mac OS. um

74:46

or traditional like Linux terminals. The

74:48

criticism is why does that matter? And

74:51

you know the the easy answer is when you

74:54

um when you accidentally cut a file like

74:56

a lot of people will force close. Um the

74:58

creator of Reddus posted a great comment

75:00

for me, a great uh comment on hackernews

75:03

about why he loves Ghosty, which is that

75:05

he previously used to tail production

75:07

Reddus logs and you know it just spews

75:09

logs out and he used to have to send

75:11

them to an intermediary file and then

75:14

read them out later so he could render

75:16

it.

75:16

>> So he could render it and actually work

75:17

with it. And he doesn't have to do that

75:18

anymore because Ghost is fast enough

75:20

that he could just let it dump while

75:22

he's going through it, parsing it, like

75:25

like mentally parsing it, things like

75:26

that. and that just saves him time and

75:30

um yeah

75:30

>> there's something to be said at some

75:32

point we should probably talk more about

75:33

the fact that a lot of software these

75:35

days does not care about performance and

75:37

I think it's refreshing to actually have

75:39

examples and I I hope we will at some

75:42

point maybe get back to it you know

75:43

we'll talk about AI but that might not

75:45

help but there's a level of

75:46

craftsmanship right just like not

75:48

wasting resources or being efficient or

75:50

I I think we all like I I see in in my

75:52

day-to-day life like we have more

75:54

powerful resources laptop tops, phones,

75:57

and they're not getting any faster and

75:59

it's just frustrating at times.

76:01

>> It's kind of like the love of the game.

76:02

I mean, a lot of lot of ghosty is just

76:04

the love of the game. Um like like I

76:06

like to say like our renderer cuz cuz

76:08

like like I disclaimed before like it's

76:10

not complicated. I'm not I'm not ever

76:12

going to say that ghosty is like a a 2D

76:15

game because a 2D game from a rendering

76:17

standpoint is much more complicated. Um,

76:19

but I do care a lot about the render and

76:21

we got our renderer down to for a full

76:24

screen on on my Mac um set of grids.

76:27

Each frame updates in roughly I don't

76:31

know it's like it's something like 9

76:34

microsconds or something. Um, that

76:36

doesn't include the draw time. That's

76:38

just like taking the state and

76:39

submitting work to the GPU. It's about 9

76:41

micro and the GPU takes some time. 120

76:44

hertz 120 frames per second frame is

76:46

8,333

76:48

microsconds. So if you have nine, you

76:51

know, again, we don't have the number of

76:52

how long the GPU takes, but it's super

76:55

per it doesn't take much time at all.

76:57

>> You're leaving a lot of options and work

76:59

for

77:00

>> what I'm saying is like we could have

77:01

made it 2,000 microsconds and it

77:03

wouldn't have mattered. It like you

77:05

would you would still get that

77:06

performance, but that's not fun. Like I

77:08

want to make it sub 10.

77:10

>> I I I I like the fun.

77:11

>> Yeah. So, we spent a lot of time just

77:13

like it was a big I I blogged about it.

77:15

It was this thing where we got it down

77:17

from it used to be about 800 microsconds

77:19

and got it down to like nine and uh I

77:22

thought that was awesome. Even though

77:23

for end users it doesn't make a

77:25

difference.

77:26

>> But as you say the the craft and the

77:28

level of the game. So when you started

77:30

out building go that that was around the

77:32

time where I think Chad GPD was out.

77:34

There were some tools. How did your tool

77:35

set change in terms of how you're

77:37

developing day-to-day?

77:38

>> There's two sides to that. So one AI

77:40

gave a huge boost to terminals which is

77:43

a funny thing like a like oh how so the

77:46

number because of cloud code and all

77:48

these things the amount of time spent in

77:49

a terminal has gone up which if you told

77:52

me in 2023 terminal usage would go up I

77:54

would say no it's not going to go up um

77:56

I I had no disillusions that I was going

77:58

to like save terminals and I didn't

78:00

right like AI came out and came out all

78:03

these CLI tools and and even when you're

78:06

seeing like codeex apps and claude apps

78:09

like is leaving the terminal. They're

78:12

still executing so many things in a

78:14

pseudo terminal. The number of terminals

78:18

out there is is massively larger than

78:21

there was in 2023, which is hilarious.

78:23

>> Oh wow.

78:24

>> Yeah.

78:25

>> So random.

78:26

>> Super random. And so that's part of why

78:28

uh one of the things I'm doing with

78:29

Ghosty is extracting. It's actually

78:31

extracted already what I've called lib

78:33

ghosty which is everyone reinvents this

78:36

very small surface area of a terminal

78:37

and because they do it breaks like all

78:40

sorts of things break like if you run a

78:41

docker build or push to a platform like

78:44

Heroku and you do enough weird things in

78:46

the terminal that aren't actually that

78:47

weird just like draw progress bar it

78:49

renders it like chaos

78:50

>> all over the place

78:51

>> all over the place. Yeah. Um and it's

78:53

just because they've poorly implemented

78:55

a tiny subset of a terminal because

78:56

they're more complicated than people

78:58

think. And so libgo is this minimal zero

79:00

dependency library that people can embed

79:02

terminals anywhere.

79:03

>> Oh, cool. And yeah, yeah, MIT license

79:05

and just it's really like I'm tired of

79:07

seeing broken terminals everywhere, so

79:08

please use this. Um, so okay, that's the

79:11

one angle. Really funny. But the other

79:12

angle is actually AI usage. It's hard to

79:14

say. I'm a I'm a big fan, but you know,

79:16

within the right categories of things.

79:18

Like I think that it's a revolutionary

79:20

tool and I get a lot of joy using it. It

79:24

Yeah, I use it every day. Um, I use

79:26

tools like Cloud Code and AMP and Codeex

79:28

and and the chat tools like every day

79:31

for some aspect of my life. And it's

79:33

really allowed me to choose

79:35

what I want to actually think about,

79:39

right? I think that's the most important

79:40

thing is that I always felt limited in

79:41

terms of, oh, I'm going to have to spend

79:44

the next two hours, I don't know, doing

79:46

this boilerplate annoying stuff and that

79:50

I don't want to learn about. But now I

79:53

don't have to learn about it, which is

79:54

yeah, I'm not I'm not like getting skill

79:57

formation in that category, but I could

79:59

now spend those two hours doing

80:00

something else and and that's the best

80:02

to me.

80:03

>> In your workflow, do you just use a

80:05

single agent? Do you use multiple

80:06

agents? Have you have have you

80:07

experimented with them?

80:08

>> I've tried a bit of everything. I would

80:10

say my standard workflow. What I try to

80:12

do is I try I endeavor to always have an

80:16

agent doing something at all times. Uh

80:18

maybe not when I sleep. I don't go that

80:20

far. A lot of people do go that far. I

80:21

don't go that far. Um, but while I'm

80:23

working, I basically say I want an agent

80:26

if I'm coding, I want an agent planning.

80:28

If I if if they're coding, I want to be

80:31

reviewing or, you know, that there

80:34

should always be an agent doing

80:35

something.

80:36

>> So, you have it in a separate tab.

80:38

>> Yeah. Separate tab. And and sometimes

80:40

it's multiple. I don't there's a lot of

80:42

work that I do around cleaning up what

80:44

agents do. And I don't run like gas

80:47

townesque like things. And so I'm the

80:50

the mayor, so to speak. And so I don't

80:52

want to run too many. I don't find it

80:54

that fun to clean their stuff up. But

80:56

periodically I'll I'll run two um in

81:00

competition with each other because I

81:02

it's a it's a harder task and I I don't

81:03

have a high confidence that they're

81:05

going to just like crush it. So I'll

81:06

just run Claude versus Codeex or

81:08

something like that. Or I'll have one

81:09

coding, I'll have one doing like some

81:11

sort of research task. Um I absolutely

81:13

love them for research. That's awesome.

81:16

Um, and then I'll be doing something

81:17

else, but no more than two, I would say.

81:20

Yeah.

81:20

>> The code that they generate, do you

81:21

always review it or have you kind of got

81:24

a bit more loose? And, you know, some

81:25

people swear on having a closing the

81:28

loop, having validation for it, or are

81:30

you like still like, all right, I I want

81:32

to see the exact code and I'll review if

81:34

it's correct and what I expected.

81:36

>> Uh, matters what I'm working on. And

81:38

>> if it's ghosty, I'm reviewing everything

81:40

that's going into it. Um, if it's like I

81:43

set up a personal wedding website from

81:45

one of my family members, I don't care

81:46

at all what the code looks like. Did it

81:48

render right in there three browsers

81:49

that I tried? Yes. Did it render right

81:51

like on my phone? Yes. Don't care what

81:53

the code like. Doesn't make any

81:54

networks. No. Has no secrets access. I

81:56

don't care. Like ship it. It's only

81:58

going to be online for 2 months. So,

82:00

ship it.

82:00

>> Yeah. And then how did the AI policy at

82:02

Ghostly change? I remember that in maybe

82:05

a year ago or so, you asked for

82:08

disclosures if someone is using it. And

82:10

just very recently you kind of crammed

82:13

cracked down and said like all right no

82:15

more.

82:15

>> Yeah we're going to change again too.

82:17

Well not gonna change we're gonna

82:18

iterate. Um so yeah a year ago started

82:20

asking for disclosure and people you

82:22

know the the the very fair question

82:25

there is what does it matter how the

82:28

code is produced? And the reason to me

82:30

it always mattered was because it

82:33

dictates how much effort I go into

82:35

fixing it. Because if if you produce the

82:39

code with AI and you did it really

82:40

quickly, then I'm not going to spend

82:43

hours fixing up your code. You you you

82:46

spend your time fixing.

82:47

>> Yeah. Cuz cuz you know that that person

82:49

did put much time and not much human

82:51

time. You're kind trying to mirror it,

82:52

right?

82:53

>> It's ever for effort. If you put in

82:54

hours, I'm going to put in hours back

82:56

and I'm going to help you. But if you

82:57

put in a few minutes and never read

82:58

anything and throw it over the wall,

82:59

then I should be able to read it in a

83:01

few minutes, say no thank you, and close

83:03

it. It's it's fair and I need to better

83:06

understand what that is. And you know

83:09

it's not about bad code because open

83:11

source has always gotten bad code

83:13

contributions. But the difference before

83:16

is usually those bad code contributions

83:17

came from people that were genuinely

83:20

trying their best and put in a lot of

83:22

effort just to get to that bad code

83:23

point. And so I I people behave

83:25

differently. I would always try to

83:26

reciprocate by being like this is

83:28

someone very junior or this is someone

83:29

just new to the project. And I would try

83:31

to educate them, be like, "Okay, we

83:33

should do this better and and give these

83:34

careful reviews, but if it's bad code

83:36

that there was low effort, I'm not going

83:38

to give a careful review." So again,

83:40

like I wanted to know these things. And

83:42

and the disclosure worked decently well.

83:44

The issue wasn't the disclosure. The

83:46

issue was that the quantity of

83:50

lowquality AI PRs that we were getting

83:54

reached a a point where it was too high.

83:57

Like do you know why that might have

83:59

happened? more people instructed agents

84:01

to contribute a PR to fix an issue they

84:04

had like do you have theories or

84:05

actually like like seen evidence of why

84:07

this happened? I have theories and I've

84:10

seen some evidence, but yeah, I mean, I

84:12

think obviously there's the rise of just

84:14

AI usage in general, but the real trend,

84:17

a step change that I saw at a certain

84:19

point, and I don't know when it happened

84:20

cuz I don't use agents in this way, but

84:22

at a certain point they started opening

84:24

PRs. You know, before it was like you

84:26

generate code and maybe they commit and

84:28

stuff, but you would still like push it

84:30

to a branch and then open the pull

84:31

request. At a certain point, they

84:33

started opening PRs. And there is a dead

84:36

giveaway at AI because at least to this

84:39

day to at the point we're recording

84:40

this, the way Claude opens a PR is it

84:43

opens a draft with no body and then it

84:48

edits a body later and then reopens it

84:50

for review,

84:51

>> which is not how a human would do it.

84:53

>> Oh, like one human a year would do that.

84:55

And now it's happening three times a

84:57

day. And so even if they're not

84:58

disclosing AI or they're hiding it, it's

85:01

like, oh, and it happens at a speed

85:02

that's unrealistic. It opened the body

85:04

came in less than a minute later and it

85:06

opened less than a minute later. Like,

85:07

>> yeah,

85:08

>> pure AI. I I just tweeted about this a

85:10

couple days ago, which is just like I I

85:12

wish that these agentic tools would put

85:15

a pause on opening PRs for a second. Um

85:18

because I think that's the point where

85:19

it's really causing a lot of friction.

85:21

>> How did you change the policy? Are are

85:23

you considering closing down PRs? You

85:25

mentioned that recently that you've

85:27

you've the the thought crossed your

85:29

mind.

85:29

>> I would say I was crashing out in that

85:31

moment.

85:33

Uh but I but kind of um so we shipped

85:36

this policy update where PRs written by

85:39

AI are no longer allowed anymore unless

85:41

they're associated with an accepted

85:43

feature request. So you can't just drive

85:46

by and be like I did this thing that

85:48

I've never talked to you about. Here you

85:49

go. It we and we we get about two or

85:52

three of those a day. And so we just

85:53

close this thing. I don't even I

85:55

literally don't even read the content. I

85:56

could see it's AI. Uh I can see there's

85:58

no fixes issue number. I just close it.

86:01

No idea if the code is good. Don't care.

86:03

It's just policy. Don't have time for

86:05

that. That's pretty much where we landed

86:06

on currently.

86:08

And uh we're recording this in the

86:10

middle of another transition, which I

86:12

already have the PR open. Um where we're

86:14

going to switch to a explicit vouching

86:18

system for the community. So you're no

86:20

longer able to open a PR at all. AI or

86:23

not, don't care anymore. Which is I

86:25

think the people who criticize where it

86:27

came from doesn't matter. It doesn't

86:28

matter anymore. Now, all that matters is

86:30

that another community member has

86:33

vouched for you. Um, and if they vouched

86:35

for you, you're added to a list where

86:36

forever or indefinitely you could open a

86:39

PR. If you behave badly, then you, the

86:43

person who invited you, and the entire

86:44

tree of people they ever invited are

86:47

blocked forever from the repo.

86:48

>> This reminds you a little bit of, you

86:49

know, the social lobsters.

86:51

>> Lobsters. Yes. That's what it's based

86:52

off of. So, the idea is that you're

86:55

putting your own reputation on the line

86:57

by vouching for somebody else. I'm a

87:00

reasonable person. If if this happens

87:02

and I or one of our maintainers or

87:04

community made a mistake, if you just

87:06

like hop into Discord or email and and

87:08

seem like a reasonable apologetic

87:10

person, like I'm not going to spend a

87:11

lot of time like there's not going to be

87:13

like a I don't know a mock like court

87:16

type session. I'm just going to be like,

87:17

"Okay, I'll give you no chance." So,

87:18

yeah, we're we're sort of moving that

87:20

system. I think one thing that's a

87:21

little bit different is um I should say

87:23

that this is one inspired by lobsters

87:25

but specifically in the AI space is

87:27

inspired by this project called PI. Um

87:30

they do this uh well they do they

87:32

>> call it is built on pi it's a

87:33

self-improving

87:35

>> like uh build your own agent toolkit. Um

87:37

so you know kind of ironically it's it's

87:39

an AI tool but they care a lot about

87:41

code quality and anti-slop and things

87:43

like that. So they have a similar

87:46

mechanism, a little bit less of the tree

87:48

and some other but a similar you can't

87:50

open a PR unless you're vouched for. And

87:52

the other difference here that we're

87:54

going with is in addition to vouching

87:56

where you could positively mark someone,

87:58

you could actually denounce users. So if

88:01

there's a bad actor, you could actually

88:03

ban them. Not not just like you can't

88:06

even attempt to contribute again. And um

88:08

that's just a yeah, we had one yesterday

88:11

where someone opened PR, we closed it

88:13

because it violated it they had no

88:15

associated issue uh and it was AI and

88:17

then they just reopened it like not the

88:20

same one. They resubmitted a new branch

88:21

and reopened it like less than 10

88:23

minutes later. I was like, "Oh my gosh."

88:25

So um stuff like that is just it's the

88:27

problem is it's just wasting time.

88:29

>> It feels like most of open source will

88:31

have to change because of AI, right?

88:34

like it's you you probably know more

88:36

more maintainers but I I hear this your

88:38

story is not the only one you know like

88:39

the project closed down uh PRs GitHub is

88:43

I think just shipping a feature that

88:45

projects can automatically close or

88:48

reject PRs.

88:50

>> Yeah, I think open source will have to

88:51

change in a lot of ways. I mean I think

88:53

I for who wrote this but you know one of

88:54

the logical extremes is if agents are so

88:56

good you don't need open source anymore

88:58

because you could just build it right

89:00

>> theoretically. Yes,

89:01

>> that's that's the extreme. I don't

89:02

describe that extreme, but that's one of

89:04

the extremes. The issue is there used to

89:06

just be this natural back pressure in

89:09

terms of effort required to submit a

89:11

change and that was enough

89:12

>> and now that that has been eliminated by

89:15

AI. It's I like the wording that PI uses

89:18

which is that AI makes it trivial to

89:21

create plausible looking but incorrect

89:23

and lowquality contributions. And that's

89:25

the that's the fundamental issue. You

89:27

know, open source to a certain extent

89:28

has always been a system of reputation,

89:31

right? like you you earn some trust and

89:33

you get more access that you know and

89:35

that's how it's supposed to work. Um but

89:37

yeah, it's been that reputation system

89:39

has been taken advantage of in a certain

89:41

sense with with AI um or the default

89:44

allow PRs has you know has been taken

89:46

advantage of. And so I think like like

89:49

this vouching system that that we're

89:51

proposing for my project I think it's

89:53

like very true to what open source is

89:55

which is that open source has always

89:57

been a system of trust. Before we've had

89:59

a default trust and now it's just a

90:01

default deny and you must get trust by

90:04

somebody.

90:05

>> Do you think we might see a lot more

90:06

forking happening though?

90:07

>> I hope so. I hope so

90:09

>> because until now forking used to be a

90:11

you know like like a fork off a little

90:13

bit because it was a lot of effort that

90:15

it wasn't to to keep up like it it never

90:18

seemed viable to fork a proper project.

90:21

Right.

90:21

>> Yeah. And I and okay I I am separate

90:23

from AI and everything I have always

90:26

been a huge proponent uh or I guess in

90:28

the past few years I've been a huge

90:30

public proponent of there should be a

90:32

lot more forks like a lot more forks

90:34

because open source I think one of the

90:38

reasons maintainers have been taken

90:39

advantage of to some extent is that

90:41

contributors have some sort of

90:44

entitlement you know whether it's toxic

90:46

entitlement or not but there's some sort

90:47

of entitlement which is I've made a

90:49

valuable change so you should and it's

90:51

clean and it works great so you should

90:53

accept it but you really don't have to

90:57

like you absolutely don't have to. And

90:59

then I've seen this time and time again

91:00

where you have a high quality PR, like

91:02

perfect PR, but you say no and there's

91:05

anger in the community.

91:08

>> But the thing is I I've said this since

91:10

10 years ago in Hash Base, hitting the

91:12

merge button is the easiest step.

91:14

Getting getting to and hitting the merge

91:15

button is the easiest step. Like

91:17

undergraduates should be able to do

91:18

that. It's after that it's the years of

91:21

maintaining whatever you just merged

91:22

within the context of your your road

91:24

map. um the bugs um customer needs all

91:28

that stuff like that's the hard part

91:30

like you're signing up to keeping this

91:31

forever. It's very hard to remove

91:32

features. So or anything, remove

91:34

anything. So the core privilege you get

91:36

with open source like OSI open source is

91:38

forking and you should take that's

91:41

that's the right you got you should fork

91:43

it and maintain your own software.

91:44

>> Yeah. One interesting impact of AI as

91:47

someone tweeted about how there's a

91:50

rumor that big tech is looking into

91:51

rearchitecting their monor repos because

91:53

of agentic tooling AI tooling just a lot

91:56

more code being turned out. What's

91:58

actually what's actually happening?

91:59

What's the problem with git? The problem

92:00

with git, I mean there I think there's a

92:02

lot of problems with git, but uh the

92:03

monor repo problem with git is that git

92:06

is is relatively bad at very large

92:08

repositories because you you pretty much

92:09

have to clone the entire repository.

92:11

There's there's some extensions to like

92:12

fix that, but like official mainline git

92:14

can't really do that, right? And so for

92:17

very large uh changes the uh very large

92:19

repositories um it's sort of annoying to

92:22

maintain. And then if you have a lot of

92:24

churn in it, it's very hard to get

92:25

changes into whatever your trunk is,

92:27

your main your master branch, right? you

92:29

concept rebase merge Q solves that to a

92:31

certain extent. I think merge cues works

92:34

for humans at a certain scale, but the

92:37

merge cues could get quite deep. But

92:38

then if you sort of 10x that, like

92:42

conservatively, I think 10x that. And

92:44

then if you buy into like hype cycles

92:46

and you 100 or thousandx that, I think

92:48

it gets completely untenable in terms of

92:50

how are you ever getting any semblance

92:54

of cohesiveness onto the main branch um

92:57

quickly. And and so yeah, I I think

93:00

there's a confluence of problems there,

93:01

which is which is the merge cube

93:03

problem, the dispace problem, the like

93:07

branching review type problem. Oh, I I

93:10

also treated the other time where like

93:13

git has this you branch and you push up

93:16

your branches, but the branches are only

93:17

the positive. Like when you when you

93:19

close a PR and you you don't accept it,

93:22

like you pretty much are the branch.

93:23

GitHub, you could reaccess closed PRs,

93:25

but you a lot of people don't even get

93:27

to the PR stage. They experiment.

93:28

They're like, "Oh, this isn't the right

93:29

way." And they never push the branch.

93:32

And and that's like relatively important

93:34

information. Relatively important. It's

93:37

not as important as the positive, but

93:39

there like I I think there should be a

93:41

lot more branches and get a lot more

93:42

information that we just never throw

93:44

away. Like we're at to me we're sort of

93:47

at the like Gmail moment for email for

93:50

version control where like you used to

93:52

really have to like curate delete all

93:54

this email and then Gmail came out gave

93:56

a gig away for free to everybody who

93:58

never had to think about

93:59

>> their tagline or something was like

94:00

never deleted email I remember seeing

94:02

that in some s of marketing was like

94:03

just archive it right never delete it

94:05

and that's where I feel like where we

94:06

should be at with code which is like

94:07

just this huge repos lot of context we

94:11

need better tooling in order to find

94:13

relevant context in that git repo um or

94:16

version controlled repo. I would say

94:18

that the real you ask for like real

94:19

examples. I do advise um a company

94:24

that's currently stealth but working in

94:25

this space and they're the real examples

94:27

is is is driven by the highly adventic

94:29

companies. The companies that are like

94:31

going really allin and drinking the

94:32

Kool-Aid and their struggling in terms

94:36

of the amount of churn that these agents

94:38

is causing is so much greater than

94:40

humans. And it's not a AI review problem

94:42

or anything. It's really just like a

94:44

release problem like managing the merge

94:46

cues, humans getting access to the right

94:49

set of data in the repository and things

94:50

like that. So,

94:51

>> so are problems performance problems

94:54

mainly with w with with git or or just

94:56

like even the workflow of

94:57

>> Yeah. Yeah. All of it. Performance for

94:59

sure, but workflow. Yeah. I mean like

95:00

every time you pull you're you can't you

95:03

can't push because every time you pull

95:04

there's another chain like every time

95:06

you push it's

95:06

>> there. Yeah. There there's a lot of

95:08

parallel work happening as well. Do you

95:10

think Git will be around with with with

95:11

the judge in a few years?

95:12

>> Who knows? But what's interesting is

95:15

this is the first time in like 12 to 15

95:18

years that anyone is even asking that

95:21

question without laughing.

95:23

>> We're not laughing,

95:24

>> right? Like like if you if 5 years ago

95:26

you said, "Will Git be around in 5

95:27

years?" You'd be like, "Are you of

95:29

course it'll be around?" Like that's

95:30

crazy to think, right? But now people

95:32

could ask that question. And of course

95:34

some people will laugh, but like there

95:35

are people that critically think that

95:37

Git might not be around in 5 years.

95:38

Well, I think you do want to save the

95:40

prompt history because often reading the

95:41

prompt is actually if it's a bunch of

95:43

code generated, the pull request is

95:45

meaningless.

95:45

>> Changes will h like Git and GitHub

95:48

forges in their current form do not work

95:53

with agentic infrastructure today and

95:56

it's nent today. So yeah, where change

95:59

will happen and I'm not exactly sure and

96:01

it's not something I'm trying to change

96:03

myself but it it you know I'm on the

96:05

receiving end in terms of a agent user

96:07

and a maintainer where I'm like this

96:09

isn't working. What other engineering

96:11

practices that you know we have been

96:13

relatively stable for like 10 20 or even

96:15

more years you think have to change or

96:17

or are looking to change thinking things

96:19

like CI/CD testing code review other you

96:23

know ways of

96:24

>> yeah you know AMP has this saying which

96:26

is is is

96:28

it's kind of clickbaity but it's so true

96:31

everything is changing and this is this

96:32

is the first time really where it feels

96:34

like it is the first time in my you know

96:39

short relatively short to other people,

96:40

but still a 20 year professional career

96:43

that so much is on the table for change

96:46

at one time. And I'm an optimist, so

96:48

it's really exciting to me. Um I it's a

96:50

lot of fun, but it's we've never seen so

96:53

much editor mobility. Editors used to be

96:56

one of those things that once someone

96:57

picks an editor, it's very hard to get

96:58

them off that editor. They're like

97:00

stuck. The level of editor mobility in

97:02

the past few years between like VS Code

97:03

and cursor and and just jumping around

97:05

is is unreal. So there's a bunch of

97:07

mobility there in terms of uh I mean

97:09

cursor itself is a great example of a

97:11

company that reached an insane valuation

97:13

that you could never have gotten preai

97:15

on an editor product. So editor forges

97:17

um CI/CD for sure and I I think that

97:19

testing in general because to make an

97:22

agent better it needs to be able to

97:23

validate its work. And so tests go from

97:27

even the best test case scenarios don't

97:29

have like I mean the best I guess have

97:31

full coverage but that that's a very

97:33

extreme the the very good test case

97:35

scenarios just test like one of the edge

97:38

cases and one of the happy cases and you

97:40

know bad case and they they just kind of

97:42

go through and if it passes it's

97:44

probably good paired with a human who's

97:47

thought about the problem. But AI is

97:50

more goal oriented in terms of I want

97:52

this feature to work this way that if it

97:55

doesn't see a spec somewhere or a test

97:57

somewhere that other things should work

97:59

in a different way. It'll just break it

98:01

on on its path to its own goal. And so

98:04

uh I've heard this called a lot of

98:05

things. I mean the one I like the most

98:07

is like kind of like harness engineering

98:09

which is like

98:10

>> harness engineering.

98:11

>> Yeah. That's like and I've been one of

98:13

my like goals for this calendar year has

98:15

been to spend more time doing that,

98:16

which is that anytime you see AI do a

98:19

bad thing, try to build tooling that it

98:22

could have called out to to have

98:24

prevented that bad thing or course

98:25

corrected that bad thing. And so sort of

98:27

like moving from the product to working

98:29

on the harness for the product or

98:31

product development. And so, yeah, there

98:33

there's there's a lot of that where I

98:35

think testing has to change to be far

98:38

more expansive, but CI/CD is not set up

98:41

just resource performance- wise to be

98:43

able to do stuff like that. Um, so yeah,

98:46

I'm I'm not sure how it changes, but

98:47

that's going to change, too. So,

98:49

everything is on the table. It's really

98:50

interesting.

98:51

>> Yeah. And a lot of tools to be built.

98:54

One other thing, observability.

98:57

>> Yeah. And then and I guess on that same

98:59

topic I mean of the volume and and scale

99:01

and observability it's also like the

99:02

sandbox like I didn't think even being

99:07

in infrastructure and being heavily into

99:08

infrastructure you know containers blew

99:11

up the amount of like minimal compute

99:13

units we had like floating around

99:15

everywhere. I didn't think that was

99:17

going to go up. I mean it'd go up like

99:19

predictably up but I didn't think it was

99:20

going to like slope change up. And it

99:23

has like slope changed up already just

99:25

due to the sandbox environments that

99:27

agents need. And yeah, I mean that's

99:30

super interesting to me because that

99:31

stresses a whole lot of new systems. I I

99:34

think you know the things that I worked

99:36

on like all the products I worked on but

99:37

also things in the ecosystem like Docker

99:39

but like Kubernetes they're going to be

99:41

stressed significantly because they're

99:43

engineered for some level of scale but

99:45

this is a different type of particularly

99:47

non-production workload scale that you

99:49

have to support. So, um, yeah, it's it's

99:53

fun fun proms.

99:54

>> Going back to hiring, you you've hired a

99:57

lot of engineers and you previously

100:00

talked about something really

100:01

interesting. This was, I think, in the

100:02

context of maybe Hashi Corp, how some of

100:04

the best engineers you've hired had

100:06

really boring backgrounds. Can you talk

100:08

about that? Like who who were the best

100:10

engineers you hired and like how how

100:12

>> that's a better way to frame it. Yeah, I

100:13

I I stand by this. Most the best

100:16

engineers I can remember from my time at

100:19

Hoskar, but also just in every job that

100:21

I've had are notoriously private. And

100:24

not because they want to be private,

100:25

because they just don't care to be

100:26

public, I guess would be the better way

100:28

to put it. I don't want to like

100:29

carefully describe anyone without giving

100:31

them away, but you know, they're just

100:32

they don't have social media profiles

100:34

very often. They honestly are 9 to5

100:36

engineers. They go back and they don't

100:38

code at night. they just spend time with

100:39

their family, but because they don't do

100:41

anything else during their working time,

100:43

they're like locked in and and they're

100:45

really good. And it's not about putting

100:46

the hours, it's also just skill-wise,

100:48

um, super strong. Um, so yeah, I always

100:51

found like when I when I was reviewing

100:54

resumes and stuff, when you find the

100:55

person that has a resume where they like

100:57

they don't have any GitHub, even a

100:59

GitHub account, like some people are

101:00

like, "Oh, you have to public

101:01

contributions to to stand out." Like

101:03

that is a way to stand out. But also, if

101:05

you have zero public contributions and

101:07

you've just worked at companies that

101:09

also have never heard of before, it kind

101:11

of is interesting to me, which is like,

101:13

okay, you you might know something like

101:17

deep. Um, so yeah, I I think that, you

101:20

know, the problem is, and I the funny

101:22

the ironic thing is I spend a lot of

101:23

time on social media and these engineers

101:26

are better than me. Um, but the the

101:27

funny thing is every moment you spend on

101:29

social media, time is zero sum. So any

101:32

every any moment you spend on social

101:33

media is taking away from something else

101:34

and and the issue is it's not one for

101:36

one because as every engineer knows the

101:38

time it takes to really get your mind

101:41

into flow to get going with something is

101:44

it varies but it takes time and so when

101:48

you context switch to social media if

101:49

you if something's compiling and you tab

101:51

over and you spend time you you've given

101:54

something up in terms of thinking I I

101:56

think one of the best things I do spend

101:58

a lot of time on social media but maybe

102:00

unhealthy the um amount of time on

102:03

social media, but also an unhealthy

102:04

amount of time um at night. I I don't

102:08

have insomnia, but it takes me a long

102:09

time to fall asleep. And and it's

102:11

because I just sit there in the dark,

102:13

and I love Some people do this in the

102:15

shower, but it's not long enough for me.

102:17

I love to just sit in bed, lights off,

102:19

my wife's sleeping, and I just think

102:21

through like I'm writing code in my

102:23

head. I'm thinking through products. I'm

102:25

thinking through website copy. I'm

102:26

thinking through I'm running CLI in my

102:28

head of how it's going to feel. And

102:30

sometimes last night I I went to bed at

102:33

9:30 because I'm a I'm a dad so I go to

102:36

bed early

102:37

>> and you have to wake up and you don't

102:39

know when you have to wake up.

102:40

>> Yeah. Yeah. And I didn't even feel like

102:43

I was up that long and I was like I got

102:45

to go to the bathroom. I should go I

102:46

should really actually like go to sleep

102:47

and I looked and it was 12:30 and all I

102:49

was thinking about was uh it's so dumb

102:52

but all I was thinking about was this

102:53

vouching system of how vouching might

102:55

work and might not work. And I've always

102:58

had this thing where I'm willing to I

103:01

like competing. I I think competition's

103:03

fun. Um but I always feel fair game to

103:06

compete with anyone in product building

103:07

space because I think I'll spend more

103:09

time thinking about it than they will. I

103:11

think people turn it off and I I don't I

103:13

try not to turn it off. So um yeah, I

103:15

mean I think the point of all that is

103:17

the best engineers are the ones that

103:20

context switch the least. Probably

103:22

>> having used AI AI agents. Do you think

103:25

this might change because you know like

103:28

these agents can go on and think or or

103:31

or do work for you? Like how would you

103:33

hire in this in in this new world where

103:36

where using AI is kind of a given? Most

103:38

devs will prompt uh and fewer and fewer

103:41

right even though best devs clearly know

103:44

how to write code as well.

103:46

>> Um I would definitely require competency

103:50

with AI tools. You don't need to use

103:52

them for everything. that's not

103:54

important to me. But it's an important

103:56

tool to understand the edges of like

103:58

it's like any other tool where sometimes

104:00

it's useful and sometimes not useful,

104:02

but if you ignore it completely, you're

104:04

going to do something suboptimal in a

104:06

time. I mean, the best example to me is

104:10

pro proof of concepts. Like constantly

104:12

in real product organizations, you have

104:14

an idea and you need to like demo it out

104:16

to figure out if it works. I would much

104:18

rather someone just like throw slop at a

104:21

wall that you're never going to ship and

104:23

spend a day doing that, you know, less

104:26

than a day doing that rather than spend

104:28

a week doing it or organically as a

104:30

human like cuz you're going to throw it

104:32

away anyway and you don't even you might

104:34

throw it away because it's a bad idea,

104:35

but I'd rather prove it out. And so just

104:37

slop it up. And so this is why it's so

104:39

nuanced. I'm so like I'm so get so

104:42

worked up about sloppy PRs to open

104:44

source but it's because there's a time

104:46

and place for them and that's not the

104:47

time and place for them but there is and

104:49

so I would hire in that way and I think

104:52

the other thing that I don't know if

104:54

it's the right thing to do but I would

104:56

strive that that goal that I have I

104:58

would strive for everyone to have an

105:01

agent running at all the time again like

105:03

it doesn't need to be coding but to be

105:05

doing something extra for you I would

105:07

strive for that because

105:09

uh I I do a driving. That's my biggest

105:11

one. On the drive here, I had some deep

105:13

research going. And it's like I will

105:14

always spend 30 minutes on the

105:17

boundaries. When I wake up and before I

105:19

stop working, before I leave the house

105:20

or something, I spend 30 minutes stop

105:23

working. What can my agent be doing

105:25

next? That's that's slow. What's a slow

105:28

thing my agent could do for the next

105:29

time? And I knew I was going to drive

105:30

here for an hour. It finished far faster

105:32

than hour, but you know, it was just

105:34

like, oh, I need to do some library

105:36

research. Um, okay. find all the

105:38

libraries that have these properties

105:40

that are licensed in this way and I was

105:42

looking up some like HP3 stuff quick

105:45

stuff and so find build that ecosystem

105:48

graph for me. Um, right before I left, I

105:50

was working on something to do with this

105:52

vouching system and I didn't quite

105:54

understand the edge cases of what I was

105:56

doing. And I will think about that

105:57

manually, but why not just start just

105:59

start an agent to like look at this repo

106:01

and I use AMP to like consult the Oracle

106:03

like think deeply about um what the edge

106:06

cases might be, what am I missing? If I

106:07

had another two hours to work, I

106:08

wouldn't need the agent to do that. I

106:09

would have done it myself, but I don't.

106:11

So, why not have it do it? So, it's just

106:13

part of my goal to always have one

106:14

going. And I unfortunately don't have

106:16

one going cuz they finished it all right

106:17

now.

106:18

Interesting. And so this agent running

106:21

there is kind of does do I feel

106:22

correctly that it's now so natural that

106:24

it doesn't get in the way of your own

106:26

thinking like you do your own thinking

106:28

and you do your work but every now and

106:30

then you glance and you you ping it or

106:31

you start it or it's it's now it's so

106:33

it's not distracting, right? Cuz I think

106:35

that's

106:35

>> Yes. I actually turn off all the agenda

106:37

tools do this and I turn off the desktop

106:39

notifications. Yeah. Um I I think the

106:42

desktop notifications are for the most

106:43

part a mistake. Um so yeah, I turn those

106:46

off. I choose when I interrupt the

106:50

agent, not it doesn't get to interrupt

106:52

me. Um, so for sure and and then there's

106:55

another aspect where I think my

106:57

engineering has changed where I try to

106:58

identify

107:00

the tasks that don't require thinking

107:02

and the tasks that do require thinking

107:04

and and just delegate like delegate the

107:08

work to an agent. like it sometimes

107:12

it just feels productive to do the the

107:14

non-thinking tasks and you're like,

107:16

"Yeah, I did a lot today. I got got

107:18

this." But but a lot of times I just try

107:19

to just delegate that out. There's a lot

107:21

of people that that you know say like

107:22

you think less. And I think if you use

107:24

the tools wrong, you do think less

107:25

because you just like launch an agent

107:27

and I don't know, go watch YouTube or

107:29

scroll social media or something. But if

107:31

you instead view it as a way to choose

107:33

what you think about, then I think that

107:35

you don't need to sacrifice that

107:36

thinking. But I think the the problem is

107:38

uh the majority of the population

107:40

probably won't do that.

107:41

>> Yeah. But it's still I think it's good

107:43

food for thought and it's good to hear

107:45

from you on how you're using and it's

107:47

working for you. When did you start to

107:49

to have this second agent running? What

107:50

made the switch? Was it the models

107:52

getting better or

107:53

>> Yeah, I don't remember which model it

107:55

was, but there was a certain I tried

107:57

cloud code right when it came out. It

107:58

was just like March or May last year.

108:00

>> Yeah, it was March the beta. Yeah. And

108:02

the May public release.

108:03

>> Okay. I I don't think I used the beta,

108:04

so it was probably May. um wasn't a hu

108:07

wasn't super impressed honestly. Um and

108:10

then I mean really quickly by like the

108:11

summer at some point during the summer

108:13

oh I remember I remember um I saw so

108:16

many positive remarks about it that then

108:19

I started to get scared that I would be

108:22

behind on how to use a tool. And so I

108:25

actually started forcing myself to I

108:29

still didn't believe in it. So, I would

108:30

do everything manually, but I was

108:32

forcing myself to figure out how to

108:34

prompt the agent to produce the same

108:37

quality result. I was working much

108:39

slower because I was doubling the work

108:42

and it was more than double because it

108:44

was they're slow and it's we're going

108:45

back and forth and I already had the

108:47

work done and all this stuff, but I was

108:49

forcing myself to do it and you find

108:52

stuff that I couldn't figure it out. I

108:53

couldn't like it just wasn't there yet.

108:54

But then I found other stuff where it's

108:56

like, oh, I naturally got to the same

108:58

point that thousands of other people got

109:00

to, which like, oh, if I do a separate

109:01

planning step, it does so much better.

109:05

And everyone got there. And then I

109:07

figured out, oh, if I have a better test

109:10

harness for it to execute, it does a lot

109:13

better. And then, you know, I I think

109:15

everyone starts with like no agents. MD

109:17

or claw.MD or anything. Same thing. I

109:20

realized, oh, if it makes a mistake and

109:22

I add that just to agents MD, it never

109:26

makes that mistake again. Like, oh, and

109:28

like these these are just like

109:29

incremental things that I recognize when

109:32

I see people that are new or I've

109:35

watched a couple live streams like

109:36

lurked on live streams where like kind

109:38

of anti- AI people like try AI and it's

109:42

one of those things where I'm like

109:43

they're just swinging the hammer way way

109:45

off, right? Like it's because you

109:47

haven't it's the the thing is like it's

109:50

it's as if someone tried to like adopt

109:51

Git and they used it for an hour and

109:53

decided they weren't more productive

109:54

with it. Like it takes much longer than

109:56

an hour to get proficient with Git, but

109:58

you put in the effort and then you reap

110:00

the rewards later. And it's sort of the

110:02

same thing to me with AI tools.

110:03

>> What What would your first advice be for

110:05

someone who's like not

110:06

>> My first advice would be reproducing

110:07

your work with an agent. And if you

110:09

really really don't want an agent to

110:10

code, reproduce the research part of

110:13

your work with an agent. Um like there

110:15

there's a lot of people it's like I

110:16

don't want it to write code for me for

110:17

whatever reasons like uh but yeah just

110:21

kind of delegate some of the other

110:23

research part. There's so many places it

110:25

could be helpful. So it it doesn't need

110:27

to take you know you don't need to pick

110:29

up on the it must replace you as a

110:30

person kind of propaganda. You could

110:33

just find the the corners of where you

110:36

work and and replace those parts. One

110:39

thing that you you give people is you

110:40

give advice on for potential founders

110:43

because you're a successful founder. You

110:45

you've had an exit. You built up this

110:46

awesome company. You get a bunch of

110:48

emails from people asking, "Hey, I want

110:49

to be a founder." What's your advice?

110:51

And you you wrote about this. You shared

110:53

the email, but but can you tell us like

110:55

what advice you typically give people

110:57

and how is it received?

110:59

>> Uh well, I usually ask for something

111:00

more specific. Uh because yeah, if

111:03

someone's like, "What could I do to be

111:04

successful?" one, I will always disclaim

111:06

that you're consulting someone with

111:08

survivorship bias. So, you need to take

111:10

that into account. Um, but I'm willing

111:12

to share my experience as a survivor,

111:15

but just understand that there's

111:16

survivorship bias. Um, but usually I ask

111:18

for like what's what's something more

111:19

specific like what are you trying to do?

111:21

Um, and so we usually get to like should

111:22

I open source my project or not or

111:24

should I be remote or not or should I do

111:26

enterprise and and I don't know. Um, but

111:29

my my the most general advice I usually

111:32

give people is startups are much longer

111:34

than you think. Um, you're going to

111:36

probably work on it for I I say imagine

111:39

10 years. A lot of people say 5 years,

111:40

but I say imagine 10 years. Like is this

111:42

really something you want to work on for

111:43

10 years? And is it something that like

111:46

you need to have a certain amount of

111:47

hubris in order to say I'm going to work

111:50

on this for 10 years and I truly believe

111:53

I'm going to do it better than anyone

111:54

else. There's nothing behind that. no

111:57

substance behind it other than hubris.

111:59

So you need to have a certain amount of

112:01

of ego and hubris in your head to make

112:03

that but not too much where you'll be

112:05

blind to change coming in. So that's

112:08

usually like the first advice I give cuz

112:10

a lot of people have cool ideas but

112:12

they're going to burn out you know

112:13

relatively quickly. So uh that's where I

112:15

start. So currently you're advising some

112:17

companies. What are you seeing with

112:19

them? Like what what are servers doing

112:21

these days? What are they doing

112:22

differently than you know like earlier?

112:24

How's that landscape? Uh again it's

112:28

really contextual in terms of like if

112:29

you're an AI startup it's it's very very

112:31

different.

112:32

>> How how are AI startups

112:34

>> working differently?

112:35

>> They are there's a lot of pressure to go

112:37

faster than I've ever seen any startup

112:40

though. Um, I I think the industry is

112:42

moving so fast that I I don't advise any

112:44

AI startups, but I've talked to some of

112:46

them and it it's even as an adviser, I

112:48

feel like it's too much pressure because

112:50

they are just being pushed to prove

112:53

themselves quickly, whether it's through

112:55

traction or revenue or something. It's

112:56

sort of like there's this mentality

112:58

within that ecosystem where AI should

113:01

allow you to go crazy fast. And in

113:04

addition to that, there are a lot of

113:06

companies moving crazy fast. So, um, the

113:08

change is happening. I think that's the

113:10

one thing. Outside of that, I mean, like

113:11

I said, it's just it's just a ton of

113:13

opportunity in every space. Otherwise,

113:15

it's a lot of the same stuff. I mean,

113:16

it's remote versus non remote, open

113:18

source versus not open source. Do you

113:20

see the role of software engineers

113:21

changing now, especially at the A&E

113:24

companies where engineers like like

113:26

yourself, they're actually being way

113:28

more productive? They they can produce a

113:29

lot more code, a lot more output. Are

113:33

they being pushed into being like, you

113:34

know, like wearing more hats, talking to

113:36

the business, or being a bit more like a

113:38

mini founder, if you will?

113:39

>> I hesitate to say more productive. I I I

113:42

view that there's an expectation they

113:43

could do more. I don't think that's

113:45

necessarily more productive, but it's

113:47

more like you should be able to, for

113:50

example, build a full demo, design,

113:53

everything for your you don't need a

113:55

team to do that anymore, right? Like you

113:57

should be able to do that at least from

113:58

a demo perspective. There's no reason

114:00

not to because again you could ship slot

114:02

for that. That's fine. I mean this is

114:04

still the same but you should be able to

114:05

research effectively and and in a sense

114:08

handle more vague tasks. I'm seeing that

114:10

a lot more which is like just the

114:13

capacity to experiment is so much higher

114:15

I would say. But then when it turns into

114:18

productionizing something uh it feels

114:21

similar to what it's always been. I I

114:24

think that there's a lot of companies

114:25

that are eating the you know the dog

114:28

food of of of the AI companies of

114:30

shipping whatever and I think that's a

114:32

little scary.

114:33

>> Yeah. They look at entropic and they're

114:34

like oh they build cloth cowork in 10

114:36

days and it'll be billion dollar

114:37

company. They're freaking out of why

114:40

they're not doing that

114:41

>> there. I think a big change is from like

114:42

a preede perspective or yeah preede

114:45

perspective where you would be like I

114:46

need to raise a seed in order to build a

114:48

prototype. That's like like show me the

114:51

prototype because yeah, you should be

114:52

able to build that really quickly for

114:54

most things. There's still hard tech out

114:55

there that you can do that.

114:56

>> So, you do a bunch of coding, you do a

114:58

bunch of thinking about coding as well,

115:00

even as you're trying to fall asleep.

115:02

What refills your bucket uh outside

115:04

outside of coding, outside of tech?

115:06

Obviously like the stereotypical things

115:08

like just taking breaks and being with

115:11

my family and things like that, but I

115:12

mean I think the biggest thing is you

115:13

know I am introverted so just quiet solo

115:17

time um refills the most energy for me.

115:20

I live pretty close to the beach and

115:22

just if I'm in a bad mentality, things

115:26

aren't working, I'm feeling unproductive

115:28

or some something's going on, like just

115:30

closing my laptop and taking a walk

115:33

outside,

115:34

it like stuff like that helps a lot. I

115:36

have a lot of hobbies and stuff, but

115:38

it's I think like just as a general

115:40

recharge, it's it's that more than

115:41

anything. I know there's a lot of people

115:42

it's like going out with friends or

115:44

something like that, then I like that,

115:46

but that's not the full recharge for me.

115:49

And what's a book that would you

115:50

recommend and why?

115:52

>> Um, so I only I I pretty much only read

115:55

fiction outside of news. Um,

115:57

>> great.

115:58

>> Great. Okay. Uh, the most recent book of

116:01

fiction I read is an older book and it

116:03

it is an easy read, so I hope people are

116:05

like not like, "Oh, he's an idiot for

116:06

reading this." But, um, it was uh, what

116:08

is it called? The the Something Life of

116:11

Addie Laroo. It's just like kind of a

116:13

romantic type of fiction novel, but

116:16

yeah, it's just about it's about I think

116:18

it's like 10 years old. It's older now.

116:20

Um, but uh it's just about a a woman who

116:22

kind of sells her soul to live forever,

116:25

but the cost was no one remembers her

116:27

once they walk out the room and yeah,

116:30

it's just going through her whole life

116:32

of losing all human connection, but she

116:35

gets to live forever. Um, what that is

116:38

like and know I I like reading fiction.

116:40

though.

116:40

>> I I like reading fiction at night. It I

116:44

don't know. I don't know if it's

116:45

escapism or just like you just like, you

116:48

know, you get a little bit different.

116:50

Well, it's so so different to the coding

116:52

or anything. It may maybe just helps me

116:53

turns off the thing. I I personally I

116:55

probably read way more fiction than I do

116:57

professional non-fiction, honestly.

116:59

>> Yeah. Yeah. I'm I'm the same way. It's

117:01

my version of TV, too. TV to me is more

117:04

a social activity. Like, if if my wife

117:06

wants to watch something together, like

117:07

we'll watch a show. But if I'm alone,

117:09

I'm not going to watch a show. I'm gonna

117:11

read probably.

117:12

>> Awesome. Well, well, thanks so much for

117:14

going through all all of these details.

117:17

It was just not great to hear from how

117:19

you're working, the history of of Hashi

117:21

Corp. This was all just really

117:23

interesting and motivating.

117:24

>> Yeah, thank you. Thank you.

117:26

>> I hope you enjoyed this long and

117:28

interesting conversation with Michelle.

117:30

One thing that really stuck with me from

117:31

this conversation is Michel's own rule

117:33

for himself. Always have an agent that

117:36

does something. not necessarily coding,

117:38

just doing something. For example, while

117:40

he was driving to this podcast

117:41

recording, he had Deep Research running

117:43

before he leaves the house. He asks

117:45

himself, "What's a slow task that my

117:47

agent could do while I'm gone?" An

117:48

important part to all of this, he turns

117:50

off all notifications. The agent does

117:52

not get to interrupt him. He interrupts

117:54

the agent when he's ready. Michelle is

117:56

in charge and he has a buddy who does

117:58

the work that he has delegated while he

118:00

focuses on the problem that he is

118:02

solving. This is a nice challenge for

118:03

anyone listening. Next time you step

118:05

away from your desk, before you close

118:07

the laptop, ask yourself, what slow task

118:09

could an agent be doing while you're

118:11

gone? If you enjoy this episode, share

118:13

with a colleague who's thinking about

118:14

where software engineering could be

118:15

heading. And if you've not subscribed

118:17

yet, now is a good time. We have more

118:19

conversations like this one coming.

118:21

Thanks and see you in the next

Interactive Summary

Michelle Hashimoto, co-founder of HashiCorp and creator of Ghosty, shares his journey from self-taught coding to building modern cloud infrastructure. He recounts HashiCorp's origins from a failed university project, its pivot from initial commercialization failures, and candidly discusses diverse partnership experiences with AWS, Azure, and Google Cloud. A major focus is the transformative impact of AI on open source, leading to changes in contribution trust systems and new policies like Ghosty's vouching model. Michelle also details his personal AI-integrated workflow, always having an agent running, and offers advice for aspiring founders and reflections on the evolving landscape of software engineering and hiring.

Suggested questions

27 ready-made prompts