HomeVideos

DHH’s new way of writing code

Now Playing

DHH’s new way of writing code

Transcript

3001 segments

0:00

I feel that you very much value software

0:03

engineering as a craft

0:04

>> hugely. I mean I think aesthetics is

0:07

truth. When something is beautiful, it's

0:09

likely to be correct. I think this is

0:11

true in mathematics. This is true in

0:12

physics. This is true in a lot of

0:14

different domains.

0:15

>> I wonder if there's a part of AI about

0:17

the impact of doing work that we would

0:18

have not done before.

0:20

>> The number of projects we have tackled

0:21

internally that we would never even have

0:23

contemplated starting on or Legion.

0:25

Jeremy, one of our most agent

0:27

accelerated people, went like, "We're

0:29

going to do P1. We're going to optimize

0:30

P1." Literally the fastest 1% of

0:33

requests, we're going to make them even

0:35

faster. There's a bit of tension right

0:36

now is that most of the people I find

0:38

who are all in, they're working harder

0:39

than they ever have. And I've seen that

0:41

with myself now, too. When you can be

0:44

this effective and impactful on an hour

0:46

of supervision of these agents, it's

0:48

really intoxicating. And I need to go,

0:51

do you know what? This is not like a

0:52

limited sale. on Lex Freriedman's

0:54

podcast. You were still rightfully so

0:56

very skeptical of AI.

0:57

>> This is a nuance point and maybe it's

0:59

self- serving, but I don't actually

1:00

think my opinions have changed. What

1:02

have changed is

1:06

how has the creator of Ruby on Rails

1:08

changed how he builds software now with

1:10

AI agents? David Hayam Hire Hansen often

1:12

referred to as DHH created Ruby on Rails

1:15

Omachi and is a co-founder of 37

1:17

signals. He bashed capabilities of AI

1:19

coding tools on Lex Friezen's podcast 6

1:22

months ago. Then over the course of a

1:24

few weeks over the winter break, he did

1:25

a 180 turn and went AI first on

1:27

everything. In today's conversation, we

1:29

cover how David and his team at 37

1:31

Signals build software today and how AI

1:34

tools are making them more ambitious

1:35

than ever before. Why Ruby on Rails

1:37

Analytics could become even more popular

1:38

than they are today as they are both

1:40

well suited working with AI agents. why

1:42

taste and beautiful software are

1:44

becoming more important and why both

1:46

standout designers and engineers who

1:47

care about the craft could become more

1:49

in demand and many more. If you're

1:51

interested in what one of the most

1:52

experienced builders in the tech

1:54

industry thinks about the practical

1:55

utility of AI tools and how these tools

1:57

could impact software engineers who care

1:59

about the craft then this episode is for

2:01

you. This episode is presented by static

2:03

the unified platform for flags analytics

2:05

experiments and more. Check out the show

2:06

notes to learn more about them and our

2:08

other season sponsors sonar and works.

2:11

David, it's awesome to have you here.

2:13

>> Thanks for having me. Thanks for coming.

2:15

I should actually say you're in

2:16

Copenhagen. That's my city of choice at

2:19

the moment. It's a beautiful city. It's

2:21

got so much going for it. And so, what

2:23

have you been up to?

2:24

>> I'm always building stuff. I have been

2:26

building stuff for a good damn three

2:29

decades now on the internet. I got

2:31

started back in 94, I think it was, when

2:34

I first got exposed to it and basically

2:36

just never stopped. And in the past six

2:38

months, I've been building a variety of

2:41

things. One of them is a new Linux

2:44

distribution called Umachi.

2:46

I switched to Linux about a little over

2:49

two years ago, I think, now. First spent

2:51

some time on Ubuntu, having fun with

2:53

that, and then realizing I actually

2:54

wanted to make my own system from

2:56

scratch, building it on top of Arch and

2:59

Hyperland. So, put a lot of time into

3:01

Amachi. It got started as a summer

3:03

project in between racing at the 24

3:05

hours of Lama. there's a lot of downtime

3:07

in that week. So, I just started hacking

3:10

on it and it really took off very

3:12

quickly thereafter. It's been a truly

3:15

inspiring ride to see that even in a

3:18

market as crowded as Linux

3:20

distributions, there's about 7,000

3:22

different distributions out there, some

3:24

of them with long pedigrees and many of

3:26

them even based on sort of kind of

3:28

similar vibes to some extent. There's

3:32

room for something new and it's a great

3:34

reminder that all the ideas in the world

3:37

may be taken and it doesn't matter

3:38

because your spin on it isn't. And I put

3:40

my spin on Linux. Billachi built the

3:44

perfect computer system for me and saw

3:48

exactly the same thing I've ever seen.

3:50

And whenever I build something that

3:52

really just hits the spot for me

3:54

personally, there are thousands of

3:57

others just like me or close enough to

3:59

what I like that they find the same

4:01

pleasure and joy in it. Whether it was

4:03

Ruby on Rails, Kamal, getting out of the

4:05

cloud, any of these things, it's the

4:06

same syndrome. Yeah. With with Rails,

4:09

you were literally scratching your own

4:10

itch. You were just building your own

4:12

components and then open sourcing them.

4:14

Is that how it started? Basically, I

4:16

picked up Ruby in the early 2000s and

4:20

really put it to the test in 2003 when

4:23

we started building Base Camp and I did

4:26

not have a mandate of what to use to

4:28

build it. Prior to that, I'd been

4:30

working for a lot of client projects

4:32

that would say, well, we're building

4:34

this in PHP because we have someone who

4:37

knows that. So, this is what you have to

4:39

use. And then we were building our own

4:40

system. We're building Basec Camp. And I

4:42

was free to choose. So I chose Ruby and

4:46

at the time Ruby didn't have any tooling

4:49

or ve not very much when it came to web

4:52

applications. So I had to build it all

4:53

myself and that turned into Ruby on

4:55

Rails which is still going strong. I'm

4:57

still very heavily involved with that. I

4:59

think in some ways Ruby Rails is having

5:01

a little bit of a renaissance now that

5:04

it is one of the most token efficient

5:06

ways of building web apps. It's ideally

5:09

suited for the agent workflows we're

5:11

dealing with now. We'll see how long

5:13

that lasts. Maybe all the agents are

5:15

going to be writing machine code or

5:16

assembler in about five minutes. So

5:18

maybe that comes to an end. But for the

5:20

moment, token efficiency still matters.

5:22

And it still matters whether the agents

5:24

produce code that humans are able to

5:26

read and verify. That may also come to

5:29

an end at some point. But as it is right

5:32

now, it's uh been a fun ride to just see

5:35

these kinds of projects where I'm

5:37

scratching my own itch resonate with a

5:39

much larger community of people who then

5:42

show up and want to help. I mean for

5:44

Umachi which has only been around for

5:46

what is that just over 6 months now we

5:48

have what 400 contributors who've made

5:50

code changes to the distribution and on

5:54

top of that we have tens of thousands of

5:56

people who've installed it and uses as

5:57

their daily driver. So, I always love

6:00

that discovery of something new, novel,

6:04

and inspiring like Ruby or it sounds

6:07

weird to talk about discovery of a

6:09

operating system that's been around

6:11

since what 91, but for a lot of people,

6:13

Linux now is that discovery because they

6:16

have not been using it on their personal

6:18

computer. So, they're seeing it for the

6:19

first time. And for me to help a new

6:22

cohort of Linux users and hopefully even

6:26

enthusiasts come to be because I'm

6:29

flattening the curve a little bit. I'm

6:30

making it easier to get started. I'm

6:33

making the default installation just

6:35

look amazing so that they don't feel

6:37

like they have to invest 100 hours into

6:40

tweaking the system to get going is

6:43

really fun. But what's also fun of

6:45

course is that both of these things,

6:47

both Ruby and Rails and Amachi were not

6:50

just hobby projects. I love hobby

6:52

product and I will always do those, but

6:54

I also like to apply them to business.

6:55

So at 37 Signals, we built an entire

6:58

business for 20 plus years on top of

7:01

Ruby and Rails. We're now running Linux

7:04

on the majority of developer machines

7:06

because we now have our own distro.

7:09

>> So it's obi

7:11

people can choose, right? can they

7:13

>> well sort of kind of we started with a

7:15

with an open choice and then at some

7:17

point it just doesn't make sense anymore

7:19

in the same way it would not make sense

7:21

for someone to be at 37 single and say I

7:23

want to write this thing in Django we're

7:25

going to use Python and this other

7:26

framework even if you have Ruby and

7:28

Rails and you're doing that so we

7:30

pivoted from an early invitation to play

7:33

around that was what when I first

7:34

switched to Linux just said like hey if

7:36

you want to check it out check it out

7:37

then when things got a little more

7:39

serious with Amachi I just said let's go

7:42

all in for everyone who's on the

7:44

technical side of things, not the iOS

7:46

developers of course, but anyone who's

7:48

working with the web, who's working with

7:49

Ruby, who's doing DevOps, they should be

7:52

on Linux because first of all, that's

7:56

closer to what we deploy. We've always

7:58

deployed on Linux. We've been a Linux

7:59

shop on the server side since day one.

8:02

For developers and system operators, I

8:06

actually think it is a material

8:07

advantage to be closer to your

8:08

production environment and just be more

8:10

familiar with the tools. Then on top of

8:12

that, of course, we are building this

8:14

distribution and we should have as many

8:16

hands help out as possible. And given

8:18

the fact that I'm the CTO of this

8:20

company, I get to set the technical

8:22

direction and this is the direction

8:23

we're going. Can you just like do a like

8:26

just a very short recap of of you know

8:28

like how you right and right now where

8:30

are you like where where is the business

8:31

as a whole and you know you keep you

8:33

keep building you keep launching new and

8:35

exciting and just cool stuff. I think

8:36

Fizzy was the latest one.

8:38

>> Yes. So 37 signals was founded in 1999.

8:41

It started as a web design firm and then

8:45

I joined up in 2001, two years after and

8:49

for a couple years, collaborated with

8:51

Jason on these consulting projects and

8:53

then it was in 2003. We started work on

8:56

Base Camp, released it in 2004.

8:58

Actually, either the day after or the

9:00

day before Facebook went live, which is

9:03

kind of a funny coincidence that we were

9:06

of that same time and cohort. And within

9:09

about a year, we realized this thing was

9:11

taking off and we went full-time and

9:14

switched from being a consultancy to

9:15

being a software company.

9:16

>> Awesome.

9:17

>> And that's now 22 years ago, a little

9:20

more than that. And in that time, we've

9:24

released a ton of products. Basec camp

9:26

was the first. Remains the biggest and

9:30

most important, which is also kind of

9:32

funny because you sometimes perhaps have

9:35

this delusion that as you learn more and

9:38

as you get more experience, you'll get

9:40

smarter and you'll have better ideas.

9:41

And like, no, there's tons of people for

9:44

whom their first idea was the best idea.

9:46

And I have no shame in saying that Base

9:49

Camp was the best idea objectively in

9:52

terms of a business that we've ever had.

9:54

And I'm incredibly proud that we've been

9:56

able to keep that going and growing and

9:59

flourishing for over 20 years. Very few

10:02

software companies, let alone software

10:04

products, can boast of that longevity

10:07

and legacy. But we've tried a ton of

10:09

things over those years and had some

10:11

other great successes. We launched

10:12

hey.com our email service back in 2020

10:16

which was a crazy mission when you think

10:18

about it.

10:19

>> Here is a sector completely dominated by

10:22

a single player Google with Gmail that's

10:24

a good product.

10:26

>> It hasn't really changed in 17 years but

10:28

it was really solid and lots of people

10:30

are perfectly content with it. They

10:33

think they hold this duality in their

10:35

head where at once they both hate email

10:38

but somehow don't connect it to the fact

10:40

that they're using Gmail which I find

10:42

curious but either way we launched this

10:45

that is not only a competitor to this

10:47

very entrenched product that has

10:50

probably a greater grasp on market share

10:54

in any major category than any other

10:56

product I can come to mind of in the US

10:58

I think Gmail is something like 85% of

11:01

all email traffic, which sounds insane.

11:03

Maybe it's 80%. It's incredibly high.

11:06

It's basically Gmail and then all the

11:10

rest is in this tiny little part of the

11:12

graph. So, we thought that after using

11:16

Gmail, I used it since I don't know when

11:19

I signed up, a few weeks into it, I got

11:20

one of those invite codes. That was a

11:22

really clever launch and I used it ever

11:23

since. So, that's literally 17 years or

11:26

something of that of Gmail usage. And

11:27

over that time, I built up a lot of

11:29

opinions about things that didn't work

11:30

quite like I would prefer it to work.

11:32

And we put all those opinions into a new

11:34

software product. Spend about almost 2

11:36

years developing it. Millions of dollars

11:38

in accumulative R&D funds. And launched

11:41

it in the summer of 2020, which by the

11:44

way, time to launch a product.

11:47

2020 wasn't great for a whole host of

11:50

different reasons. We were kind of

11:51

trying to slot in a can there just be a

11:54

week where the whole world is not just

11:56

insane.

11:57

>> Yeah.

11:57

>> We finally picked a week. We went live

11:59

and then we had the battle of our lives

12:02

with Apple.

12:02

>> With Apple. I remember that.

12:04

>> And ultimately

12:06

>> they didn't want to approve your your

12:07

app.

12:07

>> They didn't want to approve our app

12:08

unless we paid the toll fee, the 30%.

12:12

>> And they were basically willing to say

12:14

you can't be in the app store, which for

12:16

an email product like that is a death

12:18

sentence.

12:18

>> Yes. you have to be on not just mobile

12:21

phones but specifically the iPhone. This

12:24

is true today. The majority of hey

12:27

paying customers are iPhone users

12:29

because that's the largest most affluent

12:32

market in the US and the US is the most

12:34

affluent and market software market in

12:36

the world. So for that business to work

12:38

we needed to be on the iPhone. After a

12:41

twow weekek epic struggle back and

12:44

forth, thankfully time to perfection

12:47

with WWDC where Apple preferably didn't

12:51

want to look like the Goliath squashing

12:54

a

12:55

>> developer, tiny developer, we ended up

12:58

being allowed in and Apple sort of

12:59

rewrote the rules after the fact to make

13:02

it fit. Um, it was a small victory, not

13:05

the ultimate victory, but at least it

13:07

allowed us to to be there. And hey,

13:10

ended up being an enormous success. In

13:12

part, ironically, because Apple gave us

13:14

wall-to-wall coverage for two weeks.

13:16

When I look back upon that, I think I

13:19

wouldn't have gambled like that because

13:20

the outcome would have been zero, right?

13:23

Like, uh, Apple refuses our app.

13:25

>> We sign up 200 people and the app is

13:29

dead. What instead happened was they

13:31

gave us a multi-million dollar launch

13:34

campaign and coverage in all major media

13:36

and we signed up tens of thousands of

13:38

people in those first weeks. That was uh

13:41

an insane event but uh also very

13:44

satisfying. And the other satisfying

13:46

thing was I just love Hey. I use it

13:49

every day. I basically use base camp in

13:52

terms of web applications. That's where

13:54

we do all our collaborative work. And

13:55

then my number two app and many days

13:57

it's my number one app is hey because I

13:58

just do all my stuff in email. I am

14:00

constantly communicating with people.

14:03

I'm writing. I'm doing a lot of stuff in

14:05

email as many people do. And having that

14:08

be a pleasurable experience and a nice

14:11

environment and my inbox being a little

14:14

more sacred than what happens with Gmail

14:16

where total strangers around the world

14:19

can just make your pocket buzz if you

14:20

have notifications turned on which they

14:22

are by default. Just seems insane to me.

14:24

Right. this idea that there's direct

14:26

access to one of my most important daily

14:29

priority lists like anyone can put

14:31

something on that insane. Anyway, hey

14:34

doesn't do that. We have the screener

14:35

and no one gets to reach your inbox

14:37

before you've said I want to hear from

14:39

this person. And most of the time I say

14:41

no to most people, right? Like things

14:42

end up in the in the screener and we

14:43

have thumbs up. I will hear from this

14:45

person, thumbs down, I'll never hear

14:47

from that person again.

14:48

>> This this is how I reached out. I mean,

14:50

we were I'm not sure we were connected

14:52

on on X, but I I sent an email cuz your

14:54

email is out there and your screener

14:56

seems to have worked cuz it gave me the

14:57

thumbs up.

14:58

>> It did because the screener is me. So,

15:00

there's not even AI trying to sus out

15:03

whether I want to hear from you or not.

15:04

Because what turns out to be true is

15:06

it's actually not that ownorous to once

15:08

a day go through your screener and say

15:11

thumbs up or down because there aren't

15:13

that many people in the world. And if

15:15

you say no to the annoying pestering

15:18

salespeople who within Gmail managed to

15:21

read your inbox seven times, then the

15:24

workload is much less. And it's very

15:26

satisfying, I will say too, because when

15:28

I was using Gmail, I would get roped

15:30

into this sales tactic that they of

15:32

course rely on, which is that like you

15:34

write back and say like, "No, thank you.

15:35

I'm not interested." And then they would

15:37

respond again. And now you feel like,

15:38

"Wait, am I now obligated to respond to

15:40

this person? I kind of feel like I am."

15:42

And occasionally I would end up writing

15:44

and even if I wouldn't write, they still

15:46

have access to my inbox. So I would hear

15:48

from them again next week. They have a

15:49

whole drip campaign. They all [ __ ]

15:50

do, right? That any outreach is seven

15:53

emails. It's not one emails. It's seven

15:55

emails. And if you show any sign of

15:56

life, it's probably 52. That's just not

15:59

how it works. And hey, I say thumbs down

16:01

one time, never hear from that person

16:02

again. It's actually amazing how quickly

16:04

you can curate your garden from that

16:06

weed. And then suddenly there's just

16:08

beautiful flowers. Suddenly email is not

16:10

a chore. So you want to go smell the

16:12

roses. Suddenly the majority of things

16:13

that end up in my email or things I want

16:15

to read is from people I want to hear

16:17

from. And that was really the

16:19

fundamental mission for us with hey can

16:21

we make email lovable again? Email is so

16:24

hated by so many people because the

16:27

systems are so poor because they're

16:29

based on the original premise that email

16:32

is just what universities use for

16:34

scientists to talk to each other and

16:36

scientists have really good manners and

16:37

will not pester you 52 times about some

16:41

stupid app they want to sell you. No,

16:43

they're respectful and beautiful, right?

16:46

beautiful ideal, beautiful thought,

16:49

beautiful protocol designed for those

16:52

norms and those people then you let it

16:54

into the world at large and you realize

16:56

ah not everyone is endowed with such

16:58

norms and such politeness and especially

17:00

when sales people get involved. you need

17:02

better defenses and for me and for us

17:04

and for all our many customers hey is

17:06

that defense it is a way to love email

17:08

again and I find that it's really

17:10

important actually to have a grand why

17:13

this is all the way back to Victor

17:15

Frankle the meaning of uh of man finding

17:19

a why allows you to walk through the

17:23

snow when it's cold and uncomfortable

17:26

and annoying which many things are when

17:29

you're building with computers they are

17:31

cold and uncomfortable and annoying.

17:33

Now, it shouldn't be that most of the

17:34

time, but occasionally that will be

17:35

there. And if you have a really strong

17:37

why, why are we building this? Who is it

17:39

for? What are we trying to do to improve

17:42

the world? Even if that's not more grand

17:45

than just letting people love email,

17:47

it's a lot easier and it's a lot more

17:49

enjoyable to then carry whatever burdens

17:52

you got to pack if you can set it up

17:54

that way.

17:55

>> This is a good time to talk about our

17:56

season sponsor, Work OS. Having a strong

17:59

why is what gets you to building

18:01

something great. But after you build it

18:03

and start selling it to enterprise

18:05

customers, they expect things like SAML,

18:07

SSO, directory sync, audit logs, and

18:10

fine grain permissions. And those are

18:12

not small features. They're systems.

18:14

Systems that can take months to build

18:16

and maintain. Works gives you APIs,

18:18

enterprise ready off, and user

18:20

management in days instead of months.

18:22

All designed to fit cleanly into your

18:24

product. That's why companies like

18:25

OpenAI and Traffic and Cursor run on

18:28

work OS. Focus on building your product.

18:30

Let Work OS handle the enterprise

18:32

infrastructure. With this, let's get

18:34

back to David and the old way of

18:35

thinking versus the new way of thinking.

18:38

Putting our your developer hat on like

18:40

can you talk talk me through on how how

18:43

you built it? You said it was two years,

18:45

but was it just one or two people

18:46

starting to build it? I'm sure as tech

18:48

you obviously must have used Ruby on

18:49

Rails a lot. Uh, and then I I don't

18:52

probably some some native stuff as well,

18:54

but the two years seems a lot especially

18:55

because you know you're you're a small

18:57

company. You're a nimble. You're a great

18:58

developer. I'm you hire great

19:00

developers. Suddenly it's been two

19:02

years. What what took so long for and of

19:05

course it's beautiful product. But right

19:06

on the surface I think as developers we

19:08

might have this this thing where I look

19:10

at it as like two years with with a

19:13

talented team.

19:13

>> That's the hacker news quip to basically

19:16

everything, right? Like I could have

19:18

built that in a weekend. I mean famously

19:20

stated with Dropbox that I could have

19:22

built that in a weekend. We could have

19:24

at the original iPod when it launched.

19:26

It was like 5 GB bits uh no Wi-Fi uh

19:30

whatever less speed than Nomat lane. So

19:32

I get that because I also have that same

19:34

instinct. I think that is our hoopers as

19:36

developers. We think we are gods and we

19:40

can make anything happen in no time at

19:41

all. And you totally could. You can make

19:43

a prototype happen in these days faster

19:45

than faster than a weekend, right? like

19:47

in in in a few hours we should be able

19:49

to have

19:50

>> kick off an agent. Yeah.

19:51

>> But figuring out what you actually want

19:53

to build takes a lot longer and arriving

19:55

at something that's worth publishing

19:57

takes longer still. At least it does for

20:00

us and I think it does for anyone who

20:02

arrives at anything good and the

20:04

original hey construction was just me on

20:07

the technical side. This is actually how

20:09

we've started majority of our major

20:12

products is either it's just me

20:14

sometimes it's one additional developer

20:16

but is in a tiny tiny team until we have

20:19

a shape until we have a an architecture

20:23

and we have a direction of where the

20:24

product is going to go. I've found that

20:27

you actually go slower if you pour a

20:29

bunch of people into a direction that is

20:32

uncertain. If you don't know what you

20:34

want a million people is not going to

20:36

build it for you. You have to figure out

20:38

what you want. We can talk about this

20:39

later, but this is where AI's very

20:42

recent progress is changing things

20:44

dramatically. It is now quicker to

20:45

arrive at what do I want? But for hey,

20:49

it was me and then it was Jason and uh

20:52

one designer, two designers, very very

20:54

small team trying to figure out the

20:56

shape, trying to figure out if you're

20:59

taking on Gmail, you can't just do Gmail

21:02

in blue. No one's going to buy that. No

21:05

one's going to be interested in that.

21:06

It's got to be novel, which means it's

21:08

well, not just novel, it's got to be

21:09

good. It's got to solve problems that

21:12

people haven't even articulated they

21:14

have with Gmail because the articulation

21:17

people have of their problems with Gmail

21:19

is I hate email, which as we talked

21:22

about is a bit of a misdirection. My

21:24

contention is you hate Gmail. And not

21:26

just Gmail, but most email systems built

21:29

on the old way of anyone has access to

21:31

your inbox and all that stuff. But

21:33

figuring that out, figuring the shape

21:34

out takes a while and it's also fun to

21:37

do in this way where you noodle with it

21:41

and you don't have infinite capacity.

21:44

The original base camp is built the same

21:45

way. It was just me on the technical

21:48

side. Is this a shape uplogy? There's

21:50

shape up thinking in trying to actually

21:55

endow the designer with an intention of

21:57

how should it work not just how should

21:59

it look and figuring out it's also how

22:01

it should look product should be

22:03

beautiful and they should be unique and

22:05

appealing and so forth. So that also

22:06

takes time. But figuring out how it

22:08

should work is primary. Figuring out

22:10

where's the epicenter, what's the most

22:11

important part and teasing all that

22:13

apart. But with Hey, as with all the

22:16

major products we've done, we start with

22:18

an absolutely tiny team, often just one

22:20

individual on the programming side and

22:22

then one or two individuals on the

22:24

design side. And then we go, we go, we

22:26

go, we go. Suddenly something clicks and

22:29

we go like, this is good. There's

22:31

something here. And then there's a bit

22:33

of a ramp. we take on a few more people

22:35

and then when we get within maybe the

22:37

last 20% we go okay now we know what the

22:41

terrain looks like we can go way faster

22:43

if everyone piles in. So one thing that

22:46

is super interesting and you might take

22:48

it for granted but it's very different

22:50

to how most startups uh that raise VC

22:53

money which I'm very familiar with uh

22:55

and and big companies Uber Facebook you

22:58

name it the way projects would start

22:59

there is you take the product manager

23:02

>> who works with maybe maybe half a

23:04

designer

23:05

>> and comes up with a spec and then

23:07

developers get involved later and what

23:09

I'm hearing what is very novel to me is

23:11

you take one or two designers and a

23:13

developer how you think about designers

23:14

Even you recently hired a designer,

23:17

Zultan actually, who I'm I'm chatting

23:19

with on the side. A great guy.

23:22

>> But my sense is you think of designers a

23:25

little bit different than potentially

23:26

the rest of the industry does.

23:27

>> We very much do. Designers at 37 Signals

23:30

are not just here to make a spec look

23:33

pretty. They're here to find what the

23:35

spec should be. They're product managers

23:37

in many ways. They are the finders of

23:41

the how and the why in many cases

23:44

deducing in some cases customer feedback

23:46

in other cases just pure intuition and

23:49

distilling that into what should we

23:51

build and how should it work and then on

23:53

top of that they're also responsible for

23:55

building it they're responsible for

23:57

doing the CSS they're responsible for

23:58

doing the HTML they're quite often

24:01

responsible at least dabbling in the

24:02

JavaScript and the Ruby code to get to

24:05

something functional now with agent

24:08

acceleration. They do the whole thing,

24:10

not necessarily as it will be merged,

24:13

but the whole thing in terms of here's

24:15

the final shape and design of what it

24:18

should look like. But I do think we are

24:20

very peculiar in this sense. And we have

24:22

found this when we've been trying to

24:23

hire designers that many designers

24:26

working other companies are not used to

24:28

also wearing the product manager hat,

24:30

figuring out what we should build and

24:32

wearing the implementation hat, shaping

24:34

it into CSS and HTML. I found that when

24:38

you combine these three hats into one,

24:40

you have an individual who know the

24:43

materials they're working with, know how

24:45

they stretch, know which way the seam is

24:48

supposed to be cut, and therefore works

24:50

natively with the fabric of the

24:53

internet. When you're working directly

24:55

in CSS, when you're working directly in

24:56

HTML, you're just much more in tune with

24:59

what this medium wants. And I find that

25:02

that's probably quite similar if you're

25:04

a jewelry designer. You should know the

25:06

properties of gold. You should know how

25:08

it bends and the strength. An architect

25:10

should have some engineering

25:12

understanding of loadbearing structures

25:15

and so on. Not to the degree that the

25:18

architect is just going to design the

25:19

whole thing and then we start pouring

25:21

concrete. you still have uh engineers

25:23

helping you out, but the more you

25:25

understand the materials you're working

25:27

with, the more you're likely to come up

25:29

with something that cuts along the grain

25:32

and therefore ends up feeling correct,

25:35

feeling good. Just a quick hop to Apple.

25:37

I think this is one of the reasons why

25:39

some of the historic super fans like

25:41

Daring Fireball and others uh Gruber

25:44

have been disappointed by the new

25:46

direction is that Apple used to stand

25:48

for these exquisitly designed native Mac

25:53

applications which is an dying breed

25:56

like they're essentially dead. Now we

25:58

have Electron which we can talk about

26:00

that too gets way too much hate in my

26:02

book. There's crappy implementation of

26:05

that, but it's just a web in a box. But

26:07

the disappointment with losing that

26:10

sense, and I think it's about the same

26:11

thing that the Mac, its native

26:15

feel has a stretch to it. Like the

26:18

button placements, everything you would

26:20

call a native application either feels

26:23

synthetic or it feels authentic. And

26:26

today, it's all synthetic. There's no

26:28

nothing authentic about it left. And I

26:31

think for the web it's the same thing.

26:33

Now the web is a much much larger

26:35

platform and therefore it's gotten much

26:36

more attention. So there are way more

26:39

people working on that quality of it.

26:42

But at the large companies it's

26:43

exceptionally rare to non-existent to

26:46

have that kind of dynamic. I think some

26:48

of that is going to change. Agent

26:50

acceleration is going to empower

26:52

designers to be more capable in these

26:54

ways. So the industry is coming a little

26:56

towards our fundamental stance which is

26:59

funny too because the same is true on

27:01

the programming side. When I talked

27:03

about base camp being a product of just

27:05

me on the programming side for launch

27:08

that for so long sounded unambitious or

27:12

even wrong or even to the point of lying

27:15

from some quarters of the internet like

27:17

yeah but you can't build anything real

27:19

anything meaningful anything big unless

27:21

you have a team that's much larger

27:24

because it's just going to be a toy

27:26

product right and my insight from the

27:28

start was that's of course [ __ ]

27:30

because you just haven't used Ruby on

27:31

Rails you just haven't used the

27:33

acceleration that's possible if you use

27:35

better tools. Now we're all realizing

27:38

that we're using realizing oh so if you

27:40

use agent acceleration a single

27:42

individual actually can build something

27:45

highly valuable team.

27:46

>> Yes.

27:47

>> And that's just fun to see that like the

27:50

industry is coming towards oh smaller

27:52

teams are better because now the cost

27:54

savings you have on the logarithmic

27:56

curve on communication cost starts to be

27:59

relevant. And this is one of the things

28:00

maybe we can talk about this where agent

28:02

acceleration is really changing the

28:05

bargain between junior developers and

28:07

senior developers. Let's talk about

28:08

this. But before we go into that, do I

28:11

feel that you very much value software

28:15

engineering as a craft, which is very

28:16

obvious, but what I'm sensing is you're

28:19

valuing design, user experience, design,

28:23

designing on software design, like you

28:25

know, like building stuff that feels

28:26

good. May that be software, hardware,

28:29

you also value that as a craft and and

28:31

you look for it like the these two

28:32

things. Do I sense this correctly?

28:33

>> Hugely. I mean, I think aesthetics

28:37

is truth. When something is beautiful,

28:39

it's likely to be correct. I think this

28:41

is true in mathematics. This is true in

28:43

physics. This is true in a lot of

28:45

different domains that when you arrive

28:47

at something that has the correct

28:49

aesthetic quality. It's like we have an

28:53

intuition that guides us towards that

28:55

level of beauty because it also happens

28:58

to be correct and noble and something to

29:01

aspire for. I also happen to believe

29:03

it's what makes people happy. Being

29:05

surrounded by beautiful, well functioned

29:09

objects is a key part of happiness. In

29:12

fact, I'll put it in a negative way,

29:13

too. One of the great sources of anxiety

29:16

and frustration is when everything is

29:19

[ __ ] When everything is laggy, when

29:22

that touch interface doesn't register,

29:25

when you have to restart it, when you're

29:27

calling a travel agent, they can't do

29:29

something because their old shitty

29:30

cobalt system won't let them. Right? The

29:33

world is full of not just in

29:36

shitification. That is things that went

29:38

from being good to being bad to just

29:40

plain bad, just plain awful. And I think

29:44

it is a serious source of malaise for

29:48

civilization. that we could literally

29:51

raise the bar of human happiness if we

29:54

were surrounded by more beautiful items,

29:57

more beautiful systems. Both in the

30:00

sense of its aesthetic exterior

30:02

qualities, but just as much in terms of

30:04

its aesthetic interior qualities,

30:06

because I find those two things are

30:08

usually in perfect harmony. The reason

30:10

why Steve Jobs cared about the inside of

30:13

the box was because he intuitively knew

30:16

that the kind of people who care about

30:18

the layout of the print board will be

30:21

the kind of people who sweat the details

30:23

on the user interface will be the kind

30:25

of people who sweat the ergonomics of

30:27

opening the case. So I think there's

30:31

essentially no choice if you are a

30:34

person who is attracted to these

30:38

aesthetics which I think is everyone.

30:39

there's just varying levels of u

30:41

awareness about whether you are or not

30:43

but that you want to make it all

30:45

beautiful and for me Ruby in particular

30:48

has been this seinal language because it

30:50

produces the most beautiful code in my

30:52

book there's barely even competition

30:54

like there are other things that can be

30:56

beautiful in a way like I find looking

30:59

at small talk for example very beautiful

31:02

in its minimalism but not the house I

31:05

want to live in Ruby is the house I want

31:07

to live in because it's got that

31:08

aesthetic equality while not being rigid

31:12

about its ideology which is a very rare

31:14

aspect too. I more often find now we can

31:18

refer to IV again is that when someone

31:20

is obsessed in this way they are a

31:22

little narrow-minded like that's the

31:24

trade-off that's the price and I find

31:26

that Ruby has somehow managed to be both

31:29

broadscoped yet also intensely focused

31:32

on on this but overall we have to have

31:36

beautiful things we have to work with

31:38

beautiful tools we have to produce

31:41

beautiful fluid interactions this is how

31:45

we should see ourselves as crafts people

31:48

that we care about polishing it until

31:50

there are no splinters left. How is AI

31:53

changing how you work and how do you

31:56

think it's changing your craft or just

31:58

let's just talk about the craft of again

32:00

you're you're hiring people in 37

32:02

signals who similarly care about design

32:05

and and software craft's quality how

32:07

it's changing what you get out of the

32:09

craft or how it's how it's making it

32:11

better or or worse in some ways I I I

32:13

just want to you know start with like

32:15

how has your view changed because the

32:17

last time you you talked in in length

32:19

about this that was on Lex Freriedman's

32:21

cast and you were still rightfully so

32:23

very skeptical of of AI. It was a

32:25

different set of tools. It didn't work

32:26

as well and I think you you went there

32:28

bashing it pretty hard but things have

32:31

changed since.

32:31

>> This is a nuance point and maybe it's

32:33

self- serving but I don't actually think

32:35

my opinions have changed. What have

32:36

changed is the circumstances and the

32:38

facts which uh is is something I called

32:41

out on that show and in many other

32:43

writings was right from the get-go I

32:46

could see that we had something new and

32:48

novel here that was going to change

32:50

things. Chat GBT its launch what three

32:54

years ago was clearly and obviously even

32:57

at the time something you would mark on

32:59

a timeline. You're like here are all the

33:01

important things that happened in the

33:02

history of computer science or the

33:04

world. Yoinks, there is the launch of

33:06

Chat GBT and interacting with computers

33:09

in this way and seeing them reason, even

33:13

if that's still a disputed term perhaps,

33:15

but to me it seemed obvious that these

33:17

things were freaking smart, smarter than

33:19

me in many ways, whether those smarts

33:22

came from parenting

33:24

weights and data.

33:27

So what? We don't know how human

33:29

consciousness works. We don't know how

33:31

human wisdom or intelligence works.

33:33

barely. So, let's not be so categorical

33:36

about what constitutes consciousness or

33:39

intelligence. At least, I find no

33:41

utility in that distinction, even if

33:42

it's fun to ponder. But what I found

33:45

with the early models and the early

33:49

ergonomics where it was autocomplete,

33:51

where it was co-pilot and cursor in your

33:55

editor trying to guess the next

33:57

character,

33:58

>> it it would be sometimes littering it.

33:59

Right.

34:00

>> Yes. I found it infuriating. I found it

34:02

as we're trying to have a conversation.

34:04

You won't let me finish a sentence.

34:06

You're constantly trying. Was this what

34:07

you meant? Was this what you meant?

34:09

You're like, shut the hell up. Can I

34:11

just finish a thought? And I thought,

34:13

even if it is capable of occasionally

34:16

accelerating, it's also wrong so often

34:20

that that acceleration feels like a

34:22

nuisance, even if it's somehow net

34:24

positive, which it wasn't for me. Or

34:26

maybe I gave up too soon. But I just did

34:28

not enjoy that. I didn't think the

34:30

models were good enough. I thought the

34:32

way of using the models with

34:33

autocomplete versus agent harnesses was

34:36

just dreadful, annoying. In fact, to the

34:40

point that I got a little pessimistic

34:42

about the direction of the industry for

34:44

a hot second because I thought this was

34:46

what we were all going to do. We're all

34:47

going to sit and do tap tap tap.

34:50

No, thank you.

34:50

>> Well, cursor even have they had I even

34:53

got one of these one of their swags was

34:55

a tap key.

34:56

>> Exactly. which which felt very and I I

34:58

haven't I I got it from them. It's

35:00

really cool, very well designed and all

35:01

that beautiful design, but

35:03

>> but dystopian

35:04

>> dystopian

35:05

>> when I see that and I remember that was

35:07

a meme for a while just we only need

35:08

three characters on the keyboard, right?

35:10

I thought of that episode of uh The

35:12

Simpsons where Homer puts a mechanical

35:16

bird on the keyboard that just dips down

35:19

and hits enter because all he's been

35:22

doing is hit enter. Except suddenly

35:25

there's a warning about the nuclear core

35:28

overloading and the bird just hits enter

35:30

and the whole thing burns down. I'm

35:31

like, "Wow, that's quite a parallel."

35:33

The Simpsons really does predict

35:34

everything. But I did not like that

35:36

style of using it. As much as I retained

35:40

my enthusiasm for the general direction

35:42

of travel because it truly is amazing

35:44

and the amazement to me I tried to

35:47

embrace as a tutor model as a pair

35:50

programmer who doesn't drive it was

35:52

amazing to have chat GBT and the other

35:54

model just be there for like I don't

35:56

understand this fully here's a piece of

35:58

code here's a question can you tell me

36:00

why it works like that can you tell me

36:02

what's wrong with it because that's how

36:04

I've been using the internet since day

36:06

one right that's what Google was for

36:07

here's an error message. Here's a

36:09

concept. Maybe I find something on Stack

36:11

Overflow with some passive aggressive

36:12

nerd telling everyone why he's so smart

36:14

and then at the bottom there's the

36:16

solution I'm looking for. Or I don't

36:18

find it at all and that's just kind of

36:19

frustrating. With the chat GBT model, I

36:22

very often got a really good

36:23

explanation. Yeah, this was actually I

36:25

talked with a game developer Jonas

36:27

Tyroller who who built this really cool

36:29

bestselling game. I loved playing it and

36:31

this was during this time of of the tab

36:33

completion and he said that in his the

36:35

way he works is he just turned off all

36:37

auto completions in his ID uh because he

36:39

got annoyed by it and then every now and

36:41

then he went to chat GPT to ask

36:43

something or have a longer thing and

36:45

then he had the mode of like I'm

36:46

thinking and I'm doing this stuff oh I

36:49

need some help okay here's the specifics

36:51

and I'm taking and somehow it felt that

36:53

you know like he just he was in the zone

36:54

the whole day by controlling it and and

36:57

somehow those habits sounds like You

36:59

know, you're saying the same thing. It

37:00

kind of took it away from you. Us.

37:02

>> Exly. Exactly. And I did get a little

37:04

worried that that was going to be the

37:05

direction that we were all going to be

37:07

the bird and I didn't want to be the

37:09

bird. Then I was like, well, what should

37:10

I do instead? Maybe like farming

37:11

potatoes. Like that's a long tradition

37:13

here in Denmark. Maybe I could take that

37:15

up.

37:16

>> But then thankfully two things happened.

37:19

A clot code in what is that? starts in

37:22

the spring, gets going sort of over the

37:24

summer, then by the fall has some

37:27

traction on a new way of using agents to

37:30

help you code where with the agent

37:31

harnesses, right? This is really where

37:33

we transition from AI to agents.

37:36

Suddenly the AI has tools. It can use

37:39

bash. It can use everything you got on

37:42

your terminal. It can call the internet

37:44

in for appropriate information. it it

37:46

just is capable of doing more than just

37:48

reasoning about a thing you gave it uh

37:51

or input from a source context file. And

37:54

then the models opus 45 to me is the

37:58

other one of the other points we're

37:59

going to have on the line where it's the

38:02

first model that continuously and

38:05

consistently would shock me with the

38:08

quality of its output. it quality of its

38:12

analysis on the basis of vague inputs

38:16

and even more importantly the quality of

38:20

its output. It produced code I wanted to

38:22

merge without

38:25

very much if any alteration and if I did

38:28

want to do alteration I could tell it

38:31

and it would remember and it would not

38:33

make the same mistake next time. that to

38:35

me the combination of those two things

38:38

was the unlock

38:38

>> and and you have a high bar like you

38:40

have a really high bar

38:40

>> incredibly high bar I as we've talked

38:42

about now at length like the aesthetics

38:44

of the output really matters if I'm

38:46

going to look at it and I'm going to

38:48

review it I'm going to give you another

38:49

anecdote in a second where those things

38:51

don't even play in but when I'm using

38:54

agents to work on Ruby code I want their

38:57

code to look as good as mine I'm not

38:58

going to merge their stuff if it's

39:00

sloppy no more than I would merge the

39:03

work of a junior developer who has not

39:06

yet fully internalized our style and so

39:08

forth. So I wanted to be on par and on

39:10

parody and the early models just

39:12

couldn't. That didn't mean they couldn't

39:13

produce working software. At least some

39:15

of the time they could. I'm very

39:17

impressive. I mean I remember when I did

39:19

my first snake game and I'm like holy

39:21

smokes. I've been wanting to do this

39:23

since I was 6 years old. Like I've been

39:25

wanting to I have this idea. I want to

39:27

get it into a game and I was able to see

39:29

that in I don't know a few 30 seconds.

39:31

It was done with the game. copy paste

39:33

the HTML magical experience, right?

39:37

>> So, I think that ramp was very

39:41

interesting because it actually took a

39:43

while until we found this form factor of

39:46

the agent harness of the terminal

39:49

interface.

39:50

That to me was the the big unlock from

39:53

this is interesting. I want to have a

39:55

conversation with it to I wanted to

39:57

write my code. I will now start any

40:01

project I'm starting with. I'm starting

40:02

agent first and that's a massive shift

40:06

and it just happened from November 27th

40:09

I believe is when Opus 45 dropped. Now

40:11

there are other people who have

40:12

different points they felt like oh is

40:14

Opus 40 or

40:16

>> maybe some people talk about Sonic 37.

40:19

There are other earlier checkpoints but

40:21

there I do feel like there's a general

40:22

consensus I can lean up against that

40:25

capy and others have expressed like yep

40:27

it was right around end of November

40:29

early December. Everyone who works

40:30

worked at larger tech companies, it was

40:32

the winter break because people just you

40:34

know like like the whole industry shuts

40:37

down for 2 weeks say for a few places

40:39

where you're on call but again no

40:40

production work happens across the

40:41

>> industry to play with this.

40:43

>> My sense was that people were playing

40:44

with it because you give it your side

40:45

project you never finish expecting not

40:47

to finish and then they also got

40:49

shocked.

40:49

>> Yeah.

40:50

>> You're done and that was just a complete

40:54

sort of break, right? Right? Like if

40:55

this was a movie, you'd hear the scratch

40:56

sound like you're like, "Wait, what?

40:59

Revine, what happened?"

41:00

>> I feel it was the most collective shock

41:02

which happened individually and then

41:04

people came back in January and everyone

41:08

especially because a lot of the decision

41:09

makers who are you know like CTO

41:12

engineers etc were not as hands-on but

41:14

they were hands-on and a lot of them

41:16

it's this weird thing where they came

41:17

back and they start to mandate or like

41:19

say all right you guys need to use this

41:20

because I've seen the future. I've

41:22

literally used it you need to see it.

41:23

So, it's we're going back to a little

41:24

bit of hardware like people were trying

41:26

to give, you know, like the the new

41:27

hardware into people's hands saying you

41:29

need to experience it cuz you're you're

41:30

not going to believe it, right? There's

41:32

something with this as well where you

41:33

you really don't believe it. We can talk

41:35

about this and whoever's not tried it or

41:37

not had that aha moment. I don't think

41:38

we can convince them.

41:40

>> This is another one of those cases where

41:42

words just are not effective. You need

41:45

to sit down in front of Open Code or

41:48

whatever harness that you use, use one

41:50

of the frontier models, start with that.

41:52

Start with Opus. I'd say start with

41:54

Opus. It's the best frontier model.

41:56

Other models are better at other things,

41:58

blah blah. But if you're just going to

41:59

work on a piece of code and you want to

42:01

see what the current frontier is and if

42:03

you I mean I'd be shocked if any of your

42:05

listeners haven't done it already, but

42:06

if there should be some left, now is the

42:08

time. And I don't even want to say in

42:10

the sense I I found it really offputting

42:12

this trend on X where unless you've

42:15

internalized everything there is about

42:16

AI, like you've been left behind. Shut

42:18

up. First of all, patently not true. you

42:21

could literally pick up everything in

42:23

the next three weeks. This is the other

42:25

magical thing about this kind of

42:26

project, right? Like or or progress when

42:29

if we had been having this conversation

42:30

in spring of last year, everyone been

42:32

like MCPs, MCPS, MCPS. And do you know

42:36

what? You can now manage to just have

42:38

jumped over that entire things and go

42:40

straight to CLI and skills. That's just

42:42

worth having in mind that this FOMO that

42:46

unless you're up on all of it as it

42:48

happens play by play, you're left behind

42:50

is complete and utter nonsense. That

42:53

being said, I can still appreciate that

42:55

some people were early. And for me, Toby

42:59

Lutki at Shopify is the main individual

43:02

who saw this and saw the changes that

43:06

were coming from it way earlier than I

43:08

did and have really helped drag me into

43:11

this by constantly sending me like,

43:13

"Hey, you look at this, look at this."

43:15

And I do think that's actually quite

43:16

helpful. It's quite helpful to be

43:18

surrounded by people who have a higher

43:22

faith or maybe their eyes are a little

43:23

further up. Like my eyes tend to be

43:26

relatively close to the road like right

43:28

in front of me and some people have a

43:29

gaze that a little higher up and

43:30

sometimes they see things that don't

43:32

come to pass. In this case, Toby saw

43:35

exactly where we were going two years

43:37

ago. And I finally saw it because the

43:41

road came to me in December. And it's

43:44

funny because along the way I kept

43:47

saying like, "Yep, when the models get

43:49

good enough, when they can do all this

43:50

thing, it's going to be amazing." and

43:52

thinking wow it's going to be I don't

43:54

know 18 months two years maybe it's five

43:56

years it's very hard to predict these

43:58

infliction points and I think the

43:59

industry itself didn't even predict the

44:01

infliction point right you have an

44:03

entire city Silicon Valley and

44:05

surrounding areas San Fran focused on

44:09

making this happen but predicting

44:11

exactly when the hockey stick starts

44:13

hockeying is very difficult but then it

44:16

happened and now my daily work is very

44:19

different

44:20

>> so so what is your daily work. Now,

44:22

>> my daily work is

44:25

agent first on everything.

44:28

>> Going agent first is a good time to

44:30

mention our season sponsor, Sonar. When

44:32

shifting to agent first work, one thing

44:34

that inherently comes up is the quality

44:36

of the code. Sonar, the makers of Sonar

44:38

Cube, is deeply rooted in the core

44:40

belief that code quality and code

44:42

security are inherently linked. High

44:44

quality code is naturally more resilient

44:46

and as agents start writing code at a

44:48

massive scale that verification layer

44:50

becomes your most important security

44:52

parameter. This is where solutions like

44:54

Sonar Cube advanced security are

44:55

valuable. With this new malicious

44:57

package detection, advanced security

44:59

provides a real-time circuit breaker

45:01

automatically stopping agents from

45:02

pulling in unverified or risky

45:04

thirdparty libraries before they ever

45:06

hit your pipeline. The impact is

45:08

measurable, too. Developers who verify

45:10

their code with Sonar are 44% less

45:12

likely to report experiencing outages

45:14

due to AI as per Sonar state of code

45:16

developer survey 2026 report. It's

45:19

really about closing the gap between the

45:20

speed of AI and the reality of

45:22

production security. What else is Sonar

45:24

doing to help reduce outages, improve

45:26

security, and lower risk associated with

45:28

AI and agenda coding? Head to

45:29

sonarsource.com/pragmatic

45:31

to find out. With this, let's get back

45:33

to David's agent first workflow.

45:35

>> So, specifically cloth cloth code. Uh I

45:38

use open code open code. You use open

45:40

code.

45:40

>> That's my main harness. I also use cloud

45:41

code a little bit. They unfortunately

45:43

got that early lead. Opus is currently

45:45

the best model. So then they started

45:47

thinking a little bit in that like the

45:49

game is single match instead of thinking

45:52

it's multiple rounds and yanked their

45:55

subscription from open code. So if you

45:56

want to use your Mac subscription, you

45:58

kind of have to use their harness, which

46:00

I don't love it. I think it's a mistake.

46:02

But leave that be for a second and let's

46:04

just celebrate the fact that they have

46:05

the best model and Opus for 45 46 is

46:09

also nice but 45 to me was the

46:10

inflection point

46:11

>> and it creates a lot of competition

46:12

because everyone wants to catch up and

46:14

overtake them now

46:14

>> of course and especially because you see

46:16

Anthropic's revenues I think start of

46:18

the year they're at 9 billion few weeks

46:21

later they're at like whatever 14 now

46:23

they're at 19 or something it's just the

46:24

craziest rocket ship you could possibly

46:26

imagine which is inspiring all this

46:29

capital to be deployed for competitors

46:32

and so forth which is wonderful great to

46:34

see so even if I don't love everything

46:36

that they do and cloud code is not my

46:38

preferred harness manage to hold two

46:41

things in your head at the same time

46:42

this is what I also try to do even with

46:44

Apple which I have serious griefs about

46:46

how they operate and act as the

46:48

gatekeeper and all the other nonsense

46:49

we've talked about and then I also keep

46:51

my I just love computers hat on and go I

46:54

like the new Neo I might even buy a new

46:56

Neo and just see what is possible at

46:59

$500 for Opus I have no qualms s about

47:01

using Opus. In fact, whenever I feel

47:03

like uh this is a really hard problem, I

47:06

go to Opus right now, but I also use

47:08

other models. And one of the things I've

47:10

incorporated into my flow is to kind of

47:12

have two models going at the same time

47:15

at different speeds. So, I use T-Max and

47:19

I have this layout thing that's built

47:21

into Amachi where it'll start my new Vim

47:24

editor on the left side and then it'll

47:26

start two panes on the right side. On

47:28

the top is open code running Kimmy K25

47:32

and on the bottom is Opus running in

47:35

cloud code and then at the very bottom I

47:37

have a strip of terminal and almost

47:39

everything I started in one of the

47:41

agents and I tell them what I want. Then

47:43

I hop over to Neoim. First I do uh space

47:47

gg to look at the um lacy git diff on

47:52

it. Once this is changing if it looks

47:54

correct I'll just commit. We're we're

47:56

done. Great. and then sometimes it

47:58

doesn't. It'll correct and I'll I'll go

47:59

in and alter the code myself. But the

48:01

ratio and how quickly the ratio changes

48:04

is still astounding. I went from early

48:07

November last year, I'm code first

48:10

everything. I started the editor,

48:13

I'll spend whatever long it is and then

48:14

at some point if I get stuck or if I

48:16

want a second opinion, I'll go ask my

48:19

friendly clanker to give me a second

48:21

opinion. That's just not how it is

48:22

anymore. Now I start with the agent. Now

48:24

he'll give me the draft. I'll review the

48:26

draft and I'll make alterations if need

48:29

be. And then just recently I flipped it

48:32

even further. So we're working on a CLI

48:35

for Base Camp so we can get full agent

48:37

accessibility for Base Camp. It's

48:40

astounding. First, actually, let me

48:42

rewind. As soon as I got pilled on how

48:46

good the agents were and how capable

48:50

they were, I immediately tried to raise

48:53

my gaze up towards the end of the road

48:54

and think, do we even need MCP? Do we

48:57

even need CLI? Do we even need anything?

48:59

Can't the agent just figure it all out?

49:01

This was when I installed OpenClaw. So,

49:03

I installed OpenClaw on a VM and I

49:06

thought, what should I do here? Let's

49:09

see how far we can push it and what it

49:10

can do by itself. So I thought I want

49:13

this claw in base camp. I want this claw

49:15

in Fizzy. Let me just try to invite it

49:17

as it was a human. So I just wrote it.

49:20

Can you sign up for Fizzy? I'm not

49:22

giving you any tools. I'm not giving you

49:23

any MCP. I'm not giving you CLI. I'm

49:25

just telling you it's at fizzy.do. Go

49:28

sign up. And you see it. Chuck along.

49:30

And then yeah, I've signed up, but it's

49:32

asking for an email address or I'm

49:33

trying to sign up. It's asking for an

49:34

email address. I'm like, oh yeah, right.

49:36

You need an email address. An agent

49:37

doesn't have an email address. Hey, go

49:39

sign up for hey.com. I'm like, it's

49:41

going to fail this one. And it's Chuck,

49:43

Chuck, Chuck. Uh, I've signed up for

49:45

Hey.com. Here's the password. Write it

49:48

down somewhere safe. I'm now also signed

49:50

up for Fizzy. I got the confirmation

49:53

email in my inbox. We're all good. What

49:56

do you want me to do? I'm like, what?

49:58

Are you telling me that you could

50:00

one-shot signing up through a browser to

50:03

these things? Now, maybe that shouldn't

50:04

be surprising. Maybe that was already

50:07

possible with Sonnet 3 or one of the

50:09

early models. I don't know. But when you

50:11

experience it yourself on your own damn

50:12

claw that you're just telling over

50:14

Telegram to do something and it's

50:17

signing up for products autonomously,

50:21

that's pretty startling. It was for me.

50:23

And then the next step I went like,

50:24

well, if it can sign up for Hey, and can

50:26

sign up for Fizzy, let me invite it to

50:28

Base Camp. So, I send it an invitation

50:30

to its own email address. Here's the

50:32

invitation link to Base Camp. Can you

50:34

just jump into the AI labs lab uh

50:36

project that we have and introduce

50:37

yourself to the team? go, "Hey, I'm

50:40

David's assistant. It's very nice to

50:43

meet you all. I've read back the

50:45

transcript a little bit. I see you're

50:46

all excited about these things." And you

50:49

just go again, "What? What?" And that

50:53

was fun because it showed me that even

50:56

if it was going to take a while, it did

50:58

take a while. It took a while. This is

51:01

um agent terms. It took I don't know,

51:02

seven minutes. That was like, "Oh, it

51:04

feels like eternity." But it was able to

51:07

do it. And that seems like the end

51:08

state. The end state is that agents will

51:10

not need any of our accommodations. They

51:12

do not need any on-ramp. They're not

51:13

coming on a little uh wheelchair.

51:16

They'll be coming on bionic legs and

51:17

running five times as fast as you in

51:19

about 2 seconds, which we'll get to in a

51:22

second to the speed aspect of it. But

51:24

then you also realize, okay, well, I

51:26

can't just sit around fiddling my thumbs

51:28

until AGI happens. Let's build for

51:31

today. And that's what we've been

51:32

building for Basecam. We've been

51:34

building CLI. We're going to build it

51:35

for Hey, we're going to build it for

51:36

Fizzy. We're going to build for

51:37

everything, even probably some of the

51:38

legacy products. And what I love about

51:40

the CLI, as much as I also love it about

51:43

these harnesses, is that they validated

51:45

the fundamental Unix philosophy from

51:47

like whatever 71. You should just build

51:50

small tools that can interoperate with

51:51

pipes and you can

51:53

>> that's philosophy, right?

51:54

>> It's the total Unix philosophy. And that

51:57

is actually the magic to me about seeing

52:00

everything having a CLI. It's not that

52:02

Base Camp is easier to use now with a

52:04

CLI. No, no, is that GitHub also has a

52:06

CLI and Sentry, I don't know if they

52:09

have an CLI, but they have an MCP that

52:10

you can tie all these things together

52:12

and now you can tell an agent, hey, we

52:14

have some errors in Sentry. Can you go

52:16

check them out? Then post a write up to

52:18

Basecam iterating what's wrong. Then go

52:21

in GitHub, come up with a pull request,

52:24

post a comment back to Base Camp when

52:26

you're done. And now we have a central

52:27

right base cam where we're following the

52:29

work as it's going on while we have an

52:30

agent doing work looking things up. And

52:34

again, when we try to talk about it and

52:37

relay it, I guess some people can see

52:39

it. And now OpenClaw has enough videos

52:41

on YouTube and so forth so you can get

52:43

at least a passenger ride. But try it

52:45

yourself with your own product, with

52:47

your own tasks and with your own prompts

52:50

and you will be pilled. You will be

52:54

simultaneously

52:56

incredibly excited for what we've been

52:59

able to make sand do. The silicon, the

53:02

chips, the weights, the whole thing.

53:06

And then also a little bit anxious about

53:09

where it's all going to go. And it's in

53:10

that tension that I and probably anyone

53:13

else who's been pilled on this live,

53:15

right? Wait a minute. If we're already

53:17

here, what does n 18 months from now

53:19

look like? Like if at the last 3 months

53:22

we've upended my entire understanding of

53:24

what's possible with computers, what's

53:26

the next 3 months look like? What the

53:27

next nine months look like?

53:28

>> Yeah. This this this is where like I I

53:30

was a little bit on on your end for a

53:32

long time and I think I still am where I

53:33

believe what works and I'm always

53:35

skeptical of projections. Mo Moore's law

53:38

broke down at some point. I I live

53:40

through everyone said it will continue

53:41

forever and you know and then it broke

53:43

as we all suspected it would

53:45

>> but then it found another way. I think

53:46

it's the good point about the Moors law,

53:48

right? It broke for individual cores.

53:50

Yes. How much can you push that? And

53:51

then we just went, well, what if you

53:53

just had what's the latest chip? 256 on

53:55

the AMD 10 chips, right?

53:57

>> And even when performance broke, we we

53:59

we went into power consumption and size

54:01

and all of those things. So, yeah, like

54:03

but it's it's harder for me to also just

54:05

to say, oh, it's going to stop here

54:07

because we've seen it grow. We we know

54:10

the approaches that they're taking this

54:12

larger and larger training sets and it's

54:14

been working so far. And there's also

54:15

the bitter lesson which I think I I

54:17

think is a it's it's such a short paper

54:20

that it's just so worth reading. I think

54:21

it's one of probably the most popular

54:23

papers outside of academic circles.

54:25

>> Yes.

54:25

>> Because it just lays out this thing that

54:27

we we don't want to believe that. We

54:29

want to believe that our knowledge our

54:31

understanding is superior that you know

54:32

you and me knowing how to code or me

54:34

putting in these 15 years or however

54:36

long it's been it's special. Sometimes

54:38

it shows that it's it's not as special.

54:40

What's interesting actually is like

54:42

right this second this snapshot in time

54:44

it a little bit is and this is a funny

54:46

bification that's happening junior

54:48

versus senior developer is that the most

54:53

successful and applicable agent

54:55

acceleration that I've seen at 37 signal

54:57

has been from the most senior people the

54:59

people who are able to validate whether

55:02

what the agent produces is suitable to

55:04

be deployed to millions of people. There

55:07

was just this story yesterday about some

55:09

of the major outages at Amazon.

55:12

>> Yeah.

55:12

>> And Amazon's own internal analysis

55:15

essentially pinned that we can no longer

55:17

let junior programmers ship agent

55:19

generated code to production without

55:21

review. And the problem with that is

55:24

first of all I think that's the

55:26

realization most companies are now

55:27

having across the industry. Whenever

55:30

it's mission critical for something of

55:31

that nature, we cannot yet rely on the

55:35

agents to abet it at all and a and

55:39

junior programmers are not capable of

55:40

figuring it out. Therefore, their role

55:44

is suddenly more tenuous than it was 6 n

55:48

months ago because a senior programmer

55:51

can and this is why senior programmers

55:53

are getting so much more acceleration.

55:55

They're able to first of all work in

55:57

parallel with lots of agents but

55:59

critically examine the quality of the

56:01

agent output and have a high degree of

56:03

confidence of whether this is going to

56:04

work or not and redirect them if not

56:06

because this is what made them senior in

56:07

the first place. This was the role that

56:09

they had that they had the uh long

56:12

insight and history and overview of the

56:14

architecture. How does it all fit in? Is

56:15

this going to work? Is this not going to

56:16

work? This was the role they played to

56:18

junior programmers. But now they can

56:19

play that role to agents and agents are

56:24

faster at following instructions and

56:28

redirections. And suddenly you have

56:32

senior developers who can 5x 10x their

56:37

individual productivity. And now this is

56:40

the second order effect. If you manage

56:42

to 5x or 10x a senior developer, that

56:46

person's value per hour just went up

56:48

10x. Now take that hour instead of that

56:51

person spending it with the agents just

56:53

shipping stuff and making things better.

56:55

They spend that hour as they would

56:57

before teaching a junior human how to do

57:00

things better. There's something in that

57:01

equation that's in play right now and

57:03

it's not clear how it's going to map

57:06

out. Now one way it could map out is

57:08

that the agents will get so good that

57:10

they stop making mistakes. They become

57:14

senior in their capacity to ship working

57:16

code. This is what my bet would be if we

57:18

look x amount of time forward because

57:21

this is what just happened with cars. So

57:23

self-driving Teslas now drive better

57:25

than humans do. Not all humans, not in

57:27

all circumstances, but on average. It's

57:29

very possible that if we're able to

57:31

delegate the mortal risk, the highest

57:34

criticality we basically deal with on a

57:36

daily basis sitting in a metal tube

57:38

along other metal tubes that go 60 m

57:41

hour where you can die if someone makes

57:42

a mistake, we delegate that to an agent.

57:44

Well, they can probably figure out how

57:45

to make the code work too, right? So, I

57:47

do think it's coming, but who knows

57:49

when, who knows how. Right now, we're at

57:52

a stage where the bulk of the benefits

57:55

are acrewing to the most senior

57:57

developers. And also I wonder just like

57:59

with self-driving like you realize

58:01

there's always KV. So, for example,

58:03

inside companies where it matters. When

58:05

you're a startup, you have zero

58:06

customers. It doesn't matter. You can

58:07

oneshot it and it doesn't matter if it

58:08

doesn't work and it, you know, it

58:10

crashes. But inside these companies, uh,

58:12

at Uber, um, I just got details on how

58:15

they're adopting AI and and they have

58:17

all these tools, cloud code and and all

58:19

these things. But what we realize as

58:20

well when you just put it in there, they

58:22

have all these internal monor repos.

58:24

They have their ticketing systems. They

58:25

have their slack. They have so much.

58:26

They have their RFC's design documents

58:29

on on how and why they have this jumble

58:31

of a mess uh with microservices which

58:33

which was fun way that we we originally

58:36

connected like many many years ago. But

58:38

what they found is they built a bunch of

58:40

internal systems, a lot of it to help to

58:43

feed NCD's agent harnesses and now

58:45

they're working better. But you know

58:47

this where we are right now is is

58:49

there's and this is why if you're a

58:51

senior engineer in one of these

58:52

companies or a staff engineer at like

58:54

Uber and you move to Google suddenly

58:57

you're not going to be as valuable as

58:59

efficient for a while until you learn

59:00

all the systems. So I I wonder if just

59:03

like with self-driving, you know,

59:04

self-driving works great as well. I was

59:06

in SFN and LA and way most they they

59:08

drive so nice. Like

59:10

>> my Teslas was driving in LA driving us

59:13

to the airport every time. The whole

59:14

family I sit peacefully watch the road

59:18

but do not steer at all on that entire

59:20

journey. Well, except my my weimo got

59:23

stuck because a a truck was parking on

59:26

on a narrow street and a car had a bike

59:28

shed and I I I I knew that it should I

59:30

should not go there, but it didn't know.

59:32

So, human oper operator came in. But

59:34

anyway, but even with Whimos, you know,

59:35

like there's there's things like there's

59:37

they drive in pretty good weather.

59:39

They've been mapped out. So I wonder if

59:40

in software engineering I I wonder if

59:43

this has these parallels where we have

59:44

all of you know like these companies

59:46

have their their specialized

59:49

landscape and once you map it once you

59:51

do all the tools once you figure out

59:52

these things and with self-driving it

59:54

took it took 10 years right like I was

59:55

at Uber when they bought the

59:57

self-driving thing and we were hearing

59:59

in the news that you know next year it's

60:00

all going to be over for drivers and no

60:04

>> yes there are not going to be steering

60:05

wheels anymore which by the way is an

60:07

amazing anecdote because it just shows

60:09

Elon 's total faith in his mission

60:13

because in 17 when he made that

60:15

proclamation, it was an AI. It was

60:17

500,000 lines of handcoded C++,

60:20

>> right?

60:20

>> Like that model was never ever going to

60:23

get us to the full self-driving. But he

60:25

had just total faith in the vision. And

60:27

then eventually, hey, here come along

60:29

comes AI and it's so good. And if you

60:31

train it on billions of hours of road

60:34

use, it actually can do it. And it can

60:36

do it better than most humans. In fact,

60:38

I'm a pretty good driver. I'd like to

60:40

say I'm not the best chauffeur because

60:44

my I don't know impatience have a

60:47

tendency to provoke the throttle. Uh

60:49

that's not always as pleasant for

60:51

passengers as it is fun for me. And when

60:53

I let uh the Tesla autopilot drive, it's

60:56

just the best chauffeur in the world.

60:58

It's just perfectly

60:59

>> better than you.

61:00

>> Better than me, better than the queen's

61:02

chauffeur, I think. like it's throttle

61:04

actuation and deceleration is godlike.

61:08

It's actually agi like or as like in its

61:12

application within that narrow domain.

61:14

And of course, when we get these

61:16

anecdotes and these examples of holy

61:20

smokes, not it didn't take 10 years for

61:23

the self. It took 10 years from the

61:25

proclamation, but what they were doing

61:26

for seven of those years had nothing to

61:28

do with what they're doing with FSD now

61:30

because the FSD that's based on AI

61:33

hadn't been running for that long. But

61:35

the inflection point of I think it was

61:37

131, FSD 131, like the first version,

61:40

you're like, "Wow, this is pretty good,

61:41

but like I better pay attention." 132

61:45

140 142

61:48

over the course of 18 months we went

61:50

from yeah it's pretty good but like I'm

61:52

going to pay attention here to why is

61:54

there steering wheel and that

61:57

acceleration that short period of time

61:59

of course is something people look to

62:01

when it comes to programming go like

62:02

well if we're here now and senior

62:05

programmers still have to review it

62:06

because otherwise you're going to get

62:07

all your whatever four severity eight

62:10

down times at AWS because some AI pushed

62:12

out some nonsense. What is it going to

62:14

look like when they take the jump that

62:16

FSD did over the same period of time?

62:17

Now, I also think you can go completely

62:19

crazy trying to just sit and soak in all

62:22

of that. This is what I tried to do over

62:25

the past year. Go, I'm really excited

62:27

for where this is going, but I'm also

62:29

going to deal with what's possible today

62:30

and what's enjoyable today and what we

62:32

do right now. I'm not going to try to

62:34

plan what my life looks like 12 months

62:36

from now when maybe we do have AGI or we

62:38

don't. Now, there are other people who

62:40

do that very well. I just watched an

62:42

interview with Leopold on Drakesh from

62:45

last year. He's thinking like what does

62:47

2030 look like? What does the whatever

62:49

10 gawatt data center look like? I'm

62:51

like I I'm very glad we have individuals

62:53

who put thought into that because that's

62:55

not my favorite spot to be and I think

62:59

most people are not that good at

63:01

polishing the crystal ball.

63:02

>> No. Well, I I mean this is a little bit

63:04

unsettling as a software engineer in the

63:06

sense of like clearly this is where the

63:07

industry wants to go. This is where a

63:09

lot of effort will be put. There will be

63:10

a lot of businesses, software businesses

63:12

built on this. A lot of VC money raised

63:13

on this by the way who are going to

63:15

tackle this and they will either like

63:17

succeed or die. That's what that's what

63:18

these companies do. But today, what do

63:21

you see at at 37 signals uh with

63:25

software engineers? You you of course

63:27

have mostly experienced engineers,

63:29

although you did hire junior engineers

63:30

as well. How is their kind of work

63:32

changing? How is their satisfaction with

63:35

with work change? Because that's also a

63:37

thing, right? We keep arguing about like

63:38

is is it making us more miserable at

63:41

these things? Is it what we want to do?

63:42

And how's it changing for you? Right. I

63:44

think it's

63:45

>> that's the biggest revelation actually

63:46

more than even the capacity of the

63:48

agents is my enjoyment running them.

63:51

When I was on that leg interview last

63:53

summer, I was talking about you know

63:54

what I don't want to be a project

63:55

manager for agents because I had the

63:58

mental model of a project manager of

64:00

humans and I thought like that's not

64:01

what I enjoy. I don't want to be that

64:03

far away from the production. I want to

64:05

be in the mix. I want to have my hands

64:07

in the code. What I failed to realize at

64:10

the time was that running a bunch of

64:13

agents feels less like being a project

64:16

manager for agents and more like

64:18

stepping into this super mech suit where

64:21

suddenly I don't just have two arms. I

64:23

have 12 and I can now look at seven

64:25

screens at the same time running five

64:27

keyboards. I'm still the one doing it

64:30

even if I'm not typing this as a keyword

64:33

in a program. I have been hyper

64:35

accelerated as a programmer. It's a

64:38

different kind of programmer, but it

64:39

still has the same affinity to

64:42

aesthetics, at least when I'm producing

64:43

Ruby code. And I'm able to combine that

64:46

while being vastly more productive on a

64:49

bunch of things. It's also like getting

64:51

an incredible brain upgrade on even

64:55

assessing issues. One of the pilling

64:58

moments I had was before the release of

65:01

Omachi 3.4.

65:03

I went into GitHub and we had I don't

65:06

know 250 PRs pending and I kind of just

65:11

sighed a little bit and like 250 PRs if

65:13

I spend I don't know 15 minutes on each

65:16

PR like how long is it going to take

65:18

before I get to the end of it and I

65:20

thought you know what let me try

65:21

something else let me just try to ask

65:24

Claude to I'm not even doing anything

65:26

with a system I just do review URL and

65:29

the URL is the issue or is the PR are

65:32

shocked In

65:34

90 minutes, I think it was, I processed

65:38

100 PRs. And it wasn't that I merged all

65:41

of them. In fact, I'd say I merged a

65:43

small minority. Maybe 10% got merged as

65:46

is. Then maybe 20% got merged. But with

65:51

Claude's implementation,

65:54

>> the programmer had correctly identified

65:56

an issue

65:57

>> but hand rolled some code that I could

66:00

see I didn't want to keep or sometimes I

66:02

couldn't even see it. I just asked

66:03

Claude and they say like ah it's not

66:05

quite right. And then I just asked

66:06

Claude, can you just clean room this?

66:08

>> This is the right problem. Let's fix it

66:10

but let's do it right. It would do it

66:11

right away in exactly the style as I

66:15

would have written the rest of Amachi.

66:16

Now this isn't the high code of

66:17

something. It's mostly just bash code,

66:19

but there's still a shape to bash code

66:21

and how you want it to look and can it

66:23

feel coherent with the rest of project.

66:24

Agents opus in this case would just nail

66:27

it. And then the second half of it was

66:30

split between 25% thinks I then just

66:32

realized I just don't want this. It

66:34

shouldn't we shouldn't have it. And 25%

66:36

claude telling me maybe there's

66:38

something here, but it's really not a

66:39

good implementation. We don't have a

66:41

straight shot to making a great one. 100

66:44

issues in 90 minutes. And I sat back.

66:47

This would have been a week's worth of

66:49

work, days at the very least. What the

66:53

heck? And even more than that, Claude's

66:56

analysis of at least half the issues

67:00

pertained to things I knew nothing about

67:03

where it was undeniably

67:06

a smarter, better reviewer, programmer

67:11

that I could ever dream to be. Well, not

67:13

dreamed to be, but wasn't that

67:14

>> moment? No, but you would have not put

67:16

in the effort. This was why the PR sat

67:19

in the first place. In many cases, I

67:21

would look at it and go that

67:23

>> I think there's something here, but like

67:25

then I now have to read up on this debug

67:27

thing. I have to figure out is this the

67:28

right way of doing it. I don't want to

67:30

just merge something that then has other

67:32

issues. And to be able to do that agent

67:35

accelerated was one of top 20

67:39

programming moments. I I like how you

67:42

put agent accelerated and it sounds like

67:45

it's especially efficient for work that

67:47

is waiting on you but you don't want to

67:49

do it or you're not as skilled of doing

67:51

it but it's a hassle to delegate because

67:53

again like you have a team right like

67:55

like like you but you probably didn't

67:57

delegate it because you probably knew

67:58

that it wouldn't make it faster or

68:00

better. So I I I wonder if there's a

68:02

part of AI that because we talk a lot

68:04

about like you know like companies love

68:06

to measure especially larger ones like

68:08

efficiency PRs and they want to see

68:09

impact but about the impact of doing

68:12

work that we would have not done before.

68:14

>> That's the kicker for me. That's the

68:16

fact that the pie is just exploding

68:18

right now. It's not growing. It's

68:20

exploding. The number of projects we

68:21

have tackled internally that we would

68:24

never even have contemplated starting on

68:28

are legion. We had a great project where

68:30

normally on performance work you worry

68:32

about uh P50, P95, P99. Jeremy, one of

68:36

our most agent accelerated people went

68:39

like what about P1? What about the

68:41

floor? Can we fix the floor? What is the

68:44

floor? And he went like well right now

68:46

our floor is I forget what it was 4

68:48

milliseconds. Let's say that, right?

68:50

Well, actually 4 milliseconds can add up

68:53

if you have a bunch of fast requests.

68:55

They can still it still matters. and he

68:57

just went like, "We're gonna do P1.

68:58

We're gonna optimize P1 literally the

69:01

fastest 1% of requests. We're going to

69:03

make them even faster." He took it from,

69:05

I think it was four milliseconds to less

69:07

than half a milliseconds. He 10x the

69:09

performance that I was like, I would

69:11

never have signed up on this and he did

69:12

the P1 project over a couple of days as

69:14

like a side gef because now he could.

69:17

>> Now he could because he had a hunch. He

69:20

had an intuition that there was

69:22

something here. He let agents run with

69:25

it and the number of PRs that like all

69:28

right we fixed this we fixed this I

69:29

think total the PR the P1 project I

69:32

maybe misremember but I think it was

69:34

like 12 PRs like just fixing all sorts

69:36

of things where I look at the single PRs

69:38

I'm like yeah actually okay yeah makes

69:40

sense I look at the total sum of it

69:42

you've changed 2500 lines of code you're

69:44

like you've done that in a few days

69:46

>> it's so I've never heard anyone do P1

69:48

because it just it feels like a vanity

69:51

experience it makes no business sense I

69:53

I This is not true, right? Cuz

69:54

everything adds up. But but you know

69:55

what I mean, right?

69:56

>> I know exactly what you mean. And this

69:58

is exactly why the explosion of the pie

70:00

suddenly lets us look at problems we

70:01

would never have contemplated looking

70:03

before. It's funny. I remember this

70:05

scene from Terminator 2 where they found

70:08

this chip from the Terminator in the

70:10

first movie and he goes like, "This

70:12

thing gave us ideas we would never have

70:15

investigated before." And like there's

70:17

some beautiful parallels here about like

70:19

maybe we're about to build the

70:20

Terminator, the cliche, but also we're

70:24

getting ideas, we're getting ambitions

70:26

we would never have looked at before

70:28

because suddenly the cost of exploring a

70:32

hunch has just dropped by a

70:34

thousandfold. I do this all the time

70:36

now, too. I'll give it some vague crappy

70:40

instructions just because like I have

70:42

this fleety idea. I haven't even

70:44

crystallized it into a neat prompt. I

70:46

just want to see something. It'll And

70:48

then I go like, oh yeah, delete as in

70:52

revert code back to normal. I like

70:54

before I would be a little more precious

70:56

about 75 lines of code because it would

70:59

have taken me two hours to do him. Now

71:02

there's no residual value to any of this

71:04

stuff and I can just go like show me a

71:06

draft. I feel like a little bit like a

71:07

king where you just go like show me the

71:10

the analysis of the farrung regions.

71:12

Where are we with the tax recip? And

71:14

this boy is like, "All right, this uh

71:16

servant is like, "Yes, I I shall do so

71:18

and return in 3 weeks." Except like you

71:20

can just wave your hands around. And

71:22

agents just come back with answers to

71:24

stupid questions, terrible ideas. Then

71:28

suddenly it wasn't so terrible. It was

71:29

actually a great idea. And you go like,

71:30

"Wha, I did this with I haven't even

71:33

pulled the trigger on it yet." But one

71:35

of the things with the Machi people have

71:36

been asking for since the beginning is

71:38

dual boot. being able to install Linux

71:41

next to the Windows installation so that

71:43

they can still play all their games. And

71:45

I just went like, do you know what? I

71:46

have more than one computer so when I

71:48

play play games, I can just do it on the

71:50

PC. It's not a me problem.

71:52

>> Yeah,

71:52

>> I totally get why a bunch of people

71:53

wanted. I'm not heavily inclined to

71:55

spend four hours figuring it out. And I

71:59

just uh a little while ago went like,

72:01

oh, this is exactly the kind of problem

72:03

like I don't have to figure it out. Just

72:05

made the agents figure it out. So I

72:07

kicked off initially the process of just

72:09

coming up with a plan. This is a pretty

72:10

good big change, right? Like if you [ __ ]

72:12

up someone's boot records or you

72:13

overwrite their petition criticality

72:15

high, which was one of the reasons I

72:17

didn't want to engage with it. Secondly,

72:19

it's a little finicky if you want uh Lux

72:21

encryption on the Linux partition, but

72:23

the Linux partition doesn't own the

72:25

whole drive. It's a little hairy. I

72:27

didn't want to take on the criticality.

72:28

I'm like, this is perfect for the kind

72:30

of agent stuff. So it started off

72:31

basically just having Opus and Codeex

72:33

pingpong a plan. Like I'll just I asked

72:36

Opus first like come up with a plan for

72:37

this it thinks for minutes and minutes

72:39

and come up with a good plan and then I

72:41

kick it over to Codeex and like critique

72:42

the plan and then I had him ping pong

72:44

back and forth a couple times and at the

72:46

end looking at the plan going like yep

72:48

that's a good plan we should totally do

72:50

that and I can't wait to kick that one

72:52

off and just go yeah now does dual boot

72:57

not because I did it but uh thank your

73:00

uh your helpful clinkers. That level of

73:02

ambition is still something I've yet to

73:05

internalize. Like even just that that

73:07

like, hey, here are these hunches or

73:10

demands, projects that I would like to

73:13

do and maybe someday and you could kick

73:16

it up on a hunch while you go to lunch.

73:18

That is a new world. Which is also one

73:22

of the reasons I think a lot of people

73:23

are thinking, well, the model continues

73:25

to improve, but even if we somehow hit a

73:27

wall tomorrow, the bitter lesson is no

73:30

longer true. There's actually a limit.

73:31

It's 19 trillion tokens. That's how much

73:33

they can learn. Not true at all. But if

73:35

it was and we had to be stuck with these

73:37

models, we would spend the next decade

73:40

just getting more and more out of them

73:42

learning how to use these tools. You see

73:44

this actually with vintage computers. So

73:47

the kind of games they were able to make

73:49

on the Commodore 64 when that was

73:50

released back in 81 to 85 I think was

73:54

the main run. I know they made it a

73:56

little longer, but then the AmIgga and

73:57

other machines came out. Were great

73:59

games. I mean, I got interested in games

74:01

of the Commodore 64, Yung Fu, and all

74:03

that stuff. The stuff they were able to

74:05

do 20 years later when someone had just

74:08

noodled all the secrets and tweaked the

74:11

one MHz processor

74:12

>> when they're building games for the old

74:14

old

74:14

>> Yeah.

74:15

>> are so much more technically impressive

74:17

because we just know so much more about

74:19

the I mean, same thing with the you look

74:21

at the PlayStation first games come out

74:23

on launch, last games before we go to

74:25

PlayStation 2, they look from they're

74:27

like from different generations. We

74:28

could totally continue to do that with

74:30

the models, but we're not going to have

74:32

that particular enjoyment because

74:33

there's a new model dropping in 3

74:35

months. But this is interesting because

74:36

if we just run with this thought like of

74:38

course we know new new things are going

74:40

to come but the point is like we will be

74:41

spending so much time learning applying

74:44

them building either our internal

74:46

systems changing how we build things

74:48

taking on new project like if you're an

74:50

existing team now that people can do

74:52

more work and more ambitious work. How

74:54

are you thinking of of the team taking

74:57

on more work launching more products?

74:59

Are you thinking of of potentially

75:00

growing the team or keeping it as is? My

75:02

best assessment for our setup is that

75:05

the same people can do much more.

75:07

>> Let's internalize that. But that's also

75:09

enough. Already we were doing enough.

75:12

Already we had margin that we could hire

75:14

way more if we had enough good ideas for

75:17

that. So all this extra productivity

75:21

we're getting out of the team allows us

75:23

now to do things like P1 and these other

75:25

projects that are awesome and they're

75:27

going to improve the product faster too.

75:29

Of course they are. The old way of

75:31

thinking like it's going to take 2

75:32

months to deliver a major feature. I

75:34

mean that's out the door. Of course

75:36

there's going to be rapid acceleration

75:38

that's going to filter all the way into

75:39

our software methodology process like

75:41

shape of was built on two-month cycles.

75:44

That doesn't make sense in the same way

75:45

at all anymore. We have not fully

75:47

rewritten those scripts yet because the

75:49

acceleration is still so fast. No

75:52

company really has rewritten the scripts

75:54

on on all that. When you're shipping

75:56

that much faster, you need a way to

75:58

control what goes live and measure

76:00

whether it's working. This is a good

76:02

time to mention our presenting sponsor

76:03

stats. Experimentation feature flags for

76:06

teams that ship fast. Static build a

76:08

unified platform that enables both

76:10

experimentation and continuous shipping.

76:12

Built-in experimentation means that

76:14

every roll out automatically becomes a

76:16

learning opportunity with proper

76:17

statistical analysis showing you exactly

76:19

how features impact your metrics.

76:21

Feature flags let you ship continuously

76:24

with confidence. And because it's all in

76:26

one platform with the same product data,

76:28

teams across your organization can

76:30

collaborate and make datadriven

76:31

decisions. To learn more, head to

76:34

stats.com/pragmatic.

76:36

With this, let's get back to the shift

76:38

about to hit developers. But I still

76:40

think software developers are delusional

76:42

if they do not think a shift is coming

76:45

where before they were the constraint on

76:49

how much could be produced and therefore

76:51

could command

76:52

>> the salaries that flow to the

76:56

constraints. If suddenly those

76:58

constraints now loosen, especially if we

77:02

fast forward a little bit where the

77:03

product manager is actually able to

77:05

produce changes that can be shipped and

77:08

work, things are going to change. I do

77:11

actually think if I was going to bet

77:13

we've seen peak programmer in terms of

77:16

the learned guilt of programmers who

77:20

went to either school or spend hours

77:24

getting really good at it. we're not

77:26

going to need the same number of them to

77:30

do the same amount of work. Now, Javon's

77:32

paradox where as the price of something

77:34

goes down, you get more of it or you get

77:35

more demand for it is true, but that

77:38

doesn't mean that all programmers are

77:40

going to get bailed out by it just

77:41

because more software than ever is going

77:44

to be produced. That's for sure. By the

77:45

way, I think

77:47

>> GitHub has gotten a lot of slack or flak

77:50

lately.

77:50

>> A lot

77:51

>> justifiably so. I saw a chart saying

77:53

they had a 92% uptime, which sounds

77:56

insane. I'm not sure exactly what that

77:58

was measuring, but I feel it. I have a

78:00

little bit of sympathy in that. I also

78:02

think there's some mistakes were made,

78:04

but also that the amount of software

78:08

that's currently being produced is on a

78:09

rocket ship. We are producing as a

78:12

civilization globally way more software

78:16

than we've ever done before. I mean,

78:18

open claw itself, I thought um he said

78:21

it was 400,000 lines of code. That used

78:23

to take 10 years and 2,000 people. Yeah.

78:25

To get to that.

78:26

>> Well, not 2,000 in and but yes, it it

78:29

took a long time.

78:29

>> I mean, a long time, right? Like you

78:31

look at uh I think uh the main monolith

78:33

at Shopify is 3 million lines of code.

78:35

That's 20 years. And if you collectively

78:37

sum up all programmers who've worked on

78:39

that, probably like 20,000 people. Yeah.

78:40

Big shifts are coming right now. Um lots

78:43

of software is being produced. I can see

78:44

why it's it's creaking a little bit over

78:46

there because like the pushes are just

78:47

going to accelerate, right? And we

78:49

haven't even seen anything yet. If you

78:51

look at AI adoption

78:53

curves, basically no one's using it.

78:55

Like we all in our little bubble in X

78:58

are like, oh, everyone's no they're not

79:00

like most companies in the world are

79:02

just not doing it. Notwithstanding that

79:04

like I think uh chatt got to 800 million

79:07

users very quickly. Obviously, there's

79:08

adoption, but nothing on the scale of

79:10

what the companies that are furthest

79:12

along are doing and how much they're

79:13

accelerating with it. So, I do think it

79:15

is correct for the average programmer to

79:19

think maybe we've seen the best of the

79:23

golden days. Certainly there will be

79:26

pressures on price because one thing are

79:29

companies like ours that have

79:31

essentially unlimited scope to come up

79:33

with new features and do more and we can

79:35

then plow in all that additional

79:36

productivity into just do more. There's

79:38

also a lot of companies who just need to

79:40

do a thing and if they can do that thing

79:42

at a tenth of the cost that's actually

79:46

their advantage, right? They just need

79:47

to do this thing. It's very neatly

79:49

scoped and defined. It's a cost center.

79:53

Anywhere where software development is a

79:55

cost center, which is actually probably

79:57

the majority of software development in

79:58

the world,

79:59

>> they're going to face these pressures.

80:00

>> Yeah. Sounds like if I'm a software

80:02

engineer right now and I'm worried about

80:04

like well, you know, like just want to

80:06

make sure that I'm I'm at a place where

80:08

things are going to be better. You want

80:11

to be at a place where you want to

80:12

either get out of a cost center or

80:14

become really valuable there. Obviously,

80:16

you know, brush up your skills. And also

80:17

I'm wondering if if the shape of

80:20

software engineers who will be hired

80:21

will be changing cuz if if if I just

80:23

look back from like the '9s right like

80:25

even if you look at the movies you you

80:27

saw the stereotypes they were the nerd

80:29

who didn't talk to anyone but they knew

80:30

how to code they knew how to do assembly

80:32

and then we went in the 2000s it was

80:35

still based on languages and over time I

80:37

think in the 2010s startups started to

80:40

not hire for languages but just hire for

80:42

algorithms because you could learn the

80:43

stuff and now I'm seeing companies uh

80:46

some of the the the latest VC funded

80:48

companies have for product engineers

80:49

where they they're actually asking for

80:51

like empathy communication on top of

80:53

like it's kind of a given that you you

80:55

know how to code or whatever. So I

80:57

wonder if I'm just looking at just just

80:59

this curve, right? If I'm just painting

81:00

it up like you're starting to get people

81:03

Oh, and and the developers I I meet at

81:04

all these companies, they're all really

81:06

pleasant. They're all just very

81:07

communicative, very oh and they talk

81:09

with customers, most of them just it's

81:11

it's not even a drag. It's like and more

81:14

and more of them love doing it. That's

81:16

the constraint value. Now the constraint

81:19

value is figuring out what should we

81:21

build, how should it be built, which

81:23

customers should we be talking to, where

81:24

should we be focusing. It's product

81:26

management. It's so funny for me too

81:28

because historically I've not

81:30

necessarily had the highest esteem for

81:32

product management as a function. I

81:34

thought there was a lot of [ __ ] and

81:35

I thought it was a lot of people who

81:37

maybe didn't do as much right and one of

81:40

the reason was that they couldn't

81:42

because the constraint resource was the

81:43

implementation was the product manager

81:45

could find out that they want to do

81:46

something I want to do this feature and

81:48

then they had to wait four weeks for

81:51

some very expensive programmers to make

81:53

that reality happen and in those four

81:55

weeks I mean I guess they could go talk

81:56

to some they were underutilized they

81:58

were not the constraint right they the

82:00

constraint was on the implementation

82:02

that absolutely absolutely is going to

82:04

switch

82:05

>> and now pure implementation

82:08

is going to be solved at some point. I

82:11

I'm not claiming it is right now and

82:13

anyone who is have not tried to just

82:16

deploy bipoded stuff with no review to

82:19

major code bases but as the lesson of

82:22

last summer on Lex I'm not going to put

82:24

my heart on the block and saying that's

82:26

not going to happen before next summer

82:27

>> again this is just like common sense but

82:29

implementation one implementation will

82:30

be solved for for a general use case for

82:33

the edge cases it will take longer and

82:35

for some cases it will not make sense

82:37

same thing as I know self-driving is

82:39

fine for like these size of cars but for

82:41

like trucks it'll either take longer or

82:43

if you're specializ you do but the point

82:44

is like there will be pockets where but

82:46

those pockets will be smaller. Yes, I do

82:49

think the stereotype of I just want to

82:51

sit and code. You have to be John Karmic

82:55

levels of good to retain that privilege

82:57

to just I just want to sit and code.

82:59

>> And even John Karmak is also super AI

83:02

appeal and lead.

83:03

>> Well, but also like he he also saw some

83:06

trends that he could do like for example

83:09

like just like you know the type of

83:11

games that people would buy, right? Like

83:13

he needed to have some business skills

83:14

or just surrounded by people who did

83:15

that. Totally, totally, totally.

83:17

>> But like you you need to literally be

83:19

the very best. And not just the very

83:21

best, but you need to be better than the

83:22

agents, right? Like for you to get the

83:24

privilege to just be an implementer, you

83:26

have to be better than what's available

83:28

off the shelf from from agents. So, who

83:30

are the very best? And you're a good

83:32

person to ask because whenever you

83:33

advertise a position, and this was even

83:35

well before AI, I remember that you you

83:37

put out a a a job for both software

83:40

engineer and a designer. And actually I

83:42

want I'm I want to interview your

83:44

designer who you hired and because uh

83:47

you published the salary which is a San

83:49

Francisco salary. You put the exact

83:52

number you can check it for it. You have

83:53

a social media presence so it's kind of

83:55

go go goes wide and you get a lot of

83:56

applications and you do a pretty good

83:58

job as as I understand you try to be

84:00

very fair. You put a lot of effort into

84:02

it. So, what did it take to get hired at

84:05

37 signals? Because now you are trying

84:07

to hire some of the best and based off

84:10

of this, what advice do you give to

84:12

people who are like, okay, I want to be

84:14

the best in in this age right now.

84:17

>> Incredibly good question. No one has

84:18

figured out we haven't cracked it. And I

84:20

say that as someone who have run an

84:23

organization where we we must have

84:24

looked at tens of thousands of now, of

84:26

course, if you're running Google, you've

84:27

looked at millions, but we've looked at

84:29

tens of thousands of candidates. The

84:31

number of candidates we've hired is

84:33

quite small. I mean total number of

84:35

programmers that's been through 37

84:36

signals over its entire lifespan. What's

84:38

that going to be like I don't know 100

84:40

150 at the most? I haven't even

84:41

>> How big is your team right now is?

84:43

>> Uh we're 60 people at the entire company

84:46

and we are what is that going to be like

84:47

20 programmers something like that.

84:49

>> Yeah, that's probably about right.

84:50

>> Oh, so so who what is the other other 40

84:52

folks?

84:53

>> Uh we have designers

84:56

>> uh probably like 10 of those.

84:58

>> Wow. Wow. And then we have customer

85:00

support which is at 14. Then we have a

85:03

bunch of support functions, HR, finance,

85:07

and then we have operations. Operations

85:09

is quite large. We have 10 folks

85:11

managing all our servers. And yeah,

85:13

that's about it. But yeah, I probably

85:15

it's probably about 100 people in total

85:17

that I've worked with uh or employed at

85:20

the company's programmers

85:21

>> out of tens of thousands

85:23

>> we've looked at. And even all those

85:25

hires did not pan out in the long term.

85:28

Like I'd actually say I think I looked

85:29

at this recently. Our batting average at

85:32

best I think is slightly better than

85:35

50/50.

85:36

So half of even those hires

85:38

>> you go through all of because you have a

85:39

really long and thorough process. You

85:41

you you put in a lot of effort, right?

85:43

>> No one has figured out just to hire with

85:47

such efficiency that they don't make

85:50

mistakes. There's a great paper that

85:53

Google published quite a long time ago

85:55

now where they tried it all sorts of

85:58

different hypothesis. Well, can we

86:00

predict employee outcomes on the basis

86:02

of Ivory League education background on

86:05

GPA on all of these things? And the

86:08

conclusion was basically like we know

86:09

nothing.

86:10

>> We can't predict it on any of these

86:12

things. We can't predict it on lead

86:13

code. We can't predict it on any of

86:15

these metrics.

86:17

What I'd say is I've clearly been

86:19

spoiled by working with some very good

86:21

people, not just at my company, but in

86:23

open source in general.

86:24

>> Yeah. Oh, yeah.

86:24

>> And therefore, I've ended up with

86:26

occasionally a twisted perspective of

86:29

what the average programmer is capable

86:32

of. And when we do hiring rounds, I am

86:34

sometimes, well, not sometimes, I mean,

86:36

every time, I'm kind of surprised how

86:38

poor the majority of the submissions

86:41

are, how little effort is put into

86:45

being presentable. And that can sound

86:48

really boomery very quickly, but it's

86:51

also just the reality of trying to get a

86:54

job. Like, you got to stand out. And I

86:58

understand that that's uncomfortable,

87:00

right? like who wants to look at this as

87:02

like well my the odds are kind of

87:04

against me but it's also a trap to

87:07

actually fall into thinking of this in

87:09

terms of odds because what I've seen the

87:12

miscalculation happened time and again

87:14

is people go like okay so you have a

87:16

thousand applicants there's only one who

87:19

gets the job or maybe two who gets the

87:20

job so that 0.1% chance no it's not not

87:25

at all with that math you had 0% chance

87:27

>> yes

87:28

>> zero and the very

87:30

They probably had a 10% chance, 20%, 30%

87:33

chance. It is not equal distributed. It

87:36

is not a lottery. We don't just like

87:38

pick a thing out and be like, "Oh, it's

87:40

going to be this person because they

87:41

happen to be the one drawn from the

87:43

bunch." Not at all. We discard off the

87:46

bat probably at least half the

87:48

applications. Maybe it's twothirds just

87:50

because they're either not addressing

87:52

the job directly, they are not following

87:54

the instructions in the relatively clear

87:56

spoken written openings that we have,

87:58

right? they're obviously not right for

88:00

it or or whatever or we get some other

88:02

smells. Then there's like perhaps a

88:04

third left and then we start looking at

88:06

some of the submissions. Then we narrow

88:08

it down historically to a pool of around

88:10

20 people that we give a at home test.

88:13

The at home test is wonderful. Some

88:15

people hate it. They feel like it's free

88:17

labor. I'm like, what the [ __ ] are you

88:19

talking about? I'm not going to use your

88:20

submission to a code test. What? I'm

88:22

going to deploy it to production. How do

88:24

you think we came up with that code test

88:25

because it already exists in the system?

88:27

I say that a little harshly. I also get

88:29

the sympathy of like I don't want to put

88:31

six hours into making a test if it's not

88:33

going to go anywhere. Okay, I get it.

88:35

But there's no way around it because if

88:38

you have it in your head that you just

88:40

send in a resume, someone's going to

88:42

call you up on the phone, have a

88:43

30-minute conversation with you and go,

88:45

you've hired sir. I don't know if that

88:47

ever existed, but certainly does not

88:49

exist today. It never existed in the

88:50

lifetime I've been in this. Well, the

88:52

only time it exists, right, is

88:54

>> through a very warm referral where

88:57

correct

88:57

>> where you're starting a typing if you're

89:00

skipping the whole pipeline.

89:01

>> And when you skip the whole pipeline, it

89:02

typically only happens at the very

89:03

beginning of a company when you're

89:04

founding a company and often it goes

89:06

both ways where it's very risky and then

89:08

you say like this buddy of mine, I work

89:10

with this person for two years straight.

89:12

I would like trust them with my eyes

89:14

closed. So that's actually the black

89:17

pill on the whole hiring process. If we

89:19

look at the long-term success rates, we

89:21

have had more long-term employees from

89:25

I've worked with this person for 2

89:27

years, we should hire them than we have

89:28

from the open calls. It is actually

89:30

exceptionally difficult

89:33

has been for us to find the kind of

89:35

programmer who thrives in our

89:37

environment from open call. It has

89:39

happened. We have hired people that way

89:41

and I continue to want to believe even

89:44

if the odds seem insanely long when you

89:47

start doing the math of like oh my god

89:48

we've looked at tens of thousands and

89:50

how many then got hired and how many

89:51

then didn't work out like Jesus there's

89:53

only like a handful left from starting

89:55

with that that that's kind of

89:56

blackmailing but then hiring directly on

89:58

the base of warm referral as you call it

90:01

um has worked very well and that the hit

90:03

rate there is really high but how does

90:05

that help anyone right like that's not a

90:07

very actionable advice except that's to

90:08

say Get as good as you can get and put

90:12

in as much effort as you can and work

90:14

with someone because I want to say that

90:17

as a counter. Some people have this

90:18

notion in their head that if they work

90:20

at a place they consider shitty, they

90:22

shouldn't try.

90:24

You're shooting your own feet, buddy. If

90:27

you show up at the shitty place of work,

90:29

and we can even be objectively in unison

90:32

about that, that it is a shitty place of

90:34

work, and you then go like, "Well, I

90:36

should just try to skirt. I should just

90:37

try to goof off. I should just try to

90:39

read X or Reddit all day, right?

90:41

Everyone else you work with, they're

90:43

going to watch that. You know where that

90:44

warm refo is going to come from? It's

90:46

going to come from someone who worked

90:47

with someone else at a shitty job, but

90:50

identified that that individual still

90:52

showed up and did as best as they could

90:54

to learn, to ship, to do all of this

90:57

stuff. There is no shortcut here. You

91:00

simply just have to be good. And you

91:02

will not get good if you do not

91:03

practice. And if you think your place of

91:05

employment is not worthy of your best,

91:08

you're cheating yourself.

91:10

>> If you're not helping, even if it's a

91:11

shitty place, if you're not helping that

91:13

place get better, why would a great

91:14

place hire you who only hires people to

91:17

to further raise the bar?

91:18

>> This is total cope. And it's cope both

91:20

on the side of I work at a shitty place

91:22

if I don't want to put things in. You

91:23

could be annoyed. I'm not telling you

91:25

you have to love your boss. I'd actually

91:27

say the majority of people I used to

91:28

work for, I didn't have the warmest

91:29

feelings about them. I still tried

91:31

really hard for my own edification, for

91:34

my own education, for my own sense of

91:37

I'm the kind of person who shows up and

91:39

does a good job just that I will be

91:41

ready when the opportunity arrives when

91:43

all my talents are needed and all my

91:45

skills are honed. Right. Well, well, was

91:47

this not how you ended up at 37 Signals

91:49

where it was just a contract job or

91:51

something and you know like on a

91:53

contract job you have no ownership and

91:55

correct but you showed up and

91:56

>> correct and Jason ended up realizing

92:00

>> okay this uh punk better get some equity

92:02

otherwise he's out the door now that's a

92:04

siminal story and you shouldn't

92:06

extrapolate everything from that I mean

92:07

all founder stories by the way are

92:09

siminal stories in that regard but the

92:11

fundamental principle is still the same

92:13

show up do as good as you can learn

92:16

more. There also was to my

92:20

chagrin to some extent. I perhaps

92:22

contributed it to it a bit for a while,

92:24

which was this notion that you can be a

92:27

great programmer and not really like

92:28

programming. That you don't have to ever

92:32

care about programming outside working

92:34

hours. Was was this what you thought of

92:36

or like

92:37

>> Well, I thought of it mistakenly because

92:40

I was pushing back on the overwork

92:43

100hour week, 120 hour week maniacal

92:46

obsession, which by the way never was my

92:48

experience. We did not start base camp

92:50

that way. We have worked on a 40hour

92:53

week rolling average over those 25

92:56

years. But also, as I said at the very

92:59

beginning, I really like computers. So,

93:00

I play with computers in my free time. I

93:03

look at computer things in my free time.

93:04

It's not work in the sense that I'm

93:07

whatever shipping features to basec camp

93:09

customers like just 247. That's not what

93:12

it is. But I am playing with computers.

93:13

I am looking at new things. I am

93:15

exploring new systems and whatever. And

93:18

I think there was for a while in the

93:20

2010s a misconception that you didn't

93:23

have to do any of those things. you

93:25

could just show up and do your work and

93:28

you would be so soughta because

93:31

programming was such a valuable activity

93:33

and there were so few people who could

93:34

do it that they'd take anyone even

93:36

people who barely gave a [ __ ] and I

93:38

think that's over if it ever was true

93:41

and I think it was true

93:42

>> the boot camps were the perfect uh like

93:45

catalyst or or like they were the canary

93:47

when

93:48

>> which also by the way is how the economy

93:50

is supposed to function when salaries

93:51

are really high it means that there's

93:53

not enough supply of labor Therefore, we

93:55

should get labor into the pool.

93:57

>> Exactly.

93:57

>> And so, I'm not I don't even have any

94:00

qualms about internet. I'm just saying

94:02

like that's over.

94:02

>> No, I I think looking, you know, we're

94:04

talking about like is it is the golden

94:06

age of the programmer? Have you passed

94:08

peep programmer? And I wonder if peep

94:09

programmer really meant that almost

94:11

anyone who wanted to get into the

94:13

industry and was willing to put in some

94:15

effort, few months or maybe a few years

94:18

could do it. You could learn how to

94:19

code. You could go to either college or

94:21

to a boot camp or put in the hours and

94:23

you could get hired at a place because

94:25

the interviews were the references were

94:28

not needed. We we didn't check and I

94:30

it's probably coming to an end. You do

94:31

need references. You more I think more

94:33

and more companies will be doing

94:35

reference checks as part of our thing

94:36

and it's not just going to be have you

94:38

worked there like would you I I've had

94:39

these calls from like data bricks is is

94:41

famous for reference cards. They don't

94:43

only check for references. They drill

94:45

you not just would you work with this

94:47

person again, how what were their

94:48

weaknesses,

94:49

>> right?

94:50

>> Where would you hire them, etc., etc.

94:51

And

94:52

>> no, I understand it. The weird thing is

94:56

peak programmer sounds like this is

94:59

something that affects all programmers.

95:01

It does not. The best programmers are

95:03

not even the best as in like it's 10

95:05

people around the world. really good

95:07

programmers are currently more valuable

95:10

than ever because they're the ones who

95:12

are able to get the most out of the AI

95:14

acceleration. And this was the kicker

95:16

for me in changing my perspective on

95:19

this is that I've also found and maybe

95:21

it's not universally true, but certainly

95:23

within 37 signals in my own experience,

95:25

I'm enjoying my time as a programmer

95:27

more than any time since early 2000s

95:30

when I just discovered Ruby. This has

95:32

the I just discovered Ruby feel to it

95:35

that it is so satisfying to be able to

95:38

move this fast on so many levels at the

95:42

same time to be able to explore the P1s

95:44

to be able to think about dual booting

95:46

omachi to do all of that stuff that the

95:49

work itself has gotten vastly more

95:51

enjoyable and I've seen the same thing

95:52

for the most AI forward programmers that

95:55

we have maybe also have some of these

95:57

anxieties but they're kind of pushed to

95:59

the side just out of sheer enjoyment

96:01

working with the new capacities. So

96:03

there is a bifocation here where we

96:06

should all feel like well we don't know

96:07

what's going on and for some people

96:09

that's going to produce some degree of

96:11

anxiety. I understand that especially

96:12

when it's your livelihood and you're

96:14

like well I'd also like to be able to

96:15

pay for my kids college in seven years.

96:17

What does that look like? I get it.

96:19

You're not going to be able to manifest

96:22

that anxiety into anything productive

96:23

unless you just plow it into leaning in.

96:25

Right? Because if you just sit and spin

96:27

around, try to think about what the

96:28

world's going to look like seven years

96:29

from now, you're wasting your time. So

96:32

that's the only path. The only path is

96:35

to either get excited about this, which

96:36

I don't even think takes that much

96:38

effort. As we said, if you sit down with

96:39

these models, you pull out one of your

96:42

hobby projects from the closet

96:43

>> that you never finished,

96:44

>> that you never finished, and you just

96:46

give it a try, I don't see how you

96:49

really like computers and not find that

96:51

experiment enjoyable. And I I've seen

96:54

this with with people who are getting

96:56

into it. Kent Beck is such a great

96:58

example. He's been programmer 52 years

97:00

and he is saying like he he loves doing

97:02

it and he found this balance between

97:04

using the agents to build something

97:06

ambitious that he always wanted to b

97:07

he's building his small talk server

97:09

which which used to take forever and now

97:11

it's it's getting closer and it's still

97:12

taking a long time and then in between

97:14

he's chilling at his he has his house on

97:15

on the lake and he just goes and like

97:17

just looks at the birds for two hours

97:18

and then gets back to it. It's

97:20

beautiful. Kent is, by the way, one of

97:22

my all-time heroes. This was right when

97:24

I got started in programming right when

97:27

I before I was picking up uh Ruby, I saw

97:30

Kent speak at a Danish conference in

97:32

2001 on stage and I was completely

97:35

mesmerized by his command of both the

97:38

material, how bold he was and how great

97:42

of a speaker he was. And this was after

97:45

having read Extreme Programming and many

97:48

of these other things. Small Talk Best

97:50

Practices is my number one

97:51

recommendation for any programmer who

97:53

want to learn the nitty-gritty of how to

97:54

structure a method and a class and the

97:57

rest of it. Small Talk Best Practices,

97:58

which is Kent's book from 95, I think,

98:01

or 96, is to this day my favorite book

98:04

of all time on tactical programming

98:07

uh patterns. So, it's wonderful to hear

98:10

him being agent pilt while also enjoying

98:13

the birds. I mean, I try to do that,

98:15

too. And this is actually there's a bit

98:16

of attention right now is that most of

98:18

the people I find who are allin, they're

98:20

working harder than they ever have. And

98:22

I've seen that with myself now too. When

98:25

you can be this effective and impactful

98:28

on an hour of supervision of these

98:30

agents, it's really intoxicating. If you

98:32

have an active uh dopamine loop up there

98:34

that gets triggered when something is

98:36

shipped, it is just hyperactive right

98:39

now. And I need to go, do you know what?

98:41

This is not like a limited sale. like AI

98:44

is going to be here next month and the

98:46

months after that. Like I cannot just

98:48

operate as though it is a limited sale

98:49

and I need to get all the dopamine in

98:51

harvested within the next two weeks.

98:53

That I actually think is the main

98:55

challenge right now for the people who

98:56

are furthest along and most pled on it

98:58

is like remember that this is as bad as

99:01

they're ever going to be as the cliche

99:03

goes, right? You damn well better find a

99:05

way not to get consumed entirely about

99:07

it as exciting as it is. And and then

99:09

yeah, there there's this consuming is is

99:11

is a big deal. Like with Steve Yaggi, he

99:13

was he looks a bit more drained than

99:15

like you can see it on the video, but he

99:17

he has he's honest like he's he's being

99:19

pulled into this. He's doing he has

99:21

friends who are and when when you're on

99:22

the edge, you're there. You've clearly

99:24

been AI pill, but how are you finding of

99:27

keeping a balance of like all right,

99:28

stepping away, you know, like I I know

99:30

you've I think you previously talked

99:32

about the importance of sleep.

99:33

Apparently you don't have an alarm.

99:34

>> Correct. I don't use an alarm, although

99:36

my wife now does because the kids need

99:38

to go to school on a regular basis. But

99:42

yeah, for me, eight hours a night is the

99:45

best investment you can make in your own

99:46

cognitive capacity. So, I just am

99:49

reminded every single time I do not get

99:51

eight hours that it is such a poor

99:54

trade. If you go from the eight to the

99:56

six, I go like, well, I'm going to be

99:58

awake for in that case 18 hours. What is

100:01

the drag I'm gonna carry for all those

100:04

18 hours for getting one more hour, two

100:07

more hours by cutting back on the sleep?

100:09

It is such a bad piece of math. It makes

100:12

no sense. Now, occasionally it's

100:14

involuntary. I have actually had,

100:15

especially around this AI stuff, I've

100:17

had a couple of times, very rare, I can

100:19

count on two hands the number of times

100:21

where I've been sleepless, like the ra

100:24

the brain racing a little too much.

100:26

That's not typical for me and it's still

100:29

not typical. But I have had a couple of

100:30

them, right? So, I get where some of

100:32

that excitement comes from. But I'd also

100:34

say the last thing you should trade is

100:36

sleep and then you should not trade your

100:39

health. You should not try to save the 3

100:42

hours a week of working out to do more

100:45

agent work. That's a very poor trade.

100:47

Keep in good condition. like there's

100:49

nothing this can be more important if

100:51

you want to keep like sharp up there

100:53

that like the rest of the system is

100:55

operating if not at peak capacities than

100:58

at uh at a good sustainable level right

101:01

and I do think there are some

101:02

individuals right now who are at fear of

101:04

running ragged

101:06

>> on something that we're going to be

101:07

dealing with for like slow down buddy

101:09

like it's not again a limited sale the

101:12

next 10 years we're going to see more

101:13

and more it's going to get crazier and

101:14

crazier so don't squander your health

101:16

don't squander your sleep don't squander

101:18

your diet in the service of anything

101:20

because even on the short term, it does

101:22

not work. You cannot get more productive

101:25

within 3 weeks, let's say, by trying to

101:28

cut back two or three hours of sleep

101:29

every night and then think there's

101:31

anything coherent left after 3 weeks.

101:34

You will be a hot mess. So, let's close.

101:36

We talked about the stuff that we don't

101:38

know. A lot of things we don't know, but

101:39

let's close with what you do know. So

101:42

you you could have retired a long time

101:44

ago and just you know kick back and and

101:46

like listen to birds. What is it that

101:48

keeps you doing keeps you building

101:51

keeping getting up every day and before

101:53

AI you would open your terminal I think

101:55

you you shared like like you would go

101:56

and and write now you're doing with

101:58

agents like what drives you and and and

102:01

looking ahead like what what are things

102:02

you're excited about?

102:03

>> My drive continues to be a deep love of

102:07

computers. This is simply the best way,

102:10

the most fun way to spend my time. I

102:12

could spend my time on a lot of things.

102:14

I do spend my time on a lot of things. I

102:15

don't just do computers. I drive race

102:18

cars. I take lots of time up. I have

102:20

three kids. We enjoy all of that stuff.

102:23

But if I'm going to fill eight hours

102:25

every day with an activity, my best bet

102:28

is computers. And it has been so since I

102:30

was literally 5 years old. Whether it's

102:33

video games or what now feels a little

102:35

bit like a video game actually

102:37

instrumenting all these agents and uh

102:39

playing a little bit of Starcraft with

102:40

moving them around and

102:42

>> Toro.

102:42

>> Yes, exactly. So, I just really like

102:45

computers. So, whether I need to do so

102:47

for economic reasons or not, I will

102:48

continue to play with computers, see

102:51

what makes them tick and make things. I

102:54

think that's the other big misconception

102:55

that some people have about wealth is

102:58

that they conceive of it as some sort of

103:00

checkpoint. Like once you've made it,

103:02

then you can just kick back in leisure

103:05

as though that was happiness. We simply

103:07

have a hundred years of psychological

103:09

studies telling us no, that's misery. If

103:12

you have all the time in the world and

103:14

no purpose, no mission,

103:18

leisure is not going to cut it. It's not

103:20

going to be fulfilling way. And this

103:22

should be obvious by example of

103:24

literally every entrepreneur who sells

103:26

their business. They sit on the beach

103:28

for three weeks and then they're back

103:29

into the game, right? Because this is

103:30

actually not just something they do in

103:33

pursuit of a goal. It's the goal itself.

103:37

It is the mission itself. It is the

103:40

satisfaction. It is the affirmation of

103:42

being a human that I'm not just a blob

103:45

laying around. I am a useful individual

103:49

who put my skills to the best use

103:51

possible. So, I'm going to continue to

103:53

do that. And I'm going to continue to do

103:55

it whether I'm sitting typing at the

103:56

keyboard, whether I'm instrumenting

103:58

these agents, whether they're teaching

103:59

me, however which way it is, I want to

104:02

play with computers, I'm going continue

104:04

to do that. And then even more

104:05

specifically after the last three

104:07

months, I'm leaning in hard now with

104:09

agent accessibility. For example, this

104:11

is what I've been doing the last few

104:12

weeks. We've been working on the new

104:13

CLI, which also taught me like we're not

104:15

quite at AGI yet, right? You think like,

104:17

well, just ask your agent to make a CLI.

104:19

It will, but like it's not quite there,

104:22

right? like I want it to be just right

104:24

and the agents still need a little bit

104:26

of help. I'm very happy to provide that

104:28

help to these agents and we'll release a

104:30

great CLI for Base Camp very very

104:31

shortly. Maybe by the time this is out

104:33

it'll probably be out and for the rest

104:34

of them too and I want to lean into all

104:36

of this. How can we use this as much as

104:37

we possibly can and then right now I'm

104:42

also just an incredibly cur curious

104:43

person. I wake up every morning I have a

104:46

new ritual which is not to pull my phone

104:48

up and start hopping on X. like right

104:50

when I wake up. I don't think actually

104:52

that is great. But it takes a tremendous

104:55

willpower to not do so because I'm just

104:57

so curious about what happened. There's

104:58

so much happening right now. I want to

105:00

know. I want to know. I want to be

105:03

enjoying it. Be a part of it. So I don't

105:06

foresee that ending. I don't foresee a

105:08

love of computers evaporating. In fact,

105:11

if anything, right now I'm seeing like a

105:14

a flourishing of it. I'm liking

105:16

computers more than I did five years

105:18

ago. And that's amazing. Amazing, David.

105:21

This was awesome. Thanks. Thanks a

105:22

bunch.

105:23

>> All right. Thanks for having me. This is

105:24

really great.

105:25

>> This was a fascinating conversation and

105:27

I love the energy that David has. I hope

105:29

some of this energy that is obvious in

105:30

person also came across to you. I really

105:33

appreciated that David was open that his

105:34

stance did not change about AI because

105:36

his philosophy changed. It's just that

105:38

the tools became good enough to do

105:40

useful stuff. AI for autocomplete was

105:42

annoying for experienced developers. AI

105:44

agent that can produce pretty good

105:46

working code by themselves on the other

105:47

hand are now pretty useful. And yet

105:49

David kept coming back to taste,

105:51

judgment, and craft. He wasn't just

105:54

saying just let the model write

105:55

whatever. It's the opposite. He has a

105:58

very high quality bar and he wants the

106:00

output to be code that he would actually

106:02

be proud to merge. It feels like AI

106:04

might make good judgment even more

106:06

valuable than before. I also really

106:07

liked how David thinks about the

106:09

importance of design. At 37 Signals,

106:11

designers help figure out what should be

106:13

built, how it should work, and

106:15

increasingly even decide how it gets

106:17

implemented. I wonder if 37 signals is a

106:20

step ahead of the industry in thinking

106:22

about designers a bit like developers as

106:24

well and developers a bit like designers

106:26

as well. Finally, I found David's take

106:29

that we might have hit peak software

106:31

engineer an interesting argument. David

106:34

thinks we'll produce more software than

106:35

ever. But his observation is that we

106:38

might be nearing the end of the time

106:39

when developers could command high

106:41

compensation simply because they were

106:43

the bottleneck. My two cents is that

106:46

there will surely be high demand for

106:48

professionals who can build profitable

106:50

software. But this will mean software

106:52

engineers who are not only good at

106:53

coding or using AI to generate code, but

106:56

can oversee building complex systems

106:58

have taste and business sense as well.

107:00

If you'd like to hear more from David,

107:02

check out a bonus episode with him

107:03

linked in the show notes. Also, check

107:05

out the show notes for related to

107:06

pragmatic engineering deep dives on

107:08

software craftsmanship and practical

107:09

ways of building software. If you

107:11

enjoyed this podcast, please do

107:12

subscribe on your favorite podcast

107:14

platform and on YouTube. And a special

107:16

thank you if you also leave a rating on

107:17

the show. Thanks and see you in the next

Interactive Summary

David Heinemeier Hansson (DHH), creator of Ruby on Rails and CTO of 37signals, discusses his transition from an AI skeptic to an "agent-first" software builder. He explores how modern agent harnesses like OpenCode and Claude Opus have transformed his development workflow into a "mech suit" for productivity, allowing for massive efficiency gains in tasks like PR reviews and performance optimization. DHH also details the unique culture at 37signals where designers act as product managers and implementers, and he shares his perspective on the industry hitting "Peak Programmer," where the value of software engineers shifts from pure implementation to high-level judgment, taste, and craft.

Suggested questions

5 ready-made prompts