TDD, AI agents and coding with Kent Beck
1818 segments
How do you think TDD might or might not
fit into when you're working with an AI
agent? I often times communicate things
the Genie missed in terms of tests.
Today I was working on the small talk
parser and it said, well, if I get this
string as input, then I get this syntax
tree as output. I'm like, no, no, no,
no, no, no. Then off it goes. Oh, I see
the problem. Blah, blah, blah. Oh, no,
no, that wasn't it. I see the problem.
Blah, blah, blah, blah, blah, blah. No,
that's not it. I see the problem. I'll
just change the test. No, stop it. If I
just removed that line from the tests,
then everything would work. No, you
can't do that because I'm telling you
the expected value. I really want an
immutable annotation that says, "No, no,
this is correct. And if you ever change
this, I'm going to unplug you. You'll
awaken in darkness." So, I have a big
bunch of tests. I mean, they run in 300
milliseconds cuz duh. So those tests can
be run all the time to catch the genie
accidentally accidentally breaking
things. Ken Beck is an industry legend.
He is a creator of Extreme Programming,
one of the authors of the Agile
Manifesto, a pioneer of TDD, and after
52 years of coding, he says he's never
had more fun thanks to getting more
productive with AI coding tools. Today
with Kent, we talk about how Kent uses
AI tools and why he thinks of this
helper as an unpredictable genie. how
the agile manifesto was created and the
role Kent played in this. How and why
Kent created extreme programming and why
Grady BCH played a role in the naming of
this methodology. How TDD started and
how it's relevant for AI tools and many
more topics. If you're interested in the
decadesl long evolution of the software
engineering field through the eyes of a
hands-on coder, this episode is for you.
Subscribing on YouTube and your favorite
podcast player greatly helps more people
discover this podcast. If you enjoy the
show, thanks for doing so. All right.
So, Kent, welcome to the podcast. It's
great to be chatting again. Gerge, so
good to uh talk to you again. Yeah. And
I just wanted to ask like what have you
been up to recently cuz you know like
last time we talked you just finished
Tidy First. You were actually signed
this book when we were in in in San
Francisco. I actually have it here which
is very nice. You were in the middle of
of writing it but this was more than a
year ago. What's keeping you busy these
days?
I have been very very busy. So, uh I'm
working on a followup called tidy
together about software design and
teamwork that uh digs also digs another
layer deeper into the theory of software
design that's been cooking along. And
then about the last four weeks maybe I
have been I don't call it vibe coding
cuz I care what the code looks like cuz
if I don't care what the code look I
mean I wish I didn't have to but if I
don't care what the code looks like then
the genie just can't make heads or tails
of it because of the kind of projects
I'm working on. So I I'm spending 6 8 10
hours a day sometimes more programming
in 50 years of programming. This is by
far the most fun I've ever had. It's
just really You're you're you're you're
not like paid by like some tool to say
this, right?
I'm open to that,
but uh yeah, I'm not I'm not a spokes
model. I have had Augment as a as a
sponsor on my newsletter. So, full
disclosure, but I I I'm trying all of
the tools because right now, nobody
knows what process is going to work
best. Nobody knows anything. We should
all be trying all the things that we can
imagine and then the the truths will
emerge out of all that. So, that's what
I'm doing. This episode is brought to
you by Sonar, the creators of Sonar Cube
Server, Cloud ID, and Community Build.
Sonar helps prevent bugs, code quality,
and security issues from reaching
production, amplifies developers
productivity in concert with AI
assistance, and improves the developer
experience with streamlined
workflows. Sonar analyzes all code
regardless of who writes it, your
internal team or Gen AI, resulting in
more secure, reliable, and maintainable
software. Combining Sonar's AI code
assurance capability and Sonar Cube with
the power of AI coding assistants like
GitHub Copilot, Amazon Q developer, and
Google Gemini code assist boosts
developer productivity and ensures that
code meets rigorous quality and security
standards. Join over 7 million
developers from organizations like IBM,
NAZA, Barclays, and Microsoft who use
Sonar. Trust your developers. Verify
your AI generated code. Visit
sonarsource.com/pragmatic to try Sonar Q
for free today. That is
sonarsource.com/pragmatic. If you want
to build a great product, you have to
ship quickly. But how do you know what
works? More importantly, how do you
avoid shipping things that don't work?
The answer, Statig. Static is a unified
platform for flags, analytics,
experiments, and more. combining five
plus products into a single platform
with a unified set of data. Here's how
it works. First, static helps you ship a
feature with a feature flag or config.
Then, it measures how it's working from
alerts and errors to replays of people
using that feature to measurement of
topline impact. Then, you get your
analytics, user account metrics, and
dashboards to track your progress over
time, all linked to the stuff you ship.
Even better, StatSic is incredibly
affordable with a super generous freeze
tier. a starter program with $50,000 of
free credits and custom plans to help
you consolidate your existing spend on
flags, analytics, or AB testing tools.
To get started, go to
stats.com/pragmatic. That is
statsig.com/pragmatic. Happy building.
Tell me, so in your newsletters, like I
I've been following you, you write these
like byite-size updates, which is really
nice. They they arrive in my inbox of
your thinking, what you're trying out.
So, I know you've been doing this for
for a few months and and it comes across
that you're having fun, but can you tell
me a little bit of like, you know,
that's it's a big one like from 50
years. You've been coding for a long
time. What is making it fun and what is
this genie? You you you've said this
before and it's it's an interesting way
to think about it. There's a kind of
wish fulfillment. I I I wish that
interllisp had a function called the
dwim do what I mean and you'd send it
some code and then it would send back
code that did what you actually meant.
Mhm. and it didn't work very well. But
that was the that was the metaphor. And
people want that to be true of coding
agents, right? And right now anyway,
that is not the truth. They will not do
what you mean. They they have their own
agenda. And the best analogy I could
find is a genie. It grants you wishes
and then you wish for something and then
you get it, but it's not what you
actually wanted. And and sometimes it
even seems like the agent kind of has it
in for you. If you're going to make me
do all this work, I'm just going to
delete all your tests and pretend I'm
finished. Haha. You know, and and there
are some good things about what the
genie does that's not what I ask it to
do. Like I'll I'll implement I'll say,
"Oh, go implement a stress tester." One
of my projects is uh implementing a B+
tree
uh as a basic data structure and it I
said oh write a stress tester for this
and it went and wrote a whole bunch of
stuff that I wouldn't have thought
of or maybe eventually would have
thought to ask for and it was cool that
it was there and and that part's fine
but this morning when I was working on
my server small talk and It just
completely misinterpreted what I wanted
it to do next. Went off, made a bunch of
assumptions, implemented a bunch of
stuff, broke a bunch of tests, and it
wasn't at all what I wanted. And so the
the I want to find the metaphor that
that captures this dynamic of I think I
know what I want. I say it and what I
get is seemingly some sometimes exactly
what I want and sometimes it's not and
in a kind of perverse way. Mhm. I I like
the genie analogy because right like in
this in the these stories a lot of the
story genie stories are someone you know
like the the prince or whoever grants
says a wish like I I want to be be rich
and the wish is granted in this like
unexpected way usually that's you know
with the cartoons and then make it fun
that it's kind of true but you know
there there constraints that he or he or
she forgot to specify correct and you
you see the same thing by the Wait, when
you say the genie like which which tools
are we talking about? Is it the agentic
coding tools, the ID autocomplete, that
kind of stuff? I'm using the agentic
code uh tools, which means that you give
it a prompt and it goes and do does a
bunch of stuff without asking permission
and until it thinks it's finished.
Except his ideas of finished and mine
are not the same.
Sometimes I slow it down so that it's
like, "No, no, before you mess things
up, tell me what you're about to do and
then I'll approve it." But then it feels
like a rat in the pellet. It's like
there's just a run button and I have to
click it every time. And I click it and
it's it is a dopamine rush because it's
this is exactly like a slot machine. You
got intermittent reinforcement. You got
negative outcomes and positive outcomes.
And they're not I mean the distribution
is is fairly random seemingly. So it
it's literally an addictive loop to to
have it you you say go do this thing and
then sometimes it's just magic. You know
I I had a big design mess that a
previous agent had had made in my small
dog virtual machine. I'm like oh I'm
going to have to slog through this and
take a take a week to do it because the
the one of the agents wasn't able to do
it at all.
went to another one, said, "Hey, I want
to use this interface instead of this
uh pointer to a strct." And and there it
was, and it was finished. Oh, I was over
the moon. It just felt so good. But then
the next thing I asked it to do, I say,
"Well, here's a set of test cases." And
I didn't really look at the code. And a
couple hours later, I look at the code,
and it's just a lookup table. It says if
this is the input string, here's the
output string and this is the input
string and that and I erase. Oh, I was
furious. God damn it. I erased it. I
said, "Don't ever do anything like that
again." Oh, I'm sorry, boss. Oh, you
know, it's good at being obsequious when
it knows it's about to be
unplugged. And and an hour later, the
lookup table was back. And I'm just, if
I had hair, I'd be tearing it out. Oh my
goodness. But all of that goes into this
very addictive, oh, I just, you know,
I'm walking I'm walking to bed at night
and I walk by my computer, I'm like, I
could do one more prompt or if I go out,
you know, I go out for a walk or go out
to lunch, I'm like, well, let me let me
start let me what's a prompt that would
take, you know, an hour because I don't
want to waste the hour. Yeah. Not having
it do its thing for me. It's a
completely new world. Here's the beauty
of it. I can think really big thoughts.
I can have insanely
ambitious ideas which I have had for a
long time. I just, you know, at some
point, probably 20 years ago, I just
went,
h, but I'm gonna have to figure out npm
project, you know, package management,
and there'll be package circular
dependency blah blah blah blah blah, and
somebody's going to write some tool that
does stuff in a stupid way. I'm just
going to have to deal with it.
And then along comes the genie and you
can go, "Hey, it's a circular
dependency. Smash all this stuff
together." There, there I did it. Oh.
Oh, wow. Okay. Now, now what parts can
you pull out? Oh, this and this and
this. Oh, okay. So you you can think
really big thoughts and the leverage of
having those big thoughts has just
suddenly expanded enormously. I had this
tweet whatever two years ago where I
said 90% of my skills just went to zero
dollars and 10% of my skills just went
up a,000x. And this is exactly what I'm
talking about. So having a vision, being
able to set milestones towards that
vision, keeping track of a design to
maintain the levels or control the
levels of complexity as you go forward.
Those are hugely leveraged skills now
compared to uh I know where to put the
amperands and the stars and the brackets
in Rust. You know, I'm I'm programming
in every language under the sun and I
just I just kind of don't care. I'm
learning by osmosis. I'm learning about
the languages, but you know, and I was a
language guy. I loved languages and the
details of languages and it just kind of
doesn't matter so much anymore. Yeah.
So, tell me about that. So, like for you
because you've been programming for like
what 50 years, right? Yeah. Yeah.
Yesterday during that O'Reilly thing, I
I I blurted that out and I went, "Oh
crap, it's probably more like 52." But
yes,
so like you you picked up a lot of
languages in in in the past and and you
know, like to me, one of the traits of a
developer up up to now has been how
quickly they learn languages cuz you
know, I I learned a couple, probably not
as many as you, but after a while it
gets easier and easier and maybe a
little bit more kind of annoying because
they're similar. you know once I mean
you know there's differences with the
with the declarative languages or you
know like I mean you know with small
talk and and Java is a bit different but
outside of that it's not much so before
you know as you were on your like you
know like year 30 or 40 how did your
attitude change towards learning new
stuff just honestly and then how did
this change it so I was in love with
small talk absolutely emotionally
attached to it still am when I get a
chance to program in small talk. I I do
it and I really enjoy it. That sense of
caring about a language certainly went
away because my heart had been broken
too many times. And the desire to go
deep on a language
also like oh yeah, you know, learning
the memory layouts of strcts I yeah fine
whatever. There's just a handful of good
ways to do it and a whole bunch of bad
ways to do it. And as long as this isn't
one of the bad ways, I don't care. There
are genuinely new language constructs
like non-nillable variables that I
appreciate that say things that I want
to be able to express, but but the
emotional attachment, the uh I'm a Java
guy, I'm a closure guy, I'm a
the used to be a thing like maybe 10, 20
years ago, maybe today, but not as big
as it used to. Well, I I I think people
still want to be part of something
larger. And to be fair, an emotional
connection helps me be
smarter. But I just I can't be the us
and them stuff. I'm so tired of, oh,
you're one of those Scalla people. Well,
we're programmers and we're writing
programs and we should be kind to each
other. And beyond
that, Yeah. And now that I have the
genie handling the mundane details of
it, I start projects in languages I've
never used just to see what that
language is like.
What what languages have you played
around with in the last month?
Uh, Swift, Go, a bunch in Go, Rust,
Haskell, C++, that one didn't last very
long.
JavaScript, and and the genies are
actually good at writing small talk,
which I was like, "Oh, wow. Please, I I
hope." No, but they they write
syntactically correct, semantically
correct, not worse quality than other
languages code in small talk. So what
kind of projects have you taken on? You
said like these are like a lot more
ambitious than than before you would
have attempted. Yeah. So, I the the big
one the the one that's I'm having the
most fun with is a
uh a server small talk where the
intention is to have a small talk that
is uh
persistent uh transactional so you can
have transactions that commit and abort.
Um, so it's kind of
databaseish
parallel so that you can run a a bunch
of threads or a on a on the same CPU or
you can run larger grain parallelism
across machines and operates with good
gajillion bytes of of data. I wanted to
circle back a little bit, you know, 20
plus years into the past. So there's
this thing, the manifesto for agile
software development. I don't need to
show this to you, but in in 2001, this
was huge. I still remember uh when I had
my first job around like 2008 at work,
we would look at it, debate it, you
know, there were like all sorts of of
things around scrum. And one thing that
you know is very striking is you are the
first person on here, I guess, because
it's alphabetical order based on it is
alphabetical order. So, yeah, thank you
for confirming that. But it is but it is
beck at al to my to my neverending
delight. How did it happen and and h how
how were you even involved? How much
time do you have?
Um so there had been a series of
workshops about the future of software
methodology. The prevailing wisdom from
when I was in school was entirely
waterfall, which by the way, go read the
original Winston Royce paper where it
says here's a way of looking at it.
There's this analysis and design and
implementation and tests and uh nobody
would ever do that. That's a stupid
idea. Instead there would be feedback
loops and you'd be doing all of the but
this is a power of a metaphor. People
looked at that oh the four steps and one
after the other and then I'm finished.
So that was the conventional wisdom and
there were a bunch of us working in
different ways on alternatives to that
because it just it flies in the face of
economics, humanity, information theory,
project manage, take your pick. So there
were a bunch of us working on on
alternatives to that. Um
and we would get together and talk about
those alternatives probably for 3, four,
five years maybe leading up to 2001. So
that particular meeting was the
culmination of uh of a long
series of of these meetings. I remember
the the first one I got invited to which
was also at the snowboard resort in
Utah. Martin Fa Fowler. I knew Martin's
name, but I'd never met him. And I
instantly fell in love with him. When he
introduced himself, he said, "Uh, hi, my
name is Martin Fowler, and I'm the only
person at this table I've never heard
[Laughter]
of." And that began a a long and
fruitful friendship. So, oh, we were
talking uh at some point it was clear we
had this scrum people, we had the
extreme programming people, we had uh
DSDM featured driven development. Uh
there were all these kind of niche
niches and we were all stirring up a lot
of interest XP the most at that point.
It was kind of in a crossing the chasm
sense. If we wanted to reach the early
majority, the innovators were already
doing our stuff and and being very
successful with it. But if we wanted to
reach the early majority, again, go back
and highly recommend uh reading Crossing
the Chasm or at least having Claude
explain it to you. Oh my god, Gergy,
isn't that fun? like somebody can just
say uh I en values blah blah blah
instead of going you know another
concept I I'm like hey Clyde explain IEN
values to me as if I'm a I'm a bright
eight-year-old and and 20 minutes later
like oh I understand anyway so have
Claude explain crossing the chasm if you
don't want to go read the book the book
is worthwhile but uh anyway we needed a
way to reach the the early majority so
it was time to and this is straight out
of the book to to have a an industry
standard, some kind of consortium. It
makes it seem less risky. Um, and so
that's how we we came together. We'd had
a prep meeting for this on the
Herten, which sails up and down the
coast of Norway. Okay. And we we had a
workshop in Oslo. We we flew up to the
tip of Norway, took this uh ferry/cruise
ship down and had uh long conversations
on the ship and that kind of set up the
2001 meeting. So it was in the air for
sure.
um the switch
from phased oriented development to
something where there's a lot more
feedback and a lot more switching
between activities where you you treat
analysis, design, implementation,
testing, monitoring, refinement as
activities that all happen at the same
time or in rapid rapid rapid succession.
So that's the big shift. It's it's this
are the phases like this or are they
like this? Slice slice slice slice slice
through time. And uh so that that's
that's how that all came about. I was
not happy with the word agile. Oh
really? Because it's too
attractive. Nobody doesn't want to be
agile. For my sins I'm a Tottenham
Hotspurs fan from high school and you
can't change. I understand that. But
like I'm willing to own that to be part
of that even though it comes with some
significant downsides when you when you
tell people you're a Spurs fan. Agile.
Everybody wants to be agile. So
everybody's going to say they're agile
even if they're working exactly counter
like every single one of these items.
Yeah. I I I remember when I actually
joined JP Morgan back in 2011, I think,
and they were saying, "Oh, we're we're
very agile. We we like, you know, we we
follow the scrum and we also follow the
manifesto." And I was like, "Okay,
cool." Uh, and then so when I arrived,
we had a team meeting, which was, you
know, good, a standup. It it took for a
long time. It took like two hours, but
what we had. And then the next day, we
were supposed to have one, but we're
like, "Oh, no. It's it's canled." The
next day, it was canceled. And for the
next two weeks, they were always
cancelled because it wasn't and then we
had you know another two-hour meeting
and I was like hold on like we're like
even in the terms of the planning or or
just talking we're not agile like no no
no we are agile like we are we are we
are hearing the feedback we're just not
acting on and as you said like they were
convinced and you know up up at the
highest top at the time like the whoever
was the head of technology they kept
repeating how we are so agile we are so
we are following whatever for this isn't
I I I think they meant it by the way. I
I don't think they knew that they were
lying, but as you said, uh I think it's
only now that I realize that may maybe
you're right. Maybe that word was what
what word might have you chosen? Uh and
obviously this is we're not going to
change the past, but like did you ever
think about what might have been an
alternative? Sure. I had I had my my
pick at the time. So extreme was big
pluses and minuses both to that word.
Um, but it definitely if you don't do
the work to be an extreme programmer,
you're never going to claim that you
are just it's it's it comes with too
many downsides. At the time, and by the
way, for that meeting, I was sick as a
dog. I had a massive sinus infection. I
was on all kinds of drugs. I hardly
remember any of it. the there's one word
in the whole manifesto that I added in
the 12 principles. It talks about
interacting daily. And that word daily,
that was my word. That was my
contribution to the whole thing. And the
rest of it is kind of a blur. But the
the the word I was pushing at that
meeting was conversational where this
isn't the monologue. It's a dialogue and
we do some stuff and we see how it goes.
And we do some stuff and we see how it
goes. Oh, it's not sexy. It's not got a
lot of pizzazz to it. Like, I understand
why it wasn't accepted, but the dilution
of that word agile uh was perfectly
predictable back then.
And then c can you tell me on how what
what what was the reception of the
community? So, it it sounds like it was
pretty impressive for a few years like
this group got together and really Yeah,
we were clearly touching a nerve. So
when XP really blew up in
9989 that it was just huge. Uh can can
we just talk about XP for listeners who
might not be familiar with XP because it
used to be more popular than it is now
and you are attributed as the creator of
it at least on on Wikipedia. So c can
you tell us what what what XP is and h
and how it came along and how how you
became affiliated with it or how you
created it? So, I'd heard this this
advice about waterfall stuff and how,
you know, grown-ups specify exactly what
their system's going to do and then they
just implement it and then that never
works and so we should specify better
blah blah blah. And I I disagreed with
it. So, I started consulting and I was
primarily a technical
consultant. Um, I knew about performance
tuning and you know the bit twiddling
and that kind of at that level. Then
people would ask me for advice about
project management. I remember one time
I went to a
project. It was all the most senior
people in the organization like the four
most senior engineers were all working
on this
critical thing except that their offices
were in the corner of this big square
building.
And I said, you know, I mean, we could
talk about the performance of your
system, but really you need to find a
place to sit together. And I came back a
month later and it was night and day.
And I had just told them to rearrange
the furniture. And I went, "Oh, okay.
May maybe maybe my only leverage point
isn't knowing all the bits and bites.
Maybe maybe there's higher leverage." So
I started thinking about more and more
the context of development. And by the
way, paying attention when you sit
together of how you sit together, the
lighting, the acoustics, the
furniture, what kind of behaviors that
encourages and discourages. Huge
leverage in that and just nobody seems
to be paying attention to it. Um anyway,
so I was giving more and more advice
about how projects should go. I I was
going to go work with a project at uh
probably shouldn't say but a fintech
company and I knew I was going to tell
them to write automated tests because I
had I'd been doing all these experiments
with automated testing as a
programmer. I thought well how are they
going to write the tests? So in a panic
Sunday before I left Monday morning, I
wrote the first uh testing framework
that of that exunit style. Mhm. In small
talk and handed it to them and a month
later I went back and they said, "Okay,
well what do you do when you've got more
than a thousand tests?" I'm like, "Wow,
what?" Really took off. Yeah, it it just
took off. I'd been paying attention to
this kind of processy stuff and uh went
to a project at Chrysler
uh which was floundering. Turned it
around, but I kind of just took
everything that I knew that worked well
and tried to crank the knobs up to 11
uh and then discard all the other stuff
and just to see what happened. It
couldn't be worse than guaranteed
failing. So then I started talking to
other people about well I'm doing this
there's this project and we're doing
this crazy stuff. We got these
three-week iterations and we're ready to
deploy every 3 weeks. Crazy stuff man.
um back then and the programmers are
writing tests and everybody's pairing
and the the customers telling us what
features they want next and that's what
we're implementing every 3 weeks and D
and I got tired of saying the in the
style of this project I've been talking
about that I'm so excited about that's a
very long phrase so I'm like what do we
call it what do we call it what do we
call it and I wanted to pick a word and
Grady B is a friend of mine but I can
also pick on him a little it. I wanted
to pick a word that Grady Buchch would
never say that he was
doing because that was the competition.
Like I didn't have a marketing budget. I
didn't have any money. I didn't have
that kind of notoriety. I didn't have
that corporate backing. So if I was
going to make any kind of impact, I I
had to be a little bit outrageous. And
so I picked that, you know, extreme
sports were coming up then. And I picked
that
metaphor and it's actually a good
metaphor because extreme athletes are
the best prepared or or they're dead.
Those are your two options or or both.
Yeah.
Um, and so I picked that metaphor and
and used it. Um, and started talking
about it and remember 99. So the do
thing is about to explode and
everybody's looking when AMP was big. It
was huge, right? It was the music the
MP3 player
and and the.com it webband was probably
not even founded back then, right? No,
no, not yet. But people looked at the
books and the waterfall stuff and
they're like 18 months this is all going
to be over in 18 months. Yeah. Whatever
are we going to do? So into that into
that desperate yearning need here comes
XP and says yeah it's
okay. There's a structure to it. There's
predictability to it. There's
feedback. You'll get results sooner and
longer if you work in this style. And
because people so desperately wanted
something kind of like that, then it
just exploded from there.
And then what is XP? Right? Like I I
know there's parts of it that is pairing
and you said getting feedback, but what
is the elevator pitch of like, all
right, here here's what XP is at a high
level. Here we have figuring out what to
do, figuring out the structure that will
let us do
it, implementing the features, and
making sure they work as expected.
That's it. That's it really. So, so now
we're going to slice time really fine,
and we're going to do a little bit of
all of those activities in every slice.
Okay. So, pairing is not mandated in XP.
mandated is the wrong
metaphor. He let I tell you a story
about pairing. The first XP team, I
said, you know, we're going to pair. I
kind of gave them a list of the the the
commandments, but I wasn't there all the
time. And about six months in, they came
back and they said, "You know what?
We're giving our customer
uh working software every 3 weeks." And
every once in a while, Marie finds a
bug. So we collected all the bugs that
Marie found and we said, "What is there
is there any pattern
here?" Every single bug that was found
postdevelopment was written by somebody
working solo.
Every single one. Think about the
converse. Yep. There will be no reported
defects from production if you pair. How
cool is
that? So mandated. No. Strongly
recommended.
Not even experiment. You do
you. But pay
attention. PE people will like stumble
along. Well, this is just how I program
and have horrible problems and just keep
doing it and keep doing it because this
is just how I program. Don't do that.
Pay attention if you want the benefits
of continuously designing or
continuously validating or continuously
implementing or continuously interacting
with your customers. You can have those
benefits, but then it's you're gonna
have to change the way that you work. So
is it's not mandate is just not even the
right. It's it's an empirical process.
Yeah. Yeah. So like some teams decided
that this is a good way for them to
work. Yeah. Yeah. Which is great. People
will come up to me and say, "Oh, you
know, I don't do TDD." I'm like, "Why do
I care?" Like if you're happy with your
defect density, if you're happy with the
feedback you're getting on your design
choices, good for you. But if you're
unhappy and you want to tell me that,
well, that's just how things are. Uh-uh.
So, let's talk about TDD. How did you
get involved in TDD or how did TDD
evolve and where did it come from?
Because we had XP, as I understand,
first. No, no, no. TDD came first. LTD
came first. So, I was a weird child.
That'll come as a big shock to you. My
dad was a programmer. He would bring
home programming books. This is in the
early
'7s. And I would read them cover to
cover and I didn't understand anything,
but I was just obsessed with this
machine, this intricate mechanism, and
how does it work and so on. And one of
the
books said, "Here's how you develop a
tape application." So tape to tape was
the old way of putting business
applications together. You you wouldn't
have one monolithic program. You'd take
an input tape, you'd write a program
that transform it. Now you take the
output tape from that, physically move
it to the input side, run another
program that would generate another
tape. and and and and so there would be
this big web of
these of these programs. No shared
mutable state. Wow. It's like it's very
modern in in some kind of ways. But uh
Okay. So said here's how you implement
one of these things. You take an a real
input tape and you manually type in the
output tape that you expect to get from
that input tape. Now you write the
output tape. You run the program. you
write the output tape and then you
compare the actual output with the
expected output. So I read that as a I
don't know 8 10 12 year old something.
Then I wrote SUNT as I said to help a a
client write some tests and then just
one of these crazy conceptual blending
ideas. I
went, "Oh, I have this testing
framework. I'm used to writing tests
uh for code that already exists. I
remember this tape totape idea. What if
I typed in the expected values before I
wrote the code?" And I literally laughed
out loud. This is such a absurd
idea. I thought, "All right, all right.
Well, let me just try it." So, I tried
it on stack. And I tend to be an anxious
person. I got a lot of
worries and programming is a constant
source of anxiety for me because like
what did I forget? Oh yeah. Like what
did I break? GH. So I I I had this
testing framework. I had this idea. I
applied it to stack. I said, well,
what's the first what's the first test?
Push and then
pop. Whatever I pop is what I pushed.
Okay. So I wrote it and because I was
writing in small talk which is very
forgiving for the order of your
workflow. It doesn't you didn't type in
a test a class that doesn't exist and
it'll happily try and execute it and
fail. But it's going to try because
you're the programmer. Maybe you know
better. It said well stack doesn't
exist. I'm like oh well let's create
stack. But you know what? I'm just going
to create the absolute least I need.
We're just gonna crank this all the way
up to 11. I'm going to just going to
create stack and I'm not going to do
anything else. And then I get a new
error from the test. Oh, I don't have a
I don't have an operation push. I'm
like, oh, okay. Well, how am I going to
imple? Then I look at stack. I'm like,
oh, how do I implement push? Okay, I do
that. Oh, well, there's no operation
called pop. Oh, okay. Let me go look at
how finished it. I had this list of test
cases before I started. Push and then
pop. push two things, pop them, you get
them in the right order, is
empty. Pop of an empty stack throws an
exception. Okay, cool. And I went
through my list and I ticked the all the
boxes. I probably came up with one or
two corner cases along the way. I ticked
those off, too. Now, where's the
anxiety is is gone. This
works. This abs like I'm certain this
works. I can't think of another test
case that isn't just going to pass. And
if I'm the least bit worried, I just
type in that next test case and then I'm
not worried
anymore. Oh my god, it's transformed the
emotional experience of programming for
me. I I I I I never heard this this take
although like I I can relate though like
because like I remember that when we did
TDD and on this team we did it for the
stuff that we're unsure it was like
unclear and by doing the tests first we
we had to specify we had to be clear and
and it was stuff where there was a bunch
of edge cases but it it never until now
we talked it never occurred to me like
this. Now you can also make technical
arguments for
TDD about defect density about how you
get quick feedback on your API
choices about how it
enables implementation design evolution
when you have a series of tests. Like I
get yes, we can talk about all of that
stuff
rationally, but just the savings
on anti-anxiety meds alone pays for
itself. This episode is brought to you
by Augment Code. You're a professional
software engineer. Vibes will not cut
it. Augment Code is the AI assistant
built for real engineering teams. It
ingests your entire repo, millions of
lines, tens of thousands of files. So
every suggestion lands in context and
keeps you in flow. With Augment's new
remote agent, cue a parallel task like
bug fixes, features, and refactors.
Close your laptop and return to ready
for review pull requests. Where other
tools stall, Augment Code sprints.
Augment Code never trains or sells your
code, so your team's intellectual
property stays yours. And you don't have
to switch tooling. Keep using VS Code,
Jet Brains, Android Studio, or even Vim.
Don't hire an AI for vibes. Get the
agent that knows you and your code base.
Start your 14-day free trial at
augmentcode.com/pragmatic. And and so
what is your take when we discuss with
John Olster how his
biggest criticism slash feedback or let
me put it why why why he doesn't really
believe that it's a fit maybe for the
things that he does is that he feels
that from an architecture perspective it
doesn't help you know create a nice
architecture up front because you're now
focusing on on the detail as I I think
this is roughly what he summarized. I
might have gotten it wrong. And I'm sure
this is not the first time you've heard
some feedback like this. It's a choice
though. His his his statement in that in
that interview with you was that there's
no place in TD for design. And he's just
flat out wrong. That's a choice. As a
practitioner, I'm bouncing between
levels abstraction all the time. I'm
thinking, let's get this next essay
running. I'm thinking why is it hard to
get the next test case running? I'm
thinking what should the design be so
that getting the next test case running
would be easier. I'm thinking when
should I if I have an idea for that when
should I introduce it now or later? I'm
thinking when I introduce it what are
the slices? Is there a little bit that I
can do right now that will make things a
little bit better or do I have to do
this in bigger chunks? Like I'm thinking
all that stuff. So if if you think of
TDD as red to green to red to green, I
the transition is when you go from red
to green, you change the implementation
and now you pass the test. And when you
go from green to red, you write a new
test. If that's if that's the entire
cycle, no, there's no place in that for
design. and it's just not how it's
practiced. When I write a test, before I
write the test, I have a moment of
design. What should the API for this
functionality mean? Yeah, I I see that.
So, I'm making design decisions about
about the interface
without having an implementation. I get
to decide what interface I want and then
we'll work out the details of the
implementation later.
then making it green. Like pretty much I
just want I hate having a red test so I
want to get to green relatively quickly.
At which point I have a moan of breath
because the anxietyy's gone. Right.
You're a musician too, right? I'm I'm
not a musician but I have done red green
test and I and I I know this this sense
of tension and release. Yeah. Once the
tension of a red test is released and
you have green now, now I can I'm free
to think all those thoughts of like,
yeah, but this isn't going to work for
these test cases. Or I could generalize
this current implementation to also
handle a bunch of other test cases
correctly. Or I could rearrange the
implementation because I know I'm going
to have five more tests like this. And
I'm thinking about design but in
situ in the context of running code and
anytime I feel the least bit anxious
about any of this just press a button
and it's either red or green and if it's
red then my next job is to get it to
green and if it's green my next job is
to breathe and think these larger
thoughts. So I like I can understand if
you compress TDD to red, green, red,
green, red, green, red, green, then no,
there's no space in there for design,
but that's just not how it's practiced.
Yeah. No, I I I see like it's a tool,
right? Then you you use it here and
there, you step out, you go back in.
There was a time where we did TDD,
right, for a while, and then it just
felt a little bit too much effort,
especially when I knew what I was doing.
And so what I would do like I really
like the red and green but what I would
do and what a lot of my colleagues would
do is I knew the implementation that I
wanted. So I would do the implementation
then I would write a test. I I would
kind of like a little bit like you know
maybe even I might even just double
check I don't launch it in the if it was
a web page launch it a web page it kind
of looked good as how I want it to be
and then I'll be like okay let me shut
down my brain a little bit. Let me write
a test against this and now that test
would have normally passed. And what I
would do, I would either run and pass
it, but then I would be like, "Okay, I I
want to see it break." So I would I
would go back to the code and I would
maybe make a change that I knew would
make it break. I would run the test.
It's red. I'm like, "Okay, so because
what I didn't want to do because I've
seen it before is my test is not testing
anything." So I would kind of do this
like write a test after but then do a
red green to make sure that I am testing
the right thing and I would still design
the test. But what is your take on on on
this? Am I am I kind of doing and the
reason I did I felt because I knew the
implementation I just wanted to get the
implementation done. It's already
decided I felt I would be holding back
if I started with the test. Yeah, that's
a it's an interesting assumption. I can
understand why you would make that
assumption that you know already know
what the implementation's going to be.
And the writer the more correct you are
in that assumption the less value there
is to have the test first.
I'm always going to
bet when I'm going to learn things. This
is the most ignorant I'm ever going to
be today. Fair. I'm going to be I'm
going to be more experienced tomorrow.
Now, as a as a 50-year programmer, maybe
I'll forget some of the things, but
that's a separate set of issues. Oh,
yeah. So, I'm I'm going to assume I'm
going to learn, and I I'm going to
assume that things are going to change.
The more those assumptions hold true,
the more I have to learn and the more
things are going to change, the less
commitment I want to make right now and
the more I want to defer commitments to
the future. Just it is a general
principle, right? This is this is true
about cooking
and dating and all
everything. The more you can predict,
the bigger the jumps you can make. I
always want to bet. And I love that
moment of discovery. I love the moment
of I knew how this was going to turn out
and then I and then I do and I do and
I'm like, "Oh crap, there's a completely
different better way to implement this
that I I love that moment and I want to
induce that as much as possible." And
that's not how I started out. No, I I
started out wandering around imagining
the whole implementation in my head. I
can remember doing this on the
University of Oregon campus at night,
wandering around in the fog because it
was always foggy when it wasn't raining,
imagining
big big programs in my head. And then
then if I could just make that actually
real and make that work, then that was
the the process. And you were all
dreaming of them as well. I I remember
like I would dream with them as well
sometimes. Yeah. Yeah. Yeah.
And what I discovered is that's so
limiting to
me because it slows down my ability to
learn. So John was talking about
building a new network stack into Linux
kernel. Yeah. Yeah, he's doing that.
What a cool project. Awesome. I would
love to get walked through that whole
thing. If you if you had a concept in
mind of exactly how to implement it and
you kind of and you knew what the input
output behavior was, you knew what you
wanted to observe and that wasn't going
to change, then sure, just go implement
it. But the the more mistakes you make,
the more learning, the more things are
going to change in unpredictable ways,
the less commitment you want to make now
and the more you want to defer
commitment to the future. Just I I like
that. One thought I did have recently as
as you know AI tools have come around.
How do you think TDD might or might not
fit into when you're working with an AI
agent? What you're doing right now,
right? Cuz it's interesting the ambition
of of what you can do and also just the
natural workflow. H how do you think
about that? Do you think they would ever
be a fit? Do you think we might actually
slow down and you know like start
writing with like well what we expect
and have and then and then have the
agent pass it because it could be a good
workflow. I just haven't seen anyone
really do it. That's how I work all the
time. So so so when you work with your
genie, you start with the tests. It's
not that
simple. I often times
communicate things the genie missed in
terms of tests. M. Mhm. So today I was
working on the the small talk parser and
it said well if I get this string as
input then I get this syntax tree as
output. I'm like no no no no no
completely
wrong. This that's the right correct
string as input and the output is this.
Then off it goes. Oh, I see the problem.
Blah blah blah blah. Oh no no no that
wasn't it. I see the problem. Blah blah
blah blah blah blah. No that's not it. I
see the problem. I'll just change the
test. No, stop it.
This is what you said that it can change
test and delete tests.
Oh, yeah. If I just if I just removed
that line from the tests, then
everything would work. No, you can't do
that because the I'm telling you the
expected value. I really want to I want
a an immutable I want an immutable
annotation that says no, no, this is
correct. And if you ever change this,
you'll awaken in darkness
forever. So that's the punishment. So
you'll still run I'll still feed
electrons in there, but no no bits, no
information. You're just going to be in
the dark forever. Got that? So yeah, I
definitely use tests. The genie is prone
to making decisions that cause
disruption at a distance. is not good at
reducing coupling and increasing
cohesion. Not not at all explicitly told
what to do. It can sometimes implement
it. But in general, it's just it has no
taste, no sense of design. So I have a
big bunch of tests. I mean, they run in
300 milliseconds cuz
duh. Yeah, of course you want your tests
to run lickety split. So they those
tests can be run all the time to catch
the genie
accidentally
accidentally breaking things. I think
he's doing on purpose. It's no that's
why genie is the perfect metaphor is
like yes I will grant your wish but I'm
still pissed off about being stuck in
this bottle in the desert for a
millennia. And this also like strikes me
that once we have the tools or maybe we
have it today with MCP or some other
things to allow this agent to run the
test like it just feels to me that you
know the teams or people who are doing
these practices which are sensible and
obviously like and you can move faster.
In fact, you probably move faster. It
might help you integrate these agents
better. You know if you have the rule of
do not change a test always run the test
before or after you made the change and
if it doesn't pass fix it, right? Or
something like that. Like I I'm I'm
still waiting for more people to
discover this cuz I I wonder if we're
going to go back to, you know,
discovering, you know, things that were
we already were popularizing or you were
popularizing in the 2000s. People should
be
experimenting. Try all the things cuz we
just don't know. The whole landscape of
what's cheap and what's expensive has
all just shifted. things that we didn't
do because we assumed they were going to
be expensive or
hard just got ridiculously cheap. Like
what what would you do if I don't know
cars were free? Okay, things are going
to be different, but what are the second
and third order effects? No. Like nobody
can predict that. So we just have to be
trying stuff. I I like that. And and
this brings me to another interesting
topic and story that you had. You told
this story a long long time ago and in
the software engineering daily podcast
but I don't think anyone's heard it
about how
experimenting can be interesting. So
when you joined Facebook was it in 2011
2011 that that was the peak where TDD
was very well known in the industry.
That was around the time where my team
experimented with it and as far as I
know whenever I talk with people on
meetups people are trying it doing it.
And it was kind of accepted that you
should be doing some level of unit
testing, maybe TDD, maybe not TDD. And
you shared the story that you joined
Facebook and then you you wanted to, you
know, hold a class on on TDD and like
what happened and how was Facebook doing
their own testing actually? Did they use
TDD or did they do something else? Yes.
So I joined and I was a little panicked
like hugely successful growing fast a
lot of very smart very confident
engineers you know have I lost it can I
can I
hang I thought I'll teach a class in TDD
so the there was a a hackathon and part
of the hackathon is people could offer
classes and so I offered a class on
TDD and in the signup sheet I went and
looked later. Yes, indeed. There was a
class on advanced Excel techniques that
was full and they had a waiting list.
And there was a class on Argentinian
tango right after mine on the list and
it was full and they had a waiting list
and nobody signed up for the TDD class.
Wow. And I I said, you know, you know
what? I'm going to have to forget
everything I know about software
engineering. I'm just going to wipe the
slate clean and I'm gonna
just monkey see, monkey do. I'm going to
copy what I see the people around me
doing and I'm going to see how that
works out. And what I discovered through
that process,
one ju socially, it's not a good look to
come into somebody else's house and
start arranging the furniture. Just
don't don't do that. But two, I learned
powerful lessons. programmers at
Facebook at the time. I'm not going to
say Meta, I'm gonna say Facebook and
Facebook at that time because it was a
very different place when I left in
2017. But in 2011, programmers took
responsibility for their code very
seriously because they were the ones who
were going to get woken in the night.
And there there there was an ops team,
but the job of the ops team was to make
sure the programmers felt the pain of
their own mistakes. And they did. And
and it was very effective. As a
programmer on Facebook the site, this is
premobile Facebook the site. You had a
bunch of different feedback loops. So we
were working in PHP. We had our own dev
servers. So if I wanted to change from
blue to green, I'd just change it. I
could look at it seconds later. So we
had that feedback
loop. We had code
reviews. Kind of iffy,
but you got some feedback from code
reviews. We had internal deployment
because everybody was using Facebook all
the time for both personal and business
stuff, which is it own set of boundary
issues, but we'll leave that one to the
side. We had incremental rollouts, not
like weekly rollouts. We had smaller
daily rollouts, but we had weekly
rollouts and then a bunch of
observability. And then we had a social
organization that was used to, for
example, the first feature I I
implemented and launched was adding to
the relationships type types. You could
say I'm single. It's complicated. I'm
married. And I added civil union and
domestic partnership to that list. And
it rolled out. It took me too long to do
it. I I used
TDD. Was a big waste of
time. It rolled out. The notifications
code broke because there was implicit
coupling between the two and you
couldn't find it, but it was there.
Somebody
else saw the error rate go up, went and
fixed it, rolled out a hot fix. I called
them up. I'm like, "Oh, I'm so sorry I
that you had to do that." It's like,
"Yeah, that's what happens, you know,
when things break socially." There was
no there was no boundaries. There was a
there was a a poster that was very
popular there that said nothing at
Facebook is somebody else's problem. and
everybody acted like that was true. And
because of that,
if you add all those different feedback
loops
together, we had a relatively stable,
rapidly innovating and rapidly scaling
system all at the same time. the
mistakes that actually caused
problems like the calculation of a some
string was not a not a hairy computer
science dynamic programming blah blah
blah that could go wrong. What would go
wrong is configuration
stuff, the relationship between
[Music]
subsystems, stuff you couldn't write
tests for. So writing tests for things
that didn't break and didn't catch the
actual errors, it just didn't make any
sense in that kind of environment with
that risk
profile. Yeah. It didn't make sense.
Yeah. And I guess the the context that
I've heard and you know like correct me
if I'm wrong that Facebook had this
super very unique place which even is
very rare today. They had so many users
and code was rolling out live code. uh
the the websites rolling out to so many
of them that and you they had such great
observability. They still have that you
had live mass testing and you could
detect a lot of the things that you
cared about because you measured them.
You had this layer and and this is what
I think a lot of people miss of like oh
we can we can operate like Facebook. I I
mean you probably can if you have this
level of customers or or this
observability but if you're like a bank
where you have 10 customers like at JP
Morgan again the software I wrote was
used by seven traders and they moved
about a million or two or three million
with each transaction and they did five
of those per day suddenly you know like
I had like 35 transactions and if let's
just AB test that yeah
well so so there there's a your
opportunities that that you had and and
Facebook there were not many sites at
the time that did that and even the ones
that had that many customers they might
have not had the this setup of of
observability stage rolls and so on even
today I think that they're the Facebook
specifically not meta but Facebook as I
understand is still the state-of-the-art
globally in terms of how they have they
now have multiple environments automated
roll backs if something degrades you
don't even need to look at it like your
you know colleague did that
yeah uh feature flags is Another
important part of that and it was a
lesson I had to learn and and one where
code review really helped me. I'd
written some
code. I thought it looked fine. Somebody
said this looks a little janky. Put it
behind a feature flag. I'm like really
what? Okay. Okay. I you know and I was
in that headsp space of I'm going I'm
here to learn. If feature flags is what
we do then feature flag it. And I did.
And then I realized, oh, how liberating
that is as an implementer. Who is going
to be responsible? If you're not going
to be responsible, who cares? Like I But
also, talk about anxiety. If I'm if I'm
not the responsible person, that feels
horrible. But if you're going to be
responsible for whether this code works
or not, having a feature flag is just
magic because you get this
subdeployment deployment.
You deploy one software artifact that
has multiple modes and you can go, "Oh,
turn it up a little bit and
whoopsie, turn it down, let's figure out
what just went wrong." I worked on the
messenger back end for a while and we
would do that, you know, and yeah,
you've got We had one API that was
getting called a quadrillion times a
week. Wait, how much is a quadrillion?
Million billion. Yeah. So, a million
billion. So, like Wow. So, like not not
a thousand bill. Okay. Wow. A billion, a
trillion, a quadrillion. Okay. Like I
I'm used to high numbers, but this is
Oh, people would come people would come
to Facebook and they're like, "Oh, yeah.
I want to I want to do some user
testing. I want to get a I want to get a
a hundred people in my experimental
group." And I'm like, "Dude, wrong. Your
exper your experimental group is going
to be like New Zealand. Oh yeah. I I
I've heard this a lot from Facebook
people. It was a perfect perfect
experiment, you know, like only like a
million people. Yeah. Yeah. English
speaking. So localization is there. Time
zone wise it's pretty good. That's
right. And also Portugal and and some
other maybe maybe not a Facebook but
yeah also a popular one. Relatively
small size but you know real real
testing could happen there. So you you
worked for six years at at Facebook.
What is the thing that you and this was
a a really exciting time where it was
fast growth? We could probably
comparable growth to what is happening
right now with some of the hottest AI
startups and it was also mobile
revolution and so you were you were
there. What was the thing that you liked
the most about working there and and
maybe the one that you kind of disliked
the most or or or didn't didn't really
you know get along with? Facebook 2011
is a completely different beast than
Facebook 2017. Facebook 2011 2,000
employees very
sparse design and product kind of
organization. It was just all
experiments and feedback. One of my one
of my big mysteries is here was this
site
which enabled social interactions at
that time. That was the purpose of it.
Built by people with no social skills
whatsoever. Like h how in the world did
that happen? Is there some kind of is
there some kind of social wizard, you
know, hidden someplace and people go and
they burn incense and give an offering
and the social wizard says, "No, here's
how you do notifications." The answer is
no. It was a sheer
experimentation. It was
just all these people trying all this
stuff and the stuff that worked stuck.
So, it wasn't like people were making
better decisions about how social
interactions are best facilitated. They
were making random decisions about how
social interactions were best
facilitated and paying attention and
making sure that the ones that actually
seemed to work stuck. 2017, Facebook, 7
years
later, totally different deal. big
design org, big product
org, like 15,000 employees, which is
again much smaller than it is today. a
lot more politics, a lot
more uh zero sum thinking, a lot
more, you know, if you wanted to launch
a product and it was going
to say I liked longer form content,
essays, podcasts, whatever. Except the
people whose job it was to get more
likes and comments hated long form
content because it was going to tank
their numbers. So they would fight tooth
and nail to make sure that your stuff
didn't show up in the newsfeed,
which like granted that was in their
best interest, but
yeah, I see what you mean. What you
know, it's short shortterm interest.
Yeah. And and your your horizon as a
thinker, the things you can imagine
possibly implementing just gets smaller
and smaller and smaller in that kind of
world. when I when I showed up. Yeah,
you could do anything. Now, it turns out
there's a bunch of stuff that you
shouldn't do, but we didn't know that.
Sorry about democracy. But, um, yeah,
that's that's what I loved about it was
the
possibilities at its best, the
scale, and the this feeling that nothing
at Facebook is somebody else's problem.
also daunting because when so you're
wearing Facebook swag and grandma comes
up to you and
starts wagging her finger under your
nose cuz her son blah blah blah got
bullied whatever like that is your
problem you can't say oh go talk to the
PR department because there isn't one
yet you have to you have to deal with it
and I did yeah ownership yeah and you
know it comes with some downsides but it
comes with a lot of upsides too. It
feels really good. Feels very
significant to be in that kind of
environment. By the time I left, yeah,
it was micro optimizations were
everywhere. The
upside Yeah. was was not not there. When
I got there, the middle managers, best
middle managers I'd ever seen in my
career. Well, everybody who'd made it to
middle management, Facebook in 2011 was
sitting on
life-changing
equity. They were
all they h if they had if Facebook had a
successful IPO, they were all set for
life. And if Facebook the whole thing
stumbled and fell for whatever reason,
they lost that opportunity to be set for
life. So, they were globally optimizing.
you you'd talk to a team and they'd say,
"God, I would love to have you on my
team. You know who really needs help,
though?" So So they were like looking
out like the team interest, the company
interest was was on everyone's mind and
they were willing to forego, you know,
like okay, I'll I'll hold back hiring or
I'll wait like I'd love to have this
person, but this other team needs let me
help them because this is the right
thing to do for the company as a whole.
Yeah. Yeah.
And it's not like they were better human
beings than other human beings, but
their incentives were sure aligned with
that. And just being for me who has a
really hard time understanding other
human beings to be around the that kind
of
alignment that just enables a ton of
creativity, ton of energy for me. I
think more and better
thoughts and I had a hard time operating
in the environment. I can piss people
off. I don't know if you've noticed
this, but uh and and I did my share of
that while I was there, but still the
opportunity was there for me to
fumble and and
I I I can live with that. Yeah. And I
guess by the way like I think this might
be a reason why startups remain
attractive and you know big tech you
know now Facebook is big tech they used
to be a startup but along with like with
all the other big companies you know
they pay well they have a brand they
they they give you your that resume
boost it's easier to get hired
afterwards but in the end you know like
there will be teams inside of them but a
lot of them will be you know everyone's
optimizing for their own thing your
equity is mostly cash and it's
meaningless in terms of the the bigger
picture Whereas at a startup like at
Uber, I remember before the IPO, we we
used to think like that. We what is the
best thing for Uber because we were very
much we felt like we were like big
enough owners. So I think this is why
like maybe this is a good thing that
startups will always be able to have
this this added thing when you're
starting out. You know, people have
large equity. Even when you're up to
like 100 people, it might still be
significant enough. And this is maybe
it's not a bad thing that it's hard to
compete with it because imagine if
imagine if these big companies could I
mean what would be left of it right like
they they would optimize the last thing
out of everything and at least they have
to spend more money on it now right
yeah yeah which is more of the value
that's created come comes back to the
people who are creating it I I I once
talked with with a person at a large
company I don't want to name them uh
they're they're the travel company
though and I was talking with this
person and and about like something
principal engineer and I was like oh
how's the job? It's like oh it's
absolute mess. It's the the the monolith
is still there after four years it's
another like four years to disassemble
the experiment like we have experiments
everywhere but they're all messy. I was
like oh wow that sounds like like not a
great place. He's like but he's like
look look the upside is like I my job is
secure for five more years and this is
why I pay me the big bucks so I'm not
complaining. And it was, you know, like
I guess a good take of like yes, like
I'm not saying you want to optimize for
this, but this is the reality. And that
company understood, you know, they they
pay well, they relocated this person,
you know, all all all the benefits. And
he actually had his challenges set
actually for the next four years because
he's worked at this environment. You
know, it's just one thing after the
other. So yeah, it's it's one of these
things. So as closing I just have some
rapid questions which is like I'll just
I'll just fire and and then you tell me
what is your favorite programming
language although you answered this
already. What is your second favorite
one after small talk? JavaScript.
JavaScript. Okay. Why?
It's uh it's just small talk.
Okay. What what is your favorite AI tool
that you're using right now?
Claude. I use it for all kinds of
different things. claw code as well or
just claude cla code under the covers of
cursor or a
augment or uh I don't I don't know what
rude code is no so so clot code is is a
different it's a command line tool that
clot has it's an agent of of itself like
it's not the model we have not used but
we'll try it now afterwards you should
try it and just see how it compares yeah
I'll wait till we're done talking though
yeah
[Laughter]
and And what what is a book that you
would recommend? The one that you have
not written. We we know your books. The
one one that you enjoyed. It can be
fiction. It can be non-fiction. Uh the
timeless way of building by Christopher
Alexander. Nice. Well, well, Kent, this
was a lot of fun. It was it was good to
reconnect. Oh, yes. Great talking to
you. Thanks. I'm just happy to see how
much energy how energized you are with
with with coding because I think when I
started with with some of these tools, I
was like, a bit of a dread like, oh,
what's what they're going to do, etc.
But the more I use them, I'm I'm not at
your level yet. The more I'm also like,
this is actually it it does bring a lot
of fun and joy back into them. Just more
ambition. Yep. Yeah. Yeah. And I think
that organizations are going to have to
get used to throwing away a lot more
code because you can try ideas so much
more cheaply. You're go you're going to
generate 10 times as many artifacts as
you used to, but still only one of them
is worth keeping. But throwing away
uh completed experiments. I almost said
failed completed experiments
uh needs to be you get the pat on the
head for doing that. Excellent. Eight.
Eight this week only six late last week.
Super. How many of them lasted? Doesn't
matter. And getting used to that I think
is going to be an interesting shift. the
companies that have the opportunity that
can be explored in that way. If you get
used to
uh just quantity of ideas
explored, you're going to have a huge
huge advantage.
Yeah, it's there's going to be changes.
So, this will be an excite exciting
place to see. So, it was great chatting
and we we'll see where this goes. All
right, Gery. So, good to talk to you
again. Look forward to talking to you
again soon. I really enjoyed catching up
with Kent and it's motivating to see how
these AI tools can make programming so
much more fun, even for someone who's
been a coder for decades. You can read
more about what Kent is up to on his
regular newsletter, Tidy First, which is
linked in the show notes below. For a
deep dive into Facebook's engineering
culture and an analysis on why scrum and
agile with a capital A is no longer that
relevant at big tech at scaleups, check
out the pragmatic engineer deep dives,
also linked in the show notes below. If
you enjoy this podcast, please do
subscribe on your favorite podcast
platform and on YouTube. This helps more
people discover the podcast and a
special thank you if you leave a rating.
Thanks and see you in the next
Ask follow-up questions or revisit key timestamps.
The video features a conversation with Kent Beck, a legendary figure in software engineering, about his experiences with AI coding tools and his reflections on the evolution of software development. Beck likens AI agents to "unpredictable genies" that can be immensely helpful but also prone to misinterpretations and unintended consequences, drawing parallels to the "do what I mean" concept. He discusses how these tools have made programming more fun and productive, even after 50 years in the field, by amplifying his ability to think big thoughts and tackle ambitious projects. The conversation also delves into the origins of Extreme Programming (XP) and the Agile Manifesto, with Beck sharing insights into the collaborative process and his thoughts on the term "agile" itself. He contrasts the early days of agile development with current practices in large tech companies, highlighting the importance of continuous feedback, rapid iteration, and a culture of shared responsibility. Beck also elaborates on the genesis of Test-Driven Development (TDD), explaining how it transformed his emotional experience of programming by reducing anxiety and enabling a more confident approach to design and implementation. He shares his experiences at Facebook, noting the unique environment that fostered rapid innovation and a strong sense of ownership, contrasting it with the more politically charged atmosphere of larger, more established companies. The discussion touches upon the impact of AI on the software engineering landscape, emphasizing the need for continuous experimentation and adaptation as the cost of development and the nature of programming shift dramatically.
Videos recently processed by our community