Luke talks RAM Pricing, Tech Youtube and LTT Developer hiring process | The Standup
2255 segments
Today we have a very special episode of
the standup. We have with us Luke from
Linus Tech Tips. Say hi.
>> All right. There he is. Right there. We
have
>> I thought it was Luke Tech Tips. Is Was
I right? New
>> to not be confused with LT. He has LTT.
This is LT, not LT.
>> Okay. Different LT. Got it. LT. Okay,
that makes sense. Thank you.
>> There's some other guy who no one knows
who's like a co-host or something on
that. The main channel, I think. I don't
know. But Luke's the main dude.
>> Luke's does the heavy lifting.
>> That's real.
>> All right. Thanks for the intro
interruption. Yeah.
Uh, anyways, sorry. Also with us is Tee
and I interrupt things.
>> Thank you. Yep. TJ, you're wearing a a
sweater that can no longer be bought. It
must be very exclusive.
>> Arch, by the way,
>> Arch forever.
>> TJ already has children, so it's okay.
Um, all right. Well, anyways, today
we're going to be about you have Can I
just say though, you have to get married
and have kids before you install.
Otherwise, it's GG, boys. Your line,
your family line is finished. Okay, your
ancestors are disappointed. They hunted
and gathered for nothing. You ended the
line just because of your operating
system. Wait till you have kids before
you start Arch. Thanks.
>> That's just the PSA for today.
>> That's a good PSA. Honestly, I think
more people need to hear that. Um, all
right. So, uh, today we're going to be
talking about RAM prices. get some
thoughts where we think it's going along
with probably SSDs and some other
things. Is there any other unknown costs
coming in? And then after that, I'm
going to probably ask Luke a little bit
about running Linus Tech Tips, what it's
like hiring people, all the good stuff.
Uh what does he kind of experience in
today's age of AI and such, especially
in the YouTube scene? Uh just because
it's a much different set of technology
and how you develop stuff and kind of
the software you run for. Uh very
curious about all that stuff. So, let's
start off obviously with the maiden
flagship topic, which is RAM prices. I
don't know about you, but the moment I
heard that AI loves RAM, the first thing
I did is went out and bought MU, uh,
which is a stock, Micron, which
exclusively does, uh, RAM. And guess
what? It was my best investment I have
ever made in my entire lifetime. And so,
there's there must be something there.
If the stock market says there's
something there, there must be a there
there. And so,
>> number go up.
>> The number does go up. Now, there's only
one person in this call that I know for
a fact can not only build the PC, but
also knows how to order a GPU from the
internet, which is an impossible task
these days. I have no idea how to order
a GPU. So, I figured that we'd bring
Luke on and he could uh kind of give us
some of his thoughts about this whole
RAM price debacle and where he kind of
thinks it going, where he thinks it's
kind of going and maybe inform us uh
uneducated non-GPU buyers about the
hardware market. the permanent
underclass. Permanent underclass right
here.
>> Sure. Yeah. I um I'm not surprised by
certain moves like uh like Micron's
Crucials. Uh so it was Crucial is owned
by Micron. Uh Crucial stepped out of the
RAM market. That was not too surprising.
I think they've honestly wanted to do
that for a long time. If you look at how
honestly somewhat specifically DDR5 was
even designed, it was not designed from
a consumer standpoint. was designed from
a enterprise standpoint. The goal has
been enterprise for a long time. This is
mostly a convenient and high profit exit
from the consumer space for Crucial. Um,
and I don't personally suspect that
prices are going to come down from
supply being super high for a pretty
considerable amount of time. It takes a
really long time to get fabs online. Um
and they are also going to resist the
fall of the price because they want it
to be high. Some of the like most
collusion in effectively like any market
anywhere has been in RAM. Uh these
companies love working together to price
fix. They love working together to um
keep prices not too high to not increase
fab capacity too much. That's another
thing they do to um I don't know if you
can say artificially or not. There you
go. Nice. Uh
>> I'm listening. I'm listening.
>> This is super well documented if you're
if you're interested in it. Um it
there's been a lot of legal suits over
it. Um it's it's not even a secret at
this point, let alone an open secret. So
you can it's Yeah, you can dive into it
if you're interested.
>> Commish, you got to listen to me.
>> Oh no. Now one more comment from you.
I'm done rebasing your mistakes. You're
on junior CSS duty until further notice.
>> Commish, you can't do this to me.
>> Keep talking and you'll be doing store
procedures for a month now. Get out of
here. Take him with you.
>> Fun fact, CSS is actually touring
complete.
>> Larry, Gary, Tango, Mary, I'm just
pulling your request. It looks good to
me. You're clear to ship. Thanks. You're
welcome. Next.
It's an awfully big PR
>> for an intern.
>> Well, I just bumped some dependencies.
It's nothing major.
>> Hey, can I get a quick stamp on this?
>> Yeah, don't worry about it.
>> Quick approval. Oh, not on my watch. I'm
on your disc like a pee on slur. Not
this again. It's literally just a hex
code change. Just to prove it. Just to
prove it. Squish. Perf resisting review.
Oh, I know you're the tipler. I've seen
vibe coding, but that AIN'T IT.
>> MERGE COP. NO,
>> I hate merge cop. He always makes
reviewing take forever. We have Code
Rabbit. Oh, come on. I wasn't even
merging the prod. It was a hex code
change. We have Code Rabbit. We don't
need real people reviewing such simple
changes. Code Rabbit can do it for us.
Our engineers time is better spent
solving problems for customers. You can
try it too at codrabbit.ai.
>> Next week on Merge Cop.
>> Now my plan to merge a dip so big you're
the dipler and I always knew it.
>> They have also worked together to yeah
not increase fab capacity. Now right now
there is enough demand that yeah they're
looking into fabs. There's partnerships
from from at least one brand to work
with a much smaller fab company to try
to increase their capacity and and
throughput. There's one happening in
Taiwan. Um,
but it's going to take a while for us to
really see the benefits of that. I think
the thing that might happen first, and I
could be wrong here, is a
slowing down of investment in
um in data centers and the a need for
RAM. Um I don't see that coming anytime
soon. I don't personally necessarily see
a like massive market crash that some
people are predicting for AI happening
super soon. uh due to a lot of the
companies that are really driving this
forward being hugely profitable
regardless of AI. Um
but yeah, I guess that's my like my
quick thoughts. Quick question. You said
that DDR5 was designed more for
enterprise, less for consumer. I have no
idea what that could possibly mean. Can
you say a little bit more about that for
me? Like what does it have like parental
controls or what? Like
>> I also just to be fair I think DDR
stands for Dance Dance Revolution. So
like that's how much I know about RAM.
So I'm like I'm very far
>> number five. Sick. I thought they'd have
more editions out by now. But that makes
sense.
>> It's a very popular game.
>> It is it is disappointing that they
don't have more Dance Dance Revolution
editions. Um no there's a there's a
really good video from
sorry um from Wendell from Level One
Text. if you check out his YouTube
channel and scroll down a bit because
it's not very new. Where is it? Hold on.
Uh, it's not that old either. Um, it's
it's called your DDR5 memory could be at
risk. All about DDR5. Um, 20 minutes
long deep diving what the heck is going
on with DDR5. I think it's interesting
that he has that video about issues that
we're finding especially in the consumer
space but definitely on the enterprise
space with with uh DDR5 and also um Lus
Tvoltz was on our channel not that long
ago and he was mentioning that he thinks
a lot of the problems that users have
with Windows is actually users with bad
RAM. um which was which was very
interesting and part of these two
discussions in my opinion merged
together because of one of the points
that come from it being designed
enterprise first is it's not really
designed for what most of us have in our
like desktop chassis. um which is
honestly not a ton of air flow,
especially compared to a like server
environment where you don't care about
uh fan noise, you don't care about um
much to be honest. You just want the
performance and ideally low power draw,
but that's often a uh an afterthought.
Um, so a lot of desktop DDR5 is like
overheating or having various other
problems and Wendell and some of the
level one tech forum crew and whatnot
have designed these like I don't know if
you were into desktops back in the like
DDR3 Corsair Dominator era where they
had those metal fan brackets where you'd
have those little tiny fans that would
go over your RAM. Um, well those might
actually like matter now. Um, and
they're the the community is making 3D
printed uh shrouds for their RAM so that
they can then mount like small I think
it's 80 mm, but it's been a while since
I've looked into this. Um, little tiny
fans and point them directly at the RAM
up close um to get more air flow on
there, then they're getting um less
errors, less problems. Um because if
if it's doing error correction like on
the actual stick before it gets to the
CPU, all that type of stuff, and it's
overheating, that's not going to perform
as well, and you might have more issues
is my like fairly not amazing
understanding of what's going on in
there.
>> So, hold on. I just want to rewind that
for a second. Mhm.
>> You were telling me that some of the
Windows issues I have not
>> Before you say that though, it's the
stand up. Can you say we're going to
circle back on that? That's a little bit
more work appropriate. We don't rewind
here.
>> Please, I don't want to boil the ocean
right now. Okay.
>> Okay. Thank you.
>> But okay, because I I can't my uh my
ability to maximize Windows decided to
no longer work anymore. And so that's my
life as a Windows user right now is the
ability to only have Windows one size.
Uh, so that's RAM.
>> Yeah, that's all that's all RAM. That's
all RAM.
>> What?
>> Probably not RAM.
>> Yeah, this is crazy. I thought again, I
thought I was a good programmer. Wind
>> is just bad. Okay.
>> Okay. Okay, fair.
>> There's only so much we can do.
>> Yeah.
>> All right.
>> I do like that though. I I like I like
Microsoft coming out and saying,
"Actually, what you guys need is little
fans on your RAM. that's going to fix
it out yourself.
>> Yeah. No, I think it's more like
crashes, blue screens, um application
crashes, application errors. If you look
into stuff in event log and there's like
things going wrong, uh it might be
related to that, but it it shouldn't be
like a a random function not working
like be able to full screen windows
>> that obviously.
>> Have you asked Cortana and or Copilot to
full screen it? It might work that way
instead. I actually haven't, but I am
very curious about doing it. I did have
>> Please record it. Please record it. I
need to see I need to see the results.
>> Yesterday I actually did give my first
AI swing of like a of of a application
and uh I noticed that Gemini has been
added to all Google Chromes and uh
Windows. So I was like, "Oh my gosh, uh
I can't log into Frame.io and so I want
to delete this site's cookies." So I
clicked on Gemini and said, "Delete this
site's cookies." And it said, "I can't
do that." And that was like my that's
been my only experience thus far,
>> which is very disappointing.
>> I just wanted the cookies gone. So then
I had to go and ask Gemini on uh on
what's it called on Google. Truly,
>> it is a futuristic AI then though
because it's like it is denying your
requests. That's what the end game is
for AI, right? So they they achieved it.
>> They achieved it. Yeah. Lore accurate
for AI. So
>> that's funny. Uh, so we are like uh
we're year one effectively into the the
great ramming. Uh,
>> I'm not
>> That's a Oh my god, that's a great
title.
>> Thank you. I just made that up right
now. But anyways, the great ramming is
happening right now.
>> How long do you think this can go before
RAM prices level out? Because I was
looking at it and MU and Western
Digital, which is SSDs, they are up 300%
in their stock price. So like people are
obviously pricing in what they think
this you know this value is going to be
and it it has not looked like it's
slowing down at all on the stock market.
Is this reflective of what RAM prices
will be? Are we going to see a 3x cost?
>> Aren't they already pretty much at that?
>> As per usual, this is not financial
advice, but
>> is the stock market predictive of
anything these days? Um, I mean, you you
had a good guess with with investing in
MU early, which is why you own the
office building you're sitting in right
now.
>> Thank you. Thank you.
>> But but it's like
>> he's leasing. Okay, Luke, don't don't
let him get out of
>> smart with my money. Okay, I lease
>> commercial real estate's terrible.
>> Yeah. Yeah. Um,
yeah. I don't know. I I mean, I think I
think the demand and price is going to
be high for a long time. I think that's
one of the main things that I meant with
my like little intro thing there. um is
that I I don't think it's just going to
randomly come down. Um I think for the
last few years we've been seeing a
general all computer hardware shift
towards enterprise and when you do that
and and consumer becomes more and more
of an afterthought, uh consumer gear is
going to be more price performance
expensive for what you're getting. Uh we
saw Intel recently talk about this where
they're like Intel just openly was like
you know what we've been giving the
consumer market like too much attention.
We're going to focus more on
hyperscalers. Uh we're we're not
interested in that anymore. You're
seeing crucial step out of the market.
You're seeing Nvidia like
>> almost like begrudgingly still talk
about GeForce cards. There's like these
these companies are are seeing the bag
that is um working with enterprise. It
was actually I mean even for me like
being in the hardware space
predominantly it was it was fascinating
not that long ago to find out that like
a lot of these water cooling companies
that I thought of as these like scrappy
little consumer water cooling brands
sold the vast majority of their revenue
of products through enterprise doing
water cooling for servers and data
centers. Um, it's it's an interesting
world and there's a lot more money in it
than like um, you know, uh, us like
waiting for deals on freaking New Egg or
whatever. Uh it's it's yeah it's a it's
a totally different thing when you when
you're willing to buy like just some
incredible amount of uh of of money
worth of like if I don't know some of
these reservations for even during coin
mining some of these massive operations
were buying pallets and pallets and
pallets of GPUs and if they had issues
with them kind of like almost shrugging
it off. Um, whereas consumers, like, you
know, if you have an issue with it,
there's a there's a Reddit thread with
7,000 upvotes and you're screaming from
the mountains and all this kind of
stuff. Like, why why deal with us
annoying low people who lack money uh
when you can deal with the the
hyperscaler
boys with all of the money and and less
of the problems. Yeah. You know, it's
funny you should say that because if you
have looked at OpenAI's kind of oraoris
investing, uh, Nvidia has promised
they're going to pay them a progressive
amount up to one or hundred billion
dollars and OpenAI's apparently one of
their agreements. I was really trying
confused trying to find out the exact
number and the only number I found was
they're going to pay them back 10
gawatts worth of purchases. And I was
like, I don't know what that number
means cuz typically I use USD. I'm not
used to whatever this AI money currency
is. But a gigawatt appar again this is
Chad GPTI ask like what the hell's a
gigawatt what the hell's a gigawatt and
it said one nuclear power plant worth of
energy. So I was just like oh my gosh
that's how much they're going to be
buying is 10 nuclear power plants and
then I was like did they make this deal
elsewhere and then apparently with
Oracle their 300 billion will also
result in 6 gawatt of data center or six
nuclear and I like 16 nuclear power like
that's first off that sounds like a
metal band. Second off, the United
States has 39 total. Like that's half
that's like 50% of the entire power grid
worth of
>> new nuclear power plants. And I'm
probably not even saying nuclear.
Correct.
>> So
>> it's nuclear.
>> Nuclear.
>> It's nuclear.
>> Nuclear power. Yeah.
>> Nuclear.
>> Nuclear power.
>> Nucleular.
>> Nuclear.
>> There's a lot of
>> nuclear. Yeah.
>> I don't know if you want me to inject
some random technical stuff in here or
not. Yes, please. Certainly can. Okay.
So, I don't study any of this stuff.
Like, hardware is definitely not my
monkeys and not my zoo. I just look at
it and go like, look, the worse the
hardware is, the better is for people
like me who like to talk about
programming performance because it just
means you have to be better, right? So,
I'm fine with like go great. If there's
a if you have to start programming for a
10-year-old laptop, I'm happy about
that. But
>> yeah, actually a typical data center, my
understanding is that you're in the like
100 to 200 megawatt range for the data
center. Like like a typical modern data
center, that's what you would be looking
at for the total number of megawws.
Whereas AI data centers are like 10x
that for power consumption. So when
they're building those, they're looking
at things like gigawatt or, you know,
mult mult like up to over one gawatt
worth of inflow of power to this data
center, right? That's how much it
consumes. So when they talk about
gigawatts, they're literally talking
about like, okay, 10 gigawatts might be
five data centers or something like
that. Five new data center buildouts
worth or something like this, right? if
I'm just like ballparking those numbers
based on what I've seen. Uh so when you
think about that when they're
guaranteeing that kind of purchase, I
mean that's an ungodly number of GPUs,
right? You know, think think literally
five data centers or something or more
worth of these racks upon racks of
Nvidia Rubin like you know slotted
things or whatever you know they're on
at that point.
And so that's just I mean I don't know
we'd have to go break out a calculator
to even try to figure out what kind of a
outlay that is for Nvidia but it's
massive right like it's a massive
amount.
Um tying that back though to the thing
we're actually talking about which is
memory. My understanding was that OpenAI
actually signed some kind of nutso deal
where they were going to buy up to
900,000
wafers a month of DDR memory. What's a
wafer and how much memory's on a wave?
>> Uh, that's a Okay, so again,
>> not my monkeys, not my zoo. I'm doing my
best.
>> It's classified prime. It's classified.
>> And
it is if you actually want to know the
the yield per wafer,
>> like they have the number of like bits
per wafer. These are things you pay
analyst firms to get and they very they
they are constantly moving around
because people are trying to increase
their yield and so on.
>> A wafer is a piece of silicon. It's
circular. You've seen them.
>> Is that the circular one that we talk
about?
>> Yep.
>> So they're just buying circles.
>> Yep.
>> They're buying the output of the
circles.
>> Yeah.
>> Okay. I was about to say I don't know
how to take a circle and turn it into a
stick. Like that's very difficult.
Casey, I've never seen a GPU that's
round. So, I don't know if I believe
this.
>> Well, there are some uh the the uh
there's people who make wafer who make
wafer scale. They're kind of prototypes,
but they make wafer scale AI
accelerators, and they are circular.
>> That's pretty cool.
>> Yeah. But that's just nutso stuff. I
mean, I don't
>> They're designed after Sam Alman's orb.
>> Yeah.
>> So, it's like it's circular.
>> You can only do it.
>> I think technically uh technically
Prime, you're more right. You were
trying to make a joke, but you're more
right than you think you are. I believe
the Open AI deal is literally for uncut
wafers. They have been patterned, but
they are uncut and they are going to be
like controlling where that gets shipped
to and how it ends up getting stacked
and packaged was like the last time I
read it, they were uncut patterned
wafers, not cut and packaged, which is
very unusual. I could be wrong, but
don't say it. My recollection.
>> No, it sounds like you're right. I
didn't actually realize.
>> I believe I am. Yeah. Yeah.
>> I'm thinking about a Sam Alman as
Scarface right now.
>> Yep.
>> Just uncut wafers all over his desk.
>> Um, so
>> again to give a really bad explanation
because I'm the wrong guy to give this
expl you really want to get like, you
know, uh, Dev Patel.
>> What about the guy's name for Dylan
Patel? Dylan Patel from um semi analysis
or something. You want to get him, he
would know, right?
>> It depends. Yeah. Like we're we're more
on the consumer side of things. Like I
don't I don't get way into this too
often. I do know that like the the
difficult part is making the wafer. Um
like there's there's somebody online who
made their own stick of of computer
memory um using using chips they got
from you know somebody else. Um, so it's
not like it's not crazy surprising that
they bought just the wafers. I just
thought they would, you know, cut to the
chase and get them to actually hand them
over like functioning sticks, but um,
>> well, they're not sticks is the problem,
right? Because they are using HBM. So,
>> Oh, right. Yeah. I mean,
>> what happens? Yeah. So,
>> for us normies,
>> okay, so, uh,
>> this is again way out of my league. and
I both have to guess what bandwidth for.
>> Yeah. So, what it is,
>> okay,
>> what it is is like if you think about
normally how memory works, right, it's
it's on a stick and you basically have
these the the individual DRAM chips that
have been fabricated are like on a line
of the chip and then the the like, you
know, the connection is the you know the
little pins it slots in and that's how
it's talking to CPU. So, if you think
about it, you've got like essentially
the CPU is on a package, the DDR is on a
package on a little thing. You've got
those connections. They go through the
PCB, right, your motherboard. And
there's like some small number of
connections connecting them, right? So,
that there's like traces on the PCB that
are going to drive over to those like,
you know, places you slot them in. So
GDDDR,
right, which is the graphics graphics
DDR, not the kind you slot into your
thing, right? So the kind that goes on a
on a on a GPU
is a little bit different. That one is
like welded right on like it's like
soldered onto the motherboard and
directly connected. And the re and
closer and the reason for that is they
it's basically the same kind of memory
for all intents and purposes as far as I
know but the signaling is much much
faster. So they they drive the data rate
up the the band total bandwidth they
drive up by increasing the speed at
which it can transfer things back and
forth. And so it has like basically
higher quality sign physical signaling
to get things back and forth. But
otherwise, same basic idea as the kind
you would slot
>> into your motherboard. Otherwise, not a
huge difference. Beyond that,
>> some people were asking for visual
representation. And if you if you look
up if you if you wiki I don't know if I
can share my screen on this thing, but
uh if you wiki high bandwidth memory,
there is a very good photo if you scroll
down to the interface section um that
shows how it's like 3D stacked as well.
Um so you can
>> Well, we haven't gotten HBM yet.
>> Okay. totally different than both of
these. Totally different than these two
things.
>> So HBM is completely different from
those two things.
>> Completely different.
>> It's the same sort of memory. Like it's
still the idea is still that there's a
capacitor and a transistor per cell of
memory. Like so the actual thing you're
fabbing is somewhat similar, but it's
very different in two very important
ways. One, like Luke just said, it's
stacked. And in order to stack it, it
needs to be uh manufactured with this
sort of different kind of uh
connectivity. It's got these things
called TSVs or through silicon VAS.
They're like these connections that go
through the stacks so that you can kind
of like have each stack is talking to
the next stack. It like and and they
tunnel through, right? So it's wider,
right? It's a wider. It's like the
actual physical footprint for the same
amount of memory is a little bit bigger
because it's got to have space for this.
The normal DRM can be doesn't have to
have those, right? Those TSVs. So,
that's a thing. But the much bigger
thing, I mean, although that's that's uh
obviously a slight difference there, the
much bigger thing is the stacking and
packaging is just a way harder problem.
So, first you have to be able to stack
them and this reduces yield apparently
for I don't know the reasons why but
like
>> again can you explain yield?
>> I I forget just like a quick like
oneliner. Why do you not like
>> percentage of stuff coming out based on
what you put in?
>> Like just because some of it's bad. It
just comes out like broken. So they're
just like this wafer is broken, bro.
>> We're different levels quality.
>> Yeah.
>> Yeah. Computer chip small hard to make.
We mess up times sometimes bad. That's
>> okay. Okay. Okay.
>> The answer is that's what I needed.
That's what I needed. Okay. Thank you.
>> The answer is I don't know. Normally
when we talk about yield, I understood
it fairly well from the old like what's
the yield on a wafer? Cuz you figure you
make a wafer, there are defects on the
wafer. So you you know, you imagine
you're patterning this thing and you've
got defects. some of the defects or
maybe could be, you know, some kind of
impurity or something went wrong or
maybe when you were patterning the light
just hit something or there was one
little speck of dust in there or
whatever. So, you have certain things
that didn't quite work as well as they
should on this wafer. And so, some
number of the things that you put on the
wafer aren't are just going to fail. And
so normally when you talk about yield,
you talk about like all right, if
there's n number of defects on a wafer,
we expect those to fall fairly randomly.
Sometimes they're they're distributed
differently like more towards the
outside than the center because of the
way the reticles work or all these other
sorts of things. Who knows? But so you
only get a certain number of those
chips, right? Only a certain number of
them will work or a certain number of
them will perform better than others for
certain reasons. All those sorts of
things. And all that goes into what you
call your yield, right? And bin. So bin
is right the process where you say like
some of these chips work better than
others for whatever reasons, run at
higher clock rates, things like that.
Others are like they literally don't
work. There's a defect on them. We can't
get it working at all. And yet others
are designed to like have certain things
constructively turned off. think GPU
where it's like okay there's this many
processing units on the GPU and they're
designed to have some of those fail and
be turned off right so it's like okay
this just has less cores now and it gets
slotted as a different thing blah blah
blah that's the normal way that like I
have heard yield talked about for this
apparently when you you have to like
grind down the silicon and like stack it
on top of each other and all the and I
guess there's a separate yield loss that
happens. It has nothing to do with
whether the original chips were working
or not, maybe. But then there's also the
fact that I got the sense when I tried
to look at this stuff cuz um I knew we
were doing this podcast. So, I was like,
"Let me go see like what what are they
talking about with some of these memory
things?" And I wasn't really able to
understand it because I wasn't sure if
what they were talking about was yield
loss from we stack these wafers on top
of each other and that process produces
problems or if what they were talking
about is we can't test the memory prior
to the stacking for some reason. Let's
say I'm making that up. I don't know.
And so when we stack eight things on top
of each other, we just multiplied our
total failure rate by eight because if
one of them's bad, they're all bad,
right? Because the VAS won't work or who
knows what. So I don't know which of
those are talking about, but what I've
seen put out for numbers is when you're
making one of these HBM modules, your
your yield rate is awful compared to
normal. They said three times as much
wafer space is required for the same
amount of HBM memory as for regular DDR5
or something like that. So the same
number of bits stored if you're storing
them in HBM, you required three times
the input wafer size and patterning to
get that out. And that's a huge yield.
Like that's massive, right? I mean just
think about that number. It means you're
doing all the same work, but you end up
with three times more chips for DDR5 if
you were producing the R5 as if you did
HBM.
Make sense?
>> Okay, that does make sense. So, that
sucks, right? That that is really really
bad. And it means that the price of
those things has to be much higher
because you're doing a ton more work for
each one. You're doing three times more
work to get the same amount of bit
storage. So if you want a gigabyte of
HBM versus a gigabyte of just the random
GDR or DDR that we're sticking in the
computer,
>> it's it's right.
>> Sorry just am. Okay,
>> just to finish this ridiculous tangent
on how it works.
>> So the thing about HBM is what I tried
to get to at the the beginning before we
went into that yield thing is it doesn't
go outside the chip. It's not connected
to the chip in this sort of way that we
think of like with GDDR that's like SL
on the PCB or DRM which is slotted in.
It's on the package. So the way that it
works is you literally, you know, you
think about I don't know if you guys
have ever seen like die shots of
something like a modern AMD processor
where there's chiplets. There's like
little chiplets in there. So it's like
you delid the processor. There's like
two, you know, CPU dyes and something on
there, right?
That's how HBM works. It's on the
package. So, the CPU and the memory are
all together in one module. They're not
out. They don't they're not things you
plug in or put on a PCB. Does that make
sense? Is this uh like
>> this could be way off, but some of the
stuff with like Mac minis and all these
some of the new Mac stuff where they
have like the unified memory, which is
why a bunch of people are using them for
running local LLM stuff because they
have this higher bandwidth connection to
CPU that actually lets you run like
local models at reasonable speeds. Is is
this related at all? I don't know.
>> I'm just asking.
>> I am not sure. I I want to say that
there may be I thought there was a uh
>> I'm not I'm not sure. I don't want to
say because I'm not sure about that. I
know that there's uh there are some like
A series uh chips I think that use on
chip memory which is very different and
super duper duper fast, way faster than
even HBM. Um but that's a different
thing which we could go into later but
it's not relevant to a DRM shortage at
all.
>> Yeah.
>> Yeah. They're they're expanding cash in
in a lot of ways, but like you said,
that's not really related.
>> Yeah, it's a different type of it's a
completely different it's it's SRAM.
It's not even the same kind of
>> Can I Can I ask a dumb question?
>> Yes, please.
>> I have been asking dumb questions the
entire time because this stuff, like I
said, not my monkeys, not my zoo. And
you look into it and you're like, Jesus
Christ. Like it's just like this huge
morass of stuff where you're like, oh
god. Okay,
>> so
>> me the consumer, I don't just want a
wafer. I want it like nicely packaged
and then have like little LEDs on top of
it. And then I want NZXT or something
LEDs.
>> You're not an LED. Come on.
>> Dude, I have LEDs right now, baby, with
my with my little rings.
>> Oh, no.
>> Just like, you know, the whole nine
yards, right? And that's me as a
consumer. So, why in the world as a
producer would I ever sell to consumers
ever for any reason if Samuel Jippity
Alman is just like, "Give me the
circles. I don't even want like I don't
even want you to put LEDs on them. I
just want I just want the circles.
>> And
>> that's the that's that's the short. You
just described the shortage. Yes.
>> Yeah. That's what's happening, right?
You nailed it, Prime.
>> Good job.
>> Very done.
>> Episode's over. We can all go home. So,
we're like literally never
>> crucial was the the consumer,
specifically consumer side of Micron.
And they were just like, "Yeah, screw
it. Why do I want to deal with you guys?
I can just sell to the to the
hyperscalers. Um there is if you look at
the amount of companies that are
actually making wafers um versus the
amount of companies that are selling in
in you know the most consumer side of
things, the sticks of RAM. There's an
incredible amount of companies selling
sticks of RAM. Uh there are very very
very few actually making wafers. Um the
the hard part is the the making of the
wafers, not the making of the sticks. um
or in the in the HBM sense, uh it
continues to be hard the whole way
through. Um we're we're we're
specifically talking in this case about
about the sticks. You were talking about
putting LEDs. You're not putting LEDs on
HBM. Um so I'm I'm talking about sticks,
but
>> they're missing out.
>> They're missing it out. Data center
them things that like, you know,
primarily coolers that go on top of of
HBM or something like that. But uh
that's probably what you might see. But
>> when Razer gets into the HBM market and
there we go.
>> I'm I'm imagining
>> when I get my CHBT subscription, there's
an additional tier and I get like a live
video feed into the data center and I
have like my rack and it has my cool
LEDs on it for an additional price. Like
they're missing out on these secondary
effects that they could be selling, you
know?
>> I like that. Yeah. So that means the,
you know, the AI future always shows
these data centers that are like glowing
with bright lights. So Sam's not going
for that style of LED future. He's going
for like the dark scary one.
>> Yeah. Yeah.
>> Okay. And we don't need light where
we're going. Okay. That's pretty
interesting. So does this is there is
there any computer part that's protected
from this AI revolution? Like is there
anything that's going to remain normal
priced or am I just screwed? Because I
have to buy a new computer here soon.
Um, I don't I should probably buy it
sooner than later. Is it It seems like
But also, is there like anything that's
going to be cheap or is it all Is it all
gone?
>> Well, I mean, one kind of sort of nice
thing is that the whole RAM situation is
sort of creating this nice bottleneck.
So it's unclear like
it's unclear how much CPU prices would
be affected long term because TSMC so
TSMC okay so there is a way in which the
production capacity for things like CPUs
is implicated by this but I don't fully
know how to what extent so when you do
these HBM modules the stack that stack
stack that we were talking about. The
bottom of that stack is not memory. So
the things that are like up the whole
way, that's memory, but the bottom is
actually a logic die. So it's kind of
like one logic layer and then a bunch of
like memory cell layers. And that bottom
die supposedly is actually like advanced
process logic process. So, my
understanding is that TSMC actually is
going to be fabbing a lot of like the
for HBM4 or something. They're actually
going to be fabbing the bottom layer.
So, it's like TSMC fabs the bottom layer
of the stack. SKH Highex, Micron, and
Samsung, if their stuff ever works, will
be fabbing the other layers, and then
they get like packaged together and then
stuck on the the, you know, chip, the
the wafer, the COS
packaging thing or whatever they're
going to be using at that point. And so
I suppose that could have some negative
consequences and that that adds more
stress to the TSMC side, but I don't
know like to what extent that competes
with things like you know the logic dice
for CPUs and stuff. So it seems like if
you're sitting around waiting for memory
all the time because you're memory
supply constrained then that would mean
that like the the fab processes that are
used for CPU and GPU dies that aren't
the parts that aren't the memory. That
part seems like it wouldn't necessarily
go up, right? Cuz they're they just have
extra capacity at that point because you
can't make new accelerators without the
memory and you need a lot of that memory
and they just don't have the fabulous.
So that could be someone out there has
done that analysis but I again it's
probably something you have to pay for
the research.
>> Yeah, there's a lot of analysts. I think
like theoretically pretty much
everything could be impacted to a
certain degree. Like if we're if we're
building all this stuff, basically
everything in a consumer computer, those
brands, a lot of them, not all of them,
uh are are also in the hyperscaler
space. But the amount of impact on like
a company that makes uh you know, server
chassis and desktop computer cases is
going to be effectively nothing because
you can scale that up no problem. Um,
and there's a lot of potential
manufacturing in that space. So, it's
just kind of whatever. Um, the amount of
things impacted is going to be like a a
computer fan, for example. Uh, they
might have lots of fans, but who cares?
You can make tons of fans all the time
basically everywhere. There's there's
there's not really going to be any
actual real impact there. Um, so it'll
it'll probably stay mostly with um, you
know, the things on or attached to or
very close to the board. Um, which is
mostly impact that we've already seen.
But I I do agree. I think it's
bottlenecked.
Casey, so are you saying since we have
to pay a lot of money for memory and our
CPUs are just going to be waiting all
the time, JavaScript is so back. Is that
like CPU doesn't even matter? We're just
waiting around.
We're just waiting around. It doesn't
even matter how long it takes anymore.
It's just chilling out in memory.
>> Or other way around. You'll only be able
to like new machines will only have 512
megabytes of memory cuz that's all
you'll be able to afford. Like each year
the memory will go down and down and
down until eventually you're running
like the Commodore 64 is like roughly
what you get. You 64K. That's all you
you know you plebs can afford.
>> I've trained normies are not allowed to
use graphical gigabyte. You don't get a
gigabyte of memory. What are you
talking? What are you? You think you're,
you know, some kind of uh the president
or something? You don't get a gigabyte.
>> Maybe uh maybe Bill Gates's old fake
quote was actually right, but it wasn't
right because it's most of what we need.
It'll be right because it's most what we
can actually get.
>> Yeah. It's like that's what you get.
>> I like this.
>> 640K will be enough
>> for anybody because it is all you will
get. Yeah. Yeah.
>> All right. So, the final question,
>> own almost nothing and be happy.
All right. So, the followup question I
think is um is pretty interesting
because uh Google Stadia obviously they
they killed Google Stadia. Uh Netflix
actually went into this whole idea of
doing um games on the server and they're
they're doing it quite successfully. Uh
a lot of these places it kind of seems
like we're moving more and more to the
server. Uh does this mean that within
the next 5 years we could see something
along the lines where consumers are
getting so priced out that it's better
just to rent uh your workstation. It's
better just to rent your time on some
sort of server computer cuz I know
there's a lot of this being pushed with
like VS Code and GitHub and everything
where you can just you rent out your
development environment as opposed to
you actually owning your development
environment. You're just like I can
program anywhere now on any you know I
just need a terminal and boom you're up
and running. And so, um, is this like a
future that we're kind of forcing people
into being like, "Oh, you want a
computer, brother? That's $20,000. Like,
you you better just go rent one for
life."
>> I think there's some desire there from a
variety of companies. But I I think when
we're looking at um at least some of the
comms that I'm seeing from these like
RAM wafer companies and stuff like that
is that they they want to scale up
because of this, like they would like
both markets. Um I but I don't know. I
mean there's a there's a Twitch chatter
mentioning Chinese manufacturers
covering to the the memory market. Like
there we might also start seeing
competition like that. Um I wouldn't be
too surprised if the gap between what
the enterprise has available to them and
what the consumers have available to
them or at least what the consumers
reasonably have available to them
continues widening. um for for a while
there the unless you're looking at like
what was that the name of that CPU like
knights able or whatever um unless
you're looking at like the extreme end
of of server specific compute hardware
uh consumers were pretty interested in
that stuff for a long time and now we
have GPUs that are just mind-blowingly
expensive and CPUs that are
mind-blowingly expensive on the extreme
high end um that just in a lot of cases
while those things are relevant isn't
going to be super necessary on the
consumer side of things. Um, so
yeah, I I don't think that's necessarily
going to happen because
there will always be a section of the
market where someone could make money
selling to those people. Um, and I think
the the Chinese who are both getting
into CPU, not both, but uh all getting
into CPU, memory, GPU, all that type of
stuff, uh, they might come for that
market. Someone else might, not sure.
But yeah, I don't I don't think that's
going to happen.
CXMT is supposedly not far behind modern
memory standards like right now. Um, and
so they're kind of predicted to be able
to ship at least like DDR4.
>> Yes.
>> Style memory, right? Or something like
this. So they're they're not
>> Chinese memory manufacturers aren't like
way behind. And so it's very possible
that especially if you start talking
about well you know no one can afford
DDR5 so you know maybe you could ship a
cheaper thing uh or a less good thing
and that would be fine right
>> and maybe you're you're you're not able
to buy as high performant computers or
relatively as now or whatever but maybe
Casey's stuff gets more popular and
programming goes more more uh you know
efficiency and performance and stuff
like that we start making Roller Coaster
Tycoon again.
>> Yeah.
>> Um
>> there we go. Yeah,
>> it really consumer desktop is just like
gone. Uh or consumer compute, I should
say.
>> I was going to say it's actually really
uh this this whole kind of crunch on
everything and and not being able to buy
the next greatest MacBook uh MacBook Pro
every single six months from the Silicon
Valley. Uh it is actually kind of a
positive thing because people might
actually start experiencing the uh the
crap they make on devices that aren't
the premium of the premium. followup
question, which is will when do you
think or do you think this is going to
hit phones soon? Because I kind of
realize like people buy phones fairly
frequently in America.
>> Mhm. Like at what point does Verizon
say, "Yeah, sorry. We're not giving you
a phone with a 2-year contract anymore.
It's just like too damn expensive now."
Like is there is there a world where
this is coming soon? Because I assume
all those parts are equally susceptible
to whatever is happening. The rammoning.
Yeah,
>> it would make sense that it would. Um, I
haven't really seen it happen yet. Uh,
you you've seen it a lot of other
worlds. I don't know if that's because
like they have contracts far enough out
that it hasn't really impacted their
their available stock at this time. Um,
so I I don't know when that would
happen, but it it Yeah, it would
absolutely make sense to me that it
would.
There's also like supposedly and this is
kind of I I don't know to what extent
they're talking about this but I guess
the AI buildout doesn't only affect DRAM
it also affects like NAND flash storage
and things like this like apparently
like you know these machines that
they're prepping I don't know what a
typical AI data center machine looks
like but it's not like oh it just has
high bandwidth memory it also has DDR5
regular And it also has nan flash like
SSD. So they're taking capacity for all
of those things. And I think the memory
side is what we talk about probably most
because that HBM stuff has such the bad
yield stuff and all that. So probably
like that hit first. But the other qu
the other sort of storage technologies
are also just facing limited supply for
the same reason. Like AI just wants to
buy a lot of it and there weren't enough
people. there wasn't enough slack like
to pick that up. So, I would imagine
that phones will have all of these same
problems. Like they'll in a year or two
when their pre, you know, when their
aotments have run out, uh, they will be
like, "Oh crap." Like the flash storage
is more expensive. The memory is more
expensive. About the only thing that
won't get more expensive probably is
that SRAMM part that's on chip memory
like in a in a Bionic like in a um Apple
A series chip or whatever, right? Um but
they, you know, that's just part of
their memory architect. They still have
the other parts, so they're still going
to have problems with that.
>> Yeah.
>> The Western Digital, uh, which does a
lot of, I assume that Nan Flash, they're
also up just an absurd amount in the
last 6 months, they're up 300 plus
percent. Uh, just today they're up over
10%. Like they're obvious they're
actually they actually have risen faster
than Micron as far as
>> That was because of the podcast, by the
way.
>> Huh.
>> That's because of the podcast, by the
way.
>> We got the motion. Uh, all right. Well,
I mean, I this could be a good time to
kind of jump off some of the hardware
stuff, unless if Luke, you have anything
else you want to talk about because I I
I do have some questions about YouTube
and tech and all that kind of stuff for
you as well.
>> Can I make two stupid jokes before we go
to the next one? Because I had some, but
I just didn't get a chance to put them
in. People were saying smart things.
When we were talking about yield, I
really wanted to ask Luke what his
thoughts were about roundabouts.
>> That's pretty good. Cuz you have to
>> Oh, round. I like roundabouts.
>> Okay, cool. Nice.
>> That didn't That didn't go as well as I
thought.
>> You don't have to yield. You don't have
to
>> yield. You have to yield at them instead
of stop. That's
>> I mean, if you're just confident enough,
you don't
>> No, I like I love roundabouts.
Personally, that's a great point.
Self-driving cars, baby. They just all
They just do the thing. It's fine.
>> In my experience, people stop for me.
>> It's done.
>> That's just cuz you put the LTT thing on
the side and it's coming through.
Yeah. Yeah.
>> I wasn't going to be able to focus for
the rest of the episode. Okay. It was
just in my head the roundabout joke
mostly. So, you had to be there.
>> Yeah.
>> Yeah.
Had to be there. Well, looks like we all
got to be there. That was very
fantastic. Thanks.
>> Brian, ask your next question. That'd be
sweet.
>> All right. All right. we should shift a
little bit of gears and talk more about
kind of uh LT cuz you you guys obviously
have uh some developers on board and all
that and you're doing I assume the tech
and the things that you're building is
much different experience than say
something like Netflix where they've
just been working on one product and
ancillary products for 20 years or
Google 20 right like all these companies
have been just working on the thing
they're doing well Google kills a bunch
of projects too but imagine a company
that doesn't kill a bunch of things
they've been doing one thing and that's
it uh I kind of like even though AI's
had a uh there's a lot of negative
impacts to things. How has it impacted
kind of this hiring what work looks
like? Do you guys feel like you're doing
more things? Like the positives, the
negatives? I'm I'm curious.
>> I don't feel like it's changed like
I don't know. Hiring has gotten really
weird. Um hiring has gotten extremely
strange. Uh, I mean, being like as
public as we are, I guess, um, the
hiring has always been a little bit odd.
Uh, I've I've always been fairly open to
hiring remote devs. Um, so even like I
don't know, six years ago, putting up a
position for remote dev, we'd get 2,000
applicants. Um, and and a lot of them
are, you know, um,
people who are not prepared and are just
like, I want to put an application. and
I'll mop your floors. I don't care. Blah
blah blah blah.
>> It's like, okay, but I actually just
need a an experienced developer. I'm
sorry. Like, I don't know. Um,
>> the camera pans around and there's like
13 people mopping this one tiny piece of
floor. It's like, guys, we don't need
anymore.
>> Come on, write code. Come on. U so you
can sift through a lot of them pretty
fast but like it's it has really taken
like I think especially over the last
year um the influx of what are like
obviously just garbage like crapped out
AI written um AI submitted applications
has gotten crazy u and then sifting
through that it was already difficult
because we had so many applicants for
certain certain positions um if if it's
like hyper local and specialized it's we
don't get that many. Um Canada doesn't
have all that many people. Uh and then
this area is quite expensive to live in.
So it's it's a little bit, you know, if
it's if it's a local position, it's not
that much. If it's global, it's it's a
ton. Um but yeah, sipping through them
trying to find the stuff that's actually
real has gotten more difficult. And the
temptation to like counter sort
basically by okay, well, you're spamming
me with AI applications. Let me sort
through it with AI is high. Um, but then
the error rate there would suck if you
lose someone who's really good because
they got AI sifted out. Um, that would
be very unfortunate. So, there's some
tension there. Um, we have a a coding
test that we do for incoming applicants
of of every type of developer that we've
hired, which is a decent range. Uh,
we've had to make different ones. uh but
they're made in-house and we've ran that
coding test in a bunch of different
models to try to see what that output
looks like. Um and we try to
kind of detect how hard someone leaned
on it. I don't personally care like if
you look at the coding test and you get
AI assistance and you make the best
result out of everyone.
>> I think you made the result out of
everyone. um it doesn't really matter to
me too much, but there is a little bit
of sussing that I want to do to figure
out like did did you just take its
output and just slap it in and not even
uh in in Tea's situation review it
properly? Um did you just did you just
throw it through? Um, and if that's
true, then you know, I'm not that
interested because the even if it did
even if it did one shot super
effectively. Um, I'm concerned about
your ability to maintain things and keep
them good over time. Um, so yeah, hiring
is definitely changed. It was difficult
and now it's way harder.
So I I've been reliably informed by Sam
Alman that uh the problems of AI will in
fact be solved by AI. And so you're
telling me that you you aren't solving
your AI problems with AI?
>> No. Yeah, not so much. Not right now. Um
you know, we'll see if that happens
eventually, but I kind of doubt it.
There's there's there like there's the
basically inherent error rate and for
certain things that's uh frustrating to
deal with. All right. So, I actually do
want to ask a real question here, which
is that Okay, so I think
>> Wait, what were all Wait, what were all
the other questions?
>> Fake questions all those are all those
answers to get to get riled up and go on
a graphic 10-minute rant.
>> Oh god.
>> All right,
>> get wrecked, by the way. Okay, so the I
think the obvious elephant now in the
room, and I'm sure a lot of people would
appreciate this in the audience, which
is okay, I am stuck in effectively elo
hell, right? like you're playing League
of Legends except for it's with your
career and you're a junior and you're
trying to figure out how to get out of
ELO hell because everybody is just dog
piling in and quitting Midame. So, how
do I stand out at least in your process
without giving away too many? Because
then the AI will be like, "Got it." But
like, how would you make yourself stand
out in a sense that when reviewing you
go, "Oh, this is actually really good."
Um, the application I guess is how I'll
take this instead of the the code.
>> I assume the application is the
beginning of ELO hell, right? That
that's the bronze level right there.
>> Yeah, that's how I took that.
>> Would would it be possible just for
those of us who don't who aren't that
familiar with it? Also, what exactly
like what kind of people are you trying
to hire? Exactly. These are programming.
These are primarily programmers and
programmers for
what kind of stuff?
>> A pretty fairly surprising to most
people variety of things. So, so we have
multiple websites. Um, so we have
flplane.com which is the the the first
like thing that we made which is a
>> uh like patreones but video first um you
know creators support platform uh where
people can upload videos have their own
channel and get direct funding from
fans. We also have ltlabs.com
which is where a lot of our
I don't even know if I can say a lot.
Some of our testing data from our local
uh testing lab gets put up on there and
you can do comparison runs and and all
these other types of things which is
kind of neat. Uh then we have somebody
in chat pointed out LGT store. That
one's a lot more basic because it's on
top of Shopify. It's not just like our
own thing. Um but then we have tied that
in with like a system that we have which
is currently called merch messages but
it might be renamed where if we are on a
live stream and you are buying something
on the store as you're checking out you
can leave a a message and that message
will go to the stream to be go to a
stream dashboard. So, it can be curated
and then selected and then um the
producer can just respond to them in
line and then they'll show up in like
the the lower third banner at the bottom
>> or they can curate them and then Linus
or myself or whoever is hosting the
stream um can respond to them directly
through like voice. So, we have that tie
in. We have a couple other things like
that. Flip plane and the LT store link
together for like exclusive sales and
blah blah blah blah blah. Um that's most
of our web development. We also have
internal tools development um which
ranges through a bunch of things. We've
done a bunch of uh you know
internal development on top of snipet
for inventory stuff. We we wrote a guey
for whisper back in the day because our
editors didn't like doing command line
stuff. Um we helped automate things
around the office. Just little things
like that. And then the lab itself also
has a pile of developers um that work on
like data ingest uh both for handling
data from um well it's entirely for
handling data from like benchmarking and
testing of computer components but then
also making sure that like the website
receives the right stuff and not the
embargoed things and whatnot. Um, and
also making a variety of benchmarking
tools like Markbench is the the name of
one of the the tools that we have which
does a bunch of automations for testing
hardware. So like you you put a new
graphics card on a test bench. You run
Markbench. Markbench will kind of try to
verify that everything is set up
correctly and then automatically go
through run all the benchmarks. It'll
it'll set up the gain, make sure the
settings are correct, scroll through the
settings. um and take screenshots so
that if something is strange, if we're
looking at the results, we're like,
"This result seems really weird." We can
go back through and check, okay, was
G-Sync enabled on the on the computer?
It's a fairly common um you know, when
you're trying to test things um error
vector. Um is is this random setting in
the game wrong? Um like is this a game
where we had to manually select the
settings and and the person clicked
beside the drop down instead of on it?
Uh so it didn't actually select the
thing that they wanted. uh or whatever
else because the most frustrating thing
is like, okay, we have five days to like
fully test this suite of new graphics
cards. We're on day four, everything is
done. We notice this one result is super
weird.
>> Why?
>> Right?
>> Like, is it that one result is super
weird or is it all the other ones are
super weird and this one result is the
real one? So being able to go through
and figure out like dive through all the
information make sure everything's all
correct is super valuable. We also do um
some amount of like AI checking to
verify results. We only use that as an
early warning system. So that's running
as test results are coming in based on
our expectations. Um and that will try
to tell us if it thinks something is
wrong early. But that hasn't removed any
human checks. um to to verify all the
data as well. U so we have developers
working on on that tube. So we we have a
a fair bit of range of of people that
we're hiring in the development space.
Um and you asked uh what stands out for
me. It's the same thing personally. It's
the same thing that has kind of always
stood out. Uh my focal point keeps
moving. Sorry about that. Um I changed
my camera setup last night. Very smart.
Um but it it it it's been portfolio
stuff for me always. Like my my
first major specifically developer hire
was very very largely off of portfolio.
They had no post-secary experience at
all. They had no work experience, but
their portfolio was just insane. And I
gave them a way too difficult project.
We've we've drawn this back a lot since
back then, but we gave them a way too
difficult take-home project and they
just nailed it. My Oh man, I am evil for
this. Uh, but my first like take-home
project was make a game um and and
submit it. And I had I had some more
details than that, but it was brutal. It
was very brutal. And he made a um
>> it was actually really fun going through
those submissions. It was very fun. But
it was it was if I remember correctly,
it was it was I attempted to tone it
down so it wasn't like way too
>> AAA game 1 million in sales.
>> No back.
>> It was supposed to be web boost. I think
it was supposed to be Amal. Oh man,
>> roughly GTA,
buddy.
>> I know.
>> Yeah. Speaking of wanted levels in GTA,
you really didn't try hard enough.
>> Yeah. Well, it's I I don't remember the
exact details, but it was web based and
there was like some some stuff that
tried to keep it relatively relaxed, but
my god, that was way too crazy of a
scope. So that Yeah, that never happened
again. But he made a uh competitive
asteroids game. Um, and he set up a a
little bot that set off an alarm in his
house when I joined the lobby. Um, so I
I like loaded the website and joined the
lobby and it like went off and he ran to
his computer and then like fought
against me and there was chat in the
sidebar. So we like talked about the
game and I was just like this is insane.
This has to be the the guy. This is
crazy. Um, so
>> the alarm that is so brilliant of him to
actually create a a physical alarm. Like
that alone is like okay yeah you're
hired. That was a great That's a great
>> The whole thing is just nuts. I'm like I
already knew from his portfolio that I
was like very sold on this person and
then that happened. I was just like, "Oh
my god,
>> you already knew." And then you said,
"Go make a game." I mean, look.
>> I was, you know, I have changed. I've
reformed aware of this. Did you
>> Is he about to get a new lore drop? When
this thing comes out, it's going to be
really
>> Yeah.
>> Do you So, I was strictly out of
curious. Do you pay? Well, can we poach
this guy? Sounds pretty good.
>> I'm not I'm not going to disclose who it
was. We
>> We're not Yeah,
>> we try to do what we can. We're not
super competitive with It's It's tough,
man. The YouTube space is very different
than like Silicon Valley. If uh if
people want to like
>> really really push and be in like hyper
competitive environments, there's more
money out there. Um
>> Yeah, I was joking.
>> Yeah. Yeah.
>> It's just I've been joking, but also
seriously, what's his name? But also, if
you happen to tell me who it was, I
might pass.
>> But also, I have a game.
>> I might pass him along to some people.
I'm not saying
>> he's still on our team. I'm happy about
that.
>> Nice. All right. So, uh, a question I
really am actually very curious about,
which is I I don't know how to explain
this. I sent TJ a frantic message last
night and I've just been feeling it for
months on end about this, which is I
feel a change coming in the specifically
more in the tech YouTube space, but I
don't know what that change is, but I
know it's happening.
>> Wait, what?
>> Well, mean
the means in which people uh consume
content, what their expectations of what
content should look like,
>> what like all those things coming in. I
just feel like there's a change coming,
but I just don't know what it is more in
the tech space. just because there's a
big influx of people. Uh, you know, I
think there's
>> who knows what's going to happen. And so
I'm I'm actually very curious on your
thoughts and really I don't know how
much of the T you can spill on LT, but
if you do I think the second T does span
stand for T, but if like what is your
guys' thoughts about this next year or
these next two years? Is it business as
usual or do you think that there is like
some pivots you may have to make or
changes you have to make to remain more
competitive?
>> Yeah, I can talk pretty openly about
that. Um, one thing question that I
would throw back at you real quick is
that I often hear the the the software
dev space referred to as tech YouTube.
Um, because of you guys talking about
being in tech um for for tech YouTube is
usually like hardware stuff. Um, so when
you're talking about the change you're
feeling in tech YouTube, are you talking
more on the like development side or the
like uh consumer electronics and and
server stuff and whatnot side?
>> I would say either or honestly because I
think I think there is a just a general
uh going to be something is going to
happen here in this space that's just
going to kind of change maybe how people
consume information. I'm not really sure
yet. Yeah, I mean I can talk about some
of the like history of of LTT and and
the the change that we are still going
through uh which has been kind of from
the beginning. So when when Lattis and I
first started working together, we were
part of a computer shop up in Canada
called NCX. Um we would show up to to
like a filming day and in this in the
studio in a corner would just be a a
mountain of products. like it would it
would they the the logistics guys would
just throw them in the corner and they
would literally like pile up sometimes
up like to near the top of the windows.
Like it was it was a lot and we could
just like old school Linusectives videos
and even some of the old school NCX tech
videos um were just like
okay pull something out of the mountain
talk about it while taking it out of the
box end video upload. Uh, and back then
we've kind of retroactively defined that
as like productdriven video where there
was so much product is coming out so
often you're getting a new major
graphics card launch every nine months,
something like that. Um, you could
really just lean on the products are
interesting. Let's talk about the
products. Let's show you the products.
and that that is your content. Around
2015ish
maybe, um it started we started having
these conversations internally about
like we kind of have to
make the story now. Um that's when we
started filming things like Scrapard
Wars, which is like a series Scrapard
Wars like 2013, I think. Um, but we we
kind of saw the writing on the wall a
little bit early, but that Scrapped Wars
is a series where you have a fixed
budget and you go and you try to buy
used hardware and you compete against
each other to who can have the better
system. And that was us trying to like,
okay, let's create a story, let's create
an interesting thing instead of the
interesting thing being the name and the
title that is the new product that's
coming. Um, and
>> was that shift driven by necessity or
was that just you guys saying we could
expand or were you saying like no, we're
like the products aren't that
interesting anymore or there's too many
people doing this? Like how did why did
you decide to do that?
>> Yeah, it's a good question. It's it was
definitely both. Um, so we were you were
detecting a you know a wider release
cadence and kind of less interesting
product releases. Both of those things
kind of happening at the same time. And
also, you know, this is our full-time
thing now. Originally, when we were part
of NCX, this was I was going to school.
Uh, Lionus had a a job as a a a product
manager um or a guy who buys product. I
don't remember what his uh that's a
little different than a product badger.
>> Yeah. Yeah. Yeah.
>> At least on tech, you would buy like
SSDs and sell them through the store or
what.
>> Got it. Yeah.
So, this was our part-time thing. So,
there's only so much we could put into
it. And unboxing stuff was was, you
know, approachable. Now, when we're
doing this full-time, that's that was
probably the main starter for for
something like Scrapard Wars was like,
"Oh, we have time. We could do this
interesting thing." And we always wanted
to promote uh you know, you can buy
stuff used. You don't have to buy the
super flashy things all the time. And
it's been a little bit frustrating for
us forever that like the the you know
the 5090 video is going to get all of
the views and the card that people are
actually buying is not going to get
branded at all because people want to
see the we call it Lamborghini content.
People want to see the the big expensive
Halo products. Um and from there they'll
go like well I this card is really
interesting therefore I want Nvidia and
then they'll go buy the thing that they
can actually afford. Um,
so we wanted to find a way to make it
more entertaining
to see content that is about not just
blowing your wallet on everything. Um,
so that was the main reason for that.
But then the reason why we really
continued doing it, not only did it get
a ton of views, but was because there
wasn't a lot of other content we could
as easily make. Um, GPU timelines for
example.
>> Yeah,
>> I said nine months. That was around when
I first started doing content. Now it's
like two years. Um, so that has been a
sliding scale the whole time. Um, and
also it's
I I find the releases are getting a lot
less interesting. Um, like when the 8800
series GPUs came out, like oh the the
like performance expansion that happened
was mind-blowing and then now it's like
uh well you're paying 50% more dollar
and you're getting 50% more performance.
Hooray. Um
>> if even
>> Yeah, exactly. So it's like it's it's
it's not nearly as interesting. Um so
now we have to try to yeah create
stories, do weird projects, do stuff
like that. Um and I the the
competitiveness of it
maybe a bit. Um Lionus like notoriously
just like doesn't watch YouTube of
almost any form. Um, so like the seeing
what other people are doing and then
doing is
>> not so much of a a thing. Um, he'll read
comments on videos but like doesn't
really watch them which has gotten him
in trouble a couple times but uh you
know um it's I mean I watch a lot of
YouTube but um
>> yeah I don't think like what other
creators in the tech space are doing has
really driven a lot of
a lot of what we do. Um, there are plans
for the future. We want to resurrect
some channels that we have on hiatus.
Uh, we want to start making more videos
again. Um, but a lot of it's like, man,
if the products aren't there, we're
going to have to make something. Um,
that just is what it is. And if you're
trying to release, like if we want to
get back to say six videos a week, for
example, which is what we did for a
decade, um,
>> the vast majority of those are going to
have to be not, uh, you know, new
product review. maybe is product focus
but not new product review type videos
because there just isn't enough
interesting stuff.
>> Very good. Sorry, I was just thinking
about it. I was
>> like waiting for a response or did he
like the answer? Did he hate it? I don't
know. I can't tell.
>> Well, the mustache did not move. It
stayed completely stationary.
>> I thought maybe I I identified with uh
Lionus quite a bit cuz I I too
notoriously have been known for not
watching YouTube. Uh, true. It is a It
is a thing that
>> And I And I like watching YouTube. I can
relate a lot to you, Luke. I can relate
a lot.
>> Yeah, I know. TJ's my Luke. Uh,
>> there you go. Nice.
>> Also, the the important one in the
group, too. Uh, so uh
>> Okay. Okay. That's that's interesting.
Um, I'm just I was just curious about it
just because there's so much content now
that's getting pumped out and things are
even like you know the so much Jeypity
content that's getting uh put out right
now that's even getting millions of
views which is crazy that people are
just watching these kind of these I mean
I guess again I don't watch YouTube so
what am I saying? Why am I even saying
it's crazy right? Uh, good for them.
They're getting all that kind of stuff.
But I wonder like how that changes
people their perception how like they're
going to interact with videos. Does it
need to be a non-stop flow of factoids?
Is it going to be how does it optimize?
What is it going to be like? Is there
going to see something? I don't know. I
just I was just very curious about how
you guys are going to approach that,
especially more in uh you're not tech
YouTube. Obviously, you're hardware
YouTube.
>> Tech.
You guys might find tech differently.
But
>> I think we're test YouTube. No, it's I
mean it's interesting, right? Like
YouTube, in my opinion, is trending
towards extremes. Uh we we've talked
about this a little bit publicly where
we we feel like YouTube channels like
the the Mr. Beasts of the world are
doing super well. The NFL channel on
YouTube, those types of things are doing
extremely well. And then smaller, more
specified niche channels are also doing
really well because, and this does kind
of make sense to me because if you think
about what viewers would go for, they're
either going for the most like kind of
brain turn off
>> absorb fun thing uh content or they want
to like learn about something and really
really dive into something. Um, and
we've traditionally kind of floated in
the middle, uh, which on YouTube was
pretty good for a long time, um, and is
getting a little bit less good now. So,
that's kind of informing some of our
choices. Like I mentioned, we're looking
into resurrecting some channels. Um,
>> and there is some planning going on of
like trying to have
channels that are more
uh, more specialized. LT has always been
a generalist channel. Um,
>> Lattis will will review a phone and then
he'll make a a seven gamers, one CPU
computer. Uh, like it's it's a huge
range of stuff, right? And that's just
less rewarded in the current YouTube
ecosystem. We're also finding that
channel momentum is less important now
than it's like almost ever been. Uh,
which is a very weird space, very uh
uncertain space to be in. um where like
back in the day if you did like we would
go to CES, Consumer Electronic Show,
we'd release 30 videos um and they would
all do a little bit worse than normal
because we're just flooding the channel
and the impact of that would be lower
views for a while and we'd see our view
curve kind of go up over time because
we're kind of recovering that um that
kind of base. And these days that
doesn't happen, which is good and also
really terrifying. It's it's good
because if you release one bad video,
it's not going to negatively affect your
your future videos as much, but it's
really bad because the predictability of
how well your videos are going to
perform. Thus, your ability to, you
know, um
>> yeah,
>> have sponsors and and all these other
types of things be confident in your
like product, which is your ability to
release videos, um is going to go down.
It makes monetization less reliable.
makes big companies like ours a little
bit more sketched out because, you know,
we have 120 mouths to feast. So, um, if
if the if the ad rev or whatever slips
because we have a few videos in a row
that don't do super well, that can
impact us a lot, which is again one of
those reasons why we want to get back to
more video releases because,
you know, if you release once every two
weeks and it really doesn't hit and it
performs really, really terribly, that
impact on you is going to be really
high. If you're releasing all the time,
it can hopefully come out in the wash.
Uh but it's difficult to release that
much content effectively. We have a we
have a big team of editors. We have a
big team of writers. But um as we get
away from product review focused
content, the content is harder to make.
Um you have to dive deeper into things.
You have to come up with better ideas.
You have to come up with ideas that take
a lot of work, etc. Um
so it's yeah, we're trying to scale
towards that, but it's a it's a tough
problem. YouTube's always been that way,
though. the the landscape is constantly
changing, what the algorithm favors
constantly changing and and if you want
to survive, you have to kind of ride
that wave. So, it is what it is.
>> Great.
>> Yeah, it is interesting.
>> Yeah, I think like you just sort of see
now on the like uh the momentum for a
channel doesn't matter as much. Like
sometimes you just put out a really good
video and it just does really well and
it's like which is awesome and like
you're saying terrifying cuz you're like
okay well it's really hard to predict
what next week videos are going to do
right in terms of like what's going to
happen next. But it is fun because it
does feel a lot of times like when you
know a video is going to be good it does
like hit well which is fun. Uh which
makes the game pretty pretty fun to do.
I don't know what I prime I feel like
yesterday though I'll just say as well
you were just reading too much anthropic
blogs bro
>> oh my gosh I get I'm I'm I'm a big time
mirror and so uh if I hang out with
people that are really sad I feel more
sad and if I hang out with people that
are more like happy I I become more
happy I like hanging out with TJ he's
always smiling it makes me always smile
too it's also why I like hanging out
with Twitch chat they uh
>> they tend to be golden retrievers and it
makes me look at Luke's face right
there. Oh my god.
>> And so, uh, it's very, very exciting.
I'm just a simple farmer, right? But,
uh, but I started reading Dario's blog
>> and it's just I'm sitting there
thinking, man, like
>> also it's just like I just got
blackfilled so hard for like 30 minutes
of my life there and I was just like,
dang, man. This Daario guy, he walked I
just now I have a whole new opinion
about everything. Uh, but
>> that's okay. I sent him a happy message
afterwards and he was right back. voice.
We're so
>> right back. I just needed just a little
bit of so upgrade. Anyways, yeah, sorry.
The the AI stuff got me down hard last
night, but now I'm back. We're so back.
>> To jump back to the spiky views topic
for a section for a second, this section
of our channel I I I find very
interesting. If you if you go to our
channel, go to video, scroll down a
little bit, and you see the the Steam
Machine won't cost what you think video
from two months ago.
>> Yeah.
>> That got 2.8 million views.
>> The next video got 547,000.
So, like boom, it slammed down. And then
the next one got 848. The next one got
5.3 million.
>> Next one got 1.6. Next one got 900,000.
It's just like whoa. Like it it really
really depends on just how the audience
resonated with that video. Um, and there
is there's some informing of the
algorithm for your channel and what type
of content you make and what who they
might first seed it to and all that type
of stuff. Like there is some influence
there.
there's no momentum. Um, but dang. Yeah,
it's uh it's really it's a it's a very
different world these days.
>> I think they picked the wrong year to
release the Steam machine.
If they want to put any RAM in it, I
think is like to tie it back to our
original topic, it's going to have like
256K,
it's going to be like here's the here's
the Steam machine, guys. It's
>> $500 with no memory.
bring your own.
>> I've really been hoping that they
already have their a lotment. That's
who knows. I had no idea, but I've
really been hoping that's true because I
I want it to do well. I think like I
suspect everyone in here is at least a
little bit Linux build. Um
>> and it would be it would be nice to have
like a a
>> quite mainstream product like that. I
think Steam Deck has done like
>> immeasurable good for the like year of
the Linux desktop movement.
>> Absolutely.
>> Yes. that produced PewDiePie, which
PewDiePie produced a Linux video, and
that
>> that's huge. That was massive.
>> Glorious SSH.
>> But, uh, yeah, I mean, I think Steam
Machine can only move that forward. So,
hopefully it's good. Hopefully, it's
affordable.
>> We'll see.
>> We're actually doing a conference this
year about year of the Linux desktop. I
can't say anymore live because there's
some details still getting closed, but
it's pretty exciting.
>> Cool. Nice.
>> Yeah. All right.
>> Right, Brian. We're going to That's why
I'm wearing this.
>> I don't want Yeah. I don't I don't want
to say anything more. We're going to You
got to be quiet.
>> No, I'm hoping that you leak and then I
can uh I can yell at you.
>> You'll be like, "Gosh dang it. Why did
you say this explicit thing right here
that I'm now repeating? I'm not going to
do it."
>> Um All right. Well, Luke, thanks for
joining us on this. This was an awesome
standup. Learned a lot. So, if you want
to enjoy the whole episode, go to
Spotify for all the extras. Uh but the
main story, of course, is always on
YouTube. Thanks so much for joining us,
Luke. Thank you so much. Uh, is there
anywhere special people can find you?
>> Uh, the internet. Uh, no. Yeah, Linus
Tech Tips. Great point. Um, WAN show in
general,
>> I'm I'm around. Yeah.
>> All right. Awesome. And of course, Tee
and Casey, thank you very much for
joining us. Thank you everybody for
watching.
That is the standup. I've never We I
don't know. We don't really have like a
signoff line.
>> Yeah, I'll try and get better jokes next
time. Sorry Chad. I I missed a few this
time. Hey, no blockers. Galactis. Love
Galactis.
>> Byebye.
>> Errors on my screen.
Terminal coffee
and
living the dream.
Ask follow-up questions or revisit key timestamps.
Luke from Linus Tech Tips joins the panel to discuss the current state of hardware markets, focusing on the rising costs of RAM and SSDs driven by the AI boom. The conversation covers technical aspects of High Bandwidth Memory (HBM), the shift from consumer-focused products to enterprise-grade technology, and how AI is impacting the developer hiring process. Finally, Luke shares insights into the changing landscape of tech YouTube, emphasizing the move from simple product reviews to story-driven content as hardware release cycles slow down.
Videos recently processed by our community