Does Meta even AI..?
170 segments
Meta is about to spend $130 billion this
year, mostly towards data centers. And
we know that Meta has been far from
relevant in comparison to other frontier
labs. And yet, Meta is doubling down to
spend up to $600 billion by year 2028,
which is nearly 2% of the US GDP in a
single year. And currently, Meta is
finishing up their $30 billion data
center in Louisiana. That brings up
their number closer to 30 data centers
in the US alone. So, why would Meta
continue to dump all this money towards
AI? Welcome to Kilobytes Code, where
every second counts. Quick shout out to
Nvidia. More on them later. We don't
necessarily think about Meta when we
talk about AI, but Meta's contribution
to AI actually goes further back than
OpenAI and Anthropic where in 2013 Meta
formed a group called fair that released
important libraries like PyTorch,
different versions of Llama from Llama 1
to Llama 4, as well as different
expressions of GAN and many more. As you
can see, unlike what we currently think
about Meta right now, Meta used to be at
the forefront of AI research and
innovation until closed proprietary labs
like OpenAI and Anthropic started to
dominate in performance. And what's
worse is that Meta also fell behind in
open models against Chinese frontier
labs like Deepseek, Alibaba, Moonshot,
and Miniax. as well as more recently
Nvidia actually started to overtake in
open model contribution according to
hugging face given their contribution
towards Neotron Cosmos and Earth 2. So
after a decade of contributing in the AI
research and innovation meta fell back
in both performance against proprietary
labs in the US and research against open
labs in China and now Meta is sort of
stuck in the middle worst of both
worlds. Now when we look at MATA's
capital expenditures over time, we can
see an inflection point right around the
DeepSc moment where they started to
increase their capex heavily. Deepseek
released their R1 model back in January
2025 which proved to the world strong
capabilities while keeping the cost down
to less than $6 million in GPU run cost.
And only 3 months after the Deep Seek
moment, Meta finally released their next
Llama 4 model, which basically flopped
even though benchmarks showed otherwise.
And that April sparked a code red moment
for Meta to do something drastic. Around
June 2025 was when Meta started to
change their business model entirely.
Mark Zuckerberg went on a spree of
acquisition where he took majority
ownership stake in Scale AI and poached
Alexander Wang to form a new initiative
called Super Intelligence Lab. And he
followed up making insane offers to top
AI talents to join his team. And since
then, Meta's capex has not only
increased towards AI, but the projected
capex over the next four years will add
up to an insane number in spending. So
much so that recently Meta announced a
multi-billion dollar deal with AMD as
well as multi-year deal with Nvidia to
purchase their CPUs, GPUs, and Ethernet
switches to fill their growing data
centers across the country. And speaking
of GPUs from Nvidia, if you're
interested to learn more about AI,
Nvidia's GTC conference is coming up
from March 16th to 19th. You can join
the session deploying AI agents at
Enterprise Scale online for free. I'll
be attending the GTC this year thanks to
Nvidia's invitation to host me with a
complimentary badge and stay. And this
year's GTC has a lot of interesting
topics to go through from data centers
to robotics as well as all the sessions
surrounding Neotron models as you can
see here. And if you sign up for any
session using the link to participate
online, I'm doing a giveaway courtesy of
Nvidia for an RTX 4080 Super GPU. And
joining sessions online is completely
free. So make sure to sign up for the
giveaway and take a selfie to qualify.
link in the description. Now, despite
Meta's strong cash position, people are
still very doubtful towards Meta's
ability to actually execute on their AI
initiatives. The closest hint that we
have in innovation is in Meta AI glasses
that uses Llama 3.1. And their first
proprietary model that's supposed to
drop in early 2026 called Avocado, which
supposedly already beat other top open
models, comparing base model against
base model. But around mid 2025, as more
and more compute became available, what
we started to see is the increase in
innovation speed among Frontier Labs
where new checkpoints were being
released every month or every 2 months.
So if you factor in other Frontier Labs
that all release on their own schedule,
we practically have a new model release
every other week. So in a way, Meta is
met by this dilemma between going the
route similar to Apple and Microsoft
where they're licensing LLMs from other
labs or just compete directly by
spending hundreds of billions of dollars
in infrastructure. Meta does have a huge
upside when we think about their
distribution network on platforms like
Facebook, WhatsApp, and Instagram where
each has more than 2 billion active
users at a given time. This kind of
scale is something that OpenAI and
Anthropic just does not have. Which
means even if Meta is slightly behind
innovation, Meta's ability to convert
their innovation to revenue dollar is
significantly faster and easier than
OpenAI and Anthropic. This kind of
family apps that Meta has also give them
huge access to training data that exists
in the form of texts, images, and videos
that other Frontier labs might not as
easily get their hands on. And even if
the underlying model isn't as strong,
the growing compute demand from agentic
use cases will require Meta to use their
data centers across the country. But
meanwhile, people are still highly
skeptical towards Meta given their track
record of Metaverse and Llama 4 that
both flopped. There are other reasons
why people are still nervous about Meta,
especially around how they are financing
their data centers. Meta's recent data
center deal involved offbalance sheet
maneuvering to secure financing to cover
construction cost while maintaining
their credit as if they don't have a
direct exposure to a $30 billion debt.
And this isn't necessarily because
people think Meta can't make the payment
per se, but rather it's the involvement
of private credit like Blue Capital,
unlike a cash goal company Meta, that
could crumble, which might trigger a
huge domino effect to the rest of the
market and the rest of the economy. What
do you think? Do you think meta has a
good chance in actually working through
all of its difficulties?
Ask follow-up questions or revisit key timestamps.
Meta is embarking on a massive AI investment, planning to spend $130 billion this year and up to $600 billion by 2028, primarily on data centers. Despite its current position, Meta historically led AI research through its FAIR group, developing key technologies like PyTorch and Llama. However, it has fallen behind proprietary labs like OpenAI and Anthropic, as well as open models from Chinese labs and Nvidia. A 'DeepSeek moment' in early 2025, when Meta's Llama 4 flopped, triggered a 'code red,' leading Mark Zuckerberg to drastically shift strategy: acquiring Scale AI, forming a 'Super Intelligence Lab,' and aggressively recruiting top AI talent. While skepticism persists due to past failures like the Metaverse and Llama 4, and concerns about its off-balance sheet data center financing, Meta holds significant advantages. These include an unparalleled distribution network with over 2 billion users across platforms like Facebook, WhatsApp, and Instagram, enabling rapid monetization, and vast access to diverse training data. Meta now faces a critical decision: either license LLMs or directly compete with substantial infrastructure investments.
Videos recently processed by our community