HomeVideos

Does Meta even AI..?

Now Playing

Does Meta even AI..?

Transcript

170 segments

0:00

Meta is about to spend $130 billion this

0:03

year, mostly towards data centers. And

0:05

we know that Meta has been far from

0:07

relevant in comparison to other frontier

0:09

labs. And yet, Meta is doubling down to

0:12

spend up to $600 billion by year 2028,

0:15

which is nearly 2% of the US GDP in a

0:18

single year. And currently, Meta is

0:20

finishing up their $30 billion data

0:23

center in Louisiana. That brings up

0:24

their number closer to 30 data centers

0:27

in the US alone. So, why would Meta

0:30

continue to dump all this money towards

0:32

AI? Welcome to Kilobytes Code, where

0:34

every second counts. Quick shout out to

0:35

Nvidia. More on them later. We don't

0:37

necessarily think about Meta when we

0:39

talk about AI, but Meta's contribution

0:41

to AI actually goes further back than

0:44

OpenAI and Anthropic where in 2013 Meta

0:47

formed a group called fair that released

0:49

important libraries like PyTorch,

0:51

different versions of Llama from Llama 1

0:54

to Llama 4, as well as different

0:56

expressions of GAN and many more. As you

0:58

can see, unlike what we currently think

1:00

about Meta right now, Meta used to be at

1:03

the forefront of AI research and

1:05

innovation until closed proprietary labs

1:08

like OpenAI and Anthropic started to

1:10

dominate in performance. And what's

1:12

worse is that Meta also fell behind in

1:15

open models against Chinese frontier

1:17

labs like Deepseek, Alibaba, Moonshot,

1:19

and Miniax. as well as more recently

1:22

Nvidia actually started to overtake in

1:24

open model contribution according to

1:26

hugging face given their contribution

1:28

towards Neotron Cosmos and Earth 2. So

1:30

after a decade of contributing in the AI

1:33

research and innovation meta fell back

1:36

in both performance against proprietary

1:38

labs in the US and research against open

1:41

labs in China and now Meta is sort of

1:44

stuck in the middle worst of both

1:46

worlds. Now when we look at MATA's

1:48

capital expenditures over time, we can

1:51

see an inflection point right around the

1:53

DeepSc moment where they started to

1:55

increase their capex heavily. Deepseek

1:57

released their R1 model back in January

1:59

2025 which proved to the world strong

2:02

capabilities while keeping the cost down

2:05

to less than $6 million in GPU run cost.

2:08

And only 3 months after the Deep Seek

2:10

moment, Meta finally released their next

2:13

Llama 4 model, which basically flopped

2:16

even though benchmarks showed otherwise.

2:18

And that April sparked a code red moment

2:20

for Meta to do something drastic. Around

2:23

June 2025 was when Meta started to

2:26

change their business model entirely.

2:28

Mark Zuckerberg went on a spree of

2:30

acquisition where he took majority

2:32

ownership stake in Scale AI and poached

2:35

Alexander Wang to form a new initiative

2:37

called Super Intelligence Lab. And he

2:39

followed up making insane offers to top

2:42

AI talents to join his team. And since

2:44

then, Meta's capex has not only

2:46

increased towards AI, but the projected

2:48

capex over the next four years will add

2:50

up to an insane number in spending. So

2:53

much so that recently Meta announced a

2:55

multi-billion dollar deal with AMD as

2:57

well as multi-year deal with Nvidia to

3:00

purchase their CPUs, GPUs, and Ethernet

3:02

switches to fill their growing data

3:04

centers across the country. And speaking

3:06

of GPUs from Nvidia, if you're

3:08

interested to learn more about AI,

3:10

Nvidia's GTC conference is coming up

3:12

from March 16th to 19th. You can join

3:14

the session deploying AI agents at

3:16

Enterprise Scale online for free. I'll

3:18

be attending the GTC this year thanks to

3:20

Nvidia's invitation to host me with a

3:23

complimentary badge and stay. And this

3:24

year's GTC has a lot of interesting

3:26

topics to go through from data centers

3:28

to robotics as well as all the sessions

3:31

surrounding Neotron models as you can

3:32

see here. And if you sign up for any

3:34

session using the link to participate

3:36

online, I'm doing a giveaway courtesy of

3:38

Nvidia for an RTX 4080 Super GPU. And

3:41

joining sessions online is completely

3:43

free. So make sure to sign up for the

3:45

giveaway and take a selfie to qualify.

3:47

link in the description. Now, despite

3:50

Meta's strong cash position, people are

3:52

still very doubtful towards Meta's

3:55

ability to actually execute on their AI

3:57

initiatives. The closest hint that we

3:59

have in innovation is in Meta AI glasses

4:02

that uses Llama 3.1. And their first

4:04

proprietary model that's supposed to

4:06

drop in early 2026 called Avocado, which

4:09

supposedly already beat other top open

4:12

models, comparing base model against

4:14

base model. But around mid 2025, as more

4:17

and more compute became available, what

4:19

we started to see is the increase in

4:22

innovation speed among Frontier Labs

4:24

where new checkpoints were being

4:26

released every month or every 2 months.

4:28

So if you factor in other Frontier Labs

4:30

that all release on their own schedule,

4:32

we practically have a new model release

4:35

every other week. So in a way, Meta is

4:37

met by this dilemma between going the

4:40

route similar to Apple and Microsoft

4:42

where they're licensing LLMs from other

4:44

labs or just compete directly by

4:46

spending hundreds of billions of dollars

4:48

in infrastructure. Meta does have a huge

4:51

upside when we think about their

4:52

distribution network on platforms like

4:55

Facebook, WhatsApp, and Instagram where

4:57

each has more than 2 billion active

4:59

users at a given time. This kind of

5:01

scale is something that OpenAI and

5:03

Anthropic just does not have. Which

5:05

means even if Meta is slightly behind

5:08

innovation, Meta's ability to convert

5:10

their innovation to revenue dollar is

5:12

significantly faster and easier than

5:15

OpenAI and Anthropic. This kind of

5:17

family apps that Meta has also give them

5:20

huge access to training data that exists

5:22

in the form of texts, images, and videos

5:25

that other Frontier labs might not as

5:27

easily get their hands on. And even if

5:29

the underlying model isn't as strong,

5:31

the growing compute demand from agentic

5:33

use cases will require Meta to use their

5:36

data centers across the country. But

5:38

meanwhile, people are still highly

5:40

skeptical towards Meta given their track

5:42

record of Metaverse and Llama 4 that

5:44

both flopped. There are other reasons

5:46

why people are still nervous about Meta,

5:49

especially around how they are financing

5:51

their data centers. Meta's recent data

5:53

center deal involved offbalance sheet

5:55

maneuvering to secure financing to cover

5:58

construction cost while maintaining

6:00

their credit as if they don't have a

6:02

direct exposure to a $30 billion debt.

6:05

And this isn't necessarily because

6:06

people think Meta can't make the payment

6:09

per se, but rather it's the involvement

6:11

of private credit like Blue Capital,

6:14

unlike a cash goal company Meta, that

6:16

could crumble, which might trigger a

6:18

huge domino effect to the rest of the

6:20

market and the rest of the economy. What

6:22

do you think? Do you think meta has a

6:24

good chance in actually working through

6:26

all of its difficulties?

Interactive Summary

Meta is embarking on a massive AI investment, planning to spend $130 billion this year and up to $600 billion by 2028, primarily on data centers. Despite its current position, Meta historically led AI research through its FAIR group, developing key technologies like PyTorch and Llama. However, it has fallen behind proprietary labs like OpenAI and Anthropic, as well as open models from Chinese labs and Nvidia. A 'DeepSeek moment' in early 2025, when Meta's Llama 4 flopped, triggered a 'code red,' leading Mark Zuckerberg to drastically shift strategy: acquiring Scale AI, forming a 'Super Intelligence Lab,' and aggressively recruiting top AI talent. While skepticism persists due to past failures like the Metaverse and Llama 4, and concerns about its off-balance sheet data center financing, Meta holds significant advantages. These include an unparalleled distribution network with over 2 billion users across platforms like Facebook, WhatsApp, and Instagram, enabling rapid monetization, and vast access to diverse training data. Meta now faces a critical decision: either license LLMs or directly compete with substantial infrastructure investments.

Suggested questions

7 ready-made prompts