HomeVideos

Inside the Closed-Door Deals Running OpenAI and Anthropic

Now Playing

Inside the Closed-Door Deals Running OpenAI and Anthropic

Transcript

492 segments

0:00

We have been watching the AI race for

0:03

three full years now. At first, it was

0:05

all about the opening moves, the models,

0:09

the breakthroughs, the chips, GBT versus

0:12

Claude, Gemini versus Deepseek. But this

0:15

part of the game is over and we're past

0:18

the opening and now we're entering the

0:20

middle game. We're no longer competing

0:22

on the models. We're competing on the

0:25

infrastructure, on who controls the

0:27

compute, who controls the clouds, and

0:30

how the contracts that are being signed

0:32

decide who gets to play. This year, we

0:36

saw two AI mega deals. Microsoft and

0:39

OpenAI and Anthropic's web of

0:41

partnerships, and these are so much more

0:44

than just headlines. We're watching a

0:47

formidable power play and a much larger

0:50

match. Today is about uncovering what is

0:54

really happening on the chessboard and

0:57

who controls whom. Let's dive in.

1:03

On October 28th, OpenAI has announced an

1:06

extensive restructuring when they

1:09

officially granted Microsoft a 26%

1:12

equity stake. This move underscores how

1:15

the global AI race is evolving and it's

1:18

evolving from a competition of models to

1:21

a contest of ecosystems, partnerships

1:25

and deals. At the core of these

1:27

partnership is the fact that OpenAI's

1:30

original structure became increasingly

1:32

unsustainable. Their unit economics do

1:35

not work. They cannot survive without

1:38

extensive capital injection from the

1:40

outside. They're losing $2 on every

1:43

dollar of revenue and they're burning

1:45

through investments faster than they can

1:47

monetize. I spoke about this at length

1:50

in my earlier videos on the unit

1:51

economics of Chad GBT. If you want to

1:53

learn more, check them out. In 2019,

1:55

OpenAI created OpenAI LP or limited

1:59

partnership which is a capped profit

2:02

subsidiary. You may ask what is openp?

2:06

Openp is a subsidiary created by openai

2:10

the purpose of which was to balance

2:12

fundraising with the nonprofit mission

2:16

using a capped profit structure through

2:20

open AI LP. They could access

2:22

significant capital that they need to

2:24

develop advanced AI models without

2:27

abandoning the core mission to benefit

2:30

humanity. This was the first step OpenAI

2:32

took to balance commercial viability

2:35

with ethics. Fast forward to now and

2:38

Microsoft holds approximately 30% of

2:41

OpenAI and their stake in OpenAI is

2:44

valued at $135 billion which makes it

2:48

one of the largest technology

2:50

partnerships in history. What this 30%

2:53

chunk means is that Microsoft now has

2:56

significant influence over OpenAI's

2:59

operations and development. On top of

3:02

this, Microsoft will be taking 20% of

3:05

OpenAI's direct revenue. And as a

3:08

refresher, the largest chunk of OpenAI's

3:10

revenue comes from Chad GBT, not API

3:14

sales. The reason I'm making an emphasis

3:16

on the API sales is because API sales is

3:19

the heart of B2B adoption and GPT is not

3:23

doing very well on API sales, meaning

3:26

they're predominantly consumer. So yeah,

3:29

back to Microsoft, they're going to be

3:30

taking 20% of OpenAI's revenue through

3:33

2030. And this means that they're going

3:35

to have a huge influence over OpenAI's

3:38

go to market and pricing decisions. But

3:41

this goes both ways. Microsoft will also

3:44

pay OpenAI around 20% of revenue from

3:47

Azure OpenAI services and Bing AI

3:50

features.

3:53

If you're building a company, you

3:54

already know scaling means hiring. With

3:57

every hire, you're looking at payroll

3:59

setup, benefits enrollment, IT

4:01

provisioning, access to 12 different

4:02

tools, corporate card, compliance

4:04

paperwork. Onboarding one takes your op

4:07

teams about 6 hours, but you're hiring

4:09

15 this quarter. That's 90 hours of

4:12

manual work. Most companies are running

4:14

on 100 plus disconnected tools. Now,

4:16

each one solves a problem, but together

4:18

they create silos. Ripling, one of the

4:21

fastest growing HR, IT, and spend

4:23

platforms out there, calls this SAD

4:26

software as a disservice. To combat

4:29

this, Riplin built a unified platform

4:31

across four systems that run your

4:33

company. one source of truth for all

4:35

employee data that allows you to manage

4:37

HR, payroll, spend, and IT in one

4:40

unified system. You can hire contractors

4:42

in more than 185 countries and full-time

4:46

employees in more than 80 countries and

4:48

pay them in different currencies, which

4:50

really matters because your next hire

4:52

probably isn't local. New hire gets

4:54

their laptop, all accounts, correct

4:56

permissions based on their role all

4:59

automatically. Someone leaves,

5:01

everything gets deprovisioned instantly.

5:03

Real-time visibility into company

5:04

spending. No more waiting for expense

5:06

reports. No more budget surprises.

5:08

Rippling is trusted by more than 20,000

5:10

companies across a range of industries,

5:12

including Cursor, Barry, Chaz.com, and

5:15

Liquid Death. Every fragmented system in

5:17

your stack is holding back execution.

5:19

Rippling provides four core systems to

5:22

manage your entire staff end to end. HR,

5:25

payroll, IT, and spend management. All

5:27

connected, all automated, helping you

5:30

increase savings and make better

5:31

decisions. And that's how you build a

5:33

company that can actually move as fast

5:34

as you scale. Check out stopsad.com to

5:38

see if SAD is affecting your

5:39

organization and what you can do about

5:41

it.

5:44

Now, why was this partnership

5:46

architected and who controls whom? The

5:49

real value of this partnerships happens

5:51

through several layers. The first one

5:54

being the revenue share. $865 million

5:58

captured in 9 months of 2025. Now look

6:01

at this number. 865 million in 9 months.

6:06

The fact that Microsoft captured this

6:08

much in revenue in just 9 months shows

6:11

both the scale of financial potential

6:14

from just one foundational AI company

6:17

and the dependency, the inevitable

6:19

dependency that it creates. Because

6:21

first of all, it shows that OpenAI is a

6:24

massive revenue driver for Microsoft and

6:27

a monetization engine for Azure. because

6:30

Microsoft extracts a significant cut

6:33

from every dollar OpenAI makes

6:35

regardless of OpenAI's own

6:37

profitability. And secondly, this flow

6:40

of hundreds of millions of dollars per

6:42

quarter means that OpenAI's growth

6:45

directly contributes to Microsoft's

6:48

cloud business and its ability to

6:50

justify further infrastructure expansion

6:54

which becomes their moat in AI. And

6:58

thirdly, the size and cadence of these

7:01

payments prove that there is a very

7:03

acute pressure on OpenAI's unit

7:05

economics because we already know that

7:07

their inference costs already exceed

7:10

revenue and now they're giving another

7:13

20% to Microsoft which means that it's

7:15

going to be extremely difficult for them

7:18

to achieve sustainable business

7:20

economics. This revenue share

7:23

illustrates how partnerships and deals

7:24

in the AI economy are a lot less about

7:27

profit sharing and much more about deep

7:31

control over the economics and

7:33

trajectories of foundational AI

7:36

companies. The second layer of this is

7:39

guaranteed sale of infrastructure

7:41

because OpenAI has committed to

7:44

purchasing $250 billion in Azure

7:47

services. What this means is that OpenAI

7:51

is not just a big client that Microsoft

7:54

managed to land. Microsoft's whole

7:56

business model is transforming because

7:58

they used to sell software. That's what

8:00

they've been known for and they're now

8:02

becoming a renting compute

8:04

infrastructure business. It's not good

8:06

or bad. It's just fascinating to observe

8:09

how AI race is becoming the race of

8:12

computing power. When OpenAI makes up

8:15

50% of Azure's revenue growth, Microsoft

8:19

becomes dependent on a partner that is

8:22

losing $2 on every dollar of revenue and

8:25

the same partner is burning through

8:27

compute resources faster than it can

8:30

monetize. And if you add the whole AGI

8:32

narrative on top of this and all the

8:34

effort and the money that is being spent

8:36

on achieving this magic entity of AGI,

8:39

which if you're curious about AGI, watch

8:41

the previous video. There is a lot of

8:43

capital burning happening here. This

8:45

creates a risk that cloud businesses do

8:49

not typically face. Microsoft cannot

8:53

easily replace OpenAI's demand. OpenAI

8:56

is the largest AI client on the planet

8:58

and the one with the biggest computing

9:01

needs. Signing them as a client is a

9:04

double-edged sword. Yes, you get the

9:06

largest AI client on the planet. You're

9:08

selling more than you ever have, but

9:10

you're also scaling your operations to

9:12

cater to this client. And if this client

9:16

leaves, you're screwed. To find the next

9:18

OpenAI level customer, for example,

9:20

Anthropic or Deepseek, you would need

9:22

years for them to scale to the needs of

9:25

OpenAI. And now, let's turn this around

9:29

because OpenAI is actively seeking

9:32

diversification and they're seeking it

9:35

through Core Weeave and Oracle. And

9:37

they're doing so to reduce their

9:39

dependency on Microsoft. And the reason

9:42

they're looking to diversify is because

9:44

this extreme financial strain makes it

9:47

risky for OpenAI to depend solely on

9:50

Microsoft. If Microsoft raises prices or

9:53

changes terms of this partnership,

9:55

OpenAI's whole survival could be at

9:57

risk. And this risk keeps growing. But

10:00

the same risk goes for Microsoft. If

10:03

OpenAI's usage drops or diversifies to

10:06

other clouds or the company completely

10:08

collapses under its own financial

10:10

structure, which yes, extremely

10:11

unlikely, but nevertheless not

10:13

impossible, Microsoft will lose both the

10:16

revenue stream and the growth narrative

10:18

that supports its $4 trillion valuation.

10:23

Yep, don't forget that Microsoft's

10:25

valuation went up because of this

10:27

partnership as well. You may ask, but

10:29

why did Microsoft do it given such a

10:32

massive risk? They did it because they

10:34

were catastrophically behind Google and

10:37

Amazon in the AI race. At the time of

10:40

the initial $1 billion investment in

10:42

2019, Microsoft held only 29% of the

10:46

cloud market versus AWS's 37%. And they

10:50

did not have any competitive AI research

10:52

capability to match Google's DeepMind or

10:55

Amazon's Alexa. The partnership with

10:57

OpenAI delivered an instant 10-year leap

11:01

because Microsoft got exclusive access

11:05

to frontier models that they couldn't

11:07

build internally and a chance to embed

11:11

AI into its Microsoft 365 and Teams. By

11:15

doing this, they create distribution

11:19

advantage that neither Google or Amazon

11:22

would be able to replicate. And to top

11:24

it off, they get the ability to position

11:26

Azure as the only cloud with OpenAI API

11:31

access. This was a gamble, and I applaud

11:34

everyone who has architected this gamble

11:37

because it pays off. Azure grew 40%

11:40

annually through 2025 compared to AWS's

11:44

19%. Microsoft stock gained over $2

11:47

trillion in market cap since this

11:50

partnership began. Microsoft now

11:52

controls the enterprise AI distribution

11:54

layer through co-pilot and Microsoft now

11:57

controls the entire enterprise

12:00

distribution layer and as a B2B product

12:02

manager who works at an enterprise and

12:04

the vast majority of enterprises are on

12:06

Azure cloud and using Microsoft stack it

12:09

is hard to overestimate the reach that

12:12

Microsoft has into the enterprise tech

12:14

and finally number four the competitive

12:16

mode which is arguably the highest

12:19

currency of all they got exclusive API

12:22

rights through Azure until AGI is

12:26

verified, which means that they're

12:29

locking enterprise customers into

12:31

Microsoft's cloud. Microsoft's exclusive

12:34

API rights to OpenAI mean that any

12:38

enterprise that wants to use OpenAI's

12:41

models in production must route all

12:44

traffic through Azure. And this creates

12:47

a dependency loop almost an unbreakable

12:49

loop where purchasing access to GBT5 for

12:53

example for your business automatically

12:56

makes you an Azure customer regardless

12:59

of whether you're on AWS or Google Cloud

13:02

or onrem. This is beyond billing. This

13:04

is technically an architectural lockin

13:07

because when you purchase access to

13:09

enterprise level GBT, you have no choice

13:12

but to accept Azure's private network,

13:14

Azure's compliance, Azure's data

13:16

residency rules, Azure's authentication,

13:19

and Azure's pricing because OpenAI's API

13:22

literally cannot be run anywhere else

13:25

until AGI is verified by an independent

13:29

expert panel. And on top of everything I

13:32

just said, once an enterprise builds

13:35

applications on Azure OpenAI service, it

13:38

automatically integrates with Azure

13:40

Cognitive Search, Azure Functions, Azure

13:43

Key Volt, and Azure Rulebased Control

13:46

and the switching costs become enormous

13:48

because you'd need to rewrite your

13:50

entire application. You would need to

13:52

rebuild your data pipelines. The problem

13:54

with the vendor switching even as is

13:56

with the current combo of Microsoft 365

13:59

copilot and teams and when it needs to

14:02

be done companies hire full teams of

14:04

people just to do the vendor switch. I

14:07

mean I was recently doing a switch from

14:09

Apple to Google Workspace and I wanted

14:10

to shoot myself in the head. Microsoft

14:13

used OpenAI's virality to its advantage

14:16

and they used it to force cloud

14:18

migration to Microsoft and so much so

14:21

that there are a bunch of companies that

14:23

picked AWS as their cloud provider back

14:25

in 2020 and are now running significant

14:28

Azure workloads because their

14:30

engineering teams at some point said

14:32

that they wanted GBT and the only

14:34

compliant enterprisegrade path to access

14:37

GBT requires full Azure adoption. This

14:41

is the reason why Azure captured 62% of

14:44

the Genai cases despite only 29% of

14:47

cloud share. They're not just betting on

14:50

the infrastructure. They're betting on a

14:53

legal absolutely legal distribution for

14:56

the world's most in demand AI models.

14:59

And that exclusivity doesn't end until

15:03

someone declares AGI. And now coming

15:06

back to AGI, Microsoft has every reason

15:10

and every incentive to delay the

15:13

declaration of AGI indefinitely.

15:18

So this was the OpenAI Microsoft

15:20

situation and now let's look at OpenAI's

15:23

rival Anthropic and see what they've

15:25

come up with. Anthropic went in a

15:27

completely different but nevertheless

15:29

brilliant direction. I personally have

15:31

huge respect for Anthropic as a company

15:33

and I prefer their models to any other

15:35

LLM. And it's interesting how they're

15:37

balancing the power game while trying to

15:41

maintain independence. They secured

15:43

investments and compute commitments from

15:46

all three hyperscalers, Google Cloud,

15:49

AWS, and Microsoft. With Google, they

15:52

became their major investor and cloud

15:54

partner. Through Google, they got access

15:56

to 1 million of custom tensor processing

16:00

units or TPUs that will be coming online

16:02

by 2026 and 1 gawatt of power. To put

16:06

this in perspective, 1 gawatt of power

16:08

is comparable to the output of a large

16:11

nuclear power plant, which means an

16:13

enormous scale of AI compute capacity

16:16

dedicated to powering anthropics AI

16:18

models. Now for the AWS,

16:21

they made an $8 billion total investment

16:23

in Athropics, becoming their lead

16:26

financial backer. The motive of AWS was

16:29

well, first of all, securing Anthropic

16:31

as one of the largest clients for the

16:34

cloud business, but also to strengthen

16:36

AWS AI service with Anthropics Claude.

16:40

And on top of this, their deal included

16:42

a multi-billion dollar cash infusion and

16:45

commitments for AWS to supply 1 million

16:49

Tranium 2 processors, which is the

16:51

Amazon's custom chip specifically

16:53

designed for AI training, which means

16:56

that in practice, this made AWS

16:59

Anthropic's principal cloud provider.

17:02

And lastly, Microsoft and Nvidia. In

17:05

November 2025, Microsoft and Nvidia

17:07

announced a new partnership with

17:09

Enthropic that involves several moves.

17:12

Enthropic agreed to a $30 billion

17:14

commitment to use a Microsoft's Azure

17:16

cloud for future compute needs, meaning

17:19

that Anthropic will spend at least $30

17:22

billion on Azure Infra over multiple

17:26

years. This guarantees Azure a large and

17:29

long-term stream of revenue. And now

17:31

they're not only OpenAI's cloud

17:33

provider, they're also Anthropic's

17:36

provider. And this in combination means

17:39

that Anthropic is the only foundational

17:43

model company available on all three of

17:46

the world's most used cloud services.

17:49

And as we continue rotating this board,

17:52

Anthropic has committed to spend $50

17:54

billion in future compute spending,

17:56

which means that they're locking

17:58

themselves into these guys for years to

18:02

come. There is a fundamental divergence

18:05

in the anthropic and multicloud approach

18:08

and the Microsoft OpenAI partnership.

18:11

And the delta is in the revenue model,

18:14

who they sell to and how they make

18:16

money. OpenAI has a consumer first

18:19

business model. 73% of OpenAI's revenue

18:22

comes from consumer subscriptions, Chad

18:24

GPT plus and Chad GPT Pro with 27% and

18:27

some sources say 15% from API and

18:30

enterprise. They have 800 million weekly

18:33

users. They love the weekly user metric,

18:35

but only 5% of them are paid. This

18:38

consumer oriented business model

18:40

automatically implies dependency on one

18:42

cloud provider and not just any

18:44

provider. It must be always on because

18:47

they're serving 8 billion weekly users

18:51

and you cannot run a consumerf facing

18:54

product at this scale on multiple clouds

18:57

without catastrophic user experience

18:59

fragmentation. OpenAI has locked itself

19:02

into complete Azure dependence.

19:05

Anthropic on the other hand is

19:07

enterpriseoriented. Anthropic's revenue

19:10

model is the inverted version of open

19:13

AIS. They get around 85% of their

19:16

revenue from enterprise API calls and

19:20

only 20% from consumer subscriptions.

19:22

Enterprise customers access claude

19:24

through AWS bedrock, Google's vertex AI,

19:27

Azure AI foundry or direct API. Point is

19:30

it doesn't matter which cloud runs the

19:33

inference. Cloud is everywhere. This is

19:36

why the multicloud model is viable for

19:39

anthropic enterprise workloads are batch

19:41

oriented, latency tolerant and already

19:44

distributed across clouds based on a

19:46

company's IT infrastructure. It's a lot

19:48

easier for a business or an enterprise

19:51

to use claude than chajbt. And as Ben

19:53

Thompson observed, anthropic's lack of a

19:56

strong consumer play means that it is

19:58

much more tenable, if not downright

20:00

attractive, for them to have a supplier

20:02

type of relationship with AWS. Only

20:05

Anthropic can pull this off right now.

20:07

Open AAI cannot replicate this model

20:09

because their consumer success is their

20:12

blessing and their curse. Now, if you

20:15

paid attention to these numbers, a

20:17

question that may have formed in your

20:19

brain is seems like Anthropic is doing

20:21

better than OpenAI. Anthropic's

20:23

multicloud partnership despite

20:25

complexity and despite operational

20:28

overhead does deliver superior unit

20:31

economics. Their multicloud model

20:33

optimizes costs and once again remember

20:36

they're serving 8 billion users weekly

20:39

and this volume requires a unified

20:42

infrastructure that cannot tolerate

20:44

delays from crosscloud routing which

20:46

forces them to accept Azure's premium

20:49

pricing even though their inference

20:52

costs run at 200% revenue. This means

20:55

that Enthropic's multicloud partnership

20:57

is the key to profit margin.

21:01

The real war and the real race in AI is

21:04

control over the infrastructure. The big

21:08

boys are no longer competing for the

21:10

best models. Microsoft and AWS do not

21:12

have their native models at all. And

21:14

while Google is investing quite a bit

21:16

into Gemini, they're still competing to

21:18

own the compute layer where all models

21:21

must run. Anthropic's committed compute

21:24

spending across all three hyperscalers

21:27

is the only strategy that keeps them

21:30

independent. And even that locks

21:32

Enthropic into infrastructure

21:34

dependencies for the next decade. We're

21:36

getting into the middle game now where

21:38

all players are making their moves and

21:41

fighting for who owns the board. The

21:45

models are just the pieces. The board or

21:48

the infrastructure decides who is going

21:52

to win. As always, we hope this was

21:54

helpful. Till next time. Bye.

Interactive Summary

The AI race has shifted from competing on models and breakthroughs to controlling infrastructure, compute, and cloud partnerships. OpenAI's partnership with Microsoft highlights a deep dependency due to OpenAI's unsustainable unit economics and consumer-first business model, leading to Microsoft's significant influence and architectural lock-in for enterprise clients. In contrast, Anthropic has adopted a multi-cloud strategy, securing investments and compute commitments from Google Cloud, AWS, and Microsoft, viable due to its enterprise-oriented API revenue model. This allows Anthropic to optimize costs and maintain better unit economics, emphasizing that the true competition in AI is now over who owns the underlying compute infrastructure.

Suggested questions

8 ready-made prompts