AI Race: OpenAI vs Anthropic
221 segments
So, this just happened, but we don't
really want to talk about it. Actually,
let's talk about it. The history between
Sam Alman and Daario goes all the way
back to 2016 when Daario left Google as
a researcher to join a new startup
called OpenAI. And before OpenAI
released, Daario left to start his own
company called Enthropic. And the
competition between them started to
become a bit silly where after open
started running ads on their platform,
Enthropic followed up with a whole ad
saying how Enthropic will never run ads
on their platform. And Sam Alman wrote a
rather long post on X defending his
position, making it a bigger deal than
it should have. And since this picture
is making all of us more uncomfortable,
let's move on to something more
substantial, which is really breaking
down the competition between OpenAI and
Anthropic. Welcome to Kale Brightes Code
where every second counts. Whether
you're on team openai or team anthropic,
chances are you probably have some
financial exposure to both of these
companies. If you look at the overlap of
investors in OpenAI and Anthropic, Max 7
companies like Google, Amazon,
Microsoft, and Nvidia all have stake in
either or both of these companies. And
even outside of the MAX 7 companies,
these publicly traded banks have also
serviced debts, which means having a
managed retirement account or even an
S&P 500 will likely have some exposure
to what's happening between Enthropic
and OpenAI. But when we look at the
competition between OpenAI and
Anthropic, one of the major differences
is when it comes down to data centers.
While Anthropic relies mostly on
partnerships with AWS and Google Cloud
for training and inference, OpenAI has
famously been trying to be vertically
integrated, including working with
Broadcom to make their own custom AI
chip. So, right off the bat, we're
looking at a significant difference in
the business model between OpenAI and
Anthropic. According to Kushman and
Awakefield's 2025 data center report,
the construction cost to data centers
has been rising since 2020. material
cost to the power supply equipment and
cooling has been rising more than 40%
across the board since 2021. And if you
factor in the lead time that goes into
actually acquiring basic materials that
goes into construction, the two main
takeaway here is first how OpenAI's
partners like Crusoe and Oracle that are
actually constructing these data centers
can't fumble on this since major delays
in construction directly affects
OpenAI's business plan. And second, just
how impressive XAI's construction of
their Colossus facility looking back
when they only took 122 days to turn an
old factory into a 100,000 H100 filled
AI data centers. But delays in data
center construction isn't the only thing
that OpenAI needs to execute on. It's
also financing. Ever since the official
announcement of Stargate back in January
2025, people like Elon Musk have been
questioning how OpenAI will actually
secure $500 billion to finish their 10
gawatt data center plan by 2029. So far,
OpenAI raised about $164 billion to get
to where they are today. And they're
still about 10% of the way as they
continue to expand their Texas facility.
Oracle is finishing up their Wisconsin
data center. Michigan is about to start
construction along with Ohio. We're just
getting started in the Stargate plan.
And all of this requires continual
funding, which means even though OpenAI
raised about $164 billion up to today,
they likely need to secure another $400
billion until 2029 to see this entire
thing through. And getting this kind of
funding requires continual interest in
venture capitals, banks, private
equities, and other companies. Which
means we need a CEO that can not only
create positive sentiment around AI, but
also strategic alignment with cash cow
companies like Amazon and Nvidia that
can help OpenAI invest in their dream.
But recently, OpenAI receiving
investment interest from Amazon led to
Nvidia actually walking back on their
$100 billion commitment because Nvidia
is likely only interested in investing
in OpenAI as long as OpenAI commits to
using Nvidia chips to fill up most of
their data center needs. But OpenAI's
partnership with Amazon likely leads to
using Amazon's training chips, which by
in effect reduces Nvidia's commitment
towards OpenAI. As you can see, when we
look at the whole strategy around data
center alone, unlike Enthropic that
partners with companies to largely meet
their compute needs, OpenAI has their
hands full. Well, I guess not that full.
Now, things look very different when we
start to look at Enthropic. When we look
at Anthropic's revenue source, they
positioned themselves as a market leader
in enterprise AI and coding. Ever since
their first model release of Claude in
March 2023, which nobody really cared
about since Chachi BT was a buzz,
Enthropic had a pretty impressive runup
in their revenue, largely contributed by
their API availability for coding into
more enterprise level adoption in 2025.
And with the release of cloud code and
wider AI adoption, Enthropic grew their
run rate revenue to $14 billion. On the
other hand, OpenAI's 2025 estimate for
their expected ARR is $20 billion, which
is different from the actual revenue for
2025, which is likely closer to $14
billion instead. But regardless, what
OpenAI is saying is that since there
seems to be a linear relationship
between compute measured in gigawatts
and annualized revenue, it's hard not to
think looking at this that their
implication here is that by the time
they finished their 10 gawatt facility,
we could expect them to have a hundred
billion in revenue following their
linear projection, which is hard to
believe that this linear relationship
will continue on without mixing in other
sources of revenue like running ads.
OpenAI's opt-in to run ads on free and
go subscription users was rather
controversial. But what running ads
really show about OpenAI more than
anthropic is not only when it comes to
their revenue growth, but more about
OpenAI quickly needing to stop their
bleeding by essentially subsidizing
their free users from eating away into
their margin. Because the more they
subsidize, the more they need to look
for financing in cost that aren't going
into their data center buildouts. This
kind of precarious situation is what
Daario commented on a recent podcast
where he said that even a slight
miscalculation of capex in data centers
could essentially sink the company which
is why Anthropic is taking more of a
measured stance in how they're unfolding
their business plan. So far, Anthropic's
most recent plan of $50 billion data
center in Texas and New York seems
rather small compared to OpenAI that
wants to spend up to a trillion dollars
in the next 10 years. So the race
between Enthropic and OpenAI is not only
about how they source their revenue from
users, but also how they source their
compute and financing. And in order for
both of these to work, they not only
need to continue proving their relevance
when it comes to their models
performance and benchmarks, they also
need to win the hearts of many end
users, especially the early adopters and
enterprises that tend to spend the most
on their platforms through subscriptions
and APIs. And recently a lot of early
adopters have grown more disgruntled
towards anthropic crackdown on oath of
their max subscription plans especially
as we look at proliferation of openclaw
use cases given that openai quote
unquote acquired the creator of
openclaw. This kind of sentiment has
been stronger as the cost of
intelligence has been dropping because
cheaper alternatives from Chinese models
are increasingly becoming more and more
viable. And whereas before people easily
signed up for multiple subscriptions at
the same time, more options mean people
are seeing the value in being able to
hot swap or load different models from
different providers. But cracking down
on high-paying users to only allow them
to use their model through their
subscription is starting to not play
well among users, especially given that
the alternative option is API pricing,
which is extremely unfavorable for
Enthropic given that their pricing is
much much higher than their competitors.
So given what we covered, do you think
Anthropic's measured approach will win
or do you think OpenAI's economies of
scales approach is actually a better
method? What do you think?
Ask follow-up questions or revisit key timestamps.
The video highlights the intense competition between OpenAI and Anthropic, tracing their rivalry back to Dario's departure from OpenAI to found Anthropic. A key distinction lies in their data center strategies: OpenAI pursues vertical integration with ambitious, multi-hundred-billion-dollar plans for custom AI chips and infrastructure, facing significant financing hurdles and investor conflicts. In contrast, Anthropic adopts a more measured approach, relying on cloud partnerships and focusing on enterprise AI and coding for revenue. Both companies grapple with the challenge of securing massive capital and maintaining user relevance amid dropping intelligence costs and a user base increasingly demanding flexibility and competitive pricing. OpenAI's use of ads indicates a need to offset costs, while Anthropic faces user dissatisfaction over its subscription policies and higher API pricing.
Videos recently processed by our community