Azure Update 3rd April 2026
364 segments
Hey everyone, welcome to this week's
Azure update. It's the 3rd of April. As
always, we have the chapters so you can
jump to any particular update you care
about the most.
New videos this week. So obviously you
should go to the movies and see Project
Hail Mary. It was phenomenal. The book
was great. The movie was great. So it's
not one of my videos. My videos
I dove into Work IQ.
So when we think about how we work, not
just the knowledge, the artifacts in our
documents, our chats, and our meetings,
but a personalization that learns how
artifacts relate to each other,
how people relate to them, how people
relate to people,
uh
how we talk, how we function, our rhythm
of business,
and then special inferencing engines and
skills around document creation and deep
knowledge retrieval, and creating really
good PowerPoints. That's all part of
Work IQ that obviously
the Microsoft Copilots they're grounded
on, Co-work is grounded on, but you can
ground your own agents.
And then what is Copilot Co-work?
And I don't often say this, but it's a
really, really cool demo. About 8
minutes in if you want to skip the video
but go and look at the demo
to show how the Co-work capability
can just be given some outcomes you
want. I can interrupt it. It's running
as a cloud agent, but it's using these
long-running deep reasoning models to go
and do a bunch of cool stuff including
creating a web app all with a single
prompt. So it's kind of cool.
And then I kind of finished a flow of
videos I was trying to create around AI
and the Microsoft AI ecosystem. So I
just created sample.ai. It's just my
YouTube stuff. It's about 15, 16 videos,
but it's a very curated path if that's
useful trying to learn. Very much the
Microsoft focused
way of leveraging AI.
All right, so onto what's new on the
computer side.
So the Azure Red Hat OpenShift offering.
So that's the jointly created, managed,
supported offering between Red Hat and
Microsoft. Well, this is now available
in Indonesia Central. And so this is
really useful to have in new regions
when I think about I need to run a
workload closer to the customers maybe
from a latency perspective, but it may
also be around meeting different
regulatory requirements.
And this is kind of an interesting one.
So for virtual machines and virtual
machine scale sets, there's now a full
caching capability for ephemeral OS
disks. And this is in preview. So
remember the ephemeral OS disk is where
it's not using a durable managed disk.
It's that it's going to create the the
disk on local host resources. The the
temp space, the cache space.
So what that means is ordinarily
the rights we make to that ephemeral OS
disk go to the local storage in the node
that the VM runs on,
but the actual main OS image it's
reading from is still remote.
So what this new feature does is it
caches the entire OS disk to the local
storage. That removes any remote storage
dependency. So that's going to obviously
increase the resiliency of that running
VM from any remote storage failures, but
also improve the performance with the
super, super low latencies because it's
just local running on that storage. Now
the way it's going to work is the
caching occurs in the background once
the VM boots. So obviously there's a bit
of time for it's going to cache that.
And it's just an enable full caching
flag that I set for the ephemeral OS
disk.
And remember why we use ephemeral OS
disks. It's where there's nothing in the
OS disk state that we care about. It's a
tin soldier. Something's wrong with it,
we just rebuild another one in its
place. It's really common with virtual
machine scale sets where we constantly
create and delete the virtual machines.
And so I don't want to pay for a managed
disk. I want better performance.
Ephemeral OS disks are fantastic for
that.
Networking.
So App Configuration now has an Azure
Front Door integration in preview.
Remember App Configuration enables us to
have all of those different
configurations for the applications to
be centrally managed,
and then it can
be used and delivered to client
applications.
Well, with the integration with Azure
Front Door, which remember Azure Front
Door is that global layer 7 anycast
split TCP
delivery layer with caching,
well, when it integrates with that,
it now enables the scaling of the
delivery of those configurations to be
in terms of millions of clients. And I
don't have to try and develop my own
proxy layer anymore. And obviously this
can work with single page apps, so the
spa, the mobile apps, a whole bunch of
other different scenarios. And what's
going to happen here is
the App Configuration endpoint is
established on Azure Front Door,
and then the Azure App Configuration
store is set as its origin.
So you use managed identity to secure
that communication,
and then Azure Front Door will simply
retrieve the selected key values,
feature flags, um whatever is required.
It can then cache and respond it. So
it's it's avoiding you having to worry
about secrets and other things, but
gives you a really high scalable secure
service.
On the storage side,
so premium SSD V2 is in a new region in
GA. Remember
premium SSD V2 is all about it's a
sub-millisecond latency,
but you have separate dials for the
IOPS, the throughput, and the capacity.
And I can dynamically change the IOPS
and throughput while it's being used. So
now we see these in South India and US
Gov Arizona.
So when I think about scenarios where
I need high IOPS, I need high latency
databases, big data, analytics, gaming,
um
this is really useful and I would use it
in VMs, but I can also use it in
containers where I need some kind of
durable state.
Uh the user delegation shared access
signature for tables, files, and queues
is now GA. So that was already available
for blob,
but now it's been pulled to the other
storage services. And the whole big deal
here is user delegation SAS is more
secure than the regular account or
service SAS because both of those are
tied to the master storage account key,
whereas this is tied to an Entra ID
identity. So whereas the storage account
key has all powerful, it can do
anything, this would be a subset of the
permissions of the identity creating it.
Can never be more.
And it's only valid for up to 7 days
maximum. So there's a lot more control.
It's a a better secured method.
Um Azure Data Box can now be imported
into Azure Files uh provision V2 storage
accounts. So the whole goal, remember
Azure Data Box is hey, I need to move a
massive amount of data.
Very often it's how I'm migrating from
on-prem to the cloud. I don't want to do
it over the network, but it can work
from the cloud back to on-prem as well.
And so I'm doing a big data migration.
Well, now as part of that migration, I
can use a provision V2 account. The big
thing about provision V2 was like the
dynamic
provision disk V2, the capacity, the
IOPS, and the throughput can all be
separately set.
And so now, hey, I can use those
more fidelity type storage accounts as
part of Data Box.
Uh Azure NetApp Files cool access has
some enhancements in preview. So the
whole goal of the cool access is the
less used data can be moved from the
Azure NetApp Files storage, which is
considered the hot tier, to regular
Azure storage, which is considered the
cool tier, to help drive cost savings.
So with the premium and the ultra
service levels of Azure NetApp Files,
they've enhanced the quality of service
algorithms that drive the allocation of
throughput to really try and minimize
any performance impact you see as a
result of that cooling.
On the database side, so Cosmos DB for
PostgreSQL is being retired
uh end of March 2029, so 3 years' time.
The writing was on the wall for this.
It's been replaced by PostgreSQL Elastic
Cluster. All right. PostgreSQL Elastic
Cluster is
built on the same Citus extensions
that we see for that distributed
sharding up of PostgreSQL databases,
but the Elastic Cluster has the built-in
HA, backups, DR, all the future
engineering investments. So just make
sure you migrate. I mean, honestly, you
want to migrate as soon as possible, but
definitely you need to migrate um
before then. And there is migration
tooling available for you to achieve
this.
Uh Event Grid has a number of updates.
So remember the whole point of Event
Grid is I can build event-driven
solutions at massive scale.
And I as the app that wants to trigger
and do something off of the event don't
have to hammer poll. I I don't have to
constantly ask the source, do you have
something? Do you have something? Do you
have something?
Event Grid takes care of that and then
calls that often serverless
handler that's going to do something
with the event data.
So what I've done is a whole bunch of
MQTT enhancements. So there's things
like in-order message delivery within a
client session.
It's now has the ability for one
connection attempt per second per client
connection limiting. So I'm going to
throttle the number of requests. It's up
to 15 MQTT topic segments and there's
also cross-tenant delivery or when
they're available in GA. In preview,
there's MQTT OAuth 2.0 for the
authentication. There's custom webhook
authentication and there's static client
ID identifiers. Now it can also now use
managed identities for webhooks. There's
a cross-tenant webhook delivery and it
also supports network security
perimeters now. Remember network
security perimeters are about, hey, I
have a bunch of different Azure PaaS
services. If I put them in the same
network security perimeter, they're
allowed to all talk to each other.
And then I can also control
communication
for inbound and outbound to them as a
group of services. So it's a really nice
set of rule controls I can do there.
So Copilot Co-work is now available in
front here. I did a whole separate video
on this. It is crazy, crazy good. And
again, I don't often push to go and
watch one of my videos, but go and watch
the Co-work video. Even if you skip me
waffling on, go to like minute eight and
it's a demo.
It runs in the cloud, so it has no
use of your local machine. It's not
using local resources. It doesn't have
full reign to your local machine.
I can interact with it while it's doing
its work. It's grounded in Work IQ and
Outlook and Teams, all these fantastic
data sources, and I just tell it the
outcomes I want.
And it goes and works out
all of the plan and then just does it.
It It is a game-changer. I've done some
really cool things already just in this
week.
But uh yeah, go and go and watch the
demo. It's crazy good.
Uh Azure Azure Speech Neural HD 2.5. So
this is all about giving more choices
um in the regions you use it, the
quality, the performance, the
expressiveness.
Where I want a really low latency, think
real-time type interaction.
There's a whole number of different
speaking style updates for English
content.
Uh I I know it's all things like
struggling and skeptical styles. I can
do things like sighing and yawning,
which again will be important if I'm
trying to do a voice about my content.
But uh just a lot of work about trying
to have those um synthetic voices.
And then I'm only mentioning this cuz I
thought the core was really named. There
There's constantly new models being
added to foundry. That's one of the
whole points around I think the
Microsoft strategy in general is model
choice. There's no such thing as the
best model. There might be the best
model this week for this type of
requirement,
but
nearly every scenario is using multiple
models and they're going to evolve over
time. But I love the name of this thing.
This is a new Nvidia model, NeMo Triton
3 super 120B
uh A12B.
So it's a mixture of experts. So the
whole point of a mixture of experts is
there's 120 billion parameters,
but
it only activates 10% of them, so 12
billion for any specific inference. So
it's actually fairly compact in its
resource utilization, but based on what
that inference request is, it can choose
which expert it has that is the most
applicable to what it's been asked to
do. It has a 1 million token context and
it's really geared towards token
text generation. Obviously I could then
pass that to a text-to-speech model. I
could have a speech-to-text model in
front of it, but its specific goal is
around that um text generation model. Um
but super, super powerful.
And that is it. As always, I hope this
is useful.
Uh amazing, amazing, amazing for all the
updates this week. And till next video,
take care.
Ask follow-up questions or revisit key timestamps.
Loading summary...
Videos recently processed by our community