The Epochal Ultra-Supercritical Steam Turbine
243 segments
In 1993, Japan broke through with the first commercial-scale
Ultra-Supercritical steam turbines.
For thirty years, turbines operated at mere supercritical temperatures.
Limited by the properties of the steel they were produced from.
It took nearly two decades of R&D for Japan to develop the
technologies to bring that steel to the market.
In today's video, we explore a coal-centric technology. The
30-year march from supercritical to ultra-supercritical steam turbines.
## Beginnings
Power-generating steam turbines are a technology well over a hundred years old.
Inside a thermal power plant, an energy center heats up water in a boiler. This
energy center can be coal, oil, nuclear, or geothermal. Most of the time it is coal.
The heated water turns into steam. That steam then hits the turbine blades and violently expands,
producing mechanical energy that spins a generator and creates electricity.
The steam - now cooler and under less pressure - is then returned to the
boiler where it is condensed back into a liquid state. This is the Rankine Cycle.
Thermal plants are not as efficient as a hydroelectric plant - which can
get to as high as 90% as compared to a thermal plant's 30-60%. But thermal
plants are cheaper, smaller, and less location-specific.
## Getting More Efficient We define a steam turbine's efficiency as how much
of the fuel's heating input is turned into usable electricity.
Like all heat engines, steam turbines have a maximum efficiency:
the Carnot heat engine efficiency,
the difference between the energy levels of the steam entering and leaving the turbine.
The steam’s energy level leaving the turbine tends to be fixed,
as it is tied to the turbine's environment. So the most practical thing to do is raise the
other side of the equation: The energy of the steam going into the turbine.
So over the past 70 years, that has been the general summary of the steam turbine's technical
evolution: Getting the steam hotter and putting it under higher pressure to make it more energetic.
There was an interesting efficiency side-quest involving reheat cycles.
After the steam expands in a high-pressure turbine,
it is sent back to the boiler for reheating.
The reheated but lower-pressure steam then hits a second intermediate turbine,
and maybe even a third lower-pressure turbine.
Only after that is the steam finally condensed back to liquid.
Reheat cycles both add thermal efficiency and protect the turbine blades by reducing
the steam's moisture content. Though they do add considerable complexity to the turbine design.
Anyway, the more efficient the turbine is,
the less coal or oil we need to burn to get the same number of megawatts. Which is a big
deal because fuel is the dominant factor in the cost of electricity.
One additional percentage point of efficiency can save millions of tons
of coal from burning each year - reducing carbon dioxide emission by about 2-3%.
## The Big Beautiful Boil
A major efficiency problem that soon emerged however was boiling.
Imagine a pot of water on the stove at sea level. You add heat to the pot until the water reaches
its boiling temperature of 100 degrees Celsius. And right on cue, the water starts to boil.
Now suddenly you find that adding more heat no longer increases the temperature. More heat
energy only produces more bubbles. It does not give you hotter steam,
which is what you need to further raise the turbine's efficiency.
But if the water's pressure and heat go beyond a certain
critical point - 22.1 mega-pascals and 374 degrees Celsius - something weird happens.
It becomes a dense, fog-like thing that we call supercritical fluid - bestowed
with the properties of both gas and liquids. It effectively becomes steam without boiling. We
can now raise input steam temperatures with more heat energy - achieving ever better efficiencies.
Going supercritical also made these boilers somewhat safer. Older, subcritical boilers
had something called a steam drum. After water is boiled into steam, the drum separates the steam.
Going supercritical means we no longer need the steam drum. Removing it not only
reduces the system's weight but also improves safety because the drums being full of hot,
highly pressurized fluid posed explosion risks. As many a train has discovered.
We call such designs "once-through" boilers - first patented by the engineer Mark Benson in
the 1920s - because water passes through the heating system only
once. It goes into the boiler cold and leaves as supercritical steam.
And yes, we probably shouldn’t call supercritical boilers “boilers” anymore,
since they are not actually boiling. But inertia is a powerful thing.
## Philo Unit 6
In the 1950s, the American Electric Power company joined with turbine-maker General
Electric and boiler-maker Babcock & Wilcox Company to build the first
commercial supercritical power generation unit: Philo Unit 6.
With a max capacity of 125 megawatts, Unit 6's feedwater was pressurized at
37.9 mega-pascals. Prior to this, no steam power plant had breached 22.1 mega-pascals.
The steam's main operating temperature reached 621 degrees Celsius - 28 degrees higher than
what had been previously possible. At these temperatures, the steel pipes literally glow red.
The unit's thermal efficiencies touched 40% - also a clear cut above anything else then available.
After Philo Unit 6, a second supercritical power generation unit fired up in 1961:
Eddystone Unit 1. Built by the Philadelphia Electric Company, Eddystone's steam reached
operating temperatures of 649 degrees Celsius and pressures of 34 mega-pascals.
The Soviets also got into the fun, launching a prototype turbine called the SKR-100 in 1968.
It achieved steam conditions of 30 mega-pascals and 650 degrees Celsius.
But Unit 6 led the way. A manager for mechanical engineering at the
successor company AEP would later say about it:
> For its day, Philo 6 was like flying to the moon without taking the intermediate
steps of first orbiting the Earth and then sending up an unmanned space ship
But perhaps as the metaphor implies, it soon became clear that Unit 6 and
its early peers had taken a step too far. Such intense heat and
pressures were not sustainable. The reason had to do with the steels.
## Going Too Far
Steam turbines share a few common steel components.
The casings and shells are big pieces of steel that offer structural support and
hold in steam. Being so big, their steels cannot be too expensive.
Bolts hold things together. These have to be highly resistant to
stress and might find themselves exposed to very high temperatures.
The blades spin around. They experience the steam gas directly, but are thin and cooled by the flow.
So it is the turbine rotor - the part that spins the blades - that experiences the hottest
temperatures. The rotor is thick and solid and receives heat conducted into it from the
blades. It is one of the biggest challenge areas from a metallurgical perspective.
Turbines and boilers are made from a limited range of steels to ensure good match of thermal
properties. Supercritical units in the day used largely two types of steel.
For most components, they used 2.25Cr-1Mo steel,
or T22 steel. The name refers to its components of chromium and molybdenum.
The high chromium content makes T22 part of a class of steels known as "ferritic steels".
They are called that because they have a body-centered cubic crystalline structure.
T22 is a good steel that welds easily and offers good creep strength. Creep,
meaning a tendency for the steel to deform after long periods of high temperatures and strong
mechanical forces. Creep is a very serious problem for both steel and people alike.
But above 560 degrees Celsius, T22 starts to lose that strength. So the hottest parts
of the turbine like the rotors were made from what are called Austenitic steels.
These steels have a face-centered cubic crystalline structure.
Austenitic steels contain high amounts of nickel and chromium and are very temperature-resistant.
They are a bit expensive, thanks to that high nickel content but
there were two other more serious problems.
First, these Austenitic steels expand a lot in high heat while also poorly
transferring that heat. So when the turbines start up and shut down,
their thick-walled components will have hotter outsides but cooler insides. The
hotter areas expand more than the cooler inner areas thus leading to cracking.
Second was oxidation. Superheated steam can oxidize the steel, creating oxides
on its surface that eventually flake off. These flakes then either build up in the boiler tubes,
blocking their flows, or chip away at the turbine's insides - breaking them.
## The Steel Stall These were complicated problems.
Therefore, subsequent turbines ramped down to 23.8 mega-pascals and main steam temperatures between
541 and 566 degrees Celsius. This neighborhood is generally classified as "supercritical".
The old neighborhood - anything over 25 mega-pascals and 593
degrees Celsius - formed a new category called "Ultra-supercritical". Though I must admit that
the borders between the categories are quite fuzzy. Various sources have their own numbers.
Great work was needed to get to this category. But with coal being so cheap, there was little
economic incentive in the US for this efficiency gain. The vast majority of America’s thermal
plants remained subcritical. By the 1960s and 1970s, just 15% were supercritical-class.
So for twenty or so years, the operating temperatures and pressures of the world’s
top coal-fired power plants remained steady. Instead,
American utilities focused on scaling turbine size and capacity to 600-1000 megawatts.
Even this hit limits at the end of the decade. The insane physics and tolerances on metals
inside such a huge turbine caused unexpected maintenance and extended downtime costs.
## Japan In the 1950s and 1960s, Japanese companies
gained technical proficiency through technology transfers with the West.
By the 1970s, Japanese oil-fired thermal plants operated steam of 16.6
mega-pascals and 566 degrees Celsius. Short of supercritical conditions,
but good enough for 35% thermal efficiency - on par with similar plants in the US and Europe.
Then came the Oil Crises of the 1970s. High oil costs - plus an
international ban on new oil-fired thermal plants - forced a transition
to a diverse energy portfolio of imported coal, LNG, and nuclear.
Thusly the government funded a rapid transition to supercritical-class turbines. The first of
which were two 500-megawatt units in the Matsushima Power Plant.
The Japanese government then embarked on a R&D project to make
Ultra-supercritical power generation a reality - seeing them as enabling coal
diversification while also meeting internal carbon emissions goals.
## Going Ultra-Supercritical
After feasibility studies, the research program at the Wakamatsu institute began in 1982.
As I said, the ultimate issue was if the turbine steels can sustain the high thermal and mechanical
stresses over long periods of time. It would take over ten years to develop the metals for this.
Phase 1 of the project spanned until 1994 and was split into two steps,
which I shall call 1A and 1B.
Phase 1A studied older Ferritic steels to achieve conditions of 31.4
mega-pascals and 595 degrees Celsius.
Meanwhile Phase 1B looked at Austenitic steels with the hope of achieving 34.3
mega-pascals and 650 degrees Celsius (with 595 degrees in the intermediate and
lower-pressure reheat cycles). There was great hope in this, initially.
But in the end, the Austenitic steels in phase 1B failed to work.
Though blends were found that can sustain those temperatures, tests concluded that
the thermal expansion coefficients still caused them to eventually warp and break.
After building test turbines at Wakamatsu,
Mitsubishi Heavy Industries leveraged the Phase 1A learnings to build the first commercial-scale
Ultra-supercritical turbine in the world: The 700 mega-watt Hekinan Unit 3.
Unit 3 operated at temperatures of 593 degrees Celsius. It first fired up in April 1993,
marking a long-awaited return to Ultra-Supercritical conditions.
## Some Special Steel
Meanwhile, in Phase 2, the Japanese program - a joint venture between Wakamatsu, Mitsubishi and
the Japanese steel company Kobelco - discovered a line of "Advanced 12Cr" Ferritic steels.
There are four of these steels, but the most well-covered one is Mitsubishi's TMK1 steel. I
am in awe of this steel. I don't know how many people care, but this is some special steel.
TMK1 descends from a 12% Chromium steel originally made in England by William
Jessop & Sons for jet engine discs. Called, MEL-TROL H46. Produced with a proprietary blend
of 9 element additives from carbon to boron to vanadium to tungsten, H46 had good creep strength.
General Electric took this steel and in 1965 reduced the proportion
of Niobium. This produced a line of 12% Chromium steels sometimes used for rotors
during the supercritical 566 degree Celsius era. They called it 12CrMoVNb.
To create TMK1, the Japanese simply had to do one thing:
Tune the amount of molybdenum in relation to how much tungsten was in the steel.
This was done thanks to experiments on H46 done by a Professor Fujita in the 1970s.
Fujita discovered that raising molybdenum content from 1% to 1.5%
helped hold together the steel's internal structure and keep it from creeping.
Too much molybdenum however would cause the steel to create delta-ferrite structures that
undermine that long-term creep strength. So 1.5% precisely. No more, no less.
Fujita's steel, called TAF, was not suitable for large items like rotors. So Mitsubishi and
Kobelco worked together to scale up the methods to produce larger steel items.
Producing this steel requires some insane skills. The raw steel mix is first melted
using electricity in a vacuum, which removes hydrogen and nitrogen gas
impurities. The output is then cast into a solid intermediate product.
This metal is then carefully re-melted by turning it into an electrode and passing
AC electric current through it. A method known as Electroslag remelting. The metal
melts drop-by-drop so it can be cast into the cleanest, most uniform ingot possible.
The massive ingot is then forged into shape while it is still hot. Then finally the metal
is heated, cooled, and reheated four times at temperatures of 700 to 1,100 degrees Celsius.
These heat treatments are to create and lock in the steel crystal microstructures
necessary for the steel to survive insanely high temperatures, pressures,
and mechanical forces without fail for over 100,000 hours.
Mitsubishi Heavy Industries used TMK1 for the rotors of the Matsuura
Thermal Power Plant Unit 2. It began operations in 1997. At 1,000 megawatts,
it was the first large-scale ultra-supercritical steam plant.
With operating steam pressures of 24.1 mega-pascals and temperatures
of 593 degrees Celsius, its thermal efficiency of about 42% broke new
ground. Recall that supercritical plants had about 34-35% efficiency.
Matsuura's success kicked off a new turbine boom in Japan. Better
steel recipes came out with improved heat and creep resistance. And by 2001, there were 13
Ultra-supercritical power generating units online in Japan. By 2013, 25 units.
This includes some of the most efficient coal-fired power plants in the world. A notable
one being the Isogo Thermal Power Plant Unit 2, an 600 megawatt ultra-supercritical turbine.
Produced by Hitachi, the turbine operates at a
wand-erful 25 mega-pascals and 600 degrees Celsius. Notably, it adds one reheat cycle
raising it to 620 degrees. The thermal efficiency is a record-breaking 45%.
## Advanced Ultra-Supercritical It is worth nothing that the European community
has also spent efforts to develop Ultra-supercritical thermal plants.
The core of the European effort was a program called AD700.
Begun in 1998, it sought to produce a demonstration plant with steam operating
at 34 mega-pascals and the neighborhood of 700 degrees Celsius. Maybe even 750 degrees.
Which is basically basaltic lava flow range. Such a system would have thermal
efficiencies of 50%. They dubbed this category: "Advanced Ultra-supercritical"
The limits of ferritic steels and others are around 620 degrees Celsius,
so the focus has shifted to nickel-based superalloys. Their
production requires even more demanding melting processes and heat treatments.
However, by the 2000s the European Community had started a shift towards renewables. The AD700
demonstration plant was eventually built in 2005 - COMTES700 in Germany, which operated for about
four years. But continued work on 50% efficient coal-fired thermal plants has been de-emphasized.
These technologies have since moved over to Asia. Japan remains a technology pioneer but
they are increasingly adopting LNG. So these thermal plants are most thriving
in India and the People's Republic of China - countries that still heavily rely on coal.
## Conclusion
Coal-fired thermal plants occupy a funny place in the portfolio.
The steam turbine is the OG turbine, but there are others out there. A notable one
is the gas turbine, which directly takes in natural gas or other refined fuels
to produce electricity. Lots of overlap with jet engines.
There are even versions where the heat in a gas turbine's exhaust is captured to power
a steam turbine a la Human Centipede: A Combined Cycle Gas Turbine or CCGT.
Measured on point-to-point efficiency, CCGTs beat standalone steam turbines at about 55-60%
for the former compared to 42-45% for the latter. And since they cook with gas directly,
you skip a big boiler - though you still need to produce steam.
But nothing quite beats coal's versatility,
storability, and low sticker price. Yeah CCGTs are more efficient,
but dead cheap and plentiful coal beats more expensive LNG - carbon pricing not included.
Moreover, these steam turbines are huge. Gas turbines generate anywhere between 100 and 400
megawatts of power. Modern steam turbines can get to a staggering 1,500 megawatts,
operating for weeks or months on end to provide steady baseload power to the grid.
Despite the rise in world renewables capacity, coal remains a dominant power source - and
I struggle to see a path away from it entirely despite the carbon footprint.
So producing more electricity from less coal should be a key goal in the future.
Ask follow-up questions or revisit key timestamps.
The video details the evolution of steam turbines, focusing on the transition from supercritical to ultra-supercritical technologies. It explains how increasing steam temperature and pressure improves efficiency, a key factor in reducing fuel consumption and emissions. Early attempts at supercritical steam, like Philo Unit 6, faced limitations due to the properties of available steels. Japan played a crucial role in overcoming these challenges, particularly in developing advanced "12Cr" ferritic steels like TMK1, which could withstand the extreme conditions required for ultra-supercritical operation. This led to significant efficiency gains, with advanced ultra-supercritical designs aiming for even higher temperatures and efficiencies, though the global shift towards renewables has impacted further development in some regions.
Videos recently processed by our community