Tear-Down of Rare ATi HD 4870 X2 Prototype & History
665 segments
This is a dual GPU engineering sample of
an AMD ATI HD 4870X2
that the company shipped back in 2008.
We bought this from a viewer and
although AMD didn't ship a variant of
what this eventually became, we've never
seen an engineering sample of it pop up
online, and this one's got some
interesting differences with it. The
retail version of this was the single
fastest graphics card when it launched
and forced Nvidia to slash prices of its
GTX 280 by $220 and its GTX 260 by $110
to contend with AMD's offerings
according to Techreport and Anitech from
the time. The release of this dual GPU
or the retail version of it came at a
pivotal moment for AMD less than two
years after acquiring ATI and only two
months before spinning out its foundry
business. Andy hadn't yet begun to
contend with the aftermath of the great
recession because it was the beforemath
or the middlemath of the great
recession. The 2008 financial crisis was
only really just beginning and that
would kick off one month after in
greater form when the layman brothers
collapsed a month after the dual GPU
launched for AMD. And just to be clear,
this didn't cause the layman brothers to
collapse. Although in a certain way that
would be sort of funny except for the
the part where the the financial crisis
happened, but otherwise it would be
funny. AMDy's stock was worth about $5
per share in August of 2008, falling to
$3.50 per share just 2 months later
after this card launched, setting it up
for failure long term. Following October
of 2008, the price plummeted to around
$2 per share by February of 2009. The
GPU that we're looking at today was part
of AMD's last flagship launch before it
entered a freef fall. It was also around
this era that AMD dumped its real estate
in the lease back deal, signaling what
almost became a death spiral. So, this
dual GPU video card that we're working
on today that our viewer found in I
think was a thrift shop or something
like that. uh this would have launched
right at the beginning of a long period
of turmoil for AMD the industry in
general but in particular for AMD long
before Ryzen was ever even someone's
idea that would only come to save AMD
nearly a decade later from the launch of
this card. We're fortunate enough to
have been able to buy this from a
viewer. We've actually got a lot of
really unique hardware from viewers over
the years. So, if you find stuff that
you think we might find interesting for
videos and you want to loan it to us or
sell it to us, you can email our tips
line, tips gamers.net. But our thanks to
uh Muzz for sending this over to us. We
bought it from them. And we're going to
get started with a tearown of this
engineering sample dual GPU card from
AMD to see what the differences are from
retail and try to get it to turn on.
Before that, this video is brought to
you by ID Cooling and the Frozen A720
cooler. The A720 air cooler performed
well in our testing last year. The A720
is a relatively high-end dual 140mimeter
air cooler with seven heat pipes. We
found the use of larger fans can be
beneficial to acoustic performance given
the thermals. Although, you'll want to
check your case for compatibility given
the taller nature of the cooler and its
fans. ID Cooling uses an all black look
for its A720 and includes mounting
hardware for all modern sockets. Learn
more at the link in the description
below. So, we had a lot of trouble
getting this card to work. We're going
to talk about that in a little bit here,
but today what we're focusing on is some
of the history of this card. A lot of
nostalgia from this era for a lot of
people. Personally, I had a system with
two 4870s. So, they were in Crossfire
and I think it was called the Ant
Skeleton. I think it was the big version
of the case. They had a mini and a large
one. That was my build for that era. I
remember playing Battlefield Bad
Company, whatever number it was back uh
shortly after in the 2010 2012 era with
those two 4870s. And uh this is kind of
familiar to that except it's one card
and an engineering sample which I
personally could have never dreamed of
having back at that time. So we're going
to try to get it to boot. We ran into a
lot of problems though. We'll talk about
that. Uh this is also a good opportunity
though for the history and the tear
down. Back when AMD's website looked
like it cared about gaming, rather than
using AI to spit out maps about Nebraska
and Texas during keynotes about AI for
education, the company announced its ATI
Radeon HD 4870X2.
The site feels like a throwback.
Gradients used for the buttons on the
website, a borderless styling, red and
black everywhere, and simple navigation.
It was a simpler time. It's a throwback.
AMD released the ATI Radeon HD 4870X2 in
August of 2008. Some things don't change
apparently, including naming schemes for
computer hardware. AMD marketed the card
towards what they said were ultra
enthusiasts and positioned its $550 dual
GPU against Nvidia's 9800 GX2 and GT200
series lineups. Today, that'd be about
$820 with inflation. This followed
Nvidia's famous 8800 GPU series, best
known for being able to run Crisis
reasonably well. At the time, Anitech
commented saying, quote, "It looks like
Nvidia's standards have changed largely
thanks to AMD, and now the key players
in Nvidia's lineup are priced more
realistically end quote." AMD's
marketing noted that the card had quote
unparalleled anti-aliasing and
anisotropic filtering. End quote. with
the features tab noting DirectX 10.1
support giving us a timestamp of an era.
The officially launched 4870X2 spec
sheet lists a 750 MHz GPU clock, a 900
MHz memory clock, 800 shading units, 40
TMUs, 16 rocks, 1 GB of GDR5 memory, and
a 256bit bus width per GPU. And again,
that's with two GPUs total. So you're
getting 2 GB total of GDR5 across the
whole board. The card also has a 286
watt TDP PCIe 2.0 x6 interface, two DVI
ports, and one Svideo port for display.
Dual GPUs can't access the resources
across the GPUs. So, these specs are
reserved for each GPU individually.
Crossfire means that you could put two
of these together, which would give you
four GPUs in two slots. Theoretically,
AMD basically took two RV770 processors,
from what we could tell, from its
already released ATI HD 4870 nonX2s.
They doubled the memory of each from 512
megabytes to 1 GB per GPU and combined
them on a single PCB along with a PLX
PCIe switch between the GPUs. Following
the same strategy introduced in its HD
3800 series and confirmed by Anitech of
the time, AMD competed at the high end
by multiplying its GPUs, effectively
planting two on the same PCB rather than
the traditional approach of using a
single flagship GPU at the high end and
then cutting it down for variations for
the mid-range and maybe low-end market
segments, if not spinning out something
different entirely. Andy decided
strategy was to build perfect solutions
for $200 to $300, which would be $300 to
$450 today. Andy stuck with that
strategy later as well, like with the RX
480. AMD's single card approach for two
GPUs granted users some key advantages
over traditional Crossfire
configurations by requiring fewer PCIe
slots and fewer PCIe power cables in
addition to eliminating motherboard
support for Crossfire which is something
that reviewers of the time noted
regarding AMD's newfound plan of attack
and tech reported quote so does AMD's
approach invalidate Nvidia's big
monolithic GPU strategy not exactly
while it is true that two RV770s can
outperform a single GT200 in many cases.
You could also make the argument that
two GT200s could outperform anything
that AMD could possibly concoct. Three
and four-way crossfire scaling isn't
nearly as good as two-way. AMD strategy
makes sense for AMD, but it's
fundamentally no different than what
Nvidia is doing. AMD is simply targeting
a different initial market and scaling
up or down from there. End quote. Early
marketing for the HD 4870X2 heavily
pushed Crossfire Sideport, a new
technology planned to release with the
card with AMD claiming it increase the
total interconnect bandwidth from 6.8 GB
per second to 21.8 GB per second, a
massive increase. Tech report explained
how it was supposed to work. reporting,
quote, "AMD says the side port is
electrically similar to PCI Express, but
is simpler because it's only intended as
a peer-to-peer link between GPUs. This
link augments the bandwidth already
available via the X2's Crossfire Bridge
interface, CFBI, in the diagram, which
is only used to pass final frames from
one GPU to the next for compositing and
its PCI Express lanes. The sideep port
connection should help improve
performance in cases where multiGPU
applications have typically had
performance scaling problems, such as
when texture synchronization between the
GPUs becomes a problem. End quote.
Despite its initial marketing, AMD
disabled the feature at launch due to
minimal performance uplift despite
increased power requirements and costs
for AIB vendors as reported via Anitech.
On the card's release day, PC
Perspective noted, quote, "The current
plan is to offer a software update for
the card that will enable the technology
for better performance. The problem is
that AMD is allowing board vendors an
option to include the Sideport
technology or leave it out for a cost
savings. This means that there'll be
some HD 4870X2 cards on the market
that'll be able to use Sideport
technology and some that will not.
distinguishing between the two will be
nearly impossible unless the card
vendors explicitly indicate their card
status. End quote. As far as we could
tell, AMD never enabled Sideport with
driver updates and ultimately abandoned
the technology after its HD 4800 series
launch. At that time, it would also have
been in the fight for its life as a
company. The big challenge with dual GPU
cards is that if you have a lot of
memory between two GPUs, it doesn't mean
one GPU has all that memory. So if
you've got 48 gigabytes on the board in
total, but it's 24 per GPU, although
still useful, it's not like you have a
single contiguous pool of 48 gigabytes
of memory. And that was true back here
too where GPUA can't use GPUB's memory
on this card if GPUB isn't doing
anything useful with it. So that's
always been a challenge for these dual
GPU solutions. Now, the performance of
most dual GPUs is also largely dependent
on driver support, which is kind of
shoddy, and whether the game or the
application knows how to utilize both
devices. So, despite their intrigue and
really a lot of the desire to make these
types of cards good, because it'd be
awesome if you could have obviously a
dual GPU card and then you put two of
them, now you got four and two slots.
Despite all of that, they've just never
been common and they've become even less
common over time. The first multiGPU
cards that we could remember were 3DFX's
Voodoo 2 and Quantum 3D's Obsidian 2
X24, both released in 1998. It wasn't
until around 2006 with Nvidia's 7950 GX2
that they really started to pick up
steam. The dual GPU trend would remain
for the next decade or so on and off
with the HD 4870X2 launching amidst the
height of the trend. Two of the last
dual GPU cards we can remember from that
time are Nvidia's GTX Titanz and AMD's
R9 295X2,
both released in 2014. Nvidia also
earlier than that had the GTX 690 in
2012. Since then, most dual GPUs have
been restricted to workstation cards
like AMD's Radeon Pro Duo cards,
Huawei's Ascend 300i Duo, and Maxon's
Intel Pro B60 Duel. Now getting into
trying to use it. Unfortunately, that is
where we ran into the big problems. We
could get the card to boot in a separate
slot as long as video was put out by
something else. The system recognized
the card and even powered it on in
various platforms we assembled with
different operating systems in our
attempts to boot with it. But the card
had issues putting out display and also
had issues getting drivers installed
onto the system. Additionally, any
attempts to force install the drivers
last officially supported on Windows 8
would crash our system and even
corrupted the operating system on
multiple occasions. Although in some
instances, like the Windows 8 9.1
version, nothing of value would be lost.
The engineering sample that we purchased
from a viewer did arrive with some
relatively minor shipping damage along
the PCIe bracket. It doesn't really seem
to be enough to affect the card's
functionality in any way. It's just the
metal that was affected that we can see.
But maybe it explains some of the
display troubles we ran into if there's
some underlying damage we can't see. We
did try a lot of things to get it to
work. So we tried Windows 11, Windows
10, Windows 8 varying versions and uh
experimented with PCIe bifurcation
support, checked on the support on the
motherboard. So that's another
interesting challenge of these dual GPU
devices is you're taking the PCIe slot,
you're basically splitting it into each
GPU is going to get eight lanes. if
you're on a buy 16 and that will also
require some form of logic for the
motherboard to do that. So, we checked
all that stuff and we just could not get
this thing to work. Maybe it was an
engineering sample for a reason. It
turns on, the fan spins, that's about
it. All right, so now we're going to
take apart the RV 770 engineering
sample. We'll start with kind of a walk
around of the card to see what the
design looked like. Really simple stuff.
So, this is a longer card with a
standard height uh PCIe slot. So they're
not protruding past the slot here. They
were running a blower fan and that's
just going to push the air straight
through over both GPU cores and out.
That would mean that this one runs
warmer, which it would do anyway because
it's further back, but also it's going
to get the heat dumped on it from this
core. The rest of the cooler design, I
mean, this is this is about as bad as
you can get in terms of thermal design,
um, even back then. But if they were
designing these to be stacked multiple
in a system, then blower is still the
best way to do it. Just because if you
got another card right here, there's no
room for the axial fans to bring air in
and and cool the heat sink. So the big
question is going to be whether this is
two or 4 GB of memory in total because
two is what it should be. But GPUZ was
showing this as four. We think it's a
software issue. If this has four for
real though physically, then that would
have been another reason. Maybe this was
an engineering sample. So, we'll start
with actually disassembling it. Now, I'm
going to track these on the mod mat,
which if you want to grab one of these
anti-static mats we make, they're on
store. Nexus.net. There's a grid here.
We're going to be tracking the front and
the back screws as I take them out. Just
because sometimes these engineering
samples, they have different types of
screws in weird locations. So, we got
Phillips one. Let's go around the whole
back plate first.
It's interesting to see the fat
capacitors externally.
All right, that's a good start.
Uh, this is better than the MSI card I
just took apart in the Cyber Power
pre-built review where that card used a
plastic back plate. This is at least
metal. So, this is aluminum. They're
sinking into this. not going to do a
whole lot, but since there's memory
directly on the back, there's nothing
else cooling it, which is going to be a
challenge to keep the heat down on
those. So, they need to do at least
something. And this is kind of the bare
minimum. So, we've got four modules for
each GPU. There's going to be some on
the other side as well. So, these are
debug LEDs that might be unique to this
card, uh, because in engineering
samples, they'll often put additional
LEDs or hookups. There's some unoccupied
four pin connector spots on the PCB
here. Over here, there's captain tape
covering two dip switches. So, that's
going to be unique to engineering card.
It's they're labeled one and two and
they say on and K. Those would have been
used for something in the debug and
engineering process. Let's get the
cooler off. Get a smaller driver.
Definitely not fully That is not fully
torqued. I'm not sure if that's because
someone else took it apart before us or
if they just didn't assemble it all the
way.
So, these are spring retention, which is
pretty much how it's been done for a
long time with GPUs. And then there's
just some plastic bumpers here. These
are probably to prevent a short. So,
those are just going to stop the uh the
bracket here, which is most likely
steel, from bridging any components on
the board. I'm wondering if the DIP
switch enables or disables one of the
GPUs.
So, this specific module isn't in the
data sheet. It's really close, though.
There's some There's a few that there's
four that have the exact same character
strain, but then one or two letters are
changed, which might just be to do with
pre-production versus production. Um, so
the format seems to match the one
gigabit modules. There's eight bits in a
bite. That would mean if there's four on
this side and there's four on the flip
side, that'd be eight total modules per
GPU, which would be one gigabyte per
GPU, which is what we would expect. This
is supposed to be a two total gigabyte
card, one per core. Uh, it does detect
as four in GPUZ, but unfortunately, it
looks like we do not have a gold mine of
memory right now in 2026 on this card,
but I think we've seen enough here to to
confirm that. So, now I'm going to take
apart the rest of this.
Oh, that was
a little crusty.
Yeah, look at that. That doesn't look
like a good quality copper. Yeah, we've
got thermal paste applied to two GPU
cores. This is pretty cool. Is that a
PEX chip? That might be a PEX chip.
Yeah. So, this is a PLX. This is a
multiplexer
for PCIe. So, this is what allows the
two GPU cores to share their lanes
through a single PCIe slot. These were
expensive. So, I don't know what they
cost at this point in time in history,
but the last time I looked up the price
of these like cost, they were $30 to $50
at the time. It was when they were
they're really in demand for specific
applications. The IHS for this is the
size is larger than the GPU silicon.
Then we've got our other four memory
modules per core here. So, that's going
to be eight total front and back, same
memory. So, uh that is going to be a two
total gigabyte card.
And then for the rest of this, it looks
like they've maybe changed some caps or
moved some things around. So between
these two power connectors, if we look
at the reference images on Tech
Power-Ups website where they have some
board shots, uh, this region looks
different. So that may be a that may be
unique for the engineering sample here.
I'm actually not sure what this is. I'm
going to have to look that up. I might
look that up. Let's look up you or V.
Oh, it's VTEC.
>> Honda
Variable cam timing.
>> H. Is that what that is?
>> E tech. Yeah.
>> Multi-phase SMD coupled inductor. No,
it's a coupled inductor.
>> That's not right.
>> That's not Yeah. Ron Honda.
Oh, these apparently these were
best known for use on the uh 4850 and
4870s. I'm just looking at some data
sheet entry for it. So, it's a coupled
inductor. Uh, it's part of the VRM. If
you look straight down here,
those are going to be the inductors
themselves. So, it looks like it might
be a group of I don't know, three or
something. Maybe maybe more than that.
But, that's just a big fat inductor
that's all grouped into one thing.
That's kind of cool. I haven't seen that
or I haven't seen it in a long time. If
I have Oh, another coupled component.
Yeah, that's another one. Oh, nice.
Plastic. There's plastic in the thermal
pad. That's good.
that'll conduct. So, this is going to be
uh three fats to the three inductors in
the coupled inductor. So, without
probing it directly, uh it's probably a
three-phase. And then we've got a plus
one over here. Maybe that's for memory.
It's possible that this is for memory.
One of these is going to be a
controller. One's going to be for
memory. There's another three phases
over here. That is probably for the uh
the other GPU. All right. So, this is
pretty easy. We're just going to see if
it says anything on the die. It does.
Nice. All right. Let's see what it says.
ATI
says site for sore eyes.
There's the ATI logo. So, this says
diffused in Taiwan. Made in Taiwan. So,
they were still they were already doing
stuff there. It does say engineering
sample
0818G
and
it's a B3 stepping whatever that would
have meant for this particular card. So
that's cool. This should be just the
same exact thing.
This might be the first pace it's gotten
in 20 years when we redo it. So same
thing just twice flipped. And uh
rotating it like that would allow them
to kind of rotate the memory and the VRM
and everything so that they can just
mirror them. This has some very orange
copper heat sinks. Normally that's not a
great sign.
Uh that can be an indicator of plating.
In this case, I I thought it was copper
plated, but um some crusty paste.
There's your heat sink. We're going to
go through and just pull the shell apart
to look at what the fin stack looks like
inside. Good use of thermal pads.
They've got a large thermal pad right in
the middle that would have been
contacting that PLX
chip. That's for the fan.
They really wanted this held together.
Couple things. First of all, they drove
that screw right through the memory
contact patch, the heat sink. So, a
little bit of lacking coverage there,
but uh steel screw doesn't really sink
that much heat. This is an aluminum base
plate. This is a kind of traditional
sign. Look at that. Look right. It was
perfect.
Again, you can grab your mod mat on the
store for moments like this. Uh aluminum
base plate. There you go. That's not
bad. They've got some fat fins here. the
the uh that's going to help with the
PLX. So, that is sitting right on top of
the PLX chip. They did get warm from
what I remember. I don't know what the
power was for them, but throwing pad to
it. And just as a reminder, that's what
that looks like. So, that larger square
pad contacts that PLX chip in the
middle. This is going to help sink the
heat. Wider fins will reduce the
resistance as you're pushing air through
the tighter fin stack here and then the
tighter fin stack here. This is uh those
pins we were looking at earlier. So,
those are going to help with memory.
There's a little bit of contact directly
under it to some of the MOSFETs um some
of the memory components like that
blower fan and uh otherwise this is all
they were cooling it with. This thing
would have run pretty hot. So, this was
also in an era where AMD and Nvidia
weren't shamed enough yet to really
improve the cooling on their GPUs. So,
that's going to be it for the history
lesson on this unique ATI AMD card right
after AMD bought ATI before they kind of
shuttered the brand and before one of
AMD's worst moments in its history
financially. Uh, after this would have
been bulldozer, pile driver,
steamroller, all of those CPUs before
the Ryzen launch in 2017 and the AM4
launch in 2016. So, this was sort of AMD
was doing okay and they managed to buy
ATI. that was big acquisition but then
they hit those they hit the 2008 crisis
not a good time for AMD and uh Nvidia
was much smaller at that time. So
historically really interesting point in
time for this card because this came out
like just before massive drops in AMD's
stock price and before AMD had to do a
lot of cutbacks which is why I think you
see some of the technologies that were
supposed to be in cards like this never
really coming to fruition because they
probably made a bunch of cuts to their
tech ambitions and just went towards
sort of trying to survive through the
2008 uh crisis. So really interesting
stuff. So that's going to be it for this
one. if you have any other older GPUs or
CPUs or anything you're interested in
where we could do a historical recap
plus maybe a tear down or something and
if we can get it working, which
unfortunately we couldn't get this one
to actually run anything successfully
without a major crash or corruption or
something. But if we can get it working,
let us know what kinds of parts that
you're interested in from older eras.
We'll see if we can get them or if we
already have them and maybe do a history
breakdown of it and some testing. And
otherwise, if you want to see more like
this, check out our 3DFX Voodoo GPU that
we looked at previously. That one was
handmade by someone who salvaged GPUs
from 3dfx that were never officially
released. We'll link that below. Thanks
for watching. Subscribe for more. You
can go to store.gamersac.net to support
us directly or patreon.com/gamersac.
and we'll see you all next time.
Ask follow-up questions or revisit key timestamps.
This video features a teardown and historical analysis of a rare engineering sample of the AMD ATI Radeon HD 4870X2, a dual-GPU graphics card from 2008. The host discusses the card's role in the market, which forced Nvidia to drastically reduce prices, and its place in history right at the onset of the 2008 financial crisis. The video also covers the card's specifications, the failed Sideport technology, and a detailed internal teardown revealing unique engineering components.
Videos recently processed by our community