Polaris vs. Hawaii - RX 570 takes on the R9 290

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,160
Steve at Hardware Unboxed has another good video showing how an old favorite does in newer games.


The RX570 manages to best the older card, but you have to wonder how would things look if the Hawaii was the newer card and more of the games were optimized for that architecture. Take a look at the results:
Screenshot_20190402-231449_YouTube.jpg


Just look how the DX12 games are actually better on the R9 290. Division 2, Metro Exodus, and FC New Dawn are the exception, but those are BRAND NEW games. You have to wonder how well Hawaii would be if it was still optimized for and upgraded with more advanced memory / GPU node.
 

Nolan7689

[H]ard|Gawd
Joined
Jun 5, 2015
Messages
1,694
Neat but I don’t wonder at all what would be if they still were working with Hawaii. Just based on the fact that they’re all essentially GCN and similar other than small tweaks. Hell the 570 is missing 20% of the stream processors that the 290 has (2048 vs 2560) and makes up for it via clock speed and process node size.

To put it more in perspective, the 570 has the same amount (2048) as the original GCN flagship, the venerable 7970. Which was refreshed as the 280x and tested last fall against future cards in its respective space. https://m.hardocp.com/article/2018/09/20/amd_gpu_generational_performance_part_2/

Doesn’t have the 570 but I think the results speak for themselves essentially. Also keep in mind that brute forcing more cores onto the chip hits walls pretty quickly ala Fury x/Fury/Vega 64/56 (4096/3584) where performance increase between those came purely in node shrinks allowing higher clocks and memory bandwidth.


All that said though, the 7970/290 era of cards did have some amazingly long legs to continue being such strong performers for so long.
 

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,298
This video is why I wished I would have kept my R9 290 flashed to an X a lot longer than 2 years. I bought it in October 2013, sold it 2 years later and am seeing that it was a mistake. That said, this is the very reason I will be keeping my Vega 56 for at least 5 years or so.
 
Joined
Feb 20, 2017
Messages
992
The problem is that AMD hasn't update its delta color compression since 2014.

As a result, the GPU slam into a memory bottleneck very quickly.

AMD's workaround has instead been to use expensive HBM memory.
 

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,160
Neat but I don’t wonder at all what would be if they still were working with Hawaii. Just based on the fact that they’re all essentially GCN and similar other than small tweaks. Hell the 570 is missing 20% of the stream processors that the 290 has (2048 vs 2560) and makes up for it via clock speed and process node size.

To put it more in perspective, the 570 has the same amount (2048) as the original GCN flagship, the venerable 7970. Which was refreshed as the 280x and tested last fall against future cards in its respective space. https://m.hardocp.com/article/2018/09/20/amd_gpu_generational_performance_part_2/

Doesn’t have the 570 but I think the results speak for themselves essentially. Also keep in mind that brute forcing more cores onto the chip hits walls pretty quickly ala Fury x/Fury/Vega 64/56 (4096/3584) where performance increase between those came purely in node shrinks allowing higher clocks and memory bandwidth.


All that said though, the 7970/290 era of cards did have some amazingly long legs to continue being such strong performers for so long.

It was really only to point out that AMD has not gone very far in all those years (outside Vega) if you factor in not just raw performance, but game optimization overall and optimization specifically between a card and a game.
 

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,160
It cost me $269 new (March 2015) with some free games that I can't remember which it was as it was a soft launch ahead of R9 390/ 390x as it seems to have the 390x gpu under the hood with Samsung memory that works at 8000Mhz as the box says 750 watt min power supply and 375 watt rated .

20190405_192808.jpg
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
4,734
I hold to this day that the 295x2 was one of the finest video cards ever made. Hawaii was a beast for its time. I even went so far as jumping on a Powercolor Red Devil X2 for a secondary machine a few years back when Newegg was selling them of a stupidly-cheap $650 - still destroys anything at 3k.
 

Dayaks

[H]F Junkie
Joined
Feb 22, 2012
Messages
8,288
I hold to this day that the 295x2 was one of the finest video cards ever made. Hawaii was a beast for its time. I even went so far as jumping on a Powercolor Red Devil X2 for a secondary machine a few years back when Newegg was selling them of a stupidly-cheap $650 - still destroys anything at 3k.

It’s a great gpu given a bad name by it’s reference cooler. However the 295x2 gets destroyed these days... and that’s with pretending crossfire works most of the time.
 

Shadowed

Limp Gawd
Joined
Mar 21, 2018
Messages
506
The problem is that AMD hasn't update its delta color compression since 2014.

As a result, the GPU slam into a memory bottleneck very quickly.

AMD's workaround has instead been to use expensive HBM memory.

Holy crap, Tonga is from 2014. I'm not sure why, but it feels more recent.
 

evolucion8

Gawd
Joined
Mar 15, 2006
Messages
916
Hawaii's biggest bottlenecks are related to bandwidth due to lack of any compression technology, tessellation and Command Queue processors. So the fact that the RX 570 much smaller resources are able to match it, shows a testament of the performance gains with minor tweaks. Even the Fury X which is much wider, barely outperforms the RX 590 and its not only cause of clockspeed alone, Fiji has bottlenecks on Tessellation, Command Queue Processor as well and something funky with the HBM1 (Latency may be?)
 

Dayaks

[H]F Junkie
Joined
Feb 22, 2012
Messages
8,288
I see why support from Nvidia has dropped off in Sli as they never reached the level AMD was at with framing pacing being made for the HD7970 because it changed the game so to speak and to evolve without the need of the ribbon and all the latest tech is there like Free Sync .. also going back and playing games from 8 to 10 years ago is the only true way to see the improvements as how the price scaled back to what you would of payed back then .

But the real meaning is Nvidia wants to sale you tax check type video cards … So with the price of the cheap little RX 570 and gold level driver support , I can afford to own a CX rig when I payed for it in 2010 and that power supply is that old also (Corsair TX ) .. lol but sayin that it has it's place still today and my single rig is a RX 580 .. but I will say it 100% CX with any of my 3 RX 570's when want to scale up performance .

MultiGPU is garbage from both vendors. I’ve had crossfire, trifire, and SLI at multiple points. There’s tons of frame time charts throughout history showing this and since the RX480 support has fallen off sharply.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,710
I see why support from Nvidia has dropped off in Sli as they never reached the level AMD was at with framing pacing being made for the HD7970 because it changed the game so to speak and to evolve without the need of the ribbon and all the latest tech is there like Free Sync .. also going back and playing games from 8 to 10 years ago is the only true way to see the improvements as how the price scaled back to what you would of payed back then .

AMD never caught up to SLI. Nvidia had been focused on framepacing with their first DX10 GPUs and it took AMD over half a decade to even acknowledge how bad their cards were doing.

Further, while CF/SLI have been sold as an 'upgrade path', it was never anywhere near perfect. It only ever made sense if a single GPU didn't exist to provide the desired performance and desired applications had provable support. This was more common in the past and is not at all common now. If you want a faster experience with AMD GPUs, buy a faster AMD GPU.
 

Araxie

Supreme [H]ardness
Joined
Feb 11, 2013
Messages
6,452
I see why support from Nvidia has dropped off in Sli as they never reached the level AMD was at with framing pacing being made for the HD7970 because it changed the game so to speak and to evolve without the need of the ribbon and all the latest tech is there like Free Sync .. .

WTF? HD7000 series had one of the worse frame pacing issues ever when relating to mGPU.. nvida made framepacing software solutions to I belive GTX 8000 series in 2006 in fact, was with Kepler (GTX 600 series, competitor of HD7000 series) that made the biggest FCAT optimizations to framepacing for SLI (and another fact nvidia created FCAT monitoring software) and was due HardOCP reports about multi-gpu solutions, framepacing for AMD was introduced in catalyst 13.8 IIRC and still Xfire was a stuttering mess, great performance but with terrible frametimes, at that time one of the biggest selling point on nvidia vs AMD was that SLI was simply better and Xfire Suckd, the truly change from AMD came for Hawaii (r9 290 and 290X) which ditched the typical Xfire bridge for the new hardware DMA Engine to allow direct Access between GPUS without the bridge.. still the Framepacing support was via software solution good but still not great..

Real change started with FIJI (Fury series) with the framepacing via hardware. Nvidia always had better mgpu frametime consistency than amd the drop in support for mGPU solutions being Xfire or SLI came from the devs, in fact, initially most games used to not support SLI or Xfire and then when that support was added games was plaged with bugs.. it happened due consoles as games are made for consoles where SLI or Xfire are not used, making support for those tech for PC was a investment that almost no studio decided to made.

so everything said for you it's another wave of BS and Fanboy trashtalking..
 
Last edited:

Nolan7689

[H]ard|Gawd
Joined
Jun 5, 2015
Messages
1,694
Yep. That’s how I remember crossfire. Even if you had higher frames than than SLI the experience was often worse due to stuttering.
 

crazycrave

[H]ard|Gawd
Joined
Mar 31, 2016
Messages
1,026
I lived my CX Eyefinity iRacing Sim Days with HD 7950's so take your BS else where snow flake
 

FlawleZ

[H]ard|Gawd
Joined
Oct 20, 2010
Messages
1,350
Because you can buy R9 Fury so cheap now I've thought about buying another to pair just to toy with Xfire. For 1080P it's hard to beat at ~120ish used.
 

Araxie

Supreme [H]ardness
Joined
Feb 11, 2013
Messages
6,452
BF4 was amazing with Xfire, but everything else pretty much sucked.

when it worked… I remember BF4 was unusable at certain point with mantle and SLI/Xfire Worked just sometimes, some patch yes, and some other nope and just under DX11 as DX12 was also broken for Xfire.. I tested multiple times with Xfired 290X never able to play stable that game Xfired.. sometimes the performance scaling was even negative over single GPU that's when I sold the pair of 290X and just kept a single 280X for long time..
 
Top