RX 590 spotted

Unless AMD has tech to make Crossfire work on every game out there (or, hell, even half of them) a dual GPU card would be dumb. They're expensive, they're incredibly hot, they require a lot of power, and they don't work in every situation. Dual-GPU cards have always been dumb, even when both companies were heavily pushing multi-GPU support in games.

yea I am not sure why people keep asking for a dual gpu card when almost no new game supports it out of the box. Until they make a dual GPU card work out of the box. there is no point in it. May be one they they will get dual die working on a single card that will just work with any software. Until then developers aren't likely to support it. So I am not sure why it makes any sense what so ever to make a dual gpu card. They do have a pro card but that is because pro applications support dual gpu and its helpful. Its useless in new games.
 
Unless AMD has tech to make Crossfire work on every game out there (or, hell, even half of them) a dual GPU card would be dumb. They're expensive, they're incredibly hot, they require a lot of power, and they don't work in every situation. Dual-GPU cards have always been dumb, even when both companies were heavily pushing multi-GPU support in games.
I 100% agree. Doesn't stop my love for them. I'm definitely not the average consumer though. Afflicted with a hardware addiction I'm afraid. I would equate the ridiculousness of a 295x2 to a supercar that gets 5 miles per gallon. Its not practical, but its cool.

Its all BS though, I get it.
 
Last edited:
  • Like
Reactions: N4CR
like this
The ONLY way they can get a dual GPU card work in every game is to perform some hack in the software or VBIOS to have the card be seen by the OS as a single card, and all the AFR/SFR logic is done on the actual card. Essentially, mGPU needs to be invisible to the OS.
 
The ONLY way they can get a dual GPU card work in every game is to perform some hack in the software or VBIOS to have the card be seen by the OS as a single card, and all the AFR/SFR logic is done on the actual card. Essentially, mGPU needs to be invisible to the OS.

Yup. Their last thought- that they could just make medium-sized GPUs and stitch them together to compete- died in the ashes of the GTX590 :D.

Their saving grace may be two-fold: Polaris is a decent raster engine, ray tracing is relatively easy from a design and software perspective, and VR will probably be the next real domain of multi-GPU. When they get the pesky software figured out...
 
A real question though:

If AMD can't compete on the high end, is it worth creating a flagship product even if it doesn't make sense/money?

Is it better to lose money and create an unpractical dual-gpu graphics flaship card, or completely back out of the high end market completely for at least another year until 7nm Navi?

I'll never buy a Ford GT, but it sure does help sell me on a Ford Focus.

I believe AMD still needs its supercar, now more than ever.
 
A real question though:

If AMD can't compete on the high end, is it worth creating a flagship product even if it doesn't make sense/money?

Is it better to lose money and create an unpractical dual-gpu graphics flaship card, or completely back out of the high end market completely for at least another year until 7nm Navi?

I'll never buy a Ford GT, but it sure does help sell me on a Ford Focus.

I believe AMD still needs its supercar, now more than ever.


This happens SO much at retail.

The RX570 is in every way a better card than the 1050Ti. Its upwards of 50% faster, it's cheaper, it supports Freesync, there is literally no reason you should buy a 1050ti.

But way more people buy the 1050Ti because Nvidia. Nvidia makes the fastest GPUs, so when you buy Nvidia, you're buying the fastest GPUs. AMD is seen as the 'alternative' brand, the 'bargain bin' 'great value' 'generic' graphics card.

The times where AMD had the fastest GPU, it was fleeting and barely by a few percent. It never had the knock-out win. Nvidia has the knockout technique, and honestly, I want to see AMD have the killer card, but I doubt it will happen.
 
there is literally no reason you should buy a 1050ti.

Hey, I bought one of these!

...to go in my server, to transcode Plex...

...because 4GB 1030's aren't 'a thing'.

But yeah, I'd be positioning a 1060 against an RX570, not a 1050Ti.
 
First of all, Vega is a different architecture than Polaris, so I don't even understand why you're going there. It's been discussed ad infinitum of why Vega is a poor gaming chip, architecturally.

The point is to compare chips within the same architecture to derive performance summaries.

If you compare within Polaris 10/11 (RX 560, RX 570, RX 580), the scaling is quite linear to CU count, when normalized for core speed. If you compare Pascal GTX 1070 Ti, GTX 1080, GTX 1080 Ti, it's also quite linear.

The only factor that is even remotely capable of raising performance in a linear fashion is clock speed. More ALUs will not yield a linear increase in performance because the front-end can't keep them occupied at a 100% utilization rate. Harsh reality aside, math rate is not the only bottleneck for games. Bottlenecks shift even within the same frame. You could be limited by draw calls one moment and VRAM bandwidth the next.
 
The only factor that is even remotely capable of raising performance in a linear fashion is clock speed. More ALUs will not yield a linear increase in performance because the front-end can't keep them occupied at a 100% utilization rate. Harsh reality aside, math rate is not the only bottleneck for games. Bottlenecks shift even within the same frame. You could be limited by draw calls one moment and VRAM bandwidth the next.

Then please explain why the entire Polaris line scales very linearly with CU count. As for your latter points, the assumption is that VRAM bandwidth should increase linearly as well, which for the most part, happens in the Polaris line.
 
The only factor that is even remotely capable of raising performance in a linear fashion is clock speed. More ALUs will not yield a linear increase in performance because the front-end can't keep them occupied at a 100% utilization rate.

While there is likely a small amount of truth in this statement, it both presupposes that the 'front-end' of a GPU with a higher CU count would not be accounted for, while ignoring the great many products that are of varying core counts and have very linear differences in performance.

And at the solid mid-range segment we're looking at here, there really is no reason that the rest of a hypothetical up-cored Polaris GPU cannot be properly scaled to get full utilization of some additional CUs. Hell, that's the minimum we should expect. Should improvements be made to clock regions, cache sizes, or overall clockspeed as would be expected over time, we should really expect a bit more.
 
Mid range has always been it's own market and miners took it over , Many of us have never sampled Polaris yet as the prices scaled out of reach and now the price of a RX 580 8Gb with games is at $200 new with great driver support as the IQ was worth the wait for me paired with a Ryzen. Different paths to reach mid-range is that Polaris is it's own design and can always scale up as Nvidia is more scaled down with 1060GTX as how much can you added back without meeting the 1070 head on as I see Polaris moving the bar up and Navi could be the choke hold as Nvidia would need a new mid range card that can be it's own breed of design to make money with cheap chips.

Vega is in every 2200G and 2400G as to have found a home and new fresh life at baseline performance and Nvidia can't play here with Vega on that life path .
 
  • Like
Reactions: N4CR
like this
Vega is in every 2200G and 2400G as to have found a home and new fresh life at baseline performance and Nvidia can't play here with Vega on that life path .

I read somewhere that "Vega" in the 2200G and 2400G is actually a Polaris chip. The collaboration with Intel is actually a Vega GPU (note the HBM2 stack).
 
Don't remember fury costing less than 300. Can we wait for the price? lol. What if they launch this at 250 and drop everything else? Would that be a bad deal? I say hell no!
It didn't at launch. But I paid $259 brand new at the Egg for mine. After a $40 gift card it was $219. Felt like a steal to me.
 

Attachments

  • Screenshot_20181114-224615_Opera.jpg
    Screenshot_20181114-224615_Opera.jpg
    134.6 KB · Views: 0
Back
Top