XFX Radeon RX 590 Fatboy OC+ Video Card Review @ [H]

If this isn't the market you are in, why even bother taking the time to post? As if everyone at [H] didn't already know this wasn't a 1080/2070 competitor before the reviews even landed? What are we supposed to do, gather around and congratulate you on not giving a crap about the majority of gamers (those still using 1080p<->1600p)? This article was extremely relevant to the vast majority of the market at large, whether or not you bother to accept that as fact.

Your post was full of truth and fact. Not gonna debate you on this. I posted cause I'm butthurt LOL
 
For people concerned about the additional power used compared to a 1060, consider the 590 is using about one additional kilowatt per 8 hours of gaming. So, gaming an average of 3 hours a day each day, every day means about 130-140 additional kilowatt hours used per year equating to $13-28 in additional electrical costs per year at $0.10 - $0.20 per KWH.

Newegg is currently selling 1060's, with only a single free game, for about $250 - $330 excluding some pricing outliers on both ends. In general and ingnoring the better performance of the 590, you would need to game on a 590 for literally years before before the 1060 would become a better price value.

It blows me away when I see how many people have literally no concept of this fact. Literally the only concerns I have when it comes to power draw are 1) heat raising the temperature of the room (moderate increase no doubt, but still), and 2) power supply requirements. It blows me away that people will spend extra on Nvidia due to “power efficiency”, seemingly oblivious of the fact that they will never recoup the cost differential during a normal upgrade cycle. It really blows me away.

That said, AMD does seriously need to fix that about their cards, particularly with the Vega series, which consumes an embarrassing amount of power compared to Nvidia’s offering.
 
Looks like a quality board and cooler.

That marketing name though. First my brain auto-corrected it to "Fratboy" and then once I it it right, I keep picturing the fedora neckbeard meme...
 
It blows me away when I see how many people have literally no concept of this fact. Literally the only concerns I have when it comes to power draw are 1) heat raising the temperature of the room (moderate increase no doubt, but still), and 2) power supply requirements. It blows me away that people will spend extra on Nvidia due to “power efficiency”, seemingly oblivious of the fact that they will never recoup the cost differential during a normal upgrade cycle. It really blows me away.

That said, AMD does seriously need to fix that about their cards, particularly with the Vega series, which consumes an embarrassing amount of power compared to Nvidia’s offering.

I'm pretty sure it's all just distraction. I doubt most of them put tens of thousands of their own dollars where their mouths are, like I did, on renewable energy generation for their home. I'm not buying a 590 right now because I don't need one, but the modest power requirements of video cards of today are of almost no concern to me. The difference between 150 and 250 watts is about what I was using just to light up a room around a decade ago. I think actually trying to minimize power use is quite the positive, but most people could save more electricity every day just by spending a few less minutes in the shower, running the water a little cooler, or even turning the water off during parts of it, than they'll save by having a 150 watt card instead of a 250 watt card and gaming 8 hours a day.
 
you can pickup used anything cheaper. I don't understand why people compare brand new cards to used. I can pretty much bet these will be down to 250ish in a month or two after AMD clears up the rx 580s that are selling for 200 since few days closing in on launch of this product. Since they both have had excess inventory or the older cards. I think AMD priced it where its to move more buyers towards rx 580s. They will probably drop these to 250 once they clear out alot of the rx 580 cards and likely make them 200 msrp when they drop this. I can already see it coming.

The relevance was that the 1060 having been on the market so long means there is a used market that doesn't exist right now for the 590. For someone who prefers to shop used to find a bargain, the 1060 is still an option. I understand it's still apples and oranges which is why I pointed out that the 1060 can also be found well under MSRP new. Once the 590 also drops to sub-MSRP pricing, it'll be a lot more fair to crown it the 1060 killer.
 
I'm pretty sure it's all just distraction. I doubt most of them put tens of thousands of their own dollars where their mouths are, like I did, on renewable energy generation for their home. I'm not buying a 590 right now because I don't need one, but the modest power requirements of video cards of today are of almost no concern to me. The difference between 150 and 250 watts is about what I was using just to light up a room around a decade ago. I think actually trying to minimize power use is quite the positive, but most people could save more electricity every day just by spending a few less minutes in the shower, running the water a little cooler, or even turning the water off during parts of it, than they'll save by having a 150 watt card instead of a 250 watt card and gaming 8 hours a day.

Yeah I agree. People need to realize that this isn’t a fridge or an air conditioner, where 30% power differences between brands are visible monthly. 30% power differences on a video card might work out to like $10 per year for most gamers in North America.
 
Yeah I agree. People need to realize that this isn’t a fridge or an air conditioner, where 30% power differences between brands are visible monthly. 30% power differences on a video card might work out to like $10 per year for most gamers in North America.

Almost most people will not run their GPU at full load 24/7, most of the time their GPU will be at idle.
 
We have an official response back from XFX on the default poition of the BIOS switch for this video card.

"Just got confirmation that the position closer to the power connectors is the Quiet BIOS, and is the current default shipping position."

So there you have it, the default position is "Quiet" mode on this video card, which has the switch closer to the power connectors. That is how our card came shipped to us, that is how we tested it, that is how the card will come shipped to everyone. Quiet mode will keep the fans at lower RPM, but it will not affect performance. If you want a more aggressive fan profile you can switch the BIOS switch over to "Performance" mode. The difference is about 10c on GPU temp.

No data needs to be re-captured for the review. I will update Page 14 with this new information.
 
Room heat is real, and another 100w is felt. That's just the reality.

I endorse this formulation in this link. These are the load formulas I use to calculate room load for massive cooling systems for large building chillers.

Link to calculate room load based on 1000W of pure heat energy

The example link above uses an 1000W electric space heater to figure out how long a closed room would raise 10 degrees. A 1000 W computer is nearly as efficient turning electricity into heat.

With the kind of minimal power levels we are talking about spread out over an entire house, you guys are basically nuts for worrying about the difference.

A typical AC in an average 2500 sq home is about 6 tons. That 21,101 Watts of cooling power. At 500 watts, you are looking at 1/2 degree F difference at most.

Even if your system ran a FULL kW more for 3 hours, you are looking at 72cents per day increased energy cost. ($0.12 / kWh to run + $0.12 / kWh to cool = $0.24 kWh * 3 = $0.72). You aren't even running close to these numbers today.

You are over reacting.
 
Last edited by a moderator:
Room heat is real, and another 100w is felt. That's just the reality.

"Oh no, Intel is letting its CPUs run as fast as possible! Someone stop them!"

This behavior is desirable. Particularly if the cooling solution and power delivery can handle it, which most can; hell, what if it's running in a cooler environment?

The key here is stability, and that has not presented itself to be an issue. You can quibble about the marketing but the performance stands.

So, when Intel is blowing past their advertised TDP (by 79w when tested by techspot), it's desirable. When amd uses 120w, it's fucking world ending.

You're not even worried about appearing unbiased.
 
Completely disagree. Age is completely relevant. For $280~300 for a rehash of a product that came out 2.5 years ago that sold then for $229 on release. That's not a win imho, but I'm not one to judge what people spend their money on. It's to close to the 1070 which completely oblierates it performance wise imho. This should of been $229 like the the original 480 8gb. Then it would of been a revolution.

The price could be better and I think we all expect it to drop closer to the $250 mark after a little time. What you're not considering is the fact that since 480 hit, we had the mining craze which moved the bar across the board on prices. Yes, it's coming down but the market pricing is still not where it was 2 years ago.
Man, looking back at the r9 Fury in this case. They should have grabbed that chip out of the EOL bin and slapped it on 12nm and worked it with some GDDR5x and sent it out the door. That GPU with higher clocks wants to go fast and would have yielded better performance vs polaris refresh.

A Fury refresh or at the very least an 8GB model would have been really nice to see. I think a Fury refresh with higher clocks and 8GB HBM would have possibly been too close for comfort on Vega 56 performance. AMD likely didnt want to pursue a route that could possibly undercut their own product.
 
I wanted to ask Brent a question , in the taskbar for AMD Radeon driver settings for Graphics profile, is this ever placed into Optimize Performance for bench marking and what dose it offer over the other settings ?
 
I wanted to ask Brent a question , in the taskbar for AMD Radeon driver settings for Graphics profile, is this ever placed into Optimize Performance for bench marking and what dose it offer over the other settings ?

No, we leave it at the default setting, same as all the other settings in AMD and NV drivers. We test in an "out of box" experience. That means installing the video card and testing as is, no hardware or software manipulation. Installing drivers, and using the default driver configurations AMD and NVIDIA have setup as the default configuration. It's the only way to "fairly" compare hardware. I'll leave it up to AMD and NVIDIA to optimize drivers respectively and then evaluate that experience that it delivers. It also means our readers can setup hardware and software just like we do to compare results very easily.
 
Nice goal-post move!

I'm pointing out the difference in a room. Here, yes, 100w makes a difference. It will be felt. I don't game in a warehouse, kudos if you do.



No bias in physics ;)

I didn't do squat. I showed the mathematics of your assumption and understanding of such is weak.

Even if I narrowed it down to a 10x12 room you would go up a few degrees at most in a perfectly sealed room with no ventilation. Are you telling me this is your room with any seriousness?
 
I didn't do squat. I showed the mathematics of your assumption and understanding of such is weak.

Even if I narrowed it down to a 10x12 room you would go up a few degrees at most in a perfectly sealed room with no ventilation. Are you telling me this is your room with any seriousness?

100w will be felt.

Your math is simplistic. Sorry.

Ventilation still assumes a gradient, of which the user is physically on the hotter side.

And no, I'm not claiming that an extra 100w is 'too much', my sig should be clear evidence of that, but the argument that 100w for very little performance gain does seem a bit silly. It's also established AMD engineering- they've totally blown their efficiency curves for the last five years at least.

open a window and get gaming dammit!

Nobody, nobody, wants that ;)
 
*Update 11/17/2018 - Power Testing*

I have some more follow-up power testing to share with you all. With the feedback on power draw in this thread, and this tweet I spent time on Friday to do some more power testing with the XFX Radeon RX 590 Fatboy.


My goal was to downclock the RX 590 Fatboy to Reference Radeon RX 480 clock speeds and test power in comparison. The goal is to see how much the 12nm process is improving power efficiency.


VERY BIG DISCLAIMER: It is quite difficult to match the 590 perfectly to the 480. The Reference Radeon RX 480 video card does not provide 1266MHz clock speed consistently when gaming. This video card has thermal throttle issues that causes the clock speed to fluctuate wildly while gaming. This fluctuation in clock speed does affect power draw, and ultimately it will not be as high as if the GPU was running a consistent 1266MHz. On the other hand, the XFX R9 590 can run 1266MHz consistently, so all performance is delivered at all times. This creates the first “difference” in power draw between the cards that is hard to align. Second, the GPU voltage runs differently between the GPUs. We found very different GPU voltages on the R9 590 at its default speed, and then downclocked to 1266MHz. This GPU voltage differed from the GPU voltage on the RX 480. This is the second major hurdle that is hard to align. There are also other hidden differences you must be aware of such as that memory voltage could be different (we don’t know) and there could be other power regulated items that differ in power draw between each board itself. Take these numbers shown below with a side of salt. I am not confident how really honestly meaningful they are.


Reference Radeon RX 480 Video Card
Peak Wattage: 269W (averages in the 250’s)
VDDC: 1.1687V
Clock speed not consistent at 1266MHz.


XFX Radeon RX 590 Fatboy OC+ at Default Settings
Peak Wattage: 350W (averages in the 320’s-340’s)
VDDC: 1.2125V
Clock speed is consistent at 1580MHz.


XFX Radeon RX 590 Fatboy OC+ downclocked to 1266MHz (no other changes)
Peak Wattage: 283W (averages in the 270’s)
VDDC: 1.1500V
Clock speed is consistent at 1266MHz.


XFX Radeon RX 590 Fatboy OC+ dowclocked to 1266MHz plus Lowered the Voltage close to 1.1687V to match the RX 480’s Voltage, it was not exact, but close, keep in mind.
Peak Wattage: 256W (averages in the 240’s)
VDDC: Fluctuated but peaked at near to 1.1687V (not 100% precise)
Clock speed is consistent at 1266MHz.


The summary is thus: Clock speed does make a big difference on Wattage, but GPU voltage is an important factory and it is different between 480/580/590. The XFX Fatboy 590 is running a slightly higher GPU voltage than RX 480, so that and the increased clock speed means a higher Wattage. When we downclock the 590 to 1266MHz voltage drops to 1.1500V which is lower than the RX 480, but the power draw is still higher. OTHER FACTORS ARE AT WORK HERE. Such as clock speed consistency, and other unknown voltage differences in regards to the board power. Trying to keep the Voltage down to the same as the RX 480 and the clock speed results in the RX 590 achieving a lower Wattage while gaming, a difference of around 13W at similar Voltages and Clock Speeds.


There you go, I am personally not confident that this really changes anything or is really that meaningful. There are inconsistencies and unknown’s that we just cannot align with for exact measurements.
 
100w will be felt.

Your math is simplistic. Sorry.

Ventilation still assumes a gradient, of which the user is physically on the hotter side.

And no, I'm not claiming that an extra 100w is 'too much', my sig should be clear evidence of that, but the argument that 100w for very little performance gain does seem a bit silly. It's also established AMD engineering- they've totally blown their efficiency curves for the last five years at least.



Nobody, nobody, wants that ;)

Nobody is going to sit there and claim amd is awesome at power efficiency compared to nvidia. We all know Pascal is more efficient.

That said it's at best a trivial issue cost wise and heat wise un you put your face near the exhaust output of your computer case.

The ONLY time I experience heat is if I run my 850 watt supply at gaming inside a closed cabinet with only a small 5x5 vent hole at the rear with the fan turned off. My system is still full tilt in my nearly closed off writers secretary. If and only if I do this can I feel heat poor out when I open the door. Luckily I have a small low speed fan which is usually on and this is never an issue. If you don't believe me I'll show you photos of how closed off it is.

As to math being overly simplistic they are well accepted formulations used by thermodynamic engineers and architects for building load. These numbers are used to calculate how many tons cooling you need for a building and are backed by ASHRAE
 
Last edited by a moderator:
XFX Radeon RX 590 Fatboy OC+ downclocked to 1266MHz (no other changes)
Peak Wattage: 283W (averages in the 270’s)
VDDC: 1.1500V
Clock speed is consistent at 1266MHz.


XFX Radeon RX 590 Fatboy OC+ dowclocked to 1266MHz plus Lowered the Voltage close to 1.1687V to match the RX 480’s Voltage, it was not exact, but close, keep in mind.
Peak Wattage: 256W (averages in the 240’s)
VDDC: Fluctuated but peaked at near to 1.1687V (not 100% precise)
Clock speed is consistent at 1266MHz.
That's weird.
 
No bias in physics ;)

Never claimed there was, but you know that, it's why you didn't bother actually responding, instead favoring a pithy response.

YOU are biased. When Intel uses extra power for extra performance, it's "desirable" . When AMD does it, it's a blown efficiency curve .

You add nothing to this conversation, you have no interest in being an objective party. Instead of trying to talk outside both sides of your mouth, why not just stop talking for once.
 
*Update 11/17/2018 - Power Testing*

I have some more follow-up power testing to share with you all. With the feedback on power draw in this thread, and this tweet I spent time on Friday to do some more power testing with the XFX Radeon RX 590 Fatboy.


My goal was to downclock the RX 590 Fatboy to Reference Radeon RX 480 clock speeds and test power in comparison. The goal is to see how much the 12nm process is improving power efficiency.


VERY BIG DISCLAIMER: It is quite difficult to match the 590 perfectly to the 480. The Reference Radeon RX 480 video card does not provide 1266MHz clock speed consistently when gaming. This video card has thermal throttle issues that causes the clock speed to fluctuate wildly while gaming. This fluctuation in clock speed does affect power draw, and ultimately it will not be as high as if the GPU was running a consistent 1266MHz. On the other hand, the XFX R9 590 can run 1266MHz consistently, so all performance is delivered at all times. This creates the first “difference” in power draw between the cards that is hard to align. Second, the GPU voltage runs differently between the GPUs. We found very different GPU voltages on the R9 590 at its default speed, and then downclocked to 1266MHz. This GPU voltage differed from the GPU voltage on the RX 480. This is the second major hurdle that is hard to align. There are also other hidden differences you must be aware of such as that memory voltage could be different (we don’t know) and there could be other power regulated items that differ in power draw between each board itself. Take these numbers shown below with a side of salt. I am not confident how really honestly meaningful they are.


Reference Radeon RX 480 Video Card
Peak Wattage: 269W (averages in the 250’s)
VDDC: 1.1687V
Clock speed not consistent at 1266MHz.


XFX Radeon RX 590 Fatboy OC+ at Default Settings
Peak Wattage: 350W (averages in the 320’s-340’s)
VDDC: 1.2125V
Clock speed is consistent at 1580MHz.


XFX Radeon RX 590 Fatboy OC+ downclocked to 1266MHz (no other changes)
Peak Wattage: 283W (averages in the 270’s)
VDDC: 1.1500V
Clock speed is consistent at 1266MHz.


XFX Radeon RX 590 Fatboy OC+ dowclocked to 1266MHz plus Lowered the Voltage close to 1.1687V to match the RX 480’s Voltage, it was not exact, but close, keep in mind.
Peak Wattage: 256W (averages in the 240’s)
VDDC: Fluctuated but peaked at near to 1.1687V (not 100% precise)
Clock speed is consistent at 1266MHz.


The summary is thus: Clock speed does make a big difference on Wattage, but GPU voltage is an important factory and it is different between 480/580/590. The XFX Fatboy 590 is running a slightly higher GPU voltage than RX 480, so that and the increased clock speed means a higher Wattage. When we downclock the 590 to 1266MHz voltage drops to 1.1500V which is lower than the RX 480, but the power draw is still higher. OTHER FACTORS ARE AT WORK HERE. Such as clock speed consistency, and other unknown voltage differences in regards to the board power. Trying to keep the Voltage down to the same as the RX 480 and the clock speed results in the RX 590 achieving a lower Wattage while gaming, a difference of around 13W at similar Voltages and Clock Speeds.


There you go, I am personally not confident that this really changes anything or is really that meaningful. There are inconsistencies and unknown’s that we just cannot align with for exact measurements.


I was under the impression that a die shrink was not as efficient if the design was not intended for the smaller process in the first place. Then again I guess AMD had plenty of time to properly adapt Polaris for 12nm.

I can't remember any examples of a "pure die shrink" on a GPU. Is this the first time?
 
This opens so many doors... Imagine a RTX 2080 Ti ThiccGurl Edition. XD
 
I was under the impression that a die shrink was not as efficient if the design was not intended for the smaller process in the first place. Then again I guess AMD had plenty of time to properly adapt Polaris for 12nm.

I can't remember any examples of a "pure die shrink" on a GPU. Is this the first time?

Take a look at Zen vs Zen+ same die shrink , it is not even known where this 12nm version is from if you look at the data it does not matter either Polaris is not a great designed gpu it does what it needs to do and that is it.
 
The reward is based on performance, product positioning, gameplay experience delivered at 1080p and 1440p compared to the competition, and being the leader in this price segment on performance, not AMD's business practices. Point of contention, it is a genuine refresh not (rebrand), a new manufacturing process, and established clock speeds is a refresh in the true sense of the word. A rebrand would be just a renaming with no change. The RX 590 definitely has change over the RX 580, as a fact.

The key statement there is 'compared to competition'. Otherwise as a ~5-7% improvement over the 580 being tested I was underwhelmed. Will be interesting what Nvidia does with pricing leading up to the holiday season with their stock in decline and increased pressure on the mainstream market segment. If there were able to deliver a more competetive product in the sub 300 category it would be a serious blow to the next year of AMDs product stack.
 
I think the 1070, 1070ti, vega 56 would be much better investments for not much more. They have all been dipping under $350 and seem to be a pretty nice bump up in performance. The card itself looks good. I think the midrange is getting interesting with the dropping prices of the previous higher end.
 
Seems like a nice card for a reasonable price if you are interested in the free games that comes with it!
I kinda hope they sell a lot of them so nvidia can't sell their leftover 1060/1070 without deep discount.
 
I think the 1070, 1070ti, vega 56 would be much better investments for not much more. They have all been dipping under $350 and seem to be a pretty nice bump up in performance. The card itself looks good. I think the midrange is getting interesting with the dropping prices of the previous higher end.
Don't forget we're not all Americans. Slightly north of you it is $350 for a 580($400+ 590) and $550-600+ for the others you mentioned.
 
If the Vega 56 deals for $349 keeps up, I don't see the point of the 590....these sales have been going on for some time now, $70 for whole other level of performance, if you're passing on that, that bag of crack rocks you're smoking must be mighty tasty.
Exactly this if you can take advantage of it, for most it's not a huge stretch but remember not all can mamke that jump, especially when it settles down to 250 or lower.
Also specials can't be counted on though. I picked up some TridentZ RGB CL15 DDR4 3000 (16/3200 with a setting change) for $134 @ 16gb. Try getting the 3200 sticks at that price now, not happening. 3000s will sell out soon too.

So with this in mind, I picked up NKDs' barely used (no volts OC and heat spot etc pre-tested + TR test and benched only) V64 for 375... + shipping to my part of the world. Grateful for that deal, I have been waiting to pull the trigger now for over a year and I don't see getting a better deal than that for a while, plus the stocks usually get low well before Navi10 will come by and probably be V56 range. It's nearly a year off for that.
They already upped the price $40 from the $400 deal again and threw games in. Games are great (would love res2) but not worth shipping to me, plus I'm a very, very patient gamer (will get gta5 next special.. lol).
 
I see 12nm Vegas or possibly 7nm Vegas in q2 2019.

Lisa already said no 7nm Vega for gamers. They didn't do 12nm for datacentre, so they sure as fuck won't do 7nm in just the same vein. All you'll get will be Navi 12 or 10 or whatever it is now, around a V56.
This is why I pulled trigger on a V64 that dipped below 400, stock will be getting lower, so in coming months it'll be harder to get a cheap, new card. They are not being restocked in my country either (and are still clearing out stock at 700USD lol).
 
Don't forget we're not all Americans. Slightly north of you it is $350 for a 580($400+ 590) and $550-600+ for the others you mentioned.
Ah damn I always forget that's the case for a lot of places. Thanks for the reminder, the 590 would make a lot more sense in that situation.
 
Exactly this if you can take advantage of it, for most it's not a huge stretch but remember not all can mamke that jump, especially when it settles down to 250 or lower.
Also specials can't be counted on though. I picked up some TridentZ RGB CL15 DDR4 3000 (16/3200 with a setting change) for $134 @ 16gb. Try getting the 3200 sticks at that price now, not happening. 3000s will sell out soon too.

So with this in mind, I picked up NKDs' barely used (no volts OC and heat spot etc pre-tested + TR test and benched only) V64 for 375... + shipping to my part of the world. Grateful for that deal, I have been waiting to pull the trigger now for over a year and I don't see getting a better deal than that for a while, plus the stocks usually get low well before Navi10 will come by and probably be V56 range. It's nearly a year off for that.
They already upped the price $40 from the $400 deal again and threw games in. Games are great (would love res2) but not worth shipping to me, plus I'm a very, very patient gamer (will get gta5 next special.. lol).

In all fairness they were/are selling the 64 for $399 on ebay:
https://www.ebay.com/itm/SAPPHIRE-R...h=item5914a7ec17:g:hMEAAOSw0QBbyyjA:rk:4:pf:0

the 56 is still $349 with games, we are going on week 3 of this now:
https://www.newegg.com/Product/Prod...=vega 56&cm_re=vega_56-_-14-930-006-_-Product

Canadian pricing is bizarro, but I found a 580 for $279 with 2 games, when you do the math its only $212 USD which isn't too far off from what we are paying, the 590 still makes no sense at $399 Canadian... Still waiting for that compelling reason why its a good buy, I imagine someone from Uzbekistan is going to tell me a story, but I will wait for that one.
https://www.newegg.ca/Product/Produ...cription=580&cm_re=580-_-14-131-720-_-Product
 
As an eBay Associate, HardForum may earn from qualifying purchases.
  • Like
Reactions: N4CR
like this
Room heat is real, and another 100w is felt. That's just the reality.
If you are Harry Potter, perhaps.
If you're buying GPUs and 100W actually is an issue all of a sudden then you should look at better ventilation or running an externally dumped loop if you're actually [H].

Don't forget we're not all Americans. Slightly north of you it is $350 for a 580($400+ 590) and $550-600+ for the others you mentioned.
Not really an excuse, import it yourself from a trusted [H]'r and be creative.

That's weird.
Could be starting to see memory power use..

In all fairness they were/are selling the 64 for $399 on ebay:
https://www.ebay.com/itm/SAPPHIRE-R...h=item5914a7ec17:g:hMEAAOSw0QBbyyjA:rk:4:pf:0

the 56 is still $349 with games, we are going on week 3 of this now:
https://www.newegg.com/Product/Product.aspx?Item=N82E16814930006&Description=vega 56&cm_re=vega_56-_-14-930-006-_-Product

Canadian pricing is bizarro, but I found a 580 for $279 with 2 games, when you do the math its only $212 USD which isn't too far off from what we are paying, the 590 still makes no sense at $399 Canadian... Still waiting for that compelling reason why its a good buy, I imagine someone from Uzbekistan is going to tell me a story, but I will wait for that one.
https://www.newegg.ca/Product/Produ...cription=580&cm_re=580-_-14-131-720-_-Product


Lol yeah very true, I just find specials are not always reliable - last time I looked the $400 v64s were gone on egg but that v56 with the 3x decent games is a hell of a deal. Much better than v64@440 with 3x games TBH...
Got me realizing they could drop $40 off the V56 price one day too with that in mind. Although it's probably subsidised loss leader. I remember people working BOM out and there isn't much in at V64 price let alone V56.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Lisa already said no 7nm Vega for gamers. They didn't do 12nm for datacentre, so they sure as fuck won't do 7nm in just the same vein. All you'll get will be Navi 12 or 10 or whatever it is now, around a V56.
This is why I pulled trigger on a V64 that dipped below 400, stock will be getting lower, so in coming months it'll be harder to get a cheap, new card. They are not being restocked in my country either (and are still clearing out stock at 700USD lol).

NVIDIA also claimed no new cards are coming about 1 month before 20xx series was announced. You endanger sales of the current product stack with an early release of next-gen products.

The amount of money invested to create a 7nm Vega cannot be justified for the pro market alone. There will be an excess capacity of Vegas @ 7nm for sure.
 
On the conclusion page in case you cant to correct it.

"but GPU voltage is an important factory"
 
Back
Top