Fury X voltage control coming soon

Best post over there.
qWYKdxN.png
 
Wow! There power draw increase is massive for barely any additional performance. Looks like AMD doesn't have much headroom in the stock clocks:

As you can see, power ramps up very quickly, much faster than maximum clock or performance. From stock to +144 mV, the power draw increases by 27%, while overclocking potential goes up only by 5%, and real-life performance increases by only 3%.


Looking at the numbers, I'm not sure if a 150W power draw increase, just to get an extra 3 FPS, is worth it for most gamers.
 
I'm not on Reddit right now, too much going on at work but I thought this would be useful here. Link?
 
That is an impressive collection of video cards.

I'll reserve judgment until I saw more results with the uncorked voltage. A bit ironic that Fury seems bandwidth starved despite the massive 512GB/s from HBM.
 
I will wait to see what AMD's own unlocks and drivers do, i have a feeling there are some major driver optimizations and voltage/clock unlocks to come that will at least make the Fury X OC competitive against the 980ti and possibly match the Titan X in some cases. I mean, nvidia has had almost a year of optimizations with their current gen so AMD is lagging behind in that area (while also being short staffed).
 
I will wait to see what AMD's own unlocks and drivers do, i have a feeling there are some major driver optimizations and voltage/clock unlocks to come that will at least make the Fury X OC competitive against the 980ti and possibly match the Titan X in some cases. I mean, nvidia has had almost a year of optimizations with their current gen so AMD is lagging behind in that area (while also being short staffed).
You can't really "optimize" how the silicon works with voltage like you are suggesting. The best they could do through software is optimize power consumption or improve 3D software performance through drivers. They can't improve GPU overclocking through drivers.
 
If AMD releases the unlock and the results are the same as this link suggests, poor overclocking.. then i will NEVER buy an AMD card again, if they lied then i fell for it.. but NEVER again.
 
If AMD releases the unlock and the results are the same as this link suggests, poor overclocking.. then i will NEVER buy an AMD card again, if they lied then i fell for it.. but NEVER again.

What unlock are you talking about? AMD has never and will never have voltage control in CCC. All voltage overclocking on AMD and Nvidia cards is done through 3rd party applications. Still this is one card and so far in just one game. We'll have to wait and see what the silicon lottery looks like for this card.
 
If AMD releases the unlock and the results are the same as this link suggests, poor overclocking.. then i will NEVER buy an AMD card again, if they lied then i fell for it.. but NEVER again.

I especially liked the part in which they said it was an overclocking monster on stage with a straight face.
 
Looks like there is a built-in hardware safety against over-volting, doubt these "findings" mean anything.
 
Looks like there is a built-in hardware safety against over-volting, doubt these "findings" mean anything.

Overvolting its working.. if not then how you can explain the increased Power consumption and higher power usage?.. its news straight from W1zzard isn't any amateur guy just inventing and figuring how to do tricks..
 
It looks like you get 50% of your OC % in real world performance increase.

10% OC = 5% perf boost, 20% OC = 10% perf boost.
 
Overvolting its working.. if not then how you can explain the increased Power consumption and higher power usage?.. its news straight from W1zzard isn't any amateur guy just inventing and figuring how to do tricks..

I know, I'm not saying W1zzard isn't doing it correctly, I'm just wondering if AMD put in a hardware safety to not fry the chip, or vrms, using more power, but causing it to throttle.
 
I know, I'm not saying W1zzard isn't doing it correctly, I'm just wondering if AMD put in a hardware safety to not fry the chip, or vrms, using more power, but causing it to throttle.
You would see that reflected in the GPU core clock speed. If they just throttled voltage back and didn't throttle the clock speed the GPU would crash.
 
His GTA V results.

https://www.reddit.com/r/AdvancedMi...ry_x_fiji_voltage_scaling_techpowerup/cti3dx3

So I've spent the last 4 hours running GTA V....

It takes about 3 minutes to load, another 4 minutes to run. I need to do one warmup run and then the real testing run.

If the card is overclocked too high it will either bluescreen or black screen. So no recovery which means reset/power on-off, start Windows, set clock and voltages, start GTA V and hope it works this time.

Why didn't I just reply "too much trouble, sorry" .. t.t

Edit: Finally got results

GTA V 4K

1050 / 500: 32.1 FPS

1195 / 500: 32.3 FPS

1194 / 560: 33.0 FPS
 
Doesn't seem worth it at all. If these things get pushed like that just to gain a couple frames, I see a shit ton of RMA's in the near future.
 
Feels like the architecture is at its peak power/performance and peak overall performance spot at default clocks. Something other than pure clock speed holding back performance in there. Interesting results. Seems like a no brainer to save your card and forget about the OC...
 
lol

called it! these cards are overclocked from factory, just like the hawaii chips

this is why they use an AIO cooler, because it's required

AMD looks to be a year (at least) behind Nvidia in terms of architecture and performance per watt.
 
AMD flubbed this launch on almost every count.

The only hope for Fury is if it somehow leaps ahead of 980 Ti in DX12 games, though I personally see no reason NV won't massively benefit from DX12 as well.
 
Isn't Fury X atleast competing currently? I know they bragged about destroying the nvidia cards, and clearly didn't, but they are similar in speeds, no?
 
AMD flubbed this launch on almost every count.

The only hope for Fury is if it somehow leaps ahead of 980 Ti in DX12 games, though I personally see no reason NV won't massively benefit from DX12 as well.

WHAT DX12 games? by the time DX12 is relevant (years from now) Fury will be a distant memory and/or AMD will either be out of biz or be acquired.
 
lol

called it! these cards are overclocked from factory, just like the hawaii chips

this is why they use an AIO cooler, because it's required

AMD looks to be a year (at least) behind Nvidia in terms of architecture and performance per watt.

Being maxed out from the factory just to be able to keep up was obvious to anyone not emotionally attached to AMD or with an irrational bug up their ass about Nvidia. I've been saying it since day one since so many signs and so much of AMD's strange corporate doublespeak pointed to it.

“You'll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri
 
WHAT DX12 games? by the time DX12 is relevant (years from now) Fury will be a distant memory and/or AMD will either be out of biz or be acquired.

Enjoy your $400 low end nvidia cards when that happens
 
DX12 games should be coming out this year. It is not going to be years until adoption. It's used in Xbox One, so future console ports will have it.

Holiday 2015 according to this report http://www.vg247.com/2014/03/20/dir...he-market-around-holiday-2015-says-microsoft/


So much incorrect and outdated here. The highly customized version of "DX12" on Xbone will have no parallel to the Windows version except marketingspeak.

As long as there is still a sizeable pool of Windows 7 machines, development target will remain DX11. "But free Windows 10 upgrade" doesn't change that. Many people won't install a new version of Windows for many different reasons.

once again, it will be years before there's a critical mass of developers targeting DX12. the guy that's the eye of the storm of DX12/Vulkan - Johan Anderson at DICE, who helped create Mantle with AMD and is at the leading edge of infusing Frostbite with DX12/Vulkan - is saying Holiday 2016 at the earliest for any EA titles with it. More conservative developers seeking maximum ROI to hit the widest number of PCs will drag even more on DX12.
 
That's pretty weird how the clocks give no extra performance... something else must be at play. Throttling?

Personally, I find a single Fury X isn't so impressive. Two with CFX scaling is where they really shine though and that's where my money is going.
Simply no way for me to stick to one card one screen mantra this time around, when I have been spoilt by 4k 40"+.. once you see it, you can't go back :(
 
Yeah, almost no increase with increased clocks.. something is not right there.

What about lower resolutions?

Could be a current driver limitation as well.
 
WHAT DX12 games? by the time DX12 is relevant (years from now) Fury will be a distant memory and/or AMD will either be out of biz or be acquired.

Detroit Doctor
Ashes of the Singularity
 
I'd be curious to see if results were any different at 1080p or 1440p. I'm wondering if 4K is hitting a bottleneck, either with ROPs or VRAM..

But either way this isn't promising.
 
Well The Witcher 3, Batman, and Project Cars are all getting the DX12 treatment. Fable Legends is coming out with DX12. All of the game engines have a DX12 path now. Personally I think Nvidia and AMD video cards will benefit equally from DX12 as will Intel and AMD CPUs.

Fury is what it is because of the physical space constraints due to the HBM taking up a large amount of real estate.
 
Well The Witcher 3, Batman, and Project Cars are all getting the DX12 treatment. Fable Legends is coming out with DX12. All of the game engines have a DX12 path now. Personally I think Nvidia and AMD video cards will benefit equally from DX12 as will Intel and AMD CPUs.

Fury is what it is because of the physical space constraints due to the HBM taking up a large amount of real estate.

Were any of those three actually confirmed to get DX12? I know there were rumors for Batman and Witcher 3 but I don't remember seeing anything officially from the devs. Considering all the work WB is putting into making AK work I wouldn't expect them to go through the effort to add DX12 to the game anytime soon, if at all.
 
Enjoy your $400 low end nvidia cards when that happens

It's going to be just like Intel's monopoly. Even with them having a hold on CPU's for the past 6 years or more, they ended up lowering the cost for a 6 core CPU from $580 to $400. Even Fry's has a deal right now which makes the 5820k cheaper compared to a 4790k.

If AMD could actually compete the possibilities would be endless what Nvidia/Intel would have to dish out. Sadly though we have to deal with 2 giants and 1 flailing whale of a competitor. A competitor says one thing about its products and then when the time comes to show what it offers it fails miserably.

"An Overclockers Dream"
 
I really like AMD, have had many of them along with NVidia cards. I like to support the underdog like anyone else but this release sucked balls.
I understand now, how a Cleveland sports fan feels. Constantly missing the mark and hoping there is something "next year" when it never seems to happen.

Picture appropriate:
1351227739002_7339282.png
 
It's going to be just like Intel's monopoly. Even with them having a hold on CPU's for the past 6 years or more, they ended up lowering the cost for a 6 core CPU from $580 to $400. Even Fry's has a deal right now which makes the 5820k cheaper compared to a 4790k.

If AMD could actually compete the possibilities would be endless what Nvidia/Intel would have to dish out. Sadly though we have to deal with 2 giants and 1 flailing whale of a competitor. A competitor says one thing about its products and then when the time comes to show what it offers it fails miserably.

"An Overclockers Dream"

I remember the top end cpus only being a few hundred, not $500+ (except the "Extreme" rip off ones)

Look at the performance "gain" from the last few years of intel cpus. its been tiny gains every year even with massive chip size reduction. Hell [H] uses a 3770k for all its testing.
 
I remember the top end cpus only being a few hundred, not $500+ (except the "Extreme" rip off ones)

Look at the performance "gain" from the last few years of intel cpus. its been tiny gains every year even with massive chip size reduction. Hell [H] uses a 3770k for all its testing.

Part of that is due to Moore's Law being dead or hitting that massive wall of diminishing returns. The megahertz race hit that wall big time and now the core race has as well. As long as we're using silicon we will likely never again see those kind of huge jumps between years. We've hit that wall with GPUs as well. The days of 100% increases generation over generation are dead.
 
Back
Top