AMD Fury series coming soon.

GCN needs some serious improvements - HBM and water cooling (which drops a few watts more due to much lower tempratures of gpu) and it still takes 30W more than Titan X for similar performance.
 
also, trying to pull up my post history results in a forum error, if any of the mods wanted to know that lol

edit: seems to be fixed now
 
Last edited:
GCN needs some serious improvements - HBM and water cooling (which drops a few watts more due to much lower tempratures of gpu) and it still takes 30W more than Titan X for similar performance.

I suspect we'll see that next year with the 14nm shrink. That is; I think we'll see a gcn 2.0 rather than a 1.x
 
GCN 3.0 todays hardware have GCN 2.0

Sure I guess. A name is meaningless, but what I mean is that we will see a large architectural change at 14nm be it 1.2/1.3 to 2.0 or 2.0 to 3.0 (or -3.5 to 123.456 for that matter).
 
GCN needs some serious improvements - HBM and water cooling (which drops a few watts more due to much lower tempratures of gpu) and it still takes 30W more than Titan X for similar performance.

Well, there you have it. A single review, showing fury power use within 30w of titan x? Still not good enough, and AMD should just hurry up and die already.

I'm glad to see Kyle is starting to crank down on all the shit this launch has caused finally. Hopefully you'll keep running your mouth and earn yourself a vacation.

In the mean time, thanks for your concern, the nvidia section is that way -->
 
being it is $350 less, IT IS.

i think that award goes to the 980ti at this point

if the fury x costs the same as the 980ti, performs the same at stock, but gets dog-walked when overclocked, then the 980ti is the TX killer
 
I hope the benches, when released, current driver, all that rot, are very competitive.

Doesnt matter if ppl here are red team or green, close benchmarks will make for lower prices.
 
i think that award goes to the 980ti at this point

if the fury x costs the same as the 980ti, performs the same at stock, but gets dog-walked when overclocked, then the 980ti is the TX killer

But, but, but, +300w from a card is WAYYYYYYYYYY too much heat to be outputting from your PC, ala anti-Hawaii argument.
 
Well those are far better benchmarks than VmodTech.

If it's within 5% of a Titan X at 4K in most games, then mission accomplished I guess.

That would be a rather optimistic viewpoint and runs into the same issue as why people were confused about the lack of relative interest in the cut price 290x against the much more expensive GTX 980. The GTX 980ti will be causing the same positioning issue for AMD that the GTX 970 does.

There has been some "doom and gloom" complaints about Fury X (eg. less VRAM, no HDMI 2.0, higher power consumption than GM200, etc.) but those are not actually a problem.

What the actual issue is that it looks like Fury X is going to perform roughly similar to the GTX 980ti (and performance comparisons of which is faster will get even more muddled once you introduce overclocking and custom models). Basically even if it is "faster" it is not fast enough in a meanginful way nor with the pricing will it offer a huge performance/price advantage (such as with 4xxx/5xxx).

This means from a broader perspective what sells Fury X and GTX 980ti will lie outside of raw performance numbers and/or price/performance (given current prices).

Certainly AMD's Fury X has other selling points of its own as well (eg. the form factor and AIO cooling, AMD's own ecosystem specifics, etc.) but how will the market on a whole feel about those against Nvidia and its GTX 980ti?
 
GCN needs some serious improvements - HBM and water cooling (which drops a few watts more due to much lower tempratures of gpu) and it still takes 30W more than Titan X for similar performance.
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.
 
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.

cant argue with internet experts:p
 
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.

I believe he is referencing semiconductor efficiency as it relates to temperature.
 
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.

well the "theory" (not so much theory, actually) is that the hotter these GPUs get, the more power "leakage" occurs, and thus needs to consume more energy as it isn't using energy at a 1:1 ratio to what is actually being generated into math.

the cooler these chips are kept, the less leakage occurs, and thus slightly lower power consumption

there was an article just the other day posted in this thread about this, i can't be bothered to dig it up

I believe he is referencing semi-conductor efficiency as it relates to temperature.

what this guy said ^
 
LoL, arguing that leakage can't happen because of thermodynamics...
I'm losing hope for this forum.

*puts hands together tapping finger to finger in an ascending order*

9JZR0Gm.jpg
 
LoL, arguing that leakage can't happen because of thermodynamics...
I'm losing hope for this forum.



9JZR0Gm.jpg

semi-conductor efficiency can't be measured while also measuring temperature?

really? it's impossible to measure the correlation?
 
AMD has never given out GCN version numbers, it is the media who has opted to number GCN iterations between GPUs to denote new architecture upgrades in GCN.

Eh, depends on who you are talking to and what documentation you are reading.
They have certainly muddied the waters with GCN.
 
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.

I'm speaking about semicondur physics not thermodynamics.
 
Why do people keep saying this? Physics doesn't work like that. Lowering temperatures in absolutely positively no way reduces power. A 1KW space heater with a giant heat sink that covers the entire surface area of a rooms containing walls will only be a few degrees warmer than the room and pleasant to touch, where a bit more standard 1KW space heater where the heating coils are in a small box glowing red hot will melt your skin if you touch it as the temperature of those is hundreds above room temperature. Both use the same power and both will heat the room at the same rate.

Seriously take some thermodynamics classes.



you are forgetting resistance and heat IR=V (ohm's law), in Semiconductors, as temp of the silicon goes up this increases resistance, increase resistance increases leakage, to over come the signal loss the chip now has to draw more power.

So if you are able to keep a chip cooler like with water cooling it will reduce leakage, hence use less power then at a higher temperature.
 
In semiconductors, increase in temperature actually decreases resistance (hence the existence of thermal runaway).

The energy emitted by the silicon is VI (voltage x current), so the hotter the chip is, the more heat it generates due to increased current, which leads to even more heat, eventually leading to the destruction of the chip (thermal runaway in a nutshell). In a typical conductor because resistance increases with temperature, leading to a decrease in current, and thus it eventually balances itself out.

Having a silicon chip run too hot would result in catastrophic damage to the chip, and hence why silicon chip cooling is so important.
 
Back
Top