6 Generations of GeForce Graphics Compared

I saw where you were going before on dipersion and get your point. I can only go by experience, but I know the gtx 480 would make my room hotter faster than my 980 ti (while getting above 90c faster). I believe, and again just aneqdotal, it may have had something to do with the 480 heating up and going to full load quicker.

I think you might just be right. I would also assume that getting the hot air into the room faster with a good cooler will allow the room's A/C to filter it away faster. Thus you never get to the heat saturation point unless your A/C is inadequate thus adding to the perception that the card is running cooler.

Thanks for sharing your experience with the GTX 480. ;)
 
In the games where 780TI was performing competitive to the 290X at their active time that still persist.. the drop in GPU optimizations for older GPUs from Nvdia part and the focus in the more recent technology and architecture is what made the every GCN card outperform nvidia counterparts in games, so you can find things like a 280X obliterating the performance of a GTX 780 in Far Cry 4.. but then in AoS you find that a mere GTX 670 still compete with an HD7970 as they did in the past and that's under DX12 which it's where AMD show their "strong" points always isn't?, gpu and drivers optimizations are what make a card competitive or not overtime, AMD utilize the same architecture since HD7000 up to today and even polaris its just yet another revision to the same architecture, so don't be stranged if with the launch of pascal nvidia will stop drivers and gpu optimizations for older GPUs making the 290X perform closer and closer up to outperform nvidia counterparts, it will a moment where you may see a 390X performing on par with the 980TI if AMD keep using GCN as their main architecture, nvidia has passed from kepler, maxwell and pascal since then.



This still apply to the case actually, you want to direct compare the 290X to the 390X, but the only reason why the 390X perform up to on par with the GTX 980 (STOCK) is because 390X it's a pre-overclocked GPU and vRAM 290X... on the latest Fury Review from [H] you can find that it's the R9 Fury that are competitive to the GTX 980 not the 390X. the GTX 970 still compete with the R9 290/290x and the 980 to 390/390X nothing have changed. YET....


It highlights how forward looking amds gpu architecture has been over the years, competitive not just at the time of launches, but long into the future as well vs nvidia chips with their much more temporal performance.



And that pre overclocked 290x in the 390x, was also a good deal cheaper than the stock vanilla 980, want to get overclocked 980s? Spend even more. The 980 was a bad choice, and plenty still chose it over a better option.
 
I think you might just be right. I would also assume that getting the hot air into the room faster with a good cooler will allow the room's A/C to filter it away faster. Thus you never get to the heat saturation point unless your A/C is inadequate thus adding to the perception that the card is running cooler.

well yes, more or less like that... the faster the heat is dissipated the less impact in the overall environment temperature and for this there's a combination of factors that have a major or minor effect on this behavior, type of GPU cooler, case location, case cooling and a large etc. This is of course speaking entirely about a same type of card compared in different environments, to keep the conversation about GTX 480 versus GTX 980TI overall power consumption and energy wasted, I've explained that in the last page, overall efficiency of components have to be taken in consideration about those points, because isn't the GPU core alone which its consuming 250W. every component on the card have a percent of utilization of that energy, which directly affect how the heat is dissipated and how that feels in the room..

fast example turn on a 60W light bulb after a certain period of time you will find the surround of that light hot.. and it will have after certian period of times a direct effect in the room, right? then put a fan near that light bulb and wait the same period of time you will find the surround of that bulb cooler than before but also the overall room temp will have a less effect because that fan it's helping to dissipate that hot air way faster across the room, the same 60W are there but less noticeable.
 
what does it matter, it matters what happens at a given time and when a person is ready to buy, as games change focus on what they stress on a GPU the performance will change too based on those aspects, if a buyer doesn't understand how that will affect their purchase decision that is a problem on their end. And of course its also forecasting what game developers will do, so its not entire doable, its great in hind sight but in hind sight everything looks either good or bad depending on what you are looking for.

Also in hind sight, why didn't anyone do a comparative performance analysis of the g80, g92 lines? They too had a similar longevity to the GCN family. But what you are getting at is this, they might last one gen more, but they still staying in the relative same performance bracket. Even given a 10% swing in performance in their favor. But nV's cards the 780ti is a big enough step away from Maxwell 2 where nV would have had to focus on per game optimizations for Maxwell 2 over Keplar. But the good news for Maxwell 2 to Pascal outside of dynamic load balancing the pipelines remain similar, the down side is Maxwell 2 has quite a bit less compute performance where games are going to push in the coming years.

AMD is still using GCN 1.2 pipelines it should be ok for them too at least from a driver point of view.


Well it should matter to everyone who does not upgrade their gpu every cycle. I get if you drop top cash on the top gpu each year or less and just pawn off the old gpu. That would make more sense. Then who cares. How many people do that vs people who hold onto their gpus for 2-3 years? Then how wel a gpu holds up over time seems EXTREMELY relevant. And all the people who were suckered into getting kepler and maxwell parts got hosed relative to people who went amd. That's why such a comparison would be useful. The only people that would not want such a thing shown are nvidia apologists who would not like that reality made more graphic with... graphics.
 
The problem is you don't know what is going to happen in the future banking on something that might or might not have legs for future games is a bad way to purchase graphics cards, not to mention anything else. Get what you need now and be happy with it, its not like you can fore see what the developers will do and when they will do it (both, easy to see where it will go but not timing). What you are saying is the same thinking that when SLi first came out, people were saying its great because they can through another card in their system at a latter date and its all fine, there is a caveat to that, games might not work with SLi. But by then they have already spent money on a motherboard that supported SLi, a bigger power supply etc, then they find out its totally useless because a next gen card can do what SLi older gen cards could do without all the issues.
 
It highlights how forward looking amds gpu architecture has been over the years, competitive not just at the time of launches, but long into the future as well vs nvidia chips with their much more temporal performance.



And that pre overclocked 290x in the 390x, was also a good deal cheaper than the stock vanilla 980, want to get overclocked 980s? Spend even more. The 980 was a bad choice, and plenty still chose it over a better option.

that's right, the GTX 980 was always a bad value card, nobody with 4 fingers in forehead will say the contrary..

It highlights how forward looking amds gpu architecture has been over the years, competitive not just at the time of launches, but long into the future as well vs nvidia chips with their much more temporal performance.

that depend on how far do you want to see in the past.. can you say the same from the HD6900 series today? have they scaled as good as HD7000 series?. remember that both fermi generation of cards are presented in that review, so would be also fair to introduce there both HD5000 and HD6000 series don't you think?. in my opinion GCN it's an awesome architecture it has proven to be really scalable through many revisions but that's not due forward looking precisely, that's because AMD doesn't have the money to release a full range of products with a new generation(in fact just a revision) of the GCN iteration as Nvidia could do with kepler, maxwell and probably with pascal from mid range to high-end. instead they keep rebranding their old high-end a tier down, that's all.. HD7970 rebranded as 280x, 280x failed to be replaced successfully by the 380X, 290X rebranded as 390x, fury series even wasn't a new architecture they just bruteforced as many compute units as possible inside a package and done, HBM was what truly helped the Fury series, Again that's not forward looking that's just lack of money to research and develop an architecture good enough to be a successful replacement of the actual GPU generation without resorting to rebrand and minor architecture improvements forever, Nvidia have the money and market to research and release cards in the market they want, that's what made mid range cards as GTX 670/GTX680, GTX 780, GTX 970/GTX980, GTX 1070/GTX 1080 to be released as higher end parts of what they truly are, GTX 1080 have so huge amount of performance improvement over the direct replacement that it allow to be placed, priced and launched as high-end as the faster and powerful card in the market when it truly it's a mid range card, it's where it deserve to be placed the same as the GTX 970 and GTX 980 were, that's not forward looking from AMD that's lack of truly competition.
 
I like the comparison, but I do kind of wish the Titans were included in this, even if they are technically a different model, since they are the top of their class.
 
The problem is you don't know what is going to happen in the future banking on something that might or might not have legs for future games is a bad way to purchase graphics cards, not to mention anything else. Get what you need now and be happy with it, its not like you can fore see what the developers will do and when they will do it (both, easy to see where it will go but not timing). What you are saying is the same thinking that when SLi first came out, people were saying its great because they can through another card in their system at a latter date and its all fine, there is a caveat to that, games might not work with SLi. But by then they have already spent money on a motherboard that supported SLi, a bigger power supply etc, then they find out its totally useless because a next gen card can do what SLi older gen cards could do without all the issues.


You can never know the future, but you can look at the past and make inferences. They are not iron clad certainties, but if every generation for the years has had steeper falloffs compared to the competition for newer titles, that's a sign of something.

Also, while it did not really show itself in dx11, the console connection does seem to favor amd in terms of optimizations. We'll see how that pans out. I expect the games that are released cross platform to be much better at taking advantage of gcn going forward, especially now that the updated console variations are probably using some modified polaris. Whatever hidden isa changes they made to that chip is probably in the new consoles as well. Does that make a difference? No idea, but the days of cpu overhead and other handicaps on amd gpus are coming to an end on modern games... At least until nvidia revamps gameworks into dx12 and starts stacking the deck in favor of pascal/volta/etc down the road.
 
I saw where you were going before on dipersion and get your point. I can only go by experience, but I know the gtx 480 would make my room hotter faster than my 980 ti (while getting above 90c faster). I believe, and again just aneqdotal, it may have had something to do with the 480 heating up and going to full load quicker.

From all the research ive done, watts is watts -- but its very possible that a 980ti might run at a lower "load" then a 480 did, depending on the application. I know when i play WoW my gpu would hover around 25% load -- where as mining / furmark, etc does 100%
 
You can never know the future, but you can look at the past and make inferences. They are not iron clad certainties, but if every generation for the years has had steeper falloffs compared to the competition for newer titles, that's a sign of something.

Also, while it did not really show itself in dx11, the console connection does seem to favor amd in terms of optimizations. We'll see how that pans out. I expect the games that are released cross platform to be much better at taking advantage of gcn going forward, especially now that the updated console variations are probably using some modified polaris. Whatever hidden isa changes they made to that chip is probably in the new consoles as well. Does that make a difference? No idea, but the days of cpu overhead and other handicaps on amd gpus are coming to an end on modern games... At least until nvidia revamps gameworks into dx12 and starts stacking the deck in favor of pascal/volta/etc down the road.

nV already has dx12 gameworks, and now VRworks which is much ahead of AMD's lliquid vr with full demos they launched both with Pascal.

There is no optimizations to dx11 really, AMD's own driver overhead screwed them up. Yeah you have tessellation in gameworks which screws with AMD's polygon through put but they have a driver fix for that.
 
It's instructive to see that the 1080 has twice the performance of my last GPU, the 780 Ti.
it isnt because only the improved maxwell GPU on FF16 and higher clock speed. also kepler driver were crippled and the performance difference has been done before
GPU 2012 Benchmarks - Compare Products on AnandTech (480 is missing)

So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?
well if the physical hardware/architecture changes and different power (saving)/Clock profiles dont make a different then people would see a even bigger Geforce Kepler (and inneficient)


I told you before, it's because of the competition. The R9 290, 390, and Fury all draw a shit ton of power with worse performance than the 970, 980 and 980Ti
let see the R9 290 was 100usd cheaper after discount than a 390 and 970 and it performed 90-95% of them at 1080 and 100% at 1440 of the 970 while using a 1.5yo architecture with even better performance at DX12 and the Hawaii PRO GPU were released to compete with GK110 which now they beat (due to nvidia drivers ;) ) the R9 390x was 100usd cheaper than the GTX 980 while matchin 980 performance so the true equivalent is the R9 Nano which uses same quantity of power as the GTX 980 while performing better on most games.


the 980 uses almost nothing compared to other cards and performs amazingly.
So R9 Nano is even better to be at the same price,outperform it and use barely 20w more

The 980Ti is where they went for maximum performance without compromise, but it still uses less than whatever AMD has.
No it doesnt.

Why do I bring up the FX9370 and R9 290? Because it's funny how you focus so much on power consumption with those parts, while completely ignoring the differences in competition today versus back then.
3 years ago 780Ti faster than a R9 290x (throttling)
Now a (throttling) R9 290 beating a 780 Ti

thats what you mean right?




what does it matter, it matters what happens at a given time and when a person is ready to buy, as games change focus on what they stress on a GPU the performance will change too based on those aspects
or they start gimping some image details via driver to get higher perfomance or just claim they need to disable a functionally feature of an API because they hardware cant manage it

if a buyer doesn't understand how that will affect their purchase decision that is a problem on their end. And of course its also forecasting what game developers will do, so its not entire doable, its great in hind sight but in hind sight everything looks either good or bad depending on what you are looking for.
well when it uses GameNotWOrk is good but when it is optimized on DX12 is bad , no?

Also in hind sight, why didn't anyone do a comparative performance analysis of the g80, g92 lines? They too had a similar longevity to the GCN family.
How long has been since the 9000/8000s cards release? because since then no nvidia gpu has achieved that. also then games were less demanding than now (expept for crysis )

AMD is still using GCN 1.2 pipelines it should be ok for them too at least from a driver point of view.
Geometry Processors, GCP, added cache L2 and improved ALUs are changes that are public but still there are not any public informaiton about the architecture.

In the games where 780TI was performing competitive to the 290X at their active time that still persist.. the drop in GPU optimizations for older GPUs from Nvdia part and the focus in the more recent technology and architecture is what made the every GCN card outperform nvidia counterparts in games, so you can find things like a 280X obliterating the performance of a GTX 780 in Far Cry 4..
it isnt only FC4 (which needed driver optimization to reach such performance) it is on many 2015 and newer games (Maxwell release)

but then in AoS you find that a mere GTX 670 still compete with an HD7970
DX11 limiting the draw calls and the DX11 limitation which are key on a RTS game doesnt make you wonder why?...

as they did in the past and that's under DX12 which it's where AMD show their "strong" points always isn't?, gpu and drivers optimizations are what make a card competitive or not overtime, AMD utilize the same architecture since HD7000 up to today and even polaris its just yet another revision to the same architecture, so don't be stranged if with the launch of pascal nvidia will stop drivers and gpu optimizations for older GPUs making the 290X perform closer and closer up to outperform nvidia counterparts
Then those GK owners who paid 550usd for a midrange gpu, and 30-100% more expensive flagaship should be ok because they paid a card which driver support wouldnt be covered as soon the new architecture isnt much beter than the old one or has a same node

it will a moment where you may see a 390X performing on par with the 980TI if AMD keep using GCN as their main architecture, nvidia has passed from kepler, maxwell and pascal since then.
why if they are using Maxwell on a node shrink!



This still apply to the case actually, you want to direct compare the 290X to the 390X, but the only reason why the 390X perform up to on par with the GTX 980 (STOCK) is because 390X it's a pre-overclocked GPU and vRAM 290X...
A 1.5 years old card matching/beating a new architecture...

There is no optimizations to dx11 really, AMD's own driver overhead screwed them up. Yeah you have tessellation in gameworks which screws with AMD's polygon through put but they have a driver fix for that.

Raja Koduri already told the DX11 performance was improved on Polaris AND Polaris improves tessellation performance

About GCN 1/2/3 DX11 performance has been optimized almost with each Driver release since Windows 10 and 15.7.1
 
Last edited:
nV already has dx12 gameworks, and now VRworks which is much ahead of AMD's lliquid vr with full demos they launched both with Pascal.

There is no optimizations to dx11 really, AMD's own driver overhead screwed them up. Yeah you have tessellation in gameworks which screws with AMD's polygon through put but they have a driver fix for that.

Glad this is turning into a bash AMD thread even though we are talking about nvidia.
 
seeing the 680 bs mid range card at a high end price, just salt in the wound. that people continue to fall for it, is mind boggling to me.
 
seeing the 680 bs mid range card at a high end price, just salt in the wound. that people continue to fall for it, is mind boggling to me.
I was wondering when the thread is going to have a post precisely about this topic.

It did not disappoint
 
Guess I'm waiting for a 1080Ti now :cautious:
My brother who is a very reliable source confirms that the 1080ti will overclock to 2200mhz on using a passive cooler and an orangutan blowing on it. There will be a special nvidia founders edition that replaces the orangutan with a macaque monkey for SFF builds
 
There is zero logic in some of those numbers. For instance the 980 ti is 40% faster than the 980 in Crysis 3 at 1920x1080 but only 20% faster at 2560x1600.
 
On the GTX 480 vs GTX 1080 heat generation argument, I found some interesting thermal imaging pics to compare:

Thermal images of the GTX 480
Thermal images of the GTX 980 Ti
Thermal images of the GTX 1080

The thing that sticks out to me about how the 480 is different from the 980 Ti, despite similar power consumption at load, is the extreme difference in temps at idle: 57.8C for the 480 and 25C-35C for the 980 Ti.

Didn't Nvidia switch the cooler tech between the 480 and 980 Ti? Something to do with a vapor chamber?
 
My brother who is a very reliable source confirms that the 1080ti will overclock to 2200mhz on using a passive cooler and an orangutan blowing on it. There will be a special nvidia founders edition that replaces the orangutan with a macaque monkey for SFF builds

Unless they put some color changing LED on the monkey blowing the air, I really don't see the reason I should pay an extra $100. If I'm going to cough up that much money, I *demand* that my monkey blinks damn it.
 
On the GTX 480 vs GTX 1080 heat generation argument, I found some interesting thermal imaging pics to compare:

The thing that sticks out to me about how the 480 is different from the 980 Ti, despite similar power consumption at load, is the extreme difference in temps at idle: 57.8C for the 480 and 25C-35C for the 980 Ti.

Huh. That rather explains why it was more helpful to have a side intake fan over the mid of my GTX 580 than it was to have it at the front of the case blowing directly at the end of the card.
 
The nvidia cooler has obviously improved over the years but also the 980 ti downlocks EXTREMELY on idle, older cards like the 480 didn't downclock/undervolt all that much as far as I can remember.

edit : looking at techpowerup review it seems the 480 did downlock a lot however idle power consumption is 54w vs a mere 11w for the 980 ti
 
Last edited:
Back
Top