6 Generations of GeForce Graphics Compared

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
TechSpot has put together a comparison of six generations of GeForce graphics cards from the GTX 480 all the way up to the new GTX 1080. This article is handy if you are upgrading from an old card like a GTX 680 or the likes.

Now with the release of Pascal the time has come to revisit history and see how six generations of Nvidia GeForce graphics cards compare. To streamline testing we'll be sticking to DirectX 11 titles which is supported by all GeForce series, old and new, so we can accurately compare them.

 
Had no idea that a GTX 980ti was just as power hungry as a GTX 480. Mind blown. I bet that when Nvidia unleashes the full Pascal experience it will pull just as much as the GTX 480 like the 980ti did. All this time people were telling me how much cooler and energy efficient the Maxwell cards were. Le sigh.

Power_01.png
 
Before someone says that the cooler on the GTX 480 was the reason it was hated so much watch this video on case flow. Skip to the 5 min mark.

 
It's instructive to see that the 1080 has twice the performance of my last GPU, the 780 Ti.
 
Wow, I don't think I knew about the 780 TI at all, or maybe it was just too pricey for me to care at the time...
 
Yeah the 980Ti has the same power draw as the 480, but it's also over 4x faster which you cannot see in that chart.
They should have had a chart showing the same render load on each generation along with power drawn. You'd see that the 980Ti would have provided the same performance as the 480 (same render load) but at 4x less power. That's where the "higher efficiency" comes out. Each pixel pushed needs less power to generate.
Not to mention that fact that the 480 only had 1.5GB memory and the 980Ti has 6GB which takes a bit of power too.
 
Yeah the 980Ti has the same power draw as the 480, but it's also over 4x faster which you cannot see in that chart.
They should have had a chart showing the same render load on each generation along with power drawn. You'd see that the 980Ti would have provided the same performance as the 480 (same render load) but at 4x less power. That's where the "higher efficiency" comes out. Each pixel pushed needs less power to generate.
Not to mention that fact that the 480 only had 1.5GB memory and the 980Ti has 6GB which takes a bit of power too.

So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?
 
So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?
I don't know. I never owned either card so I have no way to reference or discuss their heat output. My only guess is that the 480 was considered "hot for the performance provided", thus "less efficient" use of the power used. But really I don't know.
 
I don't know. I never owned either card so I have no way to reference or discuss their heat output. My only guess is that the 480 was considered "hot for the performance provided", thus "less efficient" use of the power used. But really I don't know.

Same shoes as you. I did own GTX 460 SLi on the day one release overnight shipped from NewEgg. They didn't have nearly the power draw of a GTX 480, so I never put 2 and 2 together. I just accepted that the GTX 480 was a space heater from listening to owners.
 
Yeah the 980Ti has the same power draw as the 480, but it's also over 4x faster which you cannot see in that chart.
They should have had a chart showing the same render load on each generation along with power drawn. You'd see that the 980Ti would have provided the same performance as the 480 (same render load) but at 4x less power. That's where the "higher efficiency" comes out. Each pixel pushed needs less power to generate.
Not to mention that fact that the 480 only had 1.5GB memory and the 980Ti has 6GB which takes a bit of power too.


480 was on the old 40nm process node which it wasn't even matured until the 580. The 980ti was on a very mature 28nm process node. Comparing an apple to an orange here.
 
Probably have to go back another generation (GTX 285?) to see how the power usage compared with the 480. As I recall power draw jumped a fair bit but the performance improvements didn't seem to match the increased power use (kind of like Pentium 4 clocks increasing but not much performance improvement back in the day before Core2Duo). I mean the 480 was faster for sure, but was considered a space heater. But my memory could be faulty.
 
So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?

I'm not entirely sure but if I recall correctly the 480 was running very hot (a steady 90-95c under load) and the stock cooler was really loud. This heat caused a number of hardware failures (maybe not many but enough for horror stories to spread on the interweb) which obviously gave the card a bad rep. And if I remember correctly, it wasn't even a huge leap in performance versus previous gen (or maybe it was that AMD had a better perf/watt at the time? not sure now but I remember it was basically not a very good deal)
 
I always heard it was best to upgrade every other generation and based on this article... doing so nets you an average of 30fps for the same settings. Not a terribly bad rule of thumb.
 
So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?

Because the competition uses even more and performs worse. Maybe think about the differences in the GPU world today versus back then.

Also seems convenient to ignore the GTX 980 there and just focus on the 980Ti
 
Before someone says that the cooler on the GTX 480 was the reason it was hated so much watch this video on case flow. Skip to the 5 min mark.



Watched the whole video. Yeah unless you have no fans at all or you box the case in some tight space with no airflow, cable management and case airflow does not really matter that much in a big case. But I disagree with his remark that dont use this for an excuse for bad cable management. I find cable management ANNOYING. Zip-ties, cable sleeves and whatever are all tools of the devil IMHO. Yeah it looks neat, but nothing is easily accessible when you need to change something. I do have good cable management in my case but I am already dreading the job I have when my new SSD finally arrives and I have to take the bloody thing apart again.

*edit* typo fix
 
Last edited:
Because the competition uses even more and performs worse. Maybe think about the differences in the GPU world today versus back then.

Also seems convenient to ignore the GTX 980 there and just focus on the 980Ti
This. Ask yourself why there was no 480 Ti or 490 and you won't need to ask why Fermi was considered to be inefficient compared to Kepler and Maxwell.
 
Because the competition uses even more and performs worse. Maybe think about the differences in the GPU world today versus back then.

Also seems convenient to ignore the GTX 980 there and just focus on the 980Ti

The GTX 980 would be comparable to the GTX 1080 as it is it's direct replacement. If anything the GTX 1080 increased power draw by shrinking to 16nm technology. Might be why Nvidia put safeguards on the voltage and won't allow GALAX to use their design to bypass the limitations. /shrug
 
Had no idea that a GTX 980ti was just as power hungry as a GTX 480. Mind blown. I bet that when Nvidia unleashes the full Pascal experience it will pull just as much as the GTX 480 like the 980ti did. All this time people were telling me how much cooler and energy efficient the Maxwell cards were. Le sigh.

Power_01.png

So having 3~6 times more performance at the same power level is not efficient?
 
The GTX 980 would be comparable to the GTX 1080 as it is it's direct replacement. If anything the GTX 1080 increased power draw by shrinking to 16nm technology. Might be why Nvidia put safeguards on the voltage and won't allow GALAX to use their design to bypass the limitations. /shrug

The GTX1080 is 60% faster with 10% more power usage. I don't know wtf kind of engineering you are trying to do, but there are limitations. I'm confused why you are complaining about power usage? You are running an FX9370 and an R9 290, some of the most power hungry components that perform worse than their competition.
 
So having 3~6 times more performance at the same power level is not efficient?

Well my point I was trying to make is that they both pull the same wattage, yet the 980ti is a cool running card and the GTX 480 is considered a space heater. That's what I was trying get across to you. But if you want to just argue that it isn't making as much heat because it is making a lot more frames with the energy used then it's your choice to do so.
 
The GTX1080 is 60% faster with 10% more power usage. I don't know wtf kind of engineering you are trying to do, but there are limitations. I'm confused why you are complaining about power usage? You are running an FX9370 and an R9 290, some of the most power hungry components that perform worse than their competition.

What does a FX-9370 and a R9 290 have to do with a GTX 980ti and a GTX 480 pulling the same wattage at the wall in a test? I believe I asked a legitimate question. Without malice or ill intent. And you're turning this into a my computer is better than your computer pointing match to defer from answering the question.
 
What does a FX-9370 and a R9 290 have to do with a GTX 980ti and a GTX 480 pulling the same wattage at the wall in a test? I believe I asked a legitimate question. Without malice or ill intent. And you're turning this into a my computer is better than your computer pointing match to defer from answering the question.

I told you before, it's because of the competition. The R9 290, 390, and Fury all draw a shit ton of power with worse performance than the 970, 980 and 980Ti. The GTX 480 was barely better than the HD 5870 and ran much hotter while using more power. Look at that graph, the 980 uses almost nothing compared to other cards and performs amazingly. The 980Ti is where they went for maximum performance without compromise, but it still uses less than whatever AMD has. While it may put out the same amount of heat, the card itself is not at 90C+ all the time. I would rather have the heat floating around my room than in a tiny case.

Why do I bring up the FX9370 and R9 290? Because it's funny how you focus so much on power consumption with those parts, while completely ignoring the differences in competition today versus back then.
 
I told you before, it's because of the competition. The R9 290, 390, and Fury all draw a shit ton of power with worse performance than the 970, 980 and 980Ti. The GTX 480 was barely better than the HD 5870 and ran much hotter while using more power. Look at that graph, the 980 uses almost nothing compared to other cards and performs amazingly. The 980Ti is where they went for maximum performance without compromise, but it still uses less than whatever AMD has. While it may put out the same amount of heat, the card itself is not at 90C+ all the time. I would rather have the heat floating around my room than in a tiny case.

Why do I bring up the FX9370 and R9 290? Because it's funny how you focus so much on power consumption with those parts, while completely ignoring the differences in competition today versus back then.

So does the GTX 480 pump just as much heat into your room as a GTX 980ti? That is the question. Yes or no will suffice. You can even use the 3rd choice; "I don't know".
 
Well my point I was trying to make is that they both pull the same wattage, yet the 980ti is a cool running card and the GTX 480 is considered a space heater. That's what I was trying get across to you. But if you want to just argue that it isn't making as much heat because it is making a lot more frames with the energy used then it's your choice to do so.

That's how efficiency it's measured, performance per watt on higher end parts... if you want to compare with older generations of cards then you should also compare to the direct competitor both in performance and cooling capabilities, you can't compare actual GPU coolers with 5 years old coolers so directly.. cool running card isn't the same as a cold card, also all those terms exploded with the 290 and 290X being 290W cards with a shit cooler running over 90C out of the box, as a more pure example the Fury X it's a cool running card because have a water cooler on it but it is still a 290W+ card or 320W+ under stress and still be below 55C, much more than the 250W of a 780TI or 980TI which also had better reference coolers and also better updated AIB coolers.

Efficiency is simply that, how much performance can you achieve versus the direct replacement at the same power consumption or how much it consume at the same performance, those are the efficiency metrics.
 
So does the GTX 480 pump just as much heat into your room as a GTX 980ti? That is the question. Yes or no will suffice. You can even use the 3rd choice; "I don't know".


Well here is the thing what ever power usage that is used in a silicon products, there is a direct amount of power that is put into the room, the chip temps don't really matter because the cooler is keep the chip temps at a certain point so all the heat produced is being put into the room. So if you have a card using 250 watts vs another card 250 watts used, both are putting out the same heat to the room but that doesn't mean one chip can't be hotter than the other. This is why its more important to look at the delta of the chip temps and power usage more so than raw chip temps.
 
So does the GTX 480 pump just as much heat into your room as a GTX 980ti? That is the question. Yes or no will suffice. You can even use the 3rd choice; "I don't know".

basically, yes it can or will do. both are 250W parts, but that's all, you have to take in consideration OVERALL card power, not every watt it's consumed by the GPU core, VRMs, vRAM, Mosfet, even little things as resistors and capacitors have impact in the power of the card that isn't directly related to the GPU core itself as heat pumped in the room, vRAM efficiency matter a lot and as much as power circuitry, as a constrained power delivery circuitry will produce a lot of more heat compared to a more robust one.. so when you are looking at power consumed by the card and power wasted by the card there's a lot to check there..
 
Really would have liked to see 4K, but I guess anything under the 980 couldn't run it.
 
Last edited:
Well here is the thing what ever power usage that is used in a silicon products, there is a direct amount of power that is put into the room, the chip temps don't really matter because the cooler is keep the chip temps at a certain point so all the heat produced is being put into the room. So if you have a card using 250 watts vs another card 250 watts used, both are putting out the same heat to the room but that doesn't mean one chip can't be hotter than the other. This is why its more important to look at the delta of the chip temps and power usage more so than raw chip temps.

yeah this, the overall GPU architecture and engineering have to be taken in consideration when talking about power used, power wasted and heat produced/dissipated.
 
I still haven't seen the comparison I want to see done. Amd vs nvidia cards over time.

I remember the 780ti thrashing the 290x... no longer.

I remember the 980 thrashing the 290x... no longer.

May as well toss in the midrange too and upper midrange. No one bothers to do that comparison.
 
what does it matter, it matters what happens at a given time and when a person is ready to buy, as games change focus on what they stress on a GPU the performance will change too based on those aspects, if a buyer doesn't understand how that will affect their purchase decision that is a problem on their end. And of course its also forecasting what game developers will do, so its not entire doable, its great in hind sight but in hind sight everything looks either good or bad depending on what you are looking for.

Also in hind sight, why didn't anyone do a comparative performance analysis of the g80, g92 lines? They too had a similar longevity to the GCN family. But what you are getting at is this, they might last one gen more, but they still staying in the relative same performance bracket. Even given a 10% swing in performance in their favor. But nV's cards the 780ti is a big enough step away from Maxwell 2 where nV would have had to focus on per game optimizations for Maxwell 2 over Keplar. But the good news for Maxwell 2 to Pascal outside of dynamic load balancing the pipelines remain similar, the down side is Maxwell 2 has quite a bit less compute performance where games are going to push in the coming years.

AMD is still using GCN 1.2 pipelines it should be ok for them too at least from a driver point of view.
 
Last edited:
I remember the 780ti thrashing the 290x... no longer.

In the games where 780TI was performing competitive to the 290X at their active time that still persist.. the drop in GPU optimizations for older GPUs from Nvdia part and the focus in the more recent technology and architecture is what made the every GCN card outperform nvidia counterparts in games, so you can find things like a 280X obliterating the performance of a GTX 780 in Far Cry 4.. but then in AoS you find that a mere GTX 670 still compete with an HD7970 as they did in the past and that's under DX12 which it's where AMD show their "strong" points always isn't?, gpu and drivers optimizations are what make a card competitive or not overtime, AMD utilize the same architecture since HD7000 up to today and even polaris its just yet another revision to the same architecture, so don't be stranged if with the launch of pascal nvidia will stop drivers and gpu optimizations for older GPUs making the 290X perform closer and closer up to outperform nvidia counterparts, it will a moment where you may see a 390X performing on par with the 980TI if AMD keep using GCN as their main architecture, nvidia has passed from kepler, maxwell and pascal since then.

I remember the 980 thrashing the 290x... no longer.

This still apply to the case actually, you want to direct compare the 290X to the 390X, but the only reason why the 390X perform up to on par with the GTX 980 (STOCK) is because 390X it's a pre-overclocked GPU and vRAM 290X... on the latest Fury Review from [H] you can find that it's the R9 Fury that are competitive to the GTX 980 not the 390X. the GTX 970 still compete with the R9 290/290x and the 980 to 390/390X nothing have changed. YET....
 
So does the GTX 480 pump just as much heat into your room as a GTX 980ti? That is the question. Yes or no will suffice. You can even use the 3rd choice; "I don't know".

I owned both a 980 ti and gtx 480 (3 of them at one point). I can only speak aneqdotedly, but all of my 480s ran at hotter avg temps and put out boatloads of heat, especially if overclocked in the slightest. Further there was no gentle increase in temps to >90c, they would jump up in 15-20 seconds of load. The only comparable card in terms of heat output in my experience was my 290x. My 980 ti ran no where near as hot.
 
So why is the GTX 980ti considered a cool running card and the GTX 480 considered hot and unbearable? Faster fan to dissipate the heat into the room?

Who ever said that the 980ti is a "cool running card"? Maxwell is relatively cool and efficient, and even in the graph that you posted the 980 has the lowest power consumption. The 980 was the card in the same price bracket that the 480 was, not the 980ti.

The real key is that the GTX480 power consumption and heat output was necessary at the time in order for nvidia to produce a competitive part. There was no option to buy an efficient Fermi card. There are, however, plenty of efficient Maxwell cards. Fermi's heat and power consumption was a necessity for anyone who wanted to go Nvidia that round. Maxwell's heat and power consumption only becomes an issue when we are talking about $700 luxury cards like the 980Ti.

They could have gotten away just fine with the 980 (non-Ti) as their fastest card, but since they did have leeway in terms of heat and power, there was no reason not to make an even faster and more expensive card aka 980ti for those who wanted it.
 
Who ever said that the 980ti is a "cool running card"? Maxwell is relatively cool and efficient, and even in the graph that you posted the 980 has the lowest power consumption. The 980 was the card in the same price bracket that the 480 was, not the 980ti.

The real key is that the GTX480 power consumption and heat output was necessary at the time in order for nvidia to produce a competitive part. There was no option to buy an efficient Fermi card. There are, however, plenty of efficient Maxwell cards. Fermi's heat and power consumption was a necessity for anyone who wanted to go Nvidia that round. Maxwell's heat and power consumption only becomes an issue when we are talking about $700 luxury cards like the 980Ti.

They could have gotten away just fine with the 980 (non-Ti) as their fastest card, but since they did have leeway in terms of heat and power, there was no reason not to make an even faster and more expensive card aka 980ti for those who wanted it.

Dude people having been saying on these forums for years how buying a 980ti makes their builds run so much cooler than anything else they have ever bought before. Hell they had me thinking that it was this cool running card that didn't generate much heat. Not everyone of course. I'm not going to name names; they know who they are.
 
I owned both a 980 ti and gtx 480 (3 of them at one point). I can only speak aneqdotedly, but all of my 480s ran at hotter avg temps and put out boatloads of heat, especially if overclocked in the slightest. Further there was no gentle increase in temps to >90c, they would jump up in 15-20 seconds of load. The only comparable card in terms of heat output in my experience was my 290x. My 980 ti ran no where near as hot.

I did not study thermodynamics, but I have a good question for you'll. I'm going to make some numbers up; don't take this as gospel. Say your GTX 480 had a cutoff point of 95c. And let's say Nvidia set the GTX 980ti to 85c. Both pull the exact same average wattage. Here is my question.

The GTX 480 would make you feel hotter if your case is close to you due to the higher temps emitted from the card, but the overall temperature of your room should average out to the same temps as the 980ti assuming that the same power is used for both. Am I thinking wrong? I didn't study thermodynamics so hell I could be as wrong as 2 left shoes. Thus this is a question. If someone has a good link to any example even if it's just boiling water it would be appreciated.

Thanks!

And yes I know that the 290X has a higher average wattage load than the 980ti; I was just trying to discuss Nvidia cards since the thread was about Nvidia cards. ;)
 
I saw where you were going before on dipersion and get your point. I can only go by experience, but I know the gtx 480 would make my room hotter faster than my 980 ti (while getting above 90c faster). I believe, and again just aneqdotal, it may have had something to do with the 480 heating up and going to full load quicker.
 
Back
Top