NVIDIA GPU Generational Performance Part 1 @ [H]

"We are referring to NVIDIA’s "Ti" line-up in the high-end, the cards that let loose the full potential of each architecture."

Isn't this more accurate of Titan cards than Ti?
 
Tesla to Fermi was a change in NVIDIA's philosophy to power above all else. They threw efficiency out the window and it showed in the lack of performance gains and heat generated. Kepler was the first architecture developed using their new philosophy of a balanced approach toward efficiency. NVIDIA had gotten complacent after the reign of the Tesla architecture. Still, ATi saw similar "anemic" performance gains during that time as they dealt with their own issues in design philosophy.

So you have code samples and application profiling results of these games to share with us to prove that?

This has been debunked several times. Lack of updates to a previous architecture's code base does not mean "gimping."

The testing parameters are laid out right in the article and they are consistent across the spectrum of tests, including game patches. What subjectivity are you referring to, specifically?

No, that is quality assurance at play because the hardware is not supported and the developer doesn't want to get complaints or support tickets for unsupported hardware. GPU architecture is more than the total number of transistors in the core.

The only variable in [H]'s testing is the RNG in the game loop because they want to provide us, the readers, what one can expect in real world usage. Benchmarking can't provide that. The most recent case of a benchmark being vastly different to the actual experience was Deus Ex: Mankind Divided and is a perfect example of why benchmarks should not be used as a metric for gameplay experience.

Seeing as you seem to lurk the FS/FT section and come out randomly every couple years to post thoughts in other threads, I've probably wasted my time with this post. But it seems that [H] is not the hardware site for you.


Or perhaps, just perhaps, I don't bother to say something unless something needs to be said and don't blindly accept everything I see as gospel even if I happen to respect the source. Question authroity or be sheep, your choice. BTW all of your assertions are based on a narrow contextual argument based more on emotion and fanboyism than objective reasoning.
 
"We are referring to NVIDIA’s "Ti" line-up in the high-end, the cards that let loose the full potential of each architecture."

Isn't this more accurate of Titan cards than Ti?
Sure.
 
Who on earth calls it "2k"?

This whole K nonsense is a TV thing introduced with 4K, and it is just plain dumb. I can see using it for 4K, but I'd prefer if people just called it 3840x2160. Anything prior to 4K should just never use that terminology.
I had a 2k monitor then, it was a big upgrade from my 1k monitor.
 
"We are referring to NVIDIA’s "Ti" line-up in the high-end, the cards that let loose the full potential of each architecture."

Isn't this more accurate of Titan cards than Ti?

Certainly, but even Nvidia themselves have now demphasized gaming as part of the Titan brand. The original one in 2013 was the GeForce Titan, now they've dropped the GeForce name from the Titan all together. It's just the "Nvidia Titan", and it is being marketed as a professional product machine learning and the like.

The sticker price has rocketed way way up, with the Titan V costing $3k...
 
Who on earth calls it "2k"?

This whole K nonsense is a TV thing introduced with 4K, and it is just plain dumb. I can see using it for 4K, but I'd prefer if people just called it 3840x2160. Anything prior to 4K should just never use that terminology.
Just sick of people calling 2560x1440 "2K" when it's really for 2048x1080, but can also be used for 1920x1080.
 
Certainly, but even Nvidia themselves have now demphasized gaming as part of the Titan brand. The original one in 2013 was the GeForce Titan, now they've dropped the GeForce name from the Titan all together. It's just the "Nvidia Titan", and it is being marketed as a professional product machine learning and the like.

The sticker price has rocketed way way up, with the Titan V costing $3k...

It's not the first time that a Titan was $3000, remember the Titan Z? But to be fair, that was SLI on a stick.
 
Meh, I'm still rocking a 780Ti / i7 4770 / 16GB DDR3 and a 512GB SSD and games still run fine. But that is on a 1080P 60" TV screen...

Think I'll' spin up Crysis3 and W3 to see what numbers I get. If I gamed at a higher resolution I would definitely get a better card.

PS: Good article
 
Great articles but it would be so much better if you also could do a merger of the test games suite results with a summery instead of just splitting up the performance commentary of old and new games. It would give an idea what to expect as a whole for the cards.
 
Definitively Kepler aged so bad, the fact that the 290X fares much better on current games only shows the FineWine effect. Maxwell is aging as well but not as bad, but the fact that the GTX 980 is trailing the 390X more often than not on current games compared to the old games shows two things.

1) nVidia is not gimping their old GPUs, is just that they aren't profiling and optimizing it on old architectures.

2) That Pascal GPUs are less reliant on driver optimizations than previous architectures in order to perform as back with Kepler and Maxwell, the compiler and the software scheduling was the either the friend or foe for the GPU when optimizing for games. Pascal still on the same basic concept but at least has improvements on the scheduling and resource utilization. And for what I read from Turing, nVidia is heading back to nearly a full hardware scheduler kinda like what AMD had been doing for the last 7 years but in a hybrid, more efficient way.
 
Back
Top