Nvidia's nerfed drivers for Kepler card.

socialjazz

Weaksauce
Joined
Apr 2, 2013
Messages
81
Nvidia has a history of not bringing their previous gen cards to full potential in driver. For example, Fallout 4 has been released and Kepler SLI users can enjoy full Ultra FPS framerate simply by setting SLI to AFR2. Yet, the SLI profile that comes with the driver is broken. Why? Simply because Maxwell cards have not yet have SLI figured out yet. Since it does not looks good that the Kepler cards outperforming their Maxwell cards, they nerfed the drivers for Kepler users.
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.
 
Nvidia has a history of not bringing their previous gen cards to full potential in driver. For example, Fallout 4 has been released and Kepler SLI users can enjoy full Ultra FPS framerate simply by setting SLI to AFR2. Yet, the SLI profile that comes with the driver is broken. Why? Simply because Maxwell cards have not yet have SLI figured out yet. Since it does not looks good that the Kepler cards outperforming their Maxwell cards, they nerfed the drivers for Kepler users.
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.

Right...Maxwell users can also set SLI to AFR2 and get good FPS, but the game becomes unstable.

They publicly said that to support SLI the game needs some tweaks.
 
We've been discussing this in other threads. It seems that their new GameWorks effects tessellate excessively for little graphical improvement. AMD users have a tessellation slider in the drivers so it doesn't matter. Unfortunately Nvidia users are SOL.

You should take your concerns to the GeForce forums, Twitter, and Reddit. They only seem to listen to bad publicity. Sorry about your loss of performance. Your system is still beast though! :)
 
Nvidia has a history of not bringing their previous gen cards to full potential in driver. For example, Fallout 4 has been released and Kepler SLI users can enjoy full Ultra FPS framerate simply by setting SLI to AFR2. Yet, the SLI profile that comes with the driver is broken. Why? Simply because Maxwell cards have not yet have SLI figured out yet. Since it does not looks good that the Kepler cards outperforming their Maxwell cards, they nerfed the drivers for Kepler users.
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.

Looks like you're a victim of AMD shill marketing. Kepler and Maxwell both have issues with the game crashing when SLI is set to AFR 2. NVIDIA is waiting on Bethesda to make game code changes before issuing an SLI profile.
 
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.

Consider that old cards have more mature drivers and less room to improve. So when you compare a new card to an 18 month old card and performance is equal, the new card has more potential improvements with improved drivers. Driver optimizations have diminishing returns over time, cards just don't keep getting faster forever after the initial release.
 
Consider that old cards have more mature drivers and less room to improve. So when you compare a new card to an 18 month old card and performance is equal, the new card has more potential improvements with improved drivers. Driver optimizations have diminishing returns over time, cards just don't keep getting faster forever after the initial release.

Watch some jerk take this as a flippant fanboy remark, but AMD cards across the board just seem to get faster and faster every new OS and driver release. If the little company can optimize, then Goliath should be able to do the same in my mind.
 
SLI profiles will come out soon enough. For those still runing OG titans, they can set up AFR2 and get a smooth ride. for now, I'm happy to wait.
 
Watch some jerk take this as a flippant fanboy remark, but AMD cards across the board just seem to get faster and faster every new OS and driver release. If the little company can optimize, then Goliath should be able to do the same in my mind.

If you start from a worse spot with bad drivers, you also have more room to optimize over time :p
 
Watch some jerk take this as a flippant fanboy remark, but AMD cards across the board just seem to get faster and faster every new OS and driver release. If the little company can optimize, then Goliath should be able to do the same in my mind.

But, there is also a difference one must take into account:

nVidia replaces an entire line-up when a micro-arch is released, whereas AMD often releases rebrands. Our latest example being 900 series (all Maxwell based) vs 300's (all rebrands of the 200 series).

In the light of driver optimisation, obviously there is financial incentive for AMD to keep improving the same GPU for longer than nVidia would have, since they would be making money off them for longer than nVidia would be.

Not condoning what nVidia is doing, nor am I condemning what AMD is doing, but I think there are still some financial incentives to consider.
 
But, there is also a difference one must take into account:

nVidia replaces an entire line-up when a micro-arch is released, whereas AMD often releases rebrands.

In the light of driver optimisation, obviously there is financial incentive for AMD to keep improving the same GPU for longer than nVidia would have, since they would be making money off them for longer than nVidia would be.

Not condoning what nVidia is doing, nor am I condemning what AMD is doing, but I think there are still some financial incentives to consider.

CUDA is CUDA at it's core. GCN is GCN at it's core. It would seem that if you make an improvement to basic CUDA that it would improve all CUDA based cards? When AMD improves GCN all of the cards get faster. Guess what else? Not everything is a rebrand. They actually release new cards to go along with the rebrands. Look at the lineup.

I think that Nvidia can do better for their customers. I firmly believe that it's not in their best interest financially to do so. Look what happened when the 970 fiasco happened. The boards were filled with anger and people bragging that they had bought EVGA cards so they were eligible to step up to 980s. /facepalm

When you try your hardest to fail your customer base, and they trade their old cards in for more expensive cards, you know that you're putting the right shit in the Kool Aid.

There is absolutely no earthly reason for Nvidia to prop up last year's cards with better drivers. Better to let the AMD cards surpass the older cards and watch the flock buy new upgrades. That's what it looks like to me. Maybe someone can spin some GodRays on it. :)
 
This AMD shill theory of NVIDIA not supporting older Kepler cards has been put to rest many times. Some games Maxwell does better while others big Kepler keeps up with 970/980, simple as that. AMD has no choice but to keep optimizing as most of their cards are GCN rebrands w/minor tweaks. And not only that, AMD cards start out weak and take years to catch up. Just look at Fallout 4, when AMD finally gets around to addressing its severe performance delta, AMD fans will claim, "See, AMD gains performance!". Same old tired story.
 
We've been discussing this in other threads. It seems that their new GameWorks effects tessellate excessively for little graphical improvement. AMD users have a tessellation slider in the drivers so it doesn't matter. Unfortunately Nvidia users are SOL.

You should take your concerns to the GeForce forums, Twitter, and Reddit. They only seem to listen to bad publicity. Sorry about your loss of performance. Your system is still beast though! :)

Sorry, but reviews say you are wrong:
http://hardocp.com/article/2015/11/11/fallout_4_performance_image_quality_preview/6#.VkVyxzZdF9M

This image shows that actual objects in the game, like this lamp post in the middle are more defined with "Ultra" Godrays. Using "High" Godrays it looks more washed out, but with "Ultra" Godrays the object itself is defined and stands out better with better contrast.



This is very hard to see the advantages of something like Godrays in still motion. While in-motion in-game Godrays look a lot better with "Ultra" selected, it is something you have to see as you move about in the game.

Find another fallacy to fuel you "cause" please.

And this title OP is also misleading...when did ignorance get to set the standards?

EDIT: Added video
https://www.youtube.com/watch?v=HdTobIV2sZ8
 
What a load of BS.

I did some extensive research, in absolute terms older nVidia GPU gain a little performance with newer drivers, new cards just gain more.
 
What a load of BS.

I did some extensive research, in absolute terms older nVidia GPU gain a little performance with newer drivers, new cards just gain more.

Thumbs up!

It's nice to see someone not just being a parrot, but actually looking into the facts.

This thread would not be, if all did the same.
 
Watch some jerk take this as a flippant fanboy remark, but AMD cards across the board just seem to get faster and faster every new OS and driver release. If the little company can optimize, then Goliath should be able to do the same in my mind.

let's see this first point one moment, Nvidia launch have a much faster time between major drivers, every month you keep seeing the GPUs increasing the performance, AMD release a very few amount of major drivers annually also the increments per major driver its bigger due to this reason, so if we take an example of the time that nvidia take per major drivers that have performance improvement and optimizations lets say 6, 1 major release every 2 months, (random number don't take it too much seriously), while AMD 1? 2 major release annually?. how much time take drivers to fully mature in nvidia's hand vs AMD's hand?. one fully year in the nvidia side, 3 or more in the AMD side..

second point, GPU architectures. don't forget AMD hasn't changed since TeraScale 2 and 3 with the HD5000 and HD6000 series. GCN were introduced with HD7000 and they are STILL using it, the codename? doesn't matter each different codename just change that, the name, few improvisations here and there. however the architecture keeps the same. so if we take that into consideration how since GCN 1.0 introduction at the end of 2011 AMD have almost full 4 years of architecture optimizations and maturity. in the other hand nvidia since that time have passed for kepler and now maxwell (and even Fermi if we take in consideration AMD launched HD7000 in Fermi times) any optimization they can make for maxwell will not work for kepler and vice-versa; can you say that HD6000 series are still improving and keep receiving better performance? AMD also forgot their last gen once HD7000 arrives do not forget that tiny(big) difference and at this point HD7000 and majority of R9 200 (leaving 290 and 290X) are starting to show the age, they reaching the roof of top performance that can offer..

Maxwell was kind of anomaly as they skipped entirely 800 series and they had somewhat matured drivers already due all the time the tiny GTX 750(TI) had in the market.. if we go back to kepler, they had 600 series and 700 series all rebranded except big kepler GK110 and as kepler every improvisation they made also worked for the GTX 600 series, specially for the mid to low end segments between GTX 650TI and GTX 660TI...

and in the other hand we have to take in consideration hardware as itself, it's well known that nvidia its more discrete in the terms of raw power, they know exactly how to design their cards to achieve certain performance at certain power target, in my opinion AMD have way bigger and powerful architecture, which allow somewhat futureproof. they always had larger bus, more amount of vRAM, ROPS, TMUs, etc.. that again allow some kind of better future proof hardware overtime, i'm one of those who say AMD have amazing hardware but sadly that hardware is in the wrong team of drivers. if we want to represent that in a more real life just imagine those tiny Japanese cars with tiny engines, great maneuverability and efficiency versus a typical american Muscle Car heavy weight big engine with a lot of horsepower, everyone can easily thing hey that Ford Mustang will dust off that tiny Camry V6. and then that Camry with V6 Engine just destroy that big V8 mustang engine.. so yes, AMD have a lot of muscle power to exploit overtime versus Nvidia measured power and efficiency overtime... sadly again all of those uber hardware are in wrong hands.
 
AbeOik6.jpg
 
Sorry, but reviews say you are wrong:
http://hardocp.com/article/2015/11/11/fallout_4_performance_image_quality_preview/6#.VkVyxzZdF9M



Find another fallacy to fuel you "cause" please.

And this title OP is also misleading...when did ignorance get to set the standards?

EDIT: Added video
https://www.youtube.com/watch?v=HdTobIV2sZ8


Use your eyes.

Are you saying that this blurry mess

1447201224JcH13vVZ7d_6_5_after.png


has more defined edges than these?

1447201224JcH13vVZ7d_6_5_before.png


God rays in Fallout is another crappy way to gimp AMD cards for no visual gains at all. The problem is that it's starting to backfire for Nvidia.

Also, look at the comments from the video you've posted. Most people can't see a difference at all.
 
How is it a gimp when nVidia cards also suffer from a large drop in performance when they still, as you put it, give no visual gains at all?
 
Did you read my whole post?

Your post consisted of 2 images, 1 quote and 5 lines of text, so it was [H]ard to miss anything.

Your conclusion, according to your post, is that because God-rays high looks better than ultra, thus AMD is gimped for no visual gain.
 
Your post consisted of 2 images, 1 quote and 5 lines of text, so it was [H]ard to miss anything.

Your conclusion, according to your post, is that because God-rays high looks better than ultra, thus AMD is gimped for no visual gain.

And you still managed to miss where I said that it is backfiring on them (Nvidia)...
 
No, I am curious was to where you got the 'gimp AMD' conclusion from.
 
So, they're doing something that's gimping their own cards, the competition, and looks worse, for no reason?

You mean, instead of malicious, they're stupid. Ok.
 
Nvidia has a history of not bringing their previous gen cards to full potential in driver. For example, Fallout 4 has been released and Kepler SLI users can enjoy full Ultra FPS framerate simply by setting SLI to AFR2. Yet, the SLI profile that comes with the driver is broken. Why? Simply because Maxwell cards have not yet have SLI figured out yet. Since it does not looks good that the Kepler cards outperforming their Maxwell cards, they nerfed the drivers for Kepler users.
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.

Game updates need to happen for proper SLI support

http://hardforum.com/showpost.php?p=1041968206&postcount=122
 
In regards to Ultra godrays, you are missing what it is ultimately doing. It is about how the light wraps around objects and the shafts or rays of light fill in based on distance. The sign in that screenshot is very high from the users height, naturally it looks a bit blurry as you look into the distance, think of it as a depth of field effect, which it is adding. Closer objects are more clear than objects in the distance. It is the definition of Depth of Field. It also takes the intensity of the light into account, naturally the brighter something is the less visible it is, that's what it is doing.
 
In regards to Ultra godrays, you are missing what it is ultimately doing. It is about how the light wraps around objects and the shafts or rays of light fill in based on distance. The sign in that screenshot is very high from the users height, naturally it looks a bit blurry as you look into the distance, think of it as a depth of field effect, which it is adding. Closer objects are more clear than objects in the distance. It is the definition of Depth of Field. It also takes the intensity of the light into account, naturally the brighter something is the less visible it is, that's what it is doing.

Maybe if their assets, textures and models, were of higher definition, it would have looked better. Introducing a blurring effect only masks your engines problems, it doesn't solve them...

Btw, can you do a comparison shot on an AMD card between ultra god rays and ultra god rays with tessellation limited to x16 in CCC, and the difference in performance they have?
 
Last edited:
So, they're doing something that's gimping their own cards, the competition, and looks worse, for no reason?

You mean, instead of malicious, they're stupid. Ok.

If both look bad at the same time, I would classify that as gimping, period, not gimping any side in particular.

Your post sounded like as if Godrays designed specifically around to 'gimp' AMD GPU's.
 
Maybe if their assets, textures and models, were of higher definition, it would have looked better. Introducing a blurring effect only masks your engines problems, it doesn't solve them...

Btw, can you do a comparison shot on an AMD card between ultra god rays and ultra god rays with tessellation limited to x16 in CCC, and the difference in performance they have?

This please x 10000..I decided to hold off on ordering FO4 even at the discounted price because I could NOT stand the shitty Gambyro engine in Skyrim, and as I feared, the took the cheapskate route and reused it (I know it has SOME tweaks)..

I would really like to see what capping the Tess to 16X will do for AMD performance..
 
Watch some jerk take this as a flippant fanboy remark, but AMD cards across the board just seem to get faster and faster every new OS and driver release. If the little company can optimize, then Goliath should be able to do the same in my mind.

it all depends on the starting point of the release though.. and sadly AMD is usually behind the 8 ball by a large margin which means any improvement makes it look way bigger then it actually should of been. but thus is life in the hardware/software world.
 
Use your eyes.

Are you saying that this blurry mess

1447201224JcH13vVZ7d_6_5_after.png


has more defined edges than these?

1447201224JcH13vVZ7d_6_5_before.png


God rays in Fallout is another crappy way to gimp AMD cards for no visual gains at all. The problem is that it's starting to backfire for Nvidia.

Also, look at the comments from the video you've posted. Most people can't see a difference at all.

What part of "You cannot see the difference in stills, but need to see in motion" don't you understand? :)

This kind of FUD has been around since F.E.A.R....people posted stills, claiming it looked like garbage...then people posted videos and the FUD got all silent.

A shame we still havn't gotten any further.
 
Higher quality godrays is definitely a 'see it in motion' sort of improvement. Static images don't really do it.
 
What part of "You cannot see the difference in stills, but need to see in motion" don't you understand? :)

This kind of FUD has been around since F.E.A.R....people posted stills, claiming it looked like garbage...then people posted videos and the FUD got all silent.

A shame we still havn't gotten any further.

Are you fucking kidding me? Read my post again, I wrote about the video.

The top comment is about not seeing any difference, and there are 11 replies that agree with him.
 
I see the rough edges on the bottom of the sign with Ultra and smooth in that same area under High setting .

I have always believed Nvidia gimps performance of older cards as a marketing tool to sell newer cutting edge cards and the numbers don't lie as why Nvidia leads in market shares because there user base is always needing to upgrade as the life span of older cards is not the same as AMD which keeps adding performance to older cards and why the HD 7950> R 280 can compete with the 960GTX and that was what ? a 2011-2012 card.
 
I have always believed Nvidia gimps performance of older cards as a marketing tool to sell newer cutting edge cards and the numbers don't lie as why Nvidia leads in market shares because there user base is always needing to upgrade as the life span of older cards is not the same as AMD which keeps adding performance to older cards and why the HD 7950> R 280 can compete with the 960GTX and that was what ? a 2011-2012 card.

Ahhh, you sir have stumbled upon the activity known as "Marketing".

I always find it funny to have *NEW* Mid-Tier GPUs every year when logically the previous generation's Top-Tier should drop to Mid-Tier. I know it's to try and selloff castrated new chips, but the net effect is that new chips are not necessarily any faster than older chips, and are oftentimes slower.
 
I have yet to experience a goddamn "nerf" on my 670s. I can't max out newer games as much anymore, but such is life when I'm on cards that are 3.5 years old now.
 
This reminds me of the Witcher 3. Everyone bitched and moaned about the stills but to actually play the game it's gorgeous. Still images and videos on the Internet never do a game justice. This is nothing new....

In [H] review I could definitely see a difference between high and ultra.
 
Are you fucking kidding me? Read my post again, I wrote about the video.

The top comment is about not seeing any difference, and there are 11 replies that agree with him.

The people that cannot see any difference I suggest that they, and you, drop PC gaming and buy a console.

Problem solved.

I see the difference and it's quite large IMHO.
 
Back
Top