Nvidia Nerfing 10-Series With New Drivers?

Granted, I'm always up for a good conspiracy theory.

But this title caught my attention because I haven't played Far Cry 5 for months.

Just played it again the other day after the newest drivers, and my frame rate, which used to be in the 90s, is now low 60's.

Or maybe its the Windows update or something.

It doesn't matter, because 60fps with gsync still looks great, but I thought it was odd.

Gaming is more fun if you don't know what the frame rate is.
 
I mean if AMD driver team is badly underfunded and they some how manage to get the cards to be better over time how could a company that is not underfunded not do the same thing. I do see what you are saying though but I find it a little hard to believe that Nvidia has near 100% optimization on launch and can't do anything else to improve the drivers as time passes.
You are basing your assumptions on the fact that you find it hard to believe in something.

Nvidia's release drivers are usually damn near perfect, but it is impossible to prove they don't update their drivers to find future performance benefits with older hardware. AMD probably doesn't either, they just benefit from sharing the same ancient architecture from all generations.

Either way, if it were true its a petty argument to make. If a company releases a product you are happy with and is better than the competition, what does it matter if they aren't finding future performance benefits from it? its performing exactly as they advertised.
 
You are basing your assumptions on the fact that you find it hard to believe in something.

Nvidia's release drivers are usually damn near perfect, but it is impossible to prove they don't update their drivers to find future performance benefits with older hardware. AMD probably doesn't either, they just benefit from sharing the same ancient architecture from all generations.

Either way, if it were true its a petty argument to make. If a company releases a product you are happy with and is better than the competition, what does it matter if they aren't finding future performance benefits from it? its performing exactly as they advertised.
You know I would link an article about dam near perfect drivers but whats the point. I don't believe that they can't release dam near perfect divers I know they can't and neither can AMD. My point was that the drivers are not optimized for old cards, that is a fact. And, if some one is not planing to buy a card every year that is a factor that should be considered. I buy a new card every year as i'm sure most on [H] do but for those that don't it is once again a fact that Nvidia drivers fall off as the cards get older.
 
You know I would link an article about dam near perfect drivers but whats the point. I don't believe that they can't release dam near perfect divers I know they can't and neither can AMD. My point was that the drivers are not optimized for old cards, that is a fact. And, if some one is not planing to buy a card every year that is a factor that should be considered. I buy a new card every year as i'm sure most on [H] do but for those that don't it is once again a fact that Nvidia drivers fall off as the cards get older.

I highly doubt most people on here buy a new card every year. I bet it's maybe 10% especially considering there's not really a new card to buy every year.
 
You know I would link an article about dam near perfect drivers but whats the point. I don't believe that they can't release dam near perfect divers I know they can't and neither can AMD. My point was that the drivers are not optimized for old cards, that is a fact. And, if some one is not planing to buy a card every year that is a factor that should be considered. I buy a new card every year as i'm sure most on [H] do but for those that don't it is once again a fact that Nvidia drivers fall off as the cards get older.

I'm sorry but your reasoning is very flawed. Good day to you.
 
Read my other post, I am not saying they are nurfing cards, I never said that. All i am saying is that Nvidia does not optimize drivers for old cards and that is why they fall off as time passes. Blind ignorance has a new meaning when you cant even read what is being said by someone.

It's not so much that they don't optimize for older cards as when cards launch what you see is what you get. AMD's driver team has much less funding to work with so sometimes it can take them a bit to pull out the "true" performance of their cards. This tends to go down the line as well since the cards are all using variants of the GCN architecture. AMD isn't focusing on older cards any more than Nvidia, it's just a product of improvements to newer cards also effecting older ones due to the arch. They both tend to offer driver support for cards for roughly the same length of time, both of them currently have driver support for cards over 6 years old and neither are optimizing code for those old cards.
 
I highly doubt most people on here buy a new card every year. I bet it's maybe 10% especially considering there's not really a new card to buy every year.
I buy multiple cards every year . . . :p

As for Nvidia Nerfing drivers this round, does not really look like it. On a side note, the best time to have a driver favoring, even slightly, the new generation is at the review cycle or the initial review with the initial first impressions - which will be set in stone for eternity which anyone looking for reviews in the future on the internet will or can see. I just don't see that as the case this time around.

As for AMD increasing performance over time - that I do believe was do to that GCN was forward looking and the hardware initially was not effectively used. Now GCN looks to be obsolete for gaming or needs a very big update - just look at the size of Vega chip compared to GP 104 in the 1080, the memory bandwidth differences, power differences but yet in gaming they are very comparable.

Nvidia took a very big chance this round introducing AI and RT cores which could revolutionize the industry - for me this is very exciting and much appreciated. I just can't justify spending the money at this time to buy a RTX card, maybe next year or a very good sell. I look more forward to the second generation of RTX, 7nm etc.. I think AMD has to support RT as well if they want to be taken as being a serious contender for promoting gaming and future development of gaming.
 
I buy multiple cards every year . . . :p

As for Nvidia Nerfing drivers this round, does not really look like it. On a side note, the best time to have a driver favoring, even slightly, the new generation is at the review cycle or the initial review with the initial first impressions - which will be set in stone for eternity which anyone looking for reviews in the future on the internet will or can see. I just don't see that as the case this time around.

As for AMD increasing performance over time - that I do believe was do to that GCN was forward looking and the hardware initially was not effectively used. Now GCN looks to be obsolete for gaming or needs a very big update - just look at the size of Vega chip compared to GP 104 in the 1080, the memory bandwidth differences, power differences but yet in gaming they are very comparable.

Nvidia took a very big chance this round introducing AI and RT cores which could revolutionize the industry - for me this is very exciting and much appreciated. I just can't justify spending the money at this time to buy a RTX card, maybe next year or a very good sell. I look more forward to the second generation of RTX, 7nm etc.. I think AMD has to support RT as well if they want to be taken as being a serious contender for promoting gaming and future development of gaming.

I certainly agree that AMD needs to support some form of Ray Tracing in future products. I also applaud Nvidia for taking the big jump to support it. I hope they do moving forward, as this could be one of the most exciting times for gaming in years.
 
I I think AMD has to support RT as well if they want to be taken as being a serious contender for promoting gaming and future development of gaming.

You had me right up till there. Nvidia could end up shooting themselves in the foot with it too. AMD has majority playing games on their chipsets: Xboxes and Playstations. $800 and $1200 video cards are not going to determine this outcome in such a small market. RT right now is the very definition of "bolt on". Honestly it could fade into the night and we see a move to more AI based solution. RT takes too much room period. You can't get the type of performance on a $500 console you are getting from a 2080/Ti. Just bolting on an individual RT core would put it at an additional $200-300 per console over the next couple of years.

I think Nvidia's version might pass like a fart in the night. Sure some games will be enabled for NV RT, AMD has been working on it with 6 core CPU loads, 12 threads and in Vulcan from the articles I have researched. Just search AMD ray tracing. It's not new to them. Like Hardware PhysX, it will turn software based and be a part of everything. IMO of course.
 
You could very well be right. The cost of implementing it at this stage is just astronomical. Prices will have to come down for this to ever become mainstream. The thing is though, every developer i have read has said this is the holy grail of lighting effects. I truly believe if the hardware pricing in future generations comes down (by quite a bit) then yes it definitely will gain traction. I absolutely believe that.
 
You could very well be right. The cost of implementing it at this stage is just astronomical. Prices will have to come down for this to ever become mainstream. The thing is though, every developer i have read has said this is the holy grail of lighting effects. I truly believe if the hardware pricing in future generations comes down (by quite a bit) then yes it definitely will gain traction. I absolutely believe that.

I think we'll see RTX move down in the product stack once 7nm comes around. This first round is just to get the tech out there and start devs in the right direction.
 
I mean what dose it not have to do with it. You can buy two types of cards, AMD and Nvidia. If AMD scales better over time they have a better product or have better drivers released for older cards. However, Nvidia is not releasing optimized drivers for their older cards and in this case we use the other manufacture to compare the falloff of the cards over time as new games are released.

Read my other post, I am not saying they are nurfing cards, I never said that. All i am saying is that Nvidia does not optimize drivers for old cards and that is why they fall off as time passes. Blind ignorance has a new meaning when you cant even read what is being said by someone.

The AMD cards take a year to get optimized, and at least partially catch up to the nVidia cards... Been that way for years, for some games. AMD finally improving drivers a year later, for a game I've already played, doesn't really do me any good. nVidia cards/drivers run at full speed the day the game is launched in most cases.

Running as fast as they can from day 1, isn't "falling off" a year later when the other guy makes improvements, and finally optimizes' his shit.

I would rather have all the performance that I can from day 1 then have to wait a year. This is precisely why I haven't bought AMD in many years.

The only nVidia cards which may show a performance "fall off" are Fermi (480, 580) which got it's last driver update in March 2018. Not a fast card in today's games at any rate, and optimizing for +10% performance would only equal 3 or 4 fps, i.e. no point in spending time and money on so little change. That GPU came out in March 2010, so it had driver updates including game optimizations for 8 full years. That's pretty impressive and longer than AMD releases drivers for it's products: For this timeframe, the HD 5000 series came out Sept 2009, followed by HD 6000 series released in Oct 2010. Amd's final driver for those? WHQL July 2015 (Catalyst 15.7.1), Beta March 2016 (16.2.1). So AMD's card came out later, and had drivers for less time. 6.5 years vs 8. And those drivers that DID come out, probably not optimized for games released in the preceeding couple months...

Keep drinking the coolaid tho
 
The AMD cards take a year to get optimized, and at least partially catch up to the nVidia cards... Been that way for years, for some games. AMD finally improving drivers a year later, for a game I've already played, doesn't really do me any good. nVidia cards/drivers run at full speed the day the game is launched in most cases.

Running as fast as they can from day 1, isn't "falling off" a year later when the other guy makes improvements, and finally optimizes' his shit.

I would rather have all the performance that I can from day 1 then have to wait a year. This is precisely why I haven't bought AMD in many years.

The only nVidia cards which may show a performance "fall off" are Fermi (480, 580) which got it's last driver update in March 2018. Not a fast card in today's games at any rate, and optimizing for +10% performance would only equal 3 or 4 fps, i.e. no point in spending time and money on so little change. That GPU came out in March 2010, so it had driver updates including game optimizations for 8 full years. That's pretty impressive and longer than AMD releases drivers for it's products: For this timeframe, the HD 5000 series came out Sept 2009, followed by HD 6000 series released in Oct 2010. Amd's final driver for those? WHQL July 2015 (Catalyst 15.7.1), Beta March 2016 (16.2.1). So AMD's card came out later, and had drivers for less time. 6.5 years vs 8. And those drivers that DID come out, probably not optimized for games released in the preceeding couple months...

Keep drinking the coolaid tho

My last cards 780, 970 and 1060. No coolaid here. I am referring to games that come out later like you keep your card till cyberpunk 2077 come out and the AMD offering from the same time as the Nvidia offering is better for that new game.
Like I said I buy a card every cycle and I buy what I need at the time, this does not affect me but it may affect others. No company is bullet proof and this is one of Nvidia's short comings regardless of the reason it occurs.
 
You had me right up till there. Nvidia could end up shooting themselves in the foot with it too. AMD has majority playing games on their chipsets: Xboxes and Playstations. $800 and $1200 video cards are not going to determine this outcome in such a small market. RT right now is the very definition of "bolt on". Honestly it could fade into the night and we see a move to more AI based solution. RT takes too much room period. You can't get the type of performance on a $500 console you are getting from a 2080/Ti. Just bolting on an individual RT core would put it at an additional $200-300 per console over the next couple of years.

I think Nvidia's version might pass like a fart in the night. Sure some games will be enabled for NV RT, AMD has been working on it with 6 core CPU loads, 12 threads and in Vulcan from the articles I have researched. Just search AMD ray tracing. It's not new to them. Like Hardware PhysX, it will turn software based and be a part of everything. IMO of course.

It's sort of already there. Incoming RTX RT games are supposed to run on top of DXR, which is an extension of DX12, meaning all of DX12-capable hardware should be able to run it, as it is aimed as a purely compute workload. The performance hit of not having dedicated hardware is a completely different matter, of course.

And no, I disagree that consoles dictate the movement of graphics technologies, because developers never really 'develop' on consoles: they develop on PCs, then attempt to scale and deploy on consoles. Making better graphics and technologies available to developers (and subsequent adoption) make developers demand those features for their future target hardware. Consoles don't just make hardware that has features that will never be used - they're always designed with what the developers are actually asking for. Nobody makes something that doesn't have demand...even if they have to show the target audience that they wanted it in the first place.

RT is just that - developers have long desired it, but it's never been feasible for real-time applications before. Now that it's been shown to be possible, adoption should pick up, especially for the next generation.
 
Last edited:
It's sort of already there. Incoming RTX RT games are supposed to run on top of DXR, which is an extension of DX12, meaning all of DX12-capable hardware should be able to run it, as it is aimed as a purely compute workload. The performance hit of not having dedicated hardware is a completely different matter, of course.

And no, I disagree that consoles dictate the movement of graphics technologies, because developers never really 'develop' on consoles: they develop on PCs, then attempt to scale and deploy on consoles. Making better graphics and technologies available to developers (and subsequent adoption) make developers demand those features for their future target hardware. Consoles don't just make hardware that has features that will never be used - they're always designed with what the developers are actually asking for. Nobody makes something that doesn't have demand...even if they have to show the target audience that they wanted it in the first place.

RT is just that - developers have long desired it, but it's never been feasible for real-time applications before. Now that it's been shown to be possible, adoption should pick up, especially for the next generation.



Well, back in the days we had.. consoles with next-to no vram
we had developers that maked games on pc and the shrunk the texture size for consoles
so, one would assume that they could just use the big textures on pc and then the smaller version on consoles..

but what we had in that period was that 95% of games had console textures in their pc versions,but that problem did magically disappear around the time of ps4 and xb1..

So that lets me conclude that.. all triple-a development is for the consoles, PC is still second - so RTX is doomed - IF nVidia dosen't pay the developers for it.

Because they still have to add a non-RTX path..

So guess where their focus lies..
 
Well, back in the days we had.. consoles with next-to no vram
we had developers that maked games on pc and the shrunk the texture size for consoles
so, one would assume that they could just use the big textures on pc and then the smaller version on consoles..

but what we had in that period was that 95% of games had console textures in their pc versions,but that problem did magically disappear around the time of ps4 and xb1..

So that lets me conclude that.. all triple-a development is for the consoles, PC is still second - so RTX is doomed - IF nVidia dosen't pay the developers for it.

Because they still have to add a non-RTX path..

So guess where their focus lies..

RT is not doomed. Microsoft wouldn't waste their time with DXR if that was the case. Consoles will eventually use RT and until then you'll see it in PC games. The genie is out of the bottle and RT will start to proliferate slowly but surely over the next couple of years.
 
Well, back in the days we had.. consoles with next-to no vram
we had developers that maked games on pc and the shrunk the texture size for consoles
so, one would assume that they could just use the big textures on pc and then the smaller version on consoles..

but what we had in that period was that 95% of games had console textures in their pc versions,but that problem did magically disappear around the time of ps4 and xb1..

So that lets me conclude that.. all triple-a development is for the consoles, PC is still second - so RTX is doomed - IF nVidia dosen't pay the developers for it.

Because they still have to add a non-RTX path..

So guess where their focus lies..

You've gotten the idea in reverse. Developers have always wanted more because their development PCs could do more. However, it's not always economically feasible to design and manufacture a console with the best hardware - thus the XB1 and PS4, which are now extremely close to normal PC hardware due to the low cost of decent capable hardware, bringing dev PC-to-console development costs ever lower and making more of PC graphics technologies forward to the masses.
 
RT is not doomed. Microsoft wouldn't waste their time with DXR if that was the case. Consoles will eventually use RT and until then you'll see it in PC games. The genie is out of the bottle and RT will start to proliferate slowly but surely over the next couple of years.

At the current prices RTX is at they wont drive innovation or adoption. At their current prices they represent about 3% of the steam hardware list, which will not entice any studio to bother with that low of a install base. With some money tossed at them a few will add it in, but unless Nvidia can move lots of these cards most developers will pass cause even Nvidia cant afford to toss money to everyone to get more companies to use it's tech. If Nvidia controlled the consoles then maybe, but without that I just dont see companies expending the resources without Nvidia forking over lots of cash.
 
At the current prices RTX is at they wont drive innovation or adoption. At their current prices they represent about 3% of the steam hardware list, which will not entice any studio to bother with that low of a install base. With some money tossed at them a few will add it in, but unless Nvidia can move lots of these cards most developers will pass cause even Nvidia cant afford to toss money to everyone to get more companies to use it's tech. If Nvidia controlled the consoles then maybe, but without that I just dont see companies expending the resources without Nvidia forking over lots of cash.

NV has lots of cash and they'll use it. This is just the first wave of hardware and it will move down in the stack next go round. Also, developers have wanted this capability for a long time and those that want to push the envelope will do it. DICE said it didn't take them too much time to hack it into BF5 so I expect it isn't too difficult.

I guess we'll have to wait and see who's right.
 
I just don't get the detractors for ray-tracing Montu. It's like they have no idea how long this has been coming- Nvidia has been working on it for a decade, or so they said.

We should not expect raster performance to increase much from here. We should also expect competitive ray-tracing hardware from AMD soon; from a complexity and driver perspective, Nvidia has done most of their work for them!

And yeah, we should see it in consoles as well. It's already in the engines. It's coming in the games this year.

Much to my surprise- and perhaps most in the community, we can simply say:

It's here.
 
I just don't get the detractors for ray-tracing Montu. It's like they have no idea how long this has been coming- Nvidia has been working on it for a decade, or so they said.

We should not expect raster performance to increase much from here. We should also expect competitive ray-tracing hardware from AMD soon; from a complexity and driver perspective, Nvidia has done most of their work for them!

And yeah, we should see it in consoles as well. It's already in the engines. It's coming in the games this year.

Much to my surprise- and perhaps most in the community, we can simply say:

It's here.

It isn't here yet, the hardware might be, but nothing was enabled last I checked (yesterday). So you cannot simply say it's here to most of the community, you can only say it's partly here, or almost here.
 
It isn't here yet, the hardware might be, but nothing was enabled last I checked (yesterday). So you cannot simply say it's here to most of the community, you can only say it's partly here, or almost here.

The hardware's here and the games have been demonstrated.

So I'll agree to meet in the middle with 'partly here' ;).

But really, partly is like 98%, and the big encouraging factor is just how quickly various houses got it running on real hardware. A handful of weeks to get a convincing BF:V demo?

It's hard to bank against it. Nvidia appears to have pulled this one off spectacularly.
 
NV has lots of cash and they'll use it. This is just the first wave of hardware and it will move down in the stack next go round. Also, developers have wanted this capability for a long time and those that want to push the envelope will do it. DICE said it didn't take them too much time to hack it into BF5 so I expect it isn't too difficult.

I guess we'll have to wait and see who's right.

It will be interesting to see how it turns out.
 
The hardware's here and the games have been demonstrated.

So I'll agree to meet in the middle with 'partly here' ;).

But really, partly is like 98%, and the big encouraging factor is just how quickly various houses got it running on real hardware. A handful of weeks to get a convincing BF:V demo?

It's hard to bank against it. Nvidia appears to have pulled this one off spectacularly.

For the first round of games I imagine they are mostly concerned with getting playable performance out of it. After that we will see better integration and more titles using it.

These first few games are getting it patched in, so its not exactly a baked in feature just yet.
 
For the first round of games I imagine they are mostly concerned with getting playable performance out of it. After that we will see better integration and more titles using it.

These first few games are getting it patched in, so its not exactly a baked in feature just yet.

I get what you're trying to claim, but they have playability, and ray-tracing isn't hard to integrate. Performance optimization and effects optimization- sometimes 'real-looking' lighting is boring!- are what's left, and with the game engines already supporting ray tracing, I expect pretty much everything to be using it by this time next year on top of all of the AAA-titles we already know are releasing with ray-tracing support. Assuming an 18-month refresh for hardware which might be realistic for Nvidia with a die shrink, and we're looking at dozens of top-tier titles taking advantage of ray-tracing on Turing hardware alongside whatever AMD releases.
 
I get what you're trying to claim, but they have playability, and ray-tracing isn't hard to integrate. Performance optimization and effects optimization- sometimes 'real-looking' lighting is boring!- are what's left, and with the game engines already supporting ray tracing, I expect pretty much everything to be using it by this time next year on top of all of the AAA-titles we already know are releasing with ray-tracing support. Assuming an 18-month refresh for hardware which might be realistic for Nvidia with a die shrink, and we're looking at dozens of top-tier titles taking advantage of ray-tracing on Turing hardware alongside whatever AMD releases.

Eggzackerly, RT is like every new feature that's come out in hardware. Software support dribbles out slowly and then becomes a mainstay. This first gen is just to get the ball rolling and next gen will be where it moves down the stack and becomes more ubiquitous.
 
I just don't get the detractors for ray-tracing Montu. It's like they have no idea how long this has been coming- Nvidia has been working on it for a decade, or so they said.

We should not expect raster performance to increase much from here. We should also expect competitive ray-tracing hardware from AMD soon; from a complexity and driver perspective, Nvidia has done most of their work for them!

I don't want to spend $800 for the feature right now either but I'm glad the cards are out there. 32 bit color, Bump mapping, AA, Tesselation, DSR, etc all used to be only usable for the most expensive cards and now they're available on everything.

If AMD had released a GPU with ray-tracing capability first people would be going crazy about how monumental an achievement it is and that we needed AMD doing things like this to push graphics forward and to show that they are better than NV. But since it's NV most of these people immediately revert to "it's just evil Nvidia".
 
If AMD had released a GPU with ray-tracing capability first people would be going crazy about how monumental an achievement it is and that we needed AMD doing things like this to push graphics forward and to show that they are better than NV. But since it's NV most of these people immediately revert to "it's just evil Nvidia".

Nope dude, just nope. It is the stupid high prices that are earning them the criticism and it is justified. However, people are still buying them so I suppose it does not really matter anyways. *GPP*
 
I just don't get the detractors for ray-tracing Montu. It's like they have no idea how long this has been coming- Nvidia has been working on it for a decade, or so they said.

We should not expect raster performance to increase much from here. We should also expect competitive ray-tracing hardware from AMD soon; from a complexity and driver perspective, Nvidia has done most of their work for them!

And yeah, we should see it in consoles as well. It's already in the engines. It's coming in the games this year.

Much to my surprise- and perhaps most in the community, we can simply say:

It's here.

People haven't a problem with Ray Tracing, it's the way Nvidia are going about it that people are taken issue with. It's the consumer taken the hit to be beta testers for the new hardware. I know it's business, but, it doesn't sit well with some people when they see the massive amounts of money that Nvidia are making.

Second, Nvidia says they are working on this for 10 years. Yeah, they say a lot of things. What about saying that they would never use Mantle because the were focusing on Dx12 and getting as many games out as soon as possible. And staying with Dx12, do you remember the launch of Dx12, where he stood on stage and said they were working on Dx12 for 4 years with Microsoft.

And what's this BS about Nvidia doing the work for AMD from a Driver and complexity perspective. LOL seriously?


If AMD had released a GPU with ray-tracing capability first people would be going crazy about how monumental an achievement it is and that we needed AMD doing things like this to push graphics forward and to show that they are better than NV. But since it's NV most of these people immediately revert to "it's just evil Nvidia".

If AMD launched like this, you and the others in this thread would have complained bitterly about how bad it was, how crap AMD were to launch a product with no games available to use the hardware and how dare they charge such a high price, look at how stupid AMD are.

Don't bother denying it, because that's exactly what you would have done if AMD had released a GPU with Ray Tracing first.
 
People haven't a problem with Ray Tracing, it's the way Nvidia are going about it that people are taken issue with. It's the consumer taken the hit to be beta testers for the new hardware. I know it's business, but, it doesn't sit well with some people when they see the massive amounts of money that Nvidia are making.

Perhaps they're just a tad bit shortsighted then. Ray-tracing is coming along the same way DX9 and DX10 did: by having hardware out around the same time as driver support and game availability hits. Oh! And the cards are still faster too!

Second, Nvidia says they are working on this for 10 years. Yeah, they say a lot of things. What about saying that they would never use Mantle because the were focusing on Dx12 and getting as many games out as soon as possible. And staying with Dx12, do you remember the launch of Dx12, where he stood on stage and said they were working on Dx12 for 4 years with Microsoft.

They were. The functionality for DX12 has been desired for a very long time, like ray-tracing, but unlike ray-tracing the low-overhead APIs require a significant amount of work on the developers' side. How many titles used Mantel? Shit's like DX8.4: AMD tacked on their own standard, few supported it, and when they did support it it came in a different standard (Vulkan), like the premise of programmable shaders came in DX9.

And what's this BS about Nvidia doing the work for AMD from a Driver and complexity perspective. LOL seriously?

Chicken and egg problem: Nvidia has actually put real resources on the line with massive die sizes for functionality that requires both OS/API support and game support. Without that hardware, the other two would never happen. And that driver work was done in DirectX- this isn't something proprietary, both AMD and Intel, and anyone else that implements ray-tracing in hardware, will benefit from Nvidia's trailblazing here. Hell, they're even porting it all to Vulkan!

If AMD launched like this, you and the others in this thread would have complained bitterly about how bad it was, how crap AMD were to launch a product with no games available to use the hardware and how dare they charge such a high price, look at how stupid AMD are.

Who are you talking to? 'The others'? Why troll? How do you know how people would respond to new technology?

Don't bother denying it, because that's exactly what you would have done if AMD had released a GPU with Ray Tracing first.

You shouldn't bother making such a partisan accusation. Most of us have been around to see all of the companies involved here dominate this market, as well as others that have fallen by the wayside. If AMD got off their ass and started innovating effectively, like really leading, they'd be cheered on.
 
People haven't a problem with Ray Tracing, it's the way Nvidia are going about it that people are taken issue with. It's the consumer taken the hit to be beta testers for the new hardware. I know it's business, but, it doesn't sit well with some people when they see the massive amounts of money that Nvidia are making.
Yes, NVIDIA always releases new hardware with no R&D beforehand. Do you understand what you're saying?
Second, Nvidia says they are working on this for 10 years. Yeah, they say a lot of things. What about saying that they would never use Mantle because the were focusing on Dx12 and getting as many games out as soon as possible. And staying with Dx12, do you remember the launch of Dx12, where he stood on stage and said they were working on Dx12 for 4 years with Microsoft.
There was no focusing on Mantle because it was a hardware-specific API. All vendors work with Microsoft on DirectX because they all have features that they want supported and the vendors need to know what and how they work so they can design their hardware around it. DirectX 12 didn't magically pop out of the air when AMD was experimenting with Mantle.
If AMD launched like this, you and the others in this thread would have complained bitterly about how bad it was, how crap AMD were to launch a product with no games available to use the hardware and how dare they charge such a high price, look at how stupid AMD are.

Don't bother denying it, because that's exactly what you would have done if AMD had released a GPU with Ray Tracing first.
Launched like what? This launch has been no different than any other new hardware launch I have been a spectator or a part of for at least the last 15 years. Aside from that, AMD does have ray tracing, but they just haven't been pushing it for games.
 
If AMD launched like this, you and the others in this thread would have complained bitterly about how bad it was, how crap AMD were to launch a product with no games available to use the hardware and how dare they charge such a high price, look at how stupid AMD are.

Don't bother denying it, because that's exactly what you would have done if AMD had released a GPU with Ray Tracing first.

Yep, and they did back in the day. Took a while for devs to get TruForm into games, if they did. Curved surfaces were a new area back in the day and ATI had a nice approach. https://en.wikipedia.org/wiki/ATI_TruForm
It was a novel and great idea. Worked well till they killed it themselves in CAT 5.9 ,since devs were not fully taking advantage of it and tesselation advanced to the next level. Morrowind was one of the most popular games to use n-patch to add TF.

AMD does the same if it had cash flow like Nvidia to push it's technology, which it just doesn't have today. So yes, AMD does the same thing when it can.
 
AMD does the same if it had cash flow like Nvidia to push it's technology, which it just doesn't have today. So yes, AMD does the same thing when it can.

ATi GPUs were the DX9 reference- though perhaps not for the best reasons, they won out on effectiveness- and essentially dictated the market until Nvidia got back on their feet with the 6000-series. Nvidia didn't really pull ahead until they released the 8000-series, and they've more or less been leading the market since having not (yet) had another FX5800 moment while ATi now under AMD has arguably had several. The hot-running GTX480 is about as close of an argument one can make for Nvidia there, but even then they had the performance crown.

What I find to be a shame is that AMD wasn't ready for ray-tracing with hardware. The software is arguably simpler than something like Mantle/Vulkan/DX12, and the hardware is certainly not that complex- I've no doubt that they could come up with a 'hybrid' solution to the problem that leverages their traditional strengths in raw GPU compute while putting on a good raster and ray tracing show.
 
ATi GPUs were the DX9 reference- though perhaps not for the best reasons, they won out on effectiveness- and essentially dictated the market until Nvidia got back on their feet with the 6000-series. Nvidia didn't really pull ahead until they released the 8000-series, and they've more or less been leading the market since having not (yet) had another FX5800 moment while ATi now under AMD has arguably had several. The hot-running GTX480 is about as close of an argument one can make for Nvidia there, but even then they had the performance crown.

What I find to be a shame is that AMD wasn't ready for ray-tracing with hardware. The software is arguably simpler than something like Mantle/Vulkan/DX12, and the hardware is certainly not that complex- I've no doubt that they could come up with a 'hybrid' solution to the problem that leverages their traditional strengths in raw GPU compute while putting on a good raster and ray tracing show.

Ray tracing is done on pro cards and AMD has supported that for quite some time. What Nvidia is doing is not Ray Tracing solely but a Hybrid approach at Rasterization and Ray Tracing. They have to do that since the card would absolutely choke at full ray tracing, you would need a massive amount more compute power then any 1 card is delivering anytime soon. What funny is people going on an on about how great it is when we cant even use it yet to see. Ray Tracing is great but this isnt full on Ray Tracing cause that takes a huge rendering farm to do still. Will see once it actually comes out and we can use the hardware, but right now it still looks like a massive loss of performance is the price.
 
What Nvidia is doing is not Ray Tracing solely but a Hybrid approach at Rasterization and Ray Tracing.

Was that not what we've been talking about?

Are you really quipping over an inspecificity arisen from an established general understanding?

You can always do more, the latest theatrical releases show that quite clearly. What we're seeing is the process being applied to the whole frame.

but right now it still looks like a massive loss of performance is the price.

If you believe that price/performance should always increase linearly, which it very rarely does. That's what made the 1080Ti a great deal despite the US$700 MSRP!

Just remember that people bought similarly performing Titans before it for significantly more. The price for the 'top end' this generation is certainly not an aberration.
 
Second, Nvidia says they are working on this for 10 years. Yeah, they say a lot of things. What about saying that they would never use Mantle because the were focusing on Dx12 and getting as many games out as soon as possible. And staying with Dx12, do you remember the launch of Dx12, where he stood on stage and said they were working on Dx12 for 4 years with Microsoft.

DX12 didn't magically appear out of nowhere. DX12 has been in the works for a while, and AMD wanted to get the jump, so they released their own version of it and called it 'open source'. Intel and NV definitely wouldn't support it since not only is it hardware-specific, but DX12 development was already well underway, and thus would be redundant.

If AMD launched like this, you and the others in this thread would have complained bitterly about how bad it was, how crap AMD were to launch a product with no games available to use the hardware and how dare they charge such a high price, look at how stupid AMD are.

Don't bother denying it, because that's exactly what you would have done if AMD had released a GPU with Ray Tracing first.

I can't speak for others, but I definitely would not have. RT has been a long-awaited feature, and if AMD got it first, then so be it, let them be the pioneer. I would still have my doubts that they can push adoption as quickly as NV does, but they can be the pioneer and I wouldn't complain even if it was slow.

Because man, even if it's just demos and other non-interactive stuff right now, the fact that something like ILMxLAB's Reflections demo is running on a SINGLE CARD (used to be 4x V100s) at those kinds of framerates is just all sorts of mind-blowing, coming from someone who's closely followed and marveled at graphics technologies for at least 2 decades.

EDIT: They were V100s, not Titan Vs, but that's pretty much the same GPU with less memory
 
Last edited:
nvidia has had a section working on driver testing for well over a decade now. They run benchmarks on driver sets day in and day out check performance on relevant games and IQ checks. While this process has been automated to include ML to do more scrutiny of IQ, the job remains the same. It's a never ending loop that keeps coming back to the driver teams to make improvements. Until a generation of chip gets officially dropped off support, you can count on fixes and optimizations for drivers long after RTX.
 
I dont think so. Think trying to mix in all the new 2080 features it just scrambled their driver code some. I'd give it 3-5 more releases before making true comparisons.
 
Back
Top