NVIDIA’s Lower-Tier 2000-Series Cards May Not Support Ray Tracing

In the private forum that I am involved in I stated my case on the total bullshit Nvidia is doing and going to do. It got to the point I decided and purchased a used, 4 month old EVGA 1070 SuperClocked Video card for $240.00. This Including tax and shipping. This is the correct amount a used 1070 with little usage should be. This should NOT be the exception but right now it is...

I am not going to pay these over priced 10 series cards for what they are. I am not responsible for their overstock of GPU's because they (Nvidia and the Video card companies) got greedy... This is 2 year old tech. I've done my data mining . You may take my advice as grain of salt, but from here on out you are going to hear more additional information about the "reasons" why the video cards are going to stay around this price range.

I am not paying 10 to 15% off of MSRP off of new Video cards. I am not going to pay 20 - 25% for used as well. The 900 series of video card will be fine with current gaming content.

10 months ago I was on a RX270+ which handled all of my games nicely. I then purchased a new EVGA 1050ti for $155.00 (including tax and shipping) and its a good card as well.

Now I have a used 1070 that will double the performance of what I have now... I can now wait until the stupidity dies down.
Unless from now to December I can get a good deal that you should normally should get on a new 10 series video card or the 20 series go down from their stupid levels...

I can wait.

NOTE:Video card pricing and other things below... :)

https://pcpartpicker.com/trends/price/video-card/

22168930.jpg
 
Got a link? I don't remember this. I'm not being sarcastic.
Fx6800 series. It ran hotter than %-%+*@(_ and was slow. Nvidia replaced it within a few months it was so bad. Reviewers didn't even get stable copies.
 
Why would the lower cards support RT?? It’s extremely demanding tech. Only the high end 2000 GPUs can even attempt it right now, and even they’ll be struggling.
Silly thread.
 
Got a link? I don't remember this. I'm not being sarcastic.
My game requires Transform and Lighting. Does my card support this feature?



Transform and Lighting (T&L) is a hardware feature present in only certain graphic card models. The following NVIDIA based graphic cards do not support Transform and Lighting:

Riva 128
Riva TNT
Riva TNT2 Family
Vanta

If you have one of the above graphic cards/chips in your PC, then you will not be able to play games which require Transform and Lighting. Since this is a hardware feature, you will need to replace your graphics card with a card that supports Transform and Lighting.
 
Well Nvidia so whats in it for your partners besides the arbitary small percentange left for them - maybe less than 3 % now ?? You know the partners that promoted your chips through their marketing and developed brands like ROG and such ..and put some pizazz into the plain reference design and generation after generation made the gpu, cards, and tech perform like winners ?? You know wringing out performance and standing behind the product with RMA and bug reporting and fixes and making stuff like SLI work with their chipsets and drivers ?? SO RTX IS A WAY TO DEINCINTIVE THEM FROM ANY FUTURE INNOVATION ! Because any monies spent by them on developing these cards further than beefing the cooler will not be recouped . NV even made the new dual card cable , new and propiertary like a apple lightning cable ..with a new high price for consumers ! Hmmm do we want to get this adopted by everyone so competitors can only dream of matching our superior dual card performance ?? NO ! Make it propietariy and GOUGE the price and take it away from the partners !! Consumers and partners have had too good for too long and new plan is to get as much as possible from consumers and keep as much as possible from partners !
Please tell me when Crossfire cables could be used for SLI.
Got a link? I don't remember this. I'm not being sarcastic.
https://hardforum.com/threads/ray-tracing-game-changer-or-overhyped.1966537/page-5#post-1043819516
 
Doesn't surprise me. The lower tier cards are not likely to be able to play any games with RT turned on at a framerate above slideshow. So why include it? To check a box? "Oh hey, you have this tech that you'll never be able to use!"

Yeah, I would be surprised if even the 2070 was able to run full RT play-ably in many titles.
 
Fx6800 series. It ran hotter than %-%+*@(_ and was slow. Nvidia replaced it within a few months it was so bad. Reviewers didn't even get stable copies.
Do you mean the 6800GT? I had one and it was a great card. Lasted for years, when it died I baked it back to life and still worked for a couple more years.

I guess you mean the FX5800 DustBuster. But that had nothing to do with TnL, IIRC.
 
Do you mean the 6800GT? I had one and it was a great card. Lasted for years, when it died I baked it back to life and still worked for a couple more years.

I guess you mean the FX5800 DustBuster. But that had nothing to do with TnL, IIRC.

That was it. (The fx5800.
.I typod)

What was the tnl scandal?
 
Right instead they should enable it and then the same people would bitch "Lol why did they even bother putting this on the lower end cards if its 7 FPS".
HA!!! it wouldn't even do 7 fps, more like 60 spf (seconds per frame)
 
Do you mean the 6800GT? I had one and it was a great card. Lasted for years, when it died I baked it back to life and still worked for a couple more years.

I guess you mean the FX5800 DustBuster. But that had nothing to do with TnL, IIRC.
6800GT was a great value card and you could soft mod unlock with RivaTuner too since they weren't laser cut good old days!
 
My game requires Transform and Lighting. Does my card support this feature?



Transform and Lighting (T&L) is a hardware feature present in only certain graphic card models. The following NVIDIA based graphic cards do not support Transform and Lighting:

Riva 128
Riva TNT
Riva TNT2 Family
Vanta

If you have one of the above graphic cards/chips in your PC, then you will not be able to play games which require Transform and Lighting. Since this is a hardware feature, you will need to replace your graphics card with a card that supports Transform and Lighting.
How is this a scam?
 
You are asking the wrong person. I only quoted back what he was referring too. Hardware T&L wasn't supported by the lower end cards. Much like Ray tracing now.
Except those were prior generation cards.
 
Got a link? I don't remember this. I'm not being sarcastic.

From Wikipedia:
Without broad application support at the time, critics pointed out that the T&L technology had little real-world value. Initially, it was only somewhat beneficial in certain situations in a few OpenGL-based 3D first-person shooter titles, most notably Quake III Arena. Benchmarks using low budget CPUs like the Celeron 300A would give favourable results for the GeForce 256, but benchmarks done with some CPUs such as the Pentium II 300 would give better results with some older graphics cards like the 3dfx Voodoo 2. 3dfx and other competing graphics card companies pointed out that a fast CPU could more than make up for the lack of a T&L unit. Software support for hardware T&L was not commonplace until several years after the release of the first Geforce. Early drivers were buggy and slow, while 3dfx cards enjoyed efficient, high speed, mature Glide API and/or MiniGL support for the majority of games. Only after the GeForce 256 was replaced by the GeForce 2, and ATI's T&L-equipped Radeon was also on the market, did hardware T&L become a widely utilized feature in games.

The GeForce 256 was also quite expensive for the time and it didn't offer tangible advantages over competitors' products outside of 3D acceleration. For example, its GUI and video playback acceleration were not significantly better than that offered by competition or even older Nvidia products. Additionally, some GeForce cards were plagued with poor analog signal circuitry that caused display output to be blurry.[citation needed]

As CPUs became faster, the GeForce 256 demonstrated that the disadvantage of hardware T&L is that, if a CPU is fast enough, it can perform T&L functions faster than the GPU, thus making the GPU a hindrance to rendering performance. This changed the way the graphics market functioned, encouraging shorter graphics card lifetimes, and placing less emphasis on the CPU for gaming.

https://en.wikipedia.org/wiki/GeForce_256
 
Except those were prior generation cards.
Yes he was talking about this event happening before. Where the high end cards had a feature that wasn't implemented in a lower end card. That feature didn't become prolific until later generations. I suspect he is stating that will happen with Ray tracing because only some cards have it so game makers won't want to spend money to support something that isn't widely used.
 
That would be disappointing but I get it, gotta pay the entry fee to get the latest tech, I wonder if that means the lower end cards will be a lot cheaper.
 
From Wikipedia:
Without broad application support at the time, critics pointed out that the T&L technology had little real-world value. Initially, it was only somewhat beneficial in certain situations in a few OpenGL-based 3D first-person shooter titles, most notably Quake III Arena. Benchmarks using low budget CPUs like the Celeron 300A would give favourable results for the GeForce 256, but benchmarks done with some CPUs such as the Pentium II 300 would give better results with some older graphics cards like the 3dfx Voodoo 2. 3dfx and other competing graphics card companies pointed out that a fast CPU could more than make up for the lack of a T&L unit. Software support for hardware T&L was not commonplace until several years after the release of the first Geforce. Early drivers were buggy and slow, while 3dfx cards enjoyed efficient, high speed, mature Glide API and/or MiniGL support for the majority of games. Only after the GeForce 256 was replaced by the GeForce 2, and ATI's T&L-equipped Radeon was also on the market, did hardware T&L become a widely utilized feature in games.

The GeForce 256 was also quite expensive for the time and it didn't offer tangible advantages over competitors' products outside of 3D acceleration. For example, its GUI and video playback acceleration were not significantly better than that offered by competition or even older Nvidia products. Additionally, some GeForce cards were plagued with poor analog signal circuitry that caused display output to be blurry.[citation needed]

As CPUs became faster, the GeForce 256 demonstrated that the disadvantage of hardware T&L is that, if a CPU is fast enough, it can perform T&L functions faster than the GPU, thus making the GPU a hindrance to rendering performance. This changed the way the graphics market functioned, encouraging shorter graphics card lifetimes, and placing less emphasis on the CPU for gaming.

https://en.wikipedia.org/wiki/GeForce_256

wut...
 
Back
Top