NVIDIA’s Lower-Tier 2000-Series Cards May Not Support Ray Tracing

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
While this has already been speculated, recent comments by NVIDIA executive vice president and CFO Colette Kress are supporting the possibility that ray tracing capabilities will be exclusive to the RTX 2080 Ti, 2080, and 2070. TechPowerUp is calling this decision a “tremendous misstep,” in that most gamers play on lower-tier GPUs.

As Kress puts it, "We'll start with the ray-tracing cards. We have the 2080 Ti, the 2080 and the 2070 overall coming to market," which, in context, seems to point out towards a lack of raytracing hardware in lower-tier graphics cards (apparently, those based on the potential TU106 silicon and lower-level variants). Failing to add RT hardware to lower-tier graphics would exclude a huge portion of the playerbase from raytracing effects.
 
Considering the performance toll that Raytracing will take on the higher end cards, is there much point in adding it to the lower cards?

I got a 2050 man, and when I drop down to 1280x720 I can turn on Raytracing and duuuude the reflections in battlefield are sweet.
 
The 2060 might be a rebadged 1080, or it's just those chips where the tensor core section on the die had issues, so they just disable it and still use the gpu for the low end cards.

They already have 3 cards with ray tracing, and the lineup typically has what, 5 slots? 1050 (super low end), 1060 (low end), 1070 (mid range), 1080 (upper end), 1080ti (high end).

Those low end cards are what, sub $100? I really doubt it will be a make or break feature (ray tracing) for someone on that kind of budget...
 
Last edited:
Doesn't surprise me. The lower tier cards are not likely to be able to play any games with RT turned on at a framerate above slideshow. So why include it? To check a box? "Oh hey, you have this tech that you'll never be able to use!"
 
The question needs to separate RTX and Tensor cores. I can see them ditching RTX on a 2060 because of costs and the fact that if you reduce RTX core count raytracing becomes pointless. But the Tensor enable DLSS seems like something you want to keep at all costs because of the massive speed benefits.
 
This is kind of a big "well duh" . RTX barely works at 1080p on the big cards now. And if you look at the chip the rtx stuff takes considerable space.
 
Its not supported on AMD cards either. So the market is split regardless of what they do. This just confirms that this "ray tracing" tech won't be a "real" thing until next gen at the earliest.
 
I can't wait to buy a Geforce GTX 2060 to stick it next to my Geforce 4 MX.

images?q=tbn:ANd9GcRgTV-MfutGboO02RlbpZchBHFREJkwWaQQfDan3-8QxHYbmUSv.jpg
 
Not unexpected, though it does seem very likely to undermine the success of their ray-tracing shtick.
 
Just more evidence that RT is still just another 10 years away. lol

Seriously though. I am as excited as anyone that at least someone is trying to push actual new features (that aren't 100% gimmick) again and not just a few more FPS... but ya a ton of RT in any game is still a few years off. I'm sure the RT options in the games coming in the next few years will use it sparingly and likely look better with it off and everything else cranked + higher res even on 2080tis.
 
Not bothered by this. Since Nvidia's 2060 equivalent will pretty much be better then their competitors top of the line card, you end up competiting with yourself.

I'm sorry, sometimes when you want the newest things, you have to pay higher prices. Most everyone is basically saying Ray Tracing is a 1st gen fad, and dont' expect much from it...so don't see why people would be but hurt by Nvidia releasing a card that performs as fast as their current 1080 for $249, but without a feature they don't plan on using anyways. Want to use it? It's like VR. Pay the extra price and have fun. It's not that much more.
 
After reading reports here that the 2080 struggles at 1080p with RTX on I'm not surprised by this news. I'm OK with a non-RTX 2060 as long as it performs well at a good price.

All that being said it doesn't look like we'll see good ray tracing till 2020.
 
makes sense, if the 2080Ti barely runs 1080p at 60fps, that makes 2070 between 30 and 40fps, what's the point of offering RTX cards running at sub 30...beside that would make them direct competition for 10 series that will be co-exist for quite sometime along the 20 series.
 
Right, when you have a sea change, it takes awhile to make it into the entire range. Like when it took Nvidia a year to introduce the GeForce 2 MX, and add T&L on the low-end.

For programmable shaders, Nvidia just left it out with the Geforce 4 series, while ATI castrated a perfect-good 8500, and called it the 9000. It was "DX8 in name only, " but it delivered better performance than the terrible GeForce 4 MX.
 
Last edited:
Ray-tracing for His Lordship, and the peasantry get stencil shadows and ambient occlusion? Look, nVidia's oppressing me!

Really though, just as 4k has become almost inescapable for the living room, and 1440p+/high frame rate is standard gamer fare, nV pushes tech that barely runs on their highest end still-in-shrinkwrap hardware at 1080p? Red/green market segmentation is bad enough (the Freesync/G-sync wars right now, PhysX, TressFX, various AA schemes), but segmenting your own lineup of cards seems like a great way to get people to pass on this whole generation. The doubled performance claimed in the article seems fantastical/fanatical even from nV's previous PR charts, but as well-covered [H]ere we won't really know until boards actually hit the wild.
 
TechPowerUp is calling this decision a “tremendous misstep,” in that most gamers play on lower-tier GPUs.

A lot of drama over something that is rolling out, the only realistic way it could.

RT and Tensor core take a significant die space even on high end chips, to maintain any kind of decent Ray Tracing performance they can't shrink much. So you really can't make a small die RT chip in this generation.

First generation is expensive early adopter area. It has to start somewhere.

Now when Generation 2 arrives on 7nm, and makes it down to x60 class GPUs, there will be a lot more software that takes advantage of it.
 
Feels like Skylake X .. lets price it way high and on the lower skus we will turn off something that is expected with the platform ..with skylake x it was the pci-x lanes that were abridged .

How did that work out Intel? Also it seemed like Intel spent more time in pricing tiers and skus ..than actually making the product more compelling, or adding value .. with the 'better' skus over a grand .

Nvidia seems to be overplaying their dominance with this release .. and they are trying to do an array of things at the same time . New higher performing gpu , ddr6 outperforming ddr5 without yield issues and expense of HBM ..so do they release it and do the standard, open, my 2080 can beat your 1080 ti and my 2080ti will beat your Titan X ... NO ! Has to be released with a big increase in price and hold back all the benchmarks, limit access to only reviewers that sign a compact that limits their objectivity .. oh and only talk about the new feature that no one else has including the previous NV lines ..the ray tracing . Also lets use the release to break ranks and not release a traditional reference card that the bigger mfg could outperform with refined bios and superior cooling ..lets release a reference plus card that has multiple fans and uses some of that ICX like tech to shrink the performance difference between reference and partners ...to shrink partners margins and pull back some sales from the partners side of the street... to basically recoup the margins the EVGA and ASUS superior offerings over reference were able to command .

Well Nvidia so whats in it for your partners besides the arbitary small percentange left for them - maybe less than 3 % now ?? You know the partners that promoted your chips through their marketing and developed brands like ROG and such ..and put some pizazz into the plain reference design and generation after generation made the gpu, cards, and tech perform like winners ?? You know wringing out performance and standing behind the product with RMA and bug reporting and fixes and making stuff like SLI work with their chipsets and drivers ?? SO RTX IS A WAY TO DEINCINTIVE THEM FROM ANY FUTURE INNOVATION ! Because any monies spent by them on developing these cards further than beefing the cooler will not be recouped . NV even made the new dual card cable , new and propiertary like a apple lightning cable ..with a new high price for consumers ! Hmmm do we want to get this adopted by everyone so competitors can only dream of matching our superior dual card performance ?? NO ! Make it propietariy and GOUGE the price and take it away from the partners !! Consumers and partners have had too good for too long and new plan is to get as much as possible from consumers and keep as much as possible from partners !

rant rant rant.. we have seen other companies that think they are above it all and can be heavy handed with the customers , press and industry partners . The market will retreat and seek what it craves elsewhere ..ask Intel how business as usual and traditional 'we dictate the market' hubris ..I think not winning like before . Even the 8086K owners can thank AMD for performance at a reasonable price ..if not for AMDs resurgance that sku would have been priced well above $500 smackers like all the previous Extreme Editions and the Skylake X would have been priced at Broadwell E pricing or higher .

Kenny
 
If the 2080ti can barely handle ray-tracing at 1080p, it's not like supporting it on lower end cards is going to do any good. No one out there has 720p monitors.
 
Makes sense now that they got their BIG Marketing Headlines out! Now on with business where they will sell %80+ of the new line-up without special sauce.
Nvidia needed something to sell in a non die-shrink virtual monopoly situation that would create hype. The masses watching YouTube video's of RT appear to be it.
 
So...no ray tracing on lower cards...higher card purchasers won't use it because they'll be wanting running 1440p 100+fps/hz. Congratulations nvidia on advancing nothing, and creating something no game developers are going to support, especially when those games will be using AMD gpu's on the console ports.
 
So...no ray tracing on lower cards...higher card purchasers won't use it because they'll be wanting running 1440p 100+fps/hz. Congratulations nvidia on advancing nothing, and creating something no game developers are going to support, especially when those games will be using AMD gpu's on the console ports.

But.....Buy more save more!
 
I get why nVidia is going this route. The dies for the RTX 2080 (528 mm^2) and especially the RTX 2080 Ti (754 mm^2) are insanely large and thus expensive to produce. Ray tracing is targeting 1080P resolutions on these parts. Hacking off half of the units produce a more budget oriented part with a <350 mm^2 die would hack off more than half of the ray tracing performance. (Remember, there is a good portion of the die not dedicated to raw compute like the PCIe controller that are still necessary.)

I would however argue for an RTX 1060 capable of 5 gigarays/sec, 6 GB of memory on a 192 bit wide bus and 2048 ALUs for $400 MSRP. The die itself would be based upon the same as the RTX 1070 and 1080. That'd be capable of ray tracing at sub 1080p resolutions but enough get the technology out to developers who have no budget. So far it has been the big AAA houses the have taken an interesting in the RTX line up and they have the budget to get these expensive cards or even their Quadro versions. The indie scene needs to be thrown a bone once and while to get ideas and technology out there. I would not advise going lower than that due to the expected poor results in ray tracing.

I would also advice against the idiots in nVidia's marketing department from releasing both a GTX 2060 and a RTX 2060. I have a feeling that they would contemplate such a move and it would be stupid. In fact, nVidia should have moved to a xxx5 naming schema and called the RTX 2080 Ti the RTX 2090.

A GTX 2055 would follow the more traditional generation improvement of performance moving down a tier in pricing. The GTX 2055 would have performance just below a GTX 1070 for less money. The GTX 2050 would mimic the GTX 1060 and so on.
 
This will impact developers bothering to add its support till the next generation save for a few special games that Nvidia paid to add it. As we thought, this is a gymic this generation.
 
Back
Top