Rumors Suggest Nvidia Could Launch Turing Cards Without Raytracing

If it still has DLSS than it might be worth it.

Say you have a (currently non-existing) 1180 for $450-500 with no DLSS which performs the same as a $700-850 2080 with DLSS. Who in their right mind is going to pay all that extra money just for DLSS? I can certainly put up with TAA for one generation of cards to save $250.
 
Say you have a (currently non-existing) 1180 for $450-500 with no DLSS which performs the same as a $700-850 2080 with DLSS. Who in their right mind is going to pay all that extra money just for DLSS? I can certainly put up with TAA for one generation of cards to save $250.

a) We don't know where this is going to be priced.

b) Your theoretical 1180 would actually be an 1160, and it would have a fraction of the raster performance of the 2080.
 
so release 2060 with RT cores that you cant even play with at 1080p, So RT cores are pointless. But hey nvidia has to charge more money somehow. This just makes me think that they will likely price the rtx 2060 at 399 to justify the lower tier 1160. Price that at 299?

Only reason I see 2060 coming out with rt cores so nvidia can charge more. Even though it wont play ray tracing games worth shit. It may be a dumb move to us but they know fanboys will shell out money and pay more for useless rtx cores on the 2060. I won't be suprised if 2060 has little more cuda cores to justify the price.
 
If it still has DLSS than it might be worth it.

It will be worth it, for what? HOw many DLSS games do you currently have out? Tired of DLSS stuff, with nothing to show for. Not sure why people justify purchase on tech that is not automatic and not guaranteed for every game.
 
They have sucessfully shifted all the gpu prices up a level so a GTX 960 level card like the 1060 is priced like a previous gen GTX 970 level card. I expect them to try it again and price the GTX 1160 close to the GTX 1070.
Good luck with that nonsense. People can only afford to pay so much and if you insist on making consoles a better value, people will go that route.
 
Good luck with that nonsense. People can only afford to pay so much and if you insist on making consoles a better value, people will go that route.

Indeed. Also.. I was a non believer before, but I recently got to try out both Project Stream and Nvidia Geforce Now, which both deliver a console like gaming experience without any need for special hardware. Now granted, I love my PC rigs and laptops with decent power, and there will always be folks like us here at [H] who want the very best graphics, responsiveness, etc. But for the masses, when they see that GPUs are going to cost $400+ to get the mid tier to great stuff, they may just decide that $2 per hour is better. (I know... the long term math is not good, but when have the masses been good at logic and math?)
 
Curious, if this card came out as the GTX-1180 for $399, would it sell?

a) We don't know where this is going to be priced.

b) Your theoretical 1180 would actually be an 1160, and it would have a fraction of the raster performance of the 2080.

I'm talking about something like the card listed above which should be more powerful than the 1160. I don't believe that the 1160 is going to perform 12% better than a 1080 like the linked card does. I'm guessing somewhere in the 1070/1070Ti range.

And my point is that if Nvidia released a 1180 with 30% more raster performance than a 1080 for ~$500 people would buy it. People aren't going to buy a 1080 equivalent 2070 for $500 just because of their hocus pocus RTX features.
 
As much as I hate Nvidia's pricing this generation, I think it was the right move to make based on their current situation. Everyone I look, 2080TIs are selling well. Further, by having the 2080TI so overpriced, it has helped move their backstock of 1080TIs and other older cards. It sucks for me, as I tend to upgrade my GPU every other generation, and my 980TI is getting quite long in the tooth.

I get to build a rendering rig for work come 2Q19, and it still might end up with a 2080TI in it, or 2 of whatever AMD comes out with. On a personal level, I want to support AMD over Ngreedia, but it ultimately comes down to price:performance ratio and what the cards can do. If AMD really comes out with a 1080TI equivalent for $400ish, then it will make way more sense to have 2 of those in the rig than a 2080TI, as I'm not using the rig for AI (just a lot of photoshop/premiere work). Heck, even one 1080TI equivalent at a fraction of the cost of a 2080TI would probably be enough for the workload the machine will be doing, but dammit, I want 4k 100+FPS in gaming on it too :)
 
It would be great if Nvidia decided to pretend caring about the main stream gamers and release an true mid range card.
The 1160/2060 if it has die size wasted on ray tracing it will perform worse than the $450 GTX 1080 that is 2.5 years old
 
Last edited:
Yeah those days are gone, mainly because of import taxes and companies trying to post record profits for the sake of stock tickets.

It’s not just that, the die size of the top GPUs now is ridiculous. They weren’t really pushing the envelope before. But they are now that die shrinks don’t do much. Same situation as the CPU market.
 
I want 2080Ti performance without ray tracing for $700. I'm in at that price and performance.
 
It’s not just that, the die size of the top GPUs now is ridiculous. They weren’t really pushing the envelope before. But they are now that die shrinks don’t do much. Same situation as the CPU market.

Agreed, from that 14->12-> to now we really haven't seen a break neck of an increase that really means something, I mean sure resolution capabilities have increased, but they always have, we haven't really had something like the jump from vector to rasterization kind of level of jump, ideally raytracing would be the next major jump in the way things are rendered but as NVidia proved we are no where near a level of relative power in order to push such a technology, not without creating cores specifically designed to calculate the Ray's and even then we have sibpar performance do to compute restrictions.
 
This was always needed...?

We're just going to get more RT cores relative to shader cores for rasterization in future releases.

Yeah, I don't see that happening, those are compute cores in workstation cards, they don't want to cannabalize the higher profit product lines I don't know about you but $3-5k for a gaming card isn't my cup of tea AMD neophytes would be out in full force.
 
those are compute cores in workstation cards,

They can make cards without RT cores for compute, or 'blend' the cores if there's enough similarity between them.

I don't know about you but $3-5k for a gaming card isn't my cup of tea

Where do you get this from? Smaller processes mean more transisters in the same space. Price shouldn't shoot up for consumer cards.
 
They can make cards without RT cores for compute, or 'blend' the cores if there's enough similarity between them.



Where do you get this from? Smaller processes mean more transisters in the same space. Price shouldn't shoot up for consumer cards.

Perhaps I didn't explain well enough?

While true you can have unified shaders type of compute or blending a specialized core or design using optimization will always be better at a function with lesser overhead than a more generalized core type, you can more than well enough try Ray tracing on rastercores but the performance is abysmal(NVidia confirmed as much), I believe If you need explaination for verification of my though lines here you can hit up one of AdoredTv's videos where he explains this. While what you say can be possible I believe not with current lithography methods at this very moment. Maybe in 2 years when 7nm uvl is matured enough.

For the second part, selective quoting doesn't help here, I stated or was on the thought process that loading a gaming card with high compute level cores can rival and compromise a workstation class card which cost significantly more money. Why? As a corporate entity would you allow a lower level product compromise a higher tier product that sells for a lot more, Ideally in all likelihood besides the magnanimous die size of the 2000 series is large on 12nm the 1200$ price tag doesn't in all likelihood come from that alone, it's probably a symptom of adding higher level compute to the die, why should I buy an $8k workstation card when as everyone wanted the ti model to be 800$ when it may have enough compute to be better looking at a lower tier product for my business, would 6hours of Ray traced rendering time make that huge a difference when it used to take several days to a week?
Do you by chance remember the EVGA9800gx2SSC? Many people were buying them up for compute versus buying workstation cards because they were $650 a piece rather than more than $1k, I believe NVidia won't make that mistake again.

Sorry if I was confusing or not filling in to much depth, a cell phone is not ideal for long term explainations.

The 3-5k basically is a joke, people want 4k 144htz Ray tracing now, which isn't really possible but if they could do it people would bitch. Because the card would be $3-5k with that much compute as like I said NVidia wouldn't be stupid enough to comprimise the more revenue streaming higher end products like their workstation lines.
 
It will be worth it, for what? HOw many DLSS games do you currently have out? Tired of DLSS stuff, with nothing to show for. Not sure why people justify purchase on tech that is not automatic and not guaranteed for every game.

Preach. But this is how Nvidia's marketing works: look at this awesome feature! Just ignore the fact that no games use it yet... Within a year, there might be 2 games that use it, tops, because very few people will have these GPUs. So, pay us full-price now for some benefit a year later on like 2 games! Don't wait until those games are actually available with our feature, by then GPUs will be discounted, and that's just not cool brah.

I went through this with physx. I switched from an rx480 to a 1060 because I wanted physx in whatever Batman game launched at that time (puffy smoke clouds looked awesome), and some other game... I can't even remember, that's how relevant it ended up being to my gaming. In hindsight, it was a bad choice, but it made sense because thanks to the mining craze, I gained nearly $200 when I sold it for way more money than I bought it (thanks miners!).

Now, I'm not falling into the Nvidia promise trap. RT, DLSS, whatever. Nothing uses it yet, I'll wait 5 years and see if the market ends up embracing it. Meanwhile, I'm buying Navi for good enough ultrawide 1080p performance at a cheaper price.
 
Last edited:
They should have dropped down the cards a tier. The 2070 should have been the new 2060, and so on. Also lowered in the price per tier as well. But it was new and shiny, and I guess they had to pay for that decade of R&D somehow.
 
They should have dropped down the cards a tier. The 2070 should have been the new 2060, and so on. Also lowered in the price per tier as well. But it was new and shiny, and I guess they had to pay for that decade of R&D somehow.

nVidia is just pushing the mid-end chips and trying to pass them as high end chips. They started this when the Geforce x80 chip went from using a G_00 die to a G_04 die. For example, the GTX580 actually used the high-end GF110 chip, while it's successor the GTX680 used the Mid-Tier GK104, with the TRUE successor of the GTX580 was the GK110 based Titan. They eventually used the GK110 in the striped down GTX780, but prices during the GTX7-Series was kinda ridiculous, and kind of a less extreme of what were are going through today.

The Closer parallel to today would be 9 years ago, with the GTX200 series. The GTX280 is a nice parallel to the RTX2080 and the GTX260 paralleling the RTX2070. If you account for inflation, the GTX280's MSRP is about $705 with the GTX260 MSRP being about $530.

So is it out of the question that AMD will respond with an RX-3080 the performs better than the RTX2070 for $350 and an even cheaper RX-3070 for $220? Because that's how AMD responded responded with the 4870 and 4850, again adjusting for inflation.
 
It's cause they have all that overstock, so they're gonna change one or two parts and sell it for more
 
Back
Top