Is nvidia marketing to blame for the RTX complaints?

Any hopes Nvidia had of getting me to spend money on these died when they decided to price them at their current price points. Nvidia had always priced their new cards more or less around the previous gen's msrp but decided to give these new cards a significant bump this time.. I'm currently running a 1080 which i got for $669 (cdn) back in November 2017. If the 2080 TI was closer to $1000 cdn I'd be a little more tempted to bite, but it's currently sitting at $1600 cdn. I originally spent around $1600 for my whole system when i first put it together, not for a single gpu.
There is no real competition to drive prices down so GPU's getting more expensive is inevitable.

Anyways, I think price is one of the biggest issues for these cards right now.
U sure?
SsyvM3F.gif


Yes, some day ray tracing maybe the thing, but today it is still pretty much the end goal it was in the 80's and 90's, albeit slightly closer. Once we have actual implemented raytracing on a broadscale, not just a shitty EA game and some vague promises of it, we will be there, currently we do not.
To have raytracing in games you first need actual hardware to run it on.
Another year without those "useless" RTX features would mean another year without anything being done to push graphics in games to the next level.
 
So, ignoring all of this die size talk, I think the 2000 series are just priced as such, at least partially, to help clear out all of the old pascal stock without dropping prices too much. Now how they want to go about doing that without driving people over to used cards, I don't know.

I think both companies are trying to maintain the current price/performance level as long as they can, because the mining crash bit them in the butt.

Personally it doesn't matter much to be, but I'm one of those crazy people that would rather just turn the settings down in a game or play something older rather than spend a ton of money on a fancy new GPU. Heresy, I know...
 
RTX is NVIDIA’s moniker...RT (Ray Traycing) is buildt into DX12 by Microsoft...RT is THE holy grail in graphics...so no.

DLSS also doesn’t affect gameplay (unlike PhysX)...A.I. is also comming (DLSS uses deep learning) if you like it or not.

Let me put it this way:
RT and Deep Learning are comming and if you don’t like that, you are SOL.


ok, BUT..

PhysX never affected gameplay, it was visuel eyecandy and nothing more.. at least in nV's inception.
and RT is the holy grail of graphics... when we have gen 2 or 3.. not in the current state. at all.

RT and deep learning is coming? my god.. keep em coming, haven't laughed that hard in a while.. this really is PhysX all over again, only worse..
 
ok, BUT..

PhysX never affected gameplay, it was visuel eyecandy and nothing more.. at least in nV's inception.
and RT is the holy grail of graphics... when we have gen 2 or 3.. not in the current state. at all.

RT and deep learning is coming? my god.. keep em coming, haven't laughed that hard in a while.. this really is PhysX all over again, only worse..
GPU PhysX was closed standard and RTX is hardware implementation of DXR which is available to any company that wants to implement it in their products.
Unreal Engine already got support for DXR and Unity will follow soon.

RTX as some super duper feature set that justify higher prices did fail but ray-tracing and neuron network generated graphics this is natural way computer graphics will evolve and there is no other way forward.

It was obvious RXT 20x0 series is a specific product released to clear the waters for this technology. NV could afford making this bold move to sacrifice rasterization speed for these features because of lack of competition... or rather their technical superiority and GPU market dominance.

And PhysX... it was strong selling point of NV cards and main reason many people bought GeForce over Radeon's. There was the time when AMD had simply better cards back when HD5xxx series came out and PhysX helped NV sales. So was it as failed as some people present it to be? No!

AMD will probably fail to implement RT on their products any time soon only proving they are 2 years behind competition despite having process node advantage. When games with DXR support start coming it will only make NV dominance even stronger.
 
ok, BUT..

PhysX never affected gameplay, it was visuel eyecandy and nothing more.. at least in nV's inception.
and RT is the holy grail of graphics... when we have gen 2 or 3.. not in the current state. at all.

RT and deep learning is coming? my god.. keep em coming, haven't laughed that hard in a while.. this really is PhysX all over again, only worse..

Could you name the most used physics middleware...just for the fun of it.

RT is buildt into DX12...it is called DXR (DirectX Raytraycing).
NVIDIA’s API to DXR is RTX.
AMD hasn’t gotten around to doing much with DXR, but I have a funny feeling your tune will change when they do.

Unless your account is fake that is.
 
GPU PhysX was closed standard and RTX is hardware implementation of DXR which is available to any company that wants to implement it in their products.
Unreal Engine already got support for DXR and Unity will follow soon.

RTX as some super duper feature set that justify higher prices did fail but ray-tracing and neuron network generated graphics this is natural way computer graphics will evolve and there is no other way forward.

It was obvious RXT 20x0 series is a specific product released to clear the waters for this technology. NV could afford making this bold move to sacrifice rasterization speed for these features because of lack of competition... or rather their technical superiority and GPU market dominance.

And PhysX... it was strong selling point of NV cards and main reason many people bought GeForce over Radeon's. There was the time when AMD had simply better cards back when HD5xxx series came out and PhysX helped NV sales. So was it as failed as some people present it to be? No!

AMD will probably fail to implement RT on their products any time soon only proving they are 2 years behind competition despite having process node advantage. When games with DXR support start coming it will only make NV dominance even stronger.



i agree.. we just have a different timeline for when..

This will be great - once it's ready.. and until then.. it's really a waste of money, and again.. i'll bet that when the software is ready.. faster hardware will be out, simple really.

I also think that AMD will fail in getting raytracing hardware in time for the navi release.. but that will hurt nVidia also.. that will make it much more expensive to "persuade" the different software makers to use the feature that the 0,1 percent have..

again back to PhysX.. you might argue that it sold cards.. and it did.. it just sucked... it tanked performance in most games it was used in..

but hey, that's just my humble opinion - feel free to disagree..

I would have LOVED the 2080 with cuda cores instead of tensor cores.

the 2080 ti is also a super sexy card, just priced out of my pricerange since i tend to buy a new card yearly-ish
 
Are you trolling?
PhysX has been veen able to run on CPU’s since the AGEIA days?!

I usually turned it off to avoid it hammering my CPU cores. I remember Metro 2033 benchmark running at a crawl because Physx was on. Turning it off got me back to 60fps.

If I owned an NVidia GPU, I would just set Physx to the GPU in NVCP. But if I didn't, I had to turn it off entirely.
 
I usually turned it off to avoid it hammering my CPU cores. I remember Metro 2033 benchmark running at a crawl because Physx was on. Turning it off got me back to 60fps.

If I owned an NVidia GPU, I would just set Physx to the GPU in NVCP. But if I didn't, I had to turn it off entirely.

You seem to be confused about the performance gap between CPU’s and GPU’s.

Try running graphics on your CPU...to keep up your argumentation you would have to blame DirextX for the poor performance of rendering the graphics on your CPU...and that would make me question the your level of knowlegde/insight into the topic...
 
1. No one outside of hardware enthusiasts care about die size, so the continual argument about the 2070 being a larger die is moot, those details do not justify or help sell a card to the general consumer.
It's not moot. The price the market will carry does not care what the cost to manufacture is, but you can be sure that the manufacturer does care! If the die size is really large, the number of ICs per wafer and the yield rate go down. That drives the price to manufacture up. Add in all the profit margin and other costs, it must clear a companies internal rate of return hurdle. If it can't, then they simply don't make the product. There needs to be enough slack between what the general consumer market will pay and the total cost + profit the company desires to make or the product simple won't exist. Products that fail this are mistakes.

The problem for consumers with RTX is that it's not more than 1.3x previous gen (extra cores) + RT/DLSS. At least for now, that's a hard sell due to the limited value add in actual games. It's not terrible if you are upgrading from several generations ago. Specifically, it means, ignoring model numbers and focusing only on price, there really isn't a performance increase, but a feature increase of highly subjective value. Personally, I'm on a 1080Ti and waiting for this architecture to hit 7nm.
 
It's not moot. The price the market will carry does not care what the cost to manufacture is, but you can be sure that the manufacturer does care! If the die size is really large, the number of ICs per wafer and the yield rate go down. That drives the price to manufacture up. Add in all the profit margin and other costs, it must clear a companies internal rate of return hurdle. If it can't, then they simply don't make the product. There needs to be enough slack between what the general consumer market will pay and the total cost + profit the company desires to make or the product simple won't exist. Products that fail this are mistakes.

The problem for consumers with RTX is that it's not more than 1.3x previous gen (extra cores) + RT/DLSS. At least for now, that's a hard sell due to the limited value add in actual games. It's not terrible if you are upgrading from several generations ago. Specifically, it means, ignoring model numbers and focusing only on price, there really isn't a performance increase, but a feature increase of highly subjective value. Personally, I'm on a 1080Ti and waiting for this architecture to hit 7nm.

It is moot to the consumer, it is not a selling point unless there is a benefit to the larger size, which there isn't much if you put the rest of my post in.
 
It is moot to the consumer, it is not a selling point unless there is a benefit to the larger size, which there isn't much if you put the rest of my post in.
You aren't looking at the big picture.
Mainstream gamers dont care about die size but they do care about price to performance ratio and this turing generation is not doing well on that based on sales data.
Nvidia chose to use almost 1/3 of the die size on a feature that gamers don't care about.
Most gamers care about performance and price, and these cards would have sold like hot cakes if they had used the 33% die space for more cuda cores.
 
You aren't looking at the big picture.
Mainstream gamers dont care about die size but they do care about price to performance ratio and this turing generation is not doing well on that based on sales data.
Nvidia chose to use almost 1/3 of the die size on a feature that gamers don't care about.
Most gamers care about performance and price, and these cards would have sold like hot cakes if they had used the 33% die space for more cuda cores.

Go back and read my original post.
 
1. No one outside of hardware enthusiasts care about die size, so the continual argument about the 2070 being a larger die is moot, those details do not justify or help sell a card to the general consumer.

2. Could nVidia have combated this by bumping models up, ie 2080 becomes the 2080ti, the 2080ti becomes the Titan, and 2070 becomes 2080? Partial solve, but it would highlight how there is very little performance increase this generation.

3. The biggest problem is the lack of RTX/DLSS titles, we are in a chicken and egg moment, which comes first, the hardware or the software? So the consumer is paying for a bunch of Tech that cannot be used yet, on the hopes that it will be used in the future. nVidia took this new tech and upped the price rather than cut their margin, a short sighted move IMHO because if you want RTX to take off, you need market penetration, which isn't going to happen atm.

I bought a 2080ti, and am quite happy with it, was it worth the cost compared to my Titan Xp, no, but it is the fastest and I do notice the difference.

Nvidia should have seen all this coming. What nvidia should have done was work with game developers and forgo the DLSS for now. I think They should have added more cuda/rtx cores vs tensor cores dedicated to DLSS. I believe that is where they went wrong. Or Not hyp hype the RTX that much. IF it had more cuda cores and aother 15-20% or so performance per card it would have been excellent and then kinda mention "hey our card will also do ray tracing and we are working with developers" it will be a slow process but we will slowly get there. I think 2080ti being 50% or so faster then 1080ti would have been money. DLSS is not really well received at this point. More Cuda cores would have gone a long way along with Rtx.
 
You seem to be confused about the performance gap between CPU’s and GPU’s.

Try running graphics on your CPU...to keep up your argumentation you would have to blame DirextX for the poor performance of rendering the graphics on your CPU...and that would make me question the your level of knowlegde/insight into the topic...

Physx is enabled by default for that game. If I didn't use an Nvidia GPU, performance tanked with that setting on. I got better performance on my GTX 650 in some scenes than an R9 290 if Physx was on for Metro because it was running on the CPU; the 650 didn't have that problem.
 
Physx is enabled by default for that game. If I didn't use an Nvidia GPU, performance tanked with that setting on. I got better performance on my GTX 650 in some scenes than an R9 290 if Physx was on for Metro because it was running on the CPU; the 650 didn't have that problem.
CPU PhysX is the main physics engine yes...but that has NOTHING to do the GPU PhysX effects you talk about.

FUD and ignorance is the new black on forums:
https://www.tomshardware.com/reviews/nvidia-physx-hack-amd-radeon,2764.html
 
CPU PhysX is the main physics engine yes...but that has NOTHING to do the GPU PhysX effects you talk about.

FUD and ignorance is the new black on forums:
https://www.tomshardware.com/reviews/nvidia-physx-hack-amd-radeon,2764.html

From the article you posted:

"This is where our current dilemma begins. There is only one official way to take advantage of PhysX (with Nvidia-based graphics cards) but two GPU manufacturers."

This basically sums up what I said.

This image from the article is basically my experience:

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS83L1MvMjYwNzc2L29yaWdpbmFsL21hZmlhMl8wMS5wbmc=.jpg


If you didn't have an Nvidia card (AMD only), Physx was CPU only.

Having this setting enabled by default is silly.
 
Take a few days off to work on your attitude
From the article you posted:

"This is where our current dilemma begins. There is only one official way to take advantage of PhysX (with Nvidia-based graphics cards) but two GPU manufacturers."

This basically sums up what I said.

This image from the article is basically my experience:

View attachment 142700

If you didn't have an Nvidia card (AMD only), Physx was CPU only.

Having this setting enabled by default is silly.

Hey muppet...turn of the EXTRA GPU physics and the CPU physics will run fine.
You should buy yourself a console and sell your PC, you cleary lack the neurons to be gaming on a PC.
 
This is only an issue if you insist on getting the matching number. If you have a 1080, just buy a 2070 for equivalence. Don't worry about impressing people with your specs.
 
I think they made a TERRIBLE job on the model naming for the RTX line.

One of the main concerns is that the RTX2080, for example, replaces the GTX1080, yet it performs and costs much as a GTX1080Ti.
Same thing goes for the rest of the line except the 2080Ti which is a special case, so to speak.

As it is now

RTX 2060 replaces GTX1060, costs like a GTX1070
RTX 2070 replaces GTX1070/1070Ti costs like a GTX1080
RTX 2080 replaces GTX1080 costs like a GTX1080Ti
RTX 2080Ti replaces GTX1080Ti costs and arm and a leg, a kidney a lung and both retinas.

But if the RTX 2060 was named as RTX2070 instead?
Suddenly it wouldn't sound so bad as it would cost as much as a GTX1070 and perform faster, not by that much, but still.

And if the RTX2070 was named as RTX 2080?
Guess what, it would still be faster than the GTX1080 and cost as much.

and you get the idea.

I guess nvidia feared that consumers would view Turing as just Pascal + RTX/DLSS with just a small bump in performance for the same price. So they chose to position them as a large performance increase + RTX/DLSS but also a big jump in price.

And now we know how it turned out...


I've thought similar things about this. Part of it - I think - has to do with the fact that they promoted the Titan to non-gaming enterprise/machine learning/compute territory, so the TI is now what the titan used to be. If you think of it that way throughout the lineup, just bump them up one level, then the prices don't seem as outrageous.

On the flip side - however - if you bump them up one level in the lineup, then the generation to generation performance increase is even more disappointing than it already was.
 
From the article you posted:

"This is where our current dilemma begins. There is only one official way to take advantage of PhysX (with Nvidia-based graphics cards) but two GPU manufacturers."

This basically sums up what I said.

This image from the article is basically my experience:

View attachment 142700

If you didn't have an Nvidia card (AMD only), Physx was CPU only.

Having this setting enabled by default is silly.


Is this 2010?

I have found that having a dedicated PhysX card has had little to know benefit now for 7-8 years.

It was cool for a short period about 10 years ago, but nowadays it doesn't seem to impact the load of the main GPU at all, and sometimes performance even goes down if you use a dedicated PhysX card.

That, and I feel like PhysX is nowhere near as ubiquitous as it once was. Very few titles seem to use it at all these days.
 
I've thought similar things about this. Part of it - I think - has to do with the fact that they promoted the Titan to non-gaming enterprise/machine learning/compute territory, so the TI is now what the titan used to be. If you think of it that way throughout the lineup, just bump them up one level, then the prices don't seem as outrageous.

On the flip side - however - if you bump them up one level in the lineup, then the generation to generation performance increase is even more disappointing than it already was.

I don't think performance increase is dissapointing at all. I mean the RTX2080Ti is a monster.
 
I don't think performance increase is dissapointing at all. I mean the RTX2080Ti is a monster.

I guess it depends on what you are coming from. Compared to my Pascal Titan X I bought for the same price as an MSRP 2080ti TWO AND A HALF YEARS AGO, you gain - what - 20-30% on average?

So lets say I'm getting ~52fps which is common for many titles with high settings at 4k, this puts me just a hair above 60fps in those titles.

Sure, being able to vsync at 60fps is nice for those of us that don't have adaptive sync screens, but yet another $1200 upgrade for 8fps just doesn't seem worth it, and is a bit disappointing.
 
I guess it depends on what you are coming from. Compared to my Pascal Titan X I bought for the same price as an MSRP 2080ti TWO AND A HALF YEARS AGO, you gain - what - 20-30% on average?

So lets say I'm getting ~52fps which is common for many titles with high settings at 4k, this puts me just a hair above 60fps in those titles.

Sure, being able to vsync at 60fps is nice for those of us that don't have adaptive sync screens, but yet another $1200 upgrade for 8fps just doesn't seem worth it, and is a bit disappointing.
Well, there are several cases where the Ti is capable of running games @4k 60+ fps at max settings where the GTX2080 just can't. So at least, there's that.

In the end, like Neo said, "the problem is choice".... err I mean, "Price" :D:D
 
Well, there are several cases where the Ti is capable of running games @4k 60+ fps at max settings where the GTX2080 just can't. So at least, there's that.

In the end, like Neo said, "the problem is choice".... err I mean, "Price" :D:D

Well, if we ever get choice, the prices will fall.

It's great that AMD has been in resurgence on the CPU front, but I really hope they get their act together on the GPU front as well. They've been without a high end product for way too long.

Heck, it would be amazing if both Intel and AMD can compete in GPU's. I miss the days when we had 3-4 legitimate GPU competitors. It's been a while, but there was a time when 3DFX, ATi and Nvidia were all competing for high end graphics, and Matrox wasn't that far behind them.
 
Well, if we ever get choice, the prices will fall.

It's great that AMD has been in resurgence on the CPU front, but I really hope they get their act together on the GPU front as well. They've been without a high end product for way too long.

Heck, it would be amazing if both Intel and AMD can compete in GPU's. I miss the days when we had 3-4 legitimate GPU competitors. It's been a while, but there was a time when 3DFX, ATi and Nvidia were all competing for high end graphics, and Matrox wasn't that far behind them.
I agree.
I don't expect Intel to compete in the high end on its first try (well, actually 2nd try, or is it 3rd?). But I can see the sub $300 market getting really competitive and that means good news for us all.

So navi vs intel vs Ampere. Seems 2020 will be exciting.
 
I guess it depends on what you are coming from. Compared to my Pascal Titan X I bought for the same price as an MSRP 2080ti TWO AND A HALF YEARS AGO, you gain - what - 20-30% on average?

So lets say I'm getting ~52fps which is common for many titles with high settings at 4k, this puts me just a hair above 60fps in those titles.

Sure, being able to vsync at 60fps is nice for those of us that don't have adaptive sync screens, but yet another $1200 upgrade for 8fps just doesn't seem worth it, and is a bit disappointing.


I always subtract the cost of the card when I upgrade. Plus if you are coming from a titan x you should be aming for another titan. Yep it’s sad it’s expensive but it is what it is. You are right. It’s not worth the cost for you unless you had money to throw at a titan. But for someone like me it was 70%+ performance increase. Plus I paid less than 1000 for my 2080ti.

For you though I would probably take the wait and see approach. Hopefully there is enough competition next year for nvidia to rethink their strategy.
 
I always subtract the cost of the card when I upgrade. Plus if you are coming from a titan x you should be aming for another titan. Yep it’s sad it’s expensive but it is what it is. You are right. It’s not worth the cost for you unless you had money to throw at a titan. But for someone like me it was 70%+ performance increase. Plus I paid less than 1000 for my 2080ti.

For you though I would probably take the wait and see approach. Hopefully there is enough competition next year for nvidia to rethink their strategy.


Agree, and that is what I am doing.

I think you missed the part of our conversation where we were comparing GPU's by cost parity generation to generation.

So, my Pascal Titan X was $1,200 when I bought it new in August of 2016. If I were to replace it with a GPU of the same cost today, that would be a FE 2080ti. So, dollar for dollar, the performance upgrade has been somewhat disappointing over the last 2.5 years.

In 2013 I bought the original Kepler Titan for $1000 on launch in order to try to keep up with my 30" 2560x1600 screen. Nothing else at that time could do it. Then in the summer 2015 I tried SLI again, by going with dual 980ti's to support my new 4K TV. They were $675 a piece I think for th eEVGA OC models I went with? Can't remember, but lets say $1,350 for the pair. It was a disappointing experience, so when the Pascal Titan X launched in August of 2016 I jumped on it for $1,200.

If I wanted to go from Titan to Titan today, a Titan RTX would cost me $2,499. Could I afford it? Yes. But damn. I just can't bring myself to do it. It's just too nuts.
 
Agree, and that is what I am doing.

I think you missed the part of our conversation where we were comparing GPU's by cost parity generation to generation.

So, my Pascal Titan X was $1,200 when I bought it new in August of 2016. If I were to replace it with a GPU of the same cost today, that would be a FE 2080ti. So, dollar for dollar, the performance upgrade has been somewhat disappointing over the last 2.5 years.

In 2013 I bought the original Kepler Titan for $1000 on launch in order to try to keep up with my 30" 2560x1600 screen. Nothing else at that time could do it. Then in the summer 2015 I tried SLI again, by going with dual 980ti's to support my new 4K TV. They were $675 a piece I think for th eEVGA OC models I went with? Can't remember, but lets say $1,350 for the pair. It was a disappointing experience, so when the Pascal Titan X launched in August of 2016 I jumped on it for $1,200.

If I wanted to go from Titan to Titan today, a Titan RTX would cost me $2,499. Could I afford it? Yes. But damn. I just can't bring myself to do it. It's just too nuts.

Yea that was my point as well when I said that its expensive. Nvidia basically asked top-end buyers to stretch their wallet which is basically bullshit. But they did force Turing as well on 12nm, which is a big die and they want to keep their profits sky high still I guess. I guess they saw it as an opportunity to milk buyers. But Turing didn't work out so well for them though, the numbers show. I think they should have waited until they depleted pascal and priced them lower but they tried to double dip and didn't want to take any less profits I guess. I can bet they are making good double the profit on 2080ti founders edition. Probably costs around 600 to make it.
 
for me it is this, price is double of last gen. ti is now in pricerange of titan cards. rtx does look good but it seem to perform very poorly esp. with anything less than 2080ti. 1080ti is still very much alive and performing to same degree as 2080 rtx and below, just with more ram on it, same like the new ti.. and no rtx obv. dlss is a gimmick for now. 2-3 titles out now support rtx. misleading customers with launching 2060++ as a high performance high fps rtx card. alot can be mended with a more humane price perhaps, no wonder nvidia profits are down with mining craze dialing down and prices doubling.
 
People want a 144hz/4k GPU. The 2080ti wasn't it. So there's basically no point in it existing. It's too slow for ray tracing, it's too slow for 144hz/4k, so what's the point?

It adds no value.
 
People want a 144hz/4k GPU. The 2080ti wasn't it. So there's basically no point in it existing. It's too slow for ray tracing, it's too slow for 144hz/4k, so what's the point?

It adds no value.

No I have a 1440p 144hz in gaming. I don't give two shits about performance. 4k 144hz is few gens away. So not sure what you were expecting. Plus 4k 144hz is 2k monitor. If you can afford that you can sure afford a titan as well. 2080ti is overpriced for sure but it is a viable option for 144hz 1440p. If you can grab it under 1k its not a bad option.

speak for yourself. There aren't a lot of people running 4k 144hz $2000 monitor. It adds no value to you.

We all define value differently. It's not the same for everyone. But you have every right to speak for yourself though. Not for people in general.
 
People want a 144hz/4k GPU. The 2080ti wasn't it. So there's basically no point in it existing. It's too slow for ray tracing, it's too slow for 144hz/4k, so what's the point?

It adds no value.

Because the 2080ti is the only one that will even do at least 4k/60FPS+ in newer titles. You jumped straight to 4k/144hz without even recognizing that the 2080ti is the first to 4k/60hz properly.

I have a X27. You're correct, in the latest titles I can't hit even 100FPS+ with a 2080ti. At the very least i'm holding above 60 though.

Additionally due to link limitations, you can't even hit 144hz/4k unless you downgrade to 4:2:2, or you can set it to 120hz/8-bit RGB, or if you've got the software that can use it 98hz/10-bit RGB.

Ultimately the 2080ti was the only card that would at least keep me above 60Hz in most titles, and that to me adds value. My 1080Ti would dip regular to the 30-40 FPS range in titles like AC Odyssey, and even with g-sync once you start getting into that 30-40 FPS range it becomes noticeable.

Finally, as more titles that properly utilize raytracing like Metro come out, it'll make that feature set worth it, and the 2080ti is the only card that will give acceptable framerates @ 4k running raytracing without sacrificing much quality by having to turn down settings elsewhere. The raytraced global illumination in Metro is absolutely stunning, and it's the first title in quite a while that left wowed by the graphics. I haven't really been wowed by graphics since Crysis first came out. Seeing HDR for the first time in a game wowed me too, but that was more of a display feature, not a GPU compute related feature.
 
Last edited:
People want a 144hz/4k GPU. The 2080ti wasn't it. So there's basically no point in it existing. It's too slow for ray tracing, it's too slow for 144hz/4k, so what's the point?

It adds no value.

That's not happening in a few generations and it's not something everyone wants or needs. Not to mention there's only 2 displays that can do those refresh rates at 4K in the first place and they have limitations on it as well.

It's mistaken to say that everyone chases the fastest framerates their display can handle. As long as games run above 60 fps most of the time I'm good, G-Sync/Freesync handles the dips under pretty nicely. I went from 980 Ti to 2080 Ti because I wanted to make it a bit more future proof when I get a 4K desktop display later this year. I've got a 1440p 144 Hz display yet I've decided to instead downscale from 4K for most games because I appreciate the increased visual fidelity more than running at 100+ fps vs running at 60+ fps. I don't play multiplayer shooters where framerate is king. I'm definitely the type who would enable all the RTX options because they look better even if they incur a heavy hit on the framerate.
 
Back
Top