Nvidia SUPER Refresh: Faster 2060, 2070, 2080, Faster GDDR6 and $100 Off

1- He was talking about hot hardware while owning an 9900k LOL "nonsense"
2- RTX = DOA, like it was 3D Vision back in the day and gsync is going to be...
3- I was talking to you? is he your boyfriend/relative or something...? hm... talking about getting triggered eh?
BTW i wasn't offtopic, i said that im not going to buy it. thats why your "boi" got triggered. now imgoing to ignore you, keep lurking.

3D Vision was around before and outlasted the latest cinema3D trend. It was NEVER DOA. Nor was Gsync. I don’t think you know what DOA means.
 
1- He was talking about hot hardware while owning an 9900k LOL "nonsense"
2- RTX = DOA, like it was 3D Vision back in the day and gsync is going to be...
3- I was talking to you? is he your boyfriend/relative or something...? hm... talking about getting triggered eh?
BTW i wasn't offtopic, i said that im not going to buy it. thats why your "boi" got triggered. now imgoing to ignore you, keep lurking.

1) I don't have a 9900k
2) RTX is being talked about reference AMD as well.
3) love you.
 
I hope Nvidia plays the John Williams Superman Theme at the announcement.
 
The real reason: an update to the 2080 Ti would kill the RTX Titan.
The Titans are moving well and I am waiting to hear back on my 20% discount offer from nVidia on their purchase of 2 of them. If there are any discounts to the 20xx series cards it will come in the form of mail in rebates not direct price drops and would be done to reflect cost reductions in components not really on the GPU chips themselves. I really wouldn’t be surprised if the better memory starts finding its way into the 2080 TI and Titans just allowing for better OC’s.
 
and would be done to reflect cost reductions in components not really on the GPU chips themselves

DRAM production cycles should be highlighted- essentially, when the fabs move on to something new, supply of the old stuff drops and prices rise accordingly. If GDDR6 is now hitting good yields, for example, then GDDR5X parts may be phased out; similarly, if GDDR6 is now hitting higher frequencies, then perhaps there's room for a new tier of SKUs.
 
RTX is less used then VR and VR was way over hyped as the next big thing. What they will find out will be the same thing, no one is itching to drop a grand to have a sub par experience.

Wow closed minded much? I use RTX tech when I play games that support it. And DLSS, and just better performance than my 1070. I enjoy the card I have the cost I paid for it is already forgotten. I know I'll get a year or two or more out of this card so I'm content.

Sure I overpaid a bit for new tech. That's just the way it works with new tech. It's always more expensive out the door.
 
Nvidia Super.....Sorry we tried to move pricing upmarket, but heres a better version for the same money and something that basically cements out anything AMD will launch with their rDNA...We hope lol

Unbiased Op:
Why is Nvidia even bothering to launch zilch, there clearly so far ahead, that it makes the timing interesting. 7nm Samsung/ampere is 2020, sooo keep it as usual right?
 
Wow closed minded much? I use RTX tech when I play games that support it. And DLSS, and just better performance than my 1070. I enjoy the card I have the cost I paid for it is already forgotten. I know I'll get a year or two or more out of this card so I'm content.

Sure I overpaid a bit for new tech. That's just the way it works with new tech. It's always more expensive out the door.

Problem is you bought a first gen tech that will be dropped hard when they find a better way to implement it, much like Crytek has shown. Also it comes down to frame rate you can live with, with RTX the frame rate is far too low for me to like unless I am playing Civilization or something slow like that. I just see it as a waste as much better tech will be out shortly that can actually run it but I also have not seen it visually add that much for the load it causes. We just have a different opinion on it's usefulness.
 
You are very confused.

Nope, not at all and at least he used the proper no one spelling, unlike the foolish noone that does not even exist, so it is not even a word. :D RTX tech is limited at best and I agree with him, it will be dropped as quickly as possible when something better comes along. Besides, I would not buy a $1200 video card just to play at 1080p, nope, not me. :D

We can all feel free to spend our own money as we see fit but, that does not make him wrong.
 
You are very confused.

You are I am not, were not swimming in RTX features on game releases for a reason, not worth the effort for the tiny install base and the performance hit is far to great. 100 bucks off and maybe faster ram is not going to change that. First gen hardware never lives up to it's promise, takes a generation or two and even then the standard may even change making first gen tech useless.
 
You are I am not

:ROFLMAO:

First gen hardware never lives up to it's promise, takes a generation or two and even then the standard may even change making first gen tech useless.

I recall my 9700 Pro being useful for plenty of later DX9 titles, and my 8800GTS being useful for plenty of later DX10 titles... T&L in the GF3 kept working...

I could overrun the character limit with counter-examples.

And the big one would be this: RT hardware is simple. Most of the above were difficult to balance in terms of hardware, software, and drivers, but RT is so straightforward from a hardware and software perspective that it's difficult to see current generation hardware becoming unusable going forward. Maybe AMD's first implementation will struggle in drivers as their new architectures usually do, but since Nvidia is the default hardware reference, it's doubtful that they'll have trouble.
 
I’m just still pissed nVidia didn’t just put more CUs and more texture rendering silicon into the 2080Ti instead of the RT cores.

The tech is just not there yet imo.
 
Problem is you bought a first gen tech that will be dropped hard when they find a better way to implement it, much like Crytek has shown. Also it comes down to frame rate you can live with, with RTX the frame rate is far too low for me to like unless I am playing Civilization or something slow like that. I just see it as a waste as much better tech will be out shortly that can actually run it but I also have not seen it visually add that much for the load it causes. We just have a different opinion on it's usefulness.

You see it that way. My experience has been that the performance is fine at 1440p. I'm not playing high end competitive gaming. If that's your market where it's 120hz refresh or it's just too slow then yes you will not get everything you want from today's crop of RT titles.

I've seen performance improvements and NOT had the crazy slowness people seem to be claiming to date. Maybe it's because my games are all running off of NVME drives? Maybe it's some magic in the sauce or 30FPS is fine with me? I don't know. But for my casual gaming experience I've enjoyed it a lot.

My Boss also plays at 4k on a RTX 2080 and he's very happy as well.

No we're not pro gamers with refresh synced monitors and such. Such is life I suppose.
 
You see it that way. My experience has been that the performance is fine at 1440p. I'm not playing high end competitive gaming. If that's your market where it's 120hz refresh or it's just too slow then yes you will not get everything you want from today's crop of RT titles.

I've seen performance improvements and NOT had the crazy slowness people seem to be claiming to date. Maybe it's because my games are all running off of NVME drives? Maybe it's some magic in the sauce or 30FPS is fine with me? I don't know. But for my casual gaming experience I've enjoyed it a lot.

My Boss also plays at 4k on a RTX 2080 and he's very happy as well.

No we're not pro gamers with refresh synced monitors and such. Such is life I suppose.

Eew, you sound like an adult.
 
Ring of Elysium's patch notes dated June 4 have possibly leaked 2060Ti and 2070Ti.

mi86imbcog231.png


https://steamcommunity.com/games/755790/announcements/detail/1609387561737527006
 
Problem is you bought a first gen tech that will be dropped hard when they find a better way to implement it, much like Crytek has shown. Also it comes down to frame rate you can live with, with RTX the frame rate is far too low for me to like unless I am playing Civilization or something slow like that. I just see it as a waste as much better tech will be out shortly that can actually run it but I also have not seen it visually add that much for the load it causes. We just have a different opinion on it's usefulness.

What better way? It will be several YEARS before games can move beyond using a hybrid raster/RT approach. Until then, dedicated ray-tracing hardware like the RT cores in Turning will probably be the best option for handling ray-tracing. Not the cheapest option, but likely the one that offers the best performance. First gen tech is what it is, by the time RT has moved beyond "that's neat" in a handful of games these cards will be outdated and I think most people are aware of this. That said, I doubt there will be anything "shortly" about a better, higher performance, way to approach RT in games.
 
What better way? It will be several YEARS before games can move beyond using a hybrid raster/RT approach. Until then, dedicated ray-tracing hardware like the RT cores in Turning will probably be the best option for handling ray-tracing. Not the cheapest option, but likely the one that offers the best performance. First gen tech is what it is, by the time RT has moved beyond "that's neat" in a handful of games these cards will be outdated and I think most people are aware of this. That said, I doubt there will be anything "shortly" about a better, higher performance, way to approach RT in games.

It's more of a brute force approach rather then one of finesse. I also dont see many studios going with the effort of RT until the next gen of consoles are coming out and even then it may be very limited. Crytek showed a way of getting RT without the massive hardware need but it still needed a gpu beyond most of what the mainstream has.
 
What better way? It will be several YEARS before games can move beyond using a hybrid raster/RT approach. Until then, dedicated ray-tracing hardware like the RT cores in Turning will probably be the best option for handling ray-tracing. Not the cheapest option, but likely the one that offers the best performance. First gen tech is what it is, by the time RT has moved beyond "that's neat" in a handful of games these cards will be outdated and I think most people are aware of this. That said, I doubt there will be anything "shortly" about a better, higher performance, way to approach RT in games.

Not the cheapest option is the key phrase. I sure as hell aren't going to pay fucking $1200 for "that's neat" features in a handful of games. And you'll never convince me that the generational difference between the $699 1080Ti and the $1199 2080Ti is worth $500 difference no matter what sort of whiz bangery they throw into it. They lost themselves a customer here. I'm surprised more people didn't vote with their wallet rather than drink Nvidia's Kool-Aid.

This is coming from someone who has had everything from a Ti4200 to a 1080Ti.
 
Not the cheapest option is the key phrase. I sure as hell aren't going to pay fucking $1200 for "that's neat" features in a handful of games. And you'll never convince me that the generational difference between the $699 1080Ti and the $1199 2080Ti is worth $500 difference no matter what sort of whiz bangery they throw into it. They lost themselves a customer here. I'm surprised more people didn't vote with their wallet rather than drink Nvidia's Kool-Aid.

This is coming from someone who has had everything from a Ti4200 to a 1080Ti.

I think you've justified for yourself just fine not spending the money to have the fastest card available. Everyone else agree?

Enjoy.

I still think the tune on RTX being a gimic will go away in a flash just as soon as they show a game that uses RT driven reflections to see around corners.
 
Not the cheapest option is the key phrase. I sure as hell aren't going to pay fucking $1200 for "that's neat" features in a handful of games. And you'll never convince me that the generational difference between the $699 1080Ti and the $1199 2080Ti is worth $500 difference no matter what sort of whiz bangery they throw into it. They lost themselves a customer here. I'm surprised more people didn't vote with their wallet rather than drink Nvidia's Kool-Aid.

This is coming from someone who has had everything from a Ti4200 to a 1080Ti.

I can't say I particularly like the price, but no other card on the market can do 100+ FPS at 1440p in modern games with max/near max settings so it was mostly worth it for me.
 
I also dont see many studios going with the effort of RT until the next gen of consoles are coming out and even then it may be very limited. Crytek showed a way of getting RT without the massive hardware need but it still needed a gpu beyond most of what the mainstream has.

PS5/Xbox2 won't be a serious part of the RT conversation, much as they'll try to crowbar RT onto the bulletpoint lists like they did 4K for current gen.

And Crytek's RT demo was unfortunately misleading and got peoples hopes up; it wasn't until almost two months later that they explained the technical mechanics of it, and had to admit there wasn't anything necessarily magical about their proprietary take on RT (SVOPI), and that there is still no RT free lunch.

Vega 56 did 1080p30 Medium reflections (but the Youtube video was 4K which made people assume it was realtime rendered at 4K), and they stated RTX cards would do fullscreen 4K since SVOPI would take advantage of RT cores. https://www.cryengine.com/news/how-we-made-neon-noir-ray-traced-reflections-in-cryengine-and-more#

Unfortunately SVOPI will never be implemented widely let alone outside of Cryengine, as DXR looks like it will be the standard. But really, the more the merrier when it comes to RT.

Bottom line, RTX cards and RT cores won't be suddenly obsoleted by some magical new take on RT -- that's nonsense and implies a fundamental misunderstanding of how the tech works.
 
Last edited:
PS5/XboxTwo won't even be part of the RT conversation, despite the efforts they'll go through to crowbar RT onto their marketing bulletpoint lists like they did 4K for current gen.

And CryTek didn't do AMD fans any favors by getting their hopes up with that railed Neon Noir demo video for GDC, since it wasn't until almost two months later that they actually explained the technical mechanics of it, and had to admit what was already apparent to everyone else - that there wasn't anything necessarily magical about their proprietary take on RT (SVOPI), and that there is still no RT free lunch. Vega 56 did 1080p30 Medium reflections (but they uploaded the video to Youtube at 4K which made people assume it was rendered at 4K), and they stated RTX cards would do fullscreen 4K since SVOPI would take advantage of RT cores to accelerate certain functions. https://www.cryengine.com/news/how-we-made-neon-noir-ray-traced-reflections-in-cryengine-and-more#

Unfortunately SVOPI will never be implemented outside of Cryengine, as DXR looks like it will be the standard. But really, the more the merrier when it comes to RT.

Bottom line, RTX cards and RT cores won't be suddenly obsoleted by some magical new take on RT -- that's nonsense and implies a fundamental misunderstanding of how the tech works.

Yes yes Rah Rah Nvidia, reality is RTX still is not cutting it and no one thought the Neon Noir was at 4K with ray tracing with a Vega 56 and Crytek makes money by licensing their tech as well as making games. RT is very limited in games and is struggling to even do that, even the next gen wont be a ton better. Software to run the hardware is far more important then most realize and that will take awhile and likely most will have long moved on from the 2000 series by then. These current prices should concern you far more then the meager RT performance anyway, as that will cause far more harm to the PC gaming market then anything else.
 
Perhaps the introduction if PCIE 4.0 and the increased bandwidth will make the implementation of RT addon cards a reality? I would love to see that. Then you could buy whatever card you wanted and add that manufacturers RT daughter card (or even upgrade your Ray Tracing performance with an upgraded card.).

Until then the technology being embraced and it performing to acceptable levels for the majority of the market will draw in lower costs as yeilds improve, and more coders embracing RT as a tool within their engines.
 
Perhaps the introduction if PCIE 4.0 and the increased bandwidth will make the implementation of RT addon cards a reality? I would love to see that. Then you could buy whatever card you wanted and add that manufacturers RT daughter card (or even upgrade your Ray Tracing performance with an upgraded card.).

Until then the technology being embraced and it performing to acceptable levels for the majority of the market will draw in lower costs as yeilds improve, and more coders embracing RT as a tool within their engines.

Separate cards just don't fly. This isn't the 90's just ask Ageia.

Until it's baked in, it is a niche at best thing and won't be overly useful.
 
How about a 2080 ti update to support 4k120 hdmi 2.1 VRR on the new LG OLEDs.
 
I can't say I particularly like the price, but no other card on the market can do 100+ FPS at 1440p in modern games with max/near max settings so it was mostly worth it for me.

I guess I understand that. But it really made me question whether or not I cared enough about video games to spend $1200 for a little extra eye candy. In the end I didn't. When one single part of a gaming computer costs more than every other part combined there's a problem.
 
When one single part of a gaming computer costs more than every other part combined there's a problem.

To a point, and depending on purpose, I agree. Note that I haven't purchased an RTX card; for my uses, my 1080 Ti is plenty fast. If I'd had say a 980Ti still (which I didn't, but if I still had 970 SLI, shudder...), I'd be looking much more closely at them and likely at the 2080 non-Ti.

I'll also note that CPUs can be expensive, depending on purpose, monitors are expensive, don't get me started on sound including headphones.... :D
 
So well you didn't upgrade to it, you stayed on the 1000 series just like me. Actions speak louder then words.

I have a 1080Ti. I can upgrade, and if there was a game that appealed to me, I would.

And this is tech. The system I want to upgrade the most is my ultrabook, so I don't consider myself to be a decent subjective barometer. Everyone's needs are different. Objectively speaking, Nvidia's RTX tech works great.
 
If people didn't buy it, then the technology would never be widespread. If it doesn't become widespread, prices do not fall. You have to take the plunge sometime & there will always be better (and price drops & increases!) just around the corner. The fastest is only the fastest for so long in every way of life.
 
For you. A luxury class, flagship part is a choice, a splurge. By no means a minimum requirement.

If and when a price is a problem for enough other people, the price will come down.

But see that's what I don't get... The same "luxury class, flagship part" jumped in price $500 more from one generation to another. If it were $800 with new fancy features, so be it. I'd buy one. At $1200, an instant buy became a no buy. I can't believe people defend the pricing for RTX "features."
 
I can't believe people defend the pricing for RTX "features."

I can't defend price / performance, nor the utility of RTX in specific use cases.

What I've said is that I understand how they got to that pricing. The RTX GPUs are quite large; they offer increased performance over their predecessors and they have significant die space reserved for hardware RT. Further, I don't really care what Nvidia is charging: as has been obvious, by pricing RTX cards the way they have, their unit sales have dropped.

That's on them.
 
Back
Top