So you're not their market for RTX.
I doubt they care, you'll probably buy the card anyway for the faster non-RTX performance.
Or for epeen, 'cause it's faster than the card you currently brag about in your sig.

My post went completely over your head. No one is buying a $1200 GPU to run 1080p at 60 FPS. There is NO market.
 
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.
What we've seen so far is that it's 30-40% faster than the previous generation equivalent in rasterized games (1080 -> 2080).
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.
No, it's not. RTRT is part of both the DX12 and Vulkan APIs. RTX is just NVIDIA's hardware pipeline and middleware.
The RTRT is part of DX12 specifications, so any hardware can run it, the question will be how fast, and when ATI and Intel decides to bring their answer, probably later in 2019 for ATI.

It has happened plenty of times for new technologies to take a while to catch on, there's lots of good discussions going on here on [H] as to what is likely to happen, 7nm process is not too far off possibly boosting the nvidia performance, and ATI must be working on something but we can only guess on lots of stuff right now.

The Radeon 3870 was a great card for DirectX 9, it supported 10.1 but was so slow you ran a game and thought "Wow, this will be cool when a card can actually run this, set back to DX9".
Speak for yourself. I ran everything in DX10 that I could get my hands on when I got my pair of 8800GTX.
They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.
They're sold out everywhere except the 2080, last I checked. As with previous generations the 2080 is the middle child in the high end and is seen as the worst deal.
That's not a 2x performance increase as-is, that's a 2x performance increase over whatever unspecified AA method they are using. They could be doing 4x super sampling for all we know.

While I do find it cool and interesting, I don't think it's unfair to be disappointed in the performance of a card that takes a major hit doing something it has dedicated hardware for. By all accounts the new range don't seem that much faster than the 10xx series if all you want is max frames. Definitely not for the money.

I was waiting for this range (and expected Ti model ~8 months down the line) to buy my next card. I love AMDs generally better value offerings but I have more disposable income nowadays and was looking to this launch to be the decider on the next card I bought. Now it looks like I'll just wait and pick up a 1080 Ti for cheap instead.

We're all enthusiasts here, and I appreciate people are looking for different things when new tech comes out. But I don't think anyone is unjustified in being disappointed with nVidia's offerings when all they wanted was the next best card to watch their FPS could there go up, especially when they announced the prices. If you just want the bleeding edge and some fancier shadows and reflections, and are willing to take an abnormally high hit to the wallet, and a slightly shocking hit to frame rate, be my guest. Someone has to fund the early adoption. But your somewhat derogatory attitide towards people who place value in different things to you comes across as the whining you derided earlier.
It is supersampling (Deep Learning Supersampling). If I remember correctly 4K with DLSS is supersampled from 1080p.

You're missing the fact that without the dedicated hardware the performance hit would be even larger. If Metro: Exodus runs 60 FPS at 1080p on the 2080 Ti, then the 1080 Ti without the dedicated hardware would run at 7-10 FPS with the same settings.

That is your choice as a consumer.

As I said above, this new generation will still be the fastest cards in rasterized games out there (by 30-40% based off what we've seen so far).
My post went completely over your head. No one is buying a $1200 GPU to run 1080p at 60 FPS. There is NO market.
Oh, really? I preordered a 2080 Ti and I will happily play RTX-supported games at 1080p 60 FPS and "normal" games at 4K 100 FPS.
 
After all is said and done I'll just be happy when the real reviews starting rolling out on all of this. Hard part is not pulling the trigger. Got the Pavlov thing going on when I see a new x80TI now.

On the other hand my display obsession seems like it'll pay off since I still have an old 1080/120hz/3d Asus hooked up to one rig and it's the one I was planning on switching GPU's out on anyway. I've got all major resolutions covered, 1080p/120hz, 1440p/144hz/G-Sync, 4k/60hz/HDR10. ;)
 
IMO that isn't a bad thing at all for anyone. Its time to stop worrying about 150 FPS or 160 FPS and get back to looking at IQ. I remember how excited I was the first time I had a card doing hardware lighting in quake. I would love to get back to buying new hardware and getting excited about turning on new IQ stuff that I couldn't turn on with my old gear.

Well then you're wrong. It's definitely not bad for everyone, far from it, I love new tech. But 1080P and "targeting" 60FPS is just too much of a cost for some, and I'd imagine most, people. At least if it was 1440P it would be something.

I agree that anyone playing games at over 100FPS at their chosen resolution could probably afford to lose some frames for IQ and new tech, but those people are generally not playing at 1080 as it is.

We're talking £1000 here for the Ti; 60FPS at 1080P and I'd be expecting it to suck my dick as well, raytracing or not. I'm happy for those of you/us who have that much money to burn on a significant frame hit for a significant, but arguably still slight, IQ increase. But I don't think that is the majority of people who were waiting for this line up for an upgrade.
 
Well then you're wrong. It's definitely not bad for everyone, far from it, I love new tech. But 1080P and "targeting" 60FPS is just too much of a cost for some, and I'd imagine most, people. At least if it was 1440P it would be something.

I agree that anyone playing games at over 100FPS at their chosen resolution could probably afford to lose some frames for IQ and new tech, but those people are generally not playing at 1080 as it is.

We're talking £1000 here for the Ti; 60FPS at 1080P and I'd be expecting it to suck my dick as well, raytracing or not. I'm happy for those of you/us who have that much money to burn on a significant frame hit for a significant, but arguably still slight, IQ increase. But I don't think that is the majority of people who were waiting for this line up for an upgrade.
Well, then you were never the target consumer for these cards to begin with. Anyone who thinks this is just a "slight" increase in image quality need to get their eyes checked. Even with the horrible compression that Youtube does you can still easily see the benefit. A video is never going to accurately portray what it looks like on your own screen.
 
The market for high end equipment is like the fashion industry really. Its all about the bling. Streamers, e-sports, etc. The average person is going to get whats in budget, gaming companies are going to develop for that budget players hardware but try to include implementations of new tech so your Blingers can show off how awesome your games is. Like every new advance it will take (like others said) a couple years to get it off the ground to be real. The biggest thing will be how gaming companies deal with it, if AMD is again in bed with the console vendors, you can expect lite development efforts in the coming years. Its all about time/money.
 
It is supersampling (Deep Learning Supersampling). If I remember correctly 4K with DLSS is supersampled from 1080p.

You've missed what I'm saying. Because of nVidia's insistence on giving us no details whatsoever about their performance numbers, there is no confirmation. But there is no way that enabling any kind of AA improves performance. The numbers for the 1080 are definitely with some form of AA/supersampling, same with the 2080. The "+DLSS" is showing how much better the performance is when using DL supersampling on the 2080 instead. It's the only way the numbers make sense.

You're missing the fact that without the dedicated hardware the performance hit would be even larger. If Metro: Exodus runs 60 FPS at 1080p on the 2080 Ti, then the 1080 Ti without the dedicated hardware would run at 7-10 FPS with the same settings.

You're missing the fact I wouldn't be enabling raytracing in this case. I'd be playing it at 1440P and at significantly more than 60FPS to boot (I'd expect, I don't know what the requirements are).

And regardless, I'd be doing it for half/nearly half the price, depending on prices when I buy/bought a 1080 Ti.

As I said above, this new generation will still be the fastest cards in rasterized games out there (by 30-40% based off what we've seen so far).

Oh I don't disagree, except I think 30% is being overly optimistic. I'm expecting more like a 20% at best increase, model-to-model. Still the fastest cards around though for sure. Just with a price that more than makes up for it.
 
Well, then you were never the target consumer for these cards to begin with. Anyone who thinks this is just a "slight" increase in image quality need to get their eyes checked. Even with the horrible compression that Youtube does you can still easily see the benefit. A video is never going to accurately portray what it looks like on your own screen.

Definitely no argument there, so already your "IMO that isn't a bad thing at all for anyone" is wrong. And it's not like I'm the only one in the boat.

And I said arguably slight. It's still same textures, same models, same animations, etc. The demos (worth pointing out the importance of the word there as well) are certainly impressive, but I wasn't complaining about graphics in the first place.
 
You've missed what I'm saying. Because of nVidia's insistence on giving us no details whatsoever about their performance numbers, there is no confirmation. But there is no way that enabling any kind of AA improves performance. The numbers for the 1080 are definitely with some form of AA/supersampling, same with the 2080. The "+DLSS" is showing how much better the performance is when using DL supersampling on the 2080 instead. It's the only way the numbers make sense.



You're missing the fact I wouldn't be enabling raytracing in this case. I'd be playing it at 1440P and at significantly more than 60FPS to boot (I'd expect, I don't know what the requirements are).

And regardless, I'd be doing it for half/nearly half the price, depending on prices when I buy/bought a 1080 Ti.



Oh I don't disagree, except I think 30% is being overly optimistic. I'm expecting more like a 20% at best increase, model-to-model. Still the fastest cards around though for sure. Just with a price that more than makes up for it.

DLSS might still be useful. From what I’ve seen it’s really really good but I need to see it in motion.

Assuming DLSS is 30% faster than a 2080ti not using it and the 2080ti is normally 25% faster than a 1080ti that’s 63% faster than a 1080ti. 100% more for 63% faster is pretty decent value (where applicable). But reviews should be out before launch. That number can be much higher but we have to see. Based on the 1080/2080 benchmarks from nVidia the 2080ti can be up to around 150% faster than a 1080ti I would think with DLSS.

Value is per person. To some people $1200 is 0.5% of their disposable income over two years... others it’s 25%. One guy will not think twice. If you’re happy with your 1080ti ride it out.
 
Last edited:
My post went completely over your head. No one is buying a $1200 GPU to run 1080p at 60 FPS. There is NO market.

My thoughts exactly. The performance without ray tracing better be substantially better to justify the price.
 
You're missing the fact I wouldn't be enabling raytracing in this case. I'd be playing it at 1440P and at significantly more than 60FPS to boot (I'd expect, I don't know what the requirements are).

And regardless, I'd be doing it for half/nearly half the price, depending on prices when I buy/bought a 1080 Ti.
I was addressing this point you made:
I don't think it's unfair to be disappointed in the performance of a card that takes a major hit doing something it has dedicated hardware for.

The dedicated hardware is for doing RTRT, and the performance hit comes from turning it on. It's hard to be disappointed with the performance of a brand new graphics quality feature that we have no prior benchmarks to compare to. I came up with my example assuming NVIDIA's claim that the 2000 series is 6-8x faster than a 1000 series card at RTRT.
Oh I don't disagree, except I think 30% is being overly optimistic. I'm expecting more like a 20% at best increase, model-to-model. Still the fastest cards around though for sure. Just with a price that more than makes up for it.
Going by the 14.2 TFLOP/s number provided by NVIDIA for the shader cores alone makes the 2080 Ti 30% faster than the 1080 Ti at 11 TFLOP/s.
 
Let's be honest.

Consoles are what drive the market. Not PC. I know that statement triggers people here but its the truth. Games aren't going to make heavy use of raytracing until and unless the consoles support it just like games didn't go 64-bit en masse until the PS4/XBox One came out and just like games didn't stop using DirectX 9 until the XBox One came out. You might see a few games that have cursory support for RTX with minor visual enhancements but they aren't going to go all out to invest massive programmer hours in supporting a feature that a minority (2xxx owners) of a minority (PC gamers) use.

By the time consoles do support raytracing, the RTX2080 will be several generations old and too slow to handle the games that use it.
 
Too little, too soon?

nVidia make it look amazing by going on about ray tracing, and how hard that is, and pricing the cards to the moon, yet we are only just talking about a few effects, like shadows, some lighting and some weapon effects etc. This is not what the tech media are saying however, all I keep hearing is the overly positive nVidia narrative telling us how great ray tracing is, and how much of a game changer the RTX cards are.

The bottom line is this...

These RTX cards are using a combination of mostly traditional rasterised graphics, with a few ray traced effects blended in to the scene. This is not full ray tracing, and the tech media needs to better educate buyers on this fact. So many think that by simply buying a 2080ti they are going to be playing fully ray traced games that look totally different to existing graphics cards.

Now at the same time, we are told how good the ray traced effects are from a realism and detail point of view, yet we now face gaming at 1080p and forcing our monitors to scale the image in to a blurry mess. Hardly worth it this generation, is it? Maybe wait for the next iteration or so before jumping on this marketing train.
 
They didn't specify, 1080p 60 Hz is the target for which RTX card? The 2080 Ti? The 2080? Or the 2070? Because this makes all the difference.

I don't think it does, at least from my POV.

The 2070, with everything we know about it, should be, at least as fast as a 1080, and if we're talking about price, it certainly is a "1080".

The 1080 also can basically crush every known title at 1080P.

That's means the ray tracing load is that bloody taxing of a setting that it basically has a ~50% performance hit on a "1080".

And if it's a 2080Ti? Well then all of a sudden that ray tracing tax is in the neighborhood of upwards of 75%, given the card is a good clip faster than a 1080Ti.

This basically makes the hairworks tax look like a joke.

That being said, I am really impressed with the IQ and speed improvement RTX brings - I am not impressed with the performance of the "improved" cuda cores.
 
I thought the whole point of the Tensor cores was to fill in noise from lower-fidelity ray tracing. I don't understand why the ray tracing elements can't support adaptive fidelity/resolution independent from the rasterization resolution. At 1080p the 4300-some CUDA cores are going to have a 4ms render time and just be sleeping waiting for the ray tracing engine to do it's thing. Why can't we perform all the raster work at 4K and render the RTX lighting elements at 1080p?

Sub-native shadows and reflections have been around forever and are a great way to get more performance while keeping render resolution high.

This is stupid.
 
That's not a 2x performance increase as-is, that's a 2x performance increase over whatever unspecified AA method they are using. They could be doing 4x super sampling for all we know.



While I do find it cool and interesting, I don't think it's unfair to be disappointed in the performance of a card that takes a major hit doing something it has dedicated hardware for. By all accounts the new range don't seem that much faster than the 10xx series if all you want is max frames. Definitely not for the money.

I was waiting for this range (and expected Ti model ~8 months down the line) to buy my next card. I love AMDs generally better value offerings but I have more disposable income nowadays and was looking to this launch to be the decider on the next card I bought. Now it looks like I'll just wait and pick up a 1080 Ti for cheap instead.



We're all enthusiasts here, and I appreciate people are looking for different things when new tech comes out. But I don't think anyone is unjustified in being disappointed with nVidia's offerings when all they wanted was the next best card to watch their FPS counter go up, especially when they announced the prices. If you just want the bleeding edge and some fancier shadows and reflections, and are willing to take an abnormally high hit to the wallet, and a slightly shocking hit to frame rate, be my guest. Someone has to fund the early adoption. But your somewhat derogatory attitide towards people who place value in different things to you comes across as the whining you derided earlier.
Well thought out response. I respect everything you said. I don't know that I agree with the whining part, my response was merely reflecting on a lot of very evident bias from a certain group here - but if that's how it came across that's a little pie on my face.

We don't know what the comparison was, especially since it was distributed by nVidia. Nonetheless they are claiming 50% increase in performance in "regular gaming" and I would hope that would mean not 4x super sampling. I wouldn't put it across them, they are loose with numbers, but we'll find out soon enough.

I have no problem with AMD, although will admit that I have not owned an ati card since AMD acquired them (but many CPUs), it's the blind loyalty and misleading complaints that annoy me. Some here do not want to accept that as of right now nVidia is king and this new GPU seems to be a winner - extending the gap between the two. Anyone that wants to complain about the price point I'm 100% in, but whining because a new technology that could progress graphic fidelity for the first time in a long while isn't 4k60? Sorry AMD doesn't have an answer today, but let's not disparage what is happening here.
 
I am debating if I want to buy one or go SLI. Have we heard anything about the new bridge and SLI, does it help ray tracing?

If the preorders would stop selling out in seconds that would help too...
 
While wandering around for SLI info I found this:
C83DD5B8-01EE-4F7F-8A26-717931EB26C5.jpeg

I was wondering if there was a ray tracing differential between cards. Apparently there is a decent difference. 2080ti is 66% faster than a 2070. So if they optimize for a 2070 a 2080ti should do fine at 1440p imo. We’ll see.
 
I think my main objection is to how it has been marketed, in terms of raytracing. Okay, if this is first generation new tech that is now only somewhat possible thanks to new purpose-built elements of new GPUs etc... that's fine. However, it seems odd to focus so much on that - the thing that has such a performance hit that runs at 1080p 60fps maximum even with a $1200 card - instead of the general overall performance of the cards. Lets face it, if it takes a 2080 Ti to run raytracing mode at 1080p 60fps, then it means that pretty much anyone with anything else is NOT going to be able to do so at any sort of playable framerate. This, it will not be something included in many games because you know 90+ % of your buyers aren't going to be able to run it right now. With this in mind, it seems to me that it would be better to focus on general use, apples-to-apples comparisons of 2080 Ti and other RTX cards with a blurb about "Oh by the way, they also have hardware for new generation raytracing which will be useful in specialized circumstances" as something of a special feature useful to the minority.

Regarding the RTX branding/middleware, so it is pretty much confirmed that this is NOT going to be a sufficiently proprietary implementation? I'm glad to know there are some raytracing standards out there and that RTX seems to be somewhere in their arena at least, but it wouldn't be the first time when an Nvidia product's implementation, hardware or software, was just different enough that it wasn't like developers could just develop for the standard. I'm hoping that we can move away from the days of manufacturer specific implementations or functions (ie GameWorks etc..) and instead just stick to standards and open development no matter how each side chooses to name how they arrive there. I really hope that divergent raytracing implementations are not the next battlefield so to speak.

Oh by the way, did anyone notice that the Asus 2080 Ti STRIX model does not seem able to be on order? For the 2080 Ti only the "Dual" Models are available, the base 2080 offers the STRIX pre-order though.
 
I remember when 1080p HD was the buzz word... and PCs were already gaming above that on CRT's, PC gamers were like... HD? what a joke ... then 2k, then 4k ... 8k off in the horizon...seemed like finally we are pushing the envelope, but now we're back to 1080 target because of RTX? So in the near future, benchmarks for $1,000+ cards will be tested @640x480... To get the highest RTX fps.....lol. we are DE-evolving
 
Speak for yourself. I ran everything in DX10 that I could get my hands on when I got my pair of 8800GTX.

At the time I had an 8800 GTS 512 that ran DX10 great, I like to have lots of hardware and was seeing what the much cheaper ATI 3870 card could do, for the price it ran DX9 great so it did have some value. Later the ATI 4XXX cards were so much faster in DX 10 it was a quantum leap.

I was just pointing out that it takes time for companies to implement new features, and it usually comes with a price premium. There may be quite a few people that hold off on version 1.0 of RTRT, as it gets more common we will see $600 price range cards beating the 2080 TI. Same as it ever was.
 
They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.

And yet some people did pre-order them. And i heard some people have to wait until november for their pre-order to actually arrive. Either nvidia severely limiting the amount of RTX they sold right now or it did sell like crazy already
 
Well, then you were never the target consumer for these cards to begin with. Anyone who thinks this is just a "slight" increase in image quality need to get their eyes checked. Even with the horrible compression that Youtube does you can still easily see the benefit. A video is never going to accurately portray what it looks like on your own screen.

I saw the video and was completely unimpressed. The game looks good enough without RTX effects, and without them, will run at higher resolutions and higher framerates, which is more significant. What's important? GAMEPLAY? Or pretty pictures?
 
And yet some people did pre-order them. And i heard some people have to wait until november for their pre-order to actually arrive. Either nvidia severely limiting the amount of RTX they sold right now or it did sell like crazy already

Waves of disappointment are inevitable...if these offer the marginal performance boost in most games that the specs seem to indicate. Notice ALL of their marketing compares the new "ti" card to the plain 1080? "50% faster than a 1080" = 15% faster than a 1080ti. For $1200. No thanks.
 
I saw the video and was completely unimpressed. The game looks good enough without RTX effects, and without them, will run at higher resolutions and higher framerates, which is more significant. What's important? GAMEPLAY? Or pretty pictures?

The cool thing is people can decide for themselves what's more important and turn the options on or off. To me, realtime reflections represent a huge boon for gameplay.
 
I thought the whole point of the Tensor cores was to fill in noise from lower-fidelity ray tracing. I don't understand why the ray tracing elements can't support adaptive fidelity/resolution independent from the rasterization resolution. At 1080p the 4300-some CUDA cores are going to have a 4ms render time and just be sleeping waiting for the ray tracing engine to do it's thing. Why can't we perform all the raster work at 4K and render the RTX lighting elements at 1080p?

Sub-native shadows and reflections have been around forever and are a great way to get more performance while keeping render resolution high.

This is stupid.
So I've been thinking about this and why adding ray tracing would force a massive cut to raster render resolution when it's supposed to be performed independently by fixed-function hardware.

I think in order to do ray-tracing you need to render tons of offscreen elements that would normally be culled in a traditional raster-only render. For example all that cool shit in the reflections of the BF5 demo would normally not be rendered by a GPU because it's offscreen. With ray-traced reflections not only does the ray-tracing hardware have to bounce rays off the geometry, but all the other traditional raster work, pixel shading, etc. has to happen to it as well.

Brutal.
 
Back
Top