NVIDIA’s RTX Speed Claims “Fall Short,” Ray Tracing Merely “Hype”

I think the failure to include HDMI 2.1 is because of the inclusion of variable frame rate tech which isn't their own. Whether or not that is how anyone else sees it, any cards at this price had better hit all the buttons, and not miss any. I'll wait to see the reviews but I'm looking at how the cards do on current gen "as is" tech. And then I'll weigh in the price to get there. Likely as not I won't care to spend this kind of money but reviews are likely to examine a lot of things that may be convincing. I guess the part we are all arguing is that many of us are usually first adopters and this one isn't quite hitting the numbers for us. The cards are in a silly price bracket, one where I believe they either wanted it to be so high that people drained the remaining 1xxx stocks or because they have this dream that the pricing during the mining era should be the new "normal". Let's see how they do in a few weeks. But I'm not rushing this one. In a year or two these will be outdated and half price...
 
Generally, I would agree that most new effects are introduced to the marketplace well before they can used on games. However, in this case, if NVIDIA is doing that, they've greatly miscalculated IMHO.

I posted in another thread that RayTracing really needs to work at least on 1080p@40-60fps (manage those other settings) on any of the RTX GPUs. Unlike those other historical features, NVIDIA really is selling all the RTXs as capable of useful RT and they've priced them as such. Also, they've got a relatively huge list of games which will have it and that would be awful marketing if most of the RTXs couldn't handle it.

I do fully expect that the amount of RT will be adjusted to whatever RTXs can do and future games will have better/more RT.

My scaling from that other thread was:

RTX 2070 - 1080p@40-60fps with managed settings...not max.
RTX 2080 - 1080p@60+fps and max settings or managed 1440p@40-60fps
RTX 2080Ti - 1440p@60fps max settings or close enough.
2x RTX 2080Ti (assuming NVLINK works out) - 4k@60fps high but not max

If you look at the RT gigarays or CUDA cores scaling across the RTX GPUs, their scaling actually fits that performance model above.

It is possible that NVIDIA greatly miscalculated the professional market and must hoodwink the gaming market in order to sell all the broken Quadros. I just couldn't see them launching like this otherwise, it would extremely foolish IMHO.
 
I think the failure to include HDMI 2.1 is because of the inclusion of variable frame rate tech which isn't their own. Whether or not that is how anyone else sees it, any cards at this price had better hit all the buttons, and not miss any. I'll wait to see the reviews but I'm looking at how the cards do on current gen "as is" tech. And then I'll weigh in the price to get there. Likely as not I won't care to spend this kind of money but reviews are likely to examine a lot of things that may be convincing. I guess the part we are all arguing is that many of us are usually first adopters and this one isn't quite hitting the numbers for us. The cards are in a silly price bracket, one where I believe they either wanted it to be so high that people drained the remaining 1xxx stocks or because they have this dream that the pricing during the mining era should be the new "normal". Let's see how they do in a few weeks. But I'm not rushing this one. In a year or two these will be outdated and half price...
That is the conspiracy theory, but I thought HDMI 2.1 spec really isn't ready yet? I mean it exists, but you couldn't submit a product today and get it certified as the testing program is all but non-existent. At best you could get provisional certs.

Edit: not that provisional or "close enough we didn't bother testing it" ever stopped a manufacturer from claiming they are compliant!
 
Perhaps it is because they are both relying on the tensor cores to function. There are only so many tensor cores. And if they are all being used to perform RTX ray tracing, there will be none left available to perform DLSS AA.
That wasn't made clear in the NVIDIA presentation, in fact their color coding completely disagreed with that interpretation. However, after looking at die shots and other information, I believe you are correct. There are tensor cores on the die and they can do RT or DLSS and only in the rare case (if at all), that a game developer has so little of both that they can work together.

I've been very skeptical that 10 gigarays on the 2080Ti is really enough to do full RT across the scene and the RT workload will be scaled to accommodate at least until 7nm...probably even a few generations.
 
Last edited:
Can't agree. Ray tracing is the future, and has been the future for about 40 years now. I for one am excited that the future is almost here. Once cards are fast enough to run it in real time @ 4k 60 fps, it will be worth the wait. But to get to that point, we first need to take steps to get there. This is one of those steps.
Holy shit is this guy drinking the koolaid OR WHAT. WEW LADDY
 
I think the failure to include HDMI 2.1 is because of the inclusion of variable frame rate tech which isn't their own. Whether or not that is how anyone else sees it, any cards at this price had better hit all the buttons, and not miss any. I'll wait to see the reviews but I'm looking at how the cards do on current gen "as is" tech. And then I'll weigh in the price to get there. Likely as not I won't care to spend this kind of money but reviews are likely to examine a lot of things that may be convincing. I guess the part we are all arguing is that many of us are usually first adopters and this one isn't quite hitting the numbers for us. The cards are in a silly price bracket, one where I believe they either wanted it to be so high that people drained the remaining 1xxx stocks or because they have this dream that the pricing during the mining era should be the new "normal". Let's see how they do in a few weeks. But I'm not rushing this one. In a year or two these will be outdated and half price...


The HDMI ports could well be 2.1 hardware compliant waiting for a bios upgrade.... I am not in the know, yet it is possible.
 
[Shower Thought]

RTX 20xx series is intentionally gimped/bad so that they can clear the glut of 1080 series stuck in the warehouse?

[/Shower Thought]
 
odd ? Not at all, I'd say it's expected from nVidia, how are they gonna sell their 5k Euro 65" tv with gsync if they include hdmi 2.1 ?
Just like the 780ti only had 1.4 while AMD card had 2.0.

Absolutely, the only reason at all they are making those shit displays. My 2018 Q9F is awesome but they can't support adaptive sync because they can't sell their overpriced shit then. Pisses me off. I don't want their shit display, I want a good display and a good card and I want to be able to pick each. Not my fault their shit isn't being adopted.
 
Im starting to think more and more ray tracing could be like Crysis, cutting edge but the hardware just wasn't ready and wouldn't be for some while.

If you can't use a feature except at 1080p on a 1200.00 card, the feature was not ready. The silicon and cost was not a good idea. Wait until node shrinks or architecture allows it at a resolution that hasn't been tired for 20 years. So the problem I have is that. If it performs at 4k great, I'll take two and pay 2k each. Sucks because I know it's going to be a joke.
 
Your line of thinking holds water.
I guessed this would happen when they first tried to pull their 5 year NDA crap.
If Kyle had signed that or the new one they demanded of him (reported last week), this thread wouldnt be up on [H].

I read reports that the ray tracing is performed at lower res and upsampled/filtered so its not even 1080p.
This explains why the image loses colour detail and peak brightness vs none ray traced.

I think you are on point, when you hide things it's never good news. Look at Vega, they were so fucking cagey with it, they played so many games and tried so many sneaky things, covering up monitors, it was just so shitty. This is what that feels like.
 
Personally the ray tracing aspect is cool but it's really about the performance increase outside of ray tracing that will determine whether I buy an RTX or not. There's no way in hell I'm going back to *maybe* 60 FPS at 1080p at this point, and I would think that anyone with a 2K or 4K monitor (or multiple monitors) would be thinking the same thing.

If they released it like that and used up that much space on the die and added that much cost then they are out of their minds. They have been pushing higher resolutions for years, 4k cards have been their big deal. Now because something is "hard" and "amazing" you can do 1080p30 or 60. Nothing is a miracle if you get to dial down performance until it "just works" I can do a ton of things by making you lower your expectations. That's not cutting edge, new features for early adopters, that's not a feature at all because you cannot use it. Cutting edge, early adopter means you get an awesome feature but it costs you a lot of money.

This sounds like you get nothing and pay a lot of money while they tell you that stick they rammed up your ass is amazing.
 
I think the failure to include HDMI 2.1 is because of the inclusion of variable frame rate tech which isn't their own. Whether or not that is how anyone else sees it, any cards at this price had better hit all the buttons, and not miss any. I'll wait to see the reviews but I'm looking at how the cards do on current gen "as is" tech. And then I'll weigh in the price to get there. Likely as not I won't care to spend this kind of money but reviews are likely to examine a lot of things that may be convincing. I guess the part we are all arguing is that many of us are usually first adopters and this one isn't quite hitting the numbers for us. The cards are in a silly price bracket, one where I believe they either wanted it to be so high that people drained the remaining 1xxx stocks or because they have this dream that the pricing during the mining era should be the new "normal". Let's see how they do in a few weeks. But I'm not rushing this one. In a year or two these will be outdated and half price...

They could support it now and they don't over DP even but yes, that is even besides the point. However supporting adaptive refresh under 2.1 is fucking optional if you can believe that and of course they probably lobbied to have it optional. I just want to use the best TV I can buy but of course because they can't extract more money from you that's not going to work. I am starting to really dislike them. Between this and then the AMD shit they pulled with Vega (the fucking liars) it seems like GPU companies are just cunts. I mean fuck here is 10k at least support my paltry tv. Ngreedia - Fuck you, pay me more.
 
Generally, I would agree that most new effects are introduced to the marketplace well before they can used on games. However, in this case, if NVIDIA is doing that, they've greatly miscalculated IMHO.

I posted in another thread that RayTracing really needs to work at least on 1080p@40-60fps (manage those other settings) on any of the RTX GPUs. Unlike those other historical features, NVIDIA really is selling all the RTXs as capable of useful RT and they've priced them as such. Also, they've got a relatively huge list of games which will have it and that would be awful marketing if most of the RTXs couldn't handle it.

I do fully expect that the amount of RT will be adjusted to whatever RTXs can do and future games will have better/more RT.

My scaling from that other thread was:

RTX 2070 - 1080p@40-60fps with managed settings...not max.
RTX 2080 - 1080p@60+fps and max settings or managed 1440p@40-60fps
RTX 2080Ti - 1440p@60fps max settings or close enough.
2x RTX 2080Ti (assuming NVLINK works out) - 4k@60fps high but not max

If you look at the RT gigarays or CUDA cores scaling across the RTX GPUs, their scaling actually fits that performance model above.

It is possible that NVIDIA greatly miscalculated the professional market and must hoodwink the gaming market in order to sell all the broken Quadros. I just couldn't see them launching like this otherwise, it would extremely foolish IMHO.

You're assuming perfect scaling. 4K is 4x's the resolution of 1080p. Not 2x's. So you would need 4x RTX2080Ti to reach those kind of resolutions ASSUMING if it scaled well which SLI setups rarely (IF EVER) do.
 
Last edited by a moderator:
That is the conspiracy theory, but I thought HDMI 2.1 spec really isn't ready yet? I mean it exists, but you couldn't submit a product today and get it certified as the testing program is all but non-existent. At best you could get provisional certs.

Edit: not that provisional or "close enough we didn't bother testing it" ever stopped a manufacturer from claiming they are compliant!
Last I heard full certification is just starting now. As far as I know the Xbox One X is going to be the first HDMI 2.1 certified device when the next console update hits in October.
The HDMI ports could well be 2.1 hardware compliant waiting for a bios upgrade.... I am not in the know, yet it is possible.
As far as I know only VRR, QMS, and eARC can be added through a firmware update. You're not going to get the increased bandwidth without new hardware in both the input and output devices (plus a new Ultra High Bandwidth cable, of course).
 
RTX is Nvidia's version of Vega. Tons of compute for the high end market while they try to play it off the consumers as being revolutionary when its simply mediocre at games.
 
Many are slamming ray tracing for this generation based on its likely subpar performance on FPS type games, and it will probably be worse than useless there. But what about slower paced games where high frame rates aren't paramount? I can see a survival horror game such as Silent Hill benefiting immensely from the improved creepy atmosphere ray tracing could provide, without crippling the gameplay.
 
[Shower Thought]

RTX 20xx series is intentionally gimped/bad so that they can clear the glut of 1080 series stuck in the warehouse?

[/Shower Thought]

Oh God...

It's not gimped. It's a slightly better GPU for current games. Half of the silicon budged for the RTX series is spent on Tensor and Ray-tracing cores. Current games don't use these, and the few implementations will be half-assed and rushed. That is why we have such a divisive launch this time around. AMD did the same thing with Vega, and when the reviews came out it fell flat on its face. Turing was not built for gaming, but rather for Enterprise. Think CAD/CAM, special effects, AI assisted medical visualization and so on. NVIDIA will not engineer and manufacture a separate piece of silicon for gaming, and there weren't enough CUDA cores in Turing left to sell us gamers and enthusiasts a significant performance bump over Pascal, so NVIDIA is trying to push the entire feature set. The only time a company shrouds everything in mystery and creates division and confusion in amomg their customers and their respective target audience and community is when they know that their product will fall short of expectations.

I remember the Pascal launch. While Jensen hyped the card as much as possible, he said that Pascal obsoletes the previous generation. You could order the card right away, and the performance was there. This time around he was careful to point out that Turing beats Pascal in Ray-tracing.

So, in conclusion, with proper developer support this might end up being a great technology a year or two from now, once NVIDIA move to a smaller process not for more reasonable power consumption and higher clocks. The reasonable thing to do is to wait for a few weeks for reviews, especially for the [H] review. And don't pre-order, it's stupid.
 
Many are slamming ray tracing for this generation based on its likely subpar performance on FPS type games, and it will probably be worse than useless there. But what about slower paced games where high frame rates aren't paramount? I can see a survival horror game such as Silent Hill benefiting immensely from the improved creepy atmosphere ray tracing could provide, without crippling the gameplay.

I can see that. But even so, it'll probably be better in the next gen. All that being said, nothing can be known for certain until we see real benchmarks.
 
I don't really understand the volume of posts suggesting early adopters are going to get a bad value. Every other post, article, or video reminds me of a teeter totter. In one opinion, it's gonna be great, with RT as an added benefit that will continue to grow. Then in another, it's all the way to the other side where it's going to be the worst thing ever and we should all #occupyNvidia.

I pre-ordered a 2080, but I'm still on the fence about whether to keep it. Here's the thing though, Newegg doesn't take my money until it ships. So I've got until the 19th to really make up my mind and cancel that ish if I feel like it. But, to go a step further, value is subjective. I'm upgrading an R9 290. Is $760 a lot of money to me? yes, of course. But, I see the potential for improved performance as worth while.

Now if I came from a 1080 or 1080Ti, I might feel differently, but I still wouldn't be bothered by other people getting a 20xx. The fact that so many folks are quick to make that call on behalf of someone else bothers me more than the Nvidia's marketing BS and the lack of benchmarks. I'll worry about the numbers when they get here, but why be concerned about what anyone else is doing with their own money until then, especially if it's not even drawn from the bank yet?

If your aim is just to upgrade your 290 then of course you'll get performance for the money. Then again if you're having hopes of real time RT... You're going to be sorely disappointed.
 
https://wccftech.com/nvidia-geforce...ghz-and-beats-a-gtx-1080-ti-without-ai-cores/
Wccftech (so take with a grain of salt) saying ~10k on Time Spy with a 2080. Gamers Nexus had the Titan V
Here is another link to the leaked Time Spy benchmark for the RTX 2080:

https://videocardz.com/77763/nvidia-geforce-rtx-2080-3dmark-timespy-result-leaks-out

Disappointing if true. Only 5% faster than the GTX 1080 Ti. And the GTX 1080 Ti is a very good overclocker. We'll see. It's a disappointment if nVidia has only released a card that is 5% faster at the same price point as a previous generation card.

Didn't nVidia say that the RTX 2070 would be faster than a Titan XP? That's not looking too promising.

**edit** Those benchmarks might actually be of the RTX 2070. Good news if that is the case. Bad news if it's indeed the RTX 2080.
 
Last edited:
It's what should be expected. 1080ti has more 7% more bandwidth, 13% more single TFLOPs.
 
Didn't nVidia say that the RTX 2070 would be faster than a Titan XP? That's not looking too promising.

Since the whole event was focused on Ray Tracing, I assumed that comparison was purely in relation to Ray Tracing performance.
 
Last edited:
Blah ha ha ha ha ha

God that is so badly concluded it isn't even funny.

I guess you could say fury and fury nano we're underpriced too because they all sold out.

It's called early adopter tax. Or as I like to call it more money than brains tax. You just don't know how it will pan out until you get reviews of said units.

I think once miners see you get better value from a 1080ti in terms of mh/watt the market at that price will quickly fall. You'll get maybe 30% better performance with a 50% price boost. So it's not worth it. Miners would pay for circuitry (rtx units) they will never ever use
Mining is essentially dead on GPUs for now, with the crypto market shedding 90% of it's value, down $600 Billion, all the smart money got out on the highs. If and when it ever recovers and there is a miner/gamer price war again is anyone's guess.

http://www.latimes.com/business/la-fi-tn-nvidia-cryptocurrency-20180817-story.html#
 
I've only been able to find videos. I asked if they're available for download on Geforce.com.
 
Just to reset the baseline for a lot of arguments, current gaming at 4k, is really just 1080 gaming at 4k resolution. For the most part, games are NOT all out 4k games. They are the bare minimum that can be rendered at 60 fps with the resolution of '4k'. We are still designing games (made up of character models, a textured world, the lighting of the scene, and added effects (explosions, smoke, water ripples) that are well short of a full 4k standard. How often do you see a texture that makes you frown? Even with a 1080ti maxed out we are often stuck with areas of games that just scream that they are horrifically lower quality than what should pass as acceptible.

Once we can all agree that games are really still being developed at a 1080p discipline, it's not that much of a stretch to start reassessing what we think games should be able to run at with ray tracing. There will be a point when true 4k gaming, and its scores of gddr can finally be showcased, but until then, don't assume that because a game can run at 4k resolution, that it is really a 4k game.

Now I'm not an Nvidia apologist, just browse through any of my recent comments and see for yourself, but I do understand why supposedly (should we call them half step next gen?) games like metro are going to crush gpus with or without rtx, and I will always see that as a GOOD thing. We need forward thinking, looking, designed games to move the industry ahead. We've been stagnant with developers enjoying the long, consistent run of our current console generation. Instead of revamping or completely redesigning games for new hardware, in some instances, they have to dumb them down to hit 4k resolution at playable framerates.

Lets just be honest with ourselves here and understand that the games we can play at 4k resolution comfortably, are just upscaled 1080p games.
 


Interview with an Nvidia PR guy. They asked him about Ti -> Ti performance and he said ~40% increase on GPU limited stuff generally.

It wasn't completely clear whether using RTX and DLSS at the same time was going to be a big performance hit or not.

He did say it wasn't just turn on one or just turn on the other as some people seem to have decided.

From the piece Digital Foundry put out yesterday or the day before it sounds like RTX can be tuned to be more or less intensive (it's obviously going to come with a big performance hit, but can be adjusted). So from what I can piece together so far DLSS happens after the RTX stuff is finished running and it's just a matter of how much DLSS adds to the frametime.

He also confirmed you need to train DLSS "profiles"/models on a super computer, so it's not necessarily something you'll see used by Indie Devs.

And NVLink is fast, but it doesn't make two GPUs act as one or have "16/22 GB" of memory like some people were thinking. It's still using AFR.
 
Canadian bacon is cured loin, ham is hind leg cut. Different. And I hear only 20% tastier where Canadians think it's 40% better and $1100 a pound instead of $800#.
 
[Shower Thought]

RTX 20xx series is intentionally gimped/bad so that they can clear the glut of 1080 series stuck in the warehouse?

[/Shower Thought]
Lol that'd be the plot twist of the century.

Though really, it would make sense. Keep them gimped....then amazing driver update when their ready....2080s fly off the shelf.
 
Back
Top