AdoredTV Dissects the Marketing to Make Nvidia Turing RTX Performance Predictions

Nope,

There were at least 6 games benched without HDR. And they achieved the same gains in them. Also Wolf2 can't use FP16 on Turing, because FP16 is a part of AMD shader intrinsics available only on AMD hardware. Only an ignorant guy like Adored can make that claim and get away with it.

And NOTHING is wrong with benching a card at 4K if the card is targeted at 4K. You want to use use @1440p that's your choice, just subtract a portion of the % of uplift and there you have it.

Never said there is anything wrong with doing it. It's just like comparing Infiltrator with DLSS on vs a card that can't use DLSS. Its not a bad thing, but it is something people need to keep in mind. Its why you can't take this stuff at face value and you must look at it more critically. Nvidia's marketing team is a marketing team. They're going to do their job and make the cards look the best they can and use whichever marketing tricks work to do that. And. yes, that also means they are specifically picking games (and a tech demo) that can give the most wow factor.

Given the price the 2080 is effectively competing against the 1080 ti. I doubt that would have been as favorable of a comparison as going against the 1080.
 
Here's the big hoopalah.

There are claims that the 20 series is 50% faster than the 10 series, this is both true, and at the same time false.

Is a 2070 50% faster than a 1070? From what we know, this is factually correct. And is the 2070 a cut down die from a 2080? Also true, so it's not like the 70 moniker is unwarranted.

However it's also true that a 2070 costs as much as a 1080, and therefore it should be compared to a 1080.

And basically this ~50% should hold true as you work up the product stack.


In regards to the 4K infiltrator demo, nvidia's CEO stated 1080Ti does "30 something" (supposedly some people got about 40, but for this it doesn't really matter), and the 2080Ti could do 78fps.

This was right after the introduction of DLSS, so the logical flow is that this is with DLSS.
I've already posted in another thread with the youtube video, so if you want to see if the trickery is as good as native 4k, you can scrounge that up.

Anyways, back on the infiltrator, any way you slice it, that's a 100 to 160% boost in performance.

We know the 2080Ti has straight up almost 30% more raw performance than a 1080Ti, so that leaves 70% left to account for.
There is mounting evidence that the 2080Ti's cores are 20% faster (which is not nearly as fast as Jensen's claimed 50%)

There two factors account for 56% of the performance increase. Then with the new DLSS charts, we see that DLSS can improve performance 30-40% above native.

And just like that, you've got ~125% boost over a 1080Ti.

The thing is, the more power the cuda cores are, the less impressive the RT cores are.

This is because if you take the straight transistor count/die size of a 2080Ti, and made a 1080Ti as big, with a 20% SM improvement, you'd be sitting at a 100% improvement with zero trickery.
Of course you'd then miss out on all that ray tracing magic.

It's a catch 22.
The more improved the base cuda cores, the less impressive the tensor core's DLSS capabilities.
Just for reference, if the 20 series had exactly the same pascal cores, that means that DLSS would have provided a 90% boost in fps... sadly we now know this is definitely not the case.
 
I predicted that pricing would increase and also that they would release the Ti in order to save market from crashing, with 10 series inventory glut. One of the only people on here and I got it right.
Aug 3rd..

Dlss is possibly some dynamic lighting and shadow thing that injects ray tracing-like effects to supported engines or games. Whatever it is, it's the next hairworks of performance tanking. Its the only thing that makes those cards look any better than the usual performance increases and is why there is no 2080 vs 1080ti data.

Ignore dlss and look at their graph... 25-45% same old shit. This will be the reality for a majority of titles. And no I don't usually watch hardware videos, they can go fuck themselves. Kyle does all good though.
 
I predicted that pricing would increase and also that they would release the Ti in order to save market from crashing, with 10 series inventory glut. One of the only people on here and I got it right.
Aug 3rd..

Dlss is possibly some dynamic lighting and shadow thing that injects ray tracing-like effects to supported engines or games. Whatever it is, it's the next hairworks of performance tanking. Its the only thing that makes those cards look any better than the usual performance increases and is why there is no 2080 vs 1080ti data.

Ignore dlss and look at their graph... 25-45% same old shit. This will be the reality for a majority of titles. And no I don't usually watch hardware videos, they can go fuck themselves. Kyle does all good though.
What is that nonsense you are saying about DLSS? o_O
 
My question is, if the next gen consoles are also just the "old" rendering pipeline for the next 8+ years. Won't 99% of devs/studios not create a new engine just for the PC, and only RTX cards to boot.
 
What is that nonsense you are saying about DLSS? o_O
Well let's look at how they are going to get such a jump, it's not going to be a like for like comparison if it's using their new cores, so what will it be? Likely some ray tracing injection type hack.. Leaving all the 99% of existing titles down in the dlss off graph, which is in line with typical generational increases.
 
Fine, wait for reviews, great idea, I am doing that also. But please for the love of God don’t listen to AdoredTV’s trash biased “opinion” and pretend that it’s “news”. This video is propaganda at best.

But you haven't watched it. So how would you know?

The gist of what he said is:

4k res is hard on the 1080 as it has far, far less bandwidth than the 2080
Hdr is hard on the 1080 as it has some implementation issues the 2080 does not.
These 2 things pad the 2080 numbers and put the 1080 in the harshest possible light.

So there's your pertinent points. What do you make of that?
 
The thing is, he said not long ago that he thought Turing was 15-20% faster. Now he is saying something else. So which is it? What are we supposed to believe? And why are we trusting his speculation as opposed to nVidia's benchmarks, which tend to be accurate?

The last time I checked, AdoredTV was an extremely questionable source. I refuse to watch their videos or support them in any way.

Only extremely questionable source?
 
The thing is, he said not long ago that he thought Turing was 15-20% faster. Now he is saying something else. So which is it? What are we supposed to believe? And why are we trusting his speculation as opposed to nVidia's benchmarks, which tend to be accurate?

The last time I checked, AdoredTV was an extremely questionable source. I refuse to watch their videos or support them in any way.
Have you even watched the video? Have you any idea what he's saying in it? Have you any idea what you're even arguing against?
 
I believe his estimate for performance will be pretty close to how it turns out, I tend to think 20% or less based on what I have seen so far. Also the fact that Nvidia wants to talk about features and the ray tracing and not talk about performance is telling to me.

That would be awesome, means I could cancel my pre-orders and save $2600 dollars.
 
That would be awesome, means I could cancel my pre-orders and save $2600 dollars.

What’s acceptable to you? I mean.. we know it has 21% more CUDA cores, 27% faster RAM, rumored 5-10% more OC...

Puts it maybe around 30% assuming no real CUDA perf increase and ignoring DLSS.

Btw I still expect SLI to suck. Might want to definitely wait for reviews on that part lol.
 
I give up with this thread. AdoredTV was and is right, and the nVidia shills in this thread need to stop, as they have totally lost their credibility, and reading their infantile argumentation it’s now akin to watching my dog taking a shit and turning round and sniffing it.
 
Can't wait to see your face comes review day.

I watched the video, and find his arguments valid. But come review day, hopefully a review conducted by someone who is not not an extension of nVidias marketing department will tell us what the numbers really are.

I’m sure it will be the usual thing where one game is 10% faster, and another is 110%
 
I predicted that pricing would increase and also that they would release the Ti in order to save market from crashing, with 10 series inventory glut. One of the only people on here and I got it right.
Aug 3rd..

Dlss is possibly some dynamic lighting and shadow thing that injects ray tracing-like effects to supported engines or games. Whatever it is, it's the next hairworks of performance tanking. Its the only thing that makes those cards look any better than the usual performance increases and is why there is no 2080 vs 1080ti data.

Ignore dlss and look at their graph... 25-45% same old shit. This will be the reality for a majority of titles. And no I don't usually watch hardware videos, they can go fuck themselves. Kyle does all good though.

Umm, that's not remotely what DLSS is. You're confusing DLSS with RTX Ray Tracing. DLSS is a form of supersampling. Theoretically, DLSS will increase image quality over traditional forms of AA with much less of an impact.
 
Umm, that's not remotely what DLSS is. You're confusing DLSS with RTX Ray Tracing. DLSS is a form of supersampling. Theoretically, DLSS will increase image quality over traditional forms of AA with much less of an impact.
Whoops you're right I got mixed up. It looks very interesting and has me curious at as to how much is involved in processing games and how many can be expected to be compatible. Eg I'd love to see if it could handle no man's sky.
But it'll still be subject to limitations like mgpu was, it has to be learned to work properly and I wonder if weird things like artefacting will bleed through occasionally.
 
Given everything I've seen, I still think that the 2070 is the most interesting of the lot.
It has all the ingredients required to beat a 1080 in almost all circumstances.
The 2080 is likely not substantially faster than a 1080Ti, and well the 2080Ti is basically a "Titan".

Umm, that's not remotely what DLSS is. You're confusing DLSS with RTX Ray Tracing. DLSS is a form of supersampling. Theoretically, DLSS will increase image quality over traditional forms of AA with much less of an impact.

DLSS implementation for games is as much AA as DSR is AA. In fact, what DLSS is doing is basically the reverse of DSR.

Quick refresher for those that don't remember. DSR is rendering at a higher resolution, then down sampling to desired resolution.

DLSS is rendering at a lower resolution then upscaling by filling in the blanks with the non-cuda cores.
Whether or not that's improved IQ is totally subjective.
 
Given everything I've seen, I still think that the 2070 is the most interesting of the lot.
It has all the ingredients required to beat a 1080 in almost all circumstances.
The 2080 is likely not substantially faster than a 1080Ti, and well the 2080Ti is basically a "Titan".



DLSS implementation for games is as much AA as DSR is AA. In fact, what DLSS is doing is basically the reverse of DSR.

Quick refresher for those that don't remember. DSR is rendering at a higher resolution, then down sampling to desired resolution.

DLSS is rendering at a lower resolution then upscaling by filling in the blanks with the non-cuda cores.
Whether or not that's improved IQ is totally subjective.

Someone in another thread mentioned using DSR and DLSS. It’s actually smart if possible lol. Basically DSR with no hit.

I am not sure exactly how DLSS works though. Normally SS renders at a high resolution then brings it down. So would DLSS take your native, create a higher res version, then bring it back down?
 
So let me ask you this, do you believe and agree with the video in the OP? Do you feel that not taking into account the new more powerful CUDA cores in Turing is not only stupid, but dishonest also? The guy is talking out his ***, that much is completely clear. I don’t even need to watch the video. And I’m not going to waste my time. I have better things to do. You guys clicking the link and watching the video are simply feeding the troll.

You said that you haven't watched the video so how do you know "this guy" is talking out of his ass? You claim he's an AMD shill but in the video he calls the new chips an architecture masterpiece...Doesn't sound like much of a shill to me. funny how much you have to say about a video you haven't even watched.
 
So would DLSS take your native, create a higher res version, then bring it back down?

Given how Mr Huang describes it, if you ask for 4K in DLSS, it would render at sub-4K and then upscale to 4K.

But don't take my word for it, watch 2-3 minutes:
 
Given how Mr Huang describes it, if you ask for 4K in DLSS, it would render at sub-4K and then upscale to 4K.

But don't take my word for it, watch 2-3 minutes:


That’s how I kind of took it but he isn’t always super precise in his presentations.

I still like the idea of DLSS and DSR someone else mentioned.
 
Given everything I've seen, I still think that the 2070 is the most interesting of the lot.
It has all the ingredients required to beat a 1080 in almost all circumstances.
The 2080 is likely not substantially faster than a 1080Ti, and well the 2080Ti is basically a "Titan".

From previous gens, it just seems for GPUs, the crippled versions are always an after though. They are not designed. The full chip gets all the resources with intent of franken monstering the lower part as things shape up.

Personally, I've vowed to only ever get the full chip cards that the software/architecture stack was actually designed around.
 
Reply to Sickbeast "Why? Is the credibility of the source not important to you?"

Where did I say that the credibility is not important to me?

You also did not mention why you regard this source as not credible. Kyle specifically find him very credible, so maybe you know a bit more than he does.

But yet, you sit and judge a source you clearly know very little about and you don't even watch the video. Yip, this makes people want to listen to you. The only people that dislike Adored TV, are puppets following after their idol worhipped comapnies, like Intel.
 
Last edited:
  • Like
Reactions: TMCM
like this
I don't understan what liking or not liking him has to do with anything. The points he brought up in the video make sense and are perfectly valid, even if the predictions turn out false.
 
Here's another perspective. I came in fresh having not followed the nVidia roll out. Watching the video basically outlined all the things I would look out for in any system, (settings, bandwitdth, fp16v32, hdr, memory compression, etc) and did so in a logical although sometimes tedious manner*.

If you argue the points, even if his projections have a range of error, the methodology is sound.

If you come here and shame him for his methods, cough cough SB, you only shame the community. Watch the fucking video or GTFO!!!

*When using a calculator to compare percentage. Use decimal point math and the 1/x. It is often one step and you never have to shift the decimal point. (this is a joke AdoredTV would get it)
 
So let me ask you this, do you believe and agree with the video in the OP? Do you feel that not taking into account the new more powerful CUDA cores in Turing is not only stupid, but dishonest also? The guy is talking out his ***, that much is completely clear. I don’t even need to watch the video. And I’m not going to waste my time. I have better things to do. You guys clicking the link and watching the video are simply feeding the troll.

This is the equivalent to walking into a professors class after it is over, yelling at the top of your lungs, you don't know what your talking about, I've seen this guy before, this is what you should study, then stripping naked, jumping on the desk and saying draw me. Well not the last part, but cuda cores.
 
From previous gens, it just seems for GPUs, the crippled versions are always an after though. They are not designed. The full chip gets all the resources with intent of franken monstering the lower part as things shape up.

Personally, I've vowed to only ever get the full chip cards that the software/architecture stack was actually designed around.

Certainly a valid strategy, although not the most effective from a value perspective.

I'm basically just saying that the 2070 has all the ingredients required to be a great value buy, if it ever hit MSRP pricing.

Also regarding the current 20 series, none of the 3 cards being released are the fully enabled.

In the upper stack of the 10 series, only the 1080 and Titan XP were fully enabled. I would also note that the value buys for the 10 series are the 1070 and 1080Ti, neither of which are fully enabled.
 
Umm, that's not remotely what DLSS is. You're confusing DLSS with RTX Ray Tracing. DLSS is a form of supersampling. Theoretically, DLSS will increase image quality over traditional forms of AA with much less of an impact.

I was thinking it's a gating procedure where nVidia basically watches what's going on, identifies areas where the sampling is more effective, then adjusts resources accordingly.

Kind of a first order convergence test determining the need for more iterations.
 
No one thinks price/performance? I guess that is not sexy enough. Nvidia doesn't seen so to me. Even at 25% perf increase its great.. if ignore the high prices across the board.. GPU prices are just to freaking high full stop.
This generation seems to be a mix of new features and brute force upgrades... Nothing wrong with it i suppose.. again just price.
I guess AMD could push out a brute force vega with tons of cores and tons of memory.. but honestly I am hoping they speed up navi , and i hope it is a GPU focused in price/ performance as per initial rumors.
 
Still waiting on the ROPs/TMU's and benchmark comparisons that matter far more than how well it does in cherry picked RTX ray traced examples that give us no practical context.
 
No one thinks price/performance? I guess that is not sexy enough. Nvidia doesn't seen so to me. Even at 25% perf increase its great..
The issue here is that the 1070, which replaced the 980Ti was 200-250$ cheaper, for the same performance.

And if you wanted more juice, for 50$ more than a 980Ti, you could get a 1080.

So people were expecting substantially more than 25% boost for a given buck.
 
The issue here is that the 1070, which replaced the 980Ti was 200-250$ cheaper, for the same performance.

And if you wanted more juice, for 50$ more than a 980Ti, you could get a 1080.

So people were expecting substantially more than 25% boost for a given buck.
Yep, i understand what you mean. I cobbled 2 things kind of shitty in my post.. I guess what I mean is from a technical/ technology standpoint alone, the 25% or 50% or whatever are big, nice numbers.. but yeah.., my complaint is just price in general... Everything is just too effin high.. And yeah nvidia just raised the middle price points too.. its just those huge dies i suppose.
 
Back
Top