AdoredTV Dissects the Marketing to Make Nvidia Turing RTX Performance Predictions

Wonder how the Vega 64 and 2080 compare at 1440p and HDR? Maybe not so far apart in that case. Reviews will be very interesting coming up. As for effects what I've seen so far with Ray Tracing, HDR done right to me would win out over just ray tracing with SDR for IQ. Plus if ray tracing performance limits it to 1080p, that is a non-start right off the bat. Ray Tracing would only be useful if performance is acceptable at 1440p or above and with HDR. Look very much forward to real data, especially looking at best IQ overall on each game.

LOL, now I know why there are many FreeSync HDR monitors and virtually no GSync HDR monitors - because Nvidia performance tanks with HDR!
 
Why do people keep linking to this ignorant shills bad videos?
tumblr_lwudgwhIJv1qh87wbo1_1280.jpg


you no like you no watch.

simple solutions.
 
Wonder how the Vega 64 and 2080 compare at 1440p and HDR? Maybe not so far apart in that case. Reviews will be very interesting coming up. As for effects what I've seen so far with Ray Tracing, HDR done right to me would win out over just ray tracing with SDR for IQ. Plus if ray tracing performance limits it to 1080p, that is a non-start right off the bat. Ray Tracing would only be useful if performance is acceptable at 1440p or above and with HDR. Look very much forward to real data, especially looking at best IQ overall on each game.

LOL, now I know why there are many FreeSync HDR monitors and virtually no GSync HDR monitors - because Nvidia performance tanks with HDR!

I have heard that, but, a monitor reviewer on another site says that this isn't the truth at all, that HDR runs fine on Nvidia cards.
 
It seems like no one is allowed to make a predictive analysis.
Also the nVidia fanboiism in this thread is sickening
" How dare someone dare question their graphs... Heresy I tell you"

nVidia are the same as any company. They'll Cherry pick results to advertise. No real surprise there
 
The problems is people spread his FUD, then vanish when he is proven wrong at a later date...just to come back the next time he post som FUD...spread it again (while forgetting how wrong he were)..a never ending cycle of shilling and ignorance.

This is anything but that!!! You may call it what you like, but by attacking the messenger and not the message you are in fact the proprietor of FUD, shilling and ignorance.

AdoredTV dude starts off by admitting his projections were less than stellar. Then goes through his process. So you never get anything wrong? You've never gone through a long drawn out problem and found the conclusion didn't match your hypothesis? If you haven't you certainly don't deserve the soapbox you put yourself on. Watch the review or GTFO.

This is straight up thread crapping and you have, without looking at it, blindly destroyed the message.
 
Last edited:
It seems like no one is allowed to make a predictive analysis.
Also the nVidia fanboiism in this thread is sickening
" How dare someone dare question their graphs... Heresy I tell you"

nVidia are the same as any company. They'll Cherry pick results to advertise. No real surprise there

They give pretty conservative numbers, I fear you have a bad memory:

1463427458xepmrLV68z_3_5_l.gif

nvidia-geforce-gtx-1080-performance-and-efficiency.png


But who cares about facts when you got a shill like Adored to muddy the waters eh? ;)
 
The thing is, he said not long ago that he thought Turing was 15-20% faster. Now he is saying something else. So which is it? What are we supposed to believe? And why are we trusting his speculation as opposed to nVidia's benchmarks, which tend to be accurate?

The last time I checked, AdoredTV was an extremely questionable source. I refuse to watch their videos or support them in any way.

I'm not making an estimate. AdoredTV is making one, and it's BS. nVidia has released benchmarks. And if AdoredTV isn't taking into consideration the fact that the new CUDA cores in Turing are much more powerful, then they are biased as hell, just as I thought.

Call me crazy I guess for believing benchmarks from a company that has never lied about them instead of someone who is flat out pro-AMD biased with a horrible track record. We might as well link to nVidia viral marketing propaganda at [H] if we are going to link to AdoredTV in the "news" section. This is fake news.

AdoredTV is the Stephen A Smith of PC hardware, to put it lightly. More entertainment than analysis.

He is downplaying Turing. At this point, it is all AMD has left. Try to rain on nVidia's parade. I'm not buying it.

I can't wait to quote this post once the reviews come out.

Did you watch the Pascal reveal? They said that the GTX 1060 would be as fast as a GTX 980, and that a GTX 1070 would be as fast as a GTX 980 Ti. Both true statements. nVidia does not lie during these reveals. They have no need to.

I agree with this guy. At least on charts like that they have been very truthful.

It's not surprising he still gets dismissed as a source of info considering the crap he spouted that led to his higher profile.

Nope, not watching it, thanks. Not a fanboy. I just know what is viral BS when I see it. AdoredTV is viral marketing trash. Fake news. Allow them to brainwash you at your peril.

Fine, wait for reviews, great idea, I am doing that also. But please for the love of God don’t listen to AdoredTV’s trash biased “opinion” and pretend that it’s “news”. This video is propaganda at best.

If you were in North Korea would you read the communist propaganda and then kiss Kim Jong Un’s ass, or would you ignore it? I would ignore it.

Not really. AdoredTV is a questionable source and that much is well known.

AdoredTV at least can see the writing on the wall. He sucked in all the AMDrones with his earlier videos and since they aren't delivering any competition he's starting to straddle the other fence. It's as simple as that.

So let me ask you this, do you believe and agree with the video in the OP? Do you feel that not taking into account the new more powerful CUDA cores in Turing is not only stupid, but dishonest also? The guy is talking out his ***, that much is completely clear. I don’t even need to watch the video. And I’m not going to waste my time. I have better things to do. You guys clicking the link and watching the video are simply feeding the troll.

The guys is a complete fraud, first he says 2080Ti is going to be 50% faster than 1080Ti, then he saw the reveal and heard the cry of AMD fanboys and adjusted that to 20%, then after seeing NVIDIA's slides and the cache and core improvements, he is saying 25-30%, he is clueless, guessing in the dark, that's what he does.

A 50% faster core, and a 50% better and larger cache with increased core count and bandwidth will only net you 25%? Yeah right, the only people who would believe a guy like that delusional from the start.

Only extremely questionable source?

I give up with this thread. AdoredTV was and is right, and the nVidia shills in this thread need to stop, as they have totally lost their credibility, and reading their infantile argumentation it’s now akin to watching my dog taking a shit and turning round and sniffing it.

Can't wait to see your face comes review day.

Why do people keep linking to this ignorant shills bad videos?

The problems is people spread his FUD, then vanish when he is proven wrong at a later date...just to come back the next time he post som FUD...spread it again (while forgetting how wrong he were)..a never ending cycle of shilling and ignorance.

They give pretty conservative numbers, I fear you have a bad memory:

View attachment 99335
View attachment 99336

But who cares about facts when you got a shill like Adored to muddy the waters eh? ;)



Hilarious thread guys, cheers.
 
AdoredTV, the Turing cards are a solid 35% faster, not 15-20% like you said. I stand by my comments. I'm not sure why you are quoting me and gloating about it when your assessment was clearly wrong. Please link us to benchmarks showing a 15-20% improvement. From what I have seen, the RTX 2080 is as fast as a GTX 1080 Ti. That is a 35% improvement over the GTX 1080.

I will say that your more recent video looks to be more accurate. That's the problem with journalism though. Once you sell out and lose your credibility, it's basically gone. It's hard to get it back and recover. Based on what you have done in the past it has tainted my own view of your YouTube channel, personally. It will always be in the back of my mind. If AdoredTV hyped AMD and bashed nVidia in the past, there is always the chance that they will do it again.
 
Last edited:


Hilarious thread guys, cheers.


I checked three games and they pretty much matched up with nVidia’s chart. Wolfenstein II, Shadow of War and Tomb Raider. SoW was 10% lower but Wolfenstein II was 10% higher and Tomb Raider spot on with the chart.

I actually like your videos btw.
 
AdoredTV, the Turing cards are a solid 35% faster, not 15-20% like you said. I stand by my comments. I'm not sure why you are quoting me and gloating about it when your assessment was clearly wrong. Please link us to benchmarks showing a 15-20% improvement. From what I have seen, the RTX 2080 is as fast as a GTX 1080 Ti. That is a 35% improvement over the GTX 1080.

I will say that your more recent video looks to be more accurate. That's the problem with journalism though. Once you sell out and lose your credibility, it's basically gone. It's hard to get it back and recover. Based on what you have done in the past it has tainted my own view of your YouTube channel, personally. It will always be in the back of my mind. If AdoredTV hyped AMD and bashed nVidia in the past, there is always the chance that they will do it again.

As others said in the thread, updating your analysis based on more information is actually a desirable trait. I said that I overran the initial numbers of 20-25% (this was before we'd even seen any Nvidia slides) and I then decided on 30% being more accurate.

Here -

32.4% at 1440p over maybe 10 different sources.

If7SxfP.png


I checked three games and they pretty much matched up with nVidia’s chart. Wolfenstein II, Shadow of War and Tomb Raider. SoW was 10% lower but Wolfenstein II was 10% higher and Tomb Raider spot on with the chart.

I actually like your videos btw.

Yep and I said that they wouldn't just outright lie in the benchmarks however they'd make the best case they could (and should), but it should also have been obvious that they'd done so and that the average results would be quite a bit lower.
 
As others said in the thread, updating your analysis based on more information is actually a desirable trait. I said that I overran the initial numbers of 20-25% (this was before we'd even seen any Nvidia slides) and I then decided on 30% being more accurate.

Here -

32.4% at 1440p over maybe 10 different sources.

View attachment 105146



Yep and I said that they wouldn't just outright lie in the benchmarks however they'd make the best case they could (and should), but it should also have been obvious that they'd done so and that the average results would be quite a bit lower.

Well technically it's a bit less if your comparing them equally at the same clock speed since the 2080 founders edition have a slight overclock to them.
 
Well technically it's a bit less if your comparing them equally at the same clock speed since the 2080 founders edition have a slight overclock to them.

Turing has a higher max overclock so you shouldn’t do equal clock speed. Max OC vs max OC works. I think they did that in the gamersnexus review. Techpowerup also noted every card they tested was power limited... so I am curious if the FTW version helps with that.

I get your point though, test apples to apples.
 
As others said in the thread, updating your analysis based on more information is actually a desirable trait. I said that I overran the initial numbers of 20-25% (this was before we'd even seen any Nvidia slides) and I then decided on 30% being more accurate.
No you said 25~30%. Mostly on the low side of that. Turns out it's 35%. And much higher with DLSS.
You had it right the first time, 2080Ti is faster than TitanV. You should've stuck with that.

Turing has a higher max overclock so you shouldn’t do equal clock speed. Max OC vs max OC works. I think they did that in the gamersnexus review. Techpowerup also noted every card they tested was power limited... so I am curious if the FTW version helps with that.

I get your point though, test apples to apples.
Anand Also tested at ref clocks, and 2080Ti is still 32% faster.
 
Last edited:
I have heard that, but, a monitor reviewer on another site says that this isn't the truth at all, that HDR runs fine on Nvidia cards.

I have noticed a lot weaker performance with it on using my 1080Ti for sure but I haven't narrowed it down yet. It looks like HDR is slower though by quite a bit. It could be me so I need to test more.
 
I have noticed a lot weaker performance with it on using my 1080Ti for sure but I haven't narrowed it down yet. It looks like HDR is slower though by quite a bit. It could be me so I need to test more.

Just listened to the Full Nerd podcast on the way to work. Pascal loses 10.5% to HDR and Turing is 5%. So not a huge difference but Turing does handle it better. So the outrage over HDR in the benches it a bit exaggerated imo.
 
2080ti is ~23% faster on average than the 1080ti according to hardware unboxed, fwiw. (22:50 mark)
2080 about 30% faster than 1080.
And 2080 about equal to 1080ti.
 
2080ti is ~23% faster on average than the 1080ti according to hardware unboxed, fwiw. (22:50 mark)
2080 about 30% faster than 1080.
And 2080 about equal to 1080ti.


I think techreport had a 30+ game review and 23% at 1440p and 30% at 4k 2080ti vs 1080ti. But some games were +44% (BF1) and some at 0%.... probably 60Hz locked. But they counted it lol.

Some of these review sites are weird. OCing without increasing power limit. Using games that are CPU limited or locked to 60Hz. Ect.
 
No you said 25~30%. Mostly on the low side of that. Turns out it's 35%. And much higher with DLSS.
You had it right the first time, 2080Ti is faster than TitanV. You should've stuck with that.

I literally linked my video at the exact point I said "30% at 1440p and not a bunch of HDR benchmarks", which FYI Tech Report and Computer Base used, hence their much higher ~40%ish scores, without which the results would have been nearly exactly 30% (lower had they used a reference 2080).

Anand Also tested at ref clocks, and 2080Ti is still 32% faster.

I never at any point mentioned the 2080Ti's performance, it was always 2080 vs 1080. Note though that they benched Wolfenstein II again which was a massive 72% faster even at reference clocks.

It's fine to bench Wolfenstein II of course, just be aware that this single game can contribute 2-3% more to the performance totals simply due to being such a huge outlier. In much the same way we wouldn't judge Vega performance on this game, we shouldn't with Turing either.

If you use a geometric mean to help alleviate outliers like this, the 2080 (reference) at Anandtech was 32.4% faster than the 1080, otherwise just taking the average results it's 34%.
 
Last edited:
Is the 2080 not 35% faster than the 1080? And if so, then why are you quoting me?

Your entire tirade in this thread was about me supposedly being the unreliable source and Nvidia never lying.

Can you show me where Nvidia's marketing showed the 2080 as 35% faster than the 1080? Or did they actually show a bunch of slides with stuff like shader perf ~60% higher and HDR benchmarks which appeared to show the 2080 as a *much* faster ~50% ahead of the 1080?
 
Is the 2080 not 35% faster than the 1080? And if so, then why are you quoting me?

As for DLSS, it looks like it could be a game changer. We just don't know yet.

Well i Said it would be at 1080ti perf, and it basically a tie, that was so ridiculous for you, that you wanted to quote me on release, because I assumed you anticipated the 2x times perf slide.
Anyhow it's about 32% in 4k, and sub 25% in 1080p, if you want RTX games that would be minus 100% in 1080p.
this must be the first new gen release where the same perf gets a price increase not a decrease.
 
Is the 2080 not 35% faster than the 1080? And if so, then why are you quoting me?

As for DLSS, it looks like it could be a game changer. We just don't know yet.
Its about price points though.. Intel could call their newest 28 core 5 gigahertz chip a Celeron, but if they are selling it for 6000$ no one would compare it with a 100$ chip.
 
Back
Top