AdoredTV Dissects the Marketing to Make Nvidia Turing RTX Performance Predictions

The absolute 1 thing I am interested in is the performance of using nvlink, in any way would we be bypassing development restrictions driver side, will RT performance stack or scale well using it, also with RT because they are game works will they use madapter for NVidia cards to help create RT games at higher fps because the limit of using 3 cards. Will the tensor cores do physx and hairworks calcs to help make them more viable using......to many questions from this generation and everyone is focused on just the % of increase compared to what we are getting.
 
Did you watch the Pascal reveal? They said that the GTX 1060 would be as fast as a GTX 980, and that a GTX 1070 would be as fast as a GTX 980 Ti. Both true statements. nVidia does not lie during these reveals. They have no need to.

I agree with this guy. At least on charts like that they have been very truthful.
 
It's not surprising he still gets dismissed as a source of info considering the crap he spouted that led to his higher profile.
 
Until I see an independent BungholioMarks score, I remain skeptical.
 
Call me crazy I guess for believing benchmarks from a company that has never lied about them instead of someone who is flat out pro-AMD biased with a horrible track record. We might as well link to nVidia viral marketing propaganda at [H] if we are going to link to AdoredTV in the "news" section. This is fake news.

Whoa there Fanboy... did you even watch the video cause it looks like you're talking out of your ass. Before you talk about him being ""flat out pro amd biased" perhaps you should watch the end of the video at the 25:50 mark where he says that AMD is in so much trouble and then calls it a masterpiece in architecture.

You do understand that you can criticize a company's marketing and praise a product... which is EXACTLY what he did here
 
Whoa there Fanboy... did you even watch the video cause it looks like you're talking out of your ass. Before you talk about him being ""flat out pro amd biased" perhaps you should watch the end of the video at the 25:50 mark where he says that AMD is in so much trouble and then calls it a masterpiece in architecture.

You do understand that you can criticize a company's marketing and praise a product... which is EXACTLY what he did here
Nope, not watching it, thanks. Not a fanboy. I just know what is viral BS when I see it. AdoredTV is viral marketing trash. Fake news. Allow them to brainwash you at your peril.
 
Did you watch the Pascal reveal? They said that the GTX 1060 would be as fast as a GTX 980, and that a GTX 1070 would be as fast as a GTX 980 Ti. Both true statements. nVidia does not lie during these reveals. They have no need to.

I agree with this guy. At least on charts like that they have been very truthful.

Since you both seem to like Nvidia's slides. Does not raise any red flags with you at all that they didn't come out and say anything like that this time around? It was all Ray Tracing, 6x faster than pascal this, 2070 as fast as the 1080ti that. And then they release performance slides a few days after the event, so no questions could be asked. And, the info in that slide is lacking. We have no idea what settings were used.


No, No, you both can keep your belief that Nvidia is totally honest with their charts. Me, I will wait for the reviews. And if the slides turn out to be 100% accurate, and not just accurate in certain scenarios that favour the Turing cards, then great. But, the next time cards are released. I will still wait for independent reviews, and I will still not believe marketing slides.
 
Since you both seem to like Nvidia's slides. Does not raise any red flags with you at all that they didn't come out and say anything like that this time around? It was all Ray Tracing, 6x faster than pascal this, 2070 as fast as the 1080ti that. And then they release performance slides a few days after the event, so no questions could be asked. And, the info in that slide is lacking. We have no idea what settings were used.


No, No, you both can keep your belief that Nvidia is totally honest with their charts. Me, I will wait for the reviews. And if the slides turn out to be 100% accurate, and not just accurate in certain scenarios that favour the Turing cards, then great. But, the next time cards are released. I will still wait for independent reviews, and I will still not believe marketing slides.
Fine, wait for reviews, great idea, I am doing that also. But please for the love of God don’t listen to AdoredTV’s trash biased “opinion” and pretend that it’s “news”. This video is propaganda at best.
 
I agree with this guy. At least on charts like that they have been very truthful.

Oh there is no doubt that the chart is accurate, for the games and settings used. Pascal suffers under HDR, this has been shown in several benchmarks. For whatever reason Pascal does not handle HDR super well. I wouldn't be surprised if Nvidia corrected that for Turing. Additionally, all of those games were run at 4K, against the 1080. The 1080 is not a great 4K card. The 2080 seemingly is much closer to the 1080 ti at 4K (if not matching or slightly beating it). Remains to be seen how things go at the 1080's sweet spot, 1440p, or the more CPU dependent 1080p.

Fine, wait for reviews, great idea, I am doing that also. But please for the love of God don’t listen to AdoredTV’s trash biased “opinion” and pretend that it’s “news”. This video is propaganda at best.

You haven't even watched the fucking video. Stop acting like you aren't talking out your ass.
 
You haven't even watched the fucking video. Stop acting like you aren't talking out your ass.
If you were in North Korea would you read the communist propaganda and then kiss Kim Jong Un’s ass, or would you ignore it? I would ignore it.
 
Oh there is no doubt that the chart is accurate, for the games and settings used. Pascal suffers under HDR, this has been shown in several benchmarks. For whatever reason Pascal does not handle HDR super well. I wouldn't be surprised if Nvidia corrected that for Turing. Additionally, all of those games were run at 4K, against the 1080. The 1080 is not a great 4K card. The 2080 seemingly is much closer to the 1080 ti at 4K (if not matching or slightly beating it). Remains to be seen how things go at the 1080's sweet spot, 1440p, or the more CPU dependent 1080p.
.

No company ever intentionally picks things that performs poorly on the last generation to make the new generation coming out look even better :) You pretty much nailed why that chart is meaningless that they put out.
 
AdoredTV at least can see the writing on the wall. He sucked in all the AMDrones with his earlier videos and since they aren't delivering any competition he's starting to straddle the other fence. It's as simple as that.
 
Not really. AdoredTV is a questionable source and that much is well known.

Which goes back to my point about not taking information at face value. However, you need to actually WATCH a video before you are capable of forming a valid thought on it or what is contained. For as much as you blather on about him spreading propaganda I've yet to see you provide a shred of evidence. The only one peddling "fake news" here seems to be you.

No company ever intentionally picks things that performs poorly on the last generation to make the new generation coming out look even better :) You pretty much nailed why that chart is meaningless that they put out.

Yep. It's good marketing. Nvidia has a pretty good PR team and they know how to hype up new product launches. There is no need for them to lie about performance when they can use scenarios that make the old cards struggle in order to make the new stuff look better.
 
Since you both seem to like Nvidia's slides. Does not raise any red flags with you at all that they didn't come out and say anything like that this time around? It was all Ray Tracing, 6x faster than pascal this, 2070 as fast as the 1080ti that. And then they release performance slides a few days after the event, so no questions could be asked. And, the info in that slide is lacking. We have no idea what settings were used.


No, No, you both can keep your belief that Nvidia is totally honest with their charts. Me, I will wait for the reviews. And if the slides turn out to be 100% accurate, and not just accurate in certain scenarios that favour the Turing cards, then great. But, the next time cards are released. I will still wait for independent reviews, and I will still not believe marketing slides.

They actually had numbers in their keynote presentation. He said a 1080ti got 36 fps in a certain demo and the 2080ti had 60 locked (vsync).

I think they were just focusing on their exciting new tech.

At the end of the day, they could have used 50% more CUDA cores and had a ~45% boost. (2080ti vs 1080ti).

Everyone is waiting for reviews but you go on what you know. Hell I am waiting probably a few weeks after launch. I want a 2080ti with a high power limit so I can OC the hell out of it. I think the 2080ti will be power limited given it’s a massive 754mm^2. The largest gaming chip ever made.

If it doesn’t curb stomp the 1080ti someone is getting fired lol. 50% more die space ffs.

The NDA lifts ~a week before launch. If nVidia was being super shady they would have done it later. Not given you a week to read all the reviews.
 
Last edited:
Which goes back to my point about not taking information at face value. However, you need to actually WATCH a video before you are capable of forming a valid thought on it or what is contained. For as much as you blather on about him spreading propaganda I've yet to see you provide a shred of evidence. The only one peddling "fake news" here seems to be you.
So let me ask you this, do you believe and agree with the video in the OP? Do you feel that not taking into account the new more powerful CUDA cores in Turing is not only stupid, but dishonest also? The guy is talking out his ***, that much is completely clear. I don’t even need to watch the video. And I’m not going to waste my time. I have better things to do. You guys clicking the link and watching the video are simply feeding the troll.
 
They actually had numbers in their keynote presentation. He said a 1080ti got 36 fps in a certain demo and the 2080ti had 60 locked (vsync).

I think they were just focusing on their exciting new tech.

At the end of the day, they could have used 50% more CUDA cores and had a ~45% boost. (2080ti vs 1080ti).

Everyone is waiting for reviews but you go on what you know. Hell I am waiting probably a few weeks after launch. I want a 2080ti with a high power limit so I can OC the hell out of it. I think the 2080ti will be power limited given it’s a massive 754mm^2. The largest gaming chip ever made.

If it doesn’t curb stomp the 1080ti someone is getting fired lol. 50% more die space ffs.

That was the UE4 Infiltrator demo. As for performance, I think there might have been a little trickery going on. The Nvidia blog seems to be saying that it was running 4K HDR using DLSS.
 
That was the UE4 Infiltrator demo. As for performance, I think there might have been a little trickery going on. The Nvidia blog seems to be saying that it was running 4K HDR using DLSS.

Why wouldn’t they? How is using a feature of the card that takes up nearly 1/3 of the die trickery?

As long as it has the same IQ there’s no trickery.
 
Why wouldn’t they? How is using a feature of the card that takes up nearly 1/3 of the die trickery?

Trickery mostly in how Jensen presented it. He made it sound like they were running apples-to-apples and that made it twice as fast, instead it was an entirely useless comparison as the Turing card was using a feature that the 1080 ti is not capable of of.
 
Trickery mostly in how Jensen presented it. He made it sound like they were running apples-to-apples and that made it twice as fast, instead it was an entirely useless comparison as the Turing card was using a feature that the 1080 ti is not capable of of.

As long as the IQ is the same it’s apples to apples. It’s the same end result.
 
As long as the IQ is the same it’s apples to apples. It’s the same end result.

No, it is not. Apples-to-apples means the EXACT same settings. Don't try to change what a term means in order to justify marketing BS.

If DLSS works like Nvidia claims it could actually mean better IQ, but we'll see. It's awesome tech, but using it means a comparison is not apples-to-apples.
 
No, it is not. Apples-to-apples means the EXACT same settings. Don't try to change what a term means in order to justify marketing BS.

If DLSS works like Nvidia claims it could actually mean better IQ, but we'll see. It's awesome tech, but using it means a comparison is not apples-to-apples.

Let us see how Brent handles it haha.

They were definitely using it. I can see concern since it’s not something you can use in every game.
 
Let us see how Brent handles it haha.

They were definitely using it. I can see concern since it’s not something you can use in every game.

H does both "highest playable" and "apples-to-apples". I imagine Brent, or Kyle, will do a big article on DLSS once there are a few games out there that support.

Yeah. I like how their chart listed normal (likely apples-to-apples) performance and then DLSS (where applicable). Does a good job showing the positive impact of the tech. For as stunning as ray tracing was in those demos, its DLSS that I am really interested in seeing.
 
Nvidia specifically used settings/features that are most demanding to the 1080.

If anyone actually believes the 2080 will be 47% faster on average then they're just the kind of sucker Nvidia is marketing towards.
 
Such a well thought out and useful comment.

About as useful as what info Nvidia has released so far. Heavens to murgatroid we get some actual numbers instead of 1.5x or 2x 1080 at 4k 60hz hdr. When we did get some RTX stuff, it wasn't so great (30-40fps at 1080p) and the response was all about beta drivers and various tech press people saying things like we'll be surprised in a month. Well, yay. Gotta get the month long hype train going to try to justify the price increases. As per usual, we'll be waiting 6-8 weeks on the 2070 and 2-4 months on the 2060.
 
The guys is a complete fraud, first he says 2080Ti is going to be 50% faster than 1080Ti, then he saw the reveal and heard the cry of AMD fanboys and adjusted that to 20%, then after seeing NVIDIA's slides and the cache and core improvements, he is saying 25-30%, he is clueless, guessing in the dark, that's what he does.

A 50% faster core, and a 50% better and larger cache with increased core count and bandwidth will only net you 25%? Yeah right, the only people who would believe a guy like that delusional from the start.
 
The guys is a complete fraud, first he says 2080Ti is going to be 50% faster than 1080Ti, then he saw the reveal and heard the cry of AMD fanboys and adjusted that to 20%, then after seeing NVIDIA's slides and the cache and core improvements, he is saying 25-30%, he is clueless, guessing in the dark, that's what he does.

A 50% faster core, and a 50% better and larger cache with increased core count and bandwidth will only net you 25%? Yeah right, the only people who would believe a guy like that delusional from the start.

Obviously you didn't bother watching this vid. All in all it was a good anlysis, and the only % I heard him guessing at here was that the 2080 would be ~40% better and not to expect 50% unless you basically are running in the realm that the 1080 struggles in. (note that realm is likely where most of the people paying 1 grand for a video card play in...)
 
Last edited:
Obviously you didn't bother watching this vid. All in all it was a good anlysis, and the only % I heard him guessing at here was that the 2080 would be ~40% better and not to expect 50% unless you basically are running in the realm that the 1080 struggles in. (note that realm is likely where most of the people paying 1 grand for a video card play in...)
I DID watch it, same crap is said over and over, NVIDIA is evil, their marketing is evil, they are out to get you, they are cherrypicking and bla bla bla.

If NVIDIA is cherrypicking any game, they would have used only gameworks stuff, or used DX11 games. or use old games. They didn't. Their selection of games is the best I have seen in a while, they have DX11, DX12, Vulkan, old, new, NVIDIA/AMD sponsored, and HDR/SDR titles. They covered all the bases, and in all cases there was a considerable increase in performance.

But he failed to see that, as he is just an AMD mouthpiece and will always remain so unless he changes his ways.
 
I DID watch it, same crap is said over and over, NVIDIA is evil, their marketing is evil, they are out to get you, they are cherrypicking and bla bla bla.

If NVIDIA is cherrypicking any game, they would have used only gameworks stuff, or used DX11 games. or use old games. They didn't. Their selection of games is the best I have seen in a while, they have DX11, DX12, Vulkan, old, new, NVIDIA/AMD sponsored, and HDR/SDR titles. They covered all the bases, and in all cases there was a considerable increase in performance.

But he failed to see that, as he is just an AMD mouthpiece and will always remain so unless he changes his ways.

Apparently you didn't. He didn't imply Nvidia is evil.

Of fucking course they're cherry picking. It would be stupid if they didn't. Even a quick glance at the limited information they give proves that they're cherry picking. Pascal has issues with HDR, multiple games have HDR enabled. Pascal can't do FP16, Wolf 2 uses it. The 1080 is a 1440p card, it lacks the bandwidth to be a great 4K GPU, the games were all run at 4K. They did 2080 vs 1080 instead of 2080 ti vs 1080 ti. They intentionally worded the chart in a way that is going to confuse some people. It is a good selection of games (and one tech demo), but they ran them in ways that gives the most benefit to the 2080 and the most problems for the 1080.
 
Of fucking course they're cherry picking. It would be stupid if they didn't. Even a quick glance at the limited information they give proves that they're cherry picking. Pascal has issues with HDR, multiple games have HDR enabled. Pascal can't do FP16, Wolf 2 uses it. The 1080 is a 1440p card, it lacks the bandwidth to be a great 4K GPU, the games were all run at 4K. They did 2080 vs 1080 instead of 2080 ti vs 1080 ti. They intentionally worded the chart in a way that is going to confuse some people. It is a good selection of games (and one tech demo), but they ran them in ways that gives the most benefit to the 2080 and the most problems for the 1080.
Nope,

There were at least 6 games benched without HDR. And they achieved the same gains in them. Also Wolf2 can't use FP16 on Turing, because FP16 is a part of AMD shader intrinsics available only on AMD hardware. Only an ignorant guy like Adored can make that claim and get away with it.

And NOTHING is wrong with benching a card at 4K if the card is targeted at 4K. You want to use use @1440p that's your choice, just subtract a portion of the % of uplift and there you have it.
 
Jim @ adoredtv predictions are more in line with what is realistic . A 50% improvement over a previous generation isn't likely and for that matter even rational on Nvidias part. They don't have any reason to produce a card with such a performance gain over last generation. It would make more sense to save a homerun when you have competition not when you have the performance crown.
 
We will see the reviews when they come in.........And whoever has to eat crow well......enjoy it.
 
Jim @ adoredtv predictions are more in line with what is realistic . A 50% improvement over a previous generation isn't likely and for that matter even rational on Nvidias part. They don't have any reason to produce a card with such a performance gain over last generation. It would make more sense to save a homerun when you have competition not when you have the performance crown.
LOL you think nVidia can "save up" performance for when they need it in a future generation? ROFL
 
I felt that he made some interesting points in the video and since Nvidia hasn't really put out jack to prove performance... He is as about as correct as anyone else currently. We shall see soon enough.

To me? This is pure marketing genius. Look at this thread and all the others on Reddit etc. Shit is blowing up and we have almost a month before release. Tons of publicity and they just keep trickling out tidbits to fuel the hype train.

We have charts now. Charts that look good maybe or tell us fuck all but have no indication of settings and so on to go by. Everyone argue about the charts! Hilarious.

All the diehards will throw down and preorder the new hotness anyway. Despite the unholy price and lack of real data to boot. All the people that think RTX is too expensive or is over hyped will buy 10 series cards since sadly AMD cant compete. AND just perfect because they happen to be sitting on a shit ton of inventory there.

Genius.
 
Back
Top