New AMD ES GPU beats 2080Ti by 17% on OpenVR GPU Benchmark leaderboard

Indeed, Nvidia's architecture is very efficient, allowing them that power usage on a larger node. But so what. As a consumer, a 5700XT and 2070 super perform the same and use the same power.

The 'so what' is that we're talking about future products, where AMD is already reaping the 'gains' of a node shrink which has allowed them to 'catch up' to their competitor's product that is still made on a larger node.

And then Nvidia has a new architecture coming on that same smaller node.

However, AMD does still have architecture refinements. And their roadmaps are pretty aggressive right now. I'm betting that Nvidia still retains the top crown at the high end, with Ampere. But, I think that AMD will remain not too far behind, as they are now. And their R&D will close the gap sometime next year. I'm feeling like they may even squeeze out an Ampere competitor, before spring 2021. Even if it means another higher power usage part.

That's the thing -- the best indicator of AMD changing the course they've been on since they started making GPUs / bought ATI is Navi, and well, it's more of the same.

And it's still missing hardware ray tracing. So they not only have to make a huge leap in raster performance just to approach competitiveness, they also have to add ray tracing hardware into the mix.


The chances of AMD doing both, given their history?

I'd love to be optimistic, but I'm too rational for that.
 
And it's still missing hardware ray tracing. So they not only have to make a huge leap in raster performance just to approach competitiveness, they also have to add ray tracing hardware into the mix.
Consoles having AMD hardware will mean that major titles will focus on AMD's RT approach. AMD has that advantage.
 
Frankly I just care which card is best for the money when the time comes. At 1440p the 5700XT pretty much owns that resolution. When 120hz, FreeSync Premium Pro 4k and above, preferably 5K Ultra wide monitors are available and reasonable, then a GPU that can push the FPS, with very high quality settings, if not maxed out, I will definitely want. Not the ridiculous G-Sync 27" 120hz/144hz HDR ones which really are limited by Displayport 1.4. Since Displayport 2 standard has been out with expected hardware this year, that 5K Ultra Wide with 120hz+ I am hoping to get sometime this year or next.
 
About the supposed big Navi: https://www.technopat.net/sosyal/konu/asus-zephyrus-g-ga401iv-dxdiag.804989/
and the VR benchmark sample:
AdlnLJp.png

lulz

Must be the 2080 Ti Big Navi Edition.
 
The 'so what' is that we're talking about future products, where AMD is already reaping the 'gains' of a node shrink which has allowed them to 'catch up' to their competitor's product that is still made on a larger node.

And then Nvidia has a new architecture coming on that same smaller node.



That's the thing -- the best indicator of AMD changing the course they've been on since they started making GPUs / bought ATI is Navi, and well, it's more of the same.

And it's still missing hardware ray tracing. So they not only have to make a huge leap in raster performance just to approach competitiveness, they also have to add ray tracing hardware into the mix.


The chances of AMD doing both, given their history?

I'd love to be optimistic, but I'm too rational for that.
Next Gen consoles are confirmed RDNA2 with ray tracing. Who knows when exactly, the PC part will be out. But, I can't imagine they would launch the consoles this year and not have ray tracing in a PC part this year, as well.

At this point, in my opinion; Nvidia's RT performance is abysmal, relative to any price point. I have a really tough time giving them a lot of credit for having it now, as opposed to later. Current game implementations require a 2080ti if you want 60fps at 1080p, for most of the RT games so far. Which is nearly insulting. RT on my RTX 2060 is basically a joke. Usually requiring an internal rendering resolution near 720p and even some dialing down of other settings, to get a good 60fps avg gameplay experience with RT effects turned on.

Further opinion: Games which have less of a hit with RT (modern warfare), barely show a visual reason to have it on, anyway. And some of the few key differences could look much better than they do, with RTX off. Its like the devs purposefully gimped the way shadows work, on certain objects. I get that RT simplifies implementation of such effects. But, the performance hit is still pretty large in Modern Warfare. Cleaning up the shadows in the regular lighting model may take some extra effort, but wouldn't result in a large performance hit, like RT.
 
Last edited:
  • Like
Reactions: noko
like this
Update on this;

The benchmark creator has chimed in and confirmed this is legit.
Other users have connected more dots and came up with the below info;
The processor ID mentioned is "AMD Eng Sample 100-000000098-40_39/27_Y". That CPU is Engineering Sample of Ryzen 7 4800HS used in Asus Zephyrus G14. There was a DxDiag dump on some Turkish website on 21st December that shows same ES CPU used in Zephyrus G14. Source. As you can see, the model listed in that DxDiag dump is Zephyrus G GA401IV. Now here is Asus's website listing that model under their G14 lineup.

Why would someone use R7 4800HS to test Discreet GPU that's almost 20% more powerful than 2080Ti?

What seems to have happened is that OpenVR benchmark picked up the Integrated Vega of R7 4800HS and displayed because that's your primary video adapter. That test is run on the RTX 2060 that Zephyrus comes with, as for the scores, either its a fault in benchmark (I think its a new Benchmark as well, haven't seen this one before) or just fabricated by someone in their free time.

So, this info might also show a test of an external GPU on a laptop.. with Zen 2 mobile as CPU - quite interesting.

old.reddit.com/r/Amd/comments/emhb8f/about_the_amd_gpu_beating_2080ti_by_17_leak/
edit: god damn reddit embeds are a black and white wall of text PITA here..
 
Next Gen consoles are confirmed RDNA2 with ray tracing. Who knows when exactly, the PC part will be out. But, I can't imagine they would launch the consoles this year and not have ray tracing in a PC part this year, as well.
.

I dunno man. These APUs are a lot more complex than the Xbox One (add cache, second jaguar module, increase size of GCN.) and PS4 (add unified GDDR5 support, second jaguar module, increase size of GCN.) Both of the graphics cards featured in the systems had been out for 6 months prior.

For these two new consoles, you do have a more complete starting point with Renoir, you still have to gut the Vega architecture for something not-yet-released. You also have to add unified GDDR6 controller,

Given that these are much larger in die size than their predecessors, the new architecture and larger size could be delaying release (just because they have release samples doesn't mean if's a final release candidate). The consoles have to be ready before AMD can make their own discrete die based off the new architecture (and that includes developer support on the inevitable 6 months of driver/firmware updates that come with every new AMD architecture, even on a console).
 
Last edited:
Well, look at the scores for the 2080Ti below that. The 8700k is faster than the 9900k.

Could it be that that test just runs way better on newer AMD CPUs and the 2080Ti is being held way back by the Intel CPUs?

Seems like the most logical explanation to me.
 
Update on this;

The benchmark creator has chimed in and confirmed this is legit.
Other users have connected more dots and came up with the below info;


So, this info might also show a test of an external GPU on a laptop.. with Zen 2 mobile as CPU - quite interesting.

old.reddit.com/r/Amd/comments/emhb8f/about_the_amd_gpu_beating_2080ti_by_17_leak/
edit: god damn reddit embeds are a black and white wall of text PITA here..

It is possible that cpu is being tested outside of a laptop as well.
 
  • Like
Reactions: N4CR
like this
Well, look at the scores for the 2080Ti below that. The 8700k is faster than the 9900k.

Could it be that that test just runs way better on newer AMD CPUs and the 2080Ti is being held way back by the Intel CPUs?

Seems like the most logical explanation to me.
It's GPU only test according to the dev. CPU does practically nothing in that particular one. Also you have to only compare the same resolutions.

It is possible that cpu is being tested outside of a laptop as well.
Very true, doesn't mean the CPU is specific to G14, just it's the same APU.
 
I think AMD would do much better selling this for 600-700 dollars. Target old 1080ti pricing, nvidia will overshoot this of course, but we don't know what size their ampere 7nm cards will be. With the 2000 series, the 2080ti is some massive brute forced 700+ mm2 die. So if they are only 600mm2 they may not need to price their highest end card up to the sky. So AMD would do much better to get 2080ti performance around 600 dollars.

btw, if my pricing sounds too cheap, that just shows you how out of whack gpu pricing has become.

I believe this is wishful thinking. If AmD beats 2080TI performance by 15% they are not going to underprice themselves 50% compared to nvidia’s offering.

Amd does need to make up market share but they aren’t going to take a loss on these high end ‘big navi’ cards either. I’m expecting a minimum msrp of $999. If we get lucky $799.

It’s exciting news but keep in mind that nvidia has 2080ti super ready to go as a placeholder counter until they can roll out Ampere architecture video cards.
 
  • Like
Reactions: Auer
like this
There is no 2080Ti Super coming no matter what AMD does.

Nvidia already has cards faster than the 2080 Ti, they're just called Quadro, Titan or Tesla. It would not be any skin off of thier back to rebrand one of these as Gefore to maintain brand mindshare.
 
Nvidia already has cards faster than the 2080 Ti, they're just called Quadro, Titan or Tesla. It would not be any skin off of thier back to rebrand one of these as Gefore to maintain brand mindshare.

Oh yes the Titan RTX is out there and yeah there is virtually no difference in performance over it then the Ti. This is why a Super is never coming and thats because the Titan already exists with RTX. The Super rumor only came to exist for the 2080Ti was due to the other cards getting Super versions. It would only be Super stupid for Nvidia to release a Super 2080Ti and destroy their Titan RTX sales.

MEbcAknTAK8B2PzBAne5Ma-650-80.png
 
I really hope AMD can hit this level of performance in the first half of this year, and I'd love to see them kick nVidia in the nuts on pricing. My 1080Ti has been chugging along for 2 1/2 years and there is still nothing on the market close to a reasonable upgrade for the money. I'm a gamer and enthusiast with a 1000W power supply and custom loop with 2x360mm radiators - don't give 2 shits about efficiency. Give me an AMD card that performs at a good price with triple 8-pin power connectors and a full coverage water block and I'll buy it.
 
Hopefully AMD first gen RT cards won't be as bad or hindering as Nvidia first gen RT cards. Really I would like to know much more about Ampere as well, then again when the time comes I won't wait either and just get what is reasonably good performance/$ with the features that are important. If AMD RT is like current Nvidia RT dealing with large performance handicap, it better be priced very good for me to consider it - if not, I will most likely wait to see what Nvidia has cooking. It will have to be a combination of Gaming monitor and GPU.
 
They still need to fix RT in games where it doesn't look like someone just waxed every surface, they need to find a way to subdue the reflections otherwise it looks more fake then what they do in a game without RT. I am hoping once the consoles are out that developers will have had more time to figure out how to implement it better then what I have seen so far.
 
Oh yes the Titan RTX is out there and yeah there is virtually no difference in performance over it then the Ti. This is why a Super is never coming and thats because the Titan already exists with RTX. The Super rumor only came to exist for the 2080Ti was due to the other cards getting Super versions. It would only be Super stupid for Nvidia to release a Super 2080Ti and destroy their Titan RTX sales.

this is nvidia, don't say never.. after all they released 3 different titan models for the 700 series to make sure they stayed ahead of AMD's 200 series.
 
I don't think we'll much, if any, RT on consoles. There simply isn't the horsepower.
Probably but never discount optimisation and big dev bucks making hardware efficient.
 
Crystal ball.
Big die.Lot of heat.Low clocks.Water cooling.Sub $800 with 3rd party coolers or no dice.

Show me FPS in top Nvidia backed game & top AMD backed game.

AMD 2020 drivers are still a mess.
 
This is a Hoax !
This is some guy testing an Asus Laptop with a 4800HS chip, using hybrid display, with an RTX 2060 GPU. The 2060 GPU is 40% the speed of a Desktop 2080Ti. But what if there is some eGPU using a 2080Ti connected to that Laptop. Some bench could use both GPU and show better results than the Desktop 2080Ti. That would explain the 17% better. Another explanation could be that the Bench has bugs especially when testing on hybrid display.
 
this is nvidia, don't say never.. after all they released 3 different titan models for the 700 series to make sure they stayed ahead of AMD's 200 series.


That was back when they were going with the "single die that does it all" solution. They took the hit on margins to undercut the 290X. This was also a cutting-edge .28nm process, so improved yields over time made it more feasible to unlock more cores.

Today, Nvidia is still doing the now-ancient 14nm process, AND they have two dies that do completely different things. The RTX 2080 Ti has gaming-optimized drivers, and nearly identical gaming performance to the $3000 Titan V (so no, price cut to make a faster gaming card that's not an option). It's less than 10% performance difference between the current Ti and top-end Titan RTX (all cores enabled on TU102), so there really isn't any performance to unlock.

We're all going to have ti wait for Ampere to get more performance out of a single card.
 
Last edited:
Lot of nice free publicity for a new benchmark that’s conveniently charging a fee to see for yourself if it’s real or not.
 
They would do both at the same time. Superti and supertitan available 2-2-2020 for you guessed it 2222 and 222020...
 
Well, if their Titan sales start to tank, wouldn't they just release a Super Titan RTX?

Unless I am wrong, I believe the Titan RTX is fully enabled so there is nothing left they could do other then increase memory speeds and gpu speeds.
 
This is a Hoax !
This is some guy testing an Asus Laptop with a 4800HS chip, using hybrid display, with an RTX 2060 GPU. The 2060 GPU is 40% the speed of a Desktop 2080Ti. But what if there is some eGPU using a 2080Ti connected to that Laptop. Some bench could use both GPU and show better results than the Desktop 2080Ti. That would explain the 17% better. Another explanation could be that the Bench has bugs especially when testing on hybrid display.

All that wall of text still doesn't explain the scores being higher than 2080Ti. Of course it picked up an iGPU as default. It didn't detect the external GPU.
 
Back
Top