[AMD] AMD shows more gaming benchmarks on RX 6800, 6800 XT and 6900 XT, at 4K and 1440p.

luisxd

Limp Gawd
Joined
Nov 26, 2014
Messages
159
AMD just uploaded more gaming benchmarks of their navi cards, including 4K and 1440p.


1604114681182.png

1604114691539.png

1604114705164.png

1604114716793.png


1604120142495.png
1604183011288.png


MORE BENCHMARKS AT: https://www.amd.com/en/gaming/graphics-gaming-benchmarks
 
Last edited:
I got the impression that AMD does well in games made for Xbox as it is the same underlying directX architecture shared with PCs

Is that right?

Someone else said that AMD did well in directX 11 games. Which seems strange because nVidia too would have optimised all their games for directX 11
 
148.4 FPS average on a RTX 3090 at 1440p? What bong is AMD smoking here?

I get much higher FPS than that. They didn't even tell what maps or game modes were played on.
At +120/+600 and +114% power target, at 1440p and max settings, I got 182 FPS average in Warzone (doing two matches at 1440p, max settings, DXR enabled and 100% render scale).
It's impossible that a small overclock would go from 148.4 to 182 (FPS range was 150 to 210, except in some obvious slow areas like FMV scenes before the round starts).

In Ground War, same thing pretty much, but 170 FPS average.

Hardpoint was the hardest. 90-211 (average fps 162) fps and 106-239 fps (average fps 171).

10900k @ 4.7 ghz and 4400 mhz 16-17-17-37 memory (2x16 GB).

My scores pretty much match AMD's most optimistic 6900 XT scores.
 
148.4 FPS average on a RTX 3090 at 1440p? What bong is AMD smoking here?

I get much higher FPS than that. They didn't even tell what maps or game modes were played on.
At +120/+600 and +114% power target, at 1440p and max settings, I got 182 FPS average in Warzone (doing two matches at 1440p, max settings, DXR enabled and 100% render scale).
It's impossible that a small overclock would go from 148.4 to 182 (FPS range was 150 to 210, except in some obvious slow areas like FMV scenes before the round starts).

In Ground War, same thing pretty much, but 170 FPS average.

Hardpoint was the hardest. 90-211 (average fps 162) fps and 106-239 fps (average fps 171).

10900k @ 4.7 ghz and 4400 mhz 16-17-17-37 memory (2x16 GB).

My scores pretty much match AMD's most optimistic 6900 XT scores.
You have a different processor and memory, of course your scores are different...if they ran the bench with different hardware, it'd probably score differently, too?
 
148.4 FPS average on a RTX 3090 at 1440p? What bong is AMD smoking here?

I get much higher FPS than that. They didn't even tell what maps or game modes were played on.
At +120/+600 and +114% power target, at 1440p and max settings, I got 182 FPS average in Warzone (doing two matches at 1440p, max settings, DXR enabled and 100% render scale).
It's impossible that a small overclock would go from 148.4 to 182 (FPS range was 150 to 210, except in some obvious slow areas like FMV scenes before the round starts).

In Ground War, same thing pretty much, but 170 FPS average.

Hardpoint was the hardest. 90-211 (average fps 162) fps and 106-239 fps (average fps 171).

10900k @ 4.7 ghz and 4400 mhz 16-17-17-37 memory (2x16 GB).

My scores pretty much match AMD's most optimistic 6900 XT scores.
Ok so save your overclocking profile and then reset the card to Nvidia reference clocks, and tell us your numbers. That will help us identify what AMD probably did.

AMD's framerates for the 3090 are undoubtedly based on 3090 reference clocks. Not a 3090 with increased power limit and/or additional manual tweaks to the core and memory clocks.

You also mentioned that they don't state which mode or map they used and showed that your average framerates vary a lot, based on those variables.

However, there can also be a big gap in 3090 performance compared to reference, when you start upping the power limit and manually overclocking. Check out the difference in GPU clocks techpowerup reported on these two 3090's

**I am sorry, I could not get this stuff to post without the bold font. I tried and tried.

ASUS GeForce RTX 3090 STRIX OC. ("stock" performance is heavily overclocked Vs. Nvidia reference clocks)

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/30.html

ZOTAC GeForce RTX 3090 Trinity ("stock" performance is more or less Nvidia reference).

https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/30.html

That's a giant difference on that Asus. And if we apply it to a game which is primarily GPU limited, you can see a huge difference in 1440p performance

Asus in Devil May Cry 5

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/13.html

Zotace in Devil May Cry 5

https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/13.html

The point being that your 3090 is overclocked significantly (+14% is on the higher end of power limit adjustment for 3090's. Plus you apply an extra manual tweak on top of that) and I bet that is indeed the reason your numbers are a lot higher than AMD's posted numbers, which are likely based on reference clocks.

 
Last edited:
It's marketing. You can't take it at face value from ANY of the vendors. Wait for independent benchmarks using games.
Agreed. In the past both sides have gimped their competitors to show them being “superior.”

Though a 6900xt would be a day one purchase to replace my aged 1080Ti.
 
Looks like fat corps like intel & nvidia cant compete amd in terms of $/performance ratio, they just too greedy for this.
 
148.4 FPS average on a RTX 3090 at 1440p? What bong is AMD smoking here?

I get much higher FPS than that. They didn't even tell what maps or game modes were played on.
At +120/+600 and +114% power target, at 1440p and max settings, I got 182 FPS average in Warzone (doing two matches at 1440p, max settings, DXR enabled and 100% render scale).
It's impossible that a small overclock would go from 148.4 to 182 (FPS range was 150 to 210, except in some obvious slow areas like FMV scenes before the round starts).

In Ground War, same thing pretty much, but 170 FPS average.

Hardpoint was the hardest. 90-211 (average fps 162) fps and 106-239 fps (average fps 171).

10900k @ 4.7 ghz and 4400 mhz 16-17-17-37 memory (2x16 GB).

My scores pretty much match AMD's most optimistic 6900 XT scores.

Their benches seem fine to me. They are in line with reference rtx 3090. Overclocking will help depending on the part of game you are on. But let’s stop calling their benches bullshit because they ran the exact same benchmarks on both. 14% power target is a lot on ampere and I m sure memory OC helps with core overclocking on it.
 
Waiting for independent OC to OC reviews before I dump my 3090 and grab a 6900XT. Which will be available for purchase sometime in 2021 :ROFLMAO:

I think these cards are going to overclock damn well. I am wondering how much memory OC they get. Reports were already hurting that AIBs are clocking them to 2.3-2.4 ghz game clock and shit. So that is pretty decent.
 
148.4 FPS average on a RTX 3090 at 1440p? What bong is AMD smoking here?

I get much higher FPS than that. They didn't even tell what maps or game modes were played on.
At +120/+600 and +114% power target, at 1440p and max settings, I got 182 FPS average in Warzone (doing two matches at 1440p, max settings, DXR enabled and 100% render scale).
It's impossible that a small overclock would go from 148.4 to 182 (FPS range was 150 to 210, except in some obvious slow areas like FMV scenes before the round starts).

In Ground War, same thing pretty much, but 170 FPS average.

Hardpoint was the hardest. 90-211 (average fps 162) fps and 106-239 fps (average fps 171).

10900k @ 4.7 ghz and 4400 mhz 16-17-17-37 memory (2x16 GB).

My scores pretty much match AMD's most optimistic 6900 XT scores.
I tested Borderlands 3 just yesterday. Their 6900XT got 72 FPS at 4K with max settings IIRC, I did the same and got 74 on my 3090 FTW3 Ultra in the canned benchmark. Granted, it's not an FE - but this is why waiting for reviews is a good thing (and why I was so pissed that the Nvidia review NDA didn't lift until the day of release).
 
I tested Borderlands 3 just yesterday. Their 6900XT got 72 FPS at 4K with max settings IIRC, I did the same and got 74 on my 3090 FTW3 Ultra in the canned benchmark. Granted, it's not an FE - but this is why waiting for reviews is a good thing (and why I was so pissed that the Nvidia review NDA didn't lift until the day of release).

This is what I'm trying to say.
Someone claimed that Call of Duty was run by AMD in "benchmark mode". But how the heck do you run it in benchmark mode? There's no option in the blizzard client to run it in benchmark mode :/
I just now ran the game again. Absolutely no options for "Benchmark mode."
 
Their benches seem fine to me. They are in line with reference rtx 3090. Overclocking will help depending on the part of game you are on. But let’s stop calling their benches bullshit because they ran the exact same benchmarks on both. 14% power target is a lot on ampere and I m sure memory OC helps with core overclocking on it.
14% is like an extra 50 watts?
 
14% is like an extra 50 watts?

Well depends on what your card is already configured at. I believe FTW3 ultra already has higher power in bios and 14% on top of that can be a lot of card is already configured for 400w.
 
So this is the most I can give you.
2560x1440 max settings, Ultra, 100% render scale, DirectX raytracing enabled.

At stock settings and stock power limit on RTX 3090 FE (I assume 350W) I averaged 171 FPS on warzone on the first game and 169 fps on the second game (this game was mostly with me being around the Storage Area)
i9-10900k @ 4.7 ghz and RAM At 4400 mhz 2x16 GB, 16-17-17-37 primary timings, AIDA read 68k, write 68k, copy 66.5k, 35.9ns latency (Mind you this AIDA was tested at 5.2 ghz core, 4.8 cache, while my Warzone game was 4.7 ghz core, 4.4 cache).

With 400W power limit (114%) and +120 core and +600 memory, I scored 182 fps average (First part around the airport, after I finished the gulag, around the southeastern Farmland)..
Either way, that's way above AMD's 148 FPS average, so AMD's test is completely useless until we know what exactly AMD tested. Because if there is some sort of "benchmark mode" that someone claimed AMD used, I don't think regular gamers on battle.net have access to it.
 
So this is the most I can give you.
2560x1440 max settings, Ultra, 100% render scale, DirectX raytracing enabled.

At stock settings and stock power limit on RTX 3090 FE (I assume 350W) I averaged 171 FPS on warzone on the first game and 169 fps on the second game (this game was mostly with me being around the Storage Area)
i9-10900k @ 4.7 ghz and RAM At 4400 mhz 2x16 GB, 16-17-17-37 primary timings, AIDA read 68k, write 68k, copy 66.5k, 35.9ns latency (Mind you this AIDA was tested at 5.2 ghz core, 4.8 cache, while my Warzone game was 4.7 ghz core, 4.4 cache).

With 400W power limit (114%) and +120 core and +600 memory, I scored 182 fps average (First part around the airport, after I finished the gulag, around the southeastern Farmland)..
Either way, that's way above AMD's 148 FPS average, so AMD's test is completely useless until we know what exactly AMD tested. Because if there is some sort of "benchmark mode" that someone claimed AMD used, I don't think regular gamers on battle.net have access to it.


I will wait for Gamers Nexus and Digital Foundry to put out their benchmarks. Then we can know how honest AMD is...
 
So this is the most I can give you.
2560x1440 max settings, Ultra, 100% render scale, DirectX raytracing enabled.

At stock settings and stock power limit on RTX 3090 FE (I assume 350W) I averaged 171 FPS on warzone on the first game and 169 fps on the second game (this game was mostly with me being around the Storage Area)
i9-10900k @ 4.7 ghz and RAM At 4400 mhz 2x16 GB, 16-17-17-37 primary timings, AIDA read 68k, write 68k, copy 66.5k, 35.9ns latency (Mind you this AIDA was tested at 5.2 ghz core, 4.8 cache, while my Warzone game was 4.7 ghz core, 4.4 cache).

With 400W power limit (114%) and +120 core and +600 memory, I scored 182 fps average (First part around the airport, after I finished the gulag, around the southeastern Farmland)..
Either way, that's way above AMD's 148 FPS average, so AMD's test is completely useless until we know what exactly AMD tested. Because if there is some sort of "benchmark mode" that someone claimed AMD used, I don't think regular gamers on battle.net have access to it.
*Scooby Snack*
 
  • Like
Reactions: c3k
like this
So this is the most I can give you.
2560x1440 max settings, Ultra, 100% render scale, DirectX raytracing enabled.

At stock settings and stock power limit on RTX 3090 FE (I assume 350W) I averaged 171 FPS on warzone on the first game and 169 fps on the second game (this game was mostly with me being around the Storage Area)
i9-10900k @ 4.7 ghz and RAM At 4400 mhz 2x16 GB, 16-17-17-37 primary timings, AIDA read 68k, write 68k, copy 66.5k, 35.9ns latency (Mind you this AIDA was tested at 5.2 ghz core, 4.8 cache, while my Warzone game was 4.7 ghz core, 4.4 cache).

With 400W power limit (114%) and +120 core and +600 memory, I scored 182 fps average (First part around the airport, after I finished the gulag, around the southeastern Farmland)..
Either way, that's way above AMD's 148 FPS average, so AMD's test is completely useless until we know what exactly AMD tested. Because if there is some sort of "benchmark mode" that someone claimed AMD used, I don't think regular gamers on battle.net have access to it.
Again, we have to look at your boost clocks. The FE cards are not reference designs, actually. They generally boost higher than reference. So even with your FE at "stock", it still is likely performing better than a reference clocked 3090. Such as the Zotac 3090 Trinity I linked above. That card stays pretty tight around 1755mhz. Reference clocks for a 3090 is 1.7. So as I said, that Zotac is more or less "reference" clocks.

All that said, indeed, we do not know the map/mode which AMD tested in. But I am betting if you tested with your card staying around 1.7, you would see a further drop over your stock FE numbers.

Additionally, its possible Zen 3 isn't as good in that game as Intel's 10 series.
 
So this is the most I can give you.
2560x1440 max settings, Ultra, 100% render scale, DirectX raytracing enabled.

At stock settings and stock power limit on RTX 3090 FE (I assume 350W) I averaged 171 FPS on warzone on the first game and 169 fps on the second game (this game was mostly with me being around the Storage Area)
i9-10900k @ 4.7 ghz and RAM At 4400 mhz 2x16 GB, 16-17-17-37 primary timings, AIDA read 68k, write 68k, copy 66.5k, 35.9ns latency (Mind you this AIDA was tested at 5.2 ghz core, 4.8 cache, while my Warzone game was 4.7 ghz core, 4.4 cache).

With 400W power limit (114%) and +120 core and +600 memory, I scored 182 fps average (First part around the airport, after I finished the gulag, around the southeastern Farmland)..
Either way, that's way above AMD's 148 FPS average, so AMD's test is completely useless...
I'm guessing that AMD doesn't benchmark their games with a [email protected] and ram at 4400mhz.

That's the entire discrepancy right there. At those FPS levels you're more influenced by CPU than you probably consider.

Likewise, good chance that there is an actual canned benchmark that Activision made for CoD for the development team or internal testing that AMD has access too that you don't.
 
Last edited:
I'm guessing that AMD doesn't benchmark their games with a [email protected] and ram at 4400mhz.

That's the entire discrepancy right there. At those FPS levels you're more influenced by CPU than you probably consider.

Likewise, good chance that there is an actual canned benchmark that Activision made for CoD for the development team or internal testing that AMD has access too that you don't.
Yeah, AMD is using 3200mhz for their testing in these benchmarks and that makes a big difference in FPS. 8-14 fps gain with faster memory.
 
Definitely building the hype. Defintely want to see independent benches and IQ comparisons. If that lead in Forza is due to crappy IQ, it's not worth much.
 
Those benchmarks are perfect to forget once independent reviews get these cards and actually run benchmarks.
 
I'm guessing that AMD doesn't benchmark their games with a [email protected] and ram at 4400mhz.

That's the entire discrepancy right there. At those FPS levels you're more influenced by CPU than you probably consider.

Likewise, good chance that there is an actual canned benchmark that Activision made for CoD for the development team or internal testing that AMD has access too that you don't.

You guess? We already know what they benched with, their own platform. As for the 10900K, if it does not run at 4.7GHz out of the box, then no, they will not run it at 4.7 GHz. Besides, my understanding is that on Intel, faster ram does not make much of a difference.
 
Faster RAM is faster, but I think the idea came from when Ryzen first launched. There was a much bigger delta on performance with the AMD platform.

I think as the AM4 architecture matured, I'm not sure how true that is anymore. But it was true at one point.
 
Back
Top