Battlefield 1 DX12 Benches: Nvidia still king, AMD gaining

Or it could be getting Kepler optimization status now.

You mean by developers? GCN 1.0 seems dropped too like the ugly Tonga child that always got the short stick. And I have a feeling GCN 1.2 gets dropped before 1.1.
 
You mean by developers? GCN 1.0 seems dropped too like the ugly Tonga child that always got the short stick. And I have a feeling GCN 1.2 gets dropped before 1.1.


I was just being sarcastic but you took offense to it. LOL

I can care less what Nvidia and AMD do, I buy what ever. But obviously you really care about Nvidia enough to defend them any chance you get.
 
I was just being sarcastic but you took offense to it. LOL

I can care less what Nvidia and AMD do, I buy what ever. But obviously you really care about Nvidia enough to defend them any chance you get.

No I didn't take offence. I just dont think you are able to separate game optimization and drivers, and your second post confirms it. :)
 
Well now I'm officially perplexed :)

If the AMD gains were due to cpu bottlenecks being lifted it wouldnt make sense to see 5% gain on rx480 and the same 5% gain for fury x at a higher framerate.

If the amd gains were due to gpu side improvements we wouldn't see regression at 4K.

well it makes sense. Memory bandwidth limitation on 4k for the rx 480 and only 4gb of ram on fury x. I am sure that porbably has something to do with 4k.
 
20 GPUs tested with a video on the page for more information on the testing.
Battlefield 1 Benchmarks – 20 GPUs tested at 1080p, 1440p, & 4K!



ijfuyGj.png


hQ9irwQ.png


UXxpd2L.png
 
I'm going to start ignoring sites that don't list the models for their 300-series samples.
That series does not exist in ref model, which means every benchmark testing those cards is using one of the factory OC'd models.

I don't trust reviewers to DOWNCLOCK their cards to factory speeds before testing unless they specifically say THEY DID downclock. Compare this to Nvidia, who sends out ref models to reviewers and don't rely on partners for it.

This is potentially a FAKE 5-10% performance advantage for all of AMD's non-ref cards and 10%+ for Nvidia.

Reviews must be:
1. REF vs REF
2. NON-REF vs NON-REF
3. REF vs NON-REF @ ref spec clock.

I'm still shocked that I am the only person to ever point this out for the past year... Yikes.
Hey reviewers, don't treat us like retards. Thanks.

This is why I always go straight to PCGH for benchmarks.
 
Last edited:
I'm going to start ignoring sites that don't list the models for their 300-series samples.
That series does not exist in ref model, which means every benchmark testing those cards is using one of the factory OC'd models.

I don't trust reviewers to DOWNCLOCK their cards to factory speeds before testing unless they specifically say THEY DID downclock. Compare this to Nvidia, who sends out ref models to reviewers and don't rely on partners for it.

This is potentially a FAKE 5-10% performance advantage for all of AMD's non-ref cards and 10%+ for Nvidia.

Reviews must be:
1. REF vs REF
2. NON-REF vs NON-REF
3. REF vs NON-REF @ ref spec clock.

I'm still shocked that I am the only person to ever point this out for the past year... Yikes.
Hey reviewers, don't treat us like retards. Thanks.

This is why I always go straight to PCGH for benchmarks.

Not being contrary, but don't all of the Nvidia cards automatically OC on their own? Thus there is no "reference" speed as it is dynamically changing?
 
Not being contrary, but don't all of the Nvidia cards automatically OC on their own? Thus there is no "reference" speed as it is dynamically changing?


Neither do AMD ref cards, all cards starting from 2 gens ago, had some type of boost technology, but all ref cards have base clocks with boost ranges which will stay the same.
 
Back
Top