2500k/980ti 1080p to 1440p actually increased FPS...

Scheibler1

[H]ard|Gawd
Joined
Jul 8, 2005
Messages
1,441
Guess my 2500k was a bigger bottleneck then I imagined after all. Running my 2500k @ 4.8ghz with a 980ti oc'd to 1450mhz.

upgraded monitor from 1080p 144hz to 1440p 144hz. To my suprise my FPS actually went UP in BF1. Haven't played any other games yet, but was shocked to see this happen. Even if the 2500k was bottlenecking I was expecting to lose some FPS
 
you will be surprised how much games can actually be a bottleneck for the good old 2500K and 2600K.. the fact that it can run good a game doesn't exactly mean isn't bottlenecking which is what you have found and it was the same fact I found by going from the 4.5ghz 3770K to 4.5ghz 6700K and was even more surprised to see some games gaining still more performance at 4.8ghz, but that's mainly because I game at 1080P@120hz/144hz..
 
Play around with super sampling in Bf1. Might still be bottlenecked even.
 
How big of an increase could I expect from a 6700k? Was probly going to wait out for Zen to see if AMD can put something out like the Athlon X2..
 
How big of an increase could I expect from a 6700k? Was probly going to wait out for Zen to see if AMD can put something out like the Athlon X2..

64618100.jpg
 
i really doubt your framerate went up. it should stay the same. as for the 6700K:

4gj7HKF.png


the stock 980 Ti they use got 107/130 at 1080p so it would move your bottleneck to the GPU.
 
How big of an increase could I expect from a 6700k? Was probly going to wait out for Zen to see if AMD can put something out like the Athlon X2..
SB-SKL is 30-35% IPC IIRC. anandtech had the numbers. I recall it 30-35%

30% is a lot for minimum frame rate. whatever your minimum is for CPU bottleneck expect a 30%ish boost.

Also any of these charts people post take with a grain of salt. I have called out many members posted complete bullshit slides.

If it isn't from anandtech or techpowerup or hard forum or notebookcheck. I highly question the legitimacy of them.

Araxie....I am looking at you :p

I find that gamegpu slide questionable....looking at 2600K vs 6600K and somehow HT is getting a 30% boost to cover for those 30-35% IPC difference.......i doubt that.

So araxie and Odellus are both posting trash slides....okay :D


if you want a reply quote me
 
Last edited:
SB-SKL is 30-35% IPC IIRC. anandtech had the numbers. I recall it 30-35%

30% is a lot for minimum frame rate. whatever your minimum is for CPU bottleneck expect a 30%ish boost.

Also any of these charts people post take with a grain of salt. I have called out many members posted complete bullshit slides.

If it isn't from anandtech or techpowerup or hard forum or notebookcheck. I highly question the legitimacy of them.

Araxie....I am looking at you :p

I find that gamegpu slide questionable....looking at 2600K vs 6600K and somehow HT is getting a 30% boost to cover for those 30-35% IPC difference.......i doubt that.

So araxie and Odellus are both posting trash slides....okay :D


if you want a reply quote me

can you find and post a review that prove those slides wrong? (posted by me in other thread you know which one) and this one posted by Odellus?). is not like I normally trust in gameGPU (because I don't) the problem is, none of those sites you mention, haven't made any Battlefield 1 or Gears of wars 4 (the one I posted) CPU review.. also I didn't posted any pic or number here, and if you don't know TechSpot and DigitalFoundry (EuroGamer) are one of the best games reviewers, being DigitalFoundry in my opinion the best with deep analysis of FPS and Frametimes.

[H]OCP only is good to know how perform latest hardware and nothing else both for games reviews and other hardware reviews so in [H] you are never going to find how good or bad a sandy brige CPU perform against a Skylake/KabyFailLake in any modern game or application.
 
can you find and post a review that prove those slides wrong? (posted by me in other thread you know which one) and this one posted by Odellus?). is not like I normally trust in gameGPU (because I don't) the problem is, none of those sites you mention, haven't made any Battlefield 1 or Gears of wars 4 (the one I posted) CPU review.. also I didn't posted any pic or number here, and if you don't know TechSpot and DigitalFoundry (EuroGamer) are one of the best games reviewers, being DigitalFoundry in my opinion the best with deep analysis of FPS and Frametimes.

[H]OCP only is good to know how perform latest hardware and nothing else both for games reviews and other hardware reviews so in [H] you are never going to find how good or bad a sandy brige CPU perform against a Skylake/KabyFailLake in any modern game or application.
basic logic and common sense calls those gears of war slides total bullshit and even kyle bennet agreed those were total bullshit.

explain to me how SB with 100mhz more freq and HT can result in a 3-7.5% higher FPS score given a ~30% IPC difference? Explain to me the code or science for that magic? If you can't than those results are false. Those results do not follow basic understandings between the two CPUs. If those results are accurate there needs to be a reason on why 30% IPC difference magically disappeared.

explain to me how HW is equal to SB as well in that chart. Is the program coded in a way that all IPC improvements are 100% void in that game? If so why? These are basic question that many people here fail to ask when they look at a slide and see if it is reliable. Do you not even fact check anything?

Explain to me how a 6600K smashes a 2500K but a 6600K does not smash a 2600K.....You are telling me HT is pulling a 30% boost? I highly doubt HT is pulling a 30% boost.

If those slides are accurate than every IPC improvement that Intel has done over the last 6 years does not exist. Plenty of reliable tests shows that is not the case. Either those slides are fake, batched, or unreliable testing methodology was used.

What we know about those 3 different CPUs and other reliable sources tells us that something is wrong with those results.

Maybe HT can net a 30% boost but i have never seen a reliable source show that happen. I have seen -5% to 15% but never 30% and definitely not that 100% boost those gears of war slides showed. rofl

Sadly, i would trust linus over gameGPU or that other site you mentioned in the other thread.
 
Last edited:
SB-SKL is 30-35% IPC IIRC. anandtech had the numbers. I recall it 30-35%

30% is a lot for minimum frame rate. whatever your minimum is for CPU bottleneck expect a 30%ish boost.

Also any of these charts people post take with a grain of salt. I have called out many members posted complete bullshit slides.

If it isn't from anandtech or techpowerup or hard forum or notebookcheck. I highly question the legitimacy of them.

Araxie....I am looking at you :p

I find that gamegpu slide questionable....looking at 2600K vs 6600K and somehow HT is getting a 30% boost to cover for those 30-35% IPC difference.......i doubt that.

So araxie and Odellus are both posting trash slides....okay :D


if you want a reply quote me
i've verified their findings multiple times myself when i had a 2500K. it's not bullshit. stop wasting people's time and energy. it's been known since BF3 that frostbite is extremely CPU heavy and scales to 8 threads. it's no surprise at all that HT improves framerates in games using that engine as much as it does. this misconception that CPUs don't have a huge impact on framerate is long outdated. more and more games are utilizing multi-threaded engines that scale even up to 16 threads now. all of those tests that show little or no difference in framerate are flawed. i'm so tired of seeing idiots link singleplayer battlefield CPU benchmarks or hilariously GPU bottlenecked benchmarks trying to say that there's no reason to upgrade their CPUs or buy i7s. hyperthreading DOES make that big of a difference as long as the game/engine can make use of it.
 
Last edited:
i've verified their findings multiple times myself when i had a 2500K. it's not bullshit. stop wasting people's time and energy. it's been known since BF3 that frostbite is extremely CPU heavy and scales to 8 threads. it's no surprise at all that HT improves framerates in games using that engine as much as it does. this misconception that CPUs don't have a huge impact on framerate is long outdated. more and more games are utilizing multi-threaded engines that scale even up to 16 threads now. all of those tests that show little or no difference in framerate are flawed. i'm so tired of seeing idiots link singleplayer battlefield CPU benchmarks or hilariously GPU bottlenecked benchmarks trying to say that there's no reason to upgrade their CPUs.
i never said CPUs dont have a large impact...my argument and contention is that HT provides 30-100% boost is total bullshit. Divx and Maya dont even see that kind of boost and if anything was going to get a big boost it would be something that is truly threaded.

unless you provide a logical reason on how HT in these games provides 30-100% boost in frame rate i am calling 100% bullshit

plenty of people claim the opposite BTW that HT kills frame rate in games.
 
i never said CPUs dont have a large impact...my argument and contention is that HT provides 30-100% boost is total bullshit. Divx and Maya dont even see that kind of boost and if anything was going to get a big boost it would be something that is truly threaded.

unless you provide a logical reason on how HT in these games provides 30-100% boost in frame rate i am calling 100% bullshit

plenty of people claim the opposite BTW that HT kills frame rate in games.
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-6600+@+3.30GHz
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-2600K+@+3.40GHz

perfectly multithreaded environment; 6600 is 9% slower than a 2600K. the 2600K is 7% faster than the 6600 in their BF1 bench in averages, 3% in minimums. i have no idea what's going on with their gow4 minimum numbers. i doubt there's a 32% difference in minimums but the average is definitely realistic.

regardless, every time i've checked their average numbers using youtube videos with frame counters visible and my brother's computer with a 3570K, they've been more or less accurate from what i've seen, so i trust their averages at least.
 
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-6600+@+3.30GHz
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-2600K+@+3.40GHz

perfectly multithreaded environment; 6600 is 9% slower than a 2600K. the 2600K is 7% faster than the 6600 in their BF1 bench in averages, 3% in minimums. i have no idea what's going on with their gow4 minimum numbers. i doubt there's a 32% difference in minimums but the average is definitely realistic.

regardless, every time i've checked their average numbers using youtube videos with frame counters visible and my brother's computer with a 3570K, they've been more or less accurate from what i've seen, so i trust their averages at least.
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/9

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/11

7-zip is the only thing a 6600K lost to a 2600K from what they have. Again...i trust anandtech over passmark or any of those hack blogs/youtubers.

passmark.....people use that shit?


EDIT:
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/17

here again WinRAR 2600K wins and 3D particles. So does BF1 use a certain part of a CPU where HT shines? most games i see HT is a hit or miss of a few % points one way or the other but is BF1 oddly different? Or is that source complete shit?........
 
Last edited:
Guess my 2500k was a bigger bottleneck then I imagined after all. Running my 2500k @ 4.8ghz with a 980ti oc'd to 1450mhz.

upgraded monitor from 1080p 144hz to 1440p 144hz. To my suprise my FPS actually went UP in BF1. Haven't played any other games yet, but was shocked to see this happen. Even if the 2500k was bottlenecking I was expecting to lose some FPS
Average fps may have gone up due to max fps hardly being restricted by vsync now.
Minimums wont improve, they will reduce.
 
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/9

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/11

7-zip is the only thing a 6600K lost to a 2600K from what they have. Again...i trust anandtech over passmark or any of those hack blogs/youtubers.

passmark.....people use that shit?


EDIT:
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/17

here again WinRAR 2600K wins and 3D particles. So does BF1 use a certain part of a CPU where HT shines? most games i see HT is a hit or miss of a few % points one way or the other but is BF1 oddly different? Or is that source complete shit?........

BF multiplayer certainly makes use of HT. Even BF4 did. If you turn HT off on a 64 player server you'll notice a huge difference. The chart Odellus posted doesn't look that odd to me. Even in 1440p I'm incredibly CPU bound on BF1 multiplayer with my 980 ti & 4790k (64 players maps that is).
 
SB-SKL is 30-35% IPC IIRC. anandtech had the numbers. I recall it 30-35%

30% is a lot for minimum frame rate. whatever your minimum is for CPU bottleneck expect a 30%ish boost.

Also any of these charts people post take with a grain of salt. I have called out many members posted complete bullshit slides.

If it isn't from anandtech or techpowerup or hard forum or notebookcheck. I highly question the legitimacy of them.

Araxie....I am looking at you :p

I find that gamegpu slide questionable....looking at 2600K vs 6600K and somehow HT is getting a 30% boost to cover for those 30-35% IPC difference.......i doubt that.

So araxie and Odellus are both posting trash slides....okay :D


if you want a reply quote me
It depends on the workload and so many other factors if Skylake will kick Sandy Bridge. Comparing one game or test bench and coming to conclusions will be rather limited. For example, my FX 9590 beats out most of the I5 6600K with >20,000 entries to Futuremark Time Spy database, in fact I would come in I think 13th place beating virtually all I5 6600K's out there except only a few. I5 6600K vs FX 9590 SLI 1070 and I am not pushing neither the FX 9590 nor the 1070's either. So does that mean my FX 9590 is faster then most I5 6600K's out there? Maybe in some things is probably the best answer.

I would not look so much at numbers but actual gaming experience for yourself if looking at a need to upgrade a cpu for gaming. There are other reasons to upgrade for example a more newer platform with more modern features or for other type of applications that need more cores etc.
 
Oh god it is always the same damn people that cant accept that an i7 can be faster than an i5.

In Mafia 3 I get 25% better average and 43% better minimums with HT on than with it off. This was tested with FRAPS just driving a loop with zero action going on so as to be repeatable. And even though the fps numbers look ok with HT off the actual game plays like jittery shit as the cpu is pegged basically the whole time. It is perfectly smooth with HT on though.



4770k @ 4.3 with HT on

Min, Max, Avg
70, 94, 81.966


4770k @ 4.3 with HT off

Min, Max, Avg
49, 83, 66.119
 
Please forgive me, I'm going to ask stupid questions not related to the CPU.
What monitor did you have, and what monitor did you upgrade to?
Have you changed the way you connect the video card to the monitor - for example, you used to use HDMI and are now using Displayport, or the other way around, or even DVI?
Did you update the Nvidia drivers when you installed the new monitor?
If you did update the Nvidia drivers, is it possible that some settings you had before have changed without you realizing it? For example:
In the Nvidia Control Panel's 3D settings, was the "Texture filtering - Quality" previously set to "High quality" and has now reverted to the default of "Quality"?
In the Nvidia Control Panel's Change Resolution page, is "Output color depth" set to 8bpc and "Output dynamic range" set to Full?
 
So, you didn't change anything, except now you're rendering at higher resolution, and your frame rate went up? And that somehow translates into even greater gains with skylake. What?

A lot of replies in this thread make no sense to me.
 
I didnt even see that he upgraded to Skylake either.
 
Went from Asus vg248qe connected by supplied DVI-D cable for 144hz

To Acer Xb271hu connected by display port @ 144hz with gsync

Drivers were both default settings

I noticed my GPU usage went up at 1440p
 
You got a higher rez monitor and your FPS went up? makes no sense, maybe the game tweaked your graphic settings after upgrading?
 
EverythInga always been at Ultra. Cone to think of it, I did turn TAA off now that I got 1440p. Not sure if that could've done it. But like I said my you usage went up quite a bit at 1440p and cpu usage now fluctuates 96-100, where as before it was constantly pegged at 100 in 1080p
 
Back
Top