How important do you think the 2600k's extra 2MB cache be over the 2500K?

BottomsUp

Gawd
Joined
Aug 29, 2010
Messages
602
I'm not interested in HT or 8 cores, but wonder if 8MB cache (2MB per core) will be worth ~$100 more over the 2500k for gaming?

Of course benchmarks will tell when they are released, but absent that what do you guys think?
 
I doubt it will make a significant difference in gaming -- for instance, it may be the difference between an average FPS of 60 and an average FPS of 75.
 
I doubt it will make a significant difference in gaming -- for instance, it may be the difference between an average FPS of 60 and an average FPS of 75.

I'd call that a big difference once you start getting to high settings and resolutions, as well as more demanding games down the line
 
if i had to venture a *guess* i would agree with jbz7890. it won't make a significant difference as far as gaming goes. as of right now i plan on getting the 2500k, but we'll see when NDA lifts.
 
I would call a 25% improvement significant

It depends on where the difference is. If you are averaging 60 FPS with a min FPS of 50, then a 25% performance increase won't really matter.

In a very general sense, CPUs tend to serve as caps on frame rates, which are more noticeable at low resolutions. Once you get to higher resolutions, your system becomes GPU bound rather than CPU bound.

Since anyone buying one of these processors is going to be playing at no lower a resolution than 1920x1200, I don't think the increased performance from the extra cache will be very important. (Though, I would have said the same thing about dual vs quad cores two years ago! I couldn't have been more wrong.)
 
The cache size will likely make a difference in gaming as we've seen with the move from Nehalem to Gulftown. A 25% increase in minimums doesn't matter? I don't know about you but anytime I'm playing Battlefield Bad Company 2 or really any first person shooter, when the minimum drops below 70 fps I can feel a difference in the responsiveness in my keyboard and mouse inputs being displayed on the screen. Please don't drag up the tired argument of not being able to see more than 60fps, you can experience differences between 60fps and 100fps.

So is it worth $100 more? Probably not but you're the only one who can decide that. If you can afford it why not get it and then you don't have to wonder.
 
Assuming you game at high res (2 megapixels or so), it will make no appreciable difference. A i5-750 will already max out any CPU benefit you're likely to see.

Dropping another hundred on your GPU will make a huge difference on the other hand. Or using those funds to go SSD. While that won't increase FPS, load times will be much better.
 
Yeah id say probably a 2-3% difference. Im sure at least one big reviewer will do a 2500k to 2600k comparison so we should find out in a couple of weeks.
 
It's L3 cache not 2,1, or their new L0... I'd say maybe 1-2% clock for clock? As always I'm sure some apps will make a bigger use of the cache, but I doubt it'll be huge
 
Since anyone buying one of these processors is going to be playing at no lower a resolution than 1920x1200, I don't think the increased performance from the extra cache will be very important. (Though, I would have said the same thing about dual vs quad cores two years ago! I couldn't have been more wrong.)

LOL.

I will be buying one, and I game at 1650x1050. But yeah, I will be the only one probably.
 
They are the same architecture.

It's been a while, but wasn't the move from Q6xxx to Q9xxx the move from Conroe-based Kentsfield to Penryn-based Yorkfield? So a shrink to 45nm, along with architecture improvements (and a faster FSB)? So, yes, not a new architecture, but more than just a die shrink.
 
I'm the OP and yeah I'll be using the same resolution.

Haha, same here. I was thinking of springing on a U23H or ZR24, but decided to just wait it out until we get a new display tech (OLED....probably not) or IPS gets even better. That is, if my willpower can hold off my purchasing urges. I have an IPS tv and the colors it displays once calibrated is fantastic, even with the matte screen.

My holy grail is a glossy 24-inch screen that is 16:10 with good enough response times, good color reproduction, and blacks that aren't terribad.
 
Just a few % difference at most. Not 25%. That's just silly.

There is a similar situation in mobile processors right now: the i7 840qm (8MB L3) vs i 720qm (6MB L3). The clock speed difference is even bigger in MHz and as a percent of processor speed than the difference between the 2500K and 2600K. The difference in benchmarks is virtually what you would assume by the difference in clock speed: http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html

6MB L3 vs 8MB L3 isn't that critical to performance. The biggest question in the decision between the 2500K and 2600K is if you're willing to pay $100 for hyperthreading.
 
I bet it will be about as important for gaming as hyperthreading on the current gen is (not at all), but more than a few people will buy it anyway
 
well let's not get too hung up on arbitrary numbers. i'd be surprised if reviews show a real-world difference of 25% in gaming.

Well in their defence, you DID agree with these arbitrary numbers until it was pointed out that those numbers made up for a 25% improvement
 
Haha, same here. I was thinking of springing on a U23H or ZR24, but decided to just wait it out until we get a new display tech (OLED....probably not) or IPS gets even better. That is, if my willpower can hold off my purchasing urges. I have an IPS tv and the colors it displays once calibrated is fantastic, even with the matte screen.

My holy grail is a glossy 24-inch screen that is 16:10 with good enough response times, good color reproduction, and blacks that aren't terribad.

Good luck finding a 16:10 in the future.
 
Well in their defence, you DID agree with these arbitrary numbers until it was pointed out that those numbers made up for a 25% improvement

i was agreeing with his statement that it won't make a significant difference as far as gaming goes. i figured it was obvious that his numbers were arbitrary for the sake of example (albeit a poor choice of figures).

if i had to venture a *guess* i would agree with jbz7890. it won't make a significant difference as far as gaming goes.
 
I'm not interested in HT or 8 cores, but wonder if 8MB cache (2MB per core) will be worth ~$100 more over the 2500k for gaming?

Of course benchmarks will tell when they are released, but absent that what do you guys think?
If you don't care about Hyperthreading I'd say the 2500k is right for you. The cache won't make a difference.
 
It took me a long time to find something relevant, and this is the best I could do. I know we're comparing lopsided stuff here across different generations, but bear with me, it took me half an hour to find it.

Toms's Hardware chart.

Q8300
2.5GHz
2 x 2MB L2
1333MHz FSB.

Q9550
2.67GHz
2 x 6MB L2
1333MHz FSB

The Q9550 has 7% more clock speed, and 200%, or three times more of its cache, but in the benchmarks, the Q8300 performs about 10-20% slower.

I don't think 25% less cache is going to make any noticeable difference to your general experience using the 2500K. I think HT will be more significant as a feature that differentiates the performance between the two. Is it worth $100 for gaming alone? IMHO no, but that's a choice that you must make.
 
It took me a long time to find something relevant, and this is the best I could do. I know we're comparing lopsided stuff here across different generations, but bear with me, it took me half an hour to find it.

Toms's Hardware chart.

Q8300
2.5GHz
2 x 2MB L2
1333MHz FSB.

Q9550
2.67GHz
2 x 6MB L2
1333MHz FSB

The Q9550 has 7% more clock speed, and 200%, or three times more of its cache, but in the benchmarks, the Q8300 performs about 10-20% slower.

I don't think 25% less cache is going to make any noticeable difference to your general experience using the 2500K. I think HT will be more significant as a feature that differentiates the performance between the two. Is it worth $100 for gaming alone? IMHO no, but that's a choice that you must make.

Yep, and keep in mind that the Core 2 series did not have the benefit of a on die memory controller and the L2 Cache played a larger role in performance.
 
It took me a long time to find something relevant, and this is the best I could do. I know we're comparing lopsided stuff here across different generations, but bear with me, it took me half an hour to find it.

Toms's Hardware chart.

Q8300
2.5GHz
2 x 2MB L2
1333MHz FSB.

Q9550
2.67GHz
2 x 6MB L2
1333MHz FSB

The Q9550 has 7% more clock speed, and 200%, or three times more of its cache, but in the benchmarks, the Q8300 performs about 10-20% slower.

I don't think 25% less cache is going to make any noticeable difference to your general experience using the 2500K. I think HT will be more significant as a feature that differentiates the performance between the two. Is it worth $100 for gaming alone? IMHO no, but that's a choice that you must make.
That's L2 cache though, the cache in question is L3 and a much different CPU
 
While we're pointlessly debating processors that haven't been released. Why not ask how much of an increase $100 could do in other components for gaming? Unless the only new things you're buying are a cpu and motherboard there are probably better value propositions.
 
Isn't this the processor forum? I don't know, he's asking about L3 cache. I've already answered his question, just don't think a little less L3 cache is going to hurt gaming. I'm guessing he simply doesn't want to spend $100 on extra L3 cache or anything else if he doesn't have to.
 
It seems the conclusion is that for gaming there won't be much of a difference, can anyone tell me for video encoding (basically frapsing gameplay footage @1650x1080 or w/e that resolution is then encoding it with virtualdub) would be different between the two processors?
 
It seems the conclusion is that for gaming there won't be much of a difference, can anyone tell me for video encoding (basically frapsing gameplay footage @1650x1080 or w/e that resolution is then encoding it with virtualdub) would be different between the two processors?

I would imagine HT would have a bigger impact there. Don't know how much though.
 
Back
Top