Effects of CPU cache size on total system performance?

Sentoschool was trolling or behaving like a 10 year old asking mom for $5 so he can buy candy. He had nothing to input except try to get on people's nerves asking the same thing over and over again. I am civil just because I use words like stupid or moron doesn't mean I am not. Sometimes these words fit the bill. :rolleyes:

Volt mod my pupils? I guess so. If you aren't stupid I don't know how you are going to volt mod your pupils. Maybe you want to be stick 10000volts in to your pupils. I've seen stupid things happen.

Read the rules: flaming is prohibited. That includes calling people "stupid" or "moron"
 
I'm not going to debate with you over what is or isn't civil behavior, or what the difference is between sarcasm and stupidity. I see the purpose of this thread as having been satisfied, and it has devolved into pettiness. Because of this, I'm getting a mod to close this thread.

I hope this thread will be at least mildly informative to those who come across it in the future.

You can close it yourself.
 
That's right, I took the average of the 4 game benchmarks to condense the data... but if you would prefer, I can break it out individually:

PREY Benchmark:
E6600 @ 1600x1200: 160
E4300 @ 1600x1200: 155

Nobody on God's green earth can tell the difference between 160 and 155 fps. Plus, nobody in their right mind would run FarCry like this, unless to brag about how their CPU is 3.2% faster than a CPU with 2mb less L2 cache...

FarCry Benchmark:
E6600 @ 1600x1200: 135
E4300 @ 1600x1200: 117

It's completely impossible to tell the difference between 135 fps and 117 fps. It's a completely unrealistic scenario to begin with, which is why H]ard|OCP stopped doing this type of worthless benchmarking years ago...

Company of Heroes Benchmark:
E6600 @ 1600x1200: 100
E4300 @ 1600x1200: 96

Hey wait, if I look closely while I'm playing I think I can actually see some difference in these framerates... wait, no I can't. Crap.

X3 Reunion Benchmark:
E6600 @ 1600x1200: 86
E4300 @ 1600x1200: 75

Wow, that's an astonishing 14% difference... now if I just overclock my retinas and volt-mod my pupils, I'll be able to see it.... aww fuck it, you get the point already. But yeah, you've totally proved your point, too - in completely and utterly unrealistic cases, you can force a 20% difference in framerates based solely on the size of a CPU's L2 cache. Rock on.

you do realize those are averages right? taking a look at any video card review here on [H], you'll find min and maxes as well...which is what you meant (IDK)?

i guess as a side note to anyone else reading, there can be big playable game diffferences between something like 100-150fps because the fps floor will be raised, even though one can't really distinguish between frames running at 100 and 150fps all the time. ...just like horsepower ratings in cars.
 
Back
Top