Current gen games on Wolfdale?

skadebo

[H]ard|Gawd
Joined
Feb 27, 2005
Messages
1,126
I know there are a lot of people still running Wolfdale, I took a break from gaming for a little over a year and as such have an E8400 running at stock and a basic video card, both passive, for a completely silent system. I'm wondering what others' experiences are on this platform with faster graphics cards. Are you able to play with reasonable settings (at least medium) at 1920 x 1200? I'm interested in playing Bioshock Infinite, Crysis 3, Mass Effect 2/3.
 
I was in the generation before with an Intel® Core™ 2 Duo E6400 it was able to play most of the games I played at that time but I have since moved up to Intel Core i7-860 and just replaced that system with an Intel Core i7-3770K for my home system. Each move has given me improvements in performance.

The Intel Core 2 Duo E6400 that I had I think is currently being used as a minecraft server.
 
UE3 games doesn't really use that much CPU. Can run 4 copies of UE3 games (BL2) on 4 threads without much issue.

Crysis 3 I don't know, but the minimum is "2.8 GHz dual core processor, Intel Core 2 Duo", but if that gives you the performance you want is unclear. :D
 
you might have some trouble with crysis 3 since even modern systems will reportedly choke, but the other should be fine.
 
Crysis 3 I don't know, but the minimum is "2.8 GHz dual core processor, Intel Core 2 Duo", but if that gives you the performance you want is unclear. :D
I hate when specs are listed like that, especially when they are unclear or vary with each source. :p

From the official page: http://www.crysis.com/us/crysis-3/buy

Code:
MINIMUM SYSTEM OPERATING REQUIREMENTS FOR PC:
• Windows Vista, Windows 7 or Windows 8
• DirectX 11 graphics card with 1Gb Video RAM
• Dual core CPU
• 2GB Memory (3GB on Vista)
• Example 1 (Nvidia/Intel):
          • Nvidia GTS 450
          • Intel Core2 Duo 2.4 Ghz (E6600)
• Example 2 (AMD):
          • AMD Radeon HD5770
          • AMD Athlon64 X2 2.7 Ghz (5200+)

You're not going to get top performance with an E8400, but it's better than the example minimum.

The resolution isn't that important if your video card is fast enough to run it at some level of detail. IOW, you may see around the same framerate at many resolutions if the other graphics settings are fixed.
 
Well, to give an idea I found I was CPU bound with my sig rig when botting up Metro 2033 the other day. No matter how much I OC'd the card (which was pretty much on par with the [H] review OC), I averaged 50FPS in the built in benchmark. Underclocked by 100MHz (so 200MHz change) only netted a drop of ~2.5FPS. Using high settings, 1920X1200.

At least now I found myself a valid reason to upgrade once Haswell comes out! :D
 
Well, to give an idea I found I was CPU bound with my sig rig when botting up Metro 2033 the other day. No matter how much I OC'd the card (which was pretty much on par with the [H] review OC), I averaged 50FPS in the built in benchmark. Underclocked by 100MHz (so 200MHz change) only netted a drop of ~2.5FPS. Using high settings, 1920X1200.

At least now I found myself a valid reason to upgrade once Haswell comes out! :D

Damn, now that drives it home lol. Gonna dust off my my 8800GT and start playing old games from my Steam backlog, heh. That should tide me over for a long while
 
Well my old pc was an E8400 stock being possible to 3.6GHZ sometimes when i had, now its stock use by my dad and little brother, its with a gtx 560 SSC DS edition, and in that system i can confirm mass effect 2 and 3 running at 60fps (vsync on) max settings, reckoning, fallot 3, fallout new vegas, resident evil 6, diablo III, dead space 1 and 2, battlefield bad company 2, etc etc all are maxed settings and all 55 - 60 fps all at 1680x1050... Its keep performing very well without OC like when i had on my hands...
 
I know there are a lot of people still running Wolfdale, I took a break from gaming for a little over a year and as such have an E8400 running at stock and a basic video card, both passive, for a completely silent system. I'm wondering what others' experiences are on this platform with faster graphics cards. Are you able to play with reasonable settings (at least medium) at 1920 x 1200? I'm interested in playing Bioshock Infinite, Crysis 3, Mass Effect 2/3.

I played majority of games on E7200 overclocked to 3.2 GHz without problems. Even Shogun 2, which I played on NV 8600 GT.

What is basics video card? GTX 660, or GTX 650 Ti is what's required to play at 1920x1200 at reasonable setings.
 
Yes that's what my research is showing me. I booted up Bioshock 1 on my 8800GT and 1080p and I can max it out. I'm seeing 560 Tis going for around 120 and I'll pick that up or something newer once I'm finished with Bio 1, 2, and Mass Effect 2. Knowing myself, this will probably take a few months heh.
 
Is Mass Effect that game with scanning Ur-Anus? If yes it's finishable relatively fast if you skip the scanning minigame.
 
UE3 games doesn't really use that much CPU. Can run 4 copies of UE3 games (BL2) on 4 threads without much issue.

Crysis 3 I don't know, but the minimum is "2.8 GHz dual core processor, Intel Core 2 Duo", but if that gives you the performance you want is unclear. :D

Wolfdale could handle Crysis 3 (heck, overclocked Conroe can, and Kentsfield, surprisingly, doesn't require even overclocking, as Crysis 3 is quad-core aware) on the CPU side - the issue with UE3 games and Crysis 3 is resolution and other settings (the GPU, in other words). If you have an Intel quad-core (or overclocked dual-core), but have HD7750 or less, consider investing in a GPU upgrade first before sinking money into a CPU upgrade. Due to discounting, GTX650 Ti BOOST (NVidia) and HD7850 2 GB (AMD) are looking like the sweet spots on the GPU side.
 
Wolfdale could handle Crysis 3 (heck, overclocked Conroe can, and Kentsfield, surprisingly, doesn't require even overclocking, as Crysis 3 is quad-core aware) on the CPU side - the issue with UE3 games and Crysis 3 is resolution and other settings (the GPU, in other words). If you have an Intel quad-core (or overclocked dual-core), but have HD7750 or less, consider investing in a GPU upgrade first before sinking money into a CPU upgrade. Due to discounting, GTX650 Ti BOOST (NVidia) and HD7850 2 GB (AMD) are looking like the sweet spots on the GPU side.

I think in more a 650 TI boost only for the fact that the 7850 can be a bit bottlenecked for be more powerfull card, also a bit more hotter and noisy.. and he stated that like a complete silent system.
 
the time has come to an end for great duo core with amazing potential oc. go jump on haswell when it comes out soon
 
the time has come to an end for great duo core with amazing potential oc. go jump on haswell when it comes out soon

Not necessarily Haswell.

Haswell actually does not offer that great a performance bump over Ivy Bridge - and the price will be a higher delta than the performance gain will. If anything, Ivy Bridge i5s will be a more obvious bargain once Haswell arrives - even if the pricing goes nowhere. (And it may well NOT go anywhere - we still don't know if desktop Haswell will require a different socket; if it does, i5 Ivy's pricing, if not that for all the 3rd-generation Intel LGA1155 CPUs, could well stay put.)

Quad-core, yes - application, game, and most importantly, operating systems are actually putting quad-cores to use. However, not necessarily Haswell - look how long Kentsfield has been EOL, and it's actually still mostly relevant.
 
Not necessarily Haswell.

Haswell actually does not offer that great a performance bump over Ivy Bridge - and the price will be a higher delta than the performance gain will. If anything, Ivy Bridge i5s will be a more obvious bargain once Haswell arrives - even if the pricing goes nowhere. (And it may well NOT go anywhere - we still don't know if desktop Haswell will require a different socket; if it does, i5 Ivy's pricing, if not that for all the 3rd-generation Intel LGA1155 CPUs, could well stay put.)

Quad-core, yes - application, game, and most importantly, operating systems are actually putting quad-cores to use. However, not necessarily Haswell - look how long Kentsfield has been EOL, and it's actually still mostly relevant.

From ages, if publish that the end of 1155 is ivy bridge, haswell and broadwell arrives in the new socket 1150.. Also haswell vs ivy does not offer much performance diference just over stock settings, the OC capability on haswell is far away over ivy only the fact of his 3 blck bases make a huge difference, adding the new voltages control will add even more oc margin.. Also move to ivy mean move to a dead socket, haswell mean a fresh socket ready for the next broadwell arquitecture.
 
From ages, if publish that the end of 1155 is ivy bridge, haswell and broadwell arrives in the new socket 1150.. Also haswell vs ivy does not offer much performance diference just over stock settings, the OC capability on haswell is far away over ivy only the fact of his 3 blck bases make a huge difference, adding the new voltages control will add even more oc margin.. Also move to ivy mean move to a dead socket, haswell mean a fresh socket ready for the next broadwell arquitecture.

That's if Broadwell will actually mean anything compared to Haswell - remember, Ivy Bridge offered little improvement over Sandy - in other words, going from Haswell to Broadwell may be another sideways move.

A saner move may well be from Ivy to Broadwell, skipping Haswell altogether. Now if you have Sandy Bridge or LGA1156, going to Haswell may make sense, but not if you have anything older or newer than Sandy Bridge.

First off, what applications (other than outliers) are going to make use of tall CPU overclocks? Few everyday applications - even games - do so today. It's why even the Ivy heatwall - though lower than that of Sandy Bridge - makes next to no difference unless you're into e-peen wavery. Unless there is also going to be a memory (system RAM, that is) change with either Haswell or Broadwell (if anything, it appears that such a change will NOT happen with Haswell), the dead-socket impact could well be minimal or nil (which wasn't the case with LGA775, of course, as the death of that socket also included the obsolescence of DDR2).
 
check out this review

"Is This Even Fair? Budget Ivy Bridge Takes On Core 2 Duo And Quad"

http://www.tomshardware.com/reviews/ivy-bridge-wolfdale-yorkfield-comparison,3487.html

Majority of this difference is because of much better RAM bandwidth and lower latency. I have 85 ns, a normal I3/i5 can have 45-35 ns. This is world of difference. The real results are in this tab: http://media.bestofmicro.com/F/7/381571/original/Sandra-Cryptography.png Ignore i5 because it has AES-NI support.
 
The G2020 is pretty impressive. Can't wait to see what Haswell based Pentiums can do for $60 (next year). :p

Majority of this difference is because of much better RAM bandwidth and lower latency. I have 85 ns, a normal I3/i5 can have 45-35 ns.
wat? Cache hit rates and CPU utilization in most of these benchmarks are quite high, and main memory latency would be not be very significant for most of the THG tests.

With the smaller cache sizes in those Pentium and Celeron CPUs, it would be hitting main memory more often than the CPUs those are compared against (E8400 has 6MB L2, Q9550 has 12MB, G1610 has 512MB L2 & 2MB L3, G2020 has 512MB L2 & 3MB L3).

It's faster for a number of reasons, like higher bandwidth caches and uarch improvements over the Conroe and Nehalem cores. It's about a lot more than lowering some kind of synthetic measure of memory bandwidth. :p
 
I hate when specs are listed like that, especially when they are unclear or vary with each source. :p

Business software is usually worse.
Word came down from the business that they wanted to stamp "2Ghz CPU required" on our product.

I told them that was completely meaningless.

I don't know why using the WEI never caught on for viable spec requirements (for our windows application).
 
Majority of this difference is because of much better RAM bandwidth and lower latency. I have 85 ns, a normal I3/i5 can have 45-35 ns. This is world of difference. The real results are in this tab: http://media.bestofmicro.com/F/7/381571/original/Sandra-Cryptography.png Ignore i5 because it has AES-NI support.

That is why I pointed to the memory change (from DDR2 to DDR3) as being part of the dead-socket impact with LGA775. I'm currently running DDR2-800 (200 MHz quad-pumped with Q6600, and this is typical of non-overclocking DDR2. Budget DDR3 is currently 333MHz quad-pumped - 133 MHz quicker before the quad-pumping even kicks in (DDR3-1333) - and therefore faster than typical DDR2 in and of itself It's also why I wish I had been able to grab an ASUS P5G41-M LX Plus R2 (it's basically the same motherboard as the P5G41-M LX2/GB, but with a different BIOS and DDR3 support), so I could compare the two memory types in identical circumstances.
 
A saner move may well be from Ivy to Broadwell, skipping Haswell altogether. Now if you have Sandy Bridge or LGA1156, going to Haswell may make sense, but not if you have anything older or newer than Sandy Bridge.


buying a Haswell and using it for 5 years or so sounds a lot better than your suggestion of buying two different systems.

And why would buying Haswell when you are already on Sandy Bridge make sense? You were already told it needs a new motherboard and the improvement coming from SB/IB is the smallest compared to any other sockets. Wake up.
 
Business software is usually worse.
Word came down from the business that they wanted to stamp "2Ghz CPU required" on our product.

I told them that was completely meaningless.

I don't know why using the WEI never caught on for viable spec requirements (for our windows application).

That would be equally meaningless. Also, Windows 8 WEI and Windows 7 WEI are rated ona different scale, so a same system under W7 gets different scores under W8.
 
That would be equally meaningless. Also, Windows 8 WEI and Windows 7 WEI are rated ona different scale, so a same system under W7 gets different scores under W8.

But I'm not interested in comparing WEI. I'm interested in a minimum WEI... which would be far more relevant than a minimum CPU clock rating.

I have a 2Ghz Willamette around here somewhere....
 
Thanks for posting that article.

I am still running a E8400 myself OCd to 3.6-4 and it is being absolutely DESTROYED by modern games. I barely broke 30fps in Bioshock Infinite (high to ultra or whatever settings), I can barely break 60fps in minecraft with optifine, Farcry 3 was not terrible but still had some hitches such that I turned down quite a few settings from ultra/high to high/med.

This is with 8gb of mem, and a 7970 at only 1920x1080.
 
Last edited:
But I'm not interested in comparing WEI. I'm interested in a minimum WEI... which would be far more relevant than a minimum CPU clock rating.

I have a 2Ghz Willamette around here somewhere....

As long as you don't even specify the minimum combined WEI.

If you do, then the max you can ever get with any non-SSD is 5.9.

As for graphics, CPU, and RAM, the scale is not linear. It takes a lot more speed to go from 7-7.9 than in does to go from 5.0-5.9.

And what exactly is WEI measuring when it does its tests?

A video card could score the same exact as a different card, but the same game may not run equally well on both cards.
 
As long as you don't even specify the minimum combined WEI.

If you do, then the max you can ever get with any non-SSD is 5.9.

As for graphics, CPU, and RAM, the scale is not linear. It takes a lot more speed to go from 7-7.9 than in does to go from 5.0-5.9.

And what exactly is WEI measuring when it does its tests?

A video card could score the same exact as a different card, but the same game may not run equally well on both cards.


Like I said in the first post: business software without significant requirements.... minimum combined WEI is ideal for this use.

If I say "minimum WEI 3.0" then it has easily quantifiable meaning across various hardware.
Otherwise you say "1.8Ghz CPU" and people get uppity when their atom or 6-year old celeron don't run it well.
 
buying a Haswell and using it for 5 years or so sounds a lot better than your suggestion of buying two different systems.

And why would buying Haswell when you are already on Sandy Bridge make sense? You were already told it needs a new motherboard and the improvement coming from SB/IB is the smallest compared to any other sockets. Wake up.

I was referring to moving from dual-core (i3) Sandy to i5 OR i7 Haswell - if you are keeping the same number of cores, you are right; it WOULDN'T make sense. (The only reason leaving Celeron Wolfdale for Kentsfield and staying in the same socket made any sense was due to the doubling of the number of cores - E3400->Q6600 - that Kentsfield has more cache per core was strictly icing. As it is, the extra cache is used by installations, but very few individual applications OR games. And yes - the same logic would have applied if Yorkfield were substituted for Kentsfield.)
 
Well, to give an idea I found I was CPU bound with my sig rig when botting up Metro 2033 the other day. No matter how much I OC'd the card (which was pretty much on par with the [H] review OC), I averaged 50FPS in the built in benchmark. Underclocked by 100MHz (so 200MHz change) only netted a drop of ~2.5FPS. Using high settings, 1920X1200.

At least now I found myself a valid reason to upgrade once Haswell comes out! :D

Yeah, bringing a pretty cold carcass out, but figure this is relevant. New rig in system, stock clocks on everything, no tweaking. Same settings in the Metro benchmark (default high settings, 4X msaa, tesselation, no DOF, advanced phsyx on) took the average from the 50fps above ([email protected], same video card, 6GB 1066 DDR2, P35 board) to 85 average. That is a pretty decent bottleneck from the platform I was on.
 
I know there are a lot of people still running Wolfdale, I took a break from gaming for a little over a year and as such have an E8400 running at stock and a basic video card, both passive, for a completely silent system. I'm wondering what others' experiences are on this platform with faster graphics cards. Are you able to play with reasonable settings (at least medium) at 1920 x 1200? I'm interested in playing Bioshock Infinite, Crysis 3, Mass Effect 2/3.

Buy the games, see if they work, upgrade if they do not... People on this site seem to mostly be biased towards wasting money for unnecessary upgrades and listening to advice here will often result in throwing your money away. Always try it first. Not much to lose that way. Especially ignore people who are known to upgrade ridiculously often. When this site was relatively new, PCs were slow and overclocking and such could give noticeable gains that would improve productivity on your machine. But times are different now. Adapt to the fact that computers now are pretty damn good. Save your money when you can. Only upgrade when your machine starts to bog down.

I've had at least one reply to me saying stuff like this saying basically "people upgrade cars to go faster when they don't need to so why not computers?" Well, people don't typically upgrade the engines in their cars to one with better specifications but which feels absolutely the exact same in practice. They would generally only do upgrades they can actually feel. If your PC upgrade isn't obvious in using it, then you have wasted your money (and worse, given it mostly to companies outside of the US, thus weakening the US economy that much more).
 
Last edited:
Back
Top