Just ordered a E8400...

BrandonB

2[H]4U
Joined
Jan 25, 2007
Messages
2,489
Anyone here have one of these babies? If so, how are you liking it so far? What's it like speed wise in games n stuff?
 
those e8400's are some of the worst for gaming and are outdated by now.

/would not read again
 
Just got mine last night. (Xeon 3110, actually.)

hurrhurrrhurrrrr537.jpg


Prime stable @ 4ghz 1.375v, however 4.5ghz requires 1.525v+ to get stable-a little too high for my tastes. I'm going to try to find some middle ground, hopefully a sweet spot between 4.1 and 4.4 before the chip requires a huge ramp up in voltage.

I've just been using it for a few hours, and I can already say (even at stock speeds) that this chip is much, much snappier than my E4300 @ 3.2ghz.
 
I just got mine a couple of days ago, upgraded from a AMD 3800 x2. NOw its definitely faster, and much much snapier like m3tahead said above. My harddrive seems to work even faster now.

Now I kept the same video card as my last system which was a 8800gts, so I think that in games the 8800gts is a limiting factor now. In Oblivion I expected outside frame rates to be 80+ but at my resolution at 1650x1080 on high detail it still bogs down to like 40FPS on those outdoor scenes, but still faster than my 3800x2. This must be the video card limiting me.

I'm also only able to get an OC of 3.6 at 3.18 volts stable. I expected to get to 4ghz but I'd have to go over the recommended vcore rate of 3.6, which I don't care about doing.
 
I'm quite happy with mine. 3.6Ghz at 1.218v. under load it hits 40C max.

gets about 1750ppd in f@h and that is with a few hours of break time to play some games. Crushes any game I throw at it.
 
I like mine, runs at 4.05ghz with 1.4V

After checking around 1.4V seem to be the max safe limit for these chips so I can't go any higher lol.
 
E8400 sucks!

:D (waiting for the virtual stoning, and flamethrowers)

Naw, seriously... Enjoy it . It's probably one of the smartest buys right now. There hasn't been anything so great for OC'ers since the Celeron 300A if you are old enough to even know about those. hehehehe....
 
well there have been a few chips considered near the consideration to the 300A, I remember having a 1.6a northwood that did 3.0ghz without blinking.
 
Intel Core 2 Duo E8400. Check it out on Newegg.



This is true but the CPU still helps a lot. My current CPU is bottlenecking my 9800GTX, so that's why I got the E8400 so I could OC the shit out of it.

QFT, some people really underestimate how much work the CPU needs to do to keep the GPU fed with data. I must admit I was one of those not long ago. I thought my OCed X2 4400+ was more than enough for any game for some time to come as long as I had a decent video card. That was of course until I ran into some major slowdowns that were not cured by reducing graphics settings or getting a new video card. Once I made the upgrade to my current system, everything ran a lot smoother.
 
Curious, what are your VIDs at? What's the lowest and highest on the E8400 chips?
 
I'm going to purchase an E8400 in 5 days, just depends where from because I gotta save all the money I can with this static amount of money right now. I'm hoping I can get 3.6ghz on stock voltage(to run RAM @ stock 400mhz).
 
QFT, some people really underestimate how much work the CPU needs to do to keep the GPU fed with data. I must admit I was one of those not long ago. I thought my OCed X2 4400+ was more than enough for any game for some time to come as long as I had a decent video card. That was of course until I ran into some major slowdowns that were not cured by reducing graphics settings or getting a new video card. Once I made the upgrade to my current system, everything ran a lot smoother.

That's right. A faster CPU WILL help out when there are alot of objects on screen and I have noticed less slowdown on the OC'd E8400 vs. OC'd Opteron 170.
 
I'm going to purchase an E8400 in 5 days, just depends where from because I gotta save all the money I can with this static amount of money right now. I'm hoping I can get 3.6ghz on stock voltage(to run RAM @ stock 400mhz).

Good luck, I'm running it at 3.6 at 1.28V.
 
That's right. A faster CPU WILL help out when there are alot of objects on screen and I have noticed less slowdown on the OC'd E8400 vs. OC'd Opteron 170.
As a previous Opteron 170 @ 2.8ghz owner, I agree 100%. When I went to my Q6600 @ 3.4 the difference was massive, especially in heavily cpu-bottlenecked games like CSS. Actually I'm still bottlenecked in CSS, but I don't know if I should get the E8400 for 4.0ghz or the Q9450 for an extra $100 and shoot for 3.6...
 
As a previous Opteron 170 @ 2.8ghz owner, I agree 100%. When I went to my Q6600 @ 3.4 the difference was massive, especially in heavily cpu-bottlenecked games like CSS. Actually I'm still bottlenecked in CSS, but I don't know if I should get the E8400 for 4.0ghz or the Q9450 for an extra $100 and shoot for 3.6...

Or neither... Who cares if you dip into the 80's instead of maintaining 100fps all the time? Not really much of a bottleneck.
 
Turn the frame rate counter off and I gaurantee that you'll never know the difference, and you'll save yourself $300. ;)
 
Turn the frame rate counter off and I gaurantee that you'll never know the difference, and you'll save yourself $300. ;)

i tried, and i can still tell the difference. my monitor and my eyes are very good. and it won't be $300 after I sell the q6600.
 
i tried, and i can still tell the difference. my monitor and my eyes are very good. and it won't be $300 after I sell the q6600.

Humans cannot see the difference past 30FPS or so. So that's just impossible, no matter how good your eyes or your monitor is.
 
Humans cannot see the difference past 30FPS or so. So that's just impossible, no matter how good your eyes or your monitor is.

While I do find it hard to believe that someone can tell a difference between 80 and 100 fps, humans can indeed tell the difference above 30 fps... There is a common misconception here. Humans my not be able to see the individual frames once you go past 25fps or so, but that does not mean we can't see the difference in fluidity between say 30 and 60 fps.
 
Humans cannot see the difference past 30FPS or so. So that's just impossible, no matter how good your eyes or your monitor is.

Whoa, Whoa... When it comes to a dvd or some kind of video content, this may be partially true. Although those 120Hz LCD demos are convincing me otherwise, but that's double refreshing a 30fps source.

The argument of 60 fps. vs 100 fps. is definitely debatable. But I can definitely tell the difference between 30 fps and 60 fps.

My comments above about fps drops, IS without an FPS counter. I'm to lazy to figure out how to enable those things, but in gaming I do notice less slowdown when there are more objects on screen.
 
Humans cannot see the difference past 30FPS or so. So that's just impossible, no matter how good your eyes or your monitor is.
L O L ! There is one of these naysayers every time this question comes up. I'm not even gonna justify that with an answer, but I will say that if you cant tell the difference between 80 and 100+ fps you are either not sensitive to it (some people can't even see the flicker at 60hz or claim that 30fps is smooth) or you need a good monitor that can actually display that difference. Back on topic now.
 
L O L ! There is one of these naysayers every time this question comes up. I'm not even gonna justify that with an answer, but I will say that if you cant tell the difference between 80 and 100+ fps you are either not sensitive to it (some people can't even see the flicker at 60hz or claim that 30fps is smooth) or you need a good monitor that can actually display that difference. Back on topic now.

I'll admit I'm just repeating what I thought to be truth from someone else. I was stupid for not doing my own research, if I'm wrong about it, then just tell me and then tell me why I'm wrong about it. :)
 
I got mine yesterday and am running stock on my ASUS P5E-VM HDMI. It's plenty fast at stock.

LOL, this is the first PC I have built that I didn't overclock.
 
you might not SEE a difference btween 80 and 100 fps ... but you will definately experience and feel a muchhh more smoothly game.

I'm aiming at an e8400 too to pair it with a 9600gt .. all I play is CSS ... and my x2 4200+ @ 2.6ghz is a real bottleneck for my 9600gt .. i get same fps with this 9600gt that i get with my 6800 ...

so yes, im ready to move on i think ...


btw, all of you that are getting nice oc's at around 1.4V and 1.38 v etc etc .. is this whats set in bios ? or what you read in windows?

please post your volts in bios.what you set in bios
 
L O L ! There is one of these naysayers every time this question comes up. I'm not even gonna justify that with an answer, but I will say that if you cant tell the difference between 80 and 100+ fps you are either not sensitive to it (some people can't even see the flicker at 60hz or claim that 30fps is smooth) or you need a good monitor that can actually display that difference. Back on topic now.

What kind of monitor are you using? My LCD is basically locked at 60Hz, and since I upgraded my system I can't really tell the difference between a solid 70 and say 110 fps, since my LCD only updates 60 times a second anyway :(
 
What kind of monitor are you using? My LCD is basically locked at 60Hz, and since I upgraded my system I can't really tell the difference between a solid 70 and say 110 fps, since my LCD only updates 60 times a second anyway :(

Sony GDM-FW900, I can run CSS at 960x600 160hz, but most of the time I use 1280x800 120hz. I wish Tank Guys was still in stock of the Xeon version of the E8400 because I would order right now.
 
Back
Top