1.6vcore safe if temps are ok? Also, will RAM upgrade help TF2 fps?

Epiik

Limp Gawd
Joined
May 2, 2007
Messages
314
CPU: E8400
MOBO: Model #: GA-P35-DS3L (Gigabyte DS3L)
HEATSINK: Tuniq Tower 120
RAM: Model #: F2-6400CL5D-2GBNQ (G.SKILL 2GB DDR2 800)
PSU: Corsair HX520
OS: XP 32bit

Coretemp during 32m SuperPi, 47c low, 66c high, around 59-61 average near the end.

4.41GHz @ 1.6 bios vcore. There's nothing higher except 1.8 and 2.0.

Also, I'm thinking about getting the Mushkin 4gb DDR2 800 Ascent ram to hopefully get TF2 running at 100fps constant, even in 24 player firefights.

Can't seem to keep 100fps even with low settings at 640x. Using DX8 instead of DX9 doesn't seem to help much either.

I'm turning HDR and Motion Blur off because I've heard CPU power whereas the other things are just GPU? I'm not sure, if anyone can shed more info, thank you.


1) Is it the high vcore itself that causes degradation or the heat?

2) I've seen reviews where the Mushkin Ascent runs at 1100mhz so would it be possible to push the CPU further and even if I can't, think the upgrade will help the TF2 FPS?

edit: 100fps is only useless if you play on LCDs. -dxlevel 81 is a launch cmd that puts tf2 into dx8 mode for higher fps.
 
Last edited:
1) Is it the high vcore itself that causes degradation or the heat?
It's the Vcore. 1.6V is WAY more than your CPU is designed to handle and it's almost a certainty that you are damaging your chip.
 
It's the Vcore. 1.6V is WAY more than your CPU is designed to handle and it's almost a certainty that you are damaging your chip.

like zero said.. 1.6v is WAY to high.. you need to get it down to 1.4v or lower.. theres no reason you should be at 1.6v with that processor..

the memory isnt going to make that much of a difference in your frame rate unless you are already maxing it out.. not to mention 100fps is pretty much useless.. the problem you are running into is that the 8800GT just does not have enough video memory to run high settings plus AA in any games.. so either you run lower settings with AA or turn AA off completely.. this is especially true if you do not have that card overclocked.. also using DX8 isnt going to make any difference to begin with since its a DX9 game..


if you are already running 4.4ghz @ 1.6v.. then there is no possible way you can get that any higher.. and infact ill bet anything you will have to downclock it to get it back to 1.4v.. but what the mushkin ram will allow you to do is to increase the ram multiplier.. so instead of running 1:1 with the FSB you will be able to run it a few steps higher which will leave the cpu at the speed its at but have the ram running at DDR2 1100..

to sum this up.. i think you need to do a little more reading on how AA, HDR, and motion blur work as well as how memory can effect games..
 
Last edited:
like zero said.. 1.6v is WAY to high.. you need to get it down to 1.4v or lower.. theres no reason you should be at 1.6v with that processor..

He doesn't need to go all the way to 1.4v, but I would go back down to 1.5v. I wouldn't want to run 24/7 much above 1.525V. I ran an E5200 at 4.4Ghz at 1.525V 24/7 for about six months and have since passed the CPU on to a friend and its doing just fine for them.
 
As everyone has noted so far, RAM speed doesn't make a difference when it comes to sheer FPS. Right now, your main barrier to getting higher FPS is the video card.
 
OP mentioned he's running in 640x480, shouldn't that be mostly CPU limited at that low of res, I know it's TF2, we're not comparing Quake here, but come on, 640x @ 4.4Ghz and we can't keep 100fps?
 
OP mentioned he's running in 640x480, shouldn't that be mostly CPU limited at that low of res, I know it's TF2, we're not comparing Quake here, but come on, 640x @ 4.4Ghz and we can't keep 100fps?


its meaningless anyways.. and no that processor wouldnt hold 100fps anyways.. because its not the processor thats the problem.. the problem is that its a source game and while it does support multi-core its piss poorly coded for it.. but what i still dont understand is whats the point in running the game at that resolution especially being that its a source game its not going to give you any useful information..

and also it doesnt matter if you have an LCD or a CRT.. 100 fps doesnt make any difference unless the LCD has a 100hz refresh rate.. the game could be running at 10 million fps.. and you still the a 60hz LCD can only show 60fps..
 
Last edited:
DON'T you remember 1.6 volts killing northwood p4s like 5 generations ago....

My question is what video card are you using?
 
hopefully get TF2 running at 100fps constant, even in 24 player firefights

It was my impression (as DAOC and Warhammer player) that online games with multiple players are typically network I/O or server bound/bottlenecked or both in big fights. Will the game draw a new frame(s) if all the info for location etc. of the other players have not been received and if it did draw a frame what would be the use without all the other player info ? If on a local fast LAN not so much but over the net I thought this was a big issue. I too have an 8800GT and its OK for most of my games but over teamspeak some of my guys with nicely tweaked out SLI machines still bitch at the "slide-show" when things start to get really interesting. So much so that I maybe noticed a little improvement, at least I think so, when I went to a dedicated Intel NIC. Naturally it helps to have as good of a video card as you can afford, not arguing about that. Just wondering if the users goal is achievable given the design of the game and the network constraints.
 
As others have stated, this is network related, not GPU or CPU related. You should probably back your OC down further, as you're only causing that chip to run stupid hot, no gaining anything from it.
 
and also it doesnt matter if you have an LCD or a CRT.. 100 fps doesnt make any difference unless the LCD has a 100hz refresh rate.. the game could be running at 10 million fps.. and you still the a 60hz LCD can only show 60fps..

CRTs have been able to run at 100hz, 120hz, 150hz for a long time now. If you're running your monitor at 120hz, fps all the way up 120 will make a difference although not as noticeable to some. From there on up, you may see screen-tearing.

It was my impression (as DAOC and Warhammer player) that online games with multiple players are typically network I/O or server bound/bottlenecked or both in big fights. Will the game draw a new frame(s) if all the info for location etc. of the other players have not been received and if it did draw a frame what would be the use without all the other player info ? If on a local fast LAN not so much but over the net I thought this was a big issue. I too have an 8800GT and its OK for most of my games but over teamspeak some of my guys with nicely tweaked out SLI machines still bitch at the "slide-show" when things start to get really interesting. So much so that I maybe noticed a little improvement, at least I think so, when I went to a dedicated Intel NIC. Naturally it helps to have as good of a video card as you can afford, not arguing about that. Just wondering if the users goal is achievable given the design of the game and the network constraints.

First of all, +1000 thanks for your Gigabyte overclock guide. I've referenced it for years and it helped me 120% OC a E2140 to 3.2GHz that painlessly held me over until I could afford a better chip. To be honest, I actually don't notice much of a performance difference moving from that to the E8400.

I messed around a bit and have it running pretty well now. One of the issues was a problem with dxlevel that caused me some confusion. After you mess with -dxlevel in the launch options, I thought using -autoconfig would set everything to default but you have to manually put in -dxlevel 91 to get back to 9.

And I think you're right about the hardware not holding us back but the network/server. Because everything will run 100fps fine even with everything maxed but get more than 12 ppl oncreen and it goes to hell. Seriously this is an epiphany for me, I've often wondered how it's possible that hardware has gotten so much better but the game performance hasn't scaled accordingly and this seems like the culprit.

I'll take a look at the dedicated NICs, thanks for the input.
 
I ran 1.53v (in CPUz, not bios) on my E8400 for 6 months with 0 degredation. I previously ran it for 6 months before that @ 1.38v. Its still currently in service @ 1.37v and running strong (new owner lowered the clock and the 1.53v).

My temps were plenty inline. People always say voltages kill, however I still dont believe voltage kills as quickly as heat does. I kept my E8400 below 70c at all times w/ 1.53v and I firmly believe thats why it lives today. Theres also been a few others who have run @ 1.5v+ without issues (some over a year).
 
I ran 1.53v (in CPUz, not bios) on my E8400 for 6 months with 0 degredation. I previously ran it for 6 months before that @ 1.38v. Its still currently in service @ 1.37v and running strong (new owner lowered the clock and the 1.53v).

My temps were plenty inline. People always say voltages kill, however I still dont believe voltage kills as quickly as heat does. I kept my E8400 below 70c at all times w/ 1.53v and I firmly believe thats why it lives today. Theres also been a few others who have run @ 1.5v+ without issues (some over a year).

Just cause it works now doesn't mean there wasn't any degradation... just not enough to kill it yet.
 
I know there was degredation, of course there will be, there always is to some extent. To how much extent noone has proven. All chips are different, but theres many who have run 1.5v + for 6-12 months with any noticeable degredation (needed a bump in vcore specifically).

Heat kills, not voltage. Id bet money that an e8400 running @ 1.4v @ 100c would die quicker than 1.6v @ 70c.

Id also like to know how things have change from 5-8 years ago? Heat was always the enemy, not voltage. Everyone is scared of voltage nowadays and not so scared of heat. Id much rather run my CPU cooler w/ a bit more voltage, than the opposite.
 
I know there was degredation, of course there will be, there always is to some extent. To how much extent noone has proven. All chips are different, but theres many who have run 1.5v + for 6-12 months with any noticeable degredation (needed a bump in vcore specifically).

Heat kills, not voltage. Id bet money that an e8400 running @ 1.4v @ 100c would die quicker than 1.6v @ 70c.

Id also like to know how things have change from 5-8 years ago? Heat was always the enemy, not voltage. Everyone is scared of voltage nowadays and not so scared of heat. Id much rather run my CPU cooler w/ a bit more voltage, than the opposite.

Everything has changed since 5-8 years ago. Most importantly, however, the process these chips are being built on are around 1/4th the size. Literally -- 180nm process for a Pentium III Coppermine as opposed to 45nm process for a Core 2 Duo Wolfsdale. It takes a hell of a lot higher voltage to cause electromigration across the gaps in a 180nm chip than it does across the same gaps in a 45nm chip.
 
Maybe Im wrong, but I was under the impression electronmigration was more dependent on heat? I understand its EASIER w/ more voltage for it to happen, but heat will do it QUICKER.
 
Maybe Im wrong, but I was under the impression electronmigration was more dependent on heat? I understand its EASIER w/ more voltage for it to happen, but heat will do it QUICKER.

Both in combination, but voltage is the more important factor. Heat without high voltage, less likely, high voltage with only moderate heat (for CPUs, ~60C) likely, high voltage and high heat (70+C, especially 80+C). I'm no electrical engineer, but if my memory serves me well these numbers are in line with what others who are EEs here have said in the past.
 
AIR cooling FTL a 1.6V thats like LN2 MAX KILL voltage on a suicide run
 
Not hardly. I ran 1.62 on my E8400 w/ water. Granted, it crept up to around 92c, but it did ok :D Again, chip dependent though.
 
Not hardly. I ran 1.62 on my E8400 w/ water. Granted, it crept up to around 92c, but it did ok :D Again, chip dependent though.
No matter how well you think the chip coped with that much voltage, it's almost a certainty that you caused some damage to it.
 
Understand it damaged it, Im just stating its not going to kill the chip always ;/ They can take a bit more voltage than people believe. Someone roaming around on one of the forums had his chip over 1.6v for over 6 months without an issue ;/
 
Understand it damaged it, Im just stating its not going to kill the chip always ;/ They can take a bit more voltage than people believe. Someone roaming around on one of the forums had his chip over 1.6v for over 6 months without an issue ;/
If it was a 65nm chip, then that's not good, but it's not nearly as bad as if it was on a 45nm chip. And these chips are designed to run for 10 years+. Overvolting shortens the lifespan of the chip - the more, the shorter (and I doubt it's a linear relationship).
 
1.6 seems like too much. However with great cooling it might be doable.
 
1.6V is not a safe voltage with a 45nm Core 2 CPU no matter what. You are guaranteed to cause damage to your chip by running at that level.
 
I've had my q6600 at 2.0 v. No sweat.
Unless it was under extreme cold (LN2), that's instant fry territory. If it wasn't under extreme cold and it saw the next day, it was probably a reading error in whatever program you were using.
 
Back
Top