Friend just had his 8800gt fried in Starcraft 2..

Because if you cant meet 60fps, vsync cuts you to 30, and if you cant meet 30, vsync cuts you to 15, etc. Since blizzard games tend to target lower-end hardware that get poor framerates anyway, they dont want to make things worse. Vsync is to fix screen tearing when you have too many frames, its not a good solution for low framerates, so they wouldn't default it to enabled..

That and vsync can introduce awful input lag. It would be flat out terrible for a game like this.

not true with triple buffer, sc2 uses this by default to sync frames. not sure why people still think this, pretty much every game made within like the last 10 years that has vsync as an option will use triple buffering. there's no input lag, and you're not limited by frame multiples in any way.

maybe it's because people are not used to benching their games while they play, and someone at some point back in the 90's said "vsync is bad, don't use it if you want to be pro". try it yourself and see, I run the game on ultra with vsync and hover between 50-60 fps all game. same exact framerates with it off, except for low demand when it shoots up to the hundreds.
 
I can see how maxing out the GPU 100% might cause failures. Just ran Furmark for a while and watch your temperatures, on many cards they'll rise above safe levels. Many cards just aren't designed for 100% sustained load. My 4870 VDDC temps get up to the 120C in Furmark unless the fan is maxed out (which it wont be unless I manually set it to 100%). However I dont have SC2 so no idea what's that like.

The notion that some GPUs would reliably fail in Furmark because they're being "artificially stressed" and exceed their designed TDP is one thing. The idea that they wouldn't throttle back clockspeed based on temperature, as has been done since Pentium 4's is another.

But an automatic fan throttle that doesn't reach 100% power at 120C? What is the 100% power setting *for*?
 
Last edited:
You expect to see rising temperatures as soon as the game loads, but the concern is the menu temperatures. Temperatures drop very fast when you minimise a game, hence why you need something that graphs the results so you can see what the temperature was a few seconds ago when the game was still maximised.. You just need to check those temperatures aren't higher than what is safe for the card (I dont know what temperatures are safe for a GTX480, I dont own one and its out of my price range so I haven't investigated it ;) Maybe someone else can chime in on that).

Okay so I set my GTX 480 fan to 65%, it's idle about 43C and as soon as I fire up the game I saw it rocket to 70C. I didn't wanna stay around the menu for long but I think it will go higher. Went back and set fan speed to 70% and straight to the game, game for about over an hour and immediately check the temp after quitting the game and it was still around 70C. I guess as long as I crank up the fan to 70% then I'll be fine.
 
The notion that some GPUs would reliably fail in Furmark because they're being "artificially stressed" and exceed their designed TDP is one thing. The idea that they wouldn't throttle back clockspeed based on temperature, as has been done since Pentium 4's is another.

But an automatic fan throttle that doesn't reach 100% power at 120C? What is the 100% power setting *for*?

The VDDC is the VRMs and those Volterras run hot as shit. I was once told by Unwinder in a thread somewhere that their upper limits are about 150c, I wouldn't feel comfortable with that personally. Mine go over 90c all of the time and I have yet to see any issues.


I would love to know which driver people are using with this "problem'.
 
This might be anecdotal, but my 8800GT's fan spins up and whines like crazy when I play SC2.
 
A little off topic but I found that my 5850 does the same thing in DoW II "victory" screen, where it's grading your enemies killed/resilience/speed of completion. Maybe not as bad as Starcraft but my fan ramps up until I exit that screen.
 
uhhhhhhhhhh guys, it's a known issue and becomes a non-issue with updated drivers.
 
I notice a similar oddity with the DoW II victory screen - it manages to make my card run at just the right frequency causing it to produce a high pitched chirping noise. Fortunately none of the other games I have do it.

On a side note, on my old ATI 9800 Pro (the ancient AGP beast that was a good card for the time) was a victim to the WoW logon screen. Went to go eat and got disconnected from the game, came back about 30 minutes later and the card was dead. Thing was I even had an aftermarket cooler on it (Artic Cooler or some such, forget the specifics), maybe it was just time for it to die.
 
But an automatic fan throttle that doesn't reach 100% power at 120C? What is the 100% power setting *for*?

Well I think the problem is fan speed is based off GPU temperature, which isn't really that high. Its the VRM temperatures which dont appear to affect fan speed but get really hot.

On a side note, on my old ATI 9800 Pro (the ancient AGP beast that was a good card for the time) was a victim to the WoW logon screen. Went to go eat and got disconnected from the game, came back about 30 minutes later and the card was dead. Thing was I even had an aftermarket cooler on it (Artic Cooler or some such, forget the specifics), maybe it was just time for it to die.

I think one problem with many aftermarket cooling solutions is they keep the GPU cool, but the pathetic little ram sinks they include are actually worse than the stock heatsink at cooling the rest of the card.
 
so I did some offline testing with the game engine, and took some screens to show how it renders certain scenes, how this relates to vsync/frame caps, and how your card might get overheated when idling in certain places. this is run with the rig in my sig, which is probably one of the most common setups around today, last gen mid range everything on a 60hz 22" widescreen lcd at 1680 res. gtx260 is evga oc edition with stock clocks/cooling on 257.21 drivers, and conroe dualies oc'd to 3.2ghz, all settings ultra with no changes to ini configs.

the osd will show gpu/vram metrics on top, over framerates next to the D3D9 label, with cpu/ram/pagefile usage on bottom. the (-1024) offset is for a ramdisk, since I'm running 4gb on xp-x86, I have used this to reclaim some of that lost memory, with 3gb left over for system use. the osd is produced by rivatuner, with realtemp and perfmons plugins, that you can also set to be logged over time.

an interesting thing to note is that with vsync on, a frame buffer is only applied to in game rendering, but not the 3d scenes around the ship. this is a good example of the difference between buffered, and unbuffered vsync. here is a nice shot of the armory, with the huge unit models moving about.

vsync off, so the card is going at full bore, which happens to be 41 fps:



vsync on, notice the gpu is unable to stay at 60 fps, so it gets capped at 30:



vsync off, much less demanding when focused on the viking:



vsync on, flexes the badass mech servos with ease, at 60 fps:



now in game, with vsync on, where a triple buffer is applied to the dx9 rendering. here you can see some variation in effects with a few units in the background, some particle effects around that bigass laser, terrain, fog, etc. below. notice frames dropping down to low 50's here, going as low as upper 40's in places with many more units around. that is made possible by triple buffered vsync, which also prevents it from going above 60 fps in less complicated scenes. frame rates are the same without vsync, for anything under 60 fps.



this is one situation I found that is most likely to cause heat problems, where players are likely to spend some time and/or idle in, the computer consoles of the ship. here we have only 2d ui overlays, with a few static animations for the unit models. your card can't drop to 2d, but it's not doing much, so frame rates will pointlessly skyrocket with no vsync enabled.

vsync off, where it goes past 300 fps:



vsyc on, 60 fps cap:



notice the gpu temps are still at a relatively cool 64 deg in the 300 fps shot, after letting it idle for a minute or so. this is not enough to trip my fan profiles to ramp up the speed, and fans are going at only about 60%, so driver efficiency and good cooling is probably doing its job here. the other screens that show 3d scenes, where the temps are even lower, is when they actually work the card, and ramp up the fan speed to 80% or more.

so my theory is, that only people with shitty cooling, really dirty fans, poor contact on thermal grease/tape, or maybe stuck on one of those bugged driver releases from nv, are going to have any problem at all, if you happen to spend lots of time on research/upgrades in the 2d console screens.
 
not true with triple buffer, sc2 uses this by default to sync frames. not sure why people still think this, pretty much every game made within like the last 10 years that has vsync as an option will use triple buffering. there's no input lag, and you're not limited by frame multiples in any way.

maybe it's because people are not used to benching their games while they play, and someone at some point back in the 90's said "vsync is bad, don't use it if you want to be pro". try it yourself and see, I run the game on ultra with vsync and hover between 50-60 fps all game. same exact framerates with it off, except for low demand when it shoots up to the hundreds.

The problem is that there's still enough input lag for some people to notice it. Vsync is just plain bad in my opinion, regardless of double or triple buffering. I've never played a game with vsync in double or triple buffering where I thought the input lag introduced was acceptable.

There's two issues here. One is the fact that the GPU overheated in the first place, a sign of what you stated above. Two is the fact that Blizzard allows the menus to be drawn as fast as possible unchecked. I don't think Blizzard nor gamers wanted to Furmark their system everytime they went into SC2's menus. This is why they should fix it.

Sure, they can do that, but it definitely isn't getting to the root of the problem. I can write up an OpenGL program to render a triangle at 3000 FPS, and it won't kill my graphics card. The bottom line is that either the drivers that control the fan speeds are bad or the hardware was poorly designed.
 
The problem is that there's still enough input lag for some people to notice it. Vsync is just plain bad in my opinion, regardless of double or triple buffering. I've never played a game with vsync in double or triple buffering where I thought the input lag introduced was acceptable.

I would challenge anyone to a blind test between the two, on any system/monitor capable of maintaining a more than reasonable frame/response rate. people will put up with all sorts of tearing just to say that they're pro enough to notice a difference of input delay, but at this point I think it's just a placebo, with the advances we have today it's not even a pepsi challenge tbh. and if your system is struggling, then that's a different problem playing in or out of sync won't fix.
 
I would challenge anyone to a blind test between the two, on any system/monitor capable of maintaining a more than reasonable frame/response rate. people will put up with all sorts of tearing just to say that they're pro enough to notice a difference of input delay, but at this point I think it's just a placebo, with the advances we have today it's not even a pepsi challenge tbh. and if your system is struggling, then that's a different problem playing in or out of sync won't fix.

There's a very distinct difference if you know what to look for. I have found most people simply don't know what to look for and they conclude that there's no input lag, which is very wrong. Just by the principle of buffering itself, there's input lag, there's no denying that. But to say that you won't be able to notice it is just ignorant. Maybe World of Warcraft doesn't support triple buffering, but I've tried with vsync on and there's a ridiculous amount of input lag that I can't accept, and this is in old world where I can get 300 FPS uncapped.

The fact of the matter is, vsync or not, frame rate limiting, or what-have-you, there is nothing that Blizzard can do that will solve the root of the problem which is either bad drivers or poor hardware design. The only thing that people can do to reduce the chance of this happening (regardless of what game/app) is to make sure your fans and heatsinks are clean and there's adequate airflow. This really is a hardware vendor problem, not Blizzard's.
 
I have a 8800GT 512mb been playing the game Since Beta, no issues at all.. ya'll fail.
 
I would challenge anyone to a blind test between the two, on any system/monitor capable of maintaining a more than reasonable frame/response rate. people will put up with all sorts of tearing just to say that they're pro enough to notice a difference of input delay, but at this point I think it's just a placebo, with the advances we have today it's not even a pepsi challenge tbh. and if your system is struggling, then that's a different problem playing in or out of sync won't fix.

It's not the only factor that causes input lag. I never noticed input lag with Vsync on my old 1ms TN panel but on my 8ms PVA trust me the difference is noticeable. Although the tearing also isn't as noticeable on the PVA. Online lag can add to this too. So when combined with other factors vsync can make a noticeable difference.
 
Last edited:
um I just loaded up starcraft 2, and at the menu my frame rate is locked to 63 fps, vsync off.
 
Back
Top