LOL, Kyle should be putting in SC2 into his benchmarks now.
I think you mean the SC2 menu lol
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
LOL, Kyle should be putting in SC2 into his benchmarks now.
Because if you cant meet 60fps, vsync cuts you to 30, and if you cant meet 30, vsync cuts you to 15, etc. Since blizzard games tend to target lower-end hardware that get poor framerates anyway, they dont want to make things worse. Vsync is to fix screen tearing when you have too many frames, its not a good solution for low framerates, so they wouldn't default it to enabled..
That and vsync can introduce awful input lag. It would be flat out terrible for a game like this.
I can see how maxing out the GPU 100% might cause failures. Just ran Furmark for a while and watch your temperatures, on many cards they'll rise above safe levels. Many cards just aren't designed for 100% sustained load. My 4870 VDDC temps get up to the 120C in Furmark unless the fan is maxed out (which it wont be unless I manually set it to 100%). However I dont have SC2 so no idea what's that like.
You expect to see rising temperatures as soon as the game loads, but the concern is the menu temperatures. Temperatures drop very fast when you minimise a game, hence why you need something that graphs the results so you can see what the temperature was a few seconds ago when the game was still maximised.. You just need to check those temperatures aren't higher than what is safe for the card (I dont know what temperatures are safe for a GTX480, I dont own one and its out of my price range so I haven't investigated it Maybe someone else can chime in on that).
Have you checked the temperatures with the framerate not capped in the menu?
The notion that some GPUs would reliably fail in Furmark because they're being "artificially stressed" and exceed their designed TDP is one thing. The idea that they wouldn't throttle back clockspeed based on temperature, as has been done since Pentium 4's is another.
But an automatic fan throttle that doesn't reach 100% power at 120C? What is the 100% power setting *for*?
But an automatic fan throttle that doesn't reach 100% power at 120C? What is the 100% power setting *for*?
On a side note, on my old ATI 9800 Pro (the ancient AGP beast that was a good card for the time) was a victim to the WoW logon screen. Went to go eat and got disconnected from the game, came back about 30 minutes later and the card was dead. Thing was I even had an aftermarket cooler on it (Artic Cooler or some such, forget the specifics), maybe it was just time for it to die.
not true with triple buffer, sc2 uses this by default to sync frames. not sure why people still think this, pretty much every game made within like the last 10 years that has vsync as an option will use triple buffering. there's no input lag, and you're not limited by frame multiples in any way.
maybe it's because people are not used to benching their games while they play, and someone at some point back in the 90's said "vsync is bad, don't use it if you want to be pro". try it yourself and see, I run the game on ultra with vsync and hover between 50-60 fps all game. same exact framerates with it off, except for low demand when it shoots up to the hundreds.
There's two issues here. One is the fact that the GPU overheated in the first place, a sign of what you stated above. Two is the fact that Blizzard allows the menus to be drawn as fast as possible unchecked. I don't think Blizzard nor gamers wanted to Furmark their system everytime they went into SC2's menus. This is why they should fix it.
The problem is that there's still enough input lag for some people to notice it. Vsync is just plain bad in my opinion, regardless of double or triple buffering. I've never played a game with vsync in double or triple buffering where I thought the input lag introduced was acceptable.
I would challenge anyone to a blind test between the two, on any system/monitor capable of maintaining a more than reasonable frame/response rate. people will put up with all sorts of tearing just to say that they're pro enough to notice a difference of input delay, but at this point I think it's just a placebo, with the advances we have today it's not even a pepsi challenge tbh. and if your system is struggling, then that's a different problem playing in or out of sync won't fix.
I would challenge anyone to a blind test between the two, on any system/monitor capable of maintaining a more than reasonable frame/response rate. people will put up with all sorts of tearing just to say that they're pro enough to notice a difference of input delay, but at this point I think it's just a placebo, with the advances we have today it's not even a pepsi challenge tbh. and if your system is struggling, then that's a different problem playing in or out of sync won't fix.