I am almost convinced that this is a software issue after this latest development. I just had one of those "black-outs", but this time I minimized the game immediately after it happened and there was a warning message by my taskbar saying "Display Driver Stopped Responding and Has Recovered."...
Thanks, but this happens while playing modern and demanding games.
EDIT: I set "Prefer maximum performance" in NVIDIA Control Panel, but it didn't help: the card is still running at 2D clocks even when playing Crysis and such.
Sometimes, when playing a game(any game, really), the screen goes dark for a couple of seconds. After it goes back to normal, there is a pretty sharp drop in FPS. The reason for this? The GPU starts to run at 2D clock speeds. The only fix I found is to restart the PC.
I'm running Windows 7...
I always have vsync turned off because it causes considerable input lag for me. I barely ever even notice tearing, and when I do, it doesn't really bother me. The only game I play with vsync on is Braid. I don't know, that game just feels better with it on.
I also don't like sacrificing my fps.
Well, the file isn't working apparently. I extracted the files(there are three), but when I try to open any of them, it says "Blabla has stopped working, Windows will close the program blabla". :(
EDIT: It's a self-extracting archive.
No, I wasn't being sarcastic :P. Thanks for the link. I've downloaded the F4 file and I'm going to try it after I'm finished writing this post.
About the SpeedStep thing. For me, it reduces the multiplier, not the clock speed, and clock speed is always visible as is the multiplier. Would be...
This whole nVidia horseshitting scheme began with the success of 8800GT, right? I mean, nVidia makes great GPUs, but their naming scheme is downright purposefully misleading.
I have my 920 set to 150x20, but after a shut down it seems to always revert to 142x20. It still says 150x20 in BIOS though. Another interesting thing I noticed is that Computer Information and dxdiag always seem to say that my CPU runs at ~2.80GHz, no matter what speeds I set.
My Mobo is...
I read about PAE(Physical Address Extension), but I didn't really get it. It says that every CPU above Intel Pentium Pro supports it. Somehow it should make a 32-bit system support up to 64GB of RAM? Maybe I'm just stupid, but could someone explain this to me in layman's terms? I mean, what's...
Here's the deal: I've got 2x2GB DDRIII RAM and a GTX285. My System Information says that I've got 3.25GB RAM. Why not only 3.00GB? Isn't it 4GB - 1GB(VRam)? Or am I missing something here?
EDIT: I tried running dxdiag and it says that I have 3326MB of RAM. What the hell's going on?
Yea, my XFX GTX285 does that too. I noticed it happens in game menus/company splashes and ATi tool. Another person in this thread opened my eyes by saying that it happens when extremely high framerates are being displayed.
This isn't something to worry about, right?
Hehe, maybe I should consider myself lucky because I barely ever even notice tearing, and when I do, it doesn't bother me.
Well, you don't really *have* to check for artifacts with a tool; games are probably the best testing tools, but I would still recommend testing with a tool to avoid...
I was intimidated by CPU overclocking at first too, but after reading up a little on here, I did just fine.
So yeah, if you want even higher framerates, then overclock your CPU, turn vsync off, and maybe even add more RAM.
Nah, I would get CPU lag in Left4Dead and SupCom. It wouldn't be anything too terrible, but you could definitely feel it. I use G15 keyboard to monitor CPU/RAM usage.
I would have to disagree. My E6600 @ 3.3GHz was bottlenecking my 8800GTS 512. I'd get 100% CPU usage in Left 4 Dead, and don't even get me started on more CPU intensive games like SupCom. Of course, the bottleneck wasn't very big, but still. Now keep in mind that a GTX280 is at least twice as...