My 8800gt is poopy.

Oh4Sh0

2[H]4U
Joined
Mar 13, 2005
Messages
2,744
:( My computer started freezing during call of duty 4 a few times lately.. screen would change funky colors.. and then freeze a sec later. Figured it was the video card and then ran atitool in combination with another gpu stresser and it reproduced my fears.According to atitool and nvidia monitor, the core doesn't even reach 70C though. Is it just shitty vram?
 
:( My computer started freezing during call of duty 4 a few times lately.. screen would change funky colors.. and then freeze a sec later. Figured it was the video card and then ran atitool in combination with another gpu stresser and it reproduced my fears.According to atitool and nvidia monitor, the core doesn't even reach 70C though. Is it just shitty vram?

Have you overclocked anything on the machine lately?
Have you tampered with something?
What do the other temperature readings state for CPU ect?
 
Have you overclocked anything on the machine lately?
Have you tampered with something?
What do the other temperature readings state for CPU ect?

1. Nope, Q6600 and 8800gt are both stock clocked.
2. Nope.. everything's as it was.. I did tidy up the wiring a bit.. but as the progs have shown, temps are good
3. Both nMonitor and ATI tool show 68-69C for the gpu when it's stress testing (just prior to freezing). Running Prime95 the 4 cores for the q6600 are in the upper 50s. At idle, they're 30-36.
 
i would try unplugged any unused HD's or CD roms to see if it is a power issue. ( assuming you dont have a different PS test with). if possible see if you can reproduce the problem is a friends computer.

if it doesn't work start the return process asap.

edit: seen PSU you have now, would test the card in another system if possible
 
Exactly the same problem I have (in another topic). What card is it specifically? And what mobo do you have?
 
Well using a software utility (everest ultimate) it says my +12V rail is @ 10.40A idle.. I don't think I'd even be in Windows if that was the case.. damn you software monitors. No matter how many times I "refresh" it, it stays at the same value too.. lol. I have another system with a 2900 pro in it. I'll switch the cards out.. if it can't handle the 8800gt, the 2900pro will definately bring it to its knees.
 
Well I threw in the 2900pro.. Everest is still reporting the same 10.4V on the +12V rail. Booted it up and used rivatuner (cuz atitool sliders wont work.. vista issue im guessing, worked on my xp rig) to overclock it from 600/800 to a modest 815/890. It should be using enough power now to rival that of an african country. I can tell by the fan :D

Anyhow I'm stressing it now, and it's running all the live-long day. Must be the 8800gt. :(
edit: I also wanna throw it out there that I put it through 3dm06 and it successfully beat the 8800gt by about 200 pts. It is oc'd though. -.-
 
Well I threw in the 2900pro.. Everest is still reporting the same 10.4V on the +12V rail. Booted it up and used rivatuner (cuz atitool sliders wont work.. vista issue im guessing, worked on my xp rig) to overclock it from 600/800 to a modest 815/890. It should be using enough power now to rival that of an african country. I can tell by the fan :D

Anyhow I'm stressing it now, and it's running all the live-long day. Must be the 8800gt. :(
edit: I also wanna throw it out there that I put it through 3dm06 and it successfully beat the 8800gt by about 200 pts. It is oc'd though. -.-


I had the same kinda problem with ALL FIVE 8800 gt's I owned. For me, it was a scaling issue with my monitor if I used anything but my LCD screens native res. of 1440 x 900.

When I switched to "Use nVidia Scaling", I was able to stop the COD 4 freezes at lower res. like 1280 x 720 etc.


Not that an 8800 GT couldn't handle any game just fine at 1440 x 900, I was just erked that such a problem even existed. Of course, I didn't discover that it was a scaling issue until AFTER I plunked down another 80 bucks for a power supply that I didn't need.


I don't know if it was the driver or my monitor....or the fact that it frustrated me for weeks, but I ended up hating the 8800 GT period and picked-up an HD 2900 512-bit. No driver issues...no scaling issues. No major performance difference from the 8800 GT....for 100 dollars less than an 8800 GT was at that time.

At 160, those HD 2900 Pro's were and are the best video cards ever.
 
I don't know if it was the driver or my monitor....or the fact that it frustrated me for weeks, but I ended up hating the 8800 GT period and picked-up an HD 2900 512-bit. No driver issues...no scaling issues. No major performance difference from the 8800 GT....for 100 dollars less than an 8800 GT was at that time.

At 160, those HD 2900 Pro's were and are the best video cards ever.

Are you kidding me? No major performance difference? Try 50-65% the performance of the GT.

That's not really a bargain for a card that is 50-65% the performance of a GT, at 65% of the price..

If you don't believe me, here's a random review I found which happened to have both cards in it: http://www.tweaktown.com/reviews/1210/1/page_1_introduction/index.html
 
Sr7, I don't mean to make this a 2900pro versus 8800gt thread.. but the r2900pro stock clocks are 600/800. The core can easily oc 225mhz+ over it's stock speed, and well over 2900xt speeds. You don't buy the card and leave it stock. And with the clocks bumped, it royally whoops ass - and was only $160. :) My 8800gt scored 11,100 on 3d06 stock with a Q6600 stock and vista ultimate. I dropped in the 2900pro, bumped the clocks a bit and got 11,300. Not a big difference, and the 8800gt is obviously stock. However, the 2900pro isn't artifacting, and I could probably bump it a bit more, that was just the first place I stopped the slider so I could do stress tests!! :p and it costed a helluva lot less.

And Og, I'm using 1280x1024 on standard aspect 19" 's, so unfortunately that's not the problem.
 
Are you kidding me? No major performance difference? Try 50-65% the performance of the GT.

That's not really a bargain for a card that is 50-65% the performance of a GT, at 65% of the price..

If you don't believe me, here's a random review I found which happened to have both cards in it: http://www.tweaktown.com/reviews/1210/1/page_1_introduction/index.html



Haha...what you fail to understand is. I did my OWN review. And there is little or no difference in real world gaming with the exception being Crysis on High and Very High.


The HD 2900 Pro will pull about 5-10 FPS less in the GPU benchmarks than an 8800 GT at those settings with the rig in my sig.

In the real world, as you play through the game, it doesn't add up to much difference.


The 8800 GT doesn't play through all that great on High anyway. I know people will beg to differ...some are happy with those dips into the 15-20 fps range during heavy fighting.

I played through the game on a mixture of settings that was acceptable to me, and both cards performed the same at those settings.

Actually, if you wanna throw a HD 3870 into the mix, which I also owned and used, it and the HD 2900 Pro I have (which is now flashed to an XT) actually plays Crysis a bit better than an 8800 GT on straight Medium settings.


Is the HD 2900 Pro a better card than an 8800 GT? No.


For 160 bucks? It certainly is. Considering, before the 8800 GT or HD 3870 even came out, the best price I could find on a lowly X1950 Pro 512 MB was from New Egg for 160 bucks (about three-four months ago.)


At the point that I bought it, the HD 2900 Pro 512-bit was the best value i've ever seen in a video card.

These days, The 8800 GT 512s have been found under 200 with rebates....and that's pretty good.

But for my money and my enjoyment without any driver headaches or any other BS, the HD 2900 Pro is still tops.
 
Back
Top