Post your 6800 GT OC!

Status
Not open for further replies.
OK.. Just got back from My CompUSA turns out that despite what the website said they had 1 left. Echanged my bad one for a new one.. am Happy to say that it works perfectly! Runs at Ultra speeds 400 1.1...

Greg
 
im at 430/1110, i havent pushed it to the max yet. Scored 13,000 on 3dmark03 w/ that
 
JediFonger said:
by the time SM3 is found in virtually all major first person shooter game, we'll all need to upgrade again. ATI made th right move in that respect. besides in the fall release of the x800 they'll probably put SM3 in there. in the vid card industry, nothing lasts more than a coupla years especially for hard core gamers. it's all braging rights.

SM 3.0 is not a matter of just "slapping it in there" when conveniant. NVIDIA will have more experience with it and do it better by the time ATI gets around to it. ATI made the wrong move. They didn't omit SM 3.0 because it "isn't needed now", they omitted it becasue they didn't want to invest and banked on NVIDIA not either. The reason ATI says you don't need it is because THEY DON'T HAVE IT. Just like 3dfx saying you don't need 32 bit color. Look where that got them. ATI is *behind* in technology now no matter how you slice it. Look at how much more efficient and faster clock for clock NV40 is. A 370/1000 GT can smoke a 520/1120 XT PE in nearly anything OpenGL. A 400/1100 Ultra can beat the 520/1120 PE XT in over half the D3D benchmarks right now. ATI is *severely lacking* in technology with their overclocked 9800 PRO ( X800 )

JediFonger said:
i was gonna get a gf anyway but decided not to because i don't game as much as i used to and i want good image quality. i also watch/encoding video stuff and ATI has the edge with real/divx-hardware acceleration. i think that's one of the best things since sliced bread. if gf supported i would jump ship.
You saing the GF doesn't give good image quality? LOL.
 
Just an update...

Got it to 429 / 1.15 with load at 84c... I think I may have a bit more room, but I'll keep it there for now. I have got to say, these cards ROCK!!! :D
 
420/1.12 right now. How do you guys test load temps? I wana see how mine is and if I got more head room.
 
|MaguS| said:
420/1.12 right now. How do you guys test load temps? I wana see how mine is and if I got more head room.

The only accurate way is to run rthdribl at at least 1024x768 and open the temp monitor while its running and watch. Exiting a game and checking will not be accurate as the chip temp falls drastically the 2nd you exit the game.
 
strossos said:
Right now I'm at 424/1.15 rock steady at 55-60 idle, and 66 max.

BFG 6800GT OC :)

Run rthdribl at 1024x768, your temp will go 75-85 if air cooled :p Post screenshot after its been running for 5 minutes.
 
Running perfectly fine at 420/1100. It actually detected higher but I figured its running hot enough as it is and with my slow CPU it won't effect have much effect. Idle temp is 60 and it raised my case temp 6 degrees when idle! Very hot card but after running a Geforce 4, Far Cry is freaking beautiful. My roommate plans to upgrade now that he's seen this in action!
 
Is it possible that you have to kind of break these cards in to make them run at a higher overclock? I was having trouble two days ago overclocking this card at all, i was only getting 375/1020, but i went home yesterday, played some games and went to overclock again and now i'm up to 400/1070, much better then i was when i got it, so that makes me happy, if i use it more, do you think it might clock even higher?
 
I cant get past 385 without getting artifacts but i'm happy with it at stock for my BFG6800GT.
 
I can't seem to get mine past 375 without artifacts:(, 1050 mem. I have a BFG 6800 GT , AGP V 1.55 default, antec True 380 ps with 2 opt. drives 1 sata drives, nothing else, did i just get a dud?? or is my PS no powerfule enough despite the requirements?? My CPU is OC'd from 2.6 to 3.12, i think i'll try default and then oc to see if the CPU is the problem or sucking to much power.
 
I'm sure for me it's not power supply i have Antec true power 550watts.I just think i got a card that dont overclock well but like i said it runs great at stock so i'm happy :D .
 
Can you guys please start posting what power supplies you are using?

I have an old Enermax 350watt and I am curious how my BFG 6800 GT OC will, er, OC.
 
Well, you will probably get better results with a stock CPU. I had me CPU oced from 2.6 to 3.12 and had to up the vcore to 1.6v to get it there, i lowered it to 3.003 and it runs stable with stock vcore, now my video card ram goes to 1.15 instead of 1.05 !!, my gpu is still not to high, but 384 is better than 370. I may try default clock speed for my CPU and see how high i can get my vid, it all depends on how much more oc i can get out of it, because it may only oc a bit more then i would go back to what i have now.
 
behind in what? list the number of games that has sm3 (ok i'll give you far cry). but then after that the number of games that have sm3 is only a handful. hell i can't think of another. can you?

sm2 already have widespread support. hence going with sm2 is the 'here and now' since by the time sm3 becomes as prevalent as sm2 we'll already have the next gen r500, gf50. DUH!

3dfx did have their arse handed to them... but that's another story altogether. you say ATI is behind technology but most games today are too! it's like this, DVD-Audio and SACD are already out... are there enough compelling titles to warrant owning a player? not likely. same with sm3, there aren't enough games with that technology to own a vid card that takes advantage of it.

link me those benchmarks where gf beats x800xt.

re: gf image quality. does gf hardware-assisst real/divx?



evilchris said:
SM 3.0 is not a matter of just "slapping it in there" when conveniant. NVIDIA will have more experience with it and do it better by the time ATI gets around to it. ATI made the wrong move. They didn't omit SM 3.0 because it "isn't needed now", they omitted it becasue they didn't want to invest and banked on NVIDIA not either. The reason ATI says you don't need it is because THEY DON'T HAVE IT. Just like 3dfx saying you don't need 32 bit color. Look where that got them. ATI is *behind* in technology now no matter how you slice it. Look at how much more efficient and faster clock for clock NV40 is. A 370/1000 GT can smoke a 520/1120 XT PE in nearly anything OpenGL. A 400/1100 Ultra can beat the 520/1120 PE XT in over half the D3D benchmarks right now. ATI is *severely lacking* in technology with their overclocked 9800 PRO ( X800 )


You saing the GF doesn't give good image quality? LOL.
 
i have a powerright 500 custom from frozencpu.com and it drives the bfg 6800gt to 434 core /1.13 mem. BOOOOYAH!

i don't run it there normally though, that's where it autodetects every time i run it. i keep it at around 420/1.10
 
You came to this post just to bash the GF6? Your an ass then, why did you bother posting?? This is for OVERCLOCKING the GEFORCE 6 results, not ATI v. Nvidia, i've had enough of whose bettter crap, they are so close its impossible to tell, and if you really want a link, read the TomsHardware FarCry 1.2 & SM3.0 Patch article. And the Ultra is just as fast as the X800 P, the GT is only a little slower than the Ultra, therefore it scores pretty close to XT speeds, and no it does not beat the XT, but it can ty with it in several benchmarks because it has its full 16 pipelines unlike the X800 Pro. Mainly the X800 XT only wins with max res, 4xAA, 16x Aniso, otherwise the GF6800 U beats it and the GT ties with the XT. If you want to argue 3 or 4 fps then you are stupid, that is still within the margine of error. There are a whole slew of games that will support SM 3.0, even HL2 is supposed to support it with, i believe a patch/mod like farcry. Remember you buy these cards to keep a few years, i plan on keeping mine for 2 years or so until DX 10 comes out. Why are you getting upset?? Look at the reasons, its not because ATI has a bad design, its because the GT is an Ultra that is underclocked, the X800 is left behind because it has 4 pipes disabled!!! That's a 1/4 of its rendering power, and it shows, the X800 P is usually about %25 ~ %30 slower than the GT because of it, which makes perfect sense.
Please do not turn this into a flame war!!! And if you don't have a GF card then don't bother posting or even listening to what other ppl say about the X800's, heck don't even listen to me, if you have that card, then to you it should be the greatest card in the world, no matter what.
 
Captain_Insano said:
Please do not turn this into a flame war!!!

You may have wanted to take your own advice, and not post all the stuff you did before that statement.
 
I can't get past 380 on the core and 1.15 on the mem, but that is not too shabby, pretty close to ultra speeds, i suppose i shouldn't complain, cause its free.
 
I said if you don't have a GF Card don't be posting, i have a GF 6800 GT. And i'm not turning anything into a flame war, he asked for benchies, i posted a site, which is well known, to find them at. I also in part agreed with him, he did not believe that the GT was faster than the XT, it is not as i stated above.
 
no i came here to decide whether or not to ditch my x800xt @gateway or buy the gf6 gt, to see if i'm ready to use gf again. i have had a nv since tnt and gf, gf2 days. so it's not like i'm one-sided. i didn't come to bash anything/anyone. i'm asking valid questions, and i ain't angry, if you think the word DUH insinuates anger then obviously that's your opinion. but at least i wasn't calling anyone an ass.

tom's hardware stopped being valid a few years ago... i trust he hasn't changed much.

i've been going around on a few reviews:
http://www.bjorn3d.com/read.php?cID=655

http://www.nvnews.net/reviews/bfg_geforce_6800_ultra_oc/index.shtml

and there aren't as much differences. as you said it's merely 3 or 4 fps in some cases except for high res... but then again i typically play @1600x1200.

regarding sm3, i've read that 1.2 for farcry isn't even out yet and the games that are listed in the links above don't have sm3 patch yet... at the time of this post. so if i buy a gf6800 card in the next few months i will not have a game to take advantage of SM3 (not one) until those patches come out. and you know how when they announce something it doesn't necessarily gaurantee delivery. who knows when those patches will come out. i'm willing to bet that by the time those sm3 patches come out it'll be product refresh time in september/october. i still stand by my statement that sm3 is useless today because there aren't enough games (quantity-wise) to justify it. that isn't a feature to argue about between gf+ati cards at this moment.

finally afaik gf doesn't support divx/real hardware-assisst decode, ati does. i watch more movies than i game anyway.

thus, i'm staying with what i ordered, the x800xt. and this will be the final post from me about ati+nv comparison in this thread. from hereon i'll only talk about the oc of gf6800gt.


Captain_Insano said:
You came to this post just to bash the GF6? Your an ass then, why did you bother posting?? This is for OVERCLOCKING the GEFORCE 6 results, not ATI v. Nvidia, i've had enough of whose bettter crap, they are so close its impossible to tell, and if you really want a link, read the TomsHardware FarCry 1.2 & SM3.0 Patch article. And the Ultra is just as fast as the X800 P, the GT is only a little slower than the Ultra, therefore it scores pretty close to XT speeds, and no it does not beat the XT, but it can ty with it in several benchmarks because it has its full 16 pipelines unlike the X800 Pro. Mainly the X800 XT only wins with max res, 4xAA, 16x Aniso, otherwise the GF6800 U beats it and the GT ties with the XT. If you want to argue 3 or 4 fps then you are stupid, that is still within the margine of error. There are a whole slew of games that will support SM 3.0, even HL2 is supposed to support it with, i believe a patch/mod like farcry. Remember you buy these cards to keep a few years, i plan on keeping mine for 2 years or so until DX 10 comes out. Why are you getting upset?? Look at the reasons, its not because ATI has a bad design, its because the GT is an Ultra that is underclocked, the X800 is left behind because it has 4 pipes disabled!!! That's a 1/4 of its rendering power, and it shows, the X800 P is usually about %25 ~ %30 slower than the GT because of it, which makes perfect sense.
Please do not turn this into a flame war!!! And if you don't have a GF card then don't bother posting or even listening to what other ppl say about the X800's, heck don't even listen to me, if you have that card, then to you it should be the greatest card in the world, no matter what.
 
Uhh, the auto overclock said i 600mhz core and 1.4ghz ram, i think the damn thing almost killed my card! no way do i trust this
 
Just traded my week old X800 pro in on a 6800GT, must say I'm much happier with the nVidia card (god I never thought I'd be saying that again :eek: )

PNY Verto
426 core / 1.16 mem

Can't wait for the DD waterblock for these to come out for these things. Card goes up to 450 / 1.19 with the stock cooler but starts freaking out after about a minute, once it get's good and hot.
 
BFG 6800 GT @ 428/1.14
PSU = Antec 550 watt

No artifacts. Going higher causes problems before max temperature is reached at a lower stable overclock, which leads me to believe the cooling is not the limiter on these cards.
 
finally got my evga gt in today , and autoed at 405/1100 out of the box, but going to let me break in a little bit first before running it there.
 
430/1100 watercooled, 2 120mm fans on the radiator running at 7v, 120mm exhaust fan controlled by ps, never over 1400-1450rpm. quick, cool, and quiet. :)
 
Captain_Insano said:
I said if you don't have a GF Card don't be posting, i have a GF 6800 GT. And i'm not turning anything into a flame war, he asked for benchies, i posted a site, which is well known, to find them at. I also in part agreed with him, he did not believe that the GT was faster than the XT, it is not as i stated above.

Little hint: Don't use Toms for anything. It used to be censored here for a reason. Their benchmarks are not valid for anything. :)

Explination: He skewed benchmarks to favor one card. Namely, he used AA/AF on one (ATi) and not on the other, and then claimed that the second was faster because of it. (well duh, you get more FPS when one has 4xaa and 16xaf on...)
 
So when SLI comes around you can expect a 12C case temp. increase with two of these cards plugged in? Imagine what kind of PSU you would need...watercooling would be a prerequisite.
 
Matrices said:
So when SLI comes around you can expect a 12C case temp. increase with two of these cards plugged in? Imagine what kind of PSU you would need...watercooling would be a prerequisite.
It would be less than 12C case temp because temperature runs on a logarithmic scale
 
Running

410/1160.

420/1100 running 420 core seems to freeze in some games i.e bf vietnam and UT2004 when playing. Kinda weird just pauses for 10 seconds and then resumes. I lowered it to 410/1160 and its running great.

The memory on theses cards are highly overclockable. I would say its an ultra in a 65800gt oc box

Blows away my radeon 9800 xt.

Oh i am selling my 9800xt for 240 bucks anyone interested
 
I'm a bit confused here.. The stock speed for the reference 6800 GT is 350 MHz core and 500 MHz (1,0 GHz "effective") for the memory right? And the stock speed for the Utra is a 400 MHz gpu and 600 MHz (1,2 GHz "effective") in the memory right? Then why do you guys claim to have reached the Ultra speeds when clocking the cards to 410 MHz gpu and 550 MHz (1,1 GHz "effective") memory? The gpu is indeed running faster, but you still miss 50 MHz on the memory..? Or am I missing something here..?
By the way I also never understood why they just can't state the actual frequency of the memory, but absolutely have to use this "effective" frequency where they compare it to how it performes compared to the older sd ram.. Must be some marketing-gimmick I guess.. Higher numbers sell more maby :-P

And also.. this is my first post on this forum :D Hello!

Edit: I got the supposedly 600 MHz on the Ultra's memory from the first page of the HardOCP's 6800 gt review.. I just found out that the real number is of course 550 MHz, meaning you guys were right all the time and that my entire post is pointless. Someone shoot me.. I will leave now.. :(
 
Status
Not open for further replies.
Back
Top