Post your 3dMark 2005 scores here. (Link to list in 1st post)

ellover009 said:
i got a p4 2.8 hp, 2gb kingston cheap ram, 6800gt
score 4767 bench mark 05
is it low for the card?
I was gonna get a x850xt but i knew it would be cpu limited and lacks shader 3,0 and full HDR support. so far content with card coming from a geforce 4 ti4200
so is 4767 good for a 6800gt stock, i will run more test when i turn off HT and shut off some programs.

Actually, that score is average for a 2.8GHz Intel P4 CPU and a 6800GT (reference, non-OC).
 
Just got my Gtx back from Viperjohn today.
I'm pleased.

It's sitting at 540/1458. 50c under load.

Top bench so far with the latest official nvidia drivers : 9642


Computer specs :

A64 4000+ @ 2858ghz, air. 1.6v
Dfi Lanparty UT4 -dr-sli, 623's bh5 optimized.
1Gb Mushkin bh-5 3500. Maximum tightness 2-2-blah-blahblah, 3.4v
74gb raptor, on a fresh install of winxp sp2
Viperjohn 7800gtx 540/1458

Not too shabby for a single card.
I'll rest my benches. The card can stretch no farther.
 
What mods did he throw on it, got any pics? I'd imagine a volt mod of some sort :D
 
Volt, vcore, and nearly a silent air kit on it.
replaced the capacitors and resistors,the piece where the pcie molex plugs in, etc.
He did a real number on it. And made the card reach its top end performance limit.
No cam batteries so, I'm unable to provide pics.
also, he kind of fixed the gdc on it. It's kind of hard to explain without visual aid.


I'll just leave my visual aid as the 9650 3dmark05 score I had with one card :p
Works for me.
 
3dmark.jpg
 
3DMark: 6425
CPU: 4487

These scores were with a x800gto2 with unlocked pipelines and overclocked to GPU:540mhz MEM:600mhz, on a 3200+ venice overclocked to 2400mhz(from 2000mhz) with 8-3-3-2.5 ram timings on a A8N-SLI Premium motherboard 1007 bios.
 
I don't know if this is still being actively maintained or not, but, my score should be updated if it is. I've upgraded my CPU, so if there was any CPU limitation, it's gone now. I'm using the latest official drivers at this time (video 81.85, chipset 5.11.) My current 3dMark05 score is 4251. Not the best, but, pretty decent for this configuration I think. I believe my old score was in the 3.5K sort of range, but, I'm not 100% sure I remember right. Here's my hardware at this time:

Athlon 64 3700+ (San Diego SH-E4) @ 252x10 (4xHTT,) 2x512MB TwinMOS SpeedPremium DDR400 UTT BH-5 @ 209MHz, 2-2-2-5, EPoX 9NDA3J (nForce 3 Ultra,) LeadTek A400TDH (Geforce FX 6800 NU) @ 375/700 w/ 16/6 pipes, Sound Blaster Audigy 2 ZS Platinum, ThermalTake Silent PurePower 420W PSU
 
I wouldn't trust all that high quality stuff on a 420watt thermaltake :eek:, it does only have a 18A on ONE 12v rail.
 
My system is perfectly stable. I used to have a CPU that was very very sensitive to dipping voltages and I got it stable. I don't really think the video card is so ultra sensitive that that even comes up. Maybe if I were voltmodding to 1.4V and I had to worry about any changes in voltage screwing up my card, but, I'm not going to do that. Thing is, it's not like I'm using raptor drives or something. I've actually gone down to a slightly less power hungry system with my most recent upgrade (Barton to San Diego.)
 
Just managed a new personal best !!!!!

326 3DMarks :D :D

If people are still being added to the list I have a GeForce FX 5500. The card is clocked at 310/464 and I am running the 77.77 drivers. Almost the fastest FX 5500 on the list. Now that would be an accomplishment! :D

Man I cant wait till my 7800GT gets here!!!
 
Mr. K6 said:
I wouldn't trust all that high quality stuff on a 420watt thermaltake :eek:, it does only have a 18A on ONE 12v rail.

You have me worried now cos my rig is running on a 350watt supply!
 
Rash said:
You have me worried now cos my rig is running on a 350watt supply!
What people don't get is it's quality, not quantity. If it's running stable, you're ok. I mean, of course make sure your voltages aren't running amok, but, the main threat is if they go up. Down means instability, up means instability + possible damage. Mine go down what little they change and it's not enough to be a problem for a mobile barton running at 2.5GHz @ 1.75V to be unstable. Technically my new setup eats a little less power even than the old one, so now there's a little more unused than even then. After all, a San Diego running at 2.5GHz @ 1.45V doesn't use nearly as much power as a Barton (even a mobile) running at 2.5GHz.

More on subject of this thread, I'm pretty sure the video card isn't as sensitive as many other components (after all, the molex connector isn't there for looks) and I seriously doubt it could affect the card. So far the card has held up well this year or so I've had it, with the only exception being when I screwed up the memory with a bios mod that had lower timings (didn't know it at the time I tried it) and when I thought I could get away with a higher AGP bus (which supposedly helps Geforce cards, but, I saw very little change when I raised it, so I guess it applies mainly to older Geforce cards.)
 
7251 for the rig in my sig. Card still at stock clocks with a very light o/c on the processor

 
Ok I'm hitting 8599 with my new rig. Video card at 610/670 cpu at 2890.

How does that compare with an oc'd 7800gt ?

Ok that looks better than most every gt I've seen and a few gtx systems. :p

Damn we need some approved drivers for comparison pages.
 
Nazo said:
What people don't get is it's quality, not quantity. If it's running stable, you're ok. I mean, of course make sure your voltages aren't running amok, but, the main threat is if they go up. Down means instability, up means instability + possible damage. Mine go down what little they change and it's not enough to be a problem for a mobile barton running at 2.5GHz @ 1.75V to be unstable. Technically my new setup eats a little less power even than the old one, so now there's a little more unused than even then. After all, a San Diego running at 2.5GHz @ 1.45V doesn't use nearly as much power as a Barton (even a mobile) running at 2.5GHz.

More on subject of this thread, I'm pretty sure the video card isn't as sensitive as many other components (after all, the molex connector isn't there for looks) and I seriously doubt it could affect the card. So far the card has held up well this year or so I've had it, with the only exception being when I screwed up the memory with a bios mod that had lower timings (didn't know it at the time I tried it) and when I thought I could get away with a higher AGP bus (which supposedly helps Geforce cards, but, I saw very little change when I raised it, so I guess it applies mainly to older Geforce cards.)

Voltage going down can mean damage as well. Many devices die because of lack of proper voltage.
 
System as in sig:

Stock:
Athlon FX-57 (stock 2.8GHz)
Albatron 7800GTX (stock 430/1200)
3DMark01 - 28151
3DMark03 - 16534
3DMark05 - 7729


OC'ed 7800GTX:
Athlon FX-57 (stock 2.8GHz)
Albatron 7800GTX (oc 516/1370) - using Coolbits "Find Optimal Speed"
3DMark01 - 29147 (+3.5%)
3DMark03 - 18682 (+13%)
3DMark05 - 8899 (+15.1%)

No problems with a 20-pin PSU.
 
R1ckCa1n said:
:eek: Nice score! I think I must swap my X2 3800+ for a X2 4400+ to get the extra cache.

Do you think the cache makes that much difference? At what speed is your 3800 running?
 
END said:
Voltage going down can mean damage as well. Many devices die because of lack of proper voltage.
Just stick to the subject of the thread. Anyway, the only damage that has ever occured to my video card has been when I tried changing around it's BIOS and when I tried stuff like setting a higher AGP bus. Never once have I had a problem outside of those things. As for CPU, well, I always make sure my CPU is Prime95 stable. You don't get any more stable than that. I've never lost a CPU or damaged one either (my previous barton I had running so high is not dead, just moved to another computer where it will get to run cooler at a lower speed for an HTPC.)

nunyabiz said:
Do you think the cache makes that much difference? At what speed is your 3800 running?
Doubling your L2 cache makes something like a 3-7% difference in gaming if I remember the official numbers correctly. Honestly, it's not terribly noticable and you'll see more difference by putting the extra money into your video card or other such components probably. Outside of gaming, there are certain applications, such as encoding, where that extra cache is all the difference in the world (I've even seen a case where an athlon 64 core sempron was below even a barton at things like that thanks to that lower L2 cache -- and I emphasise that this wasn't the sempron with the barton core, which wasn't even on the scale.) For gaming, I wouldn't really spend the extra cash on the L2.
 
Intel_Inside said:
Numbers mean nothing to me. In game in every game I play is awesome.
Indeed, it's just a benchmark :p. Odd how it's slanted like that though :confused:

Nazo: If it's stable then everything is OK, just wondering on that. Seeing that you have a 6800AGP, then your video card is probably not sucking a ton of power.
 
Mr. K6 said:
Indeed, it's just a benchmark :p. Odd how it's slanted like that though :confused:
Not so odd when you consider how Futuremark has been found guilty on MANY counts of optimizing their benchmarks more for one manufacturer than another... It wouldn't surprise me in the least bit if there are more optimizations for AMD than for Intel for example, and I'll bet that's what they've done here.

Nazo: If it's stable then everything is OK, just wondering on that. Seeing that you have a 6800AGP, then your video card is probably not sucking a ton of power.
Probably the 6800 is among the highest power consumers actually, though the GT and Ultra draw even more. 7800, by all accounts, is supposed to actually be more power efficient from what I hear. Actually, I'm rather hoping this is true, because nVidia is starting to burn some serious power with all their cards, and that means a whole host of problems, one of which being it will never be very mobile friendly. As AMD already knows, we're getting to the point where we use up too much power even for daily operations. You know, my old PC uses a 230W power supply, and that's too much. My main PC uses a 420W and it's only just enough -- despite the fact that I have a processor designed for minimal power consumption. So, you'll excuse me if I like to think that nVidia has finally caught on and is trying to not push things quite so much now. ^_^
 
Here is the latest update. So far its game stable, but I want to test further before going higher.
X1800XT.jpg
 
wo0t! X1800XT scores! Cool.

Can anyone link me to the thread that has all the graphs people and the video cards they have.
 
Bah, it turns out those bios mods messed up my video card after all. I thought I got it stable at full capabilities again, but, it just looks like it won't do it. For one thing, I picked up a wrong bios somewhere (it'd probably help if leadtek hadn't removed their bios download page so that I could tell which of these is which...) I realized something was funky when I couldn't underclock (if I lowered the frequency, the display got insanely currupted and it froze then rebooted the system after a while.) You can discard the earlier number if you like, but, it seems to me like it's a valid enough representation of what happens when you run a 6800 at those specs to still be useful for references.

EDIT: And, if you're curious, a 6800 running at 286/500 gets a 2401 in 3dMark05.

BTW, is there any way the list could be posted as text? Or at least a download for the spreadsheet? It'd be really nice to be able to run a search on it for one.
 
AMD 3500+ @ 2475GHz 1000FSB w/BFG 7800GT x2 @ 425/1050
3dMark01 - Not tested
3dMark03 - 24493
3dMark05 - 10569
 
New high score on my X800 GTO

6381

Connect3D X800 GTO, benched using a Core of 575 and Memory of 525
16 unlocked pipes, core is R423. Device ID has been modded; Windows recognized it as an X800 XT as well

6381
 
Back
Top