Is 12607 a decent score for 3DMark06 and my specs?

newls1

Supreme [H]ardness
Joined
Sep 8, 2003
Messages
4,607
Guys, can you help me out and tell me if this is an OK score to achive on 3dmark06. I got 12607 and there is absolutely no tweaks on my PC. I have XP Pro 64bit, 8800GTS 640MB, 2GB ram @ DDR1052 speeds, Q6600 @ 3.75GHz, etc.... I am using driver 163.44 but it is for some reason FORCING me to have a refresh rate of 59Hz in everything where as my old ATI card allowed me to have a 75HZ refresh rate:confused: THis issue alone is driving me crazy, but im sure i'll find the answer to that. Thanks
 
Really? Thanks for the reply. I thought it sounded really low for some reason
 
As a comparison, I got 12,418 out of my signature rig, o/c'ed to 3.6 CPU and 858/900 GPU, etc. but with 32-bit XP. Your score sounds in the ballpark of where it should be.
 
i have question for all of u, when u test with 3dm do u use the same vid card settings u game with? aa af ect...... i have things set to performance levels not what i game with

or do u set everything to performance values?

i know 3dm isn't the end all and be all to testing but i would have thought my scores would have been a little higher. the last test i ran i only got a 8379

thnx.
 
For AA/AF, I set the card to "Use Application Settings" and then work up to core/mem speeds to whatever you think the card can do, + or - until you reach the point where you get no freeze ups or restarts.
 
I thought for 3dmark06 score your suppost to run at all default settings?

and I get about 11k with my sig rig.
 
I thought for 3dmark06 score your suppost to run at all default settings.?

'Default' for 3DM's settings yes, but for the CPU/GPU/RAM etc. the idea is to see how well your rig can do, which means o/c'ing those components.
 
I am using driver 163.44 but it is for some reason FORCING me to have a refresh rate of 59Hz in everything where as my old ATI card allowed me to have a 75HZ refresh rate:confused: THis issue alone is driving me crazy, but im sure i'll find the answer to that.
The latest nVidia drivers are bunk. I threw a 1950pro in my rig to test for my sister's rig i'm building. Once i went back to my 7600gt, using the new drivers my monitor wouldn't even work at native resolution (yes i cleaned and scrubbed all drivers before and after). See if there are beta drivers for your 8800 to fix your refresh rate issue in the mean time.
 
The latest nVidia drivers are bunk. I threw a 1950pro in my rig to test for my sister's rig i'm building. Once i went back to my 7600gt, using the new drivers my monitor wouldn't even work at native resolution (yes i cleaned and scrubbed all drivers before and after). See if there are beta drivers for your 8800 to fix your refresh rate issue in the mean time.
163.44 driver is a BETA driver. I have tried everything possible to fix this issue, and still cant find a fix! This refresh rate issue is driving me to drink. Can someone answer this: Does 3dmark06 utilize all 4 cores on a quad core?
 
Does 3dmark06 utilize all 4 cores on a quad core?
Well, not really, but from my observation is that single core on quad core perform a lot better than most single core on core2duo and other dual core cpu by 10% maybe higher. :p

I can't wait to get mine on the 6th. Hopefully it's G0 steppin lulz.

Edit: My 3dmark06 score is 11536 and might increase by 1k - 1.5k if I get to OC my new quad core on September 6th. :D
 
OK, so more or less Im right about where I should be at.....
 
**UPDATE**

Just installed the NEW BETA driver 163.67 and using the exact same configuration (1 8800GTS XFX XXX @ 615/975, Q6600 @ 3.75GHZ, etc) My 12607 score jumped to 14028!!! This seems like a pretty damn good score for just 1 8800GTS card, right?? What is up with this new beta driver?
 
I'd give you a comparison with new drivers vs. the 163.44 but the 163.67's screw up my OC -- in order to do a straight comparison I'd have to flash my card with 'proper' shader speeds, since as of 163.67 the drivers allow separate core and shader speeds to be set... but no current OCing software has a shader slider. Supposedly this will be added/fixed in RivaTuner 2.04 (setting custom ratios) and a slider added in 2.05. Until then it's video BIOS flash or 1350Mhz shader domain on the GTX regardless of core clocks with 163.67... which is a bit frustrating, to say the least. I may flash, may not, not sure.

I hit 12716 with my sig rig, though my E6420 was running at 3.5Ghz 1.5 vcore at the time. The RAM on the GTX was also running at 999Mhz up from the usual 900Mhz, and core from 576Mhz to 676Mhz. Shader was... whatever shader would normally scale up to.
 
I get 12227 with my Q6600 at 3.75 and my 8800 640 oced a bit.

I have ddr2 800 ram though, So yes your score sounds good.
 
**UPDATE**

Just installed the NEW BETA driver 163.67 and using the exact same configuration (1 8800GTS XFX XXX @ 615/975, Q6600 @ 3.75GHZ, etc) My 12607 score jumped to 14028!!! This seems like a pretty damn good score for just 1 8800GTS card, right?? What is up with this new beta driver?

Whoa, thanks for the heads up on the new beta drivers. I've read the changes log and this might fix the poor performance with the 8800GTS 320mb cards.

What’s New in Version 163.67
New Features
• Improved performance of the graphics memory manager on GeForce
8 series GPUs running DirectX 9 applications in single‐GPU
configurations.
These improvements solve cases of reported performance slowdowns in
some 3D applications with high graphics settings and resolutions.


Fixed Issues–Windows Vista 32-bit
Fixed Single-GPU Issues
All GPUs

• Company of Heroes (DirectX 10) ‐ ʺout‐of‐memoryʺ errors may occur
at high resolutions and high antialiasing levels.
GeForce 8 Series GPUs
• GeForce 8600 GTS: The Microsoft DirectX SDK Blob demo–the screen
turns blank when changing the multisample type while in full‐screen
mode. [322023]
GeForce 7 Series GPUs
• GeForce 7300GT, GeForce 6150/6100: The display becomes corrupted
after entering Hibenate mode from Standby mode (linked standbyhibernate).
[339269]
• GeForce 7050PV/7025, GeForce 6600: Civilization4–the game menu
doesnʹt show the options. [330106]
Fixed SLI Mode Issues
• [SLI]: S.T.A.L.K.E.R.: Shadow of Chernobyl–game performance with
SLI mode enabled may be lower than under single‐GPU mode in
some areas. [321749]
• [SLI], GeForce 8600: Age of Empires 3–blue‐screen crash or blank
screen occurs after exiting the game. [322085]
• [SLI], GeForce 8600: 3DMark06–ʺGT2‐Firefly Forestʺ shows massive
corruption at 1600x1200 resolution with 8xAA enabled. [321355]
• [SLI], GeForce 8500/8400/8300: 3DMark05–the benchmark crashes or
results in an intermittent blank screen when run with Override AA set
(from the NVIDIA Control Panel) and at resolutions higher than
1920x1200. [330368]
• [SLI], GeForce 8500/8400/8300: The desktop resolution resets to
800x600 after switching between SLI‐mode enable and SLI‐mode
disable. [327233]

Fixed Issues–Windows Vista 64-bit
Fixed Single-GPU Issues

• GeForce 7300GT, GeForce 6150/6100: The display becomes corrupted
after entering Hibernate mode from Standby mode (linked standbyhibernate).
[339269]
Fixed SLI Mode Issues
• [SLI], GeForce 8800 GTX: Screensaver crashes after resuming from
preview. [340179]
• [SLI], GeForce 8500/8400/8300: The system automatically restarts
when SLI is either enabled or disabled with UAC enabled. [332104]
 
I'd give you a comparison with new drivers vs. the 163.44 but the 163.67's screw up my OC -- in order to do a straight comparison I'd have to flash my card with 'proper' shader speeds, since as of 163.67 the drivers allow separate core and shader speeds to be set... but no current OCing software has a shader slider. Supposedly this will be added/fixed in RivaTuner 2.04 (setting custom ratios) and a slider added in 2.05. Until then it's video BIOS flash or 1350Mhz shader domain on the GTX regardless of core clocks with 163.67... which is a bit frustrating, to say the least. I may flash, may not, not sure.

I hit 12716 with my sig rig, though my E6420 was running at 3.5Ghz 1.5 vcore at the time. The RAM on the GTX was also running at 999Mhz up from the usual 900Mhz, and core from 576Mhz to 676Mhz. Shader was... whatever shader would normally scale up to.
My shader speed is 1520MHz, is this maybe why my score is pretty good?
 
I got 11,806 with an E6300 and a single 8800GTS.

I haven't overclocked in ages.

I have a Q6600 waiting for me, I am getting it in in about 10 days.
As well as a GTX.

You are going to make me install those drivers, and re-overclock.

Thank you. :)
 
Don't sweat it. Theres a reason kyle stopped using 3dmark a long time ago in reviews. The time of uber 3dmarks is almost over.....:eek:
 
Score actually went down from 163.44 to 163.67, with the exact same OC settings (I believe... might be a few Mhz off on the RAM on the GTX).

From 12716 to 124xx something. A good bit of that is within the margin of error, but, eh. 200 points, sure, do another run. 300+ the over all performance has probably gone down. Maybe the GTS fixes screwed up the GTX? Who knows.
 
Score actually went down from 163.44 to 163.67, with the exact same OC settings (I believe... might be a few Mhz off on the RAM on the GTX).

From 12716 to 124xx something. A good bit of that is within the margin of error, but, eh. 200 points, sure, do another run. 300+ the over all performance has probably gone down. Maybe the GTS fixes screwed up the GTX? Who knows.

From what I've been hearing is 3dmark scores are going down but gameplay is getting smoother and less jerky.
 
As long as you're within a thousand or two points, you're fine. These synthetic benchmarks are a sham. They're only good for determining whether or not you've installed a driver wrong of have a massive bottleneck. That being said, my GTS at 620mhz core and 1000mhz (sig is the max overclock on the GPU, but is only run when playing demanding games) memory gets slightly higher scores with a slightly lower CPU speed.
 
I got 11805 with the CPU @ 2.88 and the GPU @ 658/2132, so that 14k you got now is rather good.
 
Guys, can you help me out and tell me if this is an OK score to achive on 3dmark06. I got 12607 and there is absolutely no tweaks on my PC. I have XP Pro 64bit, 8800GTS 640MB, 2GB ram @ DDR1052 speeds, Q6600 @ 3.75GHz, etc.... I am using driver 163.44 but it is for some reason FORCING me to have a refresh rate of 59Hz in everything where as my old ATI card allowed me to have a 75HZ refresh rate:confused: THis issue alone is driving me crazy, but im sure i'll find the answer to that. Thanks

Seems like a bragging post to me. All that OCing and unsure how 3DMarks works?

And correct, you are in NO WAY tweaking your PC at all :) All Q6600's run at 3.75 :)

What do you have the 8800gts at?
 
I got 11,806 with an E6300 and a single 8800GTS.

I haven't overclocked in ages.

I have a Q6600 waiting for me, I am getting it in in about 10 days.
As well as a GTX.

You are going to make me install those drivers, and re-overclock.

Thank you. :)

You got an 11806 with nothing overclocked with that hardware? That's about 15% better than you should with nothing oc'ed IMHO.
 
Seems like a bragging post to me. All that OCing and unsure how 3DMarks works?

And correct, you are in NO WAY tweaking your PC at all :) All Q6600's run at 3.75 :)

What do you have the 8800gts at?
Bragging, NA, I have nothing to brag about :). What I meant when I said my system isn't tweaked is, some people "TWEAK" there OS for benches, and what not, I prefer not to do that stuff. Oc course the PC is overclocked, I bought a "G0" for that 1 reason.
 
damn ye vista....

I can only eek out a 13000 in vista 64, but can almost touch 14k with a 13900 in xp...

ah well, gaming feels more or less the same in either os, it actually feels smoother in vista for the 64 bit enabled games (source stuff, far cry), and opengl stuff plays the same in either os.

Synthetics don't paint the full picture of how your system is running. Just look at it as that your system is healthy if you can get it running the synthetics rather than a gauge of its performance.
 
Back
Top