Never expected this !!!

wolf2009

[H]ard|Gawd
Joined
Jan 23, 2008
Messages
1,767
I''m doing these benches for my "Best Old Card Project" on my website http://hardwarebenchmark.googlepages.com/ .

I was surprised by what i saw.

Capture038582.jpg


Capture041772.jpg


This is the Lost Coast ( HL2 is supposed to be a ATI game ) in built benchmark on highest settings . Rest of the machine in sig.

Yes that is 8600GTS 256MB up there with x1950XT . I never expected 8600GTS to be this strong and so much better than 7800GTX .

Used 8.8 Catalysts for ATI benchmark and 177.41 for Nvidia Benchmark .

ADDED : Lost Planet Snow Level DX9 Benchamark . AA as not been enabled as Nvidia Cards 7 series cards are not able to do so .

Capture042312.jpg


Capture043908.jpg



This is the GPU-Z's of those cards , all default speeds

te.png


58g.png


Capture040127.jpg


Capture039762.jpg
 
nice work - 1920x1200 too, put the gpu full on! I think you will find that the true winner(s) will come out at high res.
 
Not to steal your thunder, just adding to the fun for comparison.

http://www.bit-tech.net/gaming/2005/09/21/lost_coast_benchmark/2

Here is a quick overview of what we found:

Low-end: Less than Radeon 9600, GeForceFX 5900 - not playable, less than 5FPS
Mid-range: Around Radeon 9800, Radeon X700, GeForce 5950, GeForce 6600 - playable at 800x600 with AA turned off
High-end: Radeon X800, GeForce 6800 - playable at 1024x768 or 1280x1024, with AA and AF depending on the exact model of card
Bleeding-edge: GeForce 7800 - playable at 1600x1200 with full AF

AMD FX57 processor on a MSI nForce 4 SLI motherboard, this machine has "only" 1GB of Crucial memory.

benchmark1.png


hdrimpact.png
 
What's so surprising?

It's a known fact that the 8600GTS matches or exceeds high-end cards from the previous generation (7900GT/x1950 XT). The only reason it got a bad rap is because it was initially overpriced, and there really wasn't anything available between the 8600GTS and the 8800GTS.

Just like the 6600GT and 7600GT, the 8600GTS matched the previous generation's high-end.
 
What's so surprising?

It's a known fact that the 8600GTS matches or exceeds high-end cards from the previous generation (7900GT/x1950 XT). The only reason it got a bad rap is because it was initially overpriced, and there really wasn't anything available between the 8600GTS and the 8800GTS.

Just like the 6600GT and 7600GT, the 8600GTS matched the previous generation's high-end.

The reason it really got bad rep was the way it was 2x+ slower than the second order highend card. The gap has been increasing steadily since the 6600GT's extremely superb price performance, and at the 8600GTS level people actually realized that.

Price/performance is supposed to go up with lower prices (excluding budget cards), and the 8600s were definitely anything but following that logic.
 
What's so surprising?

It's a known fact that the 8600GTS matches or exceeds high-end cards from the previous generation (7900GT/x1950 XT). The only reason it got a bad rap is because it was initially overpriced, and there really wasn't anything available between the 8600GTS and the 8800GTS.

Just like the 6600GT and 7600GT, the 8600GTS matched the previous generation's high-end.
well the 8600gts doesnt beat the 7900gt/x1950xt except in very few games and most of those are new ones like Crysis, CoD4 and UT3. in older game the 8600gts is usually beaten. yeah the 8600gt/gts got a pretty bad rap in the beginning but it was mainly the release drivers that made the card look like a joke when it first came out. on the next set of drivers that came out performance went way way up especially in STALKER, CSS and FEAR. in fact framerates nearly doubled in those three games. the 8600 cards are still very lacking because most people dont play below 1280x1024 which is pretty much required.
 
wolf2009,

When I first saw your HL2:LC graph, I was incredulous because your numbers for the 7600GT are very low. I'm trying to replicate your results at 1680x1050, but the FPS simply won't go that low.

I want to make sure I'm doing everything correctly.

Built-in benchmark = Video Stress Test?

FRAPS was used to measure the FPS, which is pretty much the same as what the stress test reports.

HL2: Lost Coast settings: all "High", Reflect All, 4xMSAA, 16xAF, Vsync Off, HDR Full (if available)

Nvidia Control Panel Settings:
Code:
AF = app controlled
AA gamma correction = off
AA setting = app controlled
AA transparency = multisampling
Conformant texture clamp = hardware
Error report = off
Extention limit = off
Force mipmaps = None
Max pre-rendered frames = 3
Multi-display/mixed-GPU accel = single display performance mode
Texture filtering: Negative LOD bias = allow
Texture filtering: quality = high quality
Threaded optimization = on
Triple buffering = off
Vsync = app controlled

*the following are greyed out:
Texture filtering: anisotropic mip filter optimization (off), anisotropic sample optimization (off), trilinear optimization (on)

It does not seem to make a difference if I force 4xMSAA and 16xAF in Nvidia control panel.

My system;
Q6600 @ 2.4GHz (stock)
4GB RAM
7600GT @ 560/700 (stock)
Vista x64
Forceware 175.19 (older drivers than the ones you used)

Could my drivers really make a significant difference?
 
^^ those are interesting results . I will rebench on those drivers once i have 7600GT back in the system and report back .
 
Back
Top