kcthebrewer
[H]ard|Gawd
- Joined
- Jun 14, 2004
- Messages
- 1,728
Anybody have any XGI scores?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
kcthebrewer said:Anybody have any XGI scores?
101998 said:Also when I had to run the timedemo through twice because the first time through (on either driver any setting) the demo would pause every couple seconds. ? The second time it always played fine.
titanium said:I think my scores are pretty low for what I have. Any ideas what my problem might be? I'm new to the whole overclocking thing.
3dmark score: 38xx (at work cant remember)
AMD 64 3200+ @ 2.3ghz
Chaintech Nforce3 250
1gb Crucial DDR3200
Gainward GF6800GT (non GS) @ 400/1100 66.7 drivers
Make SURE you put on some new thermal paste with these Gainward cards you'll drop huge temps when you do. Put on some AS3 and I dropped 10c from it.
Jbirney said:Batman,
Dave over at B3D gave more info on the ATI bug when fixed that netted 1000 pts in 3dmark. It had to do with the way ATI managed vertex data. It seems like on these AGP cards the vertex data was slipping into main memory vrs saved on the on-board memory. PCI did not see this as much due to its higher bandwidth. A guy by the name of "dio" is an ATI D3D software engineer said it was a memory management issue.
As far as you statement about no other games seeing a increase that is very true. However please name me one game out there today that is even close to same complexity of what 3dmark05 is doing. I will save you the time. There is no game today that even comes close.
..
dvanderb said:I was really surprised to see a score of 5497 with my 6800GT..
http://people.ucsc.edu/~dvanderb/3dmark05.jpg
System specs follow:
AMD Athlon64 3200+ @ 2480mhz(225mhz FSB)
Asus K8V SE Deluxe
1gb OCZ Enhanced @ CAS 2.5 and 450mhz
PNY 6800GT w/ NV Silencer 5 @ 430mhz core/1150mhz memory - fastwrites on.. 1.4V bios
Using 66.70 drivers
Jbirney said:Brent,
while its funny that people agure over this stuff, its still sad as you never have answered my question on how do you predict furture prefromance? Sticking to current games is not a good indicator. I dont have a good answer but having synthics show some possible weekness in the API that those games will use seems like at least a starting point...
btf said:What was the clock on your GT for the test?
dvanderb said:I listed it in the system specs.. was 430mhz core.. and 1150mhz memory
btf said:I was asking because I get the same score with mine clocked @450/1200 with the 1.4v bios also. Wonder if fast writes turned on would make a difference.
The Batman said:Also, about 'approved' drivers. WHO. GIVES. A. FUCK. Seriously, this is [H]ardOCP, as in Hardcore, not Hardly packing a dick. WHQL, FutureMark Approved, Microsoft Recomended, GITG, TWIMTBP, runs best on Alienware/Intel P4, they can all blow me.
truffle00 said:.
Numbers are in the following format: 4.9 score/framerate / 8.07 score/framerate / Percent difference
3dmark01SE 21902 / 21957 / 0.25
3dmark03 11929 / 11976 / 0.39
3dmark05 4473 / 5788 / 22.72
Doom 3 - 1280x1024
Medium 48.6 / 53.3 / 8.82
High 46.0./ 51.7 / 11.03
Ultra 42.8 / 48.3 / 11.39
Doom 3 - 1280x1024
Medium 35.3 / 38.1 / 7.35
High 33.8 / 37.1 / 8.89
Ultra 31.9 / 34.9 / 8.60
Swing away, Batman.
The Batman said:Thanks alot Bro. Must say this is much better then flaming back and forth. Though I think you misunderstood what I wanted to compare [doesn't matter, you gave the scores and that's all I really needed, I'll be sure to explain my reasoning as well].
Alright let's get cracking at the benchs and why I think your results support my "the bug excuse is FUD" theory.
Alright, let me draw your attention the scores I've bolded in the quote. Those are what we'll be looking at for the comparision. For now let's looks at the top non AA scores. Compare the 48.6 to 46.0. That's when moving from Mediuem [128megs used] to High [256megs used] under the older drivers. This difference is 2.6 frames. Percentage wise that's a .95% difference. The percentages are what we'll be looking at, we can't use a frame comparision since the card is doing more work in high than mediuem that is not nessearrily related to textures.
Okay, now we're going to compare the same thing but using the hotfixs. 53.3 to 51.7. Difference of 1.6. That's a .97% making it all in all a .02% difference. Not really something to write home about. So...what happened to the bug fix, the difference between .95 and .97 is negligable at best, falling within the benchs margin of error at worst? We should be seeing a LARGER frame% disparity. Do you guys follow me? The hotfix is suppoed to fix a 256meg allocation problem [as it only effects 256meg cards]. What I did was compare the difference between using 128 [Mediuem] and 256 [High] under the older drivers to the difference between using 128 and 256 under the hotfix drivers. If indeed the hotfix worked as ATI claimed it did, we should have seen a LARGER disparity between 128 and 256 since 256mb mode wouldn't be crippled as it was under the older drivers. In other words if this was such a serious issue truffle should have taken a much larger hit when moving from mediem to high under the older drivers [since 256meg wasn't allocating properly]. Obviously this is not the case.
Okay, now we're going to do the same exact thing but this time we'll use the AA scores.
35.3 to 33.8. A .95% difference again under older drviers. 38.1 to 37.1. .97% again.
Same difference between nonAA and AA. So yes I call FUD, I think there was NOTHING wrong with using 256megs of vram under the older drivers [that in fact there was no bug to begin with] and I think truffle's scores support this in REAL GAMES. I don't want to use synthetics as an example because people with stock X800 Pros are beating people with Ultra Extremes.
If anyone can punch a whole in this comparision, please do so by all means.
The Batman said:Truffle said AI was off [he doesn't have CCC installed].
Brent_Justice said:Unless you are a time traveler how can you know what future games will be like?
Is 3DMark really telling you of what future games will be like?
Gamers buy video cards to play games NOW. They usually don't buy a brand spanking new video card and shelve it for 6 months and then pull it out and use it.
As for predicting future game performance no one can do this with 100% certainty.
What we CAN do though is use the latest games based on the latest game engines that might give us a prediction in how a video card will perform on a game using that game engine.
For example testing DOOM 3, it has a very new engine which WILL be used by a whole lot of games in the future. By testing DOOM 3 we get a glimpse at what kind of performance we MAY see in future titles based on this engine. If nothing else at least we are testing a real game engine that WILL be a popular game engine in future titles.
Same for other games, we want to hit game engines that will be used in the future, HL2, UE3, Cry Engine, etc... These are all state of the art games and engines.
But we are much closer to figuring out future game performance by using real game engines that are going to be used in the future rather then some synthetic benchmark that will never be used as any game engine in any game in the future.
Same difference between nonAA and AA. So yes I call FUD, I think there was NOTHING wrong with using 256megs of vram under the older drivers [that in fact there was no bug to begin with] and I think truffle's scores support this in REAL GAMES. I don't want to use synthetics as an example because people with stock X800 Pros are beating people with Ultra Extremes.
Jbirney said:
The bug was documented as being an issue with vertex data and how it was stored and then leaked into main memory on AGP cards with 256mb of memory. Here is some home work for you.
1) How much vertex data is used in 3dmark2k5 gametest 1 (the one that saw the biggest benifit of this driver)?
When you get that answer compare it too
2) How much vertex data is used in Doom3?
And see if there is a difference. Of course there is the obvoius opengl vrs DX, and doom being more or less a "dx7 game" vrs a "true dx9 benchmark"...
Jbirney said:
The bug was documented as being an issue with vertex data and how it was stored and then leaked into main memory on AGP cards with 256mb of memory. Here is some home work for you.
1) How much vertex data is used in 3dmark2k5 gametest 1 (the one that saw the biggest benifit of this driver)?
When you get that answer compare it too
2) How much vertex data is used in Doom3?
And see if there is a difference. Of course there is the obvoius opengl vrs DX, and doom being more or less a "dx7 game" vrs a "true dx9 benchmark"...
The Batman said:Fine. Since you've done your homework you can explain to me why the new drivers do very little for Far Cry [except for breaking the sm2b path]. It's a DX9 heavy game. It didn't see a massive jump in frames [hell, alot of users were reporting a few frames lost even].
The Batman said:Fine. Since you've done your homework you can explain to me why the new drivers do very little for Far Cry [except for breaking the sm2b path]. It's a DX9 heavy game. It didn't see a massive jump in frames [hell, alot of users were reporting a few frames lost even].
Jbirney said:The bug is a specific to how vertex data is handled not texture data. Frycry does not come close to the same about of vertex data that 3dmark2k5 does. No other game stresses your GPU like 3dkmark2k5. And thats why your not seeing it any where else. But given these facts there is a very reasonable cause for this drop and so far there is NO evidence that there is any foul play going on. The fact the FM approved these drivers also seems to indicate this is a very legit reason. I am not trying to be a butt head just looking at all of the data before I say they are "cheating".
rancor,
I dont have an 256 mb AGP card to test your case. I just got that "new" sony HP94P LCD (which is very nice) so I have to save up a bit to get a new video card.
rancor said:Batman, I'm leaning away from it being a cheat, specially since what I just saw with my engine, and it effects nV's cards as well, its a bug that has to be fixed by both sides, seems to be since this many polygons haven't been used in a real time app before its a common flaw in programming technique or driver vertex memory access errors.
Jbirney said:The fact the FM approved these drivers also seems to indicate this is a very legit reason. I am not trying to be a butt head just looking at all of the data before I say they are "cheating".
rancor said:Batman, I'm leaning away from it being a cheat, specially since what I just saw with my engine, and it effects nV's cards as well, its a bug that has to be fixed by both sides, seems to be since this many polygons haven't been used in a real time app before its a common flaw in programming technique or driver vertex memory access errors.
I did not notice the frame rates have any effect on Code cult code creatures, so not all programs have this flaw.
The Batman said:I'm not calling it a cheat/hack/whathaveyou. I'm just questioning the validity of this 3dmark05 only performence boost and MORE IMPORTANTLY it's relevance to real world gaming.
Like the man said if Far Cry doesn't stress it, and it's going to be the top dog of DX9 engines [well maybe not top dog but I don't imagine any new engine is going to come along and utterly destroy inside the year] until UR3 debuts...who cares? We'll all have 7800s and XI800s. It didn't do a thing for D3.
Brent_Justice said:The whole idea of a "Synthetic" benchmark being effected in performance by drivers completely invalidates it as a "Synthetic" benchmark that is suppose to tell you how powerful your video card is.
Brent_Justice said:The whole idea of a "Synthetic" benchmark being effected in performance by drivers completely invalidates it as a "Synthetic" benchmark that is suppose to tell you how powerful your video card is.
Elios said:that would also explan the 1000 point jumps of the 66.70 NV drivers hmmm.....