New Benchmakrs would be nice. Questions about normal maps and power supplies

TommyTwix07

Weaksauce
Joined
Jun 15, 2004
Messages
73
I would just like to see a new benchmark between the 6800 ultra with the latest forceware drivers and the X800XT with the Cat 4.7 or the 4.8's when they come out. That will finnally help me decide whether to cancel my order for my X800XT or buy the PNY 6800 Ultra from one of the fellow forum members.

Both cards are amazing there is now doubt. I like the video encoding features on the NVIDA card and the dual dvi. But ATI still seems to be outperforming NVIDIA in every other game besides Doom3 (not to say that trend will continue from what i hear they are now about neck and neck but i have no reputable source because all comparisons are with old drivers) What i want to know is how the 6800 Ultra handles normal maps compared the the X800XT PE (with ther 3dc technology) and how ATI's latest drivers will help with open GL performance. And I'd like to see how NVIDIA's latest drivers help in performance with some of the newer titles.

(Maybee The guys who run this site can do something about that, because i know i can't get both pieces of hardware as much as i'd like to and this is probubly going to be the last video card i buy.)

I'd also like to know if you guys think an Antec true 430 will power a 6800 ultra if i have 3 hard drives 2 optical drives 4 case fans two cold cathodes and an asus p4c800-e MB and a gig of corsair ull memory.
 
I can tell you this much...the 6800Ultra and the XTPE are seriously neck-and-neck...so either card is a great choice...

The general concensus about their OpenGL drivers is that they won't be coming out with new ones any time soon...however, in the case of Doom3, it's not that the XTPE does badly...it just gets beat soundly by the 6800GT (the $400 card)...

Remember though that the Ultra is also 2 slots...but it does have dual dvi (I have 2 LCDs so I'll definitely be getting one at some point)

As far as power goes, I'm sure your p/s would do just fine...
 
You won't be missing out on anything by getting either card, unless you have 2 displays, then the Ultra is for you. I'd say whichever you can get cheapest.
 
heyheyhey said:
You won't be missing out on anything by getting either card, unless you have 2 displays, then the Ultra is for you. I'd say whichever you can get cheapest.

oh yeah should probably mention the nvidia drivers have much better multiple monitor support and linux support...
 
Are there any games out there using 3DC compression yet? Or even planning on using it?
 
Koho said:
Are there any games out there using 3DC compression yet? Or even planning on using it?

There are a few that have announced 3Dc. Far Cry is about the only game out right now that will use it.

There will be very little performance difference and almost no IQ difference between 3Dc and DXT5 though. The 6800's will use DXT5 compression which is pretty much 3Dc except ATI made a few extensions to it.
 
current nVidia cards support better normal map compression, but both ATi's and nVidia's implentations have limitations.
 
TommyTwix07 said:
What i want to know is how the 6800 Ultra handles normal maps compared the the X800XT PE (with ther 3dc technology) and how ATI's latest drivers will help with open GL performance. And I'd like to see how NVIDIA's latest drivers help in performance with some of the newer titles.

The X800's dont use 3Dc except in games where its been implemented and aren't currently any. Far Cry will probably be one of the first if HL2 doesn't release before that with 3Dc support. 3Dc will have little benefit in HL2 though because it rarely uses normal maps. HL2 will also have SM 3.0 support.

You can find more info on 3Dc vs DXT5 along with pics of both here:

http://www.nvnews.net/vbulletin/showthread.php?t=30772&page=4

ATI's latest drivers are the beta 4.8's and they were used in the DOOM 3 benches.

TommyTwix07 said:
I'd also like to know if you guys think an Antec true 430 will power a 6800 ultra if i have 3 hard drives 2 optical drives 4 case fans two cold cathodes and an asus p4c800-e MB and a gig of corsair ull memory.

An Antec True Power 430w will be plenty. I have quite a bit of shit in mine and i've run the True Power 430w and the True Blue 480w.
 
I must say the DXT lighting artifacts stand out much more in full motion than in a static screenshot. Having said that, the difference is still minimal.
 
Koho said:
I must say the DXT lighting artifacts stand out much more in full motion than in a static screenshot. Having said that, the difference is still minimal.

The lighting artifacts? What does 3Dc have to do with lighting? It compresses textures on models....:confused:

Quite a few people have ran that 3Dc demo on 6800's and havn't noticed any real IQ issues. The 3Dc demo also reflects a "worst case scenario". That doesn't mean there is going to be a noticeable difference in-game in Far Cry and Half Life 2. And you dont exactly run around with your face on the floor when your playing games anyways.

http://www.nvnews.net/vbulletin/showthread.php?t=30772&page=1&pp=15

If you look at those first pics from farther back, you really can't tell a difference between the two. And its not like your going to have an X800 and 6800 both to run side by side and try to find tiny IQ differences with lol. When your playing the game you'll never know the difference.
 
I always thought compression worked on normal maps too, but if you read the latest Doom3 update in the other thread it mentions normal maps, specular maps, and diffuse maps all in various states of compression.

I would assume specular maps would have something to do with compressing lighting. < Dunno for sure.

http://www.ati.com/products/radeonx800/3DcWhitePaper.pdf Is ATi's official paper statement on 3Dc tech. I'll be the first to admit I don't understand it completely. Especially the part about DXTC traditionally compressing 3 and 4 channel maps (I'm assuming RGB and RBGA) and instead optimizing for 2? Doesn't make a lot of sense to me. 3Dc does seem to be able to compress values like shinyness, roughness and transparancy now though (at least according to the whitepaper)
 
Ok, i just got done running that 3Dc demo by Humus and i have been all over that freaking little room and i dont even see the slight artifacting issue that was shown in those DXT5 vs 3Dc photos on nV News. I've tried a dozen different angles in well lit portions of the room and NOTHING. It looks gorgeous. It looks pretty much like the 3Dc pic.

If you have to go to all this trouble and still not even find a difference in IQ using DXT5 then the IQ aspects of 3Dc is an absolute joke.

This demo is suppost to be a "worst case scenario" of 3Dc vs DXT5.

Has anyone else with a 6800 ran the 3Dc demo? I'm using a 6800GT and the 61.76 WHQL drivers.

I'm going to try and take some screenshots later on. I already took some but i'm having to download some programs to view and convert the .tga formats.
 
Thanks for all the feedback... I was under the impression that HL2 was going to use a lot of normal maps. In addition, i was under the impression that practically any effect with PS3.0 can be done with PS2.0+ it just may require a bit more instruction. The other thing is in that screenshot provided below i can definately pick up on a banding effect around the edges of the "bricks\tiles" on the Nvidia 6800 Ultra screenshot (I pick up on that shit a lot, however if they were small tiles and it was a fps i'd prlly never nottice it. But i do find it odd with nvidia so focused on precision, and in a game where i would be moving cautiosly like doom3 or theif or splinter cell i might pick up on it).

And has anyone who owns a 6800 ultra notticed an improvement in video encoding speed as a result of their hardware?
 
TommyTwix07 said:
Thanks for all the feedback... I was under the impression that HL2 was going to use a lot of normal maps. In addition, i was under the impression that practically any effect with PS3.0 can be done with PS2.0+ it just may require a bit more instruction. The other thing is in that screenshot provided below i can definately pick up on a banding effect around the edges of the "bricks\tiles" on the Nvidia 6800 Ultra screenshot (I pick up on that shit a lot, however if they were small tiles and it was a fps i'd prlly never nottice it. But i do find it odd with nvidia so focused on precision, and in a game where i would be moving cautiosly like doom3 or theif or splinter cell i might pick up on it).

And has anyone who owns a 6800 ultra notticed an improvement in video encoding speed as a result of their hardware?

Like i said, i just got done running the 3Dc demo on my 6800GT and i didn't notice that banding effect that showed up in those pics and i worked very hard at trying to find it. I will continue to look for it and maybe someone can give me some tips or something. If its that hard to find, your never going to notice it while playing in-game.

And these are the differences between SM 2.0b and SM 3.0.

Dependent Texture Limit
2.0b = 4
3.0 = No Limit

Position Register
2.0b = none
3.0 = Yes

Executed Instructions
2.0b = 512
3.0 = 65536

Interpolated Registers
2.0b = 2+8
3.0 = 10

Intstruction Predication
2.0b = none
3.0 = Yes

Indexed Input Registers
2.0b = none
3.0 = yes

Constant Registers
2.0b = 32
3.0 = 224

Arbitrary Swizzling
2.0b = none
3.0 = yes

Gradient Instructions
2.0b = none
3.0 = yes

Loop Count Register
2.0b = none
3.0 = yes

Face Register (2-sided lighting)
2.0b = none
3.0 = yes

Dynamic Flow Control Depth
2.0b = none
3.0 = 24

Minimum Required Full Precision Shader Capabilities
2.0b = FP24 (96-bit)
3.0 = FP32 (128-bit)

The differences are pretty much night and day.

SM 3.0 has only been implemented into ONE game which is Far Cry and it did very little to take advantage of what SM 3.0 can do over SM 2.0b. There are over a dozen other games this year that will have SM 3.0 support.

We'll just have to wait and see what kind of real benefits come out of it.
 
burningrave101 said:
Like i said, i just got done running the 3Dc demo on my 6800GT and i didn't notice that banding effect that showed up in those pics and i worked very hard at trying to find it. I will continue to look for it and maybe someone can give me some tips or something. If its that hard to find, your never going to notice it while playing in-game.

And these are the differences between SM 2.0b and SM 3.0.



The differences are pretty much night and day.

SM 3.0 has only been implemented into ONE game which is Far Cry and it did very little to take advantage of what SM 3.0 can do over SM 2.0b. There are over a dozen other games this year that will have SM 3.0 support.

We'll just have to wait and see what kind of real benefits come out of it.

One thing that stood out to me between the 6800s and X800s was in farcry the X800s can only do 3 lights per pass instead of 4 like the 6800s since they don't support enough instructions...I figure if we're already bumping into those kinds of limitations...you'll definitely want the 6800 if you want to keep it for any length of time...
 
Brent_Justice said:

Oh yea, that explains it lol.

Ok then what is 3Dc in comparison to DXT5?

ATI's 3Dc normal map compression algorith utilises a tweaked version of the DXT5 compression mode in DXTC / S3TC.

http://www.beyond3d.com/reviews/ati/r420_x800/index.php?p=11

ChrisRay said:
It cant, But it can do DXT5, 3dc is merely an extension of DXT5 for the most part.

http://www.nvnews.net/vbulletin/showthread.php?t=30772&page=1&pp=15

DSC said:
SM3.0 is part of DirectX9.0, 3Dc isn't. If other 3d companies doesn't support 3Dc, then it's a dead duck. Remember Truform? 3Dc isn't even invented by ATI, it's DXT5 with their modifications. It's S3's invention(S3TC) and licensed to MS(DXTC).
 
id like to see a NON ati or nvidia websites take, on 3dc vs dxt5.....everyone keeps linking to stuff from there websites which overplay their benefits.....and id suggest just stick with your x800.....i dont think youll be dissapointed future or present....unless its gonna take like 50 years to actually get it :rolleyes: damn shortages......dont listen to some of these people on here that are hardcore onto a single card......ATIs opengl (if properly done) rewrite should close the gap big time in the near future....also noone knows if sm3.0 is actually gonna make a big difference....same thing with 3dc....only time will tell........and am i the only one that things HDR is just plain insignificant??
 
xSyzygy666x said:
id like to see a NON ati or nvidia websites take, on 3dc vs dxt5.....everyone keeps linking to stuff from there websites which overplay their benefits.....and id suggest just stick with your x800.....i dont think youll be dissapointed future or present....unless its gonna take like 50 years to actually get it :rolleyes: damn shortages......dont listen to some of these people on here that are hardcore onto a single card......ATIs opengl (if properly done) rewrite should close the gap big time in the near future....also noone knows if sm3.0 is actually gonna make a big difference....same thing with 3dc....only time will tell........and am i the only one that things HDR is just plain insignificant??

Beyond3D is as fair as your going to get with being a non ATI/nVidia site. And the FACTS dont change no matter what site your quoting them off of. You might have to look around a little if you want both sides of the story in full detail though.

nVidia is working on new optimizations to add to their 62.xx series drivers. There is just as good of chance of nVidia taking the lead in D3D as there is ATI getting OpenGL drivers that can compete against the 6800's lol.

And am i the only one that thinks 3Dc compared to DXT5 is just plain insignificant? Especially since DXT5 has been around for years and wasn't ever truly adopted by game developers.

nVidia still has more money and more GPU market share than ATI so game developers are mostly going to be playing to nVidia's tune.
 
burningrave101 said:
Like i said, i just got done running the 3Dc demo on my 6800GT and i didn't notice that banding effect that showed up in those pics and i worked very hard at trying to find it. I will continue to look for it and maybe someone can give me some tips or something. If its that hard to find, your never going to notice it while playing in-game.

And these are the differences between SM 2.0b and SM 3.0.

The differences are pretty much night and day.

SM 3.0 has only been implemented into ONE game which is Far Cry and it did very little to take advantage of what SM 3.0 can do over SM 2.0b. There are over a dozen other games this year that will have SM 3.0 support.

We'll just have to wait and see what kind of real benefits come out of it.

I just ran the demo on my 5900Ultra...I can see what he meant about worst case scenario...like you would have to put it in just the right position at just the right angle and snap the screenshot at just the right second to see the banding...I played with it and I can make out the banding, but not as bad as their screenshots showed it...I'll check it out again tomorrow when I get my GT and see if there is a difference...
 
burningrave101 said:
Beyond3D is as fair as your going to get with being a non ATI/nVidia site. And the FACTS dont change no matter what site your quoting them off of. You might have to look around a little if you want both sides of the story in full detail though.

nVidia is working on new optimizations to add to their 62.xx series drivers. There is just as good of chance of nVidia taking the lead in D3D as there is ATI getting OpenGL drivers that can compete against the 6800's lol.

And am i the only one that thinks 3Dc compared to DXT5 is just plain insignificant? Especially since DXT5 has been around for years and wasn't ever truly adopted by game developers.

I'm waiting to see what 3Dc can do in Farcry before I even start to form an opinion about it...
 
Where does it say that they used the beta cat 4.8's in the doom 3 benchmarks? And if you look at where the blocks taper off... it isn't so much banding as it is a boxyness in the gradient on the shading. I saw it imediately. I am not trying to say that its that big of a deal but with other applications it might become more prevelant.
 
burningrave101 said:
Beyond3D is as fair as your going to get with being a non ATI/nVidia site. And the FACTS dont change no matter what site your quoting them off of. You might have to look around a little if you want both sides of the story in full detail though.

nVidia is working on new optimizations to add to their 62.xx series drivers. There is just as good of chance of nVidia taking the lead in D3D as there is ATI getting OpenGL drivers that can compete against the 6800's lol.

And am i the only one that thinks 3Dc compared to DXT5 is just plain insignificant? Especially since DXT5 has been around for years and wasn't ever truly adopted by game developers.

nVidia still has more money and more GPU market share than ATI so game developers are mostly going to be playing to nVidia's tune.


3dc has at least a little value but i agree it isnt a very big deal......and facts DO change from site to site....be careful of your source....there not always truthful......specially those who are trying to sell you something....are YOU going to go on a car lot and believe everything THEY tell you about the car??? of course your not....cause they overstate the truth and exaggerate things....the best source is a neutral 3rd party source :D......and btw im getting GT so dont think im defending the x800s in anyway....
 
TommyTwix07 said:
Where does it say that they used the beta cat 4.8's in the doom 3 benchmarks? And if you look at where the blocks taper off... it isn't so much banding as it is a boxyness in the gradient on the shading. I saw it imediately. I am not trying to say that its that big of a deal but with other applications it might become more prevelant.

In the hardocp article they state version 8.05 driver was used...current version for cat 4.7 is 8.03...

Did you run the demo or look at the screenshot?
 
^eMpTy^ said:
In the hardocp article they state version 8.05 driver was used...current version for cat 4.7 is 8.03...

Did you run the demo or look at the screenshot?

He was looking at the screenshot.

And what card were they using to compare the DXT5 to 3Dc? Was it a 6800? I need to read back over that thread again i guess.

I've went through that 3Dc demo several times now and i'm not noticeing anything. The IQ is excellent. Could it of been a driver issue that caused that artifacting/banding?
 
burningrave101 said:
He was looking at the screenshot.

And what card were they using to compare the DXT5 to 3Dc? Was it a 6800? I need to read back over that thread again i guess.

I've went through that 3Dc demo several times now and i'm not noticeing anything. The IQ is excellent. Could it of been a driver issue that caused that artifacting/banding?

I have the latest drivers installed with my 5900 and the banding is definitely there...just about half as noticeable as shown in that screenshot...I'll let you know if it changes when I install the GT...
 
^eMpTy^ said:
I have the latest drivers installed with my 5900 and the banding is definitely there...just about half as noticeable as shown in that screenshot...I'll let you know if it changes when I install the GT...

Well, i'll keep looking for it lol, but i can tell you guys right now that are wondering about 3Dc IQ compared to DXT5 on a 6800, if i'm seeing something, its not worth mentioning.
 
DeschutesCore said:
current nVidia cards support better normal map compression, but both ATi's and nVidia's implentations have limitations.


Hmmm both support DTX5 and 3Dc? (though at this moment Nvidia doesn't actually support 3Dc but it could be added in if they desired easily enough)

Anyways on image comparison here is a picture from discussion on Beyond3d (when someone discovered that the DTX5 compressor used wasn't compressing very well (quality was bad) and now the DTX5 texture has been replaced with a better compressed one which is hard to tell the difference). Anyways here is the picture comparing the differences of the normal maps in 3Dc and DTX5 mode.

3DcDXT5compare2.jpg


From here http://www.beyond3d.com/forum/viewtopic.php?p=322435&highlight=#321646

Anyways you would hope they would look fairly considering they are practically the same thing with a slight change of getting rid of some channels to compress, since you only need 2 channels for a normal map.

Edit: Also here is the original picture comparing 3Dc where you can see there is some obvious problems (on page 1 of the thread above and you can look on page 2 where someone points out the issues in the demo)

3DcDXT5compare.jpg
 
Back
Top