nVidia to not run all tests in the next 3dMark?

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
Linkage.

So there is at least a couple tests that no nVidia card will be able to render?

Nicklas Renqvist said:
Our HDR implementation requires full SM3.0 support and FP16 textures & blending. This means that it won’t be possible to run the HDR tests with SM2.0 cards or SM3.0 cards without support for FP16 textures & blending.
Only the r520 can do FP16 blending, no? That's why it can anti-alias FP16 HDR, while the GeForce 7800s cannot.

Interesting.

Oh, and check out the screenshot on the last page.
:drools:

Update: G70 based boards will be able to run the tests, just without AA. Clarification :) - DougLite
 
I am sure that nvidia will release some drivers that enable HDR with AA at least to some extent
 
Read it and I highly doubt Nvidia wont counter! :) It's like pissing in the wind. YOU GET WET.
 
Haha, I knew buying the new 7 series would be a bad idea for some.

Now all the benching people can't play their beloved 3D Mark game anymore...
 
Borgschulze said:
Haha, I knew buying the new 7 series would be a bad idea for some.

Now all the benching people can't play their beloved 3D Mark game anymore...

Yeah, people who sit around jacking off to their 3dmark score. To the rest of us, we'll just keep gaming and not notice.
 
dderidex said:
So there is at least a couple tests that no nVidia card will be able to render?
huh? The NV4x and higher support FP16 blending.

You're thinking FP16 blending + AA.

It's a SM3.0 benchmark, so i doubt the NV4x and higher won't be able to run the default benchmark tests.
 
every NV card since the 6 series can do FP16/32 HDR. it just cant do that and AA at the same time.

R42x will not be able to run it, though.
 
That was one of the best interviews from a developer I've read in a very long while...

Thanks for the link
 
I don't see where nvidia won't support the feature, but I did see where ATi might not be able to complete one test that they were kind of elusive about:

Vertex texture fetching seems to be big news at the moment (see ATI's Radeon X1000 series for more details); do you have plans to make use of this feature or test it separately in some way? If so, then would this not require you to use something like ATI's R2VB as this could be supported by all SM3.0 implementations currently on the market, whereas sticking to SM3.0 vertex texture fetching would only be supported by one IHV?

Unfortunately I’m not able to comment on this one as the new feature tests for the next 3DMark are still in development (at the time of writing this) and I can’t say for sure if they will make it to the final product.
 
Oh noes let's all freak out over what 3DMark is going to support, I bet the gameplay will be even better than FEAR!
 
As far as I know, every nVidia card in the GeForce FX series, GeForce 6 series, and GeForce 7 series all support FP16 blending. (with the GeForce FX supporting FP16 only, the GeForce 6 supporting FP16 and FP32, and I'm not sure about the GeForce 7, but I'm sure it supports AT LEAST FP16 and 32).

Also, ATi cards (everything above the 9200(9200 not included)) supports FP16 and FP24.

The x1-series cards supports FP32. I'm not sure about the X-series cards, but I'm pretty sure they're FP16 and FP24.


Also, I think the nVidia card supports AA with HDR (FP16). VALVe said that their original 2 versions of HDR didn't support AA because of the way they rendered, but the final installment (the one we've all got) DOES support AA with HDR, on ALL cards.
 
noobman said:
Also, I think the nVidia card supports AA with HDR (FP16). VALVe said that their original 2 versions of HDR didn't support AA because of the way they rendered, but the final installment (the one we've all got) DOES support AA with HDR, on ALL cards.

The HDR in use in DoD:S (et al) is integer based, not floating point at all.

nVidia hardware is (currently) unable to anti-alias floating point render targets. Not something that can be "fixed" in a driver - card is physically incapable of it.

I had misunderstood that the limitation of anti-aliasing floating point render targets was because it could not *blend* them. Seems that's not the case, and the G70 will be able to run all the tests...as long as anti-aliasing can be turned off on them (because it certainly can't FSAA FP16 scenes).
 
dderidex said:
Seems that's not the case, and the G70 will be able to run all the tests...
So will the (most/all) NV4x cards.

If it's multi-pass FP16 blending, the NV44/NV44A (6200 turbo cache and 6200A) doesn't support FP16 blending to the framebuffer. In that case, the NV44/NV44A won't support it. If it's single pass, all NV4x and above handle reading FP textures and writing to a FP framebuffer.
 
The r5x0 series of cards run FP16 HDR + AA like crap.
All 6x00 and 7x00 Series from nVidia can do FP16 HDR (with 6200 stuff as listed above).

The 3dMark series has always been a fun game with great graphics and gameplay worthy of buying. :rolleyes:

:p
 
kcthebrewer said:
The 3dMark series has always been a fun game with great graphics and gameplay worthy of buying. :rolleyes:

That's because it's just a tech demo. The best way to see what your graphics card can do is to run an application that concentrates solely on graphics. That way you won'd have to worry about it being CPU limited since there's very little physics, AI, database, or gameplay to worry about.

When benchmarking in games, the entire system is taken into consideration. Your memory, processor, mobochipset, sound (ESPECIALLY SOUND GRRR!!!) and video card are all factors to the real world performance. But when you want to try out and see what your video card can REALLY do, with nothing holding it back, 3DMark or any 3D benchmarking utility is preferred.

Taking the video pure 3D performance, and comparing it to a full system performance, is kinda dumb.
 
kre62 said:
Yeah, people who sit around jacking off to their 3dmark score. To the rest of us, we'll just keep gaming and not notice.
and there's something wrong with this? :D J/K, the only thing 3dmark is truly good for, IMHO, is seeing if your hardware is performing anywheres in the ballpark of where it should be, if it's 10,000 points shy of average you know something's set up wrong somewhere...... I use '01 for CPU/Memory benchmarking (testing CPU overlclocks) for the most part, and '03/'05 for graphics.......
 
So hard to be overcast and downtrodden with so many new improvements on the way, with solutions abound so that even those living in the shadows may have light to live by. And even though the density of my own thick skull may fail to see the uniform of transition; two words do indeed cross my mind...

holy shit
 
Our HDR implementation requires full SM3.0 support
"Full" support would require a Vertex Texture Fetch. The only ATi card with that is in the XboX2.

I'm not sure that AA in FP16 HDR is a SM3.0 requirment. Sounds like FUD.

Although I never liked the game play in 3Dskidmark so I stopped playing it years ago :p
 
PRIME1 said:
I'm not sure that AA in FP16 HDR is a SM3.0 requirment. Sounds like FUD.
im pretty sure its not. i remember people yelling about how SM3.0 and FP16/32HDR in FC was NOT the same thing.
 
I believe they are all direct X 9 features but running AA on the Floating Point 16 precision with HDR is not a DX 9 requirement. However Vertex Texture Fetch is. Well at least it was in the spec before I guess MS thinks it's okay to have another way about it. They have spoken and it is so. By the way I don't think Nvidia Can't do FP 16 HDR AA, I just think it runs so poor nvidia hasn't bothered running it or enabled it in the driver. It's also possible it can be a hardware limit but I have never seen concrete proof of this, only people saying it is so without a single link to prove it. From what I've seen it doesn't run well on ATI X1800 either and that's not saying X1800 is bad it's just that, HDR AA FP16 precision is hardcore stuff. Either way Synthetic benchmarks may not be that important if games begin to include their own benchmark like fear. I would like to see that more because last I checked I can't play 3dmark. I mean it may be fun to swim through that HDR reflective water but too bad huh :(
 
Sly said:
That's because it's just a tech demo. The best way to see what your graphics card can do is to run an application that concentrates solely on graphics. That way you won'd have to worry about it being CPU limited since there's very little physics, AI, database, or gameplay to worry about.

It is not made to be a tech demo. 3dmark is (and probably will always be) an unoptimised PoS that doesn't reflect anything useful for real gaming performance. It, however, is pushed as a gaming benchmark which it is not and has never been. As it has been stated a million times before, unless it is built on (a) 'real' game engine(s) it will never be useful except for people wanting to show off how much money/time they have (to waste - exception to reviewers).

The only thing that matters with a high-end system is how it performs in the games you play (or whatever you are doing with it).

That said, it is still purty and I like watching it run whenever they are released.
 
Lord_Exodia said:
I believe they are all direct X 9 features but running AA on the Floating Point 16 precision with HDR is not a DX 9 requirement. However Vertex Texture Fetch is. Well at least it was in the spec before I guess MS thinks it's okay to have another way about it. They have spoken and it is so. By the way I don't think Nvidia Can't do FP 16 HDR AA, I just think it runs so poor nvidia hasn't bothered running it or enabled it in the driver. It's also possible it can be a hardware limit but I have never seen concrete proof of this, only people saying it is so without a single link to prove it. From what I've seen it doesn't run well on ATI X1800 either and that's not saying X1800 is bad it's just that, HDR AA FP16 precision is hardcore stuff. Either way Synthetic benchmarks may not be that important if games begin to include their own benchmark like fear. I would like to see that more because last I checked I can't play 3dmark. I mean it may be fun to swim through that HDR reflective water but too bad huh :(

http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/3.html

Straight from the Chief scientist at Nvidia.
 
dderidex said:
Only the r520 can do FP16 blending, no? That's why it can anti-alias FP16 HDR, while the GeForce 7800s cannot.

No, they are two different things.

The following currently available cards support FP16 blending:
GeForce 6600
GeForce 6600GT
GeForce 6800LE
GeForce 6800
GeForce 6800GT
GeForce 6800 Ultra
GeForce 7800GT
GeForce 7800GTX
Radeon X1800XL
Radeon X1800XT

If your card isn't on this list, it doesn't support FP16 blending. If it is, it does.
 
Brent_Justice said:
Oh noes let's all freak out over what 3DMark is going to support, I bet the gameplay will be even better than FEAR!
Thank you for the quote that should have ended this thread. My thoughts exactly! OMGs! We is #dMawk and we's invinted someting new and iffs wore cawd dont suportsit, its feces!!!!!
 
Rabidfox said:
it is fake, they're just pixels mister positivity...

Thanks for stateting the obvious :)

Now try and read the post I replied to, and put my answer in a context...

Terra...
 
tranCendenZ said:
No, they are two different things.

The following currently available cards support FP16 blending:
GeForce 6600
GeForce 6600GT
GeForce 6800LE
GeForce 6800
GeForce 6800GT
GeForce 6800 Ultra
GeForce 7800GTX
Radeon X1800XL
Radeon X1800XT

If your card isn't on this list, it doesn't support FP16 blending. If it is, it does.

So ATi just caught up to spec this month when they released their new cards? I thought the others they had were comparable to the GF 6 series since they have been out forever? Well regardless it's good to see both companies come into line better late than never.
 
tranCendenZ said:
No, they are two different things.

The following currently available cards support FP16 blending:
GeForce 6600
GeForce 6600GT
GeForce 6800LE
GeForce 6800
GeForce 6800GT
GeForce 6800 Ultra
GeForce 7800GTX
Radeon X1800XL
Radeon X1800XT

If your card isn't on this list, it doesn't support FP16 blending. If it is, it does.

Damn I guess my 7800GT doesnt :/ ugh i hate wasting $400 for nothing lol
 
According to this list, the newer generation of Ati cards do support fp16 blending.


FP16 processing throughout the pipeline: arbitrary filtering for integer and floating point FP16 textures (including anisotropy), full support for FP16 output into a frame buffer (including any blending operations and even MSAA). FP16 texture compression, including 3Dc.

The 7800 GT does support fp16 blending.

I'd be guessing that those new funky looking clouds that we've seen in recent FarCry 2 video's are the result of 16-bit blending. See a cloud hovering mid way up a mountain and seamlessly be able to walk to it, through it, as well as view it from the top.
 
Da Frechman said:
I'd be guessing that those new funky looking clouds that we've seen in recent FarCry 2 video's are the result of 16-bit blending. See a cloud hovering mid way up a mountain and seamlessly be able to walk to it, through it, as well as view it from the top.

I believe it was an engine demonstration of DX10 capabilities
 
kcthebrewer said:
That said, it is still purty and I like watching it run whenever they are released.

As noted in other threads, it's also damn handy for stress testing an overclock. You can just let it loop for a few hours while you go off and do something useful.

Warrior said:
who the hell runs 3dmark with AA anyways?!

I do, actually.

It looks pretty good. Modern cards can generally run 3dMark03 (not really 05 yet) at 1680x1050 with 4xAA and "post processing" turned on in the 3dMark application. Almost as good looking as '05 is then, and it certainly stresses the video card when testing it!
 
Sc4freak said:

An interesting read, to be sure.

Chief scientist at Nvidia said:
"But with HDR, you render individual components from a scene and then composite them into a final buffer. It's more like the way films work, where objects on the screen are rendered separately and then composited together. Because they're rendered separately, it's hard to apply FSAA (note the full-screen prefix, not composited-image AA! -Ed) So traditional AA doesn't make sense here."

This tells me that it's very unlikely that FSAA is going to work on the current gen nvidia part. It does not tell me that it can't be done at all- note the editor's comment. There are interesting developments in the patent arena about ways to get around this limitation- here's a quote from another "Chief Scientist":

patent 6956582 issued Oct. 18th 2005 said:
In an image, it is desirable to limit pixel-to-pixel contrast in order to minimize artifacts. Contrast can be based on ratios and is thus logarithmic in nature, as is the sensitivity of the eye to brightness. In practice, intensity (not the log of intensity) is used throughout the graphics pipeline and in the display devices.

A specific embodiment of the present invention improves antialiasing by employing a filter kernel based on contrast. This filter has a specific coverage area but continuously varying size, geometry, and weighting based upon the linear intensity differences or log of intensity differences between neighboring pixels. Further, the results from the filter weighting are normalized in order to meet the Erdahl criterion. This has the effect of limiting the pixel-to-pixel contrast.
 
Back
Top