Hdr + Aa

Status
Not open for further replies.

DeathCloud

Gawd
Joined
Jul 21, 2005
Messages
1,004
This may have been answered before but oh well. Why is it that ATI cards can do both HDR and AA where as Nvidia can only do one at a time. I have no preference on these 2 companies so please dont turn this into a vs match. Also why hasnt Nvidia stepped up and tried to deliver both HDR and AA on their newest cards?
 
NV cards do not physically support FP16 blending + AA in the framebuffer, it is a hardware thing they just don't have. ATI does, they built it into the X1K architecture.
 
Brent_Justice said:
NV cards do not physically support FP16 blending + AA in the framebuffer, it is a hardware thing they just don't have. ATI does, they built it into the X1K architecture.


Intersting. Its hard to believe the Nvidia would not make this one of their top priorities. Again I have no bias either way but with how tight the competition is between these 2 companies I would think this would be a big deal. Matbe with Nvidias DX10 cards they will try to implement this.
 
It is what it is

Making an educated guess I would say their next gen will be able to do it.

There is actually another feature difference as well concerning HDR, ATI doesn't support FP16 Filtering, but NV does.

So it is kind of weird:

ATI
No FP16 Filtering
But FP16 Blending w/AA

NV
FP16 Filtering
FP16 Blending but w/o AA


Hopefully ATI's next gen will support FP16 filtering and hopefully NV's next gen will support FP16 Blending + AA.
 
Good info. Like you said hopfully both companies next gen cards will support this feature and more.
 
DeathCloud said:
Intersting. Its hard to believe the Nvidia would not make this one of their top priorities. Again I have no bias either way but with how tight the competition is between these 2 companies I would think this would be a big deal. Matbe with Nvidias DX10 cards they will try to implement this.

Not really that odd. Nv beat ATi to the punch with a ton of features in there GF6. While ATi was really dragging their feet with the feature checklist in the x800 era. While they come out with chips every 6 months, these companies don't respin the architecture from the ground up. In some ways, while not as extreme, this is now Nv's X800. They added pipes to the GF6, they upped the clocks. Milking the cash cow, and behind all these refreshes the next BIG thing is being worked on. It won't have been in development for 6 months, behind the scenes Nv is working on the GF8 and checking there list once, twice or thrice. Like all the flak they been taking lately for shimmering. They simply won't fix it in hardware, they are working on there next thing instead.

I've been labelled a !!!!!! for this lately, but if you're goling single card and don't care for the heat or power, the ATi is the way to go right now. It is more advanced, undeniably and unquestionably. I think the latest fan war was probably the GF7900 availability thread, and that thread tanked like a rock once the HDR+AA and Oblivion threads came out.

No doubt in my mind, Nv's next cards, being really NEW, will have the kitchen sink and then some inside. And a year, maybe 2, later we'll be asking why the refresh doesn't do so and so. Just long enough for the big guns to deliver again.
 
So it is kind of weird:

ATI
No FP16 Filtering
But FP16 Blending w/AA

NV
FP16 Filtering
FP16 Blending but w/o AA

Filtering isn't always necessary with HDR (Farcry doesn't use it afaik) and when it IS, there is an easy and efficient workaround in a shader with hardly any performance impact (used in 3DMark06, SS2, Oblivion(?)).
 
Necessary or not it is a feature ATI does not have with HDR, what I posted are facts.

I find it kind of funny one has one feature the other lacks and the other has that same feature the other lacks.
 
Interesting that Oblivion is an Nvidia stamped game, and it runs better with ATI hardware. Gamebranding sucks! An awful way to promote ones product at the expense of the consumer.
 
Mrwang said:
Interesting that Oblivion is an Nvidia stamped game, and it runs better with ATI hardware. Gamebranding sucks! An awful way to promote ones product at the expense of the consumer.

Its that way with games all the time but normally ATI has somewhat of an advantage in DX titles and nVidia has the advantage in OpenGL You should remember how HL2 was and the fact it was a highly promoted release for ATI hardware but once benchmarks started coming out ATI didn't really have much of an advantage over nVidia at all even though this game had been built from the ground up on ATI hardware.
 
For clarification, HDR and AA can play on the same playground on nVidia cards depending on the implementation. One must differentiate the common method (floating point buffer) with others that may use different techniques and expose different limitations of different architectures.

Mrwang said:
Interesting that Oblivion is an Nvidia stamped game, and it runs better with ATI hardware. Gamebranding sucks!

A TWIMTBP stamp doesn't ever directly indicate that performance is better with nVidia cards. The Way It's Meant To Be Played is just a poor way of naming a program that allows developers and nVidia to work together to optimize performance, image quality, and, potentially, stability. The name is misleading, but don't be led astray by it.
 
Brent_Justice said:
Hopefully ATI's next gen will support FP16 filtering and hopefully NV's next gen will support FP16 Blending + AA.

They will. Both of the items on that wishlist are part of the D3D10 spec, and G80 has already been announced to be a '10 part, and R600 is expected to be one as well.
 
Brent_Justice said:
It is what it is


ATI
No FP16 Filtering
But FP16 Blending w/AA

NV
FP16 Filtering
FP16 Blending but w/o AA

If you had to pick one or the other, what would you pick FP16 blending w/AA and no FP16 Feltering or FP16 Feltering with no FP16 Blending w/AA?
 
ATI has always be the leader in AA
ATI having HDR+AA, while nVidia can only do HDR or AA
is simply the new SM 2.0 vs SM 3.0 of the last generation.
X800 series couldn't do SM 3.0, 6800 could.
Next gen it will be all leveled out.
 
Majin said:
ATI having HDR+AA, while nVidia can only do HDR or AA

If I were a moderator, I'd make it mandatory to clarify the capabilities that nVidia cards do not have. It is, as I've already stated, not always a case of "no HDR and multisampling on nVidia cards EVER", but in fact varies on the implementation.

You are also not taking into account that supersampling can change the game. Remember this.
 
johnnq said:
i can do hdr+ 4xfsaa on my 7800gt just fine...
:p What about the other 3-4 games that use FP16 HDR and support AA? Think about the children!
 
Mrwang said:
Interesting that Oblivion is an Nvidia stamped game, and it runs better with ATI hardware. Gamebranding sucks! An awful way to promote ones product at the expense of the consumer.

Game Branding is for suckers. It's strictly a marketing ploy to sell more Nvidia or ATI cards.

EQ2 is an Nvidia game, and here we are 16 months after it launched and it still runs far better on ATI.
 
EQ2 is an Nvidia game, and here we are 16 months after it launched and it still runs far better on ATI.

don't forget, there is still alot of bugs yet to be addressed with that game and ATI cards, one being the hitching while AA is enabled

lol source hdr = wanna be hdr

not having nearly the range of FP16, it still has an impressive look for being "wanna be" and to alot of people it is still comparable, probably needs alot more development time to get it to look that good
 
I hate it when people say things like EQ2 is an NVIDIA game, HL2 is an ATI game, I think if you check you'll see EQ2 is a Sony Online game and HL2 is a Valve game and Oblivion is a Bethesda game. Game developers aren't out to alienate their market, they just want to sell their games to everyone they can, it is all about selling the most copies, they can't do that if they only target one vendor. These things like Get in the Game and The Way It is Meant to played should not be exaggerated for more than what they are.

/end rant
 
Brent_Justice said:
I hate it when people say things like EQ2 is an NVIDIA game, HL2 is an ATI game, I think if you check you'll see EQ2 is a Sony Online game and HL2 is a Valve game and Oblivion is a Bethesda game. Game developers aren't out to alienate their market, they just want to sell their games to everyone they can, it is all about selling the most copies, they can't do that if they only target one vendor. These things like Get in the Game and The Way It is Meant to played should not be exaggerated for more than what they are.

/end rant

It's not the game developers that are at fault here. Gamebranding is perpetuated by the Graphics card company's. Nvidia will be paying alot of money to have there name stamped on UT 2007.There is way too much of that these days. ATI is at fault here as well. I dont think having an Nvidia or ATI trademark on the box sells more games. Its the other way around. Card makers sell more cards making the consumer believe there products run a game better.
 
yea dude you have it wrong. they stamp their names on the games to sell THEIR products, not the game itself.
 
johnnq said:
yea dude you have it wrong. they stamp their names on the games to sell THEIR products, not the game itself.

Read my post again. ;) I am not saying anything differently than you.
 
zzzVideocardzzz said:
lol source hdr = wanna be hdr

I'm sorry but I just have to say IMO I think that in the lost coast hdr was simply amazing looking. I have oblivion yes, and farcry, and ss2 all have hdr, and I simply think that lost coast is the best looking hdr out of all of them. I don't see why it's "wanna be" at all.
 
Valve's integer method is comparable to any floating point method - I find nothing fake about it at all.

There's a strange mentality about precision these days. People have spit at nVidia's mixed precision modes for a long time, but what many don't ever realize is that sometimes there simply isn't a tangible difference between 16 bits of precision and 24 bits. I abso-believe that we need floating point precision with something like HDR, but if tradeoffs have to be made in the transitional phase (integer precision instead of floats), I find nothing wrong with it.
 
I think FP HDR is able to provide more contrast in the image vs Valve's HDR. I haven't seen this yet in Oblivion but Farcry (and 3DMark06) has some good examples with what i mean. Also Valve's implementation doesn't work with refractions afaik.
 
Mrwang said:
It's not the game developers that are at fault here. Gamebranding is perpetuated by the Graphics card company's. Nvidia will be paying alot of money to have there name stamped on UT 2007.There is way too much of that these days. ATI is at fault here as well. I dont think having an Nvidia or ATI trademark on the box sells more games. Its the other way around. Card makers sell more cards making the consumer believe there products run a game better.

First off, who even cares. But, if you think about it, it makes a lot more sense for Nvidia or ATI to implement their hardware and stability into a game before its released, that way, us consumers benefit from it. It doesn't sell more games but TWIMTBP tells the consumer, "Hey, if I have an Nvidia card, I should have no problems running the game." I'm glad Nvidia does this, it means less bugs will be introduced to the consumer if Nvidia, or even ATI stepped in with the developer and optimized the game to run on the worlds most popular video cards, before release date. Whether branding a logo on a game is right or wrong, well, who really cares? You ATI fanatics (not you specifically) shouldn't care that a game is branded TWIMTB, and vice versa. At least Nvidia owners know their TWIMTBP will run flawlessly right out of the box.

To the original topic, maybe Nvidia figured games wouldn't implement HDR+AA just yet. And if they did, it would be far too few to even make a big difference, just like SM3.0 was with the 6800's and ATI's motive on that one. The performance is not great to have it be an issue, just like running SM3.0 on the 6800's, HDR+AA on a 1900 is the same scenario. Hopefully ATI pushes (which knowing them I doubt it) more titles to support HDR+AA so Nvidia will be ready to launch its G80 with compatiblity and even better performance. Sound familiar?
 
zzzVideocardzzz said:
lol source hdr = wanna be hdr

You're out of your mind. Source HDR is by far the most realistic looking HDR in any game today, period. The calc method is irrelevant. Oblivion looks like a cartoon next to Source. Going in and out of the sunshine is breathtaking on Lost Coast.
 
Brent_Justice said:
I hate it when people say things like EQ2 is an NVIDIA game, HL2 is an ATI game, I think if you check you'll see EQ2 is a Sony Online game and HL2 is a Valve game and Oblivion is a Bethesda game. Game developers aren't out to alienate their market, they just want to sell their games to everyone they can, it is all about selling the most copies, they can't do that if they only target one vendor. These things like Get in the Game and The Way It is Meant to played should not be exaggerated for more than what they are.

/end rant

True, but it does make you wonder why Bethesda just a short while ago, said that HDR+AA could not be done, when it can. They also suggest you use a NV card for the best expierence, when ATi's cards are faster. To me, thats not being honest and fair with their customers.
 
fallguy said:
True, but it does make you wonder why Bethesda just a short while ago, said that HDR+AA could not be done, when it can. They also suggest you use a NV card for the best expierence, when ATi's cards are faster. To me, thats not being honest and fair with their customers.

You're not exactly being forthcoming either Fallguy.

An unsupported patch with render issues, multi card issues, Windows issues, and crashes doesn't exactly make for "HDR+AA on Oblivion FTW"?

You'd never let this pass for a nVidia based product.
 
Rollo said:
You're not exactly being forthcoming either Fallguy.

An unsupported patch with render issues, multi card issues, Windows issues, and crashes doesn't exactly make for "HDR+AA on Oblivion FTW"?

You'd never let this pass for a nVidia based product.


I have no render issues, no crash problems or any other issues with the "Chuck" drivers. Where'd you get this from?
 
Rollo said:
You're not exactly being forthcoming either Fallguy.

An unsupported patch with render issues, multi card issues, Windows issues, and crashes doesn't exactly make for "HDR+AA on Oblivion FTW"?

You'd never let this pass for a nVidia based product.

Damn those ati guys for publishing a beta driver, i hope that other company dont start doing this to!
 
5150Joker said:
I have no render issues, no crash problems or any other issues with the "Chuck" drivers. Where'd you get this from?

Maybe he tried them on his own card. :)
 
5150Joker said:
I have no render issues, no crash problems or any other issues with the "Chuck" drivers. Where'd you get this from?

there have been posts on multiple forums, but iirc it's with crossfire confings (intermittent texture flashing and stuff).

im running the "chuck" drivers on a single card with no issues either.
 
CaiNaM said:
there have been posts on multiple forums, but iirc it's with crossfire confings (intermittent texture flashing and stuff).

im running the "chuck" drivers on a single card with no issues either.

I have flashing textures, but not in the game. Only when I hit ESC to go to the menu. Hitting ESC again, going back to the game, and there are none. Id much rather them be in the menu, than the game. But like you said, it seems to be for Crossfire only. I disabled CF, and I had no issues.

But these are beta, for now though. And with all beta drivers, patches, not everything is perfect. I expect all issues to be gone very soon, hopefully next week with the .4's. The simple fact that ATi had to put out a beta driver for HDR+AA to even work, is pretty pathetic. They should have not lied to consumers, and said it wouldnt work. They shouldnt lie and tell people to use a NV card for the best expierence. One has to wonder what their motivation was for making such claims.
 
fallguy said:
One has to wonder what their motivation was for making such claims.
i don't believe any wondering is required; partnering in nvidia's TWIMTBP program is designed for nvidia to sell more cards (just as the whole "get in the game" partnership between ati and valve was meant to sell more ati cards). it's also possible ati knows their hardware better than bethesda :)
 
Status
Not open for further replies.
Back
Top