Feature differences between ATI's X1K and nVidia's 7-series

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
Comparing pure features alone (IE., performance not relevant), I see:

Things in ATI's favor:
  • Up to 6 sample fully programmable multisample anti-aliasing. It's gorgeous, it's fast, it's open to improvements. Heads and shoulders above nVidia's max of 4x grid aligned sample pattern
  • Temporal anti-aliasing can provide an incredible boost in image quality in somewhat older titles, keeping them looking decent
  • Again, with certain older titles (Morrowind, for example), Truform (still supported) can provide poly counts that look as good as any recent title - EDIT: Deleted feature. ATI removed in Catalyst 5.9 and newer drivers.
  • FSAA that works with FP16 rendering targets (FP16 HDR, for example). This is revolutionary, to the point where nVidia's VP of architecture said a mere few weeks back it wasn't possible, and we wouldn't be seeing it in any manufacturers cards this gen or next. Oops. Guess he was wrong. But, at least we know nVidia isn't planning it for a while...
  • Anisotropic filtering modes that provide PURE scene-wide aniso filtering...not angle-dependent filtering that causes shimmering on certain texture transitions (fact: I don't care if you can't see it or not, nVidia's aniso filtering IS angle-dependent, so it WILL shimmer when you rotate through the algorithm, PERIOD. Basic mathematics, you can't argue with that.)
  • Integragted dual-link DVI for ultra-high resolution LCD support

Features in nVidia's favor:
  • Purevideo works better than AVIVO, so far.
  • Game profiles allow different games to use different levels of the card's features without changing anything. This is TREMENDOUSLY important, and very nice to see!
  • Drivers that take advantage of dual-core CPUs to provide a substantial performance boost
  • Support for stereoscopic 3d
  • Support for hardware display color calibration devices (true, ATI technically supports them - but there is no way to calibrate the 3d or overlay modes, and no way to copy the calibrated desktop mode over to either of the other 2 modes)
  • Limited (but present!) super-sample anti-aliasing modes
  • Support for 16-bit color anti-aliasing (important for some older titles - "Longest Journey" springs right to mind)
  • nView offers superior multi-monitor implementation to Hydravision (all three major monitor 'spanning' modes supported vs ATI only supporting the 2 least useful modes)

Does that seem like a fair representation of the actual FEATURES that differentiate the two product lines?

Anything I'm obviously missing?
 
Kirk isnt the CEO heh, hes VP of architecture at Nvidia.

ATi Tray Tools has a tweak in it's advanced tweaks window for multi-threading. I doubt that will be something Nvidia only has, which they currently dont officially. And ATI has 12 official driver releases a year, Nvidia has about half that on a good year and a whole bunch of betas. You brought up drivers even though i thought this was a feature comparison, not me. SMP drivers that Nvidia is working on do not help everyone, only dual core computers as well. Really thats a bad point since its not something everyone will benefit from.

Also kinda strange you start out saying performance not relevant then you name SMP drivers giving a performance boost as a point....contradiction

you can list 12 official driver launches a year for ATI though, thats definitly a plus.
 
Features in nVidia's favor:
Purevideo works better than AVIVO, so far.
Game profiles allow different games to use different levels of the card's features without changing anything. This is TREMENDOUSLY important, and very nice to see!
Drivers that take advantage of dual-core CPUs to provide a substantial performance boost
Support for stereoscopic 3d
Support for hardware display color calibration devices (true, ATI technically supports them - but there is no way to calibrate the 3d or overlay modes, and no way to copy the calibrated desktop mode over to either of the other 2 modes)
Limited (but present!) super-sample anti-aliasing modes
Support for 16-bit color anti-aliasing (important for some older titles - "Longest Journey" springs right to mind)

Well AVIVO is free and PureVideo cost money so its an odd comparison not fare really.
Other than that spot on.
I think this Generation is like the lst. ATi was faster last time but lacked features and Nvidia is now faster (by a smaller margin though) but ATi has much more features and their card looks to have more longevity.
 
dderidex said:
Anything I'm obviously missing?
How about price, availability, and the ability to run dual gpu cards? :p

In all seriousness, good post!
 
Shifra said:
Kirk isnt the CEO heh, hes VP of architecture at Nvidia.

Fixed, thanks.

Shifra said:
ATi Tray Tools has a tweak in it's advanced tweaks window for multi-threading. I doubt that will be something Nvidia only has, which they currently dont officially. And ATI has 12 official driver releases a year, Nvidia has about half that on a good year and a whole bunch of betas.

Ummm: drivers.

* BETA driver to support the availability of the game Black and White 2
* Mixed vendor support for NVIDIA SLI
* Performance enhancements for Dual-core CPUs
* PureVideo high definition de-interlacing support
* Usability enhancements when connecting to an HDTV
* For a full list of known issues please view the Release Notes
* Microsoft® DirectX® 9.0c and OpenGL® 2.0 support

And, AFAIK, the ATT tweak for the ATI drivers...well:
* It's not publicly released yet (only betas of ATT distributed to a small group have it)
* It doesn't work with any internal features of the ATI drivers that support dual cores (because there aren't any yet), but merely sets affinity for different driver processes trying to balance the load. Not even close to the boost nVidia is getting by offloading vertex processing to the CPU.
 
Comparing pure features alone (IE., performance not relevant),
[*]Drivers that take advantage of dual-core CPUs to provide a substantial performance boost

Thats performance relevant and not to everyone that owns an Nvidia card, shouldnt be there unless you want to change your rules :)
 
Shifra said:
SMP drivers that Nvidia is working on do not help everyone, only dual core computers as well. Really thats a bad point since its not something everyone will benefit from.

If you aren't on dual-cores yet, you will be soon....if you are cool. :p So...yeah, it's a feature. :D

Shifra said:
Also kinda strange you start out saying performance not relevant then you name SMP drivers giving a performance boost as a point....contradiction
I meant, not comparing like "OMG, 7800GTX IS SO MUCH FASTER THAN THE X1300PRO THAT"S TEH SAEM PRIEC!!! OMEG!!ONE!ELEVWEN!!!"

All I'm saying with that point is....something that, from a starting point, provides a boost in performance is a plus. I'm simply not comparing FPS, is all. IE., not saying card one has xxx FPS and card 2 has yyy FPS, and so card one is better than card two because of the FPS difference.

That's not a feature.

However, if you have card one in a single-core system, and card two in a single-core system, then upgrade both systesm to dual-core.....if card one maintains the exact same performance, and card two gets a performance BOOST by doing that....well, that's a feature in card two's favor.

Shifra said:
you can list 12 official driver launches a year for ATI though, thats definitly a plus.
Number of releases a year has never really done anything for me - or anyone else I can think of.

You don't update drivers just to have newer sets, do you? It's either:
A) To fix bugs (in which case it's not a good thing, just something we PC users deal with)
or
B) To improve performance (and, really, how many "improves performance" driver updates does ATI put out compared to nVidia? About the same, overall...)

So....it's kinda a wash.
 
Alright since we can have features in drivers then, ATI will allow Voltage changes via bios. Kinda nice for overclockers.

Nvidia can always try to correct shimmering more and more in their drivers, thats a nice feature.

footnote: Shimmering is objective so your single opinion matters not; thats to anyone that feels compelled to tell me it doesnt exist. Shimmering will always remain no matter how hard or easy it is to notice simply by the way AF is done, it cannot be completely removed. It just sucks worse on Nvidia hardware because their AF quality degraded. ATI R420/R480 has shimmering as well, but its not as bad.

Nvidia has a nice driver feature to improve only 3Dmark performance, thats a good one.

ATI has 3Dc, you dont have that. Oh and FP10, and Dual 10-bit Display Pipelines, Nvidias are 8-bit i think.

Oh and one more the R520/RV530 and RV515 Vertex Shader functionality is substantially better then Nvidia's with flow control enabled.
 
Shifra said:
Alright since we can have features in drivers then, ATI will allow Voltage changes via bios. Kinda nice for overclockers.
That WILL definitely be a feature for them! Ehhh....as soon as it's confirmed.

So far, we just have the word of one of the mad overclockers at Ibiza, who MAY have been supplied with custom drivers never to be seen again. Certainly, no reviewer on the X1K-series launch mentioned such a feature.

Shifra said:
Nvidia can always try to correct shimmering (which is objective so your single opinion matters not; thats to anyone that feels compelled to tell me it doesnt exist. Shimmering will always remain no matter how hard or easy it is to notice simply by the way AF is done, it cannot be completely removed) more and more in their drivers, thats a nice feature.

Last couple times nVidia had to change the aniso rendering method (GeForce4 Ti -> GeForceFX, then GeForceFX -> GeForce 6-series), it required hardware changes.

Ditto with their FSAA sample patterns (GeForceFX as OGMS to GeForce6 as RGMS).

I tend to think nVidia has a habit of hardcoding things where ATI makes them programmable. At least, that's how it's been to date.

Shifra said:
Nvidia has a nice driver feature to improve only 3Dmark performance, thats a good one.

Referring to DST, I presume? Didn't mention it, for the same reason as the below point...

Shifra said:
ATI has 3Dc, you dont have that.

No, I don't. Does any game actually USE it? No? Didn't think so....Wait, one? Errr....not much of a feature, no?

Shifra said:
Oh and FP10, and Dual 10-bit Display Pipelines, Nvidias are 8-bit i think.

Absolutely - unfortunately, no test has been able to prove it's used yet. Or, if it IS used, if it does anything at all to the picture output.

It's telling that ATI, in all their PR 'fluff', didn't have any "before" and "after" slide showing the FP10 advantage (exaggerated as need be - note nVidia's BS "HDR" slides for comparison)
 
the 2 big IQ feature differences IMO is FP16 HDR + AA and HQ AF

but performance is yet to be seen with HDR and AA
 
well i think the most important thing for the x1000 series is the af technique and how it wont shimmer as bad or at all when compared to the 7 series. other then that, both lines are pretty competitive, but of course there is availability/prcing. and being late to the game never looks good on the resume'. and having availalability for running dual graphic cards as of right now is always a bonus.

so biggest stand out feature: better af for less shimering

hdr w/ aa is pretty important but that can be subjective, seeing as how we haven't gotten to the point where we can have hdr+sm3.0+aa and not see a big performance hit
 
so its not a feature on a brand new peice of hardware if its not immediatly used, but SMP drivers that only the small percent of people who have dual core cpus can use is. Alright....if you're going to fight me tooth and nail when i name brand new things ATI has that Nvidia doesnt, and tell me they dont count cause they're not used yet (heh i mean after all the cards didnt just get revealed today or anything) then i'll opt out of the rest of this thread and watch it die.


BTW i think its funny you'd call SSAA a real feature considering what it does to performance and is thus, never going to be a reality for high res or high quality gaming. Yet 3Dc 4:1 texture compress, FP10/10-bit pipelines arent. Theres a difference between a real feature, and a "fluff" feature, to use your word. SSAA will never see the light of day in wide spread gaming, not on this card, its a "fluff" feature. 3Dc usable in OpenGL and D3D not to mention nvidia did bank on this tech so it will come out more in time, and FP10/10bit pipelines arent. You could at least pretend to be a little more objective.
 
Another benefit i saw for ATI is that they made the pixel pipelines multithreaded. So for more complex shader code, they will (hopefully) be more effecient.
 
Shifra said:
BTW i think its funny you'd call SSAA a real feature considering what it does to performance and is thus, never going to be a reality for high res or high quality gaming. Yet 3Dc 4:1 texture compress,
Not sure I follow. 3Dc is a texture compression format. Games have to be specifically coded to use it - if they aren't specifically coded to use it, well...there is nothing there. You can't "turn it on" - the game either uses it, or doesn't. Ditto with Truform.

SSAA is a feature you, as a user, can turn on in any game you want. It's just another FSAA mode. Heck, with the SMALL exception of ONLY the latest-and-greatest cutting-edge games, SSAA is perfectly feasible on almost all titles. It can certainly WORK in ANY title...it's just performance the question. (I mean, heck, if you can still get playable framerates in Half-Life 2 using it....it's a workable feature!)

Shifra said:
FP10/10-bit pipelines arent.
That IS a great feature - maybe. It'd always be used, if it's used at all.

My point is that, as far as we know, it's just a bullet-point feature not currently exposed in drivers. No testing has proved it exists and is working yet, and nobody has reported any improvement in image quality (subjective or objective) from it.

It's only a non-feature because there is no evidence of its existence other than a bullet point on a Powerpoint slide.

Shifra said:
Theres a difference between a real feature, and a "fluff" feature, to use your word. SSAA will never see the light of day in wide spread gaming, not on this card, its a "fluff" feature.
Sorry to say, but I was using SSAA all the time all the way back on the GeForceFX. Again, the cards are usually powerful enough that - while true the latest-and-greatest (FEAR, Oblivion, Quake 4, UT2K7, etc) will never be able to use it - last-gen games (Half-Life 2, Morrowind, UT2K4, etc) can use it just fine. And OLDER games (Starfleet Command series, Return to Castle Wolfenstein, etc), it helps a LOT.

Shifra said:
3Dc usable in OpenGL and D3D not to mention nvidia did bank on this tech so it will come out more in time,
Again, nothing inherently wrong with this tech - it sounds great on paper.

Only....it can't be turned on by the user, and REQUIRES game CODE to support....and no (very few, anyway) games use it. AFAIK, *only* XIII uses it, in fact.

Shifra said:
You could at least pretend to be a little more objective.

LOL - you did check out the system in my sig, right? ATI chipset, ATI video card?
 
TheArchitect said:
Another benefit i saw for ATI is that they made the pixel pipelines multithreaded. So for more complex shader code, they will (hopefully) be more effecient.

It's definitely something to keep in mind - but leans more towards raw FPS comparisons than feature comparisons.

And we don't really know that to be true, yet. Time will certainly tell!
 
Shifra said:
You could at least pretend to be a little more objective.
That's pretty funny, coming as it does from the greatest ATI apologist ever...
 
Well, for features i looked on Nvidia's site and didn't see anything that stated full hardware accelerated decode of H.264 (maybe I'm just blind), which will be good once Blu-Ray comes out (with the PS3). So if that is the case the would be a major plus for ATI since hardware accelleration is required for playback because CPUs dont have enough processing power for that codec.
 
so its not a feature on a brand new peice of hardware if its not immediatly used, but SMP drivers that only the small percent of people who have dual core cpus can use is. Alright....if you're going to fight me tooth and nail when i name brand new things ATI has that Nvidia doesnt, and tell me they dont count cause they're not used yet (heh i mean after all the cards didnt just get revealed today or anything) then i'll opt out of the rest of this thread and watch it die.


Exactly - you asked for features, this IS a feature IN the CARD - so it counts - you cant count something out cause no game developers has implemented it.

if it is IN the card - it is a feature, bottom line, end of it, you cant be passive selective about what you want to include.
 
Shifra said:
BTW i think its funny you'd call SSAA a real feature considering what it does to performance and is thus, never going to be a reality for high res or high quality gaming. Yet 3Dc 4:1 texture compress, FP10/10-bit pipelines arent. Theres a difference between a real feature, and a "fluff" feature, to use your word. SSAA will never see the light of day in wide spread gaming, not on this card, its a "fluff" feature. 3Dc usable in OpenGL and D3D not to mention nvidia did bank on this tech so it will come out more in time, and FP10/10bit pipelines arent. You could at least pretend to be a little more objective.


3Dc is 2:1 compression or 4:1 with speed loss due to uncompression (which ends up the same speed as 2:1) So the benifits of 3Dc is worthless. Which is the fluff feature again? 3Dc is useless compaired to DXT9. DXT9 has 2:1 compression and you can renormalize with no problems at all, 3Dc this can't be done. Objectivity comes from understanding the tech, not just babling about it.
 
MrGuvernment said:
Exactly - you asked for features, this IS a feature IN the CARD - so it counts - you cant count something out cause no game developers has implemented it.

if it is IN the card - it is a feature, bottom line, end of it, you cant be passive selective about what you want to include.
Actually, no, I'm really being selective about USABLE features.

There are a number of 'architecture' features of both cards that just don't matter because they aren't used. nVidia's DST springs right to mind. Sure, it speeds up 3dMark05.

Oh, and 3dMark05. And, hey, 3dMark05 uses it, too! Annnnnnd.....that's it. So it's no feature. A nice idea maybe, but not a usable feature.
 
dderidex said:
Anything I'm obviously missing?

I would add nview and hydravision to the list, with nvidia having a huge advantage...their multi-monitor/multi-desktop software spanks ATi's...

Other than that I think you hit all the big ones...in my mind, all that matters is HQ AF...that's what makes me want an x1000 series card...but the features won't be the story of this generation...it's all about availability and pricing...
 
Don't forget:
- nVidia GPUs post-NV40 do depth bound stencil shadow culling
- G70 does two four-component MADDs per fragment per clock

Also keep in mind that G70 converts any calls for 3Dc to V8U8, which ends up being 2:1 compression.
 
^eMpTy^ said:
I would add nview and hydravision to the list, with nvidia having a huge advantage...their multi-monitor/multi-desktop software spanks ATi's...
LOL - I dunno, back when I did multimonitor, I don't seem to recall being able to stand EITHER solution.

Still, that's a valid point. IIRC, nVidia's solution allowed two displays to be treated as a single large output device, which is not something ATI's solution ever did. Basically, there are three ways to output to two displays - 'clone', where each display is a duplicate; 'multi-view', where each display is a seperate render device; and 'span', where all the displays just create a single large array. Last I remember, ATI could only do the first two, while nVidia could do all three. Makes a BIG difference, as game support for the first...well, doesn't matter. For the second is nil (like, only MS Flight Sim handles it), and for the third is widespread.

^eMpTy^ said:
Other than that I think you hit all the big ones...in my mind, all that matters is HQ AF...that's what makes me want an x1000 series card...but the features won't be the story of this generation...it's all about availability and pricing...
*shrugs*

Availability, of course, is a factor. If you can't buy it, it don't exist. As to pricing....well, it's features you are BUYING, isn't it?
 
Brent_Justice said:
the 2 big IQ feature differences IMO is FP16 HDR + AA and HQ AF

but performance is yet to be seen with HDR and AA

another thing I also like is the 10-bit DACs and dual-link DVI

could potentially show better IQ on certain displays

so IMO ATI has got 3 good IQ things goin for them
 
dderidex said:
LOL - I dunno, back when I did multimonitor, I don't seem to recall being able to stand EITHER solution.

Still, that's a valid point. IIRC, nVidia's solution allowed two displays to be treated as a single large output device, which is not something ATI's solution ever did. Basically, there are three ways to output to two displays - 'clone', where each display is a duplicate; 'multi-view', where each display is a seperate render device; and 'span', where all the displays just create a single large array. Last I remember, ATI could only do the first two, while nVidia could do all three. Makes a BIG difference, as game support for the first...well, doesn't matter. For the second is nil (like, only MS Flight Sim handles it), and for the third is widespread.

I'm a huge multi-monitor user...I couldn't live without it...I've been having issues with 2d speed on my 6800gt since the day I bought it...so a couple weeks ago I slapped in a 9800pro to see if it was in fact the videocard...it wasn't...the 9800pro was just as slow (and slightly jerkier) in 2d...but I left it in for a couple weeks anyways to try out ATi's control panels and such...during that time I realized how WILDLY inferior Hydravision is to nView...

I know most people don't use it...but seriously...this is one place where nVidia's solution makes ATi look like a bunch of amateurs...

dderidex said:
*shrugs*

Availability, of course, is a factor. If you can't buy it, it don't exist. As to pricing....well, it's features you are BUYING, isn't it?

well this is where we make the switch from theoretical to practical...people actually buying the cards are only concerned with features they are going to use, and performance is king...so in the end, the card with the most bang for the buck wins...

just like last generation...ATi cards still sold because they were competitive in games...even if they didn't have a lot of checkbox features...
 
Brent_Justice said:
another thing I also like is the 10-bit DACs and dual-link DVI

could potentially show better IQ on certain displays

so IMO ATI has got 3 good IQ things goin for them

I don't get the whole 10-bit DAC thing...anybody have a good link on that subject?

As for dual-link DVI...it's nice...but I don't see anyone getting much use out of it for a while yet...all current mainstream LCDs, even the big ones, work just fine on single-link...but it's a switch that will need to take place eventually, might as well be right now...

I'm just so happy to see ATi providing some stiff competition...more competition = more cool stuff for less money...:)
 
Brent_Justice said:
another thing I also like is the 10-bit DACs and dual-link DVI

could potentially show better IQ on certain displays

so IMO ATI has got 3 good IQ things goin for them
I'll give them dual-link DVI....although I've not seen an LCD that needs it, those days ARE coming - and soon - and the feature WILL be a requirement, not just a checkbox, when those days are on us.

I dunno about that 10-bit DAC, though. You specifically mentioned in the review that you didn't notice any difference - and every review I've read on the 'net said the same thing. It just doesn't seem to matter right now - if, indeed, it's currently fully functional at ALL.

No way to know. If you guys had a bunch of monitor laying around there, though....might be something interesting to test!

Personally, if you are picky about accurate color reproduction, the fact that ATI doesn't even support proper hardware color calibration tools seems more of a HUGE strike against it that any number of improvements to the RAMDAC will counter.
 
dderidex said:
I'll give them dual-link DVI....although I've not seen an LCD that needs it, those days ARE coming - and soon - and the feature WILL be a requirement, not just a checkbox, when those days are on us.

I dunno about that 10-bit DAC, though. You specifically mentioned in the review that you didn't notice any difference - and every review I've read on the 'net said the same thing. It just doesn't seem to matter right now - if, indeed, it's currently fully functional at ALL.

No way to know. If you guys had a bunch of monitor laying around there, though....might be something interesting to test!

Personally, if you are picky about accurate color reproduction, the fact that ATI doesn't even support proper hardware color calibration tools seems more of a HUGE strike against it that any number of improvements to the RAMDAC will counter.



The 7800GT/GTX do actually have dual link dvi. EVGA lists it on their site as having it, and I can confirm that both the MSI 7800GTX and EVGA 7800GXT KO do have working dual link (2560x1600) - despite specs saying that isnt the case in some places. I'm assuming its because Nvidia lists it as such in order to leave that edge to the quadro lineup... As for the monitor that requires dual link - the Apple 30'' Cinema display is one of them... :)
 
dderidex said:
Actually, no, I'm really being selective about USABLE features.

There are a number of 'architecture' features of both cards that just don't matter because they aren't used. nVidia's DST springs right to mind. Sure, it speeds up 3dMark05.

Oh, and 3dMark05. And, hey, 3dMark05 uses it, too! Annnnnnd.....that's it. So it's no feature. A nice idea maybe, but not a usable feature.

ATI uasble features - 0
Nvidia usable features - all of them :D
 
dderidex,

thats a pretty good summary. However some of those features are more or less driver related and may or may not prove to be exclusive. For example Terry (the Cat guy) has said they have planned dual core drivers to be out end of this year, first of next. And compare that to a hardware feature (say HRD + AA) which will not change till we have new hardware from NV. See where I am going??
 
well I own an ati card but I am tired of the sony advertising they are doing. DONT look at Nvidia thats all they can do LOOK AT WHAT WE WILL DO...
NOT can or can't do but will do in the future. so did all the ati fan boys get ati to fax you a new video card today???
Yep paper launch of something that MIGHT be.
yep it does do this or that but not yet...wait for next driver...or maybe the driver in march

At a price that will be out of this world


thats my opinion of ATI
OK Nvidia too :)

sparks
 
Brent,

I've seen speculation but if there's an authoritative comment yet, I haven't found it. Can NVidia implement angle-independent anisotropic filtering in drivers, or is it hardware-dependent?
 
Well i guess ill add some of my views to this.

dderidex said:
Again, with certain older titles (Morrowind, for example), Truform (still supported) can provide poly counts that look as good as any recent title

Firsly i really think you should reword this. Sure truform will help quality abit, but claiming it going to make them look like recent titles is abit far fetched IMO.

dderidex said:
Anisotropic filtering modes that provide PURE scene-wide aniso filtering...not angle-dependent filtering that causes shimmering on certain texture transitions (fact: I don't care if you can't see it or not, nVidia's aniso filtering IS angle-dependent, so it WILL shimmer when you rotate through the algorithm, PERIOD. Basic mathematics, you can't argue with that.)

Well you can argue it in some ways. I honestly dont believe angel dependent AF is the real course for teh simmering. Maybe if the cards optimize the feature too much it might, but the main effect of angel dependent AF tends to be blurred textures (where the card does little or no AF), this can cause simmering, but this isnt really the problem thats seen with the GTX for example.

For example, the X800/850 all do angel dependent AF too, yet you dont hear people complaining about simmering with that 10 times a day. I do agree that the choice of just having full plain AF is good now we have very powerful cards, but angel dependent AF is still a fairly good feature to have if you ever need abit extra performance in games.

dderidex said:
Support for 16-bit color anti-aliasing (important for some older titles - "Longest Journey" springs right to mind)

ATI can do 16bit AA too. When the 9700 was first released it only did AA in 32bit. Some time after this ATI released a driver update to make the cards do AA in 16bit. Now im not sure if ATIs cards do 16bit AA in OpenGL or not, but i know for a 100% fact they do 16 AA in D3D, because ive used AA in Final Fantasy 8 PC (16bit only) on an R300 and an R420.
 
DarkBahamut said:
Firsly i really think you should reword this. Sure truform will help quality abit, but claiming it going to make them look like recent titles is abit far fetched IMO.
Have you seen Morrowind with Truform on? Sure, it doesn't have QUITE the advanced level of texture work and lighting on newer titles, but INSANE poly counts go a LONG way to improving the graphics experience.

DarkBahamut said:
Well you can argue it in some ways. I honestly dont believe angel dependent AF is the real course for teh simmering. Maybe if the cards optimize the feature too much it might, but the main effect of angel dependent AF tends to be blurred textures (where the card does little or no AF), this can cause simmering, but this isnt really the problem thats seen with the GTX for example.

For example, the X800/850 all do angel dependent AF too, yet you dont hear people complaining about simmering with that 10 times a day.

That's because it's not in vogue to complain about shimmering on the X800/X850.

They DO shimmer, too. The X800/X850 does, the GeForce6-series does, the GeForce7-series does, etc. It just got all the attention on the 7-series, because nVidia changed the algorithm somewhat and it resulted in a DRASTIC drop in quality - IE., LOTS more shimmering than any previous gen card. This served to bring attention to the issue at all (an issue that has plagued all cards since the GeForceFX, where nVidia got burned so badly in performance by NOT offering the feature), and so this has served as the beacon everyone points to in order to show the problem.

In fact, nVidia *has* released a driver that, in HQ mode, "fixes" the shimmering on the 7-series....but, as some have pointed out, it really doesn't make it go away entirely. But, of course it won't, the card uses angle-dependent aniso filtering. The "fix" just knocks the 'shimmering' problem down to the same level as is seen on the X800/X850 and GeForce6-series...which has long been "good enough" (apparently, since nobody complained about it before).

Now that ATI is offering angle-INdependent Aniso, they've kicked the game up to the next level!

DarkBahamut said:
ATI can do 16bit AA too. When the 9700 was first released it only did AA in 32bit. Some time after this ATI released a driver update to make the cards do AA in 16bit. Now im not sure if ATIs cards do 16bit AA in OpenGL or not, but i know for a 100% fact they do 16 AA in D3D, because ive used AA in Final Fantasy 8 PC (16bit only) on an R300 and an R420.

Actually, it's not doing 16bit AA. It's taking the 16bit image and rendering it in 32bit, and doing 32bit AA. ATI doesn't support 16bit AA natively in any mode.

That said - as you noted, in MANY cases, rendering a 16bit color game in 32bit color to do FSAA on it works fine.

In some cases - "Longest Journey" and "Grim Fandango", for example - it just doesn't work.
 
Shifra said:
Alright since we can have features in drivers then, ATI will allow Voltage changes via bios. Kinda nice for overclockers.

Nvidia can always try to correct shimmering more and more in their drivers, thats a nice feature.

footnote: Shimmering is objective so your single opinion matters not; thats to anyone that feels compelled to tell me it doesnt exist. Shimmering will always remain no matter how hard or easy it is to notice simply by the way AF is done, it cannot be completely removed. It just sucks worse on Nvidia hardware because their AF quality degraded. ATI R420/R480 has shimmering as well, but its not as bad.

Nvidia has a nice driver feature to improve only 3Dmark performance, thats a good one.

ATI has 3Dc, you dont have that. Oh and FP10, and Dual 10-bit Display Pipelines, Nvidias are 8-bit i think.

Oh and one more the R520/RV530 and RV515 Vertex Shader functionality is substantially better then Nvidia's with flow control enabled.

The .84's finally fixed my shimmering and the dual core support seemed to help out a decent amount too.
 
CMAN said:
ATI uasble features - 0
Nvidia usable features - all of them :D
Why do posts always end up having people like this bash the discussion...Shrugs


From a purely Hardware prespective I am betting you will see the ATI 1.000 series around a lot longer than then nvidia 7 series. Nvidia already announced that their next gen chip will be out sometime in first quarter next year.

ATI also has a future proof memory controler on board. At least for the next two years.

Ati has also made the hop over to a better process the 90 nm. while nvidia still has to make the leap.

Nvidia came out of the gates first, and ATI has been struggling to play catch up. Just have to see how Nvidia handles their conversion.\

Things like support for duel core is purely a software issue. And id bet anyone in here that ATI will support it in time. Availibility will come to, sooner or later your local best buy will have full selections from Both ATI and Nvidia.

I like ATI and think they are a good company. I was waiting on theire new card and started buying for it. Now that quake 4 is coming out this month I had to make my purchase now. I went with a nvidia 7800 GTX BFG. Nice hardware but I would rahter have had that new 1800XT :)
 
Joe Fristoe said:
The .84's finally fixed my shimmering and the dual core support seemed to help out a decent amount too.

That's great and all, but how many people are struggling to play games with that driver?
 
Shifra said:
you can list 12 official driver launches a year for ATI though, thats definitly a plus.

Produce twelve buckets of shite and then one calls it "sweet" :confused: Granted, not all of the Catalyst drivers have been bad, but several of the 12 you mention caused headaches galore in top games, etc.

Oh and that awesome SM 3.0. Turns out it's missing vertex texture fetch http://www.techreport.com/onearticle.x/8872. ATI's David Nalasco does suggest a possible work arround, but developers will have to code for this and it'll have a performance impact. meh. How can the Vertex Shader with flow controll be better if the texture fetch feature is missing? If a work arround has to be implemented, I suspect the performance will drop down to the SM 3.0 performance of the existing nV products. Again the real next gen product (G70 really being an nV40 refresh) from nV will have even more excitement in the SM arena.

FP10? Is this for bragging about a lower image quality mode for enhanced performance? I'm missing something here, why would anyone want to use this? Present for backwards performance for X8xx modes and their older SM 2.0 I suspect.

3Dc, so what,
http://www.beyond3d.com/previews/nvidia/g70/index.php?p=02 said:
*Update: NVIDIA have confirmed that 3Dc isn't supported by G70 hardware, however they do support a V8U8 format which can be used for 2:1 compression of two component Normals. When an application calls for 3Dc the NVIDIA’s driver will convert the relevant textures to this format at the load time and use these during the applications operation.
So developers can use 3Dc and we'll still enjoy the games. May be an issue when games actually require a full 512MB framebuffer since the compression would be less, but by then the next gen nV product will be available and rumor has it that 3Dc will be in hardware (remember the G70 is really only a refresh of the older tech).


Shimmering, let's wait for the German hardware techsite to exhaustively test the X1000 sereis for this. The reason you don't see this in the older tech is that they were not great performers at the resolutions we are now using and see the shimmering at (although I don't with the 78.03 drivers, but this must be subjective even if nVIDIA says they had a patch in these drivers :rolleyes: ) It's quite possible that the issue may now be present in the ATI topend as well. Let's wait before we end up having to all say OMFG it's present on ATI as well, but nV is still suxxorz.
 
Back
Top