AMD ATI Radeon HD 4870 X2 Preview

The shader core used on the RV770 is pretty much identical to the ones used on the RV670. But the main difference is that Anti Aliasing is resolved on the ROP instead of the shader like the RV670. So it would benefit better of the additiona bandwidth than the RV670 which was more ROP bound. ATi was in great disadvantage before, because in reallity, the RV670 had only 64 Stream Processors (They would work as 320 when it was possible to execute 5 Operations per Cycle, since is a Super Scalar Architecture), against 128 from nVidia (Which would perform 128 Operations per Cycle no matter what, because is a Scalar Architecture), now the difference is less, ATi has 160 (Which would work as 800 when it is possible to perform the same 5 operations per Cycle) while nVidia has 240 (Can execute 240 instructions regardless), so having a Super Scalar architecture will give you more raw power to process shaders without using lots of transistors, but since not all the shaders in games are equal, maximizing the Shader Core usage on ATi hardware requires optimizations on the game and driver level and it's performance is unpredictable, while the nVidia GPU which is a Scalar design, has a fat shader which will requires little or no optimization and it's performance will be more consistent and predictable, even though it's raw power will be lower. And yes, the CFAA is useable in many games, specially on the HD 4800 series (Even though the RV670 has a more optimized datapath to perform all anti aliasing operations on shaders, that explain why the HD 4800 has a bigger impact in performance when CFAA is enabled compared to previous generations and specially, when Edge Detect is used which makes the HD 4800 series to perform almost identical to the HD 3800 series.

Just quoting this because you like to post without reading, ihira.
So read this and be quiet. It will explain how your sarcasm just failed horribly. The way the cards handle the aliasing is different.

The drivers do matter for the GTX 280, but if you read my hypothetical situations chameleoki, you would see that the extent to which the drivers affect the GTX 280 vary, I only said that because some of the drivers were downclocking the card and thus adversely affecting performance.
 
lol
Who cares if they handle AA differently, ATI's solution is superior in that it does what we need it to, gives higher framerate.

If you had something solid as evidence that NVidia has a superior solution you would have given it by now.
The basis of your argument is on things that might happen.
I dont see older ATI users complaining from lack of support.
If the card has the required feature set for a game, it works.
 
lol
Who cares if they handle AA differently, ATI's solution is superior in that it does what we need it to, gives higher framerate.

If you had something solid as evidence that NVidia has a superior solution you would have given it by now.
The basis of your argument is on things that might happen.
I dont see older ATI users complaining from lack of support.
If the card has the required feature set for a game, it works.
yeah but in some reviews the ATI 4800 cards had lower minimum framerates with AA on than Nvidia.
 
yeah but in some reviews the ATI 4800 cards had lower minimum framerates with AA on than Nvidia.

grabing at anything little thing you can? By any chance is one your alias Maroon1? :p :confused:

Avg frame rate is more important... And we all know so..
 
grabing at anything little thing you can? By any chance is one your alias Maroon1? :p :confused:

Avg frame rate is more important... And we all know so..
Im not grabbing at anything, just simply pointing out something much like you did. Its BS that you can say the 4850 does better in AA but I cant say that the minimum frame rate is lower. Also the minimums are VERY important in a game and you should know that.
 
Not as important as Avg..

Edit:
How often and how long will an absolute minimum occur?

Not very often and for not very long, and as such I and many many others hold it less important than avg..
 
Not as important as Avg..
you just cant make a blanket statement like that because it depends on how bad the minimum is. I would rather average 60fps with a low of 45 than average 70fps with a low of 25 any day. every game is different but if both game have playable average framrates I would pick the card that could deliver a higher minimum framerate if the difference is significant enough.
 
Like I said, absolute mins do not occur often or for very long, for if they did the AVG FPS would be lower.
 
Like I said, absolute mins do not occur often or for very long, for if they did the AVG FPS would be lower.
well we could argue this all day but huge variations in framerate is much more aggravating then averaging close to steady framerates.
 
Whats to argue?

Absolute Min, is simply the Lowest frames, its one number, its one occurence when the FPS went to low, the lowest number of any, avg is avg, Theres nothing to argue..

In order of importance,
Avg
Absolute Min
Absolute Max

besides, the difference between Min FPS between the GTX 280 and the X2 is minimal so it really should not be that big of a deal... Well, unless... Well I won't say it..
 
again you CANT make a blanket statement like that. in most games having really low dips in framerates will kill the experience no matter what the average is. getting 80fps one second and 30fps the next is much worse then having a slightly lower but more consistent average. if the average is all that mattered then good reviewers wouldnt even mention anything else.
 
Sure, whatever you say...

But at the end of the day we all know which bar people look for first..
 
Sure, whatever you say...

But at the end of the day we all know which bar people look for first..
well if thats the only bar they look at then they are being ignorant. the overall experience is NOT just based on average framerate and maybe its time you learned that. the 9800gx2 can usually average higher fps than a gtx280 in some game but guess what...the minimum framerate is usually lower and makes for a crappy experience compared to smoother consistent rate of a singles gtx280. most of the issues with dual cards is that they drop the minimum framerate down too low at times even though they average high framerates and that gives the perception of bad performce. please let that sink in.
 
oooh My Ghitty Aunt...

I said the first bar, not the only bar...

All Im saying is that the difference between absolute min FPS, comparing the GTX 280 and 4870 X2, is not drastic enough for most people to consider the GTX 280 outright better than the X2.. And I sure hope your not trying to say that, cause Ill go to bed at that point. :p

And I wasn't even talking about Dual card setups, I was just speaking comparitively between the GTX 280 and the 4870 X2..

Edit: And its too early to be going on about this comparison as official reviews are not even out...
 
Just quoting this because you like to post without reading, ihira.
So read this and be quiet. It will explain how your sarcasm just failed horribly. The way the cards handle the aliasing is different.

The drivers do matter for the GTX 280, but if you read my hypothetical situations chameleoki, you would see that the extent to which the drivers affect the GTX 280 vary, I only said that because some of the drivers were downclocking the card and thus adversely affecting performance.

;) :p
 
lol
Who cares if they handle AA differently, ATI's solution is superior in that it does what we need it to, gives higher framerate.

If you had something solid as evidence that NVidia has a superior solution you would have given it by now.
The basis of your argument is on things that might happen.
I dont see older ATI users complaining from lack of support.
If the card has the required feature set for a game, it works.

Are you not listening?
The game has to be programmed for IN THE DRIVERS.
As in for the LATEST game the LATEST drivers need to be used.
If the game is not PROGRAMMED FOR IN THE DRIVERS THEN IT WILL NOT HAVE THIS PERFORMANCE.
READ!
 
Well, since all games are using a standard API, they should perform great regardless of the architecture. But in situations like the ATi cards which are superscalar architecture (HD series), some games (Not all) may require some optimizations at the driver level to perform better, specially those games with little shader work and lots of textures which favors the nVidia hardware. But old games should have no problems since they don't require that much raw power to run, only in very rare cases the game can perform sluggish, but usually issues with games are related to image quality issues like missing textures etc, which happens on both, nVidia and ATi. But performance improvement is a plus for the sake of competition and appeal.
 
Are you not listening?
The game has to be programmed for IN THE DRIVERS.
As in for the LATEST game the LATEST drivers need to be used.
If the game is not PROGRAMMED FOR IN THE DRIVERS THEN IT WILL NOT HAVE THIS PERFORMANCE.
READ!

No it doesnt, you are way off the mark.
Put up some figures if you want your argument to have merit and dont use Caps, its childish.
Also, I dont care for your tone, curb it or I'll return the compliment.

I dont see anyone reporting that their older ATI cards run games any worse than expected for the generation.
In fact, the old X1900 series still perform better than NVidias competing 7900 cards from the same era.
Thats the opposite to what you claim is happening.
 
Not an SLI argument.
Architecture of the hardware on the ATI requires optimizations in the drivers to perform as great as it does.
The GTX2xx does not, because its architecture has the 240 shaders as exactly that, 240 shaders.

The shaders from the 8 series and the GTX2 series do not differ in the way they compute the mathematics, what nVidia did with the GT200 shaders is that there are now 3 streaming multiprocessors per Texture Processing Cluster and there are now 10 TPCs as opposed to 8.
This gives a higher shader count of 240 (10*8*3) which isn't really more efficient.
So per TPC we have:
24 Shader processing units
8 texture fetch units.
10x24 = 240 shaders
8x 10 = 80 texture fetch units.
plus a 512mbit memory bus
This all equals a huge and overpowered single GPU solution.

Whereas ATI had changed the way their streaming processors work. If you see in the earlier posts by the person who stated this, if ATI codes the drivers for a game it allows them to map their shaders and lets their architecture become more effective.

What I am saying to you is in a world where we can't depend on driver updates, the ATI will win when the drivers are kept up to date in the latest games, but will always fall short in older games because of the inability to create all of the profiles for the card for most games. The difference in this argument between a single GPU nvidia card and a single GPU ATi 4xxx series card is, for the ability to get the performance the 4870 is capable of you will NEED to have the profiles for each game whereas an nVidia card would not.
Oh I see; you're not talking about SLI/Crossfire.

My impression is that you don't really know what you're talking about. What experience do you have writing shader compilers, or gpgpu backends which leads you to believe this about their architectures? You seem to be saying a lot that the AMD cards require significantly more optimization to achieve good performance, but you don't seem to be backing it up with anything quantifiable or via experimental data.

Write me a shader or gpgpu program which demonstrates this supposed optimization requirement; or show me an example of someone else who has done this.

Yes, we understand that AMD uses SIMD processors, and nvidia uses SISD, but that doesn't necessitate one requiring more optimization over the other to achieve good performance.

People have been arguing over the respective benefits of each for much longer than video cards have been around. Suffice to say, there is no definitive answer to this as you would suggest. For example, SSE (Streaming SIMD Extensions) and its various incarnation provide significant performance benefits in certain applications with no optimizations other standard techniques done by those in the compiler. I doubt you would suggest that CPUs would be better without SSE.
 
Look, average is NOT a good measure. On AVERAGE, EVERYONE in America has 1 tactical and 1 ovary. Do you still think average is such a good statistic?
 
Im a Computational Physics major and I feel no need to tell some of these lower-life forms how it exactly goes down.

What is a computational physics major? Is it a lot of comp science with some physics or is it a mix of eletrical engineering with physics. Never heard of such a thing before, well at least in USA.
 
Charlie Nvidia stil requires optimizations you know that right?
Your only 'arguement' for the GTX280 is a hypothetical.
"Oh well if we didnt have drivers that were optimized it would be better"
Thats plain retarded.
Atop of that The ATI archs from the X1800 and X1900 series were the same way compared to there generational counterparts and I see the X1800 and X1900 lasting way longer than the Nvidia counterparts.
The 4870X2 beats the GTX280.
 
Of course the GTX280 requires optimizations, but it's shader processing is very similar to the rest nVidia's offerings. nVidia's optimizations aren't to the extent where upon a release of a new video game the drivers must be rolled out for adequate performance. I'd love to see a 4870x2 use catalyst drivers previous to its release to see how good it is. You won't see it. I can tell you that a GTX280 on previous drivers will not perform as bad as you think.
I'd be happy to run my 280s on some previous driver to their release and I'd love to see them pit up against the 4870x2 so Nenu can go crawl in his hole. This will end the argument since this will show both cards running drivers that are not created for them.

Infact Nenu, your new name is just Dealdaddy because you can't handle that the card is good but only when optimized for each game. I highly doubt you will stomach for these current 4870x2 drivers to run good on FarCry2, infact I expect you to say "wait for new drivers" upon the games' release to defend your purchase.

What Nenu is trying to do is use cognitive dissonance to justify his purchase of the 4870x2. Go ahead, buy it. If you're not going to listen to evolucion8 I think you're pretty ignorant. You think being the DealDaddy is "the win" for you Nenu. You'll see. I hope you bought a 3870x2 to learn your lesson about launch hype but something tells me you haven't.
 
No really, I'm happy with what I have and dont feel the need to spout how good my hardware is to the world and try to convince everyone that what I have is better.
You talk utter rubbish and have no way of backing it up as you have not backed up the comments that we have asked for proof on.
I dont know what has happened in your life but you come across as very insecure and child like.

You have made assumptions about people that are wrong and you base your argument on things that might happen.
You have no proof that NVidia has bettered ATI this generation and even when talking about older generations, you got it wrong.
You remind me of a spoilt child :)
 
I don't know what ppl are on about this Min FPS stuff, according to the Hard Forum Preview in the apples to apples section for Crysis, the X2 and an OCed GTX 280 get the Same exact Min FPS of 11.
And on the AOC Apples to apples the X2 Beats out the GTX 280 in MIN Fps...

On the PC Perspective Preview in Bioshock the X2 has better Min FPS than GTX 280 at all resolutions, on the call of duty test the Min FPS are separated by 5 FPS or less so thats not nothing to write home about, and again on the two high resolution tests for crysis the Min FPS are separated by 3 FPS or less, again nothing to write home about.

Other previews don't list min or Max FPS.

So the Min FPS difference between the X2 and GTX 280 are tiny. Nothing to worry about, not at all.
 
Of course the GTX280 requires optimizations, but it's shader processing is very similar to the rest nVidia's offerings. nVidia's optimizations aren't to the extent where upon a release of a new video game the drivers must be rolled out for adequate performance.
You keep saying this, but where's your evidence to back it up? Everything I know about the hardware suggests that what you are saying is false.

What part(s) of the rendering pipeline requires significantly more manual optimization on AMD hardware? Show us an example of this in practice.
 
I have a 3870X2 Charlie and I dont update my drivers on the regular.
Its holding up better than most of my friends cards from that generation.
You clearly dont know what your talking about.
 
Yeah Charlie's right, I would never buy a card that requires me to update its drivers slightly more than normal. What an inconvenience. :rolleyes:
 
I have a 3870X2 Charlie and I dont update my drivers on the regular.
Its holding up better than most of my friends cards from that generation.
You clearly dont know what your talking about.

Right Fidel, by getting owned by a single 8800 GTX.
Because 3dmark06 said otherwise, and thats what you choose to believe.
if your beloved 3870x2 is so good how come its worth about 140 bucks now?
Buy another and run crossfirex!
 
Yeah Charlie's right, I would never buy a card that requires me to update its drivers slightly more than normal. What an inconvenience. :rolleyes:

Right, and you're being sarcastic without knowing what the heck you're talking about. You don't even realize that your games will be dependent upon drivers.
 
You keep saying this, but where's your evidence to back it up? Everything I know about the hardware suggests that what you are saying is false.

What part(s) of the rendering pipeline requires significantly more manual optimization on AMD hardware? Show us an example of this in practice.

How can you read evolucion8's post and ask questions that you have answers to?
You can't support something a company will not let you look at. Do you think ATI would admit this? Do you think that both kernel's aren't locked out? Of course they are. How can I show you this until reviews come out? When they do I'm sure kyle and brent wont mind showing you a 4870x2 on outdated drivers to show you guys what I'm talking about.
 
Right Fidel, by getting owned by a single 8800 GTX.
Because 3dmark06 said otherwise, and thats what you choose to believe.
if your beloved 3870x2 is so good how come its worth about 140 bucks now?
Buy another and run crossfirex!

Yet again more assumptions and then going off topic to try and prove a different point that nobody here is discussing.
You are good at this :rolleyes:

Right, and you're being sarcastic without knowing what the heck you're talking about. You don't even realize that your games will be dependent upon drivers.

Another assumption that is clearly wrong.

How can you read evolucion8's post and ask questions that you have answers to?
You can't support something a company will not let you look at. Do you think ATI would admit this? Do you think that both kernel's aren't locked out? Of course they are. How can I show you this until reviews come out? When they do I'm sure kyle and brent wont mind showing you a 4870x2 on outdated drivers to show you guys what I'm talking about.

Completely off topic, answer the question.
 
Right, and you're being sarcastic without knowing what the heck you're talking about. You don't even realize that your games will be dependent upon drivers.

I know enough that a super scaler architecture (R6xx/R7xx) doesn't need any more "driver optimization" than a scaler architecture (G8x/G9x/GT2xx) does. As for Crossfire profiles, yeah they need to be updated (although technically AMD has a "compatible AFR profile" that works with every game, just not as well as an optimized AFR profile), but it's not a HUGE deal. Is it so hard to install hotfix drivers or a new driver every month? You make it sound like AMD will make you jump through hoops to get games working with Crossfire.
 
How can you read evolucion8's post and ask questions that you have answers to?
You can't support something a company will not let you look at. Do you think ATI would admit this?

Everything that was said about how SIMD processors requiring optimization on a game by game basis in order to achieve decent performance is incorrect. Do you any empirical evidence to suggest otherwise?

Crossfire/SLI optimizations are one thing, but you said that wasn't what you were talking about.
 
Right Fidel, by getting owned by a single 8800 GTX.
Because 3dmark06 said otherwise, and thats what you choose to believe.
if your beloved 3870x2 is so good how come its worth about 140 bucks now?
Buy another and run crossfirex!

First of all I dont use 3Dmark...
Second I got the card around the launch time for a massively discounted price because I have a friend that works at my local electronics store and used his emplyee discount atop of the MIR to get me the card for about 260 dollars. A stellar price at the time and while slower than a 8800GTX it sure as heck was cheaper. But this is all besides the point and your talking about things I never brought up Lol! :rolleyes:
I suppose thats cause you cant prove your point.
You still cant except that the 4870X2 destroying the GTX280OC.
Thats just pathetic...
Its just a graphics card..

How can you read evolucion8's post and ask questions that you have answers to?
Because this guy has no credibility to me over the posts I see at places like Beyond3D where they have never mentioned this suposed 'achilies heel' of ATis Arch.
Both cards need drivers.
Both Cards use optimizations.
Both Companies update the drivers frequently.
But your wrong for thinking that the card is driven purely by its drivers.
 
Because this guy has no credibility to me over the posts I see at places like Beyond3D where they have never mentioned this suposed 'achilies heel' of ATis Arch.
Both cards need drivers.
Both Cards use optimizations.
Both Companies update the drivers frequently.
But your wrong for thinking that the card is driven purely by its drivers.

First of all, this guy have a name. 2nd is true that the card is not driven entirely by it's drivers since the game is created on a standard api. 3rd, try and read the architecture review of the HD 2900XT or HD 3800 and you will see that the architecture is more dependent of optimizations to achieve it's peak performance than competing hardware. Why, simply, HD 3800 GPU has 64 Stream Processors which are 5 way Super Scalar, composed of 4 simple stream processors and 1 fat stream processor which take charges of complex operations like MUL, ADD etc) That means that in the worst case scenario (Which pretty much never happens), the GPU will only execute 1 instruction per stream processor, that means 64. But when the conditions area ideal and the optimizations are done (Specially at the Game Engine level), 5 instructions can be packed and executed in parallel on each stream processor making a total of 320 instructions per cycle. That's why in some games like Bioshock, the HD 3X00 performance is phenomenal, rivaling and outperforming the 8800GTX in many scenarios, in other games that are optimized using short shaders which will excell on a more linear hardware like the GeForce 8 series, will perform poorly on ATi hardware like Lost Planet. So if you need proof just let me know and I'll paste the links right here.
 
Even if the 3xxx series did not perform as well as expected, it still is the top series in bang for your buck discussion, especially at current prices.. And next in line is the current 4xxx..

-lays CH to rest-
 
Back
Top