4870x2 reviews beating gtx280 sli

Well the good news is that ATI seems to be writing their drivers to take multi-GPU setups better, esp. since they seem dedicated to this, and Nvidia likely as well. Hopefully with all the swirl around GPGPU and what not, devs will code future games to support multi-GPU better as well.

And an old game should not have any trouble with the performance 4870 puts out anyways

That's exactly it. Newer games which need all the power of a 4870X2 will be supported through the drivers. Older games that aren't supported won't need that much power. Even a single 4870 will most likely allow you to run older games at max settings.
 
That's exactly it. Newer games which need all the power of a 4870X2 will be supported through the drivers. Older games that aren't supported won't need that much power. Even a single 4870 will most likely allow you to run older games at max settings.

Well that's the way its supposed to work but life tells us that this isn't always the case. I'm not knocking multi-GPU solutions. CF looks to scale from what I've read well these days with few issues and my personal experience with this round of SLI has changed my view of SLI. It works well.
 
my personal experience with this round of SLI has changed my view of SLI. It works well.
Same. I didn't much like 8800GT SLI with 174/175 series drivers -- which is primarily why I was so quick to get a GTX 280. I later tried the waters again with a second GTX 280. Much better. Still some microstuttering, but never unbearable as it was (for me) with G92.

Two factors common to both SLI/CF that are apparently making this generation better for multi-GPUs are (a) the underlying individual cores (G200, RV770) are faster than their previous-gen counterparts, so the worst case framerates will be better; and (b) continual improvement of drivers.
 
I hate how Nvidia has that "The it's meant to be played" project. They always get their hands on good games to make them run better on their cards before they are released.
 
I hate how Nvidia has that "The it's meant to be played" project. They always get their hands on good games to make them run better on their cards before they are released.

I don't think that's really the case. Isn't AoC a TWIMTBP title that runs better on AMD cards? There may be others, but I don't think that this program is as technically oriented as it is marketing oriented.
 
Well he just got the boot. I saw that post and just left it alone.;)

Yea he did kind of annoying thats for sure.

Either way I might do as you do, wait tell those new 55nm gtx's come out, see how they do against the 4870x2.

Im for price/performance....but im not sure how much more my 6400+ at 3.5ghz could handle these new games...

Time will tell
 
Just wondering, has anyone actually clocked his/her GTX280 up to 750MHz to see what kind of performance the new GTX290 (?) will get? Is it even possible for the GTX 280 to get that high?
 
Just wondering, has anyone actually clocked his/her GTX280 up to 750MHz to see what kind of performance the new GTX290 (?) will get? Is it even possible for the GTX 280 to get that high?

I've not seen a GTX 280 clocked that high though 260s can hit that. The best I get is 660/1420/1150.
 
Just wondering, has anyone actually clocked his/her GTX280 up to 750MHz to see what kind of performance the new GTX290 (?) will get? Is it even possible for the GTX 280 to get that high?

I had a single GTX 280 stable at 729/1458/1377. Dropped to 702/1458/1350 for SLI.
 
I don't trust min fps numbers on multi gpu set ups either. That helps compare for stuttering not for microstuttering.

Just because you don't notice microstuttering doesn't mean it isn't affecting quality.

My point is a dual gpu at 90 fps can look like a single gpu at 70-80fps. But you won't "see" the micro-stuttering because even 70fps is pretty fast. My advice would be to run up the AA on the dual gpu until it has the same FPS as a single gpu and see if they have the same smoothness. If the dual gpu doesn't look as smooth it is because of microstuttering.

Microstuttering is something I feel you need to consider when you are trying to justify paying the extra cash for x2, SLI, or CF. Especially in today's world where the 4850,4870,260,280 are very good at playing current games and are pretty cheap.

Display lag is also an issue.
 
I thought micro stuttering was an issue when your below 60fps...any who I guess I'll find out myself if this is indeed an issue when my card gets here next week.
 
People who are buying the 4870x2 care. Any who, the previews seem to match the current reviews. I've linked all the new reviews on page 1 though.
 
Either way I might do as you do, wait tell those new 55nm gtx's come out, see how they do against the 4870x2.


Wishfull thinking but will not happen..... They will be lucky if they can sandwich two of these beasts into a GX2 flavor.
 
Wishfull thinking but will not happen..... They will be lucky if they can sandwich two of these beasts into a GX2 flavor.


After looking at some of the recent benchmarks, I think ill go with a 4870. I noticed sometimes it beat a 280gtx, sometimes it lost. But either way it rocked at 1920X1200.

Im usually an nvidia guy, but the price/performance of a 4870 compared to a 280gtx or every a 4870x2 is just awesome :)
 
I suggest a fresh OS load as I have never had good success removing NV video cards and going to AMD without resorting to a clean OS.
 
I suggest a fresh OS load as I have never had good success removing NV video cards and going to AMD without resorting to a clean OS.

Yea I learned my lesson back in the day going from a GF4 to a 9700pro :)
 
Mine was fine, I just unintalled the Nvidia drivers, ran CCleaner, and Driver Cleaner, then shut-down, pulled the 8800GTS and slapped in the 4870... haven't had a single issue... =/
 
Pretty impressive, looks like nvidia is going to be hurting even more. 2 x 1GB of GDDR5. This is the card everyone has been waiting for. :)

4870x2 Reviews


[This Spot reserved for [H] Review :cool:]

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/9225-palit-radeon-hd-4870-x2-2gb-video-card-review.html

http://legitreviews.com/article/766/1/

http://techreport.com/articles.x/15293

http://driverheaven.net/reviews.php?reviewid=607

http://anandtech.com/video/showdoc.aspx?i=3372

http://www.guru3d.com/article/radeon-hd-4870-x2-review-crossfire/

http://www.hexus.net/content/item.php?item=14928

http://www.hothardware.com/articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/

http://www.extremetech.com/article2/0,2845,2327867,00.asp

http://www.pcper.com/article.php?aid=605

http://www.tweaktown.com/reviews/154...ire/index.html

http://www.techpowerup.com/reviews/Sapphire/HD_4870_X2/

4870x2 Previews

http://anandtech.com/video/showdoc.aspx?i=3354

http://enthusiast.hardocp.com/article.html?art=MTUzMSwxLCxoZW50aHVzaWFzdA==

http://www.driverheaven.net/reviews.php?reviewid=588

http://www.legitreviews.com/article/745/1/

http://techreport.com/articles.x/15105/1

http://www.hexus.net/content/item.php?item=14178

http://www.pcper.com/article.php?typ...&aid=590&pid=2

http://www.extremetech.com/article2/0,2845,2325444,00.asp

http://www.guru3d.com/article/radeon-hd-4870-x2-preview/1

Related Info:

Sapphire announces 4870x2 release date: August 12, 2008 at the launch price of $499!
http://xtreview.com/addcomment-id-60...ease-date.html

Alright, updated and added all the reviews...all the ones that I know of anyways. I still haven't read through all of them myself.
 
After looking at some of the recent benchmarks, I think ill go with a 4870. I noticed sometimes it beat a 280gtx, sometimes it lost. But either way it rocked at 1920X1200.

Im usually an nvidia guy, but the price/performance of a 4870 compared to a 280gtx or every a 4870x2 is just awesome :)

I think the GTX260 is a pretty good bang for the buck as well for the resolution your running. I do think that the 4870 does have a bit of a performance edge, and would be the obvious choice especially if you have an intel-chipset on the MB.
 
I don't trust min fps numbers on multi gpu set ups either. That helps compare for stuttering not for microstuttering.

Just because you don't notice microstuttering doesn't mean it isn't affecting quality.

My point is a dual gpu at 90 fps can look like a single gpu at 70-80fps. But you won't "see" the micro-stuttering because even 70fps is pretty fast. My advice would be to run up the AA on the dual gpu until it has the same FPS as a single gpu and see if they have the same smoothness. If the dual gpu doesn't look as smooth it is because of microstuttering.

Microstuttering is something I feel you need to consider when you are trying to justify paying the extra cash for x2, SLI, or CF. Especially in today's world where the 4850,4870,260,280 are very good at playing current games and are pretty cheap.

Display lag is also an issue.

I am very curious to see how you can "notice" the difference between 90fps and 70 fps on a monitor that refresh your screen at 60 fps.

Now entering "explain to me like i was five years old" mode :cool:

Human eye has ilusion of continous movement around the 24 images every second, movies are 24fps :p

For gaming purposes 30fps minimums are good gameplay, but when a card reaches 60fps minimums one can rest assured that no further performance gains will be noted because the LCd monitor simply will not be able to display the images any faster...your eye was defeated at 24fps anyway.

it is complete BS to use number like 65fpsx130fps to claim a "100%" performance gain.

A true performance gain is when you are in unplayable and reach playable numbers: 25 fps on Crysis is unplayable, but a mere 5 fps gain will put the gameplay on the 30fps mark and make it playable.

The worst case scenario happens when a dual GPU solution sends 2 pictures in a row making the 40fps been seen as 20+ fps ( and below 24fps the human eye MAY perceive stuttering). Any claims that a card with 50fps minimums has microstuttering is FALSE from the start,

So the arguments that card Y has microstuttering at 22fps are pure GERMAN CRAP: the game was already at an unplayable setting, microstuttering notwithstanding:mad:

I insist that minimum fps are a good ground to make assumptions on dual cards configurations:

<24fps- the game is stuttering already, microstuttering is just meaningless
24-30fps- the game is microstuttering unless the drivers are perfectly timing the frames.
30-40fps- the game MAY be plagued by microstuttering, drivers optimizations are crucial.
41-48 fps- only the worst case scenario will show microstuttering on very rare situations
49+fps- sorry sceptics idiots :rolleyes: it is simply impossible to claim microstuttering using the naked eye here, maybe the 6 million dollars man with the bionic eye could see it on 10k+ rendering objects scenes :p

So if there is a single card solution that reaches 27fps on a given resolution and a terrible scaling dual gpu solution that has "only" 66% performance gain and reaches 45fps i am truly sorry but the dual gpu solution has really smashed the competition, now imagine that the dual gpu solution can do the same thing with FREE 8xAA:eek:
just to complete the argument crushing i would like to point that response lag and input lag of monitors are in the ms ( milisecond) realm. at 30 fps a single frame lasts around the 33ms mark- good monitors lose less than a frame,very bad ones lose a notch above 2 frames, nothing that will make a game unplayable or make one decide for VGA X or Y. Response times havent been an issue for the last 4 years- 2ms vs 8 ms wont made even a single frame gap in the end:cool:, input lags are an issue for professional players of first person shooters, where a frame behind is a great deal.;)

[H] review is a much awaited opinion, hopefully the final truth on the microstuttering debacle.
 
The only time you're going to notice anything is when the frame rate dips below 60FPS. Now if microstuttering does cause it to dip below, then it's possible to notice. But anything above 60FPS won't be noticeable if your monitor only refreshes at 60Hz.
 
Time to pull the big guns. Lets quote Einstein:

"Only two things are infinite, the universe and human stupidity, and I'm not sure about the former"

Hey how bout forming an arguement next time instead of posting an irrelevant quote?

Ad hominem FTL.
 
I am very curious to see how you can "notice" the difference between 90fps and 70 fps on a monitor that refresh your screen at 60 fps.

First and most importantly, that was an example. The same logic applies to 20-30fps range. My point was min fps numbers tell you next to nothing about microstuttering.

Second of all my FW900 CRT refreshes 80-100; not everyone uses LCDs.

Third, the higher the FPS past the monitor refresh, the less delay between the time the picture is rendered and the time it is displayed, and even the small microstutter on single gpus will cause an LCD to repeat the same frame often if you are running right at 60fps.

Human eye has ilusion of continous movement around the 24 images every second, movies are 24fps :p

It is not a perfect illusion, it is a practical illusion that is meant for viewing not for gaming. Gaming is 100% computer rendered video, and so when you are tracking moving objects (or the map moving relative to your movement) the difference between even 100hz and 120hz is perceivable because of afterimage. In a blind test even half-decent gamers will tell the difference between 60fps and 100fps with 100% accuracy on all the monitors I have used. Just because you can't explain why something is happening doesn't mean it is your imagination.

The worst case scenario happens when a dual GPU solution sends 2 pictures in a row making the 40fps been seen as 20+ fps

Now you are saying what I was saying when you quoted me.

Any claims that a card with 50fps minimums has microstuttering is FALSE from the start
Now you're dead wrong, unless you meant to say visually noticeable microstuttering. But the fact is there can be microstuttering at 150fps but not any at 100fps and below, though obviously there's no reason to care about it in that situation.

<24fps- the game is stuttering already, microstuttering is just meaningless
24-30fps- the game is microstuttering unless the drivers are perfectly timing the frames.
30-40fps- the game MAY be plagued by microstuttering, drivers optimizations are crucial.
41-48 fps- only the worst case scenario will show microstuttering on very rare situations
49+fps- sorry sceptics idiots :rolleyes: it is simply impossible to claim microstuttering using the naked eye here, maybe the 6 million dollars man with the bionic eye could see it on 10k+ rendering objects scenes :p
Now your ass is speaking.


Response times havent been an issue for the last 4 years- 2ms vs 8 ms wont made even a single frame gap in the end:cool:, input lags are an issue for professional players of first person shooters, where a frame behind is a great deal.;)
Response time in LCD terms has nothing to do with input lag. And 5-30ms input lag is not something that only "professional gamers" consider. There are plenty of people that enjoy a high level of playing even though they aren't "professional gamers."

[H] review is a much awaited opinion, hopefully the final truth on the microstuttering debacle.
I never said 4870x2 was a bad card or has problems, I just said there are known issues with multi-gpu set ups and any such issues that this card has need to be evaluated before you rush to compare FPS numbers. And in fact, it appears that the 4870x2 has pretty much defeated microstuttering in all practical situations.
 
First and most importantly, that was an example. The same logic applies to 20-30fps range. My point was min fps numbers tell you next to nothing about microstuttering.

I can agree that microstuttering will happen at any frame rate, but i still fail to see how UNPERCEIVABLE microstuttering can impact gaming experience- will you defend the 27fps without microST or the 45fps with little microST?

MINIMUM FPS TELL ALL YOU NEED TO KNOW ABOUT PERCEIVABLE MICROST:)

Second of all my FW900 CRT refreshes 80-100; not everyone uses LCDs.

Can a CRT refresh at 1000Hz@1920x1200 or 2560x1600- the warfield of this card?



It is not a perfect illusion, it is a practical illusion that is meant for viewing not for gaming. Gaming is 100% computer rendered video, and so when you are tracking moving objects (or the map moving relative to your movement) the difference between even 100hz and 120hz is perceivable because of afterimage. In a blind test even half-decent gamers will tell the difference between 60fps and 100fps with 100% accuracy on all the monitors I have used. Just because you can't explain why something is happening doesn't mean it is your imagination.

This sentence is the very real reason why i felt quite right quoting einstein- a monitor refresh the screen at 60 fps and you claim to be able to tell the difference between a scene rendered at 65 and a scene at 130fps WHEN IT IS DISPLAYED AT 60FPS- is this difference enough to dump in the trash the card that rendered the game at 65 MINIMUM fps:rolleyes:;

in videoforuns it has been a long discussion IF a 120Hz FullHD LCD can beat a Full HD 60Hz plasma- the HDMI link just cant acomodate sound and 1080p video on the same link at 120Hz:rolleyes:


Now you're dead wrong, unless you meant to say visually noticeable microstuttering. But the fact is there can be microstuttering at 150fps but not any at 100fps and below, though obviously there's no reason to care about it in that situation.

Noticeable microST- is there any other kind of MicroST that matters for buying decisions?

Response time in LCD terms has nothing to do with input lag. And 5-30ms input lag is not something that only "professional gamers" consider. There are plenty of people that enjoy a high level of playing even though they aren't "professional gamers."

OHHH- now all of a sudden we are beggining to enter the reality distortion field- i NEVER said that input lag is the same as response time, i only showed that your idea that response time and input lag would interfere with the VGA choice had no solid ground, much like the 70vs90fps situation.

I never said 4870x2 was a bad card or has problems, I just said there are known issues with multi-gpu set ups and any such issues that this card has need to be evaluated before you rush to compare FPS numbers. And in fact, it appears that the 4870x2 has pretty much defeated microstuttering in all practical situations.

it appears that the 4870x2 has pretty much defeated microstuttering in all practical situations

end of argument then:D

Quoting Einstein is a practical and comical way to give up discussion: after you enter "explain to me like i was five years old mode"and there is still resistence it is better to let those who could perceive the truth learn it and simply let the rest believe that all species where created on 7 days about 6 k years ago:(

[H]addicts have upgraditis, and all the time the unethical reviewers are trying to push things the industry produced (8600GTS anyone- a card that was slower than midrange offers of the previous generation and costed MORE:eek:); i am really tired of sites saying that a given card/setup was FAR superior based on 130vs115 fps numbers:eek:, and this we are seeing another strange behavior- claiming that UNPERCEIVABLE MicroST is a SERIOUS problem with the best VGA of the history, or that a card that costs $170 and is faster than a card that costed $300 2 months ago DOESNT deserve the laureals:mad:http://www.extremetech.com/article2/0,2845,2327347,00.asp
 
Quoting Einstein is a practical and comical way to give up discussion: after you enter "explain to me like i was five years old mode"and there is still resistence it is better to let those who could perceive the truth learn it and simply let the rest believe that all species where created on 7 days about 6 k years ago:(

No its really an asshat/arrogant way to give up discussion because you cant explain something, or form a worthy arguement.
 
it appears that the 4870x2 has pretty much defeated microstuttering in all practical situations

end of argument then:D

Yes, kill this argument. I think people started to believe they were seeing micro-stuttering when they sli'd the g92s/8800gts/9800gtx and 9800gx2. In reality they were probably seeing lag / stuttering from the limited 512mb frame buffer...and perhaps some micro-stuttering as micro-stuttering was more prevalent in those cards. The 4870x2 has minimized micro-stuttering to a point where your eyes can't detect it / micro-stuttering is not an issue with the 4870x2. End of discussion. :cool:
 
My crt refreshes at 90hz at 1920x1200.

When I said you can distinguish between 100 and 120 I was talking in terms of pure fps not fps on a 60hz LCD. I addressed how 60+fps may help on an LCD at another spot.

Anyways, you guys should know that Einstein was smokin the reefer when he made half of these quotes.
 
What's funny is that 90% of gamers out there are seeing right through nVidia this round. It is a no brainer. Big deal. nV lost and ATI is going to make money. That is how the industry is. Loss, rebound, $$$ rinse repeat.

I find it funny that there are gamers who would stick to a company just for the sake of the company. If nV came out with a card that completey sucked ass and was getting trounced by over 200 FPS to ATI's card, there would be the nutjobs trying to defend it. Same goes for ATI.

If there was a noticeable performance spike between the 4970x2 and the 280GTX sli, our EYES WOULD NOT NOTICE IT. No matter what you want to believe, we as mammals and the eyes that are in our head, can't see it.

Bottom line is I know what I paid for my card and what the difference is performance wise between the flagship cards from nV and ATI.

Buy what you desire, and be happy....hehe even though ATI bitched slapped nV for now. Everyone knows it, it's no secret. :)

Night guys.
 
Back
Top