FUD: Radeon 4870 X2 Beats GeForce GTX 280

defiant007

2[H]4U
Joined
Feb 27, 2006
Messages
3,497
http://www.vr-zone.com/articles/Radeon_HD_4870_X2_R700_Beats_GeForce_GTX_280/5851.html

While we are away for Computex, CJ let us know that the Radeon HD 4870 X2 R700 prototype card is out and it beats the NVIDIA GeForce GTX 280 in 3DMark Vantage. The R700 card is basically made up of two RV770 GPUs with 2x256-bit memory interface to either GDDR3 or GDDR5 memories. We asked our sources about R700 in Computex and apparently AMD is going to let AIB partners to decide the specs themselves. Therefore, the partners will set their own clock speeds, PCB design, memory types, cooler solutions etc. and there will be Radeon HD 4850 X2 and 4870 X2 cards differentiate by the memory type. The R700 card apparently doing pretty well at this stage scoring about X5500 in 3DMark Vantage Extreme preset while the GeForce GTX 280 card is scoring X4800. Both sides are still working hard on optimizing their drivers for the new architecture so probably we will see the performance to improve over time.

Looks like ATI might still be in the game, wonder if this will force Nvidia to lower the MSRP of the GTX 280 and 260.
 
Nothing new here, ATI has been winning 3DMark for some time. The 4870X2 and the GTX 280 will probably trade blows in the real world. The GTX 280 will have the advantage where games don't do multi-CPU scaling well and vice versa. As far as price, the 4870X2 ain't going to be cheap, $500 minimum, and nVidia isn't going to cut the GTX 280 much because its expensive to make until the 55nm shrink.
 
I dont care about 3dmarkvantage, ATI or nvidia. I will buy the best performing/cost effective card
 
I dont care about 3dmarkvantage, ATI or nvidia. I will buy the best performing/cost effective card

Neither one of these cards is going to be cost effective. You buy these cards for performance sake, pure and simple.
 
I wrote a program called ATImark, and belive it or not, ,my 1600xt beat my SLI 8800GT's every time. Weird, huh?

If its not a game, it doesn't matter.
 
As heatless said, this means nothing. Look at the 3Dmark performance of the R600s compared to the G80 lol.

And the CF issues as well.
 
So ATi takes lead in synthetic bench mark which has Sapphire's (AMD's partner) logos and has the best support for dual GPU solutions and we all know that R600-architecture is pretty good for pre determined pathways which 3dMark is using.

..but yes. There are still people who thinks that 3dMark-programs would actually measure gaming performance. It only measures 3dMark-performance.

----

If we go with this rumour path then we should take in consideration GT200b which should arrive at September..one month after HD4870 X2.
 
3DMark is a great tool I think. It's a shame that its been so badly abused. It still a valid benchmarking and testing tool, but absent anything else its just another synthetic benchmark. Unfortunately somewhere it became a GPU gauge and I have a feeling FutureMark wish it hadn't.
 
As heatless said, this means nothing. Look at the 3Dmark performance of the R600s compared to the G80 lol.

And the CF issues as well.

i dont think there's going to be CF issues here , since this card will be seen as one card by Windows.
 
There are still people who thinks that 3dMark-programs would actually measure gaming performance. It only measures 3dMark-performance.

I completely agree 110%!

My X800XL scores 1,800 in 3D Mark06, and fortunately this says absolutely nothing about how it compares to Tri-SLI 9800 GTX's in real life game play...

Whew, I am SO glad that argument is over!







:rolleyes:


Unless........ it gives you a general idea of it's performance.... but nah... that'd make too much sense...
 
i dont think there's going to be CF issues here , since this card will be seen as one card by Windows.

It's still not one GPU and unless AMD has made a huge breakthrough OS drivers are still involved in making the use of two GPU's and that's still the hard part in making multi-GPU solutions.
 
Vantage and 3dMark06 measure things very differently. In fact, if you look closely, Vantage has favored Nvidia cards recently until ATI released a driver fix, rumor being that Nvidia worked closer with Futuremark this time around so don't dismiss Vantage as though its 3dM06 all over again (its not, for one the CPU is not so disproportionally powerful again)

That being said, it obviously doesn't mean anything to real performance yet since people are still evaluating what the scores really mean, whereas 3dM06 has been around 2 years and dissected left and right.

The more important thing is that this estimates the 4870's performance around X3000 which is quite an impressive score and in line with the idea that the 4870 is 25-30%+ more powerful than the 9800GTX.
 
It's still not one GPU and unless AMD has made a huge breakthrough OS drivers are still involved in making the use of two GPU's and that's still the hard part in making multi-GPU solutions.

Yes, and the 3870x2 (and 9800x2) illustrate that well. The scaling just isn't there. So unless ATi unleashes some new magic pixie dust, any x2 solution will face the same limitations
 
i dont think there's going to be CF issues here , since this card will be seen as one card by Windows.
Well if HD3870 X2 didn't seem to be one card by Windows then HD4870 X2 won't since it's crossfire solution just like HD3870 X2
 
Even if it did, wouldn't Nvidia just make a GTX280 X2 of their own, and grab the crown back? Comparing two core solutions to single core solutions is kind of misleading.
 
Even if it did, wouldn't Nvidia just make a GTX280 X2 of their own, and grab the crown back? Comparing two core solutions to single core solutions is kind of misleading.

Sure they would, but not on 65nm. This is a big power hungry chip and an X2 is out of the question now.
 
Kyle probably can't comment on this anyways with the release of these cards so soon but back in january he posted this here


R700 will be a multiple GPU architecture. Will not be CrossFire on a card. The architecture is being designed from the ground up to be "multiple GPU." Engineers at ATI have also told me specifically that AA is "fixed."

Hopefully this will clear up some of the rumors. As you know, we bat about 1000 when it comes to this kind of information. If I tell you more now, I will have to kill you.

Since then, what's changed?
 
Don't have to .. SLI will work just fine.

But then CrossFire X would trump that. But then maybe tri-SLI would been that. The GTX 280 is going to be the fastest single chip GPU ever by a good bit over existing single card dual GPU solutions. There's really no other option for nVidia as anything less and they might very well have been better off waiting till that had such a part.

This is nVidia's game. AMD is taking the price/performance route. nVidia must astound us June 17. Anything less would be a massive fail.
 
But then CrossFire X would trump that. But then maybe tri-SLI would been that. The GTX 280 is going to be the fastest single chip GPU ever by a good bit over existing single card dual GPU solutions. There's really no other option for nVidia as anything less and they might very well have been better off waiting till that had such a part.

This is nVidia's game. AMD is taking the price/performance route. nVidia must astound us June 17. Anything less would be a massive fail.

Doubtful - the company has coffers full of cash, AMD is broke, and has massive debt. It's AMD on the ropes still - Nvidia can suffer one or two poor release cycles.
 
Even if it did, wouldn't Nvidia just make a GTX280 X2 of their own, and grab the crown back? Comparing two core solutions to single core solutions is kind of misleading.

Too hot and too big. We'll have to wait until at least 45nm before we see a GX2 card. Let's be real, 55nm would still be too big for a GX2 280-GPU.
 
Yeah considering the G80 went from 90nm to 65nm to finally get a GX2 card, an optical shrink wouldn't be a big enough change. 45nm would be required for it to occur.
 
Doubtful - the company has coffers full of cash, AMD is broke, and has massive debt. It's AMD on the ropes still - Nvidia can suffer one or two poor release cycles.

No company can "afford" a poor release cycle.
 
Doubtful - the company has coffers full of cash, AMD is broke, and has massive debt. It's AMD on the ropes still - Nvidia can suffer one or two poor release cycles.

I'm not saying that nVidia would go broke, I just saying that reviews would be luke warm at best if the GTX doesn't offer significant performance over what we have now. Luke warm reviews for you brand new, top dollar card that doesn't bring much performance over your 18 month old design very, very bad.
 
Well we talking about a dual chip vs a single chip card, would be obvious if the 4870 X2 will perform better than the 280 GTX. Lets see what the GT200b has to offer then.
 
Well we talking about a dual chip vs a single chip card, would be obvious if the 4870 X2 will perform better than the 280 GTX. Lets see what the GT200b has to offer then.

thats still quite away no? end of the year timeframe? nov/dec :confused:

Don't have to .. SLI will work just fine.


SLI is only for a select few brave enough to go nvidia chipset LOL :p:p:p That leaves alot of people leaning towards crossfire, Since Lots of intel boards support it :D
 
They are talking about September release for GT200b. Rumour (actually there have been these rumours for a while) is that this 65nm version is just safe bet if they don't get 55nm working and that 55nm would have already taped out (=working)
 
They are talking about September release for GT200b. Rumour (actually there have been these rumours for a while) is that this 65nm version is just safe bet if they don't get 55nm working and that 55nm would have already taped out (=working)

hmm i say they will wait at least a month so all the early adopters (step ups will expire) :D
 
Don't think they'll wait for step ups to expire :p. Or they just don't give GT200b chips that soon for those companies that have these programs :p

Word is that there will be only four brands selling GT200 anyways. So far confirmed brands have been Gigabyte and Gainward[which isn't sold at States].
 
Anyone remember how the 2900xt easily beat out the 8800gtx in 3dmark? I lost faith in it as a benchmark after that...
 
do you guys think that these cards will finaly allow us to play crysis on ultra high details at 1080p res? i have a feeling it wont and that these cards will just be slight faster then the 9800GX2.

hope im wrong but who knows.
 
Kyle probably can't comment on this anyways with the release of these cards so soon but back in january he posted this here

Since then, what's changed?

Nothing changed. We just know that the HD 4870 X2 will be two HD 4870 chips on card. This far, this only works with Crossfire, just like the HD 3870 X2. If it's something different, we'll have to wait and see.
 
do you guys think that these cards will finaly allow us to play crysis on ultra high details at 1080p res? i have a feeling it wont and that these cards will just be slight faster then the 9800GX2.

hope im wrong but who knows.

Specs and rumored performance numbers put it "at least" 50% faster than the 9800 GX2. This is a single GPU we're talking about here. And it will most likely be able to max Crysis @ 1600x1200, with at least 2xAA and smooth framerates.
 
As for the OP, we had already read these rumors. Synthetic benchmark tools are worthless when it comes to real-world gameplay numbers, which is what we'll use these cards for.
3D Mark or Vantage scores are just for e-penis bragging rights.
 
"ATi HD4870x2.....Viagra for your e-penis. Consult your physician before using Radeon HD products as swelling may occur, if you experience a benchmark lasting more than 4 hours, seek immediate driver updates and hotfixes."
 
Back
Top