AMD ATI Radeon HD 4870 X2 Preview

In short...

PHENOMENAL news!

And a great "preview" as well.
Well done gents. *Well done*!!

=)
 
there is nothing wrong with the X2 costing more than $450, like the GTX 280. Its anywhere from 40% - 100% faster than a GTX 280, so even $550 is a good deal.
 
I sure as hell hope they don't take the same path with their product naming scheme that they did with the 8800 series with die shrinks and just leave it to the market to sort things out. If it's a different core, call them something else for cryin out loud. You've also got to think that AMD knew a die-shrink from NVidia was on the way for the GT200 and probably has a card or two left to show beyond the 4870X2 if necessary. What will AMD's answer to the GT200b be?

I'm sure investors aren't liking the pricing game NVidia is playing ($11.41 on google right now) but I sure do.

Answer
 
I sure as hell hope they don't take the same path with their product naming scheme that they did with the 8800 series with die shrinks and just leave it to the market to sort things out. If it's a different core, call them something else for cryin out loud. You've also got to think that AMD knew a die-shrink from NVidia was on the way for the GT200 and probably has a card or two left to show beyond the 4870X2 if necessary. What will AMD's answer to the GT200b be?

I'm sure investors aren't liking the pricing game NVidia is playing ($11.41 on google right now) but I sure do.

Answer
 
goddammit, if they can go 40nm on the R770 WHY can't they do that for a phenom?
 
IT's funny how no one argues the fairness of gddr5.
I want nvidia to release the gtx280 with gddr5 and a die-shrink.
My bet is that it will "beat the pants" off a 4870x2.
While I have nothing against ATi I find it unbelievable the praise/lauding it is getting for the 4870x2 when the card is 2 GPUs with GDDR5, and to boot it has 2x256bit memory busses that aren't showing as bottle necks due to the memory speed.
Do you think a gtx280 with a native 512bit bus and gddr5 won't spank a crossfirex setup? I sure as hell think it would come close, do the math and see.
 
a 512-bit bus and GDDR5 would be sweet from a technology and performance standpoint, but very expensive from a business standpoint, plus it introduces complexity in the PCB costs

and as we all know, memory bandwidth isn't everything, its a balance of both pixel/fillrate/texture/memory througput, case in point, the 2900 XT with gobs of memory bandwidth, but horrible real-world gaming performance
 
No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU.

Nvidia must get it's credit, since one gtx280 can scale much better in MANY games since games that may not work well with multi GPU will not be affecting a 280. I just get frustrated when it takes ATI 2GB of GDDR5 on a crap architecture to beat nvidia. I am a big nvidia fan and I admit that they were wrong in doing the fixed pricing and I understand people's disdain for that; however from a standpoint of the 280 being able to get gddr5, we will see a different cat.
A gtx280 with gddr5 should cost as much as a gtx280 did at launch. The only problem I have with nvidia and the pricing is they made a great card but expected to make too much profit, because I bet they're still not losing money selling this card under 400 bucks; which should tell us how much realistically a gddr5 will cost them about 550-650 and to the consumers around 700.

I want a fair fight, gddr5 and a 256 bit bus to achieve closer memory clocks to that of a 280 which uses gddr5 and a 512 bit bus isn't fair. To me I equate it to a saleen s7 non twin turbo'd (Non gddr5 4870) against a naturally aspirated f430/f599 (512 bit bus gtx280 with gddr3); when you throw a turbo in the saleen (Gddr5 is thrown onto the 4870) and it becomes the "fastest" production card.. What are we looking at here? The superior card or the card that wasn't superior to begin with, but used a higher speed memory to increase it's performance? That's what bothers me. We can all be the best by being lazy, but innovation is what will get us somewhere and achieve much more both in technology and life...
 
No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU......

Who cares?
NVidia dont make a GDDR5 card or have a 200 series x2 card so theres no point throwing a wobbly about which would be best.
Save it for when NVidia do compete on a similar basis.
There are more positive arguments for the 4870 and 4870x2.
I'm sure if you had either of those cards you would present a strong argument for them.

Be content with what you have, its a great card.
You are making it look like you have sour grapes.
 
No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU.

Nvidia must get it's credit, since one gtx280 can scale much better in MANY games since games that may not work well with multi GPU will not be affecting a 280. I just get frustrated when it takes ATI 2GB of GDDR5 on a crap architecture to beat nvidia. I am a big nvidia fan and I admit that they were wrong in doing the fixed pricing and I understand people's disdain for that; however from a standpoint of the 280 being able to get gddr5, we will see a different cat.
A gtx280 with gddr5 should cost as much as a gtx280 did at launch. The only problem I have with nvidia and the pricing is they made a great card but expected to make too much profit, because I bet they're still not losing money selling this card under 400 bucks; which should tell us how much realistically a gddr5 will cost them about 550-650 and to the consumers around 700.

I want a fair fight, gddr5 and a 256 bit bus to achieve closer memory clocks to that of a 280 which uses gddr5 and a 512 bit bus isn't fair. To me I equate it to a saleen s7 non twin turbo'd (Non gddr5 4870) against a naturally aspirated f430/f599 (512 bit bus gtx280 with gddr3); when you throw a turbo in the saleen (Gddr5 is thrown onto the 4870) and it becomes the "fastest" production card.. What are we looking at here? The superior card or the card that wasn't superior to begin with, but used a higher speed memory to increase it's performance? That's what bothers me. We can all be the best by being lazy, but innovation is what will get us somewhere and achieve much more both in technology and life...
hate to break this to you but..

If the 512 bit bus doesn't work for the GTX280 what do you think the GDDR5 is going to do? The only reason for Nvidia to go GDDR5 at this point is to allow them to reduce their cost by going to a 256 bit bus, not a 512bit GDDR5 bus. In fact if I remember correctly the GTX280 still has a slightly higher memory bandwidth then the 4870. I am not sure why your so upset with the comparison. they took different architecture routes and AMD s paid off big time, Nvidias not so much. wait for the die shrink and see what happens.

BTW the GDDR5 may not be such a great thing anyways, go take a look at this thread here http://www.xtremesystems.org/forums/showthread.php?t=192690&page=4

only about 8% faster on the same clocks, need more info thou.
 
Its getting closer... Do you feel it? Its GETTING CLOSER!!!

Only so many days before full reviews come out...
 
hate to break this to you but..

If the 512 bit bus doesn't work for the GTX280 what do you think the GDDR5 is going to do? The only reason for Nvidia to go GDDR5 at this point is to allow them to reduce their cost by going to a 256 bit bus, not a 512bit GDDR5 bus. In fact if I remember correctly the GTX280 still has a slightly higher memory bandwidth then the 4870. I am not sure why your so upset with the comparison. they took different architecture routes and AMD s paid off big time, Nvidias not so much. wait for the die shrink and see what happens.

BTW the GDDR5 may not be such a great thing anyways, go take a look at this thread here http://www.xtremesystems.org/forums/showthread.php?t=192690&page=4

only about 8% faster on the same clocks, need more info thou.

I understand that. I just have a hard time seeing people unhappy with a 400 dollar GTX280 (or near that) and comparing it to ATI's to be released 4870x2 that's going to be 550.
I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage. As it is I believe that there a few things that nVidia needs to iron out when it comes to this card. I'll be waiting for the next killer driver update :) I feel the gtx280 was rushed but the drivers we have now aren't fully optimized in the sense of UDA providing decent and feasible results, but not the best.

:)(
 
I understand that. I just have a hard time seeing people unhappy with a 400 dollar GTX280 (or near that) and comparing it to ATI's to be released 4870x2 that's going to be 550.
I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage. As it is I believe that there a few things that nVidia needs to iron out when it comes to this card. I'll be waiting for the next killer driver update :) I feel the gtx280 was rushed but the drivers we have now aren't fully optimized in the sense of UDA providing decent and feasible results, but not the best.

:)(

Filed under Nvidia Fan boy. Homeboy it is what it is, a better graphics card.
 
All right, I give up.. no comment from Brent Justice regarding my last post...

I mean, the rest of us do not agree that there is a performance hit when enabling 8x or 16x CSAA compared to 4x AA. Well, I guess I should just let it go here... if the cat got that tongue.

At least your new article comparing the CFAA modes got mentioned in the INQ, and it was great. I love your time graphs, as opposed to the first comment on that INQ article: http://www.theinquirer.net/gb/inquirer/news/2008/07/22/graphics-acronyms-becoming... that poster was an idiot no matter how fluent his vocabulary was (probably a drunk Ph.D. wannabe).
 
I understand that. I just have a hard time seeing people unhappy with a 400 dollar GTX280 (or near that) and comparing it to ATI's to be released 4870x2 that's going to be 550.
I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage. As it is I believe that there a few things that nVidia needs to iron out when it comes to this card. I'll be waiting for the next killer driver update :) I feel the gtx280 was rushed but the drivers we have now aren't fully optimized in the sense of UDA providing decent and feasible results, but not the best.

:)(

no they wouldn't. that what your not understanding.
 
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?

These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.

I'll just sit back and enjoy the show with my popcorn.

Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.
 
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?

These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.

I'll just sit back and enjoy the show with my popcorn.

Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.

I am a AMD fanboy buy your sounding like one here. the current price of the 280 is still dropping, so its not a 1000 dollars. and at around 360 for the 280 the question is not at all clear. Nvidia is still in the game
 
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?

These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.

I'll just sit back and enjoy the show with my popcorn.

Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.

Well the 4870 X2 isn't quite here yet, but all the information out there certainly points to NVIDIA having lost this round.

With that said I'm hesitant to purchase ATI again because of certain issues I've had with their cards. Namely their inability to properly read the EDID information of some monitors. I've got one monitor that has never worked with an ATI card via DVI since the X1800XT days. Plus I've run into more than one or two game compatibility problems. These issues are generally with more obscure titles, but it still causes me to hesitate to spend my money on any ATI cards. The EDID issue seems to happen randomly and I'd freak out if I got a newer ATI card that didn't want to work with my 3007WFP. :eek:

I will probably buy a pair of 4870 X2's but I'll wait until they are available at MSRP or lower in retail form at a local store so that I can return them if they give me any issues.
 
IT's funny how no one argues the fairness of gddr5.
I want nvidia to release the gtx280 with gddr5 and a die-shrink.
My bet is that it will "beat the pants" off a 4870x2.
While I have nothing against ATi I find it unbelievable the praise/lauding it is getting for the 4870x2 when the card is 2 GPUs with GDDR5, and to boot it has 2x256bit memory busses that aren't showing as bottle necks due to the memory speed.
Do you think a gtx280 with a native 512bit bus and gddr5 won't spank a crossfirex setup? I sure as hell think it would come close, do the math and see.

I am trying to stay fairly neutral in this but, I kinda do see the point. Some times to me it seems like nVidia's architecture is a bit more efficient. I say this because it took AMD/ATi higher clock speeds, much faster ram, and two GPU's to beat the 280. Yes I know the 280 runs hot, but the 4870 isn't with out heat issues either. I am still wating to see how ATi is doing with drivers before I pull the triger on two 4870's. If they don't make the improvements I am hoping for then I am gonna get a single 280. I don't care if it's nVidia or ATi, I don't like x2 or GX2 cards, they always seem like they are short lived.
 
I am a AMD fanboy buy your sounding like one here. the current price of the 280 is still dropping, so its not a 1000 dollars. and at around 360 for the 280 the question is not at all clear. Nvidia is still in the game
Just curious as to where you saw one for that price. I was looking on Newegg and the cheapest one I saw was the PNY for $415 after MIR.


I am trying to stay fairly neutral in this but, I kinda do see the point. Some times to me it seems like nVidia's architecture is a bit more efficient. I say this because it took AMD/ATi higher clock speeds, much faster ram, and two GPU's to beat the 280. Yes I know the 280 runs hot, but the 4870 isn't with out heat issues either. I am still wating to see how ATi is doing with drivers before I pull the triger on two 4870's. If they don't make the improvements I am hoping for then I am gonna get a single 280. I don't care if it's nVidia or ATi, I don't like x2 or GX2 cards, they always seem like they are short lived.
It depends on your definition of 'efficient' :) One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.
 
It depends on your definition of 'efficient' :) One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.

I agree, I was just looking at it from the point that ATi had to almost double everything, except the bus to get the performance to keep up/ knock out the 280. Don't get me wrong, I think the 4870 and 4870x2 are awesome cards.
 
Just curious as to where you saw one for that price. I was looking on Newegg and the cheapest one I saw was the PNY for $415 after MIR.



It depends on your definition of 'efficient' :) One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.

buy.com had a special running for 390 AR, some one else posted this(I had the price wrong)
silverphoenix Gawd, 2.4 Years

silverphoenix is offline Report Post
It's offering me another $30 off if you sign up with the free Buy.com Visa card

419.99 - 5% = 398.99

398.99 - $30 Visa sign up

368.99- $30 MIR = 338.99 Wow, very tempting
 
I agree, I was just looking at it from the point that ATi had to almost double everything, except the bus to get the performance to keep up/ knock out the 280. Don't get me wrong, I think the 4870 and 4870x2 are awesome cards.

Now let me turn that around by saying maybe ATI cut an original HUGE chip in half making 2 smaller chips. That means the smaller chip is still nearing a GTX280 and the "huge" chip is beating it.

It's not doubling, it's halving because their architecture is just that awesome.

Badabing badaboom.
 
No matter the price of the 260/280 NVIDIA is still game and we all want NVIDIA on the game: prices must go down.

I will say it again: there are very few market niches that ATI doesnt have a good product: best IGP ( and there will be a monster IGP launching next week: beware!),at sub $100/16x10 gaming the 8800GT and the 3850 trade blows, sub$200/19x12 gaming the 4850 has ZERO competition, 25x16 gaming the CF4850 has ZERO competition and the 4870X2 will simply consolidate the market dominance.

Anyone can say and think otherwise, but while NVIDIA get rich and lazy with the 8800GTX/8800GTS 512MB/9800GTX ATI created the best IGP ( and the next IGP product will just make things even better), put a firm hold on the mid end market and is now moving full speed to dominate the highend.


I just pray that NVIDIA can give us a good product by the time Nehalem reach the shelves( it was November, now its October, maybe September?), otherwise ATI may just refuse to lower the prices...
 
Now let me turn that around by saying maybe ATI cut an original HUGE chip in half making 2 smaller chips. That means the smaller chip is still nearing a GTX280 and the "huge" chip is beating it.

It's not doubling, it's halving because their architecture is just that awesome.

Badabing badaboom.


So using that as an example then. If nVidia cut their chip in half, and made it the same size as the 4870, then added GDDR5 to their 512bit bus, then they should be able to piss on the 4870x2, assuming they released a GX2 version of said card, so then you would have a 1gig bus and a 1gig of GDDR5. Hey if we are lucky, and such a thing ever came out, it might cost the same as a Quadro. ;)
 
No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU.

Nvidia must get it's credit, since one gtx280 can scale much better in MANY games since games that may not work well with multi GPU will not be affecting a 280. I just get frustrated when it takes ATI 2GB of GDDR5 on a crap architecture to beat nvidia. I am a big nvidia fan and I admit that they were wrong in doing the fixed pricing and I understand people's disdain for that; however from a standpoint of the 280 being able to get gddr5, we will see a different cat.
A gtx280 with gddr5 should cost as much as a gtx280 did at launch. The only problem I have with nvidia and the pricing is they made a great card but expected to make too much profit, because I bet they're still not losing money selling this card under 400 bucks; which should tell us how much realistically a gddr5 will cost them about 550-650 and to the consumers around 700.

I want a fair fight, gddr5 and a 256 bit bus to achieve closer memory clocks to that of a 280 which uses gddr5 and a 512 bit bus isn't fair. To me I equate it to a saleen s7 non twin turbo'd (Non gddr5 4870) against a naturally aspirated f430/f599 (512 bit bus gtx280 with gddr3); when you throw a turbo in the saleen (Gddr5 is thrown onto the 4870) and it becomes the "fastest" production card.. What are we looking at here? The superior card or the card that wasn't superior to begin with, but used a higher speed memory to increase it's performance? That's what bothers me. We can all be the best by being lazy, but innovation is what will get us somewhere and achieve much more both in technology and life...
They have professionals out there that can help you with your problems. You are probably one of the saddest fanboys I have ever seen in all my years here and on any other forum. You sir need a break from computers.

Edit: I feel bad saying this to anyone, but the honest truth is that we could all use a break from computers...
 
why is everyone so caught up with GDDR5? it would make little or no difference on a 512bit bus, hell it only makes a small difference on a 256bit bus. This is kind of like people wanting to but 1gb on an 8600GT.
 
why is everyone so caught up with GDDR5? it would make little or no difference on a 512bit bus, hell it only makes a small difference on a 256bit bus. This is kind of like people wanting to but 1gb on an 8600GT.

I'm definitely not a fanboy or anything, but it's a little different. GDDR5 would theoretically give a slight improvement in all games just by being superior, faster memory, whereas the 1 gig would only be utilized by select few texture intensive games, such as supreme commander, if I'm not mistaken. Most games would never see a boost going from 512 to 1 gig, but all games should see at least a slight improvement going from gddr3 to gddr5. The ATI card excels for its fantastic price, probably in part to the gddr5.

The 4870 is very similar to the 4850, but the gddr5 along with a couple other differences allow the 4870 to blow away the 4850. The argument that the nVidia card would do so much better is kinda invalid at this point because, simply, they didn't. If gddr5 is so much more expensive, and ATI still manages to pull of a great contender for the GTX260 for $300, much cheaper than the 260 (although now it's dropping) ATI still wins the bang/buck war. That's all that really matters, except for those who disregard price and simply want the fastest, no matter what the cost.
 
I'm definitely not a fanboy or anything, but it's a little different. GDDR5 would theoretically give a slight improvement in all games just by being superior, faster memory, whereas the 1 gig would only be utilized by select few texture intensive games, such as supreme commander, if I'm not mistaken. Most games would never see a boost going from 512 to 1 gig, but all games should see at least a slight improvement going from gddr3 to gddr5. The ATI card excels for its fantastic price, probably in part to the gddr5.

The 4870 is very similar to the 4850, but the gddr5 along with a couple other differences allow the 4870 to blow away the 4850. The argument that the nVidia card would do so much better is kinda invalid at this point because, simply, they didn't. If gddr5 is so much more expensive, and ATI still manages to pull of a great contender for the GTX260 for $300, much cheaper than the 260 (although now it's dropping) ATI still wins the bang/buck war. That's all that really matters, except for those who disregard price and simply want the fastest, no matter what the cost.
not really true at all. the 512bit bus and gddr5 would be a total waste just like having the 512bit bus on the 2900 was. the gtx280 is not really bandwidth limited therefore having much faster memory will do basically nothing except cost more money. the gtx280 would have to be much stronger card to get to the point where memory bandwidth would even be the limiting factor.
 
I'm definitely not a fanboy or anything, but it's a little different. GDDR5 would theoretically give a slight improvement in all games just by being superior, faster memory, whereas the 1 gig would only be utilized by select few texture intensive games, such as supreme commander, if I'm not mistaken. Most games would never see a boost going from 512 to 1 gig, but all games should see at least a slight improvement going from gddr3 to gddr5. The ATI card excels for its fantastic price, probably in part to the gddr5.

The 4870 is very similar to the 4850, but the gddr5 along with a couple other differences allow the 4870 to blow away the 4850. The argument that the nVidia card would do so much better is kinda invalid at this point because, simply, they didn't. If gddr5 is so much more expensive, and ATI still manages to pull of a great contender for the GTX260 for $300, much cheaper than the 260 (although now it's dropping) ATI still wins the bang/buck war. That's all that really matters, except for those who disregard price and simply want the fastest, no matter what the cost.

not to mention that it doesn't do that much for the 4870, several people have downclocked the 4870 to 4850 levels and found only a 7 to 10% difference in the performance. I am sure that the importance of the GDDR5 increases as the core speed does but even if you scale it up it would not amount to difference of 512 vs 256, in other words the added bandwidth would just be wasted using GDDR5 on a 280
 
Now let me turn that around by saying maybe ATI cut an original HUGE chip in half making 2 smaller chips. That means the smaller chip is still nearing a GTX280 and the "huge" chip is beating it.

It's not doubling, it's halving because their architecture is just that awesome.

Badabing badaboom.

Don't question Nyte; he's been pro-gaming since we were all in diapers.
 
Wow! It eats the GTX280 for breakfast!

What's absolutely frightening is that you can max the AA in AoC for no real framerate penalty, pretty much regardless of resolution. (Isn't AoC a TWIWMTBP title?)

Why it's frightening: the CF/X2 drivers aren't really dialed-in yet.
 
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?

These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.

I'll just sit back and enjoy the show with my popcorn.

Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.

And it only got *worse*.

I was running an AIW 8500DV (and about to purchase my last AIW, the AIW 9700 Pro) during that period.

The original 9700 Pro (and the followup 9800 Pro) were themselves followed by their AIW team-mates: AIW 9700 Pro and AIW 9800 Pro. For once, they weren't underclocked (in either core or memory) compared to their non-AIW counterparts (the AIW 9700 Pro would actually wax the FX5200 in MaximumPC that year, while the AIW 9800 Pro would hammer the later FX5700 the following year, also in MaximumPC; it seemed that nV couldn't buy a break).

"It's deja vu all over again." - Yogi Berra
 
not really true at all. the 512bit bus and gddr5 would be a total waste just like having the 512bit bus on the 2900 was. the gtx280 is not really bandwidth limited therefore having much faster memory will do basically nothing except cost more money. the gtx280 would have to be much stronger card to get to the point where memory bandwidth would even be the limiting factor.

So the GDDR5 doesn't help the 4870 at all ... why did they use it then? It has to be at least part of the reason the 4870 is faster than the 4850.
 
Back
Top