Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I sure as hell hope they don't take the same path with their product naming scheme that they did with the 8800 series with die shrinks and just leave it to the market to sort things out. If it's a different core, call them something else for cryin out loud. You've also got to think that AMD knew a die-shrink from NVidia was on the way for the GT200 and probably has a card or two left to show beyond the 4870X2 if necessary. What will AMD's answer to the GT200b be?
I'm sure investors aren't liking the pricing game NVidia is playing ($11.41 on google right now) but I sure do.
I sure as hell hope they don't take the same path with their product naming scheme that they did with the 8800 series with die shrinks and just leave it to the market to sort things out. If it's a different core, call them something else for cryin out loud. You've also got to think that AMD knew a die-shrink from NVidia was on the way for the GT200 and probably has a card or two left to show beyond the 4870X2 if necessary. What will AMD's answer to the GT200b be?
I'm sure investors aren't liking the pricing game NVidia is playing ($11.41 on google right now) but I sure do.
goddammit, if they can go 40nm on the R770 WHY can't they do that for a phenom?
No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU......
hate to break this to you but..No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU.
Nvidia must get it's credit, since one gtx280 can scale much better in MANY games since games that may not work well with multi GPU will not be affecting a 280. I just get frustrated when it takes ATI 2GB of GDDR5 on a crap architecture to beat nvidia. I am a big nvidia fan and I admit that they were wrong in doing the fixed pricing and I understand people's disdain for that; however from a standpoint of the 280 being able to get gddr5, we will see a different cat.
A gtx280 with gddr5 should cost as much as a gtx280 did at launch. The only problem I have with nvidia and the pricing is they made a great card but expected to make too much profit, because I bet they're still not losing money selling this card under 400 bucks; which should tell us how much realistically a gddr5 will cost them about 550-650 and to the consumers around 700.
I want a fair fight, gddr5 and a 256 bit bus to achieve closer memory clocks to that of a 280 which uses gddr5 and a 512 bit bus isn't fair. To me I equate it to a saleen s7 non twin turbo'd (Non gddr5 4870) against a naturally aspirated f430/f599 (512 bit bus gtx280 with gddr3); when you throw a turbo in the saleen (Gddr5 is thrown onto the 4870) and it becomes the "fastest" production card.. What are we looking at here? The superior card or the card that wasn't superior to begin with, but used a higher speed memory to increase it's performance? That's what bothers me. We can all be the best by being lazy, but innovation is what will get us somewhere and achieve much more both in technology and life...
the 4870 X2 is BETTER. Both on performance and on price.
hate to break this to you but..
If the 512 bit bus doesn't work for the GTX280 what do you think the GDDR5 is going to do? The only reason for Nvidia to go GDDR5 at this point is to allow them to reduce their cost by going to a 256 bit bus, not a 512bit GDDR5 bus. In fact if I remember correctly the GTX280 still has a slightly higher memory bandwidth then the 4870. I am not sure why your so upset with the comparison. they took different architecture routes and AMD s paid off big time, Nvidias not so much. wait for the die shrink and see what happens.
BTW the GDDR5 may not be such a great thing anyways, go take a look at this thread here http://www.xtremesystems.org/forums/showthread.php?t=192690&page=4
only about 8% faster on the same clocks, need more info thou.
So your saying if they had designed the card better, they'd have a greater advantage?I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage.
I understand that. I just have a hard time seeing people unhappy with a 400 dollar GTX280 (or near that) and comparing it to ATI's to be released 4870x2 that's going to be 550.
I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage. As it is I believe that there a few things that nVidia needs to iron out when it comes to this card. I'll be waiting for the next killer driver update I feel the gtx280 was rushed but the drivers we have now aren't fully optimized in the sense of UDA providing decent and feasible results, but not the best.
(
I understand that. I just have a hard time seeing people unhappy with a 400 dollar GTX280 (or near that) and comparing it to ATI's to be released 4870x2 that's going to be 550.
I am just saying that if nVidia used gddr5 on an already-high memory bandwidth card, they would actually have more of an advantage. As it is I believe that there a few things that nVidia needs to iron out when it comes to this card. I'll be waiting for the next killer driver update I feel the gtx280 was rushed but the drivers we have now aren't fully optimized in the sense of UDA providing decent and feasible results, but not the best.
(
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?
These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.
I'll just sit back and enjoy the show with my popcorn.
Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?
These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.
I'll just sit back and enjoy the show with my popcorn.
Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.
IT's funny how no one argues the fairness of gddr5.
I want nvidia to release the gtx280 with gddr5 and a die-shrink.
My bet is that it will "beat the pants" off a 4870x2.
While I have nothing against ATi I find it unbelievable the praise/lauding it is getting for the 4870x2 when the card is 2 GPUs with GDDR5, and to boot it has 2x256bit memory busses that aren't showing as bottle necks due to the memory speed.
Do you think a gtx280 with a native 512bit bus and gddr5 won't spank a crossfirex setup? I sure as hell think it would come close, do the math and see.
Just curious as to where you saw one for that price. I was looking on Newegg and the cheapest one I saw was the PNY for $415 after MIR.I am a AMD fanboy buy your sounding like one here. the current price of the 280 is still dropping, so its not a 1000 dollars. and at around 360 for the 280 the question is not at all clear. Nvidia is still in the game
It depends on your definition of 'efficient' One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.I am trying to stay fairly neutral in this but, I kinda do see the point. Some times to me it seems like nVidia's architecture is a bit more efficient. I say this because it took AMD/ATi higher clock speeds, much faster ram, and two GPU's to beat the 280. Yes I know the 280 runs hot, but the 4870 isn't with out heat issues either. I am still wating to see how ATi is doing with drivers before I pull the triger on two 4870's. If they don't make the improvements I am hoping for then I am gonna get a single 280. I don't care if it's nVidia or ATi, I don't like x2 or GX2 cards, they always seem like they are short lived.
It depends on your definition of 'efficient' One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.
Just curious as to where you saw one for that price. I was looking on Newegg and the cheapest one I saw was the PNY for $415 after MIR.
It depends on your definition of 'efficient' One common metric of efficiency is performance per mm^2. In such a case, GTX280's die is more than twice as large as a single RV770 die (as such, even TWO RV770 die combined are smaller than a SINGLE GTX280 die). Yet another common metric is performance per watt. The GTX280 has a 57% higher max power draw vs RV770, and it's certainly is not 57% faster.
I agree, I was just looking at it from the point that ATi had to almost double everything, except the bus to get the performance to keep up/ knock out the 280. Don't get me wrong, I think the 4870 and 4870x2 are awesome cards.
Now let me turn that around by saying maybe ATI cut an original HUGE chip in half making 2 smaller chips. That means the smaller chip is still nearing a GTX280 and the "huge" chip is beating it.
It's not doubling, it's halving because their architecture is just that awesome.
Badabing badaboom.
They have professionals out there that can help you with your problems. You are probably one of the saddest fanboys I have ever seen in all my years here and on any other forum. You sir need a break from computers.No disagreeing there brent, but even gddr5 on the current PCB would freaking smoke the 4870.
I just offended when people talk to me as if the 4870 is better than the 280, it's not. It may not be inferior by the ratio of the price difference, but it still is. People also forget the gtx280 is a SINGLE GPU CARD! It's not TWO, it's ONE GPU! It's still very powerful for ONE GPU. Yes I understand that dollar/performance ratio is what people want but if we're talking a single GPU versus multi GPU.
Nvidia must get it's credit, since one gtx280 can scale much better in MANY games since games that may not work well with multi GPU will not be affecting a 280. I just get frustrated when it takes ATI 2GB of GDDR5 on a crap architecture to beat nvidia. I am a big nvidia fan and I admit that they were wrong in doing the fixed pricing and I understand people's disdain for that; however from a standpoint of the 280 being able to get gddr5, we will see a different cat.
A gtx280 with gddr5 should cost as much as a gtx280 did at launch. The only problem I have with nvidia and the pricing is they made a great card but expected to make too much profit, because I bet they're still not losing money selling this card under 400 bucks; which should tell us how much realistically a gddr5 will cost them about 550-650 and to the consumers around 700.
I want a fair fight, gddr5 and a 256 bit bus to achieve closer memory clocks to that of a 280 which uses gddr5 and a 512 bit bus isn't fair. To me I equate it to a saleen s7 non twin turbo'd (Non gddr5 4870) against a naturally aspirated f430/f599 (512 bit bus gtx280 with gddr3); when you throw a turbo in the saleen (Gddr5 is thrown onto the 4870) and it becomes the "fastest" production card.. What are we looking at here? The superior card or the card that wasn't superior to begin with, but used a higher speed memory to increase it's performance? That's what bothers me. We can all be the best by being lazy, but innovation is what will get us somewhere and achieve much more both in technology and life...
why is everyone so caught up with GDDR5? it would make little or no difference on a 512bit bus, hell it only makes a small difference on a 256bit bus. This is kind of like people wanting to but 1gb on an 8600GT.
not really true at all. the 512bit bus and gddr5 would be a total waste just like having the 512bit bus on the 2900 was. the gtx280 is not really bandwidth limited therefore having much faster memory will do basically nothing except cost more money. the gtx280 would have to be much stronger card to get to the point where memory bandwidth would even be the limiting factor.I'm definitely not a fanboy or anything, but it's a little different. GDDR5 would theoretically give a slight improvement in all games just by being superior, faster memory, whereas the 1 gig would only be utilized by select few texture intensive games, such as supreme commander, if I'm not mistaken. Most games would never see a boost going from 512 to 1 gig, but all games should see at least a slight improvement going from gddr3 to gddr5. The ATI card excels for its fantastic price, probably in part to the gddr5.
The 4870 is very similar to the 4850, but the gddr5 along with a couple other differences allow the 4870 to blow away the 4850. The argument that the nVidia card would do so much better is kinda invalid at this point because, simply, they didn't. If gddr5 is so much more expensive, and ATI still manages to pull of a great contender for the GTX260 for $300, much cheaper than the 260 (although now it's dropping) ATI still wins the bang/buck war. That's all that really matters, except for those who disregard price and simply want the fastest, no matter what the cost.
I'm definitely not a fanboy or anything, but it's a little different. GDDR5 would theoretically give a slight improvement in all games just by being superior, faster memory, whereas the 1 gig would only be utilized by select few texture intensive games, such as supreme commander, if I'm not mistaken. Most games would never see a boost going from 512 to 1 gig, but all games should see at least a slight improvement going from gddr3 to gddr5. The ATI card excels for its fantastic price, probably in part to the gddr5.
The 4870 is very similar to the 4850, but the gddr5 along with a couple other differences allow the 4870 to blow away the 4850. The argument that the nVidia card would do so much better is kinda invalid at this point because, simply, they didn't. If gddr5 is so much more expensive, and ATI still manages to pull of a great contender for the GTX260 for $300, much cheaper than the 260 (although now it's dropping) ATI still wins the bang/buck war. That's all that really matters, except for those who disregard price and simply want the fastest, no matter what the cost.
Now let me turn that around by saying maybe ATI cut an original HUGE chip in half making 2 smaller chips. That means the smaller chip is still nearing a GTX280 and the "huge" chip is beating it.
It's not doubling, it's halving because their architecture is just that awesome.
Badabing badaboom.
Wow! It eats the GTX280 for breakfast!
Why are these NV dudes looking for excuses to justify their purchase of GTX280? Can't you get it through your fan boy heads that NV lost this round?
These NV guys are trying so hard to FIND reasons to bash ATI lol. It's the whole 9700 VS GeforceFX thing again, I remember that era when all the GeforceFX fans ended up caving in to ATI because they simply could not justify their purchase. It's the same thing here, it started the EXACT same way I kid you not. Go find the archive forum posts on all the big tech sites and you'll know what I'm talking about.
I'll just sit back and enjoy the show with my popcorn.
Oh, and btw:
4870x2 owning GTX280 SLI in more than 50% of games
$500 VS $1000? No contest GGKTHXBAI NO RM.
Maybe some of the members.Don't question Nyte; he's been pro-gaming since we were all in diapers.
not really true at all. the 512bit bus and gddr5 would be a total waste just like having the 512bit bus on the 2900 was. the gtx280 is not really bandwidth limited therefore having much faster memory will do basically nothing except cost more money. the gtx280 would have to be much stronger card to get to the point where memory bandwidth would even be the limiting factor.