What's the hype?

ati-eyefinity-02.jpg

Crap there goes my lower electric bill argument.
 
Last edited:

No way in hell I could play any game like this. The borders of the monitors that get in the way would drive me crazy in 10 minutes.

Absolutely NO.

I understand that gettings multiple monitors and putting them together like this may be cheaper than getting a really large one, but...

Seriously... am I the only one who just can't play like that?
 
Give it some time. Then thin bezel screens will be out soon. Even though there will still be seams between the screens, once you focus on the game itself, they become much less intrusive.
 
No way in hell I could play any game like this. The borders of the monitors that get in the way would drive me crazy in 10 minutes.

Absolutely NO.

I understand that gettings multiple monitors and putting them together like this may be cheaper than getting a really large one, but...

Seriously... am I the only one who just can't play like that?

Ya, that doesn't look fun at all. Not once did the thought of a multi-monitor setup in games appeal to me. I can see how some people would dig it though.

IMO, thin bezels or not, I don't think I'll ever be sold on the idea of eyefinity in games. We'll see...

That said, my 5870's on the way:)
 
^^ uhh, are you actually the Tyler Durden from ZeroHedge? ^^
 
well I turn of all my computers before I get to bed, I really think this thread is pointless, just be happy with what you have and let others be, this is one reason I buy a graphic card once every generation and don't go for dual chip cards because there will always be a single chip next gen card that will beat it. I am happy upgrading every year and a half or whatever it is and live with it.

anyways this thread is going no where, I would rather have it closed because the topic is pointless, just let it be, whoever wants to buy can buy it and whoever doesn't, doesn't. Monday is the day I overclock the hell out of my hd 5870 and let the asus voltage tweak be the best solution to my overclock.

Personally people are ignoring the bigger advantage to it using less power. It less power it needs, results in less heat, meaning there will be less stress put on the cooling fan on the card. Personally I'm looking forward to the 5850, because of the lower cost, and the fact that the 5870 will most likely not fit in my case......
 
Well since I'm building a new rig, I'm just going to buy a 4870 or 4890 off this forum soon, and wait until Nvidia releases their GT300, and compare it to the 5800s. So buy January the 5870 should be around $300 and even cheaper in a combo with Newegg. The GTX300s will be $400 and up, so the comparison will be price vs performance.
 
Nvidia and decent launchprices dont go well together.

But If ATI is keeping the 5890 on hold to combat Nvidia's release in February, you should have even better options?

But im failing to see what it is that Nividia will be offering that would be as good as what ATI have done. Nvidia speaking out on DX11 makes me think again if i should ever support them again.
 
Well since I'm building a new rig, I'm just going to buy a 4870 or 4890 off this forum soon, and wait until Nvidia releases their GT300, and compare it to the 5800s. So buy January the 5870 should be around $300 and even cheaper in a combo with Newegg. The GTX300s will be $400 and up, so the comparison will be price vs performance.
Are 5870 prices expected to drop so dramatically?
 
Are 5870 prices expected to drop so dramatically?

I doubt it. I think it's wishfull thinking that the price will drop by 21% in less than 3 months, why would it? It's already selling for less than what it could go for, and in 3 months it's still going to be the fastest single GPU card out. There's really no reason to think it's going to drop unless you're one of thsoe overly optomistic people who thought the card would debut at $299. Expect to pay $379 in January, $349 if you're lucky.
 
Well for people like me, 2 of these in a crossfire setup is a mighty tempting deal, it's a simpler solution than Quadfire 4870s and will have better frame rates and less problems due to scaling over 2 cards and not 4.

You're right DX11 probably wont be used much for at least a year or so, but if you're not upgrading for another 2-3 years then it makes sense to get a DX11 card now.

What wouldn't make sense it AMD investing in this technology and then not releasing the 5870 card a very reasonably priced high end card, simply making 5870x2 isn't going to earn them back all that R&D money :)
 
You're right DX11 probably wont be used much for at least a year or so, but if you're not upgrading for another 2-3 years then it makes sense to get a DX11 card now.
I understand the hesitancy, but I actually think DX11 will likely be adopted almost overnight. There seems to be a much greater optimism from developers, going on what I've read.

DX11 = Windows 7
DX10 = Windows Vista
 
For everyone to see my point of view on how the power savings on this card are meaningless, I've made this table. I didn't need to make this to understand it for myself, but maybe it's a new perspective for people who believe this card is really making you energy efficient. Because really, this is probably the first time I've ever seen a card discussed so much on the "energy savings" factor of it. To me it seems that has been everyone's opinion that it is its strongest feature. DX11 is still under the phrase "counting your chickens before they hatch" at the moment. My opinion has been if you've already spent $300 - 400 in the past year on your graphics, getting this card before 2010 would be pointless.

othesavings.png


Roughly every 100 hours you save $1.

After one year you likely won't be using the card anymore. If you do, you probably haven't even put 1500 hours into yet anyways. If you think you use it more than 4 hours a day on average, $0.1296 and $0.09024 are the costs per hour. If you want to save money, hang your clothes on a line instead of using the dryer. Install solar panels. Don't use A/C. You'd save thousands per year.
 
It isn't meaningless, it isn't all about the money and it also isn't just about load consumption. If someone is trying to reduce their bill or reduce their carbon footprint obviously they'll have to make mroe than one single change. Just like if someone was trying to lose weight, they'd likely have to do more than simply switch their soda to diet.

If you don't want to adopt newer better technology then don't. You may not need a table to understand power consumption but you need something to help you understand that forcing your opinion on everyone else isn't welcomed.
 
For clarification purposes, it's the fasted single GPU card. The 4870 X2 and 5870 are both SINGLE cards. There is a difference, but I think that difference gets too much attention.
 
For everyone to see my point of view on how the power savings on this card are meaningless, I've made this table. I didn't need to make this to understand it for myself, but maybe it's a new perspective for people who believe this card is really making you energy efficient. Because really, this is probably the first time I've ever seen a card discussed so much on the "energy savings" factor of it. To me it seems that has been everyone's opinion that it is its strongest feature. DX11 is still under the phrase "counting your chickens before they hatch" at the moment. My opinion has been if you've already spent $300 - 400 in the past year on your graphics, getting this card before 2010 would be pointless.

othesavings.png


Roughly every 100 hours you save $1.

After one year you likely won't be using the card anymore. If you do, you probably haven't even put 1500 hours into yet anyways. If you think you use it more than 4 hours a day on average, $0.1296 and $0.09024 are the costs per hour. If you want to save money, hang your clothes on a line instead of using the dryer. Install solar panels. Don't use A/C. You'd save thousands per year.

You're missing the point of the power savings feature. It's all about heat. Less idle wattage translates into much less heat inside your case which then translates into a quieter system.

It also translates into lower ambient temperatures. I generally leave my computers on 24/7 (rebooting once every few days at most). All that up time means my computer room runs around 10 degrees warmer than the rest of the house on average which also means the central AC has to run that much harder if I want to maintain a stable room temperature.

The 9800GX2 my gaming computer has in it now runs anywhere from 60c to 110c depending on what its doing. It's like sitting beside a freaking oven sometimes, so I gladly welcome any power saving feature on new video cards.
 
Ya, that doesn't look fun at all. Not once did the thought of a multi-monitor setup in games appeal to me. I can see how some people would dig it though.

IMO, thin bezels or not, I don't think I'll ever be sold on the idea of eyefinity in games. We'll see...

That said, my 5870's on the way:)

Actually, the tech behind EyeFinity is old (consider that Burnout PC supported it natively, as did all the Crysis titles); all EyeFinity does is make it something you don't have to write to; instead, it's all done in hardware. EyeFinity isn't even anywhere on my radar (I only have the one 23" H233H, and no room for even one more, let alone five). While the technology is good for a poor-man's Desktop/Video/Gaming Wall, there is still the issue of input lag (does EyeFinity compensate for that, and by how much?).

CF and SLI are each great technology feats; however, the spottiness of game support and the finickiness of compatibility (not to mention the extra fiddling required, in both cases, compared to a single GPU) indeed makes both a non-starter for me personally.

I'm not looking at HD5870 because I don't have enough *display* to take proper advantage of it (1920x1080, even non-interlaced, is simply too low a resolution to take advantage of all the power HD5870 brings to the table; at that resolution, HD5850 is plenty, even for Crysis).

Lastly, with HD5850, I don't have to upgrade the whole PC (like I would for CF or SLI); I don't even have to change motherboards. Other than the graphics card, I make two (and only two) changes - the CPU (which I was planning on anyway, for reasons other than a GPU swap) and the power supply (the only change that is directly GPU-related, and that is entirely because of the two 6-pin PCIe connector requirement, and my current PSU has none; I'd have to make such a change even with HD4850, which uses more power).
 
I'm not looking at HD5870 because I don't have enough *display* to take proper advantage of it (1920x1080, even non-interlaced, is simply too low a resolution to take advantage of all the power HD5870 brings to the table; at that resolution, HD5850 is plenty, even for Crysis).

For gamer settings yes. Enthusiest settings with 2xAA will use up all the rendering power the 5870 has and make you wish it had just a bit more.

Oh, and my 5870 card came with two 6pin adaptors, so if your PSU has enough power, there's no need to upgrade it.
 
CF and SLI are each great technology feats; however, the spottiness of game support and the finickiness of compatibility (not to mention the extra fiddling required, in both cases, compared to a single GPU) indeed makes both a non-starter for me personally.

Wow, another one. Well let me put it to you this way: I've had multi-GPU systems since the Voodoo 2 SLI days. Since the 6800Ultra days I've had dual 6800GT's in SLI, 7800GTX's, in SLI, 7900GTX's, X1950XTX's, X1950Pro's, 8800GTX 2-Way and then 3-Way SLI, 9800GX2 Quad-SLI, Geforce GTX 280 2-Way SLI, Geforce GTX 280 -3Way SLI, 4870 X2 CrossfireX, and I've worked on, built or have owned other SLI or Crossfire based systems using a variety of lower end cards. If that's your opinion of Crossfire/CrossfireX/SLI-3-Way SLI/Quad-SLI etc. then you have little to no experience with it.

Since the late Geforce 6-Series days SLI has become very user friendly and easy to use. There isn't any need to dick around with rendering modes unless the game is brand new and no driver has been released for it. Wait about a day or two and that changes as NVIDIA always releases new drivers when new games come out. I think I waited a week once for proper drivers for a game. That didn't matter because there are only a handful of rendering modes to try. I found the one that enabled the best performance and that was that. Easy and painless. Crossfire/CrossfireX is now virtually seemless. AMD is slower about driver releases which is something I wish they'd get better about. So when CrossfireX doesn't work with a game right away all you can really do is wait for the new driver. That's it for compatibility and "fiddling" issues. I've never really seen "spotty game support" with either technology either. All the mainstream and even some very obscure titles are typically well supported. Issues like WoW's inability to perform properly under multi-GPU configurations is likely a coding issue or an issue with the game engine. That's not AMD or NVIDIA's fault. Some games benefit more from the technology than others but as the technology has matured scaling has improved. In the Geforce 6-Series days there were issues with some games actually having worse performance with SLI enabled but that type of issue has since disappeared. From the late 6-Series through today I've ALWAYS seen a performance increase in performance going from single GPU to multi-GPU. Granted the resoution you run your games at has a large impact on what SLI can do for you and of course cost is another issue altogether. SLI/Crossfire has never been an economical solution.

Even the issues with micro-stuttering are eliminated with newer dual GPU cards. The problems found with the 7900GX2/7950GX2 and even the 9800GX2 have been eliminated. Newer cards like the 4870 X2 and the Geforce GTX 295 are excellent cards and have seamless SLI or Crossfire support built in. As long as you keep your drivers up to date you shouldn't have any problems with most games. This is one of the reasons why the differentiation between single and dual GPU cards should be pretty much eliminated.
 
For everyone to see my point of view on how the power savings on this card are meaningless, I've made this table. I didn't need to make this to understand it for myself, but maybe it's a new perspective for people who believe this card is really making you energy efficient. Because really, this is probably the first time I've ever seen a card discussed so much on the "energy savings" factor of it. To me it seems that has been everyone's opinion that it is its strongest feature. DX11 is still under the phrase "counting your chickens before they hatch" at the moment. My opinion has been if you've already spent $300 - 400 in the past year on your graphics, getting this card before 2010 would be pointless.

othesavings.png


Roughly every 100 hours you save $1.

After one year you likely won't be using the card anymore. If you do, you probably haven't even put 1500 hours into yet anyways. If you think you use it more than 4 hours a day on average, $0.1296 and $0.09024 are the costs per hour. If you want to save money, hang your clothes on a line instead of using the dryer. Install solar panels. Don't use A/C. You'd save thousands per year.

Well said. I've said similar things in other threads concerning computer power usage. It never goes over very well as you can see. The hippies come out of the wood work with their "carbon footprint" BS once you shoot down the cost savings argument. The truth is your big screen TV, your microwave and your washer and dryer chew through more power than your computer does.

My machine pulls (including monitor) pulls 980 watts under full load. I've never seen this impact my electric bill. Ever.
 
@ OP: You don't find a reason to upgrade a 4870x2... I wouldn't either if I had a 4870x2.

I however, find a reason to upgrade my 8800 GT.
 
My machine pulls (including monitor) pulls 980 watts under full load. I've never seen this impact my electric bill. Ever.

With all due respect, you haven't been looking or your utility co is mighty generous and giving you free electricity. I'm not saying it's a night an day difference, but you're a tech guy so you know as well as I do that 1000watts isn't free. With less power consumption comes less heat which means AC is on less especially during the summer months, which by your own admission, is a big source of energy usage.

It's not the "hippies" making it out to be a big deal. The "pro 58xx" crowd simply states all the benefits of the newer card, power consumption happens to be one of them and then the "anti-58xx" crowd attacks the card, and in this case, focus on power consumption and the debate ends up being focused there. If power consumption is not a big deal than simply stop talking about it.

I personally could care less about the power consumption IF it was the only benefit, and for someone that owns a 4870x2 it would be one of the few benefits for them which is why I also said previously that if I had a 4870x2 I would not upgrade to the 5870. But not everyone owns a 4870x2, some people owned 4850's, even more still owned 8800GT's and a whole host of other cards with similar performance, and for those people, there is really no reason to go with the 4870x2 over the 5870... THIS is what the thread is about, not power consumption. That's just a tangent that this thread went on thanks to the few people crying that their 4870x2 is now being overshadowed buy newer technology
 
Last edited:
With all due respect, you haven't been looking or your utility co is mighty generous and giving you free electricity. I'm not saying it's a night an day difference, but you're a tech guy so you know as well as I do that 1000watts isn't free. With less power consumption comes less heat which means AC is on less especially during the summer months, which by your own admission, is a big source of energy usage.

It's not the "hippies" making it out to be a big deal. The "pro 58xx" crowd simply states all the benefits of the newer card, power consumption happens to be one of them and then the "anti-58xx" crowd attacks the card, and in this case, focus on power consumption and the debate ends up being focused there. If power consumption is not a big deal than simply stop talking about it.

1,000 watts of power draw on my computer while gaming is more than nothing but given the fact that I don't have power usage nearly that high 70% of the time I don't think it is a big deal. If I played computer games all day every day then the actual power draw would be around 1,000 watts which of course would add up. That's not the case though. My actual idle power range is probably about half that. The electricity isn't free but it's not really expensive either. Less idle power is a good thing so I'm not knocking the 5870 in that regard. Not at all.

All I'm saying is that the 5870's lower power consumption isn't the huge deal people are making it out to be. Technologically speaking it is an impressive feat to get the idle power usage down into the 30w to sub-30w range. However, the 5870 isn't a worthy upgrade if you've got a Geforce GTX 295 or a Radeon 4870 X2. You'd save a little money in power, but you'd have to keep that card for several years (more than 5 probably) before it would "pay for itself". You'd also be either making a lateral move in regard to performance or a step back in some areas. DX11 and power savings alone are probably not going to be worth the "upgrade." Now with that said, if you are in need of a new video card and you are upgrading from a Geforce GTX 260 or a Radeon 4870, then the 5870 is an attractive option. Power savings and all.
 
I agree, my edit pretty much says the exact same thing as your last paragraph.
 
Wow, another one. Well let me put it to you this way: I've had multi-GPU systems since the Voodoo 2 SLI days. Since the 6800Ultra days I've had dual 6800GT's in SLI, 7800GTX's, in SLI, 7900GTX's, X1950XTX's, X1950Pro's, 8800GTX 2-Way and then 3-Way SLI, 9800GX2 Quad-SLI, Geforce GTX 280 2-Way SLI, Geforce GTX 280 -3Way SLI, 4870 X2 CrossfireX, and I've worked on, built or have owned other SLI or Crossfire based systems using a variety of lower end cards. If that's your opinion of Crossfire/CrossfireX/SLI-3-Way SLI/Quad-SLI etc. then you have little to no experience with it.

Even the issues with micro-stuttering are eliminated with newer dual GPU cards. The problems found with the 7900GX2/7950GX2 and even the 9800GX2 have been eliminated. Newer cards like the 4870 X2 and the Geforce GTX 295 are excellent cards and have seamless SLI or Crossfire support built in. As long as you keep your drivers up to date you shouldn't have any problems with most games. This is one of the reasons why the differentiation between single and dual GPU cards should be pretty much eliminated.
This was refreshing to read. Would you mind weighing in on this discussion?
 
honestly, I think this whole 5870 thing is a bunch of nonsense, hopefully I can inform you as to why I feel this way:

- the power consumption arguement is ridiculous, there isn't a whole lot of savings happening, essentially you could just turn some lights off or line dry your clothes and save more power, not to mention that in buying something new, I'm just perpetuating overconsumption and so on, the idea is to buy something and use it for a long period of time, not just get something fancy every 2 months because its 20 or 50% more efficient, thats wasteful, and not worth $400 to me

- the performance gains over my 4870X2 aren't impressive, and there are currently no DX11 or DX10 titles that interest me, so while it may offer DX11 functionality, nothing draws me to that, and the performance improvement is less than 5%, that isn't worth $400 to me

- the eyefinity feature does not appeal to me, I have a large, single monitor, which I enjoy for gaming, and I don't need more workspace beyond what I can presently do with the 4870X2's pair of outputs, or adding a cheapie secondary board like a 4350 or even digging out my old X1650, and then I'd have four outputs to the 5870's 3, again, not worth $400 for that

NOW, I'm not "crying" because my 4870X2 isn't the biggest and baddest kid on the block, it hasn't been able to lay claim to that title for almost as long as I've had it, and I don't buy components just to show other people up (I also generally refrain from posting system specs and similar as a result, because I could really care less what people think of my hardware, its only relevant if technical support is being asked for anyways)

HOWEVER, if I had, say, kept my GeForce 6800GT for whatever the last 4 years, yeah, 5870 would be an ENTIRELY LOGICAL choice for an upgrade, because of its performance and availability, but so would many other boards (like 4890, GTX 275, etc)

basically, I agree that there is no need for hype, but don't understand why there's so much bad blood over the whole mess, for those that already have a good performing solution, there isn't a legitimate need to upgrade, and theres nothing wrong with buying something just because it pleases you, for those who want/need a performance upgrade, theres nothing wrong with buying something newer, but either way, there is no reason to be so cold to each other over it
 
I didn't mean to imply all x2 owners are crying, just a select few that are.
 
This is a rather weak argument don't you think?

It's not an argument, you need not read further than the first post of this thread to see what i'm talking about. I made my argument several times already and I'm not the only one to make it. It's really very simple, if you have a 4870x2 already then the 5870 is probably not worth an upgrade. If you have a weaker card and are looking for an upgrade, it makes no sense to go with the 4870x2 over the 5870 which is exactly what the OP of this thread is suggesting. That is my argument that I've made no less than three times now.

ahh wait, you are the OP, then you shoud know exactly what i'm talking about.
 
Last edited:
I am planning on buying HD5970 very soon, but what bothers me the most is the PSU. My spec is in signature. I wonder if it is enough to handle HD5970 OC'd?

I read on Anandtech forum and it mentioned that AMD recommends 750 watts or above in order to get better performance result from OC'd HD5970 along with at least 20A or above.

Anandtech HD5970 review

For overclocked operation, AMD is recommending a 750W power supply, capable of delivering at least 20A on the rail the 8pin plug is fed from, and another 15A on the rail the 6pin plug is fed from. There are a number of power supplies that can do this, but you need to pay very close attention to what your power supply can do. Frankly we’re just waiting for a sob-story where this card cooks a power supply when overvolted. Overclocking the 5970 will bring the power draw out of spec, its imperative you make sure you have a power supply that can handle it.

Should I go ahead and buy at least 750 watts PSU? I am a fan of Corsair PSU. :D Also, money is not a concern for me.
HTML:
 
It not about hype in the least bit, it happens to be the best single gpu out.
yes it is not quite as fast as nvidia's top end but then it is not running 2 gpus either, which for me is a good thing, I like to keep it simple, and I'am pretty sure a lot of people fall into the same catagory.
While a lot of people are like OFMG this card is so exspensive!", I don't know what the heck they are smoking?, for the price point it is a solid product, and will be even more so once the drivers have matured, 370-380 versus nvidia top end always going for what atleast $120 ($500+) or more on average.
Some people hate stuttering you get sometimes with multi gpu set ups also or certain games not playing nice with cross fire or sli, where pretty much the 2nd or 3rd is a complete waste.
 
What does this has to do with my post? I am talking about power consumption on HD5970 not Nvidia card or price wise whatever you are talking about...
 
if you are planning for OC i reccomend to buy what they suggest cause when overvolting power usage increases a lot i have seen 5870 numbers going up the roof
 
What does this has to do with my post? I am talking about power consumption on HD5970 not Nvidia card or price wise whatever you are talking about...

It was actually you who brought this thread back to life and took it off topic. So the more accurate question would be, what does your post have to do with this thread?
 
Back
Top