ASUS ROG Poseidon GTX 980 Platinum vs. AMD R9 295X2 @ [H]

AMD needs to get their shit together, this is ridiculous

Basically what everyone said about ATI before AMD bought them.

Driver updates/profile updates have been one of the major hindrances when thinking about buying one of their cards. Whatever staffing they have in their driver departments, they need to do one of two things:
1) Fire everyone and get new people in there.
2) Hire more people to speed up releases.

Until they can reliably release updates in a timely manner their products will always have negative on their report card.
 
Double the power usage!! Yikes! 10 to 25% faster when it WORKS! I'll take the ASUS all day long.
 
Did I read a different review or something? Because what I read showed the 295X2 being not only a better deal (since it already comes with the AIO) but faster in every game except Watch Dogs which no one plays.

I mean, I agree that the 300 series needs to come out soon ... but I don't understand half of what I just read in replies.

It is a better deal, but uses twice the wattage and is dependent on Crossfire profiles. I think maybe not this generation but the next will really put the performance/watt in check. AMD is finally realizing and with the big push on the mobile side this will trickle to all of its other products.
 
Did I read a different review or something? Because what I read showed the 295X2 being not only a better deal (since it already comes with the AIO) but faster in every game except Watch Dogs which no one plays.

I mean, I agree that the 300 series needs to come out soon ... but I don't understand half of what I just read in replies.

It's comparing a single 980 and a dual 290x card, where the 290x is generally known for being on par with 980 for general performance.

This review was showing how a properly cooled 980 comes within a hair's breath of a dual 290x. A 295x is indeed better than any single 980 if crossfire actually works on that game (which requires both driver support AND game engine support), but this review concludes that even then, there is only 20% difference between 295x vs Platinum 980.

20% is still better, but it lands squarely in the area of 'not worth the SLI/Xfire profile headaches' area. Personally I use 50% scaling as break even point.
 
Nvidia and Intel ought to invest in AMD.

You can't buy better publicity then this. They can also use this to cover up the fact that there have only been 10-15% performance increases from gen to gen. Who gives a shit?! At least it's faster then AMD.

In all seriousness though, Intel and Nvidia need AMD now to cover their own asses. They need their David.
 
Whats wrong with the drivers? I have been using AMD cards since 9800 Pro days, never have issues. Most of the games I play run just fine. In face, I face more issues with buggy games than drivers......Having to download those gigs of patches just to get a game run properly is ridiculous.

I am running 2 x R9 290 now, some games support CF, some games don't (eg. Diablo, WOT, STO). But I still dont have a problem with it since the frame rate is fine. I play BF4 and Crysis 3 and CF works fine.

The only issue I have is 4 x R9 290 don't work well except for benchmarks. So, I sold off 2 cards in the end.

As someone who's owned both Nvidia and AMD cards pretty much on an alternating basis, this driver argument is out of date crap that people just perpetuate. Nvidia had a huge advantage back in the Detonator days. I haven't noticed a whole lot of difference since Catalyst was released and I upgraded to a 9700 to be perfectly honest. For a while, CFX was scaling better than SLI, but don't tell that to the fanboys.

That said, AMD really needs to do something impressive for the 300 series or it's going to be increasingly difficult for anyone to justify going their route for an upgrade, aside from the significant cash savings derived from a FreeSync monitor vs its Gsync equivalent.
 
Zarathustra[H];1041488396 said:
I would pick the Asus Poseidon 10/10 times.

Not because I have anything against AMD products, but because I have been burned by multi-GPU setups in the past.

Based on this I will ALWAYS try the most powerful single GPU before deciding to do dual or more GPU's.

If you don't absolutely NEED Crossfire or SLI, it's just not worth the headaches, IMHO.

This I agree with. I'd much rather have a powerful single GPU solution rather that deal with the consistent issues that seem to plague multi-GPU setups. If I can avoid it, I will.

That and the Hawaii was embarrassingly power hungry. I hope AMD brings that back to earth with the next card launch.
 
Nvidia and Intel ought to invest in AMD.

You can't buy better publicity then this. They can also use this to cover up the fact that there have only been 10-15% performance increases from gen to gen. Who gives a shit?! At least it's faster then AMD.

In all seriousness though, Intel and Nvidia need AMD now to cover their own asses. They need their David.

so are you saying that the GTX 980 its 10-15% faster than the GTX 780?.. which its the directly successor?.. and how much compare it to the GTX 680? which are first gen kepler vs first gen maxwell??..

god.. :confused:
 
It is a better deal, but uses twice the wattage and is dependent on Crossfire profiles. I think maybe not this generation but the next will really put the performance/watt in check. AMD is finally realizing and with the big push on the mobile side this will trickle to all of its other products.

Yes, but lets get real on the whole wattage argument ... no one cares how many watts something like this takes. That is akin to worrying about the fuel economy of a drag racer.
 
It's comparing a single 980 and a dual 290x card, where the 290x is generally known for being on par with 980 for general performance.

This review was showing how a properly cooled 980 comes within a hair's breath of a dual 290x. A 295x is indeed better than any single 980 if crossfire actually works on that game (which requires both driver support AND game engine support), but this review concludes that even then, there is only 20% difference between 295x vs Platinum 980.

20% is still better, but it lands squarely in the area of 'not worth the SLI/Xfire profile headaches' area. Personally I use 50% scaling as break even point.

Corrections, its showing a highly overclocked + water cooled GTX980 vs 295X2. Keyword here is highly overclocked and water cooled. I am very sure stock results will be different.

I am using EK water blocks on my pair of 290 (not 290x) and I oc it. Its even faster than that overclocked 295X2 used in the review. However, I only oc highly when I want to run benchmarks for the fun of it (you should see how quad 290 performs in 3Dmark and Ungine, its awesome). For 24/7, I ran at a conservative 1050MHz to protect the lifespan of the card.

As I have mentioned previously, overclocking will result in reducing the lifespan of the GPU and it is not mentioned in the article. There is a difference between benching and 24/7 usage. I am dead sure most of those who overclock highly have at least killed 1 or more cards.......
 
Yes, but lets get real on the whole wattage argument ... no one cares how many watts something like this takes. That is akin to worrying about the fuel economy of a drag racer.

Yes, I fully agreed. If I want a flagship card, I will not bother about power, if I am concern about power, that means I cannot afford a flagship card. Simple as that.
 
As someone who's owned both Nvidia and AMD cards pretty much on an alternating basis, this driver argument is out of date crap that people just perpetuate. Nvidia had a huge advantage back in the Detonator days. I haven't noticed a whole lot of difference since Catalyst was released and I upgraded to a 9700 to be perfectly honest. For a while, CFX was scaling better than SLI, but don't tell that to the fanboys.

That said, AMD really needs to do something impressive for the 300 series or it's going to be increasingly difficult for anyone to justify going their route for an upgrade, aside from the significant cash savings derived from a FreeSync monitor vs its Gsync equivalent.

There's nothing fanboy about a lack of service.
I own cards from both companies.
AMD is seriously behind on drivers.....nothing new or updated since December?
Quite honestly dual or greater card support has been pretty sketchy since the 4700 X2 came out.
Single cards, probably doesn't matter and AMD is probably pretty close to NVidia in that regard.
But if you use more than one card,most times AMDs support is pretty far behind.

Yes, I fully agreed. If I want a flagship card, I will not bother about power, if I am concern about power, that means I cannot afford a flagship card. Simple as that.

The only "bother" is whether or not your PSU can handle the GPU; that and the extra heat generated by the extra power useage.

The 290X cards are real power hogs not to mention hot and loud. I have a couple reference cards and to use them properly, watercooling was the only way to get it done and be able to hear myself think.
 
Interesting review in an academic context, not the best time to be buying GPUs right now at any rate.

THis is the conclusion I keep coming away with in every [H] card review. I mean, there are so few titles out there actually pushing the hardware capability at 1080p that my GTX 660TI, 3gb is not only workable, but perfectly feasible for nearly every game coming out. Can I max settings? No, butyou don't buy a 660ti new to max settings. I want the game to look good, with some eye candy, and run well, and I have yet to install a game for which that is not the case for my setup (outside of the buggy release or piss-ass port).

The benefit of all this is that Nvidia, at least, has been working hard on efficiency. When games and engines start really pushing the hardware that is going to pay off in spades.

I'm REALLY hoping UE4 begins to challenge cards more. Maybe Source 3 when it drops (though i suspect not).
 
Did I read a different review or something? Because what I read showed the 295X2 being not only a better deal (since it already comes with the AIO) but faster in every game except Watch Dogs which no one plays.

I mean, I agree that the 300 series needs to come out soon ... but I don't understand half of what I just read in replies.
I agree. Both cards can be had for the same price, but the Poseidon requires a custom water loop which can be an extra $200 on top of the price of the card. The 295X2 is still pretty much PnP, coming with everything you need. AND it performs better. The 980 gets close, but also realize that the Hawaii XT chip used in the 295X2 is 18 months-old at this point while Maxwell 2.0 is only about 4 months old.
 
I agree. Both cards can be had for the same price, but the Poseidon requires a custom water loop which can be an extra $200 on top of the price of the card. The 295X2 is still pretty much PnP, coming with everything you need. AND it performs better. The 980 gets close, but also realize that the Hawaii XT chip used in the 295X2 is 18 months-old at this point while Maxwell 2.0 is only about 4 months old.

you can use the Poseidon as a typical Air cooled card.. and even overclock it and still have great temps.. if you remember the [H] review of the card. that on air even overclocked and with the fans at 58% the max temp was 76C.. so what's the problem?.. you what its good?. you can buy any good expandable AIO and cool your chip and GPU Without problems.. all of those guys with expandable AIO which are using it exclusively to cool the CPU can aim perfectly for this kind of target and enjoy a quieter and cooler system.. specially if are a high end AIO like a Swiftech H240-X for example.

1425298901Z7r8UqWalm_11_1.gif
 
so are you saying that the GTX 980 its 10-15% faster than the GTX 780?.. which its the directly successor?.. and how much compare it to the GTX 680? which are first gen kepler vs first gen maxwell??..

god.. :confused:

Uh, no. I'm comparing it to top of the line previous gen, the 780ti. Which it only outperforms between 10-15%.
 
Uh, no. I'm comparing it to top of the line previous gen, the 780ti. Which it only outperforms between 10-15%.

IMHO, The card should be compared to the direct successor, if nvidia launch a 980TI then with what card should be compared if you are already comparing the 980 to the 780TI..?
 
you can use the Poseidon as a typical Air cooled card.. and even overclock it and still have great temps.. if you remember the [H] review of the card. that on air even overclocked and with the fans at 58% the max temp was 76C.. so what's the problem?.. you what its good?. you can buy any good expandable AIO and cool your chip and GPU Without problems.. all of those guys with expandable AIO which are using it exclusively to cool the CPU can aim perfectly for this kind of target and enjoy a quieter and cooler system.. specially if are a high end AIO like a Swiftech H240-X for example.
In the context of this comparison, one cannot hit the same overclocks on air that you can with water. The lower clock speed translates to lower performance which increases the gap between it and the R9 295X2. So keeping on-topic, the 295X2 still translates to the better value. Of course that is questionable if power draw is one of your concerns, but I think most people here are not worried about that.
 
In the context of this comparison, one cannot hit the same overclocks on air that you can with water. The lower clock speed translates to lower performance which increases the gap between it and the R9 295X2. So keeping on-topic, the 295X2 still translates to the better value. Of course that is questionable if power draw is one of your concerns, but I think most people here are not worried about that.

it really depend on the luck of the draw.. as other cards like the Gigabyte G1, EVGA Classy (or even the FTW) can achieve 1600+mhz on air.. because maxwell its really limited by the power limit and not by temperature itself, you will reach way faster the power limit before throttle by temperature (specially talking about aftermarket solutions). so no, the card as was mentioned in the review major difference its the noise as the fans under water remain at idle speed.. so 1580mhz under water its nothing admirable, nothing special.. its just quieter.. water cooling its nothing important until you apply a mod BIOS to increase the power limit and still I have to see a 980 under water reaching higher clock than a typical G1 or Classy..
 
Nope, it will be more. You can't use AIO for this, it has to be a custom loop so its going to increase the price by quite a bit. You need fans, radiators and pump as well as tubing.

But if you already have a loop, adding the Poseidon to it is likely very simple.
 
IMHO, The card should be compared to the direct successor, if nvidia launch a 980TI then with what card should be compared if you are already comparing the 980 to the 780TI..?

I dunno if Nvidia is gonna cannibalize the Titan X this round. They took away its workstation potential this round, why make a neutered version this time?
 
I dunno if Nvidia is gonna cannibalize the Titan X this round. They took away its workstation potential this round, why make a neutered version this time?

IMO the Titan-X and 980ti will have similar performance, with the titan X having double the VRAM and less high resolution punch
 
I wanna grab a 295x2 and tri-fire so bad :X

Edit: nvm...lol
 
Last edited:
It's comparing a single 980 and a dual 290x card, where the 290x is generally known for being on par with 980 for general performance.

This review was showing how a properly cooled 980 comes within a hair's breath of a dual 290x. A 295x is indeed better than any single 980 if crossfire actually works on that game (which requires both driver support AND game engine support), but this review concludes that even then, there is only 20% difference between 295x vs Platinum 980.

20% is still better, but it lands squarely in the area of 'not worth the SLI/Xfire profile headaches' area. Personally I use 50% scaling as break even point.

THANK YOU!

I get very tired of the mob mentality haters just spouting what they think the majority will agree with.

Sir, if I could give you intelligent points, I would.
 
Didn't bother reading review,your game lineup with majority GameWorks meant the conclusion was set.

AMD GPUs suck in NV games, but NV GPUs work just fine in AMD GE games (FC3, BF4 etc), case close. No point to even bring up the same old tired bashing of poor CF (name a recent AAA NON-NV game recently that didn't have CF working), or bad performance..

I look forward to not reading your reviews in the future when GTA and Witcher 3 (both GameWorks titles) are added to your small list of games. /gg

So what you're saying is they should ignore some of the most important titles on the PC like Witcher and GTA because AMD may poorly in them? That's AMD's fault for not having the same level of developer resources that NVIDIA does. It's unrealistic to filter out games during testing just because AMD can't cut it. In fact, it's more realistic because those AAA games are what players want and they should know which platform they perform best on with all features enabled.
 
Last edited:
I dont know why any of this matters, Asus stated they aren't making any more of them. And you can't buy them retail anymore.
 
Back
Top