Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You mean 3? You aren't getting 4x 290Xs for $1500.$1500?
I'd rather buy 4 290x single GPUs for the same price lol...
You mean 3? You aren't getting 4x 290Xs for $1500.
That said, I'm pretty much forced to stay with NVidia due to their grossly superior performance in World of Warcraft and better thermals.
The most interesting thing to note about this card is that AMD is putting more effort into the build quality and cooling solutions of their GPUs.
I hope this trend continues!
What's the exact source on this?Nice:
I would pay significantly more for a video card that came stock with an AIO watercooler, considering the costs/effort of installing one myself.At a premium of hundreds of dollars, of course.
Nice:
Seriously?
I think what surprised me the most is that it's highest temp is 70c. I have triple R9's watercooled and it doesn't go past 60c.
For me, I'm just debating whether I should do a single 4k monitor vs triple 144hz monitors.
$1500?
I'd rather buy 4 290x single GPUs for the same price lol...
Really? And put them under water? I doubt it lol.500 bucks apiece even second hand considering the lack of 550 dollar ones.
Really? And put them under water? I doubt it lol.500 bucks apiece even second hand considering the lack of 550 dollar ones.
Two reference R9 290X boards: $550 each, or $1100.Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.
Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.
Are you installing your video cards in a tablet? No? Then who gives a shit?200W more than 780ti SLI @ full load? Jesus, AMD is sure going to make you pay for that extra performance, aren't they?
Absolutely unimpressed by their performance with this part. Get those power consumption figures under control and we'll talk. Until they can compete on a performance-per-watt basis they simply aren't competing.
YAWN.
You do realize the monitor they used in testing currently costs around $3,000 US, right?This test should have been performed with a 1440p or 1600p surround setup as well as with Titans which have an equal amount of ram.
I.E surround 1600p = 4800x2560 or 1440p surround, this is better than a 1200p surround setup and most likely if someone is spending this much on a GPU.
Hopefully HardOCP can get their benches in order since this was a very unrealistic setup for someone who is spending this much on a card
EDIT: Funny since here I am complaining that they were not high enough and people are complaining they are not low enough. Good job on the 4k Kyle, you guys do great work guess I am in the minority since I dont feel surround 1080p/1200p shows any difference
Are you installing your video cards in a tablet? No? Then who gives a shit?
SoftOCP is that way >>>>>
In a way, yes, I am.Are you installing your video cards in a tablet?
Some people will find anything to argue about. Even if these cards did match the TDP of SLI'd GTX-780ti's, the same people would be then shouting how this card doesn't have Physx, doesn't have Gsync... And let's not forget about AMD's shitty drivers!
Please tell me where I can get 290xs for ~350 dollars.
They were available for ~$375 on the forums and from ebay here are a couple of examples:
http://www.ebay.com/itm/Used-Radeon-r9-290x-4gb-Great-Condition-Not-Over-Clocked-/400691488447?pt=PCC_Video_TV_Cards&hash=item5d4b12e6bf#ht_82wt_1063
http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-GDDR5-Video-Card-/181376460465?pt=PCC_Video_TV_Cards&hash=item2a3ae126b1#ht_371wt_1063
There were a few on the for sale/trade forum for $380 but looks like they all sold. I bought one on the forums for $380 last week.
You do realize the monitor they used in testing currently costs around $3,000 US, right?
With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz.
The shady tactic of...increasing clock speeds?
This is legit.
They can only test in so many scenarios at Nvidia who, remember, doesn't make boards on their own besides some reference boards for in house use. So they do some testing set a good boost number for launch, and wait.
As the drivers mature, and get more efficient, the amount of heat produced by the boards will decrease, allowing for additional thermal overhead that they can make use of. The data they gain from the millions of sold gpus allows them to make these kinds of tweaks impossible with the small sample set of in-house boards.
But at the end of the day, what difference is 13mhz (1.3%) going to make in terms of real performance?
This is legit.
They can only test in so many scenarios at Nvidia who, remember, doesn't make boards on their own besides some reference boards for in house use. So they do some testing set a good boost number for launch, and wait.
As the drivers mature, and get more efficient, the amount of heat produced by the boards will decrease, allowing for additional thermal overhead that they can make use of. The data they gain from the millions of sold gpus allows them to make these kinds of tweaks impossible with the small sample set of in-house boards.
But at the end of the day, what difference is 13mhz (1.3%) going to make in terms of real performance?