PowerColor Devil13 HD 7990 Revealed

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
TUL Corporation, a leading manufacturer of AMD graphic cards, today reveals the 1st and only dual TAHITI XT GPU solution: the PowerColor Devil 13 HD7990. Designed to tackle the demanding HD titles, the Devil 13 HD7990 has default settings at 925MHz engine clocks and 1375MHz memory clocks. Furthermore, it’s equipped with dual BIOS switch button, boosting up the engine frequencies to 1GHz, breaking out the limitation of extremely gaming performance.
 
I would imagine this would trump the 690 but would that put AMD back on top since it's not an official AMD card?

I am curious to see the power consumption though.
 
Must know release date. Bill Me Later....prepare to be maxed out.
 
Does anyone else look at these card and wonder.. "WTF? Aren't cards supposed to be getting thinner and less consumptive?" I mean.. there was an industry backlash over cards just getting slapped together and making the customer pay 2x as much but only getting 20-30% increase in performance... and lately it looks like it's happening again.

I would like for cards to NOT take up 2-3 slots on my motherboard, and I would VERY much like to be able to use multiple monitors on my card without playing godamn Russian Roulette with adapters and conversion plugs just to make it work. I had a conversation with Steve about that... it's like the card makers get some kind of perverse pleasure from not providing real change or evolution in cards and the tech they use.

How many others have jumped thru hoops on collecting build parts to take advantage of announced feature and/or abilities, only to have those promises evaporate when they aren't willing to put their money where there mouth is. Yeah.. I'm lookin' at YOU spare card intended as a physics processor!
 
Does anyone else look at these card and wonder.. "WTF? Aren't cards supposed to be getting thinner and less consumptive?" I mean.. there was an industry backlash over cards just getting slapped together and making the customer pay 2x as much but only getting 20-30% increase in performance... and lately it looks like it's happening again.

I would like for cards to NOT take up 2-3 slots on my motherboard, and I would VERY much like to be able to use multiple monitors on my card without playing godamn Russian Roulette with adapters and conversion plugs just to make it work. I had a conversation with Steve about that... it's like the card makers get some kind of perverse pleasure from not providing real change or evolution in cards and the tech they use.

How many others have jumped thru hoops on collecting build parts to take advantage of announced feature and/or abilities, only to have those promises evaporate when they aren't willing to put their money where there mouth is. Yeah.. I'm lookin' at YOU spare card intended as a physics processor!
This is a special edition card doing something nobody's done before: 7970 Crossfire on a single card. Why rant in a thread about an obviously over-the-top card? You're just going to get deaf ears and this :rolleyes:

As to your rant, the 680/670 brought power consumption down a lot, and the 660Ti is doing the same for much less money. My 670 FTW has 2 DVI ports - that's 2 monitors, no adapters right there. Most people don't need more, or if you're spending enough to go Eyefinity/surround, an adapter for monitor #3 is no big deal.

Also, don't know how you can expect a top-end GPU to use only one slot these days, you're going to need a much less power-hungry (read: slower) card for that.
 
I'd almost always prefer two single GPU video cards in crossfire/SLI over one dual GPU card.

I just don't get the appeal of these things, except in cases of compact rigs, like those Shuttle all in one barebones cases, in which case the PSU would be to wimpy for this board anyway...
 
Last edited:
Awful, awful Flash-centric website.

Don't make me work to hunt and peck through a graphic to find what to do next -- it's supposed to be an informational site, not an asinine PC adventure game from the 80's.
 
Zarathustra[H];1039072412 said:
I'd almost always prefer two single GPU video cards in crossfire/SLI over one dual GPU card.

I just don't get the appeal of these things, except in cases of compact rigs, like those Shuttle all in one barebones cases, in which case the PSU would be to wimpy for this board anyway...

Real appeal is for bitcoin mining. the 5970 has proven to be extremely efficient power wise compared against two 5870s at the same clocks, the 5970 draws less power. allows you to populate a single motherboard with 8 gpu's instead of 4. this maximizes mining efficiency. instead of running two motherboards, two processors, multiple ram, hard drive, etc for 8 GPU you can do it from 1 motherboard.

From a gaming standpoint if i wanted to do quadfire crossfire, two of these cards would be a lot easier temp wise then having 4 cards packed so tightly together they overheat and catch on fire
 
Does anyone else look at these card and wonder.. "WTF? Aren't cards supposed to be getting thinner and less consumptive?" I mean.. there was an industry backlash over cards just getting slapped together and making the customer pay 2x as much but only getting 20-30% increase in performance... and lately it looks like it's happening again.

I would like for cards to NOT take up 2-3 slots on my motherboard, and I would VERY much like to be able to use multiple monitors on my card without playing godamn Russian Roulette with adapters and conversion plugs just to make it work. I had a conversation with Steve about that... it's like the card makers get some kind of perverse pleasure from not providing real change or evolution in cards and the tech they use.

How many others have jumped thru hoops on collecting build parts to take advantage of announced feature and/or abilities, only to have those promises evaporate when they aren't willing to put their money where there mouth is. Yeah.. I'm lookin' at YOU spare card intended as a physics processor!

Then order a 7770 and stop looking at the 7990. you cant reduce the size and power WHILE increasing performance. pick two...

Manufactures have been using the same TDP for awhile. top tier cards for the most part have used two 6 pin connectors, plus the PCI-E for a total of 225watts maximum. my 7970 requires two 8 pin connectors so the fact this 7990 only needs 3 instead of 4 is pretty impressive.
 
Zarathustra[H];1039072772 said:
I am doing extremely gaming performance.

Why not you?

Let's do extremely gaming performance together!

For Victory! Hot Space Station!

Hahaha.
 
Anyone else notice no crossfire connectors? How can i have my "Extreme gaming perfomance" without two of these in my computer?
 
Zarathustra[H];1039072772 said:
I am doing extremely gaming performance.

Why not you?

Let's do extremely gaming performance together!

For Victory! Hot Space Station!

Yeah, I saw that too and I rofl'ed.
 
What a terrible marketing campaign. Really, take the time to read the scrolling text on that website. It's engrish genius.
 
With the 7950 now selling for $299. Or, 3 of them for $899 or even cheaper used. Which of course would stop this new 7990 cards performance by around 30%, to me, it's nothing I would consider.

Then again I have a Asus X79 Workstation that makes adding in 3 cards easy peasie.
 
Since this is a massively nonstandard design, a la the Ares and MARS cards...are we looking at another $1000 monstrosity here? Because if so, damn...
 
seems like the 690 would likely smoke this thing

No, not really. The 690 will likely lose to this.

The 690 is a very very stupid card for gamers. It has so little VRAM that multi-monitor > 1080p gaming (really the only reason you'd need that much power outside of folding) on it is stupid. You're better off with 4gb 670's in SLI. Also the 690 is like $1k+.

This card is only slightly less dumb. 3gb per card is better, but the power requirements will be absurd and it's powercolor which means if anything on it breaks they will probably RMA you an hd6870 and a can of beans with a cryptic letter that reads:
"New card of??????
- POWERCOLOR :) Customers Support"
 
http://www.hardocp.com/article/2012/03/28/nvidia_kepler_geforce_gtx_680_sli_video_card_review/9
The biggest question in regards to performance and gameplay experience about GeForce GTX 680 SLI was if the 2GB of VRAM per GPU and lesser memory bandwidth compared to Radeon HD 7970 would be a hindrance. Our testing has clearly answered that question. In fact, in every game we tested, GTX 680 SLI offered a better gameplay experience compared to Radeon HD 7970 CrossFireX. We specifically tested at NV Surround and Eyefinity at the maximum resolution of our configuration at 5760x1200 to see if there would be any bottlenecks. We found that the new GeForce GTX 680 SLI has the performance where it counts.

It seems that 2GB is not a bottleneck at 5760x1200
That may be different at 7800x1600...
 
Back
Top