AMD Radeon HD 6990 "Antilles" Sneek Peek @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
AMD Radeon HD 6990 "Antilles" Sneek Peek - It is here in our hands and it screams. "Performance!" The wait is nearly over as AMD's next generation "Antilles" video card sporting two high performance GPUs on a single printed circuit board is in our hands. We comment on our gameplay experiences in Dragon Age II in Triple-Display Eyefinity.
 
I have a question about the screen shots. They seem stretched out from my perspective. Is that me viewing it at 19x12 or are some of the games "stretched" across the 3 screens ?
 
Looks to be AWESOME performance with that card!
 
Last edited by a moderator:
So why isn't this dual-gpu AMD card going to suffer from the microstutter and other issues of all the previous dual-gpu AMD cards?
 
So why isn't this dual-gpu AMD card going to suffer from the microstutter and other issues of all the previous dual-gpu AMD cards?

because all of the other radeon cards do not have microstutter, my 6950 cough (6970s) do not have microstutter.
 
Agh this article is such a cock tease. Basically we know that the card exists ;) Well now I can't wait to see specifics.

It will be very interesting to see how this compares to Crossfired 6950s or 6970s, especially for people with P55 motherboards (8x by 8x). Will the bandwidth advantage of having both GPUs on one PCIe 16x slot make it a better choice than two discrete cards? (Assuming all other things equal)
 
because all of the other radeon cards do not have microstutter, my 6950 cough (6970s) do not have microstutter.
Yes, they do, and just because you can't see it doesn't mean it's not there. Any AFR-based multi-GPU solution has microstutter, it's a limitation of the technology.

Great article, can't wait for the full review :cool:.
 
...um, that is exactly what it means.
Not in the least. If you turn off the lights so you can't see the table in front of you, but then walk into it, do you sit down confused why you hit something even though you can't see it? Give me a break. Microstutter just makes the actual framerate lower than the reported framerate due to the discrepancy in frame times. Do a FRAPS run or better yet, play with CF/SLI off for a bit and you can easily see the difference.
 
Looks promising... wonder if its going to be retardedly expensive like the 5970s... if its not... I may just get this over the GTX 580....
 
I know you are doing this as a "quickie".

I'm just wondering how the 6990 would compare to Crossfired 6970s or 6950s, which-ever GPU was used on the 6990s.

I am very interested to see the total review.

Can I ask what PSU connectors you used for the 6990?

Oh yeah..........if you don't want that briefcase thingy you got the card in, I can take it off your hands.;)

Thanks for the review.
Maybe this means the nvidia equivalent isn't too far behind?:D
 
Only so much you can say before the NDA lift. Good to see the card is physically produced and awaiting an NDA lift, shouldn't be long therefore before it's out.
 
when does it come out? im more interested in the "BETA DRIVER" you used, and when that driver will come out...
 
The word is spelled "sneak" guys :)

I did it on purpose, thought it made for a fun title. Sorry. - Kyle
 
Last edited by a moderator:
I guess there is light at the end of the tunnel:D From the hints presented in this article, this damn card is what the doctor ordered:cool:
 
If I had to guess, I would say min. 700 watt. I know we argue back and forth about how they can't exceed this (300 watt) power envelope, but when you factor in overvolting and overclocking, those figures go up. However, I'm sure in idle mode the numbers will be impressive.
 
Very cool. Was hoping for more details for a while! Heard it is coming out in april so nV will not be able to adjust 590 to it.
Hopefully it won't take longer than 1 or 2 weeks for the 6990 to come out. :p
 
Nice to see the hardware is going to make Eyefinity even more playable.

Now if only the game developers would jump on board and make games with PROPER triple-screen support. I played the DA2 demo in triple-screen, and while the spanning is well implemented in terms of aspect ratio/FOV, the HUD is, like the first one, on the edges of the outside screens.

Why oh why will they not just spend the tiny amount of dev time to allow the HUD to be either customizable, or do the calculation so it is restricted to the center monitor.

I honestly find games somewhat unplayable when I have to ping-pong my head back and forth to read the HUD.
 
Sounds like a nice card. I wonder if the rumored date of the 15th for the 590 will hold.
 
I have a question about the screen shots. They seem stretched out from my perspective. Is that me viewing it at 19x12 or are some of the games "stretched" across the 3 screens ?
you mean how the far left and right sides look stretched? That's a property of portraying a 3d world onto a flat screen. If the three screens are properly alligned along a single plane and if you're sitting the correct distance from the monitors there is no distortion.

The central action on the screens should not be stretched though, so if thats what you see that something else
 
Oh nice.... Can't wait to see how this thing stacks up to similarly specced single core cards in SLI/CF. I'm always also interested to see what the power/heat results have to say also.

Damn the GPU development for being so fast moving! lol. Good news though is that should push 5970 prices down even more.

Gah!! Can't wait to see the reviews. May have to flip my 5970 and sport one of these :)

Shane
 
Can't wait for the full review. Expecting it to have full 1536 shader Cayman cores clocked at 6950 speeds to stay within the 300 watt PCI-e limit but tailored to be easily overclocked to full 6970 speeds and beyond.
 
Hmm, the GPUs are set pretty far apart... wonder why that is

Because there is a new chip in between that unites the 2 Frame buffers to make it a 4gb card instead of 2gb x2 :eek:

Did I also mention that these were uber cherry picked Caymans with 1920SP each chip :eek:
 
Microstutter sounds like a pretty big issue with these kinds of sandwich cards. If so, then what's the point? Just to say on paper that this card has higher FPS?
 
Hmm, the GPUs are set pretty far apart... wonder why that is

I'll assume you haven't seen the earlier pictures showing the radial fan in the middle. So here you go:

1fullimagephp.jpg

4x Mini DP outputs and 1 DVI-D, leaving the second slot for full exhaust. That radial fan is going to move a LOT of air, hopefully it won't be too loud.
2fullimagephp.jpg

Notice the 8 and 6 pin PCI-e power connectors. That means the card draws a max of 300 watts. Also notice the absence of the dual BIOS switch.
fullimagephp.jpg


Because there is a new chip in between that unites the 2 Frame buffers to make it a 4gb card instead of 2gb x2 :eek:
Nope.
Did I also mention that these were uber cherry picked Caymans with 1920SP each chip :eek:
Nope, that's just a regurgitation of the same false rumor from back when everyone thought Cayman was going to have 1920 Shaders.
Microstutter sounds like a pretty big issue with these kinds of sandwich cards. If so, then what's the point? Just to say on paper that this card has higher FPS?

They make these cards just so that you could come to the forums and complain about them. :)
 
Last edited:
All I want to know is will this run my 30" Dell 3008wfp @ standard res. with all settings maxed the fucked out?


:D
 
All I want to know is will this run my 30" Dell 3008wfp @ standard res. with all settings maxed the fucked out?


:D

Of course it can... heck any card can do that...:rolleyes:
But this card can probably do so and be smooth... :p

For those whining about micro-stutter... drivers can help... but because they have not come up with a better way than AFR... of course it will have some micro stutter...
but the frame rates should be more than high enough for it to be smooth anyways.
 
LOL, that should be the launch slogan of this thing, "Graphics maxed the fuck out"
 
Microstutter sounds like a pretty big issue with these kinds of sandwich cards. If so, then what's the point? Just to say on paper that this card has higher FPS?

Because it isn't a big issue. It is real, it exists, but reviews like [H]'s will automatically account for it by the method used. You only have to factor it in when looking purely at FPS numbers. Once you take the more subjective gaming experience of the reviewer into account (assuming you trust them, of course), then you don't need to account for microstutter. If you only look at average FPS, then yes, microstutter can be a big issue because it isn't displayed in a single number, but the solution to that is easy - don't look at crappy reviews.
 
The new framebuffer idea is just what I wanted in a double-core card, hell yeah! looking forward to eyeballing this one when it's time to build a new computer later this year for BF3 :)

On the subject of Dragon Age 2 - Is it just me, or is there something missing from the graphics? They look so raw, so stale, like there's no colour filtering going on. APB used bloom and HDR to make its look, with those disabled the game looked worse than Quake 1 - in a broken way. The last Fallouts used a green filter to get that look to it, without them it was just like this. Looking unfinished, a very peculiar design decision. Or is it just the game engine for DA2? (whatev engine is it anyway?)
 
Back
Top