R9 290X CrossFire performance....

Sieg

Limp Gawd
Joined
Aug 2, 2011
Messages
213
Wow! :eek:
Almost 2x scaling on some games. Take it with a grain of salt still. :cool:
AMD-Radeon-R9-290X-CrossFireX-850x442.jpg


2zi7yoh.jpg


Read more from VC WCCFT

I'm seriously considering to get two if the frame pacing issue will be resolved for eyefinity when the card launch. :)
 
If these results are accurate ... WOW

Whatever changes AMD did To CF seems to be working!
 
I wonder why they turned off AF in some games? I thought there was no performance penalty hit with AF, someone care to educate me?
 
Nice results, also I'm amused by people running 4xMSAA at 4k... I can't imagine Jaggies are much of a problem at that resolution. Now someone just needs to some proper frametime/FCAT analysis.
 
Nice results, also I'm amused by people running 4xMSAA at 4k... I can't imagine Jaggies are much of a problem at that resolution. Now someone just needs to some proper frametime/FCAT analysis.

if I was running triple display I would still try to run 4XMSAA. it makes the games look that much better no matter how many pixels are on the screen.

Also with a 512bit memory bus. MSAA and even SSAA at 1440p/1080p might be pretty badass now!
 
That is very good performance and all but it'd be more interesting if there were some games benchmarked that we know AMD doesn't have a application profile for CF.

I'd like to see how it performs, or doesn't, on those games.

Also with a 512bit memory bus.
Its more about the bandwidth than the width of the memory bus. Total memory bandwidth didn't go up much vs the 7970 believe it or not.

Still might respond well to a lil' overclocking though.
 
That is very good performance and all but it'd be more interesting if there were some games benchmarked that we know AMD doesn't have a application profile for CF.

I'd like to see how it performs, or doesn't, on those games.


Its more about the bandwidth than the width of the memory bus. Total memory bandwidth didn't go up much vs the 7970 believe it or not.

Still might respond well to a lil' overclocking though.

Guess we will find out soon, but I think at higher resolutions and MSAA the 290/290x will shine.
 
Depends on how good your eyes are I guess.

I have a 1440p monitor and by itself without any FSAA I can hardly see the jaggies most of the time.

Turning FSAA at 2x usually fixes it. To me AF has more of an effect since high res textures are everywhere these days you can't really enjoy them without it IMO. I try to run it at 8x or higher whenever I can.
 
if I was running triple display I would still try to run 4XMSAA. it makes the games look that much better no matter how many pixels are on the screen.

Also with a 512bit memory bus. MSAA and even SSAA at 1440p/1080p might be pretty badass now!

Triple head is a different scenario. In ultra wide you are still working within the same effective size only spreading that scene across 3 diplays however you are still working at the same vertical resolution so the aliasing effect is identical. Triple portrait is closer to the real deal, however since you still have the same amount of PPI edge aliasing is still quite easy to see.

At 4K 30" the number of PPI is massively increased however you're still sitting the same distance from the screen. As such it's pretty hard to notice aliasing beyond extreme cases. You can pretty much replicate this effect by firing up Infinity Blade 3 on an iPhone 5s, there is barely a jaggy in sight unless you really look simply because the density of the display relative to the viewing distance is so large.
 
My over-riding concern is that CF-eyefinity and DX9 CF is fixed. These issues have been outstanding for far too long, and is a huge disservice to CF users.

I applaud that single screen resolutions were fixed, but it isn't enough. NVidia has "just worked" for some time now. AMD needs to step up and get their CF on par with NV's SLI which has been great for some time, IMHO.

That said, i'm greatly looking forward to the 290X and hope it delivers in the CF department and not just framerates. Eyefinity is important too.
 
Anyone know if you're finally going to be able to run Crossfire in Full-Screen windowed mode like SLi? If so, I'm totally down for 2 of these!
 
I'd guess not.

I like the scaling, though I'll admit I'm dismayed they still can't maintain 60fps min at 4K with two 290X. I'm curious about [H]'s max playable settings at 4K, hopefully it's not so turned down it actually looks worse.
 
I like the scaling, though I'll admit I'm dismayed they still can't maintain 60fps min at 4K with two 290X. I'm curious about [H]'s max playable settings at 4K, hopefully it's not so turned down it actually looks worse.

Considering R9 290x isn't really next gen, really just seems to be what Amd could do given a shot at a super high end card with their current tech.
4k literally just came out, and to top that [H]ardocp's new display is the first I've seen that can do 60Hz via Displayport on one gpu. Given that 4k should become a new standard, the 2014 gen gpus will be the ones to much more easily handle 4k, maybe even in surround with Sli x3 & CF.
 
Considering R9 290x isn't really next gen, really just seems to be what Amd could do given a shot at a super high end card with their current tech.
What qualifies as next gen. then?

Given that 4k should become a new standard
Do you mean as a common place standard? If so I don't think that'll happen for a long time. 1080p and 1200p still fill that role for most PC's at home or the office. Hell 1440p only recently became relatively affordable with the $250-300 S.Korean monitors that have been popping up. They'll really have to get down to less than $200 from a brand that has a US presence for warranty purposes before 1440p really takes off IMO.

Even today's top end single video cards have a tough time with the latest games with high IQ @ 1440p, often falling below 60fps which is the sweet spot IMO.

4K is still years away from becoming anywhere near as affordable most likely. I expect it to stay niche for a long time because of the high monitor costs and high costs of the hardware necessary to play new games at that res. 2x R9 290/x's or 2x Titan/780/Ti's are friggin' expensive.
 
4K is still years away from becoming anywhere near as affordable most likely. I expect it to stay niche for a long time because of the high monitor costs and high costs of the hardware necessary to play new games at that res. 2x R9 290/x's or 2x Titan/780/Ti's are friggin' expensive.


I used to own a yamakasi 1440p but then I upgraded to three u2412m and I've never look back. You don't have the latest hardware to play the latest game with a high res setup unless you wanting to maxed them out. There are 4k panels that you can buy for cheap but at 30hz. Watch forums explode when there are affordable 60hz capable panels hit the market in the next coming months. I'm glad that they're showing 4k benches. Why? It's because it show me that the 290x can handle that resolution which is a big plus for eyefinity user like me. :)
 
I don't know why anyone has this stupid idea that maxing a game out is a pre-req. You can easily lower settings minimally - i'm talking 2-3 settings at most for a lot of games - to make everything workable on a single card even at 1440p with one GPU. Furthermore, those 2-3 settings you change don't really change image quality at all. This isn't a hard concept to understand.

Or, if you want to max everything out, good luck with that because not even a Titan can "max" every game out at 1080p for a constant 60 fps. :rolleyes: It's seems that something so blatantly obvious isn't, obvious. You don't have to max everything out. If you are, you're spending way more money than you probably should be - unless you just dont' give a fuck about dropping more than a grand on multi GPUs.

1600p is easily manageable with a single high end GPU with sensible settings that don't lower IQ. 2-3 settings in the most demanding games, by the way - VERY FEW GAMES are this demanding - will be enough to make everything playable on a single GPU at 1600p.. That said, I still feel CF or SLI will be desirable for 4k, although it's not like all games will be unplayable. You don't need to tunnel vision and max every fucking game out at 4k, nor any other resolution.
 
Last edited:
some of us don't.


That i'm well aware of, Rizen. I've used SLI and CF plenty of times, and have spent that much money myself on such setups.

I'm just saying, there really isn't a need to completely dismiss 4k gaming based on "maxed out" benchmarks, you know? You can easily make that workable with a SLI or CF high end rig with more sensible settings. I'm not saying a single GPU can do it, but I feel that a SLI setup could do it fine with a few settings lowered here and there. Know what i'm saying? That's the main point i'm getting at - to not merely dismiss 4k because of framerates. All of these benchmarks are maxed out, which is not how the average 4k gamer will play at that res. They could probably lower 2-3 settings per game in the most demanding ones to get acceptable performance.

Does that make sense? I mean I ran 680 SLI for around a year. I loved it. But I also tested a single card at 1600p and was completely shocked at how well it did, and how lowering 1-2 settings here and there had dramatic increases on framerate with no IQ decrease. I know a single GPU won't do it at 4k, but I think a few compromises here and there will make it a good gaming experience.
 
3x u2412 is going to be around $900. Add in a couple R9 290/x's and you'll be looking to spend another $1,000-1,400 on top of that. All that is pre-tax too of course.

Even on this site many if not most people aren't willing to spend that sort of money for a monitor set up like you've got. Your situation is niche of a niche to put it mildly.

Those "cheap" for $700+ 4K "monitors", TV's more like it, are fairly crappy right now too. Actual 4K monitors cost around $3,500 right now. They'll come down from that price but it'll probably take a while for them to into the generally affordable ~$300 price range. Certainly will take longer than a few months.

In the end high res isn't worth much without the IQ to go along with it. This doesn't mean you need to be able to run at 8x MSAA + 16x AF + every other gimmick maxed out. You will still however need top end hardware to run that res @ 60fps with some IQ settings on at moderate levels.
 
That i'm well aware of, Rizen. I've used SLI and CF plenty of times, and have spent that much money myself on such setups.

I'm just saying, there really isn't a need to completely dismiss 4k gaming based on "maxed out" benchmarks, you know? You can easily make that workable with a SLI or CF high end rig with more sensible settings. I'm not saying a single GPU can do it, but I feel that a SLI setup could do it fine with a few settings lowered here and there. Know what i'm saying? That's the main point i'm getting at - to not merely dismiss 4k because of framerates. All of these benchmarks are maxed out, which is not how the average 4k gamer will play at that res. They could probably lower 2-3 settings per game in the most demanding ones to get acceptable performance.

Does that make sense? I mean I ran 680 SLI for around a year. I loved it. But I also tested a single card at 1600p and was completely shocked at how well it did, and how lowering 1-2 settings here and there had dramatic increases on framerate with no IQ decrease. I know a single GPU won't do it at 4k, but I think a few compromises here and there will make it a good gaming experience.
No, I know exactly what you mean. Turning down a couple shadow settings and ambient occlusion will usually result in a pretty significant FPS increase without any discernible quality decrease, particularly in fast paced games where you aren't staring at the scenes.
 
At 4K resolution, there better be some damn fine near-linear scaling.
 
Nice results, also I'm amused by people running 4xMSAA at 4k... I can't imagine Jaggies are much of a problem at that resolution. Now someone just needs to some proper frametime/FCAT analysis.

People said the same thing when 1080 came along and then 1440...
 
In my opinion aa comes in handy with motion. In every case sitting still doesn't net you much, but in motion especially with fences, the aliasing is very noticeable. No matter the resolution.
 
That is very good performance and all but it'd be more interesting if there were some games benchmarked that we know AMD doesn't have a application profile for CF.

I'd like to see how it performs, or doesn't, on those games.


Its more about the bandwidth than the width of the memory bus. Total memory bandwidth didn't go up much vs the 7970 believe it or not.

Still might respond well to a lil' overclocking though.

What do you suppose the performance would be like? Are you simply looking for a scenario where crossfire fails?
 
Amazing scaling here.
I wonder how this scaling compares to the 7-series.
Seems as though xfire has been getting better and better in the past year.
 
What do you suppose the performance would be like? Are you simply looking for a scenario where crossfire fails?
In theory CF should be able to work with out a driver update or a CAP update. In practice I know that doesn't happen but it'd be nice to see if that is still true for the R9 290/x since there appears to have been a lot of changes in how those cards handle CF.
 
my hope of the R9290x got shot down since the price SEEMS to be sitting at the 700 dollar level.

I really hope whatever driver AMD releases for the R9 series will have some trickle down positive effects for my 7970 crossfire setup.

I have plenty of horsepower to run my games at 1440p (yay korean monitors) just need to have the software in order on the driver side to really enjoy it.
 
Any word on if those numbers make you feel like you have parkinsons?
 
Many didn't get the news I guess but the new 290X doesn't have the bridge-type Crossfire anymore, everything is done through the PCIe 3.0 bus. This is also a requirement for these numbers I would guess. But it does go to show it has a pretty nice scaling going on. This will also very likely remove the frame pacing issues as AMD has probably been hard at work after this problem gaining so much media attention.
 
What do you suppose the performance would be like? Are you simply looking for a scenario where crossfire fails?

Happens to sli also.
it just fails.
Game and engine needs support to function and scale.

I buy cards based upon the games I play, mainly Battlefield and some rpg and amd just been better there.
 
Your situation is niche of a niche to put it mildly.

I don't think it of that way tbh. There are people that have much better setup than me. People that have 4x titans to run a 3240x1920 120hz lighboost setup, people that have a 7800x2560 with tri 7970, people that named their case Frank N Stein, and etc..

I didn't spend $900 on my u2412m, it was around $600 and I spend 3 months lurking for sale and it's still cheaper than a U3014. For the 2x 290x, I'm selling my old 6970s, a DT 880, a PS3 and games, and other stuff to help me pay for it. So it's not that expensive if you're willing to sell your old stuff and wait for sales.

As for 4k panels, Asus have a 39" VA that will be affordable. Why I said months is because the 4k movement is picking up steam just like 1080p back then. For the expensive IGZO display that Brent just got, that is niche of niche.

I agree that IQ is important and that's why I'm willing to spend money on two 290x.

Back on topic.
[H]opefull a 3 or 4 way cf numbers in the future because cf scaling is pretty good with 2 cards.

Also WCCFT just posted some OC result from Chiphell and things are heating up! :D
 
What will the difference be between the MSI/Powercolor/Sapphire etc. offerings? They will all be using the same reference cooling, right? So what will the differences, if any, be? Custom PCBs? Different memory modules (Hynix vs Elpida), voltage locked/unlocked?
 
What will the difference be between the MSI/Powercolor/Sapphire etc. offerings? They will all be using the same reference cooling, right? So what will the differences, if any, be? Custom PCBs? Different memory modules (Hynix vs Elpida), voltage locked/unlocked?
How would anyone possibly know this when AMD still hasn't officially released any specs for the 290X let alone pricing or release date?

Yes, there will likely be custom PCBs and different offerings, but no one has announced them yet, so anything anyone tells you now is hearsay at best.
 
It's almost like they fixed the age ole issue of being limited to the vram of one card to get near 2x scaling.
 
Since when does the VRAM limit the multi GPU rendering performance ? The cards use mirroring for their assets.
 
Back
Top