290x creating "Platform Bottleneck"

Dr3amCast

Weaksauce
Joined
Aug 8, 2012
Messages
91
I've been reading various sites' reviews of the r9 290x today and I've noticed a few of them mention a "platform bottleneck" at more mainstream resolutions...namely 1920x1080 which explain why sometimes performance on the 290x was better (relative) at Ultra HD settings. I had never heard of this before and looking for an explanation of how that actually works...

I've never purchased a really high-end card, but will be buying a 290x once we see some cards with third party cooling. I'm going to be playing at 1080p, not 4k or even 2560x1600, but I want to be able to crank up the settings in-game a bit, because of platform bottlenecking am I better off going with a slightly less powerful card?
 
Then your best bet is to get a 290 non X. The 290x is overkill for 1080p.

Hell even a 280x would fit your needs.

If price isnt an issue you can get a 770, but get the 4gb version.

Eitherway a 290/290x is overkill for 1080p IMO.
 
Then your best bet is to get a 290 non X. The 290x is overkill for 1080p.

Hell even a 280x would fit your needs.

If price isnt an issue you can get a 770, but get the 4gb version.

Eitherway a 290/290x is overkill for 1080p IMO.

agreed here.. +1..
 
I love how people say the R9 290x is overkill for 1080p.

That really depends on the game and how much detail is turned up, oh and some people are given the option of running in a window. There are also other background programs people like to keep open like web browsers, Firefox is one, that uses the GPU which of course will cause a slight performance hit. Etc.
 
I love how people say the R9 290x is overkill for 1080p.

That really depends on the game and how much detail is turned up, oh and some people are given the option of running in a window. There are also other background programs people like to keep open like web browsers, Firefox is one, that uses the GPU which of course will cause a slight performance hit. Etc.

This.

I'm going non-X 290 for 1080p for this reason.
 
My 7950's in CrossfireX aren't overkill for 1080p. I need a ton more power to maintain 120Hz in games. I still can't cut everything on in Crysis 3 @1080p. Not at least in multiplayer.
 
I love how people say the R9 290x is overkill for 1080p.

That really depends on the game and how much detail is turned up, oh and some people are given the option of running in a window. There are also other background programs people like to keep open like web browsers, Firefox is one, that uses the GPU which of course will cause a slight performance hit. Etc.

I dont see someone buying a 290x for web browsers. I don't think iv ever seen someone ask which high end 500+ video card is the best for surfing the web with chrome/firefox/IE. I guess some people just have all the spare money in the world if they are worried about web browsing performance lol

Even crysis 3 turned up he will be good. BF4 is one game we cant comment on yet.

But hey different strokes for different folks
 
Last edited:
My 7950's in CrossfireX aren't overkill for 1080p. I need a ton more power to maintain 120Hz in games. I still can't cut everything on in Crysis 3 @1080p. Not at least in multiplayer.

The OP never mentioned 120hz, thats a whole different ball game.
 
Overkill for 1080p at 60Hz. Perfect for 1080p at 120Hz, or if you just want a card you can more or less max games out for the next couple years without having to upgrade.
 
I would say good if you want to turn all the eye candy up at 1080p for a while to come at 60hz.. Now at 120hz you will want 2 :)
 
It is not overkill if you play a multitude of games, and desire high frames for a longer period of time. The notion of overkill being an actual thing is ridiculous.

Now, if you were building a system for a specific purpose such as maxing out Counter Strike 1.6 then yes, overkill could come into play here. If, however, you play games of today, and plan on playing games of tomorrow, then get as much horsepower as you can logically afford every time that you upgrade.
 
Its not overkill. For example, some of the ENB mods for Skyrim really put your GPU to the test, at any resolution.

With SSAO on I can barely get 30 fps @ 1080 with my favorite ENB.

Also, modded GTA V will be coming ( most likely )
 
DASHlT, you're missing the point, either on purpose or you misunderstand what I am trying to say.

The R9 290x is not overkill for 1080p, not by a long shot.

I was speaking to all of the factors, web browser being open behind you're game, etc, maybe the game is windowed, maybe you have all the detail turned up that all of these things considered really impact the performance of a video card.

People are just dead wrong if they think a R9 290x is overkill for 1080p
 
Also you have to think more in the long run. 290x may be overkill for 1080p now, but what about in a year or two from now? Software will always catch up.
 
go look at TPU benches, the 290x doesnt always get 60fps at 1080p.

Its not overkill in any sense of the word.

We 1080p people want ALL the eyecandy on AND 4x-8x SSAA, HBAO, HD Textures, full on tessellation, and 16xAF.

My problem is, i dont think its enough of an upgrade from my 7970 OC to get me all that at even close to 60fps.
 
This is Hardforum. So Crossfire 290X watercooled and oc'ed to the tippy top clocks for 1080P thanks.
 
why has no one asked what the rest of his specs are? that seems like the concern here so its odd they were not mentioned in the op and no one asked.
 
290/290x is overkill for 1080p IMO.
Right now this is true.

A year or 2 down the road though this --is-- going to change. The 290/x might just end up being the bang for the buck champs for quite a while with newer gen games and console ports.
 
Then your best bet is to get a 290 non X. The 290x is overkill for 1080p.

Hell even a 280x would fit your needs.

If price isnt an issue you can get a 770, but get the 4gb version.

Eitherway a 290/290x is overkill for 1080p IMO.

Unless your wanting to future proof for 1080p with everything maxed out and using a 120hz/144hz monitor and want to use vsync. then I suppose it's worth it.
 
The idea of a 'platform bottleneck' is that the 290X has a TON of architecture in place to help it maintain its performance at huge resolutions. Other cards drop off exponentially at higher resolutions because their architectures are designed to render tons of detail and shoot it out at the 1080p resolution. Some improvements to the Hawaii (290X) chip are the memory interface (50% wider than 7970) and the ROP units doubling. These are the 'big players' when it comes to the resolution performance. Basically: modern games don't really take advantage of a huge memory bus at 1080p, and ROPs are really only used to push a raw frame onto the screen once everything is rendered out separately, which means the lower resolutions aren't using the ROPs at all most the time. Other things like the compute units (GCN cores) etc. Are just as important, but rather they scale pretty linearly with resolution. Without those other 'resolution big guns', the GCN cores would be idle waiting for the ROPs to get done with thier job.

Basically the 290X is the first '4K card', meaning it was designed from the ground up to render in 4k and beat out the competition. I think that's cool! But I'm more excited about eyefinity 1440p possibility. 1440p monitors are starting to become REALLY affordable (with warranty) but the graphics power to push these massive resolutions still 'isn't there' with last gen cards.
 
Overkill is an over-used term. If you're using it for web browsing, email and flash based games, then ok, it's overkill at 1080p. For gaming, sure, for many games today it probably is, but not for all games. And how long will you keep it? Next gen consoles will be here in less then a month which means next gen games will follow right along. I personally would have no reservation pairing a 1080p display to a 290x
 
this is [H]ardforum. Nothing is overkill for anything. I been running a gtx 590 at 1080p for years and I plan to upgrade soon. Hell, I plan to slide in sli 780s or 290x crossfires or a 7990 and crank that shit all the way up. AT 1080P.

Go [H]ard or go home.
 
You know what they call a cow when it's been 'Overkilled'?

Medium rare.

Games are the same. 'overkill'?? That just means 'don't have to worry'.
 
this is [H]ardforum. Nothing is overkill for anything. I been running a gtx 590 at 1080p for years and I plan to upgrade soon. Hell, I plan to slide in sli 780s or 290x crossfires or a 7990 and crank that shit all the way up. AT 1080P.

Go [H]ard or go home.
you better have some magical CPU from the future

and still we have no specs listed here which again I think it's the point of whether it's overkill or not
 
you better have some magical CPU from the future

and still we have no specs listed here which again I think it's the point of whether it's overkill or not

Thanks for the input everyone!
My specs are:

FX-8350 @4.53ghz
1TB WD Caviar Black
16GB (8x2) Ballistix Sport 1600MHZ
1100W ABS Majesty PSU

I'm currently gaming on a 60hz tv, but will probably be getting a new monitor or possibly a 120hz tv on Black Friday or around Christmas time.
 
The idea of a 'platform bottleneck' is that the 290X has a TON of architecture in place to help it maintain its performance at huge resolutions. Other cards drop off exponentially at higher resolutions because their architectures are designed to render tons of detail and shoot it out at the 1080p resolution. Some improvements to the Hawaii (290X) chip are the memory interface (50% wider than 7970) and the ROP units doubling. These are the 'big players' when it comes to the resolution performance. Basically: modern games don't really take advantage of a huge memory bus at 1080p, and ROPs are really only used to push a raw frame onto the screen once everything is rendered out separately, which means the lower resolutions aren't using the ROPs at all most the time. Other things like the compute units (GCN cores) etc. Are just as important, but rather they scale pretty linearly with resolution. Without those other 'resolution big guns', the GCN cores would be idle waiting for the ROPs to get done with thier job.

Basically the 290X is the first '4K card', meaning it was designed from the ground up to render in 4k and beat out the competition. I think that's cool! But I'm more excited about eyefinity 1440p possibility. 1440p monitors are starting to become REALLY affordable (with warranty) but the graphics power to push these massive resolutions still 'isn't there' with last gen cards.


Thanks for the explanation. Yeah, I had hearfd of that until now but it makes sense.
 
Then your best bet is to get a 290 non X. The 290x is overkill for 1080p.

Hell even a 280x would fit your needs.

If price isnt an issue you can get a 770, but get the 4gb version.

Eitherway a 290/290x is overkill for 1080p IMO.

4GB on any 680/770 is useless. The cards bandwidth is the limit not their memory.

In my book, the 280X is sooo much better buy, but that's just me.
 
Some improvements to the Hawaii (290X) chip are the memory interface (50% wider than 7970) and the ROP units doubling. These are the 'big players' when it comes to the resolution performance. Basically: modern games don't really take advantage of a huge memory bus at 1080p, and ROPs are really only used to push a raw frame onto the screen once everything is rendered out separately, which means the lower resolutions aren't using the ROPs at all most the time. Other things like the compute units (GCN cores) etc. Are just as important, but rather they scale pretty linearly with resolution. Without those other 'resolution big guns', the GCN cores would be idle waiting for the ROPs to get done with thier job.

You are wrong.

59270.png


59291.png
 
4GB on any 680/770 is useless. The cards bandwidth is the limit not their memory.

In my book, the 280X is sooo much better buy, but that's just me.

Only reason i said to get a 4gb model is future games are going to start using more then 2gb.

Look at the BF4 benchmarks. Also the recommended specs for BF4 call for 3gb.

It's just going to be the norm in the future.
 
Only reason i said to get a 4gb model is future games are going to start using more then 2gb.

Look at the BF4 benchmarks. Also the recommended specs for BF4 call for 3gb.

It's just going to be the norm in the future.

+1 on this.

I'm sick of people talking about how a 770 isn't powerful enough to utilize >2GB VRAM.

Check out the BF4 beta benchmarks online. 1080p 60Hz on Ultra was utilizing >2GB VRAM nearly 100% of the time. Remember, the BF4 beta did not have full eye-candy enabled either.

I'm going for either a 280x or a 770 4GB if they drop $100 in price by EOY (need dat physx)
 
Thanks for the input everyone!
My specs are:

FX-8350 @4.53ghz
1TB WD Caviar Black
16GB (8x2) Ballistix Sport 1600MHZ
1100W ABS Majesty PSU

I'm currently gaming on a 60hz tv, but will probably be getting a new monitor or possibly a 120hz tv on Black Friday or around Christmas time.
I would just get a to a 280 X.and a TV at 120 Hz is not going to allow you to really see more than 60 FPS anyway since its input will only be 60.280 X just makes more sense for you.
 
Back
Top