will my 980 do Ok at 2560x1600

Snowy2012

n00b
Joined
Nov 18, 2012
Messages
22
Hi. My rig is a 5820k oc'd to 4.2ghz with 16gb ram, 1tb ssd and 980gtx (evga sc).

At the moment I game at 1920x1200 at ultra/Max with great fps but have been offered a new monitor that runs at 2560x1600 (16:10). Just wondered what will happen to my fps if I move to this resolution (or even if I go higher to 4k on a diff monitor).

Any help much appreciated.

Current games are project cars, alien isolation, tomb raider and dragon age inquisition.
 
I would say half fps in my limited experience. But you can lower AA.
Try DSR through the nvidea software to see for yourself what kind of performance you will get.
 
I think 2560x1440/1600 its the sweet spot for a GTX 980 in 90% of the games.. but the fact that only have 4GB vRAM sadly will make it obsolete soon even for 1920x1080... but yes as mentioned the best way its to test without having the monitor its with DSR..
 
I couldn't really see a huge diff in fps between 1920/1080 and 3???/1800 using VSR on my 290. Guess it depends on horsepower. Although the higher resolution required no AA compared to the lower for same visual quality.
 
I couldn't really see a huge diff in fps between 1920/1080 and 3???/1800 using VSR on my 290. Guess it depends on horsepower. Although the higher resolution required no AA compared to the lower for same visual quality.

its basically the same 1920x1080+ medium/high AA vs a high VSR/DSR without another AA method as they are basically the same.. VSR/DSR are derivatives from Supersampling/downsampling so if you add high levels of MSAA/SSAA to a 1920x1080 game you can be in fact using a resource heavier game than just run the game at a high Resolution with a worse image quality a 4X SSAA are basically the same as running the game at 4K but heavier and worse quality.. That's why you don't see huge difference between 1080P+AA vs a VSR but also the nature of those 290/290X help to work better at higher resolutions, in the case of newer nvidia cards doesn't matter how much UberUltraSuperMegaGrande texture/color compression are involved the tinier Bus always will have that negative performance impact as the resolution go higher..
 
its basically the same 1920x1080+ medium/high AA vs a high VSR/DSR without another AA method as they are basically the same.. VSR/DSR are derivatives from Supersampling/downsampling so if you add high levels of MSAA/SSAA to a 1920x1080 game you can be in fact using a resource heavier game than just run the game at a high Resolution with a worse image quality a 4X SSAA are basically the same as running the game at 4K but heavier and worse quality.. That's why you don't see huge difference between 1080P+AA vs a VSR but also the nature of those 290/290X help to work better at higher resolutions, in the case of newer nvidia cards doesn't matter how much UberUltraSuperMegaGrande texture/color compression are involved the tinier Bus always will have that negative performance impact as the resolution go higher..

So are you saying Araxie that I will notice a real world FPS difference if I move to the higher resolution monitor or that I won't ?
 
So are you saying Araxie that I will notice a real world FPS difference if I move to the higher resolution monitor or that I won't ?

yes of course you will notice a FPS difference (drop) from 1200P to 1600p that's like ~75% increased the pixel amount from 2.3Mp to 4.1Mp.. so you will be probably getting more or less half FPS of what you have now at 1920x1200 depending on the settings..
 
Assuming that the number of pixels drawn are exactly the same, 1600p has 78% more pixels than 1200p.

If you have 100fps on 1200p, you would get around 56 fps on 1600p, assuming no other bottlenecks.

However, this is just a very basic calculation, you will need to find the settings which you accept under 1200p and then again on 1600p and see if you can accept the performance or IQ loss.

The only thing DSR/VSR does not tell you about is the effect of pixel density. Due to the non-whole number nature of going from 1200p to 1600p, you may find that you will need to smooth out any blurs to get an acceptable IQ, but it may also affect how AA actually look on a real screen.

My own personal experience with 970 SLI (I have used single and both cards) on an 1440p. A 980 on 1600p should be roughly similar to the performance of 970 on 1440p. I find that the more recent games such as say SoM, AC:U etc may require a second card to perform smoothly at the native resolution. SoM was an exception in that despite the game handily capping my VRAM on my 970's, it still runs relatively well at the highest details, however AC:U definitely struggles even with 2 970's. I have not tried GTA V as my current interest have been more retro gaming rather than recent AAA titles, and a 970 is perfectly viable at 1440p for such use (ME3 for example I could downsample from 5k and is still playable much of the time, though there are definitely some hiccups).

However, I will say that the existence of DSR/VSR is the primary reason why I personally prefer lower resolution screen with high refresh rate, than a high resolution screen with a low refresh rate. DSR 4k on 1080p screen has zero blur, and you can eliminate much of the blur for any resolution in between with the built in function of DSR (no idea if VSR has similar feature). Upscaling 1080p image to a 4k screen you are at the mercy of the monitor's internal scaling (AFAIK, most monitors actually internally scales even whole number upscaling, making 1080p blurry on 4k even when it should not), and you are SOL for pretty much anything in between.
 
Last edited:
Once you go 1600p you'll wonder how you ever lived without it. Go for the new monitor. 2560x1600 is a world of fun. :)
 
Gimme 24" 144hz G-Sync 4k monitor with single GPU powerful enough to drive 4k at 60fps, and I'll bite :p
 
How many FPS do you want? If that is a commercial 60Hz LCD, you would be fine. If it's a FW900 at 72Hz, it would be fine in most things. But if that is a custom-made, 120Hz LCD, you will need another 980.
 
I was look at either a HP ZR30w or a Dell U3014 - both second hand but in perfect condition.

The other I was look at (brand new) is a : BenQ BL3200PT

I currently have a Dell 2709W.
 
I was look at either a HP ZR30w or a Dell U3014 - both second hand but in perfect condition.

The other I was look at (brand new) is a : BenQ BL3200PT

I currently have a Dell 2709W.

Why not get a FW900? The FW900 supports 3840x2400@96Hz/interlaced. It also supports 2560x1600 at 144Hz/interlaced and 1920x1200@96Hz/progressive. And because it's a CRT, you can always switch resolutions without having to deal with scaling issues. If your GPU can't handle 3840x2400 with 48FPS and you have to drop to 2560x1600 or 1920x1200, the resolution will still be native.
 
I am running a similar rig with a 290x at 1170/5500 (my 5820k is at 4.5) and had no trouble at 2560X1600 (ZR30w). It runs pretty solid at 3440x1440 as well. I am going to try to OC a 980GTX I just ordered to see if it is a noticeable improvement over the 290x. I have turned up most settings on my 3440X1440 display and DA:I looks and runs really well. GTA V is really solid most of the time, but stutters a little bit sometimes. I still have the 2560X1600 display sitting on the floor so may hook it up to test the 980. If I do, I will post the results. I may not keep the 980, but I got a good price on it so if the OC makes a difference vs. the 290x. I will probably skip the next generation. If not, 980ti or 390x here I come. I would bet you will be quite happy with the 980 on the 30" since is usually tops the 290x and I was really happy with it on my 30", but what I enjoy may not work for you. Best of luck.
 
I decided to go with the U3014 and ordered a second EVGA 980GTX SC anyway just to make sure :) Hopefully the 2 980's in SLI will keep me going for a while!
 
Nah, 3840x2400 is one of the true 4Ks and is one of the only acceptable upgrades from 2560x1600.

While it is a good upgrade (provided a small enough screen size so that you're not just getting size bloat instead of an appropriately scaled resolution increase), 3840x2400 is not "true 4k". It's not even an official resolution, and the FW900 he mentioned is 2304x1440 maximum, he was suggesting an interlaced resolution. The closest official resolution to that is UHD at 3840x2160, but the whole point of the "4k" moniker was that it was over 4000 pixels wide - the native DCI 4k resolution is 4096x2160.
 
While it is a good upgrade (provided a small enough screen size so that you're not just getting size bloat instead of an appropriately scaled resolution increase), 3840x2400 is not "true 4k". It's not even an official resolution, and the FW900 he mentioned is 2304x1440 maximum, he was suggesting an interlaced resolution. The closest official resolution to that is UHD at 3840x2160, but the whole point of the "4k" moniker was that it was over 4000 pixels wide - the native DCI 4k resolution is 4096x2160.

I agreed with you for the most part, but 3840x2400 had a monitor before 3840x2160 was a thing. It's an LCD to boot although it require 2 to 4 connections to run depending on how it configured / modded. IBM T221 I believe and it's derivatives.

@OP, you're going to fine. That second card may be unnecessary, but not you'll have more power to play, especially if you have to have max settings.
 
I decided to go with the U3014 and ordered a second EVGA 980GTX SC anyway just to make sure :) Hopefully the 2 980's in SLI will keep me going for a while!

The U3014 is a good choice, not as good as the FW900, but still very nice! It's one of the few displays that still are in a correct aspect ratio.
 
Thanks for all the advice and tips guys :)

Everything is installed and running super smooth.

The U3014 has one dead pixel unfortunately - It was second hand and the seller has kindly given me the option of returning it.

If I do, then my only options really are to purchase a new U3014 which will be about double the price (I paid GBP £500 for this one) or something like the BenQ BL3200PT for about £500.

Thing is, I could probably just manage with 16:9 aspect ratio instead of 16:10 but can anyone tell me if the 32" 16:9 screen is physically bigger (viewable area) than the 30" 16:10 screen ? Just wondering ?
 
Why not get a FW900? The FW900 supports 3840x2400@96Hz/interlaced. It also supports 2560x1600 at 144Hz/interlaced and 1920x1200@96Hz/progressive. And because it's a CRT, you can always switch resolutions without having to deal with scaling issues. If your GPU can't handle 3840x2400 with 48FPS and you have to drop to 2560x1600 or 1920x1200, the resolution will still be native.

yup, just get it..ohh where..?? nowhere that is... or in 1996.. now where did i put my quantum physics books again..
 
Why not get a FW900? The FW900 supports 3840x2400@96Hz/interlaced. It also supports 2560x1600 at 144Hz/interlaced and 1920x1200@96Hz/progressive. And because it's a CRT, you can always switch resolutions without having to deal with scaling issues. If your GPU can't handle 3840x2400 with 48FPS and you have to drop to 2560x1600 or 1920x1200, the resolution will still be native.

Maybe I missed something here... But to my knowledge the GDM-FW900 will not support 3840x2400@96Hz/interlaced. We torture tested one unit in the lab, and it achieved 2560x1600 but only at 60 Hz interlaced. Anything above this timing, it bombed...

UV!
 
Thanks for all the advice and tips guys :)

Everything is installed and running super smooth.

The U3014 has one dead pixel unfortunately - It was second hand and the seller has kindly given me the option of returning it.

If I do, then my only options really are to purchase a new U3014 which will be about double the price (I paid GBP £500 for this one) or something like the BenQ BL3200PT for about £500.

Thing is, I could probably just manage with 16:9 aspect ratio instead of 16:10 but can anyone tell me if the 32" 16:9 screen is physically bigger (viewable area) than the 30" 16:10 screen ? Just wondering ?

How dead are we talking here? Jet black on a white background? Thats how mine were, really bothersome. Those were worse than the stuck pixel imo. Depending on where on the display.

My catleap has a couple. Those just look off white. Those aren't noticeable unless you're really looking for them on a white background.
 
Hi. My rig is a 5820k oc'd to 4.2ghz with 16gb ram, 1tb ssd and 980gtx (evga sc).

At the moment I game at 1920x1200 at ultra/Max with great fps but have been offered a new monitor that runs at 2560x1600 (16:10). Just wondered what will happen to my fps if I move to this resolution (or even if I go higher to 4k on a diff monitor).

Any help much appreciated.

Current games are project cars, alien isolation, tomb raider and dragon age inquisition.



I just made the jump from crossfire 290X to a single Titan X for my 2560x1600 monitor myself. It's been pretty smooth with pretty much all my games I am playing with everything at maximum settings. If you can wait a few months, go to the 980 TI over the 980. But I also hear people who buy custom/non-reference 980s have a good experience at 2560x1600.


I am not going back to multi-gpu gaming anymore. 2014 was by far the worst year for multi-gpu gaming for me. It either took forever to get the game patched to make multi-gpu gaming playable or it just never worked at day one. I am going to be staying with single GPU setup for the time being. Just throwing my two cents in to give you an assessment on what to expect for your future upgrade. Take care and good luck!
 
There are a few off white pixels and they aren't noticeable like you say banabooey. But there is one jet black one right in the middle horizontally and a few centimetres from the top. Really annoying.

The cards I ended up with Mr mean are evga 980 superclocked and they seem great.
 
I have been very happy at 1600p with a single 980 overclocked at 1.5ghz.
This is coming from a SLI Titan vanilla setup.
 
Maybe I missed something here... But to my knowledge the GDM-FW900 will not support 3840x2400@96Hz/interlaced. We torture tested one unit in the lab, and it achieved 2560x1600 but only at 60 Hz interlaced. Anything above this timing, it bombed...

UV!

Are you sure interlacing was on? This issue really sounds like interlacing wasn't on. A FW900 should handle 144Hz interlaced or 72Hz progressive at 2560x1600.

I don't own a FW900, so I can't test extreme resolutions, but I'm currently running my CRT at with 2304x1728i@120Hz and when I get a GPU with a stronger DAC, it should be able to go up to 2560x1920i@120Hz or 2304x1728i@140Hz. And my CRT only has a 4kHz higher scanrate than the FW900. What happened when you tried higher resolutions?

I've been considering a FW900, but if it really only can handle those low resolutions and refresh rates, I'll just stick with my current monitor. I don't want to halve my resolution over a wider aspect ratio and a larger screen.

EDIT: Just tried 2560x1920i@120Hz with tighter blanks to reduce the required bandwidth. It worked fine.
 
Last edited:
I run my Dell 3014 on a GTX 670, so yeah I think a 980, would be swell.

I'm thinking of upgrading to an AMD 3xx series when it comes out, so I can do PLP with my couple Dell 20" 1600x1200 monitors flanked in portrait on either side.

AMD 285x is the only option for new titles in PLP at this time. But it isn't fast enough as a single card to drive 4960x1600 res.
 
Back
Top