Still Impossible to Play 4k

Trackr

[H]ard|Gawd
Joined
Feb 10, 2011
Messages
1,786
I have GTX 660 SLi, which is about equivalent to a GTX 970, and I can barely play Dragon Age Inquisition, Crysis 3, etc. on High settings at my 2560x1600.

If I were to get GTX 970 SLi, I could possibly play on High and maybe a few settings on Ultra, but that's it.

So, I'd have to get GTX 970 Triple SLi in order to play on Ultra smoothly (at 60 fps) on my 2560x1600.

Meaning that if I had a 4k monitor, I'd literally be screwed.

Which has me asking - is there anything upcoming that could solve this? I heard that 16nm GTX 1000 won't be ready until 2016..

So, how is anyone buying a 4k monitor these days?
 

pandora's box

Supreme [H]ardness
Joined
Sep 7, 2004
Messages
4,806
Those of us with ultra high resolution displays are turning down image quality to maintain acceptable fps. I'm doing just fine on a single 970 (At 1500Mhz Core mind you) at 3440x1440 in just about all games. Running most games at "High" preset with either no AA or SMAA.
 

gungravevn

n00b
Joined
Nov 15, 2014
Messages
43
The problem is memory to run those high resolution Screen. Yours GTX 660SLI only have 2 GB so running it on 4k wouldnt be optimum at all. GTX 970 sli doesnt do 4k justice either, I a currently running 2xR9 290x and 4k tv but barely play any game at ultra high settings at all. So I am returning the 4k next week and stick with dual 1440p monitors for now.
 

xorbe

Supreme [H]ardness
Joined
Sep 26, 2008
Messages
6,029
Which games are actually more enjoyable with 3840x2160? Flight sims, and ...
 

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
I tend to prefer 'faking' 4k (DSR'ed 4k) on a 144hz panel than a real 4k with 60hz. Then again I have never seen a 4k monitor so I can't comment on how its pixel density would affect that opinion.
 

rennyf77

2[H]4U
Joined
Feb 12, 2007
Messages
2,865
your gtx 660s in sli mean you're still dealing with 2gb or ram while a single gtx 9xx or r9 29 series card will 4gb at their disposal. heck even a well overclocked single gtx 78x card with 3gb can max quality crysis 3 at 25x16 with no aa and average 50 fps. when you start climbing up in resolution with actual games, you begin to realize that firestrike or heaven numbers mean squat.

its actually flight sims, and ....everything. and i don't mean on some piddly 20 something inch 4k monitor. i mean on a 30 or larger. 'everything' just pops. i still have my 25x16 monitor and i simply don't enjoy gaming on it over my 4k.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,060
your gtx 660s in sli mean you're still dealing with 2gb or ram while a single gtx 9xx or r9 29 series card will 4gb at their disposal. heck even a well overclocked single gtx 78x card with 3gb can max quality crysis 3 at 25x16 with no aa and average 50 fps. when you start climbing up in resolution with actual games, you begin to realize that firestrike or heaven numbers mean squat.

its actually flight sims, and ....everything. and i don't mean on some piddly 20 something inch 4k monitor. i mean on a 30 or larger. 'everything' just pops. i still have my 25x16 monitor and i simply don't enjoy gaming on it over my 4k.

Bingo. 2GB is just not enough to run 4k textures in this game. Try turning texture settings down to medium and see what happens.
 

atrance5

2[H]4U
Joined
Oct 3, 2008
Messages
2,919
Those of us with ultra high resolution displays are turning down image quality to maintain acceptable fps. I'm doing just fine on a single 970 (At 1500Mhz Core mind you) at 3440x1440 in just about all games. Running most games at "High" preset with either no AA or SMAA.

Pretty big trade-off. Higher res, lower image quality. You'd get a better experience on a 60-80" TV or 1600p monitor.
 

samuelmorris

Supreme [H]ardness
Joined
Dec 20, 2010
Messages
5,506
I would just like to point out, as a 4K gamer, the main reason I bought my monitor was for the desktop work experience at 4K - you will probably find a lot of people did the same - there's no 4K content in the TV/film industry yet so that's irrelevant, which leaves gaming and general desktop use.
Nonetheless, I do play games successfully on mine. Battlefield 4 runs fine on the single GTX970 at 4K, obviously only with medium detail but it works. I've gone from two HD6970s to this, which in real terms isn't much of an increase, but the video memory increase really does show - plenty of titles that didn't really work at all on the 6970s now work fine.

You can also get away without AA a bit easier at 4K - not because the resolution is higher (as people used to continually tell me when I ran 2560x1600) but because now, finally, the DPI is higher as well, which is what makes the difference. Jaggies are still jaggies and I'd rather not have them, but they're far less an issue on a 31.5" 4K screen, let alone something smaller. You'd be surprised just how good a well-polished title will look at this resolution, even on high a lot of the time. It really has to be seen to be believed.
Whether 2560x1600 Ultra is preferable to 3840x2160 'whatever you can get away with' detail depends on the title - if you play lots of ultra-demanding games and aren't willing to put up with the flaws of a top-end multi-GPU system or just can't justify the cost (a valid argument I think now that there are some cheap 4K screens out there) then perhaps you will be forced to use fairly low detail levels.
 

Runt.

Gawd
Joined
Aug 18, 2011
Messages
618
I have a 4K monitor but with G-sync. I am only using a GTX 760 2GB card as well. It plays smoothly for the games I play lately (Batman Arkham Aslyum, Hearthstone, Heroes Of the Storm, Sniper Elite V2) on High to Ultra settings with no AA and V-sync off. I get around ~45FPS for these games. I even tried Titanfall and it ran smoothly as well. If you get a 4K with a low-mid end GPU w/o G-sync it's going to suck but with G-sync it helps alot.
 

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
26,073
Not at all impossible to play at 4k. And as far as games looking better, EVERYTHING does except perhaps 30-year-old DOS games ;).

I'll quote myself from a 1920x1200 vs. 2560x1600 thread way back when... two major quotes:

GoldenTiger years ago said:
Yep, higher resolution gives you more pixels for a given screen area representing in-game objects/textures/etc. Say you're at a door that takes up the center bottom sixth of your screen. At 1920x1080, you might have roughly 800 pixels across and 500 high representing that door. The texture on the door has to be scaled by the game engine to fit in those dimensions. If the source texture is larger, it gets scaled down, thus reducing its visible quality. If you were on a monitor with twice the total pixel count and the same aspect ratio, you'd have a lot more of the texture visible in the same overall viewport area. This happens dynamically and scales as you move around and the door takes a different amount of screen space.

The same concept applies to geometry and the effect it has, especially on object edges in regards to antialiasing. If you have an edge that is horizontal at a slight angle downward left to right, think of looking through a grid (graph paper) where each square is representing a pixel. Think about how you could try to draw the line between its start and endpoints. You'll probably realize the problem here already: you don't have enough squares vertically to make it not come out looking very jagged and rough, and it doesn't look like a straight, smooth line like you see in real life on an edge. Screen resolution acts like this, a good analogy being a "screen door" that you are looking through to the virtual world, where each square of the screen was one pixel. The more resolution you have, the more pixels you have to represent a given object taking the same amount of viewport space, and thus smoother-looking less-jaggy edges (not perfect ever though even at super-high resolution) and much more of a texture represented in comparison to its source.

As well as...

GoldenTiger even more years ago said:
If your monitor does not have enough resolution to render the texture at a pixel-perfect mapping, you will benefit by having a higher resolution, especially when it's the same dot-pitch (space between pixels). If you have a 512x512 texture but can only see 256x256 of it due to the lower resolution of a hypothetical screen, then move to a screen where you can see all 512x512 pixels of it, yet it's still the same dot pitch, you are getting an immense boost in actual detail.

And add in that these displays (4K 3840x2160) are higher dot pitch and you're getting tighter detail that's outright sharper as well!

Regarding performance my two GTX 970's in SLI @ quiet (small air noise only) fan speeds for 24/7 gaming @ 1506mhz core/7806mhz memory (1.231v vGPU, so they run cool too!) has handled every game I've thrown at them in 4K with just dropping down AA settings (in some cases no MSAA). It looks worlds better than even my 2560x1440 display did with MSAA, let alone an older 1080p (1920x1080 display). Occasionally I need to *GASP!!! BLASPHEMY!!!!* turn down one setting that you can only tell in still screenshots zoomed in to double magnification or more and never find in-game if you look for them, to maintain 50-60fps minimum framerates.

Long story short: 4K is completely, 100% able to be run today with high-end video card setups, and it is a whole 'nother level of graphical fidelity beyond even 2560-based resolutions. Compared to 1920x1080 resolution it is beyond night and day, just absolutely jawdropping. Running my Acer B326HK 32" IPS 4K 60hz monitor is a beautiful dream, honestly, in image quality in all regards for gaming and productivity. And the desktop workspace is awesome, too.

Considering a decade ago a $2500 total system build without any accessories such as monitor, speakers, keyboard/mouse, etc. wasn't really that out there for a high-end setup at all, being able to get a 4K IPS 32" monitor for $675-700, and a pair of high-end GTX 970 type video cards (or R9 290X crossfire if you prefer) and build a system well under that total figure everything included really puts things in perspective as to how inexpensive it is comparatively.
 

wabbitseason

[H]ard|Gawd
Joined
Jun 16, 2010
Messages
1,511
I tend to prefer 'faking' 4k (DSR'ed 4k) on a 144hz panel than a real 4k with 60hz. Then again I have never seen a 4k monitor so I can't comment on how its pixel density would affect that opinion.

DSR doesn't improve sharpness. All it does is remove aliasing, through the horrendously inefficient process known as SSAA. DSR is SSAA at the driver level, not some mystical force that enables 4K sharpness on 1080p screens. It's marketing speak for giving you yet more of the same, and you're eating it up talking about "4K quality on your 144Hz 1080p display".
 

xorbe

Supreme [H]ardness
Joined
Sep 26, 2008
Messages
6,029
Don't be naive. It's incredible how much difference it makes for mid-far objects. Fine detail is preserved in distant foliage etc. Any game that properly supports 4K is more enjoyable at 4k, if you have the power to render it.

It's from experience, I guess it just didn't click with me. I downgraded from 2560x1600 to 1920x1200, because less fps and little gained (in my opinion, obviously others disagree). Though with a faster card now, I am enjoying 2560 dsr'd to1920, but I like it for the true full-screen AA, not draw distance.
 

Summoner

[H]ard|Gawd
Joined
Sep 6, 2003
Messages
1,456
I cant say I have an issue playing games at 4k with my 970GTX *shrug*. Really enjoying DA:I at the moment :)
 

Neon01

[H]ard|Gawd
Joined
Jan 22, 2008
Messages
1,041
I have GTX 660 SLi, which is about equivalent to a GTX 970, and I can barely play Dragon Age Inquisition, Crysis 3, etc. on High settings at my 2560x1600.

If I were to get GTX 970 SLi, I could possibly play on High and maybe a few settings on Ultra, but that's it.

This is incorrect. I have a SLI 780 setup and I can play DA:I with all settings maxed except no MSAA. At those settings using my 21:9 monitor (3440x1440 - about 20% more pixels than standard 1600p) I get a near-constant 60fps. And my 780s are slightly slower than the SLI 970 setup would be.

I experimented with 4k in DA:I using my setup and it was indeed too much for my cards to handle at the (mostly maxed) settings I like to maintain. However, with a SLI 980 setup and very modest sacrifices in quality (mainly AA), even modern challenging titles like DA:I are possible at 60fps and UHD res.
 

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
DSR doesn't improve sharpness. All it does is remove aliasing, through the horrendously inefficient process known as SSAA.
Not quite. SSAA uses multi-point sampling, while DSR internally renders at the higher resolution and downscales (which is what some believe SSAA does). This is why DSR sometimes plays merry hob with UI elements and their scaling.
 

Starrbuck

2[H]4U
Joined
Jun 12, 2005
Messages
2,869
No issues with 4K on my SLI GTX 970 setup. I can't quite run it full Ultra but on games like Battlefield 4 enemies that are at a far distance, for example, are so much sharper it's amazing.
 
S

sonsonate

Guest
This is incorrect. I have a SLI 780 setup and I can play DA:I with all settings maxed except no MSAA. At those settings using my 21:9 monitor (3440x1440 - about 20% more pixels than standard 1600p) I get a near-constant 60fps. And my 780s are slightly slower than the SLI 970 setup would be.

I experimented with 4k in DA:I using my setup and it was indeed too much for my cards to handle at the (mostly maxed) settings I like to maintain. However, with a SLI 980 setup and very modest sacrifices in quality (mainly AA), even modern challenging titles like DA:I are possible at 60fps and UHD res.

I will second that. Without MSAA, DA ran really well -- 50-60FPS on Ultra -- with SLI'd 780s. My only issue was flickering textures, like I had in BF4...
 

wabbitseason

[H]ard|Gawd
Joined
Jun 16, 2010
Messages
1,511
Not quite. SSAA uses multi-point sampling, while DSR internally renders at the higher resolution and downscales (which is what some believe SSAA does). This is why DSR sometimes plays merry hob with UI elements and their scaling.

True and a welcome clarification. The point still stands: DSR does absolutely nothing to improve image sharpness. It does not create DPI. It removes aliasing.
 

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
It doesn't increase the monitor's pixel density, no, obviously, but I'm not so sure it does nothing to increase image sharpness. I haven't really looked very closely at what they're doing with filtering during downsampling. At low levels, the image could very well appear sharper than it would at native resolution due to 'insufficient' filtering.
 

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
26,073
I tend to prefer 'faking' 4k (DSR'ed 4k) on a 144hz panel than a real 4k with 60hz. Then again I have never seen a 4k monitor so I can't comment on how its pixel density would affect that opinion.

If you've never seen a 4k display how CA you say you prefer 1080p to one? :confused: Also while DSR does provide a nice filtering and antialiasing boost, it is hardly letting you "fake 4k", it just looks better than raw 1080p.
 

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
Because 4k monitors are still stuck at 60hz, but on lower monitor resolutions they can do up to 144hz. Sure, it doesn't help much when hardly any gpu can play games on 4k at 60fps, but down the line, when GPUs are powerful enough to do 4k at higher than 60fps, I can use at least use DSR to get the textures upped, while enjoying the greater than 60fps without having to buy a whole new monitor,

Assuming that 1) GPUs still support DP then, and 2) My monitor survives until then. If either doesn't happen that plan of mine gets several spanners
 

Rurki

Limp Gawd
Joined
Dec 18, 2006
Messages
150
I am currently running an Acer 4k G-Sync monitor with 970 SLI. I run Call of Duty advanced warfare at a constant 58 FPS without issue.

Settings are at
High (haven't tried ultra yet)
3840x2160 16:9
SMAA T2x
Refresh 60Hz

I found that running at constant 60Hz cause micro stutters with g-sync enabled. So I came across a forum post that referenced riva statistics server to lock the application (COD) to 58FPS, that way it never pushes g-sync to 60Hz; all stuttering went away.
 
Top