In the market for a new monitor! GSYNC vs 21:9?

PanzeR-

Limp Gawd
Joined
Jun 17, 2006
Messages
443
Howdy! I currently own a single korean shimian 1440p monitor. The colors are apparently super good (im partly colorblind) but the darks are not super inky, but nothing that bugs me out of gameplay. I mainly use my pc for gaming and browsing, I watch all my movies on my TV.

I am buying my new monitor today or this week
I'm currently considering 2 different routes:
a 29-34 inches 21:9 monitor with decently low lag input for games
a 27 inches 1440p gsync monitor (like the s2716dg dell currently on sale)

I dont care much between TN and IPS
Current budget is 700$CDN MAX.
Thanks for the suggestions!
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
About 5 inches.;)

the UM58 is 29"; the 68 is 34"

(the UM68 is $330 here in the US, I paid almost $600 for my UM65 three years ago)
 
I was going to say, for me, I see Adaptive Sync, and in my case, G-Sync, a requirement for a 34" 21:9 display because it's a resolution like this that will make adaptive sync a need.

That being said, I have challenged my own thinking to a degree. I have been OK for all these years without adaptive sync. Perhaps if I simply apply enough video card horse power the greater resolution will be fine and I can save some cash. I already have one 1070 card, a second 1070 and a non- adaptive sync widescreen can still be cheaper than a 34" G-sync.

Anyway, my thoughts on it, maybe they will help you decide.
 
Silly nvidia and their gsync tax

Buy an HP Omen 32" ($300 over BF) with free sync and then buy Fury card ($200 over BF). Sell your 1070 for $350.

Total cost to upgrade to a adaptive sync premium display experience is $150-$200 instead of $500 minimum.

Or buy 3 for a truly ultra wide experience.

IMG_1096.JPG
IMG_1102.JPG
IMG_1097.JPG
 
Oh hell NO.

There was a time when GeForce Cards were the shit but ran hot and noisy and AMD came out with the very first Radeon cards and blew NVidia away. It was obvious at that point that Radeon was a superior engineered product.

The same is true today of the NVidia 10 series cards. They are a superior engineered solution and if it cost more for better equipment I am comfortable with that. I wouldn't touch the current AMD cards for any price because I no longer spend new money for outdated over-stretch technology.

I spend my new money on the better solution and am happy for it.

The 1070 I bought was the best solution available for what would meet my projected needs. I do not mind paying more for better.
 
Last edited:
Silly nvidia and their gsync tax

Buy an HP Omen 32" ($300 over BF) with free sync and then buy Fury card ($200 over BF). Sell your 1070 for $350.

Total cost to upgrade to a adaptive sync premium display experience is $150-$200 instead of $500 minimum.

Or buy 3 for a truly ultra wide experience.

View attachment 12167 View attachment 12168 View attachment 12170

Umm... You're advocating trying to run 3 2560x1440 monitors with cards with 4GB of VRAM. 7680x1440. What? This is terrible advice. There are already many titles that won't run well at 4K with less than 6GB due to texture swapping... And you want to go bigger with less? I'm not saying the Fury X is a bad card, but it's a very bad choice above 2560x1440.
 
Umm... You're advocating trying to run 3 2560x1440 monitors with cards with 4GB of VRAM. 7680x1440. What? This is terrible advice. There are already many titles that won't run well at 4K with less than 6GB due to texture swapping... And you want to go bigger with less? I'm not saying the Fury X is a bad card, but it's a very bad choice above 2560x1440.

Disagreed

http://www.tweaktown.com/tweakipedi...re-triple-4k-eyefinity-11-520x2160/index.html

I am running half the pixels of that review with three 1440p displays on a pair of Fury X cards.
 
Disagreed

http://www.tweaktown.com/tweakipedi...re-triple-4k-eyefinity-11-520x2160/index.html

I am running half the pixels of that review with three 1440p displays on a pair of Fury X cards.

They're running "Medium" and "Normal" presets. Bump that up to high or ultra in Tomb Raider, or try the same with The Witcher 3 or GTA V, and it will be unplayable due to texture swapping. EDIT: My fault here, texture swapping will not be an issue per benches here but performance will still be poor. See my discussion of The Witcher 3 performance in more detail below for example.

The same will be true with many racing and simulation titles. Yes, you can get the Fury X to work at that resolution... but not with high quality textures. Clearly you're willing to accept that trade off, but I have a very hard time suggesting someone spend $400 on GPUs (which are no longer that cheap) and $900 on displays (also not that cheap now) to run games on medium settings, even if it is over a large area at a high resolution. As an enthusiast I am /never/ aiming for Medium. Hell, there are some things my heavily OCed 980Ti SLI setup can barely manage 60 FPS minimum on with a single 3440x1440 display with maxed settings.


To illustrate the difference in load between ultra settings and medium in The Witcher 3, I just did a quick test.

With everything (literally every slider) maxed at 3440x1440, vsync off, no FPS cap, I was averaging 60 FPS, sometimes jumping up to at or near 70, but often dipping in to the high 50s -- 57-59.

Turning settings down to medium via the game's built in presets, still vsync off, no FPS cap, I was sitting at 89-90 and am fairly sure I was CPU limited.

This isn't a very thorough test, just running around in a small area near water and a few buildings near the start of the game, but I think it makes my point fairly well. If my system can't even hold 60 FPS with settings maxed at 3440x1440, the Fury X setup which is slightly slower due largely to lower OC headroom is going to really suffer at 7680x1440. The same will be true in most other triple-A type titles with similarly high settings.

This isn't even taking the potential for running out of VRAM in to consideration; I realize most titles don't go over 4GB even at 4K. but then again, you're advocating for substantially more pixels. 4K = 8.2944 million pixels. 3x1440P is 11.0592 million, almost a third again. Some games like Far Cry 4, Shadow of Mordor, and GTA V will exceed 4GB with maxed settings at 4K, so they'll be well over at triple 1440P. Yes, AMD's memory management on the Fury series combined with its extremely fast HBM will help relative to other cards with only 4GB in the same situation, but it will still really hurt performance.

EDIT2: Retrying this test with 2.25x DSR (the closest I could get to 11 million pixels at 11.1456) I was averaging 37 FPS in that area and never saw above ~40 with all settings maxed. Dropping 23 average FPS -- nearly 1/3 -- wasn't at all worth it for the enhanced visual quality. It softened the image somewhat, that was all really.
 
Last edited:
I can play Battlefront at ultra settings with the internal resolution slider at 75% on a single Fury X at 60-75 FPS at 7680x1440p. I know because I've been doing it.

My second Fury X card is being shipped back because it came defective.

I haven't found a single game in my library I can't get at least 60fps on a max settings at 2560x1440 on my Fury X. Not one.

If I have to turn off AA and turn settings from ultra to high to have three 1440p screens at 7680x1440 I'll still be having a fantastic, screen tear free, and relatively inexpensive experience. just throwing out another option. 4gb HBM has not revealed itself to be a real world use limitation to me so far.
 
If I have to turn off AA and turn settings from ultra to high to have three 1440p screens at 7680x1440 I'll still be having a fantastic, screen tear free, and relatively inexpensive experience. just throwing out another option. 4gb HBM has not revealed itself to be a real world use limitation to me so far.

If you want to max everything at that res with MSAA on top, i.e. the place where memory might start becoming an issue, I'd think you need better than a single 1070 anyway to keep the frames high.

With post-processing "fake" AA becoming more and more prominent this doesn't feel as relevant a use case anymore either.
 
They're running "Medium" and "Normal" presets. Bump that up to high or ultra in Tomb Raider, or try the same with The Witcher 3 or GTA V, and it will be unplayable due to texture swapping. EDIT: My fault here, texture swapping will not be an issue per benches here but performance will still be poor. See my discussion of The Witcher 3 performance in more detail below for example.

The same will be true with many racing and simulation titles. Yes, you can get the Fury X to work at that resolution... but not with high quality textures. Clearly you're willing to accept that trade off, but I have a very hard time suggesting someone spend $400 on GPUs (which are no longer that cheap) and $900 on displays (also not that cheap now) to run games on medium settings, even if it is over a large area at a high resolution. As an enthusiast I am /never/ aiming for Medium. Hell, there are some things my heavily OCed 980Ti SLI setup can barely manage 60 FPS minimum on with a single 3440x1440 display with maxed settings.


To illustrate the difference in load between ultra settings and medium in The Witcher 3, I just did a quick test.

With everything (literally every slider) maxed at 3440x1440, vsync off, no FPS cap, I was averaging 60 FPS, sometimes jumping up to at or near 70, but often dipping in to the high 50s -- 57-59.

Turning settings down to medium via the game's built in presets, still vsync off, no FPS cap, I was sitting at 89-90 and am fairly sure I was CPU limited.

This isn't a very thorough test, just running around in a small area near water and a few buildings near the start of the game, but I think it makes my point fairly well. If my system can't even hold 60 FPS with settings maxed at 3440x1440, the Fury X setup which is slightly slower due largely to lower OC headroom is going to really suffer at 7680x1440. The same will be true in most other triple-A type titles with similarly high settings.

This isn't even taking the potential for running out of VRAM in to consideration; I realize most titles don't go over 4GB even at 4K. but then again, you're advocating for substantially more pixels. 4K = 8.2944 million pixels. 3x1440P is 11.0592 million, almost a third again. Some games like Far Cry 4, Shadow of Mordor, and GTA V will exceed 4GB with maxed settings at 4K, so they'll be well over at triple 1440P. Yes, AMD's memory management on the Fury series combined with its extremely fast HBM will help relative to other cards with only 4GB in the same situation, but it will still really hurt performance.

EDIT2: Retrying this test with 2.25x DSR (the closest I could get to 11 million pixels at 11.1456) I was averaging 37 FPS in that area and never saw above ~40 with all settings maxed. Dropping 23 average FPS -- nearly 1/3 -- wasn't at all worth it for the enhanced visual quality. It softened the image somewhat, that was all really.

I just spent the last two hours playing Shadow of Mordor at high settings at 7772x1440p (bezel correction applied) on a SINGLE Fury X card at high settings. I didn't try Ultra. This is the first time I've ever played the game. I turned off AA to start because at 1440p I don't think it's very needed and why take a performance hit for nothing. Unless I'm running a low resolution I typically don't even run AA. I think it makes the screen look slightly blurred like non-native resolution on both AMD and Nvidia cards.

Anyway --- no AA and High settings selected on everything else.

Max FPS in the Benchmark was 52, average was 43, minimum was 35. I didn't see any frame tearing with freesync on in the two hours of playtime. The experience was phenomenal. It didn't even seem too slow or clunky - EVER. Freesync really works wonders (as gsync does I'd assume). Lots of times my framerate was bumped up against the game max of 60hz/60FPS with a single card. It felt completely playable/enjoyable and that's on one single Fury X card. (my other is still out for RMA).

The things you are warning me that I'll experience and that my card can't handle I'm telling you I haven't. I have the three displays and the card and am experiencing them/enjoying them first hand. You are theory crafting, and bench racing - but you've missed the mark on your misgivings - and are condeming a fantastic experience.

Before I was thinking this new 81" wide eyefinity display was sort of ridiculous and not sure I would keep it because you honestly can't see the periphery on this wide of a display - it just isn't possible - but after playing Shadow of Mordor, and discovering a program called flawless widescreen that lets you set GUI elements on the center screen and choose your resolution --- These three monitors are staying. Even if only a percentage of the games work in Eyefinity this well - it's just flat out incredible when they do. I'm about a week into the three displays and this is the best experience yet.

I guarantee nobody is going to complain about high vs. ultra settings in comparison to 3 vs. 1 monitor - - if ultra settings is even a problem. (mind you this is on a single Fury X card).


IMG_1142.JPG
IMG_1144.JPG
 
Last edited:
I'm not theory-crafting, I actually ran similar resolutions (via DSR) and showed the difference in load between medium and ultra settings at those resolutions in that title. The fact that you come back with "contradictory" numbers for another title is irrelevant.

If you're fine with 34 FPS, we're clearly looking for very different things. I can hardly stand anything below 60FPS not due to tearing (which is all Freesync/G-Sync eliminate) but the increased input lag, the lack of fluidity. It's a huge part of why I only buy exclusives like Bloodborne for console -- 30FPS feels like slogging through thick mud to me.
 
Well I just put everything on Ultra (no AA) on Mordor and yes it's too slow there with a single Fury X at 7772x1440. It plays a bit clunky at Ultra.

Benchmark reads.
32 FPS Average
38 FPS Max
22 FPS Min

No way the slight uptick in quality is worth that frame rate hit with a single card. I'd agree that didn't feel smooth - but the first experience at high felt very smooth.
 
I'm not theory-crafting, I actually ran similar resolutions (via DSR) and showed the difference in load between medium and ultra settings at those resolutions in that title. The fact that you come back with "contradictory" numbers for another title is irrelevant.

If you're fine with 34 FPS, we're clearly looking for very different things. I can hardly stand anything below 60FPS not due to tearing (which is all Freesync/G-Sync eliminate) but the increased input lag, the lack of fluidity. It's a huge part of why I only buy exclusives like Bloodborne for console -- 30FPS feels like slogging through thick mud to me.

I don't have Witcher III yet. I plan to pick it up in the next month or so. I'll give you some real world feedback then.
 
Howdy! I currently own a single korean shimian 1440p monitor. The colors are apparently super good (im partly colorblind) but the darks are not super inky, but nothing that bugs me out of gameplay. I mainly use my pc for gaming and browsing, I watch all my movies on my TV.

I am buying my new monitor today or this week
I'm currently considering 2 different routes:
a 29-34 inches 21:9 monitor with decently low lag input for games
a 27 inches 1440p gsync monitor (like the s2716dg dell currently on sale)

I dont care much between TN and IPS
Current budget is 700$CDN MAX.
Thanks for the suggestions!
… the s2716dg dell is on sale? Where?!
 
Go with it Dell S2716DG. Don't bother with Freesync, HP Omen or some other exotic resolutions. I'm not sure if I prefer my Dell or Asus. Dell has excellent warranty and RMA policy. I replaced mine since I didn't like AG coating - A00 revision.
 
Last edited:
Finally received my Dell S2716DG. It took about 1 hour to get working since my gtx 1080 was acting strangely with display port. Had to reset my motherboard's bios to default then the monitor worked right off the bat. I did reset everything to what it was in the bios and it still work.
I'm partly colorblind, but sitting right next to my shimian 27 IPS, I find this panel comparable to the point that the difference is noticable but not bothersome. The default settings of this monitor are way too bright so i lowered them using a guide on youtube.
Coming from a 60hz ips panel, gsync is incredible. In the few games i tried it felt nearly surreal. World of Warcraft and Witcher 3 feels like totally new games. While I never noticed my 40ms ping in wow, I do now lol.
Unless a better mandatory feature comes out, I can not see myself use a non-gsync/freesync high refresh rate monitor for gaming.
 
Back
Top