NVIDIA Big Format Gaming Display

its lock in nvidias palfrom

market share for nvidia is in all time highs

g-sync keeps it that way.

Yeah, it's a two way street.

People buy G-Sync over Freesync (or in the future HDMI VRR) because Nvidia has the only top end GPU's right now.

AMD may catch up GPU-wise at some point in the future, but when they do, people are already going to be locked into Nvidia due to owning G-Sync monitors.

People may upgrade GPU's as often as every 6months to a year in some cases, but people tend to keep their screens much longer, and once you have a Gsync screen you are unlikely to buy anything but Nvidia.

It's really shitty of them, and why I've always liked AMD better. They always seem to have a willingness to do the right thing and pursue and use open standards. It's too bad they can't bring the bacon these days when it comes to pure pixel-pushing power.
 
Even if the open standards are theoretically better on paper - the thing is Nvidia forced a quality variable refresh rate technology to market with G-SYNC (in 2014 -2015) while everyone was sitting on mostly stagnant (for the most part, other than higher refresh rate) monitor tech for years . Yes you had/have to pay for that premium but free-sync was slow to reply and was always playing catch up with arguably lower performance/quality in the real world. When you were finally able to buy a really comparable free-sync monitor to a vetted by nvidia g-sync monitor. apples to apples - your price was probably as high or nearly so on those highest quality of the free-sync displays.

Let's talk about V-Sync, Free-Sync, G-Sync, Adaptive-Sync and Fast-Sync. self.buildapc REDDIT

Dissecting G-Sync and FreeSync - How the Technologies Differ | PC Perspective

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

CUDA from nvidia was also effective on more of the standard graphics apps than amd's OpenCL acceleration afaik, like adobe apps... at least sooner and with those developers prioritizing cuda.

The shield will add some cost but it's a nice device and this is a double duty FALD HDR multimedia device as well as a gaming device. The shield has an integrated plex server and/or can connect to other plex servers (as well as a lot of android apps and games... Kodi, HBO, Netflix4k, AmazonPrimeVideo, etc etc). It also has one of the faster processors for a streaming device, so runs youtube beautifully. Even for gaming the shield can stream games over ethernet (whether this is good enough or not is debatable), and run simpler android games locally. Personally I do prefer the modular approach and I already own a shield however.
 
Last edited:
It all hinges on AU Optronics and their panels.

If people end up playing panel roulette on what will surely be expensive beasts like these? Good luck. We need to get past that and especially for these types of products.
 
here is a breakdown from the vid...

Pros:
- no burn in concerns or limitations
-1000nit color brightness in HDR movies and games looked amazing to him
-no glow around letterboxing even in completely dark room at night
-no glow around small white game bitmap objects floating around (snowflakes on a black background in an indie game tested around 16:26)
-lower input lag than tvs
-no pixel response time/overdrive issues
-gsync: has g-sync, g-sync still has some advantages over freesync.
-High refresh rate

Cons:
- Some vertical banding/lines possibly from the local dimming zone grid when viewed up very close on certain games. (He had his nose touching the screen).
- glow around mute icon during HDR movie playback which means that there is likely glow around closed captions, at least in HDR. What you'd expect on a FALD but:
.... Odd since the small bitmap snowflakes on black background he showed didn't have glow haloing around them in games (but that was SDR).
..... No mention of glow haloing in shadow of tomb raider in HDR or any of the other bitmap games or HDR movies he ran.

Misc:
- 4k 120hz displayport limitations (chroma/sub-sampling, 8bit, or 98Hz).
- Size/viewing distance outside of living room setups
- living room setups requiring couchmaster or other lapdesk type setup, preferably long wired peripherals vs input lag
- requires a powerful gaming pc and gpu for 4k in living room
- doesn't like the stand aesthetic compared to his other tv
- whining that his logitech remote doesn't already have support for the BFG



 
Last edited:


Interesting OLED gaming display impressions video. His distance evaluation is pretty much what I've been saying .. 3.8' to 4' away on a 55" for things like anno/rts and first person games and can be closer for things like driving games. that's where a huge monitor arm would come in handy (or a 2nd desk space like I have, which he actually mentions as what would be an option to make it much more usable).

He did a lot of evaluation just on the size/rez effect on gaming at first and was not directly comparing the oled to the FALD. Then he started to hit the major drawback. The big drawback was no HDR. For that kind of money heading toward xmas 2019 to not have HDR on such an expensive monitor is a huge mistake and he relegated it to something of a 2nd monitor in a spare space which this flagship pricing is not really for. I think he was just being kind at that point.


Once nvidia has hdmi 2.1 output gpus all of these monitors cashing in on 98hz - 120hz over displayport in order to do 4k at highly inflated prices are going to be obsolete. If you are strategically planning the spending of $2k to $5k on a monitor or tv with forward thinking, really you shouldn't buy anything without hdmi 2.1 48gbps, full eARC uncompressed sound capability, and a good HDR implementation (preferably 1000nit color brightness ceiling or higher). If you want to blow your money for a year I guess these are for you, but it's a lot of money for hobbled tech.
 
Last edited:
I watched his video as well. To be honest I wasnt that impressed. For something of the price they are wanting I expected better. I'm glad manufacturers are starting to think about the gaming community but apparently they intend to give us more 'lesser quality' features before giving us the full nine yards.
 
I watched his video as well. To be honest I wasnt that impressed. For something of the price they are wanting I expected better. I'm glad manufacturers are starting to think about the gaming community but apparently they intend to give us more 'lesser quality' features before giving us the full nine yards.

They aren't holding back to get you to rebuy. They're just rushing out products with any significant improvements to be first to market. It's wayyyyyy more difficult to develop and ship something with those features than people think.
 
They are slapping displayport onto tvs for 98hz - 120hz 4k and charging extreme prices because nvidia isn't releasing any gpus with hdmi 2.1 on them. As for the Dell OLED, they aren't willing to use 800nit spikes and ABL kicking it down to 600nit on a display marketed as a monitor because the burn in risk (and the ABL fluctuation) is too great - so they stuck with a hard 400nit SDR color ceiling to keep the oleds on "simmer".
 
Last edited:
Any big gaming monitor like these should be considered DOA at this point since it is highly likely that 2020 will introduce HDMI 2.1 TVs across every brand as well as HDMI 2.1 video cards to drive them with. Had these things come out a year prior then maybe they would've had a chance.
 
They are slapping displayport onto tvs for 98hz - 120hz 4k and charging extreme prices because nvidia isn't releasing any gpus with hdmi 2.1 on them. As for the Dell OLED, they aren't willing to use 800nit spikes and ABL kicking it down to 600nit on a display marketed as a monitor because the burn in risk (and the ABL fluctuation) is too great - so they stuck with a hard 400nit SDR color ceiling to keep the oleds on "simmer".

I believe that's EXACTLY why they did it... there's no doubt we'd be seeing a lot of returns on these in the long term otherwise, and they're covering themselves by limiting the brightness. There's no other reason to do it than I can think of. It wouldn't actually be so bad if the price reflected this though... but it obviously doesn't.

The crazy thing is, if they HAD brought the price down and been able to undercut the LG C9, these would literally fly off the shelves, but I'm guessing they aren't actually making that many in the first place, hence the price. There isn't much of market for 55" desktop monitors, and I'm sure they know this. It's really just a showboating halo product more than anything 99.9% of consumers actually want.
 
I'm guessing they aren't actually making that many in the first place, hence the price. There isn't much of market for 55" desktop monitors, and I'm sure they know this. It's really just a showboating halo product more than anything 99.9% of consumers actually want.
If only they'd be willing to do this with 43" versions too. :whistle:
 
They are slapping displayport onto tvs for 98hz - 120hz 4k and charging extreme prices because nvidia isn't releasing any gpus with hdmi 2.1 on them. As for the Dell OLED, they aren't willing to use 800nit spikes and ABL kicking it down to 600nit on a display marketed as a monitor because the burn in risk (and the ABL fluctuation) is too great - so they stuck with a hard 400nit SDR color ceiling to keep the oleds on "simmer".
Neither have AMD or Intel, yet. The certification program for HDMI 2.1 didn't even get into full swing until earlier this year.
 
Neither have AMD or Intel, yet. The certification program for HDMI 2.1 didn't even get into full swing until earlier this year.
Definitely NOT enough time to certify Navi or Super, right? :rolleyes:

Or am I wrong… did they actually not have enough time to do it?
 
Back
Top