NVIDIA Big Format Gaming Display

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
I think display port 1.4 in current nvidia gpus can do 4K at 120hz without any problems , So why we wait for HDMI 2.1 to do the same purpose which display port 1.4 does ?

Because TV manufacturers have no interest in putting Displayport on their products.
 

criccio

Fully Equipped
Joined
Mar 26, 2008
Messages
13,724
I think display port 1.4 in current nvidia gpus can do 4K at 120hz without any problems , So why we wait for HDMI 2.1 to do the same purpose which display port 1.4 does ?

Watching Linus' coverage, he shows the new displays in question are indeed using displayport.
 

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
Uh, how about the TVs this thread is about?

Note previous discussion about how this isn't a TV, it's a 65" monitor. It won't include a tuner(the thing that makes a TV a TV, legally) and is made by Acer/Asus/HP using an AUO panel, not any of the TV manufacturers.

So far no TV manufacturer has displayed interest in doing something like this nor in adding DP to any of their TVs.

Also, we're making the wild assumption this will actually come out on schedule. If it's anything like the 27/34" HDR 4K 120z products, it will be delayed indefinitely and AUO will have all kinds of problems actually making usable panels.
 

mms

n00b
Joined
Apr 7, 2017
Messages
52
Because TV manufacturers have no interest in putting Displayport on their products.
But this isn't a TV , it is a giant pc monitor so i think display port 1.4 will be enough to run 4K at 120hz at least for me without having to HDMI 2.1 .
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,563
But this isn't a TV , it is a giant pc monitor so i think display port 1.4 will be enough to run 4K at 120hz at least for me without having to HDMI 2.1 .

Yes of course. G-Sync only works over Displayport.
 
Joined
Nov 18, 2011
Messages
621
Considering that many people get their media via the Internet and not Tuners nowadays, I don't think it's unfair calling it a TV.

I like the idea on the surface. I really like the idea of a low latency gaming display. Built in Shield is great!

But I have an issue:

Something this big will likely be used in Home Theaters. Receivers don't support display port. GSYNC does not work if you connect a second HDMI "display" to a receiver for audio. Any other method of connection looses quality, or isn't available on most receivers.

We need NVIDIA to put an audio only HDMI output port of some kind on their video cards, or find a solution that gives us the same quality without loosing GSYNC.

Maybe if the display can input the audio via display port and output via HDMI out to the receiver without a quality issue or loosing GSYNC.

Truth be told though, with HDMI 2.1 on the horizon, and 120hz OLED HDR panels coming soon, I'll probably take a wait and see.

Oh and to those that said 120hz isn't needed, if you're not using GSYNC or something similar, 120 is the only refresh rate that divides natively into 24, 30, and 60. Only 48fps content would be An issue.
 

AdamK47

Limp Gawd
Joined
Feb 7, 2004
Messages
310
You could use the optical out on the motherboard for audio input on the receiver using TOSLINK. Then plug the displayport cable directly into the TV.
 

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
2,381
Considering that many people get their media via the Internet and not Tuners nowadays, I don't think it's unfair calling it a TV.

I like the idea on the surface. I really like the idea of a low latency gaming display. Built in Shield is great!

But I have an issue:

Something this big will likely be used in Home Theaters. Receivers don't support display port. GSYNC does not work if you connect a second HDMI "display" to a receiver for audio. Any other method of connection looses quality, or isn't available on most receivers.

We need NVIDIA to put an audio only HDMI output port of some kind on their video cards, or find a solution that gives us the same quality without loosing GSYNC.

Maybe if the display can input the audio via display port and output via HDMI out to the receiver without a quality issue or loosing GSYNC.

Truth be told though, with HDMI 2.1 on the horizon, and 120hz OLED HDR panels coming soon, I'll probably take a wait and see.

Oh and to those that said 120hz isn't needed, if you're not using GSYNC or something similar, 120 is the only refresh rate that divides natively into 24, 30, and 60. Only 48fps content would be An issue.


They just need to support ARC (audio return channel). It's basically 2 way HDMI communication and pretty much every new receiver and TV has it.

I can watch netflix, amazon prime, etc on my LG smart tv and it outputs surround sound to my receiver and can even turn the receiver on and off and control the volume. I have my receiver hidden away and it doesn't matter because the remote doesn't need to communicate directly with it.

ARC seems like something they could add to Windows with software and would allow an audio only signal so you don't have to do the stupid 2 screen trick to get surround sound.
 

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
Something this big will likely be used in Home Theaters. Receivers don't support display port. GSYNC does not work if you connect a second HDMI "display" to a receiver for audio. Any other method of connection looses quality, or isn't available on most receivers.

We need NVIDIA to put an audio only HDMI output port of some kind on their video cards, or find a solution that gives us the same quality without loosing GSYNC.

Gsync works fine with multiple displays connected to your computer. The display doesn't need the audio if you have a receiver, so you can do PC -> Display with DP, and then do HDMI -> receiver.

In fact, this is how my home setup works, I use HDMI to receiver for audio and have a 27" 144ghz g-sync desktop display via DP. The annoying part is you end up with a '2nd monitor' that you don't want to use, because there's no way to turn off HDMI video and just use it for audio without external hardware devices, unfortunately. Sure would be nice if Nvidia would provide a software solution for this.
 
Joined
Nov 18, 2011
Messages
621
Gsync works fine with multiple displays connected to your computer. The display doesn't need the audio if you have a receiver, so you can do PC -> Display with DP, and then do HDMI -> receiver.

In fact, this is how my home setup works, I use HDMI to receiver for audio and have a 27" 144ghz g-sync desktop display via DP. The annoying part is you end up with a '2nd monitor' that you don't want to use, because there's no way to turn off HDMI video and just use it for audio without external hardware devices, unfortunately. Sure would be nice if Nvidia would provide a software solution for this.


Really? When I contacted NVIDIA and EVGA they both said GSYNC would be disabled.

I'm going to have to try this.
 
Joined
Nov 18, 2011
Messages
621
You could use the optical out on the motherboard for audio input on the receiver using TOSLINK. Then plug the displayport cable directly into the TV.


Only if you cap yourself to 5.1 at limited bit rate. It's still better than nothing though.
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
24,754
Some 4k tv's can do 120hz native at 1080p though, in case anyone wasn't aware. I'll say again that unless you are getting at least 100fps average or so, high hz is pretty meaningless. So it would depend on how demanding a game was and what settings you dial in to vs your gpu power to hope to get 100fps-hz average or more at 4k on some games.

About living room pc use comfort. I highly recommend the couchmaster that my gf got me for xmas this year. With the right couch/cushion setup to support your head and neck , and even feet.. it's like you are floating practically.




more couch master photos here
Doesn't matter with G-Sync.
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,563
Basically in 2019 you will have to pit 4K OLED's 120 Hz and amazing picture quality (and more reasonable 55" size) but will have some input lag and no variable refresh [NVIDIA] VS a far inferior picture quality FALD VA but has G-Sync and virtually no input lag. I guess it depends on your priorities.
 

gan7114

Limp Gawd
Joined
Dec 14, 2012
Messages
275
Basically in 2019 you will have to pit 4K OLED's 120 Hz and amazing picture quality (and more reasonable 55" size) but will have some input lag and no variable refresh [NVIDIA] VS a far inferior picture quality FALD VA but has G-Sync and virtually no input lag. I guess it depends on your priorities.

Would easily take the former over G-Sync. Personally speaking, anyway.

I hope great strides are made during 2018 to work on solutions for OLED image retention and burn in on the desktop. Not sure how that would be done, but there's a lot of engineers at LG et al who get paid a lot more than I do to figure it out. :)
 
Joined
Nov 18, 2011
Messages
621
Basically in 2019 you will have to pit 4K OLED's 120 Hz and amazing picture quality (and more reasonable 55" size) but will have some input lag and no variable refresh [NVIDIA] VS a far inferior picture quality FALD VA but has G-Sync and virtually no input lag. I guess it depends on your priorities.


Those TVs should have HDMI 2.1 which supports variable refresh rate.

Interestingly, the XBOX 1X has HDMI 2.1 support. The rest of the industry can't be far behind.
 

JRUHg

Limp Gawd
Joined
Jan 5, 2016
Messages
386
Basically in 2019 you will have to pit 4K OLED's 120 Hz and amazing picture quality (and more reasonable 55" size) but will have some input lag and no variable refresh [NVIDIA] VS a far inferior picture quality FALD VA but has G-Sync and virtually no input lag. I guess it depends on your priorities.

scanning backlight tho :unsure:
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,563
Those TVs should have HDMI 2.1 which supports variable refresh rate.

Interestingly, the XBOX 1X has HDMI 2.1 support. The rest of the industry can't be far behind.

NVIDIA isn't going to support HDMI 2.1's VRR, hence why I put NVIDIA in parenthesis. They are sticking with G-Sync.

And the Xbox 1X does not have a 48 GBps HDMI 2.1 chip in it. Nothing does. It cannot do 120 Hz 4K. The Xbox 1X has support for some HDMI 2.1 "features", but not the speed. Would be a moot point though has the Xbox 1X doesn't have the power to run 4K past 30 FPS let alone 120.
 
Joined
Nov 18, 2011
Messages
621
NVIDIA isn't going to support HDMI 2.1's VRR, hence why I put NVIDIA in parenthesis. They are sticking with G-Sync.

And the Xbox 1X does not have a 48 GBps HDMI 2.1 chip in it. Nothing does. It cannot do 120 Hz 4K. The Xbox 1X has support for some HDMI 2.1 "features", but not the speed. Would be a moot point though has the Xbox 1X doesn't have the power to run 4K past 30 FPS let alone 120.

I was referring to the 2019 TVs you mentioned, and not the GSYNC monitor. 2019 TVs will have VRR through HDMI 2.1

You are right in that XBOX One X does not have a 2.1 port. However, it DOES feature HDMI VRR and Freesync. This is actually more important for lower frame rates then it is for higher frame rates. When the game drops below the native panel speed (say 120hz), the VRR kicks in and reduces stutter on VRR supported TVs/Monitors. That's why 45FPS via VRR can feel smoother than 55 FPS on a non VRR screen. Additionally, as a 4K Blue-Ray player, the VRR can be used to run the screen at native refresh rates of the source material.
 

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,096
120hz native can do 24 x 5 repeated frames for 24fps movies = 120 frames
120hz which would look aot cleaner than running a lower peak Hz yet variable hz monitor at 2 x 24 = 48hz.

There is a reason the higher end vr kits are 90hz, and would be higher if they could.

You don't get appreciable blur reduction until around 100fps.
Compared to a 60hz at 60 fps (solid not average) baseline:

100hz at 100 fps = 40% blur reduction (1.6:1 motion def increase
120hz at 120 fps = 50% blur reduction (plus 2:1 motion def increase)
144hz at 144fps = 60% blur reduction (2.4:1 motion def increase)

oUVyJg3_d.jpg
 
Last edited:

kasakka

2[H]4U
Joined
Aug 25, 2008
Messages
2,199
Something this big will likely be used in Home Theaters. Receivers don't support display port. GSYNC does not work if you connect a second HDMI "display" to a receiver for audio. Any other method of connection looses quality, or isn't available on most receivers.

I haven't tried it but my G-Sync display works just fine with my HDMI TV connected to my PC and don't remember G-Sync breaking the few times the computer's default audio device was set as the TV.
 

DoubleTap

2[H]4U
Joined
Dec 16, 2010
Messages
2,680
Only if you cap yourself to 5.1 at limited bit rate. It's still better than nothing though.

Most PCs no longer have a DDL or DTS Connect encoder - motherboard makers don't usually pay the licensing and the latency is horrible for gaming anyway.

The original point is valid though - since these will likely end up in a HT environment, HDMI audio should be expected and isn't really available without using a 2nd virtual monitor (which will give non technical users all sorts of issues when they lose icons, etc)
 

mms

n00b
Joined
Apr 7, 2017
Messages
52
Gsync works fine with multiple displays connected to your computer. The display doesn't need the audio if you have a receiver, so you can do PC -> Display with DP, and then do HDMI -> receiver.

In fact, this is how my home setup works, I use HDMI to receiver for audio and have a 27" 144ghz g-sync desktop display via DP. The annoying part is you end up with a '2nd monitor' that you don't want to use, because there's no way to turn off HDMI video and just use it for audio without external hardware devices, unfortunately. Sure would be nice if Nvidia would provide a software solution for this.
Sorry i'm an arabic and i don't know some concepts . I don't know what is the receiver which you mean ? Can u show to me by pictures how can i use HDMI to receiver for audio and the monitor via DP 1.4 ?
All what I know is running SPEAKERS with sound output in motherboard .
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,563
I was referring to the 2019 TVs you mentioned, and not the GSYNC monitor. 2019 TVs will have VRR through HDMI 2.1

You are completely missing the point. 2019 4K 120 Hz TVs will most likely have HDMI 2.1. But no video card to drive it with in VRR mode. The only company that makes video cards fast enough to make 120 Hz 4K really viable is NVIDIA. Who most likely won't support VRR over HDMI 2.1. The only VRR "solution" would be to go with an AMD card if and when they support HDMI 2.1. But I seriously doubt they would be fast enough to be all that useful anyway considering how far they are behind NVIDIA. I don't buy a 4K 120 Hz OLED to run 50 FPS/Hz.

The only solution for NVIDIA users on non G-Sync display is to try and overpower the requirement and keep a minimum FPS = refresh rate. 4K 120 FPS minimum will be extremely difficult to do. Wouldn't have this problem if AMD cards didn't suck.
 

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
Sorry i'm an arabic and i don't know some concepts . I don't know what is the receiver which you mean ? Can u show to me by pictures how can i use HDMI to receiver for audio and the monitor via DP 1.4 ?

Google is your friend. An AV receiver is a hardware device that accepts HDMI from many sources and has amplifiers to drive speakers, as well as connecting to your TV to display the input from the source of your choice on the TV. I'm way too lazy to draw a flow chart for this specific setup but here's a gif.

You are completely missing the point. 2019 4K 120 Hz TVs will most likely have HDMI 2.1. But no video card to drive it with in VRR mode.

Yup. But the fact this is all delayed until 2019 anyway leaves me hoping that we'll have a proper OLED gaming monitor by then, or at least an announcement. We probably won't, but I can still hope... in all honesty I would prefer a 30-32" 4K OLED 120hz G-sync monitor to any 50+ inch size display, and I'd be more than happy to pay $5000 for it. More than likely I will just be dumping G-sync because I'll be honest I'd rather have 120hz with OLED true blacks than G-sync if that G-sync comes with shitty dark trailing VA and FALD halos/bad latency backlighting and still wants me to pay thousands of dollars... the hell with that. This is even more true if the rumored 40-49" LG OLEDs happen in 2019 or 2020.

Sorry, Nvidia. If you're going to try to force us to use inferior panels to benefit from G-sync, I'll live without it. I refuse to be locked into inferior products at premium prices. Premium products at premium prices are fine, once you lose the premium part you're done, and FALD VA is not premium it is garbage.
 

gan7114

Limp Gawd
Joined
Dec 14, 2012
Messages
275
But the fact this is all delayed until 2019 anyway leaves me hoping that we'll have a proper OLED gaming monitor by then, or at least an announcement. We probably won't, but I can still hope...

Although its size leaves little to be desired for 4K purposes, all eyes should be on the ASUS ProArt PQ22UC this spring. It'll be very interesting to see how their OLED implementation plays out. Dell tried last year, to no success. I have a feeling these monitor manufacturers are testing the waters with their workarounds for OLED retention and burn in. Trial and error, if you will.

It's worth noting that the panel will be from JOLED, a joint venture between Japan Display, Sony, and Panasonic.

From TFT Central:
http://www.tftcentral.co.uk/news_archive/39.htm#asus_pq22uc
 
Last edited:

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,096
Oled fades . Even with LG's proprietary OLED tech, which as I understand it, uses all white OLEDs through some kind of color filter something like per pixel FALD to eliminate uneven color fading. This does not eliminate the OLEDs fading over time though, and unevenly due to different brightness areas and static images, which may be one of the reasons OLED has a lower peak brightness. This comes into play with HDR especially. The newer LGs are supposed to have increased the brightness to over 700 but I've read that they do so by adding a white subpixel which can impact image clarity. There are still questions as to whether oled can keep color calibration/accuracy over time in general as well as if it will get burned out / IR in places.
The HDR premium standard and HDR10 both have 1000nit peak brightness as a minimum as well as at least .05 black depth. While the HDR Premium standard makes exception for oled at 500 /.0005, it's worth noting that HDR Movies are mastered at 4000nit.

Personally I'm intrigued by both the subject of this thread and the upcoming 4k 120hz oled tvs. I find the lack of true oled deektop monitors, the fact that dell pulled theirs , and the fact that there isn't much mention of any other than that 22" one suspect though.
 
Last edited:

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
Although its size leaves little to be desired for 4K purposes, all eyes should be on the ASUS ProArt PQ22UC this spring.
It's worth nothing that the panel will be from JOLED, a joint venture between Japan Display, Sony, and Panasonic.

Interesting about the panel source. And yeah I'd read about this one too. I'll absolutely be buying it as long as the price isn't totally insane. <$2K is fine, I think. If it's $3-$5K I probably wouldn't be able to do that for a 21.6" 60hz screen.

If it's <$2K though it'll make a great side monitor/movie watching monitor for me, and even if I get a bigger OLED monitor later it will still be useful to keep around.
 
Joined
Nov 18, 2011
Messages
621
You are completely missing the point. 2019 4K 120 Hz TVs will most likely have HDMI 2.1. But no video card to drive it with in VRR mode. The only company that makes video cards fast enough to make 120 Hz 4K really viable is NVIDIA. Who most likely won't support VRR over HDMI 2.1. The only VRR "solution" would be to go with an AMD card if and when they support HDMI 2.1. But I seriously doubt they would be fast enough to be all that useful anyway considering how far they are behind NVIDIA. I don't buy a 4K 120 Hz OLED to run 50 FPS/Hz.

The only solution for NVIDIA users on non G-Sync display is to try and overpower the requirement and keep a minimum FPS = refresh rate. 4K 120 FPS minimum will be extremely difficult to do. Wouldn't have this problem if AMD cards didn't suck.


Depends. We don't know what standard consumer Volta cards will support. NVIDIA may well support HDMI 2.1 if there is consumer demand.

I will happily settle for 60-90 FPS VRR gameplay over 120 vsync
 

MistaSparkul

[H]ard|Gawd
Joined
Jul 5, 2012
Messages
1,667
Depends. We don't know what standard consumer Volta cards will support. NVIDIA may well support HDMI 2.1 if there is consumer demand.

I will happily settle for 60-90 FPS VRR gameplay over 120 vsync

Ok but here's the question. Even if nvidia adds hdmi 2.1 for volta, does that mean it will automatically support VRR through hdmi? They might just add hdmi 2.1 support and remove VRR from it.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,710
Ok but here's the question. Even if nvidia adds hdmi 2.1 for volta, does that mean it will automatically support VRR through hdmi? They might just add hdmi 2.1 support and remove VRR from it.

Remember that like Freesync, HDMI VRR is an optional function. While implementation should be trivial, HDMI VRR is something beyond HDMI 2.1 that they are not obligated to support.
 

MistaSparkul

[H]ard|Gawd
Joined
Jul 5, 2012
Messages
1,667
Remember that like Freesync, HDMI VRR is an optional function. While implementation should be trivial, HDMI VRR is something beyond HDMI 2.1 that they are not obligated to support.

As I thought. So really 4k120hz WITH VRR on TVs is only going to be possible through AMD.
 

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
2,381
Ok but here's the question. Even if nvidia adds hdmi 2.1 for volta, does that mean it will automatically support VRR through hdmi? They might just add hdmi 2.1 support and remove VRR from it.

Nope, they would have to implement VRR to their drivers. It isn't automatic by including with the hardware.
 

elvn

Supreme [H]ardness
Joined
May 5, 2006
Messages
4,096
G-Sync requirements are much stricter. The technology requires display makers to use a proprietary hardware module and Nvidia keeps a firm grip on quality control, working with manufacturers on everything from initial panel selection to display development to final certification.

That’s a decent amount of added cost, and G-Sync monitors tend to start at higher prices as it’s considered a premium add-on for premium gaming displays. You won’t often find G-Sync monitors paired with budget or mainstream gaming PCs as a result—though you’ll always know what you’re getting with G-Sync.


I realize people are talking about the VRR in the new HDMI standard but technically free-sync and g-sync are both V-ariable R-efresh Rate technologies already. G-sync just requires display port currently.
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,563
Google is your friend. An AV receiver is a hardware device that accepts HDMI from many sources and has amplifiers to drive speakers, as well as connecting to your TV to display the input from the source of your choice on the TV. I'm way too lazy to draw a flow chart for this specific setup but here's a gif.



Yup. But the fact this is all delayed until 2019 anyway leaves me hoping that we'll have a proper OLED gaming monitor by then, or at least an announcement. We probably won't, but I can still hope... in all honesty I would prefer a 30-32" 4K OLED 120hz G-sync monitor to any 50+ inch size display, and I'd be more than happy to pay $5000 for it. More than likely I will just be dumping G-sync because I'll be honest I'd rather have 120hz with OLED true blacks than G-sync if that G-sync comes with shitty dark trailing VA and FALD halos/bad latency backlighting and still wants me to pay thousands of dollars... the hell with that. This is even more true if the rumored 40-49" LG OLEDs happen in 2019 or 2020.

Sorry, Nvidia. If you're going to try to force us to use inferior panels to benefit from G-sync, I'll live without it. I refuse to be locked into inferior products at premium prices. Premium products at premium prices are fine, once you lose the premium part you're done, and FALD VA is not premium it is garbage.

What sucks is VRR is the real deal. So it is hard to brush off G-Sync so easily. Especially in a very demanding 4K scenario at high refresh rates, VRR is needed more than ever.

Although its size leaves little to be desired for 4K purposes, all eyes should be on the ASUS ProArt PQ22UC this spring. It'll be very interesting to see how their OLED implementation plays out. Dell tried last year, to no success. I have a feeling these monitor manufacturers are testing the waters with their workarounds for OLED retention and burn in. Trial and error, if you will.

It's worth nothing that the panel will be from JOLED, a joint venture between Japan Display, Sony, and Panasonic.

From TFT Central:
http://www.tftcentral.co.uk/news_archive/39.htm#asus_pq22uc

I'll definitely be getting one to try out on release day.

Depends. We don't know what standard consumer Volta cards will support. NVIDIA may well support HDMI 2.1 if there is consumer demand.

We can only hope...

Ok but here's the question. Even if nvidia adds hdmi 2.1 for volta, does that mean it will automatically support VRR through hdmi? They might just add hdmi 2.1 support and remove VRR from it.

That is what I fear will happen. If NVIDIA has refused Free-sync this entire time, why would they change?
 

Sancus

[H]ard|Gawd
Joined
Jun 1, 2013
Messages
1,043
What sucks is VRR is the real deal. So it is hard to brush off G-Sync so easily. Especially in a very demanding 4K scenario at high refresh rates, VRR is needed more than ever.

You're not wrong. Most likely I will punt the whole dumpster fire to 2019 and buy an PG27UQ or PG35VQ now that it seems we have confirmation those are still slated to come out this year. Kinda leaning towards the PG35VQ. You're gonna have smear on these BFGDs anyway since they're VA panel based, might as well get 21:9 and a usable desktop size.
 
Top