WOW! 55" Display with 120Hz / 1440p with excellent latency. nVidia and FreeSync

100ms, 10ms, 1ms.... maybe I'm blind, but I can't see the difference.
I can also barely, just barely see the difference between 60 and 120hz.

My JS9000 had about 120ms or so input lag because I turned all post processing on and refused to play in game mode.
This new tv has 10ms. I can see some difference between 120hz and 60hz, but not between 120ms and 10ms.

I would posit that you are blind ;)

Though if it makes sense, you less 'see' input latency than 'feel' it. As for the high-framerate stuff, that's night-and-day to me.
 
I'd bet that Nvidia does support HDMI VRR. Well, I'd call it 51/49, because I prefer to remain optimistic :).

Hell, they could introduce G-Sync over HDMI too though...
...Given a long enough timespan, the chances of anything becomes 100%...
 
The 120 Hz refresh means nothing when the response time is 9 ms. This thing would give me a headache 15 min into a FPS not to mention you probably would not be able to hit the broad side of a barn with this kind of input lag.

This kid knows nothing about what he thinks he is talking about ^.
 
I would posit that you are blind ;)

Though if it makes sense, you less 'see' input latency than 'feel' it. As for the high-framerate stuff, that's night-and-day to me.

Everyone is different in this regard.

Personally I can't "see" a difference in framerate above ~30fps. I can be looking at someone else playing a game and as long as it's at least 30fps, it all looks the same to me.

Now, if I grab the mouse and start playing, I can tell the difference immediately. In a first person shooter 30fps feels unplayable. 60fps usually feels good enough. There is a slight improvement above 60fps, up until about 90fps, but it is minor to me and not worth worrying about. Above 90fps I can't tell the difference at all.

Now, as far as screen input lag goes, on my system my Samsung JS9000 feels perfectly responsive in game mode, which has been measured at 23.6ms. (according to rtings) It both looks and feels instantaneous to me. In PC mode the input lag goes up to 55.8ms, which both looks and feels noticibly bad to me. I can still play, but I perform considerably worse.

Worth keeping in mind is that the impact of input lag is additive. What we percieve as lag is the total system lag, starting with the mouse, the USB subsystem, the CPU/RAM/game engine itself, travel across the PCIe bus to the GPU, and then finally the monitor. Some systems have higher input lag than others, notably SLI/Crossfire increases input lag by quite a lot.

In the general population anything better than 100ms is perceived as instantaneous to the average person. Those of us who play fast paced games are probably a little more sensitive to it though.

It is possible that a monitor with a certain input lag may feel good on a system with a single GPU, not going above the input lag theeashold that is noticible, but move that same monitor to a system with SLI, and it could all of a sudden be noticible.

Input lag: it's more than just monitors.
 
Last edited:
Honestly I'll probably switch to AMD GPUs if they make something that can rival Nvidia (performance per watt.) I'm really getting sick of being tied to Gsync displays to get VRR. Also, TV tech is rapidly outpacing gaming displays. IMO if you can deal with the 49+" size of these "sweet spot" TVs, they blow away your average gaming monitor. Input lag was a no-go issue for years, but with some of the 120hz-capable TVs getting in the sub-15ms range, I can consider them an acceptable alternative.
 
Input lag of the NU8000 at 120 Hz is 9.7ms. That is 1/5 of a frame past what you see on the screen. 1.4ms isn't going to make or break your gameplay. These televisions are not the ones from even 2-3 years ago when 40ms input lag was common.

The NU8000:
  1. Can't drive 4K at 120 Hz
  2. Does not have FALD
  3. Freesync support, but no G-Sync
  4. Is still a TV
I also don't understand your use of the word "kids" as a pejorative. Kids are not buying these displays, and neither are their parents.

If you want a big screen with 120 Hz support for under $1,000, then this is a good choice. However, it is simply not an alternative to anything outside of its price range like the BFGD.


First of all, the BFGD is going to be very expensive. Cost prohibitive to most gamers. in fact, almost all. The demographic for this has to be very tiny as where Samsung's 2018 120hz native 1440p that can be enjoyed by both AMD and nVidia card owners en masse @ a very affordable $750 to $850.

1 - there is nothing on the market that can push 4K at 120hz ...

2 thru 4 means absolutely nothing.
 
I agree. There are lots of nice options to get today.


Certainly a lot of options.

Personally I'm keeping my 48" Samsung JS9000, though, until there is something I. The 42-43" size range, that can actually display 4k 120hz, and has a variable refresh rate technology that either works with Nvidia GPU's or AMD has a competitive high end GPU again.

At $2k, this thing cost me a pretty penny, but 3.5 years later I still think it was worth it.
 
Linus Tech Tips says we might see VRR on Samsung sets in 2019.

is VRR an open standard?

update:

I did a quick search, yes it's an open standard.

Vesa has Adaptive-Sync

As well as, the HDMI Specification 2.1 which also has VRR.

As long as Samsung supports these then we are good. I'm really looking forward to the new 2019 Samsung sets.
 
I have FreeSync on my 2018 Samsung.

For 4K the range is too limited, but it's very nice for 1440p (would be even better if the image wasn't so blurry, you can't win them all).
 
100ms, 10ms, 1ms.... maybe I'm blind, but I can't see the difference.
I can also barely, just barely see the difference between 60hz and 120hz (a bit smoother gameplay and mouse movement, but nothing drastic).

My JS9000 had about 120ms or so input lag because I turned all post processing on and refused to play in game mode.
This new tv has 10ms. I can see some difference between 120hz and 60hz, but not between 120ms and 10ms.

Input lag is more felt rather than seen. It feels like things on screen don't respond to your button pressed, mouse or joystick movement immediately but with a slight delay. Depending on the game this can manifest as more missed shots etc. I have some input lag caused by my Denon audio receiver and it took me a long time to realize something was off. Routing video from TV to receiver instead of the other way around removed this input lag and made a big difference in a game like Bloodborne for example where precise input is crucial for counterattacks.
 
I'm curious as I was just trying to decide on another monitor lol, I have a 27" 1440p 144Hz ROG swift, and a Dell equivalent. I like it for PUBG, Battlefield and similar games.... though I feel as if I am missing a lot of colors lol, feels very bland. I was considering a 4k monitor but damn... 2k for a 144Hz 4k isa bit much and I'll not be hitting 144fps in 4k even with my system (need to update my sig) But I have the capability to hit 60fps easily.... so I thought about keeping my monitors then getting a 43-49" to put above them that could do faster refreshes and looked good so when I play more... console-ish / less intense games I could do 4k60. the Nu8000 in a 49" is available for a nice sub $600
 
I'm curious as I was just trying to decide on another monitor lol, I have a 27" 1440p 144Hz ROG swift, and a Dell equivalent. I like it for PUBG, Battlefield and similar games.... though I feel as if I am missing a lot of colors lol, feels very bland. I was considering a 4k monitor but damn... 2k for a 144Hz 4k isa bit much and I'll not be hitting 144fps in 4k even with my system (need to update my sig) But I have the capability to hit 60fps easily.... so I thought about keeping my monitors then getting a 43-49" to put above them that could do faster refreshes and looked good so when I play more... console-ish / less intense games I could do 4k60. the Nu8000 in a 49" is available for a nice sub $600

Just FYI, the 49" NU8000 isn't 120hz capable (1440p and lower), if that matters to you at all.
 
I'm curious as I was just trying to decide on another monitor lol, I have a 27" 1440p 144Hz ROG swift, and a Dell equivalent. I like it for PUBG, Battlefield and similar games.... though I feel as if I am missing a lot of colors lol, feels very bland. I was considering a 4k monitor but damn... 2k for a 144Hz 4k isa bit much and I'll not be hitting 144fps in 4k even with my system (need to update my sig) But I have the capability to hit 60fps easily.... so I thought about keeping my monitors then getting a 43-49" to put above them that could do faster refreshes and looked good so when I play more... console-ish / less intense games I could do 4k60. the Nu8000 in a 49" is available for a nice sub $600
I know this much,
when I turn motion interpolation off the screen is flickery wtching TV because the panels response time is so fast.
OLEDs panels are even worse because OLED response is faster.
It is a true 120Hz panel as you can see in the Rtings report and I havent noticed trails while gaming at 120Hz.
 
ahh crap :( I feel 55" would be too large.... I'll have to get a feel for them and see
 
Pretty much all TVs look quite bad at non-native resolutions though, wtb HDMI 2.1 for real 4K120.

Indeed. Despite what anyone says, if you put a 49" or larger 4K TV at 2560x1440, it will look like shit. You will get a slightly fuzzy image doing that. If you set the display at 1920x1080, it should be a 4:1 mapping of pixels on the Samsungs leaving you with an "almost native" resolution. The problem is that your essentially combining 4 pixels into 1. Your display will look grainy. I've tried all this shit with my Samsung KS8500 and anything other than 4K looks like shit. If you can't see it, more power to you but its not as good as running at native resolution. It never is.

No, that is absolutely just not true. My KS8000 looks excellent at 1080P and 1440p, Looks native in fact and that's 2015 technology.

We are talking Samsung's High-End 2018 line-up.

Did you mean with cheap POS branded TV's? Maybe ..... I dunno. But I don't buy insignia and element like you guys.

Also, what the hell could you use to possible push 120hz @ 4K ? Maybe if you drop all the settings down and then .... maybe.

My RTX 2080 Ti does great at very-high settings @ 1440p pushing 120FPS.

No, it doesn't look as good. I've got a Samsung KS8500 that I spend more than 10 hours a day on. Sometimes more, rarely less. I've tried 2560x1440 and 1920x1080 and it doesn't look as good. I don't have great vision and I can see the difference clear as day.

So what's the trick for getting VRR to work with an Nvidia card? Honestly I'd start saving up for a new display in the form of a 55" TV right now if it was possible.

There isn't. Variable refresh rates are not supported using NVIDIA GPU's on these TV's. G-Sync needs to be supported in hardware. Period.

There is no trick. It just works.

People are talking about variable refresh rates, not NVIDIA GPU's not being compatible with the display.

There is some very very bad information out there about all these FreeSync monitors ... they all support nVidia cards. I've owned 3 "Freesync" monitors and they all worked with nVidia ... not under Freesynce or Gsync but just under the resolution and refresh rate at the same 100, 120 or 144hz refresh rate.

I even have a older 32" HP that has 75hz Freesync and my nVidia cards gave me the same 75hz.

No one said that you couldn't use an NVIDIA GPU with these displays. Unfortunately, variable refresh rates aren't supported. This isn't supporting NVIDIA GPU's. This is supporting display standards which the NVIDIA GPU and TV are compatible with.

There difference between Freesynce, Gsynce and Non-branded 144hz is that Gsync lines up the frames without skipping or duplicating ( as far as I can tell ) ... otherwise without Freesync and Gsync you might skip a few frames but this doesn't really matter when you have a 144FPS raining down on you like stripper dollars ever damn second. BTW, it just took you a second to read the last few words. Who is going to notice a dropped / dupe frame. And from what I read, vsync is pretty damn good on it's own.

G-Sync and FreeSync work the same way. They synchronize the display's refresh rate with the video card's frame output. Its that simple. The difference is one is a hardware solution and the other is a software solution. They both have advantages and disadvantages. The disadvantage of G-Sync is that it costs money and locks AMD GPUs out as they don't support the hardware. FreeSync is free in terms of licensing with no royalties being required and is naturally cheaper to implement. However, FreeSync displays support a specific range of refresh rates and you have to make sure that it supports the ranges you are likely to use. I am not sure if all FreeSync displays support LFC or not either. (Low Framerate Compensation) as early units had problems under 30Hz as it was outside FreeSync's range. G-Sync has always had anti-ghosting built into it, and FreeSync hasn't. I'd bet that most of the newer TV's that support FreeSync support these newer features, but I don't know that for certain. You mentioned lining up frames, which isn't accurate. G-Sync has a "collision avoidance" feature which prevents it from displaying duplicate frames. I don't know how FreeSync handles this.

V-Sync isn't "pretty damn good on it's own." V-Sync sucks ass because it reduces frame rates by half whenever you drop below one of its thresholds. If you drop below 60FPS on a 60Hz display the frame rate gets cut in half to 30FPS. V-Sync also creates noticeable input latency. You can enable triple-buffering to help avoid this, but that doesn't always work. V-Sync isn't and wasn't ever a really good solution to screen tearing. Its all we had for many years but that never made it ideal. Essentially, V-Sync forces the video card's frame rate down to maintain synchronization with the display rather than the other way around. G-Sync and FreeSync constantly adapt the display output to match the GPU's FPS.
 
ahh crap :( I feel 55" would be too large.... I'll have to get a feel for them and see
lol.
Just sit a little further away, its better for the eyes anyway.

The frame skip test with 1/10s shutter @120Hz 1440p shows 12 consecutive squares filled.
172369_Frame_skip_test.jpg
 

Attachments

  • Frame skip test.jpg
    Frame skip test.jpg
    179.8 KB · Views: 0
ahh crap :( I feel 55" would be too large.... I'll have to get a feel for them and see


Yeah, the trend is sadly that premium features are more and more only being put in TV's with larger screens, which effectively kills the "I use my TV as a great monitor" application, because 55+" is just too damned large for desktop use. Even my 48" is a bit large. Ideally I'd have the top of the line 4K Samsung QLED TV in a 43" package.
 
INo, it doesn't look as good. I've got a Samsung KS8500 that I spend more than 10 hours a day on. Sometimes more, rarely less. I've tried 2560x1440 and 1920x1080 and it doesn't look as good. I don't have great vision and I can see the difference clear as day.
I wont disagree with you, its not perfect, but my Q9FN is very good at 1440p. I was surprised.
I use this res often for windows. And all the time for 120Hz gaming because it looks native.

This is text at 100%, no zoom. The blur is my crappy camera work.
(I use 130% zoom to browse and it looks native.)
View attachment 119794
 

Attachments

  • 1440p -1.jpg
    1440p -1.jpg
    301.5 KB · Views: 0
I wont disagree with you, its not perfect, but my Q9FN is very good at 1440p. I was surprised.
I use this res often for windows. And all the time for 120Hz gaming because it looks native.

This is text at 100%, no zoom. The blur is my crappy camera work.
(I use 130% zoom to browse and it looks native.)
View attachment 119794

I never said the Samsung's weren't better at it than most displays. At 2560x1440, its serviceable for gaming. However, I don't think it is good enough for web browsing and any substantial productivity work. At 130% zoom, I don't doubt it looks more or less native. As large as the text would be like that, the blur effect wouldn't be as pronounced.
 
Yes it does. The higher the the refresh rate, the lower the input lag. It correlates.
You must consider how long image processing takes.
Take a look at early OLEDs for example.
Great pixel response, crap TV lag.
 
You must consider how long image processing takes.
Take a look at early OLEDs for example.
Great pixel response, crap TV lag.

Refresh rate is one of the contributing factors to input lag, but it is not the only one. Screens do various types of processing to the image as well.

On the same screen with all else being equal, yes, higher refresh rate will typically reduce input lag, but when comparing between different screens, or even the same screen with different settings you could easily have an increase in processing time offset the improvement.
 
Certainly a lot of options.

Personally I'm keeping my 48" Samsung JS9000, though, until there is something I. The 42-43" size range, that can actually display 4k 120hz, and has a variable refresh rate technology that either works with Nvidia GPU's or AMD has a competitive high end GPU again.

At $2k, this thing cost me a pretty penny, but 3.5 years later I still think it was worth it.

Hell yes it was. I would've kept it but I really wanted to try that 1440p/120hz.
I find 1440p acceptable but still think that 1440p and 4k have a huge difference in picture quality.
Besides, my screen had bad side bleed because of high brightness and I wanted to give my dad a bigger tv (he's not picky about the picture, probably doesn't even notice the bleed, lol).
Also, going down to 1440p resolution is fine for my 1080Ti and makes skipping current gen Nvidia cards mentally more tolerable :p
 
I'm halfway tempted to buy an NU8000 anyway and do without Gsync... but I'm afraid I'd miss it too much. :/
 
Refresh rate is one of the contributing factors to input lag, but it is not the only one. Screens do various types of processing to the image as well.

On the same screen with all else being equal, yes, higher refresh rate will typically reduce input lag, but when comparing between different screens, or even the same screen with different settings you could easily have an increase in processing time offset the improvement.
You quoted the wrong person, I said similar to you.
 
Yeah, that's the thing. I'm pretty sure I will win this argument.

First of all, bust out the $3,000 to $4,000 dollars you're gonna need to buy the new nVidia 65" gaming display that apparently HP. Alienware and nVidia have committed to. Possibly others. So now you have your 65" 4K HDR set with Gsync. Great, how are you supposed to push that? Ok, we need 2 x 2080 Ti's. Ok, cool, that's aother $2600 dollars minimum. Ok, so out of pocket we are at $5600 to $6600 dollars. Ok, wow .. that stings and if you're the guy who bought all of that your like 1 in 5,000 kids. How special you must be.

While we're at it, let's go ahead and add another $1500 for all the additional parts you will need. We are now at $7,000 to $8,000 fucking dollars. Maybe you have a newer PC, maybe you don't. Either way, you've spent the money before or will have to spend the money.

I don't think you kids are really getting this. That's not a realistic solution.

But hey, that's all on you. Get the bank loan, get your Grandmother to pay for it all after you tell her you're gonna be some new Twitch streamer and will make all this money, etc, whatever lie you have to tell yourself and others.

You guys are absolutely delusional if you don't think a 55" @ 120hz @ 1440p will take you to the next level.

Son, I am running a Predator X27 and 43" Mango 4k120. The mango IS the closest you are going to get to the BIGGEM native 4K BFGD experience right now @ sub $1,000.

Son, SLI is a waste of time so there is $1,300 saved.

Son, my 2080ti pushes 4k120hz just fine on BF4 with all settings maxed, PUB-FUK-GEE with some settings on medium and low, etc, etc....most people who play competetive shootey games at 1080/1440pee lower the eyecandy so they can have better frame rates anyway, so its not a big deal for me to lower settings on FPS so I can play 4k@120

Son, if you want to play games @4k with settings maxed then chances are your playing a single player game or other non competetive game and the 2080ti does a swell enough job of hitting 60hz

Son, you should have already had that other $1,500 in additional parts purchased. What are you a console kiddy son?
 
Last edited:
100ms, 10ms, 1ms.... maybe I'm blind, but I can't see the difference.
I can also barely, just barely see the difference between 60hz and 120hz (a bit smoother gameplay and mouse movement, but nothing drastic).

My JS9000 had about 120ms or so input lag because I turned all post processing on and refused to play in game mode.
This new tv has 10ms. I can see some difference between 120hz and 60hz, but not between 120ms and 10ms.

100.gif
 
I find some games still do look nice at 1440p on a 4K TV (mine is 55") but most games look bad.

For example, Left4Dead on max settings (importantly, 8x MSAA) looks almost like native 4K. It's really nice.

But any newer games with post-process AA don't look as hot. I tried Prey and it looked like ass.

So you can make it work sometimes, but, in general, non-native res is a no-go.
 
I find some games still do look nice at 1440p on a 4K TV (mine is 55") but most games look bad.

For example, Left4Dead on max settings (importantly, 8x MSAA) looks almost like native 4K. It's really nice.

But any newer games with post-process AA don't look as hot. I tried Prey and it looked like ass.

So you can make it work sometimes, but, in general, non-native res is a no-go.

Battlefront 2 looks like hammered dogshit at 2560x1440.
 
I'm curious as I was just trying to decide on another monitor lol, I have a 27" 1440p 144Hz ROG swift, and a Dell equivalent. I like it for PUBG, Battlefield and similar games.... though I feel as if I am missing a lot of colors lol, feels very bland. I was considering a 4k monitor but damn... 2k for a 144Hz 4k isa bit much and I'll not be hitting 144fps in 4k even with my system (need to update my sig) But I have the capability to hit 60fps easily.... so I thought about keeping my monitors then getting a 43-49" to put above them that could do faster refreshes and looked good so when I play more... console-ish / less intense games I could do 4k60. the Nu8000 in a 49" is available for a nice sub $600

You are not "missing" any colors, those monitors should be accurate for sRGB which is what is used by all games. What you are saying is that you prefer the higher gamut, exaggerated contrast colors you would see on monitors that support that.
 
You are not "missing" any colors, those monitors should be accurate for sRGB which is what is used by all games. What you are saying is that you prefer the higher gamut, exaggerated contrast colors you would see on monitors that support that.
Why so dismissive?
There are differences with how much of the intended colour palette can be displayed.
Some colours are not the correct shade because the display cannot properly show them, even when calibrated.
Wider gamut does 'not' mean exaggerated.
 
Since this TV has Freesync, would it be smart to dump my 1080Ti and buy Vega 64? I could probably end up with +-0 once I sell my 1080Ti and buy a new Vega 64.
 
Back
Top