WOW! 55" Display with 120Hz / 1440p with excellent latency. nVidia and FreeSync

Since this TV has Freesync, would it be smart to dump my 1080Ti and buy Vega 64? I could probably end up with +-0 once I sell my 1080Ti and buy a new Vega 64.
I recommend you try it with your 1080ti first, thats what I use mine with.
Wouldnt trade it for anything.
 
Since this TV has Freesync, would it be smart to dump my 1080Ti and buy Vega 64? I could probably end up with +-0 once I sell my 1080Ti and buy a new Vega 64.

The problem with FreeSync is that you end up having to use AMD GPUs with it. Vega 64 would be a downgrade over your 1080Ti. AMDs best just isn't that great right now.
 
Vega 64 will struggle at 4K. If you plan to use 1440p that could work, but for 4K stick with the 1080 Ti.
 
Since this TV has Freesync, would it be smart to dump my 1080Ti and buy Vega 64? I could probably end up with +-0 once I sell my 1080Ti and buy a new Vega 64.

The problem with FreeSync is that you end up having to use AMD GPUs with it. Vega 64 would be a downgrade over your 1080Ti. AMDs best just isn't that great right now.

Vega 64 will struggle at 4K. If you plan to use 1440p that could work, but for 4K stick with the 1080 Ti.


Yeah, that's the current problem

Want a large good 4k screen with adaptive refresh? It's going to be VRR or Freesync.

Want a GPU to drive it? It's going to be Nvidia. AMD isn't even worththinking about for 4k at this point, unless you really like low framerates or low settings.

Nvidia doesn't support Freesync/VRR.
 
Last edited:
AMD isn't even worththinking about for 4k at this point, unless you really like low framerates or low settings.

I'll jump off on this point: the settings on Vega 64 would be lower, but I'd speculate that they'd still be higher than your average console would pull off.

Perhaps not as good performance as you'll get for the money but certainly functional.
 
There is another option. Crossfire. With 2x Vega 64 you can reach 60 fps at 4K (high to very high settings) in many games. Granted it doesn't work in all games but the cost is not so crazy when Nvida is selling $1,200 GPUs.
 
The one option TVs cannot do that I absolutely must have for PC use is PBP. That there kills the idea of using a TV as a pc monitor for my needs.
 
I'll jump off on this point: the settings on Vega 64 would be lower, but I'd speculate that they'd still be higher than your average console would pull off.

Perhaps not as good performance as you'll get for the money but certainly functional.

Even a Geo Metro is better than a bicycle.

I don't understand why you would even make this comparison :p
 
There is another option. Crossfire. With 2x Vega 64 you can reach 60 fps at 4K (high to very high settings) in many games. Granted it doesn't work in all games but the cost is not so crazy when Nvida is selling $1,200 GPUs.

Crossfire and SLI are unusable garbage that just result in high input lag, and poor compatibility and scaling. They are Dyno queens, high average fps, but frequent dips and terrible minimum framerates.

They are not a viable alternative at all.
 
Crossfire and SLI are unusable garbage that just result in high input lag, and poor compatibility and scaling. They are Dyno queens, high average fps, but frequent dips and terrible minimum framerates.

They are not a viable alternative at all.

I've been using multiGPU setups since their release on the market. What you are saying hasn't always been true. I wouldn't even say its entirely true now. You can also disable the feature anytime it becomes a detriment to performance.
 
I've been using multiGPU setups since their release on the market. What you are saying hasn't always been true. I wouldn't even say its entirely true now. You can also disable the feature anytime it becomes a detriment to performance.


Fair, I only have two data-points.

I used it in 2011 with dual Radeon HD 6970's on a Phenom II 1090T (overclocked to 4.2Ghz if memory serves). I had recently upgraded to my 30" Dell U3011 2560x1600 screen, and was struggling to get decent framerates at that resolution.

I found that the titles I played had terrible bugs when Crossfire was enabled, frequently crashing or graphical distortions. Some were fixed several months after launch, others were never fixed. Even when they were fixed, the scaling was never good, and while it did give me a bump in average FPS, the overall experience was almost as bad as with one GPU, as that really matters, the minimum FPS was mostly unchanged going from one to two GPU's. It would run away with simple scenes, but more complex one would have almost the same min FPS.

I never had any of the microstutter problems people were talking about at the time, just a combination of bugs making many titles unusable, poor minimum framerates, and added input lag.

At first I thought the problem may have just been that the CPU couldn't keep up, so I swapped my Pehnom II 1090T (my wait for Bulldozer build, that I kept when Bulldozer turned out to be crap) for my current 3930k x79 system, and overclocked it to 4.8Ghz but the problems persisted. At that point I knew it was not a CPU issue. I found having two GPU's a complete waste, and dumped both my 6970's in favor of a single 7970. Despite the single 7970 only being a small upgrade over a single 6970, the experience was MUCH Better than 6970's in crossfire. (I later killed that 7970 a slipped screwdriver incident when I tried to ghetto mount a corsair AIO to it, in a desperate attempt to get better overclocks to make 2560x1600 get to a minimum framerate that did not drop below 60fps at 2560x1600)

The only part I liked about that setup was that it was pretty awesome to see the massive dual Asus three slot DirectCUII cards in the same case :p

proxy.php?image=http%3A%2F%2Ffarm7.static.flickr.com%2F6010%2F5978233316_ee4a00e74a_b.jpg


Those GPU's were massive, they just don't look it in this case, as those fans are 180mm fans, and that is th eultra rare Maingear 180mm single slot AIO. I hat to saw into th emotherboard tray to make the space for it :p

I replaced the dead 7970 with a 2GB GeForce GTX 680 on launch, but that still wasn't fast enough for 2560x1600. it wasn't until I got my first gen Titan on launch in 2013 I was happy. I kept that setup until...

Summer of 2015 I made the same mistake again. I got my 4K Samsung JS9000, and nothing on the market was fast enough to drive it. I wound up getting two 980ti's for use in SLI. I still remembered my bad experience with crossfire, but people had always said that SLI was the better solution, so I had hope, but nope. To be fair, there were fewer game breaking bugs, but apart from that, I had the exact same performance problems with dual 980ti's in SLI at 4K as I had with dual 6970's in Crossfire at 2560x1600 4 years earlier.

Still was a pretty sweet build with one Corsair H90 for each 980ti, using Corsairs GPU bracket, and some hollowed out fans as spacers. Apparently I never took a picture of th efinished product though. It wasn't pretty, but it was very effective at keeping those 980ti's cool.

For good measure, here it was before I installed the AIO's:

19929929986_b9ebf7c6be_b.jpg



It only lasted until I got fed up and jumped on the Pascal Titan X on launch and built my first custom water loop.
 
Crossfire and SLI are unusable garbage that just result in high input lag, and poor compatibility and scaling. They are Dyno queens, high average fps, but frequent dips and terrible minimum framerates.

They are not a viable alternative at all.
While you are correct that there definitely are games with bugs and issues, I would disagree that it's across the board.

I would say most of the games I've tried do work fine in Crossfire and make it worth it. Granted, not all games scale well, but usually you are looking at around 50% boost over 1 card or even 100% in rare cases (some big games like GTA V you can get almost 100%, as you can in Tomb Raider).

There have also been improvements with frame-pacing, that largely mitigated the dips, though it will depend on the game. I find that if you can reach around 90 fps average, you will survive the dips on a 60Hz screen (though if you are only getting 60 fps it may be more choppy than 1 card).

Seeing as you can get a Vega 64 for just above $400, an $800 Crossfire setup is actually competitive with say an RTX 2080. Of course, you may find issues in specific games, but on average I think it would be a win when you factory in a FreeSync monitor or TV.
 
I have FreeSync, I have Gsync, I have VRR and even an old nVidia 3D monitor with the added in 144hz module for those of you that have any kind of history with the PC. Guess what, I don't see, feel, or detect any difference between any of them. But then again, I run the latest hardware. It's not only me but the demonstrations I will sometimes do when my friends come over to prove to them there is not much difference between all of them. I just did a Gsync test with a Dell Gaming S2716DGR monitor I scored off eBay for $150 a few months ago against my new NU8000 ... No one ( 6 people with 2 of them knowing what Gsync was ) could see any duped frames, dropped frames, half frames, any type of BS frame you can think of, we could not see anything difference between the two. Everything was nice and buttery smooth at 120hz. Period. Let's get crystal clear about that right now.

So yes, I do get tired of hearing all this crap about one monitor tech vs another monitor tech. I gets old, no one never has any video proof. All I get is numbers this and explanations that. At the end of the day, it's all jibber jabber in my ear and it's confusing to people that are trying to buy the best they can with the limited amount of money they can afford. No doubt there are technical differences between them. We get it. Enough. Can people see and feel the difference? Maybe, I can't. My friends can't. Lot of people on Amazon can't. Reviewers can't. Can someone link me video reviews showing me vsync sucking ass at 120hz vs Gsync at 120 / 144hz with the latest hardware pushing all of these tests?

I am totally convinced that many of you here would suggest that some kid go out and buy the nVidia 65" @ 3K to 4K instead of a Samsung that can basically do the same thing. Again, please do not throw meaningless numbers at me. Show me video proof and expert opinion between 120hz vs 144hz. gsync, freesync, VRR, vsync, etc etc etc. And these expect(s) better be pretty damn convincing as I have a god mode BS filter. In fact, it's so good, I know ahead of time before anyone even start's their line of BS.

"Oh, you're gonna drop frames!" ... really, ok, I dropped 1 frame, skipped 2 frames, duped 3 frames, who f'ing cares? I have 114 more where those came from and I have another 120 frames the following second and the second after that. So, your point?

I made this post to highlight you don't need to wait and you don't need to spend all that money. A real solution is here right now and for cheap if you want a large screen gaming solution. Microcenter just marked their 55" NU8000 down to $749. Also, Linus Tech Tips says 120hz VRR could make it into 2019 Samsungs which would be pretty damn sweet.
 
Last edited:
SixFootDuo I agree, but I also disagree. G-Sync/FreeSync is one of the biggest developments of our time in terms of PC gaming (HDR is also a game changer, but it's very early right now).

Assuming you are hitting your refresh rate, V-Sync is more or less fine visually. There is input lag, sure, but for single-player games, or even online non-twitch shooters, I would posit this doesn't matter and is acceptable.

However, in many cases you will not be hitting over your refresh rate, particularly with high refresh displays, and G-Sync/FreeSync is a huge improvement.

The average person might not notice, just as average gamers can play on consoles at 30 fps and not notice anything is wrong. But to a connoisseur of PC gaming, it's easy to tell the difference.

Everyone is different, and some people notice some things over others. For me, I cannot stand screen tearing. It completely ruins the game for me. There are other people that can't stand lag, and have no issue with V-Sync off. Still other people that think 144Hz is bogus and are fine at 60Hz.

So it's very difficult for me to say there is not a difference because some people don't notice a difference. Those are two very different things.
 
Why so dismissive?
There are differences with how much of the intended colour palette can be displayed.
Some colours are not the correct shade because the display cannot properly show them, even when calibrated.
Wider gamut does 'not' mean exaggerated.

Wider gamut often means exaggerated when it's used to display content that is made for sRGB. That saturation is what makes people think sRGB looks "bland" or "missing colors" when in reality it is actually a more accurate representation of the source material. You are right, wide gamut does not have to mean exaggerated colors, but in the original post's context that was exactly what the poster was after. TVs typically come with this kind of setting right out of the box so that it looks impressive in a store. I think people should just be more educated about color accuracy. Lots of folks use default settings which have excessive sharpening and color saturation.
 
Wider gamut often means exaggerated when it's used to display content that is made for sRGB. That saturation is what makes people think sRGB looks "bland" or "missing colors" when in reality it is actually a more accurate representation of the source material. You are right, wide gamut does not have to mean exaggerated colors, but in the original post's context that was exactly what the poster was after. TVs typically come with this kind of setting right out of the box so that it looks impressive in a store. I think people should just be more educated about color accuracy. Lots of folks use default settings which have excessive sharpening and color saturation.
Wider gamut does not "often mean exaggerated", it means more of the colour spectrum.
Rather than calling him out for something you have no idea is correct, I would have questioned him to be sure, if its really needed.
My take, assume he can see the difference with bland colour displays and prefers those that have a wider gamut. I do as would most people.
No need to bury someone for enjoying what todays displays bring in quality improvements.
These new QLED displays have very wide gamut and colour volume (wide gamut throughout the brightness range).
They are exceptionally good.
 
Wider gamut does not "often mean exaggerated", it means more of the colour spectrum.
Rather than calling him out for something you have no idea is correct, I would have questioned him to be sure, if its really needed.
My take, assume he can see the difference with bland colour displays and prefers those that have a wider gamut. I do as would most people.
No need to bury someone for enjoying what todays displays bring in quality improvements.
These new QLED displays have very wide gamut and colour volume (wide gamut throughout the brightness range).
They are exceptionally good.

The phrase "wide gamut" has become extremely diluted in its use. True 10-bit monitors calibrated for the Adobe RGB or DCI-P3 colorspace will often look oversaturated when the computer tries to display sRGB content, since working with a true wide gamut monitor requires a native end-to-end workflow. Most non-professional desktop software, games, web-browsers and the like aren't color managed for wide gamut and will assume that the display is sRGB by default, which causes the oversaturation. Some wide gamut monitors will let you limit the colorspace to sRGB through their settings but there are also many that don't have this function, others have sRGB as an option but don't offer seperate calibration for that colorspace.
 
Last edited:
The phrase "wide gamut" has become extremely diluted in its use. True 10-bit monitors calibrated for the Adobe RGB or DCI-P3 colorspace will often look oversaturated when the computer tries to display sRGB content, since working with a true wide gamut monitor requires a native end-to-end workflow.
I dont have this issue, which displays does it often occur on?
Most non-professional desktop software, games, web-browsers and the like aren't color managed for wide gamut and will assume that the display is sRGB by default, which causes the oversaturation.
Look at the TVs on offer these days, no such issue.
Some wide gamut monitors will let you limit the colorspace to sRGB through their settings but there are also many that don't have this function, others have sRGB as an option but don't offer seperate calibration for that colorspace.
Not an issue on my TV.
 
Last edited:
Interesting.
I dont have this issue, which displays are those?

Look at the TVs on offer these days, no such issue.

Not an issue on mine.

Any wide gamut panel that displays sRGB content without color management, it's simply a by-product of displaying sRGB content on an A-RGB/P3 display. It's not as though this is invented, it's very well documented. (Here's a good article)

Your television may be automatically switching to sRGB mode for relevant content.
 
Any wide gamut panel that displays sRGB content without color management, it's simply a by-product of displaying sRGB content on an A-RGB/P3 display. It's not as though this is invented, it's very well documented. (Here's a good article)
No doubt but what relevance does that have here?
Did you do due diligence before posting?
 
No doubt but what relevance does that have here?
Did you do due diligence before posting?

As I understood it this was the problem that kasakka was referring to and I was clarifying what I believed to be his intended point about oversaturation.
 
As I understood it this was the problem that kasakka was referring to and I was clarifying what I believed to be his intended point about oversaturation.
The only time I encountered oversaturation with this TV is when Square Enix allowed SotTR to display HDR colour in DX11 without BT2020 enabled.
This was a bug.
 
The only time I encountered oversaturation with this TV is when Square Enix allowed SotTR to display HDR colour in DX11 without BT2020 enabled.
This was a bug.

Interesting, what tool did you use for calibration?
 
Bumping this.
Since Nvidia will support Freesync, will these TVs now become better for 4k/60hz than 1440p/120hz?
 
Bumping this.
Since Nvidia will support Freesync, will these TVs now become better for 4k/60hz than 1440p/120hz?
Whether its good enough for you with your display or good enough at all cannot be determined until it is tried. Unless NVidia have already certified it.
User results should start coming in after the 15th.
 
I have FreeSync, I have Gsync, I have VRR and even an old nVidia 3D monitor with the added in 144hz module for those of you that have any kind of history with the PC. Guess what, I don't see, feel, or detect any difference between any of them. But then again, I run the latest hardware. It's not only me but the demonstrations I will sometimes do when my friends come over to prove to them there is not much difference between all of them. I just did a Gsync test with a Dell Gaming S2716DGR monitor I scored off eBay for $150 a few months ago against my new NU8000 ... No one ( 6 people with 2 of them knowing what Gsync was ) could see any duped frames, dropped frames, half frames, any type of BS frame you can think of, we could not see anything difference between the two. Everything was nice and buttery smooth at 120hz. Period. Let's get crystal clear about that right now.

So yes, I do get tired of hearing all this crap about one monitor tech vs another monitor tech. I gets old, no one never has any video proof. All I get is numbers this and explanations that. At the end of the day, it's all jibber jabber in my ear and it's confusing to people that are trying to buy the best they can with the limited amount of money they can afford. No doubt there are technical differences between them. We get it. Enough. Can people see and feel the difference? Maybe, I can't. My friends can't. Lot of people on Amazon can't. Reviewers can't. Can someone link me video reviews showing me vsync sucking ass at 120hz vs Gsync at 120 / 144hz with the latest hardware pushing all of these tests?

I am totally convinced that many of you here would suggest that some kid go out and buy the nVidia 65" @ 3K to 4K instead of a Samsung that can basically do the same thing. Again, please do not throw meaningless numbers at me. Show me video proof and expert opinion between 120hz vs 144hz. gsync, freesync, VRR, vsync, etc etc etc. And these expect(s) better be pretty damn convincing as I have a god mode BS filter. In fact, it's so good, I know ahead of time before anyone even start's their line of BS.

"Oh, you're gonna drop frames!" ... really, ok, I dropped 1 frame, skipped 2 frames, duped 3 frames, who f'ing cares? I have 114 more where those came from and I have another 120 frames the following second and the second after that. So, your point?

I made this post to highlight you don't need to wait and you don't need to spend all that money. A real solution is here right now and for cheap if you want a large screen gaming solution. Microcenter just marked their 55" NU8000 down to $749. Also, Linus Tech Tips says 120hz VRR could make it into 2019 Samsungs which would be pretty damn sweet.

Every frame actually matters with emulators, so, *nuke*.
 
The 15th, nVidia will be releasing their "Freesync" drivers. Should be cool. Don't really think I need them as I am already over the top buttery smooth but I will def be checking them out.

Loving this Samsung BFGD ... simply amazing. And at 1/5th the cost of the nVidia BFGD ... mind-blowingly delicious.

I've literally had 2 other friends buy this display after they saw mine.

Insane.
 
The 15th, nVidia will be releasing their "Freesync" drivers. Should be cool. Don't really think I need them as I am already over the top buttery smooth but I will def be checking them out.

Loving this Samsung BFGD ... simply amazing. And at 1/5th the cost of the nVidia BFGD ... mind-blowingly delicious.

I've literally had 2 other friends buy this display after they saw mine.

Insane.
Still think 55” is too big for a desktop. If there was a 43” model or even a 48” I’d consider it.
 
Still think 55” is too big for a desktop. If there was a 43” model or even a 48” I’d consider it.
Yeah. I bought a Samsung 55" and it's too big. I like it, but really 40" or 43" would be perfect.
 
Yeah. I bought a Samsung 55" and it's too big. I like it, but really 40" or 43" would be perfect.
Asus is showing their XG438Q 4k 120Hz display at CES and it will come with Freesync 2. It looks to be exactly the same panel as the Wasabi Mango, so hopefully Wasabi will enable Freesync on theirs and it will be cheaper. I would bet the XG438Q will come in >$1500US.
 
Asus is showing their XG438Q 4k 120Hz display at CES and it will come with Freesync 2. It looks to be exactly the same panel as the Wasabi Mango, so hopefully Wasabi will enable Freesync on theirs and it will be cheaper. I would bet the XG438Q will come in >$1500US.

Gotta wait next gen for 4k/120hz, maybe not even then.
I love 4k, but, currently, 60hz is where it's at, even with 2080Ti.
I simply refuse to sacrifice image quality for refresh rate, this is why everyone goes 4k instead of 1080p/1440p.
 
So, now that Nvidia is supporting VRR, I'm really looking at upgrading my screen again.

It's really tough to tell which models have VRR though, and what frequency ranges they support.

Don't get me wrong. My JS9000 still looks amazing, but I would LOVE to have VRR.
 
I simply refuse to sacrifice image quality for refresh rate, this is why everyone goes 4k instead of 1080p/1440p.

Here maybe, but I hang out on a few other groups as well, and there everyone is all about maxing out refresh rate, and complain when they can't hit 144hz on their 144hz screens. They disable shadows, turn down all of their effects, run at lower resolutions, etc. etc., because everyone wants to be the next competitive streamer, and there is a percieved advantage to the fraction of a ms improvement in reaction time this provides.

It's really kind of nuts. :(
 
I went from 1440p to 4K and I was liking the picture quality but performance was a constant struggle. I think I spent more time messing with settings trying to get 60fps than playing the games.

Now I'm on 1080p ultrawide, I can safely max out pretty much any game and still get in the 100fps+ range, and many games can get the full 166fps.
 
Here maybe, but I hang out on a few other groups as well, and there everyone is all about maxing out refresh rate, and complain when they can't hit 144hz on their 144hz screens. They disable shadows, turn down all of their effects, run at lower resolutions, etc. etc., because everyone wants to be the next competitive streamer, and there is a percieved advantage to the fraction of a ms improvement in reaction time this provides.

It's really kind of nuts. :(

I haven't noticed much difference in fluidity going from 4k/60hz to 1440p/120hz in fluidity, but I did notice quite a big image quality downgrade.
I guess those competitive gamers want that 0.5% edge over other players as it can make a (tiny) difference, but when people go 4k and then start lowering graphical details to increase refresh rate, that's what triggers me :)
There is however some refresh rate craze going on. I don't understand it either. Maybe I'm too old and can't see it, you know, like when all those kids can hear certain frequency and people over 30 can't, so they use that tone for their phone ring tone in classes XD
 
Night and day difference going from 60hz to 120hz .... amazing amazing .... amazing.
 
Same here.
I had been struggling to qualify well in a race in Project Cars 2.
The increase in refresh seems to affect how quickly the car responds as well as how it looks on screen because making micro changes to the steering has a much better effect.
I qualified easily once I changed to 120Hz. I dont think its down to reduced lag because I tried for months and was used to the lag.
Trying 60Hz again the same difficulty returned.
Other games are smoother too.
1440p 120Hz on these screens is perfect for a 1080ti.
 
I haven't noticed much difference in fluidity going from 4k/60hz to 1440p/120hz in fluidity, but I did notice quite a big image quality downgrade.
I guess those competitive gamers want that 0.5% edge over other players as it can make a (tiny) difference, but when people go 4k and then start lowering graphical details to increase refresh rate, that's what triggers me :)
There is however some refresh rate craze going on. I don't understand it either. Maybe I'm too old and can't see it, you know, like when all those kids can hear certain frequency and people over 30 can't, so they use that tone for their phone ring tone in classes XD


I'm with you. I feel very similar. I feel there is a small improvement above 60fps, but you hit diminishing returns rather quickly.

Most of the difference is placebo.
 
I've been loving my 75 inch Samsung NU8000 I picked up on black Friday. Really excited to see what VRR looks like in action. Personally the biggest wow factor of my tv came a week later when I finally decided to try 1440p/120. I much prefer that right now than 4k gaming.

I been playing Grid 2 again because it's easy to run 4k and was a good comparison to see 4k/60 vs 1440p/120. It's night and day difference between the smoothness. Easily my favorite upgrade since my first 144hz monitor. Monster Hunter at 1440p/120 is pretty wonderful looking as well. It's been fun showing my pc friends and seeing them have the same holy shit reaction.

Oh I just wanted to add this since I been seeing people grouping up 1080/1440 together. At 75 inches about 7-8 feet away, 1440p looked much cleaner than 1080p. It made me a believer of this resolution. I never did the math before and didn't realize how much more pixels it is over 1080p. To my eyes in motion it seems closer to 4K than it does to 1080p.
 
Last edited:
  • Like
Reactions: Nenu
like this
Back
Top