Separate names with a comma.
Discussion in 'Displays' started by Vega, Apr 26, 2019.
Your monitor should not be higher than about eye level.
Okay, so I've decided that I'm entirely done with waiting for this to come out. On top of that, I don't even run an Nvidia card.
Do any other high refresh 38" options exist? All the ones I've found are low refresh, productivity monitors.
No. We'll probably see something next year though, even if not from LG some other company will likely use the panel to make a FreeSync version.
Bummer. I can’t help but feel like I’m stuck in a state of perpetually waiting for mediocrity (when it comes to monitors, at least).
That's a great way to put it.
There isn't going to be a truly exceptional computer monitor until we finally get to MicroLED. Backlights are always going to be a problem and image retention is too much of an issue with OLED.
Maybe in 2030 we will have the perfect monitor, until then every monitor has a draw back but good features as well. For example there is no ultra wide gaming monitor 4k. Also Nano IPS is it for now OLED is in 2030 unless you want it now for couple K and 20" lol...
Yes as of now 38" gaming monitor with 144 and 175hz nano ips 1ms 450nits 3840x1600 is only LG... Later on people will get their hands on panels that are larger in size and ultra wide like the LG. It will be less cost then LG plus by that time the LG 38" gaming monitor will be couple hundred less if you can wait... Some cant wait lol.
When I originally purchased my Asus PG278Q's on release in 2014 I only expected them to last a year, maybe 2 until something vastly superior replaced them. I was very disappointed to step down to a 27" 1440p monitor from the 30" 1600p monitor I had been using for many years prior, but gsync and high refresh rate made it worthwhile.
This LG monitor, should it ever come out, is the first monitor to exist that is a worthwhile upgrade. I'm sick of waiting endlessly to upgrade, every year I get older and blinder. By the time the magical microled super monitor exists I will be too old and blind to notice or care.
My ZR30w 30" 2560 x 1600 monitor is still in use!
I went the same route and actually preferred the slightly smaller size and sharper PPI of the PG278Q. I felt the 30" 16:10 2560x1600 monitors were just a bit too large for their resolution. I got tired of waiting so I went with the Samsung CRG9 and that should tide me over until something truly better comes along. At this rate it needs to be a 38-43" 5120x2160 120+ Hz monitor with FALD/Mini-LED/dual layer LCD backlight which I really don't expect to see for years. If we ever get some decent 40"-ish 4K 16:9 displays those are going to be decent alternatives too.
The LG is just barely larger vertically with about the same PPI as the CRG9. Overall it's just not a good year for buying a large high refresh monitor due to the lack of truly good options. We can only hope next year turns out better.
I feel ya guys. Also tired of 'wait until next year' syndrome.
I just bought 3x PG279QZs (the IPS version) and as much as I like them, I've been running NV Surround for over 8 years but Guild Wars 2 was the last truly viable NV Surround game for me and I just don't play it any more.
I'd play other games in surround if I could push them (I can get 60fps on low settings in Modern Warfare, but it's just not enough).
I also got them to do work, but I have endless trouble getting my work laptop to run on all 3 screens (even though the dock supports it) and at the end of the day, 3x 27 is just too wide and it's a lot of inputs to manage and a lot of head turning.
The CRG9 seemed like the solution, but I didn't want VA and I do want Gsync, plus most games just put your HUD/instruments against the left and right edges of the screen, which would be terrible on a 32:9
the 34" Alienware ran great, but the BLB was just unacceptable and they wouldn't fix it, plus 120Hz just doesn't feel like enough.
This LG checks off all the boxes for me - not too wide, has Gsync, a good aspect ratio and refresh rate and while it will probably run decent on a 1080Ti, I'm sure it will scream on a 3080Ti
I'll probably lose a few hundred bucks selling of the Asus panels, but I think it will be worth it - as long as the LG isn't a lemon - some of these wonder screens that get hyped end up having some major limitations or issues.
This monitor is now available for preorder from B&H. They don't have an "expected by" date anymore, but the fact that preorders are available seems to imply that they're shipping soon.
Preorders were available back in early September too. I don't think it implies anything.
Yeah I saw on a foreign site it listed ETA as 1st quarter of 2020. Manufacturing delays or defects lead me to think this product may never come to market...
What a bummer.... They announce the monitor in June and its December and it is still not out... sighs. Also B&H had the price @ 2000 dollars. Then changed it to 1796 dollars. Then couple days ago it went to 1999 dollars and now its back to 1796 dollars. WTF is going on. I aint paying 2k for a monitor. Also 1800 dollars seems a bit expensive for this panel. It should be 1500 dollars and will be next year as all monitor prices fall couple hundred from release to 3 to 8 months after release and what not.
Uh? Why on earth would you think it wouldn't come to market? This is just what always happens. You may not have been aware, but virtually every single flagship monitor release for the past few years has been through an almost identical process... big hype announcement, release date withheld, later announced, then delay after delay after delay. I'd be amazed if this DIDN'T happen!
$1800 seems reasonable given the current market.
It has no real competition but there are several top tier monitors in the 2000+ range
We are almost certain to see a new crop of gaming monitors and updates on HDMI 2.1 gaming friendly TVs, monitors (and hopefully GPUs) at CES in January. That’s where many of the current crop of monitors were first announced, several of which have not yet been released.
And then the circle of lif.. err I mean waiting begins anew.
The question is if we will see any 4K 120 Hz models that are smaller. 40" 16:9 would be about the same horizontal width but taller, that's unlikely to happen but maybe a 43" screen. Then it comes down to whether they will skimp on specs on these smaller displays or not.
I currently have the ACER 38" UltraWide 4k display and it's the perfect size (i tried pretty much all othe sizes and this is the only size i truly love). The only problem is that it's FreeSynch at 75hz.
I've been runnin nVidia cards for 15+ years and this new display with G-Sync, 1ms and 144hz sounds like it's all i'll ever need. Already punt in a pre-order at B&H and can't wait for it.
LG reps mentioned back in May that the release date for this monitor was tracking for a July/August 2019 time frame. It's now November 2019 and there's no product in site. I have a feeling that LG is having some type of problem manufacturing the monitor.
Why do you think that? EVERY top tier monitor release gets delayed. It may very well be production delay due to unforeseen technical issues, but this is nothing new. I can't think of a single high end monitor the past few years that's actually released when the manufacturer said it would, often wayyyyy later.
Yup for any new high end monitor just take the promised release date and add a year to be safe.
Yeah add at least 1 year. At least. Some might not show up for 2 years. I guess they do this to build up hype while they develop and build the unit. Its not nice to the consumer really IMO but it is what it is. I don't see the manufacturers changing how and why they do this. Nothing prevents them or forces them to change this pattern so it will continue.
People tend to keep monitors a long time and don’t buy them that often. So it’s in the manufacturer’s best interest to generate hype and hope folks wait instead of buying a competing product.
There is yet to be a 4k UltraWide @ 144hz. They are all 60hz of 70 or 75hz which makes it a non gaming monitor and a monitor for production on the desktop and what not.
I assure you that a 60Hz monitor can absolutely be used for gaming.
I'm very over waiting now. Been wanting a good monitor for years now. I bought an xg2703-gs to hold me over until the high refresh 4k monitors came out because they were taking so long. They had so many issues at launch I just kept it and have been waiting for this for the best part of a year. My desk is an absolute shambles and I hate having dual 27's because I need one screen directly in front of me for ergonomics and turning my head so much for the second screen is just a pain.
I need a high refresh ultra wide, so I can play games and a high radius 49 screen to go below it as I use the second screen for notes when working. Could have got the 49 as a high refresh to hold me over but the Samsung 49 is too curved, the Dell is perfect size but 60hz.
Increasingly tempted for the X35 but I just wanted that extra bit of resolution. Releasing a monitor during CES is fooking dumb, as no doubt there will be a HDR 1000 screen announced that meets my needs and I end up waiting another 12+ months.
Waahhhh waaah wahhh
Can. Can I ever go back after being at 144Hz? Don't think so.
Going from 10 years at 1920x1200@60Hz to 1920x1080P@144Hz felt like a better upgrade than going from an i5 2300 to a Ryzen 5 2600. (all on a GTX 1070)
But yeah, my youngest is still fragging on one of my old HP 24" ZRW whatevers. He doesn't know any better
Is it all refresh rate though...and not more the inclusion of adaptive sync making the difference? I haven't thought to try turning off Freesync features but keep a high refresh rate to see what the eyes think. Anyone tried that?
I do it daily
I think it depends. All of this "can't go back" stuff is largely hyperbole. I owned one of the early 120Hz monitors for a while. It was pretty awesome at the time and I enjoyed the smoother motion in games and on the desktop. But once I went 40"+ 4K I was hooked on the immersion and really, 60Hz doesn't feel as awful as everyone makes it out to be. It wasn't long ago that consoles were locked at 30Hz and 60Hz was perfectly acceptable (it still is).
Having said that, I totally get that you don't want to go back, just as I don't want to go back to using a hard drive after experiencing an SSD, and don't want to use a 27"/32"/34" monitor after having a large one that sucks me into games, and don't want to use LCDs with all of their flaws after having an OLED. So, I get it...but yes, we could go back. It just wouldn't be our ideal experience.
When I come from playing a slow page-ing animation 30fps on ps4 bloodborne to a solid 60fps on dark souls3 on pc it's a huge and welcomed difference so there is that. Of course games capable of high hz at 100fps+ average on a 120hz or higher monitor are better as long as you are willing to dial the graphics setting in (down) as necessary to something like very high, very high+ or ultra- ... in order to "turn up" the motion benefits -- getting appreciable motion clarity (blur reduction) increases and motion definition increases(more animation cells in a flip book flipping faster, smoother movement, more dots per dotted line motion pathing, etc) on the most demanding games and native resolutions.
100fps solid (not average) is the point you get 40% blur reduciton and 5:3 frames of motion definition compared to 60fps solid at 60hz or higher Hz. 120fps solid (not average) is the point you get ~ 50% blur reduction and 2:1 frames , double, the motion definition compared to 60fps at 60hz or higher. So 100fps solid 40% 5:3 and 120fps solid 50% 6:3 but people are lucky to get 100fps average tweaking their settings on some games at higher resolutions so are more like a 70 - 100 - 130 fps graph even then. There are of course other games that are easier to render and get high enough frame rates. Just remember that without supplying high enough frame rates to fill those hz enough to get appreciable gains, the higher Hz is pretty meaningless.
Variable Refresh Rate technologies let you smoothly traverse frame rate potholes and jarring frame rate changes, and can even help to avoid tearing in some bad game engines that would otherwise tear depending. Normally tearing happens when you are going above the max Hz of the monitor with your frame rate though which can happen even on demanding games if you have say a 100fps average dialed in on a 120hz monitor and the graph is running something like 70 - 100 - 130 along with a few frame rate potholes and frame rate spikes outside of that range. VRR techs typically revert back to v-sync above the max hz of the monitor to prevent tearing unless you turn it off so you'll end up with either v-sync's input lag or go back to seeing some tearing unless you cap your frame rate. That's why it is recommended that you cap your frame rate either with an in game limiter if available at custom caps , or use rivatuner since it adds essentially no lag where using the nvidia driver's frame rate cap adds a little input lag. The recommendation with g-sync is -3 fps under the max Hz of the monitor for some leeway. I've read that "g-sync" VRR off of nvidia cards on freesync or hdmi VRR monitors requires more of a frame rate cap than that -3 to achieve the same results.
I love cutting the viewport movement from smearing eye-washing blur at speed down to more of a soften blur "within the lines" and "shadow masks" of objects at 100fps+ average, and the glassy smoothness of motion and pathing at those 100fps+ average frame rate ranges whenever I can get it. It's going to be a lot harder at 4k to do so on demanding games for now though. Looking forward to 7nm hdmi 2.1 gpus but with every gpu gen the graphics ceiling can be blown out eventually since screenshots and 30 - 60fps youtube videos and twitch streams sell. The challenge for devs is to whittle game's complexity down to "real time", not the other way around so they could easily make the "ultra slider" 2x , 3x, 10x , 1000x higher if they really wanted to.
Hopefully in the future someone will develop a really advanced interpolation/duplication that doesn't have bad input lag or artifacting so you could run 80fps x 4 or 100fps x 3 (or higher multiples) for some really high solid frame rates while still getting good motion definition.
This. I think when I say I can't go back to 60Hz, I mean, I can't go back to 60Hz WITHOUT Adaptive Sync. I don't have a 60Hz monitor WITH adaptive sync to make that judgement though.
I just know when I went to a 144Hz monitor, it felt like there was a serious hardware upgrade in my PC, when there wasn't.
So now, like a lot of guys in this thread are saying...they can't see themselves buying one of the 4K ultrawides, because they're only 60/75Hz or whatever. I think IdiotInCharge is onto something though, you CAN and SHOULD not wait if that's the resolution you want, because it's really the adaptive sync tech making the difference when it comes to gaming and screen tearing. (especially if you're coming off of an older pre-FreeSync/G-sync monitor).
Most 60hz VRR is something like 48hz - 60hz which isn't doing much but I guess the low rate compensation for beneath 48fps would help even if at molasses motion frame rates with duplication placeholders. If playing 60hz you should really go for a 60fps minimum imo not average anyway or you'll be down in the 30 - 40fps mud part of the time and perhaps even a few potholes that bottom out. If you run 57fps capped minimum on 60hz then VRR isn't even needed at all really because you are running a solid rate but could still be useful for the low fps compensation on the odd pothole to prevent judder/stutter. 75hz freesync is typically 48hz - 75hz too so yeah it's something if you do 60min or better for the most part and cap it at 72.
I suppose you could argue that VRR does allow somewhat higher hz benefits though since it allows you to "straddle" a range of slightly higher frame rates without hiccups in the variances enabling you to balance slightly higher graphics settings (or a bit less dialing down of graphics) vs a reasonably high frame rate range rather than a frame rate minimum. e.g. 100fps (70 - 100 - 130fps) rather than 100fps minimum. Using VRR to play lower frame rate ranges without hiccups while beneficial, isn't going to lower sample and hold blur or add motion definition, smoothness, pathing articulation ... of individual objects and characters but also of the entire viewport moving the game world around relative to you in 1st and 3rd person games. High hz needs to be supplied much higher frame rate rages to tap a bit into filling some of those high Hz. Otherwise the higher Hz ceiling is pretty meaningless.
But sure 60hz, especially at 60fps minimum and not a 30 - 60 - 90 graph, is playable it's just more sluggish movement wise and not glassy "high definition" motion pathing. 60fps solid at 60hz is definitely a lot less "chuggy" and page-y than 30fps console gaming's movement and animation cycles but it still has some stagger compared to the glassy smoothness at higher rates and hz. The sample and hold blur is almost like having a picture on a drill or saw table at high speed vibrating the table and picture to where it's completely buzzing and is smeared out and unreadable at 60fps/Hz and then like cutting that vibration by half to where it's fuzzy at 100 - 120 fps/hz+. Every time you move the game world in your viewport at speed the "vibration" turns up to high at 60hz cap. It happens more when you move the viewport at speed so it's not happening at max smear all the time depending, and overdrive helps some at lower speeds.
I play on a ps4 on a 4k tv at times, and some isometrics on a 15" 60hz laptop and it's "fine".. but when I get back to my gaming rig on a high fps+hz capable game the movement is super glassy and the FoV movement blur is much less aggressive down to a soften.. I wouldn't plan on gaming on a 60hz monitor at my pc personally as a long term goal but since it's not available yet or not worth paying for yet to me on my other devices currently I'm willing to use 60hz on console and laptop (and even on phone) for now. PS5 in 2020 is supposed to have 120hz quasi 4k on performance mode enabled game titles as an option though. I'll prob end up with a huge 55" C9 oled and drop my half-circle desk back 4' from it, then wait on 7nm hdmi 2.1 nvidia gpus to get 120hz 4k someday. I haven't found any reliable info about running 3840x1440 8bit 100hz letterboxed on one but that would be great in the meantime if possible somehow. 3440x1440 8bit 100hz is under the 18gbps of hdmi 2.0b bandwidth limit. If I found out that was possible to do reliably I'd consider buying a 55" C9 for xmas since they've been on sale for $1300 from authorized LG resellers lately.
Amazon finally has a page up for this thing.
I had the Acer X34 before moving to a 4k tv, giving up gsync for superior image quality. Oh yeah, and this was a few years back so there were still a lot of games that lacked adequate support for the UW resolutions and a fair few games were stupid black bar nightmares, and I had to get a lot of patches etc to get the reso's running full screen. Has this gotten progressively better since? This is a good looking monitor tbh, albeit expensive. Between this and the Acer/Asus 43" 4K monitors coming out. Those aren't curved though, why the F did they give up on the curved 4K screens. Samsung had it right. They are awesome for desktop use - as living room TV's they were pretty worthless.
In my experience most games nowadays support at least 21:9 res just fine. Most work fine even with 32:9 but for example Control needed to be patched for this resolution.
Where Samsung went wrong was trying to sell TVs that were curved. That makes no sense for living room use which is why nobody makes them anymore. I'm glad that the curve lives on in ultrawide desktop displays because I would not want to use my Samsung CRG9 if it was flat, looking at the left and right edge would be very awkward.
I say bring on the 5120x2160 38+" ultrawides!
I imagine that it's gotten way better on newer games, considering the rise in popularity of 21:9 over the past several years. Older games are less likely to be updated and will likely remain hit or miss, requiring patches or manual edits to config files, but I'd have to think that recent/current/future games do a better job of supporting ultrawides than they used to.
I think you just answered your own question, lol. Monitor users (where those curved TVs shined) are but a small subset of the market.
edit: ahh kasakka posted while I was trying to get my post typed while making pour over, heh