34" 21:9 UltraWide Displays (3440x1440) - LG UM95/UM65 & Dell U3415W

Hi everyone !
I've been following this thread for a while, and was wondering if any of those lucky ones who have grabbed a UM95 have tried gaming in a non-native resolution e.g. 2560*1080. If so, is the upscaling OK ? I mean, does the image look blurry ?
Thanks for the information ;)

i tried several games and the interpolation doesnt look good. if you buy the um95 be sure to have it powered by titan, 780ti, r9 290x or similar. if you have another card buy the um65. i doesnt make much sense to play on um95 with everything on low. i have the gtx 690 and ran into problems because of the 2gb vram. ideal card would be titan black. if you can afford a monitor which costs 1000 euro i think you also can get a good graphics card, if not um65 :)
 
i tried several games and the interpolation doesnt look good. if you buy the um95 be sure to have it powered by titan, 780ti, r9 290x or similar. if you have another card buy the um65. i doesnt make much sense to play on um95 with everything on low. i have the gtx 690 and ran into problems because of the 2gb vram. ideal card would be titan black. if you can afford a monitor which costs 1000 euro i think you also can get a good graphics card, if not um65 :)


Ideally everyone would have Titan Blacks :) But seriously, if you are at all budget conscious I wouldn't spend a lot of $ right now on a Titan Black or 780 Ti.

Get a 290X if you don't have a brand preference. I just moved from a 3GB 780 because I ran into a VRAM wall with Watch Dogs at 3440x1440. They are cheap right now, good stopgap for the 20nm cards that will come out (presumably) in early 2015. Or just get a regular 290 and OC it you want to save even more money. Really can't beat the value of AMD cards atm...
 
Ideally everyone would have Titan Blacks :) But seriously, if you are at all budget conscious I wouldn't spend a lot of $ right now on a Titan Black or 780 Ti.

Get a 290X if you don't have a brand preference. I just moved from a 3GB 780 because I ran into a VRAM wall with Watch Dogs at 3440x1440. They are cheap right now, good stopgap for the 20nm cards that will come out (presumably) in early 2015. Or just get a regular 290 and OC it you want to save even more money. Really can't beat the value of AMD cards atm...

Watchdogs is really not a good example of V-ram usage. A console port that is not very well optimized. Any card with 3Gb should be fine, no need to run crazy AA with such a high resolution. There are points of diminishing returns. I mean if money is no object then sure by all means get a Titan Black.

But I do agree that the 290X's are a good buy on ebay right now, even if they were used for mining. I would be all over the 290X but I really dislike AMD's software/drivers and prefer SLI with multi-GPU setups.
 
i tried several games and the interpolation doesnt look good. if you buy the um95 be sure to have it powered by titan, 780ti, r9 290x or similar. if you have another card buy the um65. i doesnt make much sense to play on um95 with everything on low. i have the gtx 690 and ran into problems because of the 2gb vram. ideal card would be titan black. if you can afford a monitor which costs 1000 euro i think you also can get a good graphics card, if not um65 :)

Really? I have 2 gtx 690's but am currently only running one in my system and I will game at 5760x1080 and not run into too many problems with the v-ram. Of course I couldn't always max stuff out but I would at least run high settings with fxaa turned on and get 80-100 fps (except Crysis). When I was running quad 690's I was getting about 170 fps with everything cranked up in battlefield 3 but that game did scale unusually well. I guess as long as you're ok with turning shadows and post processing down to high or medium and running fxaa, v-ram should not be an issue with this resolution.
Most cards run out of juice before they hit their v-ram limit. That's why if you look at benchmarks between titan, gtx 690, and a gtx 780, a gtx 690 wins because it is faster even though the v-ram is lower.
 
yeah a dual gpu in the higher echelon usually wins vs a single.

Some games use the max ram available no matter how much you have. Watch dogs, whether for poor optimization or not, is reported to start hitting 3gb vram cieling. I don't intend on buying it until it drops considerably on a steam sale some months into the future personally, and by that time I'll be looking toward running 800 series in sli so can see what vram amounts are on cards then. And no I'm not going to pirate watch dogs in order to play it now. I don't mind waiting. I'm already waiting on the PG278Q and 800 series in the months ahead so all in good time I guess. Still eyeing the 21:9 LG as desktop/app monitor though, especially if the PG278Q ends up being problematic to early adopters. Maybe the 21:9 LG will drop in price eventually too.
.
I really hate dropping shadows and dynamic shadows in a game, as well as view distance and animated objects/creatures viewable in distance - in games since I prefer to play game worlds as opposed to compressed arenas. I think the immersive or "3d" world in a bottle effect of shadows on a game world is under appreciated. Even though it isn't a level/amount setting, animation quality in games is also unappreciated since it doesn't show up in screenshots. I'd rather drop AA down personally if I have to choose on a high rez monitor.

Finally I often like to remind people that the graphics cieling i.e. "ultra" settings are completely arbitrary by the devs and could be drastically more demanding, easily orders of magnitude more detailed geometry and textures, FX, etc. The challenge of game devs is to whittle a game down to "fit" real time gaming limits on a current generation's hardware, not the other way around.
 
Really? I have 2 gtx 690's but am currently only running one in my system and I will game at 5760x1080 and not run into too many problems with the v-ram. Of course I couldn't always max stuff out but I would at least run high settings with fxaa turned on and get 80-100 fps (except Crysis). When I was running quad 690's I was getting about 170 fps with everything cranked up in battlefield 3 but that game did scale unusually well. I guess as long as you're ok with turning shadows and post processing down to high or medium and running fxaa, v-ram should not be an issue with this resolution.
Most cards run out of juice before they hit their v-ram limit. That's why if you look at benchmarks between titan, gtx 690, and a gtx 780, a gtx 690 wins because it is faster even though the v-ram is lower.

For "next gen" games like Watch Dogs, your 690s loses by a large margin compare to a single 780/titan. Even at 1080p, the game requires 3GB of VRAM for Ultra textures and that's with no AA. Many users on 3GB cards are still running out of VRAM, and there are some 690 users reporting that they can't even run the game on medium.

Now that the new consoles have way more accessible VRAM, even 3GB cards won't be able to max some games at 1080p because they will be coded to utilize it much more than they have in the past. Now on 3440x1440, you ideally are going to want around 6GB of VRAM if you want to max some of the upcoming games like Watch Dogs... Sorry to say, your 2GB is going to be no where near enough in the near future. :(

Good news is that Maxwell should hopefully have 6-8GB of VRAM, and i'd imagine a single GTX 880 would not be much different in sheer performance than 2x 690s due to quad SLI scaling limitations.
 
Last edited:
Hey guys...hoping you can help me out here. I am trying to calibrate my monitor with True Color Finder. When I first got the monitor it detected that it was hooked up via USB to my computer and the True Color Finder software was working, but I didnt have a calibrator yet.

Well I just got my calibrator today and now no matter what I do, I cant get the True Color Finder to detect via USB. It just keep saying cant detect, try a different port. I have tried all the ports on my computer USB2.0/3.0 and no luck. I have my keyboard hooked up to the monitor and it does pass through the keyboard to my computer so I know the USB is working. The True Color Finder software just says it failed to start. When I unplug my USB from the monitor to computer, the True Color Finder software refreshes but then comes back with the same failed to start message. The bottom of the software says "DDCCI/USB communication may not be supported in this monitor"

I even tried to hook the USB up to a laptop and install the True Color Finder software and it says the same thing on that laptop.

I have tried resetting the monitor settings, re-installing the software, re-installing the drivers and no matter what I do, it wont work. Anyone have any ideas of how to fix this?

EDIT: Got it to work finally. I had to remove the USB ASM and TUSB3410 Drivers as well as True Color Finder and reboot. Then I installed the True Color Finder software from scratch which re-installed the ASM and TUSB drivers.
 
Last edited:
I know a gtx 690 can't max out a game in the near future but whether you want to admit it or not, it does perform quite well. yes the 2 gb of vram does oppose a problem but for me, I'm ok with fxaa and only high settings at 3440x1440. If you want to max out EVERY game like watch dogs, you'll have to upgrade every year. Have fun doing that as it will cost you $1000 or so a year. Ultra vs High doesn't make or break a game. Just gotta know what settings to turn down. Like I said, shadows and AA take a big hit on the gpu, and for me shadows aren't the selling point. Everyone is different. I'm ok with 3440x1440 on high settings with fxaa... if youre not then make sure you have the graphic cards to drive it. simple as that :)
 
diablo 3 does not officially support 21:9. however, you can run it just by selecting "windowed fullscreen" in the graphics options. i've been running D3 in 2560x1080 for a year now and don't even need flawlesswidscreen for that.

This seemed to work for me fairly well, thank you!

Let me know if you get this working and what your thoughts are on D3 at this resolution. It's the primary reason I even consider this monitor as it sounds like it would be a blast.

I played D3 for about two hours once I got skuko's method working. It may just be the "new toy syndrome," but the experience was awesome! Almost overwhelming at first, I actually forgot to blink for awhile lol.

It works with flawless widescreen. Start FWS , load the plugin minimize it. Make sure D3 is set to full screen and the highest res you can pick. then Alt tab out of D3 one or twice.... when you go back in it will be set for 21:9 and flawless...

Finally got this working as well, thank you!
 
i tried several games and the interpolation doesnt look good. if you buy the um95 be sure to have it powered by titan, 780ti, r9 290x or similar. if you have another card buy the um65. i doesnt make much sense to play on um95 with everything on low. i have the gtx 690 and ran into problems because of the 2gb vram. ideal card would be titan black. if you can afford a monitor which costs 1000 euro i think you also can get a good graphics card, if not um65 :)
Thanks for your reply ;)
I have 2 x GTX 780, and I think thought they would provide enough horsepower for a while, but I'm the kind of person who cranks every graphics setting up before even playing the game, and if further games as demanding as watch dogs come to be released, the combo um95 and SLI GTX 780 will note prove future proof if I want to enjoy all the eye candy the games provide...
I've been able to compare full HD with AA to WQHD without AA, and I enjoyed the former better. I guess the same logic goes for UM65 and UM95.
However, I can't get my hand on many feedbacks regarding the UM65...
 
Thanks for your reply ;)
I have 2 x GTX 780, and I think thought they would provide enough horsepower for a while, but I'm the kind of person who cranks every graphics setting up before even playing the game, and if further games as demanding as watch dogs come to be released, the combo um95 and SLI GTX 780 will note prove future proof if I want to enjoy all the eye candy the games provide...
I've been able to compare full HD with AA to WQHD without AA, and I enjoyed the former better. I guess the same logic goes for UM65 and UM95.
However, I can't get my hand on many feedbacks regarding the UM65...

I think watchdogs is an exception. Think about it, every game that comes out is not going to be a massive open world in a city environment. The reason for high Vram usage besides being a console port, is the entire city of Chicago(or at least a good chunk of it) is being rendered and stuffed into the Vram. On top of that you have all of the cars and NPCs moving about. I would say in 90% of cases your GPU horsepower will be at its limits before Vram becomes an issue.

I will also test watch dogs when I have a chance, I received a copy with my 780 Ti's.
 
I think watchdogs is an exception. Think about it, every game that comes out is not going to be a massive open world in a city environment. The reason for high Vram usage besides being a console port, is the entire city of Chicago(or at least a good chunk of it) is being rendered and stuffed into the Vram. On top of that you have all of the cars and NPCs moving about. I would say in 90% of cases your GPU horsepower will be at its limits before Vram becomes an issue.

I will also test watch dogs when I have a chance, I received a copy with my 780 Ti's.
You're right, but since I do not specially plan to change my graphics cards before ~2 years, based on what has been observed before, I'm pretty sure my SLI won't be able to handle this screen's native resolution with AA in one year's time. That's mainly why I'm interested in this monitor's ability to nicely display other resolutions.
Added to this that replacing my graphics cards may be a hassle due to their "nice looking but not so easy to replace" setup... :p

 
Last edited:
I highly doubt in one year you will have trouble running new games with AA, even 2 years is a stretch. And I understand about switching video cards because I also have a water cooling system.
 
Can't be sure about that. With PS4 and XB1 platforms being close to desktop PCs, this might "help" getting better looking (and hence more demanding) games on PCs, since developpers will only have to code one version of the game and not spend time providing a specific version of the game for us PC gamers.
That's a 1000$ bet I'm not sure I'm willing to take, unless I'm sure the screen handles lower resolutions properly.
 
I'm not sure it would be worth stepping down to the UM65. The loss of the hardware calibration function (I believe only the UM95 gets this) would be a major downside. Further, 2560 x 1080 might be OK for games, but would likely be poor for any other use. Having used a 27" 1920 x 1080 monitor, there's no way I would ever want to buy one again.

I'm in the same boat with regards to graphics cards, but I'd gladly lose some AA (or be forced to make a hardware upgrade) in exchange for a higher resolution.
 
Can't be sure about that. With PS4 and XB1 platforms being close to desktop PCs, this might "help" getting better looking (and hence more demanding) games on PCs, since developpers will only have to code one version of the game and not spend time providing a specific version of the game for us PC gamers.
That's a 1000$ bet I'm not sure I'm willing to take, unless I'm sure the screen handles lower resolutions properly.

They may be closer, but still are completely different platforms then a PC. The game engines will still need to be developed around either the consoles or PC. Also Vram usage is not specific enough to actually measure how much a game REQUIRES or just ALLOCATES, there is a big difference. As it stands now the SLI 780's/Ti's seem to have no issues with games maxed out at 4K with 3GB of Vram with at least playable FPS with most games. 3440x1440 is much less demanding then 4K, so I do not see why it would be of any issue.

There should be no reason to need very high AA with such resolutions anyways.
 
I'm not sure it would be worth stepping down to the UM65. The loss of the hardware calibration function (I believe only the UM95 gets this) would be a major downside. Further, 2560 x 1080 might be OK for games, but would likely be poor for any other use. Having used a 27" 1920 x 1080 monitor, there's no way I would ever want to buy one again.

I'm in the same boat with regards to graphics cards, but I'd gladly lose some AA (or be forced to make a hardware upgrade) in exchange for a higher resolution.
I don't really get it about why hardware calibration is better, apart from "memorizing" the calibrated settings allowing the monitor to be used with different sources. Regarding my needs and use of a monitor, I don't take this function into account in my future choice, since the monitor is going to be used only with my main PC.
I'm currently using a 32" 1080p TV as PC monitor, and moving to the UM65 would basically add some horizontal real estate (and pixels), albeit losing a few cm vertically. I would not consider this as a step backwards, that's why I'm seriously considering the UM65. But the higher screen resolution is sooooooooooo appealing :D
They may be closer, but still are completely different platforms then a PC. The game engines will still need to be developed around either the consoles or PC. Also Vram usage is not specific enough to actually measure how much a game REQUIRES or just ALLOCATES, there is a big difference. As it stands now the SLI 780's/Ti's seem to have no issues with games maxed out at 4K with 3GB of Vram with at least playable FPS with most games. 3440x1440 is much less demanding then 4K, so I do not see why it would be of any issue.

There should be no reason to need very high AA with such resolutions anyways.
I'm not sure about how they really develop the games for console/PC, but I was imagining things to be simpler, i.e. coding the same game, and adjusting settings for consoles like you would via the PC display options :p
 
This seemed to work for me fairly well, thank you!



I played D3 for about two hours once I got skuko's method working. It may just be the "new toy syndrome," but the experience was awesome! Almost overwhelming at first, I actually forgot to blink for awhile lol.



Finally got this working as well, thank you!


Yes, running it in windowed (fullscreen) may work, HOWEVER....if you have Crossfire or SLI, then typically running in a windowed mode turns off 1 of your vid cards. Most games out there REQUIRE total "fullscreen" mode in order to use both vid cards. So it may be that flawlesswidescreen.org allows you to stay in Fullscreen and use both vid cards.

However, the last time I tested D3, I put it into fullscreen(windowed) and I didn't notice a performance difference. SLI NGTX580 3GB cards. Possible the game doesn't tax one card that much. I will test that. I'll get Fraps, turn off adaptive vsync and see if there's a difference in fps between windowed mode or not.

It would be nice if D3 isn't one of those games that requires fullscreen to turn on both vid cards. It's a Nvidia/AMD driver thing....the fullscreen requirement. Been like that for years. I can't believe that haven't fixed that yet.
 
I don't really get it about why hardware calibration is better, apart from "memorizing" the calibrated settings allowing the monitor to be used with different sources. Regarding my needs and use of a monitor, I don't take this function into account in my future choice, since the monitor is going to be used only with my main PC.
I'm currently using a 32" 1080p TV as PC monitor, and moving to the UM65 would basically add some horizontal real estate (and pixels), albeit losing a few cm vertically. I would not consider this as a step backwards, that's why I'm seriously considering the UM65. But the higher screen resolution is sooooooooooo appealing :D

I'm not sure about how they really develop the games for console/PC, but I was imagining things to be simpler, i.e. coding the same game, and adjusting settings for consoles like you would via the PC display options :p

I believe the advantages to calibrating the monitor internally versus with software more has to do with getting accurate colors in games and programs that are not capable of using a color profile. From (the little) that I know, software calibration requires the use of additional programs to "force" a color profile. It all boils down to simplicity. Of course, you could just go with the settings out of the box, which is perfectly acceptable.

I would try to look at the increased resolution as another graphics option for games, probably more important than any individual "high" versus "very high" typical graphics option in a game. By sticking to the #### x 1080 format, you're definitely missing out on something. For me, higher resolutions are worth the trade-off every time.

By the way, I looked at the build in your signature and it is beautiful! Amazing job.
 
I believe the advantages to calibrating the monitor internally versus with software more has to do with getting accurate colors in games and programs that are not capable of using a color profile. From (the little) that I know, software calibration requires the use of additional programs to "force" a color profile. It all boils down to simplicity. Of course, you could just go with the settings out of the box, which is perfectly acceptable.

I would try to look at the increased resolution as another graphics option for games, probably more important than any individual "high" versus "very high" typical graphics option in a game. By sticking to the #### x 1080 format, you're definitely missing out on something. For me, higher resolutions are worth the trade-off every time.

By the way, I looked at the build in your signature and it is beautiful! Amazing job.

I actually came from the good old 37in Westinghouse 37w3 and I was very pleased with the size of the LG. I would highly recommend getting the UM95 instead of the UM65, the main reason I purchased this monitor was the combination of size and resolution.
 
I believe the advantages to calibrating the monitor internally versus with software more has to do with getting accurate colors in games and programs that are not capable of using a color profile. From (the little) that I know, software calibration requires the use of additional programs to "force" a color profile. It all boils down to simplicity. Of course, you could just go with the settings out of the box, which is perfectly acceptable.

I would try to look at the increased resolution as another graphics option for games, probably more important than any individual "high" versus "very high" typical graphics option in a game. By sticking to the #### x 1080 format, you're definitely missing out on something. For me, higher resolutions are worth the trade-off every time.

By the way, I looked at the build in your signature and it is beautiful! Amazing job.
I've been doing some reading about AA modes, and SMAA seems to be the AA to go for ; it blurs the edges it smoothens a little, but as far as I understand how it works, the blurry effect is reduced at higher resolutions. It might be THE solution if I want to use AA on the UM95 without taxing the graphics cards too much.
However, whichever monitor I chose, I'll have to find an "acceptable excuse" for replacing the one I'm using, since my wife wouldn't understand all the resolution and AA stuff we're talking about :D
Regarding colour profiles, I didn't know they couldn't always be handled by games :confused: Good to know !

PS : thanks for the comment on my build ;)
 
overclock.net anti-aliasing-the-basics

http://images.anandtech.com/doci/4061/EQAA.png

Since AA is based on the pixel (boundary or the pixel center), a higher pixel density should help somewhat at lower AA (somewhat like shrinking a picture makes a lower resolution picture look more detailed/smoother).
Some people are super aggressive about AA though regardless of the fps hit vs gpu budget. Though that article is showing older gpus, it still shows the trend of more and more aggressive AA settings causing larger and larger performance hits. Combine this with a very demanding resolution and you might have to make some tradeoffs settings wise.
 
Last edited:
Ouch. I have a pre-order for this with B&H photo, back when it was $899 with them, but they don't expect availability until June 30th now ;)

Wonder if it's LG delays to fix some folks getting too much backlight bleed on some panels.

Amazon has 5 people having given it 5 stars. I saw it in person at Frys. It was a tiny tiny bit shorter than a standard 27" but damn was it wide. Looked good.
 
overclock.net anti-aliasing-the-basics

http://images.anandtech.com/doci/4061/EQAA.png

Since AA is based on the pixel (boundary or the pixel center), a higher pixel density should help somewhat at lower AA (somewhat like shrinking a picture makes a lower resolution picture look more detailed/smoother).
Some people are super aggressive about AA though regardless of the fps hit vs gpu budget. Though that article is showing older gpus, it still shows the trend of more and more aggressive AA settings causing larger and larger performance hits. Combine this with a very demanding resolution and you might have to make some tradeoffs settings wise.
Nice article, thanks :)

But what the hell with the prices in France ?! Only one vendor, Amazon, at a 1200€ price tag : http://www.amazon.fr/LG-34UM95-Ecran-3440-1440/dp/B00HG7EB64
:eek: Are they crazy or what ???!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I managed to pick up a used one for $750 a couple days ago. Awesome monitor! I had to switch to my 1440p 27" to adjust some settings and it was hard going back to 16:9. I still may bring it back only because I had to buy a second R9 290 to run games and am having a little bit of buyers remorse for dropping 1200+,lol. I have the weekend to play with it before I decide.


Can someone please post their calibration settings? I won't be able to order one until next week and would like a baseline to adjust to.
 
I managed to pick up a used one for $750 a couple days ago. Awesome monitor! I had to switch to my 1440p 27" to adjust some settings and it was hard going back to 16:9. I still may bring it back only because I had to buy a second R9 290 to run games and am having a little bit of buyers remorse for dropping 1200+,lol. I have the weekend to play with it before I decide.


Can someone please post their calibration settings? I won't be able to order one until next week and would like a baseline to adjust to.

You had to buy a 2nd card for it? Would think one 290 would be fine. I have SLI NGTX580 Lightning Extreme's which were factory overclocked 580's with 3GB vram. I ran triple 23" screens and had to run BF3 on Low settings to make it playable, but that was 5960 x 1080ish rez. This is going to be 3440 x 1440 so I might need to run BF4 on med to get 60fps.

I would figure a 290 is better than my 580's. My 580's are the performance of a single 780, and people have been running this monitor fine on 780's.
 
You had to buy a 2nd card for it? Would think one 290 would be fine. I have SLI NGTX580 Lightning Extreme's which were factory overclocked 580's with 3GB vram. I ran triple 23" screens and had to run BF3 on Low settings to make it playable, but that was 5960 x 1080ish rez. This is going to be 3440 x 1440 so I might need to run BF4 on med to get 60fps.

I would figure a 290 is better than my 580's. My 580's are the performance of a single 780, and people have been running this monitor fine on 780's.

Single 290X here and I'm pretty content with the performance. I had a 3GB 780 before and it ran into VRAM limitations in Wolfenstein and Watch Dogs. Don't know if that is a sign of things to come or just poor optimization. Anyway... I came out ahead after making the swap considering how cheap used 290x's are now, it is slightly faster than my 780 in all games I've tried too. Great value. I'm going to upgrade again once 20nm GPUs hit, hopefully Nvidia does not skimp on VRAM next time otherwise I am sticking with AMD.
 
You had to buy a 2nd card for it? Would think one 290 would be fine. I have SLI NGTX580 Lightning Extreme's which were factory overclocked 580's with 3GB vram. I ran triple 23" screens and had to run BF3 on Low settings to make it playable, but that was 5960 x 1080ish rez. This is going to be 3440 x 1440 so I might need to run BF4 on med to get 60fps.

I would figure a 290 is better than my 580's. My 580's are the performance of a single 780, and people have been running this monitor fine on 780's.

I meant for High/Ultra settings @ 60fps.
 
Welp, I think I am unfortunately in the group that has bad backlight bleed.

Here is the display at 10% brightness on cinema mode displaying full black, in a completely dark room:

bleeeed_zpsf666726a.jpg


The orange tint on the left side is not a camera artifact, it actually looks that color.

I'm not a picky person at all, but I can't help but notice that bleed in games already:

diablo3_zps57fe95c5.jpg


Think that is bad enough bleed to get an RMA or to return it? I should mention that I have NOT calibrated the display, but I don't think calibration will help me with that...Has anyone tried RMA'ing one of these through LG yet? I'd ideally like to do an advance RMA until the next one comes.
 
Last edited:
Return the monitor. LG is using us as paying beta testers and I for one am sick of it.
 
That looks about like the one I returned last week. I am going to wait it out for improvements. I would just return it while you can.
 
Here is mine at 10% brightness in completely dark room.

DSC_4048.jpg


Edit: Mine is also not calibrated yet. I plan to purchase a calibrator at some point.
 
Welp, I think I am unfortunately in the group that has bad backlight bleed.
This is not backlight bleed, it's IPS glow. All the LG 34UM95 monitors will have this more or less, so replacing it will not really remove it. If you don't want this, you'll have to buy an IPS monitor with a polarizer (e.g. Eizo EV2736W) or alternatively a monitor with a VA panel, like the BenQ BL3200PT.

https://www.youtube.com/results?search_query=ips+glow
 
[X]eltic;1040865735 said:
This is not backlight bleed, it's IPS glow. All the LG 34UM95 monitors will have this more or less, so replacing it will not really remove it. If you don't want this, you'll have to buy an IPS monitor with a polarizer (e.g. Eizo EV2736W) or alternatively a monitor with a VA panel, like the BenQ BL3200PT.

https://www.youtube.com/results?search_query=ips+glow

Yes but his is significantly worse then mine, the color shifting is not normal for an IPS.
 
Some of these monitors definitely seem to have worse edge uniformity than others if going by the photos.

However VA panels have a "Cone" of shift too. No monitor is perfect and all have tradeoffs. VA gains black levels/detail in blacks, typically suffers some shift effects (visible on edges of monitor head on), and most VA have some ghosting/trailing effects.

A very large monitor or tv or a very long one will show these edge effects more, especially if not looking head on. That's why people sometime angle triple monitor setups so that the side monitors are "curved" towards them. My VA samsung TV is on a rotating pillar stand and it makes a huge difference in contast/brightness whether you are viewing it head on or not (or standing up in kitchen watching it vs. sitting down in the direct view 'sweet spot').

VA shift-cone, ghosting.. ips glow, black levels, TN shift/"shadow", color uniformity (and lack of color saturation in the worse of the TNs), black levels. Other obvious tradeoffs are max hz, response time, blur reduction/elimination, g-sync, input lag, and until recently on only a few monitors - resolution.
 
Last edited:
Some of these monitors definitely seem to have worse edge uniformity than others if going by the photos.

However VA panels have a "Cone" of shift too. No monitor is perfect and all have tradeoffs. VA gains black levels/detail in blacks, typically suffers some shift effects (visible on edges of monitor head on), and most VA have some ghosting/trailing effects.

A very large monitor or tv or a very long one will show these edge effects more, especially if not looking head on. That's why people sometime angle triple monitor setups so that the side monitors are "curved" towards them. My VA samsung TV is on a rotating pillar stand and it makes a huge difference in contast/brightness whether you are viewing it head on or not (or standing up in kitchen watching it vs. sitting down in the direct view 'sweet spot').

VA shift-cone, ghosting.. ips glow, black levels, TN shift/"shadow", color uniformity (and lack of color saturation in the worse of the TNs), black levels. Other obvious tradeoffs are max hz, response time, blur reduction/elimination, g-sync, input lag, and until recently on only a few monitors - resolution.

You are correct there are always trade offs depending on the panel type. But the pictures posted by dave are much worse then I would expect a good IPS panel to exhibit, and also going by the one I have.
 
For those of you who have tested a bunch of games with this monitor....what is the outcome?

I'm interested in picking up this monitor...but I do fear the compatibility issues. Do alot of games suffer from black bars? Or messed up HUDs? If it does, is it a simple or hard fix? My main focus on this monitor is gaming...with productivity/work being secondary. I play all types of games such as FPS/RPG/Adventure. I'm a HUGE Assassin's Creed fan....I'm afraid that this monitor won't be compatible?

Also, any word on the Dell version of this monitor? I really want to pick this up....I'm not fearful of the IPS lag...I've gamed on IPS screens before and they don't bother me. I just don't want to drop $1000 and have tons issues playing games....paying $1000 just to have 1/3th of your screen taken away = :(

Thanks in advanced!!
 
For those of you who have tested a bunch of games with this monitor....what is the outcome?

I'm interested in picking up this monitor...but I do fear the compatibility issues. Do alot of games suffer from black bars? Or messed up HUDs? If it does, is it a simple or hard fix? My main focus on this monitor is gaming...with productivity/work being secondary. I play all types of games such as FPS/RPG/Adventure. I'm a HUGE Assassin's Creed fan....I'm afraid that this monitor won't be compatible?

Also, any word on the Dell version of this monitor? I really want to pick this up....I'm not fearful of the IPS lag...I've gamed on IPS screens before and they don't bother me. I just don't want to drop $1000 and have tons issues playing games....paying $1000 just to have 1/3th of your screen taken away = :(

Thanks in advanced!!

Most newer games are fine and I would imagine using flawless widescreen you will be fine. If you play a lot of older games you may have issues.
 
Back
Top