Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

This basically reads like:
"I actually wouldn't be that concerned about desktop use as long as you're concerned about it."

No. It's not acceptable. Thoughts and considerations like these shouldn't enter the mind of people dropping $2,000+ on a monitor.

Sometimes you have to use precautions when using high end hardware. I don't drive my Lamborghini in the snow and I don't drive my convertible in the rain. But that doesn't stop me from enjoying them during a nice summer day.

Maybe you can't afford it. But I think most people looking to buy this aren't going to mind that tiny extra precaution.
 
This basically reads like:
"I actually wouldn't be that concerned about desktop use as long as you're concerned about it."
Well, think about it like this: have you ever done any of the following?
  • Driven a car
  • Ridden a bike
  • Participated in sports
  • Lifted weights
  • Built a PC
  • Had a romantic relationship (and what that entails)
  • Worked at a hazardous job
  • ...and so on
And if so, did you take any precautions to mitigate risk while engaging in these activities? Did you wear a seatbelt while driving a car? A helmet on a bike? You probably did, at least some of the time. I'd bet you probably do one or more of these things on a regular basis.

We take precautions to manage risk all the time, with almost everything we do. It's really the exception for some activity to have no risk at all.

No. It's not acceptable. Thoughts and considerations like these shouldn't enter the mind of people dropping $2,000+ on a monitor.
That's the trade-off inherent to the technology at the current point in time though, isn't it? OLED will give you the best picture, but it comes with the downside that you do have to have an awareness of the risks and exercise a bit of caution while using it if you want to avoid problems down the line. If that trade-off isn't worth it to you, there are plenty of other choices that will get you a picture nearly as good (or even better in some respects).
 
Looks pretty sweet but they really need to incorporate HDR and the new hdmi and display port protocols on stuff of this cost imo

It will have HDMI 2.1 and DP1.4 and I expect it to support HDR10 at the very least. As for Dolby Vision and other HDR formats, that's a maybe.
 
Well, think about it like this: have you ever done any of the following?
  • Driven a car
  • Ridden a bike
  • Participated in sports
  • Lifted weights
  • Built a PC
  • Had a romantic relationship (and what that entails)
  • Worked at a hazardous job
  • ...and so on
And if so, did you take any precautions to mitigate risk while engaging in these activities? Did you wear a seatbelt while driving a car? A helmet on a bike? You probably did, at least some of the time. I'd bet you probably do one or more of these things on a regular basis.

We take precautions to manage risk all the time, with almost everything we do. It's really the exception for some activity to have no risk at all.

That's the trade-off inherent to the technology at the current point in time though, isn't it? OLED will give you the best picture, but it comes with the downside that you do have to have an awareness of the risks and exercise a bit of caution while using it if you want to avoid problems down the line. If that trade-off isn't worth it to you, there are plenty of other choices that will get you a picture nearly as good (or even better in some respects).

Wellllllllll I think it's debatable. OLED still isn't as bright as LCDs can get, is it?
 
Regarding HDR10 (HDR 1000nit COLOR) "support" and comments on LCD brightness...

"OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display."

The same reason the OLED burn in risks are reduced is the reason their color volume range is so limited.. the OLED color volume range is being capped, rolled down from and given white polluted colors in it's already short extension past SDR range .. in order to prevent increased burn in risk from higher color brightness levels.

What we are talking about with HDR is COLOR brightness throughout a much higher color volume... not "Brightness" in the traditional SDR sense which brightens the whole screen's narrow SDR range where it then clips to white.

OLED as it is now, can only do about HDR 400 and fakes HDR 600 nit if lucky. The added white subpixel to every single pixel on screen pollutes the color space just to get the overall brightness from 400 - 500 color to 700 - 800+ white washed briefly .. then ABL brightness limiter reflexes kick in dimming the while screen to pseudo color 600nit. For those reasons the LG oled color cant even be calibrated in HDR ranges.

OLED has amazing deep blacks and side by side PER PIXEL color and black depth contrast within it's range. Other than the off chance of getting unlucky with a static content burn in lottery chance despite their normal brightness limiters and safety restraints avoiding most of that - OLED look incredible for SDR or "SDR+" but I wouldn't buy one expecting to wade into true HDR color brightness ranges.

As it is now, due to the nature of the organics requiring harsh limitations... OLED will never be able to do real HDR color volume ranges.
 
  • Like
Reactions: Nenu
like this
If people are going to slap down multiple thousands of dollars for hardware, they pay for what they want, not for what you want.

There's nothing wrong with owning a display that is capable of better performance than your graphics card can output. It just means that Nvidia/AMD need to get off their asses and produce better technology to support what the consumer demands.

This 55" OLED 4k 120 hz display looks damn sweet, the question as always is what is the price point.

Right now, today I can buy a 55" 4k OLED on Amazon for $1500 or $1600. But it's NOT a PC monitor. Doesn't have the hardware for that, so I'd expect some kind of price increase for the additional hardware to make it work with G-Sync/Freesync and to make it a good, low input lag, PC monitor. If it drops on the market at let's say $1999 or less, that makes a serious statement.

In fact I would consider this monitor over the HP OMEN X Emperium, even though it's 10" bigger and offers 144hz instead of 120hz because of that sweet, sweet OLED technology. Not to mention the X Emperium costs frigging 5 grand.

All this hoping and praying going on ... day dreaming ... wish you guys would focus on what you can actually use TODAY.
 
Last edited:
If nvidia would implement full integer scaling of 1080p to 4k in the meantime, it would give an option for some games to crank up settings at 1080p and still get appreciable gains out of higher hz with high fps.. while still having 4k native rez gaming options outside of that for when it's more desirable.

Some games support nvlink or sli pretty well for higher frame rates at 4k .. but that is big money on gpus + a state of the art 4k 120hz gaming monitor. Dual gpu support outside of a few AAA titles is usually better once a few months of patches and nvidia driver updates are under their belt so I usually wait that out a bit on single player games post release if necessary (and by that time I might catch them on sale too).

https://www.gamersnexus.net/images/media/2018/gpus/2080-ti/nvlink/nvlink-2080ti-farcry5-4k.png
 
If nvidia would implement full integer scaling of 1080p to 4k in the meantime, it would give an option for some games to crank up settings at 1080p and still get appreciable gains out of higher hz with high fps.. while still having 4k native rez gaming options outside of that for when it's more desirable.

It's not an either or situation. 4k dot pitch is so fine that you can run any sub resolution really: 2560x1440, 1600x900, 3200x1800, whatever. Looks especially good with in-game scalers.

Check this out:
 
I'd prefer 1:1 (or perfect integer scaling) whenever possible and though I've seen that vid before, thanks for posting it. Dishonored 2, which I own, has dynamic rez option on pc but I don't use that option on my 1440p native rez display so I can't say how it looks right now.

I have seen 3rd person games on a ps4 though and it looks decent other than the motion definition being very page-y/paper-y/choppy and the viewport movement at speed being a smearfrest. A feature-rich high end tv like samsung Q9 could prob use any singular option or combo of flicker, interpolation, even black frame insertion to help with the blur but it still wouldn't add frames of motion definition so might be more like a floaty 30fps. It would still help a lot though compared to no sample and hold blur reductions on low fps games.

On a pc as long as the downsample allowed you to approach around 100fps average target you'd get appreciable gains in both motion definition(more pages in a animation flip book flipping faster + more dots per dotted line pathing), and motion clarity (blur reduction) out of the higher 120Hz refresh rate. I won't comment on the resolution and downsample quality on pc yet since I haven't tried it out.
 
4k dot pitch is so fine that you can run any sub resolution really: 2560x1440, 1600x900, 3200x1800, whatever. Looks especially good with in-game scalers.

Perhaps with moving content from TV distances, but I'll be damned if I'll run my 31.5" 4k monitor with scaling. Talk about fugly!

However, I do believe that scaling will be much more feasible with '8k' generation panels.
 
...

Looks pretty muddy on monster hunter world which has really good textures. I can understand having it as an option for some people but that softened blur clay look rather than crisp clear pixels isn't my preference. Seems opposite of what 4k resolution and high Hz itself is all about.. clearer, more crisp display and clearer more crisp + more defined motion..




-------------------------


Another perspective, a more "pro" dynamic resolution vid here arguing at speed not as big of a deal. I support it as an option I just wouldn't want it forced.

 
Last edited:
If people are going to slap down multiple thousands of dollars for hardware, they pay for what they want, not for what you want.

There's nothing wrong with owning a display that is capable of better performance than your graphics card can output. It just means that Nvidia/AMD need to get off their asses and produce better technology to support what the consumer demands.

This 55" OLED 4k 120 hz display looks damn sweet, the question as always is what is the price point.

Right now, today I can buy a 55" 4k OLED on Amazon for $1500 or $1600. But it's NOT a PC monitor. Doesn't have the hardware for that, so I'd expect some kind of price increase for the additional hardware to make it work with G-Sync/Freesync and to make it a good, low input lag, PC monitor. If it drops on the market at let's say $1999 or less, that makes a serious statement.

In fact I would consider this monitor over the HP OMEN X Emperium, even though it's 10" bigger and offers 144hz instead of 120hz because of that sweet, sweet OLED technology. Not to mention the X Emperium costs frigging 5 grand.

$2k would be nice, but since it's going to have the whole "Alienware Gamer" brand on it plus a few other things, I would probably expect it to be more like $3, which is still acceptable to me as I believe the new LG OLEDs start off around $2500.
 
$2k would be nice, but since it's going to have the whole "Alienware Gamer" brand on it plus a few other things, I would probably expect it to be more like $3, which is still acceptable to me as I believe the new LG OLEDs start off around $2500.

Yeah. I'm interested to see how these will perform, but I'm not really interested in paying the Alienware tax unless they do something earth-shatteringly different than the LGs.

The BFGD is a dead joke to me. So, that's automatically ruled out.

I'm going to try to be patient and see if LG does indeed announce one later this year or early next year that's between 40"-49". If they do, that'll be my jam (with the bonus that it'll likely be the least expensive of the bunch). There's really no going back to LCD at this point, and MicroLED isn't "here" despite the cries of the few OLED haters that show up in every Front Page News thread about them, so I'll happily stick with OLED until MLED is an option but I wouldn't mind going down a notch in size from my current 55" panel for desktop use. And by then, hopefully we'll have GPUs that can take better advantage of the 4K/120 potential.
 
Regarding HDR10 (HDR 1000nit COLOR) "support" and comments on LCD brightness...



The same reason the OLED burn in risks are reduced is the reason their color volume range is so limited.. the OLED color volume range is being capped, rolled down from and given white polluted colors in it's already short extension past SDR range .. in order to prevent increased burn in risk from higher color brightness levels.

What we are talking about with HDR is COLOR brightness throughout a much higher color volume... not "Brightness" in the traditional SDR sense which brightens the whole screen's narrow SDR range where it then clips to white.

OLED as it is now, can only do about HDR 400 and fakes HDR 600 nit if lucky. The added white subpixel to every single pixel on screen pollutes the color space just to get the overall brightness from 400 - 500 color to 700 - 800+ white washed briefly .. then ABL brightness limiter reflexes kick in dimming the while screen to pseudo color 600nit. For those reasons the LG oled color cant even be calibrated in HDR ranges.

OLED has amazing deep blacks and side by side PER PIXEL color and black depth contrast within it's range. Other than the off chance of getting unlucky with a static content burn in lottery chance despite their normal brightness limiters and safety restraints avoiding most of that - OLED look incredible for SDR or "SDR+" but I wouldn't buy one expecting to wade into true HDR color brightness ranges.

As it is now, due to the nature of the organics requiring harsh limitations... OLED will never be able to do real HDR color volume ranges.

Last weekend I went over to my buddy's house and while I was there I decided to check out the Vizio P Series Quantum that he bought late last year. That TV is one of the brightest TVs around according to RTings with the peak brightness on a 10% window measuring over 2000 nits. After viewing some HDR content on it, I'm not convinced about this whole "brightness and color volume" argument. I'll stick to OLED for now.
 
Different generations of Vizio's have different # of FALD backlights / densities. The Samsung Q9Fn is arguably the benchmark for higher HDR color right now with ~ 1800 + nits and 480 zone FALD , QD.

I respect your opinion but make also sure you are watching HDR 4000 UHD 4k HDR disc or uncompressed rip of one and the whole AV pipeline is set up properly. The content also matters. The average screen brightness of scenes can generally remain the same. The HDR takes highlights and direct light sources that would normally crush to white and instead continues their color volume through color brightness instead of clipping to white at a very narrow and early clip ceiling. Even "HDR 2000" out of HDR 4000/10,000 content is a fraction of full HDR.

OLED's side by side per pixel black depth and color contrast is amazing though. For "SDR+" it looks really phenomenal if you can live with the bad luck chance of burn in (with no warranty period for it showing a lack of confidence in it). OLED just can't really do HDR 1000 color and definitely not HDR2000 out of 4000/10000 capable content.. That gap will grow as higher HDR color tvs come out with HDR 4000 color and eventually HDR 10,000. I have nothing against OLED but for the people mentioning HDR10 and such - OLED will never be capable of high HDR color considering it's organics are already shackled to very low limits and roll down safety features and forced to use white subpixels vs true higher color brightness volumes (so much so that HDR color can't even be calibrated on a LG OLED).
 
Last edited:
This is kind of what I meant when I said that HDR isn't quite there just yet. You have to be viewing a certain scene, on a certain AV pipeline, on a certain TV to get the full benefit, and even then it's STILL only a fraction of full HDR in the end. I don't know anyone who owns a Q9FN so I can't make any comparisons to that.
 
The modular micro LED proof of concept panels would allow people to make their own aspect ratios. So you could build 3x2 or 4x4 , 6x2 , 8x3 whatever. If you were rich enough to build a whole wall or wall section of them, the representatives talk about making your own screen sizes on the fly out of the larger one just like you can pinch and zoom windows or resize windows with a mouse.

January 2018 https://www.cnet.com/news/samsung-microled-makes-massive-modular-tv-a-reality/
"The next step, says Samsung, is to produce a 75-inch size at 4K resolution. That's roughly half the size of the current 146-inch iteration, and will require both smaller LEDs and, more difficult, even narrower pitch between them. I asked how long it might take before a 75-inch MicroLED TV hit the market, and a representative said somewhere between two and five years."

January 2019 https://www.cnet.com/news/samsung-shrinks-the-wall-microled-modular-tv-down-to-75-inches-ces-2019/
"Samsung previously announced that it intends to sell MicroLED TVs to consumers in the US in 2019, taking advantage of the new-generation displays' thinner panels. The first generation of The Wall was 80mm thick and designed for industrial use, while the newer consumer-focused version slims down to 30mm.

Samsung has yet to provide pricing or availability information about the new MicroLED displays."
https://www.samsung.com/us/business/products/displays/direct-view-led/the-wall/

bjbncDN.png

I think that would be the ultimate, other than perhaps a hemisphere or dome of them perhaps... at least until extremely high resolution, high hz + good interpolation (if we are dreaming.. then also achieving the "zero blur" goal of 100fps x 10 interpolated = 1000fps at 1000hz), clear AR glasses and AR tech take off years down the road... with VR overlapping it. At that point we'd have virtual "holographic" objects placed and animated all over and could slap virtual panels wherever we wanted in 3d space. By that time the games will be breaking outside of the flat box looking glass so to speak and into 3d space looking like holograms on tables or free floating on virtual plane surfaces, and floors, etc. (like the ms hololens is attempting in very primitive fashion) and having the ability to walk right into them VR holodeck-ish style in a spectator-like overview mastering mode or boots on the ground as the main character (like the very early iterations of modern VR are trying to do).
 
Last edited:
Well that's too bad that this monitor doesn't look like an option for people anymore, especially since it was supposed to have hdmi 2.1 for future-proofing.

When I saw the 43" 4k 120hz (144hz?) VA monitors with VRR in the roadmap I pretty much switched away from this OLED personally. I like my 43" monitors but 55" is just too big for my current setup and wouldn't match what I already bought into with my other monitors and arms. I think 43" is about my limit unless I was going for a couch lap-desk/couchmaster kind of setup in my living room and in that case I'd probably want 70" not 55". I wish the 43" monitors had a FALD version on the roadmap for the huge increase in contrast ratio and black depth, and the HDR capabilities though.

It would be nice to know why they are pulling the gaming OLED. If it's the burn in concerns as a pc monitor, or that it can't be price competitive with the other 43" and similar monitors due out, or the small amount of people that would buy it at that size. .. or a combination of issues.



FW900 CRTs and me
---------------------------------

I had a few fw900 crts over the years. They were good for a time but I wouldn't go back to them personally. 22.5" diagonal is short. It was even then in 2008 when I had a 27.5" 1920x1200 next to it.


I don't think all GPU outputs even output crt capable signals anymore, at least not without adapters/converters. And how about HDR color brightness going forward on crt? I think a new fw900 is like 125nit and they dim with age. You also have to get under the hood and screw arround with tweaking the monitor. The geometry and possible jitter issues, lines, focus, convergence.. and they all eventually fade out and/or bloom, get screen anomalies and otherwise break with age. I liked them when I had them but once high hz lcds came into play at larger sizes I dropped the fw900, and once TV's got high enough contrast ratios and better movement tech I dropped my sony xbr960 34" widescreen hdmi input 60hz crt tv too.

I'm fairly happy with motion clarity in modern gaming at 100fps to 120fps average on a high hz monitor (along with quality overdrive implementation) which cuts the sample and hold blur during viewport movement periods at speed down to a "soften" blur within the fields or shadow masks of onscreen objects as long as you maintain higher frame rates. So typically much less obnoxious than 60fps-hz's smearing blur - down to a soften blur at speedy viewport movement and clear view with overdrive at medium and lower speed viewport movement as well as during head on looking and more static scene stages. I'd still love to have something like a 100fps x 10 interpolated frames to 1000fps on some future 1000 hz displays someday for crt level "zero" blur though.






For reference, ordered by height.. (roughly, based on raw sizes):

----------------------------------------------------------------

22.5" diagonal 16:10 .. 19.1" w x 11.9" h (1920x1200 ~ 100.6 ppi) FW900 crt

27.0" diagonal 16:9 .... 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
34.0" diagonal 21:9 .... 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

31.5" diagonal 16:9 .... 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

40.0" diagonal 16:9 .... 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

43.0" diagonal 16:9 .... 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

48.0" diagonal 16:9 .... 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

55.0" diagonal 16:9 .... 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

----------------------------------------------------------------
 
Last edited:
Really, given that ULMB + Gsync monitors are about to be a thing, this seemed too expensive for more of the same old shit.

Isn't flickering a perfect solution for OLED monitors? I'd think that a constantly flickering monitor wouldn't have burn in since the pixels aren't always lit up.

The whole promise of OLED was ultra low input lag and lower motion blur. When is it ever going to deliver?
 
I don't think that what you are proposing would work out with current OLED tech as it is. In fact is seems like oled is a stop gap and displays will move away from it in the folowing years.. especially computer displays where it barely made entry in the first place.

You'd need a much brighter screen, especially for HDR. ULMB reportedly dims LCD screens effectively to your eyes and brain by about 2/3 for example.. OLED are already constrained by ABL to 600 nit HDR color brightness output ceiling to avoid burn-in, and that is in movie's varying scenes and isolated highlights.. not full screen 600nit monitor brightness counteracting the dim effect of strobing.

You'd probably also need to use some kind of high end interpolation to keep high frame rates along with the strobe rate so that the strobe rate wouldn't fatigue your eyes, 120hz strobing or higher imo. Perhaps "interpolation" something like some of the VR solutions use.
The only way it would get motion clarity benefits at low fps would be using some kind of motion interpolation built into the monitor , and even that would just be duplicating the low number of frames so would still appear muddy and clunky movement wise, or some kind of floating cut out look. (VR headsets do use "Time Warp" , "Space Warp" and "motion smoothing" types of interpolation for VR which supposedly works pretty well, but no monitors that I'm aware of have that tech).


---------------------------------------------------------

From the thread-
ASUS TUF Gaming VG32VQ: World’s 1st display with concurrent motion blur reduction & Adaptive-Sync

People use G-sync/VRR in order to push graphics settings higher while riding a frame rate range without hiccups. Between the desire for very high+ to ultra game settings relying on VRR to smooth things out seeming to be in direct opoosition to the very high frame rates required for high hz strobing, along with the very high peak color brightness required for even fractional HDR, this kind of strobing seems very situational usage wise..


I've heard people's reports of strobing cutting their peak brightness by 2/3. If that were the case, for this to do HDR 1000 + strobing it would have to do 3000nit peak color brightness with strobing off. For 350nit SDR it would have to do 1050 nit peak brightness with strobing off. At it is,, according to that site these are 350nit - 400nit with strobing offt depending on the model so HDR is just a fake label.

Most people avoid PWM like the plague.
With variable refresh rate, you'd have to have a typical LOW of 100fps to maintain 100fps-HZ strobing, not an average of 100fps.
Since typical frame rate graphs tend to be +/- 30fps from the average for the most part, that could mean 130fps average for a mostly 100 - 130 - 160 fps graph. Even then, 100Hz strobing and variable rate strobing over hour(s) of game time could fatigue people's eyes.. and some have faster eyesight than others regarding flickering as well.

People use G-sync/VRR in order to push graphics settings higher while riding a frame rate range without hiccups. Between the desire for very high+ to ultra game settings relying on VRR to smooth things out seeming to be in direct opoosition to the very high frame rates required for high hz strobing, along with the very high peak color brightness required for even fractional HDR, this kind of strobing seems very situational usage wise. 1440p makes higher frame rate lows more within reach though at least.

I could see this tech being much more useful if someone developed a very high quality interpolation to go along with this, doubling or tripling the typical frame rates and combining it with a very high peak brightness to start with like a Q9Fn's 1800 - 2000 nit peak color brightness or higher. (Those Q9Fn's also have interpolation and black frame insertion I believe but I'm not sure of the quality vs artifacts. Those flagship tvs are still only a 60hz 4k line w/o hdmi 2.1 for now but they can do 1440p non native at 120hz with VRR/freesync off of amd gpu pcs and VRR off xbox one).

With a theoretical very high quality interpolation that avoided input lag and artifacts, you could run a 70fps graph of something like 40 - 70 - 100 fps interpolated x3 to 120 - 210 - 300 fps or multiplied more. That way the strobing could be very fast. Incidentally, if we ever get 1000hz super high response time displays with interpolation of something like 100fps x 10 for 1000fps at 1000Hz, we wouldn't even need strobing since there would only be 1pixel of sample and hold blur just like a crt.... essentially "zero" blur.



The below is dated vs combined VRR + Strobing but most of the tradeoffs still apply..


==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------
 
This thing seems like it was doomed from the start.

The number of people who are willing to pay the premium for OLED is already low. That's changing as the prices come down and people realize how stellar the image quality is but considering the technology that this display promised combined with the Alienware branding...you know it was going to be priced out reach of the vast majority. I can't imagine it would have sold in large numbers.

I'll keep happily using my B7 and keep my fingers crossed that LG releases their smaller-than-55" version sooner than later. Having an OLED + one of those high refresh 43" monitors that elvn mentioned above will cover every possible need for me until the next major advancement in display tech.
 
If Samsung or Sony cranked out a 43" OLED TV/monitor, it'd be the top selling display in history.
 
My first reaction - yeah, not surprised. My dream monitors always get crushed.

My second - it's wccftech, they will publish anything for a click, so maybe...

so conflicted....

Well, as of yet no one has released my dream monitor, so I can understand where you are coming from.
 
Regarding HDR10 (HDR 1000nit COLOR) "support" and comments on LCD brightness...



The same reason the OLED burn in risks are reduced is the reason their color volume range is so limited.. the OLED color volume range is being capped, rolled down from and given white polluted colors in it's already short extension past SDR range .. in order to prevent increased burn in risk from higher color brightness levels.

What we are talking about with HDR is COLOR brightness throughout a much higher color volume... not "Brightness" in the traditional SDR sense which brightens the whole screen's narrow SDR range where it then clips to white.

OLED as it is now, can only do about HDR 400 and fakes HDR 600 nit if lucky. The added white subpixel to every single pixel on screen pollutes the color space just to get the overall brightness from 400 - 500 color to 700 - 800+ white washed briefly .. then ABL brightness limiter reflexes kick in dimming the while screen to pseudo color 600nit. For those reasons the LG oled color cant even be calibrated in HDR ranges.

OLED has amazing deep blacks and side by side PER PIXEL color and black depth contrast within it's range. Other than the off chance of getting unlucky with a static content burn in lottery chance despite their normal brightness limiters and safety restraints avoiding most of that - OLED look incredible for SDR or "SDR+" but I wouldn't buy one expecting to wade into true HDR color brightness ranges.

As it is now, due to the nature of the organics requiring harsh limitations... OLED will never be able to do real HDR color volume ranges.

This was one of the major reasons I bought a Samsung Q9FN TV, I want really good HDR and it delivers.
It is fantastic in many other ways too with a few small flaws on the way, but overall a superb experience for Movies, TV and gaming.
I game on it @ 1440p 120Hz (racing) or UHD60.
Once micro LED video walls can do 110" UHD 120Hz at 4000 nits or higher, I'm in.
 
...I had a few fw900 crts over the years. They were good for a time but I wouldn't go back to them personally. 20.5" diagonal is short...

Hey...that's 22.5" man. :)

A great shame we only saw one generation of Sony's flat faced wide screen CRT computer monitor. What might have been...

Oh well, mine still works great and has a gorgeous picture. And I still think looks kind of cool in a retro way without the case...all that glass and metal.

Anyway...probably my last build around a CRT I imagine. Though I keep saying that....
 
Fixed... thanks.


----------------------------------------------------------------

22.5" diagonal 16:10 .. 19.1" w x 11.9" h (1920x1200 ~ 100.6 ppi) FW900 crt

27.0" diagonal 16:9 .. 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
34.0" diagonal 21:9 .. 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

31.5" diagonal 16:9 .. 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

40.0" diagonal 16:9 .. 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

43.0" diagonal 16:9 .. 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

48.0" diagonal 16:9 .. 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

55.0" diagonal 16:9 .. 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

----------------------------------------------------------------
 
I remember when 22" was HUGE after the 12", 15" and 17" screens.
I got an Iiyama Vision Master Pro 510 with the Mitsubishi Diamondtron, Trinitron in disguise, it was the holy grail.
2048x1536 res at 80Hz (max refresh 160Hz) around year 2000, WOW! Not that you could read it at max res lol.
It was fantastic at 768p to 1200p, all well above 75Hz.
75Hz was the minimum I could handle well, below that was too flickery.
There was a tradeoff between Hz and sharpness, the higher the Hz the more blurry, so I didnt use it above 100Hz.

I still have a 14" Trinitron TV that I keep for some reason.
Must check it still works, its been around 5 years since the last.
Next question, where the heck is it?
 
Last edited:
I remember when 22" was HUGE after the 12", 15" and 17" screens.
I got an Iiyama Vision Master Pro 510 with the Mitsubishi Diamondtron, Trinitron in disguise, it was the holy grail.
2048x1536 res at 80Hz (max refresh 160Hz) around year 2000, WOW! Not that you could read it at max res lol.
It was fantastic at 768p to 1200p, all well above 75Hz.
75Hz was the minimum I could handle well, below that was too flickery.
There was a tradeoff between Hz and sharpness, the higher the Hz the more blurry, so I didnt use it above 100Hz.
I owned a 510 as well, man. It was glorious for gaming in its day. Pretty much used 1024x768 @ 100Hz exlusively while occasionally dipping into 1280x1024 @ 85Hz when the cpu/gpu could keep up. It held a very sharp and consistent image once it warmed up (i.e. very little geometry/color tweaking). Sadly, mine did not live as long as I had hoped (maybe 5 years?)

Back then, the best LCD tech was TN based. I remember buying my first LCD, a Hitachi 17" (the first 8ms g2g) and putting it side-by-side against my 510. It was no contest in pretty much every category. The downgrade in IQ wasn't even funny - Hello 1280x1024 @60Hz, blacks that were greys, back light bleed, and shit viewing angles. Bleh. The only good thing about early LCD tech was text was noticeably sharper and it didn't heat up the room during the summer. Anyway, things have come a long way since then.
 
  • Like
Reactions: N4CR
like this
I remember when 22" was HUGE after the 12", 15" and 17" screens.
I got an Iiyama Vision Master Pro 510 with the Mitsubishi Diamondtron, Trinitron in disguise, it was the holy grail.
2048x1536 res at 80Hz (max refresh 160Hz) around year 2000, WOW! Not that you could read it at max res lol.
It was fantastic at 768p to 1200p, all well above 75Hz.
75Hz was the minimum I could handle well, below that was too flickery.
There was a tradeoff between Hz and sharpness, the higher the Hz the more blurry, so I didnt use it above 100Hz.

I still have a 14" Trinitron TV that I keep for some reason.
Must check it still works, its been around 5 years since the last.
Next question, where the heck is it?

Back when I was young and poor but still had eagle eyes I ran a 17" (~15" visible) CRT at 1600x1200 for 130 DPI; which is roughly the same as what your 510 would do at 2048x1536. I was able to get away with it through my late teens and 20's. Now I can only comfortably go above ~100/110 nominal DPI if it's being scaled on a high DPI panel (eg my laptop's 280DPI screen at 2:1); without the extra nudge from the increased sharpness it's just not where I want it anymore. Not sure how much is my eyes getting older vs just my getting pickier as I get old though. :sigh:
 
I owned a 510 as well, man. It was glorious for gaming in its day. Pretty much used 1024x768 @ 100Hz exlusively while occasionally dipping into 1280x1024 @ 85Hz when the cpu/gpu could keep up. It held a very sharp and consistent image once it warmed up (i.e. very little geometry/color tweaking). Sadly, mine did not live as long as I had hoped (maybe 5 years?)

Back then, the best LCD tech was TN based. I remember buying my first LCD, a Hitachi 17" (the first 8ms g2g) and putting it side-by-side against my 510. It was no contest in pretty much every category. The downgrade in IQ wasn't even funny - Hello 1280x1024 @60Hz, blacks that were greys, back light bleed, and shit viewing angles. Bleh. The only good thing about early LCD tech was text was noticeably sharper and it didn't heat up the room during the summer. Anyway, things have come a long way since then.
Yeah LCDs were poor in comparison.
I made the best of a bad thing by converting my first LCD into a projector, that was really good fun and worked very well through 2004 to 2007.
I started off with an OHP and its standard yellowish (4000K) 400W lamp (50hrs per lamp lol) and converted it to a 250W Metal Halide (6000K) which lasted around 7000hrs, was much brighter, cooler and had much better colour balance. It got me a 100" gaming/movies display incredibly cheap and better resolution than most projectors at that time. Even replacement lamps were a fraction of the cost.
I used it as my main display with a TV tuner in my PC sat in the top left corner for casual viewing and full screen if I liked the program.
I killed the display panel accidentally and found better ways to reduce light loss so bought a newer better monitor for it and converted to a slightly more efficient 150W metal halide. It was fantastic, glorious days :D

Then I bought a Dell S2209Wb 22" (not the A that was very highly praised) for normal use (not as a PJ) but this display is so good that I used it to help calibrate my Plasma TV I later bought. It is still in use with my security system and has A+ image quality.
But yep, early LCDs were pants.

I lost most of the pictures, this is the best of whats left.
Projector with lighting not adjusted yet, Oblivion with the better lamp and corner lighting improved.
The PJ in action, the first Flatout (I think) with my guitar to show the size.
Final one is the TV app playing Simpsons and My name is Earl playing behind. I'm holding a 12" ruler to show the size.
The camera wasnt that good, the colour quality in real life was a bit better.

05070003.JPG 05150008.JPG

DSCF0091.JPG DSCF0104.JPG

projector size.JPG
 
Last edited:
Back when I was young and poor but still had eagle eyes I ran a 17" (~15" visible) CRT at 1600x1200 for 130 DPI; which is roughly the same as what your 510 would do at 2048x1536. I was able to get away with it through my late teens and 20's. Now I can only comfortably go above ~100/110 nominal DPI if it's being scaled on a high DPI panel (eg my laptop's 280DPI screen at 2:1); without the extra nudge from the increased sharpness it's just not where I want it anymore. Not sure how much is my eyes getting older vs just my getting pickier as I get old though. :sigh:
Getting old mate ;)
I like my screens big and far away.

ps does anyone see the wink smiley at the end of the first line above?
They are missing for me and just show a blank space.
 
I agree on the 43" OLED if it were a thing. However I'd be happy with something like a 43" version of a samsung Q9FN tier 4k tv if it had hdmi 2.1 , VRR, 512 zone or more FALD backlight, and optional BFI (black frame insertion), Interpolation for consoles, 1000nit+ HDR color brightness ceiling, Quantum Dot Color, filter, etc.

----------------------------------------------------------------------------


Looks like this turned into a nostalgia thread for now. I'm fine with that :b

Perhaps if I poured through my mother's actual book photo albums I could find a glimpse of a few of the older ones, but
unfortunately I don't have any pics of my :

-atari 2600 running on an old wood cabinet color tv that needed plyers to change the channels and had tin foil on rabbit ears as the antenna.

- commodore 64 running on our old, tiny black and white tv as a monitor until I could afford a similarly small color tv to play ultima IV and other C64 games on there (there were others?).

-386 running ancient windows and DOS and maniac mansion, indiana jones and the fate of atlantis, loom, etc.

-Or when I borrowed money from my dad to buy the first Pentium 90 when they hit the market in order to play tie fighter. along with order upgrades from 15" crt to 17" (curved "bubble" front") crt, 4x cdrom from 2x, I think 8mb ram from 4, 500mb hdd.
... my first 3dfx gpu making quake and tomb raider go from wet sugar pixels to smooth polygons (along with the included virtua fighter and panzer dragoon, magic carpet).

- several other cpus with a 19" viewsonic flat CRT.

This gallery of my more modern pc multiple monitor setups starts around 2006:
https://imgur.com/a/8EAmu



-------------------------------------------------------------------------------------------------------


I'm also getting older. I still use my 8.5" tablet in my hand a lot on the road, and my phone when I have to, and a 15" 4k laptop at 150% text scaling.. but when my eyes are tired after a long day or hours of reading nearer text is harder to read unless I sort of relax my eyes and droop my eyelids a bit like the futurama "supsicious" "not sure" meme looks lol. So I'll probably end up farsighted as I get older. It's probably not a coincidence that I'm now sitting 3' away from my 3 monitor array considering that, but any closer and I wouldn't be able to see much of the side monitors without rotating my chair a lot. When I get another 43" to match in the center i'll probably have to move back even more to see more of the array at once.

 
Last edited:
Getting old mate ;)
I like my screens big and far away.

ps does anyone see the wink smiley at the end of the first line above?
They are missing for me and just show a blank space.

I see the smiley just fine. Script blockers can cause that unless you selectively allow the right script I think. I'm using umatrix but I have enough allowed now to see emogees. I might have had a problem with that with ublock origin at one point, or noscript addon's defaults.
 
  • Like
Reactions: Nenu
like this
https://www.pcgamer.com/alienwares-55-inch-4k-oled-gaming-monitor-is-definitely-getting-released/

I reached out to Dell for clarification and was told something very different.

"While this gaming monitor did start out as a 'concept product' back at CES this year, we can confirm that it is definitely going to be available for sale in selected markets sometime in Q4 this year. So, it is definitely going to hit the market ," Dell told me.

Dell underlined that last bit for emphasis..

It's not clear when exactly the monitor will launch, though Dell told me it will again be showcasing the display at Gamescom in August. Pricing is also a mystery for now. I'm not expecting it to be cheap, though. Its primary competition (at least initially) will likely be LCD monitors that are part of Nvidia BFGD (big format gaming display) initiative, such as HP's 65-inch Omen X Emperium, which sells for $4,999.99.
 
Last edited:
Back
Top