ASUS PG35VQ 3440x1440 200Hz Gsync

neither brightness nor balck levels are relative or subjective, both are measured in nits, a non-SI name for cd/m². 300 nits will hurt the eye if you sit close to the screeen, like, for example, using a PC monitor. going HDR will not change this [H]ard fact about SI definitions.

luminance dinamic ranges are relative, the so called HDR standards just add more steps to divide the available contrast. going 10000, 4000 2000 or 1000 nits does not help when one sits less than 2 feet from the screen. going from 0.05nits to less than 0.005 does.
 
you are still ignorant ofnhow hdr works, especially how brightnrss works in hdr with color saturation avoiding cutoff/crushing to white at higher brightjess highlights and how it uses absolute levels instead of ratios. Overall providing a much more saturated image than sdr can ever provide. I highly recommend that you watch the video I posted in it's entirety as it addresses many questions about hdr and goes into great detail about the features and superior quality it provides.
 
will be any of these great monitors glossy? becouse i am afraid that my glossy monitor will have better colors from these one if they will be matt......
 
you are still ignorant ofnhow hdr works, especially how brightnrss works in hdr with color saturation avoiding cutoff/crushing to white at higher brightjess highlights and how it uses absolute levels instead of ratios. Overall providing a much more saturated image than sdr can ever provide. I highly recommend that you watch the video I posted in it's entirety as it addresses many questions about hdr and goes into great detail about the features and superior quality it provides.

i suppose such video will convince me that nits (cd/m²) are not an objective physical property of a light source :rolleyes:

https://en.wikipedia.org/wiki/Candela

The candela is the luminous intensity, in the perpendicular direction, of a surface of 1 / 600 000 square metre of a black body at the temperature of freezing platinum under a pressure of 101 325 newtons per square metre.[
 
neither brightness nor balck levels are relative or subjective, both are measured in nits, a non-SI name for cd/m². 300 nits will hurt the eye if you sit close to the screeen, like, for example, using a PC monitor. going HDR will not change this [H]ard fact about SI definitions.

Most LCDs have had full scene brightness at 300 nits for many years now. I dunno where you get the idea it hurts the eye, do you think most people turn the brightness of their LCDs down? Many just run them at 100%.
 
Most LCDs have had full scene brightness at 300 nits for many years now. I dunno where you get the idea it hurts the eye, do you think most people turn the brightness of their LCDs down? Many just run them at 100%.

ad populum?! the whole concept of nits involve light per volume.
340 nits on a 30" feels like an eye rape at 2 ft. sit further away, or with a smaller screen and it can be tolerated.

i am sure that there are morons out there willing to run 30+" PC monitors above 350 nits.
too bad i can not challenge them to use a 1000 nits large PC monitor 24x7 without being charged with qualified physical injury against mentally impaired.

out of curiosity, i checked how many monitors TFTcenbtral reviewed over the last 12 months that outrape more than 350 nits.
1 monitor had a reading of 487 nits ( the Asus ROG Swift PG258Q- a $569 TN) ,
3 had ~350 nits, 2 of them are smaller than 28"
and rest stayed below 300 nits.
 
Yeah but I'm talking about full scene brightness. HDR doesn't require full scene brightness of 1000 nits and in fact no HDR display is capable of this as far as I know, it's the peak highlight brightness that has to go above 1000 nits for HDR, which is typically tiny fractions of the screen.
 
Yeah but I'm talking about full scene brightness. HDR doesn't require full scene brightness of 1000 nits and in fact no HDR display is capable of this as far as I know, it's the peak highlight brightness that has to go above 1000 nits for HDR, which is typically tiny fractions of the screen.

come again?? if i set luminance to 4090 ( or whatever max value you HDR encoding is using instead of old scholl 250) and put a spyder pro 5 in front of the screen i will not get a reading of 1000 nits??

what part of " light per volume" did you missed?: a 1000 nits screen should outrape the same amount of light over 100 cm², 1cm² and 1mm². or measure the rated brigthness by some obscure unit, not the SI cd/m².

now if you tell us that no HDR screen complies with any HDR standard, i am 200% behind you.
 
I think 1000 nits might be fine in peripheral parts of the screen for small objects, but not center. Guess it's up to content creators to make sure the hdr content is proper. But the claimed 1000nits must be if running maximum brightess, surely most people would set a lower maximum for a dim room.
We are after all full capable of having a peripheral light source roughly 1000x stronger still, capable of blinding us (direct sunlight), and we function in and even enjoy such an insane environment.
 
SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.

HDR video increases the range of brightness in an image to boost the contrast between the lightest lights and the darkest darks. If you’re having difficulty grasping how that translates into a more realistic image on your screen, think of the subtle tonal gradations a fine artist creates in a charcoal drawing to build the illusion of volume, mass, and texture, and you should begin to get the picture. But HDR doesn’t just improve grayscale; its greater luminance range opens up a video’s color palette as well. “Basically, it’s blacker blacks, whiter whites, and higher brightness and contrast levels for colors across the spectrum,” says Glenn Hower, a research analyst at Parks Associates.

The result is richer, more lifelike video images. Rather than washing out to white, as it would in conventional video, a ray of sunlight reflecting off a lake in HDR will gleam, and a bright cloud will appear soft and cottony. Basically any image your current TV would render shadowed, dull, muddy, or bleached out will look nuanced, vibrant, and strikingly realistic in HDR."

--------------------------------------------------------

"Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits."

--- This is what the uninformed thinks happens:
"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.

HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""

------------------------------------------------------

http://www.creativeplanetnetwork.com/news/shoot/understanding-gamma-and-high-dynamic-range/615702

"To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma."

"Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images."

All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content."
 
Last edited:
Thanks elvn, this last post actually clears things up really nicely.

Am I wrong to think that SDR content viewed on a HDR display (such as those g-sync displays we're talking about here) will be improved quite a bit vs a SDR display? (not talking about SDR to HDR "upscaling" here though that could prove interesting too).
I mean there have been SDR FALD displays before right (even IPS ones), and while they had some issues (low number of zones, haloing...) IIRC they performed quite well, not that far from OLED/Plasma, especially the latest models (that can now do HDR as well of course).

Even if HDR content is coming, most of the time the monitor will be used to display plain-old SDR content after all.
 
Last edited:
No, you aren't wrong in regard to this display.

Like you said, a well implemented FALD array is superior to edge lit (which can suffer clouding, flaring, "flashlighting", uniformity issues). So that is an improvement in itself.

Being a VA panel in particular with such a high density FALD array should result in much better black depth and contrast compared to most gaming monitors.

It won't touch the .005 black depth of an OLED but going from the pale 860:1 to 1000:1 contrast ratio and .12 to .14 black depth of IPS and TN gaming monitors to even a .04 or .05 black depth and perhaps a 3500:1 or more contrast ratio in SDR (just guessing the better numbers) would still be a huge difference. Non-FALD modern VA gaming screens shoot for 3000:1 contrast ratio but end up around 2800:1 calibrated so I'm just guessing at what the FALD on this monitor can do to improve that. On my low density FALD array VA tv it increases the contrast ratio and black depth by a lot (around a 25% improvement). A very high density FALD like this might even get a larger % increase.

HDR displays also aim for P3 color which is a few % over adobe rgb.
http://i.imgur.com/VR6gxX2.png
This display is quantum dot P3 color so it should look good in that facet even considering sdr content's narrow range makes it clipped to white on the top of its range and muddied on the lows (like all sdr content). "Quantum-dot technology provides the cinema-standard DCI-P3 color gamut for realistic colors and smoother gradation".
 
The static contrast on a FALD display changes very much depending on what is displayed.
The ~25% increase is about what you get when measuring a checkerboard pattern, like most reviewers do and then call it a day.
But a dense and starry sky for example will barely have more than native contrast, since such a pattern is very hard on any FALD.
On the other hand, a smaller logo in the center surrounded by black space might very well produce a static contrast over 500.000:1
Almost none of the monitor review sites measure black level with high enough accuracy though (at least to the 3rd, better to the 4th decimal). The way it is now there are already noticeable rounding errors above 3000:1, and it gets totally useless above 10000:1
 
The higher the density of the FALD the better. The 27:9 versions of this monitor tech are 384 zone, which carries over to 512 on the ultrawide. That is quite a dense array but nothing compared to OLED. As I understand it, even LG's OLED implementation is all white (to get around color fade/burn-out issues), so in a way it is sort of like a per pixel FALD array (8.29million pixels at 4k) which sounds like the most dense possible at 4k and less unless someone invents sub-pixel dimming for some reason someday.

So, these monitors have 384 zones at 27" which increases to 512 zones with the added ultrawide screen width. Many FALD TVs only have 32, 64, 150 zones, so these monitors have greater than than 2.5 times more zone density (256% the zones of a 150 zone array comparing 16:9 vs 16:9) than commonly in high end FALD LCD tvs.

http://televisions.reviewed.com/content/samsung-ks9800-series-hdr-led-tv-review
Local dimming on Direct LED (full-array) backlit TVs is a tricky thing to get right. If you have too few zones, you run into issues where you're dimming/brightening the wrong content during complex scenes. If your range of dimming isn't granular enough, you end up with stair-stepped shadow levels that dramatically change depending on their proximity to a bright object on screen.

The KS9800 has a bit of a problem with the latter issue. It has plenty of zones and zone control, but the difference between black levels when the screen is mostly shadow versus only partially shadow is pretty noticeable.

For example, if around 10% of the screen is dark, you can expect black levels of about 0.05 nits. If a large majority of the screen is dark, those levels drop dramatically, to about 0.005 nits.

While both measurements are great for an LED TV, the difference between the two is substantial, and occasionally the KS9800's presentation of shadows shows off this discrepancy in an aberrant way. It's not a huge problem for most content (especially brighter HDR content, or things like sports, news, weather, nature documentaries), but more complex, filmic content mastered for standard dynamic range can occasionally be given the staircase treatment.
 
Last edited:
  • Like
Reactions: mms
like this
The higher the density of the FALD the better. The 27:9 versions of this monitor tech are 384 zone, which carries over to 512 on the ultrawide. That is quite a dense array but nothing compared to OLED. As I understand it, even LG's OLED implementation is all white (to get around color fade/burn-out issues), so in a way it is sort of like a per pixel FALD array (8.29million pixels at 4k) which sounds like the most dense possible at 4k and less unless someone invents sub-pixel dimming for some reason someday.

So, these monitors have 384 zones at 27" which increases to 512 zones with the added ultrawide screen width. Many FALD TVs only have 32, 64, 150 zones, so these monitors have greater than than 2.5 times more zone density (256% the zones of a 150 zone array comparing 16:9 vs 16:9) than commonly in high end FALD LCD tvs.

http://televisions.reviewed.com/content/samsung-ks9800-series-hdr-led-tv-review

A lot of people love Samsung-KS9800 with ~ 150 zones , Sony X900E with ~ 30 zones , Sony X940E with ~ 250 zones , Sony Z9D with ~ 646 zones , Panasonic DX902B with ~ 512 zones and Vizio TVs with ~ 128 zones , All these FALD TVs have excellent black levels with great colors and details . i love true black levels for OLED but sometimes i fell the picture is very dim and the black color overshadows some details i want to see it .
So i think 512 FALD zones for 35'' Ultrawide will be enough to give us great contrast and black levels especially with VA panel + HDR 1000 nits .
 
  • Like
Reactions: elvn
like this
Agreed. These monitors will have a lot of zones compared to most fald tvs so the potential that they have is great. They are also going through a quantum dot filter and are of course VA to start with.

A high density FALD VA HDR is where it's at in my opinion until a true full featured OLED gaming monitor or similar more advanced tech full gaming monitor comes out some years ahead. It will be interesting to see what kind of black depth ranges especially but also FALD driven quantum dot color reproduction and overall pq these get. Looking forward to it.
 
  • Like
Reactions: mms
like this
Definitely very exciting, I hope they really do come out by the end of this year.
 
Unless you typoed your resolution, 2560*1920 is the same number of pixels as 3440*1440. And anyway, sure, but in what games at what settings? 3-way SLI scaling is very bad and offers little to no improvement over 2-way SLI in many games, and at framerates in the 200 range games are often CPU bound anyway. CSGO and Overwatch probably cover the vast majority of FPS gaming people do in 2017, for example, and it's not a problem to get Overwatch to average 144fps@3440x1440 with 1x GTX 1080TI in Ultra settings, with settings minorly tweaked/bumped down or overclocking, average 200fps is probably achievable. With CSGO framerates obviously aren't an issue at any resolution.

No you're not gonna play Witcher 3 at 200fps, but that's what G-sync is for, you don't need 200fps to enjoy the great black levels and insane contrast provided by HDR/FALD. And in the games where high refresh rate really matters, it's usually worth it to tune settings down a little for the improved motion resolution.

Yeah people don't know what G-Sync is. They hear 200Hz and think you have to run everything at that fps. G-Sync works from 40fps (or Hz) to the monitor's max Hz rating so in this case it's 200Hz.
 
What about this case ?
Asus and Acer (PG35VQ and X35) have 4:2:0 Chroma Subsampling and all the high end monitors have 4:2:2 Chroma subsampling , and we dont know if the HDR thing is gonna work on HDR movies and thats because the HDR only works with G Sync and G sync only works while u r gaming .

W T F? Umm sounds like you're just throwing out specs and have no idea what you are talking about.
 
I've read this from post in reddit , and the guy said we can set a lower refresh rate and get around that issue ( to get 4 : 4 : 4 ) .

That is true because the HDMI 1.4 spec did not have enough bandwidth to carry the 4:4:4 data. Don't know how that got in to this conversation. HDMI 2.0 has enough capacity.
 
"Comming soon" statement from ASUS means "maybe this year".
 
I'm more interested in this than the PG27UQ because it's a VA panel, which means that the haloing in FALD mode should be much less obvious due to the panel being able to block more of the light. Yeah, it's going to have response time/dark trailing issues like every VA panel but that's probably OK as long as it looks decent at 120hz. I don't expect it to look correct at 200hz.

It is really too bad it isn't 3840x1600 though.
 
Well they said Q2 so that is April-Jun. Assuming they hold to their last update which has been a while. I didnt hear them say any time frame during CES which was disappointing.
 
I'm more interested in this than the PG27UQ because it's a VA panel, which means that the haloing in FALD mode should be much less obvious due to the panel being able to block more of the light. Yeah, it's going to have response time/dark trailing issues like every VA panel but that's probably OK as long as it looks decent at 120hz. I don't expect it to look correct at 200hz.

It is really too bad it isn't 3840x1600 though.

Yes and it's a much higher density FALD than most. 384 for 16:9 and 512 for 21:9 is a lot of direct led lights.

A gaming VA would start out with a less black level than a VA TV , around 3000:1 contrast ratio rather than nearly 6000:1 but the trade-off is the gaming panels are much faster response times(and higher hz). That's still triple the black depth and contrast ratio compared to IPS and tn to start with. The FALD further improves this contrast ratio and black depth by a lot, and with that many zones probably quite well.

My FALD VA tv gets 7625:1 contrast ratio uncalibrated , 8157:1 calibrated/tweaked. By comparison, with FALD off, uncalibrated it is 5882:1 so loses a lot of contrast ratio (~23% lower ... it's 77% of 7625:1). The direct zone backlight on my tv is 32 zones so dynamic contrast actually works pretty cleanly without noticing the whole screen changing or popping other than very rarely if you are looking for it.. and it doesn't lose contrast overall blanching in zones like edge light tends to do . 32 zones means 4 quadrants of 8 direct led backlights I would think. I think the P series has double that at 64 (16 per quadrant likely). The FALD makes a huge difference even at that with VA black levels to start with. Just watched the dark car scene in blade runner 2049 (no spoiler really but anyone who saw it will know what I'm talking about). It looked great. I can't imagine how good a 384 (or 512 UW) zone 1000nit HDR one would look like. That's like 96 direct led backlights per quadrant at 16:9, 12 times more dense than what mine has, 6x more dense what the P series has.

The fact that this isn't 4k might be an issue for movies but for games most people aren't going to be getting 100fps+hz average or better on a 4k resolution on demanding games with their gpu power even if they turn the graphics down a bit. So this resolution might actually be a good thing to attempt to fill the high hz in order to get the benefit of 70 - 100 - 130 range at least. Personally I'd rather a bigger 4k that I could run a 21:9 resolution (or even a smaller 16:9 resolution if necessary) in for gaming, letter-boxed, which would still be a decent size and I'd gain frame rate. Then swap back to 4k for desktop/apps/movies and less demanding games.
 
Last edited:
That's just another LCD FALD backlight. Totally different than micro-LED which is the actual pixels.
 
While it's not microLED, I'm cautiously optimistic that mini LED backlighting could address some of the biggest shortcomings of LCD and be "good enough" as an interim solution for the next few years. The FALD backlighting implemented in the current and announced 27" displays is only using 384 zones, which makes each zone around an inch across and is what gives you the noticeable haloing we've seen on e.g. the Dell UP2718Q.

With mini LED, we could see the number of zones increased to 7,000-9,000 (112*63 or 128*72 are a 16:9 ratio and 7,056 and 9,216 zones, respectively), which means the zones would be less than a quarter of an inch across. At 256*144, you get to ~37,000 zones, each less than 0.1", or about 2mm. At that size, haloing should be all but unnoticeable 99% of the time.

Samsung is moving in this direction on their TVs, with micro dimming set to debut on at least one model later this year. Combined with improvements to light rejection in the panel and QDCF (photo emissive quantum dot), the next couple of years look like they're shaping up to bring some of the most significant upgrades to raw image quality LCD has seen in well over a decade.

As for OLED, while it might still be better in some respects, so far there's still no indication that we'll see panel sizes in the optimal desktop PC range of 27-35". Burn in is also still a concern, and while the 2018 LG sets may have made some improvements to combat it, there's no reason to believe it's a solved problem yet. Perhaps if/when they move to blue OLED+QDCF (thus increasing efficiency and lifespan) burn in will be reduced to the point where it's no longer a significant concern. That doesn't seem likely to happen in the next year or two, however.
 
I haven't read anything that states how many zones "mini LED" is suppose to have. Until we have that number, we are just spitballing. Also for FALD, it's not just how many zones but also the delay that can be very annoying. The 27" Dell and the 32" ASUS FALD have both shown to have very slow FALD back-lights making for problems with halo trailing moving objects that is very annoying.

I found the 27 Dell FALD to be completely unusable.

OLED better in some respect... more like virtually all respects. With 2018 sets, even LCD's saving grace of being cheaper isn't true anymore in the premium TV segment. The "Burn in" issue is completely over-blown. I have gamed on OLED for years with zero issues. I have no sympathy for people that buy an OLED to watch CNN for 8 hours a day at max brightness. What a waste. I will concede that if you want to do nonsense like that, get a LCD.
 
  • Like
Reactions: Fleat
like this
I haven't read anything that states how many zones "mini LED" is suppose to have. Until we have that number, we are just spitballing.
Presumably it depends on the product category and price target, but ~10,000 LEDs is a number that comes up from a few different sources:

ledinside.com said:
Take a TV panel as an example, before using mini LED, edge-type backlight only needs dozens of high luminance LED. Smartphone backlight merely requires 25 LED units. After applying mini LED, TV panel backlight set contains as many as tens of thousands of mini LED units. A 5-inch smartphone screen would employ an amount of mini LED backlights up to 9,000 to 10,000 units.
Source

digitimes.com said:
For direct-type backlighting of automotive displays, a single panel will use several thousand to over 10,000 Mini LED chips
Source

news.cnyes.com said:
It is reported that a 5-inch smart phone panel backlight requires 20-25 LED chips or about 9000 miniLEDs
Source



It was also previously reported that AUO would be adopting mini LED backlighting:
oled-a.org said:
Paul Peng, CEO of AU Optronics (AUO) said the company would adopt mini LED chips for backlighting in 2018 [...]
Mini LED reduce the size of chips used in direct-type backlighting, and a backlight unit (BLU) for a notebook panel may need 6,000-7,000 mini LEDs, with better brightness, color and contrast than current LED chips, Peng noted. However, mini LED BLUs incur higher costs and therefore are unlikely to be adopted for inexpensive products such as notebooks in the US$500 range, Peng said. But they can be used in niche-market applications, such as those for gaming notebooks, PCs and professional displays, priced over US$1,000, Peng noted, adding AUO is in talks with clients about possible mini LED backlighting applications.

And AUO just showed a 27" panel with mini LED backlighting on March 8th, scheduled to ship this year:
ledinside.com said:
AU Optronics (AUO) on March 8 released the world's first Mini LED-based product line, including a gaming monitor, a gaming laptop, and a small-sized display for VR wearable devices. The company will begin to ship its Mini LED products in 2H18 and the large-scale gaming monitor will be the first to be on the market [...]
Products AUO showcased on March 8 were a 27" gaming monitor, a 15.6" gaming laptop, and a 2" VR head-mounted display.

Then again, their previously announced FALD displays have been repeatedly delayed, so we'll see.



On the TV side of things, the 8k TV that Samsung showed at CES reportedly had over 10,000 dimming zones:

forbes.com said:
While Samsung wouldn’t reveal the exact number of local dimming zones in operation with the direct LED lighting system, it did say that it’s more than 10,000.


OLED better in some respect... more like virtually all respects. With 2018 sets, even LCD's saving grace of being cheaper isn't true anymore in the premium TV segment. The "Burn in" issue is completely over-blown. I have gamed on OLED for years with zero issues. I have no sympathy for people that buy an OLED to watch CNN for 8 hours a day at max brightness. What a waste. I will concede that if you want to do nonsense like that, get a LCD.
For media without static logos, OLED is probably fine. But it's not suitable for desktop PC use at this point, where typical usage might include leaving a static desktop with a browser window or other applications open for 8-12 hours a day. Not to mention the lack of suitably-sized displays for a typical desk.
 
  • Like
Reactions: elvn
like this
Everything sourced above is all speculative. I see a lot of "may" and "would". I've not seen anything that says releasing product X has X amount of mini-LED's. Marketing can be quite different than what you actually buy in the store. Marketing you can promise the moon.

I would agree that OLED does not fit into the "content creation" role all that well. Having a spreadsheet open for 12 hours per day doesn't utilize OLED's strengths anyway. The cheapest LCD you could buy could do that just fine, so not sure why you would even consider an expensive OLED. OLED is more for the role "content consumption". Games, movies...
 
I heard something about xbox and freesync support so maybe consoles could make VRR tech more universal someday since a lot of people play consoles on TVs (and hopefully including oled tvs eventually). Not sure if that would ever hash out for nvidia gpus like mine though unless nvidia started supporting freesync or hdmi VRR. A 120hz native HDR 4k OLED with VRR that worked on a nvidia GPU would indeed be tempting.

https://www.anandtech.com/show/1252...port-to-xbox-one-s-and-xbox-one-x-this-spring

https://www.pcgamer.com/xbox-one-s-and-one-x-are-getting-amd-freesync-2-support/

https://www.extremetech.com/gaming/...dd-amd-freesync-support-xbox-one-s-xbox-one-x
 
Last edited:
Everything sourced above is all speculative. I see a lot of "may" and "would". I've not seen anything that says releasing product X has X amount of mini-LED's. Marketing can be quite different than what you actually buy in the store. Marketing you can promise the moon.
The evidence points to it being the direction the industry is moving in, however. But the proof of the pudding is in the eating, as they say, which is why I said I'm only cautiously optimistic.

I would agree that OLED does not fit into the "content creation" role all that well. Having a spreadsheet open for 12 hours per day doesn't utilize OLED's strengths anyway. The cheapest LCD you could buy could do that just fine, so not sure why you would even consider an expensive OLED.
Because there are lots of people that use their desktop PC for everything - work, games, movies, general web browsing - and want a display that does all of those well. Currently, every available display technology involves some trade off, and falls down in one or more of those categories.
 
If we can use Micro LED as a solution why we use Mini LED now in 2018 ?
 
Because Micro LED is a pipe dream as of now. It's only a prototype, and in use on a ginormous display. Physical LED's are extremely hard to scale down in size. May never happen at desktop monitor sizes. Only a few technologies can handle 4K resolutions at desktop display sizes.
 
If we can use Micro LED as a solution why we use Mini LED now in 2018 ?
There are manufacturing challenges that need to be overcome for microLED to be viable for consumer products. See this article.

Basically, traditionally LED arrays are manufactured using "pick & place", where each individual LED is placed one at a time. That's okay when you only have a few a hundred to a few thousand relatively large LEDs, as in a full array backlight for an LCD, but isn't economical when you need 24 million tiny LEDs for a 4k display. So they need to develop new manufacturing processes to do it, and that takes some time.

Expect microLED to show up in smaller applications like smartphones and smartwatches before it makes it into bigger displays. TVs/monitors probably won't happen for at least 4-5 years:

YxTwErE.png
 
If we can use Micro LED as a solution why we use Mini LED now in 2018 ?

Micro LEDs are never coming to monitors. Note that costs of micro LED displays scale with resolution, whereas costs of OLED displays and LCDs scale with panel area. Micro LEDs are grown on expensive compound semiconductor wafers, and even if the chips can be made tiny, it's easy to calculate a 8k display would more or less require the whole output from a single wafer.

As for mini LEDs, they would seem to offer the biggest improvement LCDs have received since, well, ever. That said, given the way OLED juggernaut is advancing, I just have a feeling even that will be too little too late:

http://english.etnews.com/20180309200003

Because Micro LED is a pipe dream as of now. It's only a prototype, and in use on a ginormous display. Physical LED's are extremely hard to scale down in size. May never happen at desktop monitor sizes. Only a few technologies can handle 4K resolutions at desktop display sizes.

Actually the problem is not scaling down the size of the chips but their mass transfer from a wafer to a display substrate. Micro LED chips can be made very small, by definition they are measured in microns, and could allow a pixel density of over 1,000 ppi. However, the smaller the chips are, the more difficult they are to handle. Samsung showcased a 150" display because in that size micro LED displays can actually be competitive.
 
Actually the problem is not scaling down the size of the chips but their mass transfer from a wafer to a display substrate. Micro LED chips can be made very small

semantics.. they can't make them small enough size ---onto a display substrate---.

That TCL oled printing and the hope for consoles driving VRR adotion to be more standard is very hopeful information for the nearer future. If they can make 3 to 5 times more FALD backlights per zone that would be welcome of course but the cost and timeline is out there. It does sound nearer than true quantum dot fired displays at least.
 
I'd been looking forward to 200hz 3440 x 1440 or 144hz "big" (ie 34" +) panels for a while until....

white, electrical fire smelling, smoke came off my bottom Titan Xp! POOF. It's toast, down to just 1 x card. Not gonna max out 3440 x 1440 at 200 FPS with any current single card out there, and not paying 1800 USD (Thx Crypto) for a new Xp when hopefully next gen stuff is due by end of summer.

Next gen is due this summer r-r-ight Goys?
 
Back
Top