It's oled so there will be image burn in.

Hard hard pass. I'll wait for microled.

My OLED laptop I've had for a year and a half has zero burn in. Enjoy crappy LCD's for the next decade waiting for some pipe-dream microLED in computer displays because of the boogieman "burn in", LOL.
 
With a push for thin HDR displays this makes sense. You are never going to get a good enough backlight for LCD tech that can do reasonable HDR thin enough for a laptop. Not to mention most people who say they have burn-in actually have image retention, and they just need to watch something else for a bit.
This is the fallacy, burn in is unlikely unless abused. Image retention on the other hand is more common (1.5 years with my c7 and none to this point), but can happen. The extreme test rtings did people like to quote is unlikely for any consumer user. 24/7 for a year other than refresh cycles... c'mon even the taskbar isnt going to be present near that long.
 
Are there any mobile GPU's that can handle that screen while gaming?

2080, also let me add at that amount of ppi you would have to be a compete moron to run any kind of full screen AA, in fact the 2070 can probably do just fine as long as you don't mind the 20-60 fps range the desktop 2070 averages a 14min to 46fps average on ultra settings with AA cranked, disabling AA or bouncing it down to high settings should really be able to hold up well. realistically until we have more varieties of testing for Ray tracing there is no way of clearly seeing what to expect, Quake 2 apparantly runs well at 1440p but while that shows what can be done with lighting and path tracing that may not equate to decent numbers for newer rasterization games that are more taxing
 
Last edited:
The only OLED screen I've had is the one on my original Samsung Galaxy. I used the phone normally, and it still burned in noticeably. Hopefully OLED tech has advanced enough as for this not to be a problem anymore.
 
I've had burn in on OLED screens on absolutely every phone that has had one that I've had and that's about half a dozen all under 2 years.

I'll trust random forum members about magic computer monitors that don't suffer burn in that every other use of oled screens suffer from right after they buy me one that doesn't.
 
Windows 10 can do per-display scaling.

Yes, that's why I said "up until I got windows 10".

I'd still argue that if you have to scale it, it's a fail. The target DPI for any computer should be in the 95-110 range. Anything g more than that requires scaling and scaling is a waste.
 
With a push for thin HDR displays this makes sense. You are never going to get a good enough backlight for LCD tech that can do reasonable HDR thin enough for a laptop. Not to mention most people who say they have burn-in actually have image retention, and they just need to watch something else for a bit.

Agreed... IR != burn-in.

Image retention isn't as bad as burn-in but they are related and still both bad.

I occasionally get some noticable IR on my Panasonic Plasma, and I run the white scrolling bars to get rid of it, but every time you do, there is wear on the panel and contrast drops.

For my desktop I won't get any technology that isn't completely IR and but in free. There are just too many GUI elements that sit in the same place of the screen all the time.

Hopefully MicroLED winds up being this technology, and can be shrunk down far enough.
 
Just hit me up with like a 27 inch, 1440p OLED panel with a 144+ hz


3.4 million colors? 24 bit is 16.8 million, 30 bit is 1 billion, right?

2^depth

So yes, 2^30 = 1073741824 colors
 
I'd still argue that if you have to scale it, it's a fail. The target DPI for any computer should be in the 95-110 range. Anything g more than that requires scaling and scaling is a waste.

While I generally don't disagree given the quality of desktop scaling available, at some point we're going to have to move on.

The nice thing is that proper scaling technology is being developed alongside the hardware, so we should see 'resolutionless' displays eventually.

Essentially, input resolution and scaling won't matter; the input side will be fast enough to output enough visual fidelity, and the displays will just spit it out accordingly; at the same time, we'll see the software (OS and apps together) respond to higher resolution where possible with more detail, say for graphic work or font smoothing (through super-sampling) and so on.
 
Image retention isn't as bad as burn-in but they are related and still both bad.

I occasionally get some noticable IR on my Panasonic Plasma, and I run the white scrolling bars to get rid of it, but every time you do, there is wear on the panel and contrast drops.

For my desktop I won't get any technology that isn't completely IR and but in free. There are just too many GUI elements that sit in the same place of the screen all the time.

Hopefully MicroLED winds up being this technology, and can be shrunk down far enough.

The fact is that the IR I get on my LG OLED only shows on a black screen from the previous displayed content, and then is IMMEDIATELY wiped out by the next content. It really does not bother me in the slightest. It does not compare, for example, to Plasma - I had several plasma TVs and there the image retention would last quite some time. If IR is the reason you wouldn't go for OLED based on experiences with Plasma, I can confidently state that you can fire away and it will not be an issue.

Disclaimer: This is of course assuming you don't leave static images for aaaages on it. But then there's the pixel refresher as well built-in.
 
Last edited:
Yes, that's why I said "up until I got windows 10".

I'd still argue that if you have to scale it, it's a fail. The target DPI for any computer should be in the 95-110 range. Anything g more than that requires scaling and scaling is a waste.

Apologies, misread about the Win10 thing :)

No - scaling is absolutely sensible, as long as it's a divisible by 2 number. This is what MacOS generally does for example. If you have 4K, and use a 200% scaler, meaning the screen looks exactly the same as 1080p in terms of content size etc. but with greater detail, then this is a win, not a loss, and makes absolute sense.

I DO agree that scaling where content does not look the same and you're having to compromise or find a least-worst option is stupid.
 
Last edited:
Disclaimer: This is of course assuming you don't leave static images for aaaages on it. But then there's the pixel refresher as well built-in.

I'm sure OLED's are fantastic for TV, but this is why I'd be concerned on the desktop. The start menu would seem like a prime candidate to always be on, and thus a great risk for burn-in.
 
Apologies, misread about the Win10 thing :)

No - scaling is absolutely sensible, as long as it's a divisible by 2 number. This is what MacOS generally does for example. If you have 4K, and use a 200% scaler, meaning the screen looks exactly the same as 1080p in terms of content size etc. but with greater detail, then this is a win, not a loss, and makes absolute sense.

I DO agree that scaling where content does not look the same and you're having to compromise or find a least-worst option is stupid.

Well, I feel like 110 dpi is plenty detail at 100% scaling.

Increase the pixel density and all you are doing is increasing load on your GPU, increasing power use, reducing battery life and wasting money on pixels you don't need.

Don't get me wrong, I love high resolution displays, but I do it for the extra screen real estate, not for higher DPI.
 
increasing power use

This is a big one for LCDs, since increasing resolution decreases the pixel apertures thus increasing the need for more backlighting. I'm not sure it matters much for OLEDs, though, given that they are emissive, thus every pixel must output 1/n lumins in order to hit a total brightness figure for the screen.

It's also a particular gripe I have with Dell- they don't like putting 1080p panels and 16GB of RAM on the same ultrabook. You get to pick memory or battery life, but not both. Perhaps an OLED panel would change that?
 
Well, I feel like 110 dpi is plenty detail at 100% scaling.

Increase the pixel density and all you are doing is increasing load on your GPU, increasing power use, reducing battery life and wasting money on pixels you don't need.

Don't get me wrong, I love high resolution displays, but I do it for the extra screen real estate, not for higher DPI.
On a 27 and up inch screen 1080p looks like arse. 4K looks amazing. 200% scaling means no compromising with displayed content size.

1440p means 1:1 scaling or 200% is not possible so 4K is far better here.
 
Lordy - do folks get going. It's an option! Frankly the supply is going to be limited so those who don't want them, don't worry.
If you can't run 4K in a game run it at 1920x1080. That's the res you'd get from a lower res screen anyway. I have a 4K 15.6" notebook and it's great. Frankly the resolution is a real joy to use on any text pages and watching videos in full 4K is quite a sight even on a small screen. You wouldn't want a phone that was 300x500 resolution, yet you can't stand a notebook that has high dpi? Ok well some of us don't mind it at all.

Many points to cover
  • Burn in is an issue but it is becoming less common as they get more skilled at making screens
  • Phones get the least love because vendors are all about lowest cost despite the pretty screen
  • Samsung made an announcement at CES that 13.3, 14 and 15.6 sizes would be available
  • Downturn in phone and tablet industries has them with excess manufacturing capacity quite probably (that is an educated guess)
  • Remember that this also means vastly better response times. No you won't get 144Hz yet but your pixel changes will be vastly better than LCD
  • Better daylight visibility too
  • High end 20xx GPU's will handle 4K gaming - remember you are not likely to need any AA on a 15.6" 4K screen
  • MicroLED is going to be very hot (they have to be driven harder to produce the same light as curren panels)
  • MicroLED is going to take extremely precise manufacturing of every LED to get a precision screen which will be expensive
  • MIicroLED can burn out same as LED's do and expensive screen with burned out pixels isn't going to make warranties cheap
  • Meanwhile OLED improves its durability, gets even cheaper to produce at scale and is relatively cheap
  • OLED is quite probably the most cost effective way to get HDR on laptop size displays - high nit screens take too much power and heat otherwise
  • Every single device the world over is working towards high DPI displays - the market demands that over nearly all other features
  • High contrast, high dpi, superior color, equal or lower power consumption, thinner form factor, easy manufacturing

I'll grant you that you could be skeptical. But nearly every phone maker has adopted the technology, because the visual impact to the buyer sells devices. OLED in TV's is putting other display types out of business. LG can literally sell as many OLED panels as it can manufacture. Yes there are risks and issues. For myself, if the OLED panel burns out, replacing it with an LCD is cheap these days. And nearly everyone who has used OLED TV's will tell you there is very little going back. Older laptops stopped using them in part due to supply constraints. And that may still be the case if Samsung is just selling what they have excess factory space to produce. For that reason, if you are in doubt, hold back. It's fine. But for most of us we've been long awaiting the coming of OLED to laptops, in particular ones with GPU's powerful enough to actually game on. AW is even making an OLED 55" monitor. This tech has been limited by supply and durability. Durability has greatly increased. Supply is finally making it possible to have in notebooks. Be happy :)
 
Back
Top