sharknice
2[H]4U
- Joined
- Nov 12, 2012
- Messages
- 3,757
Are there any <48" models of these planned?
55" is the smallest
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Are there any <48" models of these planned?
No one has produced even a prototype of a display that can compete with Gsync HDR yet so this seems pretty unlikely.
Figures. Hopefully Wasabi adds Freesync support to their 120Hz 4K models so I can pick up the 43” for my 1080ti.55" is the smallest
Updated burn in test from RTings. Only the sets that were running CNN for 20 hours a day showed burn in. The sets used for gaming only were fine even after thousands of hours of usage. Sorry but this whole burn in scare, while being a very real possibility, can also be very unlikely for many.
Go figure.
But it won't stop the Burn In Brigade around here from hating on OLED despite their having never owned one. They see white paper specs and torture tests and are paranoid they'll have a burned in screen after 5 minutes of use.
Very interesting video. Thanks. It would be nice if they'd back it up with a warranty period of some sort. Samsung's LED FALD LCDs have a 10 year burn in guarantee. With an oled, if it happens (with many claims of it in product reviews on major sites) .. you are screwed.
6:15 "This also indicates that there may be a 'luck component' that in some cases may be more significant than the theoretical lifespan"
It is encouraging especially for OLED enthusiasts considering HDMI 2.1 sets are coming out.
Regardless,
OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display.
They do 400nit color or so and with added white subpixel can get brighter white but not brighter HDR color volumes. So they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in.
If true HDR color gamut past SDR-ish/quasi HDR volumes so you can show the color gamut into hdr color of 1000 - 2000nit ranges (and further toward hdr 4000 and 10000 where hdr is mastered for.. in future tvs) doesn't matter to you and you are willing to take the burn in risk - then the black depths and per pixel side by side contrast on oled is amazing.
If you are going to want real HDR color volumes of 1000 - 2000nit color out of HDR 1000 and HDR 4000/10,000 content.. and not be retarded by safety mechanisms and caps, and not play the burn in luck factor.. OLED might not be for you.
Of course the current state of FALD and small dynamic backlight arrays have their own tradeoffs in screen dim or bloom favoring algorithms in screen spaces since they aren't per pixel so they aren't a perfect solution either but for me I am more interested in achieving real HDR color volume going forward personally.
https://www.engadget.com/2019/01/08/alienware-4k-oled-55-inch-gaming-monitor/
It seems like this may be an alternative for those interested in jumping in early. It has DP 1.4 support, so it will avoid the need for HDMI 2.1 support on video cards to get 4K 120hz.
Yes true. But let's look at Micro LED. In just 1 year Samsung has managed to shrink the wall from 146 inches down to 75 inches. In 2-3 more years they may be able to shrink it down to 55 inches and HDR content will be more prevelant. If you absolutely need that HDR performance then MicroLED is the way to go, all without burn in risk. If you are more interested in achieving real HDR performance then that's what you want, not these crappy FALD LCDs.
Microled has been at about that size for about a year now.Yes true. But let's look at Micro LED. In just 1 year Samsung has managed to shrink the wall from 146 inches down to 75 inches. In 2-3 more years they may be able to shrink it down to 55 inches and HDR content will be more prevelant. If you absolutely need that HDR performance then MicroLED is the way to go, all without burn in risk. If you are more interested in achieving real HDR performance then that's what you want, not these crappy FALD LCDs.
Hmm.
Looking at all the announced 2019 models, they are all 65+ inches, which means they are useless as monitors.
tired of being paranoid with my Panasonic Plasma i use in my home theater
Nop. C9, B9 and E9 are available in 55, as usual. We already knew there wouldn't be anything 40-50 inch until next year at the earliest, if ever. It's not a popular size anymore and LG panel production is focused on getting as much TV market share as possible, it doesn't make sense for them to expend capacity on a specialized monitor size for lots of reasons. The market for PC monitors that cost more than $500 is very small.
If it was one of the last gen ones, I doubt there's any reason to. I've had a Samsung PN60F8500 since 2014 and I've forgotten about it with the Windows desktop up for 12+ hours and it didn't even have image retention. I don't think the last gen plasmas even had burn-in at all.
You have to pour thousands of hours of identical static content into an OLED to burn anything in.
The market may be what it is, but 43" is my max size. Above that size, no sale.
It's a shame that this size has been reduced to the realm of budget TV's.
I paid $2k for my 2015 48" Samsung JS9000, and would do it again for the right TV. I want it so little smaller and with VRR this time though.
It has to be top end picture quality though. I don't want some cost reduced model just because it is smaller.
Throw the best you can muster picture qualitty wise into a nice little 43" package, and omit any and all smart functions, and we are in business.
Then your best bet is the Asus XG438. 43 inch VA panel with 120Hz VRR.
Then your best bet is the Asus XG438. 43 inch VA panel with 120Hz VRR.
I can understand not connecting a tv with their snoop history but then again i keep an amazon echo connected - it's just too damn convenient. I do agree about at least connecting a tv once in awhile for firmware updates. That's a good poont.
Besides the tv snoop angle, the smart tv apps are usually slower and clunkier. You can add a nvidia shield-tv or a htpc instead. Shield and consoles also have Netflix, prime video, plex, youtube etc. Shield and consoles also have twitch which roku doesn't, and shield can run kodi if you are into that as well. Youtube especially seems resource hungry/un-optimized so performs poorer and crashes more often on smart tvs and weaker rokus but is fast on shield. The whole shield interface is snappy and has the whole google play (android) store and the nvidia game one available.
Yeah no, it's big stuff.I really don't even get why companies like Samsung are at the forefront of stuff like MicroLED. You'd think that AMD and Intel would want to get in on that action since making stuff small is basically their area of expertise.
It's crazy that nvidia didn't put in the 2.1 hdmi in the rtx cards. 2080ti is already $1200, lol fn nvidia.
Even is 2.1 hdmi is 6 months out its not like nvidia has any competition in the high end market.
There wasn't enough time between HDMI 2.1 spec release and designijng Turing architecture to include HDMI 2.1.
Perhaps this is why they chose to release a low inventory high price card when they did.I wonder if NVIDIA will find itself in a similar situation to 2018, where RTX inventory could stagnate because gamers will hold off for HDMI 2.1. As a result, we could see big sales as we head into spring and summer.
I have no horse in the race when it comes to HDMI 2.1 on the GPU end, though. I picked up a 1080 Ti when there were fire sales going on this fall, so I won't be back in the market until 2021 at the earliest. At least by that point, HDMI 2.1 will be firmly established.
I really don't even get why companies like Samsung are at the forefront of stuff like MicroLED. You'd think that AMD and Intel would want to get in on that action since making stuff small is basically their area of expertise.
Nvidia chose to release Turing when they did because they already spent the R&D money on ray tracing cores and and arch for them for self-driving cars and then that deal fell through and they had to make money off their spent R&D.Perhaps this is why they chose to release a low inventory high price card when they did.
They knew there would be excess inventory leading up to HDMI 2.1.
The high price also helped shift excess 10xx series stock.
This way they make a ton more profit on the new cards and have less issues moving forward.
Yeah no, it's big stuff.
Plus there's a ton of patents and know-how involved.
It's far less complicated than what Intel already does.
Nvidia chose to release Turing when they did because they already spent the R&D money on ray tracing cores and and arch for them for self-driving cars and then that deal fell through and they had to make money off their spent R&D.
Oh and I just realized... If you run everything through a stereo receiver that will have to have HDMI 2.1 as well.
More waiting on updates I guess.
Both TFTcentral and digitaltrends articles say the Dell Alienware 55" OLED gaming monitor has both DP 1.4 and HDMI 2.1.
The samsung Q9 series ~1800 nit LED FALD HDR VA LCD tvs (phew that's a lot of acronyms) .. have 1440p 120hz at 10ms and VRR on hdmi 2.0b. The LG C8 OLED is still at around 21ms I think currently but can't do 1440p 120hz so that could be the difference there.
HDMI 2.1 spec has QFT (quick frame transport) for low input lag gaming which might help all tvs achieve that kind of low input lag or better at 4k 120hz 4:4:4 on hdmi 2.1 hopefully.
The Dell Alienware being marketed as a monitor and having g-sync tech could even be less input lag and would have oled response times as well though that response time doesn't come into play as much at 120hz compared to lcd unlike higher Hz ranges would. You also still need to feed high fps ranges to high hz (120hz in this case) to combat sample and hold blur even on oled.
100FpsHz on 120hz monitor = 40% blur reduction vs 60hz smearing, 5:3 motion definition increase (10 unique world action frames shown to every 6 at 60FpsHz).
120FpsHz on 120hz monitor = 50% blur recution (~ soften blur filter look) vs 60hz, 2:1 (double) the motion states shown (12 frames to every 6 shown at 60FpsHz)
I'm pretty happy if I can dial in 100fpsHz average to get around 70 - 100 - 130+ fps range in demanding games.
The higher end tvs have other technologies like 480Hz flicker, optional BlackFrameInsertion(BFI), and optional interpolation. It would be nice if the modern monitors and TVs would share a lot of those capabilities across the board so you could use them for consoles, emu and indie games, low hz limited games, etc.