Shintai
Supreme [H]ardness
- Joined
- Jul 1, 2016
- Messages
- 5,678
If it just wasn't 5000$....
Else I cant find much to complain about
Else I cant find much to complain about
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
All OLED TVs that I've seen or heard of have bad input lag, although granted I don't claim to have seen all of them. Odds are their "0.1ms" time is the most rosy distorted number they could cook.
Thinking about it. There are bandwidth problems. Problem with this beastie...it only has DisplayPort1.2, HDMI2.0 and USB-C according to Tweaktown. I suspect they get 4K 120Hz by dropping the color rendering side. Because none of those connects have enough bandwidth for 4K at 120Hz with 10-bit color. You need what 25+ Gigabit/second? HDMI2.0 (not a+) maxes at 14 IIRC, DP1.2 maxes at 17 gigabit/second
TT REALLY should have seen that problem in their posting rather than just releasing a canned press release.
Let's say it does 10-bit at 90Hz and 8-bit at 120Hz-
Assuming you actually need 10-bit, is toggling between the two or just living with 90Hz that big of a loss, outside of being a pain in the ass?
Who cares if it lags? This monitor isn't for Doom (no matter what Megalith says) it's for content creators and color accuracy and great black levels trump response rates for that. That said, 12-18 months from now, it'll probably be half this price.
Like Plasma before it, it's an inherent flaw of the technology. While there will be tricks and optimizations to mitigate it, the issue will never truly go away.
I would imagine that calibrating the device to hit around 120 cd/m2 for the high-end should minimize burn-in and extend life expectancy.
Minimize? Sure. Completely prevent? Nope. And that's exactly what I said - "While there will be tricks and optimizations to mitigate it, the issue will never truly go away."
Relax bro. Not trying to steal your thunder. I was just citing a specific optimization / mitigation technique. We're all here to help each other out and discuss tech that we're passionate about, right?
You don't get voice inflection with typed text. I apologize if it came across that way, but trust me. Read my post in this guy's voice and you'll understand where I'm coming from
(it's Ben Stein)
In the year 2000, the Sony GDM-FW900 was the king of the hill. Creme de la creme. $2500 bought you a monitor that could reach up to 2304x1440 (80hz), had stupid high accuracy (my old FW-900 without any calibration profile could hit an average delta E of around 0.5 for the grayscale at 6500K), had fantastic contrast ratio (again, even my 13-year-old monitor could hit 10,000:1), had no motion blur, and had no input lag.
It. Did. It. All. And it excelled at EVERYTHING it did.
For $5000, I expect a monitor to do EVERYTHING well.
Good to see this sort of thing being announced, hopefully it's the first of many OLED 120Hz 4K. If this thing had good input lag numbers, dp1.4 and drops to about half this price it would be an instant purchase for me. As it stands I'll be waiting for more competition or a major sale.
I expect it to be 3-5 years before 4k, 100+ hz OLED with HDR and proper inputs hits the sub-$500 mainstream range. I just made a monitor purchase, so 3-5 years is fine by me.
Doubtful.
Even now, 4K panels of any quality only just get below $500USD. Factor in inflation and in 5 years they are not going to be less than $500USD. As for 60Hz+ OLED, no way. Only if/when 4K is superseded as a standard by 8K or what have you.
Doubtful.
Even now, 4K panels of any quality only just get below $500USD. Factor in inflation and in 5 years they are not going to be less than $500USD. As for 60Hz+ OLED, no way. Only if/when 4K is superseded as a standard by 8K or what have you.