Why can't panel manufacturers start making high resolution displays?

XiP

Limp Gawd
Joined
Mar 10, 2008
Messages
136
Why don't panel manufacturers just start making panels with enough pixel density that we don't have to care about resolution and focus on other things?

Skip 4k... start making panels that go up to 64k or something. It seems almost like they're trying to slow down technological progress so they can keep selling 1920x1080 screens. 1920x1xxx should have been the standard in the early 2000s when IBM already had a 3840x2400 monitor being produced...
 
just start making panels

I don't disagree. However, fabrication of high DPI displays is probably not profitable most of the time.

Reality check: Most people have no idea what the word "resolution" means. They see "High Definition" in big letters and that means "Dat's some High End shit up in dis Walmart", and that's all they care about.

If making 1366x768 and 1080p panels is making easy money for vendors and fabricators, and high-DPI resolution panels are still considered boutique items with much lower sales, then it's obvious what will continue to be manufactured.
 
Just make panels...when the cables and busses and GPUs in no way support it. Brilliant.
 
What would a 64K display be like?

4K is 3840x2160; they chose 3840 instead of 4000 or 4096 because 1920 x 2 = 3840 and nobody (industry or consumers) wants to go through the SD to HD black bar issue again. So, the next step up would be 7680x4320 and would be called 8K; therefore:
16K = 15360x8640
32K = 30720x17280
64K = 61440x34560

HDMI 2.0 and Displayport are barely doing 4K at 60Hz, which is pushing 8.3 million pixels for a video bitrate of 11.94 gigabits per second. 64K at 60Hz would be pushing 2.1 billion pixels for a video bitrate of 3057 gigabits per second, or 3 terabits per second, or almost 400 gigabytes per second if you want to think of it that way.

The faster SSDs these days can push 0.5 gigabytes per second, so even if you could assemble a 2 billion pixel display, what would feed it?

Calculations of video bitrates courtesy of the WSGF FOV calculator.
 
One cannot just make a high density display simply by snapping your fingers. It takes time and money to develop the manufacturing process. That is the near term practical answer.

The longer term more sinister answer is that display companies have always tried to slowly but surely advance so that people who already have displays are still enticed to buy new ones. lets say they produce a nearly perfect display, it has OLED or plasma like contrast, and long term reliability with little wear like LED back lighting, 180 degree viewing angles perfect colors. Then what? People buy one to fit in their room and never buy another display again unless it has a catastrophic malfunction.

You also have the raw horsepower of graphics cards etc.... While high end cards can do a lot they cannot do 64K.

Now in support of your point I definitely think that the display industry has gone way to far on the cautious side. My biggest complaint is with DP and HDMI specs which seem to just catch up to what we need that is coming out right now and they do not build any future proofing into them. I think DP should have suppoted 8k at 60hz in the last upgrade. This would have given us the ability to at least do things like 60 fps 3D, or 120hz 4k while still allowing for another upgrade cycle without forcing people to upgrade every component.
 
I think they need to learn how to make GOOD displays at the resolutions we currently have first. Maybe they should master black level, viewing angles, response times etc. You think you could spot the difference between 4K and 64K at a normal sitting distance? I doubt it. Not to mention the obscene graphics card power that would be necessary. I'd be perfectly fine with current resolutions if the displays could even get those right.
 
Lets say they produce a nearly perfect display, it has OLED or plasma like contrast, and long term reliability with little wear like LED back lighting, 180 degree viewing angles perfect colors. Then what? People buy one to fit in their room and never buy another display again unless it has a catastrophic malfunction.

Don't forget 0 input lag, no motion blur, and 480Hz true refresh rate with Gsync.

I have money0 in the bank for the 42" 21:9 version of that.
 
None of these tech industries intend to ever give you the "optimal" product, no matter if it's a monitor, or any other tech gadget.

Tech companies need to be able to introduce an exciting new feature every year so they can string the consumer out like crack dealers. If they sold the"perfect" monitor, you wouldn't have any reason to buy their next gen.
 
What would a 64K display be like?

4K is 3840x2160; they chose 3840 instead of 4000 or 4096 because 1920 x 2 = 3840 and nobody (industry or consumers) wants to go through the SD to HD black bar issue again. So, the next step up would be 7680x4320 and would be called 8K; therefore:
16K = 15360x8640
32K = 30720x17280
64K = 61440x34560

HDMI 2.0 and Displayport are barely doing 4K at 60Hz, which is pushing 8.3 million pixels for a video bitrate of 11.94 gigabits per second. 64K at 60Hz would be pushing 2.1 billion pixels for a video bitrate of 3057 gigabits per second, or 3 terabits per second, or almost 400 gigabytes per second if you want to think of it that way.

The faster SSDs these days can push 0.5 gigabytes per second, so even if you could assemble a 2 billion pixel display, what would feed it?

Calculations of video bitrates courtesy of the WSGF FOV calculator.

All that tells me is that other aspects of the computer industry are lazy also. :)
J/K
 
Yes I'm just joking. I do feel like the whole industry needs to progress a little faster... Sure we've come a long way in the past couple of decades but I feel like technological progress is being hindered by average consumers.
 
Why didnt they make all CRTs 1080 15 years ago.. they were just holding back tech! :rolleyes:
 
Why didnt they make all CRTs 1080 15 years ago.. they were just holding back tech! :rolleyes:
almost all CRT monitors that did 1280x1024@60Hz can do also 1920x1080@60Hz
and this was vast majority of 15" and all 17" and higher

@OP
ridiculous thread
why not make 128K display?
why stop there, let's make 1024K :D
 
technological progress is being hindered by average consumers.

I think it's just the opposite. New tech gets adopted and made better and cheaper because of average consumers. If the average consumers weren't buying a hojillion tablets, we definitely wouldn't have more and better hand-held tech coming down the pipe, for instance.
 
Sort of a silly thread, to be honest. 4K is trouble enough in regard to hardware that can keep up. And as for larger displays (ie televisions), it's debatable whether our eyeballs can even discern the difference (depending on distance and size of TV).

The CRT comment also doesn't make much sense, as yeah, there were plenty of CRT monitors that could do higher than 1080p.

For most people who care about image quality, it's not more resolution they want. It's better contrast/black levels. It could be argued that the industry is focusing on milking what they can from LCD (4K) before going more into OLED production. Or they wasted years with their 3D TV nonsense, again, trying to milk some marketing aspect over true technological advancement. I wouldn't say the biggest thing holding back panel tech is a lack of resolution though.
 
On every forum that covers any form of technology (cars, cameras, tvs etc..), I always see questions like the OP, which are best translated as:

"Why can't I have magic/science fiction (or occasionally plausible tech from decades in the future)."

More study of physics/economics will reveal the answer to those types of questions.

In reality there won't be "64K" display, anyone here can buy, in our lifetime. 8K is overkill for 100 foot movie screens. So what use would there be going beyond that? Space projectors filling the sky?
 
XiP: Tell me how that 64K screen at 7Hz works out for you. If you wait a little while for HDMI 2.0 it might even do 14Hz refresh!
 
Ive been running much higher than 1920x1080 on CRT for a lot of years.

I think you missed his point. Sure, CRT's could do higher than that for years now (I have one such CRT), but at the time an FW-900 that costs $2500 wasn't exactly consumer-friendly.
 
Based on related comments, I believe Sony would say that their CRT technology advanced nicely and continued to do so in the lab beyond what could be released at consumer price points, before the market became untenable...

Still a very sad story....
 
I think you missed his point. Sure, CRT's could do higher than that for years now (I have one such CRT), but at the time an FW-900 that costs $2500 wasn't exactly consumer-friendly.

Well it took a very long time for affordable LCD's to even be 1920x1080 but I didn't upgrade to an LCD until 2005 due to the fact that almost all the LCD's were simply inferior resolution. Back in like 1998 or so I was using a 17 inch CRT that did and I ran at1600x1200 (Viewsonic 17 PS). A few years later (probably 2000) I upgraded to a Viewsonic PS790 (19 inch, ran at 2048x1536 resolution). I want to say around 2002 I got upgraded to a Viewsonic P225F which was 22.5 inch and I ran at 2560x1920. It took a long time for LCD's to catch up in the resolution game.
 
It took a long time for LCD's to catch up in the resolution game.

LCD was making good progress for a while. Heck I still use the first gen Dell 24" 2407WFP 1920X1200 from 2006!

You can thank Best Buy shoppers for wanting the absolute cheapest laptops possible for our foray into "HD" 720p laptop screens and stalling out development for a good 5+ years
 
Well it took a very long time for affordable LCD's to even be 1920x1080 but I didn't upgrade to an LCD until 2005 due to the fact that almost all the LCD's were simply inferior resolution. Back in like 1998 or so I was using a 17 inch CRT that did and I ran at1600x1200 (Viewsonic 17 PS). A few years later (probably 2000) I upgraded to a Viewsonic PS790 (19 inch, ran at 2048x1536 resolution). I want to say around 2002 I got upgraded to a Viewsonic P225F which was 22.5 inch and I ran at 2560x1920. It took a long time for LCD's to catch up in the resolution game.

Also - just because your monitors could run those resolutions doesn't mean they could actually resolve all of those pixels. the P225F apparently has a 0.24mm dot pitch. And at 22 inches, I'm skeptical of it being able to resolve that. Someone else could do that math though, and if the grille could do it, then I apologize. But I'm very skeptical.
 
almost all CRT monitors that did 1280x1024@60Hz can do also 1920x1080@60Hz
and this was vast majority of 15" and all 17" and higher

@OP
ridiculous thread
why not make 128K display?
why stop there, let's make 1024K :D

Not that i recall it, as 4:3 was the norm for CRT's not 16:9 / 16:10 for CRT for the majority sold around...

1280 x 1024 and 1600 x 1200

As said, sure if you wanted to pay top dollar for high end CRT's you could for higher resolutions but the average market CRT's were all 4:3 and my 1600 x 1200 Viewsonic i had cost me $480 CAD i recall when i got it. with my Pentium III 533 computer i built.
 
Well it took a very long time for affordable LCD's to even be 1920x1080 but I didn't upgrade to an LCD until 2005 due to the fact that almost all the LCD's were simply inferior resolution. Back in like 1998 or so I was using a 17 inch CRT that did and I ran at1600x1200 (Viewsonic 17 PS). A few years later (probably 2000) I upgraded to a Viewsonic PS790 (19 inch, ran at 2048x1536 resolution). I want to say around 2002 I got upgraded to a Viewsonic P225F which was 22.5 inch and I ran at 2560x1920. It took a long time for LCD's to catch up in the resolution game.

LOL, O RLY? Wiki
 
Still using my LG 246 from 2008 or so. The biggest difference I see since then is higher refresh and we're just starting to see higher pixel density in that panel size. Sadly we don't have IPS image quality with TN responsiveness and "retina" dpi. Surely they could have worked that out in 5 years.
 
They already produce panels for hand-held devices with high enough PPI to make 20" 8k.
There's just no market for it, so no company makes monitors out of those panels.
 
They already produce panels for hand-held devices with high enough PPI to make 20" 8k.
There's just no market for it, so no company makes monitors out of those panels.

Producing a product for a phone does not mean you are capable of scaling it anyway. The reason they make high PPI devices for phones first is because they have dead pixels and so cutting a panel into small phone displays allows them to separate the good panels from the bad. If you try to scale that up to a larger display you have too low of yield. So they don't actually make viable panels of that high PPI in the larger sizes yet. There are some makers that actually try to push the edge but they often push very over priced products into professional markets. Many times these are the result of them trying to get a process up and running and they are selling the low number of good panels at high costs.
 
Resolutions definitely have stagnated, it would be nice to see more pixel density and that being the standard. You CAN get some 4k displays, but they arn't considered standard or mainstream, thus are ridiculously expensive and arn't really easily obtainable.

As for video cards being able to drive it, I'm sure it would not be an issue. Back when 1024x768 or w/e was standard, the average video card was like what, 16MB? Maybe a tiny fan at most. Now video cards are so huge they take 2-3 slots and have GB's worth of memory and practically need their own power supply. HD is only a slight bump from 1024x768. I'm sure a modern video card could easily do 4k or 8k or higher. Gaming at those resolutions, maybe not, but simply displaying a desktop program? I'm sure it would be fine.

But I guess we have to remember these companies arn't in it for advancement, they're in it for the money. There is no financial advantage to them making a better product, at least not at this point due to cost.
 
Resolutions definitely have stagnated, it would be nice to see more pixel density and that being the standard. You CAN get some 4k displays, but they arn't considered standard or mainstream, thus are ridiculously expensive and arn't really easily obtainable.

As for video cards being able to drive it, I'm sure it would not be an issue. Back when 1024x768 or w/e was standard, the average video card was like what, 16MB? Maybe a tiny fan at most. Now video cards are so huge they take 2-3 slots and have GB's worth of memory and practically need their own power supply. HD is only a slight bump from 1024x768. I'm sure a modern video card could easily do 4k or 8k or higher. Gaming at those resolutions, maybe not, but simply displaying a desktop program? I'm sure it would be fine.

But I guess we have to remember these companies arn't in it for advancement, they're in it for the money. There is no financial advantage to them making a better product, at least not at this point due to cost.

Do you mean 1024x768 for desktop resolutions or game resolutions? I'm assuming you're talking normal desktop resolutions, but for the sake of my inability to put 2 and 2 together, could you clarify? :D
 
Do you mean 1024x768 for desktop resolutions or game resolutions? I'm assuming you're talking normal desktop resolutions, but for the sake of my inability to put 2 and 2 together, could you clarify? :D

Talking about just general usage, aka desktop apps. Back then I don't know if it was typical to actually game at that res or if it got brought down to 800x600.
 
Oh man a 64K display. I'd like to see the GPU that powers that monster.
 
LOL, O RLY? Wiki

FYI I have two of those displays (VP2290b and IBM T221 9503-DGP). They weren't even remotely affordable until 2005-2006 ish. Its pretty much the only LCD (other than a very few select 2048x1536 LCD's) that could do > 1920x1200 until the dell 30 inch/apple 30 inch cinema display's came out.
 
Talking about just general usage, aka desktop apps. Back then I don't know if it was typical to actually game at that res or if it got brought down to 800x600.

Well, I do know that as late as Doom 3, FEAR, and Oblivion, you were doing well if you could run 1024x768 at good image quality settings and a good framerate to boot. 1600x1200 was considered the very high-end.
 
Yes I'm just joking. I do feel like the whole industry needs to progress a little faster... Sure we've come a long way in the past couple of decades but I feel like technological progress is being hindered by average consumers.
I feel just the opposite, to be honest.

LCDs had stagnated for a long time. And then retina displays hit the phone market, followed by very high resolution LCDs for laptops, which is now about to spill over into the desktop market too.

I believe we'll see a truckload of 4K desktop monitors in the coming days at CES.
 
Back
Top