Monitor Vendors Eyeing 5K Units In 2015

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
5K monitors coming in 2015? Well, hopefully that drives down the price of 4K monitors so we can all get our 4K gaming fix (here, here and here) without going bankrupt in the process.

As global shipments of LCD monitors are expected to reach 136 million in 2014 followed by 132 million in 2015 and 128 million in 2017, vendors are looking to release high-end units in niche areas in order to remain profitable. Supply chain sources noted that Philips in particular is eyeing 5K monitor opportunities, and that most 5K panel supply in the market will come from LG Display.
 
5K is a lot of pixels to push, almost 2x a single 4K monitor. And unless you're going for the integrated all-in-one iMac approach or hacky Dual DisplayPort 1.2 approach, it's probably best everyone wait for fully standardized DisplayPort 1.3 monitors and GPU hardware.
 
Honestly I don't get all the fuss..

I could see the difference moving from SD to HD. I could see the difference moving from 720p - 1080. On TV's > 60" I can see the difference between 1080 and 4k. On my computer monitor which is 24" I absolutely Cannot See a shred of difference other than it just requires a shitload more power to run those resolutions. At "Present" I have zero incentive to game in any resolution other than 1080p.
 
Honestly I don't get all the fuss..

I could see the difference moving from SD to HD. I could see the difference moving from 720p - 1080. On TV's > 60" I can see the difference between 1080 and 4k. On my computer monitor which is 24" I absolutely Cannot See a shred of difference other than it just requires a shitload more power to run those resolutions. At "Present" I have zero incentive to game in any resolution other than 1080p.

Yeah I agree with you. I have tried one of the 32" 4K panels and honestly, it didn't look much better than my old 2560x1600 monitor. Slightly sharper, and less of a need for AA, that's it. Not much of an improvement considering that's DOUBLE the pixels and takes a lot more GPU power to drive. In a monitor I would say at least 40" is required to really get the benefits from 4K, anything less and you are wasting your money IMO.

4k on a big TV however is a huge improvement especially when sitting close.
 
5K is a lot of pixels to push, almost 2x a single 4K monitor. And unless you're going for the integrated all-in-one iMac approach or hacky Dual DisplayPort 1.2 approach, it's probably best everyone wait for fully standardized DisplayPort 1.3 monitors and GPU hardware.

Not everyone buys 4k /5k for gaming, there are plenty of other uses which a low to mid range GPU can handle fine i would think...
 
5K is literally going to be the 2560x1440/1600 of 4K as it is for 1080p. When 4K becomes the norm and prices are at the 1080p levels of today it'll be the saving grace to keep monitors at a premium.

If 4K gaming is still years off on a single GPU, 5K+ is a cruel joke.
 
Excited to find out which manufacturer is dumb enough to equip one of these with a matte coating (many people report that the 4K 32" are obviously grainy even though they use semi-glossy/nearly grain free matte coatings offered by LG and Samsung).
 
They're getting way ahead of themselves. The LCD manufacturers are in a panic because nobody is buying TV's every year, because there is no compelling reason, and now they're just jumping the shark. Introducing yet another new format before consumers have even begun to adopt the newest gimck is just shooting yourself in the foot. Now, people aren't going to know whether to buy 4k, or wait for 5k. Brilliant marketing, dumbasses.
 
I wouldn't mind seeing some 21:9 widescreen 5120x2160 models instead of 5120x2880. they would be slightly easier to drive and maybe get support for single stream DP easier /faster and UHD 3840x2160 and 4096x1260 content could be show in native resolution with just pillar boxing instead of letterboxing and pillarboxing on 5120x2880 and I'm thinking scalers could take 1080p content to UHD with pillars better and cheaper than to 5120x2880.

I understand that Monitor makers are trying to focus on selling "5K" as some sort of premium over UHD but if you need that kinda resolution on the desktop it seems better to have a wider single monitor or spread it across 2-3 screens and there aren't a lot of options for high quality 21:9 let alone ones that can do native UHD. It just feels like they got a surplus of scalers to do 4x 2560x1440 and are gonna push it out regardless of video card support
 
I'm waiting too the K is over 9000 before I buy.

Seriously though -- 4K gaming might really break out in a year or two, and they are already talking about 5K.. Nuts to that, it's starting to sound like a pissing contest with no super great benefits.
 
Honestly I don't get all the fuss..

I could see the difference moving from SD to HD. I could see the difference moving from 720p - 1080. On TV's > 60" I can see the difference between 1080 and 4k. On my computer monitor which is 24" I absolutely Cannot See a shred of difference other than it just requires a shitload more power to run those resolutions. At "Present" I have zero incentive to game in any resolution other than 1080p.

Yeah, this seems like a bad idea. We are kinda barely getting monitors with good gaming refresh rates at 1920x1080, and they are kinda too big for my tastes at 24" which is kinda the max I would consider...And now 4k and 5k monitors. How 'bout we solidify the 120-144hz, no ghost, crisp picture, standard, before we start jumping the shark to something else that really won't benefit a lot of people at all. 5k?! Sounds like a mess of interpolation and latency at anything playable, and possibly for playback. There isn't even a market really for 4k/5k video. It seems like a phantom thing a vendor can push, whether it works well, is responsive, etc.
 
Yeah, this seems like a bad idea. We are kinda barely getting monitors with good gaming refresh rates at 1920x1080, and they are kinda too big for my tastes at 24" which is kinda the max I would consider...And now 4k and 5k monitors. How 'bout we solidify the 120-144hz, no ghost, crisp picture, standard, before we start jumping the shark to something else that really won't benefit a lot of people at all. 5k?! Sounds like a mess of interpolation and latency at anything playable, and possibly for playback. There isn't even a market really for 4k/5k video. It seems like a phantom thing a vendor can push, whether it works well, is responsive, etc.

1) If they "perfect" 24" with what you said, how are they gonna sell 4/5K displays? :D

2)novelty=marketing=scamming the uninformed, which is many :rolleyes:
 
1) If they "perfect" 24" with what you said, how are they gonna sell 4/5K displays? :D

2)novelty=marketing=scamming the uninformed, which is many :rolleyes:

Well, yes...that's what I'm saying, but one would hope they would figure out and complete Step 1, before moving on to a Step 2 that isn't even really relevant. Heh, yes, it's a way to sell garbage to people, because apparently it is affordable to create 24" 4k monitors, rather than crisp low-latency 24" monitors. I dunno who wants the former in their 24" monitor, but everyone probably expects the latter at this point. How many years have we been working out 120-144hz?

This might be a cash-grab before G-Sync/FreeSync hits, which will really matter and probably obsolete these absurdly early 4k monitors.

How about we get Lightboost working well, we get 120-144hz with good contrast, color reproduction, and low ghosting? That's where monitor tech should be focused right now. 4k/5k might be a thing for super high end, very large, TV's...though they have no media available to take advantage of such a format...
 
Well, yes...that's what I'm saying, but one would hope they would figure out and complete Step 1, before moving on to a Step 2 that isn't even really relevant. Heh, yes, it's a way to sell garbage to people, because apparently it is affordable to create 24" 4k monitors, rather than crisp low-latency 24" monitors. I dunno who wants the former in their 24" monitor, but everyone probably expects the latter at this point. How many years have we been working out 120-144hz?

This might be a cash-grab before G-Sync/FreeSync hits, which will really matter and probably obsolete these absurdly early 4k monitors.

How about we get Lightboost working well, we get 120-144hz with good contrast, color reproduction, and low ghosting? That's where monitor tech should be focused right now. 4k/5k might be a thing for super high end, very large, TV's...though they have no media available to take advantage of such a format...

Hoo boy, so much bad information and opinions based on imagination in one post! 4k is here today with huge benefits for gaming and production. It's, even at 24", like taking a pair of glasses off to go back to 1080 from 4k. It's just that big a deal. Latency is actually above passable for 99.9 percent of users, and color reproduction makes your poor 144hz TN panel look like a facsimile of the images it's displaying. I have tried even a TN 4k of the new generation and the picture quality comes far from the ips 4k I had used for months prior. Next up is a 32" 4k ips as an exchange ;). Been doing a/b comparisons with a 115hz pls monitor at 2560x1440 the whole time, too.


1080p range resolution belongs back in 2004 or earlier. Heck, even I bought a 1680x1050 ips monitor back then. The desktop space alone is worth it a dozen times over.
 
1080p range resolution belongs back in 2004 or earlier. Heck, even I bought a 1680x1050 ips monitor back then.

yep, yet the "next gen" consoles still struggle with 1080p. i played quake 2 in 1600x1200 back then. while i can understand that 4k is a good upgrade, i think anything above that is just a desperate attempt by the industry to sell consumers something new to show progress in development of new tech. it's like that fancy 3d tv or 73.5 audio setup that noone needs, but people love to throw away their working tech to get new shit.
 
Hoo boy, so much bad information and opinions based on imagination in one post! 4k is here today with huge benefits for gaming and production. It's, even at 24", like taking a pair of glasses off to go back to 1080 from 4k. It's just that big a deal. Latency is actually above passable for 99.9 percent of users, and color reproduction makes your poor 144hz TN panel look like a facsimile of the images it's displaying. I have tried even a TN 4k of the new generation and the picture quality comes far from the ips 4k I had used for months prior. Next up is a 32" 4k ips as an exchange ;). Been doing a/b comparisons with a 115hz pls monitor at 2560x1440 the whole time, too.


1080p range resolution belongs back in 2004 or earlier. Heck, even I bought a 1680x1050 ips monitor back then. The desktop space alone is worth it a dozen times over.

Hoo boy, so much bad information and hypocrisy. 120-144hz 1080p is still immature. Cannot reproduce crisp image, low latency, or accurate colors TODAY. Now you are claiming 4k monitors are better and faster, than all the most legitimate problems with current monitors? Hoo boy?! Let's get 1080p working right, before you start muddling your ips/tn/led/oled on your 4k/5k march.

4k/5k on a 24" panel!!!

Howdy My Gosh, 'Hoo boy', is probably the least likely way to add weight to your argument, btw. You are definitely stretching there, in the goofy hyperbole department...
 
Hoo boy, so much bad information and hypocrisy. 120-144hz 1080p is still immature. Cannot reproduce crisp image, low latency, or accurate colors TODAY. Now you are claiming 4k monitors are better and faster, than all the most legitimate problems with current monitors? Hoo boy?! Let's get 1080p working right, before you start muddling your ips/tn/led/oled on your 4k/5k march.

4k/5k on a 24" panel!!!

Howdy My Gosh, 'Hoo boy', is probably the least likely way to add weight to your argument, btw. You are definitely stretching there, in the goofy hyperbole department...

lol:D
 
Honestly I don't get all the fuss..

I could see the difference moving from SD to HD. I could see the difference moving from 720p - 1080. On TV's > 60" I can see the difference between 1080 and 4k. On my computer monitor which is 24" I absolutely Cannot See a shred of difference other than it just requires a shitload more power to run those resolutions. At "Present" I have zero incentive to game in any resolution other than 1080p.

agreed, this addiction to pixel count seems bizarre.

Of course the hardware vendors love it as they get something new to sell, but not sure why people want it, extra pixels = need better gPU to pump them out and probably a very big screen to notice the difference.

I rather see a gsync 1050p 16:10 IPS monitor, or 1080p IPS gsync monitor. 60hz not more. I am not interested in 4k or hz above 60 thank you.
 
I don't care if they come out with 10k panels, anything at 60Hz or less sucks for use as a PC monitor IMO. I'll be interested when they start making 120Hz 4k panels and the connections with the bandwidth to support them!
 
5K is literally going to be the 2560x1440/1600 of 4K as it is for 1080p. When 4K becomes the norm and prices are at the 1080p levels of today it'll be the saving grace to keep monitors at a premium.

If 4K gaming is still years off on a single GPU, 5K+ is a cruel joke.

If it's the 2.5k of 4k, then I hope we can expect 16:10 (5120x3200) screens for it, and 21:9 versions of it (7144x3200)!!! :D :eek:
 
4k-5k can't really take off for gaming because the consoles just released won't support them. From what I have read a fair number of games don't even run at 1080 due to performance limitations.

Now can there be a PC gaming niche, yeah and I really hope so. Consoles won't take the plunge unless there is SOME adoption, most likely in the media arena though but PC adoption would help a little. Maybe the NEXT gen will push the hardware limits more like PS3/360, but maybe the plan is to do releases more often (They better have back catalog support then).

Options are good and I thank those niche PC gamers, cause they are funding our future tech.
 
While I think 5K will be great for desktop use, I think the real revolution will come when DP 1.3 hits mainstream and we can get 120 Hz 4K. Combine that with a strobing back light and you basically have the perfect display.
 
While I think 5K will be great for desktop use, I think the real revolution will come when DP 1.3 hits mainstream and we can get 120 Hz 4K. Combine that with a strobing back light and you basically have the perfect display.


Does DP 1.3 even have the bandwidth to support 5K @ 60Hz? I've been hearing they are running out of room on copper pushing more bandwidth for uncompressed data in relation to GPU/Monitors/TV's that they may ultimately be forced to compress the data. Distance is already a major factor and while I see DP 1.4 coming 10x faster than HDMI 2.1/3.0 I think these higher resolutions are going to start becoming a huge problem. That's a lot of data lol.
 
I think we all can agree that the panel manufacturers need to drive the ecosystem forward. HDMI and DP are not evolving fast enough. In fact, just as SSDs have revealed the snails pace at which the SATA group evolved their standards, high density displays are revealing the same thing about HDMI and DP. Very frustrating - maybe this is why Apple was betting so big on Thunderbolt and, more generally, moving everything directly to the PCI Express bus. Suddenly the closed ecosystem is putting them in a position to deliver things (PCI-based boot storage, 5k at 60Hz) that everyone else is struggling with.
 
5K has big potential if the manufacturers use it creatively which means not obsessing on increasing pixel density in the old format (like 27"@5K) but on expanding the field of view. This can be achieved with widescreen and curved monitors. Not only 21:9 but more, 22,23:9, moving into the substitution of multimonitor setups. There are flat 4K 40" monitors coming, there are curved 21:9 34"@3400x1400 available. The field is open for a curved 5K@~6000x~2400.
 
Thing is, many people have laptops these days and if you run an integrated GPU then even DisplayPort 1.2 is a Haswell-only thing and that's barely a year old... DisplayPort 1.3 was released this September, my educated guess is that it'll take Intel at least two years if not three to have a mobile IGP on the market that supports it.
 
Yeah, but only for 8-bits.
It is known that graphics card manufacturers are working on driving 8K monitors. Full 4:4:4 8K is possible with dual DP1.3 connectors which would be similar to the common dual DVI.


Thanks. That sucks. Wish DisplayPort would make it more known the in-depth details like you can find on HDMI via Wikipedia. Looks like they may have to do a quick refresh to make room for 5K unless monitors have no plans for 10/12 bit support anytime soon.

Have a feeling the DP group will be more focused on 8K scheduled around 2017 to even consider it relevant though.
 
5K has big potential if the manufacturers use it creatively which means not obsessing on increasing pixel density in the old format (like 27"@5K) but on expanding the field of view. This can be achieved with widescreen and curved monitors...

Thank you! 4k/5k is a way to get PPP eyefinity without bezels. Monitor makers aren't interested in developing new tech for a couple of well funded PC gamers.

I'd take a well developed 4k, 32" 144Hz, OLED (with good pixel life).
 
Does DP 1.3 even have the bandwidth to support 5K @ 60Hz? I've been hearing they are running out of room on copper pushing more bandwidth for uncompressed data in relation to GPU/Monitors/TV's that they may ultimately be forced to compress the data. Distance is already a major factor and while I see DP 1.4 coming 10x faster than HDMI 2.1/3.0 I think these higher resolutions are going to start becoming a huge problem. That's a lot of data lol.

Or maybe the industry will have to move to a fiber-optic standard for cables. I mean you can get a fiber-optic displayport cable now...they just cost like $800 for a 50 foot run O_O
 
Or maybe the industry will have to move to a fiber-optic standard for cables. I mean you can get a fiber-optic displayport cable now...they just cost like $800 for a 50 foot run O_O

Those fiber optic DP cables are worthless. They don't even run at full DP 1.2 speed, they are limited by the slow TCon chips on either end. I've purchased and tested many of them. If they say they are running at full 4-lane DP 1.2 bandwidth, they are lying.

Unfortunately for high end screens, even with DP 1.3 coming out we will see monitors that require multiple display cables.

To do 120 Hz 4K, 60 Hz 5K, or 8K basically anything with full color depth and no reduced chroma BS will require multiple DP inputs..

Unfortunately HDMI 2.0 has only caught up to "old" DP 1.2 bandwidth, and DP 1.3 is only 50% faster than DP 1.2...
 
anyone else find it annoying that 5k has been hijacked by apple

I thought 5k was for 21:9 (4k) variants ?

I hope we see 21:9 5k ( not 5k apple in a 21:9 aspect either! ) So we can watch 4k Blueray without bars and stretching natively.

oh and with 120hz support over DP1.3 or Dual DP1.3 + 3D
 
anyone else find it annoying that 5k has been hijacked by apple

No. :confused:

It was hardly "hijacked" by Apple. Quite the opposite, in fact, as Apple's push to introduce a 5K panel on a high-volume product likely spurred development of 5K technology in general.

The 5K iMac will only benefit everyone looking for wider adoption of high-PPI monitors, whether or not you ever buy a 5K iMac.

In fact, Dell already had to drop the price for their 5K monitor before it was released because the 5K iMac is so surprisingly cheap. Ref: http://www.pcworld.com/article/2844...itor-price-after-apple-launches-new-imac.html

I thought 5k was for 21:9 (4k) variants ?

There's nothing preventing ultra-wide 5K monitors from also being called 5K. Put 5K horizontal pixels on a monitor, and it's 5K. That's what 5K means.

Resolution naming is a mess. In fact, most 4K proponents will quickly point at that all of these "4K" monitors on the market aren't actually 4K. They're 3840 pixels wide. Real DCI 4K monitors are 4096 × 2160, which only applies to a small number of monitors at the moment.

I hope we see 21:9 5k ( not 5k apple in a 21:9 aspect either! ) So we can watch 4k Blueray without bars and stretching natively.

You've got your standards confused here.

4K Blu-Ray will have a resolution of 3840 x 2160. The only way to watch them without stretching natively is on a 16:9 "4K" monitor with a 3840 x 2160 resolution.

But that's missing the point, because one of the biggest advantages of high-PPI monitors is that stretching isn't a big deal any more. Once the pixels are sufficiently small, you can scale things all over the place and you will never notice. Just take a look at the Retina MacBook Pros, which most users run at a non-integer multiple of the native screen resolution which still looks far better than a low-PPI display. Stretching and scaling just aren't an issue any more.

Finally, watching movies without bars depends on the movie's aspect ratio, which is independent of the recording medium's aspect ratio. The only way to watch a movie without bars without cropping is if the aspect ratio of the film perfectly matches that of your display. Many films fall in to certain standard aspect ratios, but even among ultra-wide films you end up with some at 2.40:1 and others at 2.35:1, while a handful of films are actually 2.55:1. Meanwhile a lot of films are still 1.78:1 while a bunch of others are still 1.85:1.

It's best to give up on your idea of watching movies without bars, because there is no one-size-fits-all monitor.

On the plus side, ever improving black levels should make the bars a non-issue. :D
 
No. :confused:

It was hardly "hijacked" by Apple. Quite the opposite, in fact, as Apple's push to introduce a 5K panel on a high-volume product likely spurred development of 5K technology in general.

kind of has been, 5k 21:9 TV's have been shown and announced before apple came along with their higher resolution non ultra wide and called it 5k


There's nothing preventing ultra-wide 5K monitors from also being called 5K. Put 5K horizontal pixels on a monitor, and it's 5K. That's what 5K means.

its just confusing to the average consumer.

You've got your standards confused here.

4K Blu-Ray will have a resolution of 3840 x 2160. The only way to watch them without stretching natively is on a 16:9 "4K" monitor with a 3840 x 2160 resolution.

I read that some movies would come with an alternate layer with the correct 5k resolution.


But that's missing the point, because one of the biggest advantages of high-PPI monitors is that stretching isn't a big deal any more. Once the pixels are sufficiently small, you can scale things all over the place and you will never notice. Just take a look at the Retina MacBook Pros, which most users run at a non-integer multiple of the native screen resolution which still looks far better than a low-PPI display. Stretching and scaling just aren't an issue any more.

but thats because apple actually cared. Not all monitors or windows/linux for that matter are going to handle scaling as well.

thanks for your reply. Il go vent my lack of 5k @ 21:9 elsewhere :p
 
Last edited:

LOL, this thread has me all turned around at this point. I don't know if you are laughing with me, or at me.

If you want a crisp image with good color reproduction...120-144hz isn't your bag. It has advantages, but those are definitely it's faults. You can find much better monitors than the 120-144hz monitors available currently, that are just leagues ahead of them in those regards.

So the question is, 'what do we need' now? Since no 4k/5k media exists, and nobody is gaming at 4k/5k, let's figure out this 120-144hz elephant in the room. It will naturally yield benefits to 4k/5k when it is relevant.

Right now, this is way backwards. Setting the cart before the horse.
 
Back
Top