Skip 4K, go directly to 5K?

.. or not, if with current input options it will be accompanied by MST, unless it's custom non-standard HW like in new imacs.
 
MST is a no go for me, I hated it on my old 24" dell 4k and always had issues.

My 3k laptop has about the same DPI as the 27" 5k screen and I must say the clarity is amazing.

5k is the way to go, but its not the future mainstream. 4k and 8k will be mainstream and 5k will be the red headed step child.
 
its always better technologies around the corner. 4k, curved 4k, curved 4k ultrawide, 4k /5k with g/free sync,8k and so on.

i would say. let live now.
 
Resolution is great, but what size is the panel?

The iMac 5K picture looks like glass. No pixels.
But for PC users we rather have a bigger screen as the pixel count increases.

I want a 40", and the first with a near glossy panel with 4K or 5K 60Hz will get my money.
AMD has updated it's drivers to support 5K via dual DP1.2. If Nvidia or AMD releases a DP1.3 card, that's the one I'm buying.
 
It's safer to stick to more standard resolutions and aspect ratios for the sake of compatibility. For the next few years most mid to high-end displays will start switching to 4k(3840 x 2160). This will give you the best support in software since the high-DPI is giving applications some growing pains. If you game a lot then you know that even 4k(4096 x 2160) is a source of some compatibility headaches. Just stick to 4K in 16:9 and you should be set for the next few years at least.
 
It's hard to get excited, at least for gaming. GPUs can barely handle 4K, and 5K is double the pixels.

I just want an affordable 4K OLED monitor, don't really care about resolution beyond that right now.
 
5K would be amazing for productivity, but not so much for gaming. 4K + antialiasing is barely realistic with any currently available GPU hardware.
 
It's hard to get excited, at least for gaming. GPUs can barely handle 4K, and 5K is double the pixels.
5K would be amazing for productivity, but not so much for gaming. 4K + antialiasing is barely realistic with any currently available GPU hardware.

Don't forget perhaps Q4/15 but surely in 2016 there will be at least double the GPU power comparing to now.
 
5K would be amazing for productivity, but not so much for gaming. 4K + antialiasing is barely realistic with any currently available GPU hardware.

While it certainly wouldn't hurt to enable it if you have the headroom, with 4K+ computer monitors (under 30") the DPI is so high you likely wouldn't really notice a difference in image quality versus the enormous performance hit it'd take.
 
Nah, with LG already demoing 8k 55" screens, and DP1.3 coming with support for 8k@60hz I have the feeling we might skip directly to 8k, especially because Oculus Rift requires extremely high res displays, and that will push the tech further.
 
Nah, with LG already demoing 8k 55" screens, and DP1.3 coming with support for 8k@60hz I have the feeling we might skip directly to 8k, especially because Oculus Rift requires extremely high res displays, and that will push the tech further.

Oculus Rift is absolutely not a driving force in display innovation. They may want better displays, and they will use better tech when available, however they have close to zero clout with pushing disuplay manufacturing forward.
 
Don't forget perhaps Q4/15 but surely in 2016 there will be at least double the GPU power comparing to now.

Running multi-GPU setups has always been a necessity for maxing out games at 2560x1600, since the days of Half Life 2. As top-end GPU power increases, so do game graphic requirements. Game developers have been targeting 1920x1080 for their "ultra" resolutions on a single top-end video card, and with game consoles fixed at 1080p, that isn't going to change anytime soon. So unless game graphics get worse in 2016, twice the GPU power we have today wouldn't be sufficient for 5K gaming.

While it certainly wouldn't hurt to enable it if you have the headroom, with 4K+ computer monitors (under 30") the DPI is so high you likely wouldn't really notice a difference in image quality versus the enormous performance hit it'd take.

IMO antialiasing is still required for the best experience at 4K. I have a 14" Alienware laptop with a 1080p display. That's exactly one quarter of the size and resolution of a 28" 4K display, and pixel aliasing is definitely still noticeable.

The current generation video hardware can barely keep up with 4K, much less maxed details at 1440p. Even if next generation hardware can handle 4K, expecting double performance from the next gen after that is just unrealistic. 5K gaming isn't going to be feasible anytime soon.
 
Mainstream by what measure? If by prices, no way, 4K is cheaper than 1440p in many cases now. All the major manufacturers are tooled for UHD resolution so the costs are much lower for a manufacturer than a side resolution like 5K which isn't supported by any media format.
 
.
My 3k laptop has about the same DPI as the 27" 5k screen and I must say the clarity is amazing.

5k is the way to go, but its not the future mainstream. 4k and 8k will be mainstream and 5k will be the red headed step child.

Now imagine that clarity on a screen that is 27" instead of a tiny laptop size. :eek:

8K is a full 5+ years away from being a reality for a consumer priced desktop monitor IMO.

It's safer to stick to more standard resolutions and aspect ratios for the sake of compatibility. For the next few years most mid to high-end displays will start switching to 4k(3840 x 2160). This will give you the best support in software since the high-DPI is giving applications some growing pains. If you game a lot then you know that even 4k(4096 x 2160) is a source of some compatibility headaches. Just stick to 4K in 16:9 and you should be set for the next few years at least.

Games don't care what resolution they run at, they care about aspect ratio. "small"-4K, 5K, 8K are all 16:9 formats and will work fine.

Don't forget perhaps Q4/15 but surely in 2016 there will be at least double the GPU power comparing to now.

If you think there will be GPU's twice as powerful as the GTX980 in that short of a time-frame, I've got some magic beans to sell you. ;)

And DP1.3 coming with support for 8k@60hz I have the feeling we might skip directly to 8k

Except that it doesn't. DP 1.3 doesn't even come close to having enough bandwidth for proper 8K@60Hz. Sub-par connectivity standards will keep 8K technology at bay for consumer level electronics for years.

Just to put 8K@60Hz into perspective for people, 4K@120Hz requires half the bandwidth...
 
aye, half the bandwith .. and 8K is 4 times pixels of 4K, thus only way to go for 8K@60Hz is two DP 1.3 or four DP 1.2/HDMI 2.0.
P.S.
I really hope that VR headsets/glasses will take off. It's MUCH MUCH cheaper to manufacture screens of sizes used in these then big high res screens in common monitors/TVs. Heck, simple example. How little fraction of cost for two screens reused from smarthpones as in current occulus samples vs price of 120+"(or even bigger?) TV for somewhat comparable screen size due further distance to it?
 
This is actually really sad. Instead of making advancements in the panel technology, manufacturers pushing for the higher resolutions with the same crappy lcd monitors. :confused:

I don't think we will ever see a consumer oriented OLED monitors in this decade at this rate.
It's easier to increase the pixel count than to develop and perfect a new display technology and the consumers only care about numbers and gimmicks anyway, seeing how plasma tv's are dead and gone now only confirms this.
 
5k is the way to go, but its not the future mainstream. 4k and 8k will be mainstream and 5k will be the red headed step child.

Not necessarily. 4K and 8K are TV standards with pictures filling the whole screen. 5K can be made in the 21:9 format or even wider and still have windows of 4K size. 8K is something new, I see this kind of resolution better for widely stretched curved monitors than just stanard 16:9.

While it certainly wouldn't hurt to enable it if you have the headroom, with 4K+ computer monitors (under 30") the DPI is so high you likely wouldn't really notice a difference in image quality versus the enormous performance hit it'd take.

Indeed, but monitors can get much bigger than 30". 40" 4K monitor has dpi of 27@1440p which is nothing special. But the monitors wild card are wide curved monitors going beyond 21:9.

Running multi-GPU setups has always been a necessity for maxing out games at 2560x1600, since the days of Half Life 2. As top-end GPU power increases, so do game graphic requirements. Game developers have been targeting 1920x1080 for their "ultra" resolutions on a single top-end video card, and with game consoles fixed at 1080p, that isn't going to change anytime soon. So unless game graphics get worse in 2016, twice the GPU power we have today wouldn't be sufficient for 5K gaming.

Preformance hit is temporary aspect, like high-end 4K gaming now is for dual SLI of GTX 980 caliber, 5K gaming will be for dual SLI with the 16 nm generation.

5K gaming isn't going to be feasible anytime soon.

You will have to beg pardon for this statement in about 18m from now max :D. Though we are talking about dual SLI.

8K is a full 5+ years away from being a reality for a consumer priced desktop monitor IMO.

Heh, yeah, consumer priced. But here are plenty of No-Joe-Sixpacks people who are buying expensive 5K, curved 34" and so on. 8K monitors for the upper class may thus arrive in 2016.

I really hope that VR headsets/glasses will take off. It's MUCH MUCH cheaper to manufacture screens of sizes used in these then big high res screens in common monitors/TVs. Heck, simple example. How little fraction of cost for two screens reused from smarthpones as in current occulus samples vs price of 120+"(or even bigger?) TV for somewhat comparable screen size due further distance to it?

VR is completely different segment from displays and they cross only to certain extent. Seeing VR as a substitute for displays is sci-fi at this time. Even the issue of possible health problems with VR is not clarified yet but motion sickness, eye strain and even display artefacts retention in the retina are commonly reported.
 
wirk, when it comes to monitors, you had better just defer to Vega.

if you really believe that, within 18 months, consumer grade gaming hardware will exist that can keep up with "no-joe-sixpack" 8K monitors in an ultimate extreme mega high end market segment that completely doesn't exist today, you are naive about how computer hardware manufacturing works, much less what is technologically possible.
 
If you think there will be GPU's twice as powerful as the GTX980 in that short of a time-frame, I've got some magic beans to sell you. ;)

Don't smoke and get real facts: GTX 980 chip is relatively small and scaled down, its full version a la Titan would be 50% up. This is in 28 nm tech, the 16 nm technology coming probably in 2015 but surely in 2016 can add another 50% on top of this. So the conclusion is twice the GTX980 within the next 18 m.
wirk, when it comes to monitors, you had better just defer to Vega. if you really believe that, within 18 months, consumer grade gaming hardware will exist that can keep up with "no-joe-sixpack" 8K monitors in an ultimate extreme mega high end market segment that completely doesn't exist today, you are naive about how computer hardware manufacturing works, much less what is technologically possible.

The talk in this case was NOT about 8K but about 5K monitors about which I said dual SLI cards will be OK for gaming on them, arguments are as above.

This is actually really sad. Instead of making advancements in the panel technology, manufacturers pushing for the higher resolutions with the same crappy lcd monitors. :confused: I don't think we will ever see a consumer oriented OLED monitors in this decade at this rate. It's easier to increase the pixel count than to develop and perfect a new display technology and the consumers only care about numbers and gimmicks anyway, seeing how plasma tv's are dead and gone now only confirms this.

Heh, to develop perfect technology in the future one has to have money now. They sell what is possible now and finance R&D from it. Big size OLED is only just now enetering large scale production, OLED monitors may yet come. It will be interesting to see how wide is the LG OLED lineup at the CES'15 in 3 weeks time.
 
I find it really bizarre that OLED monitors aren't starting with computers like LCDs did.

The computer industry is the perfect testbed for new technology. That's where all the technology enthusiasts are, and you don't have to produce screens as big (anything over 30" is pushing it for a computer monitor).

Seems dumb to me to start with huge OLED TVs instead of starting with 21" - 30" computer monitors.
 
I find it really bizarre that OLED monitors aren't starting with computers like LCDs did. The computer industry is the perfect testbed for new technology. That's where all the technology enthusiasts are, and you don't have to produce screens as big (anything over 30" is pushing it for a computer monitor). Seems dumb to me to start with huge OLED TVs instead of starting with 21" - 30" computer monitors.

One could think like that but:

OLED was produced on manufacturing lines with small glass sheets size, this is OK for making only small displays in large quantities, note that from a number of years Samsung is making OLED displays for smartphones. Now first LG factory with the glass size comparable to LCD is just starting production. Apart of low production yields the problem with OLED is/was image retention, earlier there were also reports about pixel burnout. Monitors would be particularily sensitive to this since static pictures may be kept for long time. There is yet another problem with OLED which is white blocks output limitation. OLED can not display very bright big areas for long due to overheating. Light intensity of a big bright area has to be reduced, this is noticeable effect of picture or its part getting darker.

But now when LG is just starting big OLED factory perhaps OLED monitors will show up too.
 
One could think like that but:

OLED was produced on manufacturing lines with small glass sheets size, this is OK for making only small displays in large quantities, note that from a number of years Samsung is making OLED displays for smartphones. Now first LG factory with the glass size comparable to LCD is just starting production. Apart of low production yields the problem with OLED is/was image retention, earlier there were also reports about pixel burnout. Monitors would be particularily sensitive to this since static pictures may be kept for long time. There is yet another problem with OLED which is white blocks output limitation. OLED can not display very bright big areas for long due to overheating. Light intensity of a big bright area has to be reduced, this is noticeable effect of picture or its part getting darker.

But now when LG is just starting big OLED factory perhaps OLED monitors will show up too.

You'd think that smart phones would be just as bad as computers as far as image retention goes. They're pretty much always displaying the same stuff.
 
IMHO 5120x2880 is a bad resolution for PC use or gaming.

The lack of updated displayport standards in shipping products to allow for SST connections, the lack of good dpi scaling in windows ( so by extension were people typically game) , the lack of a UHD playback and transmission standard (REC 2020 is only a proposed standard still) and having it not be a multiple of 4K all mean that current PC hardware will not really allow you to be productive, game or enjoy UHD content better than on a 3840x2160 60hz screen or a 2560x1440 144hz screen.

If you really wanted "better" than "4K" I think a 5120x2160 60hz screen would be better as a move before 8K becomes a viable thing.

You should be able to get 60Hz SST DP 1.3 connection easier once some monitors hit the market and it will handle 1080p and 3840x2160p content fine with simple pillar box since is is the same vertical pixels as UHD and a even multiple for 1080p.

Being wider than 16:9 also means 4096x2160 content could also be played back without conversion and having scalers and hdmi inputs is more reasonable than converting to 5120x2880.

I know the extra width would be better for some people wanting multiple screens for gaming or productivity than a 5120x2880 screen would be and would be easier to drive GPU wise.

I've read that there will also be 5120x2160 monitors planned that are 28"-32" and curved and to me that sounds like a really nice immersive setup for a single monitor gaming setup or movie watching versus a 28" 5120x2880 monitor that is basically the same immersion as my 2560x1440 monitor already does but would need a lot more GPU and would really need good dpi scaling in Windows.

5120x2160@60hz at 32" seems like it would be right in the sweetspot , you would still need more GPU horsepower to game on it , but it would still be easier to implement and if I'm moving to a resolution beyond UHD I'd rather it was wider too instead of the same aspect ratio I already have.
 
if you have the right GPU to push all those pixels, go for it!

Been running 4k since may 2014, hated mst and was glad to upgrade to a larger 32" sst 4k display recently. No way would I want to go 5k until sst with displayport 1.3 was common enough, and probably more video card horsepower would be desirable than my highly oc'd sli gtx 970 pair. I love my 4k panel, and don't forsee upgrading for at least 3 years from now, though I'm aiming to stick with it for 4+.

Don't smoke and get real facts: GTX 980 chip is relatively small and scaled down, its full version a la Titan would be 50% up. This is in 28 nm tech, the 16 nm technology coming probably in 2015 but surely in 2016 can add another 50% on top of this. So the conclusion is twice the GTX980 within the next 18 m.


The talk in this case was NOT about 8K but about 5K monitors about which I said dual SLI cards will be OK for gaming on them, arguments are as above.



Heh, to develop perfect technology in the future one has to have money now. They sell what is possible now and finance R&D from it. Big size OLED is only just now enetering large scale production, OLED monitors may yet come. It will be interesting to see how wide is the LG OLED lineup at the CES'15 in 3 weeks time.
Agreed on all points.
 
Can't wait for 4K IPS but yea most graphics cards - even the top of the line ones - have a hard time handling 1440p with most things set on ultra/high and getting a smooth fps.
 
IMHO 5120x2880 is a bad resolution for PC use or gaming.The lack of updated displayport standards in shipping products to allow for SST connections, the lack of good dpi scaling in windows ( so by extension were people typically game) , the lack of a UHD playback and transmission standard (REC 2020 is only a proposed standard still) and having it not be a multiple of 4K all mean that current PC hardware will not really allow you to be productive, game or enjoy UHD content better than on a 3840x2160 60hz screen or a 2560x1440 144hz screen. If you really wanted "better" than "4K" I think a 5120x2160 60hz screen would be better as a move before 8K becomes a viable thing. You should be able to get 60Hz SST DP 1.3 connection easier once some monitors hit the market and it will handle 1080p and 3840x2160p content fine with simple pillar box since is is the same vertical pixels as UHD and a even multiple for 1080p.
Being wider than 16:9 also means 4096x2160 content could also be played back without conversion and having scalers and hdmi inputs is more reasonable than converting to 5120x2880. I know the extra width would be better for some people wanting multiple screens for gaming or productivity than a 5120x2880 screen would be and would be easier to drive GPU wise. I've read that there will also be 5120x2160 monitors planned that are 28"-32" and curved and to me that sounds like a really nice immersive setup for a single monitor gaming setup or movie watching versus a 28" 5120x2880 monitor that is basically the same immersion as my 2560x1440 monitor already does but would need a lot more GPU and would really need good dpi scaling in Windows. 5120x2160@60hz at 32" seems like it would be right in the sweetspot , you would still need more GPU horsepower to game on it , but it would still be easier to implement and if I'm moving to a resolution beyond UHD I'd rather it was wider too instead of the same aspect ratio I already have.

You are right on many points. But I would rather treat 5K as generic number not strictly limited to the 5120x2880 pixels, 5K provides then sufficient pixel budget for entirely new solutions. One can think about 21:9 curved display with resolution about 6000x2400 or even more stretching like 22:9 and 23:9 formats. This would be definitive breaking with convention of traditional monitor looking like small TV. Regarding the lack of DP1.3 and GPU horsepower these are temporary problems. Of course it would be best if all components of new technology arrived in coordinated way at the same time but this is the way economy works, one can be glad they managed to agree on single standards :).
 
Don't smoke and get real facts: GTX 980 chip is relatively small and scaled down, its full version a la Titan would be 50% up. This is in 28 nm tech, the 16 nm technology coming probably in 2015 but surely in 2016 can add another 50% on top of this. So the conclusion is twice the GTX980 within the next 18 m.

It's good to know that supposition, speculation and hyperbole are "real facts". :rolleyes: Someones been reading too much wccftech.com...

Been running 4k since may 2014, hated mst and was glad to upgrade to a larger 32" sst 4k display recently.

It's funny, although SST is superior, I've never run into many issues at all having owned multiple MST 4K displays.
 
4K, 144Hz, OLED, curved, G-Sync FreeSync. That's all I want/NEED. :D

But who am I kidding. I'll probably using some form of VR by the time something like that, if ever, becomes available.
 
Have fun waiting for a good two years and spending over 1,000 for a 5K monitor.
 
You'd think that smart phones would be just as bad as computers as far as image retention goes. They're pretty much always displaying the same stuff.

My (old) Samsung S3 has image retention. Entirely my fault - I should know better than to use it as a GPS with full brightness in a car in bright sunshine for months and months. :)

Other (newer) OLED phones/tablets actually orbit the pixels to try to prevent permanent image retention. Who knows if they will be successful.

Exciting times to own an OLED display. :D
 
Other (newer) OLED phones/tablets actually orbit the pixels to try to prevent permanent image retention. Who knows if they will be successful.

(some)Plasma's did the same thing.

It kinda works, not really though. Instead of a finely details burned in image you get a slightly blurry burned in image.
 
Maybe they should make a screen with a motor that periodically rotates it 180 degrees. (Or a rotate a square screen 90deg) I wonder how often it would have to do that to avoid burn-in.
 
This is actually a good question now that 5K monitor(s) are getting released and if you have not used 4K before, I would definitely recommend jumping to 5K directly. Of course, 4K monitors now are more ironed out compared to the beginning of this year when the 4K's were still using MST.

However, having 3 of the Dell UP2414Q 4K IPS monitors, I have zero regrets and can say with confidence that they work beautifully in 4K Surround (25MP) which is 11 million more pixels than 5K.

Check here to see the 4K Surround beast in action: http://youtu.be/DnBVPNRV5gU

4K Surround is definitely playable although AA was not an option since even the Titan Black SCs were using the 6GB in almost all games with no AA.

The thing that really rankles me about the new 5K monitor - only one on the market - is the DP 1.2 connection and needing TWO of them to run 5K @ 60Hz. If Dell was able to install a high-end Timing Controller along with DP 1.3, I would get 3 of them immediately for some 44MP madness.

Anyway, I wonder if we will see a DP 1.3, 5K @ 60Hz, IPS (type?) monitor with G-Sync. That would be an incredible display.

And whoever keeps saying current GPUs struggle at 4K has no clue as to what they are saying. I've been running 4K+ resolution for almost FOUR years (3 30" monitors) since the days of the GTX-580 Classified. It ran great and now with 4K Surround, even that runs well for the most part.

Look at this video of BF4 - 4K @ 60FPS w/ everything on Ultra (No MSAA): BF4 4K @ 60FPS.

That's w/ 2 GTX-980 Classified in SLI. Yea, keep stating that 4K is "unplayable." :rolleyes:
 
That's w/ 2 GTX-980 Classified in SLI. Yea, keep stating that 4K is "unplayable." :rolleyes:


So, you need 1000+$ in overclocked gpu's to max out a year old console port? My 580gtx struggled with bf4 even on high settings in 1080p.

I'd rather have a 1080p screen with better contrast, blacks and factory calibration than those 4k ips dells.:D
 
^ exactly. BF4 performance is not a good baseline for games starting to come out now.

A good baseline is Crysis 3, which gets nowhere near 60 FPS@4K with 980 SLI. 40 FPS average, sometimes dips into the 30s.
 
. If Dell was able to install a high-end Timing Controller along with DP 1.3, I would get 3 of them immediately for some 44MP madness.

Means there is absolutely unsatiabale appetite for pixels out there :eek:. Yours should be satisfiable in 2015 though :D.
 
Back
Top