Now that 4K displays are on pipeline, will there be steep decline in 30 inch display

maverick786us

2[H]4U
Joined
Aug 24, 2006
Messages
2,118
With 4K TVs on the horizon, and 4K monitors are on the pipeline. Will we expect steep decline in the prices of 30 inch displays, which is much anticipated for all the hard core enthusastics in this forum?
 
No. You'll see 30" models slowly vanish from the market as supplies run dry. But I don't think it'll happen any time soon, 4K monitors are still much more expensive.
 
Depends on how manufacturers go. It may go the way of the Titan, where instead of replacing the high end, it makes a new ultra-high end and prices stay the same. I hope prices get pushed down, but we need a display maker like seiki to drive down prices. It would be even better if dell and hp drops prices, but I doubt it. Although, Dell was one of the first to try and make affordable IPS screens, and then one of the first ones to make affordable monitors with displayport, so it could happen.
 
I'd say 27" 1440p displays are filling this niche nicely.

Eh... They are too small for me. 27" 1440p displays are only slightly taller than a 24" 1200p display.

compare.png
 
I think once the cheapie 39" TVs get 60Hz 4K sussed out they will start putting some serious hurt on the "monitor" segment...

$3000 31" 4K "Monitor" or $600 39" "TV" Hmmmm
 
No. You'll see 30" models slowly vanish from the market as supplies run dry. But I don't think it'll happen any time soon, 4K monitors are still much more expensive.

What will take the place of 30 inch displays?

I think once the cheapie 39" TVs get 60Hz 4K sussed out they will start putting some serious hurt on the "monitor" segment...

$3000 31" 4K "Monitor" or $600 39" "TV" Hmmmm

TVs cannot generate the clarity monitors can. If you are using your PC just for gaming TVs could be fine. But not for everyday purpose. Look at google maps @ 4K resolution. You will be amazed. You won't find such a clarity in a 4K TV
 
TVs cannot generate the clarity monitors can. If you are using your PC just for gaming TVs could be fine. But not for everyday purpose. Look at google maps @ 4K resolution. You will be amazed. You won't find such a clarity in a 4K TV

I can see mentioning a ton of things like features, inputs, warranty, or color calibration as differentiating between a "TV" and a "Monitor" but you're seriously going to say clarity?

The glass/panels they use are exactly the same!
 
What will take the place of 30 inch displays?



TVs cannot generate the clarity monitors can. If you are using your PC just for gaming TVs could be fine. But not for everyday purpose. Look at google maps @ 4K resolution. You will be amazed. You won't find such a clarity in a 4K TV

I lol'd IRL. Thanks!
 
Eh... They are too small for me. 27" 1440p displays are only slightly taller than a 24" 1200p display.

compare.png

++ This!!

If I only played games on the PC then I could get by with a 1440p with a 16:9 aspect, but I don't like them for a PC monitor. I was considering the 27" 1440 monitors before I bought a 30" but every time I looked at one in a store, it didn't feel like an upgrade to my 24" 1920x1200 monitor, it felt like it was the same as my current equipment.
 
Eh... They are too small for me. 27" 1440p displays are only slightly taller than a 24" 1200p display.

compare.png

27" vs 30" at the same distance, centered:
27inTV_vs_30inTV_sizes.jpg


To quote myself from elsewhere:

ppi and monitor size to your perspective are relative to viewing distance. A 27" 108.8 ppi at desk distance is a lot better than a 1920x1200 at desk distances, and fills a lot of your viewpoint. Personally I think larger than 27" 16:9 / 30" 16:10 at normal desk distances gets too big in the case of a game more specifically. That is, unless games were to add larger screen real estate as additional FoV rather than just making the same scene JUMBO in front or your face , pushing the outer screen into your periphery.

Note that while the 30" has +80px top and +80 px bottom vs the 27" -- the 30" does not have any more pixels across the width than the 27" , its pixels are just that much larger.

The 80px top and bottom size 'gap' would be much smaller size-wise at the same pixel sizes --> 75 inches top and .75 inches bottom (3/4inch top and bottom) 1.49" total to be exact if both were 108.8ppi.

Another way to look at it on more equal terms is that if you moved the 30" panel back enough until its width (and ppi) looked equal to the 27" to your viewing perspective, there would be .75" peeking out top and bottom (about the diameter of a dime coin each) in relation to the 27" screen, 80px tall each.
Conversely, if you move a 27" monitor's viewing distance closer to you until it has the same ppi (and screen width) to your perspective as a 30", it will have the "same ppi" and look a lot larger to you.
A 27" 2560x1440 panel sized up to 30" 2560x1600 sized ppi would be 29.19" diagonal.
..
A 30" 2560x1600 panel sized down to 27" 2560x1440 sized ppi would be 27.75" diagonal.



Games use HOR+, which means even if you are using a 16:10 monitor, you are better off using 16:9 mode in the game imo anyway.

HOR-plus_scenes-compared_1-sm.jpg


In using larger screens with the same scene (going over 27" - 30" at their suitable desk distances), for example larger tv's, you are making the entire scene jumbo

Having one tall giant screen at desk viewing distances is a recipe for some eye bending to the periphery imo at anything much over 30" diagonal on the primary viewing space, even if I managed to center it to my viewpoint somehow. I used a 37" wetstinghouse at one point, and regardless of the ppi, it was physically way out of bounds to the perimeter at normal desk viewing distances. I moved it back 4' on a pillar until I could sell it. Unless there were some way to isolate a virtual primary monitor "window" of the main game screen.. constraining model sizes and keeping all of the main action, huds, notifications, pointers, and chat within it.. a "window"/virtual monitor space with invisible borders so that everything beyond it in game FoV was additional peripheral FoV intentionally.. a giant (tall) wall of monitor in front of my face would not be something I would want personally for gaming.

As it is now, using multiple monitors is one of the only ways to increase your FoV rather than just making the same scene JUMBO in front of your face.
eyefinity_config-aspects-visualized_sm.jpg
 
Last edited:
With regard to this thread's title, "Now that 4K displays are on pipeline, will there be steep decline in 30 inch display". I agree with some others that a 27" 2560x1440 at $350+ price point will be a lot more common. You could always mount more than one 27" 2560x ips with a unified/spanned desktop optionally and it would be much better color than an "affordable" 4k at similar ppi. From what I understand, the ips panels have better color clarity than the currently "affordable" 4k displays. I can see using a smaller, affordable 4k tv that is $1k or less for desktop use in order to get more real estate/surface at a similar ppi, but I wouldn't want to sacrifice color quality or quality of the displayed content in general personally, unless I were doing primarily text/coding perhaps.
I'll have to keep an eye on the color quality and display quality tests/benchmarks/reports of the $1k and less models.
If the only 4k monitors with comparable color quality and overall display quality to the current 27" and 30" 2560x panels continue to be very expensive, I don't feel that the decline will be that steep, at least for people to whom color matters.
.
For gaming, imo, 1080p at 120hz - 144hz is still the sweet spot for gpu power on the enthusiast budget ($750 - $1k in gpus) without going to the extreme budgets of $1500 - $2k+ in gpus alone. That is, in order to get 100 - 120+ fps on a 120hz monitor at high, high+, or ultra settings(depending on how demanding the game is). Going even to 2560x on a 120hz "overclocked" korean monitor crushes fps on more demanding games. I think it is best to keep separate monitor just for gaming, and use much better ppi and color monitors for desktop use.

Personally I want 100fps+ on a 120hz monitor for my games (over 120fps optimally). Whatever my gpu budget is vs how demanding a game is limits what video settings quality i will set a game at. For me, sub 100fps and not running a 120hz monitor is not "ultra" graphics/display experience at all. It is not max settings to me, or perhaps "max configuration" and maximum presentation of the game world to me.
60hz monitors/tv's blur the entire viewport during FoV movement 50% more than 120hz at high fps, and 60% more than 144hz at high fps. Low fps and/or low hz also show half or less the most recent action slices per second, which means lower accuracy//lower motion tracking and lower aesthetic smoothness of motion/fluidity of motion.
Maximum blur and the worst aesthetic motion smoothness (and reduced accuracy) is not the best/max visual presentation of a game.

60hz smears horribly, 120hz at high fps does what to me looks like a full soften blur effect on the entire viewport during FoV movement, or on high speed objects during static viewpoints.
Photos: 60hz vs 120hz vs Lightboost
In actual gaming it is not just a single cartoonish ufo object blurring.
Bluring of lcds during FoV movement blurs out all high detail object/architecture/landscape~geographic detail, high detail texures, depth via bump mapping, shader effects, in game text/signs, nameplates, etc.

While I am very interested in blur reduction and optimally blur elimination, there are additional benefits to running high fps and high hz.
.
When I say "smoothness" I mean something separate from blur reduction. If I were using a general term for blur reduction I would use something like "clarity" or "clearness".
.
Smoothness to me means more unique action slices, more recent action going on in the game world shown - more dotted lines per dotted line length, more slices between two points of travel per se, more unique and newer pages flipping in an animation booklet, pick your analogy. It means less "stops" in the action per second and more defined ("higher definition") animation/action flow, which provides greater aesthetic motion and can increase accuracy, timing, and reaction time.
.
Disregarding backlight strobing for a moment.. As I understand it - where a strobe light in a room someone runs across would show blackouts, a typical lcd rather than blacking out just continues displaying the last "frozen" frame of action until it is updated. At 60hz that is every 16.6ms of course, and at 120hz and high fps it would have shown a new state of/peek into the room and run cycle 8.3ms sooner instead of freeze-frame skipping (over what would have been a new state at +8.3ms) to the next later state of the room and run cycle a full 16.6ms later. What is displayed of the entire animated world action in games is updated twice as often(and twice as soon) which can increase accuracy, and in providing more "dots per dotted line" per se, makes movement transitions "cleaner"/aesthetically smoother, providing higher definition movement and animation divided into 8.3ms updates. This goes hand in hand with blur reduction/elimination to make the entire experience a drastic improvement over 60hz/60fps.
 
Last edited:
What will take the place of 30 inch displays?



TVs cannot generate the clarity monitors can. If you are using your PC just for gaming TVs could be fine. But not for everyday purpose. Look at google maps @ 4K resolution. You will be amazed. You won't find such a clarity in a 4K TV

And your saying this from experience?

I am sorry but TV's are monitors they just have tuners too. The only problem with TV's is they often to post processing which can be adverse for monitor usage but this can usually be disabled. The image on my 39 inch seiki 4k dispay is better than my 22 inch IPS 4k display (IBM T221). Also its better than my 30 inch dell and 27 inch yamakasi as well. Which 4k displays/tvs do you have?
 
As it is now, using multiple monitors is one of the only ways to increase your FoV rather than just making the same scene JUMBO in front of your face.

There are many games that do not maintain the same FOV at a given aspect ratio regardless of resolution. The ones that do strictly adhere to this tend to be first/third person single character games, but a number of RTSes, top down RPGs, etc, do things quite differently. F ex, I see a lot more at the same time at 3840x2160 in Civ5 than at 1920x1200.

That said, multiple monitors is certainly a more reliable way to increase your FOV if that's a priority and you don't want to mess with FOV settings.
 
True, mostly tiled games like rts's. I should have been more specific to 1st/3rd person cgi world perspective games.
Even portrait mode 3x1 monitor setups or huge wall TV's in most 3rd/1st person games just jumbo size the same scene at ordinary desk distances (though 3x1 obviously adds pixel density).
Being able to isolate your primary FoV and make extra rez and screen size outside of your focal viewpoint additional FoV (in 1st/3rd person cgi world games) would make very large screens at a desk reasonable for immersion. Eye bending (and perhaps even micro neck bending) to the periphery to see the primary FoV space is unsuitable imo, and could even be a detriment to your gaming ability.

That said, there are smaller 4k tv's and monitors, I don't think they are the best choice for gaming vs the huge gains of 120hz and the 1080p sweet spot for enough fps to feed the 120hz on an enthusiast budget gpu (without going to extreme budgets), for the reasons I outlined a few posts ago.
 
Last edited:
And your saying this from experience?

I am sorry but TV's are monitors they just have tuners too. The only problem with TV's is they often to post processing which can be adverse for monitor usage but this can usually be disabled. The image on my 39 inch seiki 4k dispay is better than my 22 inch IPS 4k display (IBM T221). Also its better than my 30 inch dell and 27 inch yamakasi as well. Which 4k displays/tvs do you have?

I would be interested to see color, uniformity, and display clarity(incl ag coatings if any) testing results of the more affordable 4k panels to compare to some of the more common 2560x ips panels (dell , apple , properly calibrated koreans). The display subjectively being "better" can mean different things.
 
Last edited:
I made this in reply to the 30" vs 27" 2560x across resolution comparison argument a few posts back.
27in_vs_30in_2560x-width.jpg


more panel sizing fun.. I'll double check the 4k field later, I think its pretty accurate.

4k_vs_27in_vs_30in_2560_same-ppi.jpg
 
Last edited:
And your saying this from experience?

I am sorry but TV's are monitors they just have tuners too. The only problem with TV's is they often to post processing which can be adverse for monitor usage but this can usually be disabled. The image on my 39 inch seiki 4k dispay is better than my 22 inch IPS 4k display (IBM T221). Also its better than my 30 inch dell and 27 inch yamakasi as well. Which 4k displays/tvs do you have?

I can't afford a 4K TV ATM. But I am saying from experience. How many times, you've seen graphic designers, high end autocad engineers who develop 3-D design, use TVs instead of monitors? They all use high end 30 inch displays. And then TV do have a refresh rate of more than 60Hz. But I've read in an older thread, that TVs cannot take more than 30Hz input, so how good it will be for gaming?
 
I can't afford a 4K TV ATM. But I am saying from experience. How many times, you've seen graphic designers, high end autocad engineers who develop 3-D design, use TVs instead of monitors? They all use high end 30 inch displays. And then TV do have a refresh rate of more than 60Hz. But I've read in an older thread, that TVs cannot take more than 30Hz input, so how good it will be for gaming?

Ok so a few things here...

Yes TV's do have signal processing, input lag, lack of power management and other stuff that makes them not the best thing to use as a monitor (especially due to the input lag for gaming) but when your talking simply about quality and clarity of the screen HDTV's are usually pretty good and it more depends on the *type* of LCD panel more than anything else.

I would say mostly for the above reasons people don't use TV's unless they need big displays.

98% of TV's in general are going to be atleast 60Hz. In the case of 4k that is the only ones that have a 30hz refresh rate limit which is limited by HDMI 1.4.

Now in the case of the seiki 4k display the 39 inch model sold for as cheap as $500 but usually goes for around $600-700. It has a S-MVA panel which means almost as good viewing angles as IPS but much better contrast. It has excellent colors and viewing angles in my oppinion as I came from a 30 inch dell 3007 WFP and yamakasi catleap Q270. Even not counting the actual resolution increase I would say the display is quite a bit superior in colors, response and a huge amount in contrast compared to the 'professional' monitors that I used to use. Its not even a competition when you bring resolution into the picture.

Once it was tuned it is just as sharp as any other 39 inch display would be. At lower resolutions it can do 60Hz or supposedly with a firmware update even 120Hz (works on the 50 inch but not the 39) and unlike most tv's has almost 0 input lag so it even is good for gaming.

Honestly the only possible annoyance would be the power management is not like a normal monitor. This doesn't matter to me as I run linux so unplugging the display or going to sleep does not effect my desktop arrangement or cause any annoying issues like it can for some windows users.

The 39 inch seiki is basically the best looking display I have ever bought for under $1000 (I paid $499 for it). When being used as a monitor it is far nicer than my old premium monitors.

I would say especially when talking 3d-design, autocad/etc that the biggest reason people use 30 inch monitors are because that was the highest resolution you could get and it was IPS. Not all TV's do have panels that have good color/contrast/viewing angles but professional monitors will.

My dad is a sofltware developer who upgraded to the 39 inch seiki from a 22 inch Viewsonic VP2290b 4k display and he is extremely happy with it. It was a big upgrade for him even with the 30Hz limit (his viewsonic did 41 Hz @ 4k).
 
@elvn: I prefer bigger screens especially if I'm trying to use an anamorphic resolution while gaming.

Also, I don't think many user care too much if they have to move there eyes / neck around to see different parts of the screen. At some point you have so much resolution that multi-tasking becomes easy on one display. I also think that at some point, people start moving there displays back from there viewing position to account for their larger displays. I know I'm not the usual user, but this is [H]ardforum after all, so having massive displays with equally massive resolutions should be some what of a standard around here. I've done every display setup eyefinity lets except for 3x2. The horizontal bezel makes doing this pointless. I've done 2x2 only for testing performance at 4K and getting screenshots.

EDIT: Also if you want the absolute best example of having to move backward to use all of your displays and avoid some of this bending, here's 5 x 30"rs in portrait playing a first person shooter.



<-This is a Youtube Video, Click it for obnoxiousness.

In hindsight, I maybe should of put the right speaker behind the displays.
 
I get all that. That is why I specified enthusiast budget vs extreme budget. To me, an enthusiast is willing to drop decent money (e.g. $800 - $1500 on a rig, $600 - $1k on a gpu) just not extreme money ($1500 - $2k or more on gpus alone).
.
In an extreme setup similar to yours made of 16:9 120hz monitors - or vega's setup if you are familiar with it, 3x1 is similar to a 16:9 viewport but with much higher pixel count and demands. A 5x1 portrait setup's monitor on each end would indeed be adding additional FoV like a PLP setup can
eyefinity_config-aspects_5x1P-vs-16x9PLP_sm.jpg

- so the entire array would not have to be in your focal viewpoint if you wanted to use the ends to add immersion. Furthermore, if you set the whole array back a little farther so that only/mostly the end monitors were outside of your focal viewpoint perspective wise, the central three monitors would again be filling your focal viewpoint more or less and you would be shrinking both the pixels (making the ppi seem even higher), and shrinking the bezels to your perspective.
.
I still strongly feel this way (quote below), no matter what the rig budget/gpu budget, so the "ultra/extreme display" is a matter of tradeoffs. In my opinion Vega's monitor setup is probably the best for gaming, three 120hz monitors in lightboost 2d mode for zero blur. That is, if you had the extreme budget to drive three 1080p monitors at 120fps+, and you don't mind bezels(even "debezeled" thin ones). For the enthusiast budget, I feel that a single 1080p 120hz-144hz(incl lightboost if you prefer it) is the sweet spot to get 100 to (optimally) 120fps+ on modern games at high, high+, or ultra settings.
Personally I want 100fps+ on a 120hz monitor for my games (over 120fps optimally). Whatever my gpu budget is vs how demanding a game is limits what video settings quality i will set a game at. For me, sub 100fps and not running a 120hz monitor is not "ultra" graphics/display experience at all. It is not max settings to me, or perhaps "max configuration" and maximum presentation of the game world to me.
60hz monitors/tv's blur the entire viewport during FoV movement 50% more than 120hz at high fps, and 60% more than 144hz at high fps. Low fps and/or low hz also show half or less the most recent action slices per second, which means lower accuracy//lower motion tracking and lower aesthetic smoothness of motion/fluidity of motion.
.
Maximum blur and the worst aesthetic motion smoothness (and reduced accuracy) is not the best/max visual presentation of a game.

---------------------------

Going back to 4k screens, the real estate for desktop use would be great, I was commenting more about gaming with a wall of monitor in front of your face in typical 1st/3rd person cgi world perspectives where the in game FoV doesn't actually increase between different screen sizes at the same aspect ratios. I also commented on the huge benefits of 120hz gaming at high fps (including important ones beyond blur reduction) and the fact that extreme resolutions require extreme gpu expenditures to get 100 to optimally 120+ fps at high, high+, or "ultra" settings when going beyond a single 1080p display to 2560x or higher. Probably not possible on a 4k panel at high, high+, or "ultra" settings even with quad gpus currently even if there were a 120hz input and output capable 4k display.
.
I use separate monitors for gaming and desktop use because the tradeoffs for each usage are so drastic. 4K would be interesting to consider as a replacement for my 2560x1440 cinema display someday. I have concerns about the color of the more affordable 4k tvs vs the more common 2560x ips screens even with the feedback recently given. I'd like to see some color testing results/reviews. I'm also not a fan of AG coatings if any.
 
Last edited:
Now that 4K displays are on pipeline, will there be steep decline in 30 inch display
No.

The current 30/27'' monitors use resolutions only common for PC displays. There will be no further price-drops, the direct offers from s.korea are the lowest prices we can expect for the moment.

1280x720, 1366x768 1920x1080 and in the future 4K can be considered standardized TV resolutions planned to be produced in large volumes, decreasing prices as a consequence.
The less common pure PC display resolutions especially with non 16:9 aspect ratio, like 1280x1024 1600x1200, 1920x1200 and in the present 2560x1600 will be phased out without a noticeable price-drop beyond what has already happened.
2560x1440 like 1366x768 is a bit of an oddball that might still drop in price in case it will be used as a cheap interim "4K-ready" alternative to full 4K like 1366x768 was deemed "HD-ready" in the past.

30'' (2560x1600) will not drop further in price and be phased out just like 1920x1200.

27'' might have some potential to be used as not-quite-4K like 1366x768 was not-quite-Full-HD, but cheap monitors will be more dependent on available panels using cheaper production methods (TN, e-IPS etc) than a drop forced by pressure from higher resolution alternatives.
 
Last edited:
2560x1600 has not seen price changes over the last 6y. Even Koreans 30" fail to cut prices in an impressive way: a non-korean 27" costs USD650, a korean costs USD300. A non korean 30" costs USD 1100, a korean costs USD 700, for that kind of money most users are better off with a 39" 4k tv.

There is one thing that can stop 1600p having the way of the dodo like 1200p: 1600p IPS panel have professional users that can not use VA panels like those fabled cheap 39" 4K TV panels that innolux is producing.
 
2560x1600 has not seen price changes over the last 6y. Even Koreans 30" fail to cut prices in an impressive way: a non-korean 27" costs USD650, a korean costs USD300. A non korean 30" costs USD 1100, a korean costs USD 700, for that kind of money most users are better off with a 39" 4k tv.

There is one thing that can stop 1600p having the way of the dodo like 1200p: 1600p IPS panel have professional users that can not use VA panels like those fabled cheap 39" 4K TV panels that innolux is producing.

I think 2560x1600 will eventually be replaced by 4K when it gets cheaper.
 
for that kind of money most users are better off with a 39" 4k tv. .

Very few people are interested in using 39" screens as desktop monitors to begin with, regardless of resolution or price. They're just too big for common seating positions and desk depths. Even 32 inches is too big for many users.
 
Once mass-manufacture of 4K begins at all size levels, I think 4K will be cheaper than designer resolutions (laptop size, monitor size, etc). Much like how 1080p is now cheap. Probably Apple would release a "Retina Thunderbolt Display" within a year, since it'll soon be one of the last Retina-less Apple products, and then thereabouts, roughly that time, cheaper 4K monitors by other vendors will begin taking off.
 
Engineering companies and business around the world, as well as graphics and arts studios specifically order 30" monitors because they are better than any other monitor as far as productivity. They are not great for watching movies and playing video games, but business don't need that anyways. Just because you don't order one to play League of Legends or COD on it, does not mean 30" monitors are going to vanish. You also don't purchase a Precision Workstation Laptop with 32 GB of RAM, Dual 512GB SSD's an 3940XM cpu and Quadro K5000 video card for $6000, but it is common equipment at my work and Dell and HP are not going to stop producing them anytime soon. and yes the package comes with 2 U3014's or Z30's as well. The only time a 27inch monitor is ordered around my work place is when someone orders a MBPr and wants the thunderbolt display.

What do you think? We are going to design the new space friggin shuttle on LED riddled Alienware laptop's and 120Hz overdriven Asus monitors hacking the registry for ligthboost etc?
Put an Asus Republic of Gamers sticker on the newest Direct Tv satellite, after all it did all the orbital mechanics simulations for it hahahahahaha :D
 
hacking the registry for ligthboost etc?
Correction: LightBoost doesn't use registry hacks anymore.
It's now an easy keypress to turn ON/OFF LightBoost with modern LightBoost utilities such as ToastyX Strobelight.

Also, strobed backlights are becoming more of an advertised motion-blur-eliminator feature of displays. So it's easy to turn on on some displays. For example, Eizo's new high-end commercail display FDF2405W uses a strobe backlight (like LightBoost), that can be easily turned on/off via on-screen menus.
 
There are premier cgi movie studios, special fx studios,game studios and film studios using non-gaming design hardware that could still benefit from displays that provide motion excellence, were it addressed by the display manufacturers directly. I've seen a few cgi movie co's that still have fw900's in their mix btw. CAD architecture/engineering static design doesn't require non-blurred motion sure, but that isn't the only kind of design heavy render based workstations are used for. Many are used for compositing cgi and "filmed" material, all out cgi video, or editing real world video digitally.
.
A 4k panel would provide a lot more real estate for 3D design and "film" editing, so could potentially show up on some designer's desks.
Any design suite that utilizes multiple viewports, preview window(s), and multiple toolboxes and readouts, (and in some cases utilizing more than one app in conjunction with one another) can benefit from large gains in real esate, and the resolution is beneficial when working with higher and higher resolution materials (video, photos, textures).
They would need to have competitive color compared to the existing 27 - 30" ips panels though, and they would still have inferior motion presentations without addressing lcd blur.

Obviously you could have a smaller 4k panel with a much higher ppi, but this graphic was made to show a more direct comparison of the real-estate differences between the three panel's resolutions.
4k_vs_27in_vs_30in_2560_same-ppi.jpg
 
Last edited:
What do you think? We are going to design the new space friggin shuttle on LED riddled Alienware laptop's and 120Hz overdriven Asus monitors hacking the registry for ligthboost etc?
Put an Asus Republic of Gamers sticker on the newest Direct Tv satellite, after all it did all the orbital mechanics simulations for it hahahahahaha :D

No, we're going to do it on friggin Dell 1905FPs, because thec orporate beancounters don't want to get pricey displays. Maybe you can find a 22" 1680x1050 if you're lucky, but if you want dual monitors, better wait for the guy next to you to retire or go on vacation, and swipe his. Oh, and don't forget that many are being driven by Core 2 Duo machines with Windows XP running on mechanical hard drives. And if you want a docking station for your laptop, be prepared to spend 3 hours filling out paperwork and getting approvals. You must be working at a start-up I guess :p
 
What do you think? We are going to design the new space friggin shuttle on LED riddled Alienware laptop's and 120Hz overdriven Asus monitors hacking the registry for ligthboost etc?
Put an Asus Republic of Gamers sticker on the newest Direct Tv satellite, after all it did all the orbital mechanics simulations for it hahahahahaha :D

The thing is the tech is no longer stratified like you think it is. Suddenly we have companies that seem to understand a consumer might want to get a 4K screen to use as a TV and a monitor. They don't need production color calibration. It's going to eat away at the 30" market. Some 30" users need pro features, the others are going to see a 39" that is hundreds less than the 30" and it's a slam dunk decision. There is no way you can spin the existence of good enough 4K glass packaged cheap as not eroding the sales of 30" gear.
 
No, we're going to do it on friggin Dell 1905FPs, because thec orporate beancounters don't want to get pricey displays. Maybe you can find a 22" 1680x1050 if you're lucky, but if you want dual monitors, better wait for the guy next to you to retire or go on vacation, and swipe his. Oh, and don't forget that many are being driven by Core 2 Duo machines with Windows XP running on mechanical hard drives. And if you want a docking station for your laptop, be prepared to spend 3 hours filling out paperwork and getting approvals. You must be working at a start-up I guess :p

I work for NASA.

The thing is the tech is no longer stratified like you think it is. Suddenly we have companies that seem to understand a consumer might want to get a 4K screen to use as a TV and a monitor. They don't need production color calibration. It's going to eat away at the 30" market. Some 30" users need pro features, the others are going to see a 39" that is hundreds less than the 30" and it's a slam dunk decision. There is no way you can spin the existence of good enough 4K glass packaged cheap as not eroding the sales of 30" gear.


I am pretty sure I have an idea on how tech is being stratified in these days.
 
Not the design side, but these are mission control room photos:

http://upload.wikimedia.org/wikipedia/commons/4/45/ISS_Flight_Control_Room_2006.jpg

http://baen.com/images/Burlison_images/Figure_5_FCR.jpg
.
Control rooms like that nasa one and stock broker desks always have a huge array of monitors because they need more real-estate for readouts.
.
Design suites utilizing multiple design viewports, preview window(s), toolboxes and readouts, and using more than one app in conjunction with one another can always use larger real-esate, so typically resort to using multiple monitors. A 4k display adds an additional 1280 x 560 of desktop real-estate in relation to a 30" 2560x, or 1280 x 720 in relation to a 27".
.
Entertainment is big business and cgi and special fx productions are big budgets and returns. What do you think that CAD designed dish network satellite is broadcasting? A lot of it is cgi based or enhanced video material for entertainment purposes.
.
"Space may be the final frontier, but it's made in a Hollywood basement."
.
Video editing, cgi compositing over real-world footage, full cgi movie design and video work in general could benefit from displays that don't suffer sample and hold blur as could the target audiences of that material. Lightboost as it is now affects color and brightness too much for anything outside of gaming, but backlight strobing/scanning is something that could benefit even design displays if done properly/advanced by display manufacturers from the ground up. The fact is that lcd technology is flawed as it stands - mostly in regard to motion clarity, but also in regard to black levels and detail in blacks. 4k does nothing to address these shortfalls.
.
Will 30" 2560x or 27" 2560x go away in the next 5+ years? not likely, but it will probably slowly get supplanted/phased-out by 4k as the primary desktop/app display by some people(including quality 4k monitors adopted by some designers) just like 1920x was, especially as it becomes "relatively affordable" to.
.
For gaming, personally I think gpu power vs cost is way behind for such extreme resolutions, and that 60hz input for gaming is inferior on multiple levels.. but that won't stop some people.
.
 
Back
Top