Why don't they make higher resolution desktop monitors?

natty

Limp Gawd
Joined
Apr 7, 2008
Messages
163
We recently bought a laptop that has a 15 inch widescreen with 1920x1200 native resolution. Gaming on this display looks beautiful. You don't see any jagged edges and you don't have to use AA.

Why can't someone make a 22 inch widescreen for desktops that does 2560x1600. I would certainly buy one.
 
I expect the density of that screen would make it fairly difficult to produce, raising the the cost for the consumer, also, I suspect there isn't much of a market for one, as most consumers would find user interface elements too small to use.

Joe
 
1920x1200 native resolution

I've never seen a 15" monitor have that high a native rez. But I'm sure for everything else, part from gaming, it isn't so practical.
 
I expect the density of that screen would make it fairly difficult to produce, raising the the cost for the consumer, also, I suspect there isn't much of a market for one, as most consumers would find user interface elements too small to use.

Joe

The market would be gamers. Like I said above, games look beautiful on the display. No jagged edges and no AA needed.

All you have to do is make the windows interface larger. Seems like that would be easy to do.
 
I've never seen a 15" monitor have that high a native rez. But I'm sure for everything else, part from gaming, it isn't so practical.

Go to dell.com and look at their laptops. Many of them come with that resolution. They call it a 'Hi-Def' screen.
 
Yeah, I've often wondered about that too. I used to have a Dell laptop with a 1920x1200 screen and it was really nice. But I soon discovered that the only way to achieve that res on my desktop monitor was to buy a 24" screen. Now I want 2560x1600, but the only way to get that is with a 30 incher. :(
 
if i had the option between a 22' or a 24' 1920x1200 i would go with the 24. i think that a 2560x1600 24' would be almost as expensive as a 30' 2560x1600, and guess what most people would rather have? a 30
 
If windows rendered the screen in a way that was fully scalable it would be great. I'm all for shrinking pxels to the point where you can't even see them anymore... but the way Windows works now everything just gets too small to see.

I know you can enlarge fonts, but other aspects of the Windows UI don't change size AFAIK.
 
My 24" is 1920x1200 and the DPI [I think that's what I mean] is just about right.

I couldn't imagine trying to read or do anything at this res on a smaller screen. Let alone one that tiny.

I would imagine that would be a nightmare. And a waste as I wouldn't be able to tell much of a difference in games over a smaller res and it would just mean you'd have to turn settings down to get the same FPS anyway.
 
If windows rendered the screen in a way that was fully scalable it would be great.

If only MS would put some meaningful changes into their OSes instead of just giving us bloated pretty pictures that only work nice if you have the exact same size/resolution as their developers.

I just installed Ubuntu on a new (from misc parts) computer and started it on a 1024x768 CRT. Fooled around and made sure it was ok, then moved it onto the KVM with my normal 1600x1200 CRT. I clicked "detect monitor settings" on the screen admin widget and the screen resized to full 1600x1200 without shrinking anything!

Apple uses some variation of Adobe software (pdf compatible something or other - ask a Mac fanboi to explain) and they don't have problems with scaling AFAIK. Meanwhile, Windows still uses pixel counts for everything instead of sizes in human scales.
 
Westinghouse is working on a 2180p Display, 3,840 x 2,160 (They call it Quad HD)

http://crave.cnet.com/8301-1_105-9671238-1.html

The tech is out there, but there are several barriers;

1) Cost - The yeild for a pannel of that rez is VERY low, making it VERY expensive
2) Demand - There is virtually no demand for anything that high rez yet.
3) Support - No video sources support over 1080p for TV on the mass-consumer market, Computer video cards are just now pushing 3d Apps at 2560 x 1600
 
I enjoy the tiny fonts and such. Since I've been using primarily laptops for so long I was really put off when I started LCD shopping. I'm very adjusted to these and it will take me a while to adjust to the poorer res of desktop LCD's.

I made a post on another forum asking where I could find a monitor with equal resolution. My 17" ws is 1680x1050, the average resolution of 22". Such a thing does not exist as far as I'm aware.
 
3) Support - No video sources support over 1080p for TV on the mass-consumer market, Computer video cards are just now pushing 3d Apps at 2560 x 1600

Linux already supports high resolution displays for the most part. All you need to do is verify that the dpi setting is right and make sure text-antialiasing is turned on. On my 17" laptop display running at 1920x1200, the text is just beautiful.

If you had a 200dpi display, running 3d apps would not be an issue because you would effectively have a second "native" resolution of 100dpi (comparable to current displays e.g. 20", 30"). Even a 150-166dpi panel would give you a reasonable second "native" resolution.

There were some grayscale medical imaging displays released some time ago offering high dpi but haven't heard anything since. Then again, that's nothing spectacular if you calculate the resolution of your typical color monitor ala "digital camera" method, which gives you 300dpi -- just remove the color from the pixels and tweak the panel driver.

But yeah, its a big shame that no one has yet stepped up to fill this need. Anyone up for a 30" 200dpi LED backlit monitor? I know I sure would be :D
 
If windows rendered the screen in a way that was fully scalable it would be great. I'm all for shrinking pxels to the point where you can't even see them anymore... but the way Windows works now everything just gets too small to see.

I know you can enlarge fonts, but other aspects of the Windows UI don't change size AFAIK.

Yea that would be a problem. I was assuming that windows already had a way to increase the size of the interface. Wake up Microsoft.
 
There were some grayscale medical imaging displays released some time ago offering high dpi but haven't heard anything since. Then again, that's nothing spectacular if you calculate the resolution of your typical color monitor ala "digital camera" method, which gives you 300dpi -- just remove the color from the pixels and tweak the panel driver.

They weren't just for medical purposes and they weren't grayscale. Are you thinking ofthis monitor as well?
 
I too would be ALL OVER a 20" 1920x1200 or 22-24" 2560x1600... I love laptop screens that are high res. since the dot pitch is so low, makes things look very detailed. However, to go to 2560x1600 I would have to buy a 30" screen, and 1920x1200 a 24" screen right now, which is just not enough resolution to justify an upgrade compared to my equal-dot-pitch-to-those 20" Dell 2005FPW S-IPS panel LCD widescreen. 1920x1200 @ 24" just would mean a larger screen, not a more finely detailed one, same deal with a 30" @ 2560x1600. :(
 
I too would be ALL OVER a 20" 1920x1200 or 22-24" 2560x1600...:(

http://www.engadget.com/2007/10/24/lg-philips-unveils-20-8-inch-qxga-lcd-for-the-medical-realm/

or 22-24" 2560x1600...:(

As a previous poster mentioned there is always the IBM T221, or the Viewsonic VP2290b running at a resolution way higher than 2560x1600 http://www.trustedreviews.com/displays/review/2004/06/30/ViewSonic-VP2290b-High-Resolution-TFT/p1

Well let us know how it goes. Expect to empty out your savings accound and go into debt :p

But yeah, very small market, and all the good sizes are grayscale.
 
Lesser size monitor with higher resolution have smaller pixels, if this ratio is extremely different, it may result to detail reduction, that is the reason why you cant buy a 22 with 2k+ res.

But do you have imagine how hard graphics card is needed to play games on that big resolution?
 
Lesser size monitor with higher resolution have smaller pixels, if this ratio is extremely different, it may result to detail reduction, that is the reason why you cant buy a 22 with 2k+ res.

But do you have imagine how hard graphics card is needed to play games on that big resolution?
An IBM T221/Viewsonic VP2290b will do 3840x2400. Specialized video cards are required to drive the monitor at full resolution/refresh rate (Parhelia HR256 comes to mind, but a few of the FireGL cards will do it too). A monitor like this is definitely not for gaming as the video cards that support the monitor are not designed for it and neither is the monitor (50ms response time for example). If you really want one, I almost bought a brand new one that went for ~$2000 on eBay (which is very rare since they are out of production now), so you will occasionally find one. Was tempting, but I couldn't find the Matrox Parhelia HR256 anywhere at a reasonable price (as in under $1000).
 
Careful, those 3840x2400 screens have some very odd resolution and refresh rate specs.

The officially supported maximum refresh rates at native resolution depends on how many TMDS links are used. Single, double, quad-link support 13, 25, 41 Hz respectively. With reduced blanking periods single, double, and quad-TMDS-link can obtain 17.0, 33.72, and 41 Hz respectively. The monitor's native refresh rate is 41 Hz
 
Careful, those 3840x2400 screens have some very odd resolution and refresh rate specs.
Not a problem for me. This would by no means be for anything requiring a high refresh rate (games, movies, etc). I know what this monitor is for, so I know exactly what to expect out of it. I thought the newer revision did 48hz. 41hz is plenty for photoshop however. :p
 
Lesser size monitor with higher resolution have smaller pixels, if this ratio is extremely different, it may result to detail reduction, that is the reason why you cant buy a 22 with 2k+ res.

But do you have imagine how hard graphics card is needed to play games on that big resolution?

Eh, no. More pixels == more resolved details on screen. So what you get is a much sharper image and at those dpi's you are approaching print resolution, so it would be more like looking at a photograph than anything.

Many modern video cards woudn't have a problem driving the latter revisions of the T221 (dual-link DVI).

As for games, with such a high DPI, you get an effective second native resultion which will look just like as if you were on a current generation monitor ie. on the T221 you can use standard 1920x1200 and it should look just as sharp as if it were a native HD panel
 
Not a problem for me. This would by no means be for anything requiring a high refresh rate (games, movies, etc). I know what this monitor is for, so I know exactly what to expect out of it. I thought the newer revision did 48hz. 41hz is plenty for photoshop however. :p

Yeah, I thought they upped the rate. In any case, most movies are what 23fps, 25fps, 29fps (yeah, no nice ratio to 41, 48hz, but could always run on a slower refresh rate - its not as if you get flicker on an LCD panel unlinke crts). Only FPS games will be somewhat affected.
 
Yeah, I thought they upped the rate. In any case, most movies are what 23fps, 25fps, 29fps (yeah, no nice ratio to 41, 48hz, but could always run on a slower refresh rate - its not as if you get flicker on an LCD panel unlinke crts). Only FPS games will be somewhat affected.
It still has a 50ms response time, so movies wouldn't look very good. I have my FW900 for movies though, so I'll be fine (and it is 0.3" bigger :D).
 
i considered an FW900 a few years ago but with its weight and other "old" technology drawbacks, i decided to go with an LCD instead. lower rez, sure, but the 19" Acer POS was a hold me over for the Samsung 226BW which is now a Samsung LN32A550 1080p.

FW900 was too involved to make it work properly for my purely gaming needs.
 
Eh, no. More pixels == more resolved details on screen. So what you get is a much sharper image and at those dpi's you are approaching print resolution, so it would be more like looking at a photograph than anything.

Many modern video cards woudn't have a problem driving the latter revisions of the T221 (dual-link DVI).

As for games, with such a high DPI, you get an effective second native resultion which will look just like as if you were on a current generation monitor ie. on the T221 you can use standard 1920x1200 and it should look just as sharp as if it were a native HD panel
thats right, but only if we talking about video/photo scenes, but OS's GUI will be very small and harder to read fonts, icons, symbols, websites...etc, in gaming the graphics can looks cool but the game's HUD will be small and hard to use too, and you need very good graphics card that will be able to render in monitor's very high native resolution

for example this was availiable on CRT monitors, some of them in sizes 15 or 17 were able to display 1600x1200 or higher resolution what was recommended for 19" or larger CRTs, but it was very hard to work with that resolution
 
The T221 is definitely not a general usage monitor. To give you an idea, here is one next to a 30" ACD. If you want high resolution in a smaller size (and lesser wallet hit), SGI made a 17.3" widescreen LCD that did 1600x1024.
monitor.PNG
 
thats right, but only if we talking about video/photo scenes, but OS's GUI will be very small and harder to read fonts, icons, symbols, websites...etc, in gaming the graphics can looks cool but the game's HUD will be small and hard to use too, and you need very good graphics card that will be able to render in monitor's very high native resolution

Only if you use windows. Linux already has good support for resolution independence. In gnome, all I need to do is set the dpi to the monitor, and everything looks just fine. I believe OS X has some good scaling as well.

for example this was availiable on CRT monitors, some of them in sizes 15 or 17 were able to display 1600x1200 or higher resolution what was recommended for 19" or larger CRTs, but it was very hard to work with that resolution

The problem with CRTs running at insane resolutions is that first you loose sharpness. The pixels become more blurry, as opposed to the razor sharpness of an LCD. Secondly, you need to run at a low refresh rate (which would be fine for an LCD), but with CRT, there is a significant amount of flicker, and us flicker sensitive people cant stand it (migranes, eye strain, etc).
 
+1 for the original poster. Toshiba used to make a laptop with a 15" screen @ 1600x1200 (UXGA), the 5105-S607. I liked the fine dot pitch. Today, even as my eyes get older, I still like the idea of higher dot pitches even if just to smooth edges by scaling fonts.
 
Only if you use windows. Linux already has good support for resolution independence. In gnome, all I need to do is set the dpi to the monitor, and everything looks just fine. I believe OS X has some good scaling as well.

Resolution independence and a vector based UI was dropped from Leopard afaik. It was the one thing I was looking forward to. :(
 
Resolution independence and a vector based UI was dropped from Leopard afaik. It was the one thing I was looking forward to. :(

Yes, true resolution independence was dropped, but some components still remain. I believe that there is a tweak that enables scaling (think windows dpi setting) that enlarges the widgets and all, but a lot of the stuff isn't vector based yet so looks pixelated.
 
Back
Top