Does a 1920x1200@120Hz monitor even exist?

it doesn't exist. 1920x1080 at 120hz for LCD monitors is all you'll find.
 
The way things have gone with previous 120Hz monitor releases, there's talk about them happening well over a year in advance of their release. To date there has not even been a rumor that any of the monitor manufacturers are thinking about making a 120Hz 1920x1200. It's not totally impossible that one might be made someday, but plan on it never happening. Same goes for 120Hz 2560x1600 and 120Hz IPS. Not a hint, whisper or word.
 
The way things have gone with previous 120Hz monitor releases, there's talk about them happening well over a year in advance of their release. To date there has not even been a rumor that any of the monitor manufacturers are thinking about making a 120Hz 1920x1200. It's not totally impossible that one might be made someday, but plan on it never happening. Same goes for 120Hz 2560x1600 and 120Hz IPS. Not a hint, whisper or word.

That's pretty fucking pathetic.

Somethins is wrong when your video card(s) cost more than your monitor.

Oh well, thanks for the answer. Case Closed.
 
16:9 is better ratio for games and 120Hz is useless for anything other than games.

best gaming monitor is FW900. It don't do 120Hz in 1920x1200 (it maxout at about 96Hz in that res) but it don't matter as there is no single-GPU that can do 120Hz in new games in that res...
 
16:9 is better ratio for games and 120Hz is useless for anything other than games.

best gaming monitor is FW900. It don't do 120Hz in 1920x1200 (it maxout at about 96Hz in that res) but it don't matter as there is no single-GPU that can do 120Hz in new games in that res...

How is 16:9 better for games? And the single-GPU argument is just pointless. Tons of people have dual gpu setups...
 
How is 16:9 better for games? And the single-GPU argument is just pointless. Tons of people have dual gpu setups...

Because modern games are HORIZ+ and not VERT-. Wider aspect ratio = wider field of view. Except for the games with fixed fields of view, but an overwhelming majority of games are designed for a target resolution of 16:9 because they are console ports. So filling to 16:10 gives you the same FOV but the game looks stretched, or gets letterboxed. Either way, 16:9 is the current ideal resolution to play games at for almost any major title. Ergo, "gaming" 120hz monitors are targeted for that.
 
filling to 16:10 gives you the same FOV but the game looks stretched, or gets letterboxed.

What game of the last 5 years has produced a stretched look in 16:10? As for the letterbox games, yeah they're console ports.

Either way, 16:9 is the current ideal resolution to play games at for almost any major title.

How can that possibly amount to anything more than your own opinion?

Anyway there's a thread specifically for the 16:9 vs 16:10 debate. This isn't it.
 
i dont think you'll ever see 1920x1200 120hz.

Reason being, 120hz is entertainment/gaming feature. All HD movies are 16:9, and most games are built with a 1920x1080 resolution and aspect ratio in mind. Yes they support 1920x1200, but the majority of users have 1080p monitors anyway so companies might as well cater to the majority and not the minority.

1920x1200 or 12:10 is more or less a workstation aspect ratio. with 16:10, you can have two pages of a document side by side on your monitor, and not have to scroll to see the entirety of each page.
 
To my knowledge and understanding, for now we will not see any resolution higher than 1080p for 120hz monitors. This is a technical limitation due to available bandwidth via screen cables.
 
How is 16:9 better for games?
16:9 have larger FOV or viewing area. Also more and more games don't support 16:10 adding just black bars...

And the single-GPU argument is just pointless. Tons of people have dual gpu setups...
dual-GPU don't decrease input lag. Worse, it often increase it. More input lag defeats purpose of having 120Hz in the first place...

ps. people who think 100fps on dual-gpu = 100fps on single gpu are IDIOTS :rolleyes:
 
Because modern games are HORIZ+ and not VERT-. Wider aspect ratio = wider field of view. Except for the games with fixed fields of view, but an overwhelming majority of games are designed for a target resolution of 16:9 because they are console ports. So filling to 16:10 gives you the same FOV but the game looks stretched, or gets letterboxed. Either way, 16:9 is the current ideal resolution to play games at for almost any major title. Ergo, "gaming" 120hz monitors are targeted for that.

Nonsense rumor that is disgusting to see repeated here at [H]. A whole armada of ignorance created by that SC2 .gif (one of the few games designed so foolishly as to actually reduce the horizontal field of view for 16:10 monitors with the exact same 1920 horizontal pixels).
 
16:9 have larger FOV or viewing area. Also more and more games don't support 16:10 adding just black bars...


dual-GPU don't decrease input lag. Worse, it often increase it. More input lag defeats purpose of having 120Hz in the first place...

ps. people who think 100fps on dual-gpu = 100fps on single gpu are IDIOTS :rolleyes:

I've tried to restrain myself in the past (you've only been here four months) but can do so no longer. You're a complete dumbass and I'd be surprised if you aren't banned for idiocy very soon.

Edit for the OP: Don't hold your breath. Idiots are driving the development of monitor "technology". My advice is to grab one of the older, quality 16:10 LCDs (or better yet an FW900) and hold out for OLED. The display technology industry is in the toilet and the future outlook isn't promising.
 
Last edited:
Yeah, won't exist any time soon. 120hz is for games and movies like people have said and both those are targeting 1080p.

One reason I'm considering upgrading from my 16:10 24" is that BF3 is HORIZ+ (http://widescreengamingforum.com/dr/battlefield-3) so you get better FOV with 16:9. It's not that 16:10 isn't supported, it's just a zoomed view rather than a vertically extended view. Also, with regard to frames/s on modern FPS: with settings on LOW I'm certain you could easily reach 120fps on BF3 with a single 580 GTX. Most pro gamers play with settings on low for max FPS.
 
Nonsense rumor that is disgusting to see repeated here at [H]. A whole armada of ignorance created by that SC2 .gif (one of the few games designed so foolishly as to actually reduce the horizontal field of view for 16:10 monitors with the exact same 1920 horizontal pixels).

It's not nonsense. I've never even seen whatever .gif you're talking about. I work at a professional game development studio and I can tell you that 90% of our testbeds are using 16:9 displays, with a few minor 16:10 and 4:3 and I think we might have one 5:4 left over for unit testing.

If you design your game to be HORIZ+, which I can assure you any intelligent game designer in the last 10+ years has been doing, that means as aspect ratio gets wider, you see more of the game. Whether this means that the FOV shrinks or the viewport simply cuts pixels from the left/right, one of these happens.

If you have a game that will display more detail going from 4:3 to a 16:10, it will further display more information moving to 16:9. And moreso if you go eyefinity. That's how HORIZ+ works. The only exceptions are fixed FOV/viewport games, which are locked to one specific one (usually 16:9) and letterbox in anything else. If it doesn't lock the camera's FOV or viewport, then you're either increasing the FOV or displaying more visual data.

There isn't really an argument to this, it's just how it works. You either allow more detail to be shown as the aspect ratio gets larger, meaning 16:9 will see (slightly) more than 16:10, or you lock it to one FOV for fairness (more common in competitive games) and let it distort/letterbox on other ones (which, due to current gaming trends, is almost always going to target 16:9 foremost).

EDIT: Also, 1920 horizontal pixels means literally nothing in comparing two viewable areas. A screen that was for some reason 1920x400 would display triple the amount of information that a 1920x1200 would. The 1920x400 has an aspect ratio of 4.8:1, while the 1920x1200: is 1.6:1. Despite the fact that one has three times the vertical pixel count, what matters is the ratio used for calculating the viewport to display. While trying to play with 4.8:1 would look awkward in any game that doesn't expand the FOV as well (most non-eyefinity supported games), it would still be more game data despite occupying a third of the display space.
 
Last edited:
There is a 0% chance of that happening. 1920x1200 screens are out in favor of 1920x1080 displays. :(

Personally, I like the 16:9 ratio over 16:10, but not at the cost of real estate. If 16:9 is going to be the dominant standard, then I'm holding out for cheaper 2560×1440 screens for instance.
 
Ramza, you're telling me that increasing vertical resolution doesn't increase vertical field of view. Yet, a squished display with the exact same number of horizontal pixels, simply because of its wider aspect ratio, displays more information?

A 1920x400 screen displays three times the information of a 1920x1200 screen? Are you listening to yourself? If you really are a game designer (actually I believe it) I weep for the future of computing.

You either allow more detail to be shown as the aspect ratio gets larger, meaning 16:9 will see (slightly) more than 16:10, or you lock it to one FOV for fairness (more common in competitive games) and let it distort/letterbox on other ones (which, due to current gaming trends, is almost always going to target 16:9 foremost).

Right...or you could allow more detail to be shown as the resolution gets larger. But that doesn't make sense :rolleyes:.

Even if what you're saying is true, do you realize how incredibly sad that is? A wider aspect ratio always means more information, regardless of resolution? In what universe does that make the slightest bit of sense? What kind of game designer accepts that and programs around it?
 
Last edited:
I think when people say 16:9 give you more FOV than 16:10 they simply mean the aspect ratio and don't have a specific resolution in mind which is correct. However if you take 1920x1080 vs 1920x1200 the FOV on 1920x1200 is greater on the vertical side and the same on the horizontal side.

But if you took the vertical resolution of 1200 and apply the 16:9 aspect ration to that you'd end up with something like 2133x1200 which would have all around greater FOV than 1920x1200.

But they don't make 2133x1200 resolution monitor so comparing 1920x1200 to 1920x1080 is kind of apples and oranges because they don't make 1200 horizontal resolution in 16:9.

Same logic applies to 2650x1600 and 2650x1440. 30" vs 27" you essentially get the same FOV but in a smaller monitor which makes more FOV.

ya dig
 
Stay on topic

XoR, you're very confused, do more research. FoV and aspect ratios and resolutions are three completely separate things, the way a game handles them by defaulting to a higher FoV because of a wider aspect ratio doesn't mean that either resolution has an advantage over another.

The focus here is screens.

I am already running on a Sony FW900, but the uphill battle of driver updates breaking any and all custom resolution and refresh rate support is getting very time consuming and annoying, also the bezels are huge, not ideal for eyefinity, and a larger area would be nice too, and the picture is much better without anti-glare coating, sometimes day use is slightly annoying.

I've been waiting for SED, FED, and good quality screens for 5 over years. Instead what I got what gimmicky eye-tearing 3D using 20 year old shutter glasses technology.
 
Last edited:
@stevedave
FOV is resolution independent. It's also ratio independent. Just large FOV need wider resolution or it will look ridiculous and because of that game developers limit FOV. Same applies to wide screen and small FOV - it looks ridiculous => need larger FOV to look good.

It's best to find game with adjustable FOV and experiment with various settings and aspect ratios to understand how it work and why game developers use HOR+

edit://
kohan69
??????????????
 
Last edited:
To my knowledge and understanding, for now we will not see any resolution higher than 1080p for 120hz monitors. This is a technical limitation due to available bandwidth via screen cables.
Bingo...

From Wikipedia:

"WUXGA (1,920 × 1,200) @ 120 Hz with CVT-RB blanking (2 x 154 MHz)"

I wonder what significance CVT-RB blanking plays.
 
Last edited:
There's horizontal FoV and vertical FoV. The question is: if you go from 16:9 to 16:10, do you increase V FoV or decrease H FoV...
 
What's HORIZ+? Google doesn't help me out.
Hor+ (Horizontal Plus)
A hor + game is a game that when played on a widescreen monitor with a widescreen resolution, expands the horizontal component of the FOV while keeping the vertical component roughly or exactly the same. This is often considered the ideal solution for widescreen games, as it grants widescreen users a wider picture.

http://widescreengamingforum.com/article/screen-change

This post has a good screen cap comparison of how BF3 handles 16:9 vs 16:10
http://hardforum.com/showpost.php?p=1037828119&postcount=59
 
I am currently running on a Sony FW900, but the uphill battle of driver updates breaking any and all custom resolution and refresh rate support is getting very time consuming and annoying.

If you are currently running a FW900 there is absolutely nothing on the market remotely comparable. There likely won't be for at least 3-5 years. That's your answer.
 
To my knowledge and understanding, for now we will not see any resolution higher than 1080p for 120hz monitors. This is a technical limitation due to available bandwidth via screen cables.



Bingo...

From Wikipedia:

"WUXGA (1,920 × 1,200) @ 120 Hz with CVT-RB blanking (2 x 154 MHz)"

I wonder what significance CVT-RB blanking plays.


I don't get it, is that a DVI limitation? why not offer it through VGA, HDMI or displayport?

How do 21:9 2560x1080 displays work then?

edit: wow, it seems you're right. we're stuck at the refresh rate due to bandwidth reasons!!
DisplayPort version 1.2 was approved on December 22, 2009. The most significant improvement of the new version is the doubling of the effective bandwidth to 17.28 Gbit/s,

How is the bandwith measured? 1920x1080@120Hz is 2074MB 120 times a second? Wouldn't that be 250 gigabytes per second? or 2 Terabits of bandwidth?


If you are currently running a FW900 there is absolutely nothing on the market remotely comparable. There likely won't be for at least 3-5 years. That's your answer.

I sadly acknowledge that, but there's something terribly wrong with the market when a 12 year old product outperforms any current one.

Sure CRTs has a century lead in R&D on LCDs, but it's still discomforting.
 
Last edited:
There's horizontal FoV and vertical FoV. The question is: if you go from 16:9 to 16:10, do you increase V FoV or decrease H FoV...

both depending on how you look at it. But at some point your going to get more out of the 16:9 because the way our eyes work we see more horizontally than we do vertically. I think our eye are actually closer to 2.35:1 so we could stretch the screen even more to utilize more.
 
Copying, pasting, and editing from posts I have made previously:

1680x1050 = 1,764,000 pixels
1,764,000 x 60Hz = 105,840,000 pixels per second
105,840,000 x 24 color bits per pixel = 2,540,160,000 bits per second, or 2.54 Gbits/sec
To get values for 120Hz, merely double the 60Hz values. So:

1680x1050@60Hz = 2.54 Gbits/sec
1920x1080@60Hz = 2.99 Gbits/sec
1920x1200@60Hz = 3.32 Gbits/sec
Single-link DVI effective data rate: 3.96 Gbits/sec
1680x1050@120Hz = 5.08 Gbits/sec
2560x1440@60Hz = 5.31 Gbits/sec
2560x1600@60Hz = 5.90 Gbits/sec
1920x1080@120Hz = 5.97 Gbits/sec
1920x1200@120Hz = 6.64 Gbits/sec
Dual-link DVI effective data rate = 7.92 Gbits/sec
HDMI 1.3/1.4 effective data rate = 8.16 Gbits/sec
DisplayPort 1.0/1.1 effective data rate = 8.64 Gbits/sec
2560x1440@120Hz = 10.62 Gbits/sec
2560x1600@120Hz = 11.80 Gbits/sec
Displayport 1.2 effective data rate = 17.28 Gbits/sec
HDMI Type B effective data rate = 20.40 Gbits/sec

If I have made any errors, let me know. I am not 100% certain about some of the maximum speeds for the cables - I have read that effective data rates are 80% of symbol rates, and I only want to list effective data rates.

I have read that AMD video cards started using Displayport 1.2 with the 6000 series, and that Nvidia cards generally don't use Displayport and those few that do are still at 1.1. HDMI B has not been seen on consumer devices.

Going by my list, 1920x1080 and 1920x1200 should be possible on Dual-Link DVI, HDMI 1.3 and Displayport 1.0 and up.

Edit:
ToastyX said:
Those values don't include blanking.

1920x1080 @ 60 Hz is typically 2200x1125 @ 60 Hz = 3.564 Gbps or 148.5 MHz pixel clock
1920x1200 @ 60 Hz CVT-RB would be 2080x1235 @ 59.95 Hz = 3.696 Gbps or 154 MHz pixel clock
1920x1200 @ 120 Hz would need to be 2080x1271 @ 120 Hz = 7.614 Gbps or 317.25 MHz pixel clock

That should still fit within dual-link DVI, which is 2 x 165 MHz links = 330 MHz.

Thanks ToastyX, blanking and pixel clocks are totally beyond me. As long as we know dual-link DVI could do 1920x1200@120Hz, and HDMI 1.3/1.4 and Displayport 1.0/1.1 are a bit faster, that's what's important here.
 
Last edited:
both depending on how you look at it. But at some point your going to get more out of the 16:9 because the way our eyes work we see more horizontally than we do vertically. I think our eye are actually closer to 2.35:1 so we could stretch the screen even more to utilize more.

Insanity. If an application shows less information with more pixels, the only thing to blame is the programmer.
 
evilsofa said:
Going by my list, 1920x1080 and 1920x1200 should be possible on Dual-Link DVI, HDMI 1.3 and Displayport 1.0 and up.
Those values don't include blanking.

1920x1080 @ 60 Hz is typically 2200x1125 @ 60 Hz = 3.564 Gbps or 148.5 MHz pixel clock
1920x1200 @ 60 Hz CVT-RB would be 2080x1235 @ 59.95 Hz = 3.696 Gbps or 154 MHz pixel clock
1920x1200 @ 120 Hz would need to be 2080x1271 @ 120 Hz = 7.614 Gbps or 317.25 MHz pixel clock

That should still fit within dual-link DVI, which is 2 x 165 MHz links = 330 MHz.



xorbe said:
Insanity. If an application shows less information with more pixels, the only thing to blame is the programmer.
It's not that simple. You say more pixels. More pixels compared to what? It's completely arbitrary because 16:9 is not always less pixels than 16:10.

23" 2048x1152 (16:9) = 2359296 pixels
24" 1920x1200 (16:10) = 2304000 pixels

Not only that, 3D games aren't based on pixels, so which monitor should get the advantage?
 
(Cool V is right, went off-topic.)
 
Last edited:
16:9 is better ratio for games and 120Hz is useless for anything other than games.

best gaming monitor is FW900. It don't do 120Hz in 1920x1200 (it maxout at about 96Hz in that res) but it don't matter as there is no single-GPU that can do 120Hz in new games in that res...

You say 16:9 is better for games then say that the best gaming monitor is FW900... but its native and recommended resolution is 16:10 as well as its highest supported (2304x1440) :confused:

I love how this thread turned into an aspect ratio war again.
 
xorbe said:
Cripes, you KNOW that we're referring to 1920x1200 versus 1920x1080. Total argument fail. :rolleyes:
Games have to work with more than just 1920x1200 and 1920x1080, so how can you blame the programmer when you can't even think beyond more than two possible scenarios?

You fail to understand that your statement doesn't make any sense unless you compare two specific resolutions in relation to each other, but then the comparison is completely arbitrary because 3D games are not based on pixels, so how do you decide which resolutions should show more information and which resolutions should show less?
 
1920x1200 @ 120 Hz would need to be 2080x1271 @ 120 Hz = 7.614 Gbps or 317.25 MHz pixel clock

That should still fit within dual-link DVI, which is 2 x 165 MHz links = 330 MHz.
I don't understand. 1920x1200 @ 120 Hz = 2080x1271 @ 120 Hz? Different pixel count with the same frequency makes them equivalent? :confused:
 
=========================
Several threads deal with 16:9 foV in games and cover it much more informatively than this thead.
============================

16:10 vs 16:9 - what you prefer?



The 1920 x 1080 vs 1920 x 1200 thread


=============================

aspect ratio in itself is more like a lens.. it does not mean pixels or resolution. A wider virtual lens (not wider pixel count mind you) will show wider field of view at the same virtual viewing distance. This is virtual cinematography using virtual cameras.

Wiki http://en.wikipedia.org/wiki/Field_of_view_in_video_games

HOR+ (Horizontal Plus) scaling is the most common scaling method for computer games released after 2005. The FOV in height is fixed while the FOV in width is expandable dependent on aspect ratio of the monitor resolution; a wider aspect ratio gives a wider FOV. The FOV is independent of how high the monitor resolution is. For instance the FOV will be the same for 1366x768 and 1920x1080 because both resolutions are 16:9. Any 16:9 resolution will always have wider and bigger field of view than any 16:10 or 4:3 resolution

==========================

eyefinity_config-aspects-visualized_sm.jpg
 
To my knowledge and understanding, for now we will not see any resolution higher than 1080p for 120hz monitors. This is a technical limitation due to available bandwidth via screen cables.

No, DisplayPort v1.2 increases performance by doubling the maximum data transfer rate from 10.8 Gbps (Giga-bits-per-second) to 21.6 Gbps.

HDMI 1.4b spec was also just announced last month and will definitely handle 21 Gbps.

1920x1080@120Hz, 24-bit = 17.92 Gbps
1920x1200@120Hz, 24-bit = 19.91 Gbps

Also, everyone should just move their debate here: http://hardforum.com/showthread.php?t=1635939
Edit: Looks like elvn beat me to the punch.
 
Last edited:
xDezor said:
I don't understand. 1920x1200 @ 120 Hz = 2080x1271 @ 120 Hz? Different pixel count with the same frequency makes them equivalent? :confused:
There are gaps between each line and each frame that aren't shown by the monitor.

1920x1200 is the active (visible) pixel count, 2080x1271 is the total pixel count including blanking.

1920 pixels active + 48 pixels front porch + 32 pixels sync width + 80 pixels back porch = 2080 pixels total

1200 lines active + 3 lines front porch + 4 lines sync width + 64 lines back porch = 1271 pixels total

The total needs to be considered when calculating bandwidth.
 
Back
Top