GeForce GTX 580 vs. Radeon HD 5970 2GB Performance @ [H]

I don't understand how you have more deskspace with 16:9 versus 16:10? One is 1920x1080 and one is 1920x1200 - so 16:10 has the same horizontal pixels and more vertical pixels, which makes more pixels total. So how is that less space?

I don't see how anyone would argue for less vertical height. If you want more width, okay, but why take it at the cost of height when you can have both with 16:10?

16:10 and 16:9 are the same width, when we are taking about the desktop. 16:10 is just higher. The reason Sirgcal thinks he is getting more space is because his eye has been fooled into thinking 16:9 is wider because it isn't as high. It's just an illusion.

Most people think 16:10 is better for desktop.

Edited this based on Sirgcal's post above. For the game situation it's different and you see more on 16:9 display than 16:10. But this is mainly because games are mainly console ports now and the Fov isn't done right for 16:10 that's why it looks squished. I think if you use the scaling options in the graphics card control panel it can solve this problem. I don't know as I only have a 16:9 display myself.
 
Last edited:
16:10 and 16:9 are the same width, when we are taking about the desktop. 16:10 is just higher. The reason Sirgcal thinks he is getting more space is because his eye has been fooled into thinking 16:9 is wider because it isn't as high. It's just an illusion.

Yeah, that's what I think too - pixels don't lie.
 
You're looking at pixels, not actual use. If you have the same height of the image viewable on both monitors, the 16:9 model has more to the sides. It's all about the asepct and point (or field) of view. I want the better field of view. If you use the desktop the same way, you have more usable area. If you're an 'everything is fullscreen' person, then you have no idea about utilizing desktop space in a development environment anyhow so it doesn't matter what resolution you're running.

I run a Dual 22 inch monitors 16:10.

I do alot of work in Access and SQL with databases and I prefer 16:10 for the work I do.
 
Sure, you'd sum the squared variances from the mean and divide by the total number of FPS measurements over the sample. But then all you get is a number, and that number means absolutely squat to 99% of people as it takes a pretty decent quantitative background to figure it out.

Not to mention, you'd need to phrase it within a probability distribution so you could graphically see the skewness or kurtosis which is present.

Then again, me, and about one other person might find that cool :p
The rest would say WTF.

the number would be meaningful. it is the "low framerate" value. people understand that. I call it normal low
 
The move by manufacturers to make *primarily* 16x9 instead of 16x10 is a rip off to consumers. It probably has some to do with manufacturing costs, particularly to only have to have one fabrication process for HDTVs and monitors, but it still sucks. This recent move to 16x9 as the common ratio instead of 16x10 is a step BACKWARD from monitors from only a few years ago.

yes
 
Here's an interesting example: http://www.nvnews.net/vbulletin/showthread.php?t=152594&page=4. With most games locked with vertical FOV, the 16:9 shows you more actual information properly formatted because they have a wider FOV. 16:10 can see the same by forcing the FOV wider but it squishes the image and, to me, looks like crap.

broken FOV is the fault of a broken game. it can usually be fixed because PC games can be modified

1920x1080 is smaller than 1920x1200 period
 
broken FOV is the fault of a broken game. it can usually be fixed because PC games can be modified

1920x1080 is smaller than 1920x1200 period
not for games. nearly all modern games are hor+ so 16:9 simply adds a little more on the sides.
 
I think I can solve all of this pretty easily...

When we're comparing 1920x1200 versus 1920x1080, 1920x1200 inherently contains 1920x1080. In a desktop workspace environment, it's impossible to make any successful argument that 1920x1080 offers more space because you can accomplish exactly the same thing by ignoring 120 vertical pixel rows on a 1920x1200 monitor (which, interestingly, leaves us with... I dunno, 120 rows of pixels to allocate for some productive purpose?). Even for gaming, if pure 16:9 is better for a certain game, then you can run at 1920x1080 without losing anything.

The only valid argument for 1920x1080 over 1920x1200 is that of price. A monitor of 1920x1080 resolution with the same width as a monitor of 1920x1200 resolution will have both fewer pixels and less screen space. Those two factors make the 1920x1080 monitor cheaper to produce, and in turn should mean cheaper to sell. If you aren't interested in the extra 120 rows of pixels, then you're going to save money on a 16:9 monitor so... go with that. Personally, I'll stick with my 16:10.
 
Exactly. SirGCal seems to be confusing games with Hor+ (which is less bad than Ver-, but still bad. games should scale per pixel in all directions) and desktop. 16:10 offers more desktop space, and you can always display 1920x1080 on it and get the same FOV in Hor+ games. Best of both worlds and much better than 16:9.
 
I think Nvidia screwed everyone with the 4xx release instead of adjusting the chip to the 5xx specs first. But Amd was running wild with the 5xxx cards claiming tesselation blah blah & capturing market share so they had to release 4xx. But as an Amd user I must say the Nvidia 5xx chips outclass the Amd 6xxx including the 69xx that's coming out as well. I'll wait to see the results of 69xx but I mite be turning green with my next vid card.
 
Back
Top