Understanding Resolutions and Displays

Joined
Feb 6, 2006
Messages
813
I am learning about technical side of resolutions, displays and stuff like sampling theorem, signal processing, etc etc....I am doing this Just for fun, just cause I find it a cool subject.

whats the differnce between counting pixels in a resolutions by A.) How many pixels there are. B.) How many Rows/Columns of Pixels there are. C.) the Length of Pixels. I am only familiar with counting it as the number of pixels. I.E. 1024x768 means there are 1,024 pixels horizontal and 768 pixels vertical. But this article i just read, said that number, 1024x768 could also be the number of rows of pixels. Which I've never heard of counting it that way.

And I believe counting the Length of Pixels refers to the PPI (pixels Per Inch) am I correct ?

EDIT - Ok I think that I get the difference now. When its an Image Resolution it's counted as Pixels per Column/Row and when it's a Display resolution its counted as the total number of Pixels Horizontal and Vertical. Am I correct ?
 
Last edited:
I don't know what you're talking about. 1024x768 only has one meaning: 1024 horizontal pixels (columns), 768 vertical pixels (rows).
 
Last edited:
But this article i just read, said that number, 1024x768 could also be the number of rows of pixels. Which I've never heard of counting it that way.
Do you mean like 720p and 1080p? Both of those only refer to number of rows, and then the columns are implied by 16:9 aspect ratio.

And I believe counting the Length of Pixels refers to the PPI (pixels Per Inch) am I correct ?
If you compare two moniters and measure x pixels in a line, the monitor with a smaller measurement for x pixels has a higher pixel density and therefore a higher pixel per inch.

The other way of looking at that is if you measure a length of y on both monitors, the one with more pixels inside your measurement of y has a higher pixel density and therefore a higher pixel per inch count.
 
I don't know what you're talking about. 1024x768 only has one meaning: 1024 horizontal pixels (columns), 768 vertical pixels (rows).

not according to everything i just read, which I read a lot of articles and wikipedia entries.
 
Do you mean like 720p and 1080p? Both of those only refer to number of rows, and then the columns are implied by 16:9 aspect ratio.


If you compare two moniters and measure x pixels in a line, the monitor with a smaller measurement for x pixels has a higher pixel density and therefore a higher pixel per inch.

The other way of looking at that is if you measure a length of y on both monitors, the one with more pixels inside your measurement of y has a higher pixel density and therefore a higher pixel per inch count.

Yea I learned that after i posted what I originally said. There is so much too this stuff, that after only reading a lil bit like i did when i posted this, i wasnt sure on a lot of things, but after more reading, I've figured out a lot more.
 
Pixels have the same meaning for monitors, for pictures and for videos. Each pixel is a colored dot -- on a modern lcd screen, usually comprised of three subpixels, one red, one blue, and one green. The three subpixels light in combination to produce a wider spectrum.

An image will have color data for each pixel. So an image taken with an old camera might have a resolution of 800 x 600. The camera would record data for 480,000 pixels (600 columns, 800 rows -- multiply them to get the total number of pixels). The data is the color value.

When you look at the image on a monitor, if the monitor's resolution is greater than the resolution of the image, you will be able to view the full image at 100% resolution (unscaled). Each pixel would show the color value recorded in the image data.

The only complication is with "scaling." You can view a 24,000,000 pixel image on a monitor that only shows 480,000 pixels. But either the monitor or the video card will analyze the image data and reduce its complexity. A single pixel on the monitor might show an aggregate of a thousand pixels of data from the image.

Scaling is a complicated process, and it can produced mixed results. You get a clearer image when one pixel from your input (video or image) is cleanly mapped to one pixel on your monitor.
 
Resolution in its true meaning is the number of dots per unit of length, not per unit of area. Total pixel count means nothing. A 100x100 display 1'' wide and 1'' high can be [up to] 100% (up to twice) sharper than 50x50 display 1'' wide and 1'' high, even though total number of pixels is four times as high.

In relation to human vision, it is center-to-center pixel spacing expressed in arcseconds / arcminutes / arcdegrees.
 
Resolution in its true meaning is the number of dots per unit of length, not per unit of area. Total pixel count means nothing.

I'm not sure where you got that definition from, I think you're thinking of resolution density or pixel density or something like that.

Resolution is literally the total pixel count, and i think many will argue it is the ratio of horizontal to vertical pixels as well.

A 24" monitor with 1920 x 1200 and 17" monitor with 1920 x 1200 both have the same resolution. The 17" monitor has more PPI though (pixels per inch).
 
I'm not sure where you got that definition from, I think you're thinking of resolution density or pixel density or something like that.

Resolution is literally the total pixel count, and i think many will argue it is the ratio of horizontal to vertical pixels as well.

A 24" monitor with 1920 x 1200 and 17" monitor with 1920 x 1200 both have the same resolution. The 17" monitor has more PPI though (pixels per inch).
Read into resolution, please. Not the bastardized term referring to total number of elements.
 
Back
Top