So why did they skip 1440p and 2K to go right to 4K ?

Subzerok11

Gawd
Joined
Aug 13, 2014
Messages
550
Was wondering why we went from 1080p then 4K, why not 1440p for TV and monitors ? 1440p has been around for a little while so why not use it ?
 
Is it because if you watch a 1080p video on a 1440p tv/monitor it has to stretch the image and make it slightly blury. On a 4k (3840 × 2160) monitor since it's double the pixels a 1080p video would not have to use a bluring filter to increase the resolution of the video to match the monitor's native resolution, instead you can just double the pixels and retain the videos sharpness.

I've heard of some people having scaling issues and whatnot with 1440p basically icons and text were too small and they didn't scale correctly for some. With this also be a non-issue with a 4K monitor ?
 
No one "skipped" anything. There are plenty of monitors with resolutions between that of 1080P and 4K.

In terms of TV's specifically, it's different. You have a bit of a chicken and the egg issue. Resolutions higher than 1080p for TVs is a hard sell because there isn't much content above 1080p. 1080p is already more than good enough for most people (most can't even tell the difference between 1080p and 720p). 4k represents enough of a boost over 1080p (both in terms of real benefits and BS marketing fluff) that people are buying into it. Enough people are buying 4k TVs to where 4k content should become increasingly common, which will further drive sales. That itself wouldn't work if there wasn't one single new resolution that the industry was standardizing on.

Edit: In reference to your 2nd post, scaling was supposed to be one of the big benefits of 4k, yes. It is supposed to be able to display 1080p content by simply doubling up pixels vertically and horizontally. Unfortunately, in practice, few 4k TVs actually do this, and for whatever reason 1080p images end up going through the shit scaler and coming out blurry :mad:
 
Last edited:
Do not put TVs and monitors in the same bag. There are tons of 1440 monitors.

For TV as said above the simplest is doubling HD resolution.
 
UHD 4k (3840x2160) is an even 2x multiple of 1920x1080 and a 3x multiple of 1280x720, while a 2560x1440 monitor can't evenly scale 1080p.

Also, "2k" isn't really a resolution and if it were would probably be used to refer to 1080p, since the new 4k/5k/8k resolutions refer to the horizontal instead of the vertical resolutions as 480/720/1080/1440/1600(p/i) did.
 
What will become the next standard resolution for most PC gamers 1440p or 4K ? From what I've seen 1440p according to some statistics has a very small percentage of the market and it's been out for years so is 4K going to be the PC standard when consoles become 4K capable in the next few years in the next generation ?
 
Guys.. 1080p (1920x1080) → 4K (3840x2160) is not double the resolution. It's 4x!
 
Guys.. 1080p (1920x1080) → 4K (3840x2160) is not double the resolution. It's 4x!

tard.gif
 
@ SolidBladez: Depends on how you look at it. Since the shorthand version only uses the vertical resolution, saying 4k (2160) is double 1080 is correct. From a pixel standpoint it absolutely is 4x, which is why whenever I talk about 4k TVs in real life I always say that they should have called it 4x.
 
1080+1080 is 2160 so why do they call it 4k shouldn't it be called 2K, sorry if this sounds like a dumb question maybe I just don't understand something.
 
Lot's of corrections for the above:

4K is 4096x2160.
UHD is 3840x2160.
2K is 2048x1080
HD is 1920x1080.

MadVR scales 1080 to 1440 lines so well enough to be less of an issue to IQ than the limitations of LCD itself.
 
I'm looking at an ad in the paper for HH Greg in the ad it says 4K has four times the resolution of full HD which that means basically 1080p, but why do they call it four times the resolution 1080+1080 is 2160 that's twice only not four times.
 
I'm looking at an ad in the paper for HH Greg in the ad it says 4K has four times the resolution of full HD which that means basically 1080p, but why do they call it four times the resolution 1080+1080 is 2160 that's twice only not four times.

Marketing... bigger numbers sound better. They are also referring to pixel count, in which case 4 times is correct.
 
Marketing... bigger numbers sound better. They are also referring to pixel count, in which case 4 times is correct.

I'm not too sure why you say it's a marketing thing but then point out that it is legitimately correct; you'll just confuse the poor guy! :p

To answer SubZero's question, you're forgetting the horizontal component of the resolution. Yeah 1080 + 1080 = 2160 = only twice the resolution vertically, but then you also have twice the resolution horizontally to equal 4 times the resolution.
 
I say call it according to megapixel count. That would actually make the numbers mean something... But oh well. The damage is done. :)
 
To answer SubZero's question, you're forgetting the horizontal component of the resolution. Yeah 1080 + 1080 = 2160 = only twice the resolution vertically, but then you also have twice the resolution horizontally to equal 4 times the resolution.

That's exactly where the problems is with multiplying resolutions. Resolution in precise scientific terms is quantity measured along single dimension since in general resolution in different directions (e.g. vertical, horizontal, diagonal) might not be the same (this is the case with the resolution of human eye where it is measured in lines/mm). Multiplying then makes really no sense. In the case of equal vertical and horizontal resolutions one can make a shortcout and say that 4K is twice the resolution opf 2K.
 
Back then I thought it was strange that there were no 1440/1600p TV's and I wanted one because I always wanted a TV screen bigger than 40" which supported either 1440/1600p but since 2160p arrived pretty much changed my mind, I have one now and quite happy with it.

Though I wonder why the industry is labeling 3840x2160 as "4K" when it really isn't, should have been called 3.8K or something. Can somebody explain why its called 4K, just like 7680 x 4320 is called "8K" when it really isn't.
 
3840 × 2160 = 8,294,400
1920 x 1080 = 2,073,600

8,294,400 ÷ 2,073,600 = 4

Yep, UHD is 4 times the resolution/pixels of 1080.
 
Back then I thought it was strange that there were no 1440/1600p TV's and I wanted one because I always wanted a TV screen bigger than 40" which supported either 1440/1600p but since 2160p arrived pretty much changed my mind, I have one now and quite happy with it.

Though I wonder why the industry is labeling 3840x2160 as "4K" when it really isn't, should have been called 3.8K or something. Can somebody explain why its called 4K, just like 7680 x 4320 is called "8K" when it really isn't.

4K refers to the horizontal resolution of 4096. That's actually a real standard. However, televisions are usually 3840 which is actually UHD. 3840 is a multiple of 1080P which makes up scaling 1080P much easier. The marketers tend to round up or down whenever it suits them. Hard drive capacities are rounded down to 1000 instead of 1024. Whatever the case, the name 4K stuck as it's just rolls off the tongue better than UHD. For what it's worth, most if not all 4K UHD TV supports 4096x2160 resolution scaled. So I guess technically, it supports true 4K resolution just not natively.
 
I thought 2K was 2048×1152 in which there were a couple monitors made with it.

Dell SP2309W
Samsung 2343BWX
 
Man up and buy a 4k display with appropriate GPU power.

Then you won't care "why". ;)
 
Back then I thought it was strange that there were no 1440/1600p TV's and I wanted one because I always wanted a TV screen bigger than 40" which supported either 1440/1600p but since 2160p arrived pretty much changed my mind, I have one now and quite happy with it. Though I wonder why the industry is labeling 3840x2160 as "4K" when it really isn't, should have been called 3.8K or something. Can somebody explain why its called 4K, just like 7680 x 4320 is called "8K" when it really isn't.

4K refers to the horizontal resolution of 4096. That's actually a real standard. However, televisions are usually 3840 which is actually UHD. 3840 is a multiple of 1080P which makes up scaling 1080P much easier. The marketers tend to round up or down whenever it suits them. Hard drive capacities are rounded down to 1000 instead of 1024. Whatever the case, the name 4K stuck as it's just rolls off the tongue better than UHD. For what it's worth, most if not all 4K UHD TV supports 4096x2160 resolution scaled. So I guess technically, it supports true 4K resolution just not natively.

Indeed, strictly speaking K=1024 so 2K, 4K, 8K refer to the multiples of 1024, i.e. 2048, 4096. But in the television area the base of HD was selected as 1920. This has origin in ancient times when TV standard was 640x480, computer terminals of 320x240 existed and the number of 120 played important role. When the 16:9 format was selected for TV as giving better experience, resonable base number was 120 and this is how they came to the 1920x1080 of HD. But in pure computer terms the power of two numbers are better and the closest is 2048 and thus the term '2K' started life for describing the HD format. Further life to the 2048 was given by the DCI standard of digital cinema where they standardized 2048x1080 as the format for movies shown in digital cinemas. Why they did this? Just to differentiate from the consumer format and making exclusive format which can not be directly shown on consumer devices. When the 4K came we ended in the doubling of both formats: 3840x2160 and 4096x2160. This became bit messy so there is trend now in the TV area to use the UHD term instead of the 4K.

4K refers to the horizontal resolution of 4096. That's actually a real standard. However, televisions are usually 3840 which is actually UHD. 3840 is a multiple of 1080P which makes up scaling 1080P much easier. The marketers tend to round up or down whenever it suits them. Hard drive capacities are rounded down to 1000 instead of 1024. Whatever the case, the name 4K stuck as it's just rolls off the tongue better than UHD. For what it's worth, most if not all 4K UHD TV supports 4096x2160 resolution scaled. So I guess technically, it supports true 4K resolution just not natively.

As explained above, the use of 4K became quite messy. Strictly speaking you are right and the UHD should be now used instead of it when dealing with the 3840, but marketing is not changing habits easily. Regarding the support of true 4K resolution by UHD TVs this is virtual issue since there is no 4K content and even if there is it would not be best for UHD resolution, the best is original UHD format which makes possible pixel-to-pixel resolution without scaling.

I thought 2K was 2048×1152 in which there were a couple monitors made with it. Dell SP2309W Samsung 2343BWX

Yes. There were also 1920x1200 16:10 monitors and there is 4096x2160 monitor from LG and of course 5K 5120x2880 monitors . But such nonstandard resolutions are deep niche since standard resolutions of 1920x1080 and 3840x2160 are ruling.

3840 × 2160 = 8,294,400 1920 x 1080 = 2,073,600 8,294,400 ÷ 2,073,600 = 4 Yep, UHD is 4 times the resolution/pixels of 1080.

No, resolution is a term from physics and refers to single dimensional quantity. But you can equal a cart to car since they differ only by one letter.
 
To further muddy the 4k resolution 'standard'

4K-digital-film-standards.png


You'll notice none of them perfectly match pc monitor UHD 3840x2160, which used to be nicknamed "quad HD" since it was like 4 quadrants of 1920x1080 and as others have stated already, can use pixel doubling from 1080p cleanly as an even multiple.

In my opinion there was a huge marketing push for 4k (in tvs especially of course), but also on gaming review sites where many would not even benchmark 1440p anymore and would only show 1080p and 4k results, and usually only 4k results for sli benchmarks. This is very annoying since 4k is limited to 60hz at this point (as well as 60fps or less vs gpu power in most cases), which is greatly inferior in regard to motion clarity and motion definition to modern 120hz-144hz (+variable hz and modern gaming overdrive) monitors. It makes for pretty screen shots though.
 
Last edited:
Lot's of corrections for the above:

4K is 4096x2160.
UHD is 3840x2160.
2K is 2048x1080
HD is 1920x1080.

MadVR scales 1080 to 1440 lines so well enough to be less of an issue to IQ than the limitations of LCD itself.

Isn't 720p HD, and 1080p full hd
 
4x the pixels is 2x the resolution. What's difficult to understand? Both "4x" or "2x" are valid descriptors for the difference as long as they are qualified with the proper term.
 
4x the pixels is 2x the resolution. What's difficult to understand? Both "4x" or "2x" are valid descriptors for the difference as long as they are qualified with the proper term.

Don't be a smart ass, seems like a lot are getting this wrong too, sorry I don't live on these boards.
 
Don't be a smart ass, seems like a lot are getting this wrong too, sorry I don't live on these boards.

Wasn't directed at you (or anyone else) in particular, I was just trying to address general "2x vs 4x" bickering. Both are correct as long as they are qualified.
 
Back
Top