• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

1366x768 native.. Why?

mange_

Gawd
Joined
Jan 1, 2001
Messages
534
I'm having a hard time finding some logic in why lcd tv panels nowadays are of 1366x768 native resolution. If I remember correctly the first lcd tvs where 1280x720 native, which makes a lot of sense seeing that 720p is 1280x720, no need for upscaling there. But then all manufacturers switched to 1366...?

Whenever I'm at a tv store I ask this question and the answers I get are borderline idiotic, is there someone here who knows exactly why the industry choose this native res for lcd tv panels?

Thanks,
Mange
 
I have nary a clue, but the xbox 360 can output at that slightly-higher-than 720p resolution, so that would be a good match-up if that were your goal.
 
Yeah I know alot of stuff support that res, but I'm dead curious of why the industry went for that res.. And no one can give me an answer.. :( Logically it must be something with the manufacturing process, like that it's easier to produce 1366x768 panels instead of 1280x720, but why? :)
 
If I remember it has to do something with the 5% overscan that most programs give you. If you have a 1280x720 the overscan forces the tv to "zoom in" to remove that overscan and thus the display is no longer as crisp.
 
If I remember it has to do something with the 5% overscan that most programs give you. If you have a 1280x720 the overscan forces the tv to "zoom in" to remove that overscan and thus the display is no longer as crisp.

That is what I have heard.
 
Nah, I think it has to do with fabrication being easier to do because of something like sticking with xga resolutions or something. The overscanning idea doesn't make any sense because that's entirely dependent on the video circuitry (especially when fed a digital signal).

There's not really a whole lot to worry about though, since scaling of progressive video doesn't usually degrade image quality. 1:1 pixel mapping will always look sharper for text and such, but with the Xbox 360 it doen't make any difference because the source is simply scaled up from 720p. Either way you shake it, you're not gonna get perfect mapping; however, sometimes TV's have really shoddy scalers, and the 360 may produce a noticeable image quality increase. What you mainly have to watch out for is 1080i sources, since most 720p televisions convert from 1920x1080i down to 1920x540p, throwing away half the resolution in the process(but it'll still look good at smaller sizes).
 
Please remember that the HD standards were set AFTER the VGA standards. This is one part of the problem. The VGA standard was around long before we started going HD. This is similar to people saying that Lord of the Rings was such a rip off from Star Wars. Know thy roots.

Also, quite a few components out there, along with monitors, are optimized to sync with the standards of 600, 768, 1024, 1200. Therefore to get the widescreen format with minimal change to the standards, it was "eaiser" to just multiply the 768 by 16/9. In short...cheap managers.

And lastly...1366x768 is NOT a recognized standard. This is why so many people have to use a 3rd party tools (powerstrip for example) to tweak their vidoecards setting to work with the device. It is quite sucky actually and why I will NEVER buy a screen that isn't a supported standard.

The resolutiosn standards though in general sorta suck. In some cases there was some effort by IBM to come up with a reasonable upgrade path..but techonology and desires now outweigh common sense. The wiki is pretty close to the truth in this case.
 
And lastly...1366x768 is NOT a recognized standard. This is why so many people have to use a 3rd party tools (powerstrip for example) to tweak their vidoecards setting to work with the device. It is quite sucky actually and why I will NEVER buy a screen that isn't a supported standard.


It's not a standard TV resolution, but it's a standard monitor res. either way, you don't need powerstrip if you're using an nvidia card. you'll likely need to tweak resolutions anyway to compensate for overscan, which is standard on all TVs, expect for a few of the newest models. my resolution is actually closer to 700 vertical lines, as it is an edited 720p variant, adjusted for the overscan.

zv
 
It's not a standard TV resolution, but it's a standard monitor res. either way, you don't need powerstrip if you're using an nvidia card. you'll likely need to tweak resolutions anyway to compensate for overscan, which is standard on all TVs, expect for a few of the newest models. my resolution is actually closer to 700 vertical lines, as it is an edited 720p variant, adjusted for the overscan.

zv

I cannot stress enough that It is not a standard monitor res. Just because some vendors have chosen to support it doesn't make it a standard. This is part of the difficulty. nVidia has always been very good about support the non-standard while quite a few other manufactuers don't.

I also can't quite agree with overscan being standard on all TV's anymore. It all depends on the source and the desitation whether or not overscan is applied. When I play a DVD on a digital display, part of the picture isn't "cut off" anymore by the overscan..I get the whole thing. On a digital display, if set up correctly...overscan = ZERO. The same goes for streamed media. However, a TV tuner or an external TV type digital box may "apply" overscan for you and some TV's may intentially hide part of the screen thus forcing your to deal with overscan. This is actually quite a problem for me when dealing with fan-subs from Japan. Some of my friends have overscans at 10%+..which causes text to get clipped off if I make a DVD of it...but I have NO problem on my PC or my analog TV (~4% overscan).
 
Overscan is a function of TV anyways. It's a relic of the old original tube TV days, when the signal was warped and crappy due to poor transmission technology. By overscanning, they could cut of the edges of the frames, and make the picture look less distorted. I'm not sure how you define "standard res" for monitors, but 1024x768 is most certainly a "standard." No one uses powerstrip for a monitor. You're thinking of TVs, dude. :confused:

Only recently, with the convergance of monitors and TVS, has overscan begun to be phased out, and often only on non-TV formats. HOWEVER, a quick scan through AVS forums, and you'll find plenty of unhappy people with part of the HUD cut off in GoW.

You've made a lot of leaps and generalizations in that post. Be careful. Saying things like overscan isn't standard for TVs isn't right. There are currently only a handful of new sets taht have zero overscan.

You CANNOT tweak your own overscan settings without utilizing the service menu of your TV, which is for professionals ONLY. You can totally screw up your set in there, so it should NOT be reccomended. Most TVs don't even allow overscan adjustment in the service menu, only by a hardware flash, or new mainboard.

TV tuners and TV boxes have nothing to do with overscan. It's from the TV, not the source. There is absoultely no such thing as a source "applying" overscan.

You mentioned powerstrip being used b/c of 1024x768 res. That's not true. It's used more often to adjust for overscan, refresh rates, etc when combining PCs and TVs. As the two universes collide, things like powerstrip will be a relic.


The poor OP's head must be spinning. lol
 
TV tuners and TV boxes have nothing to do with overscan. It's from the TV, not the source. There is absoultely no such thing as a source "applying" overscan.

I'm sorry you don't understand my words...but that is okay.

You are correct that it is the destination that "applying" overscan at the source as it is applied at the receiver. However, I do sometimes, on the fly or recode, my "source" to have a 5%-10% black band around it in order to tolerate "TV's" with higher overscan. This is an absolute must for my wife's workout videos because the TV she watches it from has almost 20% overscan and hates that important parts of the workout get cut off on the edges.
 
I'm sorry you don't understand my words...but that is okay. You are correct that it is the destination that "applying" overscan at the source as it is applied at the receiver.

LOL. what? It's no wonder I'm having trouble understanding you.

However, I do sometimes, on the fly or recode, my "source" to have a 5%-10% black band around it in order to tolerate "TV's" with higher overscan. This is an absolute must for my wife's workout videos because the TV she watches it from has almost 20% overscan and hates that important parts of the workout get cut off on the edges.

20%!!?? I doubt any TV in history cut of one fifth of it's image for overscan. Are you sure you're not referring to aspect ratio bands or something? Again, the bolded text above is escaping me. Are you saying you underscan the image from the source to compensate for the TV's inherent overscan? Why would black bars appear if you corrected for it?
 
Back
Top