2012 HDTVs?

Are you kidding me? What does that have to do with a display accepting a certain input? Why would I want to buy another device that isn't even going to fix a fundamental flaw of the display device?
 
A 120 Hz monitor is 120 Hz. A 120 Hz TV just makes up fake frames from a lower source frame rate.

That's rather weak :(

I thought I could hook up my HTPC video card to the TV and enjoy the benefits of gaming at 120hz and possible less nausea when playing.
 
It is very weak. You're limited to an input of 60 Hz. The 120 Hz processing introduces lag and artifacts. I'd imagine it's very apparent in movies, seeing that it makes up 4 new frames for every frame.
 
It is very weak. You're limited to an input of 60 Hz. The 120 Hz processing introduces lag and artifacts. I'd imagine it's very apparent in movies, seeing that it makes up 4 new frames for every frame.

So are there any TVs that DON'T make up new frames for this? I'm trying to find a solution for the nausea problem, bet it a monitor or a TV (was hoping for a TV).
 
You can turn off processing, but your are limited to 60 Hz. I think it's too high of a rate for you. If you are experiencing nausea it is because your brain finds it too realistic, and you're getting motion sickness. Higher framerates will make it worse.
 
You can turn off processing, but your are limited to 60 Hz. I think it's too high of a rate for you. If you are experiencing nausea it is because your brain finds it too realistic, and you're getting motion sickness. Higher framerates will make it worse.

technically you are wrong that you can turn off processing and it puts the display into a 60hz mode.

all the 120hz LCD panels do either interpolation OR frame repeat. When you disable the porcessing the screen still runs at 120hz, it just repeats the frames with a smaller blank in between each frame or inserting black frames and if it is a 240hz or 480hz screen it is also doing black frame insertion combined with a frame repeat.

As far as what is coming in 2012, I'd say we'll see a push for quadHD (3840x2160) for glasses free 3D and more 4K news in general and apps, tons of apps.

A big determining factor will be if the iPad 3 is 2560x1536, because if a 9.7" screen is that dense we'll see 4K on the desktop in the 27"-30" range and I could see Apple waiting to do a tv display at 3840x2160 to do the "apple tax" markup a lot easier than them doing another 1080p screen unless it was OLED and too 120hz input over Thunderbolt (which is DP 1.2 for video).

That big a CE maker pushing for higher DPI screens would be needed unless the glasses free demand can push other companies too cause new resolutions are all about demand and profits.

In desktop displays we 1080p because we want cheep screens, but if a 9.7 inch screen in a iPad is rocking a 2560x1536 screen , not many consumers are going to think 1080p is good enough.

The thing is that Apple also has a history of high resolution premium screens with their 30" cinema screens and they can cater to the movie making crowd by claiming the do 4K work flows the best with a quad HD screen.

Then if the panels are around Dell and HP will probably make use of them too and both of them already offer Displayport also so it has to be 4K screens or bust in 2012.
 
I can't really make much sense of your post. Never did I claim that disabling puts the display in any sort of mode. Oh my, I can't wait til 4k! Just what I need, 4k of resolution for 24 FPS motion-blurred garbage. If movies went to 60 FPS people would like that detail better than 4k 24 FPS CRAP.
 
If that's your budget, get a 46'' or 55'' Samsung LED, specifically the UN46D6000 or the UN55D6000. Personally, after having gone to PC Richards, Best Buy, and other stores, comparing different brands (Samsung, Sharp, Sony, Panasonic, etc) and technologies (LED, LCD, Plasma), I find that the Samsung LEDs have the brightest, sharpest, most crisp and vibrant displays. I'm sure Panasonic fans will tell you that the VTs have much better image quality, but from what I saw in person, in store, side by side, Samsung won for me. You should go to a Best Buy or wherever yourself and compare for yourself, everyone has different preferences.

Personally, I have a 55'' Samsung UN55D6000, which I got from Best Buy during black friday for $999, and it's amazing. The ToC glass bezel and glass stand make it look extremely sleek and modern, and the ultra clearview screen at 120HZ plays HD graphics heavy movies such as Avatar, Tron, Transformers 3, etc beautifully. In all seriousness, I have ditched any plans to go to the movie theater for anything less than the very best/most anticipated movies. The only downside of the Samsung is that the sound quality isn't really HiFi (certainlly not bad by any means, but not theater quality obviously), but if you have a 55'' HDTV then you really should have a 5.1 sound system at least.

I personally don't use 3DTV. Tried it in store, and couldn't keep the glasses on for more than maybe 10 minutes. I don't think 3D technology has advanced to the point that it really provides an enjoyable home theater experience. However, if it's something you want in your TV, samsung also offers 3D models of their TVs, starting with the 6300 series, all the way up to their 240 HZ 8000 Series.

Don't listen to this guy. He's looking at these television sets in the horrible best buy/store environments with the TVs on torch mode and uncalibrated. Any decent plasma will put the Samsung LCDs to shame. My parents have the samsung un60d8000, and my panny st30 absolutely puts it to shame for 1/3 the price.

Also, I find Plasmas are PLENTY bright for most environments.
 
+1. No LCD can touch plasma's in picture quality and black levels.
 
Plasmas are great but keep in mind the heat generated from these sets triggers my thermostat all the time. Especially during the summer months. I've seen some pretty amazing LED sets, but I have yet to see a LED TV that is on par with a Pioneer or Panasonic plasma set.
 
I can't really make much sense of your post. Never did I claim that disabling puts the display in any sort of mode. Oh my, I can't wait til 4k! Just what I need, 4k of resolution for 24 FPS motion-blurred garbage. If movies went to 60 FPS people would like that detail better than 4k 24 FPS CRAP.

Read a book?
 
Read a post?

I think Verge's post although snarky was also succinct. If you hate all movies because they are shot in 24fps, then there is nothing we can do for you. The whole movie industry operates at that frame rate and obviously people still like movies.
Apparently since 24fps is "motion-blurred garbage" to you, then all movies from a visuals perspective aren't worth watching. It seems to me that Verge is just offering an alternative to something you can't stand, because there is no current video technology that can magically make new frames in any way that would really make a difference (nor will there ever be, because data can't just be created... like blowing up a jpg that's too small to a higher resolution. Similarly generating data to interpolate frames is just a band-aid at best.)

I don't forsee the movie industry moving away from this standard anytime soon. Canon and Red have both recently released press on upcoming cameras and the future for at least another 3-5 years will still be "24 FPS motion-blurred garbage" at true 4k and 5k resolution no less.
 
I just don't see the value in increasing resolution over something that is already good enough, especially when everything is terribly blurred with any kind of motion, and especially making displays for a resolution that has no format even on the horizon, whatsoever. We already have to compress the crap out of 1080p stuff and give less resolution to color information. Are we going to stuff a 4k movie onto a Blu-ray disc? Is there even any change that cinema would have to do to adopt 60 FPS? Does film even get shipped to theaters anymore? And then there's the desire to go to 48 FPS. For what reason, I don't know. If we just stuck with 60 FPS for everything life would be easier. There is no technical reason why a 60 FPS movie would look like a soap opera, as many of those 24 FPS advocates blabber. If they are really technically versed at all, and not just some yuppie with a camera and no clue, movies will not suffer from looking like a "home video".
 
I have to admit...I don't see the need to go to 4k resolution. It seems that this is strictly to try to sell new hardware and also to usher in a new storage format. I'm 53....I just don't myself moving to a new format at this point....in my mind, BD is great and I'm happy to die there. I would like to see the motion blur fixed, though.
 
Well 4k and 5k have their place in movie theaters. They've existed for some time because people want to see clarity on 100' screens. The only difference now is that 4k and 5k are relatively less expensive, and the cameras they are in are now smaller and lighter which allows for more flexibility in shooting.

Whether or not that technology will trickle down to users anytime soon is a different story altogether. Personally, I don't see that happening. It has taken the better part of a decade to get people to move to 1080P. Heck, my parents don't even have an "HD TV". Most consumers don't want to invest into a new TV set that quickly, Blu-Ray doesn't have great penetration, there isn't a way to move HD movies quickly over broadband for a majority of users, not to mention I'm sure that broadcast television isn't looking for a new format to have shoot over the airwaves or through cable/satellite. My prediction is that consumer level will stay at 1080p for some years yet.

4k and 5k however will start showing up in computer monitors soon enough though. That I think is true. That would not only benefit the film industry, but also tech nerds like ourselves. 27-32" at 4k will be a sight to see.
 
LOL at everybody satisfied with their 1080. I have been watching 1080 for over 10 years and am long ready for something better. I had a 55" 1080 set in 2001, then 2 more since that and currently have a 61, amoving to buy a 65" Panny VT or a 75" LaserVue DLP. There absolutely is a reason for 4K with screens 55" and larger - if you don't think so, you're blind as a bat. I am ready and have been for years.
I also game at 2560x1600. Once you've seen dense displays, there's no going back.
 
LOL at everybody satisfied with their 1080. I have been watching 1080 for over 10 years and am long ready for something better. I had a 55" 1080 set in 2001, then 2 more since that and currently have a 61, amoving to buy a 65" Panny VT or a 75" LaserVue DLP. There absolutely is a reason for 4K with screens 55" and larger - if you don't think so, you're blind as a bat. I am ready and have been for years.
I also game at 2560x1600. Once you've seen dense displays, there's no going back.

Then get 4 CRT projects and calibrate them to form a 2x2 array. Congrats...you can easily get 4k resolution.:D

And the blind as a bat comment is very insulating. What you are saying is that eveyrone is blind as a bat. Humans, even the youngins, can only resolve so many pixels per inch at a set distance. Sorry, but are by far and away the exception to every reasonable and rational rule out there.
 
LOL at everybody satisfied with their 1080. I have been watching 1080 for over 10 years and am long ready for something better. I had a 55" 1080 set in 2001, then 2 more since that and currently have a 61, amoving to buy a 65" Panny VT or a 75" LaserVue DLP. There absolutely is a reason for 4K with screens 55" and larger - if you don't think so, you're blind as a bat. I am ready and have been for years.
I also game at 2560x1600. Once you've seen dense displays, there's no going back.

See...folks can always make totally unqualified statements like this...I have a 67-inch TV and I sit 12 feet away...I find no need for more resolution...and I always wear up-to-date prescription glasses, so I'm not "blind as a bat". You're just daydreaming and thinking you're seeing something you're not. Specs are the only thing you "see".
 
It's pointless to go over 1080p for 35mm film. If it can display grain, going higher isn't going to do a damn thing. Scanning at 4k is what is proper, and then you resize down to 2k or 1080p. Even with all these crazy Red cameras, there's not much point to anything over 4k. You're just going to capture the flaws of optics the higher you go. There's also the issue of proper viewing distance. It takes me like 5-6 feet away from my 42" before getting closer does nothing. That's too close. For a normal 8 feet away, even 720p would be plenty. I'm not as blind as a bat. And then there's the fact that there are barely any 4k sources. They're all just demos. There is no standard/format for this video type and nothing to store it. They need to worry about us people who want to get rid of motion blur first. The panels do it, all it would require is an input, and a standard change in Blu-ray if they wanted to actually go through with it. Blu-ray doesn't even support 1080p30, much less 1080p60. Monitors have 1080p120 DVI circuitry. The TV panels can get a 120Hz signal from whatever is controlling it. This would not be too difficult! It's really disgusting that even the "professional" TVs/displays aren't any better than the consumer lines.
 
Blu-Ray doesn't have great penetration, there isn't a way to move HD movies quickly over broadband for a majority of users, not to mention I'm sure that broadcast television isn't looking for a new format to have shoot over the airwaves or through cable/satellite.

How does blu-ray not have any market penetration? There are old ass movies that skip right over DVD that are coming out on blu-ray. Even old stuff can take advantage of HD, especially black and white film. Most yuppies are plenty satisfied with their Netflix/other streaming. The studios are even devoting so many resources to re-scan shows shot on film to slap on blu-ray. If you see a show pop up on i-Tunes in HD or see it suddenly being broadcast in HD, you can be sure they are getting ready to release it to consumers on blu-ray in the future. I will be a person who definitely buys these shows. I don't care if a show was 4:3 and the new scans are 16:9. 16:9 is just more appealing to look at, and NOW it fills screens.
 
How does blu-ray not have any market penetration? There are old ass movies that skip right over DVD that are coming out on blu-ray. Even old stuff can take advantage of HD, especially black and white film. Most yuppies are plenty satisfied with their Netflix/other streaming. The studios are even devoting so many resources to re-scan shows shot on film to slap on blu-ray. If you see a show pop up on i-Tunes in HD or see it suddenly being broadcast in HD, you can be sure they are getting ready to release it to consumers on blu-ray in the future. I will be a person who definitely buys these shows. I don't care if a show was 4:3 and the new scans are 16:9. 16:9 is just more appealing to look at, and NOW it fills screens.

Well, what I stated was anecdotal, but a quick trip on Google has already shown me that Blu-Ray market penetration is nothing in comparison with how many people have a simple DVD player. It's great that you're a big fan of HD content and purchasing it, but you're missing my point.

My point was with things as they are, film studios, broadcast TV, and providers of digital streaming have no desire to move to a higher resolution format. My point about Blu-Ray is that the problem is already exacerbated. If they can't get everyone to buy a Blu-Ray player, what makes you think they can get someone to buy a player capable of 4K? Or a TV capable of 4K? Or enough bandwidth on your net access to stream 4K?

That was my point. 1080p is here to stay for the foreseeable future. We could stand around and argue about how many people have a Blu-Ray player, but honestly that topic is something I care little about.
 
I just don't see the value in increasing resolution over something that is already good enough, especially when everything is terribly blurred with any kind of motion

You watching an LCD?


Maybe you should go pick up a good crt set, or a crt projector, motion resolution on LCD's is downright pitiful.
 
LOL at everybody satisfied with their 1080. I have been watching 1080 for over 10 years and am long ready for something better. I had a 55" 1080 set in 2001, then 2 more since that and currently have a 61, amoving to buy a 65" Panny VT or a 75" LaserVue DLP. There absolutely is a reason for 4K with screens 55" and larger - if you don't think so, you're blind as a bat. I am ready and have been for years.
I also game at 2560x1600. Once you've seen dense displays, there's no going back.


You pre-order the sony then?


There are CRT's that can do >1080p. You can blend 2 together and get even higher, probably close to 4k.
 
Plasma. Film is heavily motion blurred to not look strobed. It doesn't matter what display you use when you can look at a single frame and see the motion blur. CRT is garbage. I already am going to get rid of my plasma due to many things I just don't like. Needless to say I won't be replacing it any time soon. LCDs are still good enough to render 24 FPS. It's when you get up into 60 Hz and beyond they start blurring content. Projectors are for people who want a dedicated theater room. I don't, and most people don't sit in the dark to just watch TV or movies. A CRT set would be retarded. My plasma is heavy enough. Could you imagine how deep and heavy a 42" CRT would be? They don't even exist because it would be so retarded. I also like my images to be posted all at once instead of scanned across and down. A persistent image looks different than a flickering one, which is why people think an LCD is inferior.
 
Unknown is absolutely correct. Blu-ray penetration is weak, at best. Think about it...all of these people are happy streaming crap stuff from the web...and many are totally happen with DVD resolution. Most could care less about 1080p, high bit-ray transfers and don't even mention lossless audio...most people use TV speakers. And you can't go by what you see and read here as we are a less than the 1%.
 
LCDs are still good enough to render 24 FPS.


Not really, there are countless test you can do to prove this, however, you probably wouldn't understand them. You are starting to sound like a 12 year old who knows a little too much for his own good, but is generally clueless.




There is a crucial, biological reason they put motion blur in movies and don't show everything in crystal clear 120fps. You should probably do some more reading on the subject or talk to an optometrist/ophthalmologist.
 
Unknown is absolutely correct. Blu-ray penetration is weak, at best. Think about it...all of these people are happy streaming crap stuff from the web...and many are totally happen with DVD resolution. Most could care less about 1080p, high bit-ray transfers and don't even mention lossless audio...most people use TV speakers. And you can't go by what you see and read here as we are a less than the 1%.

So...what you are saying is people still fundamentally care about what they are watching and not how pretty it is. Say it aint so. Next thing you will be telling me is that people like fun little games and not massive multi-hundrend hours games with awesome graphics.
 
They motion blur 24 FPS because it would be jumpy. It's not even added, but rather a function of shutter speed. There are movies with high shutter speeds that are absolutely disgusting to watch...Crank 2: High Voltage would be one good example. Saving Private Ryan, another. There is no biological reason movies aren't greater than 24 FPS. Film is ancient, and that's simply what they settled on. If film was created today they wouldn't settle with a measly 24 FPS. The quacks for 24 FPS are always quoting some kind of "artistic look" bullshit. Each frame is 42ms. LCDs have plenty quick of a response time for that. Not everyone is some motion-sick retard. I'd say most people aren't. Even so, it's nothing that can't be cured with good camera work. There's no reason for a camera to be bouncing around in a movie like it is mounted to a human's head.
 
Saving private ryan and BoB were shot that way to intentionally look jerky. The human eye naturally sees fast movement as blurry, if you had a 120hz film, it would look very unnatural to the human eye. Maybe you are the rare exception that would like how horrible that looks, but you are 1 out of 6 billion, so life is just going to suck for you.


There are 24p videos to test motion resolution on LCD's, perhaps you should run a few before talking about how they can seamlessly render the motion, because they can't. If you want decent motion resolution you get a plasma, and if you want it flawless you get a CRT.
 
No, it would look more realistic. So realistic, stuff like that usually causes motion sickness in the weak type of people (those with psychological allergies, sea sickness, and those who puke on any kind of amusement park rides or from general spinning) because the camera work is so utterly shitty and shaky the brain thinks what they are seeing is happening physically through their eyes. Just read about the UHDTV demonstrations making people sick. Saving Private Ryan was not shot that way to look jerky, it was done to not be a blurry abortion of a mess.

How does a plasma not render motion as good as CRT? They rely on phosphors, just like CRT. They are superior in the way that they render the whole image at once rather than scan. CRT is garbage, sorry...it's dead, and never coming back.
 
No, it would look more realistic. So realistic, stuff like that usually causes motion sickness in the weak type of people (those with psychological allergies, sea sickness, and those who puke on any kind of amusement park rides or from general spinning) because the camera work is so utterly shitty and shaky the brain thinks what they are seeing is happening physically through their eyes. Just read about the UHDTV demonstrations making people sick. Saving Private Ryan was not shot that way to look jerky, it was done to not be a blurry abortion of a mess.

How does a plasma not render motion as good as CRT? They rely on phosphors, just like CRT. They are superior in the way that they render the whole image at once rather than scan. CRT is garbage, sorry...it's dead, and never coming back.

You must spend all your free time in the Ultra Hi-End HT Gear at AVS.
 
You must spend all your free time in the Ultra Hi-End HT Gear at AVS.

He doesn't know anything about AV, video, calibration, or really the technologies behind them. I don't even think he got your insult.


I'm still waiting for a projector to match a CRT. Of course projectors are for idiots who sit in dark rooms to watch biggest loser. REAL videophiles HATE theater rooms and watch 40 inch plasma's in their parents living room.
 
There is no point in interpolating something with motion blur. This is a fundamental flaw in this motion interpolation crap for TVs. I really don't care what you morons think I do or don't know, or where I live. Going back to insults about if a person lives with their parents and using some nerdy terms or memes is just pathetic.
 
Cinema with even frame duration without any processing or detection to interpolate even the same frames five times over (5:5). What is presented to the TV is what is displayed... Plus to do the 5:5 the display mode would need to be changed, a

You aren't making any sense here. Movies are 24 fps. Doing the 5:5 pulldown on your player and sending them to your TV at 120 fps is just a waste of bandwidth, when you can send them to you TV at 24 fps and do the 5:5 pulldown at the set. Same result, for 1/5th the bandwidth.

Also 120Hz isn't very future proof. The next stop for movies will most likely be 48fps (The Hobbit is being shot in 48fps, as well as some IMax features) and that doesn't play back without jitter on a 120Hz set regardless of the input. A 96Hz set that accepted 24/48Hz would actually be better for movies.

, if you had a 120hz film, it would look very unnatural to the human eye.

This must be the nonsense thread. Real 120fps movies would look much more natural than 24fps. There is no sane reason to think the reverse.
 
Back
Top