8K TVs in 2018 but No Content till 2025

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,535
If you consider yourself an early adopter when it comes to the latest tech, this may be that one time you want hold off for a while. Pocket Lint talks about new 8K TVs (4320p) being a thing at CES, which we possibly missed since we were paying attention to PC tech, and a whole range of those 8K TVs to be for sale in 2018. The kicker? There is no native 8K content "available." Also the article spells out that 8K will likely be restricted to screen sizes of 80 inches or larger. Holy hell.


During a Philips TV launch in Amsterdam, Pocket-lint was told that European 8K TV broadcasts are unlikely to be available until 2025 at the least. What's more, industry renowned research director Paul Gray, of analytics company IHS Markit, explained that 8K resolutions will be restricted to TVs with screen sizes of 80-inches and above for the immediate future.
 
I'm excited for this new tech since it'll just drive the prices of old tech down even faster. Waiting to pick up that 4K 65" OLED for under $2K.
 
I'm excited for this new tech since it'll just drive the prices of old tech down even faster. Waiting to pick up that 4K 65" OLED for under $2K.
the C7P 65" OLED was $2050 this week at buydig i believe (saw on slickdeals) so its getting there...
 
My daughter is working (talent) on a feature that is being shot on 8K. It's pretty rare though at this time to shoot above 4K.
 
As others have rightly pointed out, there's barely any 4K content out there.

I didn't think TV makers could find something more useless than 3D to try to force people to upgrade, but here we are!

Personally I see a much greater difference in HDR content than from HD to 4K. Most content I look at on a screen are pictures of things, not rows and rows of text. Color gamut is often far more important than resolution!
 
Pff a lot of cable and sat don't even offer 1080p yet. 4k streaming is shit still with the low bitrate. Only thing that offer true 4k is UHD Blu-rays.
 
My daughter is working (talent) on a feature that is being shot on 8K. It's pretty rare though at this time to shoot above 4K.

Personally, I really don't want to know what I would look like in 8k. Those really bad hotel bathroom lights come to mind.. :eek:

lol
 
By the time there is 8K content I'm sure there will be a new HDMI standard, new HDR standard, and maybe even a whole new resolution standard.
Buying one of those today is just buying a quirky high-resolution PC monitor.
 
Personally I think I'll stop at 4k.

I've been a relatively early adopter of new resolutions for years now, and I might just have finally reached the point where I have had my fill.


2002: Iiyama Vision Master Pro 22" CRT 2048x1536

in 2002 I bought an Iiyama Vision Master Pro 510, a 22" CRT, the biggest I had ever seen, able to render at 2048 x 1536 at 80 Hz. During the period I owned it, I was never able to own a GPU that allowed me to render fast enough at 2048x1536, so its spent most of it's life at 1600x1200, vsynced to 100hz in Counter-Strike. Not a big deal for a CRT, as there is next to no penalty for running them at non-native resolutions.


2005: Dell 2405fpw 24" 1920x1200

In 2005 my Iiyama screen died, so I bought a Dell 2405fpw, 24" 1920x1200 (my first flat panel). I quickly found that most modern titles couldn't run fast enough at 1920x1200 on any existing GPU in early 2005, and it actually caused me to lose interest in games all together until about 2009. I essentially took a 4 year break from games due to a monitor.
When I came back in the summer of 2009, things had changed and I was actually able to get decent framerates at 1920x1200, on my highly overclocked GTX 470, so I enjoyed that for a while.


2010: Dell U30100 30" 2560x1600

In late 2010, when I got bitten by the bug again and bought a 30" Dell U3011 2560x1600. I spent the next three years rapidly going through GPU after GPU to find a solution, any solution that could give me a good steady 60hz. First I upgraded to a GTX580. It wasn't enough, and the power supply in my SFF case couldn't keep up with the power draw. Switched to a mid tower, got dual Radeon HD6970's. Found that Crossfire sucked balls. Got a 7970 on launch, found that a single 7970 got me better minimum framerates in my titles than dual 6970's, but still not enough. Got greedy and wanted to custom mod a Corsair AIO to cool it. Slipped with a screwdriver and killed the GPU. Put my tail between my legs and ran on an old backup GPU for a few months and then got a GTX 680 on launch. Still not fast enough. I wasn't happy until I picked up a Kepler Titan on launch in 2013. That puppy lasted me until the summer of 2015 when I got bitten by the screen bug again.


2015: Samsung JS9000 48" 4K (3840x2160)

Summer of 2015 I bought a 48" 4K Samsung JS9000. Great screen, but it immediately launched me back to GPU inadequacy land. First a single 980Ti (not enough) then dual 980TI's overclocked and cooled with Corsair AIO's. I found SLI sucked just as much as Crossfire had back in 2011, so I went looking for an alternate solution. I bouht a GTX 1080 at launch, immediately found it was grossly inadequate and sold it and bouth a Pascal Titan X on launch, and cooled it with a custom water loop to get the best possible overlclock, which I am still using. I was hoping the overclock on custom water would finally get me to the point where my framerate would never drop below 60fps minimums at 4k Ultra settings, but sadly that isn't the case.

So, now I'm biding my time. Older titles run fine at 4k. (Thoroughly enjoying the remastered Bioshock collection right now, a series I originally missed during my 4 year break from gaming), but for newer, more demanding titles I have to get creative. For Deus Ex Mankind Divided and Fallout 4 I created a custom ultrawide resolution of 2160x1646, so I could run unscaled pixels but letterboxed top and bottom. In both titles I would still drop below 60, which was frustrating, so to partially help I lowered the refresh rate to 50hz and vsynced it there. This worked most of the time. When I tried PUBG last summer I did the same, but kept the refresh rate at 60hz, as it was slightly less demanding than those titles, and every bit of framerate helps when playing multiplayer.

Getting a Titan V would certainly help, but there is no way in hell I am spending $3k on a GPU, especially since it is probably not fast enough to solve the 4k problem once and for all. I have high hopes for the next gen Nvidia 1080ti replacement, or whatever they call the next Consumer/Gaming Titan now that the Titan brand has gone pro compute. One thing I know for sure. I'm never going anything Crossfire or SLI again, regardless of whether both GPU's are on a single board or separates.



The TLDR version is this. After 16 years of being an early adopter of high resolutions, I'm tired of fighting the GPU/resolution arms race. I may upgrade my desktop screen once more, when something like a 42-44" 4k Gsync screen comes around, but then I'm just going to sit on that for a while. I don't like using scaling on my screens, and since I find that 42-44" somewhere results in the perfect PPI for desktop use, and I have absolutely no need for a desktop screen larger than 44", this is where I'll stay.

For Movies and TV I still can't tell the difference at all between 4k content and 1080p content at typical viewing distances even with rather large screens. It's barely noticeable in most cases, and certainly nothing like the huge leap going from 480p DVD to 1080p Bluray was. I feel like 8K unless you are sitting VERY close to a VERY large screen is a complete waste.

I'm reminded of the resolution distance chart:

tv-size-distance-chart.png



So, if you sit within 4' of a 65" screen you might be able to tell the difference between 4K and 8K, but who does that? Even if you have a 100" screen, you'd need to sit closer than 6' from it to tell the difference. That's like sitting on the front row in the movie theater. There is a reason those seats are typically the last to go.

I have a 65" screen in my home theater setup, and no room for anything larger. My viewing distance is about 12', so I technically fall in the 4k area, but when I have test viewed 4k content on 65' screens at that distance on this chart, I really haven't been able to tell the difference, and my vision is corrected to 20/20 or better.

So, for now, I'm keeping My Home Theater and TV setups at 1080p, and my desktop at 4k for the foreseeable future. 8k sounds great on paper and all, but I see absolutely no use for it what so ever in reality, when there is barely any use for 4k as it stands.

Maybe in 2025 when the 8k content is supposed to come around, I'll feel differently about it, but that's 7 years from now. Plenty of time to change my mind if I need to.
 
Last edited:
My daughter is working (talent) on a feature that is being shot on 8K. It's pretty rare though at this time to shoot above 4K.

True, and isn't it pretty typical to want to shoot at a higher resolution than you distribute in, so that you can edit and do effects at a higher resolution, then downsample and distribute?

At least that's what they typically do in Audio. Edit at a higher bit higher ferequency PCM, and then downsample before shipping.
 
Personally I think I'll stop at 4k.

I've been a relatively early adopter of new resolutions for years now, and I might just have finally reached the point where I have had my fill.


2002: Iiyama Vision Master Pro 22" CRT 2048x1536

in 2002 I bought an Iiyama Vision Master Pro 510, a 22" CRT, the biggest I had ever seen, able to render at 2048 x 1536 at 80 Hz. During the period I owned it, I was never able to own a GPU that allowed me to render fast enough at 2048x1536, so its spent most of it's life at 1600x1200, vsynced to 100hz in Counter-Strike. Not a big deal for a CRT, as there is next to no penalty for running them at non-native resolutions.


2005: Dell 2405fpw 24" 1920x1200

In 2005 my Iiyama screen died, so I bought a Dell 2405fpw, 24" 1920x1200 (my first flat panel). I quickly found that most modern titles couldn't run fast enough at 1920x1200 on any existing GPU in early 2005, and it actually caused me to lose interest in games all together until about 2009. I essentially took a 4 year break from games due to a monitor.
When I came back in the summer of 2009, things had changed and I was actually able to get decent framerates at 1920x1200, on my highly overclocked GTX 470, so I enjoyed that for a while.


2010: Dell U30100 30" 2560x1600

In late 2010, when I got bitten by the bug again and bought a 30" Dell U3011 2560x1600. I spent the next three years rapidly going through GPU after GPU to find a solution, any solution that could give me a good steady 60hz. First I upgraded to a GTX580. It wasn't enough, and the power supply in my SFF case couldn't keep up with the power draw. Switched to a mid tower, got dual Radeon HD6970's. Found that Crossfire sucked balls. Got a 7970 on launch, found that a single 7970 got me better minimum framerates in my titles than dual 6970's, but still not enough. Got greedy and wanted to custom mod a Corsair AIO to cool it. Slipped with a screwdriver and killed the GPU. Put my tail between my legs and ran on an old backup GPU for a few months and then got a GTX 680 on launch. Still not fast enough. I wasn't happy until I picked up a Kepler Titan on launch in 2013. That puppy lasted me until the summer of 2015 when I got bitten by the screen bug again.


2015: Samsung JS9000 48" 4K (3840x2160)

Summer of 2015 I bought a 48" 4K Samsung JS9000. Great screen, but it immediately launched me back to GPU inadequacy land. First a single 980Ti (not enough) then dual 980TI's overclocked and cooled with Corsair AIO's. I found SLI sucked just as much as Crossfire had back in 2011, so I went looking for an alternate solution. I bouht a GTX 1080 at launch, immediately found it was grossly inadequate and sold it and bouth a Pascal Titan X on launch, and cooled it with a custom water loop to get the best possible overlclock, which I am still using. I was hoping the overclock on custom water would finally get me to the point where my framerate would never drop below 60fps minimums at 4k Ultra settings, but sadly that isn't the case.

So, now I'm biding my time. Older titles run fine at 4k. (Thoroughly enjoying the remastered Bioshock collection right now, a series I originally missed during my 4 year break from gaming), but for newer, more demanding titles I have to get creative. For Deus Ex Mankind Divided and Fallout 4 I created a custom ultrawide resolution of 2160x1646, so I could run unscaled pixels but letterboxed top and bottom. In both titles I would still drop below 60, which was frustrating, so to partially help I lowered the refresh rate to 50hz and vsynced it there. This worked most of the time. When I tried PUBG last summer I did the same, but kept the refresh rate at 60hz, as it was slightly less demanding than those titles, and every bit of framerate helps when playing multiplayer.

Getting a Titan V would certainly help, but there is no way in hell I am spending $3k on a GPU, especially since it is probably not fast enough to solve the 4k problem once and for all. I have high hopes for the next gen Nvidia 1080ti replacement, or whatever they call the next Consumer/Gaming Titan now that the Titan brand has gone pro compute. One thing I know for sure. I'm never going anything Crossfire or SLI again, regardless of whether both GPU's are on a single board or separates.



The TLDR version is this. After 16 years of being an early adopter of high resolutions, I'm tired of fighting the GPU/resolution arms race. I may upgrade my desktop screen once more, when something like a 42-44" 4k Gsync screen comes around, but then I'm just going to sit on that for a while. I don't like using scaling on my screens, and since I find that 42-44" somewhere results in the perfect PPI for desktop use, and I have absolutely no need for a desktop screen larger than 44", this is where I'll stay.

For Movies and TV I still can't tell the difference at all between 4k content and 1080p content at typical viewing distances even with rather large screens. It's barely noticeable in most cases, and certainly nothing like the huge leap going from 480p DVD to 1080p Bluray was. I feel like 8K unless you are sitting VERY close to a VERY large screen is a complete waste.

I'm reminded of the resolution distance chart:

tv-size-distance-chart.png



So, if you sit within 4' of a 65" screen you might be able to tell the difference between 4K and 8K, but who does that? Even if you have a 100" screen, you'd need to sit closer than 6' from it to tell the difference. That's like sitting on the front row in the movie theater. There is a reason those seats are typically the last to go.

I have a 65" screen in my home theater setup, and no room for anything larger. My viewing distance is about 12', so I technically fall in the 4k area, but when I have test viewed 4k content on 65' screens at that distance on this chart, I really haven't been able to tell the difference, and my vision is corrected to 20/20 or better.

So, for now, I'm keeping My Home Theater and TV setups at 1080p, and my desktop at 4k for the foreseeable future. 8k sounds great on paper and all, but I see absolutely no use for it what so ever in reality, when there is barely any use for 4k as it stands.

Maybe in 2025 when the 8k content is supposed to come around, I'll feel differently about it, but that's 7 years from now. Plenty of time to change my mind if I need to.


As a follow-up, this version of the chart seems to match my experiences better:

2D0EQAg.png
 

Don't think I'll ever bother with 8K

Currently we only has a single 40" TV in the house and the seating is 8 feet away (10 feet if I'm reclined)
That's why a good quality 720p video looks almost as good as a 1080p video.

With my old eyes getting worse, it might be time to try and convince the wife that the old TV cabinet has to go, so I can put in 65" 2k TV
 
The story this year (and for years to come, apparently) is going to be AI driven upscaling of 4k content. Personally, I am far more interested in the high frame rate capability more than 8k.
 
I wouldn't mind a 8k 43" for stocks. Not for gaming though.


Well, the counter-argument is that super high resolutions will be great for gaming, as they will totally solve the aliasing problem once and for all, better than current anti-aliasing techniques can. The problem with that line of thought is that it will require so powerful GPU's that it may just not be feasible.

That, and resolutions would have to be much higher than 8k. Even at a relatively small screen by modern [H] standards of 24", 8k is nowhere near high enough to outresolve the eye and hide aliasing, and when people want 120hz plus Gsync capability, even if games didnt raise polygon counts and increase effect quality, like they do, we are several generations away from GPU's capable of doing this, if we ever get them at all due to the whole node shrinking problem. That, and a good chunk of that GPU power increase will be eaten up by ever increasing polygon counts and effects, leaving very little to go after super-high resolutions.
 
I'm behind times - I think my main TV I watch the most is just 720p. We have one in our bedroom that is 1080, but it's only a 32" (and I think I got it pretty cheap when I bought it). I won't be signing up for 8k. I am tempted on going to 4K as prices keep dropping down, although probably not going to pull the trigger until my 720p dies.
 
Good to see the advancement as everyone has said you need a large screen to notice it. The place it makes more sense is in projectors but they are still twice as much as TVs. It's great that TV's are falling for 4k but I don't need any more TV's and we have a lot, too many probably. I need true 4k or higher in my theater but I'm not spending $6,000 or more to get it (I need powered zoom and focus with lens memory).
 
Markets. Forex, Commodities,... candlestick charts.

OriginalPng

That. With less colors. A lot less.

Well, the counter-argument is that super high resolutions will be great for gaming, as they will totally solve the aliasing problem once and for all, better than current anti-aliasing techniques can. The problem with that line of thought is that it will require so powerful GPU's that it may just not be feasible.
They would be great for gaming. I wanted to see someone play Dirt Rally at 8K, and the reviewer said it was simply gorgeous.

 
Last edited:
Markets. Forex, Commodities,... candlestick charts.

OriginalPng

That. With less colors. A lot less.

Ahh.

Does really high ppi help with these charts?

Because typical desktop screens have between 90 and 110ppi, and this translates to about a 43" 4k screen. At 8k and that size you'd be looking at about double that.
 
What about the olympics? I am sure I read about the Japanese broadcasting the whole thing in 8k a while ago
 
I'd be surprised if some studios didn't release a handful of super uber 8K Blu-ray movies early so they can charge an early adopter markup.
 
Shooting in 8K makes sense for post production editing. As you can crop/zoom without losing fidelity, amongst other tricks.

My daughter is working (talent) on a feature that is being shot on 8K. It's pretty rare though at this time to shoot above 4K.
 
I'd be surprised if some studios didn't release a handful of super uber 8K Blu-ray movies early so they can charge an early adopter markup.

I was actually surprised when 4k Blurays started to come out, as I figured they would need a new and larger capacity format.

Is there really enough space on a bluray disk to fit high quality 8k content? Or would they have to drop the bitrate such that the artifacting would make the experience worse than just watching it at 4k or 1080p to being with?
 
I was actually surprised when 4k Blurays started to come out, as I figured they would need a new and larger capacity format.

Is there really enough space on a bluray disk to fit high quality 8k content? Or would they have to drop the bitrate such that the artifacting would make the experience worse than just watching it at 4k or 1080p to being with?


Quad layer is 128GB. Maybe. *shrug*
 
higher color gamut >>> 8k.

ABC TV still broadcasts in 720p max.

4k TV size is only good for external monitor usage.
 
As a follow-up, this version of the chart seems to match my experiences better:

2D0EQAg.png
I hate sitting 14' from a TV. 6-7' from an 80" (even at 1080p) is optimal.

Someone mentioned new HDMI cables, but that's going to happen this year and that will take us to 8k 60fps, which is likely plenty for TV/Movies in 2025, not that it matters, because nobody that buys an 8k TV this year or next is going to use it for 7 or 8 years. They'll probably sell it within 2-3 years.

The real key is how well does it upscale and when will there be 4k content (I couldn't care less about broadcast content, since they're forever behind the curve).
 
I was actually surprised when 4k Blurays started to come out, as I figured they would need a new and larger capacity format.

Is there really enough space on a bluray disk to fit high quality 8k content? Or would they have to drop the bitrate such that the artifacting would make the experience worse than just watching it at 4k or 1080p to being with?
Blu Ray starts at 25GB and goes to 50GB. UHD starts at 50GB and goes up to 100GB (though I don't know if they have any 100GB disks or not). I also believe that UHD uses H.265, so they get a lot more bang for the byte.
 
8K TVs in 2018 but No Content till 2025 No GPU's that can do 144fps in 8k till 2077.
 
I'm still using a 55" 1080p for my living room TV and it looks just fine. No need to upgrade.
Let me know when the price drops to $999.
 
I'm reminded of the resolution distance chart:

tv-size-distance-chart.png

That chart has shifted over time. It was released by the TV industry after they decided 4K was a way to boost sales.

To properly figure it out you need to know the arc resolution of a perfect human eye using the grating test (strongest contrast dead center), which most of us don't have and work the distance and compare that to the single pixel pitch .

The resolution of the human eye is ~.3 arc minutes.
1 arc minute = 1/60 degree or .0166 degrees
.0166 * .3 = 00498 degrees

So lets take that 8K TV. That's 7680 pixels wide.
On a 60" TV 52.3" wide (16:9) format, that leaves you 146.9 pixels per inch.
Or a pixel at every .0068" (1/146.9ppi)

Hyp = .0068/sin(.0166*.3)

You get a hypotenuse of 3.17". Their chart says 4', so they are fudging it....by a LOT and that's assuming perfect eyes with perfect viewing conditions (A black and white grate hash)
 
Last edited by a moderator:
Well, the counter-argument is that super high resolutions will be great for gaming, as they will totally solve the aliasing problem once and for all, better than current anti-aliasing techniques can. The problem with that line of thought is that it will require so powerful GPU's that it may just not be feasible.
You don't need 8K display for that, just render at 8k and downscale to the display and you're set. Not feasible yet, but it will be sooner later than later. Did you think about 4K gaming becoming feasible 10 years ago? We blinked twice and in 2014 I was already playing at 4K.
 
TV's are finally getting to the point you can actually start looking at some of those NASA pics without scaling!
 
Back
Top