So... i got a 4k tv but...

MrGuvernment

Fully [H]
Joined
Aug 3, 2004
Messages
21,812
So, prices are low cause 2018 models are coming out so grabbed a 4k TV (Samsung MU8000 55') I wanted the Sony X900E but couldn't justify the added $300 for it.

With that, I have been using an older Dell Optiplex 990 with an i5 2400 CPU and integrated video, as it was fine for 1080p content to my crap-tastic LG 43" TV.

So now, grabbing some 4k content, CPU is pegging around 80-100% and dropping audio.

The Optiplex has room for a half height GPU.

I know for NetFlix you need Kaby Lake for HDCP crap to stream 4k, i presume that is still the case even if your using a dedicated GPU? Or even if you use the Win 10 netflix App.

Also, i do want to be able to play streams not from netflix so offloading to the GPu would be ideal..

Can i get around the kaby lake stuff with something like a Nvidia 1030?
 
no you absolutely do not need kaby lake to run 4k HDR content(only if you want to use the IGP).. you do however need to have windows 10. you have two options, either viewing the content through microsoft edge browser or using the windows 10 netflix app..

the below part is copy and pasted directly from netflix when using the windows 10 app: https://help.netflix.com/en/node/23931

You can stream Netflix in HDR on your Windows computer in the Microsoft Edge browser or the Netflix app available from the Windows store.

To stream in HDR content on your Windows PC, your system must meet or exceed the following requirements:

If using integrated Intel GPU:

  • Windows 10 Creators Update is required (version 1703).

  • CPU: 7th Generation Core (i3, i5, or i7 7xxx or 7Yxx) or higher processor.

  • Graphics Driver: 22.XX.XX series, version 4708 or higher.
Note:
Devices must be enabled for HDR by the manufacturer.
If using discrete NVIDIA GPU:

  • Windows 10 Fall Creators Update is required (version 1703).

  • GPU: 1050, 1060, 1070, and 1080 cards with at least 3GB of video RAM.

  • Graphics Driver: 387.68 (23.21.13.8768) or higher.
Note:
Devices must support Microsoft PlayReady 3.0 or higher.
 
Good to know, thank you (i got lazy and forgot to check netflixes own page)

got Win 10 so covered there!

So, noting that, these days what is the best app for playing back 4k content, i always used Media Player classic but dont think it did GPU offloading?
 
Good to know, thank you (i got lazy and forgot to check netflixes own page)

got Win 10 so covered there!

So, noting that, these days what is the best app for playing back 4k content, i always used Media Player classic but dont think it did GPU offloading?

MPC-HC while the gui side is no longer being worked on, all the codec's and filters that were updated periodically are still being updated and can be manually added.

VLC has improved a lot over the years and doesn't really require and fiddling around with settings to get things to work.

besides those two i think most people use plex for their content viewing on tv's but i've never gotten around to checking it out yet since i haven't had time to finish my theater room setup (waiting until may 1st for the TCL 65" 6 series).
 
info above is incorrect. that is the HDR part. this is the 4k requirements. needs a newer intel.

Netflix is available in Ultra HD with the Windows 10 app. To stream in Ultra HD, you will need:

  • A HDCP 2.2 compliant connection to a 4K capable display, Intel's 7th generation Core CPU, and the latest Windows updates

  • A plan that supports streaming in Ultra HD. You can check which plan you're currently on at netflix.com/ChangePlan.

  • A steady internet connection speed of 25 megabits per second or higher.

  • Streaming quality set to High or Automatic. More information about video quality settings can be found in our Playback Settings article.
 
info above is incorrect. that is the HDR part. this is the 4k requirements. needs a newer intel.

Netflix is available in Ultra HD with the Windows 10 app. To stream in Ultra HD, you will need:

  • A HDCP 2.2 compliant connection to a 4K capable display, Intel's 7th generation Core CPU, and the latest Windows updates

  • A plan that supports streaming in Ultra HD. You can check which plan you're currently on at netflix.com/ChangePlan.

  • A steady internet connection speed of 25 megabits per second or higher.

  • Streaming quality set to High or Automatic. More information about video quality settings can be found in our Playback Settings article.


nein, nein, nein.. you do not need a kaby lake or newer cpu to watch 4k content.. all you need is HDCP 2.2 and HEVC decoding along with being certified to run microsoft playready 3.0 drm which pascal can. netflix just hasn't bothered to update that section of it, instead they put the new information under the HDR portion.. nvidia 10 series has supported netflix 4k since may of last year.

google is your friend...

https://www.kitguru.net/channel/gen...erything-you-need-to-stream-4k-netflix-on-pc/

https://www.extremetech.com/computi...a-driver-will-allow-netflix-4k-streaming-gpus

https://www.pcworld.com/article/319...-on-geforce-gtx-10-series-graphics-cards.html

also while finding the evidence i ran across more information that states not to update to w10 fall creator pack because for some retarded reason microsoft removed the native HEVC codec support from windows 10.. o_O

For Intel, only Kaby Lake iGPUs or later are supported. For NVIDIA, only GeForce GTX 1050 or higher with at least 3GB VRAM are supported, with driver version 387.96 or higher required. As NVIDIA notes, streaming 4K on SLI/LDA is not supported, and multi-monitor configurations require all active monitors to be HDCP 2.2 capable or content will be downgraded to 1080p; the latter stipulation is presumably true for future AMD configurations as well. For Intel’s part, this capability became enabled in November 2016. And while NVIDIA announced 4K Netflix support for Pascal at launch, it wasn’t until April 2017 and 381.74 that they previewed 4K Netflix support for Windows 10 Insider builds, before production support with 387.96. As of the time of writing, Netflix help documentation still does not list NVIDIA graphics as supported for 4K streaming on PC.

Recent months have seen further wrinkles, such as the removal of the built-in HEVC decoder for Windows 10 Fall Creators Update; it is unclear if Microsoft’s stated “Codec Pack” is incoming, though for the time it appears users have been struggling with a supposedly insider-only KB4041994, an automated Windows Store installation of HEVC Media Extension that fails if it is already present.

above is from an article posted on anandtech in feb/2018 when i was looking to see if vega was supported by playready 3.0 as well which is supposedly coming soon tm.... https://www.anandtech.com/show/12442/amd-plans-playready-3-support-for-polaris-and-vega-gpus-in-2018
 
So...

My options basically would be either:

1. Get a 1050Ti 4G low profile GPU for my optiplex that has an i5-2400 with Win 10 Pro 1709 - this would give me 4k on netflix, and if i used MPC-BC with madVR would offload playing 264 files to the GPU so i would not be CPU bound....
2. get a new 8 series intel CPU system ,. i3-8100 for example and build out an entire new system and use the iGPA
 
So...

My options basically would be either:

1. Get a 1050Ti 4G low profile GPU for my optiplex that has an i5-2400 with Win 10 Pro 1709 - this would give me 4k on netflix, and if i used MPC-BC with madVR would offload playing 264 files to the GPU so i would not be CPU bound....
2. get a new 8 series intel CPU system ,. i3-8100 for example and build out an entire new system and use the iGPA

personally i'd just go option 1 then if you ever wanted to you'd actually be able to play some games at 1080p on the tv.
 
personally i'd just go option 1 then if you ever wanted to you'd actually be able to play some games at 1080p on the tv.

thinking that route, but i cant find a 1050 TI low profile. it is my cheapest route to go, $180 for a GPu, vs building an entire rig for close to $500....
 
Or you could just run an Nvidia Shield for far less $$?

can the sheild connect to a NAS and stream content from it?

i tried today and the one issue is my TV and HTPC is located in my living room there is not Rj45 port, so i am running over Ethernet-over-power adapter which most i can get is about 15-20MB/sec. I tried today with a BR rip at 56G in size and it stuttered, but i assumed it was my old i5 2400 was the issue..

I guess the other part is that we do use a web browser to view streams from channels in other countries....
 
The only thing a GT 1030 can't do is Netflix 4k, because they didn't play the certification game. Nvidia had to pay to get certified, so they're holding the customer hostage for the first generation.

But it will do hardware-accelerate 4k HEVC 10-bit with ease. Just like the GTX 950 before it.

If you can just live without Netflix FOR NOW, a GT 1030 passive-cooled card would treat you very well! It would also allow you to play casual games at 1080p, somthign your integrated graphics couldn't dream of :D

Nvidia will eventually allow all their new cards to handle Netflix 4k. And when they do, you can buy that one.

And if you must have 4k Netflix, you can always do a GT 1030 for the HTPC ($100), and then a 4k Roku stick for Netlix 4k. ($50). That's STILL CHEAPER than a GTX 1050 Ti, and can actually fit in your HTPC!!

EDIT or you can use the Netflix app built into you TV.
 
Last edited:
can the sheild connect to a NAS and stream content from it?

i tried today and the one issue is my TV and HTPC is located in my living room there is not Rj45 port, so i am running over Ethernet-over-power adapter which most i can get is about 15-20MB/sec. I tried today with a BR rip at 56G in size and it stuttered, but i assumed it was my old i5 2400 was the issue..

I guess the other part is that we do use a web browser to view streams from channels in other countries....

Which powerline adapters are you using? I had stuttering issues when transferring or viewing large video files on my set and using my Shield. (Zyxcel PLA5456) It worked fine on the shield on direct connect. Can you try connecting directly and retrying?
 
Yeah, my easy method to test if it is the Network or the Processor is to copy the file locally, then play it back.

It could be either, since you get 10-20MB/s, and even with optimized decode an i5 2400 is questionable. A 50+GB 2-hour movie is probably using 10-12MB/s bandwidth. That could occasionally peak above what your connection can handle.
 
Last edited:
can the sheild connect to a NAS and stream content from it?

i tried today and the one issue is my TV and HTPC is located in my living room there is not Rj45 port, so i am running over Ethernet-over-power adapter which most i can get is about 15-20MB/sec. I tried today with a BR rip at 56G in size and it stuttered, but i assumed it was my old i5 2400 was the issue..

I guess the other part is that we do use a web browser to view streams from channels in other countries....

It can do so via Kodi no worries. I use Kodi for everything, one of the best software packages developed in the last 10 years IMO.
 
It is probably the network connection. Just playing a 20 GB 4k movie from Plex shows constant 16 to 24 Mbps, and the larger 50 to 70 GB files are 50+ Mbps. I run a Plex server from rig in sig and use an nVidia Shield for the extenders. I don't share the 4k movies outside of my network due to the bandwidth required, so I keep separate libraries for 1080p and lower and the 4k stuff only internally.
 
Even on a large TV I bet the vast majority of people can't tell the difference between Netflix 4K and 1080p. Try it out if you ever get the chance. It's really only obvious when the content has no film grain and isn't letterboxed. Even then it's not like the night and day you get from 720p/1080p.
 
I did copy my 4k mkv to the local HTPC that runs an 80G SSD and it still stuttered and drops, so def the old i5 2400 processor struggling with the iGPU.

I would agree for sure for the 1080 vs 4k., netflix i can use with the app in the TV and it is fine, and I don't feel like paying $3 a month more just for UHD 4k content that is compressed to crap anyways.

It is more the local collection of 4k content I have on my QNAP TS-431P, which wont be able to handle trans-coding. I could run it off my main rig (Dual E5 Xeon rig) but keep it turned off when not in use)

Does software like Kodi auto-magically use the GPU for decoding if it is available? The reason I would prefer to keep the HTPC with Win 10 is we use it for streaming from channels from back in Costa Rica which depending on the day can be browser picky.
 
So, prices are low cause 2018 models are coming out so grabbed a 4k TV (Samsung MU8000 55') I wanted the Sony X900E but couldn't justify the added $300 for it.

With that, I have been using an older Dell Optiplex 990 with an i5 2400 CPU and integrated video, as it was fine for 1080p content to my crap-tastic LG 43" TV.

So now, grabbing some 4k content, CPU is pegging around 80-100% and dropping audio.

The Optiplex has room for a half height GPU.

I know for NetFlix you need Kaby Lake for HDCP crap to stream 4k, i presume that is still the case even if your using a dedicated GPU? Or even if you use the Win 10 netflix App.

Also, i do want to be able to play streams not from netflix so offloading to the GPu would be ideal..

Can i get around the kaby lake stuff with something like a Nvidia 1030?

Samsung MU8000 is a Ultra HD certified TV not 4K but near 4k

Any TV/Monitor with 3840 × 2160 is Ultra HD certified

Any TV/Monitor with 4096 x 2304 is true cinema 4K

Chart
should-you-buy-a-4k-resolution-chart.png
 
Last edited:
Ya, it is sneaky marketing, UHD is anything higher than 1080. Wish these industries would just all get on the same page!
 
Samsung MU8000 is a Ultra HD certified TV not 4K but near 4k

Any TV/Monitor with 3840 × 2160 is Ultra HD certified

Any TV/Monitor with 4096 x 2304 is true cinema 4K

Chart
View attachment 61600
Its a fair point to make but needs context.

The difference between 4K and UHD will only matter to consumers if movies are released in 4K.
Since they are in UHD format it makes sense to purchase a UHD display.

Framerate will suffer when gaming at full 4K native res because it will push nearly 14% more pixels.
Its not really the gamers choice when even the fastest cards now are only just ok with UHD, many games struggle unless quality is reduced a lot.

Because UHD is the dominant format it is generally termed 4K and it is well understood what is meant.
(The full term for it is 4K UHD)
If there is a need for the general consumer or gamer to differentiate be sure it will happen.
Although I doubt full 4K res will take hold before 8K UHD becomes viable, then nobody will care.
 
You TV is a smart TV? Can't you just use the built in Netflix app to watch 4k movies?? I know I can and have plenty to of times with my samsung 40" 4k tv.

Why try threw the pc?
 
You TV is a smart TV? Can't you just use the built in Netflix app to watch 4k movies?? I know I can and have plenty to of times with my samsung 40" 4k tv.

Why try threw the pc?

I think he has the movie compressed with HEVC or something which requires a transcode to display it properly. The shield is the only extender that I know that will direct play nearly every type of compression/encoding.
 
I can use the NetFlix app, it is more for my local movies I have on my NAS. I ended up getting an NVIDIA Shield in the end, it was over all the cheapest option vs buying a $300 1050ti video card for an old i5 system. So far so good, now i just need to run some CAT to my TV area since my Ethernet-over-power is struggling!
 
3840 x 2160 can pretty much be classed as 'consumer grade 4k'. ;)

Consumer grade 4K? you 're joking right! ;)


Its a fair point to make but needs context.

The difference between 4K and UHD will only matter to consumers if movies are released in 4K.
Since they are in UHD format it makes sense to purchase a UHD display.

Framerate will suffer when gaming at full 4K native res because it will push nearly 14% more pixels.
Its not really the gamers choice when even the fastest cards now are only just ok with UHD, many games struggle unless quality is reduced a lot.

Because UHD is the dominant format it is generally termed 4K and it is well understood what is meant.
(The full term for it is 4K UHD)
If there is a need for the general consumer or gamer to differentiate be sure it will happen.
Although I doubt full 4K res will take hold before 8K UHD becomes viable, then nobody will care.

Well if you look at the GTX 1080 TI spec it has a 7680x4320@60Hz, 8K it also can handle native 4K easy but it have to lower it's Resolution to UHD 3840x2160 because of lazy TV/Monitor manufactures using a marketing 4K name, in fact they use two names 4K and UHD together so it reads 4K UHD the UHD is right but the first part is wrong and misleading because UHD is 256 pixels less than a native 4K. All movies are shot in Cinema true 4K but the movies have to be cropped to UHD format because the consumer only have a UHD TV/Monitors driven to the lower cost of the market but for those with unlimited budget can have a choice, but the content will be the same.

The real differences is:
1. 4K is a professional production and cinema standard. 4,096 x 2,304, is aspect ratio of 1.9:1
2. UHD is a consumer display and broadcast standard 3,840 x 2,160, is aspect ratio of 1.78:1 (256 pixels narrower than Cinema 4K)

Now to confuse even more consumers we have SuperUHD, UltraHD, UltraSharp, names on the market

Dell uses the UltraSharp name UP3218K for $3,899.99 for a 32-inch but again there is Limited 8K content to be used unless you use it for photo, video, design work.
IMac is to release their 5K and 8K monitors @ only 27-inch for there IMac pro range.
ViewSonic VP3278-8K 32-inch monitor.

Than there is the:
NVidia G-Sync 120hz frame rate 65-inch monitor this will be using the UHD panel and not the native 4K for gaming but again it's misleading @120hz you will need to have two GTX's to run it, because only one GTX have 60hz maybe that is what NVidia means IMHO have to wait and see what they will pull out of the hat,

Something to google and read.
"The Truth and Hype on 4K, 8K, UHD and HDR"
"Human eyes cannot see things beyond 60Hz."

GTX 1080 TI spec

Standard Display Connectors DP 1.4, HDMI 2.0b but there is no DP1.4 on the market

HDCP 2.2 is new and it's not backwards compatible, many old UHD devices don't even support it so if it not backwards compatible why do most UHD monitors sold between 2015 to 2017 with HDCP 1.4/1.2 just display this: "some things cannot be explained" by NVidia.
bH3dP.png








 
Last edited:
I told you you can use the gt 1030. It's just a hundred bucks

It's not Netflix approved, but works fine for stuff you download off torrent sites. HEVC 4k, HDR works fine.

Shield is way overkill for your needs, and means you have to learn a new platform.

I know Netflix certification is annoying, but the lines drawn are complete bullshit. Intel "Core" CPU REQUIRED, when the Pentium and Celeron and Atom all support the same 4k playback.

GT 1030 has the exact same hardware decode block as the gtx 1050
 
Last edited:
Well if you look at the GTX 1080 TI spec it has a 7680x4320@60Hz, 8K it also can handle native 4K easy but it have to lower it's Resolution to UHD 3840x2160 because of lazy TV/Monitor manufactures using a marketing 4K name, in fact they use two names 4K and UHD together so it reads 4K UHD the UHD is right but the first part is wrong and misleading because UHD is 256 pixels less than a native 4K. All movies are shot in Cinema true 4K but the movies have to be cropped to UHD format because the consumer only have a UHD TV/Monitors driven to the lower cost of the market but for those with unlimited budget can have a choice, but the content will be the same.
A name that is easy to remember, quick to type and suitably descriptive has won.
4K is "4K UHD" shorthand and carries enough information for the general public to know what they are buying.
All 4K TVs will be UHD, not much to go wrong.
Those needing 4K cinema res will be aware of the difference and even if they are not, we have the internet to find out.
If a computer user accidentally buys a full 4K screen it will still display 4K UHD. They have the option to return it if they are not happy.

The real differences is:
1. 4K is a professional production and cinema standard. 4,096 x 2,304, is aspect ratio of 1.9:1
2. UHD is a consumer display and broadcast standard 3,840 x 2,160, is aspect ratio of 1.78:1 (256 pixels narrower than Cinema 4K)

Now to confuse even more consumers we have SuperUHD, UltraHD, UltraSharp, names on the market
Standards need a name and there will always be lots of them unless progress is halted. This isnt going to improve!
When you need to know the difference, look it up.
Its only confusing if general standards arent chosen for mainstream products.

Dell uses the UltraSharp name UP3218K for $3,899.99 for a 32-inch but again there is Limited 8K content to be used unless you use it for photo, video, design work.
IMac is to release their 5K and 8K monitors @ only 27-inch for there IMac pro range.
ViewSonic VP3278-8K 32-inch monitor.
There is also limited 16K content, you either have a need for the resolution or not.
There will always be a higher resolution with less content available.
Thats progress.

Than there is the:
NVidia G-Sync 120hz frame rate 65-inch monitor this will be using the UHD panel and not the native 4K for gaming but again it's misleading @120hz you will need to have two GTX's to run it, because only one GTX have 60hz maybe that is what NVidia means IMHO have to wait and see what they will pull out of the hat,

Something to google and read.
"The Truth and Hype on 4K, 8K, UHD and HDR"[/COLOR][/FONT][/LEFT]
"Human eyes cannot see things beyond 60Hz." Then why are the 120Hz/144Hz monitor better?
Off topic.
As is most of what you have posted. Its best left here.
 
A name that is easy to remember, quick to type and suitably descriptive has won.
4K is "4K UHD" shorthand and carries enough information for the general public to know what they are buying.
All 4K TVs will be UHD, not much to go wrong.
Those needing 4K cinema res will be aware of the difference and even if they are not, we have the internet to find out.
If a computer user accidentally buys a full 4K screen it will still display 4K UHD. They have the option to return it if they are not happy.


Standards need a name and there will always be lots of them unless progress is halted. This isnt going to improve!
When you need to know the difference, look it up.
Its only confusing if general standards arent chosen for mainstream products.


There is also limited 16K content, you either have a need for the resolution or not.
There will always be a higher resolution with less content available.
Thats progress.


Off topic.
As is most of what you have posted. Its best left here.

4k is not 4k UHD @3,840 x 2,160 stop confusing other people and there is not enough information given to the consumer when buying, most of the marketing information is just mainstream media BS online today, The same can be said about the HDCP 2.2 again all hype forcing people to upgrade to the UHD TV's called 4K and content.

4K is 4096 x 2304 but UHD is 3840 x 2160 not the same in Resolution.

Here is a question

The catch is the so called misleading 4k UHD Blu- Ray/OLED TV's been sold to consumers today can only play Max 3840x2160 Resolution and streaming content @3,840 x 2,160 UHD on Netflix. and Amazon so how do you force it to read 4K? and add 256 more pixels to a Max out unit!
 
Last edited:
4k is not 4k UHD @3,840 x 2,160 stop confusing other people and there is not enough information given to the consumer when buying, most of the marketing information is just mainstream media BS online today, The same can be said about the HDCP 2.2 again all hype forcing people to upgrade to the UHD TV's called 4K and content.

4K is 4096 x 2304 but UHD is 3840 x 2160 not the same in Resolution.

Here is a question

The catch is the so called misleading 4k UHD Blu- Ray/OLED TV's been sold to consumers today can only play Max 3840x2160 Resolution and streaming content @3,840 x 2,160 UHD on Netflix. and Amazon so how do you force it to read 4K? and add 256 more pixels to a Max out unit!
The 4K topic has already been covered.
You are again going off topic.
 
play nice or dont play at all
The 4K topic has already been covered.
You are again going off topic.

Again you didn't answer my question WOW!

The 4K topic has already been covered "That's one way to get out of it ";)

I'm not really sure how to take this comment. But no, I'm not joking.

You work it out one day:)
 
Last edited:
Back
Top