Just upgrade Intel CPU or get AMD CPU/mobo combo for Plex server?

Joined
Jan 24, 2007
Messages
726
Hey guys, I was searching a little bit but can't really find an answer for my specific question. So heres the backstory, I built a low end Intel based server a few years ago with an i3 6100/MSI B150M ProVD/8GB DDR4, back when I was just storing stuff on it and then streaming to my HTPC thru VLC. Then I moved to Plex and it still worked fine since I only had or streamed 1080P content. We just recently moved and finally bought a 4K TV and now with a few movies in 4K, Plex with buffer for 25 seconds every 10 seconds of play. Checked out the server and the little i3 is pegged at 100%, it just can't handle the load of 4K. All clients use Plex along with Plex on the server itself so the server is doing all of the work.

So now my situation is, my current mobo can support up to 6th gen Intel CPUs, and I've seen some i7 6700k chips on here for around $225 shipped but possibly need a cooler, or I found an AMD combo deal at Microcenter with a Ryzen 5 2600X/ASRock AB350 Pro4 for $240 after rebate and comes with a cooler, and can reuse my RAM.

Which do you think is the better option? 98% of the time the server is steaming to 1 client, the HTPC connected to the 4K TV, but it could possibly be steaming to up to 4 clients, only one 4K TV(as of right now anyway lol).
 
Hey guys, I was searching a little bit but can't really find an answer for my specific question. So heres the backstory, I built a low end Intel based server a few years ago with an i3 6100/MSI B150M ProVD/8GB DDR4, back when I was just storing stuff on it and then streaming to my HTPC thru VLC. Then I moved to Plex and it still worked fine since I only had or streamed 1080P content. We just recently moved and finally bought a 4K TV and now with a few movies in 4K, Plex with buffer for 25 seconds every 10 seconds of play. Checked out the server and the little i3 is pegged at 100%, it just can't handle the load of 4K. All clients use Plex along with Plex on the server itself so the server is doing all of the work.

So now my situation is, my current mobo can support up to 6th gen Intel CPUs, and I've seen some i7 6700k chips on here for around $225 shipped but possibly need a cooler, or I found an AMD combo deal at Microcenter with a Ryzen 5 2600X/ASRock AB350 Pro4 for $240 after rebate and comes with a cooler, and can reuse my RAM.

Which do you think is the better option? 98% of the time the server is steaming to 1 client, the HTPC connected to the 4K TV, but it could possibly be steaming to up to 4 clients, only one 4K TV(as of right now anyway lol).

A newer gen i7 would be better, one that would allow hardware encoding using Intel quicksync. I am not sure if Plex supports AMD hardware encoding as of yet, you should check the Plex forums to see. But first, the issue may be with your HTPC, can it direct play 4K content? It seems like your Plex server is transcoding the 4K movies to something your HTPC can handle. 4K through Plex or Kodi is a pain right now especially if it is HDR.
 
A newer gen i7 would be better, one that would allow hardware encoding using Intel quicksync. I am not sure if Plex supports AMD hardware encoding as of yet, you should check the Plex forums to see. But first, the issue may be with your HTPC, can it direct play 4K content? It seems like your Plex server is transcoding the 4K movies to something your HTPC can handle. 4K through Plex or Kodi is a pain right now especially if it is HDR.
Ok I did a little more testing, the HTPC cannot direct play 4K, that has an even older i3-3225 using only the onboard graphics which doesn't support 4K, I don't know why I forgot about that. So I downloaded the Plex app on my TV and that played 2 4K movies just fine, and I checked the CPU usage on the server while it was playing and there was barely a jump from idle, maybe 5-7% total usage. Maybe Plex trying to play 4K to something that doesn't support it is putting a big load on the server CPU. I tried to play the same 4K movies on a PS4 with the Plex app connected to a non-4K HDTV and that buffered alot as well and also pegged the server CPU at 99%.

So it seems like the server can deliver the 4K content without issue, the problems lay on the client side. I think I'll try to add a GT 1030 to the HTPC and see what happens.
 
Ok I did a little more testing, the HTPC cannot direct play 4K, that has an even older i3-3225 using only the onboard graphics which doesn't support 4K, I don't know why I forgot about that. So I downloaded the Plex app on my TV and that played 2 4K movies just fine, and I checked the CPU usage on the server while it was playing and there was barely a jump from idle, maybe 5-7% total usage. Maybe Plex trying to play 4K to something that doesn't support it is putting a big load on the server CPU. I tried to play the same 4K movies on a PS4 with the Plex app connected to a non-4K HDTV and that buffered alot as well and also pegged the server CPU at 99%.

So it seems like the server can deliver the 4K content without issue, the problems lay on the client side. I think I'll try to add a GT 1030 to the HTPC and see what happens.

I have a GT 1030 in my HTPC and it does not handle 4K content. From what I have read a 1050 or 1060 is needed and you must use windows for HDR.
 
Damn, when I searched which card to use for 4K that was the recommended, I'll check out some 1050 or 1060 cards, and I do use Windows on the HTPC.
 
easiest thing for good HDR playback is just buy an Nvidia shield. I did that and never looked back.
 
easiest thing for good HDR playback is just buy an Nvidia shield. I did that and never looked back.

I agree I have 2. but HDR on the shield is still not perfect. You must switch the color profile each time you switch from SDR to HDR.
 

From what I have read and have experienced with my shield TV's and HDR content. You must change the color-space setting to enable HDR, but if you leave it set for HDR, SDR content is dim and washed out, so you need to manually change it. Supposedly Nvidia is working on auto color-space switching in a future build.
 
I use a 1050 and it works great. My next HTPC will be a shield though, I’m done upgrading my ancient pc, which mostly gets used to play 1080p and steam games.
 
Well I've been messing with this a little more and now the same movie I watched thru the Plex app on the TV is now not playing, and the CPU load on the server is still minimal. The movie stutters alot and then the Plex app becomes very laggy. I'm wondering if its the Plex app that is having issues.
 
If you're playing an original size video on a local network Plex doesn't transcode it at all (<5% cpu usage likely), the viewing side does all the processing so if your viewer isn't beefy that might be whats bogging down.
Your big bottle neck at that point if its a good viewer is either your network (if wireless), or how fast your storage can serve the media (since the transcoding isn't server side).

Personally the app has fixed more issues for me than they've caused by using that as my primary viewing.
 
Everything in my home network that was originally supposed to be streaming is connected wired, but I hooked up the TV wirelessly since the router is right behind the tv and I didn't expect to be using it to stream. That does explain why the same movie worked fine 1 time and then not later on, the signal must have been good fir the first time then later the signal was bad for some reason. I hooked up the TV with an ethernet cable and every 4k movie is playing fine now, even ones that I couldn't get to play at all. Thanks for all the help and suggestions guys.

I'm still not sure if I want to upgrade my HTPC with a new video card that does 4K or grab a low end CPU/mobo combo that will do 4K or just use the Plex TV app for 4k viewing and keep the HTPC the way it is for 1080p viewing.
 
i put a 1050ti in to my plex server and it helps out with a few of the streams, i also have a ryzen 1700x in it and i can tell you certain 4k transcoding takes that CPU to 80+ usage on 16 threads. Best bet is still direct play, the HW transcoding can sometimes have compatibility issues on certain devices.
 
i put a 1050ti in to my plex server and it helps out with a few of the streams, i also have a ryzen 1700x in it and i can tell you certain 4k transcoding takes that CPU to 80+ usage on 16 threads. Best bet is still direct play, the HW transcoding can sometimes have compatibility issues on certain devices.

To my knowledge they still haven't fixed the transcoding true tone problem, where 4k looks like crap transcoded due to the color reduction of the HDR.
 
oh okay I dont have anything for HDR so its just nice to offload some of the work
 
2600X vs 6700k is a no brainer. There are some things to hate on the Ryzen for but transcoding and plex is one of the areas it can shine. Most of the time you don't actually have to transcode anything. My plex server is dual X5660's in an R710 with a lot of RAM. I wish I went C2100 though for the added drive bays. I have not popped a 1050ti in but have plenty of them sitting around right now. I use the system for a bit more than just plex and have a household of 7 so the few random stutters I have are hard to pinpoint. Most of the time it is when I'm using the CPU to process commercial skipping from my plex DVR.
 
my 2 cents: i believe the general rule for plex 1080p playback is have a server that has a CPU passmark score of over 2k per 1080p stream. For 4k the general rule is a 4k passmark score per 4k stream. Not sure about HDR. I have a Intel Xeon E5-2650 in a Nappit-AIO config with plex running on a Ubuntu Server VM. This setup gets a 10k passmark for the full CPU so I only allocate 4 virtual CPUs to the VM giving me roughly 5k passmark for 1-2 plex streams in the household.
 
My personal experience is that on my Vizio TV Plex app sometimes bogs down, but the HTPC with a RX560 plays the media fine, also have a few Roku 3's that do better than the TV app. One thing I did notice over the years is that the Plex DB gets jacked sometimes and needs some clean up and optimize love, but that is mainly for slow title loading when scrolling for movies / tv shows. Take a look at the logs when you get the weird behavior to see if you can pin point the issues. Also having shared my server with a lot of friends some times looking at the server if it is acting weird will show a bunch of external streams happening. To the original question I would think that a cheap 4c 1151 cpu might be a better value prop than the full platform upgrade.
 
my 2 cents: i believe the general rule for plex 1080p playback is have a server that has a CPU passmark score of over 2k per 1080p stream. For 4k the general rule is a 4k passmark score per 4k stream. Not sure about HDR. I have a Intel Xeon E5-2650 in a Nappit-AIO config with plex running on a Ubuntu Server VM. This setup gets a 10k passmark for the full CPU so I only allocate 4 virtual CPUs to the VM giving me roughly 5k passmark for 1-2 plex streams in the household.

I believe that is only if it has to transcode. if the playback devices can play without transcoding, then the only limit to how many concurrent streams would be the drive(s) and network.

I recently tried to play a 4K movie on my setup and it wouldn't play, just had a black screen and no audio, but the plex server showed that the media was playing.
Converted the 4k video to Apple TV 4K spec and it still didn't play.

plex-forrest-gump-4k-apple-tv.jpg
 
I have the same problem with 10 bit 4K, mine will only play if it's 8 bit, haven't had the time to figure out why
 
I believe that is only if it has to transcode. if the playback devices can play without transcoding, then the only limit to how many concurrent streams would be the drive(s) and network.

I recently tried to play a 4K movie on my setup and it wouldn't play, just had a black screen and no audio, but the plex server showed that the media was playing.
Converted the 4k video to Apple TV 4K spec and it still didn't play.
Ever figure out the fix? I have a 4K tv on the way and use the same setup. I experienced the same black screen issue in Plex on my ATV4K when I tested some 10bit 4K HDR media on my 1080p screen. Chalked it up as I just need a 4K TV to see it.
 
Ever figure out the fix? I have a 4K tv on the way and use the same setup. I experienced the same black screen issue in Plex on my ATV4K when I tested some 10bit 4K HDR media on my 1080p screen. Chalked it up as I just need a 4K TV to see it.

Haven't figured it out or messed with it since that day. I am not really concerned as I have no desire at the moment to store 4K movies, they are just too large, I am running out of space, and my TV's are small so 720/1080p look just fine for me.
I was just curios to try a 4k rip.
 
buy a shield

game over

I have had a lot of issues getting my ATV4K to work with my Plex server / LG 4K TV. I am starting to lean towards Verge on this one re: Shield. I have heard good things.
 
I have had a lot of issues getting my ATV4K to work with my Plex server / LG 4K TV. I am starting to lean towards Verge on this one re: Shield. I have heard good things.
Are you having issues with 4k media, or just Plex and the ATV4K in general?
I don't have any issues with the Apple TV's and Plex here for 1080p or lower, just the issue with playing that one 4K movie I downloaded.

Zoom Player was pegging the CPU on my gaming machine while playing that file.

forrest-gump-stressing-system.jpg


MPC-HC plays it with only 20% CPU usage so whatever codecs that are configured in Zoom Player aren't ideal for 4K HVEC.

forrest-gump-stressing-system-2.jpg
 
Last edited:
Are you having issues with 4k media, or just Plex and the ATV4K in general?
I don't have any issues with the Apple TV's and Plex here for 1080p or lower, just the issue with playing that one 4K movie I downloaded.

Just had issues with the plex server and the ATV4K in general. It was mostly due to the plex server being a VM on my ESXi host. It is still flaky but I believe that has to do with the VM using a plex library that is NFS as well as not having the ability to us HW acceleration in plex. Either way chalked it up to that in my mind, probably isnt even the real reason but you win some you lose some.
 
Haven't figured it out or messed with it since that day. I am not really concerned as I have no desire at the moment to store 4K movies, they are just too large, I am running out of space, and my TV's are small so 720/1080p look just fine for me.
I was just curios to try a 4k rip.

Got my LG OLED this week and Plex direct streams 4K 10bit HDR stuff to the Apple TV 4K without any issue. Very happy with this setup! Wondering if having i3-8100 in my NAS helps? (Intel Quicksync)
 
Last edited:
It's all about the client. Not a good idea imo to recommend putting money into the server if it's one or the other. Who wants to transcode anyways.

I used HTPCs throughout my house for over a decade, recently for a Shield and haven't looked back.

I do find it interesting that for all the money you need to spend right now to get an htpc to play 4k. My TV can do it great off a flash drive with whatever processor it has in it.
 
Got my LG OLED this week and Plex direct streams 4K 10bit HDR stuff to the Apple TV 4K without any issue. Very happy with this setup! Wondering if having i3-8100 in my NAS helps? (Intel Quicksync)

Not sure how plex works, but I'd assume if the device can natively handle the media it wouldn't transcode anything, which might be why it works better. Regardless, you don't want to transcode uhd stuff because it takes to to 8 bit, no wide color gamut, and no HDR :(
 
Back
Top