So what's the minimum CPU for Hi10p playback?

Lucien

n00b
Joined
Nov 6, 2009
Messages
17
Apologies if this has been covered before, but my searches are drawing a blank.

What would be the lowest end Intel CPU one could go with in order to decode Hi10p content? I'm looking at something in the Sandy Bridge family, say a G630(T). My intention is to build a tiny simple box, browsing, e-mail, maybe basic word processing, all of which is easy. But I also will likely use it for occasional basic HTPC use. I hear Atoms can't pull it off, and I'm not looking to get a quad core behemoth as it'd be pointless overkill.

Does the software used make a difference? I tend to stick to the CCCP (which is MPC and FFDShow) and Zoomplayer.
 
With the rig in my sig, I never noticed anything major with Hi10p @ 1080p. CPU always still seems to be near idle w/ GPU @ idle However, I do use MPC-HC w/ no special packs.
 
I'm running the cpu in my sig which is lower than the G630 and works fine for high def, I do have an old GPU, I think my cpu would run it without but I haven't tried
 
I'm running the cpu in my sig which is lower than the G630 and works fine for high def, I do have an old GPU, I think my cpu would run it without but I haven't tried

Hi10p isn't hardware accellerated yet. So it's all CPU for now. :(

But most CPU's are more than capable of handling it. Basically if your rig can play 1080p content without H/W assist then it should be able to play Hi10p.
 
So, it's thread hijacking time. :)

Where do you folks find Hi10p-encoded media? Are we talking about things you've encoded yourself, or has some major release group switched to this?

Just curious.
 
So, it's thread hijacking time. :)

Where do you folks find Hi10p-encoded media? Are we talking about things you've encoded yourself, or has some major release group switched to this?

Just curious.

Actually I have no idea. I heard it's gaining popularity in some areas and I wanted to make sure I don't just fall a bit below what's needed - that would seriously suck. It's also the reason why I haven't been able to test if my C2D can indeed handle Hi10p. I'm migrating away from this box, but it might have been a useful guideline.

Hi10p isn't hardware accellerated yet. So it's all CPU for now. :(

But most CPU's are more than capable of handling it. Basically if your rig can play 1080p content without H/W assist then it should be able to play Hi10p.

Yeah, heard there's no GPU acceleration, though in a way that simplifies things. At any rate I'm going with Intel's embedded video for this one.

But I was under the impression that Hi10p needs more power than "regular" 1080p content. Unless that's coming from people who were dependent on their GPU and never really had a CPU powerful enough to handle 1080p to begin with....
 
From what I hear it's mostly popular with Anime rips. People claim the 10bit rips are more accurate and higher quality than regular 8bit encoding with the same or slightly smaller file sizes. From my own experience converting a non anime 10bit rip to an 8bit as my Zacate and at the time XBMC didn't support 10bit I noticed no difference or at least not enough to make a difference in motion between the 8bit and 10bit with roughly the same file sizes.
 
From what I hear it's mostly popular with Anime rips. People claim the 10bit rips are more accurate and higher quality than regular 8bit encoding with the same or slightly smaller file sizes. From my own experience converting a non anime 10bit rip to an 8bit as my Zacate and at the time XBMC didn't support 10bit I noticed no difference or at least not enough to make a difference in motion between the 8bit and 10bit with roughly the same file sizes.

It is very popular with the Anime scene for two reasons:

  • In anime, there tends to be a lot of "solid areas" and also "shaded areas of nearly one color type". The biggest thing you see is banding which really sucks with 8-bit encodes.
  • Because of the above shading/coloring...when videos are encoded to 10-bit it will either result in a much "clearer" presentation (it makes a huge ass difference in shadow scenes)

As for the anime scene, right now the 10bit encodes are still "large" because they are starting with 10bit raws! At least Japan decides not to suck. However, when you take an 8bit encode and re-encode to 10bit...the results are massive in terms of size (~1/5 to ~1/10) with no discernible loss of quality. The decoders are starting to become "standardized" in terms of what people are using an thus the encode parameters are as well.

If they started putting 10-bit HW decoders in GPU's, I think you would still a pretty massive shift fairly quickly. For people who stream videos, imagine if your bandwidth needs dropping 50%...or even 75%. It would be effing huge. This will be a reasonable test for me when Haswell comes out; I want to see how much more "battery life" will be consumed with a 10bit CPU decode vs the 8bit GPU decode. Right now it is big...but with Haswell being smarter on CPU usage, it might work out to be a much smaller issue. If this is true...we really could see a shift to 10bit world wide with HW based decoders following shortly.
 
It is very popular with the Anime scene for two reasons:

  • In anime, there tends to be a lot of "solid areas" and also "shaded areas of nearly one color type". The biggest thing you see is banding which really sucks with 8-bit encodes.
  • Because of the above shading/coloring...when videos are encoded to 10-bit it will either result in a much "clearer" presentation (it makes a huge ass difference in shadow scenes)

Ah ok. I've noticed banding on non anime cartoons before. Always kinda wondered about it. The video I converted was life action so not a lot of solid areas.
 
Here's a side by side. 10-bit anime colors look a little more vibrant where 8-bit looks a little saturated, but I don't honestly think I would notice that unless I did a side by side.

10bit8bitdifferencemovi.jpg


10bit8bitdifferencestay.jpg
 
10-bit is supported in H.265 so it's all good. It served it's purpose as experimental in h.264 to at least get people used to the concept. Hell, buy the time H.265 gets popular people will be experimenting with 12-bit anyways, which is what's being used in cinema's currently.
 
Hi10p will never be hardware accelerated. It missed the boat when it comes to industry support. It is too niche and will soon be replaced by H.265 (HEVC, High Efficiency Video Coding) which DOES have industry support.

http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding

10-bit is supported in H.265 so it's all good. It served it's purpose as experimental in h.264 to at least get people used to the concept. Hell, buy the time H.265 gets popular people will be experimenting with 12-bit anyways, which is what's being used in cinema's currently.

Thanks...didn't know that 265 was that far along.

Here's a side by side. 10-bit anime colors look a little more vibrant where 8-bit looks a little saturated, but I don't honestly think I would notice that unless I did a side by side.

[!img]http://img18.imageshack.us/img18/3748/10bit8bitdifferencemovi.jpg[/img]

[!img]http://img810.imageshack.us/img810/8068/10bit8bitdifferencestay.jpg[/img]

That is a bad example. Not sure why you used that as your only reference to determine the virtues of 10-ibt
 
Yea, H.265 is supposed to be finalized sometime earlier next year, and 10-bit and higher color depth is a valid improvement that will come along over the years as processors improve but it's also important to remember that H.265 will take several years before it becomes mainstream, especially if there's nothing like Blu-Ray and ISDB-T to push it. We still may have a couple of years before it's hardware accelerated because of that.
 
Yea, H.265 is supposed to be finalized sometime earlier next year, and 10-bit and higher color depth is a valid improvement that will come along over the years as processors improve but it's also important to remember that H.265 will take several years before it becomes mainstream, especially if there's nothing like Blu-Ray and ISDB-T to push it. We still may have a couple of years before it's hardware accelerated because of that.

Well the good news is that people who distribute Linux ISO's and anime will be all over this like white on rice. I could see this time next year a full shift in that arena.
 
Apologies if this has been covered before, but my searches are drawing a blank.

What would be the lowest end Intel CPU one could go with in order to decode Hi10p content? I'm looking at something in the Sandy Bridge family, say a G630(T). My intention is to build a tiny simple box, browsing, e-mail, maybe basic word processing, all of which is easy. But I also will likely use it for occasional basic HTPC use. I hear Atoms can't pull it off, and I'm not looking to get a quad core behemoth as it'd be pointless overkill.

Does the software used make a difference? I tend to stick to the CCCP (which is MPC and FFDShow) and Zoomplayer.


ARM V6 850MHz System on a Chip $35.
 
Back
Top