Minimum CPU requirement for x264 1080p playback w/o GPU acceleration?

mkygod

Limp Gawd
Joined
Apr 14, 2005
Messages
150
Wondering if you can give me some baseline CPUs that would good at running 1080p x264 material without the aid of the GPU .

My guess would a low-end Core2duo with 2.0ghz or higher.

I currently use CoreAVC codec, but if there are better codecs for x264, pls let me know.
 
1080p video at 24-30 frames per second

CPU - 2.8 GHz or faster Intel Pentium 4 or equivalent AMD processor

RAM - At least 1GB of RAM

GPU - 256MB or greater video card

OS - Windows 98, 2000, XP, Vista, 7

From the CoreAVC website.

Be aware tho that certain situations will tax those requirements greatly. The bird video is supposedly a good way to test this.
 
From the CoreAVC website.

Be aware tho that certain situations will tax those requirements greatly. The bird video is supposedly a good way to test this.

lol, that thing has some serious bitrate spikes for a short vid.

OP: the bird vid Zachstar is referencing is the Planet Earth Birds clip, the one that zooms out the show a massive wing of birds in flight.
 
lol, that thing has some serious bitrate spikes for a short vid.

OP: the bird vid Zachstar is referencing is the Planet Earth Birds clip, the one that zooms out the show a massive wing of birds in flight.

Speaking of which is there a legal way to get that video?
 
Total Media Theatre 3 Platinum seems to be most recommended Blu-Ray playback software here on the forums. So going off their requirements for 1080P Blu-ray playback:

CPU
Intel
Minimum -
Intel® Core™ 2 Duo CPU E4400 2.00GHz
Intel® Core™ Duo T2600 2.16G
Recommended -
Intel® Core™ 2 Duo E6400 2.13GHz
Intel® Core™ 2 Duo T7400 2.16GHz
AMD
Minimum - AMD Athlon™ 64 X2 4600+ 2.4GHz
Recommended - AMD Athlon™ 64 X2 5600+ 2.8GHZ

Though it is a tad odd as to why GPU decoding is not an option.
 
I would say most midrange Core 2 Duo's would be fine in addition to an efficient decoder like CoreAVC.
 
Total Media Theatre 3 Platinum seems to be most recommended Blu-Ray playback software here on the forums. So going off their requirements for 1080P Blu-ray playback:

CPU
Intel
Minimum -
Intel® Core™ 2 Duo CPU E4400 2.00GHz
Intel® Core™ Duo T2600 2.16G
Recommended -
Intel® Core™ 2 Duo E6400 2.13GHz
Intel® Core™ 2 Duo T7400 2.16GHz
AMD
Minimum - AMD Athlon™ 64 X2 4600+ 2.4GHz
Recommended - AMD Athlon™ 64 X2 5600+ 2.8GHZ

Though it is a tad odd as to why GPU decoding is not an option.

Well TMT is an all-purpose player, so the minimum requirement may not be for Blu-ray. But still, there's almost no video cards out there today that you can buy that won't play Blu-ray.
 
The reason why i am not including GPU acceleration is because i want to build a system that just uses the builtin graphics on the motherboard and I am assuming that the onboard graphics won't have much in the way of GPU acceleration, particulary with the CoreAVC encoder for x264, which is mostly CPU powered except for nVidia Cuda support.
 
The reason why i am not including GPU acceleration is because i want to build a system that just uses the builtin graphics on the motherboard and I am assuming that the onboard graphics won't have much in the way of GPU acceleration, particulary with the CoreAVC encoder for x264, which is mostly CPU powered except for nVidia Cuda support.

Well, anything Intel from the later variants of the X4500HD series (and newer) will have h264 accelleration, and Anything nVidia since the 8000 series IGPs will have h264 decoding, at least?


If you are using an older Intel IGP, however, then only partial MPEG2 decoding for you.

Speaking of which is there a legal way to get that video?

Not that I know of... I have the BluRay discs, so I just watch that opening section, again.
 
Out of something you can still find today, I use a Celeron e3200 which can be found for around $40.

As far as a test, I would use big buck bunny instead of the bird video (also known as killasample) It's free and legal, and killasample is more like a stress test than a benchmark. What throws a lot of systems off is how the clip bitrate spikes, sometimes higher than what's possible on a bluray disc.
 
Out of something you can still find today, I use a Celeron e3200 which can be found for around $40.

As far as a test, I would use big buck bunny instead of the bird video (also known as killasample) It's free and legal, and killasample is more like a stress test than a benchmark. What throws a lot of systems off is how the clip bitrate spikes, sometimes higher than what's possible on a bluray disc.

...?

Birds is from a bluray disc. It shows what the worst case can be, just so you know the limits.
 
There's various versions of it. The Killa Sample one I'm referring to which a lot of people like to use as benchmark is actually a piece of a badly encoded file that does not follow any established profile and actually spikes at some points to bitrates higher than what a bluray disc can produce.

If you know it's ripped correctly from a bluray disc, then it's fine. If it's Killa Sample It's overkill and will cause you to waste money.
 
lol, that thing has some serious bitrate spikes for a short vid.

OP: the bird vid Zachstar is referencing is the Planet Earth Birds clip, the one that zooms out the show a massive wing of birds in flight.

haha, I was wondering why I got so much choppy video on that clip.
 
identical bitrate needed to produce a visually identical h264 reencode from BD source files is going to, at times, peak much higher then the source files themselves. However, you will still have a smaller file size at the end. x264 is an art as far as I'm concerned.
 
It's an art yes, a bitrate that exceeds level 5.1? Even Bluray does not exceed 4.1 and I believe ATI didn't even go past 4.1 until recently. I've even found a thread of a guy thinking somethings wrong because his i7 dropped frames trying to play it. http://forum.xbmc.org/showthread.php?t=60625

http://samples.mplayerhq.hu/V-codecs/h264/

the above link is test files that mplayer uses to test with and pretty much all of them are things you'll run into in the real world, Do what you want, but personally I wouldn't sweat it if the birds clip dropped a frame or two on whatever you use.
 
i had an e4400 and it couldnt handle 1080p content with core avc at all for a crap at stock speeds.

integrated video can handle x264 just fine! if it is ati or nvidia.
 
What about a low end i3 mini-ITX based system where the cpu handles all the video processing and there is no onboard video on the motherboard? Do you think it can handle all 1080p content by itself?
 
What about a low end i3 mini-ITX based system where the cpu handles all the video processing and there is no onboard video on the motherboard? Do you think it can handle all 1080p content by itself?

The "Gpu Goodies" are in the I3 itself and your MiniITX idea is likely an H55 mobo which does allow HDMI DVI and VGA out from the i3.

Be aware tho that for HTPC purposes a cheaper AMD setup will give out more and seems to not have the frame drop issue.

Any chance for http://www.newegg.com/Product/Product.aspx?Item=N82E16813131659

And?

I would personally use it with this 45W http://www.newegg.com/Product/Product.aspx?Item=N82E16819103897

Feature chart http://images.bit-tech.net/content_...hics-performance-review/igp-compare-table.png

You can note that The 880G does 7.1 LPCM HDMI out (No bitstreaming but you really dont need that unless you have a 600 USD plus receiver system)

Oh BTW if you are willing to go MicroATX instead you get alot better options and cheapness http://www.newegg.com/Product/Product.aspx?Item=N82E16813130295
 
I'm gonna go ahead and agree with most of the people out there in saying that a cheapo gpu is the ideal way to go. Even if its a 4350 or something super cheap (25 bucks AR is normal price for the lowest end) it'll decode reasonably well. Thats assuming its an add-in. If its anything somewhat recent then onboard gpus will decode reasonably well
 
Is the 240e really worth $21 more than the 240?

This is the only review I've found which compares energy use is <http://www.anandtech.com/show/2861/9>, and 100W+ at idle seems really high, so I'm not sure what other components they have in their system, or what they are doing, but it doesn't make me trust their figures for my application.
 
100W+ is not high when you consider the fact that the Anandtech system had a Geforce GTX280, which has a ridiculous 250w tdp.

Anyhow, I was going to go with the i3, but ive been reading that it can't output perfect 23.976fps. It does 24fps instead. Since i'm going to be watching a lot of movies with it, this worries me. Also, Intel's video drivers aren't very robust and their driver support aren't up to par with video card makers.

I'm thinking about getting something like a Celeron e3300 (based on the core2duo wolfdale) instead. At idle, it consumes only 20 watts, and the 9300/9400IGP boards for it have a more attractive video processing abilities with CUDA support which helps with x264/h264 acceleration through CoreAVC. I will be using CoreAVC a lot since almost all my movies are in x264 format.
 
Anyhow, I was going to go with the i3, but ive been reading that it can't output perfect 23.976fps. It does 24fps instead.

If something is 0.001% faster, you won't be able to tell the difference. Go with the i3.
 
Bare minimum??? My socket 939 A64 X2 3800+ will do it. It struggles at times and on some videos you basically have to shut down everything but the video player, but it is possible to do smooth playback without dropped frames or a/v sync problems. When I had it overclocked to 2.4 ghz it was fine with any 1080p vid I tested it with and wasn't real particular about background applications. GPU acceleration when it works takes care of all of that and does it with better power efficiency, though it's always good to have enough brute force cpu power to fall back on in case gpu acceleration doesn't want to work with certain files.

edit: Just tried that 42Mbit birds clip from planet earth, it's a bit much for my system but it is fairly smooth, just a couple hitches and low framerate during playback. I'd be very surprised if a Core2 or phenom II X2 over 2 ghz couldn't do it. This is using MPC-HC with the Evr custom renderer. I tried MadVR(my favorite) but the birds clip is just too much work with that renderer for my cpu to playback smoothly.

edit2: Wow those birds clips are harsh, I bumped my cpu back up to 2.4ghz and tested the 20Mbit and 42Mbit clips, the 20Mb was perfect, the 42Mb still had playback issues with pausing and low framerates. Thank god I'm upgrading soon, lol.
 
Last edited:
If something is 0.001% faster, you won't be able to tell the difference. Go with the i3.

It's an issue because it can cause judder. It's the same reason why AMD and Nvidia support 59.94hz along with 60hz for htpc use. It may be fine if you don't notice it, but if you do, it's annoying as hell.
 
100W+ is not high when you consider the fact that the Anandtech system had a Geforce GTX280, which has a ridiculous 250w tdp.

Anyhow, I was going to go with the i3, but ive been reading that it can't output perfect 23.976fps. It does 24fps instead. Since i'm going to be watching a lot of movies with it, this worries me. Also, Intel's video drivers aren't very robust and their driver support aren't up to par with video card makers.

I'm thinking about getting something like a Celeron e3300 (based on the core2duo wolfdale) instead. At idle, it consumes only 20 watts, and the 9300/9400IGP boards for it have a more attractive video processing abilities with CUDA support which helps with x264/h264 acceleration through CoreAVC. I will be using CoreAVC a lot since almost all my movies are in x264 format.

Don't depend on CUDA. If your CPU cant handle the task let the normal purevideo take over.
 
It's an issue because it can cause judder. It's the same reason why AMD and Nvidia support 59.94hz along with 60hz for htpc use. It may be fine if you don't notice it, but if you do, it's annoying as hell.
QFT. most movies are filmed at 24/1.001 (23.976hz). its like an extra frame every 20 frames or so vs 24 hz, which is what introduces the jitter. If you want the most accurate reproduction then the difference is important. Of course some people won't notice, but if you're a perfectionist or know what you're looking for, it is noticeable.
 
Frankly it is stunning how Intel let the issue ship considering how seriously they took the rest of the setup. Full decode + bitstreaming... Why the hell did they let such a showstopper through?
 
Back
Top