New Video Compression Standard Doubles Efficiency

Firefox added H.264 support earlier this year. Also, the 8k UHD stuff NHK was using for the olympics was not using any compression, so you were looking at around 4 terrabytes of data per hour. Maybe this will be the standard that brings 4k to BDXL discs and blu-ray players. With a 100 gig disc and double the compression, you could likely fit a 4k movie on there, possibly even with 3D. Plus the time-table would line up with 4k TVs trickling out in the next year or two on the high end stuff. Those 70 and 80 inch LCD sets could really use higher than 1080p, as you can see the pixels and compression artifacts pretty clearly at any kind of closer seating distance.
 
He said GPU assisted decoding. Not encoding. GPU decoders are lossless. And GPU decoding is important since HEVC is substantially more complex and requires a lot more power to decode it will tax a CPU quite a bit otherwise.


Also, closed standards suck, that's why we don't have H.264 support on Firefox or Opera. They cannot afford it.

Doh,thanks for that:) Have encoding on the brain from ripping crap all week :) I've been changing alot of stuff also from mkv to mpeg 4 as it makes stuff play a little better on my HTC One X.
 
Buut "efficiency" is also related to throughput and overhead. If the decode takes less time it doesn't matter if the file size is bigger. If it is a smaller file, (thus more size efficient) and takes 3x the CPU overhead to decode (thus less efficient), it's not really an improvement. (I broke the news rules and read the article and it didn't say :eek:)


Yes! Someone that understands :p.


While having highly compressed videos and getting better quality is great, the hidden cost is to the processing power. It took quite a few years for HD to become the norm in its compressed format because of the raw CPU power required to decode the video. This new codec might not have as big of an impact as H.264 did a few years back because Quad Cores are almost the norm. My dual core would struggle with 720p encodes on my 2006 dual core laptop and still does.

Hidden cost to compression of any kind, is uncompressing it when it needs to be used.
 
Yes! Someone that understands :p.


While having highly compressed videos and getting better quality is great, the hidden cost is to the processing power. It took quite a few years for HD to become the norm in its compressed format because of the raw CPU power required to decode the video. This new codec might not have as big of an impact as H.264 did a few years back because Quad Cores are almost the norm. My dual core would struggle with 720p encodes on my 2006 dual core laptop and still does.

Hidden cost to compression of any kind, is uncompressing it when it needs to be used.

Yes... a few years ago i made the HUGE mistake of buying a laptop(Acer 5810tz) that my wife wanted with a single core Intel CULV CPU,she wanted the very long battery life.That thing is seriously the biggest fuggin turd ever.It struggles to play flash. I put a SSD(m4) in it,but being as it was also SATA 1 spec(lol),it was like trying to polish a turd.It sped it up,but when the CPU is constantly choking there is only so much it can do lol.

Moral of story? Do not drink a lot of delicious craft beers before your wife wants to buy something.
 
He said GPU assisted decoding. Not encoding. GPU decoders are lossless. And GPU decoding is important since HEVC is substantially more complex and requires a lot more power to decode it will tax a CPU quite a bit otherwise.


Also, closed standards suck, that's why we don't have H.264 support on Firefox or Opera. They cannot afford it.

No, it's not that they can't afford it, in fact it's free right now. The problem is, in the future it may not be free. Once 2016 comes around, it won't be royalty-free for web-usage, so instead of supporting it until 2016, they have decided to not support it at all (though this might be changing) and not run into legal issues over it.
 
No, it's not that they can't afford it, in fact it's free right now. The problem is, in the future it may not be free. Once 2016 comes around, it won't be royalty-free for web-usage, so instead of supporting it until 2016, they have decided to not support it at all (though this might be changing) and not run into legal issues over it.

And this is why patent-encumbered formats are so dangerous.

H.264 is a cancer attempting to infiltrate the web. The MPEG-LA hopes that websites use H.264. Once H.264 has spread far enough, they come in and start charging royalties. The W3C's biggest mistake was not specifying a mandatory codec for the <video> tag; they tried to mandate Theora but in the end, rolled over to their corporate masters.
 
Yes! Someone that understands :p.


While having highly compressed videos and getting better quality is great, the hidden cost is to the processing power. It took quite a few years for HD to become the norm in its compressed format because of the raw CPU power required to decode the video. This new codec might not have as big of an impact as H.264 did a few years back because Quad Cores are almost the norm. My dual core would struggle with 720p encodes on my 2006 dual core laptop and still does.

Hidden cost to compression of any kind, is uncompressing it when it needs to be used.

There's no magical solution that can increase compression ratio without additional workload. Those data cannot magically disappear and reappear.

Its about maximizing the resources we have. Processing power continues to increase while internet bandwidth is hardly moving (and limited monthly bandwidth doesn't help either). The logical progression would be to find more powerful means of compressing the data to make it practical to stream higher quality contents.
 
My predictions on video compression:
near future: reconvert the entire video into approximations with vector graphics and vector animation (basically all movies will be converted to look like flash animation). Audio will be approximated via midi (it's gonna be hard to hear the words!)
far away future: compress entire 2 hour movies into a single byte, basically giving the movie only 256 combinations to work with (there will only be 256 movies in the future, so it will be ok)
 
.mkv is a container/wrapper =p it has nothing to do with the video stream which in .mkv case can be nearly anything from old dvd format of H.262 microsoft's VC-1 and ofc the AVC subset of h.264 which is what you probably mean, it has a little bit to do with cpu overhead and file size but that's about it. This is progress no reason to not say it would be better.
I told this to someone at a church conference. This someone was operating the audio mixer and his friend was up there as well. The building had over 50ft of unamplified VGA cabling which split to two projectors. My old consumer-grade Lenovo laptop that had low-end Intel shared graphics chip couldn't handle pushing out video over this cabling to the projectors, so it resulting in a very choppy / low framerate of the 30fps videos I was trying to play that someone had me put together for them. Earlier that morning I had successfully tested the videos before I left the apartment we were staying out to make sure the video worked.

But anyhow, the audio operator kid was telling me to rename the extension of my video to .AVI "because the AVI format ALWAYS works no matter what". I tried to explain to them that AVI is a container/wrapper (kind of like a ZIP archive) and wouldn't make a squat difference. I had tried Windows Media Player, MPlayer, VLC Player, Quicktime, Nero ShowTime, Windows Media Player Classic -- and all of them presented the same symptoms. As I was trying different things, and monitoring CPU, RAM, and shared GPU resource usages -- doing my diagnostics -- I narrowed it down that the source of the problem wasn't to do with the CPU or the RAM or the HDD or the integrated audio, and everything was pointing to the GPU due to behaviors of rendering that were presented. Everything else on the system was working fine; it's just that my laptop's Intel graphics chip was way too weak to push out 480p+ over 50ft of unamplified cabling that split to two projectors.

So when I explained that AVI is merely a container (like wmv, mkv, mp4, etc) and also explained why the things I were doing were indicating that it the problem is with the hardware of my laptop, the next line had come out from the guy's friend: "DUDE, HE JUST WENT THROUGH 4 YEARS OF COLLEGE IN VIDEO EDITING FOR BATCHELORS. ..." He was using the fallacy of establishing credibility, authority, and superiority by something he did. FYI, I am one of those kids that got a computer at age 5 and from there it took of by itself. I am a user of Avisynth and a variety of other tools, and I know what is going on and why when I do things. These two kids were around my age +/- 2 years. We're talking early-20s.

I couldn't believe what I had just heard and were trying to do. They obviously weren't literate with computers and had quite an arrogance, ignorance, lack of openness, lack of humility / too much pride, naive, stuff like this. :eek:
 
I told this to someone at a church conference. This someone was operating the audio mixer and his friend was up there as well. The building had over 50ft of unamplified VGA cabling which split to two projectors. My old consumer-grade Lenovo laptop that had low-end Intel shared graphics chip couldn't handle pushing out video over this cabling to the projectors, so it resulting in a very choppy / low framerate of the 30fps videos I was trying to play that someone had me put together for them. Earlier that morning I had successfully tested the videos before I left the apartment we were staying out to make sure the video worked.

But anyhow, the audio operator kid was telling me to rename the extension of my video to .AVI "because the AVI format ALWAYS works no matter what". I tried to explain to them that AVI is a container/wrapper (kind of like a ZIP archive) and wouldn't make a squat difference. I had tried Windows Media Player, MPlayer, VLC Player, Quicktime, Nero ShowTime, Windows Media Player Classic -- and all of them presented the same symptoms. As I was trying different things, and monitoring CPU, RAM, and shared GPU resource usages -- doing my diagnostics -- I narrowed it down that the source of the problem wasn't to do with the CPU or the RAM or the HDD or the integrated audio, and everything was pointing to the GPU due to behaviors of rendering that were presented. Everything else on the system was working fine; it's just that my laptop's Intel graphics chip was way too weak to push out 480p+ over 50ft of unamplified cabling that split to two projectors.

So when I explained that AVI is merely a container (like wmv, mkv, mp4, etc) and also explained why the things I were doing were indicating that it the problem is with the hardware of my laptop, the next line had come out from the guy's friend: "DUDE, HE JUST WENT THROUGH 4 YEARS OF COLLEGE IN VIDEO EDITING FOR BATCHELORS. ..." He was using the fallacy of establishing credibility, authority, and superiority by something he did. FYI, I am one of those kids that got a computer at age 5 and from there it took of by itself. I am a user of Avisynth and a variety of other tools, and I know what is going on and why when I do things. These two kids were around my age +/- 2 years. We're talking early-20s.

I couldn't believe what I had just heard and were trying to do. They obviously weren't literate with computers and had quite an arrogance, ignorance, lack of openness, lack of humility / too much pride, naive, stuff like this. :eek:
EDIT: Oh, yeah, I remember what he said immediately after that first line. It went like this:

"DUDE, HE JUST WENT THROUGH 4 YEARS OF COLLEGE IN VIDEO EDITING FOR BATCHELORS. HE IS AN EXPERT/PROFESSIONAL AND KNOWS WHAT HES TALKING ABOUT..."

I know that the VGA was unamplified because I traced the cabling. I can't remember the details of how it split to the two projectors, but it was an extremely simply and small solution -- very clear it didn't amplify the signal. It was probably around 50-75ft in length from where my laptop plugged in to one of the projectors. Worst case scenario it was no more than 100ft.
 
I don't know if you guys have seen it yet... but one guys has figured this out already using x264.

Look up the near perfect 720p bluray rips by YIFY.

The size of the movies are average 700-800MB as compared to everyone else 2+ gigs... with little to no quality loss.

I am still trying to duplicate his method.

They definitly look good for the size, but i definitly can't call them near perfect blu ray rips. Artifacts can easily be seen all over individual video frames.
 
I don't know if you guys have seen it yet... but one guys has figured this out already using x264.

Look up the near perfect 720p bluray rips by YIFY.

The size of the movies are average 700-800MB as compared to everyone else 2+ gigs... with little to no quality loss.

I am still trying to duplicate his method.

Dude, are you a pirate? 'Cus if so, I do not approve...
 
Yes! Someone that understands :p.


While having highly compressed videos and getting better quality is great, the hidden cost is to the processing power. It took quite a few years for HD to become the norm in its compressed format because of the raw CPU power required to decode the video. This new codec might not have as big of an impact as H.264 did a few years back because Quad Cores are almost the norm. My dual core would struggle with 720p encodes on my 2006 dual core laptop and still does.

Hidden cost to compression of any kind, is uncompressing it when it needs to be used.
Well there are other costs. Compression works by allowing the loss of detail. Not excessively compressed, the hope is the loss isn't perceivable.

This new algorithm may have more clever ways of throwing out detail which are better than other compression methods. But it could also have clever ways of throwing out detail without the immediately detectable artifacts or blur. Which most people use to judge whether a something is over compressed or not.

I suspect as long as you keep edges sharp and don't introduce artifacts, you could drain the living hell out of an image's detail and most people wouldn't notice.
 
What is new about this? They started tinkering with new mathematics to create the successor to H.264/MPEG-4 AVC way way back in 2005-2007. Then in 2009 started real work on defining HEVC. The standard isn't set to be ratified until January 2013 and then we are still far from a working implementation and device support.

It's kind of still going on schedule but nothing significant has happened any time recently so i'm not sure why this is being posted now? This surely isn't news.

It is to me.
 
I told this to someone at a church conference. This someone was operating the audio mixer and his friend was up there as well. The building had over 50ft of unamplified VGA cabling which split to two projectors. My old consumer-grade Lenovo laptop that had low-end Intel shared graphics chip couldn't handle pushing out video over this cabling to the projectors, so it resulting in a very choppy / low framerate of the 30fps videos I was trying to play that someone had me put together for them. Earlier that morning I had successfully tested the videos before I left the apartment we were staying out to make sure the video worked.

But anyhow, the audio operator kid was telling me to rename the extension of my video to .AVI "because the AVI format ALWAYS works no matter what". I tried to explain to them that AVI is a container/wrapper (kind of like a ZIP archive) and wouldn't make a squat difference. I had tried Windows Media Player, MPlayer, VLC Player, Quicktime, Nero ShowTime, Windows Media Player Classic -- and all of them presented the same symptoms. As I was trying different things, and monitoring CPU, RAM, and shared GPU resource usages -- doing my diagnostics -- I narrowed it down that the source of the problem wasn't to do with the CPU or the RAM or the HDD or the integrated audio, and everything was pointing to the GPU due to behaviors of rendering that were presented. Everything else on the system was working fine; it's just that my laptop's Intel graphics chip was way too weak to push out 480p+ over 50ft of unamplified cabling that split to two projectors.

So when I explained that AVI is merely a container (like wmv, mkv, mp4, etc) and also explained why the things I were doing were indicating that it the problem is with the hardware of my laptop, the next line had come out from the guy's friend: "DUDE, HE JUST WENT THROUGH 4 YEARS OF COLLEGE IN VIDEO EDITING FOR BATCHELORS. ..." He was using the fallacy of establishing credibility, authority, and superiority by something he did. FYI, I am one of those kids that got a computer at age 5 and from there it took of by itself. I am a user of Avisynth and a variety of other tools, and I know what is going on and why when I do things. These two kids were around my age +/- 2 years. We're talking early-20s.

I couldn't believe what I had just heard and were trying to do. They obviously weren't literate with computers and had quite an arrogance, ignorance, lack of openness, lack of humility / too much pride, naive, stuff like this. :eek:

VGA is an analog standard. Degradation of the signal would not reduce framerate as there are no frames, in the digital sense, with VGA. A degraded VGA signal would result in noise and a degraded picture quality but the speed would be the same.
 
HEVC aims to substantially improve coding efficiency compared to AVC High Profile, i.e. to reduce bitrate requirements by half with comparable image quality, at the expense of increased computational complexity. Depending on the application requirements HEVC should be able to trade off computational complexity, compression rate, robustness to errors, and processing delay time.

HEVC is targeted at next-generation HDTV displays and content capture systems which feature progressive scanned frame rates and display resolutions from QVGA (320×240) up to 1080p (1920×1080) and 4320p (7680×4320), as well as improved picture quality in terms of noise level, color gamut, and dynamic range.

It's about time a new standard is ratified. h264 is trivial to decode even in software on modern machines.
 
I don't know if you guys have seen it yet... but one guys has figured this out already using x264.

Look up the near perfect 720p bluray rips by YIFY.

The size of the movies are average 700-800MB as compared to everyone else 2+ gigs... with little to no quality loss.

I am still trying to duplicate his method.

Trying to duplicate his method? If you want the best quality for the smallest size currently possible just use the placebo preset on x264. There is no way to get better compression.

Though 1 or even 2 GB is far from a "near perfect" BluRay rip. Maybe it looks good on a small computer monitor, but not on my 136" projector.
 
Will probably also require more processing power to accomplish, obsoleting many people's HTPC's. :p
 
Zarathustra[H];1039040720 said:
Will probably also require more processing power to accomplish, obsoleting many people's HTPC's. :p

I run an HTPC with an Intel E6300 (1.86GHz) and so far I've been very pleased with how well it plays everything. I have a quad core waiting for it, but so far, the low-spec CPU and passive video card is working wonderfully. But I can see how a highly-compressed format could, as you say, obsolete my setup real quick. But since it's 2006 tech, I have no problem with that.
 
I'm all for it. I would not mind my computer doing the work if that means the file size is either the same or smaller. The releases I get are already extremely good quality so I can only imagine how they will look with this new standard.
 
VGA is an analog standard. Degradation of the signal would not reduce framerate as there are no frames, in the digital sense, with VGA. A degraded VGA signal would result in noise and a degraded picture quality but the speed would be the same.
Yes, you are right. My unthoughtful mistakes. ;\ I still blame the Intel shared GPU, because everything worked fine without being plugged into their 50ft+ VGA cabling. One of the things I had done was going into msconfig --> Services + Startup, disabled all non-Microsoft services, and disabled all Startup items. I had even tried a minimal boot. Symptoms were the same across the board. My laptop had 3GB RAM, a 'Pentium Dualcore' at 2.2GHz, and an Intel graphics chip with 256MB VRAM (couldn't play anything 3D except for games from the 90s and early 00s, and it definitely had problems handling 720p video), and ran Windows XP Professional (a very cleaned up and carefully fine-tuned installation that outperformed most Windows XP installations; never had to reinstall the OS in 4-5 years at all -- then I sold my laptop to buy a T61 :D).

Didn't even have anti-virus installed (and no, the problem wasn't caused by infection nor was the machine infected, guaranteed/you have my word).
 
EDIT: the reason I was inclined to say it had something to do with the cabling is because my educated guess was that because the cabling is very long and unamplified, my shared Intel GPU was attempting to compensate by either (1) increasing output power over VGA (probably unlikely as I imagine it would have made 'weak signal' artifacts visible and obvious, rather than very low framerate), or (2) to provide an artifact-free image the refresh rate of the output was decreased.

Another symptom that was noticeable was that the audio went out of sync with the video. The video would be chugging (low framerate) / skipping frames. Again, I had tested earlier that morning (without being plugged into the 50ft+ unamplified VGA cabling) and the video worked perfectly fine on my laptop. It worked fine after the event (after I had disconnected from the VGA cabling). The audio operator had his laptop there, and he had said he didn't have any of these problems with his laptop (I leaned over and saw an NVIDIA 8800M sticker below his laptop's keyboard -- said to myself "go figure, that's why it works for you").



Different topic, but still interesting: with my T61, if I do not have it plugged into wall outlet power with its AC adapter -- aka running on battery power only -- I am unable to watch 480p video most of the time comfortably. This happens both on stock GPU clock and 150% GPU core overclock (which by the way gets me about 200% FPS boost in Oblivion -- it's amazing -- and my GPU's stress temperatures are the same as stock clock stress temperatures). What happens is that every few seconds (4-7 seconds) there is a very quick but brief pause of the video entirely (audio included) followed by several skipped frames --> video plays normally --> cycle repeats every 4-7 seconds.

When I plug my T61 into the wall so that it doesn't rely on battery power, no problems whatsoever. This is also more evident in higher resolution videos (1080p being the worst -- I *HAVE* to plug into wall power that's how bad it is).

My T61 has a NVIDIA Quadro NVS140M.
 
Yes, you are right. My unthoughtful mistakes. ;\ I still blame the Intel shared GPU, because everything worked fine without being plugged into their 50ft+ VGA cabling. One of the things I had done was going into msconfig --> Services + Startup, disabled all non-Microsoft services, and disabled all Startup items. I had even tried a minimal boot. Symptoms were the same across the board. My laptop had 3GB RAM, a 'Pentium Dualcore' at 2.2GHz, and an Intel graphics chip with 256MB VRAM (couldn't play anything 3D except for games from the 90s and early 00s, and it definitely had problems handling 720p video), and ran Windows XP Professional (a very cleaned up and carefully fine-tuned installation that outperformed most Windows XP installations; never had to reinstall the OS in 4-5 years at all -- then I sold my laptop to buy a T61 :D).

Didn't even have anti-virus installed (and no, the problem wasn't caused by infection nor was the machine infected, guaranteed/you have my word).

You need a bigger ferrite core at each end of the cable to keep the signal from bleeding out. Also could have been that the cable was nicked somewhere and analog frames were seeping out. :p

(kidding aside - I know how frustrating it is when at home/office everything works beautifully then at presentation site not so)
 
I'm not sure why you would drop frames, that does seem weird, did you test it at home with an external monitor maybe it had something to do with outputting to the screen and vga at the same time. As for your T61 check to see if you have both an Intel and Nvidia card installed with Optimus which means it should use the Intel video for regular tasks and kick on the Nvidia for most 3D tasks, however if you are on battery it's not always going to use the Nvidia but if you are plugged in it should. You can also search google for "Optimus Test Tools" so you can see when the Nvidia is On and when it is Off. If it doesn't have Optimus I'm going to guess it has to do with power settings somewhere in Windows.
 
You don't seem to understand the difference between xvid and mkv. They are not comparable.

i never said it to stand as they are comparable, they obviously arent, but that better compression methods / containers are possible as it seems some people didn't believe it was.
 
I'm not sure why you would drop frames, that does seem weird, did you test it at home with an external monitor maybe it had something to do with outputting to the screen and vga at the same time. As for your T61 check to see if you have both an Intel and Nvidia card installed with Optimus which means it should use the Intel video for regular tasks and kick on the Nvidia for most 3D tasks, however if you are on battery it's not always going to use the Nvidia but if you are plugged in it should. You can also search google for "Optimus Test Tools" so you can see when the Nvidia is On and when it is Off. If it doesn't have Optimus I'm going to guess it has to do with power settings somewhere in Windows.
Heh, this was like 3 years ago at least. :D My T61 has a very decent and wonderful GPU compared to the Lenovo G530 consumer-grade laptop I had. Either way, Intel sucks for consumer-grade graphics. Until they go dedicated and can actually compete with AMD and NVIDIA, they're only good for very low-end things. I know that my G530's Intel could handle Half-Life: Source on DX7 better than it could Half-Life on D3D or OGL (OGL was almost a nightmare on that Intel chip for HL1).

Intel is excellent though for when you don't have a dedicated GPU available (whether external or PCI/AGP/PCI-E) or your dedicated GPU hits the can and you need something temporary to get you by. Just think of the old Windows XP days in the early 00s when everything was CPU-rendered until you got GPU drivers installed. My, what a difference. :D
 
Encoding HEVC is going to be a major pain point unless hardware encoding and decoding ends up on the CPU (Intel's QuickSync or AMD's Quick Stream) or graphics card. My bet is on Intel having initial hardware decoding as they purchased a ton of intellectual property from Real Networks who appeared to be very deep into a new codec, at least according to this news article:

http://news.softpedia.com/news/Intel-Buys-Next-Generation-Video-Codec-Patents-from-RealNetworks-248955.shtml

And from their corporate website:
http://www.realnetworks.com/press/releases/2012/intel-to-buy-patents-and-next-generation-video-codec-software-from-realnetworks.aspx

Note that QuickSync can decode the video codecs that are valid in an MPEG Transport Stream. All of those codecs can be found in the Blu-ray format but rarely all on the same disc. My guess is that Intel is going to add QuickSync support for HEVC when it gets close to being finalized. It is expected to be formally ratified in January 2013.

I do not foresee HEVC being added to the MPEG-TS standard as doing so would make newer Blu-ray discs incompatible with current players which already have onboard hardware decoding for MPEG-2, H.264 and VC-1. It may be released and tacked onto HTML5 which will be a real joy to implement, especially as I have yet to find any standard for delivering live video using HTML5. Every implementation that I have seen so far is proprietary.

The most likely scenario is that HEVC becomes part of the MPEG-DASH standard (which has some overlap with HTML5) which is still so far up in the air, at least from my perspective, as to not be a viable solution for at least two years minimum what with the need to develop hardware decoding, deploy the licenses for decoding, have device manufacturers integrate that into new phones, do testing, refine the hardware and then have it end up on a CPU near you.

Then we get to the encoders for public use about (I'm really guessing here) six months before dedicated hardware appears. FFmpeg will at least enable decoding via libavcodec and if they do not include encoding then someone else will. If it is separate it will be able to be integrated into FFmpeg in a similar fashion to libx264.

The upside is that I know FFmpeg fairly well. The downside is it is one more codec to learn.
 
I don't know if you guys have seen it yet... but one guys has figured this out already using x264.

Look up the near perfect 720p bluray rips by YIFY.

The size of the movies are average 700-800MB as compared to everyone else 2+ gigs... with little to no quality loss.

I am still trying to duplicate his method.

Try something like this:
ffmpeg.exe -i inputfile.ext -vcodec libx264 -threads 0 -b:videotrack averagevideobitrate -bufsize averagevideobitrateplusaudiobitrate -maxrate sixteentimesaveragebitrate -minrate halfaveragevideobitrate -acodec libfaac -ab 192k -ac 2 -preset placebo -profile:v high444 -qmin 0 -qmax 51 -keyint_min 1 -g fpstimesten -y outputfile.mp4

I would also suggest two pass encoding and a long vacation.

Note that hardware decoders like a Roku using Roksbox and phones will most likely be unable to play that as the reference frames will be 16. Never go over 8 when trying to deliver to a device. The "slow" preset is the fastest you should go which nullifies using placebo for encoding unless you hand edit that command to include the appropriate switches. Know your presets and translate accordingly.

What I have provided should make sense if you know FFmpeg. You should already know how to calculate the proper bitrate or bit per pixel density of the files you encode. Normal streaming maxes out at 2x the average bitrate and the lowest is 1/2 the average bitrate.

I found one guy online that had claimed that he could stream 1080p content at 1Mbps. I found his file, downloaded it and ran MediaInfo against it.

http://www.bwin.nu/

MediaInfo dumped out his encoding settings as he used libx264 to do his encode. I improved on what he did at the time and then brought it back to a sane bitrate.

True, while the video did average about 1MB per second if you looked at the network it would spike up to around 16Mbps. Hardly ideal for streaming. The file that comes out of the command line above will not work via RTSP to the QuickTime player if it can even play it at all. RTMP is suboptimal for delivery as well. HTTP in a flash player (or HTML5) is the best you can hope for.
 
Try something like this:
ffmpeg.exe -i inputfile.ext -vcodec libx264 -threads 0 -b:videotrack averagevideobitrate -bufsize averagevideobitrateplusaudiobitrate -maxrate sixteentimesaveragebitrate -minrate halfaveragevideobitrate -acodec libfaac -ab 192k -ac 2 -preset placebo -profile:v high444 -qmin 0 -qmax 51 -keyint_min 1 -g fpstimesten -y outputfile.mp4

I would also suggest two pass encoding and a long vacation.

Note that hardware decoders like a Roku using Roksbox and phones will most likely be unable to play that as the reference frames will be 16. Never go over 8 when trying to deliver to a device. The "slow" preset is the fastest you should go which nullifies using placebo for encoding unless you hand edit that command to include the appropriate switches. Know your presets and translate accordingly.

What I have provided should make sense if you know FFmpeg. You should already know how to calculate the proper bitrate or bit per pixel density of the files you encode. Normal streaming maxes out at 2x the average bitrate and the lowest is 1/2 the average bitrate.

I found one guy online that had claimed that he could stream 1080p content at 1Mbps. I found his file, downloaded it and ran MediaInfo against it.

http://www.bwin.nu/

MediaInfo dumped out his encoding settings as he used libx264 to do his encode. I improved on what he did at the time and then brought it back to a sane bitrate.

True, while the video did average about 1MB per second if you looked at the network it would spike up to around 16Mbps. Hardly ideal for streaming. The file that comes out of the command line above will not work via RTSP to the QuickTime player if it can even play it at all. RTMP is suboptimal for delivery as well. HTTP in a flash player (or HTML5) is the best you can hope for.
Wow. Nice info.

I have mainly been using freeware converters like Handbrake and got to a smaller size with little to no loss in quality.. but file size was still above 1GB.

I will try out your parameters. thanks.
 
Wow. Nice info.

I have mainly been using freeware converters like Handbrake and got to a smaller size with little to no loss in quality.. but file size was still above 1GB.

I will try out your parameters. thanks.

Zero guarantee that it is going to work. I would test with the fast setting to see if the file size is sane before trying placebo. Quality degradation will be noticeable.

And while you are at pounding your head against FFmpeg like I did, wrap a script around it. My bash script is around 860 lines running in Cygwin using a custom build of FFmpeg. You can check out what Zeranoe has to offer but you will need to use the experimental aac codec as libfaac is not in his build due to licensing issues.

This should help.
-acodec aac -strict experimental
 
Back
Top