A Video Converter that uses all Cores/GPUs?

Trackr

[H]ard|Gawd
Joined
Feb 10, 2011
Messages
1,786
I need to covert a butt-load of videos for a project.

The best video converter I know is AVS Video Converter.

Unfortunately, it only shows 65-70% CPU usage and I can't get it to use my GTX 480s.

Anyone know a better way?

Thanks
 
I really like Any Video Converter. In the option you can increase 'Threads' to as many cores as you have. It also has a nice batch operation so you can drop in 50 videos then walk away.

Handbrake is another popular one but it's kinda buggy.
 
freemake video converter says it supports cuda and ati stream though i've never tried to get the gpu acceleration working
 
I really like Any Video Converter. In the option you can increase 'Threads' to as many cores as you have. It also has a nice batch operation so you can drop in 50 videos then walk away.

Handbrake is another popular one but it's kinda buggy.

I've tried Any Video Converter, in the past.

I guess I'll give it another shot.

Anyone else have a suggestion? I'm open to non-free programs, as long as they get the job done.
 
Don't bother using GPUs to convert video - CPUs will do a better job in less time.
 
dvdfab supports DXVA, AMD APP, CUDA and Intel Quick Sync. Probably the best overall ripper, and media converter I've used.
 
CPU faster than GPU?!?!?!?

Vegas 11 pro using GPU encodes HD1080 in less than half the time of CPU.
 
CPU faster than GPU?!?!?!?

Vegas 11 pro using GPU encodes HD1080 in less than half the time of CPU.

With about half the quality of a render using the same bitrate settings but restricted to using the CPU only, last I checked.
 
GPUs render better than CPUs. It's not a leap to assume they encode better too.

Unfortunately, they don't. The picture quality isn't as good, which is why GPU acceleration isn't included in the x264 encoding library. GPUs don't render / decode with better quality than CPUs either.

But you can get badaboom if you'd like to give CUDA a try. It's not free and it's EOL'd but it'll probably do what you want.

Otherwise HandBrake can use all available CPU threads and is cross-platform.
 
Do people not understand the difference between general purpose computing and specific-task computing?
 
Handbrake is the way to go. GPU encoding for video is inferior in video quality but faster in rendering compaired to CPU encoding from personal experience and the pros say the same thing.

http://handbrake.fr/
 
But you can get badaboom if you'd like to give CUDA a try. It's not free and it's EOL'd but it'll probably do what you want.

I'm wondering - does that make CUDA useless, as in a gimmick or is there something it does right?
 
I'm wondering - does that make CUDA useless, as in a gimmick or is there something it does right?

CUDA is great for some applications but it doesn't seem like video encoding is one of those. I remember reading a discussion about this on the handbrake or x264 mailing list or something like that.

It could be that nobody has just taken the necessary effort to faithfully port x264 to CUDA, or the current implementations or crap, or the GPU does not have the necessary instruction set, or video encoding can't be sufficiently parallelized to make a performance improvement.
 
Vreveal is great software and free... It uses GPU and can do neat cleaning and anti shake stuff on the fly.

They are getting rid of it. Its made by a compnay that makes high end drone and military video cleaning software so its no toy! If you cant download from their site anymore download from you know where...

Its the dogs danglies!
 
Update. I just had to reinstall my system and noticed Vreveal wasnt working to import MTS files which are AVCHD and always worked before. It was the latest Quicktime 7.7.4 version.. doesnt work with Vreveal.. so use V7.1... thats got it working perfectly again.
 
It looks like very old topic, but I just wanted to say that you guys are in complete nutz and in denial about what CUDA support does. I have i7-6700k and GTX 1080, 1080 is about 30 times faster than 6700k, actually more.
those clowns saying that CPU is faster FACEPALM 1000x so funny. clearly those people have no experience with decent GPUs, stop using some low-end toys and get a decent GPU lol
i have tried cracking wpa2 handshakes with cpu and gpu. cpu pulls about 15000p/s while gpu pulls 420000p/s.

i only came here because i have x-files season 1 - 9 and they are in some stupid mkv format that don't play in any of the players I have. i mean they play, but they stutter and audio keeps drifting on some episodes. i don't know why kids nowdays use mkv, it's worst format i've ever came across with. all my machines and players have issues with mkv (tried win xp, win 7, win 10 and with players such as BSPlayer, win media player, Kodi, VLC, winamp).
so I have about 30gb of videos to convert and doing that with CPU is just waste of time and it will fry my cpu when i leave it for like 24h under max stress. prolly won't fry the cpu, but it's a good way of reducing your cpu life span greatly. 6700k goes up to 75c with liquid cooling while gtx 1080 sits in cool 62c max when under 100% use. so i'd rather use gpu than cpu.

also those people are fools who say that cpu renders better quality, do you even know what you saying LOL? it's as good as saying that when typing this text with bluetooth keyboard then text quality is worse haha. kids nowdays lol. please go learn how computers work. there is NO difference in quality when rendering with cpu or gpu. it's technically impossible. convert it into binary and data is identical. kids nowdays just have no idea what they are blabbering about and it's so funny how people keep going offtopic. question was "what is the best converter that supports GPU" so stop bumbling about some nonsense, either answer or don't post.
 
Holy thread necro, batman.

Keep in mind that back in 2012, CUDA support in anything was a rare thing. In 2012-2013, CUDA 5.x was the latest and greatest (we're on CUDA 7.x today). We didn't get Unified Memory until CUDA 6.0, which was just coming out at the very end of 2013. Getting good performance out of a ton of double precision float math operations was still tricky (hello memory banking tricks). Basically, as a developer, if you didn't fully understand the nvidia gpu architecture, what a Warp is, etc. you were setting yourself up to fail. The common fallacy is you can solve every CPU bottleneck problem by brute forcing your way onto a GPU. If you don't understand the problem or can not figure out how to "divide and conquer", you'll most likely end up wasting your development time better spent elsewhere.

I see the primary issue back in 2012-2013 was a lack of understanding and experience on the developers part. I recall a lot of "It's too hard" or "It's not worth the effort" converstations back then. Speaking as a developer back then, CUDA was hard to understand. What 3 years of experience and advancements can bring...

The irony here is that CUDA encoders (& decoders) are fairly obsolete now. Hardware encoding/decoding became such a big deal that every nvidia card since Maxwell now carries dedicated h264/HEVC hardware for encoding and decoding. If you have to fallback on CPU methods, it's not the end of the world - although realtime encoding still favors GPU. Imo, it's nice to have choices.
 
Since it's alive again, I'll just say this: there is no GPU encoder that's in use so far that matches the quality of the software encoders IMO. No matter what settings you decide to use with a GPU encoder, even maxing out the possible quality, it's still going to look inferior to what the software-based encoder ends up producing. Sure you can encode stuff fast as fuck with the GPU - I have a Sandy Bridge i7 and can use Intel's QuickSync and encode something at 400+ fps but the overall quality of the resulting encode (even with the best settings) is absolute shit compared to just letting x264 crunch away at it at 20 fps.

I've seen hundreds of test encodes done with the latest GPU encoders as well as the CPU-based hardware encoders (like Intel's QuickSync, I'm now waiting to see what Kaby Lake can do based on quality) and so far, after the past decade of waiting to get some really decent hardware-based encoders, I've yet to see any of them produce outstanding visual quality in comparison to traditional software-based encoders like x264 and now x265.

I'll take the quality over the speed most every time. :)
 
Since it's alive again, I'll just say this: there is no GPU encoder that's in use so far that matches the quality of the software encoders IMO. No matter what settings you decide to use with a GPU encoder, even maxing out the possible quality, it's still going to look inferior to what the software-based encoder ends up producing. Sure you can encode stuff fast as fuck with the GPU - I have a Sandy Bridge i7 and can use Intel's QuickSync and encode something at 400+ fps but the overall quality of the resulting encode (even with the best settings) is absolute shit compared to just letting x264 crunch away at it at 20 fps.

I've seen hundreds of test encodes done with the latest GPU encoders as well as the CPU-based hardware encoders (like Intel's QuickSync, I'm now waiting to see what Kaby Lake can do based on quality) and so far, after the past decade of waiting to get some really decent hardware-based encoders, I've yet to see any of them produce outstanding visual quality in comparison to traditional software-based encoders like x264 and now x265.

I'll take the quality over the speed most every time. :)

This, a thousand times this. Quicksync is the closest one can get for good accelerated encoding, even then it takes a lot of tweaking and produces a file ~20% larger than a similar quality pure software encode. (one done with StaxRip, one with Handbrake) The Nvidia based ones are bad at best as they're h.264 fast profile intended for streaming.
 
It looks like very old topic, but I just wanted to say that you guys are in complete nutz and in denial about what CUDA support does. I have i7-6700k and GTX 1080, 1080 is about 30 times faster than 6700k, actually more.
those clowns saying that CPU is faster FACEPALM 1000x so funny. clearly those people have no experience with decent GPUs, stop using some low-end toys and get a decent GPU lol
i have tried cracking wpa2 handshakes with cpu and gpu. cpu pulls about 15000p/s while gpu pulls 420000p/s.

i only came here because i have x-files season 1 - 9 and they are in some stupid mkv format that don't play in any of the players I have. i mean they play, but they stutter and audio keeps drifting on some episodes. i don't know why kids nowdays use mkv, it's worst format i've ever came across with. all my machines and players have issues with mkv (tried win xp, win 7, win 10 and with players such as BSPlayer, win media player, Kodi, VLC, winamp).
so I have about 30gb of videos to convert and doing that with CPU is just waste of time and it will fry my cpu when i leave it for like 24h under max stress. prolly won't fry the cpu, but it's a good way of reducing your cpu life span greatly. 6700k goes up to 75c with liquid cooling while gtx 1080 sits in cool 62c max when under 100% use. so i'd rather use gpu than cpu.

also those people are fools who say that cpu renders better quality, do you even know what you saying LOL? it's as good as saying that when typing this text with bluetooth keyboard then text quality is worse haha. kids nowdays lol. please go learn how computers work. there is NO difference in quality when rendering with cpu or gpu. it's technically impossible. convert it into binary and data is identical. kids nowdays just have no idea what they are blabbering about and it's so funny how people keep going offtopic. question was "what is the best converter that supports GPU" so stop bumbling about some nonsense, either answer or don't post.
Speaking of people that have no clue what they're talking about... in addition to the replied to you above, MKV is not responsible for your players responding poorly. MKV is just a container, and a damn good one, at that, easily wrapping multiple video, audio, subtitle streams into a single file with support for chapters and many other things. The only way you'd have an issue is if your hardware was not up to snuff for whatever codec the video stream inside was encoded with. VLC or Media Player Classic Home Cinema will play basically anything and have codecs built-in for it, and the GTX 1080 has hardware acceleration for basically any video that can be hardware accelerated, including H.265.

As for quality difference - yes, indeed it is possible and is a reality. The encoders have to be coded differently, and there isn't a GPU accelerated encoder that I know of that will encode H.264 with any kind of high quality output option, they are all made to simply work as quickly as possible, producing what can be very poor results. With a software encode, it takes much longer but the encoders are made so that they will produce the highest quality output according to your settings (such as x264 providing dozens and dozens of different quality settings for making sure different videos all can look near the same as the source video).
 
It looks like very old topic, but I just wanted to say that you guys are in complete nutz and in denial about what CUDA support does. I have i7-6700k and GTX 1080, 1080 is about 30 times faster than 6700k, actually more.
those clowns saying that CPU is faster FACEPALM 1000x so funny. clearly those people have no experience with decent GPUs, stop using some low-end toys and get a decent GPU lol
i have tried cracking wpa2 handshakes with cpu and gpu. cpu pulls about 15000p/s while gpu pulls 420000p/s.

i only came here because i have x-files season 1 - 9 and they are in some stupid mkv format that don't play in any of the players I have. i mean they play, but they stutter and audio keeps drifting on some episodes. i don't know why kids nowdays use mkv, it's worst format i've ever came across with. all my machines and players have issues with mkv (tried win xp, win 7, win 10 and with players such as BSPlayer, win media player, Kodi, VLC, winamp).
so I have about 30gb of videos to convert and doing that with CPU is just waste of time and it will fry my cpu when i leave it for like 24h under max stress. prolly won't fry the cpu, but it's a good way of reducing your cpu life span greatly. 6700k goes up to 75c with liquid cooling while gtx 1080 sits in cool 62c max when under 100% use. so i'd rather use gpu than cpu.

also those people are fools who say that cpu renders better quality, do you even know what you saying LOL? it's as good as saying that when typing this text with bluetooth keyboard then text quality is worse haha. kids nowdays lol. please go learn how computers work. there is NO difference in quality when rendering with cpu or gpu. it's technically impossible. convert it into binary and data is identical. kids nowdays just have no idea what they are blabbering about and it's so funny how people keep going offtopic. question was "what is the best converter that supports GPU" so stop bumbling about some nonsense, either answer or don't post.

What the hell did I just read
 
Don't bother using GPUs to convert video - CPUs will do a better job in less time.
depends on the scenario but last I tinkered with it the GPU did things that degraded quality but were stupid fast. Which means if you don't really care a ton about quality yeah GPU can be faster. The CPU gives you more adjustment type options and better quality. I have an 8 core Xeon so as long as the CPU encoding uses all of my cores it doesn't take too long. I last used GPU encoding about 2 years ago on a 290x. From what I've heard its just not worth going back around to. If you plan on doing a ton of encoding you may want to look at those dirt cheap dual 8 core xeon setups.
 
Back
Top