Sucks is a relative term. The quality is better than it was way back when, but you still need higher bitrates on GPU to maintain as good of quality as CPU. The difference has closed a lot though, so the speed benefit is nice in situations where you don't mind slightly larger files or slightly lower IQ. For video editing in real time, GPU is best. For a final encode of something probably best to use CPU still depending on if you value time, HDD space or video quality the most. It's not as black and white as it used to be, I'm looking to s itch my Plex serve over to real time GPU transcoding for supplying video, but my original video source will still be from CPU. Right now a real time transcoding takes a bit of oomph from my CPU that would be nice to offload (not that I do so often or even really notice any difference).Are you guys sure GPU encoding still sucks? I find that as long as I keep the bitrate close to the original file that it looks nearly identical? It's when I use a pre-configured profile is when the video quality degrades. Youtube degrades video quality more than anything and Facebook drops 4k down to 720p! But my local rendered files look pretty much as good as the originals using GPU encoding and sticking to a similar bitrate. Maybe that was a problem on older GPUs and software? Anyway, real time smooth high quality 4k/x265 previews on a high quality video editor under $100 (I pay around $50 to upgrade from one version to the next) without having to make proxies is a game changer for me!