cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Open Broadcaster Software (OBS) is a multifaceted streaming and capture software solution that has become popular with broadcasters on YouTube, Twitch, and other live-stream platforms. To get the highest quality broadcasts, an expensive dual PC solution is typically utilized where the game is played on one PC and a second PC captures the game footage and uploads it to the streaming service. NVIDIA announced at CES 2019 that they were collaborating with the OBS development team to lower NVIDIA GPU usage by 66% and raise the overall quality of NVIDIA GPU encoding. This work resulted in the NVIDIA NVENC encoder in NVIDIA RTX products achieving parity with the x264 medium quality preset in OBS. This means broadcasters that previously used dual PC solutions can create high quality content with just one PC utilizing an NVIDIA RTX GPU! The quality improvements are for the new NVIDIA RTX series of products, but older NVIDIA GPUs will also benefit from the speed increase.

OBS forum administrator "dodgepong" has announced that a beta version of OBS that utilizes the new NVIDIA NVENC encoder and NVIDIA SDK is available for download. He quickly tempers OBS user's enthusiasm by stating, "The quality improvements you may have been hearing about will largely only be seen on Turing GPUs (RTX 20XX), but the performance improvements should be measurable on all GPUs that have NVENC (GTX 6XX and higher)." These improvements have come about by OBS no longer sending frames to system RAM prior to being sent to the NVENC encoder. "Instead, the frames are sent directly from VRAM, which should noticeably reduce resource usage."

GeForce RTX GPUs are up to 15% more efficient (i.e. require 15% less bitrate at the same quality level) than previous-generation Pascal GPUs when streaming in H.264, thanks to architectural improvements to the dedicated hardware encoder, NVENC. For you as a streamer, this means that GeForce RTX GPUs can stream with superior image quality compared to x264 Fast, and on par with x264 Medium. We have been collaborating with OBS, the industry-leading streaming application, to help them release a new version with improved support for NVIDIA GPUs. Scheduled to debut at the end of January, the new OBS will reduce the FPS impact of streaming by up to 66% compared to the currently shipping version.
 
Worth noting that NVENC is part of the 10xx series as well and can also be used to speed up HEVC/x.265 encoding/decoding by a large amount.

The optimizations should improve the 10-series with OBS unless Nvidia locks them out somehow.
 
No complaints streaming with nvenc/recording with x264. 1050ti here.
 
This is a good thing, really, as long as people take into consideration this will lower your FPS in games. If you have the FPS to spare then nvenc is actually rather good tech. Glad to see tuning like this, but I don't know why they didn't do it sooner, since nvenc is used in video production rather heavily.
 
I'd still just use a second machine - not sure where they are getting the "expensive" from...

Still nice to see the enhancements though!
 
This is a good thing, really, as long as people take into consideration this will lower your FPS in games. If you have the FPS to spare then nvenc is actually rather good tech. Glad to see tuning like this, but I don't know why they didn't do it sooner, since nvenc is used in video production rather heavily.
It uses the video encoder and doesn't lower your FPS.
 
It uses the video encoder and doesn't lower your FPS.
geforce-rtx-streaming-new-obs-002.png
 
It uses the video encoder and doesn't lower your FPS.

Nonsense it definitely lowers FPS, just by less than x264 and by less than the old version of OBS.
also depends what you are doing. If you are just streaming/recording gameplay, then the hit is minimal. If you are using overlays, a webcam, a green screen, audio compression software, etc. etc. you will get a pretty big hit.
 
Worth noting that NVENC is part of the 10xx series as well and can also be used to speed up HEVC/x.265 encoding/decoding by a large amount.

The optimizations should improve the 10-series with OBS unless Nvidia locks them out somehow.

Every GPU generation has different encoding silicon with different capabilities, so the quality improvement is not necessarily something thats locked out by a driver.

Anyway, these claims by Nvidia are something that can be objectively verified. Run NVENC and x264 on a file with the same CRF values, and look at the size of the resulting files. And if you really want to be thorough, you can can use the same PSNR/SSIM measurements that the encoder coders use for objective quality benchmarks.
 
NVENC runs on your CUDA cores on your GPU, this takes resources away from your gaming which is rendered by the GPU that is also rendering the game. This most certainly will impact your FPS.

It uses the video encoder and doesn't lower your FPS.
 
Every GPU generation has different encoding silicon with different capabilities, so the quality improvement is not necessarily something thats locked out by a driver.

Anyway, these claims by Nvidia are something that can be objectively verified. Run NVENC and x264 on a file with the same CRF values, and look at the size of the resulting files. And if you really want to be thorough, you can can use the same PSNR/SSIM measurements that the encoder coders use for objective quality benchmarks.

This is correct. Turing NVENC silicon > Pascal NVENC silicon. Obviously CPU encoding is still quality king, but based on the comparisons I've seen, Turing NVENC beats CPU-based x264-Fast in quality at a fraction of the encode time. Turing will also see further improvements to HEVC over time compared to Pascal.

Turing NVENC > x264 Fast > Pascal NVENC <=> x264 Fast << pascal HEVC < Turing HEVC encoding time.

Turing's NVENC hardware bump is the only reason I'm even considering a 2000-series now, since 1080Ti is still king for gaming at its pricepoint. Might get a 2060 or used 2070.
 
NVENC runs on your CUDA cores on your GPU, this takes resources away from your gaming which is rendered by the GPU that is also rendering the game. This most certainly will impact your FPS.

In the 7970/GTX 680 or anything newer, there's separate area on the die dedicated to encoding videos. If you open up HWInfo, you can see stats for that specific area as "GPU Video Engine Load" or something.

It can bog down the GPU a little, but it doesn't use the CUDA cores.

EDIT:

This is correct. Turing NVENC silicon > Pascal NVENC silicon. Obviously CPU encoding is still quality king, but based on the comparisons I've seen, Turing NVENC beats CPU-based x264-Fast in quality at a fraction of the encode time. Turing will also see further improvements to HEVC over time compared to Pascal.

Turing NVENC > x264 Fast > Pascal NVENC <=> x264 Fast << pascal HEVC < Turing HEVC encoding time.

Turing's NVENC hardware bump is the only reason I'm even considering a 2000-series now, since 1080Ti is still king for gaming at its pricepoint. Might get a 2060 or used 2070.

HEVC is really where it blows CPU encoding out of the water now, as 10-bit x265 is insanely slow. It'd be nice if streaming services could use it... I guess they're waiting for AV1, so they don't have to write MPEG a fat check?

And yeah, Turing is tempting me now too. Not only is the encoding block way better than my Maxwell one, but the tensor cores could make neural network upscaling while encoding kind of practical.
 
Last edited:
The quality improvements will come from the RTX, thats just hardware based. The performance/resources improvements is for the rest of the R/GTX series.

So before it was x264 Fast/Medium will get you really good quality for a lower bitrate. H264 you will get good performance but quality is about very fast (equivalent) so the only way to get "better" quality is to increase the bitrate to that 6000+ while x264 will have better quality 3500-4000 bitrate.. but with a big fps drop in fps/cpu intensive games.

So now with the new OBS improvements you'll have less resources consumed and higher performance.. so instead of saying 80fps (stock), 60fps (streaming), it will be like 70ish fps.. = performance improvement.

So with all that, the actual image quality improvements will come from RTX.

With that being said 1080p/60fps settings is easily doable now with NVENC, this is good for streamers who don't have a dedicated PC or have a 6+ core CPU.
 
Nonsense it definitely lowers FPS, just by less than x264 and by less than the old version of OBS.
also depends what you are doing. If you are just streaming/recording gameplay, then the hit is minimal. If you are using overlays, a webcam, a green screen, audio compression software, etc. etc. you will get a pretty big hit.
I just tested it again. I have 1% GPU load difference between streaming and not. Video Engine Load goes to 30% now from ~20%(that's the encoder) Memory Controller Load is 4% difference.

Hardly enough to say that there is an impact to using the encoder. I'll say it again from above. GPU encoding does not lower my FPS. I lock VSYNC at 60 or 75 (depending on the game). You're looking at 1 FPS difference at this pint.

I have a bigger hit to FPS leaving open Facebook on a browser tab running in the background chewing CPU cycles.

Streaming with a webcam (or Facerig), streamlabs overlays for followers and subs, noisegate set up for the mic, and a few other effects and multiple scenes. OBS uses just 5% CPU now compared to the 10-15% it did before. Bypassing System RAM makes a HUGE difference in a DDR3 setup.

I've been streaming since 2010, so I might actually know a bit about what I'm talking about. I'm having much better results then their charts and tests would imply.
 
I just tested it again. I have 1% GPU load difference between streaming and not. Video Engine Load goes to 30% now from ~20%(that's the encoder) Memory Controller Load is 4% difference.

Hardly enough to say that there is an impact to using the encoder. I'll say it again from above. GPU encoding does not lower my FPS. I lock VSYNC at 60 or 75 (depending on the game). You're looking at 1 FPS difference at this pint.

I have a bigger hit to FPS leaving open Facebook on a browser tab running in the background chewing CPU cycles.

Streaming with a webcam (or Facerig), streamlabs overlays for followers and subs, noisegate set up for the mic, and a few other effects and multiple scenes. OBS uses just 5% CPU now compared to the 10-15% it did before. Bypassing System RAM makes a HUGE difference in a DDR3 setup.

I've been streaming since 2010, so I might actually know a bit about what I'm talking about. I'm having much better results then their charts and tests would imply.

4% is more than likely not noticeable to be honest. :) I would use it and not care about performance losses. :)
 
I just tested it again. I have 1% GPU load difference between streaming and not. Video Engine Load goes to 30% now from ~20%(that's the encoder) Memory Controller Load is 4% difference.

Hardly enough to say that there is an impact to using the encoder. I'll say it again from above. GPU encoding does not lower my FPS. I lock VSYNC at 60 or 75 (depending on the game). You're looking at 1 FPS difference at this pint.

I have a bigger hit to FPS leaving open Facebook on a browser tab running in the background chewing CPU cycles.

Streaming with a webcam (or Facerig), streamlabs overlays for followers and subs, noisegate set up for the mic, and a few other effects and multiple scenes. OBS uses just 5% CPU now compared to the 10-15% it did before. Bypassing System RAM makes a HUGE difference in a DDR3 setup.

I've been streaming since 2010, so I might actually know a bit about what I'm talking about. I'm having much better results then their charts and tests would imply.

Well if you lock your FPS at 60 with vsync, you are going to not have any difference with framerates as 60fps is extremely low nowadays. If you used higher refresh rates like most people have started doing nowadays, you'd see a difference. It's minimal but there is a clear hit when you enable nvenc encoding.Anywhere from 5-15% and in some games, especially FPS games, that is a big deal.

Also I hope you are aware of the negatives of using vsync, such as input lag and frame latency. Time to upgrade monitors if you are still using vsync at 60hz. And today is a really good time since nvidia just unlocked VRR and you can get a $200 1920x1080 144hz variable refresh rate monitor from Acer, as well as plenty of other good options.
 
Well if you lock your FPS at 60 with vsync, you are going to not have any difference with framerates as 60fps is extremely low nowadays. If you used higher refresh rates like most people have started doing nowadays, you'd see a difference. It's minimal but there is a clear hit when you enable nvenc encoding.Anywhere from 5-15% and in some games, especially FPS games, that is a big deal.

Also I hope you are aware of the negatives of using vsync, such as input lag and frame latency. Time to upgrade monitors if you are still using vsync at 60hz. And today is a really good time since nvidia just unlocked VRR and you can get a $200 1920x1080 144hz variable refresh rate monitor from Acer, as well as plenty of other good options.

If you're streaming, its more important to have a consistent fps for your viewers. I'm well aware of the latency introduced with vsync. If you run uncapped, they'll see a lot of jitter and inconsistent frames, in addition to giving them horrible tearing. For me, tearing is worse than the latency I've learned to deal with over the decades.

Certain games I can get away with locking to 75hz on my end, while streaming 60hz to them, but while watching the stream you can see a lot of miss timed frames. Its quite distracting.

I'm looking forward to trying out VRR on my freesync monitory though. Probably wont run it all the time, but it could help with the latency if it works on this monitor I have.

I have a plan to replace this whole rig by the end of the year, including the monitor.
 
If I already have a movie that I want to encode to x264 can anyone recommend a program that will use my GPU to encode? I tried with some software like 10 years ago but the quality was garbage so I went back to CPU encoding. I wouldn't mind trying again though, a decade ago is about a century in computer terms.
 
Back
Top