AMD ReLive versus NVIDIA ShadowPlay Performance

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,598
AMD ReLive versus NVIDIA ShadowPlay Performance

We take AMD ReLive in the AMD Radeon Software Adrenalin Edition and NVIDIA ShadowPlay as part of GeForce Experience and find out which one is more FPS and CPU-efficient for recording games while gaming. We will compare features, specifications, and find out overall which better suits content creators for recording gameplay on AMD and NVIDIA GPUs.
 
Great review! Also nice seeing RTG taking the software side very serious. While I don't use these features much I did like Relive better than ShadowPlay and definitely do not like Geforce Spamming Experience. UI wise I just fine RTG drivers superior over Nvidia.
 
No mention of the data logging that runs in the background for GFE? Telemetry of gameplay data & habits, sent straight to NVIDIA.

Under their privacy EULA clause, they are allowed to share all of the collected data with their partners.
 
Nvidia is NVENC
pdH18Bq.jpg

https://developer.nvidia.com/nvidia-video-codec-sdk
Still find it baffling that Nvidia doesn't support HEVC for shadowplay, it's built into their chips. Even though no streaming service takes HEVC shadowplay is more than just a streaming program.


AMD is under VCE 1.0, 2.0, 3.0, 3.1, 3.4, 4.0 similar to nvidia different gpu generations have different levels of feature support but they don't have a nice little infographic, in fact it's quite difficult to find any hardware specific documentation to copy-pasta. HEVC support was added starting with VCE 3.4
VCE1.0 ARUBA (Trinity/Richland), CAPE VERDE, PITCAIRN, TAHITI, OLAND
VCE2.0 KAVERI, KABINI, MULLINS, BONAIRE, HAWAII
VCE3.0 TONGA, FIJI
VCE3.1 CARRIZO
VCE3.4 STONEY, POLARIS10, POLARIS11, POLARIS12
VCE4.0 VEGA10

You can get similar performance with w.e features you want out of 3rd party software as long as they're written to work with NVENC and/or AMD AMF(VCE). Each generation btw is known to produce different image quality results at the same bitrate.
 
Can you guys test low end cards next RX 560(1024)/1050 ? Perhaps doing dota 2 as it's lighter on the GPU.

I think it'll be especially prevalent for when the desktop APUs come out.See if you can do good esports streaming on a lower budget and what not.

Would also like to see a quick revisit in an easy 4k title like dota 2 for the higher end cards, that way it's not just hammered to 24fps


I also think it's highly important to upload the output video files for each setting to a mega link or something so we can view what the end product looks like
 
Last edited:
Could you comment on the quality of the output files too? I recently moved from an AMD card to a 1080Ti and I find the output files from ShadowPlay very difficult to work with (audio sync issues) and they're often corrupt, to a point where I don't even bother using it anymore.
 
I challenge Brent to perform double blinds tests on audio bitrate.
320Kbps is way above what humans can perceive in audio quality.
192Kbps is about as high as people can detect on double blind tests.

This may be true and not true, depends on the music, compression, compression method and so on and so on.
320kbps is where I start to be really happy and confident as I often hear differences, I do not have much insight to how the files were produced but the same source was used.
I can usually live with 192kbps but if I actively Listen then there's a difference.

"The human eye can't see above 60 FPS...."

Same argument, which I do not prescribe to.

In games 60fps and 120fps and so on may feel the same depending on the engine, many can feel their input being off by decreasing fps as the game engine pulls input and network traffic every frame while others do it outside that loop thus creating different behavior as some games feel sluggish at 60 fps and some feels perfectly fluid.
As for those games that have it outside the loops or some other method people struggle but in games where it's in the loop they use a mere second or two to identify it.
 
"The human eye can't see above 60 FPS...."

Same argument, which I do not prescribe to.

That's a whole new can of worms waiting to be opended, but I won't go there.

What I can say is that there is a point in AAC audio streams where increased bit rate won't give you a perceivable better audio quality and its generally accepted that 192kbps is the best audio quality for stereo streams. Anything more and you are just wasting space.
 
How do these compare with OBS?

OBS > relive/shadow play if you're a daily streamer.. relive and shadow play are good for casual streamers or just want to quickly stream for friends but OBS does everything relive does plus way more.

That's a whole new can of worms waiting to be opended, but I won't go there.

What I can say is that there is a point in AAC audio streams where increased bit rate won't give you a perceivable better audio quality and its generally accepted that 192kbps is the best audio quality for stereo streams. Anything more and you are just wasting space.

either way i'd rather have the option to set it to what i want vs software deciding what it should be at on it's own.
 
Considering how much longer NVIDIA has been working on ShadowPlay, I'm impressed that AMD's already not only reached feature parity, but surpassed NVIDIA in many regards. The performance difference is pretty negligible.

To echo the point another poster made above, I would be interested to see these same comparisons with something like a GTX 1050 or 1060 vs the AMD equivalent - RX 580 or 570? A lot of the streaming community are gamers without big powerful rigs, so that would probably hit a wider audience :)

Great article. I like this type of content.
 
either way i'd rather have the option to set it to what i want vs software deciding what it should be at on it's own.

Well, I agree; but I was just pointing out that there's a point of diminishing returns once you go past a certain bit rate.
 
Considering how much longer NVIDIA has been working on ShadowPlay, I'm impressed that AMD's already not only reached feature parity, but surpassed NVIDIA in many regards. The performance difference is pretty negligible.

To echo the point another poster made above, I would be interested to see these same comparisons with something like a GTX 1050 or 1060 vs the AMD equivalent - RX 580 or 570? A lot of the streaming community are gamers without big powerful rigs, so that would probably hit a wider audience :)

Great article. I like this type of content.

That's what happens where there's no competition. I mean, nvidia had no incentive to improve shadowplay, but now I'm sure they will intoduce new features and match/surpass ReLive.
 
R.E., Audio Bitrate options, Point is, content creators want the highest quality possible, the highest data quality, and then edit down what they need. You can always downward compress audio, but you can't go the other way post-recording.

I know for a fact microphone quality is of paramount importance.

There are people who claim they can tell the difference between compressed and lossless audio.

I'd rather have the option of a custom bitrate, with optional higher bitrate, than not have the option to change it at all. That, puts ReLive, ahead of ShadowPlay, in that department.
 
Last edited:
R.E., Audio Bitrate options, Point is, content creators want the highest quality possible, the highest data quality, and then edit down what they need. You can always downward compress audio, but you can't go the other way post-recording.

I know for a fact microphone quality is of paramount importance.

There are people who claim they can tell the difference between compressed and lossless audio.

I'd rather have the option of a custom bitrate, with optional higher bitrate, than not have the option to change it at all. That, puts ReLive, ahead of ShadowPlay, in that department.

Are those the same people that claim that Vynil sounds better than lossless digital? :D:D
 
Frankly most major 'improvements' in the GF UI were a turn off.
I still use GF 2.0 - anything below 3.0 is best.
3.0 and above is when Nvidia 'jumped the shark' on me.

I like the simpler features from earlier versions - items out and easy to find with a more or less cohesive design. Nothing fancy but why did we 'need' it?

I liked your article - calling out Nvidia's GUI design.
I had the same feedback for Nvidia awhile back - they're not listening to one person's feedback. Hopefully some competition and HardForums will kick them back into gear.

For the first time in forever I'd actually choose an AMD GPU over an Nvidia.
 
I challenge Brent to perform double blinds tests on audio bitrate.
320Kbps is way above what humans can perceive in audio quality.
192Kbps is about as high as people can detect on double blind tests.
You always start as high as you can when creating content preferably at lossless but that's crazy for video, you can always compress it down later if needed.


OBS > relive/shadow play if you're a daily streamer.. relive and shadow play are good for casual streamers or just want to quickly stream for friends but OBS does everything relive does plus way more.



either way i'd rather have the option to set it to what i want vs software deciding what it should be at on it's own.

That's another point for Brent

Should do some OBS Nvidia Vs AMD Vs like a 1950X, see how the GPU encoding in OBS compares to each other and to CPUs. I don't think anyone has done testing like that, especially not for 4k.

Soon as Threadripper Gen 2 is announced I'm probably picking up a system for 4k local recording.
 
I enjoyed the comparison and I am not surprised that the performance difference is more or less negligible as they both use hardware encoding. I am very happy to see that ReLive is really coming along to give Nvidia a helpful shove towards more progress.

I am surprised that you used PlayerUnknown Battlegrounds as your test title and didn't mention Nvidia ShadowPlay Highlights in the feature comparison. My friends and I have captured way more content using that than anything else in the past. It makes it easy to get a highlight reel without having to do any editing.

When ShadowPlay first came out, it felt revolutionary to be able to record your last x minutes of gameplay with minimal performance impact. Highlights had that same revolutionary feel when I witnessed it working in PUBG. They recently brought this tech to Fortnite which hopefully means we will see this integrated into a lot of big titles going forward.
 
I enjoyed the comparison and I am not surprised that the performance difference is more or less negligible as they both use hardware encoding. I am very happy to see that ReLive is really coming along to give Nvidia a helpful shove towards more progress.

I am surprised that you used PlayerUnknown Battlegrounds as your test title and didn't mention Nvidia ShadowPlay Highlights in the feature comparison. My friends and I have captured way more content using that than anything else in the past. It makes it easy to get a highlight reel without having to do any editing.

When ShadowPlay first came out, it felt revolutionary to be able to record your last x minutes of gameplay with minimal performance impact. Highlights had that same revolutionary feel when I witnessed it working in PUBG. They recently brought this tech to Fortnite which hopefully means we will see this integrated into a lot of big titles going forward.

Another thing to add to the list of testing

Would using the highlight features from Shadowplay/ReLive have an impact if you're using an SSD for it? Like would it be constantly writing to the SSD? Or does it allocate some RAM based on how back you want to go?
 
I have to quibble with this assertion from your writeup:

"The two most popular recording resolutions, 1080p and 1440p, and 50Mbps is as high as it goes, that’s just not good enough."

That's Bullshit, at least for 1080p. At 1080p60, no one will be able to tell the difference between video captured at 50Mbps and half of that. To suggest that you need even more bitrate to capture 1080p video, more than 50Mbps, is simply untrue. 50Mbps for 1080p60 is already overkill. It would probably be useful for 1440p, but it isn't for 1080p.
 
I have to quibble with this assertion from your writeup:

"The two most popular recording resolutions, 1080p and 1440p, and 50Mbps is as high as it goes, that’s just not good enough."

That's Bullshit, at least for 1080p. At 1080p60, no one will be able to tell the difference between video captured at 50Mbps and half of that. To suggest that you need even more bitrate to capture 1080p video, more than 50Mbps, is simply untrue. 50Mbps for 1080p60 is already overkill. It would probably be useful for 1440p, but it isn't for 1080p.

Depending on the encoding settings 50K may not be enough

here's CoD WW2 Ultrafast in OBS, 10K vs 100K, 100K still isn't enough for native with those settings. Higher CPU usage settings can get you more quality with a lower bit rate.


 
Depending on the encoding settings 50K may not be enough

here's CoD WW2 Ultrafast in OBS, 10K vs 100K, 100K still isn't enough for native with those settings. Higher CPU usage settings can get you more quality with a lower bit rate.

What your wrote makes no sense. First of all, how can you say that 100Mbps (I'm going to assume you meant M instead of K throughout your post) might not be enough when your comparison is against 10Mbps instead of against a transparent copy of the original? Its apples to oranges, at best lemons to limes. Not a correct comparison. You can't prove your point by comparing against 10Mbps, which is far too low and no one is talking about it anyways. Second of all, the CPU isn't being used, Shadowplay encodes on the GPU. Thirdly, in video compression CPU or GPU usage is in all practicality a constant between test cases where bitrate is the only variable. The only reasonable conclusion given all these mistakes is that you don't know what the fuck you're talking about.

My assertion is that at 1080p60, 50Mbps already provides a transparent copy of the original using the Shadowplay encoder. Moving it to 100Mbps won't improve it. Transparency is already achieved somewhere a bit over 30Mbps for fast action video. The 50Mbps Shadowplay uses is already a bit of a waste.
 
What your wrote makes no sense. First of all, how can you say that 100Mbps (I'm going to assume you meant M instead of K throughout your post) might not be enough when your comparison is against 10Mbps instead of against a transparent copy of the original? Its apples to oranges, at best lemons to limes. Not a correct comparison. You can't prove your point by comparing against 10Mbps, which is far too low and no one is talking about it anyways. Second of all, the CPU isn't being used, Shadowplay encodes on the GPU. Thirdly, in video compression CPU or GPU usage is in all practicality a constant between test cases where bitrate is the only variable. The only reasonable conclusion given all these mistakes is that you don't know what the fuck you're talking about.

My assertion is that at 1080p60, 50Mbps already provides a transparent copy of the original using the Shadowplay encoder. Moving it to 100Mbps won't improve it. Transparency is already achieved somewhere a bit over 30Mbps for fast action video. The 50Mbps Shadowplay uses is already a bit of a waste.

Because it's comparing 2 different settings, neither of which are native like quality

Depending on how much processing shadowplay/ReLive are doing 50K might be enough or it might not be, as I showed there by using really light processing with a 100K Mbps bitrate, the K is just thousands.
 
Depending on how much processing shadowplay/ReLive are doing 50K might be enough or it might not be, as I showed there by using really light processing with a 100K Mbps bitrate, the K is just thousands.

100,000,000,000 bits per second...

12.5 gigabytes per second, man I wish I had a storage drive that can do that in write.
 
Last edited:
Because it's comparing 2 different settings, neither of which are native like quality

Depending on how much processing shadowplay/ReLive are doing 50K might be enough or it might not be, as I showed there by using really light processing with a 100K Mbps bitrate, the K is just thousands.

So, 100,000 Mbps... that is quite high... ;)

Sound is most often denoted in kilo bits pr. second (1000 bits pr. second).
Video require much higher bandwidth so its denoted with mega bits pr. second (1,000,000 bits pr. second).

https://en.wikipedia.org/wiki/Data_rate_units
 
I take it OBS is measured in Kbps and not Mbps. So ya 100 Mbps just about there.

In any event, how much encoding are the GPU recordings doing for the footage? Is it equivalent to Ultra Fast for CPUs in OBS, or is it Very Fast or better?
 

Attachments

  • OBS.png
    OBS.png
    7 KB · Views: 20
I take it OBS is measured in Kbps and not Mbps. So ya 100 Mbps just about there.

In any event, how much encoding are the GPU recordings doing for the footage? Is it equivalent to Ultra Fast for CPUs in OBS, or is it Very Fast or better?
Less CPU hit than UltraFast. The feature set is mish-mash of stuff found between SuperFast and Faster in x264, but the quality per bitrate is around SuperFast or so. Not that it matters at those bitrates. The easiest way to improve quality with video compression isn't by spending more processing time, its by throwing more bits at it. The preset has nearly no significance when the bitrate is this high.
 
You could add an OBS column as well if you want to do the full gamut. A lot of people are using this (myself included) for their streaming needs. Would be interesting to see the numbers behind the performance costs vs the vanilla vendor app.
 
Less CPU hit than UltraFast. The feature set is mish-mash of stuff found between SuperFast and Faster in x264, but the quality per bitrate is around SuperFast or so. Not that it matters at those bitrates. The easiest way to improve quality with video compression isn't by spending more processing time, its by throwing more bits at it. The preset has nearly no significance when the bitrate is this high.
Having more processing time before hand can get you a nicer image though, and it will let you get more for your bitrate, like if you had a limited upload speed for streaming but still wanted high quality, you'd have to get your processing as high as you could since your bitrate is limited

Isn't twitch limited to 6k bitrate?
 
Having more processing time before hand can get you a nicer image though, and it will let you get more for your bitrate, like if you had a limited upload speed for streaming but still wanted high quality, you'd have to get your processing as high as you could since your bitrate is limited

Isn't twitch limited to 6k bitrate?

twitch is limited to 3mbit for non partnered and 6mbit for partnered while high view count streams have a higher limit but i can't remember what it is, might be 10mbit... vast majority of streamers stream in the 3-5mbit range though. i have no clue what youtubes limits are since i don't watch streams there other than spaceX.
 
Are these specifically limited to 16:9 resolutions? Just asking because that is all that is mentioned and very specifically mentioned at that.
 
Just remember folks, watching your recorded gaming moments isn't that exciting.

So don't go to too much effort.
 
Just remember folks, watching your recorded gaming moments isn't that exciting.

So don't go to too much effort.
Lol!! Yeah, I didn't install relive because I don't need proof how bad I suck at gaming, even if that is all I do.
 
Back
Top