nVidia straight Crushes AMD in DaVinci Resolve - Puget

idiomatic

Limp Gawd
Joined
Jan 12, 2018
Messages
362
Didn't see this posted here and it basically decides my upgrades for me. Long story short 6800xt is beaten out by the 3070 in DaVinci which is a full turn around from the Radeon VII that I did not expect.

Outside of a few cases like Fusion, the new AMD Radeon 6800 and 6800 XT are a good amount faster than the older Radeon 5700 XT and Vega 64. In GPU bound tasks like noise reduction and OpenFX, these new cards as much as 83%(!) faster than the 5700 XT. Unfortunately, that isn't enough for them to catch up to the NVIDIA 3000 series cards. The RTX 3070 is less expensive than the 6800 and 6800 XT, yet outperforms them in our GPU Effects tests by a solid 14%. And if you can find an extra $50 to upgrade from the Radeon 6800 XT to the NVIDIA RTX 3080, you will see up to a 70% performance gain by going with NVIDIA.

https://www.pugetsystems.com/labs/a...dio---AMD-Radeon-RX-6800-XT-Performance-1990/
 
Keep in mind these are results for 4K - as the review mentions in the end the 8GB cards cannot be used to process 6K footage. Also, the 30-series cards will likely not support Macs for some years to come, making them a no-go for a lot of video editing professionals.
 
Keep in mind these are results for 4K - as the review mentions in the end the 8GB cards cannot be used to process 6K footage. Also, the 30-series cards will likely not support Macs for some years to come, making them a no-go for a lot of video editing professionals.
Mac's won't matter in the future. They will be using their own chips and nothing from Nvidia AMD or Intel.
 
Wonder how much Nvidia paid for that "optimization"

And that’s the issue now isn’t it. After that email, we don’t know.

That said, my workflow benefits from Nvidia, but I am hoping that AMD can catch up, if not surpass Nvidia’s performance
 
Didn't see this posted here and it basically decides my upgrades for me. Long story short 6800xt is beaten out by the 3070 in DaVinci which is a full turn around from the Radeon VII that I did not expect.



https://www.pugetsystems.com/labs/a...dio---AMD-Radeon-RX-6800-XT-Performance-1990/
GCN all purpose. RDNA made for gaming rigs. CDNA an MI100 pro compute monster. I run a Radeon VII in my TR build. The GCN upgrade is the Radeon Pro VII. A new GEN 4 PCIe card. It's the same 60CU/16GB HBM2 but it runs on a PCIe gen4 8X socket. Or just run Radeon VII's pro drivers and AMD software for video and 3D. Everything made for the MI50 card runs on the VII.
 
As others have pointed out.... professional video editors are in general working with 6k video. Which means the 3080 is a no go.... so your options are 3090 or bust in terms of Nvidia "consumer" hardware. AMD wins this round cause they have a product that can actually process 6k.
 
As others have pointed out.... professional video editors are in general working with 6k video. Which means the 3080 is a no go.... so your options are 3090 or bust in terms of Nvidia "consumer" hardware. AMD wins this round cause they have a product that can actually process 6k.
If you are a professional video editor you shouldn’t be using a consumer card. The driver advantages alone are worth the upgrade to a professional series let alone the massive performance differences and yes there are big differences more than not. And if you are a professional on a budget than the 3090 is an excellent option if you can’t swing the A6000.
But I would by no means call AMD’s performance here bad if your goal is a system that plays games and does DaVinci, those tests were crunching a Threadripper 3970x, not exactly a budget friendly CPU choice.
 
Keep in mind these are results for 4K - as the review mentions in the end the 8GB cards cannot be used to process 6K footage. Also, the 30-series cards will likely not support Macs for some years to come, making them a no-go for a lot of video editing professionals.
Sucks to be Mac users. Interestingly the DaVinci Resolve Studio is also available for Linux. In fact the software was made for Linux first and Windows and Mac was released later. Me thinks it's time for those professionals to jump ship from Mac.
 
IIRC, AMD dropped some of the compute performance in order to gain gaming performance. This is no surprise.
came here to say this.

I fully expected this result from the get-go from what AMD has said about RDNA being very gaming-focused, and from what Nvidia has demonstrated about Ampere being more compute optimized.

Remember when Vega was slower for gaming than a 1080Ti but faster for productivity and compute tasks? Kate remembers. Same thing here; NV went w i d e this generation which means more general-purpose perf (and explains why they pull ahead at higher resolutions in gaming) while AMD went narrow and more optimized, which means more gaming perf for the die size but less compute.
 
As others have pointed out.... professional video editors are in general working with 6k video. Which means the 3080 is a no go.... so your options are 3090 or bust in terms of Nvidia "consumer" hardware. AMD wins this round cause they have a product that can actually process 6k.
6k and often times above. Black Magic Design themselves (the same creators of Davinci Resolve) have a camera that captures 12k (Ursa Mini 12k). Red Monstro is also 8k.

NHK is trying to move all of its broadcasting to 8k, so it’s pushing a lot of the Japanese camera manufacturers to move to 8k acquisition. Canon is supposed to have broadcast 8k cameras soon as is I believe Panasonic.

Either way it doesn’t matter to me. I’ll continue to use my Radeon VII until it becomes obsolete by Apples own hardware.
 
  • Like
Reactions: ChadD
like this
6k and often times above.
move to 8k acquisition
This and at those resolution 8-10-16 gig is not that relevant of a distinction according to the article, you need either the 3900 or a professional card.

I am not sure the 3070 vs 6800xt is that relevant for the professional editor market that would wonder if they are ok for there 6-8K professional footage on a professional footage, that more amateur/semi-pro youtuber type I feel like.
 
Sucks to be Mac users. Interestingly the DaVinci Resolve Studio is also available for Linux. In fact the software was made for Linux first and Windows and Mac was released later. Me thinks it's time for those professionals to jump ship from Mac.
Lots of mac users have jump ship that just want to get work done. Only diehards still use macs. The professionals had to wait so long to upgrade from the trash cans. The new Mac pros are just insanely over priced.
 
This and at those resolution 8-10-16 gig is not that relevant of a distinction according to the article, you need either the 3900 or a professional card.

I am not sure the 3070 vs 6800xt is that relevant for the professional editor market that would wonder if they are ok for there 6-8K professional footage on a professional footage, that more amateur/semi-pro youtuber type I feel like.
The Radeon VII is capable of doing all of this now.
Lots of mac users have jump ship that just want to get work done. Only diehards still use macs. The professionals had to wait so long to upgrade from the trash cans. The new Mac pros are just insanely over priced.
Except it isn’t overpriced. It’s actually right in line with workstation computers from HP and Dell. And it’s capable of getting cards that those machines don’t have: namely the afterburner card which would take all the overhead off of the video cards in the first place. Editing 12k on the afterburner card is trivial. It also has access to dual Vega II’s, which again aren’t available on the PC side.

I would say you don’t know the users and the marketplace if you think pros have jumped ship. The iMac Pro and Mac Pro basically have the top covered since 2017 - and now the regular 2020 iMac is basically overkill for 4K workflows if you get upgraded graphics cards.
 
The Radeon VII is capable of doing all of this now.
That article is quite misleading then:
Unfortunately, even 16GB is not quite enough for 8K timelines. For that, we typically find that you want a GPU with at least 20GB of VRAM which largely limits you to the RTX 3090 24GB or workstation-class cards like Quadro and Radeon Pro. Still, the extra VRAM will be useful for those that tend to use multiple noise reduction nodes, a lot of OpenFX, and complicated grading node structures on 4K/6K timelines.

I guess we should not read too much from people that are trying to sell you hardware ;) They put the canvas that without the UI you can do all of this, it is with the UI on that it become an issue.

Needless to say I doubt someone working in television/movies studios is wondering should I go for the 3070 instead of an Avid machine..... are we not talking more about the step below that ?
 
Last edited:
I don't even know what Davinci Resolve is, so I think I'm good.

Edit:
Google tells me it is a video editing suite.

This could be really important for the few who work with video, but for everyone else it is just noise for a niche application.
 
I don't even know what Davinci Resolve is, so I think I'm good.

Edit:
Google tells me it is a video editing suite.

This could be really important for the few who work with video, but for everyone else it is just noise for a niche application.

It not niche if your into movie/tv production. Black Magic software is used at some point in the pipeline for almost every major Hollywood film. Its used even more in commercial and TV/Streaming work. Fusion has been BMs compositing software that has been around for I think almost 30 years at this point. For the last 4 or 5 years its been integrated with BMs Davinci Resolve. Its one of (perhaps the most) popular software of its type in pro circles. Its also used by a lot of game studios as well. I think most people consider Resolve to be the go o for colour correction, a lot of productions that use Adobe still end up using resolve for colour correction. Its best around colour correction also gets lots of use in restoration work.
 
It not niche if your into movie/tv production. Black Magic software is used at some point in the pipeline for almost every major Hollywood film. Its used even more in commercial and TV/Streaming work. Fusion has been BMs compositing software that has been around for I think almost 30 years at this point. For the last 4 or 5 years its been integrated with BMs Davinci Resolve. Its one of (perhaps the most) popular software of its type in pro circles. Its also used by a lot of game studios as well. I think most people consider Resolve to be the go o for colour correction, a lot of productions that use Adobe still end up using resolve for colour correction. Its best around colour correction also gets lots of use in restoration work.

Yeah, but film and TV production is what, 0.00001% of computer users in the country? :p
 
  • Like
Reactions: ChadD
like this
This could be really important for the few who work with video, but for everyone else it is just noise for a niche application.
, but film and TV production is what, 0.00001% of computer users in the country?

That a bit of a strange comments, it is like opening a thread about a video game benchmark and saying:

This could be really important for the many who play video games, but for everyone else it is just noise for a popular sector

I feel it is all true what you are saying (or somewhat in the right ballpark could be low estimate, they have around 2 millions license in the field apparently if you had the free demo one), but just so trivial.

https://www.smh.com.au/business/sma...ehind-the-oscar-nominees-20190124-p50tg1.html

Comparable to apple final cut pro X
 
Last edited:
  • Like
Reactions: ChadD
like this
The whole SFX industry runs pretty much Linux, from workstations through to servers and clusters. Apple lost their niche, they're trying to lock such users into an ecosystem, but without the sustained file system performance/networking of Linux and Nvidia support - I don't see it happening.

M2's good...It's not Nvidia beating good.
 
Didn't see this posted here and it basically decides my upgrades for me. Long story short 6800xt is beaten out by the 3070 in DaVinci which is a full turn around from the Radeon VII that I did not expect.



https://www.pugetsystems.com/labs/a...dio---AMD-Radeon-RX-6800-XT-Performance-1990/
With Nvidia you must pay for the PRO drivers and apps. With AMD the software is free. Did you see an Nvidia running an Nvidia App Vs Radeon with the Radeon running the free AMD Pro drivers and app? utube will not show you that. utube won't mention the price of pro drivers from Nvidia either. Radeon VII or the gen4 Radeon Pro VII are both all purpose GCN. The 6800XT is a gamer card RDNA. It wasn't made for a work station. DaVinci run off the cpu, FREE. DaVinci run off the GPU isn't free. utube won't mention that either.
 
After the nVidia/Hardware Unboxed fiasco, why would anyone trust a review from a business that actually sells their products? Maybe their results are accurate, but seeing as they have a business relationship with nVidia I would take their results with a grain of salt.
 
Outside of Resolve being heavily CUDA dependent (and without anything else), I'm guessing that a significant amount of this may be optimization related. We saw recently that even ostensibly platform agnostic tech, like Vulkan raytracing, is at current entirely optimized for and by Nvidia. This isn't necessarily nefarious in this case as lots of compute specific / tensor style supporting stuff has an entire GPU generation when only Nvidia had capable cards in the RTX 2000 series, so everything was designed for iit - which of course benefits the 3000 series as well.

Now that RDNA2 cards (including the console GPUs) just barely arrived at the end of November (not to mention their low stock and high overall performance/desirability ) its likely that optimization and development will take place. Now I'm not saying that this necessarily means that once optimized AMD will wipe the floor with Nvidia on every test, but its highly likely that these kinds of gaps that we see in specialized usage - be it raytracing, DLSS, or things like this - will close significantly. I really want to support AMD for finally having a competitive card even at the highest end and in some cases (ie rasterized game performance and general GPU use, not to mention massively leading RAM wise vs everything up to the 3090) AMD is the best choice especially for the money, but Nvidia has been trying to direct the narrative to special cases in which they excel - likely knowing that as optimization comes, they'll have even less of a lead in those cases. I really hope that AMD is working on optimizing and developing as fast as possible so that these gaps close. It may be that in certain cases Nvidia still does better for one reason or another, but minimizing the gap is important in order for AMD (and perhaps even more importantly, the open standards and methodology that AMD usually favors) to get the kind of adoption in the market as a first choice overalll.
 
Back
Top