Your not wrong.

The video shouldn't be showing a crash on the NV side ... it should instead be showing a 40-100% increase in completion time as Adobe tried to cache the video file. There may be a few methods to speed that up..... but the thing with video is its really hard to guess before the render starts exactly how large the file will be. Big solid colour animation is going to compress a lot better then HD video of say a rain forest. So a solution would need to start unloading video out of or at least marking from the much faster gram early. I am guessing somewhat but I believe most methods to allow that rendered to be moved to system ram would greatly slow the process.

So my thinking is this is no Adobe big. They have simply chosen to use the fastest method instead of greatly reducing their softwares speed all the time to fix the out of ram case. (although I imagine a user option tick box of "allow render to use system ram" would be a good alternative to crashing. Having said that perhaps someone that uses Adobe more then me can answer this one.... doesn't adobe have that option, to render to system ram ? I imagine it would be much slower but it has the option doesn't it ?

Does Adobe play nicely with the HBCC on AMD cards? If so, that'd give even the v56/64 a leg up on Nvidia cards as it essentially shows apps whatever you set the HBCC to - I think my v56 is set to 12GB HBCC due to VR (or specifically windowsMR) hammering the ram and it shows up in game and task manager as 12GB available...
 
Translation: Adobe Premiere's memory allocation on the GPU is so poor that it can lead to crashes if not enough VRAM is available, and AMD added just enough VRAM on their cards to avoid this one software bug. The real fix isn't adding more VRAM, it's fixing the software so it can manage the VRAM properly.

I don't know if its related, but a bug that started in CC2015.3 on PC and Mac broke CUDA mercury engine support on many GPU models and is still broke in CC2018. Some people changed to software rendering, which causes freezing during After Effect playback, even on an i7-8700. I ended up using OpenGL on the onboard Intel graphics just to get Premier usable for my client. Adobe went to the crapper with Creative Cloud.
 
Too bad this doesn't help the card Game any better...

true if you want a gaming only card you can spend more money on a 2080ti and get more FPS in your games. But if you are going to play games with high res texures on 4k or stereoscopic 4k then you will want more and faster video ram. This card has the potential to be armored against the future.

And its 700 bucks compared to 1200 for a 2080ti.

When manufacturers get some of the new cooler designs slapped on this bad boy it will be even quieter and nicer to own I hope. ;)
 
This kind of reminds me of the first Titan video cards - they had uncrippled FP64 performance that would crush other consumer cards but at a lower price than the high-end Quadro workstation cards.

NVidia realized they were cannibalizing their own workstation GPU sales so successor Titans chopped down the FP64. I am expecting the Radeon VII to be a similar one-off from AMD unless they have something up their sleeve to make a truly monsterous FireGL.
I dont know what amd's plans are in the future, but this is not the first AMD card to have FP64 performance that is better than 1/8. I dont think nvidia generally had as good fp64 performance on their regular consumer cards except for the titans.
vega 7 1/4 fp64
amd 7990 1/4 fp64
amd 7970 1/4 fp64
amd 7950 1/4 fp64
radeon r9 290 1/8 fp64
radeon r9 280 1/4 fp64
radeon hd 5870 1/5 fp64
radeon 6970 1/4 fp64
radeon 6950 1/4 fp64
radeon 4870 1/5 fp64


check this list out
https://www.geeks3d.com/20140305/amd-radeon-and-nvidia-geforce-fp32-fp64-gflops-table-computing/
 
I guess by the time games use 16gb RAM the GPU won't be able to drive them anyway
 
But amd graphics cards perform horribly for Adobe media products compared to Nvidia ...
 
Been so long since I have played with video editing that I didn't even realize this was done on GPU's these days.

It's surprising that they crash instead of swapping data to main system ram. Sure it would slow down significantly, but it is better than crashing...
 
Been so long since I have played with video editing that I didn't even realize this was done on GPU's these days.

It's surprising that they crash instead of swapping data to main system ram. Sure it would slow down significantly, but it is better than crashing...
Certain effects are gpu accelerated. The cpu still plays a large role.
 
No, they probably use expensive smartphones to capture 4k Video...
PfkDtk7.gif
 
No, they probably use expensive smartphones to capture 4k Video...


Because the processing of the video and the renders of transitions and such will be of lesser quality on a lower end video card? It's all about the available codec's. They may film at 8k with a RED camera, Down convert it to 4k to work on the renders. And then upload it for distribution to youtube and such once processed at 1080p/60. All of that the only thing a lower end video card that CAN handle the render is going to impact is the time it takes to do the render. If you are a prosumer looking to do this to see if you can make money at it... (as an example) or as a hobby/side hustle then this would be the card you want because the cost of entry is much lower. (And you're likely not dropping 30k on a RED camera.)

Now lets say that side hustle starts bringing in more than your job and you quit to focus on that 24/7 and see if it can take off. Great. NOW is the time to invest in the higher end pro grade video cards and such. (If you don't just outsource that processing but most probably wouldn't do that.)

Your being dismissive of the place a card like this can have is a bit obtuse.
 
  • Like
Reactions: N4CR
like this
If you have a problem with bandwidth whilst editing video you proxy edit. Swap in your primary video after editing and then render out. Yes, you need better hardware to increase the speed of rendering and, if you're making money off of your work, better hardware means less time spent which means greater income potential. But, lack of hardware has not been a good excuse to be unable to edit high quality files for the better part of two decades now.
 
Feels a lot like the Titan V but to a lesser extent. I am personally waiting for Navi...
 
I can't imagine there are that many people who need this capability.

lol - if Linus Tech Tips owns multiple Red cameras and shoots all their content in 4K then there are a shitload of people out there who could benefit from these cards.

It's easy to dismiss things you are ignorant of. It's why teenagers are smarter than their parents ;)
 
If you are serious about video production, do yourself a favor and use professional cards. I bet they don't use cheap smartphones to capture 4k video, right?

The hate is real here. Do you really hate a good deal that much. Its not for you, go buy a damn 5k video card and leave other people for what it is. Jeez. lol!
 
So I guess then take a RX 550 and slap 16GB of HBM2 on it and you'd have a great 4k video editing card?
 
true if you want a gaming only card you can spend more money on a 2080ti and get more FPS in your games. But if you are going to play games with high res texures on 4k or stereoscopic 4k then you will want more and faster video ram. This card has the potential to be armored against the future.

And its 700 bucks compared to 1200 for a 2080ti.

When manufacturers get some of the new cooler designs slapped on this bad boy it will be even quieter and nicer to own I hope. ;)

Eh... future proofing in general is a bad idea unless it happens to coincide with what you need currently.

Also, its direct competition (in gaming) is the RTX 2080 at the same price, and Vega VII uses more power. Unless you actually do video editing that takes full advantage of Vega VII, it's not a competitive buy at all. Far better to buy nVidia now or wait for Navi if gaming is the primary focus.

Have a 2080 here. But why is it so bad to have a GPU that is good for dual purpose and does both. I thought being able to do 2 things were better than 1. Its not like this card cant game.

When it's the same price as the 2080, performs worse than the 2080 in games, and sucks up more power and generates more heat than the 2080, there's no compelling reason to buy the Vega VII over the 2080 if you're not doing some serious video editing.
 
Eh... future proofing in general is a bad idea unless it happens to coincide with what you need currently.

Also, its direct competition (in gaming) is the RTX 2080 at the same price, and Vega VII uses more power. Unless you actually do video editing that takes full advantage of Vega VII, it's not a competitive buy at all. Far better to buy nVidia now or wait for Navi if gaming is the primary focus.



When it's the same price as the 2080, performs worse than the 2080 in games, and sucks up more power and generates more heat than the 2080, there's no compelling reason to buy the Vega VII over the 2080 if you're not doing some serious video editing.


Again that wasn’t what I was saying lol. Had nothing to do with what I was commenting on. All I said was why is it so bad to have a card with dual purpose? As far as power is concerned 50 extra watts don’t burn a whole in the planet. Especially if it gets the job done in content creation and pays you back for your work etc. Is it slower in games? Yea slightly overall but for those who can enjoy it for dual purpose it’s absolutely worth it. I have a 2080 but if I was doing streaming and content creation I would be all over this card. No doubt. Your point is noted but had nothing to do with my comment. Mine wasn’t solely on gaming. Cheers.
 
Eh... future proofing in general is a bad idea unless it happens to coincide with what you need currently.

Also, its direct competition (in gaming) is the RTX 2080 at the same price, and Vega VII uses more power. Unless you actually do video editing that takes full advantage of Vega VII, it's not a competitive buy at all. Far better to buy nVidia now or wait for Navi if gaming is the primary focus.



When it's the same price as the 2080, performs worse than the 2080 in games, and sucks up more power and generates more heat than the 2080, there's no compelling reason to buy the Vega VII over the 2080 if you're not doing some serious video editing.

Well I don't care about power usage in general as I water cool everything. So the fancy shroud on whatever NVIDIA or AMD card I buy is going into a storage box within 30 minutes of the card being delivered to my doorstep. Looking at the cookie cutter reviews that I have seen so far tells me that the 2080 and Radeon VII are pretty close in performance. So I'm not caring if one does 3 more fps in this game or an anomaly like Fortnite. For me that's insignificant. Its still a toss up.


What would sway me one way or the other is the ecosystem. I own a 55" Samsung 4K HDR curved TV with FreeSync support. It does 4K @60Hz. 1440p @120Hz. 1080p @120 Hz. It uses HDMI. I love large monitors as I can read the text from bed at night from across the room. Absolutely love this purchase as it is practical and makes sense. Next upgrade will be 65" or bigger.

I'm not going to spend 5K on a TV to use it strictly as a monitor. I don't like movies. I don't like watching TV shows. I love news and racing related live-streams; well live-streams in general. I love reading about technology and learning new things. I like anime streams from Amazon Prime sometimes.


What is the most powerful solution that best fits my brand new TV?


Do I want to spend $700 on a video card to play Diablo 3? Apex Legends? Monster Hunter World? Grim Dawn? Civilization? Battlefield 5? Puyo, Puyo Tetris? I game about 3 hours every other day.

I have no issues turning off graphics features that are taxing to maintain a certain level of smoothness at all times. So if something causes frame dips it has to go. I would turn it on to showcase my system to one of the neighborhood kids and then turn that crap off as soon as they leave. I game at 1440p @120Hz as it is a better experience for my current Vega 64.


Should I just wait for possible new cards from NVIDIA and AMD to come out in the Fall? Even if they are just a refresh? As long as they are usable with the FreeSync on my TV I'm not picky.

That's how my brain works. My PC idles 24/7. My current video card uses very little power when idling. My CPU uses very little power while idling. I even let my PC go to sleep after 5 hours, but it rarely does because its playing a live-stream most of the time.

That would be a good metric for power. How much power does a video card use during playback of a Twitch or Youtube video? That would be my power consumption metric. :)


I bet the stack of audio amps in my room pull more power than the PC while idling. One of those is a pro amp and it has those awful tiny fans in it that hum 24/7. I was thinking about water cooling it too. :) Or replacing it with a 550w per channel Crown.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Have a 2080 here. But why is it so bad to have a GPU that is good for dual purpose and does both. I thought being able to do 2 things were better than 1. Its not like this card cant game.

All Gpu's are duel purpose. We know V2 is just a workstation card with radeon drivers. AMD didn't allocate the resources to develop gaming and data cards this generation. I understand that they want to sell this card, and since it's not a clear winner for gaming, they have to advertise it as a content creation card.
 
The worse Radeon VII review that I've seen so far. It's like this guy didn't
All Gpu's are duel purpose. We know V2 is just a workstation card with radeon drivers. AMD didn't allocate the resources to develop gaming and data cards this generation. I understand that they want to sell this card, and since it's not a clear winner for gaming, they have to advertise it as a content creation card.

Neither did NVIDIA. In fact, RTX was developed as a workstation Quadro GPU. They needed to find a purpose for those Tensor cores in a Gaming GPU, so they came up with DLSS. Real-time raytracing was also a BS promise for gaming and in actuality a Workstation feature (movie studios pay fat stacks for this feature set), that barely works when DLSS is enabled. So yes, neither company is crazy to invest twice to develop custom silicon for each market.

It's great when DLLS + RT "just works", but I and my friends don't play Port Royal as much as we used to. /s
 
Last edited:
All Gpu's are duel purpose. We know V2 is just a workstation card with radeon drivers. AMD didn't allocate the resources to develop gaming and data cards this generation. I understand that they want to sell this card, and since it's not a clear winner for gaming, they have to advertise it as a content creation card.

You are the second one to miss the point. Heck it’s in the title of this thread. That’s what the comment was in relation to.
 
Again that wasn’t what I was saying lol. Had nothing to do with what I was commenting on. All I said was why is it so bad to have a card with dual purpose? As far as power is concerned 50 extra watts don’t burn a whole in the planet. Especially if it gets the job done in content creation and pays you back for your work etc. Is it slower in games? Yea slightly overall but for those who can enjoy it for dual purpose it’s absolutely worth it. I have a 2080 but if I was doing streaming and content creation I would be all over this card. No doubt. Your point is noted but had nothing to do with my comment. Mine wasn’t solely on gaming. Cheers.

Not saying it's bad, just saying there's no reason to buy it unless you're in the niche scenario of 4k content creation and don't need/can't afford the true workstation stuff. Which, IMO, is a much smaller market than you think it is.

Edit: Also, at 1080p for Youtube (which is what most casual streamers/content creators will be), a RTX 2080 and even a GTX 1080 ti will be enough for the task, depending on the programs used.
 
Last edited:
My 2080ti exports 4k h265 faster than 30fps with nvenc, quality is much better than previous versions of nvenc too. I'm quite happy with the card for that as well as gaming.
 
Almost every VRAM dependent app I use will simply hang or freeze when running out of VRAM, this is not unique to adobe.

That doesn't excuse Adobe (or anyone else for that matter).

*Is a SW Engineer.
 
That doesn't excuse Adobe (or anyone else for that matter).

*Is a SW Engineer.
Then you should know that you cannot know the exact memory requirement of every task in advance. I'd rather the app try to run in the available space and crash, than not even being able to run the task because of a check that works based on a worst case scenario.
 
Then you should know that you cannot know the exact memory requirement of every task in advance. I'd rather the app try to run in the available space and crash, than not even being able to run the task because of a check that works based on a worst case scenario.

You can shuffle memory in VRAM around into main memory if needed; it's NEVER acceptable to have your application crash or hang.
 
Then you should know that you cannot know the exact memory requirement of every task in advance. I'd rather the app try to run in the available space and crash, than not even being able to run the task because of a check that works based on a worst case scenario.

Maybe not the exact memory requirement, but a good programmer can make a good "guesstimate", so really there's no excuse to run out of memory.

At the very least the app should be capable of warning that you don't have enough memory.
 
Maybe not the exact memory requirement, but a good programmer can make a good "guesstimate", so really there's no excuse to run out of memory.

At the very least the app should be capable of warning that you don't have enough memory.
I'd be very mad if the app prevented me from doing something because it (falsely) thinks there isn't enough ram. I'd rather it try and fail than not try at all.
So far I've seen two kind of apps. One that doesn't say anything but will fail when running out of RAM, and the other that refuses to work on large datasets completely. (when trying half the size that I can load into other apps on the same machine)
A warning would be fine, but I haven't seen that anywhere.
 
You can shuffle memory in VRAM around into main memory if needed; it's NEVER acceptable to have your application crash or hang.
I'd rather it be faster and more optimized than doing overzealous ram checks every second. Especially if they're upfront about it and tell you that you need this amount of VRAM for this sized data.
 
I'd rather it be faster and more optimized than doing overzealous ram checks every second. Especially if they're upfront about it and tell you that you need this amount of VRAM for this sized data.

Which you don't; if an allocation fails simply transfer a chunk of it back into normal RAM/Virtual RAM to free up space. Imagine the host OS worked like you describe and BSOD's every time a memory access fails.
 
You can shuffle memory in VRAM around into main memory if needed; it's NEVER acceptable to have your application crash or hang.
That's a somewhat simple way of looking at it. If you have a chunk of data that is 8GB on it's own, and only have 6GB of vRAM, you will have to split the data into smaller pieces before working on it. Even if you had 12GB of vRAM you may have to split it if the original and modified data don't all fit into memory. In some cases, that can have an extremely detrimental effect on the time of the operation, where it might not be worth it to even try.
 
Back
Top