Nice sale on the VII

Looks like a good deal for what you get.
Poll Request: 2070S at MSRP vs 5700xt at MSRP vs R7 at $500.

AMDs fine wine seems like sour beginnings. This card should have been better supported at launch. I would love to see how it stacks up on 4-5 years.
 
I would probably bite at $399 but that is about it, would make a very good rendering card. I have two Vega FEs that can do that but the Vii is better. It does slightly beat the 5700 XT in gaming, needing 1TB per second memory to do that does show I think the progress Navi made for gaming.
 
  • Like
Reactions: N4CR
like this
I would probably bite at $399 but that is about it, would make a very good rendering card. I have two Vega FEs that can do that but the Vii is better. It does slightly beat the 5700 XT in gaming, needing 1TB per second memory to do that does show I think the progress Navi made for gaming.

Does amazing FP64 work. Ask the DC team here lol
 
I sold one of my VegaFE's a few months back for $650 on eBay. I kept the other VegaFE in my secondary computer for certain tasks. I gamed last night on my VegaFE with the new 2020 drivers...I did notice much more smoothness. My secondary monitor is a 25 inch 1080p with Freesync. If it comes down another $100 in the next few days...I will swoop one up for my secondary computer. My 2080ti also does Really good with Adobe stuff now as well with the new nvidia studio driver.

At the end of the day, I love a good value.
 
Looks like a good deal for what you get.
Poll Request: 2070S at MSRP vs 5700xt at MSRP vs R7 at $500.

AMDs fine wine seems like sour beginnings. This card should have been better supported at launch. I would love to see how it stacks up on 4-5 years.

I was playing BF-V Wake Island last night on my Core-i7/VEgaFE system with the new AMD 2020 drivers. I was getting 90+ fps with all the eye candy turned on. I again played BF-V on my Threadripper 1950x/2080ti system, seems liek under DX12 it still lags a bit where the VegaFE was as smooth as silk.
 
I would probably bite at $399 but that is about it, would make a very good rendering card. I have two Vega FEs that can do that but the Vii is better. It does slightly beat the 5700 XT in gaming, needing 1TB per second memory to do that does show I think the progress Navi made for gaming.

When AMD killed crossfire in their drivers over the summer of 2019, that lost it for me having two VegaFE graphics cards anymore for gaming since I no longer crunched video as much as I used too.

What gets me is that now the developers get the use of two graphics cards to double their workload....blender being an obvious example and AutoCAD being another that can use sli/crossfire/mgpu and us gamers are left holding our Johnsons as far as games go.

Yes, there are games that still support dual cards, but they are so few and far between anymore that IMHO, it is not worth owning two graphics cards. When the VII's came out, I wanted two of them so badly but couldn't justify because of the crossfire no longer supported.

[H]ardOcp was the reason I had a 295x2 and an R290x running in Tri-Fire (CrossfireX) to play Battlefield 4!!!! Fun times!!! What happened? I have my damn money, take it so I can still run a dual GPU rig!!!
 
Last edited:
Yes, when FC5 was reviewed for the VII, it was sucking down close to 12GB of VRAM with all the eye candy.
Cause it using 12gb of memory doesn't mean it needs it all. A lot of games load as much as possible to vram. I doubt it needs much more then 4gb in reality. It could be very poorly optimized. Gears of war 5 at 3440x1440 at ultra settings didn't even break 7gb and that game looks way better.
 
When AMD killed crossfire in their drivers over the summer of 2019, that lost it for me having two VegaFE graphics cards anymore for gaming since I no longer crunched video as much as I used too.

What gets me is that now the developers get the use of two graphics cards to double their workload....blender being an obvious example and AutoCAD being another that can use sli/crossfire/mgpu and us gamers are left holding our Johnsons as far as games go.

Yes, there are games that still support dual cards, but they are so few and far between anymore that IMHO, it is not worth owning two graphics cards. When the VII's came out, I wanted two of them so badly but couldn't justify because of the crossfire no longer supported.

[H]ardOcp was the reason I had a 295x2 and an R290x running in Tri-Fire (CrossfireX) to play Battlefield 4!!!! Fun times!!! What happened? I have my damn money, take it so I can still run a dual GPU rig!!!
I gave up on Crossfire with Vega FEs but then again for other applications that use OpenCL or even multi-gpu they use to work great - now they are in separate machines but maybe put back together if I build a Treadripper rig, 3 cards total all at pcie 16x. I will have to get back into 3d serious again for me to justify that. I played FC5 in HDR and completed the game, 144hz freesync with these - awesome experience! Later and not sure when Crossfire would not work in the game anymore and got the 5700XT anyways. Now the Vega FE's also did not work in multi-gpu with Shadow of the Tomb Raider, pretty sucky -> 1080TIs did and did well in SLI HDR with some manual configuration of the drivers. Rise Of The Tomb Raider Multi-GPU did work well as a note with the FEs. Anyways for gaming CrossFire is just not worth it, SLI in older and a few newer games is.
 
I was playing BF-V Wake Island last night on my Core-i7/VEgaFE system with the new AMD 2020 drivers. I was getting 90+ fps with all the eye candy turned on. I again played BF-V on my Threadripper 1950x/2080ti system, seems liek under DX12 it still lags a bit where the VegaFE was as smooth as silk.

That was because of the threadripper, not because of the 2080Ti lol
 
No. Battlefield 1 and BF4 play perfectly on both systems.

Might be due to certain updates then. Last time I tried BFV on a R7 3800x and a TR1920x, the 1920 was noticeably jankier when it came to frametime consistency in certain areas, whereas the R7 had 0 issues.
 
This is 1080p Ultra with Ryzen page filing making up for my lack of vram because 8Gb is all that a RX 5700 has .. this system has 32Gb running at 3600Mhz which maybe what that 32Mb of L3 was really made for if it's acting like it's feeling a Vega 10 on a 2200g .



retuned

 
Last edited:
I have a Radeon Vii and many games will preallocate the full 16 GB stack. Good to have
 
In my experience, Radeon VII is a superb card for both gaming and content creation. I swapped mine out recently for a 2080 Super (only because I got a stonking good deal on a G-sync monitor). At 4k where the card really shines due to the extra cores and mem bandwidth, there is no discernible difference for me between the VII and 2080 Super, and certainly no difference that merits $200 extra for the nV card.

I don't, however, think 16GB is going to make much difference even at 4k now. Maybe >8GB will be important next year after the new consoles are released and games are developed that need it, but being real, you'll probably want a new card by then anyways. Of course games will preallocate 16GB, but there are very few circumstances when they actually use anything even close to 8GB.

$499 is a great deal for this card, IMO, especially if you're going to play at high resolution and IQ.
 
Looks like a good deal for what you get.
Poll Request: 2070S at MSRP vs 5700xt at MSRP vs R7 at $500.

AMDs fine wine seems like sour beginnings. This card should have been better supported at launch. I would love to see how it stacks up on 4-5 years.

If developers were able to work more closely with AMD the way Nvidia does for virtually every game, I believe the card would have been a stellar performer. From what AMD has been able to achieve with Vulkan, I feel like they have the brainpower to pull it off, but it just comes down to how much money AMD is willing to invest into the gpu business. Now that their cpu cash cow has finally paid off, I honestly believe that they will be able to start putting enough money into r/d to get their hardware and support softwear in a position where it may be able to compete with Nvidia where the price performance matters the most. Remember Nvidia is roughly twice as good at efficiency, and it will only improve as they release new revisions and architectures. I remember when I had a 5970 and 5870 in tri-fire. I would have to change my drive depending on which game i was playing because the performance difference was night and day. They've been able to do a good job specializing for specific titles, however RTG has really had trouble with getting a good general driver that can cater to all needs. Hopefully in the future they will continue to work on game specific code, and get the most out of every title. It's going to be an exciting next 5 years. We've finally made it into the realm of higher video card ram, and faster streaming storage devices. This is going to go a long way in scaling up the immersion of our gameworlds. If I could get rid of one thing forever in games, it might just be fade in, especially of grass and object textures. If you want to throw away all the joy of immersion, throw a bush 5 feet in front of me on an open road. I never liked how they scale up the resolution of those textures either. You get these like single color textures, then some lines, then poof out of nowhere it's actual rock and grass 5 feet away from you, meanwhile the rest of the gameworld is a sea of faded out textures, very dissapointing. From a performance standpoint I can understand why they implement it like this, but I really feel like it should be all or nothing. Give me grass that doesn't dissappear, or just throw a flat hi rest grass texture on the ground instead. Honestly I'd rather lean towards both extremes instead of the compromise of mediocrity.

Also, I feel like gaming has been at a plateau for quite a while. While consoles are finally going to have legitimate 4k gaming with proper hardware to support it, you really need 4k to make a game look good on anything over like 42". I still feel like 1440p on a 27 in monitor still delivers a much crisper, tighter image than a large 4k tv. Now that the pc is starting to make 4k games a 'thing' at standard monitor sizes, I'm not sure how much I really care for it over 1440p. Most of the time I simply use it as an AA method when games don't have sufficient support. If we could just get photorealistic textures and less box/hexagon shaped game worlds, I'd be a pretty happy camper. Every time I see a circle in a game, the first thing i do is check to see how many sides it has. It's a fun game to play and you can see where devs had to make sacrifices to hit the spec they needed for their target delivery.

Back on topic however, I really hope that the 5700 series of cards don't get shafted when the next refresh comes out. Since it will probably be identical hardware to what the next gen consoles will be shipping with, AMD will have a much greater impetus on optimizing their softwear for that hardware. It took 6 months for them to release a stable driver for the 5700xt (20.1.2 optional drivers that came out 1/15/20). I had more crashes with the box I had running the 5700xt in 3 months, than the entire lifetime of any other videocard I've ever owned. If they can magically fix it with one update, 6 months late, then they are still cash starved for operating budgets in their driver/softwear division of RTG.
 
I wouldn't buy a Radeon VII today, even for $500. I mean, it was a great card, I used to have one, but I feel like the 5700 XT (or even plain 5700) is better bang for buck.

I have a 5700 XT now and performance seems pretty close to what the VII had, but for much cheaper. I guess if you are doing 4K, then the VII pulls ahead, but anywhere lower than that you are probably better of with the XT.

Just my opinion.
 
I wouldn't buy a Radeon VII today, even for $500. I mean, it was a great card, I used to have one, but I feel like the 5700 XT (or even plain 5700) is better bang for buck.

I have a 5700 XT now and performance seems pretty close to what the VII had, but for much cheaper. I guess if you are doing 4K, then the VII pulls ahead, but anywhere lower than that you are probably better of with the XT.

Just my opinion.
well 8GB vs 16gb. less compute than the RVII, the RVII is only better for a certain purposes, for games a RX 5700 XT is ok but a 5700 is not comparable unless it is modded with 5700 XT bios even then it is still slower than a RVII
 
Last edited:
I mostly agree. If all you do is game then a 5700 XT makes way more sense. At this point the VII is just for specific use cases and absolute performance on an AMD card for those that are willing to pay a premium (even at $500). Most need not apply.
But if you do use heavy compute and do gaming on the side, then the VII starts to make a lot of sense. If you don't need CUDA, then it's actually the top performer for consumer cards for compute. Even beating a 2080Ti. But, once again, use case.

Otherwise, for $500 it makes more sense to buy a 2070 Super or simply spend less on a 5700XT.

Still, I picked up my VII in November for $500 and I've been super happy with it. It has improved all my render times and timeline render times significantly. It has definitely allowed me to play with more layers and more intensive grading that before I would often skip just due to how taxing on my system it was. Things like heavy noise reduction and a larger more sophisticated color grade layer stack, which have had noticeable, tangible improvements to how good the finished product actually looks, and like earlier mentioned, doing it all faster allowing me to get more work done.
It also more or less has allowed me to max out or get close to maxing out every modern game in 4k (Outer Worlds I used 80% Resolution Scaling and High Shadows as an example as a few compromises with everything else maxed, and 3200x1800 Resolution in DX:MD with all other settings maxed as another example).

I've been really happy with performance and fidelity throughout. But it hasn't been a perfect experience. I did have to washer mod and repaste as well as find ways to keep the card cool or deal with overheating and thermal run-away. I've now fixed those problems and it runs without a hitch. But, I'll admit that I wish I didn't have to fix those problems in the first place.
 
It was a card that was really designed/intended for workstation use and gaming is/was an afterthought.
/love that HBM2 memory - hope it appears with big Navi
 
It was a card that was really designed/intended for workstation use and gaming is/was an afterthought.
/love that HBM2 memory - hope it appears with big Navi

There is a lot of speculation about the why's of the VII. But the short answer is that it's basically a cut down Instinct M150 card.
It's true that it wasn't perhaps "designed" to be a gaming card. And likely the reason for its existence is due to the fact that nVidia decided to price its top card at $1200 and it's second top card at $800. It then became viable for AMD to release this cut down workstation card out of very expensive silicon and still make some level of profit while staying at least somewhat relevant in the consumer space. So in that sense it wasn't directly designed to be a gaming card but workstation parts can at least fulfill the role adequately.
In the months before launch there was plenty of speculation from third parties as well as from inside AMD about how much to cut things like double precision in order to not cannibalize enterprise and workstation sales.

Really though, if nVidia didn't price their cards so high we likely never would've seen the VII. And if the VII had actually sold in any real quantity and actually had games be optimized for it, it would likely increase in speed in games by at least 25%. Which is a bold claim, but basically the VII is mostly idle during virtually every processing cycle in games and suffers with generally speaking never being fully utilized.

This video really fills in the gaps and does an incredible job at explaining the architecture (with examples) as well as showing Radeon VII's shortcomings:
 
Last edited:
Excellent summary ^ I'd also put forward that the VII exists(ed) in part because AMD knew they didn't have anything in the high end (until Navi came a few months later) and wanted to ensure they were filling a segment, cost and efficiency be damned.

I still hold that it's a solid card for high-res/high-IQ gaming, and a bargain for compute. Mine did a very respectable job for a good while.
 
Would love to drop a pair of VII's on my x58 3 Way Sli as I run a pair of RX 570 (8Gb and 4Gb) always 8Gb is master card .. they rip about 26000 + gpu on that old azz Xeon x5660 in Fire Strike
 
Would love to drop a pair of VII's on my x58 3 Way Sli as I run a pair of RX 570 (8Gb and 4Gb) always 8Gb is master card .. they rip about 26000 + gpu on that old azz Xeon x5660 in Fire Strike
bad idea pairing an AMD gpu with a low ipc cpu, specially for Directx11
 
Back
Top