Nvidia RTX 2060 for $350 or AMD Vega 64 for $370 on a mid-tier gaming machine?

Nvidia RTX 2060 for $350 or AMD Vega 64 for $370


  • Total voters
    112
Not if you are limiting power usage....gpus naturally clock down when not under load.

If you're not loading it, then you don't need to worry about power usage (these days). What you're doing is putting in an artificial block to prevent the power usage from going too high regardless of the load.
 
If you're not loading it, then you don't need to worry about power usage (these days). What you're doing is putting in an artificial block to prevent the power usage from going too high regardless of the load.
I do gaming on it...and that caused a reboot. Pulled too much power at peak. If I limited it down 20% max power it works perfectly.
 
Was referencing 1080. Vega is undisputed compute champ among the choices, but suffers in some popular titles. I would pick Vega in general. However it does suffer in some games I play compared to Nvidia, AC, Ff Xv etc

It's slower than a 1080, and still uses more power for gaming. I recognize Vega's raw compute capacity but it should also be stated that that is not a decision point in and of itself. Given AMD's less advanced software platform, applications need to be evaluated on a case by case bases; particularly, anything that runs well in CUDA is worth taking a close look at.

Vega 64. I plan to sell my 1080TI and get Radeon Vii when it is released

So you like spending cash and getting no return...?
 
This is level 3 textures and playing with the settings it will go all the way to needing Vega 7 and that 16Gb to max this game out it DX 12 1080p ..


Many games load more vram than needed simply because the card has more vram. Its called cacheing. For a proper test, you need the SAME exact card in both 4gb and 8gb versions and then compare the performance. There are a few examples floating around the web, and yes, you will see minor differences, but then again, 4gb vs 8gb, and 6gb kind of narrows things down.
 
I looked at that review. The AMD cards were using Catalyst drivers meaning old.

They left a lot of performance on the table not updating their benchmark set for Vega since Adrenalin driver update has been out.

It’s not a fair comparison at all. I get not wanting to retest everything but it’s a bit of a disservice to compare that way. When you look at newer benchmarks using the newest drivers for AMD Vega64, the Vega pretty much beats the 2060 on most everything.

Why do you keep repeating stuff that can be easily verfied to be False:
https://www.techspot.com/review/1781-geforce-rtx-2060-mega-benchmark/
"Our GPU test system consisted of a Core i9-9900K clocked at 5 GHz with 32GB of DDR4-3200 memory. We used AMD's Adrenalin 2019 Edition 19.1.1 drivers for the Radeon GPUs and Game Ready 417.35 WHQL for the GeForce GPUs."

If anything the guys at TPU/HWUB are AMD fans, Not only that but the recently did a video complaining how NVidia shut them out of early 2060 coverage. These guys really don't like and are not getting along with NVidia right now. They weren't going to leave AMD on old drivers, and clearly they didn't.

As much as people like to shit on NVidia, the RTX 2060 is actually a good card and decent value in todays market.
 
Why do you keep repeating stuff that can be easily verfied to be False:
https://www.techspot.com/review/1781-geforce-rtx-2060-mega-benchmark/
"Our GPU test system consisted of a Core i9-9900K clocked at 5 GHz with 32GB of DDR4-3200 memory. We used AMD's Adrenalin 2019 Edition 19.1.1 drivers for the Radeon GPUs and Game Ready 417.35 WHQL for the GeForce GPUs."

If anything the guys at TPU/HWUB are AMD fans, Not only that but the recently did a video complaining how NVidia shut them out of early 2060 coverage. These guys really don't like and are not getting along with NVidia right now. They weren't going to leave AMD on old drivers, and clearly they didn't.

As much as people like to shit on NVidia, the RTX 2060 is actually a good card and decent value in todays market.



Hey, YOU, stop making stuff up. :rolleyes:o_O

The article we were talking about was not the one you just linked. Nice bait and switch there.

The article in that quote --- where I said techpowerup was using old Catalyst drivers was in fact...using old Catalyst drivers on an I7-8700 as described on page 6 of the article.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/6.html

Post 13 and Post 20
upload_2019-1-26_15-43-7.png


upload_2019-1-26_15-43-27.png


upload_2019-1-26_15-44-9.png
 
Last edited:
Read post 24 in this thread.

If you can handle the noise and heat, the Vega 64 is the superior card. It has faster clocks most Vega 56 can’t quite hit, and it has extra processor cores over the Vega 56.

If you are baying today a new GPU. I would chose a 8 GB card. They are "many" versions of Vega 56, Vega 64. Different company's make them ASUS, Sapphire,PowerColor... There are differences in cooling, temperature, capacitors.. A friend has got a ASUS ROG Strix Radeon RX Vega 56 OC Gaming. Hes is satisfied, he did Undervolt the card. AMD does support FreeSync using HDMI, DisplayPort. Gtx 2060, Gtx 1060 only DisplayPort.

Your Freesync monitor has to have DisplayPort 1.2a/1.4. Any HDMI implementation of Freesync will not work at this time, even if the monitor does technically support Freesync. DisplayPort 1.2 or older on either the panel or GPU will be incompatible.
 
Last edited:
Hey, YOU, stop making stuff up. :rolleyes:o_O

The article we were talking about was not the one you just linked. Nice bait and switch there.

Oops. My mistake. Too many tech sites and I was just looking at the review on Techspot.

So I don't think drivers had that much effect.

Techspot tested across 30+ games and they used the latest Adrenalin drivers and reached essentially the same conclusion.

2060 performs nearly as well as the Vega 64.

And it tends to mainly be the crappy Blower design Vegas that go on sale, while there are many dual fan 2060s at the base price.
 
Oops. My mistake. Too many tech sites and I was just looking at the review on Techspot.

So I don't think drivers had that much effect.

Techspot tested across 30+ games and they used the latest Adrenalin drivers and reached essentially the same conclusion.

2060 performs nearly as well as the Vega 64.

And it tends to mainly be the crappy Blower design Vegas that go on sale, while there are many dual fan 2060s at the base price.

I think the extra $$$ you pay for vega64 extra 2G ram over 2060 is money well spent, blower fan or not. If you change video cards yearly, I suppose it doesn't matter. If you don't mind using lower res textures in some cases, doesn't matter.

In any case, I'd pick up a 1070 used for $200-$225 on ebay before I'd even thinking of picking up a 2060, due to the ram bonus alone.
 
I think the extra $$$ you pay for vega64 extra 2G ram over 2060 is money well spent, blower fan or not. If you change video cards yearly, I suppose it doesn't matter. If you don't mind using lower res textures in some cases, doesn't matter.

In any case, I'd pick up a 1070 used for $200-$225 on ebay before I'd even thinking of picking up a 2060, due to the ram bonus alone.

Is it a lot more for a non blower? I would never consider an AMD blower... especially at Vega 64 wattages. The only way I see it making sense is if you live alone and wear a headset all the time.

IIRC OP went for a used 1080 in the same price bracket.
 
Last edited:
Is it a lot more for a non blower? I would never consider an AMD blower... especially at Vega 64 wattages. The only way I see it making sense is if you live alone and wear a headset all the time.

IIRC OP went for a used 1080 in the same price bracket.
I'd take a 1080 over a 2060 every day of the week, twice on sunday :)
 
Many games load more vram than needed simply because the card has more vram. Its called cacheing. For a proper test, you need the SAME exact card in both 4gb and 8gb versions and then compare the performance. There are a few examples floating around the web, and yes, you will see minor differences, but then again, 4gb vs 8gb, and 6gb kind of narrows things down.


I just so happen to also have a XFX RX 570 RS with 4Gb card if you would like me to load it up and see , the clocks are a little higher but the vram is the question and not the clocks .
 
When you hit memory limits performance absolutely plumments. You’ll know it instantly and it’ll be unplayable. That’s what happened on my Fury X at 7680x1440 (3 x 32” 1440p).

At medium or high settings it would be fine. At ultra settings it would go to a crawl on a game like EA Star Wars Battlefront. You could very obviously tell it was hitting memory limits because you’d toggle a setting or two from ultra to high, settings that are tied to high VRAM usage - like AA or texture detail and it’d go from a few frames a second to like 40FPS, with one setting toggle.
 
Last edited:
I just so happen to also have a XFX RX 570 RS with 4Gb card if you would like me to load it up and see , the clocks are a little higher but the vram is the question and not the clocks .

Agree, and doesn't NVidia make slightly more efficient use of memory as well. Until I see some evidence, I don't see 6GB NV vs 8GB AMD as a big issue.
 
Agree, and doesn't NVidia make slightly more efficient use of memory as well. Until I see some evidence, I don't see 6GB NV vs 8GB AMD as a big issue.


Ok , I pulled the 8gb card and drop 4gb card and did fresh install of driver 19.1.2 as only thing different is 36Mhz gpu clock = 1286Mhz as the driver marries Ryzen also it's back to boosting 4 cores + 4 SMT = 3750Mhz if Deep Blue can keep it cool . = FS http://www.3dmark.com/fs/18064616

EDIT '

I will be back it about an hour with some kind of video for you to see but the driver wants to make changes about the system and I will record it for you 1st as AMD has AI now as a platform or how I view it .


Well I can not get Relive to start with driver 19 .1 .2 and the 4Gb card but under these settings Texture Gb levels is off limits as not enough vram for any level is what I found to run the card as everything runs fine other wise but you can tell .

This was 2Gb textures on the 8Gb card as this was on the front page post but I will move it here and some thing I need to go back and remake and also drop the 1600 in .



 
Last edited:
It's slower than a 1080, and still uses more power for gaming. I recognize Vega's raw compute capacity but it should also be stated that that is not a decision point in and of itself. Given AMD's less advanced software platform, applications need to be evaluated on a case by case bases; particularly, anything that runs well in CUDA is worth taking a close look at.



So you like spending cash and getting no return...?
Pretty much :) I am not huge fan of NVidia either but that was basically only choice I had these days. I am going to wait and see what kind of benchmarks we get.
 
Building this machine for a friend:
https://hardforum.com/threads/1600-all-in-gaming-build-quick-gut-check.1976067/

His primary game right now is PubG, and he spends a lot of time in that title. He plays almost exclusively FPS titles.

What video card would you recommend?

You should just return everything and get this system for $1400

https://www.amazon.com/CYBERPOWERPC...07H7Q34X5/ref=cm_cr_arp_d_product_top?ie=UTF8
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
overwatch..jpg
Here’s a vid that actually uses the current Adrenalin drivers and the Vega 64 steamrolls the 2060 in almost everything including Assasins Creed and PubG. (Where older AMD Vega drivers lose hard to the Nvidia 2060). This vid paints a much different light. So much so that I deleted the previous video post because it is pretty obvious they were using old recycled benchmark numbers from previous tests with very old drivers. (Like techpowerup)



Something is really fishy about those results.

In particular the fortnite results.

https://www.techspot.com/review/1781-geforce-rtx-2060-mega-benchmark/page4.html

https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/page2.html



https://www.pcgamer.com/nvidia-geforce-rtx-2060-review/

GkSS2irpxegP5duSKFJbYa-650-80.png


fortnite results..jpg


Those fortnite results look heavily suspect as fortnite is known to be an Nvidia biased game and compared to other websites, the results are really really off. While the RTX lineup are where they should, Vega 64 jump from 81-85 in other reviewers results to 150fps in this highly fishy result. That is a massive 76%-85% difference.

Look back at the older results and the adrenaline drivers really didn't do much for fortnite performance(even when using the same for gamers results at 3min 48sec).



Taking this into account and even while taking the overclock into account, the discrepancy is massive. Assassins creed odyssey(it's performing like an RTX 2080) and overwatch 1440p results look immediately off as well.

RTX2070-REVIEW-37.jpg


overwatch..jpg


Liquid Vega performs very close to the level of a 1700mhz Vega 64 and we see it is faster than a RTX 2060, but not nearly as much as the delta's in this for gamers review.

This review look incredibly cooked.
 

Attachments

  • Fortnite_1440p.png
    Fortnite_1440p.png
    51.2 KB · Views: 79
Pretty sure the 2060 will have some voltage limitations like the others but it'd be quite funny to see. I'm going to see how much power someone managed to get a Vega to suck down too lol... god damn they have bigger power circuits than the 2080Ti lmao!

GamersNexus did a video on this. It’s pretty crazy what a no limit card can pull. Red is stock/340ish system draw, blue is modded/630ish spikes.





D5161B17-BBCE-4AC6-8405-506C4CD56F93.png
 
I can’t run AC Origin and Odyssey at 100% GPU utilization at 1080p using r7 1700x at 4ghz. Something is still not right with the AMD 2019 driver.
 
I can’t run AC Origin and Odyssey at 100% GPU utilization at 1080p using r7 1700x at 4ghz. Something is still not right with the AMD 2019 driver.

Sure those games aren’t garbage for both vendors?
 
Yep. That's what happens when someone scourers the internet for a result to fit their agenda, instead of sticking to reasonably reliable sources.

And this is why AMD guerilla marketing team has targeted youtubers. A small youtube chanel is alot cheaper to payoff than any large mainstream website or even none video reviewer which has more credibility to lose because of their history.

Looking at that review by for gamers, they basically took any popular game win that Nvidia has known to win at like pubg,fortnite and overwatch strategically turned it into a AMD win somehow. It is easy to see why they would do it. These games are ones that are more likely to be played by youtubers or viewed by youtubers and this is why these results were inverted. Eventually it can make it to forums because AMD Guerilla marketing team or teamred members get passed it(their free guerilla marketing team) and the results get passed along and spread.

Look at youtubers like this and Joker productions and you will see AMD hardware doing alot better than what other websites are showing. Specifically comparisons of Vega 64 vs RTX 2070 video's where Vega 64 comes out ahead. But look at techspots(hardware unboxed) review and Hardocp, two websites with absolutely no bias towards Nvidia and you will see that this is not the case(and this is reflected by pretty much every website out there).

Add in how forums work and people typically have confirmation bias and don't use logic to see if something funny is going on and this is why guerrilla marketing is so effective. It is simply too easy to trick targets with a predisposition for AMD. Look at the people who liked that post where that review and you will see that specifically. But sneaky crap that harms the consumer needs to be called out whether it is for Nvidia or AMD.
 
Last edited:
You could always look into a cheaper Vega 56 with Samsung memory and flash to Vega 64 bios. I'm running an ASRock Vega 56 with Samsung memory with a Vega 64 bios undervolted with settings of:

P6 1477MHz @ 950mv
P7 1502MHz @ 1000mv

HBM @ 1100MHz @ 1030mv

With these settings on the stock blower cooler at 50% fan speed I see GPU temps around 66-71c, HBM temps 70-78c, and hot spot temps from 75-82c. I average around 90-100fps in FF XV at 1080p High settings and a 6907 graphics score on Time Spy. In FF XV I see power draw of 165-185w, Destiny 2 145-170w, 3D Mark 170-190w, Furmark 250w.


This is assuming being comfortable with flashing the bios and spending the time to tweak out the settings.
 
Last edited:
Back
Top