RX Vega Owners Thread

Vega multiminotor bezel correction still is broken 2 months after release.

Freesync is finally working for me after a reformat of Windows or a driver update I’m not sure which. (It was working before but a driver update broke it two separate times on my system, currently it’s working)

I’ve seriously just about lost my patience with this card/buggy software.

I’ve really experienced a LOT of hassle with my AMD Vega cards. Id not recommend them to anyone at this point in time quite honestly.

Someday they’ll get it squared away—that day has not yet come.

9DF59676-3FED-4C49-A278-6AAF2FDE4762.jpeg


right after this pic, the computer bluescreened, and rebooted to some crazy upside down image, and a low resolution. Then it was in a funk for a while blinking on and off, and just generally messing up/bugging out until I hit reset again.

So far, after loading my OS fresh tonight, I've had to hard reboot 4 times with these stupid Vega cards and my three monitor setup. If you just want a single monitor for gaming it seems to be working OK... but that's not why I bought Vega.
 
Last edited:
It hasn't been too bad for me. I mean, the drivers are kind of buggy. I've had to DDU a few times just to get Crossfire working. And OpenGL is still broken.

However, I just played 1 hour of Prey 4K CF maxed out (except SMAA on 1x) and I am getting around 80+ fps, which is nice for a new title. That said, a single 1080 Ti would probably have similar numbers, but I wanted to go AMD on this machine so I'm happy with what I got.
 
I'm selling my three Vegas. (1 Liquid Cooled Limited edition, 2 Vega 56)

I'm dropping this frustration...

Anyone interested in the cards PM me.

I thought Freesync was working after my reformat and driver update - excitedly I loaded up Star Wars Battlefront -- once again disappointed to see freesync is definitely not working with all three displays. I think it's just working with the single display in the current driver. (after a fresh OS reformat and current x.10.1 driver.

Ironically, AMD, the reason you captured me as a customer is the reason you are going to lose me as a customer. (excellent eyefinity and freesync support on the Fury X cards in crosssfire)

I guess I'll sell off these three 32" Omen Freesync monitors and get a GSync display of some sort.

Attached is a quick camera clip showing the extreme tearing with three monitors and freesync. What a joke.


Every little buggy annoyance is that much more a frustration since I have eight 1080TI in the next room over mining away. I used a couple of those 1080TI for a few weeks between when I sold my Fury X and when I got the Vega, and I'm a fool for ever buying Vega. The 1080ti cards had 0 issues. I just liked freesync and 75hz too much to give it up. It truly is a let down to give up adaptive sync technology and as I’ve mentioned before— I liked the pair of Fury X in Crossfire with Freesync significantly more than the pair of 1080ti in SLI without Freesync. I guess I’m ready to trade all the bugs I'm experiencing for the loss of Freesync and/or the upcharge to g-sync displays.

This isn't about the slightly lesser performance from AMD as compared to NVidia --- it's about the crappy beta level driver software, that doesn’t even support their own bullet points that AMD keeps throwing out there to their loyal customers.

AMD, get your crap together!
 

Attachments

  • IMG_0783.zip
    577.9 KB · Views: 22
Last edited:
Fair enough. I can understand your frustration. If I was only using one machine, I think I would definitely go with the 1080 Ti.

If I was starting fresh, I'd probably get one of the curved ultrawides. Like this:
https://www.newegg.com/Product/Product.aspx?Item=9SIA24G6659684

Probably about the sweet spot of what you can realistically render with G-Sync and 100Hz. I have a triple 1440p setup as well, but I never use it because barely any games can run at that res (even with GTX 1080 SLI). However, they are really useful for work, so I probably won't replace them.
 
I can get 60 FPS locked on Borderlands 1 with Vega 64, 4K max settings (framerate limit in game, I haven't tried disabling it).

In Dishonored 1 I can get around 80 - 90 FPS 4K max settings with 1 Vega 64, or 150+ FPS in Crossfire.

Left4Dead I'm getting around 150 - 200 FPS with Crossfire 4K maxed (I can't remember exactly how it was with one card, but I believe above 100 FPS).

Thank you for the info (y)
 

Are you going to pull one of your 180 Ti's from you mining rig? So far, I have not had any real issues but, then again, I am only running one Vega 56 and do not have 3 monitors going either, so I am not pushing things like you are. Hope the sale goes well, I would probably have bought the Sapphire one from you if I had not already bought the one I have. (Seems to Powercolor one may have been part of your problems but, I would not want to have to deal with that either.)
 
(Seems to Powercolor one may have been part of your problems but, I would not want to have to deal with that either.)
I think you are right, I’ve had intermittent problems along the way until it just outright failed and that failing card could have been a big part of the issue.

But the milk has been spilt.

Freesync doesn’t work on my three monitors regardless of card. Same for bezel correction. PUBG will crash about 5 times in about eight games on any of the Vegas. I know people say PUBG just doesn’t work well with AMD in general. - well sigh, that’s what I and my friends are actively playing right now.
 
Do any of you Vega owners own Wolfenstein Old Blood? I purchased it a while back but never played it. The new games coming out so I figured I'd better make my way through the game.

I played it about 1.5 hours tonight on preset high settings at 1440p (no changes other than chosing High) with the liquid cooled Vega 64.

It's getting between about mid 30's FPS and the 60FPS cap - mostly at the 60FPS, but there is quite a bit of variance.

Assuming this is the same engine that the new Wolfenstein game is (that had a bundled option with these VEGA cards) - - - is it weird I'd be anywhere close to 30FPS at times (not too rare either). I would have thought this card would have the ability to totally cap this game at 60FPS on high settings. The game defaulted to medium settings. I switched them to high preset and started playing.

It's not terrible to hit 30FPS on occasion when it is mostly at 60FPS, but it's not good either. Not on a $800 card and a year or two old game.

My system, as mentioned a few posts above is a brand new fresh reinstall of windows. I have a x99 system with 16GB of 2400mhz DDR4 RAM, and a Intel 6850 overclocked to 4.2Ghz on water cooling. I'm using the newest AMD x.10.1 drivers.

If you have this game would you mind loading it up and see if your frame rates are consistent or pegged at 60FPS over a 5 minute play time?
 
Do any of you Vega owners own Wolfenstein Old Blood? I purchased it a while back but never played it. The new games coming out so I figured I'd better make my way through the game.

I played it about 1.5 hours tonight on preset high settings at 1440p (no changes other than chosing High) with the liquid cooled Vega 64.

It's getting between about mid 30's FPS and the 60FPS cap - mostly at the 60FPS, but there is quite a bit of variance.

Assuming this is the same engine that the new Wolfenstein game is (that had a bundled option with these VEGA cards) - - - is it weird I'd be anywhere close to 30FPS at times (not too rare either). I would have thought this card would have the ability to totally cap this game at 60FPS on high settings. The game defaulted to medium settings. I switched them to high preset and started playing.

It's not terrible to hit 30FPS on occasion when it is mostly at 60FPS, but it's not good either. Not on a $800 card and a year or two old game.

My system, as mentioned a few posts above is a brand new fresh reinstall of windows. I have a x99 system with 16GB of 2400mhz DDR4 RAM, and a Intel 6850 overclocked to 4.2Ghz on water cooling. I'm using the newest AMD x.10.1 drivers.

If you have this game would you mind loading it up and see if your frame rates are consistent or pegged at 60FPS over a 5 minute play time?


well I guess i'm not alone:
http://www.neogaf.com/forum//showthread.php?page=5&t=1040830
 
The last few AMD drivers have tanked OpenGL performance. I've seen this on RAGE and DOOM, where performance drops to 30 fps when it should be around 80. Wolf uses the same engine, I believe, so probably the same issue I've found.
 
I have had zero problems with my 56 from Powercolor. The initial drivers were buggy as hell but smooth sailing now.
 
did ya do any voltage changes?

yup, that did it. Been trying to undervolt more, -150 was stable for a while and then I was artifacting pretty bad. Currently set to 1540/1075 with -125 and +15%. Hit 10700 in superposition and max draw of 225w per afterburner. I'm happy with this. Stock at turbo scored less and would break 300 watts.
 
The last few AMD drivers have tanked OpenGL performance. I've seen this on RAGE and DOOM, where performance drops to 30 fps when it should be around 80. Wolf uses the same engine, I believe, so probably the same issue I've found.

what's the GPU utilization at the moment the FPS are low?.
 
yup, that did it. Been trying to undervolt more, -150 was stable for a while and then I was artifacting pretty bad. Currently set to 1540/1075 with -125 and +15%. Hit 10700 in superposition and max draw of 225w per afterburner. I'm happy with this. Stock at turbo scored less and would break 300 watts.


are those voltage changes also there when at 2d clocks? if they are then that could be the cause, different parts of the chip are going to behave differently when you change voltages, also tolerances for voltages are different across the different units of the chip and also different chips as well.

I'm not sure how Vega's voltage is setup in Afterburner. But with nV's and using the curve, I have had no problems in some games, some games, just won't damn run at lower voltages, its just different tolerances with in the chip itself.

But I'm assuming its like Polaris, so yeah its just a global setting for Vega's voltage and that voltage change will be there with the 2 clocks.
 
are those voltage changes also there when at 2d clocks? if they are then that could be the cause, different parts of the chip are going to behave differently when you change voltages, also tolerances for voltages are different across the different units of the chip and also different chips as well.

I'm not sure how Vega's voltage is setup in Afterburner. But with nV's and using the curve, I have had no problems in some games, some games, just won't damn run at lower voltages, its just different tolerances with in the chip itself.

But I'm assuming its like Polaris, so yeah its just a global setting for Vega's voltage and that voltage change will be there with the 2 clocks.
Depends on whether he used the % adjustment or set the individual states, only the 6th and 7th state can be changed by Wattman. Of course this is an assumption based on the graph for each state, with only the 6th and 7th showing voltages and the others just show N/A. Like this:
Wattman.jpg


Just checked the % slider and the graph does not move for the lower states.
 
Depends on whether he used the % adjustment or set the individual states, only the 6th and 7th state can be changed by Wattman. Of course this is an assumption based on the graph for each state, with only the 6th and 7th showing voltages and the others just show N/A. Like this:
View attachment 40197

Just checked the % slider and the graph does not move for the lower states.


Good point, I've tried wattman twice and I found it was flaky.
 
Next topic... Ok so I tried Superposition with HBCC on and off. Scored 200 more points on 1080p high (wanted quick no hassle). Only issue I have with running it is the amount of memory you can select as the minimum is 11.5Gb of my 16Gb. Of course I cant say if it would matter or not but I would prefer 8Gb over 12Gb. Anyway been seeing a lot of posts around the web about higher performance in games, mainly minimums so was curious to see how it differed.
 
Usage was in the 85% range, so relatively normal aside from the abysmal framerates.

85% in Xfire?. have you tested with just single card and see how it behave?.

For Xfire 85% utilization both GPU it's OK, for single is not unless CPU bottleneck.
 
85% in Xfire?. have you tested with just single card and see how it behave?.

For Xfire 85% utilization both GPU it's OK, for single is not unless CPU bottleneck.
Yes, for RAGE it shows around 85% on both GPUs (even though I don't think the game supports CF and CF was disabled).

For DOOM usage on GPU1 is pegged at 99% (GPU2 0%) but FPS is like 28, horrible.
 
Yeah, Turbo seems to be the best setting for now through Wattman, at least until I get a better handle on how to overclock this beast. :D Doing the 2900Pro to XT clocks overclocking was a far sight easier and I was able to flash those clocks right to the bios on my Pro cards.
 
Good point, I've tried wattman twice and I found it was flaky.

I'm using afterburner which doesn't measure voltage and doesn't adjust for power states. I can tell you that adjusting the voltage made the artifacts disappear in 2d.

I lied. It appears that it shows voltage just in the main gui, not the hw monitor.

It seems like its idling at 756mv and under load hits 1000-1012mv at -143. Adjust the mv changes both idle and load.

Also JustReason I got the same 200pt boost.
 
Last edited:
Anyway to determine what clocks speeds, voltage and memory speeds Turbo is enabling? I have already tried tweaking it manually but it ends up slower than what Turbo ends up with, at least so far.
 
Anyway to determine what clocks speeds, voltage and memory speeds Turbo is enabling? I have already tried tweaking it manually but it ends up slower than what Turbo ends up with, at least so far.


Download latest afterburner,it's more reliable than wattman. Thought turbo was trash personally, pulls over 300 watts and scored less than undervolting and pushing power limit.
 
Looking forward to trying out the Windows 10 FCU Beta drivers from AMD tonight along with enabling HBCC! :)
 
So I reapplied thermalpaste in an attempt to get rid of random split second HBM temperature spikes to 99c, and they're gone!

But now I have weird dips to 0c in hbm temp.

#wtfvega
hbmdips.jpg
 
Hmmmm, very interesting, have to see what results I receive on my setup.
 
Hmmmm, very interesting, have to see what results I receive on my setup.


I think its more related to a clean system installation rather than the upgrade itself. i've testing extensively I found little performance gains, not any bigger than ~5% across all my AMD and Nvidia cards, the big difference now it's however huge and its related to "smoothness" games feel definitively way smoother and fluid than before, frametimes are tighter but minimums, average and maximums are basically the same, well within margin of error on my machines..
 
I think its more related to a clean system installation rather than the upgrade itself. i've testing extensively I found little performance gains, not any bigger than ~5% across all my AMD and Nvidia cards, the big difference now it's however huge and its related to "smoothness" games feel definitively way smoother and fluid than before, frametimes are tighter but minimums, average and maximums are basically the same, well within margin of error on my machines..

Well, I guess it depends on the system because even Nvidia hardware has seen a boost, at least according to some of the commentators of this video. :) From what I understand, Gamemode is a real help now which is not as surprising as some would think it should be but, I have not tried any games yet, I am still just doing the tweaking and overclocking for now. (I have tried games but, I mean I have done no testing yet with gamemode on or off.) I have to say that I seriously doubt a clean install alone is going to give a 20% boost and I cannot see Joker having made any false claims. Probably varies from system to system though and I am looking forward to seeing what the results are on my all AMD system is.

On a different note: I simply set my fan to 4000 rpm max, 80C max temp and the 50+ power limit and it boosted my 3D Mark 11 graphics score by about 1500 points above the Turbo setting. I have not yet overclocked the memory, changed the gpu clock speeds or under volted yet but, it already gives me 50% greater performance over the single R9 Fury I was using before.
 
So...nice surprise today. The card showed up with some other parts. Naturally, the entire afternoon was lost to driver uninstalls, restoring clean windows images, etc. After tinkering for the better part of 7 hours with the card...it's going back tomorrow. My card is *ludicrously* sensitive to temp. Crossing 65C causes the core to throttle down into the low 1400s/high 1300s, and some weird HBM bug still seems to be present, where if you OC the HBM and it thermal throttles, it won't ever recover to full speed, but just sets itself to 800 MHz and stays there. A cold boot is required to "unstick" it. Tried 17.9.3 and 17.10.1 drivers.

Yes, I can set the fan to 4100 RPM, +50% power envelope and let it run fairly well (the HBM at least doesn't tank and clocks generally stay around 1500+), however I can hear it from the other end of house, as I'm afraid if I actually close the case, then the whole damn PC will melt into a pile of slag. Even with good noise cancelling, over the ear headphones, it's not viable due to the noise.

Maybe I've become spoiled by my Fury X and all liquid, solid state PC, but this level of noise is so...no. Just, no.
 
Try setting a fun curve in Afterburner. The fan is definitely loud but, with headphones, I don't notice and the extra performance is worth it.
 
So...nice surprise today. The card showed up with some other parts. Naturally, the entire afternoon was lost to driver uninstalls, restoring clean windows images, etc. After tinkering for the better part of 7 hours with the card...it's going back tomorrow. My card is *ludicrously* sensitive to temp. Crossing 65C causes the core to throttle down into the low 1400s/high 1300s, and some weird HBM bug still seems to be present, where if you OC the HBM and it thermal throttles, it won't ever recover to full speed, but just sets itself to 800 MHz and stays there. A cold boot is required to "unstick" it. Tried 17.9.3 and 17.10.1 drivers.

Yes, I can set the fan to 4100 RPM, +50% power envelope and let it run fairly well (the HBM at least doesn't tank and clocks generally stay around 1500+), however I can hear it from the other end of house, as I'm afraid if I actually close the case, then the whole damn PC will melt into a pile of slag. Even with good noise cancelling, over the ear headphones, it's not viable due to the noise.

Maybe I've become spoiled by my Fury X and all liquid, solid state PC, but this level of noise is so...no. Just, no.

Sorry about the poor experience you are having. It must be card dependent though because, I set my card so it would max out at 4000 rpm, set the power limit to 50+ and ran some benchmarks and a game. I did not notice the noise at all and the computer is only a couple of feet away from me, to my left and on the floor. However, I tend to notice things like clicking noises or stuff like that more often than any other kind of noise.
 
He needs to learn how to use Wattman. Custom setting is there for a reason. My XFX 56 Vega does 950 HBM all day long Playing or mining. I don't think the differences between brands is this big. How big is the case and how is your airflow. These cards will run hot , but one can manage the temps by airflow and case size.
I have yet to bios flash this thing to 64. This card's main job is mining. The 64 that is also mining is not much hotter. Both cards are set to about 3000 rpm and the noise is not something I hear over a great # of other cards in the same room. Both the 56 and 64 are running in the low 60's core and low 70's HBM..I also own two Fury X's and these are silent to the point where I cant hear them at all. I have two more 64 liquid on the way. I am sure they are just as quiet as the Furies. The reference cards have always been a bitch in terms of noise once you cross a certain point of fan RPM.
 
Anyone have issues with some games and clock speeds? In bf1 and titanfall I notice the card is constantly downclocking to like 700-800Mhz core and 167Mhz memory. Chill and targeted framerates are disabled. In Doom it maintains ~1530/1070 without issue.
 
Back
Top