RX Vega Owners Thread

I finally installed the card tonight after some benchmarking of my old 290x's in crossfire to compare what I got myself into. Plus, I wanted to share my results with an Ultrawide as there isn't much posted results for us who own one. This should help who are curious what Vega 64 can do with a few titles at the moment. Sorry for the crude chart, but it should give an idea of the frame rates and power draw I got from the wall.

Benchmark_zpslmw5xto9.jpg


What have I learned from this? Well it looks like it's true that balanced mode is the best setting to use for the most part. And for the record, I did not fuss with power limits or anything like that. I kept it all at the default settings to show the performance without any tweaks. These aren't the best drivers, it's not called beta for nothing. I can see they concentrated on a few games and others really need work such as Tomb Raider.

This card is in dire need of a water block. Heat wasn't that bad really, reached a max temp of 82c, but I rather keep the clock speed at the ceiling for the setting I chose. Not bounce all over the place. As expected the power used is not spectacular. AMD has never been great in this regard since it became part of the discussion for GPU's for sometime now. It was nice when it was all about brute force and who gives a shit how much power it sucks. Well since my 290x was another power hungry GPU, it's good to see at least Vega 64 not toppling over power draw when crossfire was enabled with the 290x. Speaking of crossfire, damn wish recent games would support the damn option. Looking from the graph I could have kept going on with my almost 4 year old cards.

Now I did come across coil whine. I can happily say it only happened during the load intro screen for Doom as it was running at 5400 fps. Besides that, no other time did I experience coil whine. Not even close to what the 290x's had. Fan noise is a big improvement for me compared to the 290x OEM fan. It wasn't a distraction at all. Thank god I watercooled the 290x's soon after the install. Even though Vega's fan doesn't bother me, I still want it watercooled nonetheless.

So am I happy with the purchase? Yes and no. Plus is I can game with one card that has great avg frame rates and minimum frame rates that would stay in the LG's Freesync range. And have everything cranked to boot. PUBG looks so good on high and it's now thankfully playable instead of all low settings. That's why I really wanted to upgrade to stay within the monitors freesync range even after modding it to have a range of 35-75 instead of 55-75 without flicker. Tomb raider has a deceiving low minimum record as there is one second glitch that freezes during benchmarking that of course is recorded. If I had recorded different score from the 3 scenes for benchmark it would have been better, but it seems like that scene is the norm for testing. With that said by the way, oh boy did I miss freesync during the testing. Watching the benchmarks in between watching the kill-a-watt meter I couldn't stand the tearing and perceived studdering. I can't live without Freesync. I've even debated about upgrading from the 75hz LG to one of those Korean 100Hz panels. Looks like Vega can utilize some of the frames above 75hz, but not by much for some games. Hopefully that improves soon.

The no part of the equation is I wish crossfire was properly supported. With Freesync and the results of the benchmarks, I can clearly see two 290x's can beat a single Vega 64. That saddens me. It's great that I get better performance with a single Vega, but come on! I really thought Vega would beat out a crossfired 290x setup especially when I paid the same for a pair of 290s compared to just one Vega. I guess that's a no for the moment with Ultrawide. The other is no DVI for my Korean 1440p secondary monitor. Sure I don't overclock it to 120hz anymore but now its dark on the wall on top of my ultrawide monitor. Guess I'll have to hunt down a dual link DVI to HDMI cable.

It's a damn shame that more people can't own one with all the BS going on with price gouging and miners taking most of the limited stock again. Whether Vega 64 is mocked for not beating a 1080ti and going back and forth with a 1080 with more power usage. We all should have the right to at least take a chance on buying one and discuss what you bought instead of bitching about availability and the price rising.

Now just waiting for a waterblock and better drivers.
 
I finally installed the card tonight after some benchmarking of my old 290x's in crossfire to compare what I got myself into. Plus, I wanted to share my results with an Ultrawide as there isn't much posted results for us who own one. This should help who are curious what Vega 64 can do with a few titles at the moment. Sorry for the crude chart, but it should give an idea of the frame rates and power draw I got from the wall.

Benchmark_zpslmw5xto9.jpg


What have I learned from this? Well it looks like it's true that balanced mode is the best setting to use for the most part. And for the record, I did not fuss with power limits or anything like that. I kept it all at the default settings to show the performance without any tweaks. These aren't the best drivers, it's not called beta for nothing. I can see they concentrated on a few games and others really need work such as Tomb Raider.

This card is in dire need of a water block. Heat wasn't that bad really, reached a max temp of 82c, but I rather keep the clock speed at the ceiling for the setting I chose. Not bounce all over the place. As expected the power used is not spectacular. AMD has never been great in this regard since it became part of the discussion for GPU's for sometime now. It was nice when it was all about brute force and who gives a shit how much power it sucks. Well since my 290x was another power hungry GPU, it's good to see at least Vega 64 not toppling over power draw when crossfire was enabled with the 290x. Speaking of crossfire, damn wish recent games would support the damn option. Looking from the graph I could have kept going on with my almost 4 year old cards.

Now I did come across coil whine. I can happily say it only happened during the load intro screen for Doom as it was running at 5400 fps. Besides that, no other time did I experience coil whine. Not even close to what the 290x's had. Fan noise is a big improvement for me compared to the 290x OEM fan. It wasn't a distraction at all. Thank god I watercooled the 290x's soon after the install. Even though Vega's fan doesn't bother me, I still want it watercooled nonetheless.

So am I happy with the purchase? Yes and no. Plus is I can game with one card that has great avg frame rates and minimum frame rates that would stay in the LG's Freesync range. And have everything cranked to boot. PUBG looks so good on high and it's now thankfully playable instead of all low settings. That's why I really wanted to upgrade to stay within the monitors freesync range even after modding it to have a range of 35-75 instead of 55-75 without flicker. Tomb raider has a deceiving low minimum record as there is one second glitch that freezes during benchmarking that of course is recorded. If I had recorded different score from the 3 scenes for benchmark it would have been better, but it seems like that scene is the norm for testing. With that said by the way, oh boy did I miss freesync during the testing. Watching the benchmarks in between watching the kill-a-watt meter I couldn't stand the tearing and perceived studdering. I can't live without Freesync. I've even debated about upgrading from the 75hz LG to one of those Korean 100Hz panels. Looks like Vega can utilize some of the frames above 75hz, but not by much for some games. Hopefully that improves soon.

The no part of the equation is I wish crossfire was properly supported. With Freesync and the results of the benchmarks, I can clearly see two 290x's can beat a single Vega 64. That saddens me. It's great that I get better performance with a single Vega, but come on! I really thought Vega would beat out a crossfired 290x setup especially when I paid the same for a pair of 290s compared to just one Vega. I guess that's a no for the moment with Ultrawide. The other is no DVI for my Korean 1440p secondary monitor. Sure I don't overclock it to 120hz anymore but now its dark on the wall on top of my ultrawide monitor. Guess I'll have to hunt down a dual link DVI to HDMI cable.

It's a damn shame that more people can't own one with all the BS going on with price gouging and miners taking most of the limited stock again. Whether Vega 64 is mocked for not beating a 1080ti and going back and forth with a 1080 with more power usage. We all should have the right to at least take a chance on buying one and discuss what you bought instead of bitching about availability and the price rising.

Now just waiting for a waterblock and better drivers.

Thanks for posting - I've also got the 34" LG ultrawide and was interested in upgrading from my GTX 780. Never had the pleasure of using Freesync/GSync. However, I don't play shooters, just games like Civilization, X-Com, Starcraft 2, and Heroes of the Storm. Although I do occasionally get sucked into a FPS.

Just a question - why have Freesync disabled for the benchmarks?
 
I finally installed the card tonight after some benchmarking of my old 290x's in crossfire to compare what I got myself into. Plus, I wanted to share my results with an Ultrawide as there isn't much posted results for us who own one. This should help who are curious what Vega 64 can do with a few titles at the moment. Sorry for the crude chart, but it should give an idea of the frame rates and power draw I got from the wall.

Benchmark_zpslmw5xto9.jpg


What have I learned from this? Well it looks like it's true that balanced mode is the best setting to use for the most part. And for the record, I did not fuss with power limits or anything like that. I kept it all at the default settings to show the performance without any tweaks. These aren't the best drivers, it's not called beta for nothing. I can see they concentrated on a few games and others really need work such as Tomb Raider.

This card is in dire need of a water block. Heat wasn't that bad really, reached a max temp of 82c, but I rather keep the clock speed at the ceiling for the setting I chose. Not bounce all over the place. As expected the power used is not spectacular. AMD has never been great in this regard since it became part of the discussion for GPU's for sometime now. It was nice when it was all about brute force and who gives a shit how much power it sucks. Well since my 290x was another power hungry GPU, it's good to see at least Vega 64 not toppling over power draw when crossfire was enabled with the 290x. Speaking of crossfire, damn wish recent games would support the damn option. Looking from the graph I could have kept going on with my almost 4 year old cards.

Now I did come across coil whine. I can happily say it only happened during the load intro screen for Doom as it was running at 5400 fps. Besides that, no other time did I experience coil whine. Not even close to what the 290x's had. Fan noise is a big improvement for me compared to the 290x OEM fan. It wasn't a distraction at all. Thank god I watercooled the 290x's soon after the install. Even though Vega's fan doesn't bother me, I still want it watercooled nonetheless.

So am I happy with the purchase? Yes and no. Plus is I can game with one card that has great avg frame rates and minimum frame rates that would stay in the LG's Freesync range. And have everything cranked to boot. PUBG looks so good on high and it's now thankfully playable instead of all low settings. That's why I really wanted to upgrade to stay within the monitors freesync range even after modding it to have a range of 35-75 instead of 55-75 without flicker. Tomb raider has a deceiving low minimum record as there is one second glitch that freezes during benchmarking that of course is recorded. If I had recorded different score from the 3 scenes for benchmark it would have been better, but it seems like that scene is the norm for testing. With that said by the way, oh boy did I miss freesync during the testing. Watching the benchmarks in between watching the kill-a-watt meter I couldn't stand the tearing and perceived studdering. I can't live without Freesync. I've even debated about upgrading from the 75hz LG to one of those Korean 100Hz panels. Looks like Vega can utilize some of the frames above 75hz, but not by much for some games. Hopefully that improves soon.

The no part of the equation is I wish crossfire was properly supported. With Freesync and the results of the benchmarks, I can clearly see two 290x's can beat a single Vega 64. That saddens me. It's great that I get better performance with a single Vega, but come on! I really thought Vega would beat out a crossfired 290x setup especially when I paid the same for a pair of 290s compared to just one Vega. I guess that's a no for the moment with Ultrawide. The other is no DVI for my Korean 1440p secondary monitor. Sure I don't overclock it to 120hz anymore but now its dark on the wall on top of my ultrawide monitor. Guess I'll have to hunt down a dual link DVI to HDMI cable.

It's a damn shame that more people can't own one with all the BS going on with price gouging and miners taking most of the limited stock again. Whether Vega 64 is mocked for not beating a 1080ti and going back and forth with a 1080 with more power usage. We all should have the right to at least take a chance on buying one and discuss what you bought instead of bitching about availability and the price rising.

Now just waiting for a waterblock and better drivers.
haven't done any games for benchmarking yet, just Heaven, Superposition, and various 3Dmarks. But definitely with my WC Vega64, I am getting 2X the frame rate of my 290. The 290@1100/1300 (could do higher but prefer not messing with voltage) ran everything great, but this Vega @1660-1750 is definitely an upgrade. Once I get it back in my air condition case and crank up some fans, I will seriously look at how it games and the clocks. Fallout4is first.
 
Great work, just two tips for the benchmark thing graph/image. Always start graphs at zero and always save images heavy with text in PNG :)

I'm really looking forward for my move from RX470 >> Vega on my C32HG70 at 1440p, NCIX has my money so I'll just twiddle my thumbs until they get a hold of a card to send me.



If I've already paid, is that close enough Kyle? I can hold off any more posts if not...
That is fine. I just need to put an end to the folks coming in here trolling for trolling's sake.
 
Thanks for posting - I've also got the 34" LG ultrawide and was interested in upgrading from my GTX 780. Never had the pleasure of using Freesync/GSync. However, I don't play shooters, just games like Civilization, X-Com, Starcraft 2, and Heroes of the Storm. Although I do occasionally get sucked into a FPS.

Just a question - why have Freesync disabled for the benchmarks?

I was going to, then observing benchmarks with Freesync on skewed the numbers and made everything screwed up with stutter for some reason. Gaming fine, benchmarks no I guess. Could have been more to it. More than likely because I didn't use vsync or the frame rate cap to stay within the Freesync range it just wasn't working right. I also wanted Freesync disabled so I could see the raw performance of the card to decide on whether to upgrade to a Korean 100hz Freesync monitor. I would love the Samsung CF791 but the flicker issue keeps me away. And yes, I do realise some have issues with flicker on the Korean ones too.

Great work, just two tips for the benchmark thing graph/image. Always start graphs at zero and always save images heavy with text in PNG :)

I'm really looking forward for my move from RX470 >> Vega on my C32HG70 at 1440p, NCIX has my money so I'll just twiddle my thumbs until they get a hold of a card to send me.

Thanks for tips! I was just playing around and I wanted something different rather than just putting the numbers into text. I'm certainly no reviewer like Kyle and his fancy graphs :)
I too had an order with NCIX for the liquid, but luck would have my local store had some in stock and I wasn't waiting no more. Now I have to wait for a waterblock.


I'm picking mine up tomorrow! Hopefully my poor little 600W psu can handle it:rolleyes:

Congrats on getting one. Which one did you get? To be honest it will be close with 600w as you can see on my graph for power from the wall. So as long as your PSU is high efficient, you'll be ok. Just wouldn't oc it though.


Oh and thanks Kyle for keeping the trolls out. They can hang out at Videocardz or WCCF for that crap.
 
Congrats on getting one. Which one did you get? To be honest it will be close with 600w as you can see on my graph for power from the wall. So as long as your PSU is high efficient, you'll be ok. Just wouldn't oc it though.
It's a sapphire air-cooled. I'm only pairing it with a puny 65W i7.:oops: I guess I'll be alright.:p
 
I've got a Black Gigabyte Vega 64 I'm willing to let go of here in Canada if anyone is looking, Brand New in the box
 
Mine should be delivered today, was nice to see it hitting 43MH/s mining as well, can offset the cost of the card whoop whoop. Hopefully some new driver updates, but at least I can drive Doom at 4k on my 43" 4k monitor lol, dying to actually play it now that I'm done hockey for summer.

Cost wise:
Assuming that memory express included my 2 free games I'm going to re-gift for xmas presents for kiddo ends up costing me about $710 CAD (was $839) which isn't far off of the $580 for the 1070s I have for about 12MH/s more (1070 I estimate at 150W @ 30MH/s = $57 and Vega after the indepth test that one guy did pulls 260W for about 42MH/s, can go higher)


Mining:
Can make $20USD extra per month, not too bad considering how much it whooped on the 1070, and performs similar to a 1080, so really best of both worlds for myself since I tend to have it mining more often than I'd game, and electricity is cheap here 3-5c along with being a tax write off if I'm making capital gains :D
 
I just ordered 2 plain Vega 64s.

Reasons:
  1. itch
  2. PLP support
  3. finally with HDMI for 4K TV
  4. gaming
  5. mining when not gaming.
Actually thinking of Water Cooling them. Never did wc, so this might give me a reason to try it out.
 
Mine should be delivered today, was nice to see it hitting 43MH/s mining as well, can offset the cost of the card whoop whoop. Hopefully some new driver updates, but at least I can drive Doom at 4k on my 43" 4k monitor lol, dying to actually play it now that I'm done hockey for summer.

Cost wise:
Assuming that memory express included my 2 free games I'm going to re-gift for xmas presents for kiddo ends up costing me about $710 CAD (was $839) which isn't far off of the $580 for the 1070s I have for about 12MH/s more (1070 I estimate at 150W @ 30MH/s = $57 and Vega after the indepth test that one guy did pulls 260W for about 42MH/s, can go higher)


Mining:
Can make $20USD extra per month, not too bad considering how much it whooped on the 1070, and performs similar to a 1080, so really best of both worlds for myself since I tend to have it mining more often than I'd game, and electricity is cheap here 3-5c along with being a tax write off if I'm making capital gains :D
Where you'd see that article? I've see more like 34MH/s. I haven't opened my standard 64 because I'm holding out hope for a watercooled edition. (dwindling hope) -- but if the mining numbers are up a 43MH/s - I can more clearly see opening the one I have. My cousin tried the beta mining drivers on his with nice hash and experienced crashes and such pretty frequently -- hoping those improve shortly.
 
Where you'd see that article? I've see more like 34MH/s. I haven't opened my standard 64 because I'm holding out hope for a watercooled edition. (dwindling hope) -- but if the mining numbers are up a 43MH/s - I can more clearly see opening the one I have. My cousin tried the beta mining drivers on his with nice hash and experienced crashes and such pretty frequently -- hoping those improve shortly.

https://steemit.com/ethereum/@bitsb...ltiple-cryptocurrencies-including-dual-mining

My power figures were slightly off, 290W ish at 40Mh/s but I have a feeling in time that number will either gain a good bit or the power for that performance will drop once we can change BIOS and he wasn't able to undervolt and I don't think change the GPU core either. Curious if the standard 1200Mhz on Polaris would be same Hashrate with the 1100mem clock
 
nowinstock.net helped haha, watercooled one up now for $800 ... https://www.newegg.com/Product/Product.aspx?Item=N82E16814202299&ignorebbr=1

they sold out nearly immediately - I got the text notification while I was driving --- immediately pulled over and bought it on my phone.

I got absolutely royally screwed at $800 - but I wanted this card, had my heart set on it for as long as the silver limited edition had been announced --- and just DARN YOU AMD --- you got me. I hate you!

I'm was voluntarily locked into the FreeSync realm though with my three 32" HP Omen monitors and since I really like the displays and love FreeSync --- to go another route on the monitors would have cost me a lot more in the end. Having had the Fury X in crossfire and been happy with performance I felt this card should suffice for performance combined with freesync.

DARN YOU AMD for raising the price and making this card nigh impossible to get!

One bittersweet score, to be sure.
 

Attachments

  • darn AMD.JPG
    darn AMD.JPG
    91.6 KB · Views: 44
Does anyone here have a Vega FE + 10-bit monitor? Can someone please test if you have access to 10-bit OpenGL color buffers in Adobe suite?

I thought it did so I pre-ordered one from B&H but I'm hearing different feedback.

Best case scenario would have been the Titan Xp since I also game on the side and miscellaneous CUDA benefits but Nvidia deliberately disabled this feature on their consumer cards (surprise surprise, Nvidia fuckery strikes again!).

Thinking of waiting a few more weeks since the WX 9100 / SSG release right around the corner... getting impatient but I'm okay with staying with my 7970s a bit longer until Navi/Volta.
 
Checking in. I wasn't planning on buying one of these, but snagged a liquid cooled version. I've had a Freesync monitor for the better part of a couple years that I used to pair with 2 Fury Nanos and I really wanted to see what single GPU would be like. Crossfire sucks. So does SLI. Let's get it right. Single GPU freesync is actually pretty nice if you prefer games like Witcher or Tomb Raider or insert your favorite 3rd person RPG/Action title. I'm using a Wasabi Mango UHD400 and at the frame limit of 59 fps in Crimson settings without Vsyc on, it's a joy. Very smooth. In something like Counter Strike? It's decent, but let's face it: 4k60 is never going to be the best for twitch games. You really do need 120-144hz for that.

I've been testing the overclock for a while here. Memory clocks seem to matter much more. I had a +600 point difference from a 100mhz memory overclock on superposition. From 6100 to 6700 @4k optimized. At MSRP, that wouldn't be half bad. I think this card should have been released like the Fury X. All liquid and no packs.
 
they sold out nearly immediately - I got the text notification while I was driving --- immediately pulled over and bought it on my phone.

I got absolutely royally screwed at $800 - but I wanted this card, had my heart set on it for as long as the silver limited edition had been announced --- and just DARN YOU AMD --- you got me. I hate you!

I'm was voluntarily locked into the FreeSync realm though with my three 32" HP Omen monitors and since I really like the displays and love FreeSync --- to go another route on the monitors would have cost me a lot more in the end. Having had the Fury X in crossfire and been happy with performance I felt this card should suffice for performance combined with freesync.

DARN YOU AMD for raising the price and making this card nigh impossible to get!

One bittersweet score, to be sure.

I was cleaning my kitchen and got the email a few minutes too late lol. I saw that price and was still fine with my NCIX pre-order hanging out in the void.
 
they sold out nearly immediately - I got the text notification while I was driving --- immediately pulled over and bought it on my phone.

I got absolutely royally screwed at $800 - but I wanted this card, had my heart set on it for as long as the silver limited edition had been announced --- and just DARN YOU AMD --- you got me. I hate you!
congrats on your purchase!
let's think this way if it makes you feel better: you are just paying as much as the rest of the world outside US has been paying all the time. Heck it's still cheaper than many regions. (my local price is at around USD875)
it's not a jack up but only resuming to full price from the usual US discount:p
 
For everyone that owns one, I just got mine today. Contemplating whether to keep it or send it back.

Are you guys happy with your purchase? I have a hp omen 32". Currently running an hd7870.

I ended up with the xfx black pack ($599 with two games).
 
For everyone that owns one, I just got mine today. Contemplating whether to keep it or send it back.

Are you guys happy with your purchase? I have a hp omen 32". Currently running an hd7870.

I ended up with the xfx black pack ($599 with two games).
If you haven't tried freesync - IMO it's a game changer and the Omen has a excellent implementation of it! (My experience with three Omens and Fury X cards). I vote keep it. Also the price just went up $100 across the board,so compared to some you got a "deal". The air cooled is now $700.

Personally speaking I very much preferred Fury X and freesync at 75hz to my current 1080ti and 60hz with vsync on my Omen. As in VERY much! As in enough to overpay for an inferior card to get Freesync back.

So my vote is definately keep. I'm replacing my current 1080ti SLI with Vega to get my freesync back on the Omen.

Make sure and set frame rate target control to 72 or 73 FPS, turn off vsync, and turn on Freesync. Fantastic experience!
 
Are you guys happy with your purchase? I have a hp omen 32". Currently running an hd7870
Coming from a 7870 that will be an awesome upgrade. That HP monitor is about perfect for Vega (1440p FreeSync). I think you'll be happy.

For my situation in particular, I'm running 4K on this machine so Vega is a little short of hitting max settings 60 FPS. It's close though, around 45 - 50 in many games (aside from DOOM, of course, the best optimized game of all time). At 1440P you should have no problem hitting the FreeSync range.
 
ontariotl I'm in a similar boat. I had RX 480 CF in this machine before, and with proper CF support the performance was better than one Vega. For example, 60 FPS in GTA V 4K maxed vs. 50 FPS on Vega.

However, in the many games that don't support CF, it was a big win (like in Dishonored and Borderlands).

I am hoping AMD can release Crossfire support soon, but I have a few older games to finish I can play for now while I wait.
 
I'm picking mine up tomorrow! Hopefully my poor little 600W psu can handle it:rolleyes:
I think you will be cutting it really close with 600W. All the specs I see recommend 750W. Adored's video showed usage around 470W, so I guess it would depend on the quality of the PSU whether it could handle it.
 
That maybe. But it would be out of character for Newegg to do so. And if you look at Newegg all the RX Vega are now significantly higher priced.

Newegg for example, still sells the Biostar TB250BTC pro for $150 MSRP when everyone else is selling it for $180-$250. (Popular 12 GPU mining board)
I am sure you know much more about channel sales I do, please educate me further.
 
That may be. But it would be out of character for Newegg to do so. And if you look at Newegg all the RX Vega are now significantly higher priced. Nearly $700 for any RX Vega.

Exact thing happened back at the end of 2013 when the 290x came out. Shortly after the big gold rush with Bitcoin, R9 290 and 290X's were being swallowed up. Luckily I bought my two 290's for $449 (CDN) each, but soon after when the cards were sought after, prices jumped to nearly a grand. That includes Newegg. History repeats itself with crypto mining again and Newegg isn't innocent and certainly not out of character to what they are pulling like every other retailer.
 
I think you will be cutting it really close with 600W. All the specs I see recommend 750W. Adored's video showed usage around 470W, so I guess it would depend on the quality of the PSU whether it could handle it.
ontariotl above was using a 140W i7-5960X with a vega and they draw around 600 W at turbo mode.
I am using a 65W i7-6700 with mine...I guess it will be fine?o_O 32GB ram, 2 drives, nothing fancy.

Holy shit I was just testing mine and suddenly my system shut down by itself. The motherboard reported there was a power surge. Could it be my psu too crappy to handle my system? It's a silverstone SX600-G. I wish I have a kill a watt around.
 
ontariotl above was using a 140W i7-5960X with a vega and they draw around 600 W at turbo mode.
I am using a 65W i7-6700 with mine...I guess it will be fine?o_O 32GB ram, 2 drives, nothing fancy.

Holy shit I was just testing mine and suddenly my system shut down by itself. The motherboard reported there was a power surge. Could it be my psu too crappy to handle my system? It's a silverstone SX600-G. I wish I have a kill a watt around.

I wouldn't use turbo or balanced on certain games. I'm finding a few that are spiking more watts than other games. I'm just updating my chart as I'm finding some interesting results with some overclocking tests.

Which I have to ask, what were you testing when it shut down?
 
I wouldn't use turbo or balanced on certain games. I'm finding a few that are spiking more watts than other games. I'm just updating my chart as I'm finding some interesting results with some overclocking tests.

Which I have to ask, what were you testing when it shut down?
I was loading a map...in bloody csgo :eek:
not even got into the game yet. But then after the reboot, I started csgo and it seemed fine.
The first game I launched was dishonored 2, it was ok but I test launched it for a few minutes only. I think I'm going to put it in turbo and complete a whole mission to see if it's stable or not.

edit: It happened again, in Euro truck simulator. csgo and ets2 are not exactly games that will push the card to limit but that's what happened...Looks like I may have to get a new psu:cry::confused:

edit2: Further reading that could be the asus anti-surge mechanism being too sensitive. I'll try disabling it first

edit3: nope it's still happening. :cry:
 
Last edited:
btw no one seems to have uploaded any photo of a sapphire box. So here it is.
I thought the card has a plastic shroud so I'm surprised it's all metal. Looks so much better than the limited edition! (personal opinion)

IMG_20170819_134241703.jpg
 
Last edited:
Holy shit I was just testing mine and suddenly my system shut down by itself.
This happened to my machine the other night. I did install a second Vega card, but Crossfire doesn't work and I was just watching a YouTube video so fairly light load. This is with a 1000W PSU, so should be enough.
 
Back
Top