AMD has really put me in it with Vega..

I would wait till performance numbers are out to make a final decision. The Vega 56 would be your best bet if your power budget is 400 Watts Total (500W * 80% duty)
 
This is exactly the case, in "Ultimate mode" I noticed annoying flickering in almost all 3D games, at least ones I consistently play. It's not so much a "may" have flickering, it's more like "probably".

I just didn't see it as a large enough reason to return it as the monitor itself is gorgeous and 100hz is still great for this type of monitor.

You can see it in this video:


We won't be able to see what you are seeing because your camera isn't capturing at the same rate as the screens displayed.

If there was flicking, I would attribute it to the LED backlight strobe effect which is designed to reduce blur. See if you can turn it off.
 
We won't be able to see what you are seeing because your camera isn't capturing at the same rate as the screens displayed.

If there was flicking, I would attribute it to the LED backlight strobe effect which is designed to reduce blur. See if you can turn it off.

It's not my video, just someone else I found on YT that shares the same issue. If you actually watch it, you can definitely see the exact type of flickering that's happening though. It's only really noticeable in the shaded/darker regions of the screen, not the entire screen...
 
This is exactly the case, in "Ultimate mode" I noticed annoying flickering in almost all 3D games, at least ones I consistently play. It's not so much a "may" have flickering, it's more like "probably".

I just didn't see it as a large enough reason to return it as the monitor itself is gorgeous and 100hz is still great for this type of monitor.

You can see it in this video:


That's actually really f-ing noticeable on the buildings.
 
That's actually really f-ing noticeable on the buildings.

Again, did you guys turn off the motion blur? When you are capturing this way it will show this.

Have you ever filmed a NTSC signal on old progressive scan CRTs and you see a moving scan line dividing the screen? It's because the camera and the display are not sync'd. Hence this video is not credible.
 
500 watts, I would be wary of using Vega with that power supply. I'm using a 620 platinum EVGA w/ a 1080ti. pretty much the same set up as you just with 64 gb of vram though.

You might wanna be vary too, 1080 ti OC are over 300W and peak up to 365W stock they are close to 300W
 
Again, did you guys turn off the motion blur? When you are capturing this way it will show this.

Have you ever filmed a NTSC signal on old progressive scan CRTs and you see a moving scan line dividing the screen? It's because the camera and the display are not sync'd. Hence this video is not credible.

I may be a noob here, but come on guy. Are you even reading my posts or have you watched the video?? Please do both and if you still have a problem with it, let me know.

However I will say the issue in the video is shared across all games (that I have tested with) and the AMD freesync demo app. It' crazy obvious and it has nothing to do with the way its being captured with a video camera in the video posted, which again is not mine nor did I dictate how it was made. Literally the flickering issue shown is exactly what it looks like even with my own eyes on my monitor.

In conclusion, I am still stuck with 80-100hz.
 
You might wanna be vary too, 1080 ti OC are over 300W and peak up to 365W stock they are close to 300W


I don't over clock my comps, only my mining rigs man (overclock and undervolt). And I if I wanted to overclock my 1080ti, I have around 100watts free for that, my comp only uses 400 watts full tilt right now.

When I build systems I always try to keep the wattage around that 85% range of the PSU, but the lowest platinum power supply out there in wattage from a reputable company was the 620 EVGA, so got that.
 
You might wanna be vary too, 1080 ti OC are over 300W and peak up to 365W stock they are close to 300W

Not really.

power_average.png

power_peak.png

power_average.png

power_peak.png
 
So to you factory overclocked over reference is not overclocked? So what do you consider overclocked?
he refers to balls to the wall OC from a user. Factory OC tend to extend ranges of boost with a slight bump to base, and definitely not raising voltage as much as a user. Most users make their base the boost, hence the far higher power usage over just Factory OCed.
 
he refers to balls to the wall OC from a user. Factory OC tend to extend ranges of boost with a slight bump to base, and definitely not raising voltage as much as a user. Most users make their base the boost, hence the far higher power usage over just Factory OCed.

So its still apples to oranges compare? Not to mention worlds apart in performance. How much would an OCed Vega use? Assuming it can even OC any useful amount to begin with on a custom card. 400, 500W?
 
So its still apples to oranges compare? Not to mention worlds apart in performance. How much would an OCed Vega use? Assuming it can even OC any useful amount to begin with on a custom card. 400, 500W?
That wasn't the point. People crapping their pants over 290W-350W power usage with 1080Ti OCed are just being obtuse and flaming for the fun of it, case in point.
 
That wasn't the point. People crapping their pants over 290W-350W power usage with 1080Ti OCed are just being obtuse and flaming for the fun of it, case in point.

Difference is one part delivers performance, the other doesn't. A 350W GP102 is what, close to twice as fast as Vega?

350W Vega is about equal to a 180W GP104 and soon a 120W GV106.
 
Difference is one part delivers performance, the other doesn't. A 350W GP102 is what, close to twice as fast as Vega?

350W Vega is about equal to a 180W GP104 and soon a 120W GV106.
Again NO. They weren't bitching about performance per watt, just the wattage and how much heat. So again you are arguing the wrong point and failing at comprehension. For one to argue 300W uses too much power and would heat their tiny room up too much, and then own a 1080Ti they OC to max is disingenuous and deliberately misleading and trolling.
 
Again NO. They weren't bitching about performance per watt, just the wattage and how much heat. So again you are arguing the wrong point and failing at comprehension. For one to argue 300W uses too much power and would heat their tiny room up too much, and then own a 1080Ti they OC to max is disingenuous and deliberately misleading and trolling.

razor1 didn't OC his 1080TI if that's what you think.
 
msi 1890 mhz
asus 1888 Mhz
founders 1777 Mhz

average clocks during gaming according to techpowerup.

wouldn't really call that must of an OC for pascal. its going over 300W if you push it for sure.

Only one of those 3 goes over 300W at peak hitting 305W while averaging at 282W. While its average is below 300W. For the FE is peak is 267W with an average of 231W.

At 350W like Vega you would hit what, 2Ghz on GP102?
 
Difference is one part delivers performance, the other doesn't. A 350W GP102 is what, close to twice as fast as Vega?

350W Vega is about equal to a 180W GP104 and soon a 120W GV106.

And those arguing with you also ignore the data so far puts Vega more in line with the GTX1080 not the 1080ti, point being they are taking a GPU that is probably notably faster than Vega while ignoring its actual closest competitor in the 1080 (especially custom AIB models and not the reference FE).
Point is the power demand of the GTX1080ti should be ignored in this context and not used by those raising it.
A maxed out OC for 1080 puts demand around 210W (by those that can measure and isolate accurately), which would require the Vega to also be pushed and means beyond their official and already highish TDP.
Cheers
 
Last edited:
There is a Silverstone SFX 600W psu that the OP could use if he doesn't want to cut it close. It's the one in my system. I'm running a GTX1080 and it seems it's getting plenty of power. It does pump out a significant amount of heat though.
 
There is a Silverstone SFX 600W psu that the OP could use if he doesn't want to cut it close. It's the one in my system. I'm running a GTX1080 and it seems it's getting plenty of power. It does pump out a significant amount of heat though.

My current sys under load (sans video card obv) should be around ~160W so..

Honestly, I don't think I would even need to upgrade my PSU for a GTX1080, where I would for Vega.
 
A nano might be perfect for you. will have to see where all this ends up on perf, price, power and availability too because i suspect things might require some waiting.

He can just as well keep his 480 then.
 
you make a lot of assumptions with no information, you know that? where will the nano fall? what are its clocks? voltages? how high can it oc while staying under vega 56 consumption? temperatures?

Now OC too? :D

The 1080 is what the OP should have gotten in the first place. Top end Vega performance at half the power draw, cheaper by now and could be had 15 months ago.

Maybe #waitfornavi? ;)
 
Those videos are useless, it never ceases to amaze me that these side by side comparison videos even rack up this many views. You cannot judge a damned thing this way, you need actual performance data on a graph.

Not to mention video-compression that make it a total wash...
 
Now OC too? :D

The 1080 is what the OP should have gotten in the first place. Top end Vega performance at half the power draw, cheaper by now and could be had 15 months ago.

Maybe #waitfornavi? ;)

Sure kick me when I'm down lol. I will wait to confirm Vega data at this point, I owe it to myself for waiting so long anyway.
 


It doesn't matter, cause even with the 620 watt power supply I have it can handle that type of overclock, as I stated my system only uses around 400 right now at stock full tilt. The power supply still has the extra wattage it needs to still run an overclocked 1080ti. The OP has 500 watt power supply. I can put that into my system and it will run fine!, but put a Vega in there, its going to be very difficult, 50 more watts at stock, that puts it nearly 90% load of a 500 watt power supply. Hence why I stated I would be very wary of that power supply with Vega with his current configuration.
 
Last edited:
we don't know the power consumption situation with rx vega. we only have the tdp values. Fury X has a tdp of 275 and average consumption lower than that. It needs testing. stop claiming 350W if you have any integrity.

I am never a big fan of using TDP as a measurement of power consumption since it is really talking about the effectiveness of the heatsink. I do agree with you that Fury X do have a power consumption around 275 watts during normal usage but its absolute power consumption is 400 watts in a power virus testing, though this is a very extreme case.

As regarding to 350 watts for RX Vega, while this is true we don't know the typical power consumption for RX Vega, we can infer from Vega FE Air and Liquid Cooler, test done by PCPER does show Air version draw near 300 watts and Liquid version drawing near 300-350 watts depending which mode is use.
 
he refers to balls to the wall OC from a user. Factory OC tend to extend ranges of boost with a slight bump to base, and definitely not raising voltage as much as a user. Most users make their base the boost, hence the far higher power usage over just Factory OCed.

But Factory OC cards base clock is typically near reference boost clock. 1080TI ref base clock is 1480, boost 1582, while MSI 1080TI Gaming X 1569 in OC mode, boost 1683. I do agree that 1080TI can OC be much higher, up to 1900+ with much higher power consumption and typically 300+ watts on the card alone at those speed. Factory OC may be mild OC, but it is still OC over reference.
 
Not to mention video-compression that make it a total wash...

If you watch the video he toggles Freesync on and off multiple times. The flashing is very apparent when Freesync is on.
 
If you watch the video he toggles Freesync on and off multiple times. The flashing is very apparent when Freesync is on.

This thread is forking quite a bit at this point, but I believe he might have been referring to the video comparing the three different video cards at the same time...not the freesync one. Could be wrong though.
 
Not to beat a dead horse but did you disable magic bright (dynamic contact) and game mode when you engaged ultimate mode?

I'm not trying to pick on you but possibly help you find a solution.
 
Well there is not a "disable" Magic bright option, but its on "custom". There is an option of "dynamic contrast" , but they are mutually exclusive. Game mode is off. Neither setting has been changed from those since I got the monitor.

No sweat, I appreciate the input.
 
This thread is forking quite a bit at this point, but I believe he might have been referring to the video comparing the three different video cards at the same time...not the freesync one. Could be wrong though.
Correct.
 
it was never about comparing graphics cards. I was simply replying to power consumption claims. 1080 ti still uses a ton and 283W on a custom 1080ti is not minor if you are talking about hard limits like system cooling, psu etc. That is all. people pretending that much power is oh so bad doesn't make sense regardless of performance. a couple years from now a card will use this much power and outperform everything here, and we still wont think the 1080ti was using "too much." There is difference between more efficient and using "too much power"



we don't know the power consumption situation with rx vega. we only have the tdp values. Fury X has a tdp of 275 and average consumption lower than that. It needs testing. stop claiming 350W if you have any integrity.

Then why not 1080 where Vega is actually competing?
BTW you cannot really use that figure as it is not the absolute figure nor has any correlation to TDP/TBP, which is what is important in this discussion.
Anyway the claims of TDP/TBP is actually correct by Nvidia for 1080ti, the crux is using the FE card to get those numbers and using sites like I inferred that can actually isolate and measure accurately (only 2 sites can do this with PCPer being one but you did not link the TDP and a 3rd site is not bad).
The official performance of FP32/etc from Nvidia is official boost rather than actual clocks that usually go higher, so lets see what the TDP/TBP is for 1080ti.

The official TDP is 250W.
Here is the 1080ti FE measured accurately by Tom's.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9IL1AvNjU3NzA5L29yaWdpbmFsLzAwLVBvd2VyLUNvbnN1bXB0aW9uLnBuZw==


Ignore the peak because that is to show instantaneous power draw and does not reflect TDP/TBP - more of interest to engineering and power management/behaviour/regulation and will be higher for all GPUs and manufacturers.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9IL0EvNjU3Njk0L29yaWdpbmFsLzAxLUNsb2NrLVJhdGUtdnMtVGVtcGVyYXR1cmVzLUdhbWluZy1Mb29wLnBuZw==



And those real world TDP gives us average clocks of 1650MHz in real world as well, but the official boost clock is only up to 1582MHz so we have a GPU that is within TDP/TBP but ironically actually has greater FP32 performance than official spec due to higher real world clocks (usually the Nvidia figure given correlates with base clock and sometimes boost clock but at official rating that is more conservative to real world, saying AMD does it differently is being polite.).

So nothing wrong with the claims from Nvidia.
If you want to bring AIB into this, well then you need to rip up the official TDP/TBP of Vega as well because OCing it will have a pretty hefty impact on TDP and they are yet to show they can accurately provide TDP/TBP with the implementation of their own dynamic boost (look at Polaris to see this where historically AMD was more accurate with the older boost mechanism).
Anyway Vega should be compared to 1080 even in this regard rather than to 1080ti, just showing Nvidia claims are actually accurate.

Edit:
I should also say the Fury X was not a bad GPU in terms of TDP when left as reference but every other Fury model did not reflect this (Nano had good TBP but to do this required much lower reference clocks and most just OC'd and so ripped that up).
Cheers
 
Last edited:
Then why not 1080 where Vega is actually competing?
BTW you cannot really use that figure as it is not the absolute figure nor has any correlation to TDP/TBP, which is what is important in this discussion.
Anyway the claims of TDP/TBP is actually correct by Nvidia for 1080ti, the crux is using the FE card to get those numbers and using sites like I inferred that can actually isolate and measure accurately (only 2 sites can do this with PCPer being one but you did not link the TDP and a 3rd site is not bad).
The official performance of FP32/etc from Nvidia is official boost rather than actual clocks that usually go higher, so lets see what the TDP/TBP is for 1080ti.

The official TDP is 250W.
Here is the 1080ti FE measured accurately by Tom's.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9IL1AvNjU3NzA5L29yaWdpbmFsLzAwLVBvd2VyLUNvbnN1bXB0aW9uLnBuZw==


Ignore the peak because that is to show instantaneous power draw and does not reflect TDP/TBP - more of interest to engineering and power management/behaviour/regulation and will be higher for all GPUs and manufacturers.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9IL0EvNjU3Njk0L29yaWdpbmFsLzAxLUNsb2NrLVJhdGUtdnMtVGVtcGVyYXR1cmVzLUdhbWluZy1Mb29wLnBuZw==



And those real world TDP gives us average clocks of 1650MHz in real world as well, but the official boost clock is only up to 1582MHz so we have a GPU that is within TDP/TBP but ironically actually has greater FP32 performance than official spec due to higher real world clocks (usually the Nvidia figure given correlates with base clock and sometimes boost clock but at official rating that is more conservative to real world, saying AMD does it differently is being polite.).

So nothing wrong with the claims from Nvidia.
If you want to bring AIB into this, well then you need to rip up the official TDP/TBP of Vega as well because OCing it will have a pretty hefty impact on TDP and they are yet to show they can accurately provide TDP/TBP with the implementation of their own dynamic boost (look at Polaris to see this where historically AMD was more accurate with the older boost mechanism).
Anyway Vega should be compared to 1080 even in this regard rather than to 1080ti, just showing Nvidia claims are actually accurate.
Cheers
Did you not read his post at all? It isn't about stock or what clocks it attains or some other random CRAP. It is in reference to people complaining that 300W is too much in their tiny room and hence Vega would be unbearable, no mention of performance. So he pointed out that anyone that OCs their 1080Ti not AIB OCed models but manual OCed by users CAN easily reach into the 300W range and above. Hell your chart of the FE proves this as well, 1650Mhz to the 2000Mhz most are aiming for would greatly surpass that peak as daily clock wttage.
 
Did you not read his post at all? It isn't about stock or what clocks it attains or some other random CRAP. It is in reference to people complaining that 300W is too much in their tiny room and hence Vega would be unbearable, no mention of performance. So he pointed out that anyone that OCs their 1080Ti not AIB OCed models but manual OCed by users CAN easily reach into the 300W range and above. Hell your chart of the FE proves this as well, 1650Mhz to the 2000Mhz most are aiming for would greatly surpass that peak as daily clock wttage.

Facepalm.
Yes but the OFFICIAL TDP for Vega IS 300W and more for the models that perform!
This does not reflect Nvidia who do not operate anywhere near this officially, also he made comments about 1080ti power demand and Nvidia claims where I showed he was not fully correct.

Like I said if you start talking about OCd 1080ti to try and get closer to this figure and ignore that the 1080ti performance is meant to be notably higher even without OC than Vega (meant to be competing with 1080), then go wonder where the hell Vega is going to come in considering the official TDP/TBP.
 
Last edited:
Right here are some accurate figures on Vega FE.
In summary without the watercooling looks like it has a clock range of 1440MHz at best and down to 1269MHz when hot.
When at 1440MHz its TBP is 293W, when at 1269MHz its TBP is 266W.
These indicators definitely put Vega more around 1070/1080 performance more generally excluding good custom Nvidia AIBs IMO; watercool model looks like it will be the most optimal Vega but depends how much higher than 300W it needs.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL0YvNjkzMDg3L29yaWdpbmFsLzAwLVBvd2VyLURyYXctT3ZlcnZpZXcucG5n



aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9TL00vNjkzMDk0L29yaWdpbmFsLzA4LVRlbXBlcmF0dXJlLXZzLUNsb2NrLVJhdGUtdnMtUGVyZm9ybWFuY2UucG5n



My context is not about fps performance per se, but more about recent discussion around TDP/TBP, and real world behaviour that is clocks/TDP/performance (looks like the best will be watercooled for optimal efficiency without OC out of the range just like before).
http://www.tomshardware.com/reviews/amd-radeon-vega-frontier-edition-16gb,5128-10.html

Cheers
 
Last edited:
Back
Top