AMD Vega and Zen Gaming System Rocks Doom At 4K Ultra Settings

It'd only be a flop if AMD and the fans overhype the shit out of it. Whhhiicchh seems to be happening.

While overhyping it will be a huge wound for Vega, in truth, I see a HUGE trend with AMD's market share that follows the competitive status of AMD's top-end.

In other words, when AMD had a GPU that could compete (or even DEFEAT) Nvidia's actual top-end-cards, AMD's marketshare was highly competitive. When AMD shat the bed with "good enough" GPUs and shitty rebrands, their marketshare tanked.

The 290X was "Almost as fast" as a 780Ti on launch, and priced to compete: and nobody cared.

The Fury X was faster than the 980, but because it wasn't as fast as the 980 Ti nobody cared.

Meanwhile when the 7970GE was factually and consistently faster than the top-end Nvidia GPU, AMD's marketshare was going strong.

I see a trend.
 
I see more people hyping its failure than people saying its the next best thing.
Of course that seems to get ignored all the time.

That's because personally i believe that AMD tends to over-hype everything
1. from the time when they were implying(never stated directly) that GTX970 had less than 4GB RAM which was later proved to be a lie 2.also, many months before FuryX's release they were "bombing" us with the tremendeous capabilities of the HBM memory, letting us believe that their GPU will be something between a rocket and a spaceship!! :pompous: untill NV released 980Ti and the spaceship landed :ROFLMAO:, and 3. Their claiming that RX480 will provide us with "Premium VR experience" and so far it's far behind all NV's competitive GPUs ,
so, personally i don't take them seriously anymore untill i see reviews first.
 
Nearly all of us- I'll accept that there are some hardcore Nvidiots out there- would prefer to see a highly-competitve AMD, with products that challenge the price, features, and performance of the best that competitors have to offer in every market they compete.

The reality is that this doesn't happen very often.
 
So, Okay. I may have been a bit harsh on Vega so far. I just loaded up DEWM and ran the exact level from the demonstration that was showing the game running from 60-75FPS on VEGA. I run at 4K, absolute maximum settings on Skynet, the system in my sig: two OC'd Maxwell TiXs in SLI. I got about the same FPS as the Vega sample (I was about 7% ahead). Raw Screendump from steam below. I'm pretty sure my two 1400Mhz+ maxwell Titan Xs are faster than a single Pascal Titan X.

20170108235927_1.jpg
 
i wouldn't be so sure.
Check the graph posted at #120. You will see that SLI doesn't scale properly at DOOM (*although this graph shows measurements with OpenGL , not VULCAN)
 
So, Okay. I may have been a bit harsh on Vega so far. I just loaded up DEWM and ran the exact level from the demonstration that was showing the game running from 60-75FPS on VEGA. I run at 4K, absolute maximum settings on Skynet, the system in my sig: two OC'd Maxwell TiXs in SLI. I got about the same FPS as the Vega sample (I was about 7% ahead). Raw Screendump from steam below. I'm pretty sure my two 1400Mhz+ maxwell Titan Xs are faster than a single Pascal Titan X.
When you say absolute maximum are you also using Nightmare shadows?
If so need to lower that to Ultra like the AMD DEMO.
Cheers
 
Yep. Another example as well, higher settings tho:
View attachment 14291

Both 980 and 1080 gets around 30% scaling.

Interesting indeed. I was getting minimums of 70 FPS on the map, and I was running in OGL. I tried out 'just ' ultra settings, I didn't see any difference in FPS.

that said, according to that graph, my system beats out a 1080 SLI setup, which I find doubtful. The level in the AMD demo (Argent D'Nur) is not the most demanding of Doom's missions, graphically speaking.
 
If this is confirmed in the end I'm to have a field day teasing some forum users here about how they feel about AMD sacrificing IPC for higher clocks, just like Paxwell. Ah, I'm such an asshole lol.

Does premeditated trolling carry a greater charge? Is that like 1st degree trolling?
 
Last edited:
When you say absolute maximum are you also using Nightmare shadows?
If so need to lower that to Ultra like the AMD DEMO.
Cheers

Reproducing the same test environment when possible is the best. I also wonder how Ryzen impacts the result of the Vega machine. To many variables.
 
That's because personally i believe that AMD tends to over-hype everything
1. from the time when they were implying(never stated directly) that GTX970 had less than 4GB RAM which was later proved to be a lie 2.also, many months before FuryX's release they were "bombing" us with the tremendeous capabilities of the HBM memory, letting us believe that their GPU will be something between a rocket and a spaceship!! :pompous: untill NV released 980Ti and the spaceship landed :ROFLMAO:, and 3. Their claiming that RX480 will provide us with "Premium VR experience" and so far it's far behind all NV's competitive GPUs ,
so, personally i don't take them seriously anymore untill i see reviews first.
You just said nothing about Vega though. If your opinion (good or bad) is based on other products then you're hyping failure based on things that aren't Vega. That's just as biased and it's still hype.
On this forum i see more anti Vega nay saying than any pro Vega hype. So i dunno why anyone would lie about Vega hype unless they were playing an angle for reasons they wouldn't admit.
 
That's because personally i believe that AMD tends to over-hype everything
1. from the time when they were implying(never stated directly) that GTX970 had less than 4GB RAM which was later proved to be a lie 2.also, many months before FuryX's release they were "bombing" us with the tremendeous capabilities of the HBM memory, letting us believe that their GPU will be something between a rocket and a spaceship!! :pompous: untill NV released 980Ti and the spaceship landed :ROFLMAO:, and 3. Their claiming that RX480 will provide us with "Premium VR experience" and so far it's far behind all NV's competitive GPUs ,
so, personally i don't take them seriously anymore untill i see reviews first.

You forgot how FuryX is supposed to be an overclocker's dream.
 
Interesting indeed. I was getting minimums of 70 FPS on the map, and I was running in OGL. I tried out 'just ' ultra settings, I didn't see any difference in FPS.

that said, according to that graph, my system beats out a 1080 SLI setup, which I find doubtful. The level in the AMD demo (Argent D'Nur) is not the most demanding of Doom's missions, graphically speaking.

Are you playing with nightmare settings? Vega wasn't for example.
 
I told you the reason behind my comments. It's AMD's history that made me to completely ignore their hype or their advertising and wait for reviews.
If they were acting like a serious company i would also take them seriously.

(Now with VEGA i'm having deja-vous from the hype prior FuryX's period. So, based on the outcome back then, i remain very sceptic or biased or whatever word you wish to use ;) )

EDIT: oops!, i just noticed that i forgot to mention to whom i was replying to !!! my post was answer to {NG}Fidel's upper comment :oops:
 
Last edited:
I believe LTT is the biggest tech channel on Youtube right now so it makes senses for AMD top give Linus a lot of access. Considering how incredibly excited he is prone to get over new tech it's also a smart move to do so.

Fuel the hype train ;)
 
At least he scrolled through the doom settings in case anyone wants to match it.

I am not really on the market for a GPU. I just like to follow along and see which predictions are correct. :)

One thing is for certain is that it has the horsepower to be high end. Comes down to price...
 
So that was an Engineering sample - meaning to me it is not ready for prime time yet.

Performance for Engineering sample looks really good!

The tape everywhere in that case had to be heating things up for both the GPU and CPU.

This does not look like a 1st quarter launch to me.
 
So in regards to the Radeon 570 Vega/Polaris that appeared recently in the laptop, there are a few possibilities here. One is that the die thinning procedure that was used for the chip that went to Apple, which dramatically lowers power draw, is used for some of AMD's future chips.
 
So in regards to the Radeon 570 Vega/Polaris that appeared recently in the laptop, there are a few possibilities here. One is that the die thinning procedure that was used for the chip that went to Apple, which dramatically lowers power draw, is used for some of AMD's future chips.

Die thinning is a standard method used for ages and it doesn't change the electricals. The rebranded 470 have been used in 1? other laptop. So besides Apple and 2? laptops, Polaris haven't gone anywhere.

I dont even think non thinned dies exist today in any form. Smartphones, USB sticks, Smartcards etc couldn't exist without it. Its been done for over 20 years.
 
Last edited:
So in regards to the Radeon 570 Vega/Polaris that appeared recently in the laptop, there are a few possibilities here. One is that the die thinning procedure that was used for the chip that went to Apple, which dramatically lowers power draw, is used for some of AMD's future chips.


LOL yeah and unicorns are real.

 
Die thinning is a standard method used for ages and it doesn't change the electricals. The rebranded 470 have been used in 1? other laptop. So besides Apple and 2? laptops, Polaris haven't gone anywhere.

I dont even think non thinned dies exist today in any form. Smartphones, USB sticks, Smartcards etc couldn't exist without it. Its been done for over 20 years.

Calm down, i never said they weren't used anywhere else but what does that have to do with the fact that the die thinned Polaris die dramatically lowered power draw? Correct, nothing.
 
Calm down, i never said they weren't used anywhere else but what does that have to do with the fact that the die thinned Polaris die dramatically lowered power draw? Correct, nothing.


Die thinning has nothing to do with power or circuitry in a chip so how do you propose it drops power?

Hence the unicorn videos :)

Apple's Polaris chips are down clocked, and undervalued significantly that's why the TDP is so low and you don't know the power draw of the chip by looking at the TDP, two different things! So you don't even know even though its underclocked how much its boosting.
 
Calm down, i never said they weren't used anywhere else but what does that have to do with the fact that the die thinned Polaris die dramatically lowered power draw? Correct, nothing.

You said the wafer backgrinding lowered power draw and that is wrong. They are simply highly binned and its only for the 1024SP version. You may notice that the 2 other versions are 768 and 640SPs and downclocked.

Desktop RX460 = 1090-1200Mhz 896SP

Mobile Pro 450 725-775Mhz 640SP
Mobile Pro 455 850Mhz 768SP
Mobile Pro 460 850-910Mhz 1024SP

Its no different than the 1Ghz full GP104 with 6Ghz GDDR5 at 50W.
 
Last edited:
Die thinning has nothing to do with power or circuitry in a chip so how do you propose it drops power?

Hence the unicorn videos :)

Apple's Polaris chips are down clocked, and undervalued significantly that's why the TDP is so low and you don't know the power draw of the chip by looking at the TDP, two different things! So you don't even know even though its underclocked how much its boosting.

How much undervalued?

I think die thinned Polaris chips in laptops with dramatically lowered power draw are in the works.
 
How much undervalued?

I think die thinned Polaris chips in laptops with dramatically lowered power draw are in the works.

There is no Polaris chips that isn't "die thinned". The correct word is wafer thinning or wafer backgrinding.

I gave you an example of the clocks and configs of those Polaris chips. Also let me give you a hint. What do you see everywhere in laptops and what dont you see. That's right, Pascal everywhere, Polaris nowhere besides Apple. And Vega certainly wont be in any.
 
How much undervalued?

I think die thinned Polaris chips in laptops with dramatically lowered power draw are in the works.


A shit ton, Shintai just showed you the clocks its down clocked by 30% or more depending on the version and has cut down units too.

Its easy to see Polaris was pushed to the high end of its ideal power curves (desktop version) that's why overclocking a desktop version you get crazy voltage and power draw for small amounts of overclocks, so down clocking it and undervaluing and using cut down sm's is going to save Polaris a butt load of power.
 
possibly but it IS NOT an issue. so idk wtf hes bringing it up for. its nothing new...

Oh I agree. But you have to admit its a little funny tho. Considering Freesync, Freesync 2 etc,. And then miss such a thing on your new flagship demo.
 
Back
Top