This is what AMD should do around the VEGA launch to switch the high end away from nvidia

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
First, Vega needs to offer decent performance. If it can't outright BEAT the 1080ti, then it has to get DAMN CLOSE with competitive pricing overall.


But the other thing is related to partnerships AMD can try to leverage. Freesync. They should do everything in their power to get the xbox scorpio and ps4k to support freesync through hdmi 2.0

More than that of course, they should get the big tv makers - Samsung/LG/Sony/Vizio/Panasonic to commit to adding support for adaptive sync into some of their 2017 tv lineup through hdmi 2.0 and finally.... displayport 1.4 in some cases. That later can be reserved for higher end tvs that support 120Hz refresh rates natively at 4k.

And here will come the mic drop moment.

Unveil TWO highest end systems, one with 1080ti or titan gpus running on the highest end gsync display, a teensy weensy wittle 34" widescreen, not even full 4k, and right next to it... a much larger full 4k vega rig, or Zega rig of Zen is ready attached to a 2017 lg oled with 120Hz freesync, with an optional toggle to remove sample and hold and allow for strobing for smoother motion for games.

And then Raja/Lisa drop mics at once.


Game over. The most premium experience, even with a few extra potential fps, is a fucking disaster ditch of inferior total setup potential.


[H]ards own Vega might go VEGA! and ditch his own nvidia rig and sell his current oled and get one of these adaptive sync tvs that are the holy grail of what we've all wanted. This is something AMD should try to do, because if they can get their partners to implement this, they will have a MASSIVE advantage on the ultra high end once vega drops. This will probably be enough to force nvidia to start supporting adaptive sync.
 
AMD is so gung-ho about DX12 and even if the DX11 titles fall behind an average of 20% they will price it right in line with Nvidia. They continue to follow the same wrong path they have been for years, Management and marketing needs a clean sweep plain and simple. They are just out of touch IMO.
 
AMD is so gung-ho about DX12 and even if the DX11 titles fall behind an average of 20% they will price it right in line with Nvidia. They continue to follow the same wrong path they have been for years, Management and marketing needs a clean sweep plain and simple. They are just out of touch IMO.

Yea, they are clearly out of touch with reality. The 480 launch was a facepalm with the pcie power issue. How do you move towards a launch with glaring oversight as that? It's supposed to be them at their best but clearly the right hand didn't know what the left was doing. They've got incredibly sharp minds but their execution is failtastic.
 
I'm not concerned about their dx11 performance, that will be slowly diminished over time as dx12 titles take over the high end and the dx11 titles are all played well enough to not matter.

I am very concerned about their VR performance, where the dx11 performance comes into play with UE4, I read some dev talking about how support in that engine for dx12 was pretty lacking atm with frequent crashes. It's a worry that needs to be addressed. But the way forward is not to double down in dx11 in my view, it's to help move devs out of the gutter of dx11 and onto dx12/vulkan.


VR is a kind of reset, and I don't want to see amd trailing out of the gate. Right now some games are in alpha, but come launch time, amd has to be competitive, and if the games are designed from the ground up on dx11 because of UE4 crippled dx12 support, amd is going to have problems initially.
 
I'm not concerned about their dx11 performance, that will be slowly diminished over time as dx12 titles take over the high end and the dx11 titles are all played well enough to not matter.

I am very concerned about their VR performance, where the dx11 performance comes into play with UE4, I read some dev talking about how support in that engine for dx12 was pretty lacking atm with frequent crashes. It's a worry that needs to be addressed. But the way forward is not to double down in dx11 in my view, it's to help move devs out of the gutter of dx11 and onto dx12/vulkan.


VR is a kind of reset, and I don't want to see amd trailing out of the gate. Right now some games are in alpha, but come launch time, amd has to be competitive, and if the games are designed from the ground up on dx11 because of UE4 crippled dx12 support, amd is going to have problems initially.


I am concerned about DX 12 and VR. DX12 has been out for a year now?? How many titles do we have? I have surrendered in 2011 to Sandybridge and this year Pascal and I am not going back for a while as my main rig. I am not giving AMD a dime until they get it. They had some good products but the execution, pricing, and support makes me scratch my head.

You should be concerned about DX11 too, its not going away and AMD had plenty of years to optimize GCN on an architectural level and driver level to run it on par with Nvidia.
 
I want 6 MiniDP 1.4 and a Powerful Single GPU. I want it to be like when the 5870 launched and it was ridiculous versus the previous generation.
 
Beat a 1080Ti? If it hit's 80% of what a 1080Ti will be able to do I'll shit myself.


Agreed. There's nothing wrong with wanting AMD to do well but I think he, like AMD leadership is, well...

AMD is the same company that told us a few months ago that they were starting a VR revolution and now we're seeing real world results of them getting pummeled by NVIDIA. And the OP seems to be in some LSD induced fantasy that this same company will pull off some miracle and take down NVIDIA's top end card? LOL! They can't even compete with a 1060 let alone 1080Ti or Titan X.
 
Beat a 1080Ti? If it hit's 80% of what a 1080Ti will be able to do I'll shit myself.
So in other words, a 1080.

The problem with what the OP is suggesting, is that if AMD does pull off something like that, it'll be quickly discovered that AMD's high end GPU can't actually run the 4k 120hz FreeSync monitor at even 60fps (Pitan X barely does that), and it will then create a HUGE backlash against them, whether justified or not.

Besides, if they DO show an OLED 4k 120hz monitor, then you pretty much throw the entire "AMD being better bang for the buck" perception out the window, as that monitor is going to be as expensive (if not more) than the 1080ti/Pitan X and the 34" Ultra-wide put together (and I am not exaggerating, that Dell OLED monitor, 30", is slated for $5000, which is enough to buy an entire SYSTEM with Acer X34 and 2 Pitan X's, let alone putting the actual computer together.

If they want to take the market back, they need to release a GPU that is better than anything nVidia currently has, because that'll work into a layman's perception of "if nVidia has the best, then they are better", even though that perception is generally false when you are in the sub $200 range.

Also, the highest resolution G-Sync display is not a 34" ultrawide, but a 32" 4k (XB321HK)
 
It needs to be the best. Not the best in DX12, not the best per dollar, not the best in the category, simply the best. No argument.
 
here goes nothing!

As long as it has enough performance of the price it will sell. Nvidia will sell. Amd will sell. Just look at it rx480 and rx470s are well in stock according to my nowinstock.org notifications that were going off the chains. I had to turn it off.

All rejoice if not highend amd is atleast giving something to them people that want cheap but good performance. Rx 470 is doing just that. Shit load available it seems and they sell and they get restocked. So I don't even know why amd released rx 480 that was a fail, rx470 is so close in preformance they should have just went with two versions of that 4gb and 8gb for 179.99 and 219.99 and sell them all day. Then release rx480 down the road without the stupid stock cooler and let AIB's roll with it.

I don't have my hopes up for Vega, thats just what amd has earned over time but I think they will end up matching atleast the 1080 with it. Idk about titan x and shit. if they get anything higher than 1080 i'll drink the cool aid.
 
Last edited:
So in other words, a 1080.

The problem with what the OP is suggesting, is that if AMD does pull off something like that, it'll be quickly discovered that AMD's high end GPU can't actually run the 4k 120hz FreeSync monitor at even 60fps (Pitan X barely does that), and it will then create a HUGE backlash against them, whether justified or not.

Besides, if they DO show an OLED 4k 120hz monitor, then you pretty much throw the entire "AMD being better bang for the buck" perception out the window, as that monitor is going to be as expensive (if not more) than the 1080ti/Pitan X and the 34" Ultra-wide put together (and I am not exaggerating, that Dell OLED monitor, 30", is slated for $5000, which is enough to buy an entire SYSTEM with Acer X34 and 2 Pitan X's, let alone putting the actual computer together.

If they want to take the market back, they need to release a GPU that is better than anything nVidia currently has, because that'll work into a layman's perception of "if nVidia has the best, then they are better", even though that perception is generally false when you are in the sub $200 range.

Also, the highest resolution G-Sync display is not a 34" ultrawide, but a 32" 4k (XB321HK)


32" 4k gsync works for me, you can set that against a 43-50+ inch 4k freesync tv, especially an oled comparison would make the gsync display look like a cheap knockoff fisherprice toy which is basically what almost all the pc monitors are when it comes to display quality compared to high end tvs.

There is less work needed to get tvs to have more pc monitor like features than there would be to get monitors to have more tv like features, like local dimming (ZERO monitors, so no solid HDR capability), no oled aside from some freakishly expensive DELL model and beyond. It's a total shit show, and 32" 4k = wtf is the fracking point. Go bigger or go home, 40-43 inches is the sweet spot for desktop 4k, and if you are farther back in a living room setup, then larger size 4k is ideal for more immersive experiences.


There are already rumblings of tv makers adding freesync support so this is not some pie in the sky hope either.
 
I am concerned about DX 12 and VR. DX12 has been out for a year now?? How many titles do we have? I have surrendered in 2011 to Sandybridge and this year Pascal and I am not going back for a while as my main rig. I am not giving AMD a dime until they get it. They had some good products but the execution, pricing, and support makes me scratch my head.

You should be concerned about DX11 too, its not going away and AMD had plenty of years to optimize GCN on an architectural level and driver level to run it on par with Nvidia.


I am but I've burned off my hopes for dx11 and put them into the dx12 basket. The hope is more that triple A games tend to include dx12 or vulkan. But who knows what game devs will actually do, if they keep using that broken mess they are all in love with, UE4, dx12 will continue to be a crippled mess until they add real support. Even cryengine that no one other than star citizen seems to use seems poised for better dx12 support since star citizen is confirmed to have that... no time table though.




cryengine, rewrote the renderer to properly handle dx12... UE4... dx12 proper support? Does this help nvidia? Fuck it.

ugh, such a shit engine, and the fact that so many devs lick its boots is a tragedy beyond words, it holds the world back.
 
dx11 is on its last leg. I mean almost every new game coming out is jumping on dx12 band wagon. Even the next gen highend should play any dx11 game at more than enough fps.
 
32" 4k gsync works for me, you can set that against a 43-50+ inch 4k freesync tv, especially an oled comparison would make the gsync display look like a cheap knockoff fisherprice toy which is basically what almost all the pc monitors are when it comes to display quality compared to high end tvs.

There is less work needed to get tvs to have more pc monitor like features than there would be to get monitors to have more tv like features, like local dimming (ZERO monitors, so no solid HDR capability), no oled aside from some freakishly expensive DELL model and beyond. It's a total shit show, and 32" 4k = wtf is the fracking point. Go bigger or go home, 40-43 inches is the sweet spot for desktop 4k, and if you are farther back in a living room setup, then larger size 4k is ideal for more immersive experiences.


There are already rumblings of tv makers adding freesync support so this is not some pie in the sky hope either.

I went 32" 4k because all of the 4k TV's here either have no reviews (BenQ being the biggest name, among other companies like InFocus), don't have 40" ones (Samsung JS series do not have smaller than 55" versions here) or have PWM (The phillips one), eliminating all possible choices barring that particular monitor for me, since my desk literally cannot fit anything bigger than a 40". Even LG's OLED TV has minimum size of 55".

My main problem with 4k monitor choice is the complete lack of reviews for the models we have, and the total shit customer laws we have (we can only, by and large, exchange defective units, refunds are generally not possible).

I used to disdain large screens as well, because I do NOT want to move my head in order to just see a number in one corner of the screen, and come to think of it, I definitely wasn't a poor choice for me to go 32", I cannot imagine the horror of looking at 1440p on a 40" screen because my hardware (or any hardware that was available) wasn't up for it.

4k wasn't ready for prime time, and thus big screens is a liability for downscaling, so the bigger the screen, the more you are forced to use its native.

That was my opinion. I have no doubt that 40" 4k TV's would look mighty impressive, but from a practical point of view, the hardware isn't there to support it.

By hardware, I mean, 60fps minimum at max settings on a single GPU, not 60fps average, not high graphical settings, not SLI/Crossfire, 60 mininum, at max, on a single GPU on 4k. Right now, the hardware only barely managed Witcher 3 60 fps minimum on 1440p, let alone 4k.

I do not believe, however, there is any use in discussing this any further, since there is absolutely nothing stopping anyone from connecting a 4k 120hz freesync monitor to an nVidia GPU and having it work. The main point here is, whether AMD can come up with a GPU that would forcefully take the performance crown back from nVidia (not just in DX12 titles either). Given the performance between between Fury X and 1080 (let alone Pitan), I am not optimistic.
 
I am but I've burned off my hopes for dx11 and put them into the dx12 basket. The hope is more that triple A games tend to include dx12 or vulkan. But who knows what game devs will actually do, if they keep using that broken mess they are all in love with, UE4, dx12 will continue to be a crippled mess until they add real support. Even cryengine that no one other than star citizen seems to use seems poised for better dx12 support since star citizen is confirmed to have that... no time table though.

cryengine, rewrote the renderer to properly handle dx12... UE4... dx12 proper support? Does this help nvidia? Fuck it.

ugh, such a shit engine, and the fact that so many devs lick its boots is a tragedy beyond words, it holds the world back.

UE4 is great engine since it allows middle tier developer or bigger indie studios to produce good looking games

Not developers fault that AMD made shitty architecture that can only operate at 80% of it's full potential under DX 11.
 
Yep AMD can do what ever they want with the UE4 engine, its not that hard to get the source lol and make their own libraries...... They just need to spend the resources to do it.
 
Yep AMD can do what ever they want with the UE4 engine, its not that hard to get the source lol and make their own libraries...... They just need to spend the resources to do it.

They're too ethical to do that.
 
I don't have much confidence in AMD outright beating Nvidia in performance, but I do have faith they can come close at a lower price.
 
  • Like
Reactions: NKD
like this
I don't think amd will overtake nvidia anytime soon. They just need to have more cash flow. Going by past history AMD just doesn't have the resources that nvidia does especially when it comes to drivers, they tend to take 6-12 months to keep squeezing more performance. RX 480 will probably be the same way, in 6 months or so it will probably gain another 10-15% performance. They will come close to Nvidia but Nvidia just went all out when it comes to getting every mhz from their chips. I never thought we would have GPUs at 2ghz already. They didn't make many changes to architecture and went for raw speed. AMD went on to upgrade their front end and given their history they will take 6 months to have drivers optimized for their new hardware. Thats just the reality.
 
AMD needs to actually offer a competitively performing high end video card before they go focusing on niche products.
 
When your midrange card uses more power than the competitors high end card, you've got problems. Honestly, unless they pull off something crazy, I don't see the high end AMD cards being all that competitive other than maybe price.

It just seems weird that AMD went from being power efficient and affordable to power hungry and affordable all while still performing worse.
 
When your midrange card uses more power than the competitors high end card, you've got problems. Honestly, unless they pull off something crazy, I don't see the high end AMD cards being all that competitive other than maybe price.

It just seems weird that AMD went from being power efficient and affordable to power hungry and affordable all while still performing worse.

You are partially right. AMD was always behind Nvidia in power efficiency. You have to compare Rx 480 to their previous gen 390/390x. It is still hell of a lot power efficient then their previous gen. They can't magically beat up nvidia when it comes to that since maxwell was already power efficient and nvidia shrunk it down and clocked it up and got even more performance.

RX480 is not power hungry. It uses about 160-165w at stock. Yes it doesn't deliver Nvidia class performance but it does do hell of a lot better than 390x at 300w. Now AMD did get close to maxwell with fury x cards they did make up big efficiency. They did that partly because of water cooling, if they could keep the card at 65c it saved them power. So they will be able to make up some power efficiency with vega, we don't know how much.

I bet rx 480 will have a few more revisions where it actually makes up in power efficiency. It's just AMD being AMD and global foundries is not really the best at new nodes. We all know they have had hell of a time with new node.
 
Slightly off topic but I'm wondering if NV has picked up tick-tock and whether or not we'll see Volta bringing more substantial changes in SM structure. I'm guessing that's how it will be because I don't see any other way for them to double perf/w without a node change
 
Yeah it will have to be a very big change to the SM's and possibly other things too. In any case its a more of a response to Intel, as Intel has picked up ground on the compute side of things and nV needs to get their next gen out in that market sooner than Intel rather then later as Pascal came out later then Knights landing.
 
Yeah it will have to be a very big change to the SM's and possibly other things too. In any case its a more of a response to Intel, as Intel has picked up ground on the compute side of things and nV needs to get their next gen out in that market sooner than Intel rather then later as Pascal came out later then Knights landing.

Did GP100 end up having a front end or not by the way? Would be interesting to see pure compute GPUs being produced
 
You are partially right. AMD was always behind Nvidia in power efficiency. You have to compare Rx 480 to their previous gen 390/390x. It is still hell of a lot power efficient then their previous gen. They can't magically beat up nvidia when it comes to that since maxwell was already power efficient and nvidia shrunk it down and clocked it up and got even more performance.

RX480 is not power hungry. It uses about 160-165w at stock. Yes it doesn't deliver Nvidia class performance but it does do hell of a lot better than 390x at 300w. Now AMD did get close to maxwell with fury x cards they did make up big efficiency. They did that partly because of water cooling, if they could keep the card at 65c it saved them power. So they will be able to make up some power efficiency with vega, we don't know how much.

I bet rx 480 will have a few more revisions where it actually makes up in power efficiency. It's just AMD being AMD and global foundries is not really the best at new nodes. We all know they have had hell of a time with new node.

I was just reading about NV tapping Samsung for future GPUs so I don't think the glofo/samsung process is actually the problem. It's a big decision mainly because it carries a significant development overhead as they would need to reoptimize and rewrite their libraries for the new process
 
When your midrange card uses more power than the competitors high end card, you've got problems. Honestly, unless they pull off something crazy, I don't see the high end AMD cards being all that competitive other than maybe price.

It just seems weird that AMD went from being power efficient and affordable to power hungry and affordable all while still performing worse.

I think GPUs are out of the "pulling something off crazy" land for a while. Especially with the new node it relies more on strong management and sound engineering.

I've lost faith in AMD since rumors of the Fury X. I don't expect any miracles. I expect the gap to only widen between them and nVidia.
 
Did GP100 end up having a front end or not by the way? Would be interesting to see pure compute GPUs being produced


There were some front end changes, don't know how effective they are though over the clock speed increases for most but things like dynamic load balancing, it has been very effective but can't really compare that to older gens since that feature wasn't there before lol. I would say at most 5-10% performance gains from front end changes. Its not much but Maxwell's units are well utilized in most instances as it is, I don't know or even think at this point nV can't squeeze any more out of the current SM structure. They have to redesign the shader array to get any more utilization.
 
There were some front end changes, don't know how effective they are though over the clock speed increases for most but things like dynamic load balancing, it has been very effective but can't really compare that to older gens since that feature wasn't there before lol. I would say at most 5-10% performance gains from front end changes. Its not much but Maxwell's units are well utilized in most instances as it is, I don't know or even think at this point nV can't squeeze any more out of the current SM structure. They have to redesign the shader array to get any more utilization.


I was asking of GP100 even carries a front end, it wasn't on the diagrams they released at GTC, so there were some doubts, but in guessing they had just omitted it.

Yeah I'm quite looking forward to a shakeup, always fun to see something new then bash it to hell if/when it doesn't work. ;)
 
oh lol, yeah its pretty much the same chip as the GP102, just different memory buses, they didn't show it in the diagrams since its not the focal point of that chip.
 
I was asking of GP100 even carries a front end, it wasn't on the diagrams they released at GTC, so there were some doubts, but in guessing they had just omitted it.

Yeah I'm quite looking forward to a shakeup, always fun to see something new then bash it to hell if/when it doesn't work. ;)
And it was what I was pointing out to you when they released it and you just argue with me about it.
 
As a long time sympathizer and AMD fan, I have to say I too am skeptical. Been burned too ,many times.

PS as a side thought. Remember the old socket a tbird 1200mhz. First rev sucked. The blue b revisions ocd decent. I believe they added some layer to help mitigate voltage leakage. Do GPU dies go through similar revisions as CPU dies? Because if the 480/Polaris is going to be true long term regarding power efficiency then how in the world is and going to make a 250w part to even beats a 1080gtx :(
 
As a long time sympathizer and AMD fan, I have to say I too am skeptical. Been burned too ,many times.

PS as a side thought. Remember the old socket a tbird 1200mhz. First rev sucked. The blue b revisions ocd decent. I believe they added some layer to help mitigate voltage leakage. Do GPU dies go through similar revisions as CPU dies? Because if the 480/Polaris is going to be true long term regarding power efficiency then how in the world is and going to make a 250w part to even beats a 1080gtx :(


Rumor is that VEGA is a different architecture and very well could be the most different change from GCN so far. It needs to be because GCN doesn't clock high nor is it power efficient at higher clocks.
 
Rumor is that VEGA is a different architecture and very well could be the most different change from GCN so far. It needs to be because GCN doesn't clock high nor is it power efficient at higher clocks.

yeah it better be, but I'll believe it when I see it. I sure hope they do though as my past couple of nicer cards have been nvidia, and I'd really like to support AMD if they give me something worthwhile to buy.
 
As a long time sympathizer and AMD fan, I have to say I too am skeptical. Been burned too ,many times.

PS as a side thought. Remember the old socket a tbird 1200mhz. First rev sucked. The blue b revisions ocd decent. I believe they added some layer to help mitigate voltage leakage. Do GPU dies go through similar revisions as CPU dies? Because if the 480/Polaris is going to be true long term regarding power efficiency then how in the world is and going to make a 250w part to even beats a 1080gtx :(


Even I am skeptical about amd being able to pull off beating the titan/1080ti, but they can still do well if they blow past the 1080 and seriously undercut the 1080ti. They won't get the max performance or die crowd, but again, if we start seeing tvs ship with displayport 1.4 and freesync and 120Hz, along with their better contrast and color, that could be enough to sway a lot of people away from nvidia and their gsync cartel of tiny displays.
 
Back
Top