Vega Rumors

Not too much testing has been done with chill, performance impact/real gaming impact. Does it impact the gaming experience? How does it work with VR? I am not even sure why bother with it in the first place. It seems to be more applicable to cell phone gaming and mobile gaming when on the battery. Plus Chill should have a setting so you could adjust the minimum frame rate value and sensitivity to the amount of motion - meaning I just don't see ever using it at this time.

Radeon 64 LC, Powersaver mode, 4K, Alien isolation max settings with tweaks to increase shadows map size and something else I don't remember, frames limited to 60 fps (to keep in Freesync range). The benchmark never went below 59fps, pulling around 235w from the wall. I was very surprised on that result and that was without chill on, 17.9.1 drivers. Rendering a frame rate beyond your refresh rate of your monitor or Freesync range not only degrades the gaming experience with tearing, judder etc. but also just waste energy. Only time I can see rendering faster than the max sync rate of your monitor is if you use Fastsync with Nvidia hardware and you are rendering 2x+ over your refresh rate to reduce game latency which for me I would probably not see a significant enough improvement to bother.

I am just totally rethinking what makes better gaming for me and it is just not the average FPS which could be higher but yet has stuttering, missed frames etc. I am finding the 1% threshold or FPS (1% of the frame rate would be below given value) to be a more accurate indication from a numbers standpoint the gaming performance of a GPU. The .1% data I've seen recorded on various sites looks to be very inaccurate due to low sampling of data, I would say you would need 15min + of gaming to get a good .1% data point.
 
Project Cars @1440p seems to like these cards. Unfortunately no Crossfire yet it seems.
http://gamegpu.com/racing-simulators-/-гонки/project-cars-2-test-gpu-cpu
Their numbers are contradicted by 3 other publications, Vega cards are considerably slower than GTX 1080 in this game. They had a set of old numbers which showed the same thing, but they then scrapped those and tested again.


1.png

http://www.pcgameshardware.de/Proje...Specials/Benchmark-Test-Grafikkarten-1238952/
http://www.benchmark.pl/testy_i_recenzje/project-cars2-test/strona/28643.html

 
Their numbers are contradicted by 3 other publications, Vega cards are considerably slower than GTX 1080 in this game. They had a set of old numbers which showed the same thing, but they then scrapped those and tested again.


1.png

http://www.pcgameshardware.de/Proje...Specials/Benchmark-Test-Grafikkarten-1238952/
http://www.benchmark.pl/testy_i_recenzje/project-cars2-test/strona/28643.html


The two reviews linked used older drivers, one even predated 17-9-1 (which were junk)
The video does not even specify which driver was used so who knows???
The data was scrapped because it was on old drivers in which 17-9-2 made significant improvements I take it.
  • If that is the case then the 64 was behind the 1080 and is now ~ 18% faster in this one game
  • Curious if other games saw any kind of improvements

So no conspiracy here, just a new card going through driver enhancements is all.
 
The two reviews linked used older drivers, one even predated 17-9-1 (which were junk)
The video does not even specify which driver was used so who knows???
The data was scrapped because it was on old drivers in which 17-9-2 made significant improvements I take it.
  • If that is the case then the 64 was behind the 1080 and is now ~ 18% faster in this one game
  • Curious if other games saw any kind of improvements

So no conspiracy here, just a new card going through driver enhancements is all.


Radeon Software Crimson ReLive Edition 17.7.2 features optimizations for improved gaming responsiveness in DirectX®9 and select DirectX®11 gaming titles.

finally working on DX11 drivers ;) 2 years too late, but at least they are getting around to it.

Now to 17.9.2
  • Radeon RX Vega Series
    • Up to 2x Multi GPU support
  • Project CARS 2™
    • Multi GPU profile support added
No talk about Project Cars 2 about any performance enhancements in this driver, I would think they would mention if something like that happened. Have to be suspect of results like that when they aren't mentioned in drivers release notes.

Oddly enough why does this review say Project Cars 2 with these exact drivers, say multi GPU isn't working, when these drivers were specific for multi gpu for Project Cars 2? Something in this review isn't right, or AMD's driver is behaving badly.

I would double check if all settings are functioning properly.
 
Last edited:
The video does not even specify which driver was used so who knows???
The data was scrapped because it was on old drivers in which 17-9-2 made significant improvements I take it.
The video is done using latest drivers, and AMD made no mention of any specific fps improvements to Project Cars 2 or any other game. Unlike PUBG, where they announced it loud and clear.

Here is another game benchmark, where Vega is considerably behind 1080.
2560x1440Ultra.png


http://pclab.pl/art75527-5.html


And here are a bunch of newly released titles where Vega is considerably behind as well, (courtesy of B3D):

Dishonored Death of the Outsider
http://gamegpu.com/action-/-fps-/-tps/dishonored-death-of-the-outsider-test-gpu-cpu
https://www.purepc.pl/procesory/tes..._death_of_the_outsider_bywalo_gorzej?page=0,8

F1 2017
https://www.computerbase.de/2017-08/f1-2017-pc-benchmark/2/#diagramm-f1-2017-1920-1080

Ark Survival
http://gamegpu.com/action-/-fps-/-tps/ark-survival-evolved-test-gpu-cpu

Recore: Definitive Edition
http://gamegpu.com/action-/-fps-/-tps/recore-definitive-edition-test-gpu-cpu

Divinity 2
http://gamegpu.com/rpg/ролевые/divinity-original-sin-2-test-gpu-cpu
 
Last edited:
The AMD drivers notes mentioned Project Cars after the first paragraph.
https://www.hardocp.com/news/2017/09/22/rx_vega_multigpu_support_here

I'd assume that means that it is a driver that might address issues in Project Cars? I could very well be wrong though as I tend to overthink things at times.

In F1 2017 the 1080 got 88 fps and the Vega 64 got 82 @1440p. Big whoop?

ARK never ran well on Nvidia's hardware and is sponsored by them. I got it in a typically large game bundle from Humble Bundle for $12 and feel cheated to this very day. The developers were supposed to release a DX12 patch a couple of years ago. Did it ever materialize?

ReCore ran fine on my RX 480 @1080p. I could fire it up on my Vega 64 @1440p to see if it is just as smooth.

Divinity 2 is a turn based game where the game's attraction is reading the flavor text and listening to the impeccable voice acting. Getting 86 fps on a Vega 64 is acceptable to me for a turn based text / speech adventure. I have never uttered these words before, but you could play this game at 30 fps and still enjoy it like Civilization V. Damn I need to take a bath... 30 fps... Eww!

Dishonored 2 was riddled with performance issues at launch. It was so bad that I skipped the game. Looks like AMD forgot about it too as performance within their product stack is too close to each other in the test @1440p.
 
they talk about mGPU support and yet that review doesn't seem to have mGPU working on Project Cars 2 for Vega, I mentioned that, if it had a 30% increase in performance you know they would mention that, that is a huge increase.

That performance increase added to the fact mGPU failed for them in Project Cars 2, seems like something they did wrong or something isn't working right in those drivers......

I would ask them if they can verify if SMAA was active when running those tests. That performance difference seems like x4 MSAA hit on AMD cards (which is around the same bandwidth usage as SMAA) when bandwidth bottleneck is hit.
 
Last edited:
http://www.guru3d.com/news-story/amd-might-replace-rx-500-cards-with-rx-vega-32-and-28.html

While Vega 64 is well garbage, 56 is ok (at MSRP), so I guess 32 and 28 can compete about as fine as Polaris did. Not looking forward to pricing if it uses HBM.

HBM is only as expensive as the value that the consumer places upon it. If an iPhone didn't sell well then the price would be dropped to meet consumer demand. If miners are waiting in the wings for those cards then they will cost more at retail even though they have a different MSRP. I bet those cut down cards will cost the same MSRP as the ones they replace. :) The GPU industry knows their pricing brackets.
 
HBM is only as expensive as the value that the consumer places upon it. If an iPhone didn't sell well then the price would be dropped to meet consumer demand. If miners are waiting in the wings for those cards then they will cost more at retail even though they have a different MSRP. I bet those cut down cards will cost the same MSRP as the ones they replace. :) The GPU industry knows their pricing brackets.


ram is one of the components that are strictly based on supply and demand. But right now its supply constrained that is why dram prices are going up, from last year dram prices have gone up over 100% and predictions are its going to go up another 40%..... So yeah right now, GPU industry knows, cards are going to go up in price due to this or they eat their margins.
 
The video is done using latest drivers, and AMD made no mention of any specific fps improvements to Project Cars 2 or any other game. Unlike PUBG, where they announced it loud and clear.

Here is another game benchmark, where Vega is considerably behind 1080.
2560x1440Ultra.png


http://pclab.pl/art75527-5.html


And here are a bunch of newly released titles where Vega is considerably behind as well, (courtesy of B3D):

Dishonored Death of the Outsider
http://gamegpu.com/action-/-fps-/-tps/dishonored-death-of-the-outsider-test-gpu-cpu
https://www.purepc.pl/procesory/tes..._death_of_the_outsider_bywalo_gorzej?page=0,8

F1 2017
https://www.computerbase.de/2017-08/f1-2017-pc-benchmark/2/#diagramm-f1-2017-1920-1080

Ark Survival
http://gamegpu.com/action-/-fps-/-tps/ark-survival-evolved-test-gpu-cpu

Recore: Definitive Edition
http://gamegpu.com/action-/-fps-/-tps/recore-definitive-edition-test-gpu-cpu

Divinity 2
http://gamegpu.com/rpg/ролевые/divinity-original-sin-2-test-gpu-cpu
How do you know that video was on the latest drivers? Just because you said so? I think I will see what Brent comes up with, there is more than average numbers in the end. I think folks will figure things out and get something that will work for them.

As for what I would recommend if one is shopping around the $600+ range and only looking for a single card or system type scenario -> Hands down 1080 Ti. With that said when your system has like a FreeSync monitor then things start to become more blurry just like if you have a GSync monitor why would you buy AMD? If you have a great Freesync monitor the question then becomes why would you buy Nvidia? Syncing monitors make a huge difference in the gaming experience, I would say a bigger positive difference then the disparity of average frame rates on a 1080 or 1080Ti. Folks just need to figure out what they want in the end.

Nvidia could very easily support Adaptive Sync monitors but they choose not to and if you want to have a syncing monitor with your Nvidia card you will have to pay Nvidia extra for it. AMD cannot make their cards Gsync capable.
 
How do you know that video was on the latest drivers? Just because you said so? I think I will see what Brent comes up with, there is more than average numbers in the end. I think folks will figure things out and get something that will work for them.
Because this channel always does tests using latest drivers.

Here is another test from ComputerBase using so called latest drivers, again the Vega 64 is considerably behind the 1080, in fact it's barely faster than 1070 @1080p.
https://www.computerbase.de/2017-09/project-cars-2-benchmark/2/#diagramm-project-cars-2-1920-1080
 
The AMD drivers notes mentioned Project Cars after the first paragraph.
https://www.hardocp.com/news/2017/09/22/rx_vega_multigpu_support_here

I'd assume that means that it is a driver that might address issues in Project Cars? I could very well be wrong though as I tend to overthink things at times.

In F1 2017 the 1080 got 88 fps and the Vega 64 got 82 @1440p. Big whoop?

ARK never ran well on Nvidia's hardware and is sponsored by them. I got it in a typically large game bundle from Humble Bundle for $12 and feel cheated to this very day. The developers were supposed to release a DX12 patch a couple of years ago. Did it ever materialize?

ReCore ran fine on my RX 480 @1080p. I could fire it up on my Vega 64 @1440p to see if it is just as smooth.

Divinity 2 is a turn based game where the game's attraction is reading the flavor text and listening to the impeccable voice acting. Getting 86 fps on a Vega 64 is acceptable to me for a turn based text / speech adventure. I have never uttered these words before, but you could play this game at 30 fps and still enjoy it like Civilization V. Damn I need to take a bath... 30 fps... Eww!

Dishonored 2 was riddled with performance issues at launch. It was so bad that I skipped the game. Looks like AMD forgot about it too as performance within their product stack is too close to each other in the test @1440p.

So rather than contend that the Vega is a better product, you instead contend that it doesn't matter that it's worse because you, subjectively, can't really tell the difference...?? Even giving a game-by-game explanation of why it being a worse product doesn't subjectively matter to you.......

In that case why would you be buying a Vega at all? Buy something two generations old that plays whatever you play well enough for you. At least then you could make the valid argument that you're saving money.

:cautious:

Using "it's worse but that's okay to me" to justify paying full-dollar for a current generation product seems intellectually questionable.
 
Last edited:
So rather than contend that the Vega is a better product, you instead contend that it doesn't matter that it's worse because you, subjectively, can't really tell the difference...?? Even giving a game-by-game explanation of why it being a worse product doesn't subjectively matter to you.......

In that case why would you be buying a Vega at all? Buy something two generations old that plays whatever you play well enough for you. At least then you could make the valid argument that you're saving money.

:cautious:

Using "it's worse but that's okay to me" to justify paying full-dollar for a current generation product seems intellectually questionable.

No, it's called brand loyalty no matter what.
 
Eh, it's their money; they can spend it how they want. What is intellectually dishonest is recommending solutions based on brand that are not in the best interest of the person who's asking for opinions. Thankfully, very few people around here tend to push a dishonest opinion that is solely based on brand loyalty.
 
Here's a compilation of slides and info from Videocardz including the above.

https://videocardz.com/72934/rumor-amd-matisse-picasso-vega-20-and-ryzen-5-pro-mobile
I looked at the 4K results, Cloudy 1080 won, Thunderstorm the 64 won - still the differences are so small it would not really make a difference either way. Also remember in the video it was using Relive to record the video which is less efficient or takes more performance away then Shadowplay with Nvidia cards. Would be nice knowing that difference as well. I will wait until Brent gets a feel for this game with video cards if HardOCP does one. His process is so superior to what I would call half ass ones that permeate the internet.

As far as I am concern Vega 64 should have been clipping at the 1080 Ti toes and not at 1080 levels depending upon game and settings.
 
As far as I am concern Vega 64 should have been clipping at the 1080 Ti toes and not at 1080 levels depending upon game and settings.
Which never happens though, except for a game or two.
I looked at the 4K results, Cloudy 1080 won, Thunderstorm the 64 won
Yep, FuryX behavior, Vega lost all other resolutions though, by a big margin.
 
I almost got ProjectCars 2 but decided to wait. Mostly for VR use but my 1080Ti system is lacking a cpu cooler at the moment. I could see how it runs on Vega 64. As a side note, on Serious Sam VR The Last Hope, Vega 64 runs it flawlessly on Ultra settings. I never saw it go into reprojection with stock driver settings. I maybe will do a video of the game play with the SteamVR frame time graph running - Game is a blast and really came upon it's own with all the newest and final additions.
 
Here is another game benchmark using latest drivers, and per usual, 1080 is dominating Vega 64 here
https://www.purepc.pl/karty_graficz...jnosci_kart_graficznych_i_procesorow?page=0,7

Here are the newest Forza Motorsport 7 benchmarks. noko This might make you happy. Personally I hope that Nvidia releases a new driver or works with the developer to get more performance out of Pascal. The 1080Ti shouldn't be that far behind in performance running on "Game Ready" drivers released specifically for the game.

As far as I am concern Vega 64 should have been clipping at the 1080 Ti toes and not at 1080 levels depending upon game and settings.

Forza 7 Benchmark: Vega has more gasoline in the blood than Pascal.
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080

2017-09-29 (1).png
2017-09-29 (2).png
2017-09-29 (3).png
 
Here are the newest Forza Motorsport 7 benchmarks. noko This might make you happy. Personally I hope that Nvidia releases a new driver or works with the developer to get more performance out of Pascal. The 1080Ti shouldn't be that far behind in performance running on "Game Ready" drivers released specifically for the game.



Forza 7 Benchmark: Vega has more gasoline in the blood than Pascal.
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080

View attachment 37945 View attachment 37946 View attachment 37947

Nice results for Vega on DX12, but I agree Pascal should be performing better. Too bad, no DX 11 version for this game.
 
Here are the newest Forza Motorsport 7 benchmarks. noko This might make you happy. Personally I hope that Nvidia releases a new driver or works with the developer to get more performance out of Pascal. The 1080Ti shouldn't be that far behind in performance running on "Game Ready" drivers released specifically for the game.



Forza 7 Benchmark: Vega has more gasoline in the blood than Pascal.
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080

What in the world is up with Fury X at 4k? lol
 
Here are the newest Forza Motorsport 7 benchmarks. noko This might make you happy. Personally I hope that Nvidia releases a new driver or works with the developer to get more performance out of Pascal. The 1080Ti shouldn't be that far behind in performance running on "Game Ready" drivers released specifically for the game.



Forza 7 Benchmark: Vega has more gasoline in the blood than Pascal.
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080

View attachment 37945 View attachment 37946 View attachment 37947


interesting, frame times are out of wack on nV cards, CPU overhead seems to be the culprit when ever that happens.
 
Here are the newest Forza Motorsport 7 benchmarks. noko This might make you happy. Personally I hope that Nvidia releases a new driver or works with the developer to get more performance out of Pascal. The 1080Ti shouldn't be that far behind in performance running on "Game Ready" drivers released specifically for the game.



Forza 7 Benchmark: Vega has more gasoline in the blood than Pascal.
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080

View attachment 37945 View attachment 37946 View attachment 37947
that has to be a first... NVidia moving up as resolution increases.

I love the dynamic settings they use to maintain framerate. Wish an RPG or shooter did this just to see if it a good idea outside of racing games.
 
This is unrelated to Vega. As the 580, 390 and 380 show the same big lead, this game could be favoring AMD hardware just like Dirt 4.

I think it is more Nvidia DX12 growing pains personally. I don't write drivers so it is pointless for me to speculate other than to say that I hope it gets fixed so that everyone can enjoy the game. ;)
 
that has to be a first... NVidia moving up as resolution increases.

I love the dynamic settings they use to maintain framerate. Wish an RPG or shooter did this just to see if it a good idea outside of racing games.


If what is happening is CPU limitations that will happen. Also comparing any games with Pascal to Vega, this is happening pretty much all the time now.
 
that has to be a first... NVidia moving up as resolution increases.
That's probably just fillrate becoming more of an issue at higher resolutions. I'd say there's a fair chance this is the rapid packed math starting to show. Forza 7 being a console port, they would have used FP16 to reduce register pressure. That benefits Polaris and Pascal as well. In addition to async with DX12 as that's typical of consoles. Moved over to PC, the compiler automatically starts packing in the case of Vega. Result being the cores idling as they slam into another bottleneck and use less power in the process. One of the videos I saw had Vega running consistently around 1590Mhz. No idea on the cooling or voltages, but RPM will have the effect of reducing power consumption when used. Resulting in clocks for bottlenecked parts increasing as well as overall performance. These results would seem typical for Vega in modern games that used the latest shader compilers. Might work for older games, but a game patch or shader replacement would be required. DSBR and primitive shaders likely aren't enabled either, but compute based culling can marginalize the primitive shaders.
 
That's probably just fillrate becoming more of an issue at higher resolutions. I'd say there's a fair chance this is the rapid packed math starting to show. Forza 7 being a console port, they would have used FP16 to reduce register pressure. That benefits Polaris and Pascal as well. In addition to async with DX12 as that's typical of consoles. Moved over to PC, the compiler automatically starts packing in the case of Vega. Result being the cores idling as they slam into another bottleneck and use less power in the process. One of the videos I saw had Vega running consistently around 1590Mhz. No idea on the cooling or voltages, but RPM will have the effect of reducing power consumption when used. Resulting in clocks for bottlenecked parts increasing as well as overall performance. These results would seem typical for Vega in modern games that used the latest shader compilers. Might work for older games, but a game patch or shader replacement would be required. DSBR and primitive shaders likely aren't enabled either, but compute based culling can marginalize the primitive shaders.


Pretty sure it doesn't use RPM, was looking at the cfg in the demo haven't seen anything on RPM. Not only that, scaling from low end Polaris all the way up to Vega, doesn't show what you are saying.

Also using AMD's shader intrinsics have a direct correlation to draw calls and cpu using on a per core level. I am pretty certain one or two cores when using nV hardware are getting pushed to the limits, instead of using more cores evenly, which was fairly common on early DX12 games too.

And if you start looking at performance between Xbox One S and Xbox One X, with this game it has similar deltas between other games too, so RPM not being used.
 
Last edited:
If what is happening is CPU limitations that will happen. Also comparing any games with Pascal to Vega, this is happening pretty much all the time now.
Actually on the way home I realized it wouldn't likely help CPU restrictions aka: Skyrim and such. Although Fallout 4 has an adaptive shadow distance thing meant to maintain a frame rate in Concord and boston where shadows hit very hard. Seems to work.
 
Actually on the way home I realized it wouldn't likely help CPU restrictions aka: Skyrim and such. Although Fallout 4 has an adaptive shadow distance thing meant to maintain a frame rate in Concord and boston where shadows hit very hard. Seems to work.


Sorry didn't mean the CPU limitation for all games that exhibit resolution higher more performance difference to Pascal vs Vega, bad grammar on my part should have separated the two into paragraphs.

Forza seems to be the only game that is CPU limited on nV hardware, which similarly older DX12 games showed this specifically AMD sponsored titles (Hit Man was notorious for this as well as AOTS, both of them now have been fixed through driver updates), now I don't think its something on AMD's side telling dev's to do something a certain way, its just the way intrinsic shaders are set up, If using intrinsic shaders for AMD hardware and porting them over to nV hardware, its not straight forward, the same/similar shader will abuse draw calls on nV hardware by not using all cores, this comes down to drivers. The hardware in inherently different and this is why code has to be tailored per hardware.

Having said that, I still do expect Vega to make some gains anyhow, but this specific game its not Vega looking good, its more likely Pascal just looking bad by being held back.

Back to resolution changes and performance differences, this has more to do with fillrates (pixel and texture), Vega's fillrates didn't increase from Fiji as much as Pascal's did from Maxwell, raw shader capabilities also didn't increase as much for Vega as Pascal did from Maxwell either % wise. Where Pascal was able to shift the pixel shader bottleneck Vega's bottlenecks shifted towards parts which traditionally increased with the drop of nodes and didn't happen with Vega. This is probably due to the die size being so large and after decoupling all the units, they could only do so much.
 
Back
Top