Async compute gets 30% increase in performance. Maxwell doesn't support async.

This is why i stick with AMD.. not the a sync thing but the nvidia fan base jumping to their defense and talking about market share and all other irrelevant types of nonsense.

Nvidia have proven time and time again they will do whatever it takes , and the nvidia users just lap it up lol it's funny. I will just stick with my trusty r9 290 for now which just gets better and better with age (how bout them dodgy amd drivers lòol) , the driver thing is hilarious as well.. if a game runs crap on a nvidia card its automatically "crap devs, needs optimisations, bad console port ra ra ra" lol and if it runs crap on amd the same people say "blame the amd drivers" lol. Meanwhile i have literally had no issues in my 2 years with my r9 290 while i have a mate with a titan and a mate with a 770 who have been unable to play numerous games unless they have "geforce driver3xx.xx" installed lol.

Oh and have nvidia fixed the full rgb over hdmi for a properly calibrated display yet?? It's only been a problem for like a decade lol.. anyway.. this is why i stick with AMD.
 
This is why i stick with AMD.. not the a sync thing but the nvidia fan base jumping to their defense and talking about market share and all other irrelevant types of nonsense.

That's a strange reason to buy computer hardware.
 
That's a strange reason to buy computer hardware.

Not really, I held off buying a ps4 for the same reason lol. Can't stand the users but i eventually succumbed. In this day and age of youtube videos and forums i would rather comment and talk to people that aren't just like "ps4 ruuulez ddr5 yo! Xbones rez suucks our hardware is so much better" lol. Same principle here lol.
 
Not really, I held off buying a ps4 for the same reason lol. Can't stand the users but i eventually succumbed. In this day and age of youtube videos and forums i would rather comment and talk to people that aren't just like "ps4 ruuulez ddr5 yo! Xbones rez suucks our hardware is so much better" lol. Same principle here lol.


Why don't you just look at the gpu/cpu capabilities, pretty easy to see the PS4 is more capable from a GPU stand point......

Now will it be fully utilized is another mater, if you want to wait to see that I can understand.
 
Why don't you just look at the gpu/cpu capabilities, pretty easy to see the PS4 is more capable from a GPU stand point......

Now will it be fully utilized is another mater, if you want to wait to see that I can understand.

Well at the risk of going off topic i own both and yeah the ps4 looks better but in terms of actual hardware it's hardly night and day like the ps4 users would have everyone believe.. i also find the ps4 prioritises getting to 1080p even if that means sacrificing smooth gameplay where xbone gladly drops the resolution to maintain a TRUER 60fps generally.
 
interesting didn't know that, I've never been much of a console gamer, what I do like about consoles as later in life they get, the amount of performance dev's are able to extract out of them is pretty cool
 
My wife has been super supportive the entire time. Making me sandwiches and bringing me coffee as I worked tirelessly to obtain a response from nVIDIA.

OMFG please tell me this is true lol
 
from what I understand, this is only something that's going to take quite a while to figure out, not ideal press for Nvidia and ideal for AMD as they really need something to make up for the perceived performance gap with all the unkind reviews surrounding the latest AMD video cards.
 
I wonder if the async compute issues were related to the delaying of the ark survival dx12 patch. Did nvidia ask for a delay to tone down async compute? Did amd ask for a delay to make sure they were included in the way the game was rendered?


Who knows, if only I could transform into a fly on the wall on those hidden meetings.
 
well we can throw out preemption and context switching out the window, both those are used together and they will actually increase latency not reduce, have been reading up a little more on the topic. So what AMD's Hallock has eluded to is nV is using this method when he mentioned context switching, which it isn't right, we can see that with GPUveiw and the data from small program, that would be easy to spot, the latency doesn't have enough spikes for that, it wouldn't be a step like plot. It would be more erratic almost like a EKG.
 
http://www.guru3d.com/articles_pages/powercolor_devil_radeon_r9_390x_review,23.html
AotS, 980 Ti % over 390X (@ 1440p):
Heavy: 32%
Medium: 22%
Normal: 21%
Average: 24%

https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/30.html
DX11, 980 Ti % over 390X (@ 1440p):
Average: 27%

Even going back through Guru3D's own benchmarks the 980 Ti is around 30% faster in GameWorks games and down to 25% in everything else, lower in GE games: 9% in Hitman, 26% in Tomb Raider, 20% in Thief, 15% in Hardline. Keep in mind both of these benchmarks are comparing a 390X @ 1100/6100 to a stock reference 980 Ti.

The 390X should not even be close to this card, yet it is.

If I roll my eyes any harder they will pop out of my skull.
 
Last edited:
Mahigan:
Any validity to the claim that you are a former ATi employee?
 
Mahigan:
Any validity to the claim that you are a former ATi employee?
If this is about conflict of interest, who cares. "ATi" would be 10+ years ago.
Even if he's an AMD employee right now, it still wouldn't matter. Info is info.

If nothing else it boosts his credibility. I'd trust someone who has worked in the industry for many years rather than a random forum user.

edit; Edited for clarity.
 
I wonder if the async compute issues were related to the delaying of the ark survival dx12 patch. Did nvidia ask for a delay to tone down async compute? Did amd ask for a delay to make sure they were included in the way the game was rendered?


Who knows, if only I could transform into a fly on the wall on those hidden meetings.

Since nvidia bought Ark as a gameworks title, it's probably safe to say any delays are probably caused by nvidia and nothing to do with AMD.


Edit: also, not hearing anything about this, the devs are also probably WELL muzzled by nvidia not to give any updates to the game's player base.
 
Last edited:
err there was post on steam about this today. Few minutes ago.

http://steamcommunity.com/app/346110/discussions/0/594820656447032287/

http://raptr.com/amd/games

Ark Evolved is a AMD game evolved game too by the way.

Wow isn't that something....I did not know a game could be both "Game Works" and "Gaming Evolved" at the same time. Of course considering they were selling the game while still in Alpha status leads me to believe they needed all the money they could get. :D Its one of the few games my kids was trying to get me to buy for him...lol I told him he had to wait till the game was finished cause im not paying for pre-release bs
 
AMD keeps a list on their website:
http://www.amd.com/en-us/markets/game/featured

Ark dx12 is clearly delayed because of Nvidia.
If it were another more established studio... Or not utilizing UE4... I might agree with you. But right now it seems much more likely its the devs own incompetence. ARK is pretty infamous for its shit optimization so now imagine the same team of programmers trying to adopt a brand new API. If anything they're probably relying on Nvidia to do most of the work.
 
Last edited:
err there was post on steam about this today. Few minutes ago.

http://steamcommunity.com/app/346110/discussions/0/594820656447032287/

http://raptr.com/amd/games

Ark Evolved is a AMD game evolved game too by the way.

Maybe I just don't see the notice, I didn't find anything relating to ARK being a Gaming Evolved marketed title.

Raptr will have game settings for all types of popular games, it is completely separate from specific AMD Gaming Evolved marketed games.
 
Going off-topic slightly.

"The vast majority of DX12 titles in 2015/2016 are partnering with AMD"
http://www.overclock3d.net/articles...titles_in_2015_2016_are_partnering_with_amd/1

You can skip the article because someone made a convenient list:
https://www.reddit.com/r/pcgaming/c...t_majority_of_dx12_titles_in_20152016/cutgmrb

Condensed:
AMD (Hardware Partner):
Deus Ex
Hitman
AotS
Tomb Raider 2016
Battlefront

AMD (Known Affiliation):
Mirror's Edge
Fable: Legends

AMD (assumed based on historical hardware partnerships):

Nvidia (Hardware Partner):
Ark
King of Wushu

Nvidia (Known Affiliation):
Unreal Tournament 4

Nvidia (assumed based on historical hardware partnerships):
Gears of War (Microsoft)

Neutral:
Arma 3 (coming with map pack iirc)
Dayz Standalone
Killer Instinct (Microsoft)
Halo Wars 2 (Microsoft)
Star Citizen

So, totals -----

AMD: 5 hardware partners, 2 known affiliation
Nvidia: 2 hardware partners, 1 known affiliation, 1 historical partnership
Neutral: 5
 
Typically, AMD affiliated games run better on NVidia hardware after a month or so of release so not a big deal really.
 
Maybe I just don't see the notice, I didn't find anything relating to ARK being a Gaming Evolved marketed title.

Raptr will have game settings for all types of popular games, it is completely separate from specific AMD Gaming Evolved marketed games.

yeah you are right, I was to lazy to read :)
 
Typically, AMD affiliated games run better on NVidia hardware after a month or so of release so not a big deal really.
AMD games run good on everything because the developers aren't in a position to cripple the majority of their players who own Nvidia GPUs. AMD does usually get a slight advantage in their games... TressFX was a mess when Tomb Raider first launched. If nothing else it means Nvidia can't throw money at the problem to fuck AMD over.

If Pascal does beat out Greenland on that crop of DX12 games then it's really bad news for AMD. If AMD can't win in their own suite of Gaming Evolved games, then they have no hope for the future. So I guess what we're seeing here is AMD's golden opportunity. Let's see how they fuck it up this time.
 
AMD keeps a list on their website:
http://www.amd.com/en-us/markets/game/featured


If it were another more established studio... Or not utilizing UE4... I might agree with you. But right now it seems much more likely its the devs own incompetence. ARK is pretty infamous for its shit optimization so now imagine the same team of programmers trying to adopt a brand new API. If anything they're probably relying on Nvidia to do most of the work.

Or it could just be because DX12 in UE4 has a gigantic EXPERIMENTAL warning plastered all over it and may or may not blow up if you look at it funny because it's still early as fuck in development.

I have an AMD card and just launching the editor in DX12 is enough to take down the graphics driver in flames... though just launching straight into a game does work.
 
AMD games run good on everything because the developers aren't in a position to cripple the majority of their players who own Nvidia GPUs. AMD does usually get a slight advantage in their games... TressFX was a mess when Tomb Raider first launched. If nothing else it means Nvidia can't throw money at the problem to fuck AMD over.

If Pascal does beat out Greenland on that crop of DX12 games then it's really bad news for AMD. If AMD can't win in their own suite of Gaming Evolved games, then they have no hope for the future. So I guess what we're seeing here is AMD's golden opportunity. Let's see how they fuck it up this time.

That last statement... so true LOL
 
Keep it on topic and lay off the personal attacks...everyone who had a post deleted caught a break...the next time bans and infractions will be handed out... enough already
 
AMD games run good on everything because the developers aren't in a position to cripple the majority of their players who own Nvidia GPUs. AMD does usually get a slight advantage in their games... TressFX was a mess when Tomb Raider first launched. If nothing else it means Nvidia can't throw money at the problem to fuck AMD over.

If Pascal does beat out Greenland on that crop of DX12 games then it's really bad news for AMD. If AMD can't win in their own suite of Gaming Evolved games, then they have no hope for the future. So I guess what we're seeing here is AMD's golden opportunity. Let's see how they fuck it up this time.

But no one wins when developers are crippling their product, it is just very short sighted and in the long run people that are paying for a product that costs $50-$60 are not going to spend $1000 to play a game because their $500 videocard is not working well with some trivial eye-candy, most of them will turn features off or down and prolly not with the best feelings as to spending money on their product.

Greenland vs Pascal does not have to mean anything. in general most engines using DX12 prolly won't be pushing it for the gamers to require 8 core cpu to be able to play at medium settings.... There is also something which we already noticed when new videocards come out older code has to be revised by the game developer if the hardware is different from previous generation.
 
But no one wins when developers are crippling their product, it is just very short sighted and in the long run people that are paying for a product that costs $50-$60 are not going to spend $1000 to play a game because their $500 videocard is not working well with some trivial eye-candy, most of them will turn features off or down and prolly not with the best feelings as to spending money on their product.

Greenland vs Pascal does not have to mean anything. in general most engines using DX12 prolly won't be pushing it for the gamers to require 8 core cpu to be able to play at medium settings.... There is also something which we already noticed when new videocards come out older code has to be revised by the game developer if the hardware is different from previous generation.

its bad business practice from Nvidia and people defend it?
Never stop amaze me how people can do that
 
its bad business practice from Nvidia and people defend it?
Never stop amaze me how people can do that
Tell AMD to sell more video cards and developers will care about optimizing for their hardware. Nvidia doesn't have to do anything but wave their marketshare figures around and devs will fall in line. It's not bad practice from Nvidia, it's just common sense from developers.

The rest of their middleware bullshit just comes from availability and ease of use. If you were a game dev would you rather spend time and resources implementing TressFX on your own, or go to Nvidia and have them provide the hardware, the HairWorks black box, and the man hours free of charge? The only reason Nvidia gets away with their sabotage bullshit is because GameWorks is so appealing. It's like a Trojan Horse designed to fuck with AMD hardware.

Devs don't care about open source, they just want the graphical features as easy and cheap as possible. As soon as AMD creates a Gaming Evolved task force that goes around implementing TressFX for devs, they'll start using it. And they need to actually expand their middleware. Did TressFX 2.0 or 3.0 ever reach the market? It's been like 2 years. What's going on with that?

Blame AMD for not being more aggressive with their proprietary tech.
Blame devs for being lazy.
Blame Nvidia for being shitheads with their implementations (tessellation).

Nobody is defending Nvidia (well some people are), we're just dealing with the reality of the situation. Don't cry and tell us how mean the world is. :rolleyes:
 
I don't think all dev's are lazy some possibly but it comes down to money and time.

250px-PM_StarModel_suggested.jpg


These are what you have to gauge when making a product. All of these factors are equally important in relation with one another.
 
This is why i stick with AMD.. not the a sync thing but the nvidia fan base jumping to their defense and talking about market share and all other irrelevant types of nonsense.

Nvidia have proven time and time again they will do whatever it takes , and the nvidia users just lap it up lol it's funny. I will just stick with my trusty r9 290 for now which just gets better and better with age (how bout them dodgy amd drivers lòol) , the driver thing is hilarious as well.. if a game runs crap on a nvidia card its automatically "crap devs, needs optimisations, bad console port ra ra ra" lol and if it runs crap on amd the same people say "blame the amd drivers" lol. Meanwhile i have literally had no issues in my 2 years with my r9 290 while i have a mate with a titan and a mate with a 770 who have been unable to play numerous games unless they have "geforce driver3xx.xx" installed lol.

Oh and have nvidia fixed the full rgb over hdmi for a properly calibrated display yet?? It's only been a problem for like a decade lol.. anyway.. this is why i stick with AMD.

I decided to save this post in my email because it describes best why I stick with AMD as well. My trusty Reference XFX R9 290 plays great at 4k but, when games in the future need more than it can handle, I will then upgrade. I already have a 4k monitor and at home, it is all I want and need.

Have to admit I miss the days when a game like Crysis would push the available hardware beyond what it can handle. That is why I went with 4k because now, I am future proofed at a fantastic resolution that will be the goal as hardware improves in the future. The only reason I would go with Nvidia now would be if AMD is no longer available. Otherwise, it is AMD all the way.
 
Market share actually doesn't matter as much as you think, it's all about the installed user base.

AMD is not as far behind as it may seem, though they are certainly slipping badly.

From the steam hardware survey it would appear they have about 25 to 30% of all graphics cards in current use. That's a very hefty percentage.

However, quite critically, they have the majority of Direct X 12 cards out in the wild. In fact, MOST people who will be using Direct X 12 will be using AMD graphics cards - everything since the 7000 series has had DirectX 12 support, after-all. These cards will be potentially significantly faster in new games than their vintage nVidia counter-parts which means that AMD's attrition rate will decline for all 7000 series+ cards (though this effect is a year+ away...).

If AMD does well with their Greenland GPUs or if nVidia slips up, AMD will be poised to regain market share which will double-down on the DirectX 12 platform.

And, of course, I'm completely excluding all the consoles out there. All current-gen consoles are running AMD SoCs with GCN graphics and XBone will be getting DirectX 12 to boot, which should finally make performance translate more properly.
 
Market share actually doesn't matter as much as you think, it's all about the installed user base.

AMD is not as far behind as it may seem, though they are certainly slipping badly.

From the steam hardware survey it would appear they have about 25 to 30% of all graphics cards in current use. That's a very hefty percentage.

However, quite critically, they have the majority of Direct X 12 cards out in the wild. In fact, MOST people who will be using Direct X 12 will be using AMD graphics cards - everything since the 7000 series has had DirectX 12 support, after-all. These cards will be potentially significantly faster in new games than their vintage nVidia counter-parts which means that AMD's attrition rate will decline for all 7000 series+ cards (though this effect is a year+ away...).

If AMD does well with their Greenland GPUs or if nVidia slips up, AMD will be poised to regain market share which will double-down on the DirectX 12 platform.

And, of course, I'm completely excluding all the consoles out there. All current-gen consoles are running AMD SoCs with GCN graphics and XBone will be getting DirectX 12 to boot, which should finally make performance translate more properly.

thier design went with flexible Dx12 approach which been their plan a long time.
console wins.
Mantle err now windows 10 dx12.
async shaders hardware support.
you will find amd the better option for dx12 and after all dont we buy stuff for the future?

its an issue with their marketing though from AMD they can do better much better
 
thier design went with flexible Dx12 approach which been their plan a long time.
console wins.
Mantle err now windows 10 dx12.
async shaders hardware support.
you will find amd the better option for dx12 and after all dont we buy stuff for the future?

its an issue with their marketing though from AMD they can do better much better

only if those features are usable, and if Dx12 games come out that would push those features enough to become bottlenecks.

Its not all marketing that is killing AMD, that is being NAIVE. Marketing and Advertising in a market place that is saturated only works with viable products.
 
.

However, quite critically, they have the majority of Direct X 12 cards out in the wild.

Actually they have the fewest by far. Even if you ignore their current 18% market share (not sure why you would). Only GCN supports DX12 while NVIDIA supports DX12 on Fermi, Kepler and Maxwell.

As usual NVIDIA is better at supporting their older cards.
 
Back
Top