Async compute gets 30% increase in performance. Maxwell doesn't support async.

Discussion in 'Video Cards' started by StormClaw, Aug 30, 2015.

  1. Mboner1

    Mboner1 n00b

    Messages:
    18
    Joined:
    May 10, 2015
    This is why i stick with AMD.. not the a sync thing but the nvidia fan base jumping to their defense and talking about market share and all other irrelevant types of nonsense.

    Nvidia have proven time and time again they will do whatever it takes , and the nvidia users just lap it up lol it's funny. I will just stick with my trusty r9 290 for now which just gets better and better with age (how bout them dodgy amd drivers lòol) , the driver thing is hilarious as well.. if a game runs crap on a nvidia card its automatically "crap devs, needs optimisations, bad console port ra ra ra" lol and if it runs crap on amd the same people say "blame the amd drivers" lol. Meanwhile i have literally had no issues in my 2 years with my r9 290 while i have a mate with a titan and a mate with a 770 who have been unable to play numerous games unless they have "geforce driver3xx.xx" installed lol.

    Oh and have nvidia fixed the full rgb over hdmi for a properly calibrated display yet?? It's only been a problem for like a decade lol.. anyway.. this is why i stick with AMD.
     
  2. PRIME1

    PRIME1 2[H]4U

    Messages:
    3,942
    Joined:
    Feb 4, 2004
    That's a strange reason to buy computer hardware.
     
  3. Mboner1

    Mboner1 n00b

    Messages:
    18
    Joined:
    May 10, 2015
    Not really, I held off buying a ps4 for the same reason lol. Can't stand the users but i eventually succumbed. In this day and age of youtube videos and forums i would rather comment and talk to people that aren't just like "ps4 ruuulez ddr5 yo! Xbones rez suucks our hardware is so much better" lol. Same principle here lol.
     
  4. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    Why don't you just look at the gpu/cpu capabilities, pretty easy to see the PS4 is more capable from a GPU stand point......

    Now will it be fully utilized is another mater, if you want to wait to see that I can understand.
     
  5. Mboner1

    Mboner1 n00b

    Messages:
    18
    Joined:
    May 10, 2015
    Well at the risk of going off topic i own both and yeah the ps4 looks better but in terms of actual hardware it's hardly night and day like the ps4 users would have everyone believe.. i also find the ps4 prioritises getting to 1080p even if that means sacrificing smooth gameplay where xbone gladly drops the resolution to maintain a TRUER 60fps generally.
     
  6. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    interesting didn't know that, I've never been much of a console gamer, what I do like about consoles as later in life they get, the amount of performance dev's are able to extract out of them is pretty cool
     
  7. staknhalo

    staknhalo [H]ard|Gawd

    Messages:
    1,205
    Joined:
    Jun 11, 2007
    OMFG please tell me this is true lol
     
  8. trick0502

    trick0502 [H]ardness Supreme

    Messages:
    5,134
    Joined:
    Apr 17, 2006
    genmay rules, i want a pic
     
  9. polydiol

    polydiol [H]ard|Gawd

    Messages:
    1,327
    Joined:
    Feb 3, 2004
    from what I understand, this is only something that's going to take quite a while to figure out, not ideal press for Nvidia and ideal for AMD as they really need something to make up for the perceived performance gap with all the unkind reviews surrounding the latest AMD video cards.
     
  10. tybert7

    tybert7 2[H]4U

    Messages:
    2,633
    Joined:
    Aug 23, 2007
    I wonder if the async compute issues were related to the delaying of the ark survival dx12 patch. Did nvidia ask for a delay to tone down async compute? Did amd ask for a delay to make sure they were included in the way the game was rendered?


    Who knows, if only I could transform into a fly on the wall on those hidden meetings.
     
  11. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    well we can throw out preemption and context switching out the window, both those are used together and they will actually increase latency not reduce, have been reading up a little more on the topic. So what AMD's Hallock has eluded to is nV is using this method when he mentioned context switching, which it isn't right, we can see that with GPUveiw and the data from small program, that would be easy to spot, the latency doesn't have enough spikes for that, it wouldn't be a step like plot. It would be more erratic almost like a EKG.
     
  12. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    http://www.guru3d.com/articles_pages/powercolor_devil_radeon_r9_390x_review,23.html
    AotS, 980 Ti % over 390X (@ 1440p):
    Heavy: 32%
    Medium: 22%
    Normal: 21%
    Average: 24%

    https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/30.html
    DX11, 980 Ti % over 390X (@ 1440p):
    Average: 27%

    Even going back through Guru3D's own benchmarks the 980 Ti is around 30% faster in GameWorks games and down to 25% in everything else, lower in GE games: 9% in Hitman, 26% in Tomb Raider, 20% in Thief, 15% in Hardline. Keep in mind both of these benchmarks are comparing a 390X @ 1100/6100 to a stock reference 980 Ti.

    If I roll my eyes any harder they will pop out of my skull.
     
    Last edited: Sep 7, 2015
  13. Factum

    Factum [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Dec 24, 2014
    Mahigan:
    Any validity to the claim that you are a former ATi employee?
     
  14. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    If this is about conflict of interest, who cares. "ATi" would be 10+ years ago.
    Even if he's an AMD employee right now, it still wouldn't matter. Info is info.

    If nothing else it boosts his credibility. I'd trust someone who has worked in the industry for many years rather than a random forum user.

    edit; Edited for clarity.
     
  15. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,811
    Joined:
    Nov 5, 2010
    Since nvidia bought Ark as a gameworks title, it's probably safe to say any delays are probably caused by nvidia and nothing to do with AMD.


    Edit: also, not hearing anything about this, the devs are also probably WELL muzzled by nvidia not to give any updates to the game's player base.
     
    Last edited: Sep 7, 2015
  16. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    Last edited: Sep 7, 2015
  17. primetime

    primetime [H]ardness Supreme

    Messages:
    5,738
    Joined:
    Aug 17, 2005
    Wow isn't that something....I did not know a game could be both "Game Works" and "Gaming Evolved" at the same time. Of course considering they were selling the game while still in Alpha status leads me to believe they needed all the money they could get. :D Its one of the few games my kids was trying to get me to buy for him...lol I told him he had to wait till the game was finished cause im not paying for pre-release bs
     
  18. Remon

    Remon Limp Gawd

    Messages:
    352
    Joined:
    Jan 8, 2014
    Where did you see it's a "Gaming Evolved" game? It's an Nvidia game. That raptr page you linked is the list of games raptr can optimize. Ffs, there's even Borderlands and Assassin's Creed in there.

    Ark dx12 is clearly delayed because of Nvidia.
     
  19. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    AMD keeps a list on their website:
    http://www.amd.com/en-us/markets/game/featured

    If it were another more established studio... Or not utilizing UE4... I might agree with you. But right now it seems much more likely its the devs own incompetence. ARK is pretty infamous for its shit optimization so now imagine the same team of programmers trying to adopt a brand new API. If anything they're probably relying on Nvidia to do most of the work.
     
    Last edited: Sep 7, 2015
  20. Remon

    Remon Limp Gawd

    Messages:
    352
    Joined:
    Jan 8, 2014
    And Ark isn't there.
     
  21. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    ARK is not a GE game. AMD doesn't have anything to do with it.
     
  22. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,811
    Joined:
    Nov 5, 2010
    Maybe I just don't see the notice, I didn't find anything relating to ARK being a Gaming Evolved marketed title.

    Raptr will have game settings for all types of popular games, it is completely separate from specific AMD Gaming Evolved marketed games.
     
  23. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    Going off-topic slightly.

    "The vast majority of DX12 titles in 2015/2016 are partnering with AMD"
    http://www.overclock3d.net/articles...titles_in_2015_2016_are_partnering_with_amd/1

    You can skip the article because someone made a convenient list:
    https://www.reddit.com/r/pcgaming/c...t_majority_of_dx12_titles_in_20152016/cutgmrb

    Condensed:
     
  24. KickAssCop

    KickAssCop [H]ardness Supreme

    Messages:
    6,561
    Joined:
    Mar 19, 2003
    Typically, AMD affiliated games run better on NVidia hardware after a month or so of release so not a big deal really.
     
  25. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    yeah you are right, I was to lazy to read :)
     
  26. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    AMD games run good on everything because the developers aren't in a position to cripple the majority of their players who own Nvidia GPUs. AMD does usually get a slight advantage in their games... TressFX was a mess when Tomb Raider first launched. If nothing else it means Nvidia can't throw money at the problem to fuck AMD over.

    If Pascal does beat out Greenland on that crop of DX12 games then it's really bad news for AMD. If AMD can't win in their own suite of Gaming Evolved games, then they have no hope for the future. So I guess what we're seeing here is AMD's golden opportunity. Let's see how they fuck it up this time.
     
  27. §kynet

    §kynet Limp Gawd

    Messages:
    396
    Joined:
    Jun 11, 2003
    Someone tell that guy it's not na-vid-e-a but N-vid-e-a.
     
  28. socK

    socK 2[H]4U

    Messages:
    3,658
    Joined:
    Jan 25, 2004
    Or it could just be because DX12 in UE4 has a gigantic EXPERIMENTAL warning plastered all over it and may or may not blow up if you look at it funny because it's still early as fuck in development.

    I have an AMD card and just launching the editor in DX12 is enough to take down the graphics driver in flames... though just launching straight into a game does work.
     
  29. Mahigan

    Mahigan Limp Gawd

    Messages:
    142
    Joined:
    Aug 25, 2015
    That last statement... so true LOL
     
  30. Crosshairs

    Crosshairs Administrator Staff Member

    Messages:
    23,599
    Joined:
    Feb 3, 2004
    Keep it on topic and lay off the personal attacks...everyone who had a post deleted caught a break...the next time bans and infractions will be handed out... enough already
     
  31. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,790
    Joined:
    Jul 29, 2009
    But no one wins when developers are crippling their product, it is just very short sighted and in the long run people that are paying for a product that costs $50-$60 are not going to spend $1000 to play a game because their $500 videocard is not working well with some trivial eye-candy, most of them will turn features off or down and prolly not with the best feelings as to spending money on their product.

    Greenland vs Pascal does not have to mean anything. in general most engines using DX12 prolly won't be pushing it for the gamers to require 8 core cpu to be able to play at medium settings.... There is also something which we already noticed when new videocards come out older code has to be revised by the game developer if the hardware is different from previous generation.
     
  32. Flopper

    Flopper [H]ard|Gawd

    Messages:
    1,642
    Joined:
    Nov 15, 2007
    its bad business practice from Nvidia and people defend it?
    Never stop amaze me how people can do that
     
  33. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,415
    Joined:
    Aug 5, 2013
    Tell AMD to sell more video cards and developers will care about optimizing for their hardware. Nvidia doesn't have to do anything but wave their marketshare figures around and devs will fall in line. It's not bad practice from Nvidia, it's just common sense from developers.

    The rest of their middleware bullshit just comes from availability and ease of use. If you were a game dev would you rather spend time and resources implementing TressFX on your own, or go to Nvidia and have them provide the hardware, the HairWorks black box, and the man hours free of charge? The only reason Nvidia gets away with their sabotage bullshit is because GameWorks is so appealing. It's like a Trojan Horse designed to fuck with AMD hardware.

    Devs don't care about open source, they just want the graphical features as easy and cheap as possible. As soon as AMD creates a Gaming Evolved task force that goes around implementing TressFX for devs, they'll start using it. And they need to actually expand their middleware. Did TressFX 2.0 or 3.0 ever reach the market? It's been like 2 years. What's going on with that?

    Blame AMD for not being more aggressive with their proprietary tech.
    Blame devs for being lazy.
    Blame Nvidia for being shitheads with their implementations (tessellation).

    Nobody is defending Nvidia (well some people are), we're just dealing with the reality of the situation. Don't cry and tell us how mean the world is. :rolleyes:
     
  34. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    I don't think all dev's are lazy some possibly but it comes down to money and time.

    [​IMG]

    These are what you have to gauge when making a product. All of these factors are equally important in relation with one another.
     
  35. trandoanhung1991

    trandoanhung1991 [H]ard|Gawd

    Messages:
    1,096
    Joined:
    Aug 26, 2011
  36. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,662
    Joined:
    Oct 4, 2007
    I decided to save this post in my email because it describes best why I stick with AMD as well. My trusty Reference XFX R9 290 plays great at 4k but, when games in the future need more than it can handle, I will then upgrade. I already have a 4k monitor and at home, it is all I want and need.

    Have to admit I miss the days when a game like Crysis would push the available hardware beyond what it can handle. That is why I went with 4k because now, I am future proofed at a fantastic resolution that will be the goal as hardware improves in the future. The only reason I would go with Nvidia now would be if AMD is no longer available. Otherwise, it is AMD all the way.
     
  37. looncraz

    looncraz n00b

    Messages:
    28
    Joined:
    Nov 19, 2009
    Market share actually doesn't matter as much as you think, it's all about the installed user base.

    AMD is not as far behind as it may seem, though they are certainly slipping badly.

    From the steam hardware survey it would appear they have about 25 to 30% of all graphics cards in current use. That's a very hefty percentage.

    However, quite critically, they have the majority of Direct X 12 cards out in the wild. In fact, MOST people who will be using Direct X 12 will be using AMD graphics cards - everything since the 7000 series has had DirectX 12 support, after-all. These cards will be potentially significantly faster in new games than their vintage nVidia counter-parts which means that AMD's attrition rate will decline for all 7000 series+ cards (though this effect is a year+ away...).

    If AMD does well with their Greenland GPUs or if nVidia slips up, AMD will be poised to regain market share which will double-down on the DirectX 12 platform.

    And, of course, I'm completely excluding all the consoles out there. All current-gen consoles are running AMD SoCs with GCN graphics and XBone will be getting DirectX 12 to boot, which should finally make performance translate more properly.
     
  38. Flopper

    Flopper [H]ard|Gawd

    Messages:
    1,642
    Joined:
    Nov 15, 2007
    thier design went with flexible Dx12 approach which been their plan a long time.
    console wins.
    Mantle err now windows 10 dx12.
    async shaders hardware support.
    you will find amd the better option for dx12 and after all dont we buy stuff for the future?

    its an issue with their marketing though from AMD they can do better much better
     
  39. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    only if those features are usable, and if Dx12 games come out that would push those features enough to become bottlenecks.

    Its not all marketing that is killing AMD, that is being NAIVE. Marketing and Advertising in a market place that is saturated only works with viable products.
     
  40. PRIME1

    PRIME1 2[H]4U

    Messages:
    3,942
    Joined:
    Feb 4, 2004
    Actually they have the fewest by far. Even if you ignore their current 18% market share (not sure why you would). Only GCN supports DX12 while NVIDIA supports DX12 on Fermi, Kepler and Maxwell.

    As usual NVIDIA is better at supporting their older cards.