Fable Legends DX12 benchmark

Discussion in 'Video Cards' started by Quartz-1, Sep 24, 2015.

  1. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,810
    Joined:
    Nov 5, 2010
    All different techniques.

    Don't know why you bring up Christmas, but don't care.

    Who knows what kinds of optimizations Johan Andersson and his team are using in their Frostbyte engine, but they must be significant. We've just seen the first version of SFR in Civ. BE, it works great and should get even better going forward. As for Async and fluidity, I obviously haven't seen the Liquid VR SDK yet, but apparently AMD is using it to reduce movement lag and keep it under the maximum 16-20ms to reduce nausea.
     
  2. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    The irony, you probably switched at the worst possible time as we are finally nearing the inflection point where the dx12 api will advantage amd, or at least make them much more competitive across the line.
     
  3. PRIME1

    PRIME1 2[H]4U

    Messages:
    3,942
    Joined:
    Feb 4, 2004
    Saying that a card sucks in games now but may or may not suck in the future, is like the worst sales pitch ever.
     
  4. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,211
    Joined:
    Feb 22, 2012
    It's hard not to agree with Prime sometimes.
     
  5. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,146
    Joined:
    Aug 1, 2005
    Dude that's AMD and/or their shills main selling point and their fans eat it up:

    "Fury X will demolish 980Ti/Titan X, just wait until it's released"
    "Fury X will be an overclockers dream...just wait till we can actually overclock it!"
    "Oops ok so Fury X doesn't overclock like we thought..wait until voltage is properly unlocked! Just wait."
    "AMD will have HBM 2 priority, too bad NVIDIA! Pascal will be late and AMD will be ahead of the game. Just wait"
    "AMD won all 3 console contracts, RIP NVIDIA! All games will be optimized for GCN. Just wait."
    "FresSync will take over the world and GSync will be dead. Just wait"
    "Async Compute, just wait till late 2016, it will smoke 780 Ti! Just wait."
    "AMD APUs will put NVIDIA and Intel out of business someday, just wait."
    "Lisa Su will save AMD, just wait"
    "Zen"
    and of course
    "Bulldozer"
     
    Last edited: Sep 25, 2015
  6. Gideon

    Gideon 2[H]4U

    Messages:
    2,303
    Joined:
    Apr 13, 2006
    Between you and Prime1 I cant tell who is the bigger troll these days.

    No one from AMD said it would destroy the Titan X

    They did say it would be a overclockers dream, that was a blunder on their part.

    AMD has never promised voltage control for any of their cards, your just parroting fanboys there.

    With Intel adopting Freesync, Gsync is in trouble I doubt it will live much longer.

    Async computing is light years ahead of what Nvidia can do right now on their cards, perhaps Nvidia will close that gap with the next gen hardware.

    AMD makes a better APU but I dont think anyone thought it would bankrupt Nvidia or Intel. In fact every new gpu or cpu people usually say it will be the death of AMD not the other way around.

    Zen is coming, will see what it can do when it's here, tho it does sound good so far.

    Bulldozer is a great chip at multi threaded tasks however it sucked on single threaded tasks. Not surprising since it was built as a server chip first. Sadly that chip did get over hyped and expectations were way too high.

    I could go over all the BS you post thats is just as bad fud and pro Nvidia but I don't have all day. It's hardware, learn to enjoy it and not try to justify what you purchase to everyone else, after all you bought it for your use not mine.
     
  7. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,211
    Joined:
    Feb 22, 2012
    I stopped reading here. AMD posted benchmarks showing it would destroy nVidia and called it the "overclockers dream". No one got close to their results and we know it's tapped out.

    I own a Fury X and think it's a good SFF card, but let's not rewrite history.


    [​IMG]
     
    Last edited: Sep 25, 2015
  8. jwcalla

    jwcalla 2[H]4U

    Messages:
    3,629
    Joined:
    Jan 19, 2011
    It seems like AMD always has a product that will shine in the future.
     
  9. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,428
    Joined:
    Aug 5, 2013
    If it makes you feel any better, if I had gone with a mid-range part instead, I probably would have bought a 390 despite how much I dislike Hawaii.
     
  10. Gideon

    Gideon 2[H]4U

    Messages:
    2,303
    Joined:
    Apr 13, 2006
    Still waiting to see that quote AMD made about destroying a Titan X, cool graph tho.

    Destroying, failure and etc. are all in the eye of the beholder, some would look at that graph and say it's barely faster, of course marketing never shows their product in it's best light... Oh wait that's what they get paid to do.
     
  11. gigatexal

    gigatexal [H]ardness Supreme

    Messages:
    7,252
    Joined:
    Jun 22, 2004
    lol isn't that graph from AMD's marketing?
     
  12. KickAssCop

    KickAssCop [H]ardness Supreme

    Messages:
    6,576
    Joined:
    Mar 19, 2003
    I don't always give a shit about benchmarks but whenever I visit this forum, AMD is losing them.
     
  13. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    Nvidia has the high end crown, and nothing else in dx12. This is not to be contested, we've all seen the charts.

    390 ~ 980 > 970
    390x > 980
    280 > 960

    980ti >= Fury depending on the degree of async compute going on (not much in the fable benchmarks shown)


    The response to this is essentially, yeah but so what, in dx11 nvidia still performs better, I care about games now.

    Not a bad retort, and it would make a lot more sense if amd cards were in the gutter in dx11, but they are not (... unless talking about AOTS, any game like that and it's unusable on amd if only dx11 is available, or games like project cars where they forced physx in a dx11 code path). If I had the choice of buying a card where my performance in current games would be ~ 10% worse on average in dx11 games today but 10-15% better in the upcoming dx12 games that are coming up fast, I'd choose the decent performance today, and better performance tomorrow cards.

    But perhaps that is because I don't treat graphics cards like a cheap stable of slatterns to be rotated through like I'm Charlie Sheen.

    I've had my 290 since around late 2013 / early 2014. I'm typically on a 2 year gpu cadence, but felt no pressing need to upgrade, and wanted to wait until the die shrink which will bring the single biggest performance gains we've seen in a long while between generations. For me, and people like me, it is better to err towards the cards with longer legs.

    If you are the kind of person that rotates high end cards/sli/crossfire with each new infinitesimal iteration, then longevity and performance over time is kind of meaningless.

    It's like a guy who buys a bmw m3 this year, and buys a new one next year because they upped the hp by 20.

    You could have made a better argument if we were still well within the dx11 game phase, because getting a 980 over a 290 would give tangible performance gains. But we are NOT well within the dx11 gaming era. The games that push performance will have enormous pressure to shift towards the latest apis like dx12 and vulkan, and in that world, the time when it was easy to brush off amds offerings look to be closing.

    the 390 is based off a card that was released in EFFING 2013 for gods sakes, and it is running neck and neck with a 980 released by nvidia a year later in dx12 ?!?!?!?????!?!?!??


    Those "rebrands" when unshackled had far more staying power than we ever knew. The dark reign of shackled amd performance in dx11 is ending, and in the words of Saruman:

    A New Power is Rising.

    https://www.youtube.com/watch?v=TQq4LjSF2rc#t=42s
     
  14. atom

    atom Gawd

    Messages:
    851
    Joined:
    May 3, 2005
    Do we know what this really means? It's kind of unclear what is implied. If it simply disables ASYNC for the NVidia cards that is great news for everyone, clearly. I am all for a game taking advantage of a GPU's features as long as it doesn't cripple other GPUs.
     
  15. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,810
    Joined:
    Nov 5, 2010
    Except async code was never supposed to cripple anything, it's a legitimate feature of DX12. This is a problem nvidia created by themselves for themselves by lying.

    It would be nice to confirm if async code is only active on the Xbox version and artificially disabled on the PC version or not.
     
  16. socK

    socK 2[H]4U

    Messages:
    3,672
    Joined:
    Jan 25, 2004
    glancing at the engine source, it looks like it's not implemented yet on PC.
     
  17. atom

    atom Gawd

    Messages:
    851
    Joined:
    May 3, 2005
    I am not a part of the rest of the thread's fanboyism. I am not suggesting that AMD nor Microsoft are scheming against NVidia. I am simply pointing out that it was a simple solution to support it on hardware that supports it, and to disable it otherwise. Kind of like Tomb Raider and TressFX. If you didn't have the option to turn it off, there wouldn't be much point in NVidia owners to buy the game. Some developers want to put features in their engine and release benchmarks showing how it cripples a certain card. This is interesting and useful information, but what happens when that feature is bypassed? I wonder what amazing graphics features the latest version of Windows Blinds and Start10 will support?
     
  18. heflys20

    heflys20 [H]ard|Gawd

    Messages:
    1,492
    Joined:
    Aug 27, 2010
    You mean AMD? Doubtful. I predict Nvidia will continue to outsell them. Irregardless of these benches. I honestly don't see this changing much of anything.
     
    Last edited: Sep 26, 2015
  19. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    Well, Saruman did fail as well. But hope springs eternal. If there is any justice in this world, the 390 should start to cut into 970 sales, STILL the number 1 selling video card on amazon. I want cards to win on the merits, and the 970 no longer deserves to sit at that perch.
     
  20. heflys20

    heflys20 [H]ard|Gawd

    Messages:
    1,492
    Joined:
    Aug 27, 2010
    Name brand sells, unfortunately. The closest selling (aside from adapters), equivalent AMD card on Amazon is the 390 MSI, and it's at spot 20. It also appears Nvidia is gearing up to dump out Pascal.

    Most consumers aren't even knowledgeable of these benches. It's mostly fodder for diehard fans to argue among themselves.
     
    Last edited: Sep 26, 2015
  21. limitedaccess

    limitedaccess [H]ardness Supreme

    Messages:
    7,493
    Joined:
    May 10, 2010
    The problem is some enthusiasts place way too much importance into what are really rather marginal raw performance differences. Even more so in terms of how this applies to the broader market which is less discerning.

    From a practical stand point if you were to blind test graphic cards within 1-2 segments of each other the vast majority of people are not going to be able to tell the difference. What also compounds this issue is that products that close together will effectively trade performance wins as well (some games run better on one or the other).
     
  22. Revdarian

    Revdarian 2[H]4U

    Messages:
    2,429
    Joined:
    Aug 16, 2010
    That isn't a word.
     
  23. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,211
    Joined:
    Feb 22, 2012
    You're right.
     
  24. Michaelius

    Michaelius [H]ardness Supreme

    Messages:
    4,684
    Joined:
    Sep 8, 2003
    How is he right ? If we are talking about solutions with equal price why would you want to settle for less ?
     
  25. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,211
    Joined:
    Feb 22, 2012
    Well that's where [H] type performance reviews come in. Do you actually get a better experience? If we're talking about +/- 10% probably not. That's what he's talking about.

    Then you can consider things like past history, location of production, ect.

    The OP you responded to said "AMD and/or their shills". At some point this was said. I'm sure I could get a quote from AMD Roy if someone didn't delete his entire account.
     
    Last edited: Sep 26, 2015
  26. PRIME1

    PRIME1 2[H]4U

    Messages:
    3,942
    Joined:
    Feb 4, 2004
    I would feel bad for anyone who would buy a card based on benchmarks for a game that's not even out. A lot can change by the time the game is released and the drivers are finished. DX12 games will probably not be the standard until 2020.
     
  27. Zorlag

    Zorlag Limp Gawd

    Messages:
    272
    Joined:
    Sep 14, 2003
    Good continuation of funny tech predictions of the past. Can we repost this as reply endlessly on all your posts from now on?
     
  28. Remon

    Remon Limp Gawd

    Messages:
    352
    Joined:
    Jan 8, 2014
    I'm sure some "temperature/power draw/efficiency doesn't matter, this is [H]" were thrown a few years ago.
     
  29. heflys20

    heflys20 [H]ard|Gawd

    Messages:
    1,492
    Joined:
    Aug 27, 2010

    No, it is a word. Just a word that possesses a double negative. When used (mostly incorrectly) it irks the obsessively grammar conscious. I have a bad habit of using it sometimes. I think I like the way it sounds for some reason.

    There you go.
     
    Last edited: Sep 26, 2015
  30. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    If anyone knows prime irl you are going to need to console him hard as the dx12 titles start raining from the sky next year.
     
  31. jwcalla

    jwcalla 2[H]4U

    Messages:
    3,629
    Joined:
    Jan 19, 2011
    NVIDIA will do fine on DX12. The idea that their hardware is just a nicely polished turd and we're only now seeing it, and that all nvidia has is drooling morons for engineers who don't know how to design GPUs might make for great forum talk, but that's just not reality.
     
  32. heflys20

    heflys20 [H]ard|Gawd

    Messages:
    1,492
    Joined:
    Aug 27, 2010
    By the time dx12 games become prevalent, I imagine nvidia will have new tech on the market.
     
  33. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,146
    Joined:
    Aug 1, 2005
    Read what I wrote, I said AMD and/or their shills. Lest we not forget this amusing pic their shills/fanboys made:

    [​IMG]

    Ah here's something else:

    Fastest GPU in the world indeed...just not yet.
     
    Last edited: Sep 26, 2015
  34. trudude

    trudude [H]ard|Gawd

    Messages:
    1,647
    Joined:
    Jul 17, 2003
  35. sblantipodi

    sblantipodi 2[H]4U

    Messages:
    3,441
    Joined:
    Aug 29, 2010
  36. Revdarian

    Revdarian 2[H]4U

    Messages:
    2,429
    Joined:
    Aug 16, 2010

    English has pretty flexible grammatical rules and still adding multiple prefixes doesn't make an acceptable word. If this was german, were you can even string a couple things together to make a new complex word, then sure.

    Thereyougothisisn'tawordeitherbutaccordingtoyouitcouldbeawordeventhoughweallknowthatitisn'tthisisjustareductioadabsurdumofyouritisaworddefense.
     
  37. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,428
    Joined:
    Aug 5, 2013
    I could care less about the word "irregardless". :cool:
     
  38. DPI

    DPI Nitpick Police

    Messages:
    10,960
    Joined:
    Apr 20, 2013
    Shall we take bets now on which team will top the DeusEx:MD DX12 charts? Any takers? All in the spirit of friendly, heightened civility and basic, common respect and such in this new DX12 era to prove things are different.
     
    Last edited: Sep 26, 2015
  39. heflys20

    heflys20 [H]ard|Gawd

    Messages:
    1,492
    Joined:
    Aug 27, 2010
    But, it is a word; "regardless" of how ridiculous it may sound. Here you go:

    http://i.word.com/idictionary/irregardless

    Who cares though? Perhaps you'll send some death threats towards Merriam's way. I'm sure they'll be intimidated. Perhaps they'll be amazed by your grammar skills which are beyond reproach, yes? How much effort did you spend making that erroneous/ rude response?

    Stop! Grammar time! If you don't rectify this error, I'll be most perturbed.
     
    Last edited: Sep 26, 2015
  40. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,475
    Joined:
    Feb 1, 2005
    Every time you see me that Grammar's just so hype
    I'm dope on the floor and I'm magic on the mic