AMD RX Vega FreeSync vs NVIDIA GTX 1080 TI G-Sync Coming Soon

Discussion in 'HardForum Tech News' started by FrgMstr, Jul 25, 2017.

  1. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    Not in a direct way at least.

    The reason why I said what I said was that, if I were to find a perfect monitor that would fit into my gaming and everyday needs, currently that monitor will only exist in FreeSync, not G-Sync, meaning that if I were to use that technology, I'd have to choose AMD right now. But currently their most powerful GPU has already EoL'ed for a while, rest of their current GPUs are hard to come by due to the mining craze, and their next top dog is only matching what USED to be nV's top dog.

    I am using [H]'s TW3 benchmark as a basis for the kind of performance I am looking for, 60fps minimum at ultra settings, 1080 manages 60fps average, 1080ti is the card that closest GPU to fit the bill, but AMD has had nothing close to it, and this Vega doesn't look like it's going to cut it any time soon.

    Hence the extra disappointment in Vega. My ideal monitor is in their camp.

    It would be nice for nVidia supporting FreeSync, but with this current state of affairs, it could very well be the last nail in RTG's coffin because that's the only remaining advantage they have (abundance of cheaper FreeSync monitors). Once that's gone, there is nothing going for them.

    And it'd probably be in nV's best interest NOT to support FreeSync.

    I'll admit, I'd be much more surprised at nV supporting FreeSync than RX vega beating an 1080ti.
     
  2. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,438
    Joined:
    Oct 19, 2004
    My Fury X Crossfire experience on three 1440p Freesync HP Omen 32" monitors was nothing short of excellent over the last nine months or so (when I bought my second card)

    I think it's fair to expect the same crossfire experience from Vega. That's an option you might consider.


    Also consider the whole point of freesync (and gsync) is that you don't need 60FPS or more to "feel" smooth. In my experience with my Omen monitors down to Freesync minimum range at 48hz feels buttery smooth.
     
  3. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    AIO on Fury X was one of the primary reasons why I didn't go near it. I don't trust water.

    AIO on Vega will probably be the same, but after using SLI, I am now also weary of using mGPU again, so I'd like to avoid that as much as possible...

    If I trusted water anymore and if I had more free time to be able to tinker with mGPU setups, I'd probably have made the switch. I was close to doing so with Crossfiring Fury instead of going single 1080, but I decided to go 1080 due to it being single GPU.
     
  4. mesyn191

    mesyn191 2[H]4U

    Messages:
    2,983
    Joined:
    Jun 28, 2004
    Yeah that too could happen....but probably never will.

    NV has lots of mindshare tied up in Gsync and they won't let something like that go easily. Monitor manufacturers will have start abandoning Gsync before they do I believe.

    Nah. Even if Vega turns into a total bust they're going to try and stick it out with GCN and ho hum marketshare until Navi comes out. If Navi turns into a bust...yeah RTG might be finished in the dGPU market. They'll only maybe do APU's.

    Reliability seems to be OK for AIO's these days. Bigger issue is making it fit in some cases IMO. Air is still simplest though so understandable if you'd still want air only instead.
     
  5. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    I ran AMD in my system too, till it stopped making high end video cards.
     
  6. Mugato

    Mugato Muh Feelz!

    Messages:
    933
    Joined:
    Feb 25, 2014
    All sound and no Fury (hhahhhaha!!)

    I do like me some RX-7's though to be honest; built 3 SX13s from ground up, one CA18, one NA and one KA24DET. Damn I love Japan!
     
  7. Mugato

    Mugato Muh Feelz!

    Messages:
    933
    Joined:
    Feb 25, 2014
    They did it with Fury, they'll do it again. Ryzen has nothing to do with Vega people, margins are not close.
     
  8. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    Video is done. Article will be up tomorrow for sure.
     
    ecktt, Armenius, razor1 and 8 others like this.
  9. jologskyblues

    jologskyblues [H]Lite

    Messages:
    92
    Joined:
    Mar 20, 2017
    For transparency, is this an independent test or is this sponsored in any way, shape or form by AMD?
     
  10. workshop35

    workshop35 Gawd

    Messages:
    575
    Joined:
    Nov 24, 2013
    Even if amd sponsored it, kyle wouldnt pull any punches or skew anything in their favor.
     
    jfreund and Armenius like this.
  11. Hakaba

    Hakaba Gawd

    Messages:
    639
    Joined:
    Jul 22, 2013
    I know FE is not RX, but someone posted the 1k+ FE hashing between 30-32 at 300W+. It would be an inefficient mining card hell the 1070 beats it hands down in efficiency. But maybe RX will mine good.... I guess we shall wait and see.
     
  12. Hakaba

    Hakaba Gawd

    Messages:
    639
    Joined:
    Jul 22, 2013
    Or people that flat out hate team Green and want Red to win at any cost.
     
  13. jologskyblues

    jologskyblues [H]Lite

    Messages:
    92
    Joined:
    Mar 20, 2017
    Honestly, it was the first thing that crossed my mind after the backlash TPU got with their AMD sponsored review. Anyway, the results should speak for themselves if the testing methodology is fair and sound. Actually, I'm more interested if people can tell the difference between G-Sync vs. FreeSync in its current state.
     
  14. Sith'ari

    Sith'ari Gawd

    Messages:
    573
    Joined:
    Oct 13, 2013
    The most obvious logical explanation for AMD promoting the combination of ( VEGA+FreeSync total cost ) Vs ( 1080+GSync total cost ), is that their GPU as a standalone will be similarly priced to the competition (*perhaps even higher due to the HBM2 )
    So apparently, it's not beneficial for them, -from marketing point of view-, to compare only GPU Vs GPU.
     
  15. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    I would have also compared the list of 4k FreeSync capable monitors to a puny-in-comparison G-Sync monitors, just to show that there are much greater variety of FreeSync monitors than G-Sync.

    (of course, some of those FreeSync ranges are horrible, like 56~61 hz)
     
  16. NIVO

    NIVO [H]ard|Gawd

    Messages:
    1,254
    Joined:
    Jul 19, 2004
    hey this is awesome Kyle. I always look forward to the 2 camps go at it and duke it out. Looking forward to what you have to show us all. I gotta say the last3 or 4 cards now have all been Nvidia for me, but I hold no allegiance. Im always gonna go with the best bang for the buck when it comes to hardware. Now if we could only have some monitors that Freesync and G-sync so we dont have to choose. Are there any?
     
  17. TimberVD

    TimberVD n00b

    Messages:
    20
    Joined:
    Jul 25, 2017
    Hi Kyle, long time reader here, check in everyday and love the site, I thought I had a forum account from some years ago but for the life of me couldn't figure out username/email so just made a new account.

    Anyway, I can appreciate the desire to play with unreleased hardware even before a product launch, but I can't help but feel (no disrespect intended) that AMD is trying to pull the wool over peoples eyes with these blind tests. I think it's not the right way to go in for review sites to go along with this charade. It's awesome that in some way you are still sticking it to them by throwing a 1080Ti in the mix, but I can't help but feel that this blind test is not in the tradition of [H] when it comes to hard facts that we have come to love. Especially when it comes to a level playing ground or full declaration of hardware used by a competitor. Again, I understand that it might be a fine line with AMD, perhaps especially so with previous dealings with them but I don't personally like the very dubious route AMD has taking with these tests.

    Perhaps I completely got this wrong and I look forward to the video to perhaps alleviate some concerns. Again, no disrespect meant but it's just something I wanted to put to paper so to speak.
     
    jologskyblues likes this.
  18. Quartz-1

    Quartz-1 [H]ardness Supreme

    Messages:
    4,257
    Joined:
    May 20, 2011
    I must confess to being disappointed that the tests were not done at 4K, but maybe next time.
     
  19. Omegas

    Omegas [H]ardForum Junkie

    Messages:
    9,759
    Joined:
    Jan 19, 2007
    Wake up, damnit, I need information! You can sleep when your dead!!

    Seriously though, bated breath and all that.
     
    GDI Lord likes this.
  20. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    Currently, AMD is not an advertiser on HardOCP. It did not pay for an article. It did buy the equipment. Equipment is going back. AMD is a sponsor of Inside VR, which gets almost zero views through HardOCP. So I will let you make your mind up on what you want to label things. I found out a long time ago that it really makes no difference what I say.

    Truth hurts.
     
    staknhalo, Mav451, Digester and 2 others like this.
  21. Unoid

    Unoid [H]ard|Gawd

    Messages:
    1,049
    Joined:
    Feb 4, 2003
    I really hope you have the newest drivers from AMD that actually treat it as vega and not just a high clocked fury. Tile rasterization etc.
     
  22. SeymourGore

    SeymourGore 2[H]4U

    Messages:
    2,758
    Joined:
    Dec 12, 2008
    Where that video at?

    You're awake, you're posting. Who cares about your first cup of coffee, lets put that video up!
     
    ElPutoJefe likes this.
  23. Old OLD article. Problem is when you drop below a certain refresh rate, you have to frame double the LCD. GSync has this frame buffer to do the double. Freesync goes into a tearing. But at these low rates, your game play is going to suffer anyway.

    If your monitor complies with the 2.5x rule for FreeSync 2, your experience will be much more fluid at higher rates. And those monitors are now available.
     
    Sith'ari likes this.
  24. There's some debate as to whether or not these are hacked fury drivers. Looking at how the GPU is broken up you could assume there would be a high degree of similarity between the two. And from an engineering perspective, trying to save both time and money, a refresh is a lot cheaper than a ground up implementation.

    But claiming it's just a refresh doesn't sound nearly as exciting. So until someone puts a reverse compiler on the code and examines the code, it's hard to say.
     
    Sith'ari likes this.
  25. Johnwayne117

    Johnwayne117 n00b

    Messages:
    16
    Joined:
    Apr 8, 2016
    So there no NDA date or something?
     
  26. Unoid

    Unoid [H]ard|Gawd

    Messages:
    1,049
    Joined:
    Feb 4, 2003
    AMD made a big deal of the next gen unit is different and better than GCN:

    Wiki:

    AMD began releasing details of their next generation of GCN Architecture, termed the 'Next-Generation Compute Unit', in January 2017.[34][37][38] The new design is expected to increase instructions per clock, higher clock speeds, support for HBM2, a larger memory address space, and the High Bandwidth Cache Controller. Additionally, the new chips are expected to include improvements in the Rasterisation and Render output units. The stream processors are heavily modified from the previous generations to support packed math Rapid Pack Math technology for 8-bit, 16-bit, and 32-bit numbers. With this there is a significant performance advantage when lower precision is acceptable (for example: processing two half-precision numbers at the same rate as a single single precision number).

    Nvidia introduced tile-based rasterization and binning with Maxwell,[39] and this was a big reason for Maxwell's efficiency increase. In January, AnandTech assumed that Vega would finally catch up with Nvidia regarding energy efficiency optimizations due to the new "Draw Stream Binning Rasterizer" to be introduced with Vega.[40]

    It also added support for a new shader stage - primitive shaders.[41][42]
     
  27. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    We are not sharing embargoed information.
     
  28. ole-m

    ole-m Limp Gawd

    Messages:
    451
    Joined:
    Oct 5, 2015
    I'm exited for sure, and really do hope something is hidden in the amd drivers.

    regarding G-sync:
    Horrible hack to create sync.
    Using general purpose hardware to do it is just plain wrong. and the reason why it's pricey....
    if they used a more optimized asic it may have been better.

    The end result would have been the same for us as an experience but without 300$ premium :)

    But maybe I'm just interested in the how and not the result at times :)
     
  29. ElPutoJefe

    ElPutoJefe n00b

    Messages:
    1
    Joined:
    Jul 26, 2017
    And the video ?? or is Fake , 1 day render the video ???
     
  30. Okay let me explain it to you this way.

    The last HUGE architecture change from Intel was from NetBurst P4 to the Core 2

    Everything past Sandybridge has been minor improvements. ie: Better scheduling, OOE look-ahead steps, better branch prediction, better loop unrolling, better cache storage, bigger cache, better energy usage & states, better iGPU, AVX extensions etc.

    Together these give you a 20% IPC improvement. But make no mistake, this is not ground up. This is a revision or evolution.

    Now any software written specifically for Sandybridge or later will work with Skylake. But timingings and features may not be implemented get the best speed, until the software is rewritten to take advantage of the new instructions and timings like AVX512

    So even though there are INTERNAL improvements, the layout and resource allocation is pretty much the same. Same ROPs etc...Just like Sandybridge i3 has 2 + 2 cores. Skylake i3 has 2 + 2 cores.

    Until we analyze the code from Fury to Vega, we won't know for sure how similar they are. But getting an initial driver out the door is relatively easy since it looks like just an upgrade.
     
  31. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    I just like making putos wait. Puto.
     
    ecktt and GDI Lord like this.
  32. Sarreq Teryx

    Sarreq Teryx n00b

    Messages:
    4
    Joined:
    Apr 5, 2014
    WHY are these still competing standards? This is something which should be a single ISO or RFC or whatever applies to it, with a committee behind it. When are AMD and nVidia going to stop this type of bullshit?
     
  33. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    Hell, we don't make money off advertising any more, at least let me get some enjoyment out of it. ;)
     
    GDI Lord likes this.
  34. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,777
    Joined:
    Jan 31, 2008
    Blame Nvidia. AMD is using the open standard defined by VESA. Nvidia decided to go their own, closed ecosystem, route.
     
    GDI Lord likes this.
  35. Adaptive v-sync is actually a VESA optional standard.
     
    GDI Lord likes this.
  36. Well that's Swedish pricing. You poor Europeans and Aussies get screwed on pricing for consumer electronics.

    But I will say if it really is that much off the mark compared to performance, then AMD Radeon division should:



    They would have been better off not releasing anything at all and sticking to the professional market. Recovering from looking like a fool is much harder than not saying anything at all.
     
  37. ecmaster76

    ecmaster76 [H]ard|Gawd

    Messages:
    1,150
    Joined:
    Feb 6, 2007
  38. Sith'ari

    Sith'ari Gawd

    Messages:
    573
    Joined:
    Oct 13, 2013
    I didn't post this for RX VEGA's price itself. What i care about this thread, is the price difference (*still rumours of course) between RX VEGA & GTX1080.
    Even if the price is in Swedish currency, the price difference between these 2 is a large one ( 5600 SEK for GTX1080 Vs 9,000 SEK the roumored price for RX VEGA !!! )

    EDIT: What i mean by that, is that there is a strong possibility that RX VEGA's US-pricing will be more expensive than the competition, just as i said at my previous post #114.
     
    Last edited: Jul 26, 2017
  39. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,356
    Joined:
    May 18, 1997
    Too much for the article icon?

    upload_2017-7-26_13-7-27.png
     
  40. thesmokingman

    thesmokingman [H]ardness Supreme

    Messages:
    4,950
    Joined:
    Nov 22, 2008
    I like the icon, yes! If only my actual life was as hmm as that icon.
     
    FrgMstr likes this.
Tags: