AMD Vega prices to cost significantly more than expected

Discussion in 'Video Cards' started by SixFootDuo, Aug 10, 2017.

  1. stealthballer123

    stealthballer123 Limp Gawd

    Messages:
    299
    Joined:
    Mar 2, 2017
    I have been around since '01 and well remember your "not trolling" threads bashing amd.
    I just will never understand rooting against the underdog, but people cheer for winners I guess.
     
  2. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    There is a difference between rooting for the underdog based on merit, then rooting the underdog as a method of fighting against the power that is nVidia/Intel.

    I don't think anyone of us can deny that Vega is far too late to the party compared to its competition. Fury was prempted by 980ti by two weeks and were fighting neck to neck, so for a while, AMD had a matching product against nVidia at every price point, barring everything above 980ti.

    But this Vega which performance competes against 1070 and 1080 only managed on the scene when 1080ti has already been out for several months and its competition out for a full year.

    As much as I want to support AMD, Vega is 6 months too late. It also really didn't help when JHH gifted their first 15 V100 cards to AI researchers.

    If we only took Vega on its own and literally nothing else, yes, 1080 now finally has a competition, but the fact remains that 1080ti still don't look like there is a competition for a while, which is especially gloomy if Volta yields the same amount of performance increase over Pascal, Vega would literally be relegated to 2070/2060 competitors, and 2080 will repeat itself again.

    The only thing that is really floating AMD's boat right now is due to FreeSync, which makes all this even more disappointing, as one of the monitors I have my eyes on is a FreeSync monitor, and probably would have seriously considered jumping if Vega was 1080ti matching, 1080 matching makes it a much less appealing proposition.
     
    trandoanhung1991, Armenius and Algrim like this.
  3. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,437
    Joined:
    Oct 19, 2004
    ASUS Vega X2
     
    cybereality likes this.
  4. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    That's a dual core GPU, not in the same league as 1080ti.

    When I talk GPU, I always talk single core GPUs. Dual Core GPUs are just 2 Single cores crossfired on the same PCB, otherwise they are the exact same as running 2 separate Vega Cards.
     
    Armenius likes this.
  5. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,437
    Joined:
    Oct 19, 2004
    DX12 and Vulcan changed those rules.
     
  6. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    I still yet have to see any results of those labour.

    Besides, those are still DX12 and Vulkan games only, so far an extreme minority of games.

    Once more games under DX12 and Vulkan gets released, then I'll see about whether my statement holds true, for now, for all intents and purposes, I still consider Dual core GPUs to be nothing more than 2 GPUs sharing the same PCB.
     
    Armenius and Shintai like this.
  7. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    The only rule they changed was that using 2 or more GPUs became 10x more expensive and complex. Hence support in reality dying for good.

    And who wants 2 cards/dual card that only work in a tiny amount of overpriced and poor selling games. While the other, specially the much better selling and much better indie type games doesn't support it and act like a single GPU.
     
    Armenius likes this.
  8. -Strelok-

    -Strelok- [H]ardForum Junkie

    Messages:
    10,002
    Joined:
    Dec 2, 2010
    Everyone is moving away from multi GPU, so I’m not sure they changed anything. Also, who wants to play 3-5 games? 90%+ of games are still DX11 and/or run better in DX11. By the time it becomes relevant I can just go ahead and buy a new GPU that is much faster than current gen.
     
    Armenius likes this.
  9. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,437
    Joined:
    Oct 19, 2004
    Or you can leave your GPU in and buy any other GPU and compliment the first in performance. No more generational or model lock ins. Full RAM accessible on all cards. No more red team only or green team only. Why not one of each? Oh and throw onboard video processing power into the mix too for a couple more FPS. The future of MGPU is very different indeed.
     
    Last edited: Aug 12, 2017
    CSI_PC likes this.
  10. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,437
    Joined:
    Oct 19, 2004
    For now.

    BUT, the future of MGPU a few years down the road, when the commo game engines support it as a base function is much brighter than that.
     
  11. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,200
    Joined:
    Apr 3, 2016
    That does raise the interesting point on whether it affects Freesync/G-Sync, or importantly and from a positive perspective even if it actually then means one is no longer tied to one over the other and able to choose the VRR monitor they always wanted.
    Of course that would require MGPU to be universal support for all modern games for it even to be consumer viable (if mixing GPUs and VRR techs works well).

    CHeers
     
    Maddness and Archaea like this.
  12. -Strelok-

    -Strelok- [H]ardForum Junkie

    Messages:
    10,002
    Joined:
    Dec 2, 2010
    Ok. In 2 years we can come back to this post and you can let me know all about how great mGPU tech is now. I'll bookmark it.
     
    Armenius likes this.
  13. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,437
    Joined:
    Oct 19, 2004
    Lol. I won't need to. In two years you'll know yourself.
     
  14. -Strelok-

    -Strelok- [H]ardForum Junkie

    Messages:
    10,002
    Joined:
    Dec 2, 2010
    My point is we will see who is right. I think mGPU is going away, since Nvidia doesn't seem all that interested in developing for it and neither does AMD. Didn't we have something like this before, LucidGPU or something? That failed as well.
     
  15. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,200
    Joined:
    Apr 3, 2016
    Well mGPU is meant to be moving towards API rather than driver, so makes sense that interest will lower from AMD and Nvidia but this should be picked up then by API developers such as Microsoft (DX12) and Khronos (Vulkan).
    Comes down to how much the API developers commit to the concept of mGPU, but unfortunately may still be some time away before they really go all-in.

    Cheers
     
  16. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    So was SLI and CF. Always just down the road.

    When a developer has to spend a lot of money and time for a ultra niche its not happening.
     
    Armenius likes this.
  17. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,354
    Joined:
    May 18, 1997
  18. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,354
    Joined:
    May 18, 1997
    That it is for sure. I liked the way the FragHarder lights reflect off the shroud.
     
  19. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,697
    Joined:
    Mar 22, 2008
  20. Gulvan

    Gulvan Limp Gawd

    Messages:
    129
    Joined:
    Aug 7, 2016
    Its like the gold rush for graphics cards due to the mining craze. Sucks alot when you have to build computers for family/friends and they are paying sometimes 100 bucks extra. UGG.
     
    N4CR likes this.
  21. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
  22. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,521
    Joined:
    Sep 13, 2008
  23. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    Not just Civ6, also Metro. Are you counting 4K only? And its a 699 liquid cooled version that is at best around a stock old 499 1080FE in average at half the power? Get a 1080TI for the same price and still save power while getting much more performance? ;)

    I dont think I seen such a bad GPU release the last 20 years.

    A broader 20-30 games test will be even worse.
     
    Last edited: Aug 14, 2017
  24. harmattan

    harmattan [H]ardness Supreme

    Messages:
    4,241
    Joined:
    Feb 11, 2008
    After seeing the middling performance and crazy power draw, I was really hoping AMD was going to bring a bit more balance back from the price angle.

    The Freesync rational was really the last meager thread I could possibly see being made for anyone getting Vega, but no way that washes at these prices. Even miners are going to steer clear at this point with the power draw and less than stellar hash rate.

    Very, very happy I stopped waiting and picked up a nice 1080 ti (from someone who was holding out to "upgrade" to Vega) last month.
     
    Armenius likes this.
  25. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
  26. SilverSliver

    SilverSliver Beat It To Deformation

    Messages:
    11,165
    Joined:
    Feb 23, 2004
    Basically GTX 1080 performance using a lot more power and likely no OC headroom? Also, a lot more expensive? Am I missing something?
     
    Armenius likes this.
  27. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    Nope, you are spot on.
     
    Armenius likes this.
  28. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,925
    Joined:
    Oct 4, 2007
    Basically, they have 1080 performance at 10 bit hardware support versus Nvidia 8 bit hardware support. That is one thing many folks seem to miss which is one reason I stick with AMD. I am not going to buy a multi hundreds of dollars piece of hardware that looks completely washed out on my monitor and have to mess with settings to maybe get something that is still worse looking than AMD, at least on my monitors.
     
  29. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    Pascal should have it set up for 10 bit color. Unless I'm mistaken. I don't have an HDR monitor so that setting doesn't show up for me. But nV has been pushing HDR monitors with Pascal cards so...... They also talked about at launch too.
     
    Armenius likes this.
  30. Iching

    Iching [H]ard|Gawd

    Messages:
    1,805
    Joined:
    Nov 29, 2008
    Bla, potato comment. Do you go to a car dealership and cry you can't afford a fancy Euro car?
     
    jologskyblues likes this.
  31. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,925
    Joined:
    Oct 4, 2007
    Could be but, I had a 980Ti and found it was a waste of money at the $650 I had paid for it. An R9 380, R9 290X and R9 Fury all looked better than the 980 Ti did on my setup. For those would want to claim placebo, you are wrong. On a different note, this was on my setup with my Samsung 4k monitor that I am using now and have for the past 2 years, maybe Pascal has true 10 bit support, I have no idea.
     
  32. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    10,925
    Joined:
    Oct 4, 2007
  33. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    599/699$ for sub or just 1080 performance at crazy power draw 15 months later. :D

    Those 4x perf/watt, NCU, tile based and whatever really shows. The person saying it was a glorified Fiji was so wrong...oh wait!

    The ETH rate was barely even 1/3rd of the claimed as well. And you better like noisy cards if you buy reference Vega. 5-8dB more than a 1080FE.

    #waitfornavi (Yes, it will be 4096SP too).

    There is always the "cinematic" effect in 4K GTAV.
    [​IMG]
     
    Last edited: Aug 14, 2017
    trandoanhung1991 and Armenius like this.
  34. ir0nw0lf

    ir0nw0lf [H]ardness Supreme

    Messages:
    6,319
    Joined:
    Feb 7, 2003
    https://www.newegg.com/Product/Product.aspx?Item=N82E16814137226

    Vega 64 card $499.99 (yes OOS), but yes the rest of the air-cooled cards listed there are $599.99.
     
  35. odditory

    odditory [H]ardness Supreme

    Messages:
    5,580
    Joined:
    Dec 23, 2007
    You sound confused or just grasping at straws. 1080 most definitely outputs 10bit.
     
    trandoanhung1991 and Armenius like this.
  36. clbri

    clbri n00b

    Messages:
    13
    Joined:
    Jul 16, 2015
    Coincidentally this came up in another thread at https://hardforum.com/threads/consi...d-quadro-for-10-bits.1941528/#post-1043160681 just recently. The conclusion is that indeed NVidia supports 10bit on 1080 (and probably also on several older generations), but only in Direct3D in exclusive fullscreen mode (and not in windowed mode or in OpenGL). That is what most Windows games use though, so one could say NVidia supports 10bit on Pascal for games, but not for business and content creation applications. Curious to see if that will change later with HDR going mainstream.
     
  37. penn919

    penn919 n00b

    Messages:
    55
    Joined:
    Feb 10, 2014
    With regards to pricing, is there any reason in particular why the GPU manufacturers can't accept that the GPU market is simply larger now? Why couldn't they simply ramp up production to accomodate the miners? The way I see it, for every crypto-currency that becomes unprofitable, a couple new ones emerge to take their place. It even seems like they are being created so that consumer-grade GPUs can mine them. There may not be a way to beat them.
     
  38. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    Its all about logistics and mining being very volatile. It takes ~12 weeks just to make the chips. So lets say 15-18 weeks from you decide to increase production unexpected till you actually have a product in the stores. By then mining may be gone again and you sit on increased inventories.

    This is also why AMD and Nvidia have tried to sell else faulty trashcan bin chips as mining cards. Its money out of trash and it doesn't affect the logistics chain.

    No sane miner however will buy Vega.
     
  39. N4CR

    N4CR 2[H]4U

    Messages:
    3,861
    Joined:
    Oct 17, 2011
    I have followed this quite in-depth since 780Ti (as 10Bit can benefit some of my work) and you are correct. And I hate banding with a passion. Although newer monitors are getting better at this across the board with 10bit LUT etc.

    Regarding topic at hand, I was right. I said they'd mark it up at least in my country for launch (they did) but I also expected the mining performance to be better. Seems everyone else is having fun marking it up globally..
    I'm going to wait another month before deciding, see what AA improvements they can bring. Not being invested in any active sync gives me more options and might be able to save some $ buying in EU or DXB.
    And fuck Gibbo. What a slimy motherfucker, hope his audience is pissed at him.
     
  40. Communism

    Communism Limp Gawd

    Messages:
    165
    Joined:
    Feb 24, 2013
    AMD boosted their sales of R9 290(x) and R9 390(x) by specifically having "TheStilt" aka an AMD engineer on community liaison service making officially AMD Golden Signature signed BIOS's optimizing them for memory latency sensitive hashing algorithms.

    AMD specifically touted "cryptocurrency architecture enhancements" for the Vega in their slides.

    I would bet that AMD will ninja release mining BIOS and release their own forks of the mining software with optimizations explicitly for VEGA if/when they have massive inventory of the card and no gamers buy it.

    They won't likely release it before they have large inventories of VEGA that people aren't buying.

    ^^^^^^^^^^^^^^^^^^^^^^^^^^

    This is of course all under the supposition that AMD isn't simply lying out of their ass like they always do and that they are competent, so take this analysis with a grain of salt.