Radeon 7 (Vega 2, 7nm, 16GB) - $699 available Feb 7th with 3 games

Discussion in 'AMD Flavor' started by Digital Viper-X-, Jan 9, 2019.

  1. vjhawk

    vjhawk Limp Gawd

    Messages:
    157
    Joined:
    Sep 2, 2016
    Well. Now that you can use a Freesync monitor with Nvidia GPUs pretty much just fine, there's no longer a 'gsync' tax on monitors.

    At $699, the Radeon 7 and the 2080 cost the same and perform about the same. The question now is 16gb of memory vs 8gb + Raytracing + DLSS.

    Seems to me that AMD was forced to go with HBM2 memory again because of RND costs already sunk into Vega. It was easier for them to release a Vega 2.0 than to redo the memory interface to support cheaper GDDR6 memory.

    I still question the amount of memory though, 16 GB is only useful for professional applications, AI, data center usage, etc.

    I feel a cut down 12GB version with 756 MB/s memory bandwith for say $150 or $200 less would sell very well and see pretty much no performance dropoff compared to the 16GB 1000 MB/s version.
     
    Last edited: Jan 17, 2019
    Maddness and bwang like this.
  2. Unoid

    Unoid [H]ard|Gawd

    Messages:
    1,040
    Joined:
    Feb 4, 2003

    Vega 7 also has higher compute if that factors into anyones decision.
     
  3. Snowdog

    Snowdog Pasty Nerd with Poor Cardio

    Messages:
    8,002
    Joined:
    Apr 22, 2006
    Shaving on stack of HBM, might save AMD what? $25? Even less because judging by past AMD products, it would probably still still populate all 4 stacks and just disable one.

    So cutting $150+ would probably make the card a money loser.
     
    Nightfire, ManofGod and tungt88 like this.
  4. Unoid

    Unoid [H]ard|Gawd

    Messages:
    1,040
    Joined:
    Feb 4, 2003
    *Should have* AMD makes so much less money than nvidia and intel yet AMD is just now releasing Zen2 which will take the crown in server space and dekstop too. Amd is also keeping up with nvidia for discrete GPU's Not winning but is competitive. It's a damned miracle AMD is competing as high as they are.
     
  5. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,197
    Joined:
    Jun 13, 2003
    ...and there's the expected 'poor AMD' defense :D
     
    stashix and Legendary Gamer like this.
  6. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,252
    Joined:
    Jul 29, 2009
    Yeah because when you don't have money and have a good deal of debt you can just produce gpu out of thin air. Never mind the 3 year development cycle for gpu. So unless you have time travel available and willing to lend AMD a hand bring them some money from the future then it is not going to change development pace or status.
     
    ManofGod and jadesaber2 like this.
  7. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,197
    Joined:
    Jun 13, 2003
    Their mistakes to make, I agree ;)
     
  8. Derangel

    Derangel [H]ard as it Gets

    Messages:
    16,629
    Joined:
    Jan 31, 2008
    It is also a fact. Between Intel illegally fucking them over during the P4 era and a series of spectacularly stupid decions under previous management AMD really doesn’t have the kind of money to throw around that their competition does. They do some pretty outstanding stuff with their limited budget but it is what it is.
     
    N4CR, nEo717, RamboZombie and 3 others like this.
  9. Gideon

    Gideon [H]ard|Gawd

    Messages:
    1,882
    Joined:
    Apr 13, 2006
    You guys got to stop feeding the off topic troll. I think most are glad to see a AMD card that gives people options in the 4K gaming world.
     
    N4CR, gtrguy, ZeqOBpf6 and 2 others like this.
  10. c3k

    c3k [H]ard|Gawd

    Messages:
    2,024
    Joined:
    Sep 8, 2007
    Reference the part of your post I bolded and underlined. See https://www.guru3d.com/news-story/a...tive-with-radeon-vii-though-directml-api.html

    I'm not sure if that means anything...yet.
     
  11. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,466
    Joined:
    Feb 22, 2012
    I don’t see how it would be useful since it would take tflops away from normal rendering and I don’t think AMD has the expertise, manpower, or money to pull it off. It’s not like it has idle tensor cores sitting around like the RTX series.

    For example, the 2080ti has 110 tflops of int8 just sitting around doing nothing except for RT and DLSS. Vega has ~60 Tflops (?) of int8 if the card commits 100% of itself to int8.
     
    c3k likes this.
  12. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,169
    Joined:
    Aug 28, 2008
    Exactly. If it wasn't for Nvidia's ridiculous pricing and stupidly limited RAM for such expensive cards, AMD would have been forced to stand down until large Navi.
    AMD looks to have a very competitive card for the price, especially for those who do more then game.
     
    noko likes this.
  13. noko

    noko 2[H]4U

    Messages:
    4,083
    Joined:
    Apr 14, 2010
    I see this very useful if it is supported with multiple cards and not necessarily the same model card. Not SLI/CFX or Multi-GPU but using the second card for process, ML and maybe even RT. For example primary card does the game in what ever resolution, the second card does the ML processing then to your monitor it goes. Lag would be my only concern in this method. You would not have to render at a lower resolution like what Nvidia does but could render at full resolution and reap the benefit of the processing power of your second card.

    The Microsoft demo shows some spectacular results from 1080p to 4K. This one example looks like it is much better than DSLL but too soon to tell. What is awesome about this is that anyone with a DX 12 card can use it. 4K gaming may come to a large number of folks now without needing to upgrade.

    https://www.overclock3d.net/news/gp..._supports_directml_-_an_alternative_to_dlss/1
     
    Last edited: Jan 18, 2019
    cybereality likes this.
  14. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,466
    Joined:
    Feb 22, 2012
    I agree it’d be great if a second card could be purposed just for RT and such (but also do xfire for older games). I think a lot of people think the same and it’s definitely a way for AMD to catch up without going down the massive die route.

    I once read someone saying RT would be easy to split off but I’ve never gotten that deep into it.
     
  15. noko

    noko 2[H]4U

    Messages:
    4,083
    Joined:
    Apr 14, 2010
    The other aspect is you don't need to invest in Nvidia Tax to get it, any DX 12 card will do it. Just updated last post just before yours which has a snippet about Microsoft demonstration.
     
    Dayaks likes this.
  16. deruberhanyok

    deruberhanyok [H]ard|Gawd

    Messages:
    1,305
    Joined:
    Aug 22, 2004
    Hadn't seen this yet, thank you for posting the link.

    All of these (non-xfire, non-sli) multi-gpu capabilities that have been / are being introduced in D3D 12 are pretty neat - I like the way the multi-GPU works and this is another neat use for secondary cards in a system.

    Wondering if gaming systems in a year or two are going to look like the early PhysX days, with everyone buying / keeping older cards for dedicated ML.
     
  17. Maye88

    Maye88 n00b

    Messages:
    7
    Joined:
    Jan 14, 2019
    cybereality likes this.
  18. Snowdog

    Snowdog Pasty Nerd with Poor Cardio

    Messages:
    8,002
    Joined:
    Apr 22, 2006
    I am sure that is a major disappointment to dozens of people in the world.
     
    extide, N4CR, c3k and 1 other person like this.
  19. cybereality

    cybereality 2[H]4U

    Messages:
    3,747
    Joined:
    Mar 22, 2008
    That's sad, but not entirely unexpected. I wonder if DX12/Vulkan mGPU would still work.
     
  20. Derangel

    Derangel [H]ard as it Gets

    Messages:
    16,629
    Joined:
    Jan 31, 2008
    Probably, but it would require developers to actually support it.
     
    Algrim and Maddness like this.
  21. ManofGod

    ManofGod [H]ardForum Junkie

    Messages:
    9,893
    Joined:
    Oct 4, 2007
  22. Hakaba

    Hakaba Gawd

    Messages:
    526
    Joined:
    Jul 22, 2013
    So if one was going to buy a card (not Vega 7 necessarily) strictly for water cooling, who would you go though?
     
  23. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,169
    Joined:
    Aug 28, 2008
    I wonder what would be the limitations compared to Vega 56/64? Is the giant memory bandwidth of the V2 just too much when your trying to sync two cards?
     
  24. Factum

    Factum [H]ard|Gawd

    Messages:
    1,440
    Joined:
    Dec 24, 2014
    NO, NO, NO!!!

    Bandwith compression is LOSSLESS!
    What you are descibing is akain to "Loudness" in music....aka "vibrance".

    Again, it has nothing to do with bandwith or compression!!!

    And the "AMD has better colors" or "AMD looks sharper" has been utterly debunked a looooooong time ago:
    https://hardforum.com/threads/a-real-test-of-nvidia-vs-amd-2d-image-quality.1694755/
     
  25. ssj925

    ssj925 n00b

    Messages:
    18
    Joined:
    Feb 6, 2016
    I'm actually curious about this. Some places say it has kept the same DP64 performance of the MI150 but techpowerup as updated their page showing otherwise.
     
  26. Eymar

    Eymar Limp Gawd

    Messages:
    228
    Joined:
    Sep 15, 2005
    Techpowerup is right the DP64 is way less than MI50 Instinct, otherwise people will buy VII rather than much more expensive MI50.
     
    ssj925 likes this.
  27. noko

    noko 2[H]4U

    Messages:
    4,083
    Joined:
    Apr 14, 2010
    but but but my eyes don't lie :)

    With increase bit depth, HDR-10 -> way more colors -> less chance of duplicate colors to compress -> compression ration goes down -> Memory bandwidth needed, will go up. Pascal had some issue with loosing performance with HDR much more than AMD. I am tending to think it lies with this.

    The thread listed is dealing with non-gaming graphics but was a good read. Pretty sure Microsoft WHQL mandates the rendering of 2d text quality so that should be very close if not the same. Dealing with gaming graphics Nvidia does some things differently. For example on my 144mhz HDR monitor, AMD maintains 10bit depth HDR-10 at all refresh rates. Nvidia only has 10 bit depth at 144hz, anything else and it uses 8 bit color plus dithering. Unless what Windows is reporting is wrong, AMD image is a much better quality HDR-10 image on refresh rates less than 144hz..

    It has been awhile since I went looking at mipmaps, will have to get back with that. If Nvidia and AMD maintained their relative L.O.D then AMD will have a sharper image in general.
     
  28. Zam15

    Zam15 n00b

    Messages:
    50
    Joined:
    Jul 27, 2015
    deruberhanyok and Boil like this.
  29. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    5,761
    Joined:
    Sep 24, 2001
  30. Boil

    Boil [H]ard|Gawd

    Messages:
    1,220
    Joined:
    Sep 19, 2015
    But where is the Nano version...?!?
     
  31. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,135
    Joined:
    Oct 24, 2014
    I have a sneaky feeling there won't be one this generation.
     
  32. Boil

    Boil [H]ard|Gawd

    Messages:
    1,220
    Joined:
    Sep 19, 2015
    Maybe AMD could be convinced to do a limited run (5 to 10k) of Nano cards with pre-installed full-cover water blocks...!

    (...super doubtful...)
     
    Last edited: Jan 23, 2019
    Maddness likes this.
  33. Zam15

    Zam15 n00b

    Messages:
    50
    Joined:
    Jul 27, 2015
    Well looks like Redit, Tom's Hardware, VideoCardz, and Guru3d are all now running with the ASRock RADEON VII news.

    You saw it here first folks ;) guess they don't give credit lol
     
  34. pfc_m_drake

    pfc_m_drake [H]ard|Gawd

    Messages:
    1,223
    Joined:
    Jan 7, 2004
    I've become sort of an Amature/Hobbiest Vlogger.
    So I'll be buying one for the 'work productivity' aspect.
    Plus I have a 4k Freesync monitor, so there's that...
     
  35. Neapolitan6th

    Neapolitan6th Gawd

    Messages:
    975
    Joined:
    Nov 18, 2016
    Vega VII:
    1400MHz base. 1750MHz boost.

    Vega 64 Sapphire Nitro:
    1373MHz base. 1580MHz boost. (They offered a 1673MHz boost on their best liquid cooled card)
     
  36. Bawjaws

    Bawjaws Limp Gawd

    Messages:
    319
    Joined:
    Feb 20, 2017
    So these AIB VIIs are just the AMD reference design, then?
     
  37. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,135
    Joined:
    Oct 24, 2014
    So far yes. I don't know if it has been confirmed that there will be non reference versions, only that aib's will sell the cards.
     
  38. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    5,761
    Joined:
    Sep 24, 2001
    Even if they're limited to reference PCB, there also hasn't been any alterations to the cooling (at least there doesn't appear to be in either of these linked cards). Which is disappointing. Perhaps there wasn't enough time R&D to production time for AIBs to do it. Hopefully we'll see some of that in the future.
     
  39. Neapolitan6th

    Neapolitan6th Gawd

    Messages:
    975
    Joined:
    Nov 18, 2016
    At least it's not a blower. It'll be interesting to see if 7nm Vega will be more or less power hungry than original Vega.
     
    Last edited: Jan 25, 2019
    Maddness likes this.