AMD announces Radeon Pro Vega II Duo, a dual Vega 20 graphics card

Discussion in 'HardForum Tech News' started by Pieter3dnow, Jun 4, 2019.

  1. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,790
    Joined:
    Jul 29, 2009
    Darth Kyrie, ChadD and Neapolitan6th like this.
  2. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,257
    Joined:
    Sep 13, 2008
  3. cyklondx

    cyklondx Limp Gawd

    Messages:
    135
    Joined:
    Mar 19, 2018
    very interesting solution. Though it seems it was made specifically so it cannot be used in anything else that apple had in mind.
     
    lRaphl likes this.
  4. ND40oz

    ND40oz [H]ardForum Junkie

    Messages:
    11,342
    Joined:
    Jul 31, 2005
    It's just like Apple to finally release another upgradeable Mac Pro after 7 years and then introduce a new and improved PCIe connector with it. It'll be interesting to see what other GPUs it supports out of the box and what other MPX modules they offer as upgrades.
     
  5. auntjemima

    auntjemima [H]ardness Supreme

    Messages:
    4,619
    Joined:
    Mar 1, 2014
    Pretty cool to see a new dual GPU!
     
    lostinseganet and Darth Kyrie like this.
  6. Zepher

    Zepher [H]ipster Replacement

    Messages:
    16,763
    Joined:
    Sep 29, 2001
    If the new Monitor stand costs $1000, I wonder what the heck this card will cost.
     
  7. YeuEmMaiMai

    YeuEmMaiMai [H]ardForum Junkie

    Messages:
    14,394
    Joined:
    Jun 11, 2004
    price tag: in orbit but that card sounds like a beast...
     
  8. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,361
    Joined:
    Feb 11, 2013
    this was a huge rofl.. 4999$ monitor that doesn't include stand..
     
  9. Zepher

    Zepher [H]ipster Replacement

    Messages:
    16,763
    Joined:
    Sep 29, 2001
    I am guessing a fully configured Mac Pro is going to run around $40k
     
  10. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    191
    Joined:
    Oct 23, 2018
    So it's just two Vega IIs in crossfire? What's so magical about this? NVLink is 20% faster than Infinity Fabric Link, yet people say even that speed makes it too slow to be useful. This would be like ATT bragging about how their dialup technology is so much faster than 2G: it's both false and even the better of the two isn't useful.

    As far as I can tell, this is just two GPUs sharing two PCI slots. WGAF?

    Actually, let me amend that: this could be useful in data centers because of the much higher packing density. The problem, of course, is that it's also much higher thermal load density, so it might not net out any better even in a data center.
     
  11. GoodBoy

    GoodBoy [H]ard|Gawd

    Messages:
    1,341
    Joined:
    Nov 29, 2004
    2 PCIe connectors??

    Is this an old April Fools joke or something. It's like a pared down Bitchin'Fast3D2000...
     
  12. [21CW]killerofall

    [21CW]killerofall Aliens...

    Messages:
    2,910
    Joined:
    Mar 16, 2006
    My money is on that the only reason that there is a 2nd, specialized, PCI-E slot is simply so that you have to use their cards and you can't use the card in a different system. I bet there is no performance gain over normal Xfired cards. Typical Apple and their specialized hardware for no reason other than to make repair, upgrades, and hackintoshes harder.
     
  13. idiomatic

    idiomatic [H]Lite

    Messages:
    67
    Joined:
    Jan 12, 2018
    Its a 4 slot card. So it gains.... nothing. You can put two of em in and get 4 Vegas tho. With 32GB each...

    Then Apple dumps x86 in a year or two and yer fucked.
     
  14. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,849
    Joined:
    Apr 22, 2006
    It looks like the second connector is really only there for power. It handles something like 475 watts without power cables. Can't have ugly cables connecting to Apple cards...

    Here is the actual AMD announcement:
    https://www.amd.com/en/press-releas...ance-amd-radeon-gpus-to-power-all-new-mac-pro

    I wonder if the will bother with a Duo card for non-macs.
     
    Last edited: Jun 4, 2019
    BlueFireIce, AceGoober and Armenius like this.
  15. cyklondx

    cyklondx Limp Gawd

    Messages:
    135
    Joined:
    Mar 19, 2018
    I doubt, price alone would make grown men cry. It would likely be around $1500-1600 likely more since they'd need to put aio on it.
     
    Armenius likes this.
  16. power666

    power666 [H]Lite

    Messages:
    65
    Joined:
    Jun 23, 2018
    The key thing here versus normal PC cards is that this has the Infinity fabric bridge to link up to 4 GPUs together. nvLink does scale up to this number but is only permitted in the nVidia's own DGX station for graphics workloads (for pure compute, it can scale up to 32 Teslas). So outside of that one nVidia system, this may be the fastest GPU setup that could be used for gaming. Of course no sane person would purchase this system for gaming but I am very curious what frame rates this thing could produce. Perhaps enough to actually game on that new 6K display?

    The other neat thing about this card which I do eventually see being adopted on the PC side are the on board TB3 controllers. With USB4 incorporating TB3, I see this becoming a trend in the long run.
     
  17. alxlwson

    alxlwson You Know Where I Live

    Messages:
    5,875
    Joined:
    Aug 25, 2013
    Looks like a beast of an A/V workstation for content creation. That 475W slot was 1000% a move for keeping it proprietary. Give it time though, someone will make a cable, unless there is more than just power in that second finger
     
  18. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,257
    Joined:
    Sep 13, 2008
    wouldn't be surprised if some one creates a pcie adaptor to connect it to standard pcie with 2 8 pin connectors on the adaptor or if that's actually what it's designed for in the first place.
     
    alxlwson likes this.
  19. GoodBoy

    GoodBoy [H]ard|Gawd

    Messages:
    1,341
    Joined:
    Nov 29, 2004
    Ahh, so it's a special card for Apple workstations only. That seems like pretty relevant information missing from the thread title.
     
  20. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,481
    Joined:
    Jun 13, 2003
    Well, it's a fast GPU setup, and you can probably use it for gaming- but it won't ever be the fastest GPU setup for gaming ;)

    I'd be far less worried about framerates than I would be about frametimes. This is the Achille's heel of multi-GPU gaming. Trying to sync up four GPUs would generally result in enough frametime variance for the system to feel slower than just one card, and then there's the input lag.

    [and good luck getting gaming software support for a niche within a niche within a niche within a niche...]
     
    Armenius, dvsman and BlueFireIce like this.
  21. geok1ng

    geok1ng 2[H]4U

    Messages:
    2,135
    Joined:
    Oct 28, 2007
    Give us a machine learning GPUs and then we call it a real competition.
    there was a time when i purchased GPus looking at a FPS graph.
    Now i only look at how fast it runs Leela Chess Zero.
     
  22. vegeta535

    vegeta535 2[H]4U

    Messages:
    2,966
    Joined:
    Jul 19, 2013
    You cute thinking that monitor from Apple is going to cost only $1k. The stand alone is supposedly $1k.
     
    Last edited: Jun 4, 2019
  23. Nobu

    Nobu 2[H]4U

    Messages:
    3,036
    Joined:
    Jun 7, 2007
    'tis what he said. ;)
     
  24. cyklondx

    cyklondx Limp Gawd

    Messages:
    135
    Joined:
    Mar 19, 2018

    All in all, those links matter not in compute scenarios or render farms. It only matters in terms of i have so much $$$ take my money home user.
    The limit to 4 links, is likely limit of the motherboard rather than what they could've done - and it was designed in such a way that 4 would be their limit for the home system its designed for.

    compute/render farms
    Preference to treat each GPU as separate processors. This way you can squeeze more power out off them while crunching data instead of going by any kind of linking systems which add latency, and unneeded transfer of data.

    *Its bad when your CPU's talk to each other, because it means that there's data it needs on the other CPU memory. It adds lot of latency. Same goes with any GPU-GPU talk.

    Thats why i think this is mainly aimed at some crappy maya or something artists who will generate scenes that will eat over 128GB texture memory, on render scene. (and apple only wanted to spend more money - since AMD has solution to put even couple TB's as expendable vram ssd - it came with vega pro, where you could connect nvme to the gpu)

    // big render farms will run on cpu's anyway, not gpu's. Mainly due to limiting factors of vram. (tho amd got around that with the pro ssg card, as mentioned above)
     
    IdiotInCharge likes this.
  25. alxlwson

    alxlwson You Know Where I Live

    Messages:
    5,875
    Joined:
    Aug 25, 2013

    Any modern GPU can be used for machine learning. Its all about the software that is written for whatever uarch. Or do you simply mean a GPU that doesn't have outputs on the back?
     
    Armenius likes this.
  26. ChadD

    ChadD 2[H]4U

    Messages:
    3,963
    Joined:
    Feb 8, 2016
    Its actually a steal.

    That monitor isn't intended to you unless your doing hollywood style final edits and colour correction.

    It will compete very well against pro level reference displays costing $30k ... reference displays go up as high as 50 and 60k. On the high end I'm sure those displays are still better then what apple is cooking. Having said that there are tons of video pros using less then displays... with only the final mastering pro having a crazy reference display. Reference displays almost never include a stand as they are often mounted into custom editing racks with editing boards ect.

    Apple is putting reference quality colour and contrast in the hands of the VFX / and Colourists earlier in the chain. (The people currently using non reference quality $2-3k monitors)

    I expect Apple will sell these as fast they can make them.
     
  27. ChadD

    ChadD 2[H]4U

    Messages:
    3,963
    Joined:
    Feb 8, 2016
    Its really too bad AMD and Apple didn't add SSG memory to these cards. That is what they are missing a TB of SSG. Seeing as these machines and their new monitor are clearly aiming for video / VFX workstation use it would have been a great option.
     
    Darth Kyrie and wolfofone like this.
  28. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,491
    Joined:
    Feb 3, 2014
    Agreed, I really want to know who they are sourcing the panels from, and I am very interested in what they are using as a control board. That level of signal processing uses a lot of juice and tends to be hot, being able to run a passive cooler on it is a tad impressive at this stage.
     
  29. ChadD

    ChadD 2[H]4U

    Messages:
    3,963
    Joined:
    Feb 8, 2016
    I think I read somewhere they where using cooler running blue leds or something... but I'm sure there is more to it. I'm looking forward to some good pro level reiews of these. I don't need them but I deal with a few clients that would for sure be in the market. For people looking into monitors like Sonys $30k+ BVM reference monitors. $5k seems insane.

    Its funny I was reading an article where someone was commenting on the gasps in the crowd when Apple announced the specs and pricing on the pro display. I found it funny cause I think they thought those people where shocked by the price in a bad way. This thing is TWENTY SEVEN THOUSAND dollars cheaper then a Sony BVM-X300 with better specs. The Sony is oled and looks great but it can't handle hours of work either... the oleds get stupid hot (like you will burn yourself if you touch the screen hot) they need to be rested when they are operated at high colour temps and burn in is a real issue. 30 thousand dollar monitors you can only use for an hour or two at a time.

    Lots of people who can't wait to see how well Apples monitor really competes in colour reproduction ect. I imagine every major VFX / Video studio around is going to end up with plenty of these. The only question I have is will they be good enough to basically replace those insanely expensive OLEDs that make zero sense for the 8-12 hour a day artists and are only used for final mastering. If they are Apple just upset the industry. (I suspect they will be seen as just a hair off really replacing those mastering screens)
     
  30. BlueFireIce

    BlueFireIce [H]ardness Supreme

    Messages:
    6,133
    Joined:
    May 29, 2008
    All to only end up to being played on a TV with default or the most blown out settings it can be.

    I am no screen snob, but some of my friends I can't watch anything on their TVs without my eyes hurting.

    While it's old, it's a high end monitor from back in 2001, and it's a 4k screen. He does an overview of it and then a full tear down. It's interesting to see what goes into these monitors, as I am not sure I have seen anyone do a real tear down of any new $18k+ monitor.



     
    Nolan7689 and ChadD like this.
  31. DeeFrag

    DeeFrag [H]ardness Supreme

    Messages:
    5,501
    Joined:
    Jan 14, 2005
    But will it run Crysis?
     
    lostinseganet likes this.
  32. oMek

    oMek Gawd

    Messages:
    607
    Joined:
    Jan 3, 2007
    Reminds me of the 4870x2 with "sideport chip" to boost speeds. Was sold as it will be implemented "soon" via drivers and then they never got it working.
     
    Dayaks likes this.
  33. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,481
    Joined:
    Jun 13, 2003
    We should expect Apple to get their new GPU 'working', at least for its intended uses, which are largely not real-time graphics.


    [and I do mean to say 'Apple' and not so much 'AMD'; this is a custom solution that will require direct OS and application development to support, something Apple alone is particularly adept at pulling off at anything that approaches 'accessible' to consumers]
     
  34. pillagenburn

    pillagenburn Gawd

    Messages:
    946
    Joined:
    Oct 3, 2006
    This card, when running, can also be used on your snow covered driveway in place of a snow blower, plow or shovel.
     
  35. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,257
    Joined:
    Sep 13, 2008
    LTT just posted a video talking about the mac pro and the radeon vega II cards and how they're suppose to work.. kinda explains why they went with the design they did even if it's a stupid idea.. recommend watching the whole thing btw, this is probably the worst case of brand milking by apple i've seen in a long ass time.




    oh and there's no fans on the cards themselves, everything in the system is cooled by 3 front mounted chassis fans.
     
    Jim Kim likes this.
  36. geok1ng

    geok1ng 2[H]4U

    Messages:
    2,135
    Joined:
    Oct 28, 2007
    I am sorry, but real life projects like Leela Chess Zero experienced exponential perfomance increases on machine learning results after tensor cores were introduced.

    Lc0 is a perfect project to compare machine learning performance among GPUs because:

    -Fully open source
    -Runs on Linux and Windows
    -Had code paths for Open CL, BLAS and CUDA before cuDNN was launched

    The numbers speak a lot.
    https://www.phoronix.com/scan.php?page=news_item&px=LCZero-NVIDIA-Benchmarks

    There is no magic programmer skills that can make up for the tensor core ability to run 4 multiply-add matrix operations on a single GPU clock.
    A 2060 running cuDNN is more than twice as fast as a 1080ti running CUDA.
     
    Last edited: Jun 5, 2019
    IdiotInCharge and Armenius like this.
  37. alxlwson

    alxlwson You Know Where I Live

    Messages:
    5,875
    Joined:
    Aug 25, 2013

    Tensor cores don't make a GPU to be a "machine learning card".
     
  38. Oldmodder

    Oldmodder Gawd

    Messages:
    680
    Joined:
    Aug 24, 2018
    I have to admit routing power to GFX cards in a PC are somewhat of a nightmare to me, really when you are as anal with something like that as i am at least when i have been modding, then it is a nightmare.
    And i am not looking forward to putting my current system over into its destination case, cuz i will of course have to put my little twist on things, even if i as usual don't go pimp daddy overboard with my modding.

    I really wish they would make GFX card so long they hang out over the edge of the mobo, and then put those damn Pcie connectors on that bottom side of the gfx card so i don't have to know they are there.
    Will probably have to make my own cables,,,,, again
     
  39. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,790
    Joined:
    Jul 29, 2009
    I thought that Apple users don't have to know how things work that is the premise they buy Apple products on ...
     
  40. Auer

    Auer Gawd

    Messages:
    639
    Joined:
    Nov 2, 2018
    When you make enough money you just need it to work. High salary creatives don't have to be tech nerds. I dont think the new Apple Pro is marketed towards the Iphone barista's.