AMD announces Radeon Pro Vega II Duo, a dual Vega 20 graphics card

Pieter3dnow

Supreme [H]ardness
Joined
Jul 29, 2009
Messages
6,784
https://videocardz.com/80956/amd-announces-radeon-pro-vega-ii-duo-a-dual-vega-20-graphics-card

AMD and Apple developed a new PCI-Express connector, which supports up to 475W of power. Alongside the new PCI-e connector Radeon Pro Vega II Duo features Infinity Fabric Link capable of carrying 84 GB/s of data between the GPUs. More importantly, the Radeon Vega II Duo features onboard Thunderbolt 3 connector.


AMD-Radeon-Pro-Vega-II-Duo.jpg
 
very interesting solution. Though it seems it was made specifically so it cannot be used in anything else that apple had in mind.
 
It's just like Apple to finally release another upgradeable Mac Pro after 7 years and then introduce a new and improved PCIe connector with it. It'll be interesting to see what other GPUs it supports out of the box and what other MPX modules they offer as upgrades.
 
I am guessing a fully configured Mac Pro is going to run around $40k
 
So it's just two Vega IIs in crossfire? What's so magical about this? NVLink is 20% faster than Infinity Fabric Link, yet people say even that speed makes it too slow to be useful. This would be like ATT bragging about how their dialup technology is so much faster than 2G: it's both false and even the better of the two isn't useful.

As far as I can tell, this is just two GPUs sharing two PCI slots. WGAF?

Actually, let me amend that: this could be useful in data centers because of the much higher packing density. The problem, of course, is that it's also much higher thermal load density, so it might not net out any better even in a data center.
 
My money is on that the only reason that there is a 2nd, specialized, PCI-E slot is simply so that you have to use their cards and you can't use the card in a different system. I bet there is no performance gain over normal Xfired cards. Typical Apple and their specialized hardware for no reason other than to make repair, upgrades, and hackintoshes harder.
 
So it's just two Vega IIs in crossfire? What's so magical about this? NVLink is 20% faster than Infinity Fabric Link, yet people say even that speed makes it too slow to be useful. This would be like ATT bragging about how their dialup technology is so much faster than 2G: it's both false and even the better of the two isn't useful.

As far as I can tell, this is just two GPUs sharing two PCI slots. WGAF?

Actually, let me amend that: this could be useful in data centers because of the much higher packing density. The problem, of course, is that it's also much higher thermal load density, so it might not net out any better even in a data center.

Its a 4 slot card. So it gains.... nothing. You can put two of em in and get 4 Vegas tho. With 32GB each...

Then Apple dumps x86 in a year or two and yer fucked.
 
2 PCIe connectors??

Is this an old April Fools joke or something. It's like a pared down Bitchin'Fast3D2000...

It looks like the second connector is really only there for power. It handles something like 475 watts without power cables. Can't have ugly cables connecting to Apple cards...

Here is the actual AMD announcement:
https://www.amd.com/en/press-releas...ance-amd-radeon-gpus-to-power-all-new-mac-pro

I wonder if the will bother with a Duo card for non-macs.
 
Last edited:
It looks like the second connector is really only there for power. It handles something like 475 watts without power cables. Can't have ugly cables connecting to Apple cards...

Here is the actual AMD announcement:
https://www.amd.com/en/press-releas...ance-amd-radeon-gpus-to-power-all-new-mac-pro

I wonder if the will bother with a Duo card for non-macs.
I doubt, price alone would make grown men cry. It would likely be around $1500-1600 likely more since they'd need to put aio on it.
 
The key thing here versus normal PC cards is that this has the Infinity fabric bridge to link up to 4 GPUs together. nvLink does scale up to this number but is only permitted in the nVidia's own DGX station for graphics workloads (for pure compute, it can scale up to 32 Teslas). So outside of that one nVidia system, this may be the fastest GPU setup that could be used for gaming. Of course no sane person would purchase this system for gaming but I am very curious what frame rates this thing could produce. Perhaps enough to actually game on that new 6K display?

The other neat thing about this card which I do eventually see being adopted on the PC side are the on board TB3 controllers. With USB4 incorporating TB3, I see this becoming a trend in the long run.
 
Looks like a beast of an A/V workstation for content creation. That 475W slot was 1000% a move for keeping it proprietary. Give it time though, someone will make a cable, unless there is more than just power in that second finger
 
Looks like a beast of an A/V workstation for content creation. That 475W slot was 1000% a move for keeping it proprietary. Give it time though, someone will make a cable, unless there is more than just power in that second finger

wouldn't be surprised if some one creates a pcie adaptor to connect it to standard pcie with 2 8 pin connectors on the adaptor or if that's actually what it's designed for in the first place.
 
It looks like the second connector is really only there for power. It handles something like 475 watts without power cables. Can't have ugly cables connecting to Apple cards...

Here is the actual AMD announcement:
https://www.amd.com/en/press-releas...ance-amd-radeon-gpus-to-power-all-new-mac-pro

I wonder if the will bother with a Duo card for non-macs.

Ahh, so it's a special card for Apple workstations only. That seems like pretty relevant information missing from the thread title.
 
this may be the fastest GPU setup that could be used for gaming

Well, it's a fast GPU setup, and you can probably use it for gaming- but it won't ever be the fastest GPU setup for gaming ;)

but I am very curious what frame rates this thing could produce

I'd be far less worried about framerates than I would be about frametimes. This is the Achille's heel of multi-GPU gaming. Trying to sync up four GPUs would generally result in enough frametime variance for the system to feel slower than just one card, and then there's the input lag.

[and good luck getting gaming software support for a niche within a niche within a niche within a niche...]
 
content creation

Give us a machine learning GPUs and then we call it a real competition.
there was a time when i purchased GPus looking at a FPS graph.
Now i only look at how fast it runs Leela Chess Zero.
 
If the new Monitor stand costs $1000, I wonder what the heck this card will cost.
You cute thinking that monitor from Apple is going to cost only $1k. The stand alone is supposedly $1k.
 
Last edited:
The key thing here versus normal PC cards is that this has the Infinity fabric bridge to link up to 4 GPUs together. nvLink does scale up to this number but is only permitted in the nVidia's own DGX station for graphics workloads (for pure compute, it can scale up to 32 Teslas). So outside of that one nVidia system, this may be the fastest GPU setup that could be used for gaming. Of course no sane person would purchase this system for gaming but I am very curious what frame rates this thing could produce. Perhaps enough to actually game on that new 6K display?

The other neat thing about this card which I do eventually see being adopted on the PC side are the on board TB3 controllers. With USB4 incorporating TB3, I see this becoming a trend in the long run.


All in all, those links matter not in compute scenarios or render farms. It only matters in terms of i have so much $$$ take my money home user.
The limit to 4 links, is likely limit of the motherboard rather than what they could've done - and it was designed in such a way that 4 would be their limit for the home system its designed for.

compute/render farms
Preference to treat each GPU as separate processors. This way you can squeeze more power out off them while crunching data instead of going by any kind of linking systems which add latency, and unneeded transfer of data.

*Its bad when your CPU's talk to each other, because it means that there's data it needs on the other CPU memory. It adds lot of latency. Same goes with any GPU-GPU talk.

Thats why i think this is mainly aimed at some crappy maya or something artists who will generate scenes that will eat over 128GB texture memory, on render scene. (and apple only wanted to spend more money - since AMD has solution to put even couple TB's as expendable vram ssd - it came with vega pro, where you could connect nvme to the gpu)

// big render farms will run on cpu's anyway, not gpu's. Mainly due to limiting factors of vram. (tho amd got around that with the pro ssg card, as mentioned above)
 
Give us a machine learning GPUs and then we call it a real competition.
there was a time when i purchased GPus looking at a FPS graph.
Now i only look at how fast it runs Leela Chess Zero.


Any modern GPU can be used for machine learning. Its all about the software that is written for whatever uarch. Or do you simply mean a GPU that doesn't have outputs on the back?
 
this was a huge rofl.. 4999$ monitor that doesn't include stand..

Its actually a steal.

That monitor isn't intended to you unless your doing hollywood style final edits and colour correction.

It will compete very well against pro level reference displays costing $30k ... reference displays go up as high as 50 and 60k. On the high end I'm sure those displays are still better then what apple is cooking. Having said that there are tons of video pros using less then displays... with only the final mastering pro having a crazy reference display. Reference displays almost never include a stand as they are often mounted into custom editing racks with editing boards ect.

Apple is putting reference quality colour and contrast in the hands of the VFX / and Colourists earlier in the chain. (The people currently using non reference quality $2-3k monitors)

I expect Apple will sell these as fast they can make them.
 
Thats why i think this is mainly aimed at some crappy maya or something artists who will generate scenes that will eat over 128GB texture memory, on render scene. (and apple only wanted to spend more money - since AMD has solution to put even couple TB's as expendable vram ssd - it came with vega pro, where you could connect nvme to the gpu)

Its really too bad AMD and Apple didn't add SSG memory to these cards. That is what they are missing a TB of SSG. Seeing as these machines and their new monitor are clearly aiming for video / VFX workstation use it would have been a great option.
 
Its actually a steal.

That monitor isn't intended to you unless your doing hollywood style final edits and colour correction.

It will compete very well against pro level reference displays costing $30k ... reference displays go up as high as 50 and 60k. On the high end I'm sure those displays are still better then what apple is cooking. Having said that there are tons of video pros using less then displays... with only the final mastering pro having a crazy reference display. Reference displays almost never include a stand as they are often mounted into custom editing racks with editing boards ect.

Apple is putting reference quality colour and contrast in the hands of the VFX / and Colourists earlier in the chain. (The people currently using non reference quality $2-3k monitors)

I expect Apple will sell these as fast they can make them.
Agreed, I really want to know who they are sourcing the panels from, and I am very interested in what they are using as a control board. That level of signal processing uses a lot of juice and tends to be hot, being able to run a passive cooler on it is a tad impressive at this stage.
 
Agreed, I really want to know who they are sourcing the panels from, and I am very interested in what they are using as a control board. That level of signal processing uses a lot of juice and tends to be hot, being able to run a passive cooler on it is a tad impressive at this stage.

I think I read somewhere they where using cooler running blue leds or something... but I'm sure there is more to it. I'm looking forward to some good pro level reiews of these. I don't need them but I deal with a few clients that would for sure be in the market. For people looking into monitors like Sonys $30k+ BVM reference monitors. $5k seems insane.

Its funny I was reading an article where someone was commenting on the gasps in the crowd when Apple announced the specs and pricing on the pro display. I found it funny cause I think they thought those people where shocked by the price in a bad way. This thing is TWENTY SEVEN THOUSAND dollars cheaper then a Sony BVM-X300 with better specs. The Sony is oled and looks great but it can't handle hours of work either... the oleds get stupid hot (like you will burn yourself if you touch the screen hot) they need to be rested when they are operated at high colour temps and burn in is a real issue. 30 thousand dollar monitors you can only use for an hour or two at a time.

Lots of people who can't wait to see how well Apples monitor really competes in colour reproduction ect. I imagine every major VFX / Video studio around is going to end up with plenty of these. The only question I have is will they be good enough to basically replace those insanely expensive OLEDs that make zero sense for the 8-12 hour a day artists and are only used for final mastering. If they are Apple just upset the industry. (I suspect they will be seen as just a hair off really replacing those mastering screens)
 
Its actually a steal.

That monitor isn't intended to you unless your doing hollywood style final edits and colour correction.

It will compete very well against pro level reference displays costing $30k ... reference displays go up as high as 50 and 60k. On the high end I'm sure those displays are still better then what apple is cooking. Having said that there are tons of video pros using less then displays... with only the final mastering pro having a crazy reference display. Reference displays almost never include a stand as they are often mounted into custom editing racks with editing boards ect.

Apple is putting reference quality colour and contrast in the hands of the VFX / and Colourists earlier in the chain. (The people currently using non reference quality $2-3k monitors)

I expect Apple will sell these as fast they can make them.

All to only end up to being played on a TV with default or the most blown out settings it can be.

I am no screen snob, but some of my friends I can't watch anything on their TVs without my eyes hurting.

Agreed, I really want to know who they are sourcing the panels from, and I am very interested in what they are using as a control board. That level of signal processing uses a lot of juice and tends to be hot, being able to run a passive cooler on it is a tad impressive at this stage.

While it's old, it's a high end monitor from back in 2001, and it's a 4k screen. He does an overview of it and then a full tear down. It's interesting to see what goes into these monitors, as I am not sure I have seen anyone do a real tear down of any new $18k+ monitor.



 
Reminds me of the 4870x2 with "sideport chip" to boost speeds. Was sold as it will be implemented "soon" via drivers and then they never got it working.
 
Reminds me of the 4870x2 with "sideport chip" to boost speeds. Was sold as it will be implemented "soon" via drivers and then they never got it working.

We should expect Apple to get their new GPU 'working', at least for its intended uses, which are largely not real-time graphics.


[and I do mean to say 'Apple' and not so much 'AMD'; this is a custom solution that will require direct OS and application development to support, something Apple alone is particularly adept at pulling off at anything that approaches 'accessible' to consumers]
 
This card, when running, can also be used on your snow covered driveway in place of a snow blower, plow or shovel.
 
LTT just posted a video talking about the mac pro and the radeon vega II cards and how they're suppose to work.. kinda explains why they went with the design they did even if it's a stupid idea.. recommend watching the whole thing btw, this is probably the worst case of brand milking by apple i've seen in a long ass time.




This card, when running, can also be used on your snow covered driveway in place of a snow blower, plow or shovel.

oh and there's no fans on the cards themselves, everything in the system is cooled by 3 front mounted chassis fans.
 
Its all about the software that is written for whatever uarch.

I am sorry, but real life projects like Leela Chess Zero experienced exponential perfomance increases on machine learning results after tensor cores were introduced.

Lc0 is a perfect project to compare machine learning performance among GPUs because:

-Fully open source
-Runs on Linux and Windows
-Had code paths for Open CL, BLAS and CUDA before cuDNN was launched

The numbers speak a lot.
https://www.phoronix.com/scan.php?page=news_item&px=LCZero-NVIDIA-Benchmarks

There is no magic programmer skills that can make up for the tensor core ability to run 4 multiply-add matrix operations on a single GPU clock.
A 2060 running cuDNN is more than twice as fast as a 1080ti running CUDA.
 
Last edited:
I am sorry, but real life projects like Leela Chess Zero experienced exponential perfomance increass on machine learning results after tensor cores were introduced.

Lc0 is a perfect project to compare machine learning perform,ance among GPU because:

-Fully open source
-Runs on Linux and Windows
-Had code paths for Open CL, BLAS and CUDA before cuDNN was launched

The numbers speak a lot.
https://www.phoronix.com/scan.php?page=news_item&px=LCZero-NVIDIA-Benchmarks

There is no magic programmer skills that can make up for the tensor core ability to run 4 multiply-add matrix operations on a single GPU clock.
A 2060 running cuDNN is twice as faat as a 1080ti running CUDA.


Tensor cores don't make a GPU to be a "machine learning card".
 
Can't have ugly cables connecting to Apple cards

I have to admit routing power to GFX cards in a PC are somewhat of a nightmare to me, really when you are as anal with something like that as i am at least when i have been modding, then it is a nightmare.
And i am not looking forward to putting my current system over into its destination case, cuz i will of course have to put my little twist on things, even if i as usual don't go pimp daddy overboard with my modding.

I really wish they would make GFX card so long they hang out over the edge of the mobo, and then put those damn Pcie connectors on that bottom side of the gfx card so i don't have to know they are there.
Will probably have to make my own cables,,,,, again
 
LTT just posted a video talking about the mac pro and the radeon vega II cards and how they're suppose to work.. kinda explains why they went with the design they did even if it's a stupid idea.. recommend watching the whole thing btw, this is probably the worst case of brand milking by apple i've seen in a long ass time.

I thought that Apple users don't have to know how things work that is the premise they buy Apple products on ...
 
I thought that Apple users don't have to know how things work that is the premise they buy Apple products on ...

When you make enough money you just need it to work. High salary creatives don't have to be tech nerds. I dont think the new Apple Pro is marketed towards the Iphone barista's.
 
Back
Top