AMD X Apple unveil the Radeon W6800X Duo

NattyKathy

[H]ard|Gawd
Joined
Jan 20, 2019
Messages
1,470
Exclusively for Mac Pro

AMD-Radeon-Pro-W6800X-DUO-1.jpg


now that's a graphics card!

- 7680 shader cores (3840*2)
- "up to" 30.2TFLOP FP32 which suggests max 1970Mhz boost clock
- 64GB GDDR6 (2*32GB)
- 400W TGP

raw computational performance isn't much higher than the previous Vega II Duo due to CU count not rising, but the newer architecture should provide a decent boost for 3D stuff although it's worth noting that memory-bound applications will likely still be better off on the HBC-equipped Vega. I wonder if/when AMD will offer CDNA-based workstation cards with HBM?

Any Mac Pro users here going to pick up one of these monsters for their content creation boxen?
 
These companies keep rubbing salt in the wound...
rest assured that the Navi21 dies going into these will be a drop in the bucket compared to total supply/demand
but yes, I am jealous too lol. We can barely buy regular 6800 cards and here they are teasing us with Apple-exclusive tech.
 
rest assured that the Navi21 dies going into these will be a drop in the bucket compared to total supply/demand
but yes, I am jealous too lol. We can barely buy regular 6800 cards and here they are teasing us with Apple-exclusive tech.
I'm not jealous, just extremely jaded with the industry. After hobby building computers for the last 25+ years, I can never remember a time where I couldn't go the the local store and buy what I wanted off the shelf at reasonable prices, minus the brief DDR inflation back in 2000ish.
 
I'm not jealous, just extremely jaded with the industry. After hobby building computers for the last 25+ years, I can never remember a time where I couldn't go the the local store and buy what I wanted off the shelf at reasonable prices, minus the brief DDR inflation back in 2000ish.
Same. I've "only" been in it 20 years but yeah, it's really unsettling to see the empty GPU shelves and "out of stock" everywhere online. Always took it for granted that new components would always be available on-demand and at MSRP
 
Exclusively for Mac Pro

View attachment 381277

now that's a graphics card!

- 7680 shader cores (3840*2)
- "up to" 30.2TFLOP FP32 which suggests max 1970Mhz boost clock
- 64GB GDDR6 (2*32GB)
- 400W TGP

raw computational performance isn't much higher than the previous Vega II Duo due to CU count not rising, but the newer architecture should provide a decent boost for 3D stuff although it's worth noting that memory-bound applications will likely still be better off on the HBC-equipped Vega. I wonder if/when AMD will offer CDNA-based workstation cards with HBM?

Any Mac Pro users here going to pick up one of these monsters for their content creation boxen?


This to me just looks like two separate video cards stuck together in a line, with some sort of custom PCIe connector.

Essentially two cards in Crossfire on one board.
 
I'm not jealous, just extremely jaded with the industry.

Better to be jaded and know that AMD is sidelining their best glass to make exclusive hardware for a major business partner than to believe that Nvidia is boosting production of hash-limited cards for gamers and sidelining broken glass to make mining hardware.
 
Better to be jaded and know that AMD is sidelining their best glass to make exclusive hardware for a major business partner than to believe that Nvidia is boosting production of hash-limited cards for gamers and sidelining broken glass to make mining hardware.
This isn't even the "best glass" tho... it's a pair of cut Navi21's at low clock speeds. If it were a 2*80CU part at 2.5Ghz that would be one thing, but this are basically cut-down laptop-binned chips running at sub-2Ghz.
Not saying don't be jaded... be jaded... but these these are unlikely to actually affect desktop Radeon supply.
 
Not saying don't be jaded, be jaded... but these these are unlikely to actually affect desktop Radeon supply.

I'm sure these are still top-tier binned chips to hit the TDP, even at lower clocks. And AMD will have had to produce a large amount at launch; in the grand scheme, yeah, it probably is just a blip because of the price, but if you were shopping for anything based on these parts, this is where they were going, and this is why the prices for what's available is what it is.

Especially since AMD can sell them for even more to Mac users.
 
Better to be jaded and know that AMD is sidelining their best glass to make exclusive hardware for a major business partner than to believe that Nvidia is boosting production of hash-limited cards for gamers and sidelining broken glass to make mining hardware.
Good point.
 
I'm sure these are still top-tier binned chips to hit the TDP, even at lower clocks. And AMD will have had to produce a large amount at launch; in the grand scheme, yeah, it probably is just a blip because of the price, but if you were shopping for anything based on these parts, this is where they were going, and this is why the prices for what's available is what it is.

Especially since AMD can sell them for even more to Mac users.
yah, that's fair. The optics of launching another card when the existing ones can't be consistently bought yet
 
Better to be jaded and know that AMD is sidelining their best glass to make exclusive hardware for a major business partner than to believe that Nvidia is boosting production of hash-limited cards for gamers and sidelining broken glass to make mining hardware.
It's a FirePro card. They'll sell a handful at a massive price (handful being relative of course) for specific use cases.
 
These companies keep rubbing salt in the wound...
How? You can buy a 6800 on Newegg right now. The prices are not good but there is availability if you want one. No reason for AMD not to make another version that will sell.
 
I was wondering if/when such a card was going to come out.
There has also been rumblings of second gen "New" Mac Pro hardware using the latest Intel Workstation chips.
Mac Pro seems like it won't be migrated over to ARM until the VERY end of their transition window, and likely Intel will still be supported for a very long time.
I still have a suspicion that they will continue to use AMD graphics cards in their Mac Pro workstations. I honestly don't see a way to hit that level of graphics processing themselves. At least certainly not in a year.
 
I was wondering if/when such a card was going to come out.
There has also been rumblings of second gen "New" Mac Pro hardware using the latest Intel Workstation chips.
Mac Pro seems like it won't be migrated over to ARM until the VERY end of their transition window, and likely Intel will still be supported for a very long time.
I still have a suspicion that they will continue to use AMD graphics cards in their Mac Pro workstations. I honestly don't see a way to hit that level of graphics processing themselves. At least certainly not in a year.
Same to the first part, it seemed inevitable really-now that Apple has a solid high-end workstation platform I imagine they're not keen to ignore it like they did the trash can, and with AMD having a new high-end GPU chip...

I agree with your feeling about Apple sticking with AMD for the high-end / ultra-high-end. From what I've read it's likely the GPU in the next iMac could be >=xx50-level performance, but I'm not sure how ready / willing Apple is to scale their GPU up all the way yet. Plus with Navi31 being an MCM monster featuring an actual major core count jump, I imagine Apple engineers are already working out stuffing two of those on a card. I appreciate their willingness to not only make monstrosities like this, but to design the platform to take two :eek:
 
It is, as most/all dual-GPU cards have been.
to expand on this... the extra connector on MPX cards is physically PCIe 16x, but electrically a combination of Thunderbolt, Displayport, and power (MPX modules don't require extra power connectors). The individual GPUs are still multiplexed thru a PLX like past AMD dual-GPU cards. I realized that could be confusing since Apple just reused the connector. The single-GPU cards have it too.
 
I still have a suspicion that they will continue to use AMD graphics cards in their Mac Pro workstations. I honestly don't see a way to hit that level of graphics processing themselves. At least certainly not in a year.
I said that Apple is not a CPU/GPU manufacturer and eventually they'll have to use other hardware to stay competitive, and here we are. That didn't take long. They took the technology from Imagination which hasn't made a desktop GPU in nearly 20 years, so of course Apple would have to go with AMD for a high performance GPU.
 
I said that Apple is not a CPU/GPU manufacturer and eventually they'll have to use other hardware to stay competitive, and here we are. That didn't take long. They took the technology from Imagination which hasn't made a desktop GPU in nearly 20 years, so of course Apple would have to go with AMD for a high performance GPU.

This is for an Intel-based Mac Pro. Apple wasn't about to develop an in-house GPU for an x86 system it'll likely retire within about a year. You'll have a case to make if Apple is using an AMD GPU in an ARM-powered Mac Pro.
 
This is for an Intel-based Mac Pro. Apple wasn't about to develop an in-house GPU for an x86 system it'll likely retire within about a year. You'll have a case to make if Apple is using an AMD GPU in an ARM-powered Mac Pro.

PCIe is not ISA, it's not reliant on x86 at all. They could make one card for multiple architectures by just having two BIOSes on the card and a simple switch. Like overclocking cards have for bad flash recovery, or different overclocking profiles, but in this case, it'd be for different architectures.

It was done back in the PowerPC days, companies would make one card and have two different BIOS versions, one for x86 and one for PowerPC. Although it wasn't a simple switch, you could flash the cards without too much trouble to either architecture and back again. Sometimes it necessitated swapping out the physical ROM chips because of size differences, but later on, cut down ROMs became available that negated the need for that most of the time.

But this would never happen, Apple hates their customers too much and uses spite to force obsolescence.
 
what ever happened to sli/CF being dead?
Eh. Certain use cases probably scale better than others. I am content to leave it in the rear view mirror. No support. Terrible scaling. Double the usual double investment now.

I'm getting soft in my old age.
 
what ever happened to sli/CF being dead?
They are dead. The dual-GPU MPX cards for Mac Pro aren't Crossfire, just two GPUs on a single board- they're pretty much exclusively intended for content creation where the applications can pool multiple GPU resources without worrying about low-latency frame syncing.
 
  • Like
Reactions: Joust
like this
They are dead. The dual-GPU MPX cards for Mac Pro aren't Crossfire, just two GPUs on a single board- they're pretty much exclusively intended for content creation where the applications can pool multiple GPU resources without worrying about low-latency frame syncing.
This.
 
I have a 2019 Mac Pro, wonder if I should pick this up.
If you have anything other than the Vega II, it could be a worthwhile investment. After that it'd likely highly depend on the kind of workloads you're handling.
 
would it not be far less complicated and simpler to just put two separate GPUs in the system then?
You can do that. That's why Apple lets you configure a Mac Pro with two discrete cards. The W6800X Duo is for creators who have a massive amount of GPU-based work and either want to stuff as many GPUs into the system as possible or save a PCIe slot for other purposes.
 
would it not be far less complicated and simpler to just put two separate GPUs in the system then?
Maybe, but having them on the same board does simplify cooling, power delivery, and reduce required space in the system. It also allows shared cache and lower latency communication, if they chose to take advantage of those technologies.
 
This is for an Intel-based Mac Pro. Apple wasn't about to develop an in-house GPU for an x86 system it'll likely retire within about a year. You'll have a case to make if Apple is using an AMD GPU in an ARM-powered Mac Pro.

Given the impressive performance of the minuscule GPU on the M1 SOC, I wouldn't be surprised if Apple just goes with their own, in-house design. It's not like they have a gaming pedigree to worry about and the prductivity work of the M1 is second to none even machine that triple and quadruple the power can't keep up.

There's no guarantee of linear scaling with M1, of course, but if it does scale well Apple has nothing to worry about if they go entirely vertical in their future production.
 
Last edited:
Given the impressive performance of the minuscule GPU on the MI SOC, I wouldn't be surprised if Apple just goes with their own, in-house design. It's not like they have a gaming pedigree to worry about and the prductivity work of the M1 is second to none even machine that triple and quadruple the power can't keep up.

There's no guarantee of linear scaling with M1, of course, but if it does scale well Apple has nothing to worry about if they go entirely vertical in their future production.
It's a big "maybe" at this point. There is talk of Apple launching pro machines with tons of GPU cores, but it would be a challenge to scale from the M1 (basically only a bit faster than Intel's Xe) to something on par with Radeon Pro boards.
 
You do realize that Apple would still have to pour development effort into two separate BIOSes and otherwise add significant support headaches, right?
Must be tough hiring a second team of engineers when your market cap's only 2 trillion dollars.

Edit: I mean, your point is taken, but it's not an argument that engenders a lot of sympathy.
 
It's a big "maybe" at this point. There is talk of Apple launching pro machines with tons of GPU cores, but it would be a challenge to scale from the M1 (basically only a bit faster than Intel's Xe) to something on par with Radeon Pro boards.
I think that's probably the right way to look at it. It's better than Xe (yay), but I can't see Apple conquering the world with their second generation GPU. They've proven they can engineer a really good processor but time will tell when it comes to the rest of their lineup.

I'm just generally excited about a brighter future for RISC.

Maybe more deluded than excited?

Who knows.
 
You do realize that Apple would still have to pour development effort into two separate BIOSes and otherwise add significant support headaches, right? That this would be somewhat wasted energy given the transition?
You make it sound like this is a problem Apple has to deal with and not AMD.
Please, drop the image of Apple as a cartoonish villain. You're not 10 anymore. You know there's a lot more to it than that, and that Apple could easily have practical reasons for not supporting the same GPU on two different processor architectures.
My dear Aurelius, Apple does not cloak themselves in good deeds to hide their villainy. They openly twirl their mustache. You just don't like to admit Apple's faults and limitations.
Given the impressive performance of the minuscule GPU on the M1 SOC, I wouldn't be surprised if Apple just goes with their own, in-house design.
It's impressive when compared to Intel's current GPU's. :rolleyes: Barely beating AMD's 5 year old Vega APU's.
It's not like they have a gaming pedigree to worry about and the prductivity work of the M1 is second to none even machine that triple and quadruple the power can't keep up.
The GPU performance isn't what makes the M1 good at productivity but the video encoder and decoder, which is really good but not related to GPU performance.
There's no guarantee of linear scaling with M1, of course, but if it does scale well Apple has nothing to worry about if they go entirely vertical in their future production.
Because Apple has a lot of experience making large and powerful GPU's. By Apple I mean Imagination because it's technically their GPU technology.
It's a big "maybe" at this point. There is talk of Apple launching pro machines with tons of GPU cores, but it would be a challenge to scale from the M1 (basically only a bit faster than Intel's Xe) to something on par with Radeon Pro boards.
I'd be shocked if Apple actually does this. They haven't even shown to be able to match AMD and Nvidia yet on laptops, let alone workstations.
 
Must be tough hiring a second team of engineers when your market cap's only 2 trillion dollars.

Edit: I mean, your point is taken, but it's not an argument that engenders a lot of sympathy.
Oh, I'm not asking folks to cry a river for Apple, just noting that there are likely practical concerns. Some people act as if Apple is Snidely Whiplash tying people to railroad tracks.
 
My dear Aurelius, Apple does not cloak themselves in good deeds to hide their villainy. They openly twirl their mustache. You just don't like to admit Apple's faults and limitations.

Apple has plenty of faults and limitations. I just wish people would take a more realistic approach to them instead of portraying Apple as either a saint or a demon. Apple does some cynical things, but the boring answer is that it either usually makes business sense or is a byproduct of Apple's sometimes very rigid philosophy. I'm more concerned about the companies that are willing to compromise too much, the Dells/HPs/Lenovos of the world, because those are the ones that try to make pretty horrible things seem normal, like bad build quality (batteries that wear out prematurely, flimsy plastic, that sort of thing) or lousy tech support.
 
Apple has plenty of faults and limitations. I just wish people would take a more realistic approach to them instead of portraying Apple as either a saint or a demon. Apple does some cynical things, but the boring answer is that it either usually makes business sense or is a byproduct of Apple's sometimes very rigid philosophy. I'm more concerned about the companies that are willing to compromise too much, the Dells/HPs/Lenovos of the world, because those are the ones that try to make pretty horrible things seem normal, like bad build quality (batteries that wear out prematurely, flimsy plastic, that sort of thing) or lousy tech support.

I own a 2019 Mac Pro and have only had macbooks the past 12 years so I may be biased. But Apple being one of the richest companies in the world means that they're doing something right. This perhaps is due to the absolute amazing customer service/tech support that they have. They absolutely do not need enthusiasts like people on this board, they sell plenty of products to mainstream consumers that simply just work for their customers and quite frankly, that's who they really care about.

Just last week, I got a replacement Iphone from Apple. A few days before receiving the replacement, I texted apple that my bluetooth was acting up and they sent a replacement within 2 days of texting them. Yes, Apple does support via text now, which is really great. Applecare is simply hands down the best warranty service in the industry. Also, their products retain their value much much better than PC counterparts.

Apple may not have the fastest bleeding edge technology that is a few seconds slower than pc offerings but they sure do beat the competition where it matters the most.
 
I own a 2019 Mac Pro and have only had macbooks the past 12 years so I may be biased. But Apple being one of the richest companies in the world means that they're doing something right. This perhaps is due to the absolute amazing customer service/tech support that they have. They absolutely do not need enthusiasts like people on this board, they sell plenty of products to mainstream consumers that simply just work for their customers and quite frankly, that's who they really care about.

Just last week, I got a replacement Iphone from Apple. A few days before receiving the replacement, I texted apple that my bluetooth was acting up and they sent a replacement within 2 days of texting them. Yes, Apple does support via text now, which is really great. Applecare is simply hands down the best warranty service in the industry. Also, their products retain their value much much better than PC counterparts.

Apple may not have the fastest bleeding edge technology that is a few seconds slower than pc offerings but they sure do beat the competition where it matters the most.
I'd say a lot of it stems from insisting on catering to the premium market. That shuts out many opportunities, but it also means you don't end up with flimsy $400 laptops or poorly-supported $200 smartphones. I'd say Chromebooks are thriving in part because they either offer more appropriate prices for that quality than Windows equivalents, or offer better quality at similar prices.
 
I'm genuinely surprised that Apple has stuck with Intel and not switched over to AMD Threadripper, Threadripper Pro or even Epyc chips. The improvement on the Mac Pro 2019's IO would be massive and significantly cheaper as the Threadripper Pro and Epyc chips sport a full 128 PCIe lanes which would remove the massive PCIe bridge chip on the current Mac Pro motherboard. In addition, AMD and Apple could have enabled Infinity Fabric links between the CPU and these GPUs. That's permit flat and fully coherent memory space between CPU and GPU memory pools. That hits the trifecta of more bandwidth, lower latency and improved efficiency. Bonus if AMD/Xilinx also released a FPGA accelerator card that also connected via Infinity Fabric.

While the professional applications on the Mac side of thing are generally written nowadays to expect multiple GPUs (the trash can model standardized that even though it was well... trash) I am always curious how these cards are exposed in Windows. CrossFire is long dead but this might be the weird exception due to the Infinity Fabric links: I'm curious how four Navi 21 chips working together handle gaming.
 
Back
Top