Looks like apple notebooks will be using polaris from amd

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
Apple 2016 laptops will have AMD GPUs

For a while not it has been suggested that Nvidia would have a role in the next generation Apple notebooks or desktop, workstation processors, but it seems that the outfit has not even got a walk-on part in the marketing show. Fudzilla's deep throats have confirmed that AMD, or rather its Radeon Technology Group won another round of Apple refreshes.

MacBook Pro with Polaris is in the works. This is probably because Polaris 11, which is the smaller of two chips is really thin. This helps Apple create thinner and lighter notebooks. There might be some more MacBook based designs where we will get to see the Polaris architecture in action.


Makes sense, we have heard about a full nvidia 1080 gpu coming to notebooks, but those class of notebooks are extremely rare to see. The most common notebook gpu on the nvidia side is the 960m (the non budget gaming notebooks, that is where tolerable performance parts start), and starting with smaller die gpus gets amd ahead of the pack when it comes to offering lower tdp gpu options.

I wonder what kind of performance the mobile variations will have? They can use both polaris 10 and 11 in notebooks depending on dimensions and thermals.
 
Because it's cheap. And Apple loves skimping on graphics to squeeze as much money out of its users as possible. You can't even get a dedicated GPU on a Mac laptop until you spend $2500. And even then it's a mid-range GPU with no option to upgrade. Apple's a joke. At least with Polaris, customers might actually get decent performance now.

Also, Apple's obsession with thinness at the expense of ... well ... everything else ... is bordering on psychotic.
 
Most likely only on their high end laptop, other will still be Intel iGPU.
 
Most likely only on their high end laptop, other will still be Intel iGPU.
Even their base 15" that's $2000 doesn't come with a dedicated GPU. A dedicated GPU literally starts at their highest configuration possible. It's hilariously bad.

5wbMn8V.png
 
The lower TDP of Polaris parts should allow Apple to add a dGPU to lines that could never have had one in the past. A refreshed MacBook Air with TB3, Polaris 10 and i7 option would be pretty slick.
 
  • Like
Reactions: N4CR
like this
Even their base 15" that's $2000 doesn't come with a dedicated GPU. A dedicated GPU literally starts at their highest configuration possible. It's hilariously bad.

5wbMn8V.png

That is true, but I do think Polaris 10 and 11 will change that, though I still believe low and mid end will still be Intel iGPU. The lack of Polaris 11 news is a bit disturbing especially we didn't hear much about it in Computex, while that is not indicative that there is something wrong with Polaris 11, I just find it weird AMD was debuting it at the beginning of the year only to be quiet now.
 
I honestly don't see why it would change. It's not a size constraint that's keeping them from putting a dedicated GPU into anything but the highest end configuration. The base 15" has literally the same exact space inside it as the top configuration and yet it still doesn't have a dGPU for 2 grand. This is typical Apple pinching pennies while borking the consumer. I really hope I'm wrong, but Apple hasn't given me any reason to believe they'll change.


The lower TDP of Polaris parts should allow Apple to add a dGPU to lines that could never have had one in the past. A refreshed MacBook Air with TB3, Polaris 10 and i7 option would be pretty slick.
Again, size constraints are not why Apple is leaving out dGPUs on anything but their highest end MBP, otherwise they would have it on in their base 15" MBP, which is literally the same size inside. I mean yes, logic would dictate what you just said, but looking at their configurations, Apple doesn't work by logic. They work by whatever makes them the most money possible.
 
It's not size, it's heat. You already know if you use a MBP that has a dGPU in it. Thing gets hotter than a pistol when the M370X is being worked hard by Final Cut.

Apple is the whole reason Intel's GPUs are not garbage anymore. They care deeply about GPU hardware, they're just not willing to sacrifice product thinness/weight or battery life for it. This makes a lot of sense for their mobile lines, but explains why the iMac is just a laptop glued to a fancy display and why the Mac Pro is a trash can.
 
Yeah, their obsession with thinness is legendary, but it's making their hardware become more of a joke all the time. Size or heat, there's no excuse to not have a dGPU in their base $2000 15" because there's literally no size difference inside the thing. Also the fact that their 4K iMac doesn't have an option to have a dGPU is something that is so laughably bad I don't even know where to begin. They're becoming a joke to anyone that takes computers seriously. They're become more of a fashion/trend statement than anything else.
 
Apple's mistake on the PC side was not becoming an OS company like Microsoft, just with a more proprietary bent. Keep making the Mac OS and SELL it, just keep a fixed list of hardware that's supported on each version. Let users build the system, but the specs of those parts have to include only components Apple tested against each other for maximum compatibility.

But they were never interested in bringing the best product to the consumer. There's a niche for closed ecosystem computers that are highly tested. Instead of really filling all the needs of that role, they keep on losing ground by insisting on limiting it only to what they can build themselves.

Apple has a unique position to enforce allowed hardware devices through their OS, and people would accept it. Unlike Windows and general PCs, which can and should be capable of mixing and matching any combination of hardware and software you can imagine.

There's a place for both.

But Apple needs to get their heads out regarding their uptight branding issues.
 
And that's why I will continue to build Hackintoshes until Apple no longer allows it. There's no real desktop Mac ... and that's a product that desperately needs to happen. The Mac Mini is a joke (they removed the dGPU from the Mini as well), the Mac Pro is immensely proprietary (and expensive) and isn't updated for several years at a time, and you can't even get a dedicated GPU on an iMac until you go with the expensive 5K ... and even then you get a mid-range GPU. I really don't understand where Apple's heads are at. I know they are not technically inept, so why the hell would they drive a 5K display with a mid-range card? It boggles the mind. Either they are tech-retarded or they're genius thieves. Their target market is clearly folks who don't know any better.
 
Apple's mistake on the PC side was not becoming an OS company like Microsoft, just with a more proprietary bent. Keep making the Mac OS and SELL it, just keep a fixed list of hardware that's supported on each version. Let users build the system, but the specs of those parts have to include only components Apple tested against each other for maximum compatibility.

But they were never interested in bringing the best product to the consumer. There's a niche for closed ecosystem computers that are highly tested. Instead of really filling all the needs of that role, they keep on losing ground by insisting on limiting it only to what they can build themselves.

Apple has a unique position to enforce allowed hardware devices through their OS, and people would accept it. Unlike Windows and general PCs, which can and should be capable of mixing and matching any combination of hardware and software you can imagine.

There's a place for both.

But Apple needs to get their heads out regarding their uptight branding issues.

At this point I don't think there's enough of a consumer OS sales market to support OS X running on white box machines. The biggest reason Windows folks do it is for games which wouldn't be a draw for OS X. And even in the Windows world, direct consumer sales of Windows and white box devices is just a fraction of the market. A pretty important fraction for top end hardware but then I don't see why Apple would give up on selling that hardware at a nice profit to only sell a $100 OS.
 
And that's why I will continue to build Hackintoshes until Apple no longer allows it. There's no real desktop Mac ... and that's a product that desperately needs to happen. The Mac Mini is a joke (they removed the dGPU from the Mini as well), the Mac Pro is immensely proprietary (and expensive) and isn't updated for several years at a time, and you can't even get a dedicated GPU on an iMac until you go with the expensive 5K ... and even then you get a mid-range GPU. I really don't understand where Apple's heads are at. I know they are not technically inept, so why the hell would they drive a 5K display with a mid-range card? It boggles the mind. Either they are tech-retarded or they're genius thieves. Their target market is clearly folks who don't know any better.
that!

they don't allow hackintosh but how are they supposed to stop it? they'd have to go back to proprietary non-pc based hardware. also, for what people generally do on them a mid tier card is plenty.
 
they don't allow hackintosh but how are they supposed to stop it? they'd have to go back to proprietary non-pc based hardware.
I'm not sure. They could probably code something in the OS to prevent it. I don't see them reverting to non-PC hardware again. If they did, it would probably be the processors they use in their mobile devices. That would not only destroy the Hackintosh community, but it would make me stop using Apple products in general ever again. But honestly at this point I feel like Apple could do anything stupid.

also, for what people generally do on them a mid tier card is plenty.
That wasn't entirely my point, although though you are definitely right. They have a 5K display that has a mid-range card and even worse ... a 4K display with only onboard graphics. It makes no sense from any standpoint. They're literally crippling their hardware potential for no reason whatsoever. It just boggles the mind.
 
I have the macbook retina with the m370x. I've never played games on it nor do I intend to. Then again, I got it for the same price as the one with the intel iris. So eh. But I do video editing on my macbook and it helps.
 
Thing is, Final Cut is GPU-dependent. So that's why it doesn't make any sense. They don't even get the most out of their own software by crippling their hardware. Final Cut is optimized the most for the Mac Pro, but they haven't updated that thing in 3 years. I wish they would scrap that stupid garbage can design and go back to the upgradeable desktop they had before.
 
Thing is, Final Cut is GPU-dependent. So that's why it doesn't make any sense. They don't even get the most out of their own software by crippling their hardware. Final Cut is optimized the most for the Mac Pro, but they haven't updated that thing in 3 years. I wish they would scrap that stupid garbage can design and go back to the upgradeable desktop they had before.

The thing is, for video editing, I don't think you need a stronger gpu. I saw a video a couple months back that did some tests with different gpus in adobe premiere




Having a gpu boosts the speed, but not higher end gpus, it seems the main tweaks to boost rendering/encoding times don't tap the bulk of even middling gpus. It's games where you really need more of the gpu, and macs are not trying to fill that niche. Not sure how that translates to final cut pro, but I assume it's a similar thing.
 
yeah FCP is partially gpu dependent. it only uses it for certain things like motion/speed adjusts, titles and effects but not to the extent that gaming does. so yeah a better gpu would help in places but not by much.
 
The thing is, for video editing, I don't think you need a stronger gpu. I saw a video a couple months back that did some tests with different gpus in adobe premiere




Having a gpu boosts the speed, but not higher end gpus, it seems the main tweaks to boost rendering/encoding times don't tap the bulk of even middling gpus. It's games where you really need more of the gpu, and macs are not trying to fill that niche. Not sure how that translates to final cut pro, but I assume it's a similar thing.

Gaming GPUs are not going to be very significant for rendering video. That's why workstation GPUs, such as the ones in the Mac Pro, are specifically designed for that kind of work. Final Cut Pro X is coded to take advantage of the dual-GPU setup that the Mac Pro has, which makes it significantly faster for video editing work than any of their other product lines by a mile. ECC (Error Correcting Code) RAM (also in the Mac Pro) is also important. If video editing is your thing, you'll be getting a workstation GPU ... and those can range anywhere from $300 to as high as $8000. My only point was that the GPUs Apple chooses to put (or not put at all) into every other product line besides the Mac Pro are seriously crippling their potential. And they're doing this for no other reason than profit. It wouldn't make sense any other way.


They literally designed FCPX to take full advantage of the Mac Pro: Final Cut Pro X - Mac Pro
 
My only point was that the GPUs Apple chooses to put (or not put at all) into every other product line besides the Mac Pro are seriously crippling their potential.

Not really. You only buy the dGPU model if you know you actually need it. If you do, the software you're running on it probably costs as much as the hardware anyway. Only our Final Cut users have Macs with dGPUs, and we have hundreds of them deployed.

Don't get me wrong, a large part of this is due to a negative feedback loop caused by the barrier of entry. Why support dGPU in your software if only power users can even take advantage? Lower TDP parts that get dGPUs into a wider range of parts can help remedy this sort of issue going forward. I'm expecting dGPU as an option in the 13" MBP for sure with the major refresh.
 
Only our Final Cut users have Macs with dGPUs
This is completely false. Also I can't fathom how you don't agree that putting mid-range or even no dGPU at all is crippling their hardware. Their 4K iMac with absolutely no dGPU is crippling an otherwise beautiful machine. Having to drive that 4K display with only onboard graphics really takes away the potential of a great machine. They have the craftsmanship down, but their implementation of hardware (or lack thereof) is ridiculous.
 
This is completely false. Also I can't fathom how you don't agree that putting mid-range or even no dGPU at all is crippling their hardware. Their 4K iMac with absolutely no dGPU is crippling an otherwise beautiful machine. Having to drive that 4K display with only onboard graphics really takes away the potential of a great machine. They have the craftsmanship down, but their implementation of hardware (or lack thereof) is ridiculous.
I believe they are talking about the company they work for not apple. you missed a piece, see:
Only our Final Cut users have Macs with dGPUs, and we have hundreds of them deployed.
if you want to use FCP on hardware that it is not designed for FCP than yeah it will not work as well as on a mac pro that is built for it. taking a quote from the page you provided "It’s like Mac Pro and Final Cut Pro X were made for each other — because they are made for each other."

so yeah they are "crippling" their systems by not using a dgpu but the user base for those product isn't using it. you're comparing CONSUMER to PRO products. if you want pro you have to pay the price.

Having to drive that 4K display with only onboard graphics really takes away the potential of a great machine.
for what? watching youtube, checking facebook and normal desktop tasks? adding dgpu to regular systems would require redesigning all their motherboards and would jack the price by at least $250-300, maybe even $500 for mid range pc level performance. ive seen the guts of MANY apple systems and adding dgpus would be a nightmare.
maybe egpu would be a better solution...

edit: so back on topic..
it will be good if apple moves to amd chips in their products. it will keeps prices down but will give you more of the graphics performance you are wanting.
 
This is completely false. Also I can't fathom how you don't agree that putting mid-range or even no dGPU at all is crippling their hardware.

It's not false. I'll say it again: the only people here who have Macs with dGPUs are those who need Final Cut. No other piece of productivity software we use gets enough out of the dGPU for it to be worth buying. It would really help your argument if you actually explained what apps you run or expect to run that you need the GPU for.
 
Most video editors jumped to Adobe Premiere quite some time ago whilst Apple was busy messing up Final Cut Pro with the abortion they call FCPX.
 
  • Like
Reactions: N4CR
like this
I didn't buy a MacBook Pro for gaming so the lack of a dGPU really isn't an issue. I'm a web dev and that's the platform many web devs use. A Windows system would be a joke for that application (I can see MS trying to rectify this by adding Bash support in Win10, so kudos to them).

I'm excited to see the refresh, as the MBP needs an upgrade.

Also, on the topic of "Macs are crippled by their crappy GPUs!":
 
It's not false. I'll say it again: the only people here who have Macs with dGPUs are those who need Final Cut. No other piece of productivity software we use gets enough out of the dGPU for it to be worth buying. It would really help your argument if you actually explained what apps you run or expect to run that you need the GPU for.
My argument literally had nothing to do with Final Cut Pro in the first place. Somehow the entire discussion got steered in that direction. The only point I was making, which is completely valid unless you're an Apple apologist, is that Apple charges consumers a premium price for middle-range hardware with no option to upgrade. People keep trying to counter this idea with the fact that the average consumer just uses Facebook and watches cat videos. The average consumer isn't going to spend 1500 dollars on a 4K iMac to just use Facebook and look at cat videos on YouTube. The fact that there isn't even an option to UPGRADE to a dGPU in the 4K iMac is not logical. "But heat! Yes what about heat? The most valuable computer company on the planet can't figure out proper heat dissipation? I cannot believe all the excuses that are being made for Apple in this thread. Look, I don't hate Apple. I love them. Been using them for two decades. But if you haven't noticed the decline in their hardware implementation, and not to mention everything is becoming completely soldered on with more and more proprietary parts, I don't know what to tell you. The Mac Pro has gone downhill too with proprietary GPUs that can only be replaced with other proprietary GPUs. It used to be a workstation machine with easily swappable components.

Anyway I'm done. Keep apologizing for Apple, I guess.


I didn't buy a MacBook Pro for gaming so the lack of a dGPU really isn't an issue. I'm a web dev and that's the platform many web devs use. A Windows system would be a joke for that application (I can see MS trying to rectify this by adding Bash support in Win10, so kudos to them).

I'm excited to see the refresh, as the MBP needs an upgrade.

Also, on the topic of "Macs are crippled by their crappy GPUs!":

And this is probably the most annoying argument of all. "I don't use it, so it doesn't bother me." Just because you don't use it doesn't mean it shouldn't include it. Every other computer manufacturer that makes laptops in this price range has a dGPU in it. They don't even have one in a $2000 machine. The argument that it would be too expensive is laughable. Apple already charges a premium for their products. You can't get a dGPU in a Mac laptop until you spend $2500 ... and it has the exact same space inside it as the $2000 laptop. It's not about space, it's not about price, it's not about any of that. Apple just simply chooses to not include it. That is all. And the dGPU you CAN get is mid-range, with no option to upgrade to anything better. You are literally paying nearly 3 grand after taxes for a computer that has as subpar GPU. Please think about that. I honestly don't know how anyone can defend that absurdity, unless you are so blinded by Apple branding that you can no longer see the nonsense before you.
 
Last edited:
If you don't like the options Apple has for the price just don't buy them. Pretty simple, really...

At work I have a lot of Macs at my disposal (three 2013 Mac Pros, one 2010 Mac Pro, two MacBook Pros) and I really like OS X, but at home I use the Windows PC I built. I like Macs, but I don't always like the options and I don't like the prices. So I use PCs at home. Free market at play.
 
And this is probably the most annoying argument of all. "I don't use it, so it doesn't bother me." Just because you don't use it doesn't mean it shouldn't include it. Every other computer manufacturer that makes laptops in this price range has a dGPU in it.

Then go buy a laptop from one of those other manufacturers. Apple clearly isn't targeting your market.

I'd like to see them cram dGPUs in lower tier products because I'm always a fan of getting more for my money, but for me (and I reckon a LARGE portion of Mac users), the lack of a dGPU has not impacted my use of the device in the slightest. I didn't buy a MacBook for a gaming machine. People don't buy MacBooks for gaming machines. If you want a gaming machine, go buy a gaming machine. Apple doesn't make those, I'm sorry.

A Wintel gaming machine is not the solution to every problem, and there's nothing inherently bad about not being a Wintel gaming machine. It's just a different tool for a different job.
 
Again "I don't use it so it doesn't affect me" is a stupid argument. I have not, in my entire conversation here, mentioned gaming. Why are you bringing up gaming? Gaming isn't even a part of this discussion and it never was. You can't bring up something I never said and use it as a base for an argument. If you think GPUs are only useful for gaming then you have no idea what you're talking about. dGPUs are used for graphics work. They will improve just about everything you do on the computer, from UI effects of the computer (and taking graphical loads off the CPU in general), to viewing and editing high res video, to Adobe software taking advantage of a dGPU to improve workflow and any other software that is designed to take advantage of a GPU ... just because you are not a full on professional using a $3000 workstation does not mean you will not benefit from a dGPU in many ways. Even just watching high res movies can be improved with a dGPU. Even if you're just using video editing for personal use, a dGPU will make the experience much better and cut your rendering times down. It will save time and it will improve the performance of the computer in many areas.

Ignorance is not a reason to not include something. And yes, most Mac users knowledge of what they just bought goes as far as "it's a Mac" ... that doesn't justify the price-to-hardware ratio and not including pertinent components that are in every other laptop in this price range except Apple. I honestly can't fathom how gaming always gets brought up when anyone mentions a GPU in Macs, especially when it was not mentioned once in the entire thread. Macs have never been for gaming. That is not why a GPU would benefit a Mac. Now you're not only apologizing for Apple, but apologizing for and catering to the ignorant masses. "They have no idea what's in it so that makes it okay."
 
Ok, so all we know is you aren't gaming and aren't using FCP. You still have yet to explain what you are using your Mac for that you require a dGPU for. You mention Adobe, but we've run benchmarks on every gen of the Retina MBP with and without the dGPU and there's no benefit over the Iris Pro worth buying the GPU for there. The value of the dGPU just isn't there for most users when Apple is already using Intel's premium options for video. I understand that it's something you feel you need, but you and your needs (which still have yet to be articulated...) are probably not representative of the bulk of Apple's users.

As I've said though, I'm excited for Polaris and its potential to help reverse this trend by bringing more value from the GPU for all users and improve Apple's offerings going forward.
 
ECC (Error Correcting Code) RAM (also in the Mac Pro) is also important.

Yeah, it's so important, Apple decided they weren't paying for it.

Despite the FirePro brand, these GPUs have at least some features in common with their desktop Radeon counterparts. FirePro GPUs ship with ECC memory, however in the case of the FirePro D300/D500/D700, ECC isn’t enabled on the GPU memories. Similarly, CrossFire X isn’t supported by FirePro (instead you get CrossFire Pro) but in the case of the Dx00 cards you do get CrossFire X support under Windows.

The Mac Pro Review (Late 2013)

So yeah, these are just dual underclocked 290X cards. That don't support Crossfire, so they can ONLY be utilized simultaneously by specifically coding for them (i.e. Final Cut Pro X).

For a thousand dollars. LOVE THAT UP-SELL BULLSHIT! Some people will come up with any reason to justify such monetary waste. You're Apple's kind of people :D
 
Last edited:
The Mac Pro Review (Late 2013)

So yeah, these are just dual underclocked 290X cards. That don't support Crossfire, so they can ONLY be utilized simultaneously by specifically coding for them (i.e. Final Cut Pro X).

For a thousand dollars. LOVE THAT UP-SELL BULLSHIT! Some people will come up with any reason to justify such monetary waste.
That's actually really ridiculous. That illustrates my point even further of Apple's absurd pricing for what you get.


You're Apple's kind of people :D
I wasn't even aware they didn't have Crossfire support for the dual-GPUs in the Mac Pro. Wouldn't that be more on the fault of AMD and not Apple, though? Either way, that's incredibly stupid.
 
Ok, so all we know is you aren't gaming and aren't using FCP. You still have yet to explain what you are using your Mac for that you require a dGPU for. You mention Adobe, but we've run benchmarks on every gen of the Retina MBP with and without the dGPU and there's no benefit over the Iris Pro worth buying the GPU for there. The value of the dGPU just isn't there for most users when Apple is already using Intel's premium options for video. I understand that it's something you feel you need, but you and your needs (which still have yet to be articulated...) are probably not representative of the bulk of Apple's users.

As I've said though, I'm excited for Polaris and its potential to help reverse this trend by bringing more value from the GPU for all users and improve Apple's offerings going forward.
What I need has never been the point. It's not about what anyone needs. It's about what Apple should have in the first place. That is literally my only point. How is it okay to have a mid-range dGPU in a laptop that's nearly 3 grand after taxes? If you can justify that without sidetracking my point, I'd be happy to listen.


As for your Adobe comment, Adobe runs best with NVIDIA GPUs because of CUDA cores, so unless they start using NVIDIA GPUs again it doesn't really matter.
 
Last edited:
Y'know, for kicks, I went to NewEgg and ticked off the features my MBP has, to see how PC laptops compared. And y'know, there were similar options that had graphic cards. But they were all Quadros, and they were all more expensive than my MBP. Huh.

GP9kNPX.png


That cheaper HP is pretty close... But it has a 1080p display. Doesn't compare to Retina ¯\_(ツ)_/¯

I'm still lost on why Apple has some God-given directive that they MUST include a dGPU in devices that people don't care about having a dGPU.
 
Adobe Premiere Pro has supported AMD cards for quite some years too now.
 
Y'know, for kicks, I went to NewEgg and ticked off the features my MBP has, to see how PC laptops compared. And y'know, there were similar options that had graphic cards. But they were all Quadros, and they were all more expensive than my MBP. Huh.

GP9kNPX.png


That cheaper HP is pretty close... But it has a 1080p display. Doesn't compare to Retina ¯\_(ツ)_/¯

I'm still lost on why Apple has some God-given directive that they MUST include a dGPU in devices that people don't care about having a dGPU.
Wow, you are really bad at searching.
 
Basically,
Wow, you are really bad at searching.

Nahh, he's just good at finding self satisfaction about his $2500 screen purchase :D

Yeah, Newegg is the last place I'd buy a high-end notebook from a major maker. You'll have limited selection compared to their own website. You Apple nuts should be VERY familiar with that game.

I'd only use Newegg's search to buy desktop replacement notebooks from Asus/MSI/Gigabyte. For everything else, it's pretty lacking.

Hey look how hard that was to find. An Asus model with 512GB SSD, 16GB ram, quad core processor, 4k 15" display, under 5 pounds, Thunderbolt and GTX 960m included, crapware free for $1500!

Amazon.com: ASUS ZenBook Pro UX501VW 15" 4K Touchscreen Laptop (Core i7-6700HQ CPU, 16 GB DDR4, 512 GB NVMe SSD, GTX960M GPU, Thunderbolt III, Win 10 Cortana Premium): Computers & Accessories

And now the Dell XPS 15, for $2000

Buy Dell XPS 15 9550 Signature Edition Laptop Review - Microsoft Store

You get the same GTX 960m as the Asus, but you get the much smaller screen bezel. You also get Dell support, if you're afraid of Asus support. Still $500 less than the dGPU MacBook Pro, and crapware free.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top