Radeon Pro 400 Series Graphics: AMD’s Most Powerful Ultrathin Graphics Processors

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Today AMD unveiled a new family of power-efficient graphics processors, Radeon™ Pro 400 Series Graphics. Available first in the all-new 15-inch MacBook Pro, select Radeon Pro 400 Series graphics deliver extraordinary performance and efficiency gains over the prior generation to fuel modern creative efforts from anywhere inspiration strikes. Radeon Pro 400 Series Graphics are designed specifically for today’s makers – the artists, designers, photographers, filmmakers, visualizers and engineers that shape the modern content creation era. Harnessing AMD’s acclaimed Polaris architecture, Radeon Pro 400 Series Graphics are built on the industry’s most advanced process technology for graphics processors in production today, 14nm FinFET, resulting in incredibly small transistors. To enable the thinnest graphics processor possible, AMD also employs a complex process known as ‘die thinning’ to reduce the thickness of each wafer of silicon used in the processor from 780 microns to just 380 microns, or slightly less than the thickness of four pieces of paper. Operating in a power envelope under 35W, the Radeon Pro 450, 455, and 460 Series graphics processors deliver spectacular energy efficiency and cool, quiet operation to speed through the most demanding tasks in popular creative applications.
 
Considering that Apple replaced the logic board in my work laptop for free thanks to the settlement I'm not sure what your point is.

That's nice. Too bad Nvidia did not really acknowledge the problem until it was no longer important for most folks. (That is the point but hey, you got yours, right?) :D Now, the other point is AMD is now the thing and they did need it.
 
At least it's a 2.5x upgrade TFLOP-wise over the card in my current work laptop but as none of the apps I use are designed to use OpenCL that TFLOP upgrade would never be used.
 
Better than the Nvidia bumpgate stuff as well. ;)

AMD had its own bumpgate with Apple. Not to mention defective FirePro in the Mac Pro.
Apple Extends MacBook Pro Repair Extension Program for Video Issues Until December 31, 2016
Apple Recalls 2013 Mac Pros With AMD FirePro GPUs | HotHardware
Apple Initiates Graphic Card Replacement Program for Mid-2011 27-inch iMac

In terms of the topic, this is pretty much what Maxwell delivered on 28nm :(

640, 768, 896 and 1024 SKUs out of a small 135mm2 chip.
 
Last edited:
Since we all know how these perform in an Apple environment right now we can make that determination.
 
Full 16CU Polaris 11 at 35w, pretty good! had to down clock it, but still impressive. Wish there was a way to unlock the current 460's
 
AMD had its own bumpgate with Apple. Not to mention defective FirePro in the Mac Pro.
Apple Extends MacBook Pro Repair Extension Program for Video Issues Until December 31, 2016
Apple Recalls 2013 Mac Pros With AMD FirePro GPUs | HotHardware
Apple Initiates Graphic Card Replacement Program for Mid-2011 27-inch iMac

In terms of the topic, this is pretty much what Maxwell delivered on 28nm :(

640, 768, 896 and 1024 SKUs out of a small 135mm2 chip.

LOL I respect your determination to knock anything amd. Too bad you aren't determined enough to admit you simply hate amd. You find a way to complain and bitch about a 35w chip. Get real seriously.
 
NKD you do realize because the rx460 is down clocked 25% its performance is like that of a 750ti lol, and a 750ti is at 45 watts for mobile.......

And it doesn't really matter about gaming performance for these laptops anyways, cause they aren't made for that, the drivers won't be optimized for games only proffesional software and sorry but 750ti perfromance for anything 3d is out the window, movie editing like after effects not so good either.

Photoshop, yeah that could possible be ok as long as you aren't using any complex custom made filtering algorithms or extreme picture sizes.
 
NKD you do realize because the rx460 is down clocked 25% its performance is like that of a 750ti lol, and a 750ti is at 45 watts for mobile.......

And it doesn't really matter about gaming performance for these laptops anyways, cause they aren't made for that, the drivers won't be optimized for games only proffesional software and sorry but 750ti perfromance for anything 3d is out the window, movie editing like after effects not so good either.

Photoshop, yeah that could possible be ok as long as you aren't using any complex custom made filtering algorithms or extreme picture sizes.

Well I comment was not even about gaming performance or anything. Just sort of tired of every thread turning in to nvidia vs amd, its s 35w card. How much more can it possibly get you, you know what I mean. lol
 
Last edited:
Blame the guy that brought up nvidia bumpgate for that, should expect someone to follow up with something about nV vs AMD after that comment (since AMD had something similar to that).

Any case same old crap from apple, low end overpriced web designer systems for the go lol. And nothing to get excited about for AMD just replaces their previous contract for Apple, so no change on their bottom line.

And Lisa Su interview at wall street because of this? Yeah nothing much to look at same old story different day.
 
Last edited:
Yes performance per watt Nvidia is still champ, but they burned that Mac bridge long ago. Id still say this is a huge step forward from the power usage from a couple years back. They have realized the error in their ways on that front. 45 to 35w is 33% reduction in TDP on this chip and a large jump in performance from a mobile variant of the 750ti. Yes its down clocked 25% but also using the full P11 chip, so performance doesnt suffer like youre implying...But we get it Razor1 and Shintai are strong in the AMD hate. Jen-Hsun would have done the same interviews...come one now...I know they say love makes you blind, but does hate do the same?
 
Oh you want me to quote you then ok here ya go

No Ledra, I work and was on lunch, I dont have all day to defend Nvidia or AMD as some do. What would be the purpose of me lying about what the card does? You assume everyone is lying to you? terrible way to look at people...ah well.



As I said, my GB Windforce RX460 4GB with no PCIe plug does 1224MHz @ 1010mV. So is the 55-59w off the mark? It was read through GPU-Z, im not sure as to its level of accuracy Im not about to pull out my oscilloscope but could throw a kill-a-watt on it I suppose. I was stating that both Fury and Polaris were overvolted by a fair margin by AMD, this as someone else stated, can be backed up by all the ethereum miners undervolting their cards.

I stopped coming here, too many kids with nothing better to do than fight with each other and spout off personal insults rather than discuss things as adults...Seems to still be more of the same.

You didn't say you had one?

What you can't even remember what you posted from the night before?

Yeah That is you right? I didn't need a minute to reread anything, I have a damn good memory lol.

Why would post that you had one and didn't want to pull it out, then when questioned about it you say you never stated such a thing. WTF you can't even remember what you post about around 24 hours ago, but you expect others to value your posts like they mean something.

Sorry if you have been around as much time as I have or longer, you might be getting senile. Might what to go to a Neurologist and get yourself checked out. Till then don't come back, this is a joke btw don't take it seriously

You can forget the math I asked you about, you need much more help then that lol

I don't care of someone disagrees with me, better fuckin have a good reason behind it though cause everything I talk about I can and will give if needed, (most of it here is common sense and logic, even if I don't know something right off the bat I can deduce it based on things I do know) or at least something note worthy to talk about, not flame bait crap like you and several others have been doing in the past week or so, no one likes it and it doesn't do anyone any good here.

Back to topic, was Apple getting Polaris a surprise to anyone? I didn't expect nV to get apple contracts anytime soon. Just doesn't make sesnse for them when they have the rest of the market to themselves. I think the only OEM contract I have seen for notebooks for AMD's Polaris has been the rx470 and that was recently dropped for the gtx 1060 in the past week or so.

So AMD looses Dell but replaces an existing OEM deal with Apple.

Edit: Looks like the removed it from the corporate buying options, they have it only on the highest Alienware 15 now for home purchases, so ya gotta get their best 15 inch laptop (highest cost) to get the rx470, more low volume sales and a mid range 17inch laptop which is 1.8k. Anyone want to pony up 1.8k to 2k to get a rx470 laptop? Where you can get a gtx 1070 version for 1.8 to 2.1k? Yeah doesn't seem like a good buy. Damn you can get better speced systems from Asus and MSi with a 1060 for 1.5k. And they are much thinner too.

I didn't expect AMD to underclock P11 this much though. I guess Apple forced the issue by their form factor.
 
Last edited by a moderator:
Oh you want me to quote you then ok here ya go



You didn't say you had one?

What you can't even remember what you posted from the night before?

Yeah That is you right? I didn't need a minute to reread anything, I have a damn good memory lol.

Why would post that you had one and didn't want to pull it out, then when questioned about it you say you never stated such a thing. WTF you can't even remember what you post about around 24 hours ago, but you expect others to value your posts like they mean something.

Sorry if you have been around as much time as I have or longer, you might be getting senile. Might what to go to a Neurologist and get yourself checked out. Till then don't come back, this is a joke btw don't take it seriously

You can forget the math I asked you about, you need much more help then that lol

I don't care of someone disagrees with me, better fuckin have a good reason behind it though cause everything I talk about I can and will give if needed, (most of it here is common sense and logic, even if I don't know something right off the bat I can deduce it based on things I do know) or at least something note worthy to talk about, not flame bait crap like you and several others have been doing in the past week or so, no one likes it and it doesn't do anyone any good here.

Back to topic, was Apple getting Polaris a surprise to anyone? I didn't expect nV to get apple contracts anytime soon. Just doesn't make sesnse for them when they have the rest of the market to themselves. I think the only OEM contract I have seen for notebooks for AMD's Polaris has been the rx470 and that was recently dropped for the gtx 1060 in the past week or so.

So AMD looses Dell but replaces an existing OEM deal with Apple.

I didn't expect AMD to underclock P11 this much though. I guess Apple forced the issue by their form factor.

He did say he is NOT about to go use his oscilloscope it but he would throw in a kill-a-watt. lol, you two are funny.
 
He did say he is NOT about to go use his oscilloscope it but he would throw in a kill-a-watt. lol, you two are funny.

That wasn't the point I was trying to make, saying something and then when in his mind he was "challenged" by others because it didn't sound believable, so instead standing up to what he thought he had, he ends up flaming others. Well that isn't a good way to back up what he stated, it actually goes to the fact that he was talking crap.

If he doesn't care to prove what he has is correct, why even say it in the first place?

Its like a guy saying I just won 4 gold medals in the Olympics, then when people start asking where are the medals, he talks some shit and walks away and says they are getting polished. (hyperbole I know but easy to understand) LOL.

I mean when you see voltage locked 1060's hitting 65 watt max wattage, and 1070's hitting 75 watts, (this is all without down clocking or binning). They are getting 50% power savings, its so easy to see what AMD did with down clocking they could reach 35 watts for P11.

I even hinted at what could drive the reason for the low wattage, voltage binning has a bell curve to yield comparison, what you are getting here are chips that are hitting the low end of the spectrum with voltage, where subsequently are few in number per wafer, but since Apple doesn't need too much in volume, its perfect for them. You are not going to see these chips anywhere else. Giving the fact Apple also has the only full P11 chips and the recent price drops of the rx470, that kinda tells us there is a correlation between yields of the full P11 are not so good. For this size chip those yields not being good, I'm not sure what is going on here. Some thing is up though. Because a full size P11 at the same clocks as the current rx460 would have made it a decent competitor to the gtx 1050ti, it might not have beaten the 1050ti but sure would have come close and since the 1050ti kinda needs a 6pin power connector (I know its optional but better to use it if you want to make sure it runs on all cylinders) the power usage (from what power supply you need) of a full p11 wouldn't be much of a factor as it too would be needing that 6 pin too.

And this would have also been able to keep the price of the rx470 up. Instead AMD took lower margins on the rx470 now.

While everyone doesn't want to hear all this, this is what it is lol. There is no other reasoning for doing what AMD has been doing with Polaris.
 
Last edited:
WOW does this mean RX 460 is not fully-enabled? R9 285 all over again. That took a YEAR to sort-out!

Well, at least we know where the parts that could actually perform against the 1050 Ti went. To reach ancient GTX 960M performance levels on a laptop that Apple's charging people $2500 for. A chip that's about to replaced in notebooks everywhere with the GTX 1050 Ti, which should be about 10% slower than the desktop part.
 
Last edited:
Yeah the rx 460 for the desktop is not fully enabled, and I don't think we will see it fully enabled until what ever issues AMD is having with its yields for a fully functionally chip is resolved.

It really raises the question of what the hell is happening at GF's or at AMD's design because this isn't the first time this has happened with smaller chips. Tonga had the same issue. Apple got the first of many batches of the fully enabled Tongas for their computers, and as we know they don't do so much volume in GPU's to be the reason.

I find it hard to believe this is a maturity error on the node too, because errors that can be resolved due to node maturity, you still should be able to get quite a few good chips out of P11 wafers, if you are able to get enough good chips out of the same node on a full bigger chip AKA P10 which AMD has been able to do and has shown they can do it.
 
Last edited:
it's not going to be the double performance lead Nvidia had with the 750 Ti over the downclocked 7770 they used in the Macbook Pro, but there's still a 40% difference between the RX 460 and the 1050 Ti (that's comparing one 6-pin -powered card to another):

MSI GTX 1050 Ti Gaming X 4 GB Review

That performance gap could increase to 50%, depending on how close the 1050 Ti notebook parts can maintain desktop performance.
 
Yeah the rx 460 for the desktop is not fully enabled, and I don't think we will see it fully enabled until what ever issues AMD is having with its yields for a fully functionally chip is resolved.

It really raises the question of what the hell is happening at GF's or at AMD's design because this isn't the first time this has happened with smaller chips. Tonga had the same issue. Apple got the first of many batches of the fully enabled Tongas for their computers, and as we know they don't do so much volume in GPU's to be the reason.

I find it hard to believe this is a maturity error on the node too, because errors that can be resolved due to node maturity, you still should be able to get quite a few good chips out of P11 wafers, if you are able to get enough good chips out of the same node on a full bigger chip AKA P10 which AMD has been able to do and has shown they can do it.

I am pretty sure its lack or R&D here. They just don't have the resources to refine the chip hence it takes them longer to refine it. I think this is why the reason we see refreshed polaris chips coming along next year. I am sure they are trying hard to still tweak the design and lack or R&D budget turns in to longer period it takes them to refine a chip. I don't really think GF is at fault here. We will probably see Zen being more power efficient only because its been in design for quiet some time and most of budget probably went there and even that is a drop in the bucket compared to nvidia spending on their GPUs
 
Last edited:
Also I am starting to think that the next gen imacs will probably have a binned rx 470 and rx480 with higher shader count I think. But by that time AMD will probably have refreshed their lineup on consumer side too.
 
Sometimes you get told to use it even if it is not ideal, instead of waiting for the respin. Money has to be made at some point and throwing out a batch of chips is not making them cash.
 
Also I am starting to think that the next gen imacs will probably have a binned rx 470 and rx480 with higher shader count I think. But by that time AMD will probably have refreshed their lineup on consumer side too.

Unlike Polaris 11, Polaris 10 is already fully enabled in the consumer space.

Blame the guy that brought up nvidia bumpgate for that, should expect someone to follow up with something about nV vs AMD after that comment (since AMD had something similar to that).

Exactly, but lets not bring facts into this ;)
 
You didn't say you had one?

What you can't even remember what you posted from the night before?

Yeah That is you right? I didn't need a minute to reread anything, I have a damn good memory lol.

Why would post that you had one and didn't want to pull it out, then when questioned about it you say you never stated such a thing. WTF you can't even remember what you post about around 24 hours ago, but you expect others to value your posts like they mean something.

Sorry if you have been around as much time as I have or longer, you might be getting senile. Might what to go to a Neurologist and get yourself checked out. Till then don't come back, this is a joke btw don't take it seriously

Yeah you have a great memory but piss poor reading skills. Let's put it in plain language for ya...Its not worth my damn time to satisfy some self-proclaimed forum God on my cards capabilities. I've got a life, a job and plenty of better things to do with my time. The oscilloscope was hyperbole, as in "ill just whip out my trusty oscilloscope" get it? not something most folks have kicking around. I'm sorry I didn't satiate your thirst for proof. I was not posting unsubstantiated and fully admitted where the readings came from, so why so mad? Will you forgive me self proclaimed [H] God?

Ahh I get it, you attempt to bully folks on this forum, cut them down and make them seem not worthy of the [H]. You're hilarious, get off your high horse bud.

Anyways, back on topic i had stated prior, the P11 in the new MBP is using the fully enabled chip and im wondering if its already seen a slight rework? They made mention of thinning the insulating layers in order to fit it in the 15.5 mm chassis and maintain effective cooling. Not going to be a gaming monster, but will hold its own just fine in lighter titles, the target for this chip is content creators anyways. The P11 supports all of the latest standards in WCG, HDMI2.0b and DP 1.4. I agree the 1050 would have been a better fit for performance per watt, but Nvidia wanted to focus on the highend first and as a result didnt have it ready in time, also they like to charge a pretty penny for them, this coupled with the soured relationship from years back, effectively slammed the door on any deal that could have been made between Nvidia and Apple.

It will be interesting to see what comes of the 1050 when it hits the 35w envelope and what performance lines up like, love these two fighting it out right now. Hopefully AMD can bring the fight to the highend soon enough. Im guessing they will be the DX12 leaders with Vega until Volta drops. and Zen should help them back into the i5 and i7 territory. I wanna build an Opteron based Zen machine, bringin it back to the old days!
 
Yeah you have a great memory but piss poor reading skills. Let's put it in plain language for ya...Its not worth my damn time to satisfy some self-proclaimed forum God on my cards capabilities. I've got a life, a job and plenty of better things to do with my time. The oscilloscope was hyperbole, as in "ill just whip out my trusty oscilloscope" get it? not something most folks have kicking around. I'm sorry I didn't satiate your thirst for proof. I was not posting unsubstantiated and fully admitted where the readings came from, so why so mad? Will you forgive me self proclaimed [H] God?

Ahh I get it, you attempt to bully folks on this forum, cut them down and make them seem not worthy of the [H]. You're hilarious, get off your high horse bud.


I don't bully folks its quid pro quo, I respond the same way they post, they post like a moron, I respond like a moron, they post with flame bait, I post with flame bait. They post like a dick, I post like a dick. They post with subterfuge, I post with reality and then make them look like a moron.

See the difference? Just like I stated to you, why have the oscilloscope if you aren't going to use it and not back up your crazy story of how you down clocked your card, even after being told you are talking BS in a nice way by showing you its probably not possible? What it doesn't take very long to do the test. If you have one I expect you have some engineering know how. Yeah so far all I can think of is you don't know anything about what you do. And made up the oscilloscope and kilowatt.

Anyways, back on topic i had stated prior, the P11 in the new MBP is using the fully enabled chip and im wondering if its already seen a slight rework? They made mention of thinning the insulating layers in order to fit it in the 15.5 mm chassis and maintain effective cooling. Not going to be a gaming monster, but will hold its own just fine in lighter titles, the target for this chip is content creators anyways. The P11 supports all of the latest standards in WCG, HDMI2.0b and DP 1.4. I agree the 1050 would have been a better fit for performance per watt, but Nvidia wanted to focus on the highend first and as a result didnt have it ready in time, also they like to charge a pretty penny for them, this coupled with the soured relationship from years back, effectively slammed the door on any deal that could have been made between Nvidia and Apple.

It will be interesting to see what comes of the 1050 when it hits the 35w envelope and what performance lines up like, love these two fighting it out right now. Hopefully AMD can bring the fight to the highend soon enough. Im guessing they will be the DX12 leaders with Vega until Volta drops. and Zen should help them back into the i5 and i7 territory. I wanna build an Opteron based Zen machine, bringin it back to the old days


Yes wafer thinning is a normality for these types of products, Intel has been doing it for a few years now, there is nothing new about this other then AMD hype. And you don't get it, wafer thinning is expensive, it increases the cost of the chip and since this is Apple we are talking about, ya know they are cutting down on prices too.

Content creators? Damn its for low end web designers, not all content creators. It doesn't have the polygon horse power to do game models let alone movie models. It doesn't have the shader horse power to do any custom filters on larger size pictures or textures. Forget after effects, flame, smoke, etc just can't do it on this damn thing GPU or CPU, just not enough performance there.

Do you know wtf you are talking about? Not really man, cause I have a lot of experience doing textures and 3d work lol, I work in special effects for my day job, Movies, TV, advertisements, and work on games too.

The rest of it, doesn't matter, this is an Apple only chip, and will never see day of light outside of Apple.
 
Last edited:
Yeah waffer thinning has been used since 2003 or so......

Yeah lets hype it in 2016 so Apple and AMD fan boys have something to talk about! Worse yet the way AMD worded it, if someone doesn't know what wafer thinning is (die thinning), it sounds like it caused the power reduction lol. I'm just waiting for the next EE want to be to come here and say that. I can pretty much say with a guarantee its already crossed some of the AMD loyalists minds here, its already happened at B3D lol I would expect posts like that here in more contention.
 
Apple iMacs are about to be irrelevant real fast with the incoming Surface Studio, so I'm not sure how long these products will last.

IMO people buying MBPs don't really give a crap about the GPU. Intel Iris Pro is sufficient for the majority of the "workload" used on those.
 
Actually surface studio man if they had Pascal in them they would have been great, been looking for something that I can replace my Cintiq with and would rather have touch screen comp all in one. Something bigger than a tablet lol. But when I saw the specs, and the price for those specs, just ain't worth it. 4k for a yester-year computer.

But I can say this though, all those people that love their Imacs and businesses that buy Imacs *cough* webdesign firms and anything to do with the adobe suite*, they now have an alternative to them, same price range similar specs and it works better in 3d applications too although for that price, I would get a full workstation with dual core xeons and a high end Pascal GPU.
 
Last edited:
Yes the absence of a Pascal GPU in a 4000$ PC is really just fucking ridiculous frankly. 980M. Please. Stick it up your ass M$
 
I don't bully folks its quid pro quo, I respond the same way they post, they post like a moron, I respond like a moron, they post with flame bait, I post with flame bait. They post like a dick, I post like a dick. They post with subterfuge, I post with reality and then make them look like a moron.

See the difference? Just like I stated to you, why have the oscilloscope if you aren't going to use it and not back up your crazy story of how you down clocked your card, even after being told you are talking BS in a nice way by showing you its probably not possible? What it doesn't take very long to do the test. If you have one I expect you have some engineering know how. Yeah so far all I can think of is you don't know anything about what you do. And made up the oscilloscope and kilowatt.

Yes wafer thinning is a normality for these types of products, Intel has been doing it for a few years now, there is nothing new about this other then AMD hype. And you don't get it, wafer thinning is expensive, it increases the cost of the chip and since this is Apple we are talking about, ya know they are cutting down on prices too.

Content creators? Damn its for low end web designers, not all content creators. It doesn't have the polygon horse power to do game models let alone movie models. It doesn't have the shader horse power to do any custom filters on larger size pictures or textures. Forget after effects, flame, smoke, etc just can't do it on this damn thing GPU or CPU, just not enough performance there.

Do you know wtf you are talking about? Not really man, cause I have a lot of experience doing textures and 3d work lol, I work in special effects for my day job, Movies, TV, advertisements, and work on games too.

The rest of it, doesn't matter, this is an Apple only chip, and will never see day of light outside of Apple.


Ahhh so you in charge of keeping people in check here? and by in check i mean bashing everything any anything they say if its in contrast to your opinion. I get it, an [H] vigilante of sorts. fighting for Nvidia justice haha thats cool. Do you have a cape?

How is it a crazy story of undervolting my card? and it drawing less power? Hmmm not quite sure. You can cut me down all you like to attempt to make yourself look good. But really you just look like spiteful and small. Id say after the past 15 years of following this site and many others id have a firm grasp on the hardware side of things, at the hobbyist level and have built and overclocked many a machine. Is this site only for the elite engineers with chip design backgrounds now? I didnt see the sign on the door, I do apologize... Anyways im out, you're a waste of time and unwilling to discuss without the personal attacks. Nice to have met ya caped crusader.
 
Last edited:
I can't believe, how much hype AMD is doing, every single one of their product launchs, its great, it really shows how crappy their current lineup is.

Lets go down the list

Polaris 10: VR for the masses: did it happen because of them? NOPE! Aparrently they can't compete in VR against nV's products, even last generations products

Polaris 11: Low budget card with nice performance over last generation cards from competitors: Did that happen? nope they tied the gtx 950 is perf/watt lol: Utter failure

Now we have Apple release of Polaris: Die thinning is new its great it helps with our 35 watts TDP outright: BS die thinning has been around since early this century and doesn't do anything for TDP.

What are people smoking when they listen to this crap?
 
Last edited:
Ahhh so you in charge of keeping people in check here? and by in check i mean bashing everything any anything they say if its in contrast to your opinion. I get it, an [H] vigilante of sorts. fighting for Nvidia justice haha thats cool. Do you have a cape?

How is it a crazy story of undervolting my card? and it drawing less power? Hmmm not quite sure. You can cut me down all you like to attempt to make yourself look good. But really you just look like spiteful and small. Id say after the past 15 years of following this site and many others id have a firm grasp on the hardware side of things, at the hobbyist level and have built and overclocked many a machine. Is this site only for the elite engineers with now? I didnt see the sign on the door, I do apologize... Nice to have met ya.


I'm not saying that, either, nice try to deflect what I have stated. No you are like every other person here that comes in with an agenda and it was obvious with your third post about your down clock, and started calling people names. Now I'm going to show you what you really are, yeah you don't see the sign because you can't see period. You are like a person that thinks they can do something, but in reality never did it but talks about it and when asked to prove it, shuts up and walks away.

Come on where is your card and show me you have done what you did. Show me the downclocks, show me the pictures, show us you have a oscilloscope. At this point I don't even think you got one, because all you do is talk shit without giving any backing....
What you don't have a camera so you can't take a picture of what you are talking about? Maybe that is the problem.....

Do I have to make excuses for you not being able to show something you have told to us and then when we have some disbelief you can people names and run away, but have the balls to come back to another thread and try to talk shit again.

NO I want to see your shit. Come on. I would like pictures of your card, your oscilloscope, you measuring your card with its down clocks, how freakin hard is that to do, instead you can't prove what you state but you tell the doubters that they are fanboys, Yeah get the fuck outta here man! LOL.
 
Last edited:
Again did you miss where i said the Oscilliscope was hyperbole? time to go back and reread again...so quick to jump on folks...take a little time and think...Does it add anything to the conversation or am I just being a dick? ....then post! Probably a good way to approach many situations. See ya around man! Have a good day
 
Again did you miss where i said the Oscilliscope was hyperbole? time to go back and reread again...so quick to jump on folks...take a little time and think...Does it add anything to the conversation or am I just being a dick? ....then post! Probably a good way to approach many situations. See ya around man! Have a good day


Good get out of here talks all this BS and can't even show anything.

You weren't really being a dick what you are doing bragging about something and when called out on it, can't show anything for it. That isn't hyperbole either. If I was to call it what it is, you will find it in a psychiatry book. So you go look it up.
 
Apple iMacs are about to be irrelevant real fast with the incoming Surface Studio, so I'm not sure how long these products will last.

IMO people buying MBPs don't really give a crap about the GPU. Intel Iris Pro is sufficient for the majority of the "workload" used on those.

True.

And people who are addicted to OSX who actually do real 3D work around Apple's lack of GPU power by using Windows or Linux servers to remote into. You know, pay twice the price to do what one machine could do on it's own, because Apple madness.
 
Again did you miss where i said the Oscilliscope was hyperbole? time to go back and reread again...so quick to jump on folks...take a little time and think...Does it add anything to the conversation or am I just being a dick? ....then post! Probably a good way to approach many situations. See ya around man! Have a good day

Saying you own an oscilloscope is not hyperbole. It's a lie.

I own a private jet; also a lie.

I own 500000000 pairs of socks; that is hyperbole.

True.

And people who are addicted to OSX who actually do real 3D work around Apple's lack of GPU power by using Windows or Linux servers to remote into. You know, pay twice the price to do what one machine could do on it's own, because Apple madness.


Microsoft is emulating Apple in this respect, the surface AIO is overpriced shit.
 
Last edited:
Microsoft is emulating Apple in this respect, the surface AIO is overpriced shit.

It's got one important difference: that beautiful screen is also a tablet.

For every other use case, it's overpriced. But I wouldn't worry about the GPU. Revision 2.0 should have more of the kinks worked out of it, and a more recent GPU.
 
Back
Top