Apple’s New Mac Pro Sold Out

I don't understand people that are willing to drop $1k on a video card seem to think workstations should cost $400 tops? :confused:
Because Apple is for hipsters with lesser intellect alone and the concept that someone with equal or greater intellegence might actually use any Apple product regardless of price or target audience is incomprehensible or some shit :p
 
Do you happen to know what sort of discount?

The discounts hp offers largely depends on the amount of systems bought. One thing that drops it drastically is if you buy into hp's system support which they are widely known for, probably their biggest money maker.

Hp didn't want to sell systems in small numbers, they do at markup, which is what makes the Mac pro interesting. It isn't marked up andsis essentially an all in one package. Apple isn't trying to reinvent the wheel here, they are keeping themselves meaningful in an area they can compete in.
 
huh? I was trying to show that the macpro isn't a rip off as people seem to think it is, it's actually not that bad at all... and nobody complains that HP is waayy over priced (to the point of making the mac look cheap!)

I don't understand people that are willing to drop $1k on a video card seem to think workstations should cost $400 tops? :confused:

Ah, gotcha, I was wondering if I misinterpreted you. :p
No, that makes sense now.
 
I don't do any video editing. Is Apple still the standard in that industry? I would think that now that the hardware is (somewhat) standardized Apple would lose a lot of business to those who would rather save money and build their own.
They are but for no other reason than the people operating them. Both that and the prepress industry are stuck on Apple because the bulk of both industries are comprised of people who are computer illiterate and stuck in a 15 year old mindset about technology. Apple will always be the industry standard simply because the dullards working in those industries will always hold them back.
 
They are but for no other reason than the people operating them. Both that and the prepress industry are stuck on Apple because the bulk of both industries are comprised of people who are computer illiterate and stuck in a 15 year old mindset about technology. Apple will always be the industry standard simply because the dullards working in those industries will always hold them back.

Apparently, they aren't the only ones stuck in the past... (stares at your sig)
 
They are but for no other reason than the people operating them.
It's not just the people, it's also the software. Final Cut is just plain amazing. Avid is... not. It just feels like GIMP compared to Photoshop. You can do 99% of the same things, but the UI is more clunky, and overall the software just doesn't feel polished.
 
It's not just the people, it's also the software. Final Cut is just plain amazing. Avid is... not. It just feels like GIMP compared to Photoshop. You can do 99% of the same things, but the UI is more clunky, and overall the software just doesn't feel polished.
If you're familiar with one system because that's all you've ever used of course the other won't feel right. When I owned a video production company I was well versed in Avid and in knowing both systems that was my opinion of Final Cut.
 
the single GPU will do just fine for most CAD models if someone chooses to run cad on them they shouldn't have any issues unless it ridiculously large models
 
It's not just the people, it's also the software. Final Cut is just plain amazing. Avid is... not. It just feels like GIMP compared to Photoshop. You can do 99% of the same things, but the UI is more clunky, and overall the software just doesn't feel polished.

Is that before or after they dumbed down Final Cut Pro?
 
Because Apple is for hipsters with lesser intellect alone and the concept that someone with equal or greater intellegence might actually use any Apple product regardless of price or target audience is incomprehensible or some shit :p

LOL

ya something like that...

They are but for no other reason than the people operating them. Both that and the prepress industry are stuck on Apple because the bulk of both industries are comprised of people who are computer illiterate and stuck in a 15 year old mindset about technology. Apple will always be the industry standard simply because the dullards working in those industries will always hold them back.

epic troll is epic :D

apple was not always the industry standard... and for a lot of things (even in the "media" market) it still isn't... soooo... ???:confused:
 
Apparently, they aren't the only ones stuck in the past... (stares at your sig)

BoxCropped1.jpg
 
If you're familiar with one system because that's all you've ever used of course the other won't feel right. When I owned a video production company I was well versed in Avid and in knowing both systems that was my opinion of Final Cut.
A fair statement. I was "raised" on FCP for a semester before I took the Avid class. I hated the grey, the huge buttons, and the general lack of intuitiveness. That very well could have been because I learned Final Cut first, though!

Is that before or after they dumbed down Final Cut Pro?
I've not kept up with FCP in the last few years. What'd they dumb down?
 
Apparently, they aren't the only ones stuck in the past... (stares at your sig)
I grew up and realized investing money in my home computer was bottomless pit when I could invest it in computers for my company that make me money. That computer still edits video, photos, and customer projects on a level most members of this forum can only wish they had talent for. Playing the latest video games is of no concern to me.
 
huh? I was trying to show that the macpro isn't a rip off as people seem to think it is, it's actually not that bad at all... and nobody complains that HP is waayy over priced (to the point of making the mac look cheap!)

I don't understand people that are willing to drop $1k on a video card seem to think workstations should cost $400 tops? :confused:

not $400 tops...just shouldn't start out at 3k for a quad core.

Forcing dual GPUs with a quad core is kinda weak.

Ram options are limited...
Internal expansion is non-existent forcing expensive thunderbolt accessories just to use the workstation.

Its base price is deceptively cheap. Its performance expansion is limited and its forced accessories are pricey.

It is as bad as it looks.



The compared HP workstation is Quadro focused... And no one pays retail on those anyways. It also has greater built in functionality and internal expansion items...

There really is not a good workstation on the market to compare the Mac Pro to as nothing is as limited as it is.
 
You need OSX? You pay for Apple hardware. End of story.

Dell & HP run Windows or Linux. OSX is not officially supported (hackintosh is a different story and not legally supported). Sure, you can build a better machine for less - but it does not run OSX. Many places require OSX for some piece of software they run or they prefer OSX.

There are other things that are similar - overpriced hardware but the only one to run the software needed.
 
I grew up and realized investing money in my home computer was bottomless pit when I could invest it in computers for my company that make me money. That computer still edits video, photos, and customer projects on a level most members of this forum can only wish they had talent for. Playing the latest video games is of no concern to me.

I think you are underestimating us, our abilities, and the industry as a whole.
15 years in the past is 1998, and I haven't seen too many places that are that far back, and honestly, most of the companies which use Apple are on the up and up with brand new hardware.

So it is quite the opposite of what you claim.
 
I think you are underestimating us, our abilities, and the industry as a whole.
15 years in the past is 1998, and I haven't seen too many places that are that far back, and honestly, most of the companies which use Apple are on the up and up with brand new hardware.

So it is quite the opposite of what you claim.
The hardware I have at home was current in 2006, hardly 15 years old. It handles all of my professional Illustrator, Photoshop, etc tasks just fine. The only underestimating I see here is from people judging me based on my home computer rather than my professional stations and portfolio. The next time I'm bidding out for a global contract for one of my companies I'll make sure to ask if they care what computer is sitting on my desk at home :rolleyes:
 
I grew up and realized investing money in my home computer was bottomless pit when I could invest it in computers for my company that make me money. That computer still edits video, photos, and customer projects on a level most members of this forum can only wish they had talent for. Playing the latest video games is of no concern to me.

A lot of people here like myself work in IT and have acquired learning and skill from our spending on computers and technology. Furthermore many of us enjoy it, so its not necessarily a bottomless pit.
 
I grew up and realized investing money in my home computer was bottomless pit when I could invest it in computers for my company that make me money. That computer still edits video, photos, and customer projects on a level most members of this forum can only wish they had talent for. Playing the latest video games is of no concern to me.

Well, speak for yourself. I was on the team that produced this app: https://itunes.apple.com/us/app/jellytelly-best-in-kids-christian/id705286113?mt=8

Built the core API and did some of the JS to power the core of the app. On my macbookpro. Not bragging, I'm just saying - I wish I had the talent to make that website in your sig - some quality work there.
 
The hardware I have at home was current in 2006, hardly 15 years old. It handles all of my professional Illustrator, Photoshop, etc tasks just fine. The only underestimating I see here is from people judging me based on my home computer rather than my professional stations and portfolio. The next time I'm bidding out for a global contract for one of my companies I'll make sure to ask if they care what computer is sitting on my desk at home :rolleyes:

Well yeah, if you aren't running an i7 OCed to 20GHz, your development skillz are noob level. :p
Man, relax, I never said your system was 15 years old, but my Gamecube on the other hand... has it been so long!? :eek:
 
Well yeah, if you aren't running an i7 OCed to 20GHz, your development skillz are noob level. :p
And that's my point. Not all of us give a damn about our rigs at home. In the professional world computers aren't a hobby, they're a tool. No client cares about your hardware none the less what you have at home.

Look at the prepress industry for instance. You'd think these guys doing amazing prepress editing are all on top notch hardware and are computer geniuses but some of the best in the industry barely know how to turn a computer on. They've been retouching and color correcting since they were teenagers working the presses. They work on macs because that's all they know, that's all their superiors know, and that's all their clients know. The video community operates the same way. I tried switching my old editing company over and my rockstar editors looked at me like I threw the space station at them. The entire industry is like that.

Some of the jack-asses in this thread can poke fun at my home computer or one of my company's websites that has been very profitable for me but it doesn't change the fact that standards don't always form because of superior hardware and software. A lot of times they form because it's the path the lowest common denominator understands. You'd think as tech enthusiasts we'd all understand that but the inability to see things from an average user's perspective has proven to be the tech community's Achilles heel for ages now. If the lowest common denominators could perform those simple tasks and migrate easily you IT guys wouldn't have jobs.
 
standards don't always form because of superior hardware and software. A lot of times they form because it's the path the lowest common denominator understands.

I'm going to call BS on this. The precise reason that standards form is because of superior hardware and software. Nobody is going to use software that is garbage unless they're stuck in the past.

I know some shops that still use Quark, and it works for them because some of the old geezers that work there never bothered to learn anything better. They have old clients, and they churn out the same quality work they've been doing for 15+ years.

On the other hand, I've worked with guys who can layout and prepress just as good (if not better) in a 1/3rd of the time. Hint: they aren't using Quark.

All of that nonsense aside, I have a portfolio that would make most peoples heads spin - and if you want me to throw out some more examples I'll be happy to. I use modern development tools and practices to get my work done - you can put me in front of OSX, Windows, or Linux machine and I'll get it done. If you put me in front of a G5 or a Core Duo I'll chuckle while I ask if you really want to spend $150/hr on me in front of slow ass machine.

I don't need to bash a certain platform to validate the work I do - the work speaks for itself.
 
I do a reasonable amount of calculations for my work, at least in my field I think it's still a few years off being worth buying GPU heavy machines. Even though the calculations I do can be done better on a GPU, the programs to do it aren't nearly advanced as the ones that are still just using the CPU. A couple of years ago my work bought a big tri-SLI system with whatever GPU was best at the time, complete waste of money, at best it's been used for some benchmarking some basic situations but no practical simulations have been done that match the CPU stuff we do simply because no commercial software exists to do it and no one wants to try and manually write the GPU code to do it.

Yep, HP and Dell have huge markups that they use to negotiate with... still not sure it would bring it down to less than the mac pro though.

Yeah, that's the thing. Most of the algorithms that people use need to be re-written from the ground up to work on a GPU, and doing so is not easy.

Some people in the lab I worked in spent about two months trying to rewrite our Monte Carlo Tree Search-based algorithm using CUDA (easier to learn), and it took them a very long time to even begin to figure any of it out. There are just a lot of quirks. However, the gains seemed extremely promising: our algorithm went from running tens of thousands of playouts to millions of playouts per given time. Theoretically, in spite of the fact that MCTS converges logarithmically, this should still significantly improve our performance even if each playout is weaker because there are so many more of them.

I am sure anyone doing other stuff (Neural Networks, etc) will run into their own individual problems. It is going to take years and years before stuff is ported to GPUs, but many applications could see a lot of improvement, similarly to how Bitcoin mining is massively faster on a GPU than a CPU.

Nevertheless, I do wish the new Mac Pro had more CPU power. Some tasks simply cannot be done in parallel.

My own opinion is it's better to start now rather than later, especially for folks like myself that plan on spending years in academia and where longer term projects make more sense.
 
I'm going to call BS on this. The precise reason that standards form is because of superior hardware and software. Nobody is going to use software that is garbage unless they're stuck in the past.

I know some shops that still use Quark, and it works for them because some of the old geezers that work there never bothered to learn anything better. They have old clients, and they churn out the same quality work they've been doing for 15+ years.

On the other hand, I've worked with guys who can layout and prepress just as good (if not better) in a 1/3rd of the time. Hint: they aren't using Quark.

All of that nonsense aside, I have a portfolio that would make most peoples heads spin - and if you want me to throw out some more examples I'll be happy to. I use modern development tools and practices to get my work done - you can put me in front of OSX, Windows, or Linux machine and I'll get it done. If you put me in front of a G5 or a Core Duo I'll chuckle while I ask if you really want to spend $150/hr on me in front of slow ass machine.

I don't need to bash a certain platform to validate the work I do - the work speaks for itself.

Well put!

now to go clean up all the coffee that spit out from my nose at the $150/hr in front of a slow ass machine.
 
Back
Top