pixars render farm

sirsnits

[H]ard|Gawd
Joined
Oct 20, 2002
Messages
1,197
Pics some guy took while on tour at Pixar.

pixarrenderfarm1.jpg

pixarrenderfarm2.jpg


For the movie "Cars" rendering took 1 hour a frame, and apparently would have taken 9 hours a frame if they were not running on Linux (read it somewhere).
 
But can it run Crysis?!

LOL no but really, that's pretty sweet. Could you find a source for the "9 hours a frame if they were not running on Linux" part? Sounds interesting, I'd love to read more...
 
Last edited:
For the movie "Cars" rendering took 1 hour a frame, and apparently would have taken 9 hours a frame if they were not running on Linux (read it somewhere).

Hah, was it "welovelinux.com" or something? But seriously yeah I don't really buy that, at all. It's all CPU's and their cores. Nothing is going to magically like Linux 9x more than Windows. Plenty of companies don't use Linux. If it really was that ridiculously much faster, EVERY single company on the planet would use it. The need for speed and efficiency with CG and 3D work is that great.
 
Hah, was it "welovelinux.com" or something? But seriously yeah I don't really buy that, at all. It's all CPU's and their cores. Nothing is going to magically like Linux 9x more than Windows. Plenty of companies don't use Linux. If it really was that ridiculously much faster, EVERY single company on the planet would use it. The need for speed and efficiency with CG and 3D work is that great.

i have my doubts also.

possible user preference.
 
For the movie "Cars" rendering took 1 hour a frame, and apparently would have taken 9 hours a frame if they were not running on Linux (read it somewhere).

It's worth mentioning that is probably 1 frame per hour per 1u. If that entire farm was only capable of one frame per hour, Cars would have taken a bit over 19 years to render (assuming CGI is 24fps).
 
It's worth mentioning that is probably 1 frame per hour per 1u. If that entire farm was only capable of one frame per hour, Cars would have taken a bit over 19 years to render (assuming CGI is 24fps).


good point, each unit must be rendering at an hour per, that seems like a more accurate statement.
 
so in essence, in order for them to render the movie within a enough time to review and fix problems, they'd have to have over 800u's worth of computing power. JEEZZZZUS. Imagine what they had to do for Toy Story 1
 
I wish i had the time to do the math of how long that would take :)
 
hrm. wonder what te specs are. i bet it'd take 10 mins/1u with GPGPU rendering...

Unless they code it themselves which I'm sure they can (if i recall correctly, renderman was developed by pixar anyway....i think), there's not really software side support in publicly available render engines yet. I'm eagerly awaiting it myself though, and if it's faster you can be certain they'll be doing it.
 
Very cool, I would hope it is 1 frame per 1U per hr or something close to that as well. If it was 1 frame per hour for the whole farm that would take insane amounts of time to make any movie and I cant really imagine what would take that long to render with that much power.

In looking at the comments on that digg article it looks like it might have been CPU time, which would make much more sense.

Unless they code it themselves which I'm sure they can (if i recall correctly, renderman was developed by pixar anyway....i think), there's not really software side support in publicly available render engines yet. I'm eagerly awaiting it myself though, and if it's faster you can be certain they'll be doing it.

Yes, renderman is developed by Pixar...it really is what makes Pixar ;)
 
Back
Top