CPU against GPU in movie production

WSJ’s Don Clark has written an interesting article about how GPUs (graphics processing unit) and CPUs (central processing unit) are competing to be the top processor in Hollywood’s Computer Graphics (CG) business. In “Harry Potter and the Half-Blood Prince”, ILM used GPUs to compute a fire effect*. This is quite a milestone for an industry typically dominated by CPU farms.

While there is indeed a nascent competition between both types of chips, the reality is that in the short-term, both chips will be used for different things. Sorry, look elsewhere for drama! CPUs are largely ahead in terms of utilization, as they benefit from a huge software legacy. On the other end GPUs can process particular datasets like particles and other parallel-friendly problems orders of magnitude faster… but only if the data can fit in the GPU-accessible memory.

In the longer term, GPUs will continue to expand their sphere of influence (to a point) and continue to become exponentially faster – thanks to their inherent parallelism. As they get more flexible, GPUs will perform more and more tasks done today by CPUs. Today video compression, what tomorrow? Although they are not designed to ever replace CPUs, GPUs have the potential of making CPUs less “valuable”, if they become more potent number-crunchers. That’s why Intel is coming up with Larrabee, a GPU that should also be an interesting general-purpose computing chip.

Also, note that in the world of movie special effects, there are a lot of sponsorships aimed at buying “bragging rights”. Within the industry, it is well known that many server farms are donated by big hardware companies that will say: “they used our hardware to make that great movie”. So, keep that in mind when you hear something like that, “technological value” might have nothing to do with it.

*Among many techniques, fire can be created using fluid dynamics, which can be very complex (and slow) to compute. Another way of doing it, is to use “real life” flames footage + particles (no/less computation required), but there are serious limitations as to what you can then do with the flames and from which angles they look right (trust me on this one). The third solution is to set the actor and the set on fire. You don’t need any chip and it can’t be more realistic, but insurances don’t like it and second-takes might be problematic if something goes wrong.

Links

Intel Larrabee
Nvidia CUDA
AMD Stream

Filed in General >Top Stories. Read more about , , , , , , and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading