WSJ’s Don Clark has written an interesting article about how GPUs (graphics processing unit) and CPUs (central processing unit) are competing to be the top processor in Hollywood’s Computer Graphics (CG) business. In “Harry Potter and the Half-Blood Prince”, ILM used GPUs to compute a fire effect*. This is quite a milestone for an industry typically dominated by CPU farms.
While there is indeed a nascent competition between both types of chips, the reality is that in the short-term, both chips will be used for different things. Sorry, look elsewhere for drama! CPUs are largely ahead in terms of utilization, as they benefit from a huge software legacy. On the other end GPUs can process particular datasets like particles and other parallel-friendly problems orders of magnitude faster… but only if the data can fit in the GPU-accessible memory.
In the longer term, GPUs will continue to expand their sphere of influence (to a point) and continue to become exponentially faster – thanks to their inherent parallelism. As they get more flexible, GPUs will perform more and more tasks done today by CPUs. Today video compression, what tomorrow? Although they are not designed to ever replace CPUs, GPUs have the potential of making CPUs less “valuable”, if they become more potent number-crunchers. That’s why Intel is coming up with Larrabee, a GPU that should also be an interesting general-purpose computing chip.
Also, note that in the world of movie special effects, there are a lot of sponsorships aimed at buying “bragging rights”. Within the industry, it is well known that many server farms are donated by big hardware companies that will say: “they used our hardware to make that great movie”. So, keep that in mind when you hear something like that, “technological value” might have nothing to do with it.
*Among many techniques, fire can be created using fluid dynamics, which can be very complex (and slow) to compute. Another way of doing it, is to use “real life” flames footage + particles (no/less computation required), but there are serious limitations as to what you can then do with the flames and from which angles they look right (trust me on this one). The third solution is to set the actor and the set on fire. You don’t need any chip and it can’t be more realistic, but insurances don’t like it and second-takes might be problematic if something goes wrong.
Next Story: Yahoo Mobile Redesign Launched
- 2014-03-25 NVIDIA GTX Titan Z Has 5760 Cores
- 2014-03-25 NVIDIA Pascal Architecture Introduced
- 2014-03-20 PowerVR GPUs Get Hardware Ray-Tracing, Leap Everyone Else
- 2014-03-19 Imagination Technologies Unveils New PowerVR GPU
- 2014-02-18 GeForce GTX 750 Ti and GeForce GTX 750 Use Maxwell GPU Design
- 2009-07-22 Caustic hardware ray-tracing to be shown at SIGGRAPH
- 2009-07-24 AMD CPU+GPU fusion in 22nm in 2010?
- 2009-08-11 Tim Sweeney: The End of The GPU Roadmap
- 2010-09-10 Is Intel's Sandy Bridge a graphics game changer?
- 2010-12-09 2011 MacBooks with Intel Sandy Bridge Processors to Utilize Integrated Intel GPU, Discrete AMD GPU