The U.S department of Energy has given NVIDIA $12M to conduct research on exascale supercomputing. Exascale computing describes a computer system capable of reaching one exaflop. For comparison, current supercomputing is still using petaflops as a performance unit(1 petaflop = or one quadrillion floating point operations per second). Today, the fastest super-computer clocks at 16 petaflops today and Exascale is 1000 times larger than petascale.
The $12M basically pays NVIDIA for 2 years of research in critical areas that would lead to building an exascale compute architecture that is more power-efficient than anything we have today. Bill Dally, NVIDIA’s Chief Scientist suggests that if one was to build an exascale computer based on Intel’s X86 architecture today, the energy required would reach 2 Gigawatt or “the entire output of the Hoover Dam” he adds.
NVIDIA believes that an exascale system built with its latest Kepler GPU would consume 150 Megawatts. However, the goal of this research is to build such a system using only 20 Megawatt by 2020. Obviously, this is quite difficult, and NVIDIA plans to use different types of processors working together to achieve this.
That said, while graphics processors (GPUs) and super-computing were completely dissociated less than decade ago, a lot of super-computers now use GPUs or derivatives. So far, these chips have been commercially successful because they serve a dual-purpose: they power computer graphics that most of us use in PCs and mobile devices, but their architecture can also be used for non-graphics computing.
Computer graphics has justified and funded their continued development, and the non-graphics computing only after a decade. That’s why it would be difficult for anyone else -even Intel- to invest and build a massively parallel chip that is not a graphics processor as well. Still, the end of the decade is not so far away, and we are talking about 7.5X reduction in power consumption – do you think that NVIDIA can achieve this ambitious goal? [NVIDIA Blog post]
Next Story: Apple investigates in-app purchase exploit