nvidia-moon-landing-side-by-sideAs NVIDIA launched its new Geforce GTX 970 and GeForce GTX 980 graphics processors (GPUs), its “Demo team” has taken a stab at solving one of the big controversial stories of the century: was the lunar landing real or fake? There is ample literature on the subject and there is no shortage of reasons why some people think that the lunar landing and mission was shot in a studio.

One of the main arguments of conspiracy theorists was that the lighting seemed wrong in the scene where Buzz Aldrin comes out of the lunar module. It was suggested that there was a light next to the camera, which would mean that it was shot in a studio, since no such equipment was brought to the moon.

Because the sun’s light comes from the opposite side, the argument was that no other light source could justify the illumination of Aldrin’s right side. David Groves was one of the people making this argument, based on “science”: he had used Ray-Tracing (a well-known 3D rendering technique) to build his theory and approximate a light source’s position somewhere close to the camera.

To dig into this, NVIDIA engineers reconstructed the scene as precisely as they could, with as much data as they could gather on the reflectivity and other light-related properties of the objects in the scene (on or off-screen) and used a physically-based lighting model along with their own global illumination algorithm to see what a more accurate “light simulation” would yield. Watch their whole explanation here:

To shed some context, Physically-based lighting is the notion of measuring how light reacts to different types of materials and use this data for real-time rendering. In the past, the industry has used arbitrary values that were more “art-based” than “physics-based”.

Global illumination means that light that bounces from one surface to another is taken into account. This is a recent evolution of real-time rendering which is critical because large areas can reflect/emit a significant amount of light, which will be noticeable to the user.

When the NVIDIA team started to run the simulation, they realized that there was a significant amount of light being reflected from the suit of the second astronaut (who was filming) onto Aldrin’s suit. This explained precisely where the extra light was coming from, and debunked the notion that spotlights were used during the filming.

"THE EXTRA LIGHT WAS CONTRIBUTED AS A SECONDARY BOUNCE FROM A SPACE SUIT"David Groves was right in saying that there was a light source near the camera, but he was wrong in interpreting that as being a spotlight. Instead, the extra light was contributed as a secondary bounce from a space suit. The physically-based rendering scientifically proves that the lighting is possible and that it is consistent with the original photo. NVIDIA rests its case, and I think that this convincing enough to debunk that conspiracy theory.

I’m amazed that no-one has done this using pre-computed rendering before (at least, that I know of). NVIDIA should get a lot of credit for coming up with the idea, which is one of the best demo idea that NVIDIA has had – I can certainly appreciate the work.

Being able to do this in real-time is a great way to demonstrate how far lighting has come in the past 5 years. It also shows that every detail counts, and things that don’t seem to contribute to the scene can actually make a noticeable difference. Even the NVIDIA folks were not 100% sure that the suit could cast that much light on Aldrin, but it does. Sometime you just have to see it to believe it.

NVIDIA has promised to release a real-time demo, and I’m sure that are currently in the process of packaging the assets and make a clean version for a public release. This will take some “weeks”, but you should be able to run it on your own PC soon.

Filed in Gaming. Read more about and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading