power-vr-raytracing-gdc14-002I’ve seen the announcement from Imagination technologies earlier in the week, but I wanted to take a look at their Ray-Tracing demos and presentations to get a better idea. I finally got an opportunity to do so, and I really liked what I saw. The latest PowerVR GPU architecture has a new Ray Tracing Unit (or RTU) that can trace the path of light rays from one surface to another in order to accelerate a number of graphics techniques going from shadowing, reflections to visibility algorithm – and more.

Hybrid Rendering Is The Smart Way

What I liked the most about Imagination Technologies’s new feature is the fact that instead of trying to present Ray-Tracing as a stand-alone real-time rendering solution (like Intel did some time ago), they are being completely pragmatic and dug deep to find some real use cases that can be integrated as part of a hybrid (raster/ray-trace) rendering pipeline that would have the best of both worlds.

Below is an example of graphics pipeline that may be used in a modern 3D game. Pay attention to the “Compute Lighting”, which can be replaced by ray-tracing in the second slide.

07_Ray-tracing-in-games_regular08_Ray-tracing-in-games_hybrid

To step back and provide some context, Ray Tracing (following individual rays of light for each pixel) is a rendering method that has been around for decades, so this is not a new discovery by any means. However, historically, Rasterization (drawing triangles), another rendering technique, has been much more successful at being hardware-accelerated. Ray Tracing tends to require more code branching, and this is something that most GPUs are being uncomfortable with, even if they got much better at it over the years.

Ray-Tracing operations can greatly simplify and improve the creation of reflections, shadows, soft shadows and complex lighting, but the ratio between performance and believably has always prevented Ray-Tracing to take off in games. I think that ray-tracing every surface in today’s games is just NOT an option today, performance-wise. But even with limited hardware Ray Tracing performance, it may be possible to greatly improve or simply specific effects/tasks. I’m thinking about lighting/shadowing and global illumination in particular. Even performance ended up being the same, Ray-Tracing could simplify things appreciably.

PowerVR Says That It Is Fast Enough Today

"10 RAYS PER PIXEL AT 720P/30FPS IS A REALISTIC SCENARIO"Imagination Technologies has not presented the complete details of how they handle the Ray Tracing scene data internally, and as of today, we’re not sure how long it takes to build those structures structures required to perform Ray Tracing (RT). We also don’t know how the API submits and selects what parts of the current scene will be used during RT operations.

Ray-Tracing reflections are most useful when everything is dynamic (if not, devs often approximate reflections with a simple cubemap), but on the other hand, many acceleration data-structures also need to be re-created when things move around. How fast is that? I don’t know.This could have a major impact on the destructibility of the scene for example. The PowerVR team tells me that casting less than 10 rays per pixel at 720p/30FPS is a realistic scenario for an actual app. For a Mobile chip, these are very impressive numbers.

It becomes even more interesting because the number of rays being cast can dynamically vary per pixel. Since lighting can be low-frequency (blurry) in many places, it should be possible to cast less rays and use a blur function to approximate low-frequency lighting. Interestingly Global Illumination is mostly low-frequency, so this could be done in low-resolution. In places where the frequency is high, it’s possible to cast more rays – at least, in theory.

Helping Computer Graphics Be Less “Hacky”

For most of its history, Computer Graphics have been considered by its most brilliant minds as a “giant hack”, because what we’ve had was such a crude approximation of light’s interaction with matter. Today’s real-time graphics are leaps and bounds better than they used to be, but they are still quite hacky at times. Developers have used sheer ingenuity to amaze us with their latest rendering techniques, but anything that would make rendering a little more orthogonal would be very welcome.

It’s not like game developers cannot do reflections, soft shadows or global illumination today — they can, and I think that any rasterizer vs. ray-tracer comparison should be realistic about that. Many real-time ray-tracing presentations are utterly unrealistic (if not downright misleading) when doing these comparisons, and this is quite annoying. Games like KillZone:Shadow Fall or Battlefield have excellent lighting and I have never seen a real-time Ray-Tracing demo that looked half as good.

For credibility sake, I think that there is a need for better RT demos and believe me, creating cool tech is very different profession from building demos to show it off. "THE HYBRID RENDERING APPROACH IS WHAT I KNOW WILL WORK"

Yet, it is precisely for that reason that Ray-Tracing is still unproven in a gaming context. I don’t know of a single major title that uses what I would call ray-tracing (SSAO is a very limited form of ray-tracing, and one could argue that rasterization could be called “primary ray-tracing” too). Yet, the Hybrid Rendering approach is what I know will work, and the beauty of it is that developers don’t need to “commit” to ray-tracing. They can have both techniques and switch depending on the target platform.

Very Promising, Yet Unproven

Imagination Technologies need real games, or much better demos.

Imagination Technologies need real games, or much better demos.

Without actually using Imagination’s new feature first hand, and without a complete picture of the actual performance, it’s difficult to judge how truly useful this new feature is going to be (Geometry Shaders were lame when first introduced), but I think that developers will find great ways to utilize it if they are confident that it’s worth spending the time to write new code when only one vendor supports it at the moment. If the support for Ray-Tracing instructions would spread to other GPU vendors, then this may turn into something huge. Microsoft and the OpenGL board would be wise to support this. That said, it’s probably too late for DX 12.

Without a doubt, other GPU vendors have to be frantically looking at this right now because for the first time, this is a real, practical use of Ray-Tracing that can actually work in today’s games. I wouldn’t be surprised if many competitors were caught by surprise. Competitors management and engineering staff must have a healthy exchange of emails just about now…

By the looks of it, Lighting and Reflections in the context of a deferred renderer would be the most obvious use, but you can think of many other ways to use Ray Tracing hardware, including collision detection that could be fed into an AI engine (keep an eye on the latency though…). I don’t really care much about reflections because the current screen-space hacks with cube map fallback work well enough.

But even limited Ray-Tracing could improve Lighting by a ton (shadows, Ambient Occlusion, Light Gathering). Most of today’s raster-based techniques for light transport are extremely complex to implement and require many hacks to fix issues introduced by low-resolution render targets.

Uphill Battle, For Now

Despite everything that they have done well, this is going to be somewhat of an uphill battle for Imagination. Their hardware may be widely used but it is mobile-centric and at the moment, computer graphics leaps tend to be done on PC or game consoles. It is difficult to find developers with enough control on their own engines for a quick integration. I’m sure that folks at Dice and other high-end graphics studios will gladly look at this for the kick of it, but who knows when it will be integrated in a product,

Their best chance is to convince the likes of Unity or Epic Games to investigate this, and possibly add it to their engines so that most can benefit. Fortunately, this is very much doable and if this is as good as Imagination claims it is, you should see it in engines within 8-12 months. If that happens, and if it works well, I suspect that competitors will be forced to contemplate pursuing the same path.

Conclusion: Bold, Innovative

power-vr-raytracing-gdc14-001"PRACTICAL APPROACH TO RAY-TRACING: BAD-ASS"I didn’t expect this from Imagination Technologies, just like I was surprised when Intel brought a great solution for order-independent transparency. It is really nice to see innovation to come from many companies and shake things up. The idea of having a practical approach to ray-tracing is bad-ass. Next, Imagination will have to convince game developers to put it to use, and this is only then that we will know how good it really is. In the meantime, I’m impressed by the concept.

Filed in Featured >Gaming. Read more about , and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading