Real-Time Ray Tracing is here and we are excited as heck!
Bwahahahahahaha....... what's that sound? that's the sound of me, maniacally laughing in front of my monitor as I draw questioning glances from the other person in the room.
The Holy Grail of Computer Graphics seems to have arrived and it's taken me by surprise! Real-Time raytracing has always been top of my (and many others, I assume) wish list as a CG Artist but I honestly thought it wouldn't realistically be here, not in at least five years from now.
While offline renderers have been doing Ray tracing since forever, the downside is that its takes a lot of time to make all those calculations of direct and indirect light, bouncing off different surfaces in different ways as it should be in real life.
Real-time Engines (for example Game Engines) try to achieve this by all sorts of fudging and clever trickery like Screen Space ambient occlusion, Light/Shadow Maps, etc because they need to run in real-time, i.e at least 30-ish frames per second. The problem is that these short-cuts tend to fall apart under closer scrutiny or when you need it to do certain things which can be frustrating.
But sometime last week at the SIGGRAPH conference in Vancouver, NVIDIA announced and demonstrated its new tech: Real time raytracing that actually looked convincing and not hopelessly fudged running on the Unreal Engine.
I did not have time to pick my jaw off the floor when they announce their new range of Cards to support this tech: the RTX 2000 series. Their website states:
" NVIDIA® GeForce RTX™ delivers the ultimate PC gaming experience. Powered by the new NVIDIA Turing™ GPU architecture and the revolutionary RTX platform, RTX graphics cards bring together real-time ray tracing, artificial intelligence, and programmable shading. This is a whole new way to experience games."
So what does this mean for ordinary plebs/render junkies like us without going into anything technical? First of all, in my opinion, this goes beyond gaming. The implications for VR for instance is delicious: Virtual reality scenes can now be as close to photo-realistic as possible while still maintaining an acceptable frame-rate which will enhance immersion.
Also, every other type of 3d renderer could benefit from this tech, whether it be offline renderers like Vray and Corona or Hybrids like Lumion. Waiting minutes, hours or days to render a single frame could be a thing of the past.
Lastly, it could totally change the way the VFX industry produces their content as the adopt Real-time Engines in their production pipeline. I know I would.
Granted, there will always be a place for good old fashioned offline, brute force rendering calculating and simulating every single ray cast and bouncing off every single surface for that perfect shot but this tech can make some of their processes much more efficient.
A good example of a glimpse of what this tech can do is the Shadow of The Tomb Raider Ray-tracing Demo video. Here we see the player controlled Character, Lara Croft in a sort of Plaza at night during the Day of the dead celebration and its glorious. Soft shadows with proper falloff thrown by fire-crackers the children are playing with dance and flicker convincingly across the cobbled floor as they move about. The colored spotlights when she approaches the stage throws volumetric light on the crowd with convincing shadows that interact with scenery. All in real time!
" Soft shadows with proper falloff thrown by fire-crackers... dance and flicker convincingly across the cobbled floor as they move about......all in real time! "
It's a good time to be alive! I just feel a bit sorry for the Red team at this point, unless they can pull a rabbit out of their butts soon, it's going to be a massacre in terms of market share again this gen.
Do you think this new development is something to be excited about or I'm just caught up in the hype of another flash-in-the-pan gimmick destined to go the way of the dodo? Let us know in the comments below.