Ray tracing may be the new buzz word in the gaming world, but it’s lived in the realm of movies and TV for quite a while.
Ever wonder how computer generated imagery (CGI) looks so realistic? Everything from explosions to mythical creatures look as if they’ve been captured in real life.
In reality, they’re nothing more than data come to life, with powerful algorithms ensuring light behaves on the screen the way it does in the real world.
But what exactly does that mean?
How does ray tracing differ from other commonly used techniques like rasterization?
And does the advent of real-time ray tracing present possibilities for industries other than gaming, such as automaking, AEC, training, and more?
Let’s break it down.
Ray tracing is inspired by one of the most ingenious machines - our eyes
Light is a unique type of energy. It behaves differently depending on what kind of matter it interacts with. For instance, if light hits water (a transparent object), it travels through it.
If light hits a wooden table, a couch, or a person sitting on the couch (opaque objects), it changes its path.
And then there are those objects that sit somewhere in the middle (translucent objects like a plastic bag) that let light partially pass through them.
It may be grade school physics, but it’s worth revisiting.
The way light interacts with objects determines how we see things in the real world. It’s a phenomenon we take for granted, but an area of physics that computer scientists and engineers have worked hard to replicate in the graphics realm.
When this behavior is accurately reflected in computer graphics, it creates ultra-realistic images that are indistinguishable from photographs. The computer-generated image accurately mimics the behavior of light in the real world including:
- Refraction: Bending of light
- Shadows: Blocking of light by an opaque object
- Reflection: Change in direction of light
This is the best way to achieve photorealistic 3D images, like the ones in movies. The difference between filmmakers and gamers is that filmmakers have the luxury of time. They can use large server forms over a matter of weeks or even months to use ray tracing and create their ultra-realistic special effects.
On the other hand, interactive media like games rely on real-time rendering, which means they don’t have the same flexibility. Real-time ray tracing in video games has been considered too computationally complex to mainstream into the consumer world.
That is until the release of Nvidia’s Turing architecture, designed to introduce real-time ray tracing to the gaming world.
Read Next: Your top online resources for creating real-time 3D content with Unity and Unreal
What real-time 3D rendering technique have video games traditionally relied on?
At present, most video games rely on rasterization for their real-time 3D rendering needs.
A 3D scene is made up of countless triangles of varying shapes and sizes and a rasterizer determines which triangles in that 3D image will be visible on a 2D screen from a specific perspective.
Information about shading and color is then added to each pixel in each triangle.
This method is popular because it’s manageable in real time and computationally inexpensive compared to ray tracing.
That said, rasterization isn’t as dynamic as ray tracing. In a ray tracing environment, “light” automatically responds to new inputs, the way it does in the real world.
But isn’t there already shading and lighting in current video games? This is actually the result of clever work from game developers who envision a number of scenarios and pre-build rasterized shadows or reflections based on possible gamer interactions.
But since it’s impossible for them to predict every action a gamer might take, there’ll be instances where the lighting is slightly off.
As Wired explains:
“If the player alters the environment by—for example—blasting a hole through a wall, the light in the scene won’t change to stream through that hole unless the developers have specifically planned for that possibility. With real-time ray tracing, the light would adjust automatically.”
At present, some games use ray tracing for specific sections where it offers the most value, but it has yet to be used for entire games.
On the other hand, ray tracing uses algorithms that know how to respond to different contexts.
Real-time ray tracing has exciting applications outside the gaming world
There’s an eager audience for real-time ray tracing in the gaming world, but there’s also an appetite among consumers in other sectors as well.
Consider the online car shopping experience. A vehicle, whether it’s a minivan, a sports car, or motorcycle, plays an important role in the driver’s life. Imagining that car in context is part of the buying experience.
Related Read: The future of car buying lies in photorealistic 3D configurators
With real-time ray tracing, customers can build an ultra-realistic model of their car using an online configurator and then place that car against various backdrops including urban landscapes, vacation destinations, and more.
In addition, automotive companies save on expensive professional photoshoots of their cars. Instead of flying to different destinations and paying for an entire production each time new shots are needed, they can use ray tracing technology to produce realistic renderings that are indistinguishable from photos.
In fact, Unity Technologies and Nvidia announced a partnership earlier this year to create an enhanced software platform for this exact use case.
The same idea applies to selling pre-built homes, condos, or commercial spaces. Selling an empty house or office is rather difficult. A great sales representative can try and paint a picture through words, but it doesn’t compare to a real picture. Plus, the buyers can’t easily replicate the sales rep’s charisma when getting feedback from trusted individuals like friends, family, or board members.
This past spring, Unreal Engine demonstrated the use of an interactive application to visualize pre-built apartments using the NVIDIA RTX™ technology. This method captures elements that are extremely important for a home buyer to see by accurately capturing the light behavior regarding reflections, transparency, soft shadows, and more. .
Industrial training simulations offer another valuable application of ray tracing. Preparing workers for dangerous or challenging work environments is an important and thereby expensive step in the onboarding process.
Companies spend thousands of dollars in travel costs to send employees and trainers to dedicated training sites. They also have to pay for accommodations while they’re there.
It’s also time consuming and inflexible. If these sessions only happen at specific points in the year, it’s hard to get people up to speed efficiently without spending money to ship one or two employees off for training on their own.
With ray tracing, employers can offer more realistic immersive, real-time 3D simulations by further enhancing the realism of their products and environments . And with powerful publishing platforms like PureWeb, which support streaming of ray traced renderings, these simulations can be delivered to any device, regardless of the hardware’s GPU capabilities.
The U.S. military is currently exploring the use of Nvidia’s technology to create more realistic training simulations for activities like tactical vehicle training, air combat, and refueling. In this way, military personnel are as prepared as possible before deployment.
How to stream and interact with your real-time ray traced 3D applications
Once you’ve used ray tracing to create your projects, how do you unleash those projects?
You want to share your ray traced projects with everyone who will benefit from them, not just the individuals with the computing power to view them. You want:
- Any prospective customer to use your real-time ray tracing supported 3D configurator to build their dream vehicle
- Any potential home buyer or commercial real estate buyer to open up a meeting room and let a sales rep conduct through a walk through of what their future property
- Any employee to receive training simulations to any device while trainers interact with them in real time from anywhere in the world
The best way to execute on these use cases is through a fully-managed interactive streaming service for 3D applications.
Our PureWeb interactive streaming platform uses proprietary technology to efficiently manage your cloud GPU usage while protecting the quality of your 3D renderings.
What’s more, security is built into the system. Source files remain secure at the server level while only renderings are transferred to the user device.
Plus, everything including configuration and collaboration with AWS is managed by the PureWeb team. You get to focus on creating immersive and engaging 3D experiences while we worry about the back end.
Learn more about how to unleash your real-time ray tracing-enabled 3D applications to any device. Book a demo with a member of the PureWeb team today.