Ray tracing is a rendering technique that simulates how light travels in the real world to produce highly realistic images in games, movies, and 3D applications. Instead of using shortcuts to approximate lighting, it traces the path of light rays as they bounce off surfaces, pass through glass, and cast shadows, calculating what each pixel on your screen should look like based on the physics of light.
How Ray Tracing Works
Traditional 3D graphics (called rasterization) work by taking objects in a scene and projecting them onto your screen, then applying color and shading after the fact. Ray tracing flips this approach entirely. It starts from your virtual camera, fires a ray of light through each pixel on the screen, and asks: what does this ray hit first? If it hits an object, the system calculates the color based on lighting, material, and surface angle. If it hits nothing, it fills in the background.
Each ray is defined by a simple equation: an origin point (the camera) and a direction (determined by the pixel it passes through). The system plugs that equation into the math describing every object in the scene and checks for intersections. The object with the closest hit point is what “wins” that pixel. From there, the renderer calculates how light interacts with the surface at that exact point, including the angle of incoming light, the surface normal, and the material’s properties.
This process repeats for every single pixel. A 1920×1080 image means over two million rays just for the first pass, and most scenes require many more rays per pixel to capture reflections, shadows, and indirect light accurately.
What Makes It Look Better Than Traditional Graphics
The reason ray tracing produces noticeably better visuals is that it naturally handles lighting effects that rasterization has to fake with workarounds. These include:
- Reflections that accurately show the surrounding environment, including objects not directly on screen
- Soft shadows that blur realistically at the edges depending on light size and distance
- Refraction through transparent materials like water or glass, following Snell’s law of optics
- Global illumination, where light bouncing off one surface subtly colors nearby surfaces
- Ambient occlusion, where crevices and corners appear naturally darker because less light reaches them
Rasterized games approximate all of these effects with screen-space techniques, which only use information visible on your current screen. This leads to artifacts: reflections that disappear when you turn the camera, shadows with hard edges, and flat-looking indirect lighting. Ray tracing solves these by actually simulating the light’s behavior rather than guessing at it.
Ray Tracing vs. Path Tracing
You’ll often see “path tracing” mentioned alongside ray tracing, especially in newer games. The distinction matters. Standard ray tracing in games typically traces rays from the camera to the first surface they hit and calculates direct lighting at that point. Path tracing takes it further by continuing to trace those rays as they bounce from surface to surface multiple times.
Think of a room with two mirrors facing each other. Standard ray tracing would give you a good-looking mirror, but the reflection of the second mirror might appear flat or gray. Path tracing lets the rays keep bouncing between both mirrors, creating the infinite reflection tunnel you’d see in real life. The same principle applies to indirect lighting: light hitting a red wall and casting a subtle red glow onto a nearby white floor requires multiple bounces to simulate correctly.
Path tracing is essentially the most complete form of ray tracing. Games like Cyberpunk 2077’s “Overdrive” mode and Portal with RTX use full path tracing, where all lighting in the scene is ray traced rather than just select effects like shadows or reflections.
The Performance Cost
Ray tracing is computationally expensive, and the performance hit in games is significant. A TechSpot investigation across 36 games found that enabling ray tracing causes an average 29% reduction in frame rate on an RTX 4090, one of the fastest consumer graphics cards available. At maximum ray tracing settings, the average hit climbs to 32%.
The range varies enormously depending on the game. Resident Evil 4 loses only about 2% of its frame rate with ray tracing enabled, while Hitman 3 at max settings drops by 65%. Full path tracing is the most demanding: Cyberpunk 2077’s Overdrive mode roughly halves performance even on top-end hardware. Spider-Man: Miles Morales at maximum settings similarly cuts frame rates in half.
This cost is the main reason ray tracing hasn’t fully replaced rasterization. Most games use a hybrid approach, rasterizing the base image and layering ray-traced effects on top for specific elements like reflections or shadows.
How AI Helps Close the Gap
One of the biggest challenges with real-time ray tracing is noise. When a game can only afford to trace a limited number of rays per pixel (to keep frame rates playable), the resulting image looks grainy, like a photo taken in low light. Denoising algorithms smooth this out, but traditional denoisers rely on hand-tuned rules that can introduce shimmering, ghosting, or blurry reflections.
AI-based tools tackle this from two angles. First, AI denoisers trained on massive datasets can reconstruct clean images from noisy ray-traced input more accurately than static algorithms. They capture relationships across multiple frames, reducing the artifacts that plague conventional denoisers. Second, AI upscaling renders the ray-traced image at a lower resolution (where the performance cost is smaller) and then intelligently reconstructs it to your display’s full resolution. NVIDIA’s DLSS is the most prominent example, using transformer-based neural networks for both denoising and upscaling in a single pipeline. AMD and Intel offer competing technologies with similar goals.
The combination of fewer rays, AI denoising, and AI upscaling is what makes real-time ray tracing practical on current hardware. Without these tools, the frame rates would be far too low for smooth gameplay.
Where Ray Tracing Is Used
In film and visual effects, ray tracing has been standard for decades. Movie studios can afford to spend hours rendering a single frame, so the computational cost is less of a barrier. Pixar, Industrial Light & Magic, and other studios use path tracing to produce the photorealistic lighting in modern animated and effects-heavy films.
In gaming, ray tracing became a mainstream feature with NVIDIA’s RTX 20-series GPUs in 2018, which introduced dedicated hardware cores for ray-object intersection calculations. Since then, AMD and Intel have added ray tracing support to their GPUs as well. On the software side, two major graphics programming interfaces support it: Microsoft’s DirectX Raytracing (DXR) for Windows and Vulkan’s ray tracing extensions for cross-platform development. Console support arrived with the PlayStation 5 and Xbox Series X, both of which include ray tracing hardware.
Beyond entertainment, ray tracing is used in architectural visualization, product design, automotive prototyping, and scientific simulation, anywhere accurate light behavior matters for the final image.

