Skip to main content
  1. Home
  2. Computing
  3. News

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

How Intel could use AI to tackle a massive issue in PC gaming

Add as a preferred source on Google

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn’t introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the “limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes.” Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to “classic source representations, while also improving quality over previous work.”

Ellie holds a gun in The last of Us Part I.
Image used with permission by copyright holder

It doesn’t seem dissimilar from Nvidia’s Neural Texture Compression, which it also introduced through a paper submitted to Siggraph. Intel’s paper, however, looks to tackle complex 3D objects, such as vegetation and hair. It’s applied as a level of detail (LoD) technique for objects, allowing them to look more realistic from further away. As we’ve seen from games like Redfall recently, VRAM limitations can cause even close objects to show up with muddy textures and little detail as you pass them.

Recommended Videos

In addition to this technique, Intel is also introducing an efficient path-tracing algorithm that it says, in the future, will make complex path-tracing possible on mid-range GPUs and even integrated graphics.

Path tracing is essentially the hard way of doing ray tracing, and we’ve already seen it be used to great effect in games like Cyberpunk 2077 and Portal RTX. For as impressive as path tracing is, though, it’s extremely demanding. You’d need a flagship GPU like the RTX 4080 or RTX 4090 to even run these games at higher resolutions, and that’s with Nvidia’s tricky DLSS Frame Generation enabled.

Intel’s paper is introducing a way to make that process more efficient. It’s doing so by introducing a new algorithm that is “simpler than the state-of-the-art and leads to faster performance,” according to Intel. The company is building upon the GGX mathematical function, which Intel says is “used in every CGI movie and video game.” The algorithm reduces this mathematical distribution to a hemispherical mirror that is “extremely simple to simulate on a computer.”

Screenshot of full ray tracing in Cyberpunk 2077.
Nvidia

The idea behind GGX is that surfaces are made up of microfacets that reflect and transmit light in different directions. This is expensive to calculate, so Intel’s algorithm essentially reduces the GGX distribution to a simple-to-calculate slope based on the angle of the camera, making real-time rendering possible.

Based on Intel’s internal benchmarks, it leads to upwards of a 7.5% speed up in rendering path-traced scenes. That may seem like a minor bump, but Intel seems confident that more efficient algorithms could make all the difference. In a blog post, the company says it will demonstrate how real-time path tracing can be “practical even on mid-range and integrated GPUs in the future” at Siggraph.

As for when that future arrives, it’s tough to say. Keep in mind this is a research paper right now, so it might be some time before we see this algorithm widely deployed in games. It would certainly do Intel some favors. Although the company’s Arc graphics cards have become excellent over the past several months, Intel still focused on mid-range GPUs and integrated graphics where path tracing isn’t currently possible.

We don’t expect you’ll see these techniques in action any time soon, though. The good news is that we’re seeing new techniques to push visual quality and performance in real-time rendering, which means these techniques should, eventually, show up in games.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Macbook Neo stress test shows Apple could’ve made it run cooler with a simple fix
This simple mod makes the MacBook Neo faster.
Apple MacBook Neo with users hands on it

Apple's MacBook Neo arrived as a shock to the industry. It is the new cheap MacBook that is designed to be silent, efficient, and affordable. But a new stress test suggests that it could have been noticeably better with a very simple change.

As per a recent test, the addition of a basic copper plate to the cooling setup can improve both thermals and performance by a meaningful margin. And the frustrating part? It isn't some complex engineering overhaul and is relatively straightforward.

Read more
The Mac Pro is dead at Apple, and I’ll miss the cheese-grater powerhouse
RIP Mac Pro. The Mac Studio is taking the throne, and we're okay with that.
Electronics, Computer, Pc

Apple has officially discontinued the Mac Pro. It’s been removed from Apple’s website, and Apple has confirmed to 9to5Mac that there are no plans to release a future version. The buy page now redirects to Apple’s Mac homepage, where the Mac Pro no longer exists.

Why did Apple kill the Mac Pro?

Read more
March Madness, Revisited: The AI Model Did Well. But Mad Things Still Happen
Stills from NCAA games.

(NOTE: This article is part of an ongoing series documenting an experiment with using AI to fill the NCAA brackets and see how it fares against years of human experience. The original article is as follows.)

A week ago, I wrote about entering an NCAA tournament pool with a more disciplined process than I usually use.

Read more