I Love Egypt!

Tuesday, January 15, 2019

What is NVIDIA DLSS, and How Will It Make Ray-Tracing Faster?

At NVIDIA’s CES 2019 presentation, the company showed off a new technology called DLSS. In demonstrations, it all but eliminates the performance hit taken in games that enable fancy new ray-tracing graphics on RTX cards. But how does it work?

What Is DLSS?

DLSS stands for “deep learning super-sampling.” There are two parts to this idea, but let’s focus on the second one first: super-sampling.

Super-sampling is something you can do on your machine right now with a lot of games. It essentially renders the game at a resolution beyond what your monitor can support. That sounds strange, but it can help smooth out some of the harsh edges in polygonal graphics. NVIDIA and AMD cards already support this technology, as do some PC games all on their own. To learn more about super-sampling, check out this article.

Now, on to the “deep learning” part. Deep learning is something of a nebulous term: it basically means tons and tons of computations run on high-powered hardware in a process that improves over time. Some applications call this “artificial intelligence” (AI), but that’s a misnomer; the system isn’t “learning” in any human sense, it’s just getting better at a repetitive process.

NVIDIA’s DLSS system runs super-sampling on one specific game, over and over again, on the graphics cards in its massive data centers. It computes the best ways to apply the super-sampling technique to a game with repetitive processing on that game’s visuals—the polygons and textures that make up what you see on your screen. The “deep learning” part of the process comes into play here; the system learns as much as it possibly can about the way that the game looks, and how to make it look better.

Combine super-sampling for smoother polygon lines and textures with deep learning for applying general improvements to a game, and you get DLSS. Picture-improving techniques, already calculated at NVIDIA’s data centers, are applied on the fly via the Tensor processing cores in the RTX card.

Read the remaining 19 paragraphs



from How-To Geek http://bit.ly/2RMkIq6
via IFTTT

No comments :

Post a Comment