DLSS 2.0 in Death Stranding - Nvidia 4K Performance Trump Card
DLSS 2.0 Image Quality Compared
Published: 15th July 2020 | Source: OC3D Internal Testing | Price: |
DLSS 2.0 Image Quality Compared
Like many graphical effects, DLSS' impact varies wildly on a scene-by-scene basis. Like other DLSS 2.0 enabled titles, Death Stranding offers DLSS in two flavours "Quality Mode" and "Performance Mode".
DLSS works by upscaling images to larger pixel counts, adding additional details along the way to reduce aliasing and increase image quality to create something that looks similar to a native resolution rendering. Note that these upscaled images will never look exactly like a native resolution image, as Nvidia's AI is adding details that aren't present in the lower resolution image that it upscales.
In the image below, we can see that DLSS' quality mode looks very similar to a native resolution rendering. The screenshots below are 800x450 sections of a full 4K image. Here, we can see that some details are sharper/clearer when DLSS is enabled, such as the grass/moss outlines. That said, both images are almost identical for the most part.
With DLSS enabling higher performance levels and practically identical levels of graphical quality here, it's clear that DLSS 2.0 is a win for Team Nvidia.
When DLSS is set to Performance Mode, framerates are boosted further by sacrificing rendering quality. Despite this, Death Stranding still presents a remarkably clear pseudo-4K image. While some areas of the image look worse when DLSS is set to performance mode, other areas look better, making this performance/quality tradeoff more than worthwhile if you need the framerate boost.
In Death Stranding, the quality impact of DLSS 2.0 is at its greatest during cutscenes, where many fine details can be seen up close, such as hair and other finely detailed game assets.
With DLSS 2.0, Nvidia's AI algorithm is able to do a better job at anti-aliasing than the game's default TAA solution and presents a sharper, fuller image. Looking at the hair and eyelashes of the character below, we can see that there is no TAA ghosting and that each strand of hair is seamless and free of granular artefacts.
Despite using a lower resolution image as a baseline, DLSS 2.0 is able to deliver a significant boost in image quality within Death Stranding. With DLSS 2.0, Death Stranding both looks better and runs faster than the game's native Anti-Aliasing solution.
Setting DLSS to Performance Mode sacrifices some of the clarity seen above, but even with a lower resolution reference frame, DLSS 2.0 is still able to deliver a better looking 4K presentation that what Death Stranding can offer by default. After considering the performance upgrade that DLSS provides, enabling this technique should be seen as a no-brainer for RTX GPU users, especially if you can achieve 60 + FPS framerates with DLSS set to "quality mode".
Most Recent Comments
I didn't think much of it, until I saw that chick's face. Her eyebrows look so much better, just so much more detail there.
I'm glad this is becoming a used thing now. |
TBH, while I love DLSS 2.0, I really want to see a 3rd party alternative that can work on both AMD and Nvidia cards. That's surely coming, as console makers will want to see this kind of upsampling, I want to see this kind of thing widely adopted on both the software and hardware side.Quote
Yeah, in many scenes both look near-identical, but when TAA artefacts come into play, DLSS 2.0 wins hard. I didn't believe how good DLSS looked initially.
TBH, while I love DLSS 2.0, I really want to see a 3rd party alternative that can work on both AMD and Nvidia cards. That's surely coming, as console makers will want to see this kind of upsampling, I want to see this kind of thing widely adopted on both the software and hardware side. |
Consoles would kill for this feature, but we need to see what will AMD do. They don't have hundreds of thousands of Tesla GPUs in the basement doing all AI training, neither they have years of AI development like Nvidia has.
The console market is absurdly large so there may be an AMD alternative. Will it be this good remains to be seen.Quote
It will be hard to make 3rd party software that does it on both AMD and Nvidia cards. It will be either-or. This is a hardware-specific feature. Tensor cores are doing all the AI stuff locally on GPU die. AMD cards won't have that on their dies. And I don't think that stream cores can do all that math efficiently.
Consoles would kill for this feature, but we need to see what will AMD do. They don't have hundreds of thousands of Tesla GPUs in the basement doing all AI training, neither they have years of AI development like Nvidia has. The console market is absurdly large so there may be an AMD alternative. Will it be this good remains to be seen. |
I'm glad this is becoming a used thing now.Quote