Nvidia showcases huge performance gains in The Lord of the Rings: Gollum with DLSS
This is DLSS at its best
Published: 25th May 2023 | Source: Nvidia |
Nvidia showcases near 4x performance gains thanks to DLSS in The Lord of the Rings: Gollum
The Lord of the Rings Gollum is due to release later today, supporting Nvidia's DLSS 3 technology and ray tracing in the form of ray traced shadows and ray traced reflections.
With support for DLSS 3, The Lord of the Rings Gollum supports Nvidia's DLSS Super Resolution, DLSS Frame Generation, and Reflex technologies, enabling higher framerate, lower latency gameplay for users of compatible GeForce graphics cards. In The Lord of the Rings: Gollum, Nvidia has claimed performance gains of up to 3.9x with their RTX 4070 Ti graphics card when playing the game at 4K Max settings with DLSS enabled and set to performance mode.
The Performance Magic of DLSS 3
By enabling DLSS 3, users of Nvidia's RTX 40 series graphics cards have two avenues that deliver huge performance gains in The Lord of The Rings: Gollum. DLSS Super Resolution uses the AI performance of RTX GPUs to upscale lower resolution images to higher resolutions (1080p to 4K when DLSS is set to performance mode at 4K), enabling increased performance levels by lowering the number of traditional pixels that need to be rendered. Adding on DLSS Frame Generation, Nvidia uses a separate AI technology to generate AI intermediate frames between traditionally rendered frames, effectively doubling a game's framerate using AI frames in ideal scenarios.
With DLSS 3 (DLSS Super Resolution and DLSS Frame Generation), Nvidia can deliver incredible performance gains to PC gamers in The Lord of The Rings Gollum, as can be seen in the video below.
Below, you can see that data that Nvidia has provided us to showcase The Lord of The Rings Gollum's performance gains when using DLSS Super Resolution (in performance mode) and DLSS Frame Generation. With this combination of technologies, Nvidia uses AI to upscale a 1080p render to 4K through AI temporal upscaling and then generates intermediate AI frames to deliver further framerate enhancements.
In The Lord of the Rings: Gollum™, GeForce RTX 40 Series gamers can multiply performance with DLSS 3. At 4K, performance on the GeForce RTX 4090 increases by 3.5X, enabling 169 frames per second with every setting maxed out, and all ray tracing options enabled. The GeForce RTX 4080 can hit 123 FPS, a 3.8X increase; the GeForce RTX 4070 Ti sees a 3.9X speedup, for 100 FPS gameplay; the GeForce RTX 4070 receives a 3.8X boost, giving it a 79 FPS result in our benchmark.
The Lord of The Rings Gollum showcases Nvidia's DLSS technology at its best, at least in performance terms. Greater than 3x performance gains in games is an impressive feat, and highlights why DLSS has become such an attractive technology for both gamers and game developers.
You can join the discussion on Nvidia's DLSS delivering huge performance gains in The Lord of the Rings Gollum on the OC3D Forums.
Most Recent Comments
Watched a video with a bit of gameplay and the graphics are mediocre at best yet to get anything over 40FPS at 4K you need a 4090, Granted that's before DLSS2+3 but graphically it looks pants yet needs a massive amount of horsepower.
|
I'm beginning to sound like a conspiracy theorist, but I wouldn't be surprised if Nvidia was paying devs and publishers not to optimize their games in order to make DLSS look more appealing.
|
Why do any optimisation at all when you can just get nvidia to run it through their ai machine and then anyone on a non nvidia can just suffer and make them look better
Since physical media died, space was your problem. Since gpu makers started putting more vram on the cards some some people have ram, having the resources to run a game (if you're on pc) is your problem.
Gaming is no longer being overseen by gamers and artists, it's being run by accountants and lawyers, who's only concern is generating market cap. Therfore those that can't afford to keep up, nevermind, get a mobile game with a billion in app purchases and leave the real gaming to those that can afford itQuote
I'm beginning to sound like a conspiracy theorist, but I wouldn't be surprised if Nvidia was paying devs and publishers not to optimize their games in order to make DLSS look more appealing.
|
You can't have it both ways. Either games get to look better, or you stay on the older games and have the performance. The fact is? Nvidia and AMD have not been pushing very hard at all on raster performance. I repeat - neither of them are stupid. The days of buying, for example, a 1080Ti and it lasting you years are over. Why do you think Nvidia wanted to do ray tracing so bad? because they knew it would be over a decade before any of their GPUs can run stuff like that properly. It was reinventing the wheel, and putting us back years and years to create more sales for them.
These games are not poorly optimised. They are made using the tools devs have, and nothing more. I mean, let's take the recent "The Last Of Us" debacle for example. They lovingly recreate a game, make it look absolutely incredible, then the little P1ss ants all whine that their 3060 doesn't have enough VRAM and they can't run it at max settings. So what do they do to "optimise and fix" it? they compress all of the textures. Which makes it look like anus, and nowhere near as good as it looked before.
Don't blame people for pushing the envelope with games. Blame the real culprits. Nvidia could have made the 4090 WAY bigger and far more powerful. They just didn't want you to have it. Same with everything lower down the stack too. They are completely lame for the technology they are on. All of them should have been bigger, badder, with more VRAM and much bigger muscles. Yet Nvidia are just too greedy and squeaky to give you any of them.Quote