AMD's next-gen RDNA 3 GPUs will be more power hungry, but they expect Nvidia's to be worse

AMD expects Nvidia's next-gen GPUs to be very power hungry

AMD next-gen GPUs will be more power hungry, but they expect Nvidia's to be worse

AMD believes that Nvidia will push next-gen GPU power consumption "a lot higher than we will"

In a recent interview with Tom's Hardware, AMD's Sam Naffziger, a Senior Vice President, Corporate Fellow, and Product Technology Architect, has suggested that their next-generation RDNA 3 graphics cards will be more efficient than Nvidia's RTX 40 series products.

With RDNA 3, power efficiency is a major focus for AMD, with AMD expecting greater than 50% performance per watt improvements from the architecture. This efficiency comes from AMD's use of 5nm lithography, AMD's re-architected compute-units, optimised graphics pipeline, and next-generation infinity cache.

Like Nvidia's RTX 40 series, AMD appears to be planning to increase the power limits of their high-end RDNA 3 products. Naffziger said, "but even if our designs are more power-efficient, that doesn't mean you don't push power levels up if the competition is doing the same thing. It's just that they'll have to push them a lot higher than we will."

With greater power efficiency comes more performance per watt of power consumed. That alone increases AMD's GPU performance. Adding more power into the mix can increase GPU performance even further, giving AMD's RDNA 3 series GPUs the potential to deliver a significant performance leap over their current-generation RDNA 2 counterparts.

"they'll have to push them a lot higher than we will"

I don't know about you, but I believe that this is the most important quote from Sam Naffziger's entire interview. This quote shows that AMD believes that AMD can deliver better performance per watt than Nvidia's RTX 40 series, and that AMD thinks that they can match or beat Nvidia's next-generation flagship while consuming less power.

If this is true, AMD's RDNA 3 series GPUs will be revolutionary for Radeon. Lower power consumption levels allows AMD's GPUs to use smaller heatsinks (and potentially run quieter), and use less expensive power management systems/components than their Nvidia counterparts. This helps to increase AMD's margins and allows AMD to deliver more competitive pricing. If you can increase performance per watt, you deliver higher performance levels while also lowing the material costs of many aspects of a graphics card. Increasing performance per watt is a win-win for AMD. 

Assuming that AMD is correct in their assumptions, Nvidia GPUs will need more complex power management systems, larger heatsinks, and potentially require users to upgrade their power supplies. If Nvidia need more power to match or exceed AMD's performance with RDNA 3, it will place Nvidia at a huge disadvantage.  

AMD's next-gen RDNA 3 GPUs will be more power hungry, but they expect Nvidia's to be worse

If AMD is correct about their assumptions with RDNA 3, and they have the performance needed to go toe-to-toe with Nvidia within the high-end PC market, RDNA 3 could be a turning point for team Radeon. While both RDNA and RDNA 2 closed the gap between AMD and Nvidia, Nvidia remains the GPU market's top dog, and AMD remains behind Nvidia on several technological fronts, with ray tracing performance and DLSS being huge advantages for Nvidia. 

While AMD's FSR 2.0 technology is a strong counter to DLSS, to remains to be seen if RDNA 3 has what it takes to bring AMD to the top of the PC market when it comes to ray tracing performance. If AMD has significantly increased their ray tracing performance with RDNA 3, AMD's RX 7000 series GPUs should win AMD a lot of market share. 

You can join the discussion on AMD's expectations for RDNA 3 on the OC3D Forums.

«Prev 1 Next»

Most Recent Comments

27-06-2022, 11:23:21

Piskeante
This comes to show 2 important things:

1st: Nvida and AMD built very efficient GPUS in the past so that miners would see this gpus as really profitable in terms of calculations in mining per W of use. This took all those RTX 3060 and ti with 3070/ti to the top of profit mining. Also AMD with 6600/XT and even 6700/XT.

2º That there is a point in the technology where you cannot increase the performance while reducing the power consumption. This was made in the past to favour miners, but now that mining with GPUS is almost dead, there is no incentive to efficiency, just for raw performance which is, in the end, what most gamers want.

This is why all the reports about power consumption give less efficiency and plus more performance for both AMD and NVIDIA. They just don't care anymore about miners and have decided to push hardware to the limits at the expense of more power consumption.Quote

27-06-2022, 16:38:51

meuvoy
Quote:
Originally Posted by Piskeante View Post
This comes to show 2 important things:
2º That there is a point in the technology where you cannot increase the performance while reducing the power consumption. This was made in the past to favour miners, but now that mining with GPUS is almost dead, there is no incentive to efficiency, just for raw performance which is, in the end, what most gamers want.
Fancy ourselves conpiracy theorists aren't we?

1st: Unless you're an Alaskan gamer I'm pretty sure you'd rather have your PC components not heat up your room above the summer heat in the equator every time you want to play a game, not to mention, even with big heatsinks, localized heat still affects solder joints and causes premature cold solder-based component deaths, also remember exploding 3090s? Yeah, it was a sight to behold, and expect it to again with RTX 40 but probably affecting more SKUs. Oh and new power supply, and that electricity bill?

2nd: This is not new, and not caused by mining... Mining with GPUs was not a thing before the current generation, they couldn't just predict it would boom because they mande efficient GPUs, they could hope so, but not predict... Efficient GPUs have been a thing for ages, and sometimes warrants exclusive products like the GTX 750Ti that NVIDIA literally launched a year before the achitecture was officially released (it was launched in a more modern architecture than any of the other GPUs in the 700 lineup). Why? Wel...

If you do a quicky google search for GTX 480 you'll see many complaints about heat and high power usage, it was just another moment where NVIDIA fell behind AMD and they couldn't do anything aside from incresing power usage and bumping up everything they could, look at GTX 200 series some of those GPUs had 512-bit memory bus and still couldn't beat AMD's offerings, the GTX 400 series was NVIDIA's answer and was lauched early and incredibly power hungry for the standards from back then.

After mitigating their power issues within the future GTX 500 and finally fixing it with 600 series NVIDIA said their TOP of the line would never exceed 250W... Well. See how that's working for them?

Point is they've been working on power efficiency since the GTX 400 fiasco (actually since before but way harder after the GTX 400 series) because those cards died, and oh how they died, look up at ebay see how many GTX 480s you can find, expect the same with RTX 40 series and also the 3090. This is not a stable solution also not a long-term solution, NVIDIA simply could not finnalise their chiplet architecture in time to compete against AMD, they felt the pressure in the 30 series and had already raised the power draw to combat it, the 40 series will be worse because it's pretty much the same story, but do expect in the RTX 50 range for GPUs to come with way lower than expected power draw, probably by RTX 60 series they will be remarkably efficient.

But yes, to continue pushing the envelope and developing ever more realistic games we have already resorted to all sorts of software trickery, like temporal AA, GI, Reflections, etc. And no, I don't expect to see RTX 6090 with 250W TDP, the way GPUs need to be designed to offer the amount of performance we expect including 4k, 144hz+ refresh rates all the while increasing graphical fidelity to ever greater heights is taking it's toll and causing the power draw and cost of PC components to rise, and that's a big part of why many companies are investing so much into cloud gaming, it's coming to a point it's beign ever more prohibitive to buy hardware, including consoles, so instead, if a big company takes the upfront investiment and we just pay for the rights to use it, it's a solution. But do expect the unbelievable power draw of current and next generation GPUs to slow down and lower quite a lot in the next few years.

So basically this is a cycle, NVIDIA would much rather keep pushing smaller but still meaningful improvements every year it's less costly for them, for their image and, of course, avoids exploding and early-dying products, but AMD was loing the game and they had to do something about it, they just disrupted the market so much consoles basically reached PCs, and NVIDIA couldn't have that, they didn't supply the chips in the consoles they woudl lose so many customers, they had to do something about it. This is it, this is a an desperate attempt to remain relevant, one that I hope will come to pass, and who knows maybe it takes a few years and AMD comes out on top of NVIDIA for a while, but they'll eventually figure it out and we will ultimately benefit from their clash.Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.