Expect next-generation graphics cards to consume more power

Even if GPUs get more efficient, next-generation cards may still consume over 400 watts

Expect next-generation graphics cards to consume more power

Expect next-generation graphics cards to consume more power

Current-generation graphics cards already consume a lot of power, with Nvidia's RTX 3090 having an official power consumption rating of 350W and AMD's Reference RX 6900 XT models consuming around 300W, we expect GPU energy requirements to increase with each company's next-generation offerings. The technology behind graphics cards is changing rapidly, and the pursuit of higher performance levels will push energy use and graphics card cooling to its limits. 

While power efficiency is a primary design target for AMD and Nvidia, their next-generation GPU offerings are rumoured to be more power-hungry than their predecessors. This prospect should be seen as exciting and terrifying by PC builders. Exciting because higher efficiency levels and increased power consumption will significantly boost performance levels and terrifying because higher energy use results in greater heat production and steeper cooling requirements. 

400W is not enough? 

Nvidia's RTX 3090 Founders Edition already consumes 350 watts of power and overclocked models of AMD's RX 6900 XT and Nvidia's RTX 3080. RTX 3080 Ti and RTX 3090 can already reach for more power. If AMD and Nvidia fail to deliver insane efficiency gains with their next-generation graphics cards, maximum TDPs are set to rise again.

Competition between AMD and Nvidia is now at its tightest levels for more than half a decade, giving both vendors a reason to push performance as high as they can. When every benchmark result matters, there is little reason not to push TDPs as high as possible, especially if higher performance levels allow manufacturers to target more premium price tags.   

In a recent Tweet, the leaker @kopite7kimi stated that "400 is not enough" when referring to Nvidia's next-generation RTX 40 series. We are of the same opinion. 

Expect next-generation graphics cards to consume more power

MCM GPUs - Multi-Chip GPUs are coming!

Rumour has it that both AMD and Nvidia plan to create MCM (Multi-Chip-Module) graphics cards with their future GPU architecture. AMD is reportedly pushing to deliver multi-chip designs with their RDNA 3 architecture, while reports are a little spottier on the Nvidia side. 

In 2017, Nvidia released a paper detailing their MCM GPU plans, and we already know that AMD is heavily invested in the latest chip packaging technologies. Multi-chip graphics cards are a clear design target for both companies, making MCM GPU a question of when, not if. 

MCM chip designs allow manufacturers to cut costs and facilitate the creation of large chips that would otherwise suffer from low yields. Smaller chips allow manufacturers to waste less die area within each silicon wafer they produce, and smaller chips often have higher manufacturing yields. These factors can make multi-chip products more cost-effective to produce, and AMD's success with Ryzen and EPYC has already proven that MCM designs can succeed. 

A hypothetical multi-die RDNA 3 GPU design would feature the benefits of AMD's latest Radeon graphics architecture, the advantages of the latest process technologies and benefit from the potential performance benefits of multi-chip design. Should this design work as intended (acting from a game's perspective as a single, larger graphics card), a multi-chip RDNA 3 graphics card would offer a greater than 2x performance boost over today's RX 6900 XT. 

If AMD could bring together 2 cores like AMD's RX 6900 XT, add additional architectural enhancements and add on the benefits of newer process technologies, we have no trouble believing that a greater than 2x performance boost is possible for AMD. That said, we don't see AMD doing that below the RX 6900 XT's 300W power envelope. 

MCM GPUs will push power requirements to new highs in the consumer GPU market.  

Nvidia plans to move to Multi-Chip GPU modules to scale past Moore's Law

Nvidia already knows how to power 450W+ graphics cards - Why Nvidia's 12-pin power connector exists

With the launch of their RTX 30 series, Nvidia revealed their new 12-pin GPU power connector to the world. This power connector is designed to deliver more than 500 watts of power, signalling to the world that higher wattage graphics cards are coming. 

Most aftermarket RTX 3080 and RTX 3090 designs use traditional 8-pin PCIe power connectors, which are rated to handle 150W of power. Nvidia's 12-pin power cable can replace more than three of these cables, freeing up precious PCB space while making cable management significantly easier. 

While many consider Nvidia's 12-pin GPU power connector design a flop, it has paved the way towards graphics cards with higher TDPs. Nvidia could release a 450W graphics card and power it with a single 12-pin cable without any issues. Higher wattage GPUs will eventually force PSU makers and GPU designers to adopt Nvidia's cable design. At some point, using more and more 8-pin PCIe power cables on a single graphics card will become too much. 

Expect next-generation graphics cards to consume more power

Do you need a higher wattage power supply? 

Ultimately, we cannot tell the future, but it looks like GPU TDPs will be going up over the next few generations, not down. The era of the 250W high-end graphics card is over. We have already hit 350W with Nvidia's RTX 3090 Founders Edition, and it doesn't look like TDPs will be getting lower anytime soon. 

While AMD and Nvidia are undoubtedly targeting performance-per-watt gains with their latest GPU architectures, it is also clear that all of these efficiency benefits will be used to push clock speeds higher and to power more GPU shaders and other components. 

The good news here is that more power and more power efficiency will deliver consumers even larger gains in raw performance. When it comes to performance, things are looking good for the future of the high-end GPU market.   

You can join the discussion on next-generation GPUs and our high TDP expectations on the OC3D Forums

Expect next-generation graphics cards to consume more power

«Prev 1 Next»

Most Recent Comments

04-08-2021, 10:47:20

This doesn't bother me personally. Seeing as AMD's top-end MCM card will probably cost $1200 or more, I'm already priced out of a card like that. Same goes for Nvidia. If you have $1200-1500 to spare, you can also spare money for adequate cooling.

Nvidia's and AMD's lower-end cards should be comparable to today's TDP's, which suits me fine as that's the category I will fit into. I was going to get a 6700, but I'll probably wait until RDNA3 at this point and get a similar tier card.Quote

04-08-2021, 10:52:09

As much as I hate it, this just pushes me towards consoles or just not upgrading at all.Quote

04-08-2021, 12:51:13

MCMs should ultimately be the factor needed to start bringing large die size designs back down in cost again to counteract the fact that costs haven't really been reducing with transistor shrinks like they used to. Maybe not early MCM designs where the losses will likely make them more practical for only 2-4 "tile" configurations with fairly large dies, but as the tech matures it will start to make sense for smaller dies too.Quote

04-08-2021, 15:41:25

I just want a solid $300-400 card that is great value and easily attainable. I don't think that's to much to ask. I don't see why people will continue to splurge on things like this. It only makes it harder to stay into PC gaming if everyone only buys the top end stuff.Quote

05-08-2021, 02:00:19

Well while i did nearly have a heart attack winning my 6800XT, i'll most likely skip a few gens now, sure the newer cards will be beasts, but my card has plenty of legs on it for a while yet.Quote

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.