Expect next-generation graphics cards to consume more power

Expect next-generation graphics cards to consume more power

Expect next-generation graphics cards to consume more power

Current-generation graphics cards already consume a lot of power, with Nvidia’s RTX 3090 having an official power consumption rating of 350W and AMD’s Reference RX 6900 XT models consuming around 300W, we expect GPU energy requirements to increase with each company’s next-generation offerings. The technology behind graphics cards is changing rapidly, and the pursuit of higher performance levels will push energy use and graphics card cooling to its limits. 

While power efficiency is a primary design target for AMD and Nvidia, their next-generation GPU offerings are rumoured to be more power-hungry than their predecessors. This prospect should be seen as exciting and terrifying by PC builders. Exciting because higher efficiency levels and increased power consumption will significantly boost performance levels and terrifying because higher energy use results in greater heat production and steeper cooling requirements. 

400W is not enough? 

Nvidia’s RTX 3090 Founders Edition already consumes 350 watts of power and overclocked models of AMD’s RX 6900 XT and Nvidia’s RTX 3080. RTX 3080 Ti and RTX 3090 can already reach for more power. If AMD and Nvidia fail to deliver insane efficiency gains with their next-generation graphics cards, maximum TDPs are set to rise again.

Competition between AMD and Nvidia is now at its tightest levels for more than half a decade, giving both vendors a reason to push performance as high as they can. When every benchmark result matters, there is little reason not to push TDPs as high as possible, especially if higher performance levels allow manufacturers to target more premium price tags.   

In a recent Tweet, the leaker @kopite7kimi stated that “400 is not enough” when referring to Nvidia’s next-generation RTX 40 series. We are of the same opinion. 

Expect next-generation graphics cards to consume more power

MCM GPUs – Multi-Chip GPUs are coming!

Rumour has it that both AMD and Nvidia plan to create MCM (Multi-Chip-Module) graphics cards with their future GPU architecture. AMD is reportedly pushing to deliver multi-chip designs with their RDNA 3 architecture, while reports are a little spottier on the Nvidia side. 

In 2017, Nvidia released a paper detailing their MCM GPU plans, and we already know that AMD is heavily invested in the latest chip packaging technologies. Multi-chip graphics cards are a clear design target for both companies, making MCM GPU a question of when, not if. 

MCM chip designs allow manufacturers to cut costs and facilitate the creation of large chips that would otherwise suffer from low yields. Smaller chips allow manufacturers to waste less die area within each silicon wafer they produce, and smaller chips often have higher manufacturing yields. These factors can make multi-chip products more cost-effective to produce, and AMD’s success with Ryzen and EPYC has already proven that MCM designs can succeed. 

A hypothetical multi-die RDNA 3 GPU design would feature the benefits of AMD’s latest Radeon graphics architecture, the advantages of the latest process technologies and benefit from the potential performance benefits of multi-chip design. Should this design work as intended (acting from a game’s perspective as a single, larger graphics card), a multi-chip RDNA 3 graphics card would offer a greater than 2x performance boost over today’s RX 6900 XT. 

If AMD could bring together 2 cores like AMD’s RX 6900 XT, add additional architectural enhancements and add on the benefits of newer process technologies, we have no trouble believing that a greater than 2x performance boost is possible for AMD. That said, we don’t see AMD doing that below the RX 6900 XT’s 300W power envelope. 

MCM GPUs will push power requirements to new highs in the consumer GPU market.  

Nvidia plans to move to Multi-Chip GPU modules to scale past Moore's Law

Nvidia already knows how to power 450W graphics cards – Why Nvidia’s 12-pin power connector exists

With the launch of their RTX 30 series, Nvidia revealed their new 12-pin GPU power connector to the world. This power connector is designed to deliver more than 500 watts of power, signalling to the world that higher wattage graphics cards are coming. 

Most aftermarket RTX 3080 and RTX 3090 designs use traditional 8-pin PCIe power connectors, which are rated to handle 150W of power. Nvidia’s 12-pin power cable can replace more than three of these cables, freeing up precious PCB space while making cable management significantly easier. 

While many consider Nvidia’s 12-pin GPU power connector design a flop, it has paved the way towards graphics cards with higher TDPs. Nvidia could release a 450W graphics card and power it with a single 12-pin cable without any issues. Higher wattage GPUs will eventually force PSU makers and GPU designers to adopt Nvidia’s cable design. At some point, using more and more 8-pin PCIe power cables on a single graphics card will become too much. 

Expect next-generation graphics cards to consume more power

Do you need a higher wattage power supply? 

Ultimately, we cannot tell the future, but it looks like GPU TDPs will be going up over the next few generations, not down. The era of the 250W high-end graphics card is over. We have already hit 350W with Nvidia’s RTX 3090 Founders Edition, and it doesn’t look like TDPs will be getting lower anytime soon. 

While AMD and Nvidia are undoubtedly targeting performance-per-watt gains with their latest GPU architectures, it is also clear that all of these efficiency benefits will be used to push clock speeds higher and to power more GPU shaders and other components. 

The good news here is that more power and more power efficiency will deliver consumers even larger gains in raw performance. When it comes to performance, things are looking good for the future of the high-end GPU market.   

You can join the discussion on next-generation GPUs and our high TDP expectations on the OC3D Forums. 

Expect next-generation graphics cards to consume more power