Nvidia's reportedly considering two different specifications for its RTX 4070

Nvidia's RTX 4070 is expected to deliver RTX 3090-level performance (possibly better)

Nvidia's reportedly considering two different specifications for its RTX 4070

Nvidia hasn't finalised its RTX 4070 specifications, and they appear to be considering two options

Nvidia's reportedly considering two different sets of hardware specifications for their RTX 4070 graphics card, with one potential design offering users 12GB of GDDR6X memory and 7680 FP32 cores and the other offering users 7168 CUDA cores and 10GB of GDDR6X memory. Both graphics cards appear to utilise 21Gbps GDDR6X memory and these GPU models are expected to consume 285W and 250W of power respectively.  

The higher end of these two SKUs reportedly has the PG141-SKU340/341 name, offering users 20% more VRAM capacity and bandwidth then the slower PG141-SKU336/337 design. Nvidia's higher-end design could be transformed into a future RTX 4070 Ti model in the future, should Nvidia decide to use their lower-end RTX 4070 design. 

Nvidia's two RTX 4070 designs are reportedly being considered by the company, with their higher-end model offering 3DMARK Time Spy Extreme Scores of near 11,000 while the lower-tier model offers scores of near 10,000 while consuming less power. These performance levels give both of these RTX 4070 performance levels that are similar to or greater than an RTX 3090.  

Nvidia's reportedly considering two different specifications for its RTX 4070

Nvidia's planning to start discussing their next-generation graphics cards at GTC (GPU Technology Conference) keynote next month. That said, Nvidia is reportedly pushing back their RTX 40 series launch to give the company deal with their oversupply of RTX 30 series graphics cards. Some reports have claimed that Nvidia's RTX 40 series will launch in Q4 2022, starting with high-end SKUs. 

You can join the discussion on Nvidia's rumoured RTX 4070 specifications and performance levels on the OC3D Forums.

«Prev 1 Next»

Most Recent Comments

29-08-2022, 16:14:16

Warchild
Is this their way of making the cut down version at the previous full price, and then the higher end for a premium when in reality the high end is nothing more than the standard version anyway?Quote

29-08-2022, 18:21:04

AlienALX
Quote:
Originally Posted by Warchild View Post
Is this their way of making the cut down version at the previous full price, and then the higher end for a premium when in reality the high end is nothing more than the standard version anyway?
Nah, in other words they are waiting to see what AMD will come with

AMD turn up? 12gb. AMD don't? ration you to 10gb.

Won't be a huge issue. Yet. Once the consoles start to get to reach the 8gb or whatever it is they have and the PC pushes way past that it will be a problem.Quote

29-08-2022, 19:08:37

Dicehunter
Quote:
Originally Posted by AlienALX View Post
Nah, in other words they are waiting to see what AMD will come with

AMD turn up? 12gb. AMD don't? ration you to 10gb.

Won't be a huge issue. Yet. Once the consoles start to get to reach the 8gb or whatever it is they have and the PC pushes way past that it will be a problem.

IMO anything in the xx70 range and up should not come with anything less than 16GB of memory, If the 4070 is on the level of a 3080-3090 then it will be 4K capable and games at 4K are starting ton eat VRAM up pretty quickly.Quote

29-08-2022, 20:06:52

AlienALX
Quote:
Originally Posted by Dicehunter View Post
IMO anything in the xx70 range and up should not come with anything less than 16GB of memory, If the 4070 is on the level of a 3080-3090 then it will be 4K capable and games at 4K are starting ton eat VRAM up pretty quickly.
Yeah but that isn't how Nvidia work dude. They don't want to give you a card that will last too long, without getting you to buy the higher end card.

3070 was less than half of the price of a 2080Ti. And arguably as good (I argue it isn't, but there is no denying it was indeed much cheaper, if you could ever get one from Nvidia without paying out the anus for a plastic fantastic card from a board partner).

It is pretty obvious where Nvidia made the cuts.

I know plenty of people would want to argue with me over this, but ask yourself this. We went from a 1070 and 1080 with 8gb. To a 2070 and 2080 with 8gb. To a 3070 with 8gb.

Now the first ones? maybe that was a touch of overkill. At the time however they needed a way to convince you to spend what both of those cut down mid tier cards cost. Because as good as they were there was no denying *what* they were. The 1080 and 1070 were both cut down small mid tier dies, soon surpassed by the 1080Ti.

So OK, was 8gb enough on the 20 series? no. By then issues were starting to occur. If you look here at what happens to the 2080 VS the 1080Ti? you can see the issue.

https://i.imgur.com/wo4AW6X.jpg

Look at how it gets pasted by the 2080. Right up until you get to 4k. The cause? it doesn't have enough VRAM for UN settings at 4k. The thing is? if it were a stupid test I could say that yeah, why are you bothering to run 4k on that game with a card that isn't capable? The thing is both cards are clearly capable of running that game in 4k UN with more than acceptable framerates, only the 2080s lead has fallen off a cliff because it is now taking textures from the DRAM and paging file on your HDD.

When I posted that the other day the guy accused me of lying and said it must have been another reason. Thing is? I watched that whole video, and Doom Eternal shows you how much VRAM the card is going to use before you even try it. The only reason they didn't lower the settings? well that would have been cheating.

So, it will come as no surprise that I am still very firmly in the "8gb is not enough" category (because the 4070 will smash 4k gaming with all current games TBH) but at the same time I know what Nvidia are like and spoon feeding you with crumbs, so I won't hold my breath.

It also seems I massively got my wires crossed on Navi 3, and it could well be AMD's Maxwell moment. Not only that, but they have found a way to reduce the die sizes massively with the same density, so when it comes to cost Navi 3 should be pretty cheap in comparison.

And will RT performance really matter by then? no, no I don't think it will.Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.