XCOM 2 PC Performance Review - AMD VS Nvidia

XCOM 2 PC Performance Review - AMD VS Nvidia

XCOM 2 PC Performance Review - AMD VS Nvidia

Introduction  

We failed, the battle was lost and now the Aliens are in control of the world that we have spent so hundreds of hours of gameplay in order to protect, this is XCOM 2 and this time we are the aggressor. 

When it comes to strategy titles like XCOM, Civilization, Total War and many other similar titles, the PC performance is something which is often overlooked by reviewers, especially in turn based games like XCOM where the framerate does not have as large of an impact on your gameplay experience as it would in a real-time gameplay experience, but that does not mean that PC gamers should be left without some clear performance data on this popular title.  

With this game already receiving a large number of high review scores and a great deal of positive reception, talking about the gameplay is not really required by us at this stage. We have had a great time playing this game over the weekend and very much look forward to spending many more hours within the game.

As the title of this article clearly shows we are hear to talk about game performance and to let you know how well this game runs on today's hardware, though be warned, XCOM 2 will require a GPU that is out of this world in order to play it at max settings with a high framerate.   

 

 

Drivers 

For this game we will be using the newest drivers that were available when the game released, which is Nvidia's Game Ready Geforce 361.82 Hotfix driver and AMD's 16.1.1 Hotfix driver, both of which became available to the public in the past week.  

Test Setup  

We will be testing this game on our dedicated GPU test rig using the current flagship GPUs from both AMD and Nvidia. Both GPUs will be the reference design and will be ran at stock settings. 

 

AMD R9 Fury X & Nvidia GTX 980Ti
Intel Core i7 6700K @ 4.7GHz
ASUS Maximus VIII Hero
G.Skill Ripjaws 4x4GB DDR4 3200MHz
Corsair HX1200i
Corsair H110i GT
Windows 10 x64 

       Rise of the Tomb Raider - AMD VS Nvidia Performance Review  Rise of the Tomb Raider - AMD VS Nvidia Performance Review

Nvidia GTX 980Ti (Left), AMD R9 Fury X (Right)

 

To represent AMD and Nvidia's Mid range GPU offerings we have decided to use the AMD R9 380 and the Nvidia GTX 960. Both of these GPUs will be the ASUS Strix models

Both of these GPUs offer very similar performance in most scenarios and come in at very similar pricepoints, so it will be very interesting to see which GPU will come out on top. 

 

          Metal Gear Solid 5 Performance Review with ASUS  Metal Gear Solid 5 Performance Review with ASUS

Nvidia GTX 960(Left), AMD R9 380(Right)

«Prev 1 2 3 4 5 6 7 8 9 10 Next»

Most Recent Comments

08-02-2016, 11:53:00

Kaapstad
Performance on a TitanX maxed @2160p is much better than a GTX 980 Ti.

Below I think is the reason why.

Check out the memory usage.

http://i.imgur.com/jfRWrTt.jpgQuote

08-02-2016, 11:54:28

SPS
What avg FPS on TitanX Kaap?Quote

08-02-2016, 11:57:13

Kaapstad
Quote:
Originally Posted by SPS View Post
What avg FPS on TitanX Kaap?
About 17fps maxed @2160pQuote

08-02-2016, 12:00:16

AlienALX
These new games really are complete VRAM pigs. I kinda knew this would happen once the consoles had loads of available VRAM but nowhere near as bad as this.

With BLOPS 3 the newest update removes the extra settings from my PC (Fury x 4gb). If I hack it and enable extra settings it either black screens on load or very rarely does load up. However, within a few seconds it turns into a slide show.

And I was having the same issue with ROTTR when setting everything as high as it would go. It was fine for a couple of minutes, then it turned into a slide show and on a couple of occasions it actually stopped and ground to a halt and took about two minutes before it continued on. Now that I have lowered a few of the settings to high and left the main detail setting on very high it flies along lovely.

The problem of course is that these new consoles have up to 6gb vram available to them for 1080p . So instead of the devs optimising their textures and so on they are simply making them so that they use up that entire 6gb of texture memory at 1080p.

All of a sudden Titan X sounds far less stupid than it did when it launched.

I don't think it will be long before the 6gb the 980ti has becomes a minimum requirement for max settings, and could even be quite soon before that 6gb becomes out of date.

So much for AMD and their "It doesn't matter because it's HBM and it doesn't work like GDDR". Yeah, right.

Saying that though I do not blame AMD. As I said above fat bloatware is what causes these issues. I've got a whole ton of games that look fantastic at 4k and don't use anywhere near the same vram as these newest games.Quote

08-02-2016, 12:01:02

SPS
Quote:
Originally Posted by Kaapstad View Post
About 17fps maxed @2160p
That mem usage is certainly interesting.Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.