DirectX 12 Explicit Multi-GPU Performance Review with Ashes of the Singularity

DirectX 12 Explicit Multi-GPU Performance Review with Ashes of the Singularity

DirectX 12 Explicit Multi-GPU Performance Review with Ashes of the Singularity

DirectX 12 Explicit Multi-GPU Performance Review with Ashes of the Singularity

 

One of the most exciting features of the DirectX 12 API is how it handles Multiple GPUs, allowing Multi-GPU setups to be much easier to utilize in games and if implemented correctly can allow GPUs from multiple manufacturers to be used together in a single system. 

Ashes of the Singularity is the first game to receive this functionality with DirectX 12, allowing any two DirectX 12 GPUs to be used together in an AFR (Alternate Frame Rendering) Configuration, allowing even AMD and Nvidia GPUs to work together, or two GPUs from a different product series. 

In this review we will be looking into DirectX 12 Explicit Multi-GPU inside Ashes of the Singularity, focusing on the performance changes when using non-matching GPUs together in DirectX 12. We have also looked at the performance impact of DirectX 12 in the a different article, looking into the impact of features like Asynchronous compute.  

 

 

What is Ashes of the Singularity?

Ashes of the Singularity is an all new RTS (Real-Time Strategy) game that is based in a post-technological singularity universe in which humans have begun to colonize the stars.

In this game humanity now faces a new foe that threatens to completely annihilate them, this race is the Substrate, a race of machines that seeks to control the same fertile and habitable worlds that humanity trying to claim for themselves.

 

Drivers 

For this game we will be using the newest drivers that were available when the game released, which is Nvidia's Game Ready Geforce 361.91 driver and AMD's 16.1.1 Hotfix driver, both of which became available to the public in the past month.  

 

Test Setup  

We will be testing this game on our dedicated GPU test rig using the current flagship GPUs from both AMD and Nvidia. Both GPUs will be the reference design and will be ran at stock settings. 

 

AMD R9 Fury X & Nvidia GTX 980Ti
Intel Core i7 6700K @ 4.7GHz
ASUS Maximus VIII Hero
G.Skill Ripjaws 4x4GB DDR4 3200MHz
Corsair HX1200i
Corsair H110i GT
Windows 10 x64 

       Rise of the Tomb Raider - AMD VS Nvidia Performance Review  Rise of the Tomb Raider - AMD VS Nvidia Performance Review

Nvidia GTX 980Ti (Left), AMD R9 Fury X (Right)

 

To represent AMD and Nvidia's Mid range GPU offerings we have decided to use the AMD R9 380 and the Nvidia GTX 960. Both of these GPUs will be the ASUS Strix models

Both of these GPUs offer very similar performance in most scenarios and come in at very similar pricepoints, so it will be very interesting to see which GPU will come out on top. 

 

          Metal Gear Solid 5 Performance Review with ASUS  Metal Gear Solid 5 Performance Review with ASUS

Nvidia GTX 960(Left), AMD R9 380(Right)

«Prev 1 2 3 4 5 6 7 8 Next»

Most Recent Comments

24-02-2016, 13:18:03

Tripp
Wow didn't think this would be allowed to work, I wonder how long until nvidia stop thisQuote

24-02-2016, 14:00:50

ImprovizoR
Quote:
Originally Posted by Tripp View Post
Wow didn't think this would be allowed to work, I wonder how long until nvidia stop this
I don't think they can. It's a Dx12 feature.Quote

24-02-2016, 14:06:36

Tripp
Quote:
Originally Posted by ImprovizoR View Post
I don't think they can. It's a Dx12 feature.
never say neverQuote

24-02-2016, 14:10:07

WYP
The benchmark will go live to the public tomorrow, it would be a big deal if Nvidia removed a feature.

All Nvidia would be doing by disabling the option would be to harm consumers, though they will try to spin it in a positive way for them.

if they remove the feature myself and plenty of other tech writers will flame them for it, as removing a feature is bad for consumers.Quote

24-02-2016, 14:12:23

Tripp
Quote:
Originally Posted by WYP View Post
The benchmark will go live to the public tomorrow, it would be a big deal if Nvidia removed a feature.

All Nvidia would be doing by disabling the option would be to harm consumers, though they will try to spin it in a positive way for them.

if they remove the feature myself and plenty of other tech writers will flame them for it, as removing a feature is bad for consumers.
they have done it before though right? and Nvidia do genrally get away with a fair share of dodgy sh*tQuote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.