NVIDIA, DirectX 12, and Asynchronous Compute: Don’t Panic Yet

Ashes of the Singularity: the game to bring NVIDIA cards to their knees?

Ashes of the Singularity: The game to bring NVIDIA cards to their knees?

Monday was a terrifying day to browse the web as the owner of an NVIDIA graphics card. News hit early this week that the company’s latest series of Maxwell GPUs, the GTX 900-series, could have a design flaw that compromises performance compared to AMD graphics cards when performing asynchronous compute in DirectX 12.

In short: A few weeks ago, Oxide Games released a benchmark demo of an upcoming game called Ashes of the Singularity, the first demo for DirectX 12, the soon-to-come update to Microsoft’s popular gaming API. Many Ashes benchmark reviews found that while NVIDIA graphics cards ran the game quite well with DirectX 11, AMD cards showed an enormous performance jump when upgrading to DX 12. NVIDIA cards, on the other hand, showed no performance improvements with DX 12, and in some cases, actually took a slight hit to performance compared to running the game with DX 11.

The Ashes benchmark resulted in a great deal of debate and speculation online over the past few weeks. An early rumor that was quickly repeated was that NVIDIA’s current generation of GTX cards does not support asynchronous compute. AMD’s current line of graphics cards, however, do support asynchronous computing/shading.

While DX 11 did not allow for asynchronous computing/shading, DX 12 does. Hence, with the asynchronous shading potential “unlocked” using DX 12, AMD cards can benefit from significant performance boosts, while NVIDIA cards may suffer while trying to do the same thing.

Reddit user SilverforceG wrote up a nice overview of the controversy on r/pcgaming, and even included a simple “explain it like I’m 5” summary.

In theory, GPUs that support asynchronous shading (AMD) should see significant performance gains in DX 12 when dealing with dynamic lighting, shadows and global illumination in games. GPUs that do not support asynchronous shading would not.

The news has prompted many new 900-series owners to lament their purchases and, in some cases, contact retailers to ask about a refund.

But is this really a death sentence for NVIDIA cards? Should you toss out your brand new GTX 980 Ti and replace it with an R9 290 from a garage sale? Maybe not quite yet.

 

[edit: This has changed a lot since the original article, thanks to some helpful comments below pointing out my initial misunderstanding of the tool]

A user in the Beyond3D forum created a little tool to test latency of different cards while performing graphics and compute operations. You can see results from the tool here: http://nubleh.github.io/async/scatter.html#6

Fury X

Above is a graph of Fury X results. The blue line shows the amount of time taken for the pure graphics part of the work to be done. The red line shows the amount of time for the pure compute part of the work to be done. The green line shows the total time for both workloads combined. Since the green line is not a flat line consisting of red time + blue time, this indicated that asynchronous compute is working, because some of the compute work can be done at the same time as the graphics work.

980Ti

The 980Ti graph, however, shows something different: The green line *is* the sum of the red and blue line. This means that for some reason the 980Ti isn’t able to do the compute work at the same time as the graphics work. It would appear that asynchronous compute isn’t working as advertised on NVIDIA cards. (The “steps” aren’t important for the question of whether asynchronous compute is working or not, the important part is whether or not green = red + blue)

 

 

What does this mean for current NVIDIA and AMD cards?

AMD graphics cards have an advantage in at least one DX 12 game, Ashes of the Singularity. They may have more advantages in other DX12 games. Completely writing off NVIDIA, however, is just silly.

NVIDIA GPUs will continue to perform well in DX 12. The Ashes benchmark is one game that utilizes asynchronous shading significantly, but we’ve yet to see any other real-world DX 12 benchmarks. We don’t know how well other upcoming games will use asynchronous shading, if they use it at all.

AMD’s Mantle/Vulkan API already use some of the features coming in DX 12, and have supported asynchronous shading for some time. While a couple benchmarks show enormous performance gains using Mantle over DX 11 (60%+ in some extreme cases), most real-world performance benefits on balanced gaming PCs are more in the 5-10% range. Very nice, but not Earth-shattering.

This controversy reminds us of a similar sky-is-falling event experienced by AMD owners a few years ago: When NVIDIA announced PhysX, gamers went nuts, and AMD owners felt like they had drawn the short stick in the GPU wars. In retrospect, of course, we know that AMD owners never really suffered too much by missing out on PhysX. And how often do you see PhysX hyped today?

We are not here to root for NVIDIA over AMD, or vice-versa. In fact, it would be nice to see AMD catch up on their lagging GPU sales, as we don’t want to see either company achieve a monopoly in the graphics card space. We just want to deal with the facts, not the hype.

In the long run, developers will no doubt make more use of asynchronous shading. They already are on consoles. But that will take years to make its way into PC games in any big way, because developers need to learn to use the new features, and will want to still support older PC hardware.

Our PC hardware recommendations will continue to be based on what works well now, and what will likely work well in the future. That includes considerations related to real-world gaming performance, acoustics, thermals, reliability and build quality. Sacrificing that viewpoint based on speculation about what may or may not happen in the future would be irresponsible.