Ashes of the Singularity DirectX 12 vs DirectX 11 Benchmark Performance

Jump To:

Ashes of the Singularity DirectX 12 vs DirectX 11 Gaming

Today we’ll be looking at one of the very first DirectX 12 game benchmarks by using Stardock’s real-time strategy game, Ashes of the Singularity. Ashes of the Singularity was developed with Oxide’s Nitrous game engine and tells the story of an existential war waged on an unprecedented scale across the galaxy. The Ashes of the Singularity Benchmark was never designed as a synthetic stress test, but a real world test that was used internally to measure overall system performance. That internal developer tool was released to the public today as a DX12 benchmark!

Ashes of the Singularity

Here is what the developers had to say about the Ashes of the Singularity benchmark:

The benchmark is representative of real world gameplay. That means that while the strategic AI is scripted, everything else is being simulated just like it would be in game, such as tactical AI, physics, pathfinding, targeting, visibility, etc. To that extent, there will be more variability in the results than a synthetic benchmark would have. What we have noted however, is that performance has been consistent and several runs can be used to generate an accurate representation of performance. Even though we have been working with Microsoft on DirectX 12 for a while now, DirectX 12 and Windows 10 have only recently released to public. As such, we expect that the DirectX 12 results will only improve from here on out. We along with AMD, Microsoft, and Nvidia will continue to make improvements. With that in mind, this version should be treated as more of a performance preview to celebrate our excitement for Windows 10 and DirectX 12, as opposed to a hardcore synthetic test

Ashes of the Singularity Benchmark Technical Features:

  1. Multi-Threaded Performance: Compared to DirectX 11, which used single-threaded and multithreaded performance, Ashes of the Singularity will see huge gains in complex workloads. This will allow for significantly improved balance of workload between CPU cores.
  2. (Coming Soon) Explicit Multi-GPU Support: With DirectX 12 comes the ability to use multiple GPUs. This allows for better control of multi-GPU rendering by developer, thus controlling the frame. Without this ability, the integrated GPU would remain idle and unused as an untapped resource to increase rendering and speed up the frame.
  3. Asynchronous Shaders: This allows the schedule of work for the GPU that will be performed. Traditionally, the GPU’s command queue would have had stalls, and DirectX 12 essentially provides more work done for free.
  4. Explicit Frame Management: DirectX 12 will see a reduction in latency, which will provide a more responsive game experience. This will also allow tracking of specific information, such as whether something is GPU or CPU bound.
  5. Updated Memory Management Design: A radical change in the memory management design allows developers to remove traditional performance issues such as micro-stutters that have plagued D3D11.
  6. DirectX 12 has taken great strides to prevent “screen tearing” by defaulting to having images rendered at the screen’s refresh rate (which would result in frame rate limiting).  However, for benchmark purposes, we have taken steps to bypass this to give a more accurate representation of potential performance.
  7. MSAA is implemented differently on DirectX 12 than DirectX 11.  Because it is so new, it has not been optimized yet by us or by the graphics vendors.  During benchmarking, we recommend disabling MSAA until we (Oxide/Nidia/AMD/Microsoft) have had more time to assess best use cases.

Ashes of Singularity Benchmark

When you launch the Ashes of the Singularity Benchmark you are greeted with this screen that clearly notes that this is a pre-Beta build that “will probably kill your pets, wreck your computer.”

Ashes of the Singularity Game Settings

We disabled MSAA since there are known issues with having that enabled as well as VSync. We left the Temporal AA Quality and Terrain Shading Samples on ‘Mid’ and the other image quality settings at their default value of ‘High’. Since we are using an EVGA GeForce GTX 960 SSC 4GB video for testing and set the screen resolution at 1920 x 1080. This card was selected due to it being $229.99 shipped as it is what we would consider a mainstream video card that is popular right now.

Let’s check out the results on the next page!

Print
Jump To:
  • Michael Holm

    Yes AMD is now on top and they have stable drivers now too

  • bodaman34

    No AMD test?? This post is useless without AMD results. Some test have seen 1600% gains for AMD CPU draw calls in 3DMark. Yes I said CPU. This test definitively demonstrated the new API gave an AMD FX 8350 slightly better gaming performance than the infamous Intel gaming king the i5 4690K using a variety of both brand GPUs. Think about that for a minute. A 2012 FX CPU beating out a 2014 i5 CPU! They said AMD needed a miracle and this might be it.

    With this low level API driver lead and AMD releasing the DDR5 replacement HBM (High Bandwidth Memory) in its new GPUs another groundbreaking step in PC gaming in taken. Combined, DX12 and HBM clearly indicate PC gaming is about to make a hyper leap and AMD will be the banner bearer in this next chapter of competitive evolution.

    In the end the consumer wins on every level. Significantly better performance with current hardware, huge price drops, even more huge gains with new hardware.

  • arkon

    Hold on: what the heck is this “Benchmark”? An pub for Gtx 960? You talked only with nvidia,why not with Amd or Intel?

    Why not Gtx 980 or Radeon Fury? Not good what we see here guys.
    Shame on You!!!

    • perfectlyreasonabletoo

      AMD makes like 20 Fury GPUs a day, maybe they aren’t one of the lucky ones to receive one.

  • Jon Thomas

    I Paid $50 for this game to try DX12 only to find out its not even released yet , only the DX11 version … Crap

  • cris levin

    The article is completely biased. You completely forgot AMD cards, which improved performance by almost 100%, 290x is on par with 980 Ti.

    • spp85

      If AMD were present we can see Radeon sabotaging crippled nvidia in this game.

      Legitreview is Not legit.

    • Rickard Åman

      HAHAHHAHAHAHAHAh ok

  • agentbb007

    When are the DX12 games going to come out? I can’t see any games besides Fable Legends scheduled for a 2015 release date.

  • DoctorT

    You should have also tested AMD GPU’s.
    Other sites have seen increases up to 80% on AMD after switching to DX12.

    • SaucyJack42

      .

  • Tralalak Aviatik

    How many desktop gamers do you know that are running an 2.0GHz quad-core processor with clock speeds that low?

    Life is change.

  • Jason Evangelho

    You guys didn’t use Nvidia’s 355.66 driver? That could explain a few things.

    • Nathan Kirsch

      Jason, Good seeing you! We did all of our testing before finding out about those drivers on August 15th. I contacted our Tech PR contacts and they said shit about 355.60 to 355.66: “No major perf difference between 355.60 and 355.66.”

      • Jason Evangelho

        Oh man I love finding out about new drivers at zero hour! Ok, good to know there’s no major performance difference. Thanks Nathan!

        • Nathan Kirsch

          yeah, we had all of our testing done before the NVIDIA reviewers guide went out. Always that fun moment when you see a newer driver is floating around after testing was done, but nothing new for us right?

        • Jason Evangelho

          Didn’t that happen with the Fury X launch, too? They released the new Radeon driver like 24 hours before the embargo lifted and I was like “Welp. Sorry guys.”

  • Intel999

    As predicted by many, DX12 makes the high dollar CPU unnecessary in future gaming builds once DX12 becomes the mainstream gaming engine.

    Seems demand for AMD CPUs could increase in the not so distant future.

    • Nathan Kirsch

      That certainly appears to be the case. It also puts the game developer more in control of overall gameplay performance from the looks of how things are working out so far.

    • vision33r

      Which means a Intel Pentium 3258 at $58 is a better buy than any AMD CPU still using that hot 100watt+ outdated silicon.

      • Ryuhoshi

        Nah, some games need 4 cores. Athlon X860K is better idea, or even i3.

  • Pazuzu Hanbi

    Hello,
    Is there any way to test the game for free?