Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

Ashes of the Singularity: Escalation - 3840x2160 - Extreme QualityAshes of the Singularity: Escalation - 2560x1440 - Extreme QualityAshes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Somewhat surprisingly, the RTX 2060 (6GB) performs poorly in Ashes, closer to the GTX 1070 than the GTX 1070 Ti. Although it is still ahead of the RX Vega 56, it's not an ideal situation, where the lead over the GTX 1060 6GB is cut to around 40%.

Ashes: Escalation - 99th Percentile - 3840x2160 - Extreme QualityAshes: Escalation - 99th Percentile - 2560x1440 - Extreme QualityAshes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

 

Far Cry 5 Wolfenstein II
Comments Locked

134 Comments

View All Comments

  • sing_electric - Monday, January 7, 2019 - link

    It's likely that Nvidia has actually done something to restrict the 2060s to 6GB - either though its agreements with board makers or by physically disabling some of the RAM channels on the chip (or both). I agree, it'd be interesting to see how it performs, since I'd suspect it'd be at a decent price/perf point compared to the 2070, but that's also exactly why we're not likely to see it happen.
  • CiccioB - Monday, January 7, 2019 - link

    You can't add memory at will. You need to take into consideration the available bus, and as this is a 192bit bus, you can install 3, 6 or 12 GB of memory unless you cope with hybrid configuration thorough heavily optimized drivers (as nvidia did with 970).
  • nevcairiel - Monday, January 7, 2019 - link

    Even if they wanted to increase it, just adding 2GB more is hard to impossible. The chip has a certain memory interface, in this case 192-bit. Thats 6x 32-bit memory controller, for 6 1GB chips. You cannot just add 2 more without getting into trouble - like the 970, which had unbalanced memory speeds, which was terrible.
  • mkaibear - Tuesday, January 8, 2019 - link

    "terrible" in this case defined as "unnoticeable to anyone not obsessed with benchmark scores"
  • Retycint - Tuesday, January 8, 2019 - link

    It was unnoticeable back then, because even the most intensive game/benchmark rarely utilized more than 3.5GB of RAM. The issue, however, comes when newer games inevitably start to consume more and more VRAM - at which point the "terrible" 0.5GB of VRAM will become painfully apparent.
  • mkaibear - Wednesday, January 9, 2019 - link

    So, you agree with my original comment which was that it was not terrible at the time? Four years from launch and it's not yet "painfully apparent"?

    That's not a bad lifespan for a graphics card. Or if you disagree can you tell me which games, now, have noticeable performance issues from using a 970?

    FWIW my 970 has been great at 1440p for me for the last 4 years. No performance issues at all.
  • atragorn - Monday, January 7, 2019 - link

    I am more interested in that comment " yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage." will ALL nvidia cards support Freesync/Freesync2 or only the the RTX series ?
  • A5 - Monday, January 7, 2019 - link

    Important to remember that VESA ASync and FreeSync aren't exactly the same.

    I don't *think* it will be instant compatibility with the whole FreeSync range, but it would be nice. The G-sync hardware is too expensive for its marginal benefits - this capitulation has been a loooooong time coming.
  • Devo2007 - Monday, January 7, 2019 - link

    Anandtech's article about this last night mentioned support will be limited to Pascal & Turing cards
  • Ryan Smith - Monday, January 7, 2019 - link

    https://www.anandtech.com/show/13797/nvidia-to-sup...

Log in

Don't have an account? Sign up now