AMD Radeon RX Vega Benchmark Review: Vega 64 and Vega 56 Tested

Jump To:

Tom Clancy’s Ghost Recon Wildlands

Tom Clancy's Ghost Recon Wildlands

Tom Clancy’s Ghost Recon Wildlands is an open world tactical shooter video game developed by Ubisoft Paris. It is the tenth installment in the Tom Clancy’s Ghost Recon franchise and is the first Ghost Recon game to feature an open world environment. The game moves away from the futuristic setting introduced in Tom Clancy’s Ghost Recon Advanced Warfighter and instead feature a setting similar to the original Tom Clancy’s Ghost Recon. Ubisoft described it as one of the biggest open world games that they have ever published, with the game world including a wide variety of environments such as mountains, forests, deserts and salt flats. A modified version of the AnvilNext game engine was used.  The game was released on March 7, 2017 for Microsoft Windows, PlayStation 4 and Xbox One.

Ghost Recon Wildlands Image Quality Settings

Ghost Recon Wildlands Image Quality Settings

Tom Clancy’s Ghost Recon Wildlands was benchmarked with high image quality settings with Temporal AA and 4x AF. V-Sync and the framerate limit were both disabled and we used the game titles built-in game benchmark.

1080P Benchmark Results: Ghost Recon Wildlands is a very tough game title when it comes to graphics and at 4K with just the high preset we were getting around 43 FPS on the AMD Radeon RX VEGA 64 and 37 FPS on the AMD Radeon RX VEGA 64. The EVGA GeForce GTX 1080 FTW2 was getting around 48 FPS with 10gbps GDDR5 memory and 50 FPS with 11gbps GDDR5 memory. 

Print
Jump To:
  • chrisday85

    Ok, here’s some things ticking me off: The Vega 64 is literally twice the gflops, and twice the memory bandwidth of the RX 580. The gap from the Vega 64 to RX 580 should be huge. The GTX 1060 to 1080 may be basically as large Gflops wise, but is minimal memory bandwidth wise. And what do you notice in these benchmarks? What the bloody hell is going on? The Vega is a new architecture, it’s supposed to be better in all sorts of ways. Yet here we see the same architecture generation from Nvidia performing much better in line with the increases. Allow me to demonstrate:

    Cross multiply Nvidia GTX 1060 Gears 4 1080, using the RX 580 and you find the AMD if scaling as well, should be average 157 in the frame rate, before accounting for any leap from the new architecture. What does it turn in? 128. Nvidia is scaling better in the same generation than AMD is in the new one?

    Fallout IV: It should be 125 if it scaled as well as the NVIDIA card. It is instead 112.

    Ghost Recon:

    95 instead of 106

    Battlefield:

    131 instead of 153

    Deus Ex:

    112 instead of 125.

    Grand Theft Auto is the only one it scales a tiny bit better.

    It’s scaling WORSE in DX 12 than it is in DX 11 I might add, do the math.

    I bought the Vega 64 and I have to say I’m disappointed. I’m hoping these beta drivers are the reason, because this is absurd, and I might add, the card gets to 86c unless I turn the fans to “wind turbine” as well. I also seem to be having an issue with Batman Arkham Knight in frame rate, which I’m just going to chalk up to some weird sort of glitch, because it’s nowhere near where benchmarks peg it. This has got to be one of AMD’s most disappointing launches of a new architecture I have seen.

  • Stephen

    test system, 6950x ivy bridge-e? error

    • Nathan Kirsch

      haha, winner! Fixed and thank you.

  • Sean Kumar Sinha

    Was hoping for a card that’d make NVidia push Volta and create a realistic GPU war. Unfortunately, we got cards that compete with stuff that NVidia has had out for over a year and that draw too much power, in comparison. These are good cards, for sure, but they are a year too late. Vega 56 seems like a good option for some 1080P Freesync gaming, though.

  • James Yarno

    Ethereum hash rate?