Radeon RX VEGA 56 versus GeForce GTX 1070

On August 14th, AMD announced three new graphics cards based on the 'Vega' GPU architecture: Radeon RX Vega 64 Liquid Cooled Edition, Radeon RX Vega 64 with air cooling and the Radeon RX Vega 56. The Radeon RX Vega 56 came a little sooner than we expected and we frankly weren't prepared for it when the card arrived. AMD allowed reviews of the Vega 56 to be published on August 14th and actually asked reviewers to focus on it, but it won't be available to purchase until August 28th. When the AMD Radeon RX Vega 56 becomes available in the retail channel on August 28th it will command a $399 MSRP and AMD expects it to be the clear leader at that price point. All of the Radeon RX Vega graphics cards support HDMI 4K60 and DisplayPort 1.4, Radeon FreeSync displays, Ultra HD, ultrawide, and single-cable 5K resolutions.  So, if you are looking to pick up a new display and want some flexibility and an affordable adaptive sync solution the Vega lineup looks pretty solid on paper. The Vega 56 at $399 has 56 Compute Units with 3,584 stream processors running at up to 1471MHz for 10.5 TFLOPS of peak single-precision compute performance. Vega 56 also comes tacked with 8GB of HBM2 memory running at 1.60 GHz (effective) and that gives this model 410 GB/s of memory bandwidth. When it is comes to board power you are looking at 210 Watts for this model. Over 10 TFLOPS of compute and over 400 GB/s of memory bandwidth for $399 seems like a steal for gamers. The only problem for AMD is the NVIDIA GeForce GTX 1070 8GB GDDR5 series graphics card that was released in June 2016 for $449. NVIDIA has reduced the MSRP on the GeForce GTX 1070 over the past year and the MSRP on this model is $379. The NVIDIA GeForce GTX 1070 has 1,920 CUDA cores running at up to 1683 MHz or 6.5 TFLOPS of peak single-precision compute performance. The 8GB of GDDDR5 memory on the GTX 1070 is clocked at 8,000 MHz (effective) and runs on a 256-bit bus for 256 GB/s of memory bandwidth. NVIDIA was ableto get all that performance in the GTX 1070 and managed on keeping it at a 150W TDP rating.
RX Vega 56 GeForce GTX 1070
GPU Vega 10 GP104 Pascal
Transistor Count 12.5 Billion  7.2 Billion
GPU Cores 3584  1920
Base Clock 1156 MHz  1506 MHz
Boost Clock 1471 MHz  1708 MHz
Texture Units 224  120
ROP Units 64  64
Memory 8 GB 8 GB
Memory Clock 1600 MHz 8000 MHz
Memory Interface 2048-Bit HMB2 256-bit GDDR5
Memory Bandwidth 410 GB/s 256 GB/s
TDP 210 Watts 150 Watts
Peak Compute 10.5 TFLOPS 6.5 TFLOPS
Process Node 14nm 16nm
MSRP (Aug 2017) $399 $379
Looking at the numbers on paper makes it look like a clear win for AMD, but is it going to be? Could the tale of the tape be wrong? AMD Radeon RX Vega 56 versus NVIDIA GeForce GTX 1070 To figure out if the Radeon RX Vega 56 or GeForce GTX 1070 is the best GPU for under $399 we grabbed  both cards and ran them through a series of benchmarks at 1080P, 1440P and 4K display resolutions. We have several GeForce GTX 1070 video card models here to pick from, so we actually randomly drew brands from a hat and the Gigabyte GeForce GTX 1070 G1 Gaming graphics card (part number GV-N1070G1 GAMING-8GD) was the sacrificial lamb from the group. This card has a base clock of 1594MHz and a boost clock of 1784MHz, so it has about a 100MHz factory overclock on it. The MSRP on this card is $429.99, but good luck finding any high-end gaming graphics card at its suggest retail price thanks to the cryptocurrency mining boom! AMD Radeon RX Vega 56 versus NVIDIA GeForce GTX 1070 The AMD Radeon RX Vega 56 measures 10.5-inches in length versus the 11-inch long Gigabyte GeForce GTX 1070 G1 Gaming. When it comes to GPU cooler designs the Vega 56 model has a blower style fan and the GTX 1070 has the WINDFORCE 3X cooling system. We hate to compare a custom board partner card to a reference design, but lets face it not many are buying a GeForce GTX 1070 Founders Edition today. AMD Radeon RX Vega 56 versus NVIDIA GeForce GTX 1070 To get these cards running properly in your gaming PC you'll need to hook up two 8-pin PCIe power connectors on the Vega 56 whereas there is just one needed on the GTX 1070 model that we are comparing it to today. Both cards have backplates and lighting on the tops of the card. When it comes to video outputs for display connectivity both cards have three standard sized DisplayPort connectors and one full sized HDMI port. The GeForce GTX 1070 does have a DVI output though that isn't seen on the AMD Radeon RX Vega 56 reference design. How about we spare you the chit-chat and let's do this! Let's take a look at the test system and then see the benchmarks!

Test System

Before we look at the numbers, let's take a brief look at the test system that was used as we switched over to a new system in April 2017 when Windows 10 Creators Update was released. All testing was done using a fresh install of Windows 10 Pro 64-bit version 1703 and benchmarks were completed on the desktop with no other software programs running.  There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing: Vega GPU Test System

Intel X99 Platform

The Intel X99 platform that we used to test the all of the video cards was running the ASUS X99-E-10G WS motherboard with BIOS 0603  that came out on 03/15/2017. We went with the Intel Core i7-6950X Broadwell-E processor to power this platform and overclocked it up to 4.0GHz on all cores. The Corsair Vengeance LPX DDR4 memory kit we used was a 64GB kit (4x16GB) and while it is rated at 3600MHz we actually ran it at 3333MHz at 1.30V with 16-16-16-30 1T memory timings. The Samsung SSD 960 EVO 1TB M.2 PCIe NVMe SSD was run with latest firmware available. A Corsair RM1000x power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements. Here are the exact hardware components that we are using on our test system:
The Intel X99 Test Platform

Component

Brand/Model

Live Pricing

Processor Intel Core i7-6950X
Motherboard
ASUS X99-E-10G WS
Memory
64GB Corsair Vengeance LPX 3600MHz DDR4
Video Card Various
Solid-State Drive Samsung SSD 960 EVO 1TB
Cooling Corsair Hydro H115i
Power Supply Corsair RM1000x
Case HighSpeed PC Top Deck Tech Station
Operating System Windows 10 64-bit
Monitor ASUS PB287Q 28" 4K
Let's move on to the Battlefield 1 benchmark results!

Battlefield 1

Battlefield 1 (also known as BF1) is the fifteenth installment in the Battlefield Series developed by DICE and published by EA. The game is set during World War I. It was released world wide on October 21, 2016. The singleplayer campaign of Battlefield 1 takes place across six different "War Stories" which revolve around different people in different aspects of the Great War in campaigns such as the Italian Alps and the deserts of Arabia. We benchmark in Through Mud and Blood, which is the second mission in singleplayer campaign. Taking place late in the war, the player assumes the role of Danny Edwards, a British recruit joining the crew of a Mark V Landship named Black Bess as their new tank driver. New to the war and inexperienced in driving the unreliable vehicle, Edwards is given a trial by fire with his first mission: punch through the German line at Cambrai with a broken tank and a crew that has no trust in him. BF1 Through Mud and Blood Battlefield 1 features the Frostbite 3 game engine and has very good graphics with tons of destructibles. Maps also now feature dynamic weather systems, affecting combat in various ways; for example, The St. Quentin Scar can either start as a clear, sunny day, a dark, foggy day, or in the middle of a rainstorm, and switch between them during the round. Battlefield 1 Video Card Settings Battlefield 1 Advanced Video Card Settings We tested BF1 at 1920 x 1080 with the 'Ultra' graphics quality preset in DX12 with the GPU Memory Restriction turned off. We also disabled VSync. Benchmark Results: Battlefield 1 is a strong win for the AMD Radeon RX Vega 56 graphics card with it decisively beating the Gigabyte GeForce GTX 1070 G1 Gaming graphics card. At 108oP the Vega 56 was 15% faster and at 1440P the performance gap widened to 20%.  The AMD Radeon RX Vega 56 is off to a good start! 

Deus Ex: Mankind Divided

Deus Ex: Mankind Divided is an action role-playing stealth video game developed by Eidos Montreal and published by Square Enix. Set in a cyberpunk-themed dystopian world in 2029, two years after the events of Human Revolution, Mankind Divided features the return of Adam Jensen from the previous game, Deus Ex: Human Revolution, with new technology and body augmentations. The game was released on August 23rd, 2016 for PC users. Deus Ex: Mankind Divided Train Station Deus Ex: Mankind Divided uses a heavily modified version of the Glacier 2 engine that has been tweaked so much that they are now calling it the Dawn game engine. We took a look at GPU performance using the 'High' image quality preset with MSAA and VSync turned off. We picked to run just 'High' image quality settings due to how tough this game title is to render and we feel that most gamers will try to target this setting.   Benchmark Results: This DX12 game title with the high image quality preset is pretty tough on graphics cards, but we again found the Radeon RX Vega 56 winning across the board at 1080P, 1440P  and even 4K resolutions. The Vega 56 was 12.7% faster at 1080P and 14.8% faster at 1440P, so we are seeing a significant lead by the new AMD Radeon RX Vega 56 graphics card. 

Gears of War 4

Gears of War is a video game franchise created and originally owned by Epic Games, developed and managed by The Coalition, and owned and published by Microsoft Studios. The series focuses on the conflict between humanity, the subterranean reptilian hominids known as the Locust Horde, and their mutated counterparts, the Lambent & the Swarm. Gears of War 4 was released on October 11, 2016 for the PC and is an interesting game title in the sense that it must be run on Windows 10 Anniversary Edition. It uses the DirectX 12 API with features like async compute and tiled resources. Gears of War 4 Screenshot Microsoft and the developer (The Coalition) have worked hard to make the PC port not suck and have placed over 30 graphics cards settings in the option menu with over 100-plus options that you can adjust. We are using the ultra image quality preset with VSync turned off. Gears of War 4 Graphics Options Gears of War 4 Graphics Options Gears of War 4 comes with a built-in benchmark that appears to be pretty decent and shows the average minimum framerate (bottom 5%) in the results. This is the benchmark that we ran on our graphics cards with Ultra image quality settings to see how they would perform. 1440P Benchmark Results: Gears of War 4 showed that the Gigabyte GeForce GTX 1070 pulled out the win at 1080P, 1440P and 4K screen resolutions.  

Fallout 4

Fallout4 Fallout 4 is an open world action role-playing video game developed by Bethesda Game Studios and published by Bethesda Softworks. Fallout 4 is set in a post-apocalyptic Boston in the year 2287, 210 years after a devastating nuclear war, in which the player character emerges from an underground bunker known as a Vault. Gameplay is similar to Fallout 3.  The title is the fifth major installment in the Fallout series and was released worldwide on November 10th, 2015. fallout4 settings Fallout 4 was benchmarked with ultra image quality settings with TAA and 16x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. V-Sync can't be disabled in the games options, so we edited the necessary INI files and disabled vsync in the driver software as well. We used FRAPS to benchmark Fallout 4 after you emerge from the vault and are in The Commonwealth. Benchmark Results: For the third game title in a row the GeForce GTX 1070 comes out ahead of the AMD Radeon RX Vega 64, but by only 1-7 FPS on average.

Grand Theft Auto V

GTAV Grand Theft Auto V, currently one of the hottest PC games, was finally released for the PC on April 14, 2015.  Developed by Rockstar, it is set in 2013 and the city of Los Santos.  It utilizes the Rockstar Advanced Game Engine (RAGE) which Rockstar has been using since 2006, with multiple updates for technology improvements. GTA5-settings GTA5-settings2 In Grand Theft Auto V we set the game to run with no MSAA with 16x AF and high image quality settings as we didn't want the GPU to bottleneck the system too bad, but wanted a setup that your average gamer would actually play on. We used the games built-in benchmark utility to have at least one game we used that could be compared to your setup at home. We averaged all the five of the default benchmark runs and omitted both the minimum and maximum values as those results are garbage due to major inconsistencies. Benchmark Results: The AMD Radeon RX VEGA 56 was able to tie the Gigabyte GeForce GTX 1070 Gaming G1 video card at 4K, but gave up losses at 1440P and 1080P.  

Tom Clancy's Ghost Recon Wildlands

Tom Clancy's Ghost Recon Wildlands Tom Clancy's Ghost Recon Wildlands is an open world tactical shooter video game developed by Ubisoft Paris. It is the tenth installment in the Tom Clancy's Ghost Recon franchise and is the first Ghost Recon game to feature an open world environment. The game moves away from the futuristic setting introduced in Tom Clancy's Ghost Recon Advanced Warfighter and instead feature a setting similar to the original Tom Clancy's Ghost Recon. Ubisoft described it as one of the biggest open world games that they have ever published, with the game world including a wide variety of environments such as mountains, forests, deserts and salt flats. A modified version of the AnvilNext game engine was used.  The game was released on March 7, 2017 for Microsoft Windows, PlayStation 4 and Xbox One. Ghost Recon Wildlands Image Quality Settings Ghost Recon Wildlands Image Quality Settings Tom Clancy's Ghost Recon Wildlands was benchmarked with high image quality settings with Temporal AA and 4x AF. V-Sync and the framerate limit were both disabled and we used the game titles built-in game benchmark. 1080P Benchmark Results: Ghost Recon Wildlands is yet another game title where we found the AMD Radeon RX Vega 56 falling to the Gigabyte GeForce GTX 1070. 

3DMark Time Spy - DX12

3DMark Time Spy just recently came out and it is the latest and greatest DirectX 12 benchmark test for gaming PCs running Windows 10. This DirectX 12 Feature Level 11_0 benchmark utilizes a pure DirectX 12 game engine that supports features like asynchronous compute, explicit multi-adapter, and multi-threading! The developers opted to use DirectX 12 Feature Level 11_0 to ensuring wide compatibility with DirectX 11 hardware through DirectX 12 drivers. 3DMark Time Spy With DirectX 12 on Windows 10, game developers can significantly improve the multi-thread scaling and hardware utilization of their titles to draw more objects, textures and effects for your viewing enjoyment. 3DMark Fire Strike is a great high-end DirectX 11 benchmark test, but doesn't really show off what new graphics cards can do on a DirectX 12 game title that will have much more going on while you are playing. 3DMark Time Spy Settings We ran 3DMark Time Spy with the standard settings and got the following results: In this DX12 benchmark we see the AMD Radeon RX Vega 56 is at 6,713 points and that is enough to come out ahead of the Gigabyte GeForce GTX 1070's score of 6,490 points.

VRMark

VRMark is a relatively new benchmark aimed at those that might be thinking about buying an HTC Vive or an Oculus Rift and knowing what hardware will give them the best VR gaming experience. VRMark includes two VR benchmark tests that run on your monitor, no headset required, or on a connected HMD. At the end of each test, you'll see whether your PC is VR ready, and if not, how far it falls short.

The Orange Room benchmark shows the impressive level of detail that can be achieved on a PC that meets the recommended hardware requirements for the HTC Vive and Oculus Rift. If your PC passes this test, it's ready for the two most popular VR systems available today. The orange room rendering resolution is 2264 x 1348, which is 1132 x 1348 per eye and the target desktop frame rate is 109 FPS.
Benchmark Results: The AMD Radeon RX VEGA 56 and Gigabyte GeForce GTX 1070 were basically tied here in the VRMark Orange room. Sure, the Gigabyte card came out on top, but there was just a tenth of a second difference between the two cards frame rates at over 200 FPS.  The goal in this benchmark is to get 109 FPS as that is the number deemed 'needed' for VR gaming. These two cards easily cover that!  The Blue Room is designed to be a more intense test with a rendering resolution of 5012 x 2880 (5K) and the goal is for a desktop PC to maintain a consistent frame rate of 109 FPS or above without dropping frames to pass this test. Benchmark Results: The VRMark Blue Room is much tougher on GPUs, but even here the two cards perform close to the same with the Vega 56 leading by 0.4 FPS. 

SuperPosition Benchmark

UNIGINE introduced the Superposition Benchmark in April 2017 and it contains a benchmark based on the UNIGINE 2 Engine that is pretty tough on modern graphics cards. We ran the SuperPosition Benchmark performance test with the 1080P High preset. The AMD Radeon Vega 56 scored 8,149 versus the 9,135 on the Gigabyte GeForce GTX 1070 . Here the GeForce GTX 1070 is 12% faster than the AMD Radeon RX VEGA 56. Here are the FPS results from the benchmark that show the Vega 56 dips down to 50 FPS while the GeForce GTX 1070 drops to 54 FPS. Not horrible spikes and if you are playing on a FreeSync or G-Sync gaming monitor neither of these drops should be noticeable.  

Temperature & Noise Testing

The gaming performance on a graphics card is the most important factor in buying a card, but you also need to be concerned about the noise, temperature and power consumption numbers. Since GPU-Z doesn't yet read the temperatures of VEGA we didn't have a way to record or log or temperatures. Using AMD Radeon Settings isn't good for idle temperatures as it uses the GPU, so we ended up using GPU Shark to get some readings. AMD Radeon RX VEGA 56 Temperatures: The AMD Radeon RX Vega 56 reference card ran at 39C idle temps and hit 75C while gaming according to GPU Shark. Gigabyte GeForce GTX 1070 G1 Gaming Temperatures:  The Gigabyte GeForce GTX 1070 G1 Gaming has no fans running at idle and averaged 42C and after gaming for over half an hour our temperatures managed to top out at 64C.

Sound Testing

We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.

When it comes to noise levels the Gigabyte GeForce GTX 1070 G1 Gaming with the WINDFORCE 3X cooler is the easy winner. This model has 'fan stop' for 0dB performance at idle and at load our meter on hit 46.6 dB. The AMD Radeon RX VEGA 56 has a blower style fan that always spins, so we were about 1dB higher at idle and 12 dB higher when gaming. The decibel scale is a logarithmic scale is used, rather than a linear one. Moving from 50 dB to 60 dB is a 10x increase in sound intensity, so this is a massive difference that one really needs to hear to believe.

Power Consumption

For testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we ran Battlefield 1 at 1920 x 1080 and recorded the peak power number while gaming a particular level that we test on each card. Our idle numbers are high, but keep in mind that we are using a workstation motherboard that has two 10GbE ports and other high-power draw controllers on it. Power Consumption Results: The AMD Radeon RX VEGA 56 video card is rated at 210W and the GeForce GTX 1070 is rated at 150W, so the outcome here is to be expected. We hit 444 Watts on the Vega 56 and just 352 Watts on the GeForce GTX 1070 while playing BF1. The AMD Radeon RX Vega 56 used 92 Watts more power in this game title! 

Ether Mining with Claymore's Dual Ethereum GPU Miner v9.8

We've never included results from mining Ethereum in a gaming GPU review before, but let's face more miners are buying video cards right now than gamers and many gamers have been mining to get a little extra income or pay for their expensive gaming graphics card. GeForce GTX 1070 Ether Mining We fired up Claymore's Dual Ethereum AMD+NVIDIA GPU Miner v9.8 and did some quick testing on DAG epoch #138 (2.08GB) with the latest drivers for each card. We aren't going to get into too much detail here since the focus of this article is gaming, but the AMD Radeon RX Vega 56 gets around 32 MH/s out of the box versus around 26.5 MH/s on the GeForce GTX 1070. If you overclock the memory on both cards you can improve significantly, so with the memory pushed to the max on both card we were getting 34 MH/s on the Vega 56 versus 32 MH/s out of the GeForce GTX 1070. With the overclocked settings we were getting 367 Watts at the wall on Vega 56 versus 183 Watts on the GeForce GTX 1070 and that was done with the power target/limit reduced as far as we could on each cards respective overclocking utility. Let's wrap this review up!

For The Critics Of This Review

We've had a number of people come to this review and knock it as being a waste of space and pointless because we compared an aftermarket GeForce GTX 1070 versus a reference Radeon RX Vega 56. We deleted some of the hate filled comments that were putting us down and went back and grabbed out NVIDIA GeForce GTX 1070 FE cards from the pile and ran some benchmarks on it. So, let's look at the arguments! Noise levels are going to favor an custom boards triple fan cooler versus a reference blower card... Our testing of the GeForce GTX 1070 FE with a blower style fan had it nearly 10dB lower than the Vega 56 reference card. The NVIDIA GeForce GTX 1070 FE easily wins when it comes to blower versus blower performance against the Vega 56. Thermal performance is going to be better on the custom cooled cards... That is 100% true, but on our open air test system in a nicely air conditioned room the difference wasn't massive. The GeForce GTX 1070 FE has the lowest temp at idle since the Gigabyte cards fans don't spin at idle. So, the GTX 1070 FE has higher fan noise at idle, but is 7C cooler. At load there is only a 3C difference, but the Vega 56 is almost 10dB louder. FPS performance is going to be better on the AIB cards... That is true when they are factory overclocked like most are, but you are talking about a 2% difference here. If you want to know how a stock GeForce GTX 1070 FE performs you can mentally just take ~2% off the Gigabtye GeForce GTX 1070 G1 Gaming numbers and that will get you in the ball park. An 88MHz overclock on a GeForce GTX 1070 isn't going to drastically increase performance. Oh and something else to factor in is power is lower on the reference card since it runs at lower clock speeds and has fewer fans. The AMD Radeon RX Vega 56 reference card uses more than 100 Watts than the NVIDIA GeForce GTX 1070 FE 'reference card'. Thanks for the hate, but with either a GeForce GTX 1070 reference card or an aftermarket GTX1070 video card the overall conclusion is pretty much the same.

Final Thoughts and Conclusions

The AMD Radeon RX Vega 56 and the GeForce GTX 1070 is supposedly the sweet spot for three Vega cards released by AMD thus far. AMD is banking that the Vega 56 will be the high-performance card to buy at the $399 price point once it starts shipping on August 28th, 2017. Pricing is always an interesting topic and right now it's nearly impossible to talk about as most of the cards in this price range are being purchased by alt coin miners. That is causing a supply issue that has caused prices to shoot up. You better go buy a lottery ticket if you can find a GeForce GTX 1070 or Radeon RX Vega for the MSRP anytime this month.
RX Vega 56 GeForce GTX 1070
GPU Vega 10 GP104 Pascal
Transistor Count 12.5 Billion  7.2 Billion
GPU Cores 3584  1920
Base Clock 1156 MHz  1506 MHz
Boost Clock 1471 MHz  1708 MHz
Texture Units 224  120
ROP Units 64  64
Memory 8 GB 8 GB
Memory Clock 1600 MHz 8000 MHz
Memory Interface 2048-Bit HMB2 256-bit GDDR5
Memory Bandwidth 410 GB/s 256 GB/s
TDP 210 Watts 150 Watts
Peak Compute 10.5 TFLOPS 6.5 TFLOPS
Process Node 14nm 16nm
MSRP (Aug 2017) $399 $379
On paper the Radeon RX Vega 56 looks like it would easily perform better than the GeForce GTX 1070, but what did our performance tests show? When we take a look at 1440P gaming performance the RX Vega 56 took the lead in Battlefield 1 and Deus Ex Mankind Divided that were tested running DirectX 12. The GeForce GTX 1070 won Gears of War 4, Fallout 4, Grand Theft Auto V and Tom Clancy's Ghost Recon Wildlands. So, the Radeon RX Vega 56 took home 2 titles and the GeForce GTX 1070 won the other 4. When it comes to synthetic benchmarks the Vega 56 won 3DMark, the GeForce GTX 1070 won SuperPosition and we'd call VRMark a tie. So, they split the synthetic benchmarks with one win each. AMD Radeon RX Vega 56 versus NVIDIA GeForce GTX 1070 When it comes to power consumption, noise and temperatures the AMD Radeon RX Vega 56 proved to be more power hungry, louder and hotter. The AMD Radeon RX Vega 56 at $399.99 costs more than the $379 GeForce GTX reference design to cap that off. The Gigabyte GeForce GTX 1070 G1 Gaming that we used today has an MSRP of $429.99, so it is priced $50 more than that and $30 more than the Vega 56. We were expecting NVIDIA to reduce the GeForce GTX 1070 and GeForce GTX 1080 video card prices once RX Vega was released, but it appears they don't really have a reason to. The miners are buying everything the board partners put out to sell and it doesn't appear that the Radeon RX Vega 56 is going to put too much pressure on the GeForce GTX 1070 or GeForce GTX 1080. The AMD Radeon RX Vega 56 does appear to be the best bang for the buck with the Vega cards. If you are an AMD fan and wanting to build a nice new gaming rig with a display that features FreeSync technology the Vega 56 looks like it should do really well. This is because the FreeSync range on most 4K displays is between 40-60Hz and between 48-100Hz on some of the new 11440P UltraWide gaming displays. The AMD FreeSync Certified range is 30Hz to 144Hz, but most monitors don't have a range that wide. Ideally you want the minimum frame rate to be in the displays FreeSync range to get a smooth gaming experience. Setting up a FreeSync display for your gaming system is less costly than an NVIDIA G-Sync solution, so that is one area that AMD will be heavily pushing with Vega. AMD made a move in the right direction with Vega, but what looks good on paper can be deceiving and don't always translate into great real world performance. Maybe AMD can pull some more performance out of the new Vega architecture with driver enhancements in the months to come, but for now it looks like the year old GeForce GTX 1070 is standing strong.