GeForce GTX 1070 Ti Arrives To PartyNVIDIA recently released the GeForce GTX 1070 Ti graphics card at $449 and this card aims to fill the performance gap between the GeForce GTX 1070 ($349) and the GeForce GTX 1080 ($499) to better position the GeForce lineup against AMD's Radeon RX Vega series. NVIDIA slipped this card into the lineup near the end of the expected life span of Pascal and has done some interesting things on this model. For starters all the GeForce GTX 1070 Ti cards are shipping at the same core and memory clock speeds. This means that clock speeds can't be used to differentiate between models. NVIDIA told us that they did this to simplify the product stack, but overclockers are free to manually overclock the GeForce GTX 1070 Ti cards for extra performance and 2+ GHz core clocks is something one should aim for. It is hard to believe that the original NVIDIA GeForce GTX 1070 8GB GDDR5 series graphics card was released in June 2016 for $449. NVIDIA has reduced the MSRP on the GeForce GTX 1070 over the past year and the MSRP on this model down to $349 and the new Ti model now fills the$449 price point. The original NVIDIA GeForce GTX 1070 had 1,920 CUDA cores running at up to 1683 MHz or 6.5 TFLOPS of peak single-precision compute performance. The new GeForce GTX 1070 Ti boosts up to the same clock speed, but has 2,432 CUDA cores for 8.1 TFLOPS. Having 512 more CUDA cores is nice, but NVIDIA also unlocked the number of texture units. Rather than having 120 like the 1070 has the new 1070 Ti has 152! The 8GB of GDDDR5 memory on both cards is clocked at 8,000 MHz (effective) and runs on a 256-bit bus for 256 GB/s of memory bandwidth.
|RX Vega 56||GeForce GTX 1070||GeForce GTX 1070 Ti|
|GPU||Vega 10||GP104 Pascal||GP104 Pascal|
|Transistor Count||12.5 Billion||7.2 Billion||7.2 Billion|
|Base Clock||1156 MHz||1506 MHz||1607 MHz|
|Boost Clock||1471 MHz||1683 MHz||1683 MHz|
|Memory||8 GB||8 GB||8 GB|
|Memory Clock||1600 MHz||8000 MHz||8000 MHz|
|Memory Interface||2048-Bit HMB2||256-bit GDDR5||256-bit GDDR5|
|Memory Bandwidth||410 GB/s||256 GB/s||256 GB/s|
|TDP||210 Watts||150 Watts||180 Watts|
|Peak Compute||10.5 TFLOPS||6.5 TFLOPS||8.1 TFLOPS|
|MSRP (Nov 2017)||$399||$349||$449|
Test SystemBefore we look at the numbers, let's take a brief look at the test system that was used as we switched over to a new system in April 2017 when Windows 10 Creators Update was released. All testing was done using a fresh install of Windows 10 Pro 64-bit version 1703 and benchmarks were completed on the desktop with no other software programs running. There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing:
- AMD Radeon Software Crimson Edition 17.7.2 & Vega RX Beta for Radeon Cards
- NVIDIA GeForce 388.18 WHQL for GeForce Cards
Intel X99 PlatformThe Intel X99 platform that we used to test the all of the video cards was running the ASUS X99-E-10G WS motherboard with BIOS 0603 that came out on 03/15/2017. We went with the Intel Core i7-6950X Broadwell-E processor to power this platform and overclocked it up to 4.0GHz on all cores. The Corsair Vengeance LPX DDR4 memory kit we used was a 64GB kit (4x16GB) and while it is rated at 3600MHz we actually ran it at 3333MHz at 1.30V with 16-16-16-30 1T memory timings. The Samsung SSD 960 EVO 1TB M.2 PCIe NVMe SSD was run with latest firmware available. A Corsair RM1000x power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements. Here are the exact hardware components that we are using on our test system:
|The Intel X99 Test Platform|
|Processor||Intel Core i7-6950X|
ASUS X99-E-10G WS
64GB Corsair Vengeance LPX 3600MHz DDR4
|Solid-State Drive||Samsung SSD 960 EVO 1TB|
|Cooling||Corsair Hydro H115i|
|Power Supply||Corsair RM1000x|
|Case||HighSpeed PC Top Deck Tech Station|
|Operating System||Windows 10 64-bit|
|Monitor||ASUS PB287Q 28" 4K|
Battlefield 1Battlefield 1 (also known as BF1) is the fifteenth installment in the Battlefield Series developed by DICE and published by EA. The game is set during World War I. It was released world wide on October 21, 2016. The singleplayer campaign of Battlefield 1 takes place across six different "War Stories" which revolve around different people in different aspects of the Great War in campaigns such as the Italian Alps and the deserts of Arabia. We benchmark in Through Mud and Blood, which is the second mission in singleplayer campaign. Taking place late in the war, the player assumes the role of Danny Edwards, a British recruit joining the crew of a Mark V Landship named Black Bess as their new tank driver. New to the war and inexperienced in driving the unreliable vehicle, Edwards is given a trial by fire with his first mission: punch through the German line at Cambrai with a broken tank and a crew that has no trust in him. Battlefield 1 features the Frostbite 3 game engine and has very good graphics with tons of destructibles. Maps also now feature dynamic weather systems, affecting combat in various ways; for example, The St. Quentin Scar can either start as a clear, sunny day, a dark, foggy day, or in the middle of a rainstorm, and switch between them during the round. We tested BF1 at 1920 x 1080 with the 'Ultra' graphics quality preset in DX12 with the GPU Memory Restriction turned off. We also disabled VSync. Benchmark Results: The AMD Radeon RX Vega 56 reference graphics card edges out the new GeForce GTX 1070 Ti graphics cards at 1080P, 1440P and 4K screen resolutions, but only by a handful of frames on average.
Deus Ex: Mankind DividedDeus Ex: Mankind Divided is an action role-playing stealth video game developed by Eidos Montreal and published by Square Enix. Set in a cyberpunk-themed dystopian world in 2029, two years after the events of Human Revolution, Mankind Divided features the return of Adam Jensen from the previous game, Deus Ex: Human Revolution, with new technology and body augmentations. The game was released on August 23rd, 2016 for PC users. Deus Ex: Mankind Divided uses a heavily modified version of the Glacier 2 engine that has been tweaked so much that they are now calling it the Dawn game engine. We took a look at GPU performance using the 'High' image quality preset with MSAA and VSync turned off. We picked to run just 'High' image quality settings due to how tough this game title is to render and we feel that most gamers will try to target this setting. Benchmark Results: This DX12 game title with the high image quality preset is pretty tough on graphics cards, but we again found the Radeon RX Vega 56 winning across the board at 1080P, 1440P and even 4K resolutions. The new GeForce GTX 1070 Ti does help close the performance gap between the GeForce GTX 1070 and the Radeon RX Vega 56, but does not overtake it!
Gears of War 4Gears of War is a video game franchise created and originally owned by Epic Games, developed and managed by The Coalition, and owned and published by Microsoft Studios. The series focuses on the conflict between humanity, the subterranean reptilian hominids known as the Locust Horde, and their mutated counterparts, the Lambent & the Swarm. Gears of War 4 was released on October 11, 2016 for the PC and is an interesting game title in the sense that it must be run on Windows 10 Anniversary Edition. It uses the DirectX 12 API with features like async compute and tiled resources. Microsoft and the developer (The Coalition) have worked hard to make the PC port not suck and have placed over 30 graphics cards settings in the option menu with over 100-plus options that you can adjust. We are using the ultra image quality preset with VSync turned off. Gears of War 4 comes with a built-in benchmark that appears to be pretty decent and shows the average minimum framerate (bottom 5%) in the results. This is the benchmark that we ran on our graphics cards with Ultra image quality settings to see how they would perform. 1440P Benchmark Results: Gears of War 4 showed that the Gigabyte GeForce GTX 1070 G1 gaming can beat the Radeon RX Vega 56, so it shouldn't be a surprise that the GeForce GTX 1070 Ti models bested even those numbers and performed higher at 1080P, 1440P and 4K screen resolutions.
Fallout 4Fallout 4 is an open world action role-playing video game developed by Bethesda Game Studios and published by Bethesda Softworks. Fallout 4 is set in a post-apocalyptic Boston in the year 2287, 210 years after a devastating nuclear war, in which the player character emerges from an underground bunker known as a Vault. Gameplay is similar to Fallout 3. The title is the fifth major installment in the Fallout series and was released worldwide on November 10th, 2015. Fallout 4 was benchmarked with ultra image quality settings with TAA and 16x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. V-Sync can't be disabled in the games options, so we edited the necessary INI files and disabled vsync in the driver software as well. We used FRAPS to benchmark Fallout 4 after you emerge from the vault and are in The Commonwealth. Benchmark Results: The GeForce GTX 1070 Ti easily beats the Radeon RX Vega 56 in Fallout 4 and comes really close to performing better than the AMD Radeon RX Vega 64 air cooled model.
Tom Clancy's Ghost Recon WildlandsTom Clancy's Ghost Recon Wildlands is an open world tactical shooter video game developed by Ubisoft Paris. It is the tenth installment in the Tom Clancy's Ghost Recon franchise and is the first Ghost Recon game to feature an open world environment. The game moves away from the futuristic setting introduced in Tom Clancy's Ghost Recon Advanced Warfighter and instead feature a setting similar to the original Tom Clancy's Ghost Recon. Ubisoft described it as one of the biggest open world games that they have ever published, with the game world including a wide variety of environments such as mountains, forests, deserts and salt flats. A modified version of the AnvilNext game engine was used. The game was released on March 7, 2017 for Microsoft Windows, PlayStation 4 and Xbox One. Tom Clancy's Ghost Recon Wildlands was benchmarked with high image quality settings with Temporal AA and 4x AF. V-Sync and the framerate limit were both disabled and we used the game titles built-in game benchmark. 1080P Benchmark Results: Ghost Recon Wildlands showed that the GeForce GTX 1070 and 1070 Ti models could easily perform better than the AMD Radeon RX Vega 56.
3DMark Time Spy - DX123DMark Time Spy just recently came out and it is the latest and greatest DirectX 12 benchmark test for gaming PCs running Windows 10. This DirectX 12 Feature Level 11_0 benchmark utilizes a pure DirectX 12 game engine that supports features like asynchronous compute, explicit multi-adapter, and multi-threading! The developers opted to use DirectX 12 Feature Level 11_0 to ensuring wide compatibility with DirectX 11 hardware through DirectX 12 drivers. With DirectX 12 on Windows 10, game developers can significantly improve the multi-thread scaling and hardware utilization of their titles to draw more objects, textures and effects for your viewing enjoyment. 3DMark Fire Strike is a great high-end DirectX 11 benchmark test, but doesn't really show off what new graphics cards can do on a DirectX 12 game title that will have much more going on while you are playing. We ran 3DMark Time Spy with the standard settings and got the following results: In this DX12 benchmark we see the AMD Radeon RX Vega 56 is at 6,713 points and that is enough to come out ahead of the Gigabyte GeForce GTX 1070's score of 6,490 points, but not the 7187 score of the NVIDIA GeForce GTX 1070 Ti Founders Edition or 7,208 score we got on the EVGA GeForce GTX 1070 Ti FTW2.
VRMark is a relatively new benchmark aimed at those that might be thinking about buying an HTC Vive or an Oculus Rift and knowing what hardware will give them the best VR gaming experience. VRMark includes two VR benchmark tests that run on your monitor, no headset required, or on a connected HMD. At the end of each test, you'll see whether your PC is VR ready, and if not, how far it falls short.
SuperPosition BenchmarkUNIGINE introduced the Superposition Benchmark in April 2017 and it contains a benchmark based on the UNIGINE 2 Engine that is pretty tough on modern graphics cards. We ran the SuperPosition Benchmark performance test with the 1080P High preset. The AMD Radeon Vega 56 scored 8,149 versus the 9,135 on the Gigabyte GeForce GTX 1070 G1 Gaming and 10,204 on the NVIDIA GeForce GTX 1070 Ti Founders Edition. Here are the FPS results from the benchmark that show the Vega 56 dips down to 50 FPS while the GeForce GTX 1070 drops to 54 FPS and the GeForce GTX 1070 Ti FE drips do 61 FPS.
Temperature & Noise TestingThe gaming performance on a graphics card is the most important factor in buying a card, but you also need to be concerned about the noise, temperature and power consumption numbers. Since GPU-Z doesn't yet read the temperatures of VEGA we didn't have a way to record or log or temperatures. Using AMD Radeon Settings isn't good for idle temperatures as it uses the GPU, so we ended up using GPU Shark to get some readings. NVIDIA GeForce GTX 1070 Founders Edition Temperatures: The NVIDIA GeForce GTX 1070 Ti 8GB Founders Edition graphics card has a blower style fan that always runs, so we were seeing 33C at idle and 83C at load while gaming for an extended period of time. The fan was spinning at 27% (700RPM) at idle and got up to 54% (2200RPM) at load. EVGA GeForce GTX 1070 Ti FTW2 Temperatures: The EVGA GeForce GTX 1070 Ti FTW2 has no fans running at idle, so it is passively cooled at that point and the idle temp is higher at 42C as a result. The load temperature of 72C with the two fans running at 37% (1300RPM) is solid, so you are warmer than the reference card at idle, but silent, then at load you are almost 10C cooler. The EVGA iCX cooler also has nine additional temperature sensors that the reference card doesn't have. It shows that the memory and power phase components on the card are running cooler than the GPU, so it looks great!
We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.