Meet The Mighty GeForce GTX 1080 'Pascal' Video Card!The highly anticipated NVIDIA GeForce GTX 1080 video card is finally here! Gamers have waited years for NVIDIA to move to the new 16nm FinFET manufacturing processor and we have finally moved away from the 28nm manufacturing process that we've been using for nearly half a decade. The move also comes with the introduction of a brand-new architecture code-named Pascal that will replace the Maxwell Architecture that NVIDIA has been releasing video cards with for the past two years as well as new GDDR5X memory. All these changes mean that the GeForce GTX 1080 is by far the most advanced GeForce video card ever and according to NVIDIA it is ready for the latest DX12 game titles and pretty much everything Virtual Reality (VR) can throw at it. [caption id="attachment_181348" align="aligncenter" width="645"] NVIDIA GeForce GTX 1080 Block Diagram[/caption] The NVIDIA GeForce GTX 1080 uses the GP104 ‘Pascal’ GPU that has 20 Streaming Multiprocessor units enabled in four GPX clusters that process 32-thread warps. Each SM contains 128 CUDA cores, 2556KB of register capacity and a 96KB shared memory unit. If you take the number of SM units and times it by 128, that is how you end up with a total of 2560 CUDA cores that handle the pixel, vertex and geometry shading workloads. The texture filtering is done by 160 texture units and 64 ROPs for a total of 277.3 Gigatextures/sec. The GP104 GPU has 2MB of shared L2 cache. This is very similar to the GM204 die design that is used by the GeForce GTX 980, but there are obviously more cores and NVIDIA cranked the bloody heck out of the clock speeds. The 2560 CUDA cores on the GeForce GTX 1080 are clocked at 1607MHz base and 1733MHz boost. This is a huge increase in clock speeds as we've been stuck in the 800-1100 MHz clock range for the past six years. NVIDIA was able to get clock speeds this high by analyzing and improving each and every path to ensure that everything has been optimized for performance and efficiency. These enhancements along with the move to the 16nm node for the GPU allowed for the high clock speeds. The end result is that the GeFOrce GTX 1080 has 8,873 GFLOPS with regards to compute performance versus just 4,981 GFLOPS in the GeForce GTX 980 that came out less than two years ago! NVIDIA has also implemented new hardware features like Asynchronous Compute and faster V-Sync that they believe will be of growing importance with the release of more DX12 game titles in the years to come. Lastly, you have 8GB of Micron GDDR5X memory on a 256-bit memory interface running at 5005MHz or 10,000MHz as the effective memory clock. There are eight 32-bit memory controllers on the GP104 GPU. NVIDIA went with the narrower 256-bit bus to save power and lower the transistor count while being able to retain a respectable 320 GB/s of memory bandwidth. GDDR5X runs at 1.35V instead of 1.5V on GDDR5, so you are able to get some power savings by using GDDR5X.
|GTX 1080||GTX 980 Ti||GTX 980||GTX 780||GTX 680||GTX 580|
|GDDR5 Memory Clock||5,005MHz||7,010MHz||7,000MHz||6,008MHz||6,008MHz||4,008MHz|
|Memory Bus Width||256-bit||384-bit||256-bit||384-bit||256-bit||384-bit|
|FP64||1/32||1/32||1/32 FP32||1/24 FP32||1/24 FP32||1/8 FP32|
|TFLOPS||8.9 - Base 8.2 - Peak||5.6||5||4||3||1.5|
|Manufacturing Process||TSMC 16nm||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 40nm|
The GeForce GTX 1080 8GB Founders EditionThe NVIDIA GeForce GTX 1080 is a dual-slot form factor card that measures in at 10.5-inches in length. The reference card is going to be sold as the 'Founders Edition' model by Add-In-Board (AIB) partners and will be available for $100 more than the custom AIB cards. NVIDIA is doing this as to not compete with their customers and to allow gamers that want to still be able to buy the reference design card if they want to know they are getting something that was designed right for long term use. The top of the GeForce GTX 1080 still has the green LED backlit NVIDIA logo and and also shows the single 8-pin power connector that is needed for proper operation of this 180W TDP card. The angled design is something new from NVIDIA for this generation, but it looks great in person. The GeForce GTX 1080 Founders Edition card has a full-coverage backplate with a huge removable cutout for improving air flow for SLI users. Those with a multi-SLI setup have had to deal with heat issues and NVIDIA is trying their best to mitigate the heat issues caused by running two cards so close to one another and this is one way to open up some space for the cooling fan! The video outputs on the GeForce GTX 1080 Founders Edition graphics card has three DisplayPort connectors, an HDMI 2.0b connector (supporting 4k@60Hz) and a single dual-link DVI output. This means that NVIDIA now offers a total of five video connections, but only four can be used simultaneously. DisplayPort is 1.2 certified and spec DP 1.3/1.4-ready, which means it supports 4K screens at 120Hz or 5K at 60Hz from a single cable and an 8K display at 60Hz if you are using two cables. When it comes to video encoding and decoding support the GeForce GTX 1080 supports far more formats than the older GeForce GTX 980. The NVIDIA GeForce GTX 1080 features the usual radial fan that has become standard for them in recent years, but they are now using a vapor chamber GPU cooler design to help keep the GP104 Pascal GPU nice and cool. This is the first time NVIDIA has used vapor chamber cooling on a sub 250W graphics card! NVIDIA says that the default GPU Boost 3.0 settings will allow the GTX 1080 to boost up to 1733MHz and the cards target temperature is 83C by default. Once you pull the CPU cooler entirely off you can see the PCB of the GeForce GTX 1080 Founders Edition along with the GP104 GPU, GDDR5X memory ICs and the 5-phase dual-FET power phase design. NVIDIA also added extra capacitance to their filtering network, and optimized the power delivery network on the PCB for low impedance. As a result, power efficiency increased by roughly 6% compared to the GTX 980, and peak to peak voltage noise was reduced from 209mV to 120mV for improved overclocking. Let's move along to the GeForce GTX 1080 test system and then get straight on to the benchmark results!
Test SystemBefore we look at the numbers, let's take a brief look at the test system that was used. All testing was done using a fresh install of Windows 10 Pro 64-bit and benchmarks were completed on the desktop with no other software programs running. There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing:
- AMD Radeon Software Crimson Edition 16.1.1
- NVIDIA GeForce 368.16 for GTX 1080 and GeForce 362.00 For All Others
Intel X79/LGA2011 PlatformThe Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79-E WS motherboard with BIOS 1704 that came out on 05/08/2015. We went with the Intel Core i7-4960X Ivy Bridge-E processor to power this platform as it is PCIe 3.0 certified, so all graphics cards are tested with PCI Express Gen 3 enabled. The Kingston HyperX 10th Anniversary 16GB 2400MHz quad channel memory kit was set to XMP Profile #2. This profile defaults to 2133MHz with 1.65v and 11-13-13-30 2T memory timings. The OCZ Vertex 460 240GB SSD was run with latest firmware available. A Corsair AX860i digital power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements. Here are the exact hardware components that we are using on our test system:
|The Intel X79 Test Platform|
|Processor||Intel Core i7-4960X|
ASUS P9X79-E WS
16GB Kingston 2133MHz
|Solid-State Drive||OCZ Vertex 460 240GB|
|Cooling||Intel TS13X (Asetek)|
|Power Supply||Corsair AX860i|
|Operating System||Windows 10 64-bit|
|Monitor||Sharp PN-K321 32" 4K|
Battlefield 4Battlefield 4 is a first-person shooter video game developed by EA Digital Illusions CE (DICE) and published by Electronic Arts. It is a sequel to 2011's Battlefield 3 and was released on October 29, 2013 in North America. Battlefield 4's single-player Campaign takes place in 2020, six years after the events of its predecessor. Tensions between Russia and the United States have been running at a record high. On top of this, China is also on the brink of war, as Admiral Chang, the main antagonist, plans to overthrow China's current government; and, if successful, the Russians will have full support from the Chinese, bringing China into a war with the United States. This game title uses the Frostbite 3 game engine and looks great. We tested Battlefield 4 with the Ultra graphics quality preset as most discrete desktop graphics cards can easily play with this IQ setting at 1080P and we still want to be able to push the higher-end cards down the road. We used FRAPS to benchmark with these settings on the Shanghai level. All tests were done with the DirectX 11 API. Benchmark Results: The NVIDIA GeForce GTX 1080 FE was found to be 69% faster than the GeForce GTX 980 reference card and 17% faster than a factory overclocked GeForce GTX 980 Ti! This card is a beast! Benchmark Results: When you move up to a 2560x1440 display we found the GeForce GTX 1080 FE was found to be more than twice as fast as the GeForce GTX 980 reference card and nearly twice as fast as the AMD Radeon R9 Fury X! Benchmark Results: When it comes to 4K Ultra HD gaming the GeForce GTX 1080 FE video card averaged just under 60 FPS on average and proved to be 46.5% faster on average than the AMD Radeon R9 Fury X video card.
Fallout 4Fallout 4 is an open world action role-playing video game developed by Bethesda Game Studios and published by Bethesda Softworks. Fallout 4 is set in a post-apocalyptic Boston in the year 2287, 210 years after a devastating nuclear war, in which the player character emerges from an underground bunker known as a Vault. Gameplay is similar to Fallout 3. The title is the fifth major installment in the Fallout series and was released worldwide on November 10th, 2015. Fallout 4 was benchmarked with ultra image quality settings with TAA and 16x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. V-Sync can't be disabled in the games options, so we edited the necessary INI files and disabled vsync in the driver software as well. We used FRAPS to benchmark Fallout 4 after you emerge from the vault and are in The Commonwealth. Benchmark Results: In Fallout 4 at 1920x1080 the GeForce GTX 1080 was found to be slightly faster than the Zotac GeForce GTX 980 Ti AMP! Extreme edition video card on the average frame rate. Benchmark Results: With the display resolution cranked up to 2560x1440 we GeForce GTX 1080 was still in the lead by a small bit, but was still far faster than the Radeon R9 Fury X. Benchmark Results: When we increased the resolution to 3840x2160 for some 4K gaming goodness the GeForce GTX 1080 FE was found to be 37% faster than the Fury X and 62% faster than the GeForce GTX 980 reference card!
Grand Theft Auto VGrand Theft Auto V, currently one of the hottest PC games, was finally released for the PC on April 14, 2015. Developed by Rockstar, it is set in 2013 and the city of Los Santos. It utilizes the Rockstar Advanced Game Engine (RAGE) which Rockstar has been using since 2006, with multiple updates for technology improvements. In Grand Theft Auto V we set the game to run with no MSAA with 16x AF and high image quality settings as we didn't want the GPU to bottleneck the system too bad, but wanted a setup that your average gamer would actually play on. We used the games built-in benchmark utility to have at least one game we used that could be compared to your setup at home. We averaged all the five of the default benchmark runs and omitted the maximum values as those results are garbage for some reason. 1440P Benchmark Results: After running the games built-in benchmark and averaging the runs together we found that the GeForce GTX 1080 was nearly 16 FPS faster on average than any other card. The minimum frame rate was lower than most, but the average was impressive. 4K Ultra HD Benchmark Results: The GeForce GTX 1080 was nearly 8 FPS faster on average than any other card when re ran the benchmark at 4K settings.
Rise of the Tomb RaiderRise of the Tomb Raider is a third-person action-adventure video game developed by Crystal Dynamics and published by Square Enix. It is the sequel to the 2013 video game Tomb Raider, which was itself, the second reboot to its series. It was released for Microsoft Windows in January 2016. Players control Lara Croft through various environments, battling enemies, and completing puzzle platforming sections, while using improvised weapons and gadgets in order to progress through the story. Crystal Dynamics used a proprietary game engine called 'Foundation' for Rise of the Tomb Raider and it is able to create some pretty nice looking graphics. We tested Rise of the Tomb Raider with the Very High preset, but then changed the ambient occlusion setting from HBAO+ (an NVIDIA developed feature) to 'ON' to have as fair of a match up as we could. We also disabled VSync. Once we had the graphics and display settings figured out we used FRAPS to manually benchmark a section of the Siberian Wilderness that is about 10% into the game for a couple minutes. Rise of the Tomb Raider does not have a built-in benchmark, so this is one of the only ways you can benchmark this particular game title. 1080P Benchmark Results: For those running Full HD 1080P displays the good news is that all higher-end desktop graphics cards are able to play Rise of the Tomb Raider with 60 FPS averages. The new GeForce GTX 1080 never dropped below 100FPS during the benchmark run! 1440P Benchmark Results: With the resolution cranked up to 25x14 the GeForce GTX 1080 averaged an impressive 90 FPS! 4K Ultra HD Benchmark Results: Rise of the Tomb Raider with these image quality settings were tough on all of the cards at 4K resolution, but the GeForce GTX 1080 led the pack with nearly 50 FPS on average.
Star Wars: BattlefrontStar Wars: Battlefront is a series of first- and third-person shooter video games based on the Star Wars films. Players take the role of soldiers in either of two opposing armies in different time periods of the Star Wars universe. Star Wars: Battlefront is built on the Frostbite 3 engine. Thief is the fourth title in the Thief series, developed by Digital Illusions CE and published by EA DICE/Disney Interactive on November 17, 2015. We ran Star Wars: Battlefront with the image quality settings set at Ultra and VSYNC was disabled. We used FRAPS to benchmark with these settings on Battle on Tatooine. 1080P Benchmark Results: All of these high end cards were able to play Star Wars: Battlefront at 1080P Full HD resolution and it looks like we are becoming CPU limited here on the higher end cards! 1440P Benchmark Results: When we played Battlefront at 2560x1440 the GeForce GTX 1080 still averaged 134 FPS and had more than enough power to play this game title with ease. 4K Ultra HD Benchmark Results: When moving up to 3840x2160 the average FPS was 72 FPS on the GeForce GTX 1080 and the minimum never dropped below 60 FPS!
3DMark 20133Dmark Fire Strike Benchmark Results - For high performance gaming PCs Use Fire Strike to test the performance of dedicated gaming PCs, or use the Fire Strike Extreme preset for high-end systems with multiple GPUs. Fire Strike uses a multi-threaded DirectX 11 engine to test DirectX 11 hardware.
Fire Strike Benchmark Results:
Benchmark Results: The 3DMark Fire Strike benchmark had the NVIDIA GeForce GTX 1080 scoring 17,114 points versus that of 13,617 on the AMD Radeon R9 Fury X and 11,426 points on the GeForce GTX 980 reference card.Fire Strike Extreme Benchmark Results:
Benchmark Results: In 3DMark Fire Strike Extreme we see different scores, but basically the same scaling and performance results. The NVIDIA GeForce GTX 1080 FE 9,387 points and is the first retail card to break the 9,000 point mark in our testing!
Memory Bandwidth TestingTesting memory bandwidth on video cards is tricky, but we've been dabbling around testing it over the past yaer. We built a new Intel Z170 system to test just memory bandwidth it is best the test on a system that has a CPU with integrated graphics in order for the discrete graphics card can be run in headless mode. We also disabled Windows Aero and set the system to high performance mode for both the display settings and power settings. Nai's GPU memory bandwidth test only works on NVIDIA CUDA cards, so we tested a handful of NVIDIA GeForce GTX reference cards from the Fermi, Kepler, Maxwell and now the Pascal series to see how they perform. We weren't expecting to reach the theoretical peak bandwidth figures, but all of the cards came in reasonably close. The NVIDIA GeForce GTX 1080 is rated as have 320.3 GB/s of memory bandwidth and we averaged 298.8 with the top speed being 299.16 GB/s across the 8GB GDDR5X frame buffer. This is the highest bandwidth test result that we have seen from any series NVIDIA reference card. We've seen many gripe about the 256-bit bus used by the GDDR5X memory, but the memory bandwidth scores are up just slightly. No performance drop off was noted in the benchmark and we ran the VRAM bandwidth testing utility at block sizes of 16, 32, 64, 128, 256, 512 and 1024 just to be sure on each card. It should be noted that this application just tests the free space of the memory and isn't really a great benchmark test, but it's just another data point for use to look at and it also gives us a chance to look for any anomalies.
Temperature & Noise TestingThe gaming performance on a graphics card is the most important factor in buying a card, but you also need to be concerned about the noise, temperature and power consumption numbers. NVIDIA GeForce GTX 1080 Idle and Load Temps: When it comes to temperatures the GeForce GTX 1080 8GB GDDR5X Founders Edition graphics card were 34C at idle and 83C at load while gaming on our open air test bench. This is not a 0dB fan design, so notice the fan was spinning at 1100 RPM at idle and then 2200 RPM at load. Here is a chart that shows the temperatures of the GeForce GTX 1080 FE versus some other high-end desktop cards.
We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.The NVIDIA GeForce GTX 1080 isn't the quietest card we've ever tested, but it isn't the loudest either. We have no complaints about the sound level or choke whine issues with our sample card. ** The AMD Radeon R9 Fury X reference card that we are using was the original model with a loud water pump that whines. AMD changed the pump design before the cards hit the retail market, but wasn't willing to replace ours. We expect retail cards to perform quieter for this and hopefully AMD will send us a replacement card for proper noise testing. **