The GeForce GTX Titan X is The Baddest GPU of The LandWe’ve talked so much about the NVIDIA GeForce GTX Titan X video card lately that is feels odd to write an introduction for its official reveal, so we’ll just jump right into it. The NVIDIA GeForce GTX Titan X uses the ‘big’ GM200 ‘Maxwell’ GPU like many thought and has 3072 CUDA cores clocked at 1000MHz for the base clock and 1075MHz on the boost clock. The incredible 12GB of GDDR5 memory being used for the frame buffer runs on a 384-bit memory bus and is clocked at 7010MHz. The end result is what NVIDIA is calling the World’s fastest GPU and with specifications like that no other single-GPU card on the market will be able to touch this behemoth. Sure, the 12288MB of GDDR5 memory is overkill even in this day and age of 4K Ultra HD gaming, but for a flagship card who cares, right? The GeForce GTX Titan X was designed for gamers that desire the best card that NVIDIA has to offer regardless of the price to ensure they can game with the image quality settings cranked up no matter the resolution. NVIDIA calls these users ‘ultra-enthusiast gamers’ and knows that they are not your typical price conscious consumers. The NVIDIA GeForce GTX Titan X is priced at $999 and the card is being hard launched today, so if you are reading this on launch morning online retailers like Amazon, Newegg, TigerDirect, NCIX and others should be ready to take your money and to ship you out a GeForce GTX Titan X. The NVIDIA GeForce GTX Titan X takes over as the flagship card from the GeForce GTX 980 that was introduced in September 2014. Over the past six months the GeForce GTX 980 has dominated the market by simply flat out overpowering the AMD Radeon R9 290X and having an impressive feature set. NVIDIA has done a good job at promoting the new features that were launched alongside with Maxwell like Voxel Global Illumination (VXGI, Multi-Frame Sample AA (MFAA), Dynamic Super Resolution (DSR) and DirectX 12 API with feature level 12.1 support. The NVIDIA GeForce GTX Titan X supports everything that the GeForce GTX 980 does with more power to handle emerging Virtual Reality (VR) applications. Consumer VR headsets will be coming to market later this year and you’ll need to have a GPU that can handle whatever is coming. The GeForce GTX Titan X is powered by the full GM200 ‘Maxwell’ GPU that has six Graphics Processing Clusters and a total of 24 Streaming Multiprocessor units (SMX). Each SMM contains 128 CUDA cores and that is how you end up with an impressive 3072 CUDA cores that handle the pixel, vertex and geometry shading workloads. The Maxwell GM200 uses the same basic SMX design as the other Maxwell GPUs that are already out. The 3072 CUDA cores in the Titan X's GM200 GPU are clocked at 1000MHz/1075MHz. The texture filtering is done by 192 texture units that just happen to have a texture filtering rate of 192 Gigatexls/sec. This means that the GeForce GTX Tian X has the ability to do texture filtering 33% faster than the GeForce GTX 980! NVIDIA has been increasing the L2 cache size in recent years and they did so again on the GM200 as it has 3MB of L2 cache. The NVIDIA GK104 has just 512K and the GM204 used on the GeForce GTX 980 has 2MB. The display/video engines remain unchanged from the GM204 GPU used on the GeForce GTX 980 video cards. NVIDIA wanted to create a graphics card that was designed for 4K gaming and wanted to basically max out the performance. NVIDIA opted to go with 12GB of GDDR5 memory that runs at 7010MHz on six 64-bit memory controllers (384-bit bus) to ensure that no one will run out of frame buffer when playing the latest game titles. This is good for 336.5GB/s of peak memory bandwidth, which is 50% more bandwidth than a GeForce GTX 980 has. You can spend all day playing game titles at 4K with ultra image quality settings only to find that none on the market today will use up 12GB of memory that are on the market today! Not to mention that even if you could find a game title that is able to use close to 12GB of memory the frame rates won’t be high enough for the game to be actually playable! NVIDIA completely went overboard with the memory, but you won’t hear many complaining.
|Titan X||GTX 980||GTX 780||GTX 680||GTX 580|
|GDDR5 Memory Clock||7,010MHz||7,000MHz||6,008MHz||6,008MHz||4,008MHz|
|Memory Bus Width||384-bit||256-bit||384-bit||256-bit||384-bit|
|FP64||1/32 FP32||1/32 FP32||1/24 FP32||1/24 FP32||1/8 FP32|
|Manufacturing Process||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 40nm|
GeForce GTX Titan X 12GB Video CardOur NVIDIA GeForce GTX Titan X video card showed up in the same black box that we first ran across when the Titan X was unveiled at GDC 2015. Unfortunately our sample was not signed with a personal message from NVIDIA President and CEO Jen-Hsun Huang! The product packaging was designed to lift off from the top and reveal the mighty GeForce GTX Titan X graphics card. The end result was basically a display case showing off the GTX Titan X in an upright position. The blower style fan GPU cooler used on the GeForce GTX Titan X remains largely unchanged since first being used on the original GeForce GTX Titan in February 2013. NVIDIA isn’t getting complacent when it comes to GPU cooler designs, but rather said they came up with a good cooler and the TDP of their high-end discrete desktop cards has not really gone up. If you think about it the GeForce GTX 780, 780 Ti, and Titan were all 250W TDP cards, so why change the cooler if the temperatures and noise levels are good to go? The one thing we really like about the GeForce GTX Titan is that NVIDIA blacked it out more than another other GPU's in recent memory! The Titan X has a magnesium alloy fan housing with an aluminum frame that was trivalent chromium plated to look as good as possible. The aluminum housing has been painted black to give it an aggressive look or stealthy depending on what camp you are in. The GeForce logo on top of the card is still LED backlit and glows NVIDIA green. The NVIDIA GeForce GTX Titan X reference card measures 10.5-inches in length and takes up two PCI slots. At just 10.5” long it should easily fit inside your current gaming system or any new chassis that you are looking to purchase for your next gaming rig. The blower style fan on the Titan X basically brings air in from the end of the card and the opening of the blower fan itself and then exhausts said air out the back of the graphics card and outside of the PC chassis. When looking down the aluminum cooling fins you can see that the air is blown right across the fins and out of the system. Under the fins you’ll find a copper vapor chamber that predominately sites on the GM200 GPU to effectively transfer heat away from the GPU and too the cooling fins. The video outputs on the GeForce GTX Titan X reference include three DisplayPort connectors, an HDMI 2.0 connector (supporting 4k@60Hz) and a single dual-link DVI output. This means that NVIDIA now offers a total of five video connections, but only four can be used simultaneously. This new video output arrangement means that you can run three NVIDIA G-Sync enabled displays off of one GeForce GTX Titan Xvideo card if one desires to do so. If you want to run a multi-panel setup and don't want to sacrifice any image quality, you'll likely still need to run a 2, 3 or 4-way SLI multi-GPU setup to get the performance needed to power the resolution garnered by such a display setup. NVIDIA also changed up the way the exhaust ports are shaped on the I/O bracket to increase airflow and to reduce noise. The ability to support HDMI 2.0 is a pretty big deal and NVIDIA has the world's first GPU that is able to support it. Previous generation GPU's supported HDMI 1.4 and could only officially support 4k displays at 30Hz for '444' RGB pixels and 60Hz for '420' YUV pixels. The GeForce GTX 970/980/Titan X support full-resolution '444' RGB pixels at 60Hz for 4k displays. All GM2xx Maxell GPUs also ship with an enhanced NVENC encoder that adds support for H.265 encoding. NVIDIA claims that Maxwell's video encoder improves H.264 video encode throughput by 2.5x over Kepler and that it can encode 4k video at 60 FPS. The max resolution supported by Maxwell is 5120x3200, so get ready for displays that go way beyond Ultra HD in the years to come! The NVIDIA GeForce GTX Titan X has a Thermal Design Power (TDP) rating of 250 Watts and requires one 6-pin and one 8-pin PCIe Power Connector for proper operation. NVIDIA recommends a 600W or larger power supply for a system running one GeForce GTX Titan X video card. There is a physical location for a third 8-pin PCIe power connector on the PCB at the very end of the card, but it was deemed not needed for the GM200 and the solder points have gone unused. The NVIDIA GeForce GTX Titan X reference card does not come with a backplate. At first we were disappointed that NVIDIA did not include one, but they ensured us that one was not included to maximize the airflow for those wanting to run SLI. It would have been nice for NVIDIA to have included a fully removable backplate though as most gamers only run one card and like the looks and protection that a simple backplates can add to a graphics card. The NVIDIA GeForce GTX Titan X uses two sets of aluminum heatinks that has three embedded heatpipes that help keep the Maxwell GM204 GPU nice and cool. NVIDIA says that the default GPU Boost 2.0 settings will allow the GTX 980 to boost up to the highly clock frequency and remain there as long as the GPU temperature remains at or below 80C. Once you pull the CPU cooler entirely off you can see the PCB of the GeForce GTX Titan X reference card along with the GM200 GPU, GDDR5 memory ICs and the 6+2 power phase design. NVIDIA went with a 6-phase VR circuit with integrated dynamic power balancing circuity for the Titan X’s GM200 GPU and there is are two additional power phases for the boards whopping 12GB of GDDR5 memory. NVIDIA is also using polarized capacitors (POSCAPS) to minimize unwanted board noises as well as molded inductors for the very first time on a reference board. NVIDIA says that the 6+2 phase power supply setup has the ability to supply the GPU with 275W of power at the maximum power target setting of 110% if one would like to overclock the card. NVIDIA claims they were able to get the 3072 CUDA cores on GM200 GPU up to 1400MHz when overclocked with the stock air cooler.
Test SystemBefore we look at the numbers, let's take a brief look at the test system that was used. All testing was done using a fresh install of Windows 8 Pro 64-bit and benchmarks were completed on the desktop with no other software programs running. It should be noted that we average all of our test runs. There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing:
- NVIDIA GeForce 347.84 For All Maxwell/Kepler Cards
- AMD CATALYST 15.3.1 Beta
Intel X79/LGA2011 Platform
|The Intel X79 Test Platform|
|Processor||Intel Core i7-4960X|
ASUS P9X79-E WS
16GB Kingston 2133MHz
|Solid-State Drive||OCZ Vertex 460 240GB|
|Cooling||Intel TS13X (Asetek)|
|Power Supply||Corsair AX860i|
|Operating System||Windows 8.1 Pro 64-bit|
|Monitor||Sharp PN-K321 32" 4K|
NVIDIA GeForce GTX Titan X OverclockingThe NVIDIA GeForce GTX Titan X is a monster at its stock speeds, but if you are comfortable using an overclocking utility like EVGA Precision X you can take this already powerful card and get even more performance from the GM200 'Maxwell' GPU. This is due to the fact that NVIDIA left plenty of overhead with particular GPU for enthusaists and gamers to tap into if they wanted to push the envelope a little bit. The NVIDIA GeForce GTX Titan X is power limited in most game titles, so it will benefit from additional power. NVIDIA said that every GeForce GTX Titan X has the ability to run with the power target increased by up to 110% for better performance. We installed a latest version of the EVGA PrecisionX 16 overclocking utility to overclock the NVIDIA GeForce GTX TitanX video card! You can use whatever software utility you like for overclocking, but this our personal favorite and the one we've used the most. The NVIDIA GeForce GTX 980 is pretty open when it comes to overclocking. You can increase the power target to 110% and if you leave the GPU Temp Target locked it will automatically increase to 91C. We pushed the GPU Clock offset to +250MHz and the Mem Clock Offset to +500MHz on our card and found it was pretty stable in most game titles. We ended up with a GPU clock offset to +250MHz and the mem clock offset to +500MHz before we started to get encounter some stability issues due to the memory clock frequency. This overclock meant that we were running at 1414MHz at times thanks to NVIDIA Boost 2.0 on the core and 2000MHz (8000MHz effective) on the 12GB of GDDR5 memory. NVIDIA GeForce GTX Titan X Stock: NVIDIA GeForce GTX Titan X Overclocked (+250/+500): By overclocking the NVIDIA GeForce GTX Titan X 12GB reference card we were able to take the score of 7549 and raise it up to 8773. This is a 1224 point increase in our overall 3DMark score, which represents a performance gain of 16.2 percent. The overal FPS average in Graphics Test 1 went from 42.30 to 49.85, which is a 17.8% performance gain in the graphics test. The only problem is that our particular GeForce GTX Titan X didn't like running at 1252MHz base/1326MHz boost and the system would lock up randomly on some game titles. We backed the GPU core offset down from +250MHz to +200MHz and found the Titan X was rock solid and would for hours and hours without any crashes or lockups. At +200MHz on the core offset it means our Titan X was running at 1364MHz in the game titles. We benchmarked the GeForce GTX Titan X overclocked to +200MHz core and +500MHz memory for this review so you can see how the card performs both in stock trim and then when pushed with what we'd consider a happy 24/7 overclock that you'd actually set and leave on your card.
Batman: Arkham OriginsBatman: Arkham Origins is an action-adventure video game developed by Warner Bros. Games Montréal. Based on the DC Comics superhero Batman, it follows the 2011 video game Batman: Arkham City and is the third main installment in the Batman: Arkham series. It was released worldwide on October 25, 2013. For testing we used DirectX11 Enhanced, FXAA High Anti-Aliasing and with all the bells and whistles turned on. It should be noted that V-Sync was turned off and that NVIDIA's PhysX software engine was also disabled to ensure both the AMD and NVIDIA graphics cards were rendering the same objects. We manually ran FRAPS on the single player game instead of using the built-in benchmark to be as real world as we possibly could. We ran FRAPS in the Bat Cave, which was one of the only locations that we could easily run FRAPS for a couple minutes and get it somewhat repeatable. The CPU usage for Batman: Arkham Origins was surprising low with just 10% of the Intel Core i7-4960X being used by this particular game title. You can see that the bulk of the work is being done by one CPU core. Benchmark Results: The NVIDIA GeForce GTX Titan X 12GB video card averaged 70.49 FPS on Batman: Arkham Origins with the stock clock speeds. With the Titan X video card overclocked by 200MHz on the CUDA cores and 250MHz on the GDDR5 memory we were able ti average 81.70 FPS, which was actually faster than a dead stock AMD Radeon R9 295X2 8GB video card. The NVIDIA GeForce GTX 980 reference card came in at 52.91 FPS, so there is a pretty big gap between the GTX 980 and GTX Titan X! Benchmark Results: When you look at performance over time, the GeForce GTX Titan X 12GB graphics card never dropped below 60 FPS on our Ultra HD (3840x2160) test setup!
Battlefield 4Battlefield 4 is a first-person shooter video game developed by EA Digital Illusions CE (DICE) and published by Electronic Arts. It is a sequel to 2011's Battlefield 3 and was released on October 29, 2013 in North America. Battlefield 4's single-player Campaign takes place in 2020, six years after the events of its predecessor. Tensions between Russia and the United States have been running at a record high. On top of this, China is also on the brink of war, as Admiral Chang, the main antagonist, plans to overthrow China's current government; and, if successful, the Russians will have full support from the Chinese, bringing China into a war with the United States. This game title uses the Frostbite 3 game engine and looks great. We tested Battlefield 4 with the Ultra graphics quality preset as most discrete desktop graphics cards can easily play with this IQ setting at 1080P and we still want to be able to push the higher-end cards down the road. We used FRAPS to benchmark each card with these settings on the Shanghai level. Battlefield 4 is more CPU intensive than any other game that we benchmark with as 25% of the CPU is used up during gameplay. You can see that six threads are being used and that the processor is running in Turbo mode at 3.96GHz more times than not. Benchmark Results: In Battlefield 4 with Ultra settings at 3840x2160 we were able to average 33.28 FPS on the GeForce GTX 980 reference card versus 44.18 FPS on the GeForce GTX Titan X. The AMD Radeon R9 295X2 averaged 54.24 FPS, so it is about 10 FPS faster on average in BF4 with these settings. By overclocking the NVIDIA GeForce GTX Titan X we were able to improve the average performance up to 52 FPS, but that is still slower than a stock Radeon R9 295X2. Benchmark Results: The GeForce GTX Titan X ran BF4 pretty smoothly with these settings and and we seldom dropped below 40 FPS at 3840x2160. The AMD Radeon R9 295X2 runs AMD CrossFire, and you can see this when looking at performance charted over time as the frame rate was a bit more sporadic as you can see from the chart above.
Crysis 3Like the others, it is a first-person shooter developed by Crytek, using their CryEngine 3. Released in February 2013, it is well known to make even powerful system choke. It has probably the highest graphics requirements of any game available today. Unfortunately, Crytek didn’t include a standardized benchmark with Crysis 3. While the enemies will move about on their own, we will attempt to keep the same testing process for each test. Crysis 3 has a reputation for being highly resource intensive. Most graphics cards will have problems running Crysis 3 at maximum settings, so we settled on no AA with the graphics quality mostly set to Very High with 16x AF. We disabled v-sync and left the motion blur amount on medium. Crysis 3 appeared to run for the most part on just 3 CPU threads and used up about 15-18% of our Intel Core i7-4960X processor with these settings. Notice that the processor speed was at 3.53GHz and we very seldom, if ever, saw the processor go into turbo mode on Crysis 3. Benchmark Results: On Crysis 3 at 3840x2160 the NVIDIA GeForce GTX 980 reference card averaged 21.07 FPS and we were running at 27.83 FPS on the GeForce GTX Titan X. Overclocking the GeForce GTX Titan X got our performance up to 33.48 FPS, which as great as it looks is still a couple FPS behind the Radeon R9 295X2. Benchmark Results: It is extremely tough to get identical FRAPS runs on Crysis 3, but you an see that the GeForce GTX Titan X stays above the 25 FPS for most of the benchmark run and had less variance than the Radeon R9 295X2.
Far Cry 4Far Cry 4 is an action-adventure first-person shooter video game developed by Ubisoft Montreal and published by Ubisoft for the PlayStation 3, PlayStation 4, Xbox 360, Xbox One video game consoles, and Microsoft Windows. It is the sequel to 2012's Far Cry 3. The game was released on November 18th, 2014 in North America and Europe. Far Cry 4 follows Ajay Ghale, a young Kyrati-American who returns to his native country Kyrat to spread his deceased mother's ashes. He finds the country in a state of civil war between Kyrat's Royal Army led by the country's eccentric and tyrant king Pagan Min and the Golden Path, a rebel movement fighting to free Kyrat from Min's oppressive rule. Far Cry 4 uses the heavily modified Dunia Engine 2 game engine with Havok physics. The graphics are excellent and the game really pushes the limits of what one can expect from mainstream graphics cards. We set game title to Ultra image quality settings and did not adjust any of the advanced settings. Far Cry 4 uses about 30% of the processor and is running on multiple cores as you can see from our screen capture above. One core has more of a load on it than the others, but all logical processors are being uses to some degree when playing Far Cry 4.
Benchmark Results: In Far Cry 4 we found the new NVIDIA GeForce GTX Titan X video card put up a respectable score of 43.71 FPS, which makes the 32.36 FPS on the NVIDIA GeForce GTX 980 reference card looks underwhelming. When overclocked the GeForce GTX Titan X was able to average 51.07 FPS on FarCry 4 and out gaming experience was excellent on this awesome looking game title. The AMD Radeon R9 295X2 had the highest average FPS at 59.20, but it was choppy at various times when we were playing the game title.
Benchmark Results: The AMD Radeon R9 295X2 might have won the average FPS battle, but the game play wasn't smooth. When we charted the FRAPS log and looked at performance over time we found that the Radeon R9 295X2 dropped down below 30FPS once and down to nearly 30 FPS several times. The NVIDIA GeForce GTX Titan X never dropped below 37 FPS, so there was nearly a 10 FPS difference in the minimum frame rate during our benchmarking session.
Metro Last Light
Metro: Last Light is a first-person shooter video game developed by Ukrainian studio 4A Games and published by Deep Silver. The game is set in a post-apocalyptic world and features action-oriented gameplay with a combination of survival horror elements. It uses the 4A Game engine and was released in May 2013. Metro: Last Light was benchmarked with very high image quality settings with the SSAA set to off and 4x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. We benchmarked this game title on the Theater level.
We again found around 20% CPU usage on Metro: Last Light.Benchmark Results: In Metro: Last Light the GeForce GTX 980 averaged of 39.41 FPS and the GeForce GTX Titan X came in at 48.34 FPS in stock form. The AMD Radeon R9 295X2 came in at 64.82 FPS and not even the impressive 56.67 FPS average on the overclocked Titan X could touch that. Benchmark Results: The AMD Radeon R9 295X2 had a pretty large 30 FPS dip in our benchmark run, but for the most part nothing was seen that is out of the ordinary here! The NVIDIA GeForce GTX Titan X was able to run Metro LL above 40FPS for the entire benchmark run and we played through part of the game without any issues on our 4K setup.
ThiefThief is a series of stealth video games in which the player takes the role of Garrett, a master thief in a fantasy/steampunk world resembling a cross between the Late Middle Ages and the Victorian era, with more advanced technologies interspersed. Thief is the fourth title in the Thief series, developed by Eidos Montreal and published by Square Enix on February 25, 2014. We ran Thief with the image quality settings set at normal with VSYNC disabled. Thief appears to be running on the six physical cores of the Intel Core i7-4960X processor and averages around 17-24% CPU usage from what we were able to tell from the CPU utilization meter that is built into the Windows 8.1 task manager. Benchmark Results: On Thief we found the NVIDIA GeForce GTX 980 reference card had an average benchmark run of 50.64 FPS and the GeForce GTX Titan X averaged 62.79 FPS. The AMD Radeon R9 295X2 scored 72.43 FPS in stock trim, which the overclocked Titan X was able to surpass with a score of 74.18 FPS. Benchmark Results: The performance over time chart showed no anomalies! The NVIDIA GeForce GTX Titan X managed to stay above 50 FPS during the benchmark run.
3DMark 20133Dmark Fire Strike Benchmark Results - For high performance gaming PCs Use Fire Strike to test the performance of dedicated gaming PCs, or use the Fire Strike Extreme preset for high-end systems with multiple GPUs. Fire Strike uses a multi-threaded DirectX 11 engine to test DirectX 11 hardware.
Fire Strike Extreme Benchmark Results:
Benchmark Results: The 3DMark Fire Strike Extreme benchmark had the NVIDIA GeForce GTX 980 video card coming in with an overall score of 5,853 and the GeForce GTX Titan X easily bested that with a score of 7,549. The AMD Radeon R9 295X2 led the benchmark chart though with an impressive overall score of 8,852.Fire Strike Ultra 4K Benchmark Results:
Benchmark Results: The 3DMark Fire Strike Ultra '4K' benchmark had the GeForce GTX Titan X coming in at 4,000 and the Radeon R9 295X2 was at 4,899.
Temperature & Noise TestingTemperatures are important to enthusiasts and gamers, so we took a bit of time and did some temperature testing on the NVIDIA GeForce GTX Titan X video card. NVIDIA GeForce GTX Titan X Idle Temps: At idle we found the GPU core temperature was 32C with the single fan on the NVIDIA reference cooler running at 22% or 1050 RPM. When gaming we hit 85C, which is slightly over the 83C default GPU temp target for the GeForce GTX Titan X. Our room temperature was 70F (21C), so these are respectable scores for a flagship GPU with 3072 CUDA Cores! When gaming our card boosted up to a maximum speed of 1164.4MHz on the CUDA cores.
We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.The NVIDIA GeForce GTX Titan X is a fairly quiet card at both idle and when heated to full operating temperature.