The GeForce GTX Titan X is The Baddest GPU of The Land

NVIDIA Maxewll We’ve talked so much about the NVIDIA GeForce GTX Titan X video card lately that is feels odd to write an introduction for its official reveal, so we’ll just jump right into it. The NVIDIA GeForce GTX Titan X uses the ‘big’ GM200 ‘Maxwell’ GPU like many thought and has 3072 CUDA cores clocked at 1000MHz for the base clock and 1075MHz on the boost clock. The incredible 12GB of GDDR5 memory being used for the frame buffer runs on a 384-bit memory bus and is clocked at 7010MHz. The end result is what NVIDIA is calling the World’s fastest GPU and with specifications like that no other single-GPU card on the market will be able to touch this behemoth. Sure, the 12288MB of GDDR5 memory is overkill even in this day and age of 4K Ultra HD gaming, but for a flagship card who cares, right? The GeForce GTX Titan X was designed for gamers that desire the best card that NVIDIA has to offer regardless of the price to ensure they can game with the image quality settings cranked up no matter the resolution. NVIDIA calls these users ‘ultra-enthusiast gamers’ and knows that they are not your typical price conscious consumers. The NVIDIA GeForce GTX Titan X is priced at $999 and the card is being hard launched today, so if you are reading this on launch morning online retailers like Amazon, Newegg, TigerDirect, NCIX and others should be ready to take your money and to ship you out a GeForce GTX Titan X. GeForce GTX Titan and Titan X Video Cards The NVIDIA GeForce GTX Titan X takes over as the flagship card from the GeForce GTX 980 that was introduced in September 2014. Over the past six months the GeForce GTX 980 has dominated the market by simply flat out overpowering the AMD Radeon R9 290X and having an impressive feature set. NVIDIA has done a good job at promoting the new features that were launched alongside with Maxwell like Voxel Global Illumination (VXGI, Multi-Frame Sample AA (MFAA), Dynamic Super Resolution (DSR) and DirectX 12 API with feature level 12.1 support. The NVIDIA GeForce GTX Titan X supports everything that the GeForce GTX 980 does with more power to handle emerging Virtual Reality (VR) applications. Consumer VR headsets will be coming to market later this year and you’ll need to have a GPU that can handle whatever is coming. NVIDIA GeForce GTX TITAN X GM200 Block Diagram The GeForce GTX Titan X is powered by the full GM200 ‘Maxwell’ GPU that has six Graphics Processing Clusters and a total of 24 Streaming Multiprocessor units (SMX). Each SMM contains 128 CUDA cores and that is how you end up with an impressive 3072 CUDA cores that handle the pixel, vertex and geometry shading workloads. The Maxwell GM200 uses the same basic SMX design as the other Maxwell GPUs that are already out. The 3072 CUDA cores in the Titan X's GM200 GPU are clocked at 1000MHz/1075MHz. The texture filtering is done by 192 texture units that just happen to have a texture filtering rate of 192 Gigatexls/sec. This means that the GeForce GTX Tian X has the ability to do texture filtering 33% faster than the GeForce GTX 980! NVIDIA has been increasing the L2 cache size in recent years and they did so again on the GM200 as it has 3MB of L2 cache. The NVIDIA GK104 has just 512K and the GM204 used on the GeForce GTX 980 has 2MB. The display/video engines remain unchanged from the GM204 GPU used on the GeForce GTX 980 video cards. NVIDIA wanted to create a graphics card that was designed for 4K gaming and wanted to basically max out the performance. NVIDIA opted to go with 12GB of GDDR5 memory that runs at 7010MHz on six 64-bit memory controllers (384-bit bus) to ensure that no one will run out of frame buffer when playing the latest game titles. This is good for 336.5GB/s of peak memory bandwidth, which is 50% more bandwidth than a GeForce GTX 980 has. You can spend all day playing game titles at 4K with ultra image quality settings only to find that none on the market today will use up 12GB of memory that are on the market today! Not to mention that even if you could find a game title that is able to use close to 12GB of memory the frame rates won’t be high enough for the game to be actually playable! NVIDIA completely went overboard with the memory, but you won’t hear many complaining.
Titan X GTX 980 GTX 780 GTX 680 GTX 580
Microarchitecture Maxwell Maxwell Kepler Refresh Kepler Fermi
Stream Processors 3072 2048 2304 1536 512
Texture Units 192 128 192 128 64
ROPs 96 64 48 32 48
Core Clock 1000MHz 1126MHz 863MHz 1006MHz 772MHz
Shader Clock N/A N/A N/A N/A 1544MHz
Boost Clock 1075MHz 1216MHz 900MHz 1058MHz N/A
GDDR5 Memory Clock 7,010MHz 7,000MHz 6,008MHz 6,008MHz 4,008MHz
Memory Bus Width 384-bit 256-bit 384-bit 256-bit 384-bit
Frame Buffer 12GB 4GB 3GB 2GB 1.5GB
FP64 1/32 FP32 1/32 FP32 1/24 FP32 1/24 FP32 1/8 FP32
Memory Bandwidth 336.5 224 288 192.3 192.4
TFLOPS 7 5 4 3 1.5
GFLOPS/Watt 28 30 15 15 6
TDP 250W 165W 250W 195W 244W
Transistor Count 8.0B 5.2B 7.1B 3.5B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm TSMC 40nm
Release Date 03/2015 09/2014 05/2013 03/2012 11/2010
Launch Price $999 $549 $649 $499 $499
So, how does the GeForce GTX Titan X stack up to the the other flagship models that NVIDIA has released over the past five years? The NVIDIA GeForce GTX Titan X has is no slouch and the 7 TFLOPs of compute power and 336.6 GB/s of memory bandwidth dominate the cards of yesteryear. The original Kepler GPU powered Titan is not shown in the table above, but the Titan X is said to have 2x the performance and 2x the power efficiency of that part. It should be noted that the GeForce GTX Titan X has 7.0 TFLOPS of Single-precision floating-point performance and 0.2 TFLOPS of double-precision performance. If you want a card that can do double-precision tasks you'll need to look to the Titan Z that is still available for just that. Let's move along and take a look at the GeForce GTX Titan X reference card!

GeForce GTX Titan X 12GB Video Card

  NVIDIA GeForce GTX Titan X Box Our NVIDIA GeForce GTX Titan X video card showed up in the same black box that we first ran across when the Titan X was unveiled at GDC 2015. Unfortunately our sample was not signed with a personal message from NVIDIA President and CEO Jen-Hsun Huang! NVIDIA GeForce GTX Titan X GPU The product packaging was designed to lift off from the top and reveal the mighty GeForce GTX Titan X graphics card. The end result was basically a display case showing off the GTX Titan X in an upright position. NVIDIA GeForce GTX Titan X Video Card The blower style fan GPU cooler used on the GeForce GTX Titan X remains largely unchanged since first being used on the original GeForce GTX Titan in February 2013. NVIDIA isn’t getting complacent when it comes to GPU cooler designs, but rather said they came up with a good cooler and the TDP of their high-end discrete desktop cards has not really gone up. If you think about it the GeForce GTX 780, 780 Ti, and Titan were all 250W TDP cards, so why change the cooler if the temperatures and noise levels are good to go? The one thing we really like about the GeForce GTX Titan is that NVIDIA blacked it out more than another other GPU's in recent memory! The Titan X has a magnesium alloy fan housing with an aluminum frame that was trivalent chromium plated to look as good as possible. The aluminum housing has been painted black to give it an aggressive look or stealthy depending on what camp you are in. The GeForce logo on top of the card is still LED backlit and glows NVIDIA green. NVIDIA GeForce GTX Titan X Graphics Card The NVIDIA GeForce GTX Titan X reference card measures 10.5-inches in length and takes up two PCI slots. At just 10.5” long it should easily fit inside your current gaming system or any new chassis that you are looking to purchase for your next gaming rig. GeForce GTX Titan X GPU Cooler The blower style fan on the Titan X basically brings air in from the end of the card and the opening of the blower fan itself and then exhausts said air out the back of the graphics card and outside of the PC chassis. GeForce GTX Titan X GPU Cooler When looking down the aluminum cooling fins you can see that the air is blown right across the fins and out of the system. Under the fins you’ll find a copper vapor chamber that predominately sites on the GM200 GPU to effectively transfer heat away from the GPU and too the cooling fins. NVIDIA GeForce GTX Titan X Video Outputs The video outputs on the GeForce GTX Titan X reference include three DisplayPort connectors, an HDMI 2.0 connector (supporting 4k@60Hz) and a single dual-link DVI output. This means that NVIDIA now offers a total of five video connections, but only four can be used simultaneously. This new video output arrangement means that you can run three NVIDIA G-Sync enabled displays off of one GeForce GTX Titan Xvideo card if one desires to do so. If you want to run a multi-panel setup and don't want to sacrifice any image quality, you'll likely still need to run a 2, 3 or 4-way SLI multi-GPU setup to get the performance needed to power the resolution garnered by such a display setup. NVIDIA also changed up the way the exhaust ports are shaped on the I/O bracket to increase airflow and to reduce noise. display-capabilities The ability to support HDMI 2.0 is a pretty big deal and NVIDIA has the world's first GPU that is able to support it. Previous generation GPU's supported HDMI 1.4 and could only officially support 4k displays at 30Hz for '444' RGB pixels and 60Hz for '420' YUV pixels. The GeForce GTX 970/980/Titan X support full-resolution '444' RGB pixels at 60Hz for 4k displays. All GM2xx Maxell GPUs also ship with an enhanced NVENC encoder that adds support for H.265 encoding. NVIDIA claims that Maxwell's video encoder improves H.264 video encode throughput by 2.5x over Kepler and that it can encode 4k video at 60 FPS. The max resolution supported by Maxwell is 5120x3200, so get ready for displays that go way beyond Ultra HD in the years to come! NVIDIA GeForce GTX Titan X Power Connectors The NVIDIA GeForce GTX Titan X has a Thermal Design Power (TDP) rating of 250 Watts and requires one 6-pin and one 8-pin PCIe Power Connector for proper operation. NVIDIA recommends a 600W or larger power supply for a system running one GeForce GTX Titan X video card. There is a physical location for a third 8-pin PCIe power connector on the PCB at the very end of the card, but it was deemed not needed for the GM200 and the solder points have gone unused.   NVIDIA GeForce GTX Titan X Back The NVIDIA GeForce GTX Titan X reference card does not come with a backplate. At first we were disappointed that NVIDIA did not include one, but they ensured us that one was not included to maximize the airflow for those wanting to run SLI. It would have been nice for NVIDIA to have included a fully removable backplate though as most gamers only run one card and like the looks and protection that a simple backplates can add to a graphics card. NVIDIA GeForce GTX Titan X GPU Cooler The NVIDIA GeForce GTX Titan X uses two sets of aluminum heatinks that has three embedded heatpipes that help keep the Maxwell GM204 GPU nice and cool. NVIDIA says that the default GPU Boost 2.0 settings will allow the GTX 980 to boost up to the highly clock frequency and remain there as long as the GPU temperature remains at or below 80C. NVIDIA GeForce GTX Titan X GM200 GPU Once you pull the CPU cooler entirely off you can see the PCB of the GeForce GTX Titan X reference card along with the GM200 GPU, GDDR5 memory ICs and the 6+2 power phase design. NVIDIA went with a 6-phase VR circuit with integrated dynamic power balancing circuity for the Titan X’s GM200 GPU and there is are two additional power phases for the boards whopping 12GB of GDDR5 memory. NVIDIA is also using polarized capacitors (POSCAPS) to minimize unwanted board noises as well as molded inductors for the very first time on a reference board. NVIDIA says that the 6+2 phase power supply setup has the ability to supply the GPU with 275W of power at the maximum power target setting of 110% if one would like to overclock the card. NVIDIA claims they were able to get the 3072 CUDA cores on GM200 GPU up to 1400MHz when overclocked with the stock air cooler.

Test System

Before we look at the numbers, let's take a brief look at the test system that was used. All testing was done using a fresh install of Windows 8 Pro 64-bit and benchmarks were completed on the desktop with no other software programs running. It should be noted that we average all of our test runs. There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing:

Intel X79/LGA2011 Platform

GeForce GTX Titan X Test System

The Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79-E WS motherboard with BIOS 1501 that came out on 01/15/2014. We went with the Intel Core i7-4960X Ivy Bridge-E processor to power this platform as it is PCIe 3.0 certified, so all graphics cards are tested with PCI Express Gen 3 enabled. The Kingston HyperX 10th Anniversary 16GB 2400MHz quad channel memory kit was set to XMP Profile #2. This profile defaults to 2133MHz with 1.65v and 11-12-12-30 1T memory timings. The OCZ Vertex 460 240GB SSD was run with latest firmware available. A Corsair AX860i digital power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements than the old Corsair AX1200 power supply that we used from 2012 till this year that had a loud fan that always ran. gpu-test-system-specs Here are the exact hardware components that we are using on our test system:
The Intel X79 Test Platform

Component

Brand/Model

Live Pricing

Processor Intel Core i7-4960X
Motherboard
ASUS P9X79-E WS
Memory
16GB Kingston 2133MHz
Video Card Various
Solid-State Drive OCZ Vertex 460 240GB
Cooling Intel TS13X (Asetek)
Power Supply Corsair AX860i
Operating System Windows 8.1 Pro 64-bit
Monitor Sharp PN-K321 32" 4K
  NVIDIA GeForce GTX Titan X 12GB Reference Video Card GPU-Z Info: GeForce GTX Titan X GPU-Z

NVIDIA GeForce GTX Titan X Overclocking

The NVIDIA GeForce GTX Titan X is a monster at its stock speeds, but if you are comfortable using an overclocking utility like EVGA Precision X you can take this already powerful card and get even more performance from the GM200 'Maxwell' GPU. This is due to the fact that NVIDIA left plenty of overhead with particular GPU for enthusaists and gamers to tap into if they wanted to push the envelope a little bit. The NVIDIA GeForce GTX Titan X is power limited in most game titles, so it will benefit from additional power. NVIDIA said that every GeForce GTX Titan X has the ability to run with the power target increased by up to 110% for better performance. EVGA Precision X We installed a latest version of the EVGA PrecisionX 16 overclocking utility to overclock the NVIDIA GeForce GTX TitanX video card! You can use whatever software utility you like for overclocking, but this our personal favorite and the one we've used the most.

GeForce GTX Titan X GPU-Z

In case you forgot, the NVIDIA GeForce GTX Titan X card is clocked at 1002 MHz base with a boost clock of 1076MHz and the 12GB of GDDR5 memory is clocked at 1753MHz (7012 MHz effective). Let's see how much higher we can get a fully enabled GM200 Maxwell GPU with 3072 CUDA cores! titanx-max-oc The NVIDIA GeForce GTX 980 is pretty open when it comes to overclocking. You can increase the power target to 110% and if you leave the GPU Temp Target locked it will automatically increase to 91C. We pushed the GPU Clock offset to +250MHz and the Mem Clock Offset to +500MHz on our card and found it was pretty stable in most game titles. GeForce GTX Titan X GPU-Z OC We ended up with a GPU clock offset to +250MHz and the mem clock offset to +500MHz before we started to get encounter some stability issues due to the memory clock frequency. This overclock meant that we were running at 1414MHz at times thanks to NVIDIA Boost 2.0 on the core and 2000MHz (8000MHz effective) on the 12GB of GDDR5 memory. NVIDIA GeForce GTX Titan X Stock: titanx-3dmark NVIDIA GeForce GTX Titan X Overclocked (+250/+500): GeForce GTX Titan X 3DMark Max OC By overclocking the NVIDIA GeForce GTX Titan X 12GB reference card we were able to take the score of 7549 and raise it up to 8773. This is a 1224 point increase in our overall 3DMark score, which represents a performance gain of 16.2 percent. The overal FPS average in Graphics Test 1 went from 42.30 to 49.85, which is a 17.8% performance gain in the graphics test. gaming-safe-oc The only problem is that our particular GeForce GTX Titan X didn't like running at 1252MHz base/1326MHz boost and the system would lock up randomly on some game titles. We backed the GPU core offset down from +250MHz to +200MHz and found the Titan X was rock solid and would for hours and hours without any crashes or lockups. At +200MHz on the core offset it means our Titan X was running at 1364MHz in the game titles. We benchmarked the GeForce GTX Titan X overclocked to +200MHz core and +500MHz memory for this review so you can see how the card performs both in stock trim and then when pushed with what we'd consider a happy 24/7 overclock that you'd actually set and leave on your card.

Batman: Arkham Origins

BatmanOrigins-SS Batman: Arkham Origins is an action-adventure video game developed by Warner Bros. Games Montréal. Based on the DC Comics superhero Batman, it follows the 2011 video game Batman: Arkham City and is the third main installment in the Batman: Arkham series. It was released worldwide on October 25, 2013. BatmanOrigins settings For testing we used DirectX11 Enhanced, FXAA High Anti-Aliasing and with all the bells and whistles turned on. It should be noted that V-Sync was turned off and that NVIDIA's PhysX software engine was also disabled to ensure both the AMD and NVIDIA graphics cards were rendering the same objects. We manually ran FRAPS on the single player game instead of using the built-in benchmark to be as real world as we possibly could. We ran FRAPS in the Bat Cave, which was one of the only locations that we could easily run FRAPS for a couple minutes and get it somewhat repeatable. batman-cpu-utilization The CPU usage for Batman: Arkham Origins was surprising low with just 10% of the Intel Core i7-4960X being used by this particular game title. You can see that the bulk of the work is being done by one CPU core. batman-average Benchmark Results: The NVIDIA GeForce GTX Titan X 12GB video card averaged 70.49 FPS on Batman: Arkham Origins with the stock clock speeds. With the Titan X video card overclocked by 200MHz on the CUDA cores and 250MHz on the GDDR5 memory we were able ti average 81.70 FPS, which was actually faster than a dead stock AMD Radeon R9 295X2 8GB video card.  The NVIDIA GeForce GTX 980 reference card came in at 52.91 FPS, so there is a pretty big gap between the GTX 980 and GTX Titan X! batman-fps Benchmark Results: When you look at performance over time, the GeForce GTX Titan X 12GB graphics card never dropped below 60 FPS on our Ultra HD (3840x2160) test setup!

Battlefield 4

bf4-screenshot Battlefield 4 is a first-person shooter video game developed by EA Digital Illusions CE (DICE) and published by Electronic Arts. It is a sequel to 2011's Battlefield 3 and was released on October 29, 2013 in North America. Battlefield 4's single-player Campaign takes place in 2020, six years after the events of its predecessor. Tensions between Russia and the United States have been running at a record high. On top of this, China is also on the brink of war, as Admiral Chang, the main antagonist, plans to overthrow China's current government; and, if successful, the Russians will have full support from the Chinese, bringing China into a war with the United States. bf4-settings This game title uses the Frostbite 3 game engine and looks great. We tested Battlefield 4 with the Ultra graphics quality preset as most discrete desktop graphics cards can easily play with this IQ setting at 1080P and we still want to be able to push the higher-end cards down the road. We used FRAPS to benchmark each card with these settings on the Shanghai level. bf4-cpu-utilization Battlefield 4 is more CPU intensive than any other game that we benchmark with as 25% of the CPU is used up during gameplay. You can see that six threads are being used and that the processor is running in Turbo mode at 3.96GHz more times than not. bf4-average Benchmark Results: In Battlefield 4 with Ultra settings at 3840x2160 we were able to average 33.28 FPS on the GeForce GTX 980 reference card versus 44.18 FPS on the GeForce GTX Titan X. The AMD Radeon R9 295X2 averaged 54.24 FPS, so it is about 10 FPS faster on average in BF4 with these settings. By overclocking the NVIDIA GeForce GTX Titan X we were able to improve the average performance up to 52 FPS, but that is still slower than a stock Radeon R9 295X2. bf4-fps Benchmark Results: The GeForce GTX Titan X ran BF4 pretty smoothly with these settings and and we seldom dropped below 40 FPS at 3840x2160. The AMD Radeon R9 295X2 runs AMD CrossFire, and you can see this when looking at performance charted over time as the frame rate was a bit more sporadic as you can see from the chart above.

Crysis 3

crysis3-SS Like the others, it is a first-person shooter developed by Crytek, using their CryEngine 3. Released in February 2013, it is well known to make even powerful system choke. It has probably the highest graphics requirements of any game available today. Unfortunately, Crytek didn’t include a standardized benchmark with Crysis 3. While the enemies will move about on their own, we will attempt to keep the same testing process for each test. crysis3-settings crysis3-settings2 Crysis 3 has a reputation for being highly resource intensive. Most graphics cards will have problems running Crysis 3 at maximum settings, so we settled on no AA with the graphics quality mostly set to Very High with 16x AF. We disabled v-sync and left the motion blur amount on medium. crysis3-cpu-utilization Crysis 3 appeared to run for the most part on just 3 CPU threads and used up about 15-18% of our Intel Core i7-4960X processor with these settings. Notice that the processor speed was at 3.53GHz and we very seldom, if ever, saw the processor go into turbo mode on Crysis 3. crysis3-average Benchmark Results: On Crysis 3 at 3840x2160 the NVIDIA GeForce GTX 980 reference card averaged 21.07 FPS and we were running at 27.83 FPS on the GeForce GTX Titan X.  Overclocking the GeForce GTX Titan X got our performance up to 33.48 FPS, which as great as it looks is still a couple FPS behind the Radeon R9 295X2. crysis3-fps Benchmark Results: It is extremely tough to get identical FRAPS runs on Crysis 3, but you an see that the GeForce GTX Titan X stays above the 25 FPS for most of the benchmark run and had less variance than the Radeon R9 295X2.

Far Cry 4

FarCry4 Far Cry 4 is an action-adventure first-person shooter video game developed by Ubisoft Montreal and published by Ubisoft for the PlayStation 3, PlayStation 4, Xbox 360, Xbox One video game consoles, and Microsoft Windows. It is the sequel to 2012's Far Cry 3. The game was released on November 18th, 2014 in North America and Europe. Far Cry 4 follows Ajay Ghale, a young Kyrati-American who returns to his native country Kyrat to spread his deceased mother's ashes. He finds the country in a state of civil war between Kyrat's Royal Army led by the country's eccentric and tyrant king Pagan Min and the Golden Path, a rebel movement fighting to free Kyrat from Min's oppressive rule. FarCry4 settings Far Cry 4 uses the heavily modified Dunia Engine 2 game engine with Havok physics. The graphics are excellent and the game really pushes the limits of what one can expect from mainstream graphics cards. We set game title to Ultra image quality settings and did not adjust any of the advanced settings. farcry4 CPU Usage Far Cry 4 uses about 30% of the processor and is running on multiple cores as you can see from our screen capture above. One core has more of a load on it than the others, but all logical processors are being uses to some degree when playing Far Cry 4. fc4-average

Benchmark Results: In Far Cry 4 we found the new NVIDIA GeForce GTX Titan X video card put up a respectable score of 43.71 FPS, which makes the 32.36 FPS on the NVIDIA GeForce GTX 980 reference card looks underwhelming. When overclocked the GeForce GTX Titan X was able to average 51.07 FPS on FarCry 4 and out gaming experience was excellent on this awesome looking game title. The AMD Radeon R9 295X2 had the highest average FPS at 59.20, but it was choppy at various times when we were playing the game title.

FarCry4-fps

Benchmark Results: The AMD Radeon R9 295X2 might have won the average FPS battle, but the game play wasn't smooth. When we charted the FRAPS log and looked at performance over time we found that the Radeon R9 295X2 dropped down below 30FPS once and down to nearly 30 FPS several times. The NVIDIA GeForce GTX Titan X never dropped below 37 FPS, so there was nearly a 10 FPS difference in the minimum frame rate during our benchmarking session.

Metro Last Light

 

MetroLL-SS

Metro: Last Light is a first-person shooter video game developed by Ukrainian studio 4A Games and published by Deep Silver. The game is set in a post-apocalyptic world and features action-oriented gameplay with a combination of survival horror elements. It uses the 4A Game engine and was released in May 2013. metro-settings Metro: Last Light was benchmarked with very high image quality settings with the SSAA set to off and 4x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. We benchmarked this game title on the Theater level. metroll-cpu-utilization

We again found around 20% CPU usage on Metro: Last Light.

metro-average Benchmark Results: In Metro: Last Light the GeForce GTX 980 averaged of 39.41 FPS and the GeForce GTX Titan X came in at 48.34 FPS in stock form. The AMD Radeon R9 295X2 came in at 64.82 FPS and not even the impressive 56.67 FPS average on the overclocked Titan X could touch that. metro-fps Benchmark Results: The AMD Radeon R9 295X2 had a pretty large 30 FPS dip in our benchmark run, but for the most part nothing was seen that is out of the ordinary here! The NVIDIA GeForce GTX Titan X was able to run Metro LL above 40FPS for the entire benchmark run and we played through part of the game without any issues on our 4K setup.

Thief

thief Thief is a series of stealth video games in which the player takes the role of Garrett, a master thief in a fantasy/steampunk world resembling a cross between the Late Middle Ages and the Victorian era, with more advanced technologies interspersed. Thief is the fourth title in the Thief series, developed by Eidos Montreal and published by Square Enix on February 25, 2014. thief-display-settings thief-graphics-settings We ran Thief with the image quality settings set at normal with VSYNC disabled. dayz-cpu-utilization Thief appears to be running on the six physical cores of the Intel Core i7-4960X processor and averages around 17-24% CPU usage from what we were able to tell from the CPU utilization meter that is built into the Windows 8.1 task manager. thief-average Benchmark Results: On Thief we found the NVIDIA GeForce GTX 980 reference card had an average benchmark run of 50.64 FPS and the GeForce GTX Titan X averaged 62.79 FPS. The AMD Radeon R9 295X2 scored 72.43 FPS in stock trim, which the overclocked Titan X was able to surpass with a score of 74.18 FPS. theif-fps Benchmark Results: The performance over time chart showed no anomalies! The NVIDIA GeForce GTX Titan X managed to stay above 50 FPS during the benchmark run.

3DMark 2013

3Dmark Fire Strike Benchmark Results - For high performance gaming PCs Use Fire Strike to test the performance of dedicated gaming PCs, or use the Fire Strike Extreme preset for high-end systems with multiple GPUs. Fire Strike uses a multi-threaded DirectX 11 engine to test DirectX 11 hardware.

3DMark Fire Strike

  Fire Strike Extreme Benchmark Results:

firestrike-extreme

Benchmark Results: The 3DMark Fire Strike Extreme benchmark had the NVIDIA GeForce GTX 980 video card coming in with an overall score of 5,853 and the GeForce GTX Titan X easily bested that with a score of 7,549. The AMD Radeon R9 295X2 led the benchmark chart though with an impressive overall score of 8,852.

Fire Strike Ultra 4K Benchmark Results:

3dmark-ultra

Benchmark Results: The 3DMark Fire Strike Ultra '4K' benchmark had the GeForce GTX Titan X coming in at 4,000 and the Radeon R9 295X2 was at 4,899.

Temperature & Noise Testing

Temperatures are important to enthusiasts and gamers, so we took a bit of time and did some temperature testing on the NVIDIA GeForce GTX Titan X video card. NVIDIA GeForce GTX Titan X Idle Temps: GeForce GTX Titan X Idle Temps At idle we found the GPU core temperature was 32C with the single fan on the NVIDIA reference cooler running at 22% or 1050 RPM. GeForce GTX Titan X GPU-Z Gaming When gaming we hit 85C, which is slightly over the 83C default GPU temp target for the GeForce GTX Titan X. Our room temperature was 70F (21C), so these are respectable scores for a flagship GPU with 3072 CUDA Cores! When gaming our card boosted up to a maximum speed of 1164.4MHz on the CUDA cores. temp-testing

Sound Testing

We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.

noise-testing The NVIDIA GeForce GTX Titan X is a fairly quiet card at both idle and when heated to full operating temperature.

Power Consumption

  NVIDIA GeForce GTX Titan X Video Card For testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we ran Battlefield 4 at 3840x2160 and recorded the average idle reading and the peak gaming reading on the power meter. power-consumption Power Consumption Results: The NVIDIA GeForce GTX 780 Ti and GeForce GTX Titan X are both 250W TDP cards and we found that both used roughly 380 Watts when running BF4. The GeForce GTC Titan X hit 376W with the stock clock speeds and then when overclocked hit 403W with the power target increased and the clock speeds raised. The AMD Radeon R9 295X2 led many of the performance tests, but it also uses by far the most power to get those performance numbers! The AMD Radeon R9 295X2 peaked at 743W in BF4 and that is a staggering 367 Watts more than a stock GeForce GTX Titan X. The AMD Radeon R9 295X2 all of a sudden doesn't look that good when you see how much power it is using. By the time you factor in your 4K display, surround sound speakers (if you aren't using a headphone) and everything else you'll be damn near using 1000W of power when gaming!

Final Thoughts and Conclusions

  NVIDIA GeForce GTX Titan X Graphics Card The NVIDIA GeForce GTX Titan X was designed to be the fastest graphics card in the world to power the highest resolution displays on the market today. Our benchmarks showed that the Titan X is indeed the fastest GPU on the market with regards to single-GPU cards. NVIDIA designed the GeForce GTX Titan X with the future in mind and soldered down 12GB of GDR5 memory to ensure that you’ll never be frame buffer limited when gaming on this card. You’ll run out of shading horsepower well before you full up all 12GB of that memory! NVIDIA is a little sensitive about frame buffer right now, but this card has a true 12GB of memory! The AMD Radeon R9 295X2 is water cooled dual-GPU power chugging beast, but it fails to impress after running the more efficient and quieter GeForce GTX Titan X. The GeForce GTX Titan X also proved to be a great overclocker and we were able to raise the clock speeds on the CUDA cores by 200MHz with full stability with a nice boost in overall gaming performance. The GeForce GTX Titan X is being built and sold directly to AIB partners and there are no plans for the partners to build their own cards at this time. This is a hard launch with retail availability now and the price for the Titan X graphics card is $999. The lowest price we could find for any brand GeForce GTX 980 is $558.24 shipped. The main competition for the Titan X would be without a doubt the AMD Radeon R9 295X2 and we just found one on Newegg today for $659.99 plus shipping after a rebate. The AMD Radeon R9 290X is down to $339.99 shipped after rebate, so the price on the dual-GPU Hawaii beast from AMD is about right. The AMD Radeon R9 290X 4GB video card was released in 2013 and it was a very powerful card when it was released. AMD had great success with the card in art due to the cryptocurrency craze, but that ended up hurting gamers as the cards were out of stock and those that were impatient and didn’t want to wait went over to NVIDIA. Now that the run on GPUs for mining is long gone and the Radeon R9 290X has lost its big time appeal to gamers, team red isn’t looking that hot. The good news for AMD fans is that the AMD Radeon R9 390X 4GB or 8GB video card is right around the corner (launching around Computex in June 2015). There has been a number of leaks about the card just ahead of the Titan X launch, which could be some AMD folks or fans just trying to mess with Titan X launch. The AMD Radeon R9 390X graphics card looks promising and we know for a fact that it will be using High Bandwidth Memory (HBM) technology, but we'll reserve our thoughts for when it actually makes it to market. The NVIDIA GeForce GTX Titan X is real, in our hands and you can go by one today. LR Recommended Award
Legit Bottom Line: The NVIDIA GeForce GTX Titan X video card is now the world's fastest GPU!