NVIDIA Unleashes A Monster - GeForce GTX 690

NVIDIA Wood Crate and Pry Bar

We all knew that NVIDIA was working on a dual-GPU video card to replace the GeForce GTX 590 that was introduced last year, but no one knew when exactly it was coming. We, ourselves, only found out when this card was launching this past Saturday when NVIDIA CEO Jen-Hsun Huang told the world the news at the GeForce LAN in Shanghai, China.  The company's new video card, the GeForce GTX 690 was designed from the ground up to deliver the highest performance of any graphics card in history. NVIDIA didn't stop there though as they paid special attention to the appearance of the card to ensure that this cards design would be over the top. When a pry bar and a wood crate showed up at our door, we knew this card was something special.

NVIDIA GeForce GTX 690 Video Card

And this video card will be a game game changer, as it is powered by two Kepler GK104 GPUs that have a combined 3072 CUDA cores, 7.08 Billion transistors and 4GB of GDDR5 memory! NVIDIA wanted gamers to be able to crank up the gaming realism with NVIDIA Surround, NVIDIA 3D Vision Surround and PhysX, so NVIDIA needed a card that would that could deliver and for that the GeForce GTX 690 comes to market. If you are a gamer that cranks up the image quality and run at an Ultra HD screen resolution (2560x1600) on a with a single graphics card be sure that you are paying attention!

You can also clearly see that card is a different breed as NVIDIA made a number of changes to the overall appearance of the card by using exotic solutions like the following:

GeForce GTX 690 Video Card

It sounds like NVIDIA has started to put more emphasis on style and design, which is great! For years most video card have been noting more than black plastic boxes that go inside your PC. Most video cards feel cheap and flimsy, so when we picked up the NVIDIA GeForce GTX 690 and felt the metal construction and the overall weight of the card we were pleasantly surprises and happy with what NVIDIA has done.

GeForce GTX 690 Video Card Specs

The GeForce GTX 690 is a dual-GPU card that uses two GK104 cores, which just happens to be the core used on the GeForce GTX 680 single-GPU video card. Since it uses two of these cores, the features of the card are for the most part doubled. That means the GeForce GTX 690 ships with 16 Streaming Multiprocessor (SMX) units that are good for a total of 3072 CUDA cores (1536 per GPU).

NVIDIA lowered the base clock on each GPU though, so the GeForce GTX 690 has a baseclock of 915MHz versus 1006MHz on the GeForce GTX 680.  This means that the GeForce GTX 690 will likely perform a little slower since it has had a reduction in the core clock speed. The boost clock on the GTX 690 goes up to 1019MHz by default though, which is not too far off from the 1058MHz seen on the GTX 680. The typical Boost clock is based on the average video card running a wide variety of games and applications. The actual Boost clock will vary depending on actual system conditions since this new series of cards has dynamic clocks. With a boost clock that is just 3.9% slower than the GeForce GTX 680 we can assume that the GeForce GTX 690 will perform slightly slower than two GeForce GTX 680 video cards running in SLI.

The memory subsystem of the GeForce GTX 690 consists of four 64-bit memory controllers (256-bit) with 2GB of GDDR5 memory per GPU (4GB total). The base clock speed of the GeForce GTX 680 is 1006MHz and the typical Boost clock speed is 1058MHz. NVIDIA left the memory speed unchanged, so both the GeForce GTX 680 and the new GeForce GTX 690 have the memory running at an impressive 6008MHz (effective).

GeForce GTX 690 Video Card Specs

NVIDIA informed us that the GeForce GTX 690 board typically draws around 263W of power, but the board has been fitted with two 8-pin power connectors. Combined with the PCI Express interface, this means that the GeForce GTX 690 can draw up to 375W of power if needed. The GeForce GTX 690 is rated at 300W TDP, so this leaves a considerable amount of headroom for overclocking.

Let's take a closer look at the GeForce GTX 690!

Closer Look At The GeForce GTX 690

NVIDIA GeForce GTX 690 Video Card

The GeForce GTX 690 graphics card on the test bench today is a dual-slot video card that uses two GK104 GPUs, so it is fairly larger.  This card measures in at 11-inches in length, which one inch longer than the GeForce GTX 680!  NIVIDIA went with an exotic industrial design on the GeForce GTX 690 and the looks are unlike anything that we have seen before.

NVIDIA GeForce GTX 690 Video Card Top

Each GPU has its own distinct cooling unit and NVIDIA placed clear polycarbonate windows over each of them, so you can see each of the nickle plated heat sinks.

NVIDIA GeForce GTX 690 Video Card Back

Turning the NVIDIA GeForce GTX 690 over there is not much of interest as none of the memory chips or major components are located on the backside of the PCB. NVIDIA is using a 10-layer 2oz copper PCB on the GeForce GTX 690 to help with high-efficiency power delivery with less resistance, lower power and less geat generation. We used a pair of dial calipers and found the mounting holes around the GPU are spaced 58mm apart.

NVIDIA GeForce GTX 690 Video Card DVI and HDMI

The NVIDIA GeForce GTX 690 2GB GDDR5 graphics card has three dual-link DVI outputs (two dual-link DVI-D and one DVI-D) along with a mini-DispalyPort connector.

NVIDIA GeForce GTX 690 Video Card LED Logo

The top of the card looks sharp and after using the card for the first time we quickly noticed that the GeForce GTX logo on the edge of the board is LED backlit! The lettering is laser-etched and the lighting is just right to add some flair to any chassis.

NVIDIA GeForce GTX 690 Video Card End

The back end of the GeForce GTX 690 has a small opening to allow airflow to enter the fan.

NVIDIA GeForce GTX 690 Video Card PCIe Power

The NVIDIA GeForce GTX 690 video card requires a 650 Watt or greater power supply with a minimum of 38 Amps on the +12V rail and two 6-pin PCI Express power connectors for proper connection. It should be noted that the NVIDIA minimum system power requirement is based on a PC configured with an Intel Core i7 3.2GHz CPU. These are pretty reasonable power requirements, so that is good news for everyone!   

NVIDIA GeForce GTX 690 Video Card SLI Connector

The NVIDIA GeForce GTX 690 graphics cards has full SLI support and comes with a pair of SLI bridges located along the top edge of the graphics card. The NVIDIA GTX 690 series supports quad SLI configurations, so you can pair it with another card. We asked NVIDIA if you could pair this card with a single GeForce GTX 680 for a 3-way SLI setup and they said you could not.

NVIDIA GeForce GTX 690 Vapor Chamber Cooling

With the magnesium alloy fan shroud removed we can get a better look at the two GPU coolers and how the layout is done. Each GPU has its own dedicated cooling unit; each self-contained cooler consists of a copper vapor chamber and a dual-slot heatsink. An aluminum baseplate provides additional cooling for the PCB and board components. Channeling cool air through the GPU coolers is a center-mounted axial fan. The section of the baseplate directly underneath the fan is carved with low-profile channels to encourage smooth airflow, and all components under the fan are low-profile so they won’t cause turbulence or obstruct airflow.

NVIDIA GeForce GTX 690 Vapor Chamber Cooling

NVIDIA uses a 10-phase power supply with a 10-layer 2oz copper PCB on the GeForce GTX 690. Bewtween the two GPU's you'll notice another chip. This is the PLX bridge chip that provides independent PCI Express 3.0 x16 access to both GPUs for maximum multi-GPU throughput. The GeForce GTX 690 is ready to run in a PCIe Gen 3 slot for optimum performance.

Test Setup

Before we look at the numbers, let's take a brief look at the test system that was used. All testing was done using a fresh install of Windows 7 Ultimate 64-bit and benchmarks were completed on the desktop with no other software programs running.

Drivers used for testing:

Intel X79/LGA2011 Platform

Intel LGA2011 Test System
Intel LGA2011 Test System

The Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79 Deluxe motherboard with BIOS 0906 that came out on 12/22/2011. The Corsair Vengeance 16GB 1866MHz quad channel memory kit was set to 1866MHz with 1.5v and 9-10-9-27 1T memory timings. The OCZ Vertex 3 240GB SSD was run with  firmware version 2.15.

The Intel X79 Test Platform

Component

Brand/Model

Live Pricing

Processor

Intel Core i7-3960X

Motherboard

ASUS P9X79 Deluxe

Memory

16GB Corsair 1866MHz

Video Card

Various

Solid-State Drive

OCZ Vertex 3 240GB

Cooling

Intel RTS2011LC

Power Supply

Corsair AX1200

Operating System

Windows 7 Ultimate 64-bit

Video Cards Tested:

NVIDIA GeForce GTX 690 GPU-Z Information:

NVIDIA GeForce GTX 690 GPU-Z

Batman: Arkham City

Batman: Arkham City PC Game

Batman: Arkham City is a 2011 action-adventure video game developed by Rocksteady Studios. It is the sequel to the 2009 video game Batman: Arkham Asylum, based on the DC Comics superhero Batman. The game was released by Warner Bros. Interactive Entertainment for the PlayStation 3, Xbox 360 and Microsoft Windows. The PC and Onlive version was released on November 22, 2011.

Batman: Arkham City Game Settings

Batman: Arkham City Game Settings

Batman: Arkham City uses the Unreal Engine 3 game engine with PhysX. For benchmark testing of Batman: Arkham City we disabled PhysX to keep it fair and ran the game in DirectX 11 mode with 8x MSAA enabled and all the image quality features cranked up. You can see all of the exact settings in the screen captures above.


Batman: Arkham City Benchmark Results

Benchmark Results:
Batman: Arkham City doesn't show us a performance difference at 1280x1024 when comparing the NVIDIA GeForce GTX 690 and the NVIDIA GeForce GTX 680's in SLI, both were able to average 175 frames per second. Bumping the resolution to 1920x1080 there is a 1 frame per second difference, and at 2560x1600 we can see a 2 frame per second difference. Looking at the performance of all the singe cards, the NVIDIA GeForce GTX 690 was significantly faster which isn't unexpected. At 2560x1600 the GTX 690 was nearly 95% faster than the GTX 680 and 130% faster than the MSI R7970 Lightning 3GB graphics card! 

Battlefield 3

Battlefield 3 Screenshot

Battlefield 3 (BF3) is a first-person shooter video game developed by EA Digital Illusions CE and published by Electronic Arts. The game was released in North America on October 25, 2011 and in Europe on October 28, 2011. It does not support versions of Windows prior to Windows Vista as the game only supports DirectX 10 and 11. It is a direct sequel to 2005's Battlefield 2, and the eleventh installment in the Battlefield franchise. The game sold 5 million copies in its first week of release and the PC download is exclusive to EA's Origin platform, through which PC users also authenticate when connecting to the game.

Battlefield 3 Screenshot

Battlefield 3 debuts the new Frostbite 2 engine. This updated Frostbite engine can realistically portray the destruction of buildings and scenery to a greater extent than previous versions. Unlike previous iterations, the new version can also support dense urban areas. Battlefield 3 uses a new type of character animation technology called ANT. ANT technology is used in EA Sports games, such as FIFA, but for Battlefield 3 is adapted to create a more realistic soldier, with the ability to transition into cover and turn the head before the body.

This game looks great and we tested with the highest settings possible.  This means we used 'ultra' settings and really punished the cards being tested. We ran FRAPS for two minutes on the single player map called 'Rock and a Hard Place' for benchmarking.

Battlefield 3 Benchmark Results

Benchmark Results: In Battlefield 3 the NVIDIA GeForce GTX 690 came up a little short of the performance of a pair of NVIDIA GeForce GTX 680's in SLI. This is simply due to the fact that the base core clocks on the GTX 690 are 91MHz lower than those of the GeForce GTX 680, the boost clocks are closer together and separated by only 39MHz. The differences between these two setups isn't mind blowing, but worth enough to consider. At a resolution of 1920x1080 the GTX 690 averaged 133.7 frames per second while the GTX 680's in SLI averaged 137.9 frames per second. Increasing the resolution to 2560x1600 the GTX 690 averaged 85.1 frames per second while in SLI the GTX 680's averaged 88.4 frames per second. At these two resolutions there is only 3%-4% difference.

Deus Ex: Human Revolution

Deus Ex Human Revolution PC Game

Deus Ex: Human Revolution is the third game in the Deus Ex first-person role-playing video game series, and a prequel to the original game. Announced on May 27, 2007, Human Revolution was developed by Eidos Montreal and published by Square Enix. It was released in August 2011. Human Revolution contains elements of first-person shooters and role-playing games, set in a near-future where corporations have extended their influence past the reach of global governments. The game follows Adam Jensen, the security chief for one of the game's most powerful corporations, Sarif Industries. After a devastating attack on Sarif's headquarters, Adam is forced to undergo radical surgeries that fuse his body with mechanical augmentations, and he is embroiled in the search for those responsible for the attack.

Deus Ex Human Revolution Game Settings

Deus Ex Human Revolution Game Settings

Deus Ex: Human Revolution uses a modified Crystal Dynamics Crystal game engine, which some of you might know as the game engine from the last Tomb Raider game title. The game developers did some rather hefty modifications to this engine though as the graphics are superb in this title.

Deus Ex Human Revolution Benchmark Results

Benchmark Results: In Deus Ex we saw zero performance difference at 1280x1024 between the NVIDIA GeForce GTX 690 and the GTX 680's in SLI. Once we started increasing the resolution the differences began to show. At 1920x1080 the GTX 690 averaged 228.9 frames per second while the GTX 680's in SLI were 2.5% faster with an average of 234.6 frames per second. Setting up the maximum resolution of 2560x1600 there was a difference of 5.4% with the GTX 690 averaging 129.4 frames per second and the GeForce GTX 680's in SLI averaging 136.4 frames per second.  

DiRT 3

Dirt 3 PC Game

Dirt 3 (stylized DiRT 3) is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. However, the "Colin McRae" tag has been completely removed from this iteration. The game was released in Europe and North America on the 24 May 2011.

Dirt 3 Game Settings

Dirt 3 Game Settings

Dirt3 uses Ego 2.0 Game Technology Engine (more commonly referred to as Ego Engine or EGO, stylised ego), which is a video game engine developed by Codemasters. Ego is a modified version of the Neon game engine that was used in Colin McRae: Dirt and was developed by Codemasters and Sony Computer Entertainment using Sony Computer Entertainment's PhyreEngine cross-platform graphics engine. The Ego engine was developed to render more detailed damage and physics as well as render large-scale environments.

Dirt 3 PC Game Benchmark Results

Benchmark Results: DiRT 3 shows that the NVIDIA GeForce GTX 690 and the GeForce GTX 680's in SLI perform nearly identical to each other. Comparing the performance of the GeForce GTX 690 to a single GTX 680 we can see some hefty performance improvements. At our lowest testing resolution of 1280x1024 the GTX 690 was only 37.8% ahead of the single GTX 680. Once we increased the resolution to 2560x1600 the performance margin increased to 75.6%! 

H.A.W.X. 2

Tom Clancy's HAWX 2

Aerial warfare has evolved. So have you. As a member of the ultra-secret H.A.W.X. 2 squadron, you are one of the chosen few, one of the truly elite. You will use finely honed reflexes, bleeding-edge technology and ultra-sophisticated aircraft - their existence denied by many governments - to dominate the skies. You will do so by mastering every nuance of the world's finest combat aircraft. You will slip into enemy territory undetected, deliver a crippling blow and escape before he can summon a response. You will use your superior technology to decimate the enemy from afar, then draw him in close for a pulse-pounding dogfight. And you will use your steel nerve to successfully execute night raids, aerial refueling and more. You will do all this with professionalism, skill and consummate lethality. Because you are a member of H.A.W.X. 2 and you are one of the finest military aviators the world has ever known. H.A.W.X. 2 was released on November 16, 2010 for PC gamers.

Tom Clancy's HAWX 2

We ran the benchmark in DX11 mode with the image quality settings cranked up as you can see above.

Tom Clancy's HAWX 2

The H.A.W.X. 2 PC game title runs on what looks like seven threads as you can see from the task manager shot seen above that was taken on the test system running the Intel Core i7-3960X processor.

Tom Clancy's HAWX 2 Benchmark Results

Benchmark Results: HAWX 2 is very playable on the latest generation graphics cards and running this game title in SLI certainly isn't needed. We still found decent scaling numbers here though as at 2560x1600 we saw a 86.3% performance increase with the second GeForce GTX 680 video card!

Just Cause 2

Just Cause 2

Just Cause 2 is a sandbox style action video game developed by Swedish developer Avalanche Studios and Eidos Interactive, published by Square Enix. It is the sequel to the 2006 video game, Just Cause. 

Just Cause 2 Game Settings

Just Cause 2 employs a new version of the Avalanche Engine, Avalanche Engine 2.0, which is an updated version of the engine used in Just Cause.  The game will be set on the other side of the world, compared to Just Cause, which is on the fictional tropical island of Panau in Southeast Asia. Rico Rodriguez will return as the protagonist, aiming to overthrow the evil dictator Pandak "Baby" Panay and confront his former boss, Tom Sheldon.

Just Cause 2 Benchmark Results

Benchmark Results: Just Cause 2 doesn't show us a record breaking difference between a single NVIDIA GeForce GTX 680 and the dual GPU GeForce GTX 690, at least not at lower resolutions. At 1280x1024 we see a difference of only 12.18 frames per second. As we step up the resolutions the margins begin to grow. At 1920x1080 we can see a difference of 19.81 frames per second between these two cards. This is a jump from 12.6% at 1280x1024 to 27.3% at 1920x1080. Increasing the resolution to 2560x1600 we take another leap in performance up to a margin of  63.2%!

Metro 2033

Metro 2033

Metro 2033 is an action-oriented video game with a combination of survival horror and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in the Ukraine. The game is played from the perspective of a character named Artyom. The story takes place in post-apocalyptic Moscow, mostly inside the metro station where the player's character was raised (he was born before the war, in an unharmed city), but occasionally the player has to go above ground on certain missions and scavenge for valuables.

Metro 2033 Settings

This is another extremely demanding game. Image quality settings were raised to Very High quality with 4x AA and 16x AF. We turned off PhysX, but turned on DOF (Depth of Field) for benchmarking.

Metro 2033

Benchmark Results: Metro 2033 once again shows that there is very little performance difference between the NVIDIA GeForce GTX 680's in SLI and the GeForce GTX 690 dual GPU card.

3DMark 11

Futuremark 3DMark 11 Benchmark

3DMark 11 is the latest version of the world’s most popular benchmark for measuring the 3D graphics performance of gaming PCs. 3DMark 11 uses a native DirectX 11 engine designed to make extensive use of all the new features in DirectX 11, including tessellation, compute shaders and multi-threading.

Futuremark 3DMark 11 Benchmark Settings

Since Futuremark has recently released 3DMark11 we decided to run the benchmark at both performance and extreme presets to see how our hardware will run.

3DMark11 Performance Benchmark Results:

Futuremark 3DMark 11 Benchmark Results

Benchmark Results:  Futuremark 3DMark11 with the performance preset gave us a great score of P15411. This is 630 points or 4% behind the GTX 680's in SLI.

3DMark11 Extreme Benchmark Results:

Futuremark 3DMark 11 Benchmark Results

Benchmark Results: Running Futuremark 3DMark11 with the extreme present the NVIDIA GeForce GTX 690 scored a solid score of X5797. That's an improvement of 2560 points or 79.1% over a single GeForce GTX 680!

Power Consumption

For testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we measured the peak wattage used by the system while running the OpenGL benchmark FurMark 1.9.2 at 640x480 resolution. We also ran Battlefield 3, 3DMark11 and Deus EX and averaged the peak results recorded the highest Wattage seen on the meter for the gaming results.

GeForce GTX680 Power Class

The NVIDIA GeForce GTX 690 is supposed to be very power efficient, so this will be interested to check out. The AMD Radeon HD 6990 with Catalyst 12.4 WHQL drivers doesn't play nice with Furmark in full-screen mode, so we will not be including those results.

Total System Power Consumption Results

Power Consumption Results: The total system power consumption of the NVIDIA GeForce GTX 690 falls right between a single GTX 680 and a pair of GTX 680's in SLI. Running the Furmark 1.9.2 burn-in the 680's in SLI pulled 573 Watts while the GTX 690 was 105 Watts lower at 468 Watts. A single GTX 680 pulled only 352 Watts which is 116 Watts lower than the GTX 690. Looking at the power consumption average of the three games, it's quite a similar story. The GTX 690 averaged 459 Watts, while the SLI system averaged 83 Watts higher and the single GTX 680 averaged 118 Watts less. The idle power consumption of the GeForce GTX 690 was right on par with the GTX 680's in SLI. We expected to see slightly less since we are running a single PCB rather than a pair of them as we are in SLI. 

Temperature & Noise Testing

Since video card temperatures and the heat generated by next-generation cards have become an area of concern among enthusiasts and gamers, we want to take a closer look at how the graphics cards do at idle, during gaming and finally under a full load.

Video Card Idle Temperatures

Temperatures significant increase when running SLI due to the airflow around the cards and the inside card usually always runs hotter since it is stuck between the other card and the processor area. The inner card idled at 47C and the outer card at 40C on our open air test bench.  Notice that the fan speeds were different due to the different temperatures of the cards.  The inner card was running at 1200RPM, while the outside card was running at 1000RPM.  Running SLI does make for a hotter and louder system!

Video Card Load Temperatures

When running Furmark with the GeForce GTX 680's in SLI we saw the cards reach 82C and 86C, respectively. Both cards were at 98% GPU load, but notice the fan speed difference again.  One is running 2940 RPM and the other is running 2400 RPM. This 540 RPM difference is due to the 4C temperature difference on the cores as that is how the fan profiles are programmed.

Video Card Load Temperatures

After Furmark was turned off you can see that the cards temperatures recovered quickly.

Video Card Temps

Using the peak temperature from NVIDIA GeForce GTX 690 had lower temperatures than the single GeForce GTX 680 in Furmark 1.9.2 and while gaming. Granted it's only by 1 degree Celsius, but considering that we have two GPU's on one PCB that's pretty impressive. The idle temperature of the GTX 690 was slightly warmer than the single GTX 680 by 2 degrees Celsius.

Video Card Noise Levels

We recently upgraded our sound meter to an Extech sound level meter with ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 38dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.

Video Card Noise Levels

It's not terribly surprising that the GTX 680's in SLI were the loudest of the NVIDIA GPU's. The GTX 690 was 2.5 dBA lower than the SLI setup in Furmark 1.9.2 and .8 dBA lower while gaming. At idle though, the GTX 690 was 1.3 dBA higher than the 680's in SLI.

Overclocking The GeForce GTX 690

EVGA Precision 3.0.2 Overclocking Utility

We installed the EVGA Precision X 3.0.2 software utility to overclock the NVIDIA GeForce GTX 690 video card to see just how far we could push this card. With the new Kepler core architecture design used on the GeForce 600 series, you can now adjust the power target of the video card along with GPU and Memory clock offsets within a certain range.

EVGA Precision 3.0.2 Overclocking Utility

EVGA Precision X v3.0.2 lets you increase the power target to 135%, the GPU clock offset to 549MHz and the Memory clock offset to 1000MHz.


EVGA Precision 3.0.1 Overclocking Utility

After spending an afternoon with our GeForce GTX 690 we found that we were able to reach +160MHz on the core without any voltage increases. This is a great overclock on this dual-GPU card and saw the card hitting over 1230MHz when it was running boost in 3DMark 11 and game titles. We didn't mess with overclocking the memory much as most of the gains on Kepler are with the GPU clock speeds.

NVIDIA GeForce GTX 690 at Stock Settings: 

GeForce GTX 690 3DMark 11 Score

NVIDIA GeForce GTX 690 w/ 135% Power Target & +160MHz GPU Clock Offset: 

GeForce GTX 690 3DMark 11 Score

With this overclock we were able to hit P17419 on 3DMark 11 with the performance preset. This is a over a 2000 point increase in our score, which is a very nice 13% improvement in performance.  The stock score was P15411!

We literally got this card less than 72 hours before this article was published and didn't have much time to overclock it.  We are sure this card has more left in it! We have been told that setting the voltage doesn't matter much these days as the max voltage setting in EVGA Precision X utility is 1175mV and the card automatically defaults to that under load.

Seeing the NVIDIA GeForce GTX 690 hitting over 1200MHz this easily is impressive considering this is a dual-GPU card!

Final Thoughts and Conclusions

When it comes to the performance results, the NVIDIA GeForce GTX 690 easily becomes the fastest gaming graphics card that we have ever tested.  It is also one of the better looking cards that we have ever used as well.  You can tell that NVIDIA left no stone unturned and has tried to make this product the best video card the company has ever released. The GeForce GTX 690 is a revolutionary product as it raises the bar for what one will expect to get from a flagship video card. The NVIDIA GeForce GTX 690 has a suggested retail price of $999, so the use of exotic materials actually helps reduce the sticker shock a little bit.  No one wants to drop a thousand dollars and get a cheap looking product that also feels cheap in your hands.  NVIDIA made sure that customers will not experience that with the GeForce GTX 690.

Looks aside, the GeForce GTX 690 performed exceptionally well and our gaming experience was excellent! With the slightly lower core clock speeds we figured that the GeForce GT 690 would perform slightly slower than two GeForce GTX 680 video cards running in SLI and our testing confirmed this.  It also confirmed that it blows away the AMD Radeon HD 6990, which is the competitors current dual-GPU graphics card. Overclocking the GeForce GTX 690 was also very easy to do and we were able to have the card running over 1200MHz in games with relative ease. NVIDIA made it known to the world today that they are the fastest card available and that AMD better bring something amazing out with the Radeon HD 7990 if they want to compete. 

NVIDIA GeForce GTX 690 Video Card

The NVIDIA GeForce GTX 690 did amazingly well when it came to power consumption, heat and noise. The noise levels are more than acceptable and when gaming it ran quieter and used less power that the GeForce GTX 680 SLI setup! If you want the very best performance for $999, you are best off going with two GeForce GTX 680 cards at $499 each.  That said, the GTX 680 SLI setup uses more power and is louder than the GeForce GTX 690 that performs basically at the same level. The GeForce GTX 690 also will appeal to those with Small Form Factor (SFF) systems that can only fit one video card.  For example Mini ITX motherboards only have one PCI Express x16 slot if you want the best, the GeForce GTX 690 is the only call to make. 

According to NVIDIA, the GeForce GTX 690 will be available in limited quantities from add-in card partners, including ASUS, EVGA, Gainward, Galaxy, Gigabyte, Inno3D, MSI, Palit and Zotac starting today, with wider availability by May 7. Here in North American the main partners will be ASUS and EVGA, so those are the two brands to look for. NVIDIA expects to be able to stock these cards, but the are being produced in limited quantities, so expect them to be scarce at the beginning.

Legit Reviews Editor's Choice Award

Legit Bottom Line: NVIDIA has unleashed a beast with the GeForce GTX 690 video card and it doesn't look half bad either!