NVIDIA GeForce GTX 680 in 2-Way SLI
Back on March 21st, we brought you our review on the NVIDIA GeForce GTX 680 reference card. Earlier this week we followed that up with review of that card focused on 2D Surround gaming, so those of you that have a multi-monitor setup could get some performance numbers at an ultra HD resolution of 5760x1080. Today, we are back at it with a GeForce GTX 680 review featuring 2-way SLI! The only thing better than one GeForce GTX 680 is running two of them together right?
The focus of this article will be from the single monitor point of view, so we'll be bringing you performance numbers at 2560x1600, 1920x1080 and 1280x1024 to show you how this killer setup performs. We'll also be taking a look at the power consumption, noise levels and temperatures that will get form a GeForce GTX 680 2-Way SLI setup.
We couldn't find a company willing to send us a second GeForce GTX 680 for testing, so we went to Newegg and spent $499.99 out of our pocket for an off the shelf EVGA GeForce GTX 680 graphics cards. We got lucky and was able to purchase one when they became in-stock for a split second. Most people expect that review sites get everything they wish and in bulk, but that isn't the situation. The NVIDIA GeForce GTX 680 is still in extremely short supply and a quick look online this morning found that Newegg was sold out of all of their GeForce GTX 680 cards still. This is one of the reasons that most companies aren't sending out extras to reviews as they have retail orders to fill!
In case you are wondering you can mix two difference brand boards for SLI as long as they are running the same GPU. Since the EVGA GeForce GTX 680 is based off the NVIDIA reference design, they will work perfectly together in SLI.
To run SLI you need a power supply that is at least 650 Watts and one that supports four 6-pin PCIe power connectors. You'll also need an NVIDIA SLI bridge that connects both cards together.
Once the cards are installed you need to make sure to install the drivers for both cards and then enable SLI in the NVIDIA control panel. We are running GeForce 301.10 drivers and had no issues with drivers or getting SLI to work properly.
Let's skip all the fluff and show you the test system and dive into the benchmarks!
Before we look at the numbers, let's take a brief look at the test
system that was used. All testing was done using a fresh install of Windows
7 Ultimate 64-bit and benchmarks were completed on the desktop with no
other software programs running.
The AMD Radeon graphics cards was tested with Catalyst 12.3 WHQL drivers and the NVIDIA graphics card ran GeForce 301.10 drivers.
Intel X79/LGA2011 Platform
The Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79 Deluxe motherboard with BIOS 0906 that came out on 12/22/2011. The Corsair Vengeance 16GB 1866MHz quad channel memory kit was set to 1866MHz with 1.5v and 9-10-9-27 1T memory timings. The OCZ Vertex 3 240GB SSD was run with firmware version 2.15.
|The Intel X79 Test Platform|
Intel Core i7-3960X
ASUS P9X79 Deluxe
16GB Corsair 1866MHz
OCZ Vertex 3 240GB
Windows 7 Ultimate 64-bit
Video Cards Tested:
- MSI Radeon HD 7970 Lightning 3072MB - 1075MHz Core / 1400MHz Memory
- NVIDIA GeForce GTX 680 2048MB - 1006MHz Core / 1502MHz Memory
NVIDIA GeForce GTX 680 GPU-Z Information:
Batman: Arkham City
Batman: Arkham City is a 2011 action-adventure video game developed by Rocksteady Studios. It is the sequel to the 2009 video game Batman: Arkham Asylum, based on the DC Comics superhero Batman. The game was released by Warner Bros. Interactive Entertainment for the PlayStation 3, Xbox 360 and Microsoft Windows. The PC and Onlive version was released on November 22, 2011.
Batman: Arkham City uses the Unreal Engine 3 game engine with PhysX. For benchmark testing of Batman: Arkham City we disabled PhysX to keep it fair and ran the game in DirectX 11 mode with 8x MSAA enabled and all the image quality features cranked up. You can see all of the exact settings in the screen captures above.
Benchmark Results: Running a pair of NVIDIA GeForce GTX 680 graphics card in 2-way SLI helped improve gaming performance by 72% at 1920x1080 and exactly 100% at 2560x1600! Doubling performance is as good as it gets with SLI setups, so we are seeing great performance scaling so far!
Battlefield 3 (BF3) is a first-person shooter video game developed by EA Digital Illusions CE and published by Electronic Arts. The game was released in North America on October 25, 2011 and in Europe on October 28, 2011. It does not support versions of Windows prior to Windows Vista as the game only supports DirectX 10 and 11. It is a direct sequel to 2005's Battlefield 2, and the eleventh installment in the Battlefield franchise. The game sold 5 million copies in its first week of release and the PC download is exclusive to EA's Origin platform, through which PC users also authenticate when connecting to the game.
Battlefield 3 debuts the new Frostbite 2 engine. This updated Frostbite engine can realistically portray the destruction of buildings and scenery to a greater extent than previous versions. Unlike previous iterations, the new version can also support dense urban areas. Battlefield 3 uses a new type of character animation technology called ANT. ANT technology is used in EA Sports games, such as FIFA, but for Battlefield 3 is adapted to create a more realistic soldier, with the ability to transition into cover and turn the head before the body.
This game looks great and we tested with the highest settings possible. This means we used 'ultra' settings and really punished the cards being tested. We ran FRAPS for two minutes on the single player map called 'Rock and a Hard Place' for benchmarking.
Benchmark Results: The NVIDIA GeForce GTX 680 SLI setup performed as expected and at 1920x1080 we saw a very nice 84% performance improvement and on a 30-inch monitor at 2560x1600 we saw a 88.5% increase in the frame rate! Battlefield 3 performance was excellent on SLI and ran great with ultra image quality settings on our massive 30-inch display.
Deus Ex: Human Revolution
Deus Ex: Human Revolution is the third game in the Deus Ex first-person role-playing video game series, and a prequel to the original game. Announced on May 27, 2007, Human Revolution was developed by Eidos Montreal and published by Square Enix. It was released in August 2011. Human Revolution contains elements of first-person shooters and role-playing games, set in a near-future where corporations have extended their influence past the reach of global governments. The game follows Adam Jensen, the security chief for one of the game's most powerful corporations, Sarif Industries. After a devastating attack on Sarif's headquarters, Adam is forced to undergo radical surgeries that fuse his body with mechanical augmentations, and he is embroiled in the search for those responsible for the attack.
Deus Ex: Human Revolution uses a modified Crystal Dynamics Crystal game engine, which some of you might know as the game engine from the last Tomb Raider game title. The game developers did some rather hefty modifications to this engine though as the graphics are superb in this title.
Dirt 3 (stylized DiRT 3) is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. However, the "Colin McRae" tag has been completely removed from this iteration. The game was released in Europe and North America on the 24 May 2011.
Dirt3 uses Ego 2.0 Game Technology Engine (more commonly referred to as Ego Engine or EGO, stylised ego), which is a video game engine developed by Codemasters. Ego is a modified version of the Neon game engine that was used in Colin McRae: Dirt and was developed by Codemasters and Sony Computer Entertainment using Sony Computer Entertainment's PhyreEngine cross-platform graphics engine. The Ego engine was developed to render more detailed damage and physics as well as render large-scale environments.
Aerial warfare has evolved. So have you. As a member of the ultra-secret H.A.W.X. 2 squadron, you are one of the chosen few, one of the truly elite. You will use finely honed reflexes, bleeding-edge technology and ultra-sophisticated aircraft - their existence denied by many governments - to dominate the skies. You will do so by mastering every nuance of the world's finest combat aircraft. You will slip into enemy territory undetected, deliver a crippling blow and escape before he can summon a response. You will use your superior technology to decimate the enemy from afar, then draw him in close for a pulse-pounding dogfight. And you will use your steel nerve to successfully execute night raids, aerial refueling and more. You will do all this with professionalism, skill and consummate lethality. Because you are a member of H.A.W.X. 2 and you are one of the finest military aviators the world has ever known. H.A.W.X. 2 was released on November 16, 2010 for PC gamers.
We ran the benchmark in DX11 mode with the image quality settings cranked up as you can see above.
The H.A.W.X. 2 PC game title runs on what looks like seven threads as you can see from the task manager shot seen above that was taken on the test system running the Intel Core i7-3960X processor.
Benchmark Results: HAWX 2 is very playable on the latest generation graphics cards and running this game title in SLI certainly isn't needed. We still found decent scaling numbers here though as at 2560x1600 we saw a 86.3% performance increase with the second GeForce GTX 680 video card!
Just Cause 2
Just Cause 2 is a sandbox style action video
game developed by Swedish developer Avalanche Studios and Eidos Interactive, published by Square Enix. It is the sequel to the 2006 video game, Just Cause.
Just Cause 2 employs a new version of the Avalanche Engine,
Avalanche Engine 2.0, which is an updated version of the engine used in Just Cause. The game will be set on the other side of the world, compared to Just Cause, which is on the fictional tropical island of Panau in Southeast Asia. Rico Rodriguez will return as the protagonist, aiming to overthrow the evil dictator Pandak "Baby" Panay and confront his former boss, Tom Sheldon.
Metro 2033 is an action-oriented video game with a combination of survival horror and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in the Ukraine. The game is played from the perspective of a character named Artyom. The story takes place in post-apocalyptic Moscow, mostly inside the metro station where the player's character was raised (he was born before the war, in an unharmed city), but occasionally the player has to go above ground on certain missions and scavenge for valuables.
This is another extremely demanding game. Image quality settings were raised to Very High quality with 4x AA and 16x AF. We turned off PhysX, but turned on DOF (Depth of Field) for benchmarking.
Benchmark Results: Metro 2033 is very tough on graphics cards with all the eye candy on, but we cranked it up and found that SLI is the way to go for this game title! At 1920x1080 we found a very nice 74.4% performance improvement and at 2560x1600 it was 77.6%.
3DMark 11 is the latest version of the world’s most popular benchmark for measuring the 3D graphics performance of gaming PCs. 3DMark 11 uses a native DirectX 11 engine designed to make extensive use of all the new features in DirectX 11, including tessellation, compute shaders and multi-threading.
Since Futuremark has recently released 3DMark11 we decided to run the benchmark at both performance and extreme presets to see how our hardware will run.
3DMark11 Performance Benchmark Results:
Benchmark Results: Futuremark 3DMark11 with the performance preset showed that the MSI R7970 Lighting scored P8898, the NVIDIA GeForce GTX 680 reference card scored P9767 3DMarks and the GeForce GTX 680 SLI setup reached P16041. We are very happy to see this setup breaking the 16,000 mark!
3DMark11 Extreme Benchmark Results:
Benchmark Results: After running Futuremark 3DMark11 with the extreme present we found the MSI R7970 Lighting scored X3033, the GeForce GTX 680 at X3237 and the GeForce GTX 680 SLI setup at X6089. This is an 88% performance jump thanks to NVIDIA SLI alone.
Unigine Heaven 3.0
The 'Heaven' benchmark that uses the Unigine easily shows off the full potential of DirectX 11 graphics cards. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. With the interactive mode emerging, experience of exploring the intricate world is within reach. Through its advanced renderer, Unigine is one of the first to set precedence in showcasing the art assets with tessellation, bringing compelling visual finesse, utilizing the technology to the full extent and exhibiting the possibilities of enriching 3D gaming. The distinguishing feature of the benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality transcends conjured by your hand.
For this benchmark we used Heaven DX11 Benchmark Version 3.0, which just came out on March 7th, 2012. We haven't run this benchmark in over a year, so it will be interesting to see how this new generation of video cards handles this benchmark. We wanted to see how the cards would do with mild settings, so we disabled AA and AF and set the Tessellation to moderate. We ran the benchmark at 2560x1600, 1920x1200 and 1280x1024 to see how the cards would perform with a wide variety of settings.
With moderate tessellation enabled and AA and AF disabled the benchmark results were very close when testing the AMD and NVIDIA flagship cards. The NVIDIA GeForce GTX 680 SLI setup saw some very nice performance gains in this benchmark. We saw a 81% performance increase at a 2560x1500 resolution on our 30-inch monitor for example.
Many feel that tessellation is one of the most important features of game titles today and in the future, so we set tessellation to extreme and maxed out the anti-aliasing at 8x and the anisotropic filtering at 16x. These are as high as you can set the tessellation and image quality settings in Heaven DX11 Benchmark v3.0, so it will really punish these cards.
With the tessellation set to extreme and the image quality set to 8xAA and 16xAF the performance numbers were down, but the benchmark was able to run smoothly without too many stutters on the GeForce GTX 680 SLI setup. At 2560x1600 we found a 90.8% increase in performance and at 1920x1080 we saw a very nice 66.3% increase.
For testing power consumption, we took our test system and plugged it
into a Kill-A-Watt power meter. For idle numbers, we allowed the system
to idle on the desktop for 15 minutes and took the reading. For load
numbers we measured the peak wattage used by the system while running
the OpenGL benchmark FurMark 1.9.2 at 640x480 resolution. We also ran Battlefield 3, 3DMark11 and Deus EX and averaged the peak results recorded the highest Wattage seen on the meter for the gaming results.
The NVIDIA GeForce GTX680 is NVIDIA's very first 28nm desktop graphics card and it has been designed to be very power efficient. The maximum board power is 195 Watts and the heat output on the GTX680 is 660 BTUs. This is a pretty big deal as this card has a lower power rating than the NVIDIA GeForce GTX 580 and AMD Radeon HD 7970 as those cards are both rated at 250 Watts! The good news is that the NVIDIA GeForce GTX 680 has reduced power supply requirements!
Power Consumption Results: Our testing results show that the NVIDIA GeForce GTX680 SLI setup uses just shy of 600 Watts in our test system during Furmark testing and averaged under 550 Watts in the games. We saw power numbers all over the place in the games due to the dynamic clocks of this card. This is one of the reasons we are now averaging a number of tests rather than just one. As for idle power use, the GeForce GTX 680 uses 98 Watts at idle, while the GeForce GTX 680 SLI setup uses 116 Watts.
Temperature & Noise Testing
Since video card temperatures and the heat generated by next-generation cards have become an area of concern among enthusiasts and gamers, we want to take a closer look at how the graphics cards do at idle, during gaming and finally under a full load.
Temperatures signifcanlty increase when running SLI due to the airflow around the cards and the inside card usually always runs hotter since it is stuck between the other card and the processor area. The inner card idled at 47C and the outer card at 40C on our open air test bench. Notice that the fan speeds were different due to the different temperatures of the cards. The inner card was running at 1200RPM, while the outside card was running at 1000RPM. Running SLI does make for a hotter and louder system!
When running Furmark with the GeForce GTX 680's in SLI we saw the cards reach 82C and 86C, respectively. Both cards were at 98% GPU load, but notice the fan speed difference again. One is running 2940 RPM and the other is running 2400 RPM. This 540 RPM difference is due to the 4C temperature difference on the cores as that is how the fan profiles are programmed.
After Furmark was turned off you can see that the cards temperatures recovered quickly.
Using the peak temperature from GeForce GTX 680 SLI setup you can see that it does run hotter than a single card. We were running 12C warmer at idle, but only 5C hotter at full load in gaming and Furmark.
We recently upgraded our sound meter to an Extech sound level meter with ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 38dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.
The NVIDIA GeForce GTX 680 SLI setup put out more heat and as a result the fans had to run faster to keep up. We found a 2.8dBA increase in noise at idle and during gaming it jumped up to being a 5.6dBA increase. As you can see you get increased noise and temperatures with SLI, but you likely already knew that.
Final Thoughts and Conclusions
When it comes to the performance results, the NVIDIA GeForce GTX 680 performed as expected and we were impressed. Our performance numbers significantly improved and our gaming experience was better! Of the nine benchmarks that we used we found that adding a second GeForce GTX 680 for SLI improved performance on average by 64% at a resolution of 1920x1080 and 84% at 2560x1600. All of our tests showed a larger benefit from running in SLI at 2560x1600, so if you are gaming on a 30-inch panel you will get bigger performance gains that someone on a smaller 21.5-24" panel. Performance scaling was fairly solid and we didn't run into any gaming issues on this multi-GPU setup, which is impressive seeing how this is the first public driver for the new Kepler core architecture! Then again, this is a $1000 setup, so it better work right!
The only real negatives of the NVIDIA GeForce GTX 680 set would be the increase in power use, heat and noise. The noise levels remain acceptable though, so we don't believe anyone will be discouraged from buying this setup by it. It should be noted that we did space out our cards to leave a gap between them, so if you place them directly next to each other you might see even higher temperatures and noise levels. NVIDIA has done a marvelous job with their GPU cooler and fan profiles though, so this isn't like running a pair of GeForce GTX 285's in SLI or GeForce GTX 480's if you happened to run that setup at one point in time. Still, if you plan on running a pair of GeForce GTX 680's in SLI you would want to run them as far apart as possible and for that you should get a flexible SLI bridge.
We hope that NVIDIA looks into a solution that is comparable to AMD's ZeroCore Power feature that they introduced with the Radeon HD 7000 series. It would be nice to totally shut down the secondary card when you aren't gaming or in need of it for increased power savings and reduced noise levels. That is our only gripe we had when using the GeForce GTX 680 setup!
Legit Bottom Line: Running a pair of NVIDIA GeForce GTX 680's in 2-way SLI took our Ultra HD gaming on the Dell 30-inch display to the next level by running everything at acceptable frame rates!