NVIDIA Brings Titanium Back!

This morning the video card market got a little more crowded as NVIDIA announced the GeForce GTX 560 Ti and AMD released a variant of the Radeon HD 6950 with 1GB of memory on it. Over the past week we've had a chance to test both of these cards that are aimed at the gamer that has around $250 to spend.

NVIDIA GeForce GTX 560 Ti Video Card

If you have been a gamer for many years you might remember the GeForce 4 Ti video card series from yesteryear and it looks like NVIDIA has gone ahead and resurrected the Ti moniker for the GeForce GTX 560 series. Ti stands for Titanium and it better mean that the GeForce GTX 560 Ti is going to destroy AMD's offerings as the Ti was last used when NVIDIA dominated AMD's lineup.  

NVIDIA GeForce GTX 560 Ti Video Card

The NVIDIA GeForce GTX 460 uses the GF104 core and NVIDIA took that core back to the drawing board and improved it the best they could.  They were able to unlock all the cores on the chip now that they tweaked it, so the CUDA cores went up from 336 on the GTX 460 to 384 on the GTX 580.  NVIDIA was also able to increase the clock speeds used on the GeForce GTX 560 Ti since the core was better designed.  The graphics clock has gone from 675MHz up to 882MHz! The CUDA cores that used to run at 1350MHz now are greater in number and running at 1645MHz. Even the 1GB GDDR5 memory got bumped up to 1002MHz from 900MHz! All these extra CUDA cores and higher clock speeds did increase the board's maximum power draw to 170W, but that is just 10W higher than the GTX 460.

NVIDIA GeForce GTX 560 Ti Video Card

NVIDIA claims that the GeForce GTX 560 Ti reference board design has essentially been overbuilt. Compared to the original GTX 460 board, they've added more robust 4-phase power circuitry, faster 5Gbps memory modules, and improved cooling. The new cooler features one additional copper heatpipe and a larger heatsink and cooling fan. Finally, a baseplate was added to cool the graphics memory and the GPU’s power circuitry. To protect the graphics card and system from issues caused by excessive power draw, the GTX 560 Ti reference board features the same power monitoring hardware first introduced last year with the GeForce GTX 580.

NVIDIA GeForce GTX 560 Ti Video Card

NVIDIA says that the GTX 560 Ti should be 33% faster than the GTX 460 1GB and if that is the case we won't be upset with the 10 Watt power draw increase!

NVIDIA GeForce GTX 560 Ti Video Card

NVIDIA is hoping that the GeForce GTX 560 Ti will help get gamers to update their systems if they haven't done so in a number of years.  The GeForce 8800 GT was a very popular card back in 2007 and NVIDIA thinks that these customers are the ones that should update their systems to this new card. They are claiming that you'll see about a 3x performance upgrade by doing so and that both cards cost roughly $250 when launched. 

AMD Radeon HD 6950 1GB Video Card

AMD countered this product launch this morning by releasing a Radeon HD 6950 1GB video card at $259.99.  This card is nothing more than a Radeon HD 6950 2GB that has had its memory reduced by half.

Let's move on to taking a look at these cards and then looking at the performance!

A Closer Look At The GeForce GTX560 Ti

NVIDIA GeForce GTX 560 Ti Video Card

The PCB on the reference NVIDIA GeForce GTX 560 Ti video card measures 9.0 inches in length and the card stands at 4.376 inches tall. The card is a dual-slot cooling design for maximum cooling performance.

NVIDIA GeForce GTX 560 Ti Video Card Back

Flipping the NVIDIA GeForce GTX 560 Ti video card video card over we don't find too many interesting things. Notice that along the top edge of the GeForce GTX 560 Ti that one SLI connector is placed on the card.  This means that SLI and 3D Vision Surround configurations are supported, but it will not be able to run triple-SLI ever.

NVIDIA GeForce GTX 560 Ti Video Card Power Connector

The NVIDIA GeForce GTX 560 Ti video card requires a 500 Watt or greater power supply with a minimum of 30 Amps on the +12 volt rail. It also requires that the power supply has two 6-pin PCI Express power connectors for proper connection. It should be noted that the NVIDIA minimum system power requirement based is based on a PC configured with an Intel Core i7 3.2GHz CPU. If you want to run SLI we are not sure the exact specifications, but an educated guess would say that a 550 Watt or greater power supply would be needed.

In this picture you can see the center mounted 75mm fan that NVIDIA is using to keep the GTX560 Ti cool.  Notice that the cooling fan used on the GeForce GTX 560 Ti extends past the plastic housing.  We were told it was designed this way to help pull in cooler air from outside the fan shroud. This is a very interesting design and not one that we have seen on any reference card in the past decade. Under the 75mm cooling fan, NVIDIA is using a radial curved bifurcated fin heatsink design with three heat-pipes to keep the card nice and cool.

NVIDIA GeForce GTX 560 Ti Video Card Back

The NVIDIA GeForce GTX 560 Ti GDDR5 graphics card has a pair of dual-link DVI-I outputs along with a mini-HDMI output header. The NVIDIA GeForce GTX 560 Ti features enhanced audio support over HDMI; this includes bitstreaming support for both Dolby True HD and DTS-HD Master Audio over HDMI.

NVIDIA GeForce GTX 560 Ti Video Card Back
Here is a closer look at GeForce GTX 560 Ti (Top) compared to a GeForce GTX 460 1GB (Bottom). If you recall at the start of this page we told you the GeForce GTX 560 Ti is 9" in PCB length. The GeForce GTX 460 was just 8.25" long. We were told by NVIDIA that they increased the length of the PCB to have a cleaner PCB layout and in order to fit a better GPU cooler on the card as the TDP is higher on this new card.

A Closer Look At The Radeon HD 6950 1GB

The Radeon HD 6950 2GB GDDR5 came out last month and the Radeon HD 6950 1GB is the same exact card with half the amount of memory on it. The clock speeds remain the same at 800MHz on the core and 1250MHz on the 1GB of GDDR5 memory.

AMD Radeon HD 6950 Video Card

Right away we noticed that the Radeon HD 6950 1GB is longer than the NVIDIA GeForce GTX 560 Ti.  We took the picture above to show you the length difference between these two popular gaming graphics cards. The AMD Radeon HD 6950 1GB is 10.5" in length and the NVIDIA GeForce GTX 560 Ti is just 9" in length. This shouldn't be an issue, but is certainly worth pointing out to our readers.

AMD Radeon HD 6950 1GB Video Card

As you can see, the Radeon HD 6900 series cards are dual-slot designs with a squirrel cage type cooling fan on it.

AMD Radeon HD 6950 1GB Video Card

The back of the Radeon HD 6950 1GB video card we were sent doesn't have a back plate on it, which is interesting as the 2GB version does.

AMD Radeon HD 6970 Video Card BIOS Switch
All of the other features of the card are identical to the other AMD Radeon HD 6950 2GB cards, so we won't go into more detail on this card.  The picture above is of what they are calling a 'Dual BIOS Toggle Switch' and this is a pretty neat feature that we need to talk about.  This switch allows you to switch between two totally separate BIOS versions on the Radeon HD 6900 series graphics card. AMD said that these boards have two BIOS chips (EEPROM) that both have separate lanes.  If you want to tweak or flash the card to a different BIOS you can always do so on setting number 1, which is unprotected and for user updates.  Setting number 2 is protected as the factory default and can't be flashed by end users. 

Test Setup

All testing was done on a fresh install of Windows 7 Ultimate 64-bit with all the latest updates installed. All benchmarks were completed on the desktop with no other software programs running. The Kingston HyperX T1 DDR3 memory modules were run in triple-channel mode at 1866MHz with 8-8-8-24 1T timings. The ASUS P6X58D-E motherboard was run using BIOS 0502 with the processor running stock settings and Turbo enabled.

Drivers Used For Testing:


Windows 7 Drivers Used:
Intel Chipset Inf Update Program V9.1.1.1025
Realtek Audio Driver V6.0.1.6037 for Windows 64bit Windows 7.(WHQL)
Marvell Yukon Gigabit Ethernet Driver V11.10.5.3 for 32/64bit Windows 7.(WHQL)
Marvell 9128 SATA 6Gbps Controller Driver V1.0.0.1036 for 32/64bit Windows 7.

 

The Video Card Test System

Here is the Intel LGA 1366 Test platform:

Intel Test Platform

 

Component

 

Brand/Model

 

Live Pricing

 

 Processor

Intel Core i7-970

 

 Motherboard

ASUS P6X58D-E

 

Memory

6GB Kingston DDR3 1866MHz

 

 Video Card

 

 See Below

 

 Hard Drive

 

 Crucial C300 256GB SSD

 

 Cooling

 

 Titan Finrar

 

 Power Supply

Corsair HX850W

 

 Chassis

 

 None (Open Bench)

 

 Operating System

 

 Windows 7 Ultimate 64-Bit

Video Cards Tested:

AMD Radeon HD 6950 Video Card GPU-Z 0.5.0 Details:

AMD Radeon HD 6950 1GB Video Card GPU-Z 0.5.0 Details
AMD Radeon HD 6950 1GB Video Card GPU-Z 0.5.0 Details

NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details:

NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details
NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details

Aliens vs. Predator

Aliens vs Predator D3D11 Benchmark v1.03

Aliens vs Predator D3D11 Benchmark v1.03 is a standalone benchmark test based upon Rebellion's 2010 inter-species shooter Aliens vs. Predator. The test shows xenomorph-tastic scenes using heavy tessellation among other DX11 features.

Aliens vs Predator D3D11 Benchmark v1.03

We cranked up all the image quality settings in the benchmark to the highest level possible, so we were running 4x AA and 16x AF with SSAO enabled at both 1920x1200 and 1280x1024 on all the video cards.

Aliens Vs. Predator Benchmark Results

Benchmark Results:  The AMD Radeon HD 6950 1GB does better than the NVIDIA GeForce GTX 560 Ti in this benchmark by 5-7 FPS depending on the resolution you are gaming at.

Batman: Arkham Asylum GOTY

Batman: Arkham Asylum

Batman: Arkham Asylum is an action-adventure stealth video game based on DC Comics' Batman for PlayStation 3, Xbox 360 and Microsoft Windows. It was developed by Rocksteady Studios and published by Eidos Interactive in conjunction with Warner Bros.

Batman: Arkham Asylum

For our testing we set everything as high as it would go including Multi Sample Anti-Aliasing as we set that to 8x.

Batman: Arkham Asylum Benchmark Results

Benchmark Results: The NVIDIA GeForce GTX 560 Ti does slightly better in Batman: Arkham Asylum GOTY edition than the Radeon HD 6950 1GB.

Just Cause 2

Just Cause 2

Just Cause 2 is a sandbox style action video game currently under development by Swedish developer Avalanche Studios and Eidos Interactive, published by Square Enix. It is the sequel to the 2006 video game, Just Cause. 

Just Cause 2 Game Settings

Just Cause 2 employs a new version of the Avalanche Engine, Avalanche Engine 2.0, which is an updated version of the engine used in Just Cause.  The game will be set on the other side of the world, compared to Just Cause, which is on the fictional tropical island of Panau in Southeast Asia. Rico Rodriguez will return as the protagonist, aiming to overthrow the evil dictator Pandak "Baby" Panay and confront his former boss, Tom Sheldon.

Just Cause 2 Benchmark Results
Benchmark Results: AMD has always dominated in Just Cause 2 and the trend continues here today.

Metro 2033

Metro 2033

Metro 2033 is an action-oriented video game with a combination of survival horror and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in the Ukraine. The game is played from the perspective of a character named Artyom. The story takes place in post-apocalyptic Moscow, mostly inside the metro station where the player's character was raised (he was born before the war, in an unharmed city), but occasionally the player has to go above ground on certain missions and scavenge for valuables.

Metro 2033 Settings

This is another extremely demanding game. Settings were left at High quality with AA and AF at lowest values- AAA and AF 4x, respectively, for each DirectX 9, 10 & 11 APIs. Advanced DirectX 11 settings were left at default. The section of Metro 2033 tested was the Prologue with fraps polling from when you are climbing up the ladder until opening the door to exit the metro station. This section includes many features found throughout the game including four creatures which attack you before you exit the building, dense particles, ammo in cabinets, a few computer controlled sections and, of course, Miller, your first companion.

Metro 2033

Benchmark Results: This benchmark favors NVIDIA branded cards, so the GeForce GTX 560 Ti beating the Radeon HD 6950 1GB card shouldn't be a shock to anyone.

StarCraft II: Wings of Liberty

StarCraft II: Wings of Liberty DX11 Performance Benchmark

StarCraft II: Wings of Liberty is a military science fiction real-time strategy video game developed by Blizzard Entertainment for Microsoft Windows and Mac OS X. A sequel to the award-winning 1998 video game StarCraft, the game was released worldwide on July 27, 2010. It is split into three installments: the base game with the subtitle Wings of Liberty, and two upcoming expansion packs, Heart of the Swarm and Legacy of the Void. StarCraft II: Wings of Liberty has had a successful launch, selling three million copies worldwide in less than a month.

StarCraft II: Wings of Liberty Settings

The game StarCraft II: Wings of Liberty has no internal benchmarking tools built into the game engine, so we recorded a scene and played it back at normal speed and measured performance with FRAPS. The screen capture above shows the system settings that we were using for StarCraft II. Notice we are running the graphics quality and textures set to Ultra and maxed out everything in the menu. We did not manually enable AA in the drivers though, as we wanted to keep testing simple and consistent to the choices offered in the games settings menus.

Stalker Call of Pripyat Advanced Image Quality Settings

Benchmark Results: StarCraft II: Wings of Liberty showed the Radeon HD 6950 1GB to be beating the GeForce GTX 560 Ti at higher resolutions, but losing at lower.

S.T.A.L.K.E.R.: Call of Pripyat

Stalker Call of Pripyat DX11 Performance Benchmark

The events of S.T.A.L.K.E.R.: Call of Pripyat unfold shortly after the end of S.T.A.L.K.E.R.: Shadow of Chernobyl following the ending in which Strelok destroys the C-Consciousness. Having discovered the open path to the Zone's center, the government decides to stage a large-scale operation to take control of the Chernobyl nuclear plant.

S.T.A.L.K.E.R.: Call of Pripyat utilizes the XRAY 1.6 Engine, allowing advanced modern graphical features through the use of DirectX 11 to be fully integrated; one outstanding feature being the inclusion of real-time GPU tessellation. Regions and maps feature photo realistic scenes of the region it is made to represent. There is also extensive support for older versions of DirectX, meaning that Call of Pripyat is also compatible with older DirectX 8, 9, 10 and 10.1 graphics cards.

Stalker Call of Pripyat DX11 Performance Benchmark

The game S.T.A.L.K.E.R.: CoP has no internal benchmarking tools built into the game engine, but they do have a standalone benchmark available that we used for our testing purposes. The screen capture above shows the main window of the benchmark with our settings. Notice we are running Enhanced Full Dynamic Lighting "DX11" as our renderer.

Stalker Call of Pripyat Advanced Image Quality Settings

Under the advanced settings we enabled tessellation and 4x MSAA. We didn't enable ambient occlusion as we wanted to use these test settings for mainstream cards down the road and these settings should be tough enough to stress any and all DX11 enabled video cards.

Stalker Call of Pripyat Advanced Image Quality Settings

Benchmark Results: S.T.A.L.K.E.R. Call of Pripyat showed that the Radeon HD 6950 1GB was faster than the GeForce GTX 560 Ti at 1920x1200, but slower at 1280x1024 in terms of performance.

H.A.W.X. 2 Benchmark

We wanted to include a new benchmark for this review, so Tom Clancy's H.A.W.X. 2 was added in to see how it looked. This benchmark got some attention recently for not being neutral over the failure to use optimized code for better tessellation performance on AMD and NVIDIA cards, but that doesn't matter to us as we will be using this benchmark to look at just AMD cards.

Tom Clancy's HAWX 2

Aerial warfare has evolved. So have you. As a member of the ultra-secret H.A.W.X. 2 squadron, you are one of the chosen few, one of the truly elite. You will use finely honed reflexes, bleeding-edge technology and ultra-sophisticated aircraft - their existence denied by many governments - to dominate the skies. You will do so by mastering every nuance of the world's finest combat aircraft. You will slip into enemy territory undetected, deliver a crippling blow and escape before he can summon a response. You will use your superior technology to decimate the enemy from afar, then draw him in close for a pulse-pounding dogfight. And you will use your steel nerve to successfully execute night raids, aerial refueling and more. You will do all this with professionalism, skill and consummate lethality. Because you are a member of H.A.W.X. 2 and you are one of the finest military aviators the world has ever known. H.A.W.X. 2 is due out on November 16, 2010 for PC gamers.

Tom Clancy's HAWX 2

We ran the benchmark in DX11 mode and cranked up all the Antialiasing and Advanced image quality settings. We also enabled hardware tessellation as without that setting turned on the cards were getting well over 120FPS at a resolution of 1920x1200 and over 160FPS at 1280x1025. We wanted to stress the cards a bit and enabling tessellation appeared to do the trick as you'll see below.

Tom Clancy's HAWX 2 Benchmark Results

Benchmark Results: HAWX 2 shows the Radeon HD 6950 1GB was slower than the GeForce GTX 560 Ti series reference video card.

3DMark Vantage

3DMark Vantage

3DMark Vantage is the new industry standard PC gaming performance benchmark from Futuremark, newly designed for Windows Vista and DirectX10. It includes two new graphics tests, two new CPU tests, several new feature tests, and support for the latest hardware. 3DMark Vantage is based on a completely new rendering engine, developed specifically to take full advantage of DirectX10, the new graphics API from Microsoft.

3DMark Vantage

The Extreme settings were used for testing, so a resolution of 1920x1200 was used.

3DMark Vantage Benchmark Results

Benchmark Results: 3DMark Vantage showed that the AMD Radeon HD 6950 1GB video card with 11.1a hotfix drivers does really well in 3DMark Vantage.  In fact, we were shocked to see it beating out the AMD Radeon HD 6950 2GB with 10.11 drivers so badly. It looks like we are needing to go back through and re-test all of the AMD video cards, but at nearly 8 hours a card that would take weeks and we just don't have time for that right now.  The NVIDIA GeForce GTX 560 Ti does very well in 3DMark Vantage and was at roughly the same performance level as a GeForce GTX 480 that was the flagship NVIDIA graphics card from the first half of 2010.

3DMark 11

Futuremark 3DMark 11 Benchmark

3DMark 11 is the latest version of the world’s most popular benchmark for measuring the 3D graphics performance of gaming PCs. 3DMark 11 uses a native DirectX 11 engine designed to make extensive use of all the new features in DirectX 11, including tessellation, compute shaders and multi-threading.

Futuremark 3DMark 11 Benchmark Settings

Since Futuremark has recently released 3DMark11we decided to run the benchmark at both performance and extreme presets to see how our hardware will run.

3DMark 11 Extreme Benchmark Results:

Futuremark 3DMark 11 Benchmark Results

3DMark 11 Performance Benchmark Results:

Futuremark 3DMark 11 Benchmark Results

We are still running 3DMark 11 on several video cards, but for now we just wanted to show you how the AMD Radeon HD 6950 1GB and NVIDIA GeForce GTX50 Ti do on this benchmark.  As you can see, the GeForce GTX 560 Ti trails the Radeon HD 6950 by a small margin.

Unigine 'Heaven' DX11

Unigine DirectX 11 benchmark Heaven

The 'Heaven' benchmark that uses the Unigine easily shows off the full potential of DirectX 11 graphics cards. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. With the interactive mode emerging, experience of exploring the intricate world is within reach. Through its advanced renderer, Unigine is one of the first to set precedence in showcasing the art assets with tessellation, bringing compelling visual finesse, utilizing the technology to the full extent and exhibiting the possibilities of enriching 3D gaming. The distinguishing feature of the benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality transcends conjured by your hand.

DirectX 11 benchmark Unigine engine

We ran the Heaven v2.1 benchmark that just recently out with VSync turned disabled, but with 8x AA and 16x AF enabled to check out system performance. We ran the benchmark at 1920x1200 and 1280x1024 to see how the benchmark ran at some different monitor resolutions. It should be noted that we ran the new extreme tessellation mode on this benchmark.  These are the toughest settings that you can run on this benchmark, so it should really put the hurt on any graphics card.

Unigine Heaven Benchmark

Benchmark Results: The AMD Radeon HD 6950 with its 2GB of GDDR5 memory shows that is was faster than the Radeon HD 6950 1GB here in this benchmark. The real winner here is the NVIDIA GeForce GTX 560 Ti, though, as it was hands down faster than the 6950 1GB.

FurMark 1.8.2

FurMark 1.8.2

FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that's why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card.

FurMark 1.8.2

The benchmark was rendered in full screen mode with no AA enabled on both video cards.

Furmark Benchmark Results

Benchmark Results: Furmark doesn't play nicely on the AMD Radeon HD 6900 series as the benchmark hits the cards' max TDP and AMD PowerTune kicks in and tones the core clocks back to reduce the power draw/temperature on the GPU. Both AMD's Radeon HD 6900 series and NVIDIA's GeForce GTX 500 series have new power management features to help product consumers and their board partners from failed graphics cards.  It looks like one too many users killed their video cards running Furmark or OCCT, and to help reduce the number of RMA's both companies have fixes in place to keep these programs from burning up a card.

Temperature Testing

Since video card temperatures and the heat generated by next-generation cards have become an area of concern among enthusiasts and gamers, we want to take a closer look at how the graphics cards do at idle and under a full load.

AMD Radeon HD 6950 1GB Video Card Idle Temperature:

AMD Radeon HD 6950 1GB Video Card GPU-Z 0.5.0 Details

NVIDIA GeForce GTX 560 Ti Video Card Idle Temperature:

NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details

As you can see from the screen shot above, the idle state of the Radeon HD 6900 series drops the GPU core clock frequency down to 250MHz and the memory clock down to 150MHz to help conserve power and lower temperatures.  At idle on an open test bench the Radeon HD 6950 video card temperature was observed at 49C. The NVIDIA GeForce GTX 560 Ti had even lower power idle states of 50.6MHz on the GPU core clock and 135MHz on the memory. These clocks combined with the NVIDIA designed cooler showed that we had an idle temperature of just 27C.

We fired up FurMark and ran the stability at 640x480, which was enough to put the GPU core at 100% load in order to get the highest load temperature possible. This application also charts the temperature results so you can see how the temperature rises and levels off, which is very nice. The fans on the video cards were left on auto during temperature testing. When we hit the space bar to stop the rendering the temperature dropped.

AMD Radeon HD 6950 1GB Video Card Load Temperature:

AMD Radeon HD 6950 Video Card Load Temp

The AMD Radeon HD 6950 1GB peaked at 87C. It should be noted that AMD PowerTune is kicking on and limiting these cards from getting any hotter. You can see the throttling taking place in Furmark if you leave GPU-Z open and look at the GPU Core Clock.  You can see that the core clock is jumping all over the place when Furmark is running.

NVIDIA GeForce GTX 560 Ti Video Card Load Temp

The NVIDIA GeForce GTX 560 Ti got up to 85C, so both cards got up to roughly the same temperature in this test. 

When it comes to noise levels both cards were quieter than the CPU cooler on our test system, so no complaints there.  At full load with the cards near 90C they were both audible, but not too bad.  If we had to pick a noise winner at full load it would have to be the GeForce GTX 560 Ti. The GeForce GTX 560 Ti was cooler at both idle and load!

Power Consumption

For testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we measured the peak wattage used by the system while running the OpenGL benchmark FurMark 1.8.2 at 1280x1024 resolution.

Total System Power Consumption Results

Power Consumption Results: The AMD Radeon HD 6950 1GB video card does pretty well here in the power test, but it came to a shock to us that it used slightly more than the 2GB version that we tested last month. Every card has different GPU core leakage, so no two GPU's are exactly identical when picked at random. The NVIDIA GeForce GTX 560 Ti has slightly lower power use at idle, but uses more at full load. It is interesting, though, that as you can see, both of these cards ended up next to each other in the power chart! 

NVIDIA GeForce GTX 560 Ti GPU Overclocking

A performance analysis of the GeForce GTX 560 Ti video card wouldn't be complete without some overclocking results, so we got in touch with our friends over at EVGA and they said the latest build of their EVGA Precision software would work on the GeForce GTX 560 Ti reference card.


NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details

Using the EVGA Precision software utility for the GeForce GTX 560 Ti graphics card is one of the easiest ways to overclock a video card and since we have only had the GTX 560 Ti for a few days it was perfect.

NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details

The highest overclock that we could get on the GeForce GTX 560 Ti reference card was 960MHz on the core, 1920MHz on the shaders and 1175MHz on the 1024MB of GDDR5 memory. This overclock was 100% stable on games and the synthetic benchmarks like 3DMark Vantage. We gamed on this setting for a few hours and feel safe to call it 100% stable.  

NVIDIA GeForce GTX 560 Ti Graphics Card at 823MHz/1645MHz/1002MHz:

NVIDIA GeForce GTX 580 Video Card

NVIDIA GeForce GTX 560 Ti Graphics Card at 960MHz/1920MHz/1175MHz:

NVIDIA GeForce GTX 580 Video Card

We saw 3DMark Vantage go up from X9538 to X11030, which is a 15.6% or 1492 3DMark jump in performance! Let's see what it does in real games.

NVIDIA GeForce GTX 560 Graphics Card Overclocked

The jump from 823MHz to 960MHz on the GeForce GTX 560 Ti core clock helped boost performance by 17.2% in AvP at a resolution of 1920x1200.  We'll take a 15-17% performance increase from overclocking any day! If you throw a little extra voltage at this card you'll likely get a better overclock. We were able to run the card at 1GHz on some benchmarks, but sadly our reference card wouldn't run at those speeds with full stability and we had to back it down to 960MHz to make it rock solid.

AMD Radeon HD 6950 1GB GPU Overclocking

AMD Catalyst Overdrive PowerTune

To overclock the AMD Radeon HD 6950 1GB graphics card, we used the ATI Overdrive utility that is part of the CATALYST Control Center. When you 'unlock' the ATI Overdrive, you can manually set the clock and memory settings.

AMD Catalyst Overdrive PowerTune

The AMD Radeon HD 6950 starts off life at 800MHz on the core and 1250MHz on the memory.  We were able to max out AMD Overdrive at 840MHz core and 1325MHz on the memory and didn't see any issues in the games with PowerTune left alone. This is a 40MHz overclock on the core and a 75MHz boost on the 1GB worth of GDDR5 memory ICs. Not a huge overclock by any means, but we expect to see a difference in the benchmarks.

To test out the overclock we fired up 3DMark Vantage to see what this overclock was like. 

The AMD Radeon HD 6950 1GB Video Card Stock 800MHz/1250MHz:

AMD Radeon HD 6950 OC Video Card Overclocking

The AMD Radeon HD 6950 1GB Video Card Overclocked 840MHz/1325MHz:

ATI Radeon HD 6950 OC Video Card Overclocking

Running 3DMark Vantage with the Extreme preset we got a score of X10137 with the card at reference clock settings.  The score went up X10663 3DMarks when overclocked, which was an improvement of 526 points or 5.2% with this mild overclock. Let's see what it does in real games.

NVIDIA GeForce GTX 560 Graphics Card Overclocked
The jump from 800MHz to 840MHz on the AMD Radeon HD 6950 1GB video card's core clock helped boost performance by 4.7% in AvP at a resolution of 1920x1200.  This overclock is only an increase of 5% on the core clock, so this overclock was expected.

Final Thoughts and Conclusions

AMD Radeon HD 6950 1GB Video Card

At the end of the day what have we been able to figure out? Well, the take home message here is that when it comes to mainstream gaming graphics cards the $249.99 to $279.99 price segment just got much more interesting. The AMD Radeon HD 6950 1GB video card is nothing new, to be fair. AMD just simply removed half of the available memory on the existing Radeon HD 6950 2GB video card that the company released last month in order to drop the price down on a card to compete with the NVIDIA GeForce GTX 560 Ti. It's nice of them to release a new variant of an existing card, but it makes for a boring review as nothing is really new at all. What does taking away half the memory get you in terms of cost savings?  It looks like $10-$30 can be saved according to AMD, who was kind enough to send over their video card options for those looking to spend $200-$300 on a video card.

If you are gaming at an HD resolution then you'll want to get the card with twice as much memory for as little as $10 more. If you are gaming at a 1680x1050 or lower resolution the 1GB card might be all you need, but for a little more you can get twice as much memory and help future proof your system in case you get a new monitor in the months to come.

NVIDIA GeForce GTX 560 Ti Video Card

When it comes to the GeForce GTX 560 Ti video card things get a little more interesting. The update from the original GF104 Fermi core used on the GTX 460 series to the newly redesigned GF114 Fermi core on the GTX 560 Ti series didn't come as a surprise to us, but bringing back the Ti nomenclature was a shocker. The Ti in the product name stands for Titanium, which represents a card that is lighter, stronger and faster according to NVIDIA. We are going to assume that NVIDIA brought the Ti reference back to make a point and luckily for them the GeForce GTX 560 Ti is a very nice graphics card that is lighter, faster and more overclocker friendly than its AMD nemesis.

The NVIDIA GeForce GTX 560 Ti with its 384 CUDA cores was able to perform ~30% faster than the GeForce GTX 460 with 336 CUDA cores. We saw a slight increase in power consumption, but with the additional cores and higher clock speeds we knew that would be part of the trade off to get such big performance gains.


NVIDIA GeForce GTX 560 Ti

If we had to pick one of these video cards today it is a tough call as the performance benchmarks were split for the most part. If you don't look at performance numbers, the NVIDIA GeForce GTX 560 Ti has the usual NVIDIA features like PhysX, CUDA, 3D Vision and great SLI scaling for those wanting to run multi-GPU setups. It also runs cooler at idle and load. It uses slightly less power at an idle state and, let's face it, our PCs sit at idle most of the time. The GTX 560 Ti is a small card at just 9-inches in length. This makes it 1.5-inches shorter than the Radeon HD 6950 1GB, meaning it will easily fit in your case and not block air flow. The NVIDIA GeForce GTX 560 Ti can also overclock really well and the price tag is lower. AMD isn't dead in the water, though, as they have Eyefinity, which is a must for those looking to run a triple-monitor setup and want to buy just one video card. The AMD Radeon HD 6950 1GB was also more energy efficient at load.

A week ago the NVIDIA GeForce GTX 560 Ti was set to dominate the sub $250 video card market, but AMD put an end to that with the AMD Radeon HD 6950 1GB video card.  Both cards are great and at the end of the day the gamers and enthusiasts are the winners as AMD and NVIDIA will have to be in another price battle in order to earn your hard earned money!

Legit Bottom Line: The NVIDIA GeForce GTX 560 Ti is a great video card that destroys AMD's Radeon HD 6800 series, but AMD came out with a Radeon HD 6950 variant with 1GB of memory to head them off! It's a GPU war folks!