NVIDIA Brings Titanium Back!
This morning the video card market got a little more crowded as NVIDIA announced the GeForce GTX 560 Ti and AMD released a variant of the Radeon HD 6950 with 1GB of memory on it. Over the past week we've had a chance to test both of these cards that are aimed at the gamer that has around $250 to spend.
If you have been a gamer for many years you might remember the GeForce 4 Ti video card series from yesteryear and it looks like NVIDIA has gone ahead and resurrected the Ti moniker for the GeForce GTX 560 series. Ti stands for Titanium and it better mean that the GeForce GTX 560 Ti is going to destroy AMD's offerings as the Ti was last used when NVIDIA dominated AMD's lineup.
The NVIDIA GeForce GTX 460 uses the GF104 core and NVIDIA took that core back to the drawing board and improved it the best they could. They were able to unlock all the cores on the chip now that they tweaked it, so the CUDA cores went up from 336 on the GTX 460 to 384 on the GTX 580. NVIDIA was also able to increase the clock speeds used on the GeForce GTX 560 Ti since the core was better designed. The graphics clock has gone from 675MHz up to 882MHz! The CUDA cores that used to run at 1350MHz now are greater in number and running at 1645MHz. Even the 1GB GDDR5 memory got bumped up to 1002MHz from 900MHz! All these extra CUDA cores and higher clock speeds did increase the board's maximum power draw to 170W, but that is just 10W higher than the GTX 460.
NVIDIA claims that the GeForce GTX 560 Ti reference board design has essentially been overbuilt. Compared to the original GTX 460 board, they've added more robust 4-phase power circuitry, faster 5Gbps memory modules, and improved cooling. The new cooler features one additional copper heatpipe and a larger heatsink and cooling fan. Finally, a baseplate was added to cool the graphics memory and the GPU’s power circuitry. To protect the graphics card and system from issues caused by excessive power draw, the GTX 560 Ti reference board features the same power monitoring hardware first introduced last year with the GeForce GTX 580.
NVIDIA says that the GTX 560 Ti should be 33% faster than the GTX 460 1GB and if that is the case we won't be upset with the 10 Watt power draw increase!
NVIDIA is hoping that the GeForce GTX 560 Ti will help get gamers to update their systems if they haven't done so in a number of years. The GeForce 8800 GT was a very popular card back in 2007 and NVIDIA thinks that these customers are the ones that should update their systems to this new card. They are claiming that you'll see about a 3x performance upgrade by doing so and that both cards cost roughly $250 when launched.
AMD countered this product launch this morning by releasing a Radeon HD 6950 1GB video card at $259.99. This card is nothing more than a Radeon HD 6950 2GB that has had its memory reduced by half.
Let's move on to taking a look at these cards and then looking at the performance!
A Closer Look At The GeForce GTX560 Ti
The PCB on the reference NVIDIA GeForce GTX 560 Ti video card measures 9.0 inches in length
and the card stands at 4.376 inches tall. The card is a dual-slot cooling design for maximum cooling performance.
Flipping the NVIDIA GeForce GTX 560 Ti video card video card over we
don't find too
many interesting things. Notice that along the top edge of the GeForce GTX 560 Ti that one SLI
connector is placed on the card. This means that SLI and 3D Vision
Surround configurations are supported, but it will not be able to run
The NVIDIA GeForce GTX 560 Ti video card requires a 500 Watt or greater power supply with a minimum of 30 Amps on the +12 volt rail. It also requires that the power supply has two 6-pin PCI Express power connectors for proper connection. It should be noted that
the NVIDIA minimum system power requirement based is based on a PC
configured with an Intel Core i7 3.2GHz CPU. If you want to run
SLI we are not sure the exact specifications, but an educated guess
would say that a 550 Watt or greater power supply would be needed.
In this picture you can see the center mounted 75mm fan that NVIDIA is using to keep the GTX560 Ti cool. Notice that the cooling fan used on the GeForce GTX 560 Ti extends past the plastic housing. We were told it was designed this way to help pull in cooler air from outside the fan shroud. This is a very interesting design and not one that we have seen on any reference card in the past decade. Under the 75mm cooling fan, NVIDIA is using a radial curved bifurcated fin heatsink design with three heat-pipes to keep the card nice and cool.
The NVIDIA GeForce GTX 560 Ti GDDR5 graphics card has a pair of
dual-link DVI-I outputs along with a mini-HDMI output header. The NVIDIA
GeForce GTX 560 Ti features enhanced audio support over HDMI; this
includes bitstreaming support for both Dolby True HD and DTS-HD Master
Audio over HDMI.
A Closer Look At The Radeon HD 6950 1GB
The Radeon HD 6950 2GB GDDR5 came out last month and the Radeon HD 6950 1GB is the same exact card with half the amount of memory on it. The clock speeds remain the same at 800MHz on the core and 1250MHz on the 1GB of GDDR5 memory.
Right away we noticed that the Radeon HD 6950 1GB is longer than the NVIDIA GeForce GTX 560 Ti. We took the picture above to show you the length
difference between these two popular gaming graphics cards. The AMD Radeon HD 6950 1GB is
10.5" in length and the NVIDIA GeForce GTX 560 Ti is just 9" in length. This shouldn't be an issue, but is certainly worth
pointing out to our readers.
As you can see, the Radeon HD 6900 series cards are dual-slot designs with a squirrel cage type cooling fan on it.
The back of the Radeon HD 6950 1GB video card we were sent doesn't have a back plate on it, which is interesting as the 2GB version does.
All testing was done on a fresh install of Windows 7 Ultimate 64-bit with all the latest updates installed. All benchmarks were completed on the desktop with no other software programs running. The Kingston HyperX T1 DDR3 memory modules were run in triple-channel mode at 1866MHz with 8-8-8-24 1T timings. The ASUS P6X58D-E motherboard was run using BIOS 0502 with the processor running stock settings and Turbo enabled.
Drivers Used For Testing:
- The ATI Radeon HD 5000 series cards were all tested using CATALYST 10.10 drivers
- The AMD Radeon HD 6000 series cards were tests using CATALYST 10.11 drivers
- The AMD Radeon HD 6950 1GB was tested using 11.1a Hotfix drivers.
- All of the NVIDIA video cards were tested with Forceware 260.89 WHQL drivers with the exception of;
- The GeForce GTX480/GTX570/GTX580 as those were tested with 263.09 WHQL drivers
- The GeForce GTX 560 Ti was tested with Forceware 266.56 drivers
Windows 7 Drivers Used:
Intel Chipset Inf Update Program V18.104.22.1685
Realtek Audio Driver V22.214.171.12437 for Windows 64bit Windows 7.(WHQL)
Marvell Yukon Gigabit Ethernet Driver V126.96.36.199 for 32/64bit Windows 7.(WHQL)
Marvell 9128 SATA 6Gbps Controller Driver V188.8.131.526 for 32/64bit Windows 7.
Here is the Intel LGA 1366 Test platform:
|Intel Test Platform|
|Intel Core i7-970|
6GB Kingston DDR3 1866MHz
Crucial C300 256GB SSD
None (Open Bench)
Windows 7 Ultimate 64-Bit
Video Cards Tested:
- ASUS GeForce GTX 580 - 782MHz Core/1564MHz Shader/1002MHz Memory
- NVIDIA GeForce GTX 580 - 772MHz Core/1544MHz Shader/1002MHz Memory
- NVIDIA GeForce GTX 570 - 732MHz/1464MHz Shader/950MHz Memory
- NVIDIA GeForce GTX 560 Ti - 823MHz/1645MHz Shader/1002MHz Memory
- Galaxy GeForce GTX 480 - 700MHz Core/1401MHz Shader/924MHz Memory
- ASUS ENGTX470 GeForce GTX 470 - 608MHz Core/1215MHz Shader/837MHz Memory
- EVGA GeForce GTX 460 1GB FTW - 850MHz Core/1700MHz Shader/1000MHz Memory
- EVGA GeForce GTX 460 768MB SuperClocked - 763MHz Core/1526MHz Shader/950MHz Memory
- NVIDIA GeForce GTS 450 1GB - 783MHz Core/1566MHz Shader/902MHz Memory
- EVGA GeForce GTS 450 1GB FTW - 920MHz Core/1840MHz Shader/1026MHz Memory
- ASUS GeForce GTS 430 1GBDDR3 - 700MHz Core/1400MHz Shader/800MHz Memory
- AMD Radeon HD 6970 - 880MHz Core / 1375Hz
- AMD Radeon HD 6950 2GB - 880MHz Core /1250MHz
- AMD Radeon HD 6950 1GB - 880MHz Core /1250MHz
- AMD Radeon HD 6870 - 900Mhz Core / 1050MHz Memory
- Diamond Radeon HD 6870 - 940Mhz Core / 1100MHz Memory
- AMD Radeon HD 6850 - 775Mhz Core / 1000MHz Memory
- XFX Radeon HD 6850 - 775Mhz Core / 1000MHz Memory
- ATI Radeon HD 5970 - 735Mhz Core / 1010MHz Memory
- ATI Radeon HD 5870 - 850Mhz Core / 1200MHz Memory
- ATI Radeon HD 5850 - 725Mhz Core / 1000MHz Memory
- Sapphire Radeon HD 5770 - 850Mhz Core / 1200MHz Memory
- HIS Radeon HD 5570 512MB GDDR5 - 650MHz Core / 1000MHz Memory
AMD Radeon HD 6950 Video Card GPU-Z 0.5.0 Details:
NVIDIA GeForce GTX 560 Ti Video Card GPU-Z 0.5.0 Details:
Aliens vs. Predator
Aliens vs Predator D3D11 Benchmark v1.03 is a standalone benchmark test based upon Rebellion's 2010 inter-species shooter Aliens vs. Predator. The test shows xenomorph-tastic scenes using heavy tessellation among other DX11 features.
We cranked up all the image quality settings in the benchmark to the highest level possible, so we were running 4x AA and 16x AF with SSAO enabled at both 1920x1200 and 1280x1024 on all the video cards.
Benchmark Results: The AMD Radeon HD 6950 1GB does better than the NVIDIA GeForce GTX 560 Ti in this benchmark by 5-7 FPS depending on the resolution you are gaming at.
Batman: Arkham Asylum GOTY
Batman: Arkham Asylum is an action-adventure stealth video game based on DC Comics' Batman for PlayStation 3, Xbox 360 and Microsoft Windows. It was developed by Rocksteady Studios and published by Eidos Interactive in conjunction with Warner Bros.
For our testing we set everything as high as it would go including Multi Sample Anti-Aliasing as we set that to 8x.
Benchmark Results: The NVIDIA GeForce GTX
560 Ti does slightly better in Batman: Arkham Asylum GOTY edition than the Radeon HD 6950 1GB.
Just Cause 2
Just Cause 2 is a sandbox style action video
game currently under development by Swedish developer Avalanche Studios and Eidos Interactive, published by Square Enix. It is the sequel to the 2006 video game, Just Cause.
Just Cause 2 employs a new version of the Avalanche Engine,
Avalanche Engine 2.0, which is an updated version of the engine used in Just Cause. The game will be set on the other side of the world, compared to Just Cause, which is on the fictional tropical island of Panau in Southeast Asia. Rico Rodriguez will return as the protagonist, aiming to overthrow the evil dictator Pandak "Baby" Panay and confront his former boss, Tom Sheldon.
Metro 2033 is an action-oriented video game with a combination of survival horror and first-person shooter elements. The game is based on the novel Metro 2033 by Russian author Dmitry Glukhovsky. It was developed by 4A Games in the Ukraine. The game is played from the perspective of a character named Artyom. The story takes place in post-apocalyptic Moscow, mostly inside the metro station where the player's character was raised (he was born before the war, in an unharmed city), but occasionally the player has to go above ground on certain missions and scavenge for valuables.
This is another extremely demanding game. Settings were left at High quality with AA and AF at lowest values- AAA and AF 4x, respectively, for each DirectX 9, 10 & 11 APIs. Advanced DirectX 11 settings were left at default. The section of Metro 2033 tested was the Prologue with fraps polling from when you are climbing up the ladder until opening the door to exit the metro station. This section includes many features found throughout the game including four creatures which attack you before you exit the building, dense particles, ammo in cabinets, a few computer controlled sections and, of course, Miller, your first companion.
Benchmark Results: This benchmark favors NVIDIA branded cards, so the GeForce GTX 560 Ti beating the Radeon HD 6950 1GB card shouldn't be a shock to anyone.
StarCraft II: Wings of Liberty
StarCraft II: Wings of Liberty is a military science fiction real-time strategy video game developed by Blizzard Entertainment for Microsoft Windows and Mac OS X. A sequel to the award-winning 1998 video game StarCraft, the game was released worldwide on July 27, 2010. It is split into three installments: the base game with the subtitle Wings of Liberty, and two upcoming expansion packs, Heart of the Swarm and Legacy of the Void. StarCraft II: Wings of Liberty has had a successful launch, selling three million copies worldwide in less than a month.
The game StarCraft II: Wings of Liberty has no internal benchmarking tools built into the game engine, so we recorded a scene and played it back at normal speed and measured performance with FRAPS. The screen capture above shows the system settings that we were using for StarCraft II. Notice we are running the graphics quality and textures set to Ultra and maxed out everything in the menu. We did not manually enable AA in the drivers though, as we wanted to keep testing simple and consistent to the choices offered in the games settings menus.
Benchmark Results: StarCraft II: Wings of Liberty showed the Radeon HD 6950 1GB to be beating the GeForce GTX 560 Ti at higher resolutions, but losing at lower.
S.T.A.L.K.E.R.: Call of Pripyat
The events of S.T.A.L.K.E.R.: Call of Pripyat unfold shortly after the end of S.T.A.L.K.E.R.: Shadow of Chernobyl following the ending in which Strelok destroys the C-Consciousness. Having discovered the open path to the Zone's center, the government decides to stage a large-scale operation to take control of the Chernobyl nuclear plant.
S.T.A.L.K.E.R.: Call of Pripyat utilizes the XRAY 1.6 Engine, allowing advanced modern graphical features through the use of DirectX 11 to be fully integrated; one outstanding feature being the inclusion of real-time GPU tessellation. Regions and maps feature photo realistic scenes of the region it is made to represent. There is also extensive support for older versions of DirectX, meaning that Call of Pripyat is also compatible with older DirectX 8, 9, 10 and 10.1 graphics cards.
The game S.T.A.L.K.E.R.: CoP has no internal benchmarking tools built into the game engine, but they do have a standalone benchmark available that we used for our testing purposes. The screen capture above shows the main window of the benchmark with our settings. Notice we are running Enhanced Full Dynamic Lighting "DX11" as our renderer.
Under the advanced settings we enabled tessellation and 4x MSAA. We didn't enable ambient occlusion as we wanted to use these test settings for mainstream cards down the road and these settings should be tough enough to stress any and all DX11 enabled video cards.
Benchmark Results: S.T.A.L.K.E.R. Call of Pripyat showed that the Radeon HD 6950 1GB was faster than the GeForce GTX 560 Ti at 1920x1200, but slower at 1280x1024 in terms of performance.
H.A.W.X. 2 Benchmark
We wanted to include a new benchmark for this review, so Tom Clancy's H.A.W.X. 2 was added in to see how it looked. This benchmark got some attention recently for not being neutral over the failure to use optimized code for better tessellation performance on AMD and NVIDIA cards, but that doesn't matter to us as we will be using this benchmark to look at just AMD cards.
Aerial warfare has evolved. So have you. As a member of the ultra-secret H.A.W.X. 2 squadron, you are one of the chosen few, one of the truly elite. You will use finely honed reflexes, bleeding-edge technology and ultra-sophisticated aircraft - their existence denied by many governments - to dominate the skies. You will do so by mastering every nuance of the world's finest combat aircraft. You will slip into enemy territory undetected, deliver a crippling blow and escape before he can summon a response. You will use your superior technology to decimate the enemy from afar, then draw him in close for a pulse-pounding dogfight. And you will use your steel nerve to successfully execute night raids, aerial refueling and more. You will do all this with professionalism, skill and consummate lethality. Because you are a member of H.A.W.X. 2 and you are one of the finest military aviators the world has ever known. H.A.W.X. 2 is due out on November 16, 2010 for PC gamers.
We ran the benchmark in DX11 mode and cranked up all the Antialiasing and Advanced image quality settings. We also enabled hardware tessellation as without that setting turned on the cards were getting well over 120FPS at a resolution of 1920x1200 and over 160FPS at 1280x1025. We wanted to stress the cards a bit and enabling tessellation appeared to do the trick as you'll see below.
Benchmark Results: HAWX 2 shows the Radeon HD 6950 1GB was slower than the GeForce GTX 560 Ti series reference video card.
3DMark Vantage is the new industry standard PC gaming performance benchmark from Futuremark, newly designed for Windows Vista and DirectX10. It includes two new graphics tests, two new CPU tests, several new feature tests, and support for the latest hardware. 3DMark Vantage is based on a completely new rendering engine, developed specifically to take full advantage of DirectX10, the new graphics API from Microsoft.
The Extreme settings were used for testing, so a resolution of 1920x1200 was used.
Benchmark Results: 3DMark Vantage showed that the AMD Radeon HD 6950 1GB video card with 11.1a hotfix drivers does really well in 3DMark Vantage. In fact, we were shocked to see it beating out the AMD Radeon HD 6950 2GB with 10.11 drivers so badly. It looks like we are needing to go back through and re-test all of the AMD video cards, but at nearly 8 hours a card that would take weeks and we just don't have time for that right now. The NVIDIA GeForce GTX 560 Ti does very well in 3DMark Vantage and was at roughly the same performance level as a GeForce GTX 480 that was the flagship NVIDIA graphics card from the first half of 2010.
3DMark 11 is the latest version of the world’s most popular benchmark for measuring the 3D graphics performance of gaming PCs. 3DMark 11 uses a native DirectX 11 engine designed to make extensive use of all the new features in DirectX 11, including tessellation, compute shaders and multi-threading.
Since Futuremark has recently released 3DMark11we decided to run the benchmark at both performance and extreme presets to see how our hardware will run.
3DMark 11 Extreme Benchmark Results:
3DMark 11 Performance Benchmark Results:
We are still running 3DMark 11 on several video cards, but for now we just wanted to show you how the AMD Radeon HD 6950 1GB and NVIDIA GeForce GTX50 Ti do on this benchmark. As you can see, the GeForce GTX 560 Ti trails the Radeon HD 6950 by a small margin.
Unigine 'Heaven' DX11
The 'Heaven' benchmark that uses the Unigine easily shows off the full potential of DirectX 11 graphics cards. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. With the interactive mode emerging, experience of exploring the intricate world is within reach. Through its advanced renderer, Unigine is one of the first to set precedence in showcasing the art assets with tessellation, bringing compelling visual finesse, utilizing the technology to the full extent and exhibiting the possibilities of enriching 3D gaming. The distinguishing feature of the benchmark is a hardware tessellation that is a scalable technology aimed for automatic subdivision of polygons into smaller and finer pieces so that developers can gain a more detailed look of their games almost free of charge in terms of performance. Thanks to this procedure, the elaboration of the rendered image finally approaches the boundary of veridical visual perception: the virtual reality transcends conjured by your hand.
We ran the Heaven v2.1 benchmark that just recently out with VSync turned disabled, but with 8x AA and 16x AF enabled to check out system performance. We ran the benchmark at 1920x1200 and 1280x1024 to see how the benchmark ran at some different monitor resolutions. It should be noted that we ran the new extreme tessellation mode on this benchmark. These are the toughest settings that you can run on this benchmark, so it should really put the hurt on any graphics card.
Benchmark Results: The AMD Radeon HD 6950 with its 2GB of GDDR5 memory shows that is was faster than the Radeon HD 6950 1GB here in this benchmark. The real winner here is the NVIDIA GeForce GTX 560 Ti, though, as it was hands down faster than the 6950 1GB.
FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that's why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card.
The benchmark was rendered in full screen mode with no AA enabled on both video cards.
Benchmark Results: Furmark doesn't play nicely on the AMD Radeon HD 6900 series as the benchmark hits the cards' max TDP and AMD PowerTune kicks in and tones the core clocks back to reduce the power draw/temperature on the GPU. Both AMD's Radeon HD 6900 series and NVIDIA's GeForce GTX 500 series have new power management features to help product consumers and their board partners from failed graphics cards. It looks like one too many users killed their video cards running Furmark or OCCT, and to help reduce the number of RMA's both companies have fixes in place to keep these programs from burning up a card.
Since video card temperatures and the heat generated by next-generation cards have become an area of concern among enthusiasts and gamers, we want to take a closer look at how the graphics cards do at idle and under a full load.
AMD Radeon HD 6950 1GB Video Card Idle Temperature:
NVIDIA GeForce GTX 560 Ti Video Card Idle Temperature:
As you can see from the screen shot above, the idle state of the Radeon HD 6900 series drops the GPU core clock frequency down to 250MHz and the memory clock down to 150MHz to help conserve power and lower temperatures. At idle on an open test bench the Radeon HD 6950 video card temperature was observed at 49C. The NVIDIA GeForce GTX 560 Ti had even lower power idle states of 50.6MHz on the GPU core clock and 135MHz on the memory. These clocks combined with the NVIDIA designed cooler showed that we had an idle temperature of just 27C.
We fired up FurMark and ran the stability at 640x480, which was
enough to put the GPU core at 100% load in order to get the highest
temperature possible. This application also charts the temperature
results so you can see how the temperature rises and levels off, which
is very nice. The fans on the video cards were
left on auto during temperature testing. When we hit the space bar to stop the rendering the
AMD Radeon HD 6950 1GB Video Card Load Temperature:
The AMD Radeon HD 6950 1GB peaked at 87C. It should be noted that AMD PowerTune is kicking on and limiting
these cards from getting any hotter. You can see the throttling taking place in Furmark if you leave GPU-Z open and look at the GPU Core Clock. You can see that the core clock is jumping all over the place when Furmark is running.
The NVIDIA GeForce GTX 560 Ti got up to 85C, so both cards got up to roughly the same temperature in this test.
When it comes to noise levels both cards were quieter than the CPU cooler on our test system, so no complaints there. At full load with the cards near 90C they were both audible, but not too bad. If we had to pick a noise winner at full load it would have to be the GeForce GTX 560 Ti. The GeForce GTX 560 Ti was cooler at both idle and load!
For testing power consumption, we took our test system and plugged it
into a Kill-A-Watt power meter. For idle numbers, we allowed the system
to idle on the desktop for 15 minutes and took the reading. For load
numbers we measured the peak wattage used by the system while running
the OpenGL benchmark FurMark 1.8.2 at 1280x1024 resolution.
Power Consumption Results: The AMD Radeon HD 6950 1GB video card does pretty well here in the power test, but it came to a shock to us that it used slightly more than the 2GB version that we tested last month. Every card has different GPU core leakage, so no two GPU's are exactly identical when picked at random. The NVIDIA GeForce GTX 560 Ti has slightly lower power use at idle, but uses more at full load. It is interesting, though, that as you can see, both of these cards ended up next to each other in the power chart!
NVIDIA GeForce GTX 560 Ti GPU Overclocking
A performance analysis of the GeForce GTX 560 Ti video card wouldn't be complete without some overclocking results, so we got in touch with our friends over at EVGA and they said the latest build of their EVGA Precision software would work on the GeForce GTX 560 Ti reference card.
Using the EVGA Precision software utility for the GeForce GTX 560 Ti graphics card is one of the easiest ways to overclock a video card and since we have only had the GTX 560 Ti for a few days it was perfect.
The highest overclock that we could get on the GeForce GTX 560 Ti reference card was 960MHz on the core, 1920MHz on the shaders and 1175MHz on the 1024MB of GDDR5 memory. This overclock was 100% stable on games and the synthetic benchmarks like 3DMark Vantage. We gamed on this setting for a few hours and feel safe to call it 100% stable.
NVIDIA GeForce GTX 560 Ti Graphics Card at 823MHz/1645MHz/1002MHz:
NVIDIA GeForce GTX 560 Ti Graphics Card at 960MHz/1920MHz/1175MHz:
We saw 3DMark Vantage go up from X9538 to X11030, which is a 15.6% or 1492 3DMark jump in performance! Let's see what it does in real games.
The jump from 823MHz to 960MHz on the GeForce GTX 560 Ti core clock helped boost performance by 17.2% in AvP at a resolution of 1920x1200. We'll take a 15-17% performance increase from overclocking any day! If you throw a little extra voltage at this card you'll likely get a better overclock. We were able to run the card at 1GHz on some benchmarks, but sadly our reference card wouldn't run at those speeds with full stability and we had to back it down to 960MHz to make it rock solid.
AMD Radeon HD 6950 1GB GPU Overclocking
To overclock the AMD Radeon HD 6950 1GB graphics card, we used
Overdrive utility that is part of the CATALYST Control Center. When you
'unlock' the ATI Overdrive, you can manually set the clock and memory
The AMD Radeon HD 6950 starts off life at 800MHz on the core and
1250MHz on the memory. We were able to max out AMD Overdrive at 840MHz
core and 1325MHz on the memory and didn't see any issues in the games
with PowerTune left alone. This is a 40MHz overclock
on the core and a 75MHz boost on the 1GB worth of GDDR5 memory ICs. Not a
huge overclock by any means, but we expect to see a difference in the
To test out the overclock we fired up 3DMark Vantage to see what this overclock was like.
The AMD Radeon HD 6950 1GB Video Card Stock 800MHz/1250MHz:
The AMD Radeon HD 6950 1GB Video Card Overclocked 840MHz/1325MHz:
Running 3DMark Vantage with the Extreme preset we got a
score of X10137 with the card at reference clock settings. The score
went up X10663 3DMarks when overclocked, which
was an improvement of 526 points or 5.2% with this mild overclock. Let's see what it does in real games.
Final Thoughts and Conclusions
At the end of the day what have we been able to figure out? Well, the take home message here is that when it comes to mainstream gaming graphics cards the $249.99 to $279.99 price segment just got much more interesting. The AMD Radeon HD 6950 1GB video card is nothing new, to be fair. AMD just simply removed half of the available memory on the existing Radeon HD 6950 2GB video card that the company released last month in order to drop the price down on a card to compete with the NVIDIA GeForce GTX 560 Ti. It's nice of them to release a new variant of an existing card, but it makes for a boring review as nothing is really new at all. What does taking away half the memory get you in terms of cost savings? It looks like $10-$30 can be saved according to AMD, who was kind enough to send over their video card options for those looking to spend $200-$300 on a video card.
- AMD Radeon HD 6950 2GB - $289 ($269 soft)
- AMD Radeon HD 6950 1GB - $259
- AMD Radeon HD 6870 - $219
If you are gaming at an HD resolution then you'll want to get the card with twice as much memory for as little as $10 more. If you are gaming at a 1680x1050 or lower resolution the 1GB card might be all you need, but for a little more you can get twice as much memory and help future proof your system in case you get a new monitor in the months to come.
When it comes to the GeForce GTX 560 Ti video card things get a little more interesting. The update from the original GF104 Fermi core used on the GTX 460 series to the newly redesigned GF114 Fermi core on the GTX 560 Ti series didn't come as a surprise to us, but bringing back the Ti nomenclature was a shocker. The Ti in the product name stands for Titanium, which represents a card that is lighter, stronger and faster according to NVIDIA. We are going to assume that NVIDIA brought the Ti reference back to make a point and luckily for them the GeForce GTX 560 Ti is a very nice graphics card that is lighter, faster and more overclocker friendly than its AMD nemesis.
The NVIDIA GeForce GTX 560 Ti with its 384 CUDA cores was able to perform ~30% faster than the GeForce GTX 460 with 336 CUDA cores. We saw a slight increase in power consumption, but with the additional cores and higher clock speeds we knew that would be part of the trade off to get such big performance gains.
- NVIDIA GeForce GTX 580 - $499
- NVIDIA GeForce GTX 570 - $349 (online for $335)
- NVIDIA GeForce GTX 560 Ti - $249
- NVIDIA GeForce GTX 460 1GB - $199
If we had to pick one of these video cards today it is a tough call as the performance benchmarks were split for the most part. If you don't look at performance numbers, the NVIDIA GeForce GTX 560 Ti has the usual NVIDIA features like PhysX, CUDA, 3D Vision and great SLI scaling for those wanting to run multi-GPU setups. It also runs cooler at idle and load. It uses slightly less power at an idle state and, let's face it, our PCs sit at idle most of the time. The GTX 560 Ti is a small card at just 9-inches in length. This makes it 1.5-inches shorter than the Radeon HD 6950 1GB, meaning it will easily fit in your case and not block air flow. The NVIDIA GeForce GTX 560 Ti can also overclock really well and the price tag is lower. AMD isn't dead in the water, though, as they have Eyefinity, which is a must for those looking to run a triple-monitor setup and want to buy just one video card. The AMD Radeon HD 6950 1GB was also more energy efficient at load.
A week ago the NVIDIA GeForce GTX 560 Ti was set to dominate the sub $250 video card market, but AMD put an end to that with the AMD Radeon HD 6950 1GB video card. Both cards are great and at the end of the day the gamers and enthusiasts are the winners as AMD and NVIDIA will have to be in another price battle in order to earn your hard earned money!
Legit Bottom Line: The NVIDIA GeForce GTX 560 Ti is a great video card that destroys AMD's Radeon HD 6800 series, but AMD came out with a Radeon HD 6950 variant with 1GB of memory to head them off! It's a GPU war folks!