NVIDIA GeForce GTX 560 Ti Versus AMD Radeon HD 6950 1GBTue, Jan 25, 2011 - 12:00 AM
Final Thoughts and Conclusions
At the end of the day what have we been able to figure out? Well, the take home message here is that when it comes to mainstream gaming graphics cards the $249.99 to $279.99 price segment just got much more interesting. The AMD Radeon HD 6950 1GB video card is nothing new, to be fair. AMD just simply removed half of the available memory on the existing Radeon HD 6950 2GB video card that the company released last month in order to drop the price down on a card to compete with the NVIDIA GeForce GTX 560 Ti. It’s nice of them to release a new variant of an existing card, but it makes for a boring review as nothing is really new at all. What does taking away half the memory get you in terms of cost savings? It looks like $10-$30 can be saved according to AMD, who was kind enough to send over their video card options for those looking to spend $200-$300 on a video card.
- AMD Radeon HD 6950 2GB – $289 ($269 soft)
- AMD Radeon HD 6950 1GB – $259
- AMD Radeon HD 6870 – $219
If you are gaming at an HD resolution then you’ll want to get the card with twice as much memory for as little as $10 more. If you are gaming at a 1680×1050 or lower resolution the 1GB card might be all you need, but for a little more you can get twice as much memory and help future proof your system in case you get a new monitor in the months to come.
When it comes to the GeForce GTX 560 Ti video card things get a little more interesting. The update from the original GF104 Fermi core used on the GTX 460 series to the newly redesigned GF114 Fermi core on the GTX 560 Ti series didn’t come as a surprise to us, but bringing back the Ti nomenclature was a shocker. The Ti in the product name stands for Titanium, which represents a card that is lighter, stronger and faster according to NVIDIA. We are going to assume that NVIDIA brought the Ti reference back to make a point and luckily for them the GeForce GTX 560 Ti is a very nice graphics card that is lighter, faster and more overclocker friendly than its AMD nemesis.
The NVIDIA GeForce GTX 560 Ti with its 384 CUDA cores was able to perform ~30% faster than the GeForce GTX 460 with 336 CUDA cores. We saw a slight increase in power consumption, but with the additional cores and higher clock speeds we knew that would be part of the trade off to get such big performance gains.
- NVIDIA GeForce GTX 580 – $499
- NVIDIA GeForce GTX 570 – $349 (online for $335)
- NVIDIA GeForce GTX 560 Ti – $249
- NVIDIA GeForce GTX 460 1GB – $199
If we had to pick one of these video cards today it is a tough call as the performance benchmarks were split for the most part. If you don’t look at performance numbers, the NVIDIA GeForce GTX 560 Ti has the usual NVIDIA features like PhysX, CUDA, 3D Vision and great SLI scaling for those wanting to run multi-GPU setups. It also runs cooler at idle and load. It uses slightly less power at an idle state and, let’s face it, our PCs sit at idle most of the time. The GTX 560 Ti is a small card at just 9-inches in length. This makes it 1.5-inches shorter than the Radeon HD 6950 1GB, meaning it will easily fit in your case and not block air flow. The NVIDIA GeForce GTX 560 Ti can also overclock really well and the price tag is lower. AMD isn’t dead in the water, though, as they have Eyefinity, which is a must for those looking to run a triple-monitor setup and want to buy just one video card. The AMD Radeon HD 6950 1GB was also more energy efficient at load.
A week ago the NVIDIA GeForce GTX 560 Ti was set to dominate the sub $250 video card market, but AMD put an end to that with the AMD Radeon HD 6950 1GB video card. Both cards are great and at the end of the day the gamers and enthusiasts are the winners as AMD and NVIDIA will have to be in another price battle in order to earn your hard earned money!
Legit Bottom Line: The NVIDIA GeForce GTX 560 Ti is a great video card that destroys AMD’s Radeon HD 6800 series, but AMD came out with a Radeon HD 6950 variant with 1GB of memory to head them off! It’s a GPU war folks!