The Mainstream DX10 Graphics Cards Finally Arrive
For just over 5 months NVIDIA’s 8800 series cards have been kicking ass and taking names. That’s because for 5 months they have had no competition to their launch of the first DirectX 10 video cards which put them at a near 100% performance advantage over their last generation cards, as well as their competition. So while those with the cash to splurge on those cards have been enjoying the heck out of these great offerings NVIDIA has had their nose to the grindstone mapping out even more goodness for those of us contemplating the Ramen Noodle diet for a few weeks to keep up with the fast paced video card market.
What NVIDIA has been working on is a GPU that will enhance your Blue-Ray/HD-DVD movie playback and give you the ability run the upcoming DirectX 10 titles that we are still waiting (im)patiently for. They were also hard at work to keep power consumption down, temperature low, and performance high. What we get today is an 80nm GPU comprised of 289 million transistors which is just 11 million more than the high performance 7900GTX of last generation.
GeForce 8600 Specifications
The GeForce 8600 series of GPUs are based off the GeForce 8800 series GPUs and bring DirectX 10 graphics to the masses by being available in the popular $149-$229 price range. While right now not everyone needs the power of geometry shaders, unique instancing, massive texture arrays and high dynamic range rendering with 16x anti-aliasing, these features will be of great use in upcoming DirectX 10 game titles that are due out later this year.
The GeForce 8600 is available in two configurations. The GeForce 8600 GTS features a core clock of 675Mhz and a shader clock of 1.45Ghz. The GeForce 8600 GT features a core clock of 540MHz and a shader clock of 1.19Ghz. Both of the GeForce 8600 configurations contain a GPU with 289 million transistors that is built on the 80nm process. The core contains a unified shader design that consists of 32 individual stream processors. Each stream processor is can be dynamically allocated to vertex, pixel, geometry, or physics operations. The GeForce 8600 also has some tweaks that help improve per clock shader performance over the GeForce 8800. In the GeForce 8800, each texture processor can calculate four texture addresses and perform eight filtering operations per clock. In the GeForce 8600, each texture processor can calculate eight texture addresses and eight filtering operations per clock. This basically doubles the number of unique texture locations to be sampled!
Below is a chart that shows the differences between the GeForce 8600 GTS and GeForce 8600 GT.
The Second Generation Pure Video HD Engine
The GeForce 8600 GT and 8600 GTS both feature NVIDIA's second generation Pure Video HD engine, making it the world's first and only GPU to provide 100% offload for H.264 decoding. Current PCs require both a high end CPU and high end GPU in order to reliably decode and playback all types of HD content. With the GeForce 8600's new Pure Video technology, HD playback can be done on nearly all PCs.
A new video processor (VP) block provides more advanced video post processing to improve overall video quality. A new integrated bitstream processor (BSP) allows the 8500 and 8600 GPUs to perform full HD decode. They are the world’s first video processors to offload from the CPU 100% of Blu-ray and HD DVD H.264 video decoding, enabling playback on mainstream CPUs and providing unsurpassed movie picture quality and low power consumption. PureVideo HD support for Vista is available now for GeForce 7 and 8 series GPUs. PureVideo HD support for the 8500 and 8600 GPUs under Windows XP is expected in June 2007.
The Video Processor 2 (VP2) is designed to accelerate the decode of the most advanced video codecs including MPEG-2, VC-1, and H.264 at high bit rates, up to 40Mbps in the case of BD. The Bitstream processor (BSP) is specially designed to accelerate H.264 entropy coding schemes known as Context Adaptive Variable Length Coding (CAVLC) and Context Adaptive Binary Arithmetic Coding (CABAC). The BSP is key to the CPU offloading that takes place on the GeForce 8600 series. Also included on the 8600 series is the AES128 engine that is used to accelerate the decode of the AES128 encryption protocol to support video content security schemes for players and operating system requirements. The AES128 engine ensures both the security and integrity of the video content. This all boils down to giving the end user something that NVIDIA calls the 'best HD experience' available on the market today.
To help show how the new video engine dramatically reduces CPU utilization, NVIDIA provided Legit Reviews with a couple slides that show CPU Utilization when doing H.264 decoding with actual Blu-Ray and HD DVD movies on an Intel Pentium 4 531 Processor with an ATI Radeon X1950 Pro versus the NVIDIA GeForce 8500 GT and then with nothing more than the CPU. When using the ATI Radeon X1950 Pro and the CPU the performance tests showed that 100% of the work was done by the CPU. With the NVIDIA GeForce 8500 GT it shows the CPU only helping out with ~30-45% of the work, while the GPU handled the rest.
NVIDIA also provided a slide that compares the faster GeForce 8600GTS to the older GeForce 7600 GT on an Intel Core 2 Duo E6400 based system. Here it shows that the Pure Video Engine in the GeForce hows that decoding these HD titles took anywhere between 80-100% of the CPU's resources. The 7600 GT, with the standard Pure Video HD enabled, was able to reduce the CPU utilization down to about 60%. With the second generation Pure Video HD engine enabled the CPI is almost entirely free to do other things as CPU utilization was found to be roughly 20% on the GeForce 8600 GTS.
It's obvious that the second generation Pure Video engine does a great job at decoding video and nearly takes the CPU out of the equation for HD playback. Eventually all of the decoding could take place on the GPU, so it's highly possible that CPU utilization will be even lower on the next GPU's that are in the works now. Currently, Pure Video HD acceleration for the GeForce 8600 GTS and GT is ONLY available for Windows Vista. Pure Video HD acceleration for Windows XP is expected to be available in June 2007.
The XFX 8600 GT XXX Edition
The little brother to the XFX GeForce 8600GTS XXX Edition is the XFX GeForce 8600GT XXX Edition, which is identical to the 8600GTS other than the fact it has lower clock frequencies. The standard XFX GeForce 8600GT (part number PV-T84J-UDF) has a core clock (including dispatch, texture units, and ROP units) of 540MHz, while the XFX GeForce 8600GT XXX Edition (Part number PV-T84J-UDD) has a core clock of 620MHz. Other than a core increase of 80MHz the memory speed has been increased from 1.4GHz to 1.6GHz, which is a nice 200MHz speed boost. The standard XFX GeForce 8600 GT has 32 stream processors that operate at 1190MHz, while these 32 stream processors on the XXX version are overclocked up to 1355MHz. Basically the core clock, memory clock, and shader clock speeds have all been increased on the XXX version. The Manufacturer Suggested Retail Price (MSRP) on the standard XFX GeForce 8600GT is set at $149.99, while the MSRP on the overclocked XFX GeForce 8600GT is $20 higher at $169.99. For an extra $20 you will be able to get a card that is tested to operate at higher frequencies and this is the card that we have in for review.
The XFX GeForce 8600 GT is a single slot design that requires no additional power connector for operation. This makes the board very energy efficient as it has a maximum power draw of just 43 watts. The XFX GeForce 7600 GT has a is maximum power draw of 65 watts while the GeForce 6600 GT is 70 watts. For the third generation in a row NVIDIA has been able to decrease the power consumption on their mainstream video cards while adding features and improving performance, which is amazing and something for them to brag about.
The XFX GeForce 8600GT XXX uses a black printed circuit board that looks great when compared to the XFX GeForce 7600GT that is pictured below it. If you look really close at the picture above you will notice that the 8600GT is about one eigth of an inch shorter than the 7600GT. The card is 6.9" long and just 4.4" tall for those wondering if it will fit in a custom case or small form factor computer.
The XFX GeForce 8600GT features a pair of dual-link DVI-I outputs for connection to analog and digital PC monitors and HDTVs, a 7-pin analog video out port that supports S-Video directly, plus composite and component (YPrPb) outputs via an optional dongle. XFX did not enable HDCP and HDMI support, which is optional on the GeForce 8600GT. With dual-link DVI-I outputs this card is able to run resolutions of up to 2560x1600 on two LCD Monitors at the same time!
The back of the card doesn't feature anything special, but those with a sharp eye will note that the heat sink is held on by just two screws this time around. There are still four holes present for those that want to install an aftermarket heat sink or water block for improved cooling.
Since the heat sink is secured with just two screws, we removed it to make sure the thermal interface material was making good contact with the GPU and heat sink. As you can see in the image above the contact was perfect and the amount of thermal interface material that was applied by XFX was perfect.
The XFX GeForce 8600GT has four Qimonda 512M-bit GDDR3 memory IC's located on the PCB that are there to make up the 256MB of memory that the card has. These memory IC's operate on a 128-bit bus and feature a part number of HYB18H512321AF-14, which is a part number that is slated to end production in the second quarter of 2007 as they operate at 700MHz with 2.0V and are being replaced with memory IC's that operate at the same frequency with just 1.8V. Since the XXX edition has the memory operating at 1.6GHz, it means that these Qimonda IC's have passed testing at higher clock speeds and are overclocked from their rated 700MHz up to 800MHz. This is a common practice in the video card industry and is nothing to be alarmed by.
The XFX 8600 GTS XXX Edition
If the XFX GeForce 8600 GT series doesn't get you excited, don't worry as there is yet another graphics card series that is build off the same NVIDIA G84 core. The GeForce 8600 GTS is identical to the 8600 GT except for the fact that HDCP support is now required and the core and clock frequencies are higher than even the overclocked XXX Edition of the 8600GT.
The standard specification for the GeForce 8600 GTS calls for the core clock to run at 675MHz, the memory at 2.0GHz and the shader clock on the stream processors to be 1.45GHz. The XFX GeForce 8600 GTS (part number PV-T84G-UDF) operates at the suggested settings. For those that like overclocked pieces of hardware, XFX has the XFX GeForce 8600 GTS XXX Edition (part number PV-T84GUDD)! The XXX Edition has a core clock of 730MHz, memory speed of 2.26GHz and a stream processor frequency of 1566MHz. These higher clock rates help improve performance as the clock frequency and bandwidth is increased.
Right off the bat you'll notice the XFX 8600 GTS XXX Edition doesn't use a black PCB as it's green again. The card itself doesn't look bad being green as the heat sink is much larger than the one on the GeForce 8600 GT and the card has a nice black top plate that has the XFX logo on it. The GTS version of the 8600 is again a single slot card, which is nice.
The XFX GeForce 8600 GTS is also the first graphics card to support HDCP over dual-link, allowing video enthusiasts to enjoy true high definition movies on extreme hige resolution panels such as the 30" Dell 3007WFP at 2560 x 1600. Other than two native dual-link, HDCP enabled DVI-I outputs the GTS supports all of the features that the GT does like; a 7-pin analog video out port that supports S-Video directly, composite and component (YPrPb) outputs via an optional dongle.
The GeForce 8600 GTS has a maximum power draw of 71 watts, which is enough for NVIDIA to add a 6-pin PCI-Express power header to the board to be on the safe side of power consumption. A standard x16 PCI-E slot draws 75 watts of power, but when the cards are overclocked it will draw more than this. Since we are using the overclocked XXX Edition graphics card, having this 6-pin PCIe power header gives us piece of mind.
The fan on the 8600 GT was of a 2-pin design, while the fan on the 8600 GTS is a 4-pin design. This allows the fan speed to be monitored and controlled by software applications.
The heat spreader is held down by four screws and this time around the memory makes contact with the spreader to help dissipate the heat generated from the 2.26GHz memory IC's. The thermal interface material on the GPU was properly applied as were the thermal pads that bridged the gap between the memory and heat sink.
With the heat spreader completely removed it's obvious how large of an area it covers.
Once the thermal interface material was removed from the GPU core an updated NVIDIA logo was easily seen along with an A2 revision of the G84 core. This A2 G84 core was made in the seventh week of 2007, so it's only a couple months old!
The XFX GeForce 8600 GTS XXX Edition uses Samsumg memory IC's, which are obviously different than the Qimonda branded IC's that were on the slower clocked 8600 GT XXX Edition. The Samsung K4J52324QE-BJ1A IC's operate with 1.9V± 0.1V and a clock frequency of 1GHz (2GHz GDDR3). These memory chips are then overclocked from their rated 2GHz up to 2.26GHz for improved performance.
The Video Card Bundles & Warranty
The bundle on the XFX GeForce 8600 GTS XXX Edition (Model Number PV-T84G-UDD) comes with the driver CD, manual, S-Video cable, DVI-VGA adapter, 4-pin molex to 6-pin PCIe power adapter and a full retail version of the game Ghost Recon Advanced Warfighter. The bundle is pretty standard, but it would be nice to see XFX update the game that is bundled with the 8600 GTS as nearly a year old now and since many will be upgrading GeForce 7 series cards to get DirectX 10 support they may end up with two copies of the game. The XFX GeForce 8600 GT XXX Edition comes with all of the above, but is missing the game.
XFX Warranty Changes:
It's also worth mentioning that XFX is changing the wording on their double lifetime warranty. FX understands that today’s enthusiast gamers are interested in trying to maximize the performance of all of their video card to give them that extra competitive advantage by improving cooling for example. Prior to today if a card was returned with an aftermarket heat sink it would have resulted in a warranty that was void.
XFX North America has revised its warranty policy to alleviate the concerns that enthusiasts may have against these efforts and as as long as the there is no physical damage on the graphics card or any original components are missing it will still be covered by XFX. The Double Lifetime Protection ensures that the coverage will be transferable to a second owner, which adds additional value for the XFX card.
These changes will cover all 6 series and higher XFX video cards effective today. We applaud XFX's choice to warranty cards that have no physical damage and that have all the original components on it. This means that people can RMA cards that have had water blocks, newer heat sinks, or ram sinks on the memory IC's without question as long as the card is not damaged.
The Test System
|Video Card Test Platform|
Intel Core 2 Duo E6700
2GB Super Talent DDR2 1000
Western Digital Raptor 74GB
AeroCool Xfire Heatsink
Windows XP Professional SP2
All testing was done on a fresh install of Windows XP Professional build 2600 with Service Pack 2 and DirectX 9.0c. All benchmarks were completed on the desktop with no other software programs running. ATI CATALYST 7.2 Drivers were used on all of the ATI video cards. Forceware 93.71 drivers were used for NVIDIA 7000 series cards. Forceware 97.94 drivers were used for 8800 series cards. Forceware 158.16 were used for 8600 GT and 8600 GTS.
Company of Heroes
Company of Heroes is set during World War II where the player follows a military unit, known as Able Company, as they fight their way through some of the greatest and bloodiest battles, including the D-Day landings at Normandy.
As one of our new performance tests, Company of Heroes isn't taking it easy on the new 8600's. They are behind the 7900 GT in all 3 tests but inches closer as the resolution increases. It will be interesting to see how performance shakes up once the rumored DirectX 10 patch is released for Company of Heroes.
Sierra; F.E.A.R w/ v1.0.8 patch:
F.E.A.R. (First Encounter Assault and Recon) is a first-person close-quarters combat game for the PC. The story begins when a paramilitary force infiltrates a multi-billion dollar aerospace compound, and the government responds by sending in Special Forces. The group loses contact with the government when an eerie signal interrupts radio communications--and when that interference subsides moments later, the team has been destroyed. That's where you come in. As part of a classified strike team created to deal with threats no one else can handle, your mission is simple: eliminate the intruders at any cost, determine the origin of the signal, and contain the potential crisis before it gets out of control. High and Max settings represent the built in game control settings. Soft Shadows were not used.
Performance in FEAR is very good at 1024x768 with the test settings that we used. At 1600x1200 the 8600 GT XXX is not always playable, while the 8600 GTS XXX is right on the borderline.
Rainbox Six: Vegas
Team Rainbow is a multinational task force comprised of counterterrorism experts from around the globe. Equipped with state-of-the-art weapons, Team Rainbow is deployed during terrorist crises. When all other attempts have failed, Rainbow is brought in to save the lives of innocent people. They do not negotiate with terror. They destroy it.
Rainbox Six: Vegas is based off of the Unreal 3 engine so it's important to see what kind of performance we get here. While the 8600 GTS and 8600 GT have improved performance over all but the 79000 GTX in the 7000 series of cards it's outdone by the X1900XT 256MB. Notice the huge difference in performance between the 8800 320MB and the 8600's!!
Tomb Raider: Legend
Laura Croft's search for a South American relic changes course dramatically when she meets a dangerous figure from her past, and she finds herself in a race to recover one of history's most famous artifacts.
With next generation content enabled shader performance is the most important factor to how the cards perform. The 8600 GTS XXX is able to outperform a pair of X1800XT in CrossFire and is right on the heels of the 7900 GTX!
Egosoft: X3 Reunion
The Sequel to the award winning X3: The Threat will introduce a new 3D engine as well as a new story, new ships and a new gameplay to greatly increase the variety in the X-universe. The economy of X3: Reunion will be more complex than anything seen in the X-universe before. Factories are being built by NPCs, wars can affect the global economy, NPCs can trade freely and pirates will behave far more realistically.
Extensive development has gone into the X3 engine, making full use of DirectX 9 technology, to create dramatic visual effects and stunningly realistic starships. Coupled with the massively enhanced A.L. (Artificial Life) system, X3: REUNION will present players with an ever changing, evolving universe; where a player actions really can shape the future of the universe.
We have expanded the number of tests for X3: Reunion. With the extra power of the E6700 we are no longer seeing single digit FPS during some of the scenes. The 8600 GT XXX and 8600 GTS XXX fall behind the older 7900 GT and 7900 GS in X3. We're not really sure why this test favors ATI so much but the 8800 series is able to put up a good fight!
Marvel: Ultimate Alliance
Marvel: Ultimate Alliance is an all-new Action/RPG that lets players create their ultimate team from the largest Super Hero alliance ever as they engage in an epic quest to determine the fate of the Marvel universe. For decades, Earth's Super Heroes have opposed evil in their own cities, and on their own terms. But now, Dr. Doom and a newly reformed Masters of Evil have plans for world domination, and the heroes must band together to defeat them. Players can create and control their own completely unique team, selecting from the largest roster of legendary Super Heroes ever assembled in one game.
While a blistering frame rate isn't required to play Ultimate Alliance, performance below 30fps does have an affect on your movement. Advanced lighting in Ulitmate Alliance is really demanding on video cards and as you can see it takes a very high end card to play with it enabled even at moderate resolutions.
3D Mark 2006
3DMark 06 is the worldwide standard in advanced 3D game performance benchmarking and the latest version in the popular 3DMark series! 3DMark06 tests include all new HDR/SM3.0 graphics tests, advanced SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today.
If we were going by 3D Mark performance the 8600 GT and GTS XXX cards would be hot sellers! Performance in this benchmark puts the 8600 GTS XXX in the same league as the X1900XTX and X1950XTX with the 8600 GT XXX following closely behind.
***Warning*** The 8600 GT does not include the power header of the GTS model and although you may be able to overclock the GT to the speeds you see below, similar to the GTS, you'll be pulling more power from the motherboard than the 75 watts it can supply. This could cause damage to your video card and/or the motherboard that is not covered by any warranty.
In our overclocking adventures we thought we’d start off with NVIDIA’s nTune software but no matter what we did the card would not pass the GPU test required by the software before applying the speeds. Even running the test at stock speeds would cause a “failed” error. Determined to get more performance out of our cards we decided to give ATI Tool a shot and it worked like a charm! Knowing that the 8600 GT was just a lower clocked variant of the 8600 GTS we had high hopes for our low cost beast. With frequencies at 620MHz core and 1.6GHz memory we set ATI Tool to find the max overclock for the card.
We were pleasantly surprised to find our 8600 GT was able to run perfectly stable at 715MHz core and 1.72GHz memory! The core speed was close to that of the 8600 GTS but memory speed was a bit off so we ran a few tests to see how close in performance we were.
Rainbow Six: Vegas
For five minutes with ATI Tool we've got an 8600 GT that's performs just like the more expensive 8600 GTS XXX! Even overclocking to just the GPU core speeds of the reference 8600 GTS will give you a noticeable performance increase, will save you a bit of cash, and won’t endanger your video card or motherboard! If your asking why we don't have numbers for the 8600 GTS in overclocking it's because XFX has done a great job getting every last drop of performance from 8600 GTS XXX, increasing the overclock on either the memory more than just a few MHz would result in artifacts.
Power Consumption :
For testing power consumption we took our test system and plugged it into a Seasonic Power Angel. For idle numbers we allowed the system to idle on the desktop for 5 minutes and took the reading. For load numbers we measured the peak wattage used by the system while running through 3DMark 2006.
The idle results showed that the 8600 GT XXX Edition had the lowest idle power consumption thanks to the 80nm die shrink. The 8600 GT XXX Edition uses 14W less than the XFX 7600 GT, which is impressive for an overclocked card.
To get the load numbers we used the peak value displayed while running the Futuremark's 3DMark06 benchmark. The XFX 8600 GT comes in ahead of the XFX 7600 GT, but remember that the XFX 8600 GT is the XXX overclocked version and uses more power than a standard 8600 GT. Our XFX 8600 XXX cards lead the way with some of the lowest power consumption that we have seen of the cards that we have test thus far. The
Final Thoughts and Conclusions
My thoughts on GeForce 8600 series:
With the testing completed I'm left feeling a little flat from the performance of the 8600 GTS considering its price. Performance suffers a bit in some games but performs on par with last generation cards in others. The 128-bit memory interface doesn't hurt AA performance as much as we thought it could. Arguably, gamers playing at lower resolutions are less likely to use AA but there are certainly plenty that do. BUT! This is supposed to be NVIDIA’s performance segment entry into the DirectX 10 fray, the bread and butter of the video card industry and I can’t help but feel they missed the mark and left the door open for AMD.ATI to step in. Overclocking a reference based model may give a little better value to the 8600 GTS line but our XXX card was not having any of that! The upside to the 8600 GTS is that it offers a little better performance over the elder, mainstream/performance 7000 series hardware in shader intensive applications. Also, performance in Rainbow Six: Vegas was good enough to satisfy even this extremely picky gamer by pushing the game at 1024x768 resolution without slowing during heavy action. If you want to game at higher resolutions without slow downs in Vegas (and presumably Unreal 3 engine games) you’re going to need to spend a bit more money.
Perhaps I am too pampered by the high-end but it seems to me that other than DirectX 10 capability, the lower power requirements, new pure video feature, and heat output are the few redeeming qualities of the 8600 GTS cards. Sure, the 3DMark scores look great but when it comes down to performing well in games I just don't feel that it's a good deal when you look at current pricing of the 8800 GTS 320MB. They can be found for as little as $219 after mail-in-rebate and offers another world of performance over the 8600 GTS!! Not only does this undercut the price of our 8600 GTS XXX it offers a much more enjoyable gaming experience and it is certain that you will get a longer lifespan out of the 8800 GTS 320MB.
The 8600 GT XXX I would consider a great value for the high end of the mainstream market. Our example overclocked to nearly the specs of the 8600 GTS XXX and performance was within a frame or two in the games we tested. With better cooling I have no doubt that it would have matched it in core speed. You'll just have to be careful not to break it!
Noise is a bit of a factor for both of these cards. The 8600 GT had no fan throttling at all and was the loudest thing in our test system, the fan was at full song any time the machine was powered on which made for a long benchmarking session, although it comes in handy should you decide to overclock. The 8600 GTS fan on the other hand, will throttle once the video card driver is installed. During intense operation, the fan will throttle up to a level that would be annoying if it were stuck there. Both cards were still much quieter than the 7600 GT! Heat is not a concern with these cards, as they are not consuming much power at all so it is nothing like the 8800 series that will bake any other component near them. Power requirements are low with the 8600 GT needing a 300W PSU and 8600 GTS asking for a 350W. In SLI the recommendations step up a bit with a pair of 8600 GT's at 350W and 8600 GTS at 450W.
Other than strictly gaming performance the improvements to power consumption and the PureVideo 2 engine show signs of major improvements over previous generations. With more and more consumers starting to get in High-Definition video playback it means they need a card that can handle 1080i resolutions for HDTV and HD gaming. The GeForce 8600 GTS and 8600 GT are going to be able to bring these features to consumers at a price point that many can now afford. By making High-bandwidth Digital Content Protection (HDCP) technology support standard on the 8600 GTS it also helps protect digital entertainment content across both DVI and HDMI interfaces.
At $199 for the reference model 8600 GTS and $239 for our XXX edition from XFX I am urging you all to search pricing of the 8800 GTS 320MB before making your purchase decision.
The 8600 GT on the other hand is priced at $149 for reference models and $169 for the XXX edition card and will give you a good bang for your DirectX 10 buck, especially if you should decide to overclock it. Its a solid performer at the same price point that was once held by the great 7600 GT and 6600 GT before it.
Legit Bottom Line: NVIDIA once again beats AMD.ATI to the DirectX 10 punch, this time with a pair of mainstream cards. While the 8600 GTS is a bit off the mark the 8600 GT is an excellent value.