The 256MB GeForce 8800 GT's Arrive!
There is no doubt about it - the GeForce 8800 GT might be the fastest selling card that NVIDIA has ever produced. The price versus performance ratio is ideal for the enthusiast community and the feature set is impressive. The GeForce 8800 GT 512MB video card series fills the $199-$259 price points, but what about those that want to spend less and still have all the same features? NVIDIA didn't forget about you guys and has just recently released the GeForce 8800 GT 256MB video card that is aimed at the $179-$199 price points. The GeForce 8800 GT 512MB has been in such high demand the price has remained above $269.99, so the need for a more budget friendly card is a welcomed addition to the GeForce 8 lineup. Basically, all NVIDIA did was reduce the amount of GDDR3 memory from 512MB to 256MB and that's it. Our friends over at XFX sent out their new XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition graphics card and it is identical on the outside compared to the GeForce 8800 GT 512MB.
With the XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition and XFX GeForce 8800 GT 5126MB Alpha Dog graphics card sitting side by side, they are impossible to tell apart. The clock frequiences on the reference 256MB cards are 600MHz/1500MHz, but XFX overclocked their XXX edition up to 650MHz on the core and 1600MHz on the memory for even more performance. The XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition that Legit Reviews is looking at today comes with the part number PV-T88P-UDD4. As you can see in the picture above, it also comes with the improved cooling solution that is on their most recent GeForce 8800 GT series cards. If you'd like to know more about the features of the GeForce 8800 GT be sure to read our previous articles about it!
GeForce 8800 GT Coverage:
With the cooler removed from both XFX GeForce 8800 GT video cards, it was still impossible to tell which card was which as every capacitor and component looked to be the same. Many figured that the GeForce 8800 GT would just have half the number of memory ICs, but that doesn't seem to be the case here as both cards have eight ICs. Click on the picture above to see a larger version and see if you can figure it out. The only difference between the two cards that can be visually seen is the white sticker on the above right card.
Looks like someone forgot to remove the sticker covering the speaker after the card was washed at the factory! Still doesn't help figure out which card is which, but the one on the right is the 512MB version and the latest 256MB card is the one of the left.
The only difference between the cards is the memory ICs that are used on them. The XFX GeForce 8800 GT 516MB graphics card that we reviewed last week has Qimonda memory ICs. The Qimonda HYB18H512321BF-10 memory ICs are 512-Mbit GDDR3 Graphics RAM that are designed to run at 1.0GHz according to their datasheet. Each IC therefore contains 64MB of memory and eight of them add up to the 512MB that is found on the XFX GeForce 8800 GT 512MB graphics card.
The XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition uses eight Hynix HY5RS573225B FP-14 memory ICs. These are also GDDR3, but they are 256-Mbit ICs that are rated to operate at 700MHz according to their datasheet. Each of the eight ICs are 32MB, which is how the XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition gets its 256MB of on board memory from.
How will cutting the onboard graphics card memory in half impact performance? It's obviously going to play a roll at higher screen resolutions, but how much? Read on to find out!
We normally don't cover retail boxes in our articles, but ever since XFX pulled the hidden images stunt that landed them in court we have been checking boxes. XFX settled the 'drool' case with EVGA out of court in case any of our readers ever wondered what happned with that situation. Since the card we are looking at today it is the XXX Edition it has the XXX sticker on the front of the box and a sticker showing that the core clock is 650MHz and not the standard 600MHz core clock frequency.
Inside the box one will find the expected bundle that comes with all brands of video cards. The bundle included with the XFX GeForce 8800 GT Alpha Dog includes two DVI-to-VGA adapters, a 4-pin molex to 6-pin PCI-Express power adapter, HDTV dongle, an S-Video cable and installation instructions. Also included it the full DVD version of CAPCOM's game title Lost Planet. The Driver disc included several NVIDIA Forceware driver versions and some other software like CoolBits for unlocking overclocking features. The bundle also includes a first for us - the 'Do Not Disturb' door hanger!
The Test System
The test system was running Windows Vista Ultimate 64-bit with all available Microsoft updates including the hotefixes required for enthusiast video cards to run correctly. NVIDIA Forceware 169.05 beta video card drivers were used on all of the GeForce series graphics cards. ATI CATALYST 7.11 drivers were used on the Radeon HD 3870, 3850, 2900 XT and HD 2600 XT graphics cards. All results shown in the charts are averages of at least three runs from each game or application used.
The Video Cards:
- XFX GeForce 8800 Ultra XXX Edition 768MB (675MHz/2.3GHz )
- XFX GeForce 8800 GTX 768MB (575MHz/1.8GHz)
- BFG Technologies GeForce 8800 GTS OC2 640MB (580MHz/1.7GHz)
- XFX GeForce 8800 GT 512MB (625MHz/1.8GHz)
- NVIDIA GeForce 8800 GT 512MB (600MHz/1.8GHz)
- XGX GeForce 8800 GT 256MB (650MHz/1.6GHz)
- eVGA 8600 GT 256MB (540MHz/1.4GHz)
- ATI Radeon HD 3870 (777MHz/1.126GHz GDDR4)
- ATI Radeon HD 3850 (669MHz/1.658GHz GDDR3)
- ATI Radeon HD 2900 XT (743MHz/2.0GHz)
- ATI Radeon HD 2600 XT (800MHz/1.1GHz)
All of the video cards were tested on the Intel X38 Express Test platform, which is loaded with the latest and greatest hardware. The Intel Core 2 Quad QX9650 'Yorkfield' processor was used for testing as it proved to be the best desktop processor when it comes to game performance. The test system was also loaded with 4GB of memory and water cooled to ensure throttling of the processor or memory wouldn't cause any issues.
|Intel Test Platform|
Intel Core 2 Quad QX9650
Corsair PC2-8500 4GB
Western Digital SATA 250Gb
Corsair Nautilus 500
Windows Vista Ultimate
Now that we know exactly what the test system is, we can move along to performance numbers.
S.T.A.L.K.E.R.: Shadow of Chernobyl
S.T.A.L.K.E.R.: Shadow of Chernobyl uses the 'X-ray Engine' to power the graphics. It is a DirectX 8/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time, which makes it one of the more impressive engines on the market today. The engine features HDR rendering, parallax and normal mapping, soft shadows, widescreen support, weather effects and day/night cycles. As with other engines that utilize deferred shading (such as Unreal Engine 3 and CryENGINE2), the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing. The game takes place in a thirty square kilometer area, and both the outside and inside of this area is rendered to the same amount of detail.
Benchmark Results: At the default 1600x1200 game resolution with full dynamic lighting and maximum quality settings, I found that S.T.A.L.K.E.R.: Shadow of Chernobyl was more than playable on nearly all of the video cards. The XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition graphics card took a pretty big performance hit compared to the XFX GeForce 8800 GT 512MB Alpha Dog. The 512MB version of the GeForce 8800 GT was running 97FPS, which is 60FPS more than what the 256MB version of the card can do at the same settings. The GeForce 8800 GT 256MB struggles to play this DirectX 9 game title that was one of the most hyped games early on in the year.
Call of Juarez
Call of Juarez is a Western-themed first-person shooter from the Polish developer Techland. First released for Windows in 2006 as a DirectX 9 title, it was re-released on June 12, 2007 as a DirextX 10 game title. Call of Juarez was one of the first games to utilize Microsoft's DirectX 10 and it is included in our testing for this reason. We have tested Call of Juarez before, but that was version 184.108.40.206 EN. Today, the performance results are from version 220.127.116.11 EN, which is a newer version with performance enhancements.
Benchmark Results:As you can see this benchmark pushes DirectX 10 graphics cards to the breaking point. With high quality settings at 1280x1024 with 2xAA enabled the XFX GeForce 8800 GT 256MB XXX edition graphics card was about 44% slower than the XFX GeForce 8800 GT 512MB graphics card. At 1600x1200 with AA disabled and the quality set to normal the XFX GeForce 8800 GT 256MB XXX edition was found to be 56% slower than a 512MB GeForce 8800 GT. The XFX GeForce 8800 GT 256MB Alpha Dog video card looks like it has it's work cut out for it!
World in Conflict
World in Conflict (also known as WiC or WIC) is a real-time tactical video game developed by Massive Entertainment and published by Sierra Entertainment for Windows and the Xbox 360. The game was released in North America on 18 September 2007 and was included in our testing as it is a recent DirectX 10 game title. It also has a threaded engine for multi-core processor support, which is ideal for this testing. The plot in World in Conflict is to defend their country, their hometown, and their families in the face of Soviet-led World War III, delivering an epic struggle of courage and retribution. You are a field commander leading the era's most powerful military machines in the heroic effort to turn back the invasion…one city and suburb at a time. Let's get on to the benchmarking! WIC was tested using the most recent patch available, which is patch number 002.
Results: When the game graphics are set to medium quality the game runs in DirectX 9 mode, so we ran testing at 1600x1200 with these settings. The XFX GeForce 8800 GT 256MB XXX edition did better than expected and was hanging close to the other GeForce 8 series cards! Let's take a look at what happens when the game is set to high graphics quality under DirectX 10.
Results: Now that DirectX 10 has been enabled on high quality settings, the XFX GeForce 8800 GT 256MB XXX edition seems to fall flat on it's face. Enabling DirectX 10 graphics at the same resolution caused the peformance of the GeForce 8800 GT 512MB to drop 37.5%, but the XFX GeForce 8800 GT 256MB XXX edition decreased 78.9%. It just goes to show you what the additional memory means when it comes to DirectX 10 gaming performance. The XFX GeForce 8800 GT 256MB XXX edition drops below the 30FPS mark when DirectX 10 and high quality settings are used and wasn't playable (unless you like stutters).
BioShock is a game published by 2K Boston/2K Australia, and designed by Ken Levine. The game is a PC and Xbox 360 title released on August 21, 2007 in North America. BioShock is a first-person shooter with role-playing game customization elements that was developed using the Unreal Engine 3.0 and is a DirectX 10 game title that is multithreaded.
Since 2K didn't include a benchmark script or utility in BIOSHOCK, I used FRAPS version 2.9.2 to capture the frame rates for 240 seconds at scenes that I personally selected from the game. If you don't know anything about Bioshock, let me set the scene for you: After your plane crashes into icy uncharted waters, you discover a rusted bathysphere and descend into Rapture, a city hidden beneath the sea. Constructed as an idealistic society for a hand picked group of scientists, artists and industrialists, the idealism is no more. Now the city is littered with corpses, wildly powerful guardians roam the corridors as little girls loot the dead, and genetically mutated citizens ambush you at every turn and you get to kill them. Let's look at the benchmark results!
Benchmark Results: At 1024x768 with high quality settings set in the game, the XFX GeForce 8800 GT 256MB Alpha Dog video card performed close to what the other GeForce 8800 GT 512MB cards were able to do. It was even able to beat out both ATI Radeon HD 3800 series graphics cards in CrossFire!
Benchmark Results: At an in-game resolution of 1600x1200 and medium quality settings, the performance decreased from 82FPS to 48FPS. So far it seems that the XFX GeForce 8800 GT 256MB XXX edition is ideal for gaming at lower resolutions as 1600x1200 starts to stress this 256MB video card.
Crysis is a science fiction first-person shooter computer game that was developed by Crytek, and published by Electronic Arts. It was released on November 15, 2007 in the United States. The game is based off the CryENGINE2 game engine, which is an extended version of CryENGINE, the game engine behind the hit game Far Cry a number of years ago.
On October 26th, 2007, Crytek recently released a single-player demo that has the entire first level, Contact, as well as the sand box editor included. We used FRAPS to benchmark this multi-threaded DirectX 10 beta that came just a few weeks ago.
Results: Crysis is obviously tough on video cards with high graphics settings, with only the XFX GeForce 8800 Ultra XXX edition having an average frame rate above 40. The XFX GeForce 8800 GT 256MB Alpha Dog XXX edition was running under 20FPS at 1280x1024 with high graphics quality settings. Crysis has the ability to run 'Very High' graphics quality settings, so even with the game backed down a bit the XFX GeForce 8800 GT 256MB XXX edition wasn't too impressive. The ATI Radeon HD 3850 also has 256MB of GDDR3 memory and was able to run it nearly 10FPS faster than the XFX GeForce 8800 GT 256MB XXX edition.
Results: At 1600x1200 with medium graphics quality the XFX GeForce 8800 GT 256MB XXX edition manages to do better and was able to play the game without many annoying stutters. The difference between Medium and High graphics settings in Crysis is like night and day and the benchmarks show it.
Call of Duty 4
Call of Duty 4: Modern Warfare is a first-person shooter developed by Infinity Ward and published by Activision for Xbox 360 , Playstation 3 and PC. It is the fourth installment in the Call of Duty video game series. It was announced on April 25, 2007 and was released on November 6, 2007 in North America. The single player game can be completed in well under seven hours, but the graphics are awesome. Click the image below to see Call of Duty 4 at 1920x1200 resolution with 4x AA enabled on the ATI Radeon HD 3870 graphics card.
Call of Duty 4: Modern Warfare runs on a proprietary graphics engine, and has features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth-of-field. "Bullet Penetration" is calculated by the engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as meager protection such as wooden fences, thin walls and such no longer provide sufficient protection. The bullet's speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated. The game also makes use of a physics engine, which was not implemented in previous Call of Duty titles for the PC. Death Animations are a combination of pre-set animations and ragdoll physics. Some mistook the game's graphics to be DirectX 10 based, but it is stated that the graphics use DirectX 9.
Results: This article is the second time I've included Call of Duty 4 as a benchmark and the results are from a single player level from the game. I dropped a few of the older video cads from the charts, but all the latest are still included. The XFX GeForce 8800 GT 256MB XXX edition was above 40FPS at 1280x1024 with 4x AA turned on, which is great. It was just slightly behind the GeForce 8800 GT 512MB video cards and ahead of both of the ATI Radeon HD 3800 series cards.
Results: At 1920x1200 with 4xAA enabled the XFX GeForce 8800 GT 256MB XXX edition dropped below 15FPS, but remember 4x AA is turned on. The card started to show signs of fatigue in outdoor environments and the game often stuttered. It's obvious that the XFX GeForce 8800 GT 256MB XXX edition isn't aimed at those that have 24" or larger monitors.
3DMark 06 is the worldwide standard in advanced 3D game performance benchmarking and the latest version in the popular 3DMark series! 3DMark06 tests include all new HDR/SM3.0 graphics tests, advanced SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today.
Benchmark Results:The XFX GeForce 8800 GT 256MB XXX edition scored on average 11950 points on 3DMark 2006, which is close to what the 512MB versions of the card can do. The XFX GeForce 8800 GT 256MB XXX edition does well in this benchmark, but our game testing showed different results in the real world.
LIGHTSMARK is a new benchmark/demo with real-time global illumination and penumbra shadows created by Stepan Hrbek. Lightsmark version 1.3 was used as it contains new rendering paths for ATI Radeon HD 2xxx and 38XX series graphics cards. Before version 1.3 was released, the ATI Radeon HD 38xx series video cards were unable to render objects in the benchmark.
- realtime global illumination
- realtime penumbra shadows
- realtime color bleeding
- infinite light bounces
- fully dynamic HDR lighting
- 220000 triangles in scene
It should be noted that ATI questioned our use of this benchmark as the developer is a private individual that never contacted developer relations at ATI. ATI also made it clear to us that although this benchmark uses global illumination, that it is not similar to the DX10.1 demo that ATI been showing. ATI isn't sure how the benchmark is made as they have not had time to look into it, but if its rendering to cube maps then its likely that performance could be increased if the app used DX10.1's indexed cube maps.
To be fair to both side we contacted the creator of Lightsmark 2007 and Stepan Hrbek had this to say.
"I developed it with 3 Radeon and 3 GeForce cards, randomly switching them, there are no special optimizations, IMHO it's fair. I bought all 6 cards, no gifts.. Small unfairness is only in quality. The same shader on Nvidia card produces smoother shadows but I don't give Nvidia any bonus points for quality, only fps is measured. It uses completely new technique where part of calculation runs on CPU, drivers were not optimized for it for years, so it's possible that it hits some unoptimized driver code. But both companies are in the same situation."
Since we ran the test ,we will go ahead and include it, but what the results mean is up in the air.
Benchmark Results: The XFX GeForce 8800 GT 256MB XXX edition performed better than the GeForce 8800 GTX in this new benchmark, which is a bit shocking. This is one of the first benchmarks that uses global illumination and realtime penumbra shadows, so these results are new and interesting.
Benchmark Results: With the resolution increased to 1600x1200, the XFX GeForce 8800 GT 256MB XXX edition broke 200 frames per seconds but was slower than the GeForce 8800 GTX now that the resolution was increased. Even if this benchmark is not 'fair' to ATI and NVIDIA card it at least shows the differences between the NVIDIA cards and the ATI cards when looking at just one brand at a time.
Power Consumption and Conclusions
For testing power consumption, we took our test system and plugged it into a Seasonic Power Angel. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers, we measured the peak wattage used by the system while running the game World in Conflict at 1600x1200 with medium graphics quality.
Power Consumption Results: When it comes to power consumption the XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition did really well during gaming as the system consumed less power than either of the Radeon HD 3800 series cards. The Radeon HD 3850 256MB card and Radeon HD 3870 were both more efficient at idle though, which is worth pointing out. The XFX GeForce 8800 GT 256MB consumes 30W less than the GeForce 8800 GT 512MB, which goes to show how much power memory ICs use.
The GeForce 8800 GT 256MB video card was never designed to compete with higher end graphics card as it only has 256MB of memory. When NVIDIA briefed the media they showed performance charts with resolutions of 1280x1024 and 1600x1200 when it came to the performance numbers. That right there, goes to show you that this video card is aimed at those with 20"or less monitors running 1600x1200 and lower resolutions. Even at smaller resolutions the GeForce 8800 GT 256MB had a hard time keeping up with the Radeon 3850 and Radeon 3870, which are both priced at or under what the GeForce 8800 GT 256MB is available at. The GeForce 8800 GT 256MB faces tough competition thanks to the Radeon 3800 series and it only clearly beat the Radeon 3800's in Bioshock. Be sure to consider what games you play the most and the performance of the cards on them when you make your next purchase.
XFX has done a wonderful job with their XXX Edition video card for the GeForce 8800 GT 256MB series. Their improved cooler is a nice touch that makes their GeForce 8800 GT video cards stand out from the rest. It's tough to make a video card stand our from the crowd, but XFX has been finding ways to make it happen. The black PCB makes it by far one of the sexiest video cards we have ever seen, which is a plus for those that want their system to look as good as it performs. The bundle with the XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition was solid and it was nice to see the game Lost Planet included with the bundle. The 650MHz core clock and 1.6GHz memory clocks on the XXX edition are nice as it's a card you can plug in and just run. Users don't have to download third party applications and flash the BIOS to overclock, which is nice and worth the money if you aren't comfortable doing this yourself.
XFX also has a great warranty program where they will repair and service your GeForce 8800 GT Series Graphics Card for as long as you live. Even for those of you who know how to push our cards to the limits, if anything goes wrong, XFX will service it free of charge. All you have to do is register the card with XFX online. If you ever decide to sell or give the card away to someone, XFX will still honor the protection plan for the second owner as well. They just need to register the card with XFX. You get the protection AND the added value of being able to pass along a full warranty. XFX is one of the few companies that has a no BS warranty and that is a big thumbs up for those that like to overclock or have bad luck with computer hardware.
When it comes to pricing the XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition (PVT88PUDD4) that we reviewed today runs $229.99 plus shipping over at Newegg and is in stock at the time of writing. It also has a $10 mail-in rebate right now, so that brings the price down to $219.99 plus shipping. If you're looking for a graphics card with the latest DirectX 10 features and only play games at lower resolutions (1024x768 - 1600x1200) then this card should be on your short list of cards to consider.
Legit Bottom Line: The XFX GeForce 8800 GT 256MB Alpha Dog XXX Edition Brings is factory overclocked and features a sexy black PCB that performs just as good as it looks at lower resolutions.