NVIDIA Releases The GeForce 9500 GT
NVIDIA doesn't usually send me the bottom half of their product line, but they made an exception with the recently announced GeForce 9500 GT graphics card. For the most part ATI and NVIDIA send out their high end offerings like the upcoming Radeon HD 4870 X2 or the GeForce GTX 280. Not everyone can afford a $400-$600 graphics card and to be honest the cards that sell the most are the entry level cards. NVIDIA expects the final e-tail pricing to come in around the same level of the 8600 GT, which currently is available in the low $80 USD range. This makes the NVIDIA GeForce 9500 GT an extremely affordable solution for those wanting something better than discrete graphics without breaking the bank. The 'mainstream' graphics card is defined as a graphics card that does not require an external PCIe power connector and is priced below $100 in the retail market. Here is the new NVIDIA mainstream graphics board lineup, in descending order of price and performance, as of July 29, 2008:
- GeForce 9500 GT — 256 MB / 512 MB GDDR3
- GeForce 9500 GT — 512 MB DDR2
- GeForce 8500 GT — 512 MB DDR2
- GeForce 8400 GS — 256 MB DDR2
Let's take a look at the budget friendly GeForce 9500 GT 256MB GDDR3 version and see what it has to offer.
The first thing to point out on the GeForce 9500 GT is that it is a single slot solution with a cooling fan that is actually quiet during use. The second thing to notice is that the GeForce 9500 GT doesn't need any additional power as the maximum thermal design power (TDP) is just 50W for the entire video card. Since the x16 PCI Express slot can handle ~75W of power the GeForce 9500 GT has ample power from the PCIe slot. The length of the GeForce 9500 GT PCB is 6.875" in lenth with the overall length of the card including the back plate being 6.9375", which means that it will fit in smaller cases. This is ideal for those building HTPCs or anything with a small chassis.
The rear of the card doesn't have too much to see as all the memory ICs are located on the front under the GPU cooler.
The reference GeForce 9500 GT that I am looking at features two dual-link, HDCP-enabled DVI-I outputs for connection to analog and digital PC monitors and HDTVs, a 7-pin analog video-out port that supports S-Video directly, plus composite and component (YPrPb) outputs via an optional dongle. DisplayPort is supported by the card, but was not used by NVIDIA on the reference design. That doesn't mean that add-in-board (AIB) partners aren't going to include DisplayPort.
The top of the 9500 GT features a single SLI connector, which means that dual SLI is an option and that 3-way SLI is not possible. This shouldn't be a deal breaker as most mainstream motherboards have just a single x16 PCI Express graphics slot. The small black 2-pin connector next to the SLI bridge is an input for digital audio pass through on an HDMI connection. The GeForce 9500 GT GPU natively supports HDMI and DisplayPort. Just like we noted in the paragraph before it will be up to the add-in card partners to release boards with HDMI, DVI-to-HDMI dongle, or DisplayPort at their discretion.
Under The Heat Sink
By quickly removing the four screws that hold on the GPU cooler we can see what is hiding underneath. The core is clocked at 600MHz with 32 stream processors clocked at 1.4GHz. The four memory ICs on our reference sample were part number K4J52324QE-BJ1A, which are rated at 1000MHz at 1.9V and are Lead-free and RoHS-compliant.
Here is the feature chart that NVIDIA provided us for the GeForce 9500 GT. The only thing we noticed that was incorrect in this chart is the memory clock frequency as it shows 800MHz and our card was running 1000MHz on the memory according to CPU-Z (see the screen shot on the next page). This also means the memory bandwidth information in the chart above is incorrect.
Here is a shot of the GPU core with a US quarter sitting by it to give you an idea of how big it actually is. All of the initial GeForce 9500 GT GPUs will be manufactured at 65nm with a 55nm refresh following down the road. The GPU is comprised of 314M transistors and is small considering how big the GTX280 package is.
The Test System
The test system was running Windows Vista Ultimate 64-bit with all available Microsoft updates including the hotfixes required for enthusiast video cards to run correctly. ATI graphics cards were tested using ATI CATALYST 8.7 drivers. NVIDIA Forceware 175.16 WHQL drivers were used on all GeForce graphics cards except for the GeForce GTX 280 series cards as they used Forceware 177.34 drivers, the GeForce 9800 GTX+ that used Forceware 177.39 graphics drivers and the GeForce 9500 GT/9600 GT both used Forceware 177.79 drivers. All results shown in the charts are averages of at least three runs from each game or application used.
GPU-Z on the GeForce 9500 GT Graphics Card:
The Video Cards:
- NVIDIA GeForce 9500 GT (600MHz/1000MHz GDDR3)
- NVIDIA GeForce 9600 GT (GDDR3)
- ATI Radeon HD 4870 X2 (750MHz/1800MHz GDDR5)
- HIS Radeon HD 4870 (750MHz/1800MHz GDDR5)
- Sapphire Radeon HD 4850 (625MHz/1986MHz GDDR3)
- NVIDIA GeForce 9800 GTX+ (738MHz/2200MHz GDDR3)
- EVGA GeForce GTX 280 Hydro Copper 16 (670MHz/2430MHz)
- PNY GeForce GTX 280 (602MHz/2214MHz)
- XFX GeForce 9800 GX2 (600MHz/2000MHz)
- XFX GeForce 8800 GTX 768MB
- Diamond Radeon HD 3870 X2 - 1GB (825MHz/1.800GHz GDDR3)
All of the video cards were tested on our Intel X38 Express Test platform, which is loaded with the latest and greatest hardware. The Intel Core 2 Quad QX9770 'Yorkfield' processor was used for testing as it proved to be the best desktop processor when it comes to game performance. The test system was also loaded with 4GB of memory and water cooled to ensure throttling of the processor or memory wouldn't cause any issues. The Corsair PC2-9136C5 memory kit was run at 1066MHz with 5-5-5-15 2T memory timings.
|Intel Test Platform|
Intel Core 2 Quad QX9770
4GB Corsair PC2-9136C5
Western Digital SATA RaptorX
Corsair Nautilus 500
Windows Vista Ultimate
Now that we know exactly what the test system is, we can move along to performance numbers.
Company of Heroes
Company of Heroes (CoH) is a real-time strategy (RTS) computer game developed by Relic Entertainment that was released on September 14, 2006. On May 29, 2007 Relic released a patch for Company of Heroes that supports DirectX 10 and we used the latest patch to test DirectX 10 game performance. Company of Heroes is set during World War II where the player follows a military unit, known as Able Company, as they fight their way through some of the greatest and bloodiest battles, including the D-Day landings at Normandy.
The game was updated to version 2.301 and benchmarked with 8x CSAA enabled and the quailty settings pushed to the max on the others.
Benchmark Results: The NVIDIA GeForce 9500 GT did well on Company of Heroes at 1280x1024 and was able to perform better than the ATI Radeon HD 3870, which shocked me. At a resolution of 1920x1200 the GeForce 9500 GT just went to a black screen when running the benchmark and I had to ctrl+alt+delete to get out of the game. No error messages or anything popped up and after several tries I gave up trying to get the GeForce 9500 GT to run at that resolution and our game settings. This was the only game that I had any issues with!
S.T.A.L.K.E.R.: Shadow of Chernobyl
S.T.A.L.K.E.R.: Shadow of Chernobyl uses the 'X-ray Engine' to power the graphics. It is a DirectX 8/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time, which makes it one of the more impressive engines on the market today. The engine features HDR rendering, parallax and normal mapping, soft shadows, widescreen support, weather effects and day/night cycles. As with other engines that utilize deferred shading (such as Unreal Engine 3 and CryENGINE2), the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing. The game takes place in a thirty square kilometer area, and both the outside and inside of this area is rendered to the same amount of detail.
The game was benchmarked with full dynamic lighting and maximum quality settings at 1920x1200 and 1280x1024 resolutions.
Benchmark Results: S.T.A.L.K.E.R.: Shadow of Chernobyl ran fine on the GeForce 9500 GT at 1280x1024, but at 1920x1200 the card started to show signs of needing more horse power and video memory.
BioShock is a game published by 2K Boston/2K Australia, and designed by Ken Levine. The game is a PC and Xbox 360 title released on August 21, 2007 in North America. BioShock is a first-person shooter with role-playing game customization elements that was developed using the Unreal Engine 3.0 and is a DirectX 10 game title that is multithreaded.
Since 2K didn't include a benchmark script or utility in BIOSHOCK, I used FRAPS version 2.9.2 to capture the frame rates for 240 seconds at scenes that I personally selected from the game. If you don't know anything about Bioshock, let me set the scene for you: After your plane crashes into icy uncharted waters, you discover a rusted bathysphere and descend into Rapture, a city hidden beneath the sea. Constructed as an idealistic society for a hand picked group of scientists, artists and industrialists, the idealism is no more. Now the city is littered with corpses, wildly powerful guardians roam the corridors as little girls loot the dead, and genetically mutated citizens ambush you at every turn and you get to kill them. Let's look at the benchmark results!
Benchmark Results: In BioShock with everything cranked to high the GeForce 9500 GT was able to run the game at 1280x1024, but it was a bit choppy in certain places. By lowering the image quality to medium Bioshock is more than playable at 1280x1024, so you'll just have to tone things back a bit for this title.
Crysis is a science fiction first-person shooter computer game that was developed by Crytek, and published by Electronic Arts. It was released on November 15, 2007 in the United States. The game is based off the CryENGINE2 game engine, which is an extended version of CryENGINE, the game engine behind the hit game Far Cry a number of years ago.
The full retail version of the game Crysis was used with patch 1.2 for benchmarking. FRAPS was used over the internal benchmark utility to help avoid driver enhancements. Legit Reviews has just NVIDIA data for this game as we just recently updated the game to version 1.2 and picked a new scene to run FRAPS on and didn't have time to re-test all the cards in time for this article, but managed to get a number of the high end cards tested.
Results: The NVIDIA GeForce 9500 GT gets killed at 1280x1024 with high quality settings in the game. If you tone the settings back to low or medium you can play Crysis with respectable frame rates. The only bummer here is that with high image quality settings the game title looks awesome even without AA enabled.
Call of Duty 4
Call of Duty 4: Modern Warfare is a first-person shooter developed by Infinity Ward and published by Activision for Xbox 360 , Playstation 3 and PC. It is the fourth installment in the Call of Duty video game series. It was announced on April 25, 2007 and was released on November 6, 2007 in North America. The single player game can be completed in well under seven hours, but the graphics are awesome.
Call of Duty 4: Modern Warfare runs on a proprietary graphics engine, and has features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth-of-field. "Bullet Penetration" is calculated by the engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as meager protection such as wooden fences, thin walls and such no longer provide sufficient protection. The bullet's speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated. The game also makes use of a physics engine, which was not implemented in previous Call of Duty titles for the PC. Death Animations are a combination of pre-set animations and ragdoll physics. Some mistook the game's graphics to be DirectX 10 based, but it is stated that the graphics use DirectX 9.
Results: Call of Duty 4 with the graphics cranked up to 4xAA and 16xAF was too tough for the GeForce 9500 GT, but the card averaged 25 FPS at 1280x1024. By lowering the AA and AF settings the game play becomes smooth and the GeForce 9500 GT does very well for an $80 video card.
LIGHTSMARK is a new benchmark/demo with real-time global illumination and penumbra shadows created by Stepan Hrbek. Lightsmark version 1.3 was used as it contains new rendering paths for ATI Radeon HD 2xxx and 38XX series graphics cards. Before version 1.3 was released, the ATI Radeon HD 38xx series video cards were unable to render objects in the benchmark.
- realtime global illumination
- realtime penumbra shadows
- realtime color bleeding
- infinite light bounces
- fully dynamic HDR lighting
- 220000 triangles in scene
It should be noted that ATI questioned our use of this benchmark as the developer is a private individual that never contacted developer relations at ATI. ATI also made it clear to us that although this benchmark uses global illumination, that it is not similar to the DX10.1 demo that ATI been showing. ATI isn't sure how the benchmark is made as they have not had time to look into it, but if its rendering to cube maps then its likely that performance could be increased if the app used DX10.1's indexed cube maps.
To be fair to both side we contacted the creator of Lightsmark 2007 and Stepan Hrbek had this to say.
"I developed it with 3 Radeon and 3 GeForce cards, randomly switching them, there are no special optimizations, IMHO it's fair. I bought all 6 cards, no gifts.. Small unfairness is only in quality. The same shader on Nvidia card produces smoother shadows but I don't give Nvidia any bonus points for quality, only fps is measured. It uses completely new technique where part of calculation runs on CPU, drivers were not optimized for it for years, so it's possible that it hits some unoptimized driver code. But both companies are in the same situation."
Since we ran the test, we will go ahead and include it, but what the results mean is up in the air.
Benchmark Results: Lightsmark 2007 v1.3 shows that the GeForce 9500 GT has solid performance numbers at 1280x1024, but at 1920x1200 it chokes as the 256MB of GDDR3 just isn't enough.
3DMark 06 is the worldwide standard in advanced 3D game performance benchmarking and the latest version in the popular 3DMark series! 3DMark06 tests include all new HDR/SM3.0 graphics tests, advanced SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today.
Default 3DMark06 settings were used for testing, so a resolution of 1280x1024 was used.
Benchmark Results: 3DMark06 showed that the NVIDIA GeForce 9500 GT scores just near 7,000 3DMarks in the overall test, which is solid for a mainstream graphics card.
3DMark Vantage is the new industry standard PC gaming performance benchmark from Futuremark, newly designed for Windows Vista and DirectX10. It includes two new graphics tests, two new CPU tests, several new feature tests, and support for the latest hardware. 3DMark Vantage is based on a completely new rendering engine, developed specifically to take full advantage of DirectX10, the new graphics API from Microsoft.
Default 3DMark06 settings were used for testing, so a resolution of 1280x1024 was used.
Test Results: The GeForce 9500 GT was able to make it through 3DMark Vantage alive and turned in an overall score of 2,261 3DMarks, which is about half what the GeForce 9600 GT scored.
Test Results: The GPU score in Vantage wasn't that pretty, but this is a tough new DirectX 10 benchmark
Folding @ Home
Folding@Home, the distributed-computing protein-folding application from Stanford University, runs more than 100× faster on the GPU than on the fastest CPU! We tried out F@H with the just released GPU2 client version 6.20 to see how the new GeForce 9500 GT does on the latest client and public video card drivers. This version just came out on August 1st, 2008 and features a number of improvements over previous versions that are all now expired.
Folding on NVIDIA graphics cards have come a long way from when I first saw the demo working and while it is still not 100% stable (my personal system still gets driver crashes while folding) it seems to be getting close. NVIDIA has 70 million graphics cards on the market today that can run this folding client, which means that the folding project could get a huge boost in performance if more end users would join the project. To see if your video card supports CUDA, check out this list of supporting cards. If your card supports CUDA then you can try out protein folding and if you like join our folding team under 38296.
The client comes with a viewer will display a current Nanoseconds/Day performance rate (number of Nanoseconds simulated on a per-day basis) and is shown above. Why nanoseconds? Protein folding is measured in nanoseconds per day, or how many nanoseconds of the protein’s life can be simulated in a day’s worth of computing time. The Viewer, due in part to its current requirement of moving data off and back to the GPU to render, reduces folding performance. Performance can best be measured after closing the Viewer for a little while and reading the log file. We did that on the XFX GeForce 9600 GT and the NVIDIA GeForce 9500 GT to see what the folding performance difference is.
- XFX GeForce 9600 GT (650MHz Core/1.8GHz Memory) - 71.23 NS/Day
- NVIDIA GeForce 9500 GT (600MHz Core/2.0GHz Memory) - 37.88 NS/Day
The GeForce 9500 GT and GeForce 9600 GT were both running Forceware 177.79 video card drivers and were working on the same work unit. The GeForce 9600 GT was found to be 88% faster when it comes to folding performance, which is a significant amount as you can see in the chart above if you are interested in the time it takes to complete a step. The reason for this difference is because the GeForce 9500 GT only has 32 stream processors running at 1,500MHz and the GeForce 9600 GT has 64 stream processors operating at a faster 1,625MHz. The GeForce 9500 GT is still going to improve protein folding performance over CPU folding, so if you get one of these cards be sure to download the GPU2 client and try it out for yourself.
One other thing I noticed when testing F@H was that power consumption was actually lower than when gaming! I didn't expect this, so I made a quick chart to show the power differences between the two cards.
As you can see running F@H uses roughly 20W more power on each card, but it's 20-30W less than playing a video game like Call of Duty 4. I thought these numbers were interesting as folding doesn't require much more power on these cards than sitting at idle. Not only does the GPU beat the CPU in terms of folding performance it also seems to be very efficient.
Power Consumption and Final Thoughts
For testing power consumption, we took our test system and plugged it into a Seasonic Power Angel. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers, we measured the peak wattage used by the system while running the game Call of Duty 4 at 1280x1024 with high graphics quality.
Power Consumption Results: Looking at total system power consumption levels, the GeForce 9500 GT was very impressive. The GeForce 9500 GT used the least amount of power at idle and load! It used 10W less power at idle than the GeForce 9600 GT and 32W less when playing Call of Duty 4.
For starters, I apologize on the performance charts for not including more mainstream graphics cards! NVIDIA never sampled me any GeForce 8500 GT or 8400 GT graphics cards, which are the other graphics cards in NVIDIA's mainstream graphics card lineup. I added the GeForce 9600 GT to the performance charts for this article and as you can tell the GeForce 9600 GT is nearly twice as fast as the GeForce 9500 GT in the majority of the benchmarks and for protein folding.
The street price on the GeForce 9500 GT is expected to be $80, but will sure to be lower with rebates in the months to come. Major online retailers are already selling the GeForce 9500 GT and prices start at $69.99 on Newegg. For $69.99 you can get your hands on a 512MB GDDR2 version of this card by EVGA (Remember our review used the 256MB GDDR3 version). When it comes to pricing on the GeForce 9600 GT, it can be found on Price Grabber for $104.99 over at Newegg and has a $5 rebate making it $99.99. The GeForce 9600 GT that we tested the 9500 GT against can be had for offers much more performance as my testing showed a 62% performance gain in Call of Duty 4 and a 102% boost in Bioshock at 1280x1024. For those that don't want to break the bank and game at lower resolutions the GeForce 9500 GT is a sure to play the current game titles, but you'll have to keep an eye on the settings as to reach smooth frame rates you'll have to tone it back to medium in some games.
Performance aside the GeForce 9500 GT offers consumers other things like compact size, a quiet cooler and low power consumption. The fact that it doesn't need any additional power connections means that it will make for an easy upgrade for many that don't have a power supply that is able to handle more devices. These strong areas make for great selling points not to mention that the GeForce 9500 GT can run F@H and is CUDA ready for parallel applications. In the future things like GPU video transcoding will be sure to take off and the GeForce 9500 GT can already run commercial software packages to support GPU video transcoding like Badaboom Media Converter from Elemental Technologies. If that doesn't impress you, keep in mind that is also support NVIDIA PhysX technology that will be seen in more upcoming game titles. For a video card that can be found for $69.99 it has a ton of features and I have to give the cards a thumbs up.
Legit Bottom Line: The GeForce 9500 GT isn't going to impress you with high resolution gaming performance, but it can run all the latest game titles, Folding@Home, PhysX Technology and GPU video transcoding for under $70.