Battle For Mainstream Market Share

The battle between ATI and NVIDIA reached a tense moment today when ATI removed the gag order on the ATI Radeon HD 4850 graphics card up by more than a week.  NVIDIA was trying to spoil the launch of the ATI Radeon HD 4850 by launching an ultra secret graphics card they have been working on for many months. NVIDIA wasn't quite ready, so the GeForce 9800 GTX+ was rushed to the media this week where we had the card for less than 24 hours before it was offically announced this morning. Why are both companies in such a rush to get the cards to the market and at the same time? The answer is simple, they are priced between $199 and $229! This price point is critical to market share and is the bulk of what is sold to gamers and enthusiasts for the back to school shopping season. These are the sexy cards that offer the best price versus performance and will remain some of the most talked about cards for the rest of 2008.

ATI Radeon HD 4850 and GeForce 9800 GTX+

It seems NVIDIA quietly updated the G92 core that is found on many GeForce 8 series GPU's as well as the GeForce 9600 GT, 9800 GTX and the GeForce 9800 GX2.  NVIDIA shrunk the die from 65nm to 55nm and increased the frequency of the core and shaders, while reducing the cost and added driver support for PhysX. NVIDIA has in a sense completed the move from Enterprise Computing to Visual Computing and beyond today.  NVIDIA GeForce graphics cards can now run CUDA enabled clients like Folding@Home and utilize the AGEIA PhysX SDK, which means by just owning one of the supported NVIDIA graphics cards you can enable game physics without owning a separate physics card or a secondary graphics card. This is more of a revolutionary day than evolutionary if you ask us.

NVIDIA GeForce GTX 280 and GeForce 9800 GTX+

The new just announced NVIDIA GeForce 9800 GTX+ graphics card is from what NVIDIA informed us, nothing more than a die shrink of the G92-based GeForce 9800 GTX graphics card. The original GeForce 9800 GTX had a core frequency of 675MHz and the 128 processors cores (shader clocks) operated at 1688 MHz. Thanks to the new 55nm die NVIDIA was able to run a core clock of 738 MHz and a processor clock of 1836 MHz. The memory remains the same at 1100MHz. This boost of core and shader clocks is enough to make it the fastest GPU in the NVIDIA GeForce 9 family. Pictured above is the flagship GeForce GTX 280 on top and the just announced GeForce 9800 GTX+ on the bottom to get an idea of the layout of each card.

The ATI Radeon HD 4850 Graphics Card

Sapphire Radeon HD 4850 Graphics Card Retail Box

The retail card that we have is the Sapphire Radeon HD 4850, which features an MSRP of $199 and will be the jewel in the price versus performance charts for months to come. It is also the first card that has a TeraFLOP of compute power.  Not bad at all for a price tag of just $199!

Sapphire Radeon HD 4850 Graphics Cards

Here are a pair of cards laying opposite directions, so you can get a feel for the layout of the Radeon HD 4850 series. Sapphire used the reference design from ATI, so the PCB and heat sink should be good to go. Notice the Radeon HD 4850 takes up just a single slot! This is good news and it means that CrossFireX with four cards is possible in boards that have four PCI Express x16 slots.

Sapphire Radeon HD 4850 Graphics Card 6-pin Power

You'll notice that the Sapphire Radeon HD 4850 has just a single 6-pin PCI Express power connector on the board and that is because it uses just 110 Watts of power during peak usage! The box of the Sapphire Radeon HD 4850 states that a 450W or greater power supply with a 75W 6-pin PCI Express power connector is recommended for use.  If you want to run these cards in CrossFire you'll need at least a 550W power supply and two 6-pin connectors.  The box doesn't say what you need for 3-way and 4-way CrossFireX and we are still under an embargo on information that isn't on the box or in the presentation deck, but you can figure out the math on that one.

Sapphire Radeon HD 4850 Graphics Card DVI

The Sapphire Radeon HD 4850 comes with two dual-link DVI outputs, an HDTV output and HDMI output support through the DVI-to-HDMI adapters included.

Sapphire Radeon HD 4850 Graphics Card CrossFireX

As previously mentioned the board supports CrossFireX for those that want more performance!

Sapphire Radeon HD 4850 Graphics Card GPU-Z

If you look at the GPU-Z sceen shot above you can see some of the basic information of the card. Our Sapphire Radeon HD 4850 was running a core clock of 625MHz and the memory clock was at just 993MHz. The card has 800 unified shaders and 16 ROPs, which are correctly identified by GPU-Z.

The GeForce 9800 GTX+ Video Card

NVIDA GeForce 9800 GTX+ Video Card

Just looking the front and rear of the two GeForce 9800 GTX+ graphics cards not too much looks different from an ordinary GeForce 9800 GTX. Don't be fooled though as this card is really a GeForce 9800 GTX+, which is a card that NVIDIA claims will be the only card that offers fantastic 3D performance, high-fidelity physics, and world class GPU computing for value conscious gamers. That is a big claim to make, but we will see how it holds up during testing.

NVIDA GeForce 9800 GTX+ Video Card

With the heat spreader removed the new 55nm core can be seen along with the Hynix memory ICs.

NVIDA GeForce 9800 GTX+ Video Card

The GeForce 9800 GTX uses a dual-slot design that requires two 6-pin PCIe power connectors for proper usage. If power is not properly hooked up to the GeForce 9800 GTX the video card will sound an alarm and you will definitely hear it. When it comes to power supplies a single GeForce 9800 GTX, needs at least a 450-watt power supply unit for proper operation.

The GeForce 9800 GTX also provides native support for HDMI output, using a certified DVI-to-HDMI adaptor in conjunction with the built-in SPDIF audio connector. This solution differs from ATI by the fact is requires an audio connection from the motherboard or sound card to the video card. Both ATI and NVIDIA solutions have the outgoing audio going through the HDMI cable to the TV or receiver, but only NVIDIA needs internal connections to make this happen. The SPDIF audio connector can be seen just above the LR watermark in the above image.

NVIDA GeForce 9800 GTX+ Video Card

On the rear of the GeForce 9800 GTX one will one will find the usual dual dual-link, HDCP-enabled DVI-I outputs for connection to analog and digital PC monitors and HDTVs, a 7-pin analog video-out port that supports S-Video directly, plus composite and component (YPrPb) outputs via an optional dongle.

The Test System

The Main Test System

The test system was running Windows Vista Ultimate 64-bit with all available Microsoft updates including the hotfixes required for enthusiast video cards to run correctly. ATI CATALYST 8.5 drivers was used on the Radeon HD 3870 X2 graphics card and ATI CATALYST 8.6 drivers were used on the nw Radeon HD 4850 graphics card. NVIDIA Forceware 175.16 WHQL drivers were used on all GeForce graphics cards except for the GeForce GTX 280 series cards as they used Forceware 177.34 drivers and the GeForce 9800 GTX+ that used Forceware 177.39 graphics drivers.  All results shown in the charts are averages of at least three runs from each game or application used.

The Video Cards:

All of the video cards were tested on our Intel X38 Express Test platform, which is loaded with the latest and greatest hardware.  The Intel Core 2 Quad QX9770 'Yorkfield' processor was used for testing as it proved to be the best desktop processor when it comes to game performance. The test system was also loaded with 4GB of memory and water cooled to ensure throttling of the processor or memory wouldn't cause any issues. The Corsair PC2-9136C5 memory kit was run at 1066MHz with 5-5-5-15 2T memory timings.

Intel Test Platform

Component

Brand/Model

Live Pricing

Processor

Intel Core 2 Quad QX9770

Motherboard

Gigabyte X38-DQ6

Memory

4GB Corsair PC2-9136C5  

Video Cards

See Above

Hard Drive

Western Digital SATA RaptorX

Cooling

Corsair Nautilus 500

Power Supply

Corsair HX620W

Operating System

Windows Vista Ultimate

Now that we know exactly what the test system is, we can move along to performance numbers. It should be noted that since both Palit and XFX use the same PCB, BIOS, Memory IC's, Clock Speeds and cooling solutions we put the cards at the same data point.  There is no performance difference between the two cards.

Tomb Raider: Anniversary

Tomb Raider: Anniversary Benchmarking

Lara Croft Tomb Raider: Anniversary, is the eighth release of the Tomb Raider series. It is a remake of the original Tomb Raider game from 1996 and includes all of the original 'worlds' from Tomb Raider. Created with an advanced version of the engine used for Lara Croft Tomb Raider: Legend, the gameplay mechanics, artificial intelligence and level puzzles of Tomb Raider: Anniversary are now more refined, in-depth and complex. The PC version of this title was released in North American on June 5, 2007.

Tomb Raider: Anniversary Benchmark Settings

Tomb Raider: Anniversary has a setting for full screen Anti-Aliasing that allows NVIDIA cards to run 16xQ AA and ATI cards to run at 4x AA.  The highest Anti-Aliasing settings were used on both sets of cards at 1920x1200 to see how Quad GPUs do on this game title that actually comes bundled with many of the GeForce 9 series cards.

Tomb Raider: Anniversary Benchmark Results

Results:  This is a recently added benchmark here at Legit Reviews and it continues to give us interesting results at 16xQ AA as the water cooled EVGA GTX280 beats the air cooled PNY by 6% at 1920x1200.  The EVGA GTX 280 also takes a huge step forward over the XFX's 9800 GX2 series card with a 26% performance increase at maximum resolution. The Sapphire Radeon HD 4850 takes the lead here, but remember ATI cards can only go up to 4x AA on this benchmark while the NVIDIA cards were tested at 16x QAA.

Company of Heroes

Company of Heroes

Company of Heroes (CoH) is a real-time strategy (RTS) computer game developed by Relic Entertainment that was released on September 14, 2006. On May 29, 2007 Relic released a patch for Company of Heroes that supports DirectX 10 and we used the latest patch to test DirectX 10 game performance. Company of Heroes is set during World War II where the player follows a military unit, known as Able Company, as they fight their way through some of the greatest and bloodiest battles, including the D-Day landings at Normandy.

Company of Heroes Benchmark Settings

The game was updated to version 2.301 and benchmarked with 8x CSAA enabled and the quailty settings pushed to the max on the others.

Company of Heroes Benchmark Results

Benchmark Results: Company of Heroes is tough on graphics cards with the latest DirectX 10 patch installed. The GeForce 9800 GTX+ was able to break 40FPS at 1920x1200, which was pretty sweet.  The Sapphire Radeon HD 4850 was able to get ATI past the magical 30FPS mark and finally makes the game playable on ATI based cards with 8X CSAA enabled.

World in Conflict

World in Conflict Benchmarking

World in Conflict (also known as WiC or WIC) is a real-time tactical video game developed by Massive Entertainment and published by Sierra Entertainment for Windows and the Xbox 360. The game was released in North America on 18 September 2007 and was included in our testing as it is a recent DirectX 10 game title. It also has a threaded engine for multi-core processor support, which is ideal for this testing. The plot in World in Conflict is to defend their country, their hometown, and their families in the face of Soviet-led World War III, delivering an epic struggle of courage and retribution. You are a field commander leading the era's most powerful military machines in the heroic effort to turn back the invasion…one city and suburb at a time. Let's get on to the benchmarking! WIC was tested using the most recent patch available, which is patch number 002.

Very High DX10 Quality Settings -

World in Conflict Benchmark Results

Results: At a resolution of 1920x1200 with very high quality settings the GeForce 9800 GTX+ didnt do bad, but it was the slowest of the bunch at this resolution. The Sapphire Radeon HD 4850 did well on World In Conflict with all the latest patches.

S.T.A.L.K.E.R.

S.T.A.L.K.E.R. Benchmark

S.T.A.L.K.E.R.: Shadow of Chernobyl

S.T.A.L.K.E.R.: Shadow of Chernobyl uses the 'X-ray Engine' to power the graphics. It is a DirectX 8/9 Shader Model 3.0 graphics engine. Up to a million polygons can be on-screen at any one time, which makes it one of the more impressive engines on the market today. The engine features HDR rendering, parallax and normal mapping, soft shadows, widescreen support, weather effects and day/night cycles. As with other engines that utilize deferred shading (such as Unreal Engine 3 and CryENGINE2), the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a "fake" form of anti-aliasing can be enabled with the static lighting option; this format utilizes a technique to blur the image to give the false impression of anti-aliasing. The game takes place in a thirty square kilometer area, and both the outside and inside of this area is rendered to the same amount of detail.


S.T.A.L.K.E.R. Benchmark Settings

The game was benchmarked with full dynamic lighting and maximum quality settings at 1920x1200 and 1280x1024 resolutions.

S.T.A.L.K.E.R. Benchmark Performance

Benchmark Results: S.T.A.L.K.E.R.: Shadow of Chernobyl was and still is a fun game to play now that the developer fixed many of the bugs found in the game through a series of patches. The $229 GeForce 9800 GTX+ made short work of the benchmark with the image quality settings maxed out.  The $199 Sapphire Radeon HD 4850 also did fairly well on this benchmark.

BioShock

BIOSHOCK on NVIDIA GeForce 9800 GX2

BioShock is a game published by 2K Boston/2K Australia, and designed by Ken Levine. The game is a PC and Xbox 360 title released on August 21, 2007 in North America. BioShock is a first-person shooter with role-playing game customization elements that was developed using the Unreal Engine 3.0 and is a DirectX 10 game title that is multithreaded.

BioShock Benchmark Settings

Since 2K didn't include a benchmark script or utility in BIOSHOCK, I used FRAPS version 2.9.2 to capture the frame rates for 240 seconds at scenes that I personally selected from the game. If you don't know anything about Bioshock, let me set the scene for you: After your plane crashes into icy uncharted waters, you discover a rusted bathysphere and descend into Rapture, a city hidden beneath the sea. Constructed as an idealistic society for a hand picked group of scientists, artists and industrialists, the idealism is no more. Now the city is littered with corpses, wildly powerful guardians roam the corridors as little girls loot the dead, and genetically mutated citizens ambush you at every turn and you get to kill them. Let's look at the benchmark results!

BioShock Benchmark Results

Benchmark Results: In BioShock, the GeForce 9800 GTX+ and Sapphire Radeon HD 4850 were close to the same at 1920x2100, but at 1280x1024 the GeForce 9800 GTX+ had a bit more muscle.

Crysis

Crysis Benchmark Results

Crysis is a science fiction first-person shooter computer game that was developed by Crytek, and published by Electronic Arts. It was released on November 15, 2007 in the United States. The game is based off the CryENGINE2 game engine, which is an extended version of CryENGINE, the game engine behind the hit game Far Cry a number of years ago.

Crysis Benchmark Settings

The full retail version of the game Crysis was used with patch 1.2 for benchmarking.  FRAPS was used over the internal benchmark utility to help avoid driver enhancements. Legit Reviews has just NVIDIA data for this game as we just recently updated the game to version 1.2 and picked a new scene to run FRAPS on and didn't have time to re-test all the cards in time for this article, but managed to get a number of the high end cards tested.

Crysis Benchmark Results

Results:  The GeForce 9800 GX2 is still the top dog on Crysis, but GeForce 9800 GTX+ and the Radeon HD 4850 both did great on the benchmark. We found 1920x1200 to be playable with medium quality settings and no AA on our test system.

Call of Duty 4

Call of Duty 4 Benchmarking

Call of Duty 4: Modern Warfare is a first-person shooter developed by Infinity Ward and published by Activision for Xbox 360 , Playstation 3 and PC. It is the fourth installment in the Call of Duty video game series. It was announced on April 25, 2007 and was released on November 6, 2007 in North America. The single player game can be completed in well under seven hours, but the graphics are awesome.  

Call of Duty 4 v1.2 Benchmark Settings

Call of Duty 4: Modern Warfare runs on a proprietary graphics engine, and has features such as true world-dynamic lighting, HDR lighting effects, dynamic shadows and depth-of-field. "Bullet Penetration" is calculated by the engine, taking into account things such as surface type and entity thickness. Certain objects, such as cars, and some buildings are destructible. This makes distinguishing cover from concealment important, as meager protection such as wooden fences, thin walls and such no longer provide sufficient protection. The bullet's speed and stopping power are decreased after penetrating an object, and this decrease is calculated realistically depending on the thickness and surface of the object penetrated. The game also makes use of a physics engine, which was not implemented in previous Call of Duty titles for the PC. Death Animations are a combination of pre-set animations and ragdoll physics. Some mistook the game's graphics to be DirectX 10 based, but it is stated that the graphics use DirectX 9.

Call of Duty 4 v1.2 Benchmark Results

Results: Call of Duty 4 with the graphics cranked up to 4xAA and 16xAF is tough on graphics cards.  The Sapphire Radeon HD 4850 and NVIDIA GeForce 9800 GTX+ both were able to play COD4 above 30FPS at 1920x1200. The GeForce GTX 280 is still king of the hill, but the game is playable on nearly all of the cards at 1920x1200. In the end the Sapphire Radeon HD 4850 was just a bit faster.

Lightsmark 2007

Lightmarks 2007 version 1.2 Benchmark

LIGHTSMARK is a new benchmark/demo with real-time global illumination and penumbra shadows created by Stepan Hrbek. Lightsmark version 1.3 was used as it contains new rendering paths for ATI Radeon HD 2xxx and 38XX series graphics cards. Before version 1.3 was released, the ATI Radeon HD 38xx series video cards were unable to render objects in the benchmark.

Benchmark Features:

It should be noted that ATI questioned our use of this benchmark as the developer is a private individual that never contacted developer relations at ATI. ATI also made it clear to us that although this benchmark uses global illumination, that it is not similar to the DX10.1 demo that ATI been showing. ATI isn't sure how the benchmark is made as they have not had time to look into it, but if its rendering to cube maps then its likely that performance could be increased if the app used DX10.1's indexed cube maps.

To be fair to both side we contacted the creator of Lightsmark 2007 and Stepan Hrbek had this to say.

"I developed it with 3 Radeon and 3 GeForce cards, randomly switching them, there are no special optimizations, IMHO it's fair. I bought all 6 cards, no gifts.. Small unfairness is only in quality. The same shader on Nvidia card produces smoother shadows but I don't give Nvidia any bonus points for quality, only fps is measured. It uses completely new technique where part of calculation runs on CPU, drivers were not optimized for it for years, so it's possible that it hits some unoptimized driver code. But both companies are in the same situation."

Since we ran the test, we will go ahead and include it, but what the results mean is up in the air.

Lightmarks 1.3 Benchmarking

Benchmark Results: Lightsmark 2007 v1.3 shows that the GeForce 9800 GTX+ has some muscle, but the big shocker is the Radeon HD 4850.  The Sapphire Radeon HD 4850 took a nice jump in performance over the Radeon HD 3870 X2 and rivals some of the slower NVIDIA GeForce 8800 GTX based graphics card.

3DMark 2006

3DMark 2006

3DMark 2006

3DMark 06 is the worldwide standard in advanced 3D game performance benchmarking and the latest version in the popular 3DMark series! 3DMark06 tests include all new HDR/SM3.0 graphics tests, advanced SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today.

Default 3DMark06 settings were used for testing, so a resolution of 1280x1024 was used. 

3D Mark 2006

3D Mark 2006

Benchmark Results:3DMark06 showed that the NVIDIA GeForce 9800 GTX+ was faster than the Sapphire Radeon HD 4850 by 15.5%. Performance on the GeForce 9800 GTX+ was near that of the PNY GeForce GTX 280, which is impressive.

3DMark Vantage

3DMark Vantage

3DMark Vantage is the new industry standard PC gaming performance benchmark from Futuremark, newly designed for Windows Vista and DirectX10. It includes two new graphics tests, two new CPU tests, several new feature tests, and support for the latest hardware. 3DMark Vantage is based on a completely new rendering engine, developed specifically to take full advantage of DirectX10, the new graphics API from Microsoft.

3DMark Vantage

Default 3DMark06 settings were used for testing, so a resolution of 1280x1024 was used. 

3DMark Vantage PhysX Testing

It should be noted that the new GeForce 9800 GTX+ supports PhysX with an additional driver installation. With the NVIDIA PhysX driver v8.06.12 installed along with the Forceware 177.39 driver, the GeForce 9800 GTX, GTX 280 and GTX 260 video cards become physics processors. This new capability extends physics simulation beyond the limited capabilities of the CPU, enabling incredible performance scalability by leveraging the power of the graphics processor. Today you can test two applications to show off PhysX: 3DMark Vantage and Unreal Tournament 3.  We tested just the GeForce 9800 GTX+ with the new Physics driver to show off what it means for performance.

3DMark Vantage

Test Results:Notice that the Sapphire Radeon HD 4850 graphics cards actually beats out the NVIDIA GeForce 9800 GTX+ without the help of Physics. With Physics enabled the NVIDIA card with PhysX support gets a boost to the overall score thanks to the performance gained in the CPU test. The physics speedup carries a 25% weight on the final 3DMark score at the Performance Preset.

3DMark Vantage

Test Results:NVIDIA PhysX doesn't make a difference in the game tests and the difference shown between having PhysX enabled and disabled is not significant. Notice the Sapphire Radeon HD 4850 has a slight performance lead in the game tests.

3DMark Vantage

Test Results: We normally don't include CPU tests in our video card benchmarks, but it looks like we might have to start. With the NVIDIA PhysX driver installed the CPU score on our system more than doubled from 12,254 to 33,504.

The age of having Physics enabled with just a single graphics card has arrived! What this means for gaming remains to be seen, but PhysX is the most popular physics API in the world with over 140 shipping titles across all major platforms including PC, Wii, Xbox360, and Playstation 3. This means that PhysX features will be available without the need for a discrete physics card like in the past. The original AGEIA PhysX graphics cards back in 2006 were discrete, so this is a huge improvement and will hopefully revolutionize gaming.

Temperatures

Since video card temperatures and the heat generated by next-generation cards have become an area of concern among enthusiasts and gamers, we want to take a closer look at how these cards generate heat under load.

Sapphire Radeon HD 4850 Temperature

In order to get proper temperature readings we used the Precision Overclocking Utility along with GPU-Z on the NVIDIA cards and the Catalyst Control Center along with GPU-Z on the ATI card.  Both utilities showed the same temperature, so it seemed pretty valid as you can see above.

Sapphire Radeon HD 4850 Temperature

Test Results: The NVIDIA GeFOrce 9800 GTX+ has great temperatures, but remember that it is a dual slot graphics card.  The Sapphire Radeon HD 4850 was a bit warm for our tastes and was found to be 85C at load.

Overclocking

To overclock the ATI Radeon HD 4850 graphics card, we used ATI Overdrive that is part of the CATALYST Control Center. When you 'unlock' the ATI Overdrive, you can manually set the clock and memory settings or let the 'auto-tune' utility to set the frequencies for you. Just for fun, I tried out the auto-tune feature to see if it could really find a stable clock configuration and it worked in just a few minutes and did not lock up the system.

ATI Radeon HD 4850 Video Card Overclocking

We started out at 625MHz on the core and 993Mhz on the memory, but was able to rech 690MHz on the core and 1088Mhz on the memory. This is a 65MHz overclock on the core and 95MHz on the memory.  That adds up to about a 10% overclock, which is not bad!

ATI Radeon HD 4850 Video Card Overclocking

When it came to the GeForce 9800 GTX+ we overclocked it manually using the EVGA Precision overclocking utility. We were able to reach 840MHz with full stability and had the shaders locked in at 2084MHz.  The memory was able to be overclocked from 2200MHz to 2430MHz, which was also a nice increase.

ATI Radeon HD 4850 Video Card Overclocking

If you recall from our 3DMark Vantage test page the stock clocks scored 7888 with Physics enabled.  After the overclock the score increased nearly 1,000 points up to 8,862 3DMarks (a 12.3% increase in performance)! The card would run at 900MHz on some benchmarks, but wasn't stable on others like 3DMark Vantage.  We stopped at 840MHz on the core and 1215MHz on the memory, but this card may have a little more in it. Notice the card was running 50C with these settings!

UPDATE 6-20-2008: We spent another couple hours overclocking the card and managed to get the NVIDIA GeForce 9800 GTX+ higher with full stability. We manged to get it up to 855MHz on the core, 2200MHz on the shaders and 2550MHz on the memory ICs by un-locking the core and shader frequency lock. This improved performance as you can see below. After doing these last few increases the overall score jumped up to 9,211 3DMarks, which ends up being a 16.7% performance increase! That score is just 500 points away form the GeForce 9800 GX2!

GeForce 9800 GTX+ Video Card Overclocking

Power Consumption and Final Thoughts

Power Consumption

For testing power consumption, we took our test system and plugged it into a Seasonic Power Angel. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers, we measured the peak wattage used by the system while running the game Call of Duty 4 at 1280x1024 with high graphics quality.

Total System Power Consumption Results

Power Consumption Results: Looking at total system power consumption levels, the GeForce 9800 GTX+ and Radeon HD 4850 use roughly the same amount power at load. At idle the Sapphire Radeon HD 4850 was a little more energy friendly as it should be.  Remember the Sapphire Radeon HD 4850 consumes 110W at most.

ATI Radeon HD 4850 and GeForce 9800 GTX+

 

Final Thoughts

The past 24 hours has been nothing but a blur of testing, but we wanted to get this review out to you as soon as possible.  ATI and NVIDIA have done a great job on the Radeon HD 4850 and GeForce 9800 GTX+ graphics cards with both being hands down better than the generation they replace. We originally had some issues installing the drivers for the Radeon HD 4850, but we got them sorted out this morning with ATI. Having less than 24 hours to benchmark and publish a video card article is pretty nuts, but we managed to pull it off on two different cards this time around and were able to include overclocking results as well. We will be bringing you CrossFire and SLI performance numbers next week when the original launch was scheduled, so hang tight if you are wanting multi-GPU performance numbers.

The ATI Radeon HD 4850 is sure to be a price versus performance champ as the MSRP is $199 and will certainly go lower just like the Radeon HD 3850 did after it was launched.  For those wanting more power ATI is getting the Radeon HD 4870 ready and with GDDR5 memory and faster clock speeds it is rumored to be 10-20% faster than the Radeon HD 4850, but it will cost you more. Many places are currently selling the Radeon HD 4850, so this is a card you can actually go out and buy today if you really wanted one. The single-slot design is a winner with many users, but the card runs toasty at nearly 80C while on the open test bench. Many add-in-board (AIB) partners will likely put dual slot coolers on this card if they want to allow customers to overclock them much over stock frequencies.

Just days ago we showed that the GeForce GTX 280 was the fastest single GPU video card that we have ever tested and a few days later they ship us this mainstream part that caught us off gaurd. The NVIDIA GeForce 9800 GTX+ was designed to be a spoiler to ATI's Radeon HD 4850/4870 and it might have done the job. The GeForce 9800 GTX+ was faster in the majority of the benchmarks we tested it on and it had a tad more overclocking head room than the ATI Radeon HD 4850 did.  The MSRP on the GeForce 9800 GTX+ is $229 and with NVIDIA enforcing UMAP pricing it's likely that they won't go much below that figure. The other downside to the GeForce 9800 GTX+ is that you can't buy it now as you'll have to wait until July 2008 to get one. The good news is that NVIDIA has made the move to 55nm and it looks like they pulled it off.  The thermal temperatures were looking good and the power consumption levels are down from the previous 65nm generation. The ace that NVIDIA has is PhysX support starting right now. ATI of course has support with HAVOK, but where are the drivers? For an extra $30 consumers can go with the GeForce 9800 GTX+ and get more performance, PhysX game support and run more CPU computing applications or run the faster (BadaBoom and Folding@home).

Legit Bottom Line: The ATI Radeon HD 4850 and GeForce 9800 GTX+ rasie the bar for graphics cards and perform better than one would expect for a $199-$229 product. Enthusiasts and gamers better start saving some money up because you'll want one of these before summer is over.

UPDATE (6-21-2008 at 2PM CST): NVIDIA just informed us that the official launch date for the GeForce 9800 GTX+ is July 16th. They also wanted to share that overclocked 9800 GTX cards are currently available on the market today that deliver virtually identical performance as the 9800 GTX+. So if any of our readers want one today, they can pick one up from video card companies like EVGA and XFX have overclocked versions. Pricing on the 9800 GTX has dropped has dropped the past few days and those cards can now be found for as little as $199.99. Based on our new overclocking results it might be worth the wait though.