XFX Radeon R9 Fury - Budget Fiji?What happens when you take the AMD Radeon R9 Fury X video card with the Fiji GPU and disable 512 or 12.5% of the stream processors? You end up with the lower cost AMD Radeon R9 Fury video card that should still perform well on most game titles at the most demanding resolutions. It's been over seven months since we have looked at any Radeon R9 Fury card and there has been a bunch of new games that have come out since then as well as the new Radeon Software Crimson Edition drivers. We had the opportunity to take a look at the XFX Radeon R9 Fury Graphics Card that is sold under part number R9-Fury-4TF9 and jumped at the chance to take a look at it! The XFX Radeon R9 Fury 4GB HBM video card is available on Amazon here in the United Sates for $529.99 shipped, which is down slightly from the $549 launch price in the summer of 2015. This Radeon R9 Fury card by XFX features AMD's reference designed PCB with 1000MHz core clock speeds and 500 MHz clock speeds on the 4GB of HBM1 memory. XFX customized this card by topping it off with the XFX Triple Dissipation GPU cooler The only other Radeon R9 Fury model offered by XFX is a water cooled version for $629.99 shipped. The funny thing is the XFX Radeon R9 Fury X video card is also water cooled and runs $629.99 shipped, so there is absolutely no that anyone in their right mind would buy a water cooled Radeon R9 Fury card when the Radeon R9 Fury X features more cores along with water cooling for the same price! The XFX Radeon R9 390 Black Edition is a dual-slot graphics card that fairly long at 12-inches in length with 4.5-inches of that length being the heatsink extending past the end of the cards circuit board. XFX went with a mostly black design for this card, so the PCB, fan shroud and fans are all black. On the back of the card you'll notice that XFX went with a backplate that improves the visual appearance of the card when it's installed, but also helps add rigidity and protects the components that sit under it. In the shot above you can see two 8-pin PCIe video card power connectors that are needed for operation. XFX includes a dual 6-pin to 8-pin PCIe power adapters for those that don't have the proper cables available on their power supply. XFX recommends a 750 Watt or greater power supply to properly power this video card. The Ghost 3.0 Double Dissipation cooling solution features seven copper heatpipes, five extend to the end of the card and are visible. If you look in the image above you can see how XFX is using another passive heatsink to cool the digital power components. XFX says that their XFX Extreme VRM and Memory Cooling helps drop VRM temperatures by up to 30C and GDDR5 temperatures by up to 20C. These lower temperatures help reduce fan noise as well as the card is running coller. XFX is using a 6-phase full digital power design on this card with unlocked voltages, so you have full control over the voltages if you want to use a 3rd party utility to push your card to the limit. When it comes to video connectors the XFX Radeon R9Fury comes with three DisplayPort 1.2a connectors and one HDMI 1.4 connector when it comes to display options, which is the AMD reference video output configuration. Also notice that this card is thicker than a standard dual-slot design and is considered a 2.2 slot card due to the thick Triple Dissipation GPU cooler. Here is a quick look at the XFX Radeon R9 Fury X versus the XFX Radeon R9 Fury
|XFX Radeon R9 Fury X||XFX Radeon R9 Fury||XFX Radeon R9 Nano|
|GPU Architecture||28nm Fiji||28nm Fiji||28nm Fiji|
|GPU base clock||1050MHz||1000MHz||1000MHz|
|Texture Fillrate||268.8 GTexel/s||233.0 GTexel/s||256.0 GTexel/s|
|Pixel Fillrate||67.2 GP/s||66.6 GP/s||64.0 GP/s|
|Memory||4GB HBM1||4GB HBM1||4GB HBM1|
|Memory Bandwidth||512.0 GB/s||512.0 GB/s||512.0 GB/s|
|0dB Fanless Idle||No||No||No|
|Recommended PSU||750 Watts||750 Watts||550 Watts|
|Power Connectors||2 x 8-pin||2 x 8-pin||1 x 8-pin|
|Street Price||$629.99 shipped||$529.99 shipped||$499.99 shipped|
Test SystemBefore we look at the numbers, let's take a brief look at the test system that was used. All testing was done using a fresh install of Windows 10 Pro 64-bit and benchmarks were completed on the desktop with no other software programs running. There has been some concern of people testing a cold card versus a hot card, but we've always done out testing 'hot' since the site started back more than a decade ago. Video Cards & Drivers used for testing:
- AMD Radeon Software Crimson Edition 16.1
- NVIDIA GeForce 361.43
Intel X79/LGA2011 PlatformThe Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79-E WS motherboard with BIOS 1704 that came out on 05/08/2015. We went with the Intel Core i7-4960X Ivy Bridge-E processor to power this platform as it is PCIe 3.0 certified, so all graphics cards are tested with PCI Express Gen 3 enabled. The Kingston HyperX 10th Anniversary 16GB 2400MHz quad channel memory kit was set to XMP Profile #2. This profile defaults to 2133MHz with 1.65v and 11-13-13-30 2T memory timings. The OCZ Vertex 460 240GB SSD was run with latest firmware available. A Corsair AX860i digital power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements. Here are the exact hardware components that we are using on our test system:
|The Intel X79 Test Platform|
|Processor||Intel Core i7-4960X|
ASUS P9X79-E WS
16GB Kingston 2133MHz
|Solid-State Drive||OCZ Vertex 460 240GB|
|Cooling||Intel TS13X (Asetek)|
|Power Supply||Corsair AX860i|
|Operating System||Windows 10 64-bit|
|Monitor||Sharp PN-K321 32" 4K|
Battlefield 4Battlefield 4 is a first-person shooter video game developed by EA Digital Illusions CE (DICE) and published by Electronic Arts. It is a sequel to 2011's Battlefield 3 and was released on October 29, 2013 in North America. Battlefield 4's single-player Campaign takes place in 2020, six years after the events of its predecessor. Tensions between Russia and the United States have been running at a record high. On top of this, China is also on the brink of war, as Admiral Chang, the main antagonist, plans to overthrow China's current government; and, if successful, the Russians will have full support from the Chinese, bringing China into a war with the United States. This game title uses the Frostbite 3 game engine and looks great. We tested Battlefield 4 with the Ultra graphics quality preset as most discrete desktop graphics cards can easily play with this IQ setting at 1080P and we still want to be able to push the higher-end cards down the road. We used FRAPS to benchmark with these settings on the Shanghai level. All tests were done with the DirectX 11 API. Benchmark Results: The AMD Radeon R9 Nano was actually slightly faster than the XFX Radeon R9 Fury at 1920x1080, but not by much and you won't tell the different while gaming unless you used FRAPS and test it like we do! Benchmark Results: When you move up to a 2560x1440 display we found the Radeon R9 Fury pulled ahead of the Nano card by just 1-2 FPS across the board. Benchmark Results: When it comes to 4K Ultra HD gaming the XFX Radeon R9 Fury was again found to be just 1 FPS faster on average than the Radeon R9 Nano. It was also found to be faster than the NVIDIA GeForce GTX 980 reference card, but was slower than the GeForce GTX 980 Ti partner cards.
Fallout 4Fallout 4 is an open world action role-playing video game developed by Bethesda Game Studios and published by Bethesda Softworks. Fallout 4 is set in a post-apocalyptic Boston in the year 2287, 210 years after a devastating nuclear war, in which the player character emerges from an underground bunker known as a Vault. Gameplay is similar to Fallout 3. The title is the fifth major installment in the Fallout series and was released worldwide on November 10th, 2015. Fallout 4 was benchmarked with ultra image quality settings with TAA and 16x AF. These settings are tough for entry level discrete graphics cards, but are more than playable on high-end gaming graphics cards. V-Sync can't be disabled in the games options, so we edited the necessary INI files and disabled vsync in the driver software as well. We used FRAPS to benchmark Fallout 4 after you emerge from the vault and are in The Commonwealth. Benchmark Results: In Fallout 4 at 1920x1080 the Radeon R9 Fury and Radeon R9 Nano were basically performing the same with identical Crimson 16.1.1 drivers installed. Benchmark Results: With the display resolution cranked up to 2560x1440 we found the XFX Radeon R9 Fury finally started to pull ahead a bit. Benchmark Results: When we increased the resolution to 3840x2160 for some 4K gaming goodness the XFX Radeon R9 Fury was still ahead of the Radeon R9 Nano.
Grand Theft Auto VGrand Theft Auto V, currently one of the hottest PC games, was finally released for the PC on April 14, 2015. Developed by Rockstar, it is set in 2013 and the city of Los Santos. It utilizes the Rockstar Advanced Game Engine (RAGE) which Rockstar has been using since 2006, with multiple updates for technology improvements. In Grand Theft Auto V we set the game to run with no MSAA with 16x AF and high image quality settings as we didn't want the GPU to bottleneck the system too bad, but wanted a setup that your average gamer would actually play on. We used the games built-in benchmark utility to have at least one game we used that could be compared to your setup at home. We averaged all the five of the default benchmark runs and omitted the maximum values as those results are garbage for some reason. 1440P Benchmark Results: After running the games built-in benchmark and averaging the runs together we found that the Radeon R9 Nano was performing better than the Radeon R9 Fury in this particular game title. 4K Ultra HD Benchmark Results: At 4K the Radeon R9 Nano was still ahead on the average FPS, but had a lower minimum frame rate.
Rise of the Tomb RaiderRise of the Tomb Raider is a third-person action-adventure video game developed by Crystal Dynamics and published by Square Enix. It is the sequel to the 2013 video game Tomb Raider, which was itself, the second reboot to its series. It was released for Microsoft Windows in January 2016. Players control Lara Croft through various environments, battling enemies, and completing puzzle platforming sections, while using improvised weapons and gadgets in order to progress through the story. Crystal Dynamics used a proprietary game engine called 'Foundation' for Rise of the Tomb Raider and it is able to create some pretty nice looking graphics. We tested Rise of the Tomb Raider with the Very High preset, but then changed the ambient occlusion setting from HBAO+ (an NVIDIA developed feature) to 'ON' to have as fair of a match up as we could. We also disabled VSync. Once we had the graphics and display settings figured out we used FRAPS to manually benchmark a section of the Siberian Wilderness that is about 10% into the game for a couple minutes. Rise of the Tomb Raider does not have a built-in benchmark, so this is one of the only ways you can benchmark this particular game title. 1080P Benchmark Results: For those running Full HD 1080P displays the good news is that all higher-end desktop graphics cards are able to play Rise of the Tomb Raider with 60 FPS averages. The XFX Radeon R9 Fury dipped below 60 FPS just a couple times in our benchmark run and averaged 74 FPS. 1440P Benchmark Results: With the resolution cranked up to 25x14 the Fury card was able to ahead of the Nano card and averaged 53 FPS. 4K Ultra HD Benchmark Results: Rise of the Tomb Raider with these image quality settings were tough on all of the cards and the XFX Radeon R9 Fury was only averaging 29 FPS with the minimum frame rate being 24 FPS. It should be noted that we did get have the XFX Radeon R9 Fury video card crash on Rise of the Tomb Raider on more than one occasion. AMD let us know that this was due to the high image quality settings (Tessellation and/or SMAA enabled). This explains our random crash and AMD claims that the SMAA bug has been fixed in Crimson 16.2 drivers that just happened to come out today. The game crashing with Tessellation enabled is still an open known issue.
Star Wars: BattlefrontStar Wars: Battlefront is a series of first- and third-person shooter video games based on the Star Wars films. Players take the role of soldiers in either of two opposing armies in different time periods of the Star Wars universe. Star Wars: Battlefront is built on the Frostbite 3 engine. Thief is the fourth title in the Thief series, developed by Digital Illusions CE and published by EA DICE/Disney Interactive on November 17, 2015. We ran Star Wars: Battlefront with the image quality settings set at Ultra and VSYNC was disabled. We used FRAPS to benchmark with these settings on Battle on Tatooine. 1080P Benchmark Results: All of these high end cards were able to play Star Wars: Battlefront at 1080P Full HD resolution and never dipped below 100 FPS in the map that we tested. The XFX Radeon R9 Fury averaged 143 FPS and gaming on a 144Hz gaming monitor with this card was great at this resolution. 1440P Benchmark Results: When we played Battlefront at 2560x1440 the XFX Radeon R9 Fury card averaged 95 FPS, but we could notice that the game play wasn't smooth and the minimum frame rate was dropping down to 66 FPS. This is more than the Radeon R9 Fury X and Radeon R9 Nano were dropping down to. Very strange performance. 4K Ultra HD Benchmark Results: When moving up to 3840x2160 the average FPS was 51 on the Radeon R9 Fury, but we were getting chop and stutters while gaming. The frame rate was dropping down to 35 FPS in the section that we benchmark in. We couldn't explain why the XFX Radeon R9 Fury card was suttering, so we did a little digging. We fired up the game on the XFX Radeon R9 Fury card and found that 1080P gaming was smooth, but when we changed the resolution to something higher than the cards performance got choppy. The GPU-Z capture above shows that as soon as we switched from 1080P to 4K gaming that the frame buffer filled up right away and started using dynamic memory and the core clock speeds started to drop all the time. We used FRAPS and GPU-Z logs to record the core clock of the XFX Radeon R9 Fury video card versus the cards core clock speed and found that it remained pegged at 1000MHz when gaming at 1080P. When changed the resolution to 3840 x 2160 (Ultra HD 4K) we noticed that the core clock was hardly ever at 1000MHz and at times performance was dropping down to 500MHz on the core clock. Many of the drops in performance were lined up exactly with where the core clock decreased for some reason.This odd behavior is something that we can not replicate on the Radeon R9 Nano or the Radeon R9 Fury X cards that we have. We have no other Radeon R9 Fury to test with, but AMD confirmed that they are seeing something similar internally and are looking into a fix. We aren't sure what is causing it, but something is certainly going on and we are hopeful that AMD can come up with a solution. The AMD Radeon R9 Fury X and Radeon R9 Nano are fully enabled cores, so that might be why those cards aren't having a similar issue. We sure hope it's a driver/Windows issue and not in the hardware. Most games don't exhibit this behavior, which is why it wasn't noticed back in July 2015 when we posting the original Fury review as Star Wars Battlefront wasn't out yet!
3DMark 20133Dmark Fire Strike Benchmark Results - For high performance gaming PCs Use Fire Strike to test the performance of dedicated gaming PCs, or use the Fire Strike Extreme preset for high-end systems with multiple GPUs. Fire Strike uses a multi-threaded DirectX 11 engine to test DirectX 11 hardware.
Fire Strike Benchmark Results:
Benchmark Results: The 3DMark Fire Strike benchmark had the XFX Radeon R9 Fury scoring 12,686 points.Fire Strike Extreme Benchmark Results:
Benchmark Results: In 3DMark Fire Strike Extreme we see different scores, but basically the same scaling and performance results. The XFX Radeon R9 Fury 4GB video card scored 6,638 points, which was just ahead of the Radeon R9 Nano graphics card.
Temperature & Noise TestingThe gaming performance on a graphics card is the most important factor in buying a card, but you also need to be concerned about the noise, temperature and power consumption numbers. XFX Radeon R9 Fury Idle and Load Temps: When it comes to temperatures the XFX Radeon R9 Fury idled at 26C on our open air test bench and got up to 76C when gaming for over an hour non-stop. Here is a chart that shows the temperatures of the XFX Radeon R9 Fury versus some other high-end desktop cards.
We test noise levels with an Extech sound level meter that has ±1.5dB accuracy that meets Type 2 standards. This meter ranges from 35dB to 90dB on the low measurement range, which is perfect for us as our test room usually averages around 36dB. We measure the sound level two inches above the corner of the motherboard with 'A' frequency weighting. The microphone wind cover is used to make sure no wind is blowing across the microphone, which would seriously throw off the data.The XFX Radeon R9 Fury is pretty quiet since the three fans don't need to ramp up until you reach over 70C on the GPU core and when they do the noise level really isn't that bad. ** The AMD Radeon R9 Fury X reference card that we are using was the original model with a loud water pump that whines. AMD changed the pump design before the cards hit the retail market, but wasn't willing to replace ours. We expect retail cards to perform quieter for this and hopefully AMD will send us a replacement card for proper noise testing. **
Power ConsumptionFor testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we ran Battlefield 4 at 3840x2160 and recorded the average idle reading and the peak gaming reading on the power meter. Power Consumption Results: The XFX Radeon Fury in our test system used 105 Watts at idle and 393 Watts at peak gaming. The numbers put it between the Nano and Fury X cards, which is right where we expected to see it.
Final Thoughts and ConclusionsAfter using the XFX Radeon R9 Fury video card for several weeks we ran into some performance issues that were tough to look beyond. We had Rise of the Tomb Raider crashing and then when playing Star Wars Battlefront we had really bad stuttering issues at 2560x1440 and 3840x1440 screen resolutions. Both of these game titles were not available when we published our Radeon R9 Fury launch article, so the gaming experience we had not was not the same as last summer. AMD has acknowledged both issues that we ran into today with the release of AMD Crimson 16.2 drivers. If you look at the release notes they have listed what we run into earlier this month as known issues.
- Core clocks may not maintain sustained clock speeds resulting in choppy performance and or screen corruption
- Rise of the Tomb Raider - The game may randomly crash on launch if Tessellation is enabled