AMD Radeon R9 295X2 8GB Video Card Review at 4K Ultra HD

Jump To:


AMD Project Hydra Arrives – Full Dual Hawaii GPUs on One Card

 project-hydra

The AMD Radeon R9 295X2 video card has finally arrived! This is the first dual-GPU graphics card based on AMD’s latest Hawaii architecture and just happens to use two fully enabled Hawaii XT GPUs. With 5,632 stream processors (2 x 2,816) running at 1018MHz and 8GB of GDDR5 memory running on a dual 512-bit memory interface running at 1250MHz (5.0GHz Effective), you should already know that this will be a beast of a card! Then again, the suggested etailer price for the AMD Radeon R9 295X2 is $1499USD (EUR 1,099 + VAT) and that alone should give you an idea of how powerful this card should be. We were originally thinking this card would be priced a little lower than $1499, but the $2999 price tag on the NVIDIA GeForce GTX Titan Z gave AMD the ability to price this card where they did without too many people questioning it. The AMD Radeon R9 295X2 should arrive at retailers on April 21, 2014.

project-hydra-specs

Those 5,632 stream processors are capable of 11.5 TFLOPS of compute power and there are 12.4 billion transistors between the two GPUs.

extreme-performance

The AMD Radeon R9 295X2 should get you a score of around 16,000 in 3DMark Fire Strike, which means it has around 65% better performance than the AMD Radeon R9 290X. The AMD Radeon R9 295X2 should also have some good mining performance with around 1950 KHas/s (script) and 956 KHash/s (Scrypt-N).

r9-specs

The AMD Radeon R9 295X2 has 5,632 stream processors, 352 texture units and 128 ROPs. That is good for 11.5 TFLIPS of compute performance, 358.3 GT/s texture Fill-Rate and 130.3 GP/s Pixel Fill-Rate. The specifications chart above shows how that stacks up against the Radeon R9 290X and R9 290.

500-watts

The AMD Radeon R9 295X2 has two 8-pin PCIe power connectors on it and AMD states that this card has a maximum TDP of 500 Watts. That means that the 295X2 itself is capable of pulling 500W. Only 75W of that is through the PCI Express 3.0 x16 slot. The official PCI-SIG specification for graphics cards is written for 300W (75W slot + 75W 6-pin + 150W 8-pin). AMD is actually out of the PCI-SIG spec for PCIe graphics cards as they have two 8-pin connectors for a theoretical 375W max power draw, but they are going over that and pulling over 150 Watts from each 8-pin connector. We’ve seen many companies over the years put three 8-pin PCIe power connectors on a card before, so this isn’t a deal breaker, but something worth pointing out.

AMD suggests using a power supply that is capable of supply 28 Amps of dedicated current on each 8-pin PCIe power connector. This means your power supply must be able to support ~50 Amps across the +12V rail or rails depending on what PSU you have. We have a Corsair AX860i Digital Power Supply that is rated at 71.6 Amps on the single +12V rail and found that on our Intel LGA2011 test system that it was fine with the card at stock speeds, but unable to provide enough power when the card was overclocked. This is the first dual-slot card that we have ever used that needed more than a high-quality 860W power supply! We ended up switching over to the Corsair AX1200i Digital Power Supply that is rated at 100.4 Amps on the +12V power rail. AMD suggests using Thermaltake TP1500M, Enermax Maxrevo EMR1500, LEPA G1600, Silverstone ST1500 or Corsair AX1500i power supplies.

One power supply company that we spoke to about this card, said that they have seen a Radeon R9 295X2 pull as much as 14A on a single wire on the PCIe power cable when checked with a clamp meter. They’ve also seen temperatures of the modular connector at the PSU housing being as high as 90°C at full load. Some power supplies come with two 8-pin PCIe power connectors on a single cable and every power supply company that we spoke with said that you will want to run two individual 8-pin lines from the power supply. We asked AMD if the AMD Radeon R9 295X2 could potentially be a fire hazard and they calmed our concerns by saying:

“Nothing will catch fire. All of these PSUs have OCP protection, so there’s no fire hazard in any case. As long as your PSU is rated to deliver 28A per rail, and have the proper wattage, you’ll be fine.” – AMD PR

asetek-pump

The AMD Radeon R9 295X2 is the first reference card on the market that comes with liquid cooling. The 295X2 uses a hybrid cooler that has a full cooler over the front of the card with a 120mm fan blowing over a small heatsink to help keep the board, memory and voltage regulators cool. The AMD Radeon R9 295X2 has a 5+1 power phase design per GPU, so there are a total of 12 power phases to keep cool on this card. Asetek designed the cold plate for the Radeon R9 295X2 and they went with a pair of 4th generation cold plates and pumps over each GPU to keep it nice and cool.

295x2-diagram

Here is an exploded diagram that shows the radiator, metal backplate, PCB, water block and then finally the fan shroud. AMD Catalyst Control Center no longer allows you to control the fan speed as the memory/regulator fan will automatically adjust based on the regulator voltage current.

r9-dimensions

AMD provided us with a slide that showed the mechanical dimensions and this will be critical to those that are trying to shoe horn this card into a small form factor system where there isn’t much space. For starters the card is 305-307mm in-length, which makes it over 12-inches long. Not all mini-ITX or microATX cases are able to support cards over a foot long, so be sure to make sure this will fit in your system! From there you have the 3/8″ water tubing that is 380mm long that connect the card to the 120mm radiator that is 120mm by 152.5mm by 38mm. AMD went with a single 25-26mm cooling fan on the radiator, so the overall thickness is ~64mm on the radiator. We have asked AMD for the model number of the fan that they are using on the radiator, so people can easily purchase a identical fan for a push/pull configuration.

Let’s take a closer look at the AMD Radeon R9 295X2 reference card and then move along to the game benchmarks!

Print
Jump To:
  • Silviu

    Great benchmarks !
    2x GTX 780Ti beat the R9 295×2

  • http://www.jonnyguru.com JonnyGuru

    “Nothing will catch fire. All of these PSUs have OCP protection, so there’s no fire hazard in any case. As long as your PSU is rated to deliver 28A per rail, and have the proper wattage, you’ll be fine.” – AMD PR

    Umm… Wrong. Most of these single rail, high-wattage PSUs either use OPP on the primary side or have an OCP at 125% of the total +12V output capability. While the card won’t cause/catch fire, the drippings of hot, melted plastic could potentially ignite something.

  • anothergooddecision

    Impressive! So all I need to play modern games at decent FPS on a 4K screen is a pair of 780 ti cards.

    Oh wait, this article was about that 2nd place guy with the ridiculous power requirements.

    Honestly AMD, just put 3x 8pin connectors on it. Someone WILL set their pants on fire with this card, and when they do they are gonna sue you.

    • basroil

      Tomshardware shows the 780ti SLI consistently losing to the 295x with 4k, so just goes to show how pointless these tests are to see who is on top. Thier 295X setup is within the margin of error when compared to these results, but their 780Ti scores are 30% lower or more! When 6gb 780ti cards come out though, it will be almost impossible for the 295x to catch up, and no amount of settings tweaks to make AMD seem better will help.

      But I agree, I have no idea what AMD was thinking with a $1500 gaming card that isn’t even the undeniably fastest thing out there. When nvidia showed off their dual titan card they stayed clear of the gaming market, so you can kind of see why the card exists and why they charge so much for it. But the 295 costs 50% more than two 290X but almost on par with crossfire 290X, and that just doesn’t make sense.

      • Steven De Bondt

        Yeah the Titan Z clearly makes more sense here. Thanks for this info. I would have no idea of what I would do without your insights in technology.

        • basroil

          Clearly you don’t work or understand constrains of academic research using high performance computing. It’s far better to have a single machine with four Titan Z than two machines each with 4 titans. Not because of hardware cost, but because of software cost and scalability considerations. And a singe Titan Z is still cheaper than a K40x!

        • Steven De Bondt

          So the kepler series isn’t considered stable then? :)

        • basroil

          Now you’re just trolling with nonsensical blabber

        • Steven De Bondt

          Well, promoting a titanZ over this card, which is less than half of the price, and actually has decent OpenCL support might be considered the same. And, as you mention it, if you really want to power those CUDA apps with a professional card you best buy the K40x.

          That, or you just buy 2 titans. Instead of a dual Titan that has the price of three (?).

          Clearly you don’t work or understand constrains of common sense; because we are talking about gaming here, not GP. I understand Nvidia is confusing you with their crazy prices and weird graphs but this is still the main reason for cards like this.

          Ofcourse buying 2 290x’s would be a bit more common sense, though this card takes less space and has a better range of output ports on it. You would also save a bit on decibels and watts.

          And if this is considered nonsensical blabber unless some weird expensive nvidia card is placed in good light then you are just beyond saving.

        • basroil

          “That, or you just buy 2 titans. Instead of a dual Titan that has the price of three (?).”

          Try four cards, using Ansys. Ansys GPU compute license is so expensive it’s a dozen times cheaper to buy 4x titan z than two computers with 4x titan each. Most other professional programs are the same, that’s why $20k workstations aren’t uncommon, the hardware is a tiny fraction of the actual cost. The ONLY thing the $5000 K40 has going for it is 12gb ram, but plenty of computations do just fine with less memory per core

          “Clearly you don’t work or understand constrains of common sense; because we are talking about gaming here, not GP.”

          You are, I never was. Common sense would tell you that no card above $400 is actually meant for “common sense” gamers. Sure some idiots spend more than a car’s worth for the top gear, but they are in the clear minority of buyers of those (non-limited release) cards.

        • Steven De Bondt

          Im pretty sure that;

          The way you are promoting the 780ti which “has 30% lower scores,” claiming the tests are rigged (?)

          Calling 4k and high resolution “pointless” on very expensive cards, (??) (what?)

          And

          Saying that increasing the Volume of VRAM will magicly increase the inferior Bus width of the 780ti; only 384 vs 512 bits (yes, high resolution need a decent buswidth so ==> ???)

          You would still (with great effort) justify gaming on expensive cards when it has a frigging nvidia logo (did anyone tell you the 780ti is more expensive than 400$? Well I’m telling you now.) but when AMD enteres the high segment, even IF they prove their performance, you are immediately & completely convinced it’s total crap, and you start blaming all sorts of factors and conspiracies…

          Look, let’s just face what really is going on here, so we can both move on. ;)

  • Nick Lucier

    Wow, Just damn incredible…. Keep pushing the boundries! I could barely afford my r9 270x Im saving up for crossfire but it may be a while Nice Card(s)!!!!!!!!!

    • Michael Mike

      Miners are dumping cards like crazy and prices are dropping cause of it, so check out the bay.. also a 270x is a 7870.

      • Nick Lucier

        yea there are plenty out there, Im just broke, my 270x was $200 practically brand new, Im happy with it, no need to xfire for what I do nowadays

  • http://www.indraemc.blogspot.com/ Indra Emc

    This thing is a beast ! but it consume too much power (only 50 watt less than 2 R9 290X in Crossfire) that card alone, consume more electricity than your 2 doors refrigerator.

    • Steven De Bondt

      Fun fact;

      A titan black SLI consumes about ~40 watts less power then 2 290x’es. Is that enough to offset a two doors refrigerator?

      I already know your answer: Please reference me that two door frigderator.

      • http://www.indraemc.blogspot.com/ Indra Emc

        Average refrigerator consume 280 watts of power.

        mini fridge consume 100 watts of power.

        • Steven De Bondt

          So, did you make the same remark on the titan black review then? :S

        • pickle farts

          nobody cares about amd gpu’s lawl

        • Matthew Justin Tremain

          Then why r u here looking at a review about an AMD GPU Pickles? And yes, the r, the u, and the s are there as I indented. The Titans are computational beasts, but we are here for the Gaming side, so we look for Gaming performance, and this card is aimed at Gaming, not workstations. Indra, my PSU uses 50w when I have shutdown my PC, so then, do you think someone like me cares about power consumption. When you buy a High power GPU. of cause it is going to use High amounts of power. When you buy a large capacity V8 engine, of cause it is going to use more juice to use compared to a lower capacity V8. If you want High performance, then be willing to pay for it, and quit your pathetic complaining.