AMD Radeon R9 295X2 8GB Video Card Review at 4K Ultra HD

Jump To:

Test System

Before we look at the numbers, let’s take a brief look at the test system that was used. All testing was done using a fresh install of Windows 8 Pro 64-bit and benchmarks were completed on the desktop with no other software programs running. It should be noted that we average all of our test runs. There has been some concern of people testing a cold card versus a hot card, but we’ve always done out testing ‘hot’ since the site started back more than a decade ago.

Video Cards & Drivers used for testing:

  • NVIDIA GeForce 337.50
  • AMD CATALYST 14.4 Beta

Intel X79/LGA2011 Platform

795x2-setup

The Intel X79 platform that we used to test the all of the video cards was running the ASUS P9X79-E WS motherboard with BIOS 1501 that came out on 01/15/2014. We went with the Intel Core i7-4960X Ivy Bridge-E processor to power this platform as it is PCIe 3.0 certified, so all graphics cards are tested with PCI Express Gen 3 enabled. The Kingston HyperX 10th Anniversary 16GB 2400MHz quad channel memory kit was set to XMP Profile #2. This profile defaults to 2133MHz with 1.65v and 11-12-12-30 1T memory timings. The OCZ Vertex 460 240GB SSD was run with latest firmware available. A Corsair AX860i digital power supply provides clean power to the system and is also silent as the fan hardly ever spins up. This is critical to our testing as it lowers the ambient noise level of the room and gives us more accurate sound measurements than the old Corsair AX1200 power supply that we used from 2012 till this year that had a loud fan that always ran.

gpu-test-system-specs

Here are the exact hardware components that we are using on our test system:

The Intel X79 Test Platform

 

Component

 

Brand/Model

 

Live Pricing

 

Processor

 

Intel Core i7-4960X

 

Motherboard

ASUS P9X79-E WS

 

Memory

16GB Kingston 2133MHz

 

Video Card

 

Various

 

Solid-State Drive

 

OCZ Vertex 460 240GB

 

Cooling

 

Intel TS13X (Asetek)

 

Power Supply

Corsair AX860i

 

Operating System

 

Windows 8.1 Pro 64-bit

 

Monitor

 

Sharp PN-K321 32″ 4K

 

AMD Radeon R9 295X2 8GB Video Card GPU-Z Info:

AMD Radeon R9 295X2 GPUZ 

asic-quality

Print
Jump To:
  • Silviu

    Great benchmarks !
    2x GTX 780Ti beat the R9 295×2

  • http://www.jonnyguru.com JonnyGuru

    “Nothing will catch fire. All of these PSUs have OCP protection, so there’s no fire hazard in any case. As long as your PSU is rated to deliver 28A per rail, and have the proper wattage, you’ll be fine.” – AMD PR

    Umm… Wrong. Most of these single rail, high-wattage PSUs either use OPP on the primary side or have an OCP at 125% of the total +12V output capability. While the card won’t cause/catch fire, the drippings of hot, melted plastic could potentially ignite something.

  • anothergooddecision

    Impressive! So all I need to play modern games at decent FPS on a 4K screen is a pair of 780 ti cards.

    Oh wait, this article was about that 2nd place guy with the ridiculous power requirements.

    Honestly AMD, just put 3x 8pin connectors on it. Someone WILL set their pants on fire with this card, and when they do they are gonna sue you.

    • basroil

      Tomshardware shows the 780ti SLI consistently losing to the 295x with 4k, so just goes to show how pointless these tests are to see who is on top. Thier 295X setup is within the margin of error when compared to these results, but their 780Ti scores are 30% lower or more! When 6gb 780ti cards come out though, it will be almost impossible for the 295x to catch up, and no amount of settings tweaks to make AMD seem better will help.

      But I agree, I have no idea what AMD was thinking with a $1500 gaming card that isn’t even the undeniably fastest thing out there. When nvidia showed off their dual titan card they stayed clear of the gaming market, so you can kind of see why the card exists and why they charge so much for it. But the 295 costs 50% more than two 290X but almost on par with crossfire 290X, and that just doesn’t make sense.

      • Steven De Bondt

        Yeah the Titan Z clearly makes more sense here. Thanks for this info. I would have no idea of what I would do without your insights in technology.

        • basroil

          Clearly you don’t work or understand constrains of academic research using high performance computing. It’s far better to have a single machine with four Titan Z than two machines each with 4 titans. Not because of hardware cost, but because of software cost and scalability considerations. And a singe Titan Z is still cheaper than a K40x!

        • Steven De Bondt

          So the kepler series isn’t considered stable then? :)

        • basroil

          Now you’re just trolling with nonsensical blabber

        • Steven De Bondt

          Well, promoting a titanZ over this card, which is less than half of the price, and actually has decent OpenCL support might be considered the same. And, as you mention it, if you really want to power those CUDA apps with a professional card you best buy the K40x.

          That, or you just buy 2 titans. Instead of a dual Titan that has the price of three (?).

          Clearly you don’t work or understand constrains of common sense; because we are talking about gaming here, not GP. I understand Nvidia is confusing you with their crazy prices and weird graphs but this is still the main reason for cards like this.

          Ofcourse buying 2 290x’s would be a bit more common sense, though this card takes less space and has a better range of output ports on it. You would also save a bit on decibels and watts.

          And if this is considered nonsensical blabber unless some weird expensive nvidia card is placed in good light then you are just beyond saving.

        • basroil

          “That, or you just buy 2 titans. Instead of a dual Titan that has the price of three (?).”

          Try four cards, using Ansys. Ansys GPU compute license is so expensive it’s a dozen times cheaper to buy 4x titan z than two computers with 4x titan each. Most other professional programs are the same, that’s why $20k workstations aren’t uncommon, the hardware is a tiny fraction of the actual cost. The ONLY thing the $5000 K40 has going for it is 12gb ram, but plenty of computations do just fine with less memory per core

          “Clearly you don’t work or understand constrains of common sense; because we are talking about gaming here, not GP.”

          You are, I never was. Common sense would tell you that no card above $400 is actually meant for “common sense” gamers. Sure some idiots spend more than a car’s worth for the top gear, but they are in the clear minority of buyers of those (non-limited release) cards.

        • Steven De Bondt

          Im pretty sure that;

          The way you are promoting the 780ti which “has 30% lower scores,” claiming the tests are rigged (?)

          Calling 4k and high resolution “pointless” on very expensive cards, (??) (what?)

          And

          Saying that increasing the Volume of VRAM will magicly increase the inferior Bus width of the 780ti; only 384 vs 512 bits (yes, high resolution need a decent buswidth so ==> ???)

          You would still (with great effort) justify gaming on expensive cards when it has a frigging nvidia logo (did anyone tell you the 780ti is more expensive than 400$? Well I’m telling you now.) but when AMD enteres the high segment, even IF they prove their performance, you are immediately & completely convinced it’s total crap, and you start blaming all sorts of factors and conspiracies…

          Look, let’s just face what really is going on here, so we can both move on. ;)

  • Nick Lucier

    Wow, Just damn incredible…. Keep pushing the boundries! I could barely afford my r9 270x Im saving up for crossfire but it may be a while Nice Card(s)!!!!!!!!!

    • Michael Mike

      Miners are dumping cards like crazy and prices are dropping cause of it, so check out the bay.. also a 270x is a 7870.

      • Nick Lucier

        yea there are plenty out there, Im just broke, my 270x was $200 practically brand new, Im happy with it, no need to xfire for what I do nowadays

  • http://www.indraemc.blogspot.com/ Indra Emc

    This thing is a beast ! but it consume too much power (only 50 watt less than 2 R9 290X in Crossfire) that card alone, consume more electricity than your 2 doors refrigerator.

    • Steven De Bondt

      Fun fact;

      A titan black SLI consumes about ~40 watts less power then 2 290x’es. Is that enough to offset a two doors refrigerator?

      I already know your answer: Please reference me that two door frigderator.

      • http://www.indraemc.blogspot.com/ Indra Emc

        Average refrigerator consume 280 watts of power.

        mini fridge consume 100 watts of power.

        • Steven De Bondt

          So, did you make the same remark on the titan black review then? :S

        • pickle farts

          nobody cares about amd gpu’s lawl

        • Matthew Justin Tremain

          Then why r u here looking at a review about an AMD GPU Pickles? And yes, the r, the u, and the s are there as I indented. The Titans are computational beasts, but we are here for the Gaming side, so we look for Gaming performance, and this card is aimed at Gaming, not workstations. Indra, my PSU uses 50w when I have shutdown my PC, so then, do you think someone like me cares about power consumption. When you buy a High power GPU. of cause it is going to use High amounts of power. When you buy a large capacity V8 engine, of cause it is going to use more juice to use compared to a lower capacity V8. If you want High performance, then be willing to pay for it, and quit your pathetic complaining.