AMD Radeon R9 280X Video Card Review w/ ASUS, XFX and MSI

Jump To:

Battlefield 3

Battlefield 3 Screenshot

Battlefield 3 (BF3) is a first-person shooter video game developed by EA Digital Illusions CE and published by Electronic Arts. The game was released in North America on October 25, 2011 and in Europe on October 28, 2011. It does not support versions of Windows prior to Windows Vista as the game only supports DirectX 10 and 11. It is a direct sequel to 2005′s Battlefield 2, and the eleventh installment in the Battlefield franchise. The game sold 5 million copies in its first week of release and the PC download is exclusive to EA’s Origin platform, through which PC users also authenticate when connecting to the game.

Battlefield 3 Screenshot

Battlefield 3 debuts the new Frostbite 2 engine. This updated Frostbite engine can realistically portray the destruction of buildings and scenery to a greater extent than previous versions. Unlike previous iterations, the new version can also support dense urban areas. Battlefield 3 uses a new type of character animation technology called ANT. ANT technology is used in EA Sports games, such as FIFA, but for Battlefield 3 is adapted to create a more realistic soldier, with the ability to transition into cover and turn the head before the body.

bf3

Benchmark Results: The AMD Radeon R9 280X is looking good in BF3 and the ASUS Radeon R9 280X was running 81FPS in BF3 at 1920×1080 and 52FPS at 2560×1600. This is very competitive with the NVIDIA GeForce GTX 770 series cards that are running $399 and higher. For example the MSI Gaming N770 2GD5/0C was faster in BF3, but costs $100 more at $399. Right off the bat it looks like AMD has a price versus performance winner here as at $299 it falls between the NVIDIA GeForce GTX 760 and GeForce GTX 770.  The XFX Radeon R9 280X Double D Edition features the AMD reference clock speeds and still has very respectable scores of 80FPS at 19×10 and 49FPS at 25×16.

AMD said that the AMD Radeon R9 280X will be able to play Battlefield 4 at 2560×1440 with maximum quality settings. This game title isn’t out yet to test, but we tried it out on the beta and it appears to be true.

Print
Jump To:
  • nyran125tk

    theres definately issues with these cards. Many say the memory clocks are to high

  • Christofer Santiago

    Kind of stupid to put cards far out of its price range and on top of that Nvidia SLI numbers? They had the cards and could of easily did crossfire bench marks as well, especially being that they were the cards being highlighted. And btw did they disable PhysX to be fair to AMD? A’la TressFX.

  • Lianesch

    MSI Radeon R9 280X Gaming Temperature:

    VDDC set to min, not max

    • Nathan Kirsch

      sorry about that!

  • Yosef

    I am dissapoint by your lack of 3-fan Gigagbyte Windforce Master Race

  • Will-0-Wisp

    The MSI R9 280X is definitely the one I’ll get!!! Thanx for the review!

  • Mombasa69

    No Crossfire testing then?

    • Gogu

      No.

    • michael centeno

      this is pro nvidia campaign here buddy.

    • KC

      Crossfire works nicely. Moreover this was crossfire tested with 7970 Ghz and it works perfectly. No issues. Its a beautiful card for very low price

  • Mombasa69

    Sapphire R9 280X Toxic, best of the lot easily.

    • lolol

      If you want a burnt-through GPU,yes.
      I’ve spoken to 2 Computer-Shop owners who build custom PCs for their customers.
      I visited them to get reliable infomation about high quality hardware. Sapphire, HIS and Powercolor have a failing rate of more than 20%.
      Asus, XFX, MSI are the only reliable ones.
      They install Gigabyte only if the customer insists to use a Gigabyte model.
      Asus oder XFX models are standard.
      I was surprised when they said XFX. Only heard bad things about it. But their numbers don’t lie.
      They have to take care of their reputation. They would bullshit themselves if they talked crap.

      • Kristijan Vragović

        I had nothing but trouble with XFX, and no problem with Sapphire.

        • Dan Mackinlay

          Likewise. “burnt-through GPU” lol