AMD Radeon R9 290X vs NVIDIA GeForce GTX 780 at 4K Ultra HD

Jump To:

Metro Last Light

Metro: Last Light is a first-person shooter video game developed by Ukrainian studio 4A Games and published by Deep Silver. The game is set in a post-apocalyptic world and features action-oriented gameplay with a combination of survival horror elements. It uses the 4A Game engine and was released in May 2013.

Metro: Last Light

Metro Last Light was benchmarked with High image quality settings

metro-4k

Benchmark Results: In Metro: Last Light the AMD Radeon R9 290X had a lower average FPS, but had better performance when things bottomed out. The EVGA GeForce GTX 780 Superclocked and AMD Radeon R9 290X are pretty evenly matched in this game title.

metroll

Taking a closer look at the detailed results from the first 90 seconds from our FRAPS log we can see how evenly matched things are.  The FRAPS log shows the AMD Radeon R9 290X doing well in the first portion of our benchmark run.

Print
Jump To:
  • TacticalTimbo

    ‘Most gamers are looking beyond 1080P for their next setup…’ Are they, MOST – I doubt it?!

    Running games fluidly at 2560 and beyond is an expensive business, it’s going to require at least a pair of 770′s, or a 780… Most gamers cannot afford to spend that kind of cash; period!

  • Karthik

    comparing one cooler in a OEM cooling shroud while the other is a 3rd party custom shroud – will this not affect the stats? especially considering how the 290X and 290 throttles when temps go near 94-95C – I would like to see the results with a custom 290X :|

  • 88rolling thunder

    Please include benchmark new game Battlefield 4 & graphic-intensive Crysis 3..

  • Lorenzo

    Why they didn’t compare zotac 780 amp! ??

  • solomonshv

    this is cool and all, but under “uber” mode the 290x eats 100 watts more than an overclocked gtx 780 and runs 15*c hotter. even if the chip itself can withstand that temperature, the VRMs and the memory will not. granted, the heat issue may be remedied with a better cooler, but you are still looking at insane power draw.

    an overclocked 290x will probably draw 130 watts more than an overclocked 780, maybe more. that means that if you game an hour a day on average you are running up about 47 KWh more than a GTX 780 user.

    and with power draw anf heat like this, i don’t see a dual GPU solution ever happening, unless they clock it down. a lot

    • David Calloway

      I will agree with you on temps. and as stated that should be remedied with better heat sink solutions. Gaming an hour a day over the course of one year nets you a difference of 47 Kwh’s of juice….. I’m currently paying .14 cents per Kwh of power. If I game two hours every day it’ll cost me $ 13.16 more in power that year……. WOW!!!!!!! ….. now I’m worried. ; )

  • John

    Nathan….you need someone to proof read your articles before posting them. I don’t want to be a grammar nazi or anything but the amount of mistakes in this article is rather large. Also there are some clear sentence structure issues, where it seems you started a thought but didn’t finish it… for Example

    “we are using to running at on 60Hz, 120Hz and 144Hz monitors here on our test bench, The 4k Ultra HD monitor that we are using runs at 3840×2160 at 30Hz and we could certainly notice the performance difference and saw screen tearing (where the monitor displays more than one frame at the same time) on both AMD and NVIDIA graphics cards.”

    The comma after testbench should be a period. The first sentence was also probably meant to read “For example we are used to running on 60hz, 120Hz and 144Ghz monitors” instead of what it current says. Just pointing them out. Cheers

    • legitreviews

      Thanks for pointing that out! We try to edit most, but sometimes those that edit aren’t available! This was one of those times.

      • David Calloway

        I get a kick out of this. First let me state something for the record. I’m a Yank living in North Carolina, yes the South. I grew up overseas and am fluent in three languages. This response make me want to write….. yeah, and here at Legit Reviews the editors are the only ones that attended grammar school, and no I wasn’t watching the screen as I typed this article……. good one Nathan. I’m just havin’ some fun with this. ; )

    • David Calloway

      instead of what it current says…. wait a minute…. current-l-y reads….. this article on my PC hasn’t said a damn thing…… oh boy.

    • Fred Nice

      John, you forgot a comma in your third sentence after “Also” Or, you could have just left it at “There are some clear sentence structure issues,” Just sayin’

  • Sami

    the reviewer said “The 4k Ultra HD monitor that we are using runs at 3840×2160 at 30Hz and we could certainly notice the performance difference and saw screen tearing ”

    You guys are a joke ! you call yourselves reviewers and you got no clue that your monitor can run at 60Hz , Try googling MST (multi-stream transport) .. you clearly been using SST mode. kids these days making sites .. GL.

    • TouroBrabo

      So true… LOL

      “MST Mode for 4K Ultra-HD Video at 60 fps
      The PN-K321 supports MST (Multi-Stream Transport) mode as specified by the DisplayPort 1.2 standard. A single DisplayPort cable can transfer 3,840 x 2,160 video signals at 60 fps, resulting in smooth rendering of high-resolution video and fluid mouse operation.”

  • Bob Austin

    AMD has won this round, but unfortunately I won’t be getting it. I got a nvidia 3d vision 2 monitor which cost me a kidney and so I cannot upgrade with anything other than nvidia. I’ll have to wait for the maxwell series.

  • Erenhardt

    Check Metro fps charts. From the graph below it seems that red line is always above blue to only occasional dip to same level.

    • legitreviews

      As noted on that test page, the bar chart is just a snippet of the entire run. If we showed the entire run the chart wouldn’t easily be readable, so I cut it down to the first 90 seconds.

  • xxx

    this is not right competition , GTX 780 is 3G ram and R290 is 4G ram beside nvidia cards has Phys and CUDA in their frame ware means more process per frame

  • mike

    why is the r9 running at pci-e 3.0 and the 780 2.0?

    • Nathan Kirsch

      It’s just the way GPU-Z reads the cards. It depends on the mode they are in when the screenshot is taken. The ASUS P9X79 Deluxe motherboard is set to PCIe 3.0 in the BIOS and that setting is never changed.

      • titancom

        I am still trying to figure out what the test was trying to compare? The best AMD card against the 3rd best Nvidia card or the same priced card on both sides tho not the equivalent best of either right?

  • hellcinder

    I personally see this as a 2nd place fight with the 7990 being in the throne.

  • Derek Manning

    PCIe 3.0 for ATI and PCIe 2.0 for nVidia…

    • legitreviews

      That’s just the state when the GPU-Z shot was taken. Both cards were tested with PCIe 3.0 enabled in the BIOS of the motherboard.

      • Derek Manning

        Sorry – I didn’t want to imply that legit was favoring one over the other, only observing what was being reported.

        Is it possible that the nV cards are running at 2.0? I know nv lists 3.0.. just askin.

        Thanks for the reply

        • John P. Myers

          It’s very possible Nvidia was running at 2.0, actually. GPU-Z has no trouble reporting Nvidia running at 3.0 when it actually is. By default, Nvidia’s drivers do not support 3.0 on the X79 chipset, which the test board has. You have to manually enable it (no, the BIOS setting only “allows” it to happen, it does not “make” it happen) by running a program called force-enable-gen3.exe downloadable from Nvidia. If this was not done, it is, in fact, 2.0.

        • Diogo

          I’m waiting for a reply to this PCIe 2.0 problem. This seems to invalidate the benchmark altogheter.

    • BDK

      LMFAO .. Fail

  • Guy Parris

    our card will will shine at 4k they said it will beat the 780 with ease they said. they are like neck and neck most of the time 290x pulls away by 5/6 fps in 2 games!. AMD go home your drunk!

    • Strider

      Day one drivers, reference card at stock speeds, roughly $70 to $100 cheaper on average. Being put against a factory overclocked card, running much more mature and updated drivers, at a higher price point.

      Don’t care, still impressive. The 290X performance is only going to go up, and it will indeed surpass the 780. So AMD’s claims are still valid.

      Also, you have to look at it this way. Who games at 4K? No one, why? It’s just too damn expensive. However, people do game at even higher resolutions above 4K with 3-way Eyefinity and when put to the test the 290X had no problems going way beyond 4K (Newegg has a video up for this). It also performs near, at, or above the 780′s and the Titan in the much more common and traditional HD+ resolutions, such as 1440.

      So yeah, still a win for AMD, there is no way around that. =]

      • David Calloway

        He’s ( Guy ) is just not getting why all of your post is so important in terms of what is currently needed in the gpu market. For some of us, that have been doing this charade for 20+ years now, the current state of competition is very welcome. It forces Nvidia to stop resting on their laurels and start offering their performance at more reasonable prices ( at least here in the States ) and puts pressure on them to R&D with definitive performance goals.

    • Henry

      LMAO guy you admit that the r-390x is a “titan killer” the defining moment for these cards will be mantle

    • solomonshv

      the 290x is running on stock clocks. the 780 is overclocked with an aftermarket cooler. the way i see it, nvidia got spanked pretty hard. not necessarily because of the performance, but more so due to the price difference.

      it’s just too bad that right now you can’t buy a 290x anywhere at any price.

      • David Calloway

        Yeah, but on a sweeter note gtx 780′s can be had all day long in the States at the $500 price point. Not sure how much you all in the U.K. have to fork over in Sterling ( seems you all get thrashed on pricing overseas ) nonetheless this is where gtx 780 prices should have been at launch.

        • solomonshv

          and given the performance of the 290x, $500 is right where it should be, not a penny more.

          i got a 290x for $550 from newegg with a free copy of BF4, which I was going to buy anyway. so i’m happy

    • john

      Guy walks in on a oc 780 vs stock 290x that states 290x is the winner even in 4k… and dares to say this… well no comment… if stupidity would hurt you would be on morphine all day long

  • Tequila_Mckngbrd

    Legit Bottom Line: The AMD Radeon R9 290X reference card is a beast and was able take on an overclocked GeForce GTX 780 at 4K and come out victorious!

    … I thought they were tested at stock?

    • Strider

      The 290X is stock, and running day one drivers. The 780 superclocked is a factory overclocked card, and it’s running MUCH more mature drivers.

      • Nathan Kirsch

        A stock GTX780 is $649. An AIB’s custom designed and overclocked is $659. Most cards sold are the custom ones, so I figured I’d test with that! There are no R9 290X custom cards yet, so reference was the only choice!

        • duplissi

          Must do a review once the AIB’s come out with the windforce, direct c u II, double d, lightning, toxic, etc models are out.

        • David Calloway

          I completely agree…… and while we’re at it, how about with slightly more mature drivers.

        • Nathan Kirsch

          I’ve heard a million people say this, but Hawaii isn’t really that different than the Tahiti GCN design architecture. Not sure how much more they can get unless they are holding something back.

      • http://www.anushand.com/ Anusha Dharmasena

        i really doubt the drivers will improve just for the 290x. it’s the same architecture. just a little bit of pumped up specs.

        if the drivers improve, it will be for the entire GCN architecture based cards.

        however, you cannot say that NVidia cannot improve the drivers further. who knows?