AMD Radeon R9 Nano Versus NVIDIA GeForce GTX 980

Jump To:

Power Consumption

Radeon R9 Nano vs. Geforce GTX 980

For testing power consumption, we took our test system and plugged it into a Kill-A-Watt power meter. For idle numbers, we allowed the system to idle on the desktop for 15 minutes and took the reading. For load numbers we ran Battlefield 4 at 3840×2160 and recorded the average idle reading and the peak gaming reading on the power meter.


Power Consumption Results: The AMD Radeon R9 Nano used 4 Watts more power at idle and ~50W more when gaming, so NVIDIA wins on the power front if you are concerned with power.

Let’s wrap this review up!

Jump To:
  • Mulligan

    I was given a chance to try the Nano. I have a 980 ti, but the nano was attractive in that it saves a LOT of space and is less noisy and doesn’t get as hot. Ultimately I opted to keep the 980 ti, it has physx which is essential for Alice Madness and a few upcoming titles that I’m excited for that take advantage of it as well… plus the nano could NOT do 4k 60hz as it lacks the HDMI 2.0 that my TV offers… such a pity, AMD prolly won’t take a financial hit on this, but a LOT of people will be disappointed…

  • John Pombrio

    Sales are up from being number #156 on Amazons best seller list of graphics cards to number #73 after the price drop. Better, but sales of the Fury Nano are still not even close to the sales of NVidia’s GTX 970 and kin.

    • Nathan Kirsch

      yeah, not sure how often they reset that. The only Radeon card in the top 20 is the Radeon HD 5450 at $29.99.

  • Timothy Isenhart

    Wow that’s amazing, I have both the R9 Fury X and the 980ti and the R9 Fury X outperforms my 980ti by 3-7 FPS in the games you tested and I get much higher frame rates than what you reported. Hmmm? Either you have corrupt drivers or these are phony benchmarks.

  • Kaitensatsuma

    LOL at the fanboys down there.
    Well…for a 6 inch card its holding it’s own quite well. You could swing a much smaller box with that…Granted, cooling is always the trick, but I appreciate the idea that I could take an older office computer and turn it into a relatively decent Gaming PC.

  • Casecutter

    AMD (RTG) could only push this product into the channel so fast, they’ve slowly got the manufacturing of stuff moving in sync only several month back (Oct). They held price and got stocks built and have moved Fury stuff to AIB’s. Figure AMD made close to the $150 (min) on each Nano as AIB get it from AMD as a pass-through SKU. Now RTG is at place to move volume so at this $500 price they can move every Full Fiji chip they have wafer starts scheduled. I’d say in a month or so AMD will have AIB Fury’s coming out and we will see those at $450 with some improvement in OC those. As Sapphire first started reveling in Nov; the remainder of AIB’s will offer newer versions.

    The upside is when a there Artic Island parts hit with HBM RTG will have been well past the learning curve on interposers/HBM and can hit the ground running. Nvidia will need to pass this phase on low production as they start delivering HPC cards for corporate contracts. After that they fill the channel for professional cards of Tesla/Quadro markets and then Titan/GTX gaming. I’d figure AMD will be in a great place for the next year and then some.

  • Javier Saove

    good review sir.

    • Nathan Kirsch


  • MLSCrow

    OMG NVidia is so dead! Team Green will fade to team Brown within the next two years. The Nano is half the size of the 980Ti, and given the FuryX2 that will be out soon, it will outperform the 980Ti. On top of that, AMD is about to release Polaris, which will double the performance per watt of NVidia. NVidia is already afraid because their Pascal architecture isn’t going to be able to keep up with the innovative things AMD has been doing. They were caught off guard this time. What else is that AMD is also releasing ZEN, their new and great CPU design, which means, there will be some kickass APU’s in the near future, meanwhile, NVidia still doesn’t have an x86 license, and they have to bet all their chips on things like x64ARM processors and Tegra, and well, we’ve seen how that turned out, so unless NVidia does something and something quick, it looks like they will have a future that looks like what AMD’s used to during it’s Bulldozer days.

    Don’t get me wrong, I’m all for competition, but it will be nice to see team RED on top for once, which, by doing so, will bring prices down for all of us. Weeee.

  • shernjr

    gtx 980 is 750-770 AUD. R9 Nano is ~800 AUD. I’d take the Nano

  • Josh

    I am not a Nvidia fan boy but I’d say this is a win for the 980 as it has significantly lesser specs but is only beaten out by a small margin

    • Medi Zerovan

      Which “lesser specs” are you talking about dude?
      Get a clue.

    • Barron

      Why should consumers care so much about specs?

      The performance and the price is their main concern, not some technical spec which most consumers do not fully understand.

      Don’t get me wrong, its good too look at specs and try to understand them. But to me, the Nano is more future proof than the 980 considering, the future is DX12, don’t you think?

    • Timothy Isenhart

      I see your point, although most people don’t realize this but the large GPU bandwidth is needed to use HBM because unlike DDR5 HBM is stacked on the die and not on the card for example 1024 bit bandwidth is needed to support 1 GB of HBM also people try to compare Cuda cores to stream processing units. Each Cuda core has 2 stream processing units for full duplex per Cuda Core but with the Nano and Fury X they have 4096 stream processing units which would be equivalent to 2048 Cuda Cores.

  • E.

    AMD can’t do physx has to push it off on the cpu. Only the cheap skates who care nothing about game fidelity still buy their gpu’s.

    • duplissi

      Yes, the extra fidelity in all 3 games every year… Get over your fanboyism…

      • onstrike112

        That also destroys performance on said GPUs! Even Nvidia users don’t use PhysX. It was cool in 2004. It’s antiquated now. lol

        • actually its not that bad on CPU at least. I ran physx on an i7 4770 in borderlands perfectly find.

          GPU was a hd5770

        • onstrike112

          I was referring to running it on even high-end Geforce cards. Also, if Nvidia wanted to make real money with PhysX, they’d allow AMD users to buy Geforce cards for DEDICATED PHYSX.

    • Judge_Chip

      I use PhysX whenever a game support it, it adds so much realistic eye candy to the graphic effects without degrading performance that its a auto on feature. Its the cry baby AMD fan boys that whine the most, anything that AMD can’t do makes them cry, lie, and spew blatant BS about.

      • Ol’Gil

        Don’t forget, Physx from CPU looks entirely different from what’s generated by the NVIDIA cards. There are comparisons on youtube that show this. CPU/AMD physx is missing certain shadows, textures and color. I play Madness Returns quite often (and other games, but mostly Alice) so Physx is still quite a big deal for me and many others.

        I’m curious though, could AMD users get a dell physx card (the old dedicated cards found on ebay now) and use it with modern Physx Games?

        • Timothy Isenhart

          Not anymore, you used to be able to with a driver mod or Ageia PhysX card or running an Nvidia card installing the drivers and replacing the card with an AMD/ATI card but the Nvidia drivers now detect whether you have an authentic Nvidia Card in your system, if it detects an AMD/ATI card it will automatically disable PhysX.

        • Andi De Voil

          Actually it would still suck with an old ageia physX card.

          the old cards simply aren’t powerful enough and form a serious bottleneck

      • BaronMatrix

        You might need therapy…

      • Timothy Isenhart

        I like to use PhysX but the same effect can be done with Sweet FX and is a lot less demanding on the GPUs. AMD cards have Physics but not Nvidia PhysX unless you can find a driver mod.

      • Timothy Isenhart

        Sadly there are a lot of butt hurt AMD fanboys over PhysX and Hair Worx but I saw a lot of butt hurt Nvidia fans before they released Tomb Raider with Tress FX because they assumed it was only going to be on AMD cards but the fact is AMD doesn’t close source any of the enhancements they develop, everything AMD develops to enhance visual effects in games are also available for Nvidia users as well. I can honestly say AMD has never played unfair with Intel or Nvidia, AMD is not out to cripple the competition, they are just trying to compete.

        • Damien

          Can’t tell “Judge_Chip” nothing bro,

    • Timothy Isenhart

      AMD GPUs can do Physics but not Nvidia PhysX unless you get a hold of a driver mod, but I haven’t seen a Ageia PhysX driver mod since 2009. The ATI cards actually performed better than the Nvidia GPUs when you ran Physx which is the main reason why Nvidia bought Ageia.

    • john

      Well, luckily I will get a 5820k

  • Judge_Chip

    Nano needs 2 slots just like many factory OCed Maxwell 980 do which beat Nano bloody, so over hyped Nano is a no go for me and my many customers who want real performance and not gimmicks. NO 28nm GPU needs HBM1 bandwidth but the best future proofing card is the 6GB Maxwell 980ti which some games need now and more will need in the future. GTX980 and GTX980Ti are a bigger bang for your GPU buck than hyped up Nano.

    • Johan Krüger Haglert

      I don’t think it’s fair to compare to a stock GTX 980 but then again this card would support FreeSync possibly shaving off $100-250 of the price of a adaptable synced gaming monitor vs a G-sync one and if one consider them together it become a much more interesting choice than alone.

      • Judge_Chip

        G-sync is superior to ghosting FreeSync in every way, I don’t mind paying more but your $250 is way too much and I see G-sync monitors selling for way less closer to your $100.
        MSI GeForce GTX 980 is selling $509.99 and its highly overckclable

        • Johan Krüger Haglert

          In every way?
          FreeSync is superior to G-Sync in:
          * Price.
          * Standard.
          G-sync is superior in .. What really? Consistent behavior?

          I don’t know if you actually mean ghosting or something else, FreeSync or G-sync would never introduce ghosting, ghosting would be a consequence of the panel used not either technology. if you’ve seen a review which mentioned ghosting issues with a FreeSync screen that has nothing to do with it being FreeSync and everything to do with how something else of the screen has been made.

          I said $100-250, that include $100, I never said no screens sold at close to a $100 difference, I say they do.

          Acer XR341CK bmijpphz Black 34″ 4ms (GTG) 21:9 WQHD Curved , 3440X1440 LED IPS Monitor, Adaptive-Sync( Free-Sync) with Speakers, $999.

          Acer Predator X34 Curved IPS NVIDIA G-sync Gaming Monitor 21:9 WQHD Display with Built-in Overclocking 100Hz Refresh Rate Boost, $1290.

          I don’t know if those are the models I’ve been looking at, but it’s the category of monitors I’ve been looking at where the difference has been higher than $100.

          Both of those are 34″ 21:9 3440×1440 curved IPS monitors with adaptive syncing, one has FreeSync and the other has G-sync and the G-sync one cost 291 dollar more. Both are 75 Hz BUT the later can be overclocked to 100 Hz so sadly they aren’t completely the same in specs. So I can’t argue they are the same and I don’t argue they are the same, they are somewhat similar though and the price difference is beyond my claimed $250 at even almost $300.

          I know the Nvidia cards can be overclocked and are clocked higher than stock in almost all models from all manufacturers and I too think the GTX 980Ti is a more attractive card than the 980 if one is ok to pay the cost.

        • Judge_Chip

          Price is my (the consumers issue) and from what been looking at a G-sync monitor cost is closer to $100 than $300 more than freesync.

          I don’t care about standard when Nvidia OWNS 80% of the dGPU market. Freesync can be supported for NOTHING if and when Nvidia sees fit.

          And Yes G-sync delivers a superior visual gaming experience and is worth the extra cost.

        • Johan Krüger Haglert

          Yeah, so there’s one thing FreeSync does better for you (price), and what that it actually do worse? Then again, yes, Nvidia market share is 80% and as such if you have an Nvidia card which I guess you do the simple fact that Nvidia doesn’t support Adaptive-Sync makes it a no-go for you. To claim Freesync/Adaptive-Sync is bad is a different story.

          And if Nvidia market-share of 80% continues then Nvidia G-sync would be the more or less de facto standard in dedicated graphics card gaming setups regardless of what the standard accessible for everyone is. Personally I think Nvidia is an asshole for not supporting Adaptive-Sync and it made be wait with my purchase of a GTX 970 over one year ago and that fact and the price-difference and of course my will to have working competition in the market make me lean for / be willing to take an AMD card rather than an Nvidia card if they are equally good. To do that with someone like me I guess would kinda be a big thing because previously I’ve run the BSDs, now and then Linux and at less time Solaris and also OS X hack and as such I’ve always preferred Nvidia due to their existing and superior drivers for those OSes. AMD haven’t caught up and it kinda haven’t seen like they were able to either but now when they open up more things we’ll see what happens I guess.

          One shouldn’t also ignore the fact that Intel will support Adaptive-Sync and that Intel powers lots of display drivers and that things such as laptops and tablets may get and use Adaptive-Sync and maybe the VR equipment could do so and so on (then again since those require a dedicated graphics card to power games at the moment and Nvidia is #1 there ..)

          Sure Nvidia can support Adaptive-Sync whenever they want too, but they don’t currently and with equally good graphics cards that make going with the Nvidia solution more expensive (and they are the Sony of graphics, PhysX? Closed. GameWorks? Closed. G-Sync? Closed. Drivers? Closed.)

          Hopefully it will come back to haunt them at some time.

          “And Yes G-sync delivers a superior visual gaming experience and is worth the extra cost.”, citation/source needed not just a bullshit line.

        • Judge_Chip

          PhysX? Closed, BECAUSE Nvidia invested time and money to develop it. GameWorks? Closed, BECAUSE Nvidia invested time and money to develop it. G-Sync? Closed, BECAUSE Nvidia invested time and money to develop it. Drivers? Closed, BECAUSE Nvidia invested time and money to develop it !

          Its call ROI, return on investment, taking care of business, making a profit that Debt Laden AMD can only wish for. Its the reason I build custom gaming rigs for my clients, to make money, why is this so hard to understand. Debt Doomed AMD is Geforced to give it away for free, as for Intel they can NOT support G-sync BECAUSE Nvidia OWNS it, its proprietary IP. Soon AMD will go the way of the Dodo, Disco and Dinos, BECAUSE they are wallowing in massive Debt, forced to cut R&D, and can’t make money.

          AND Yes G-sync delivers a superior visual gaming experience, Google it and stop posting BS.

        • Johan Krüger Haglert

          Doesn’t matter that they invested time and money in it. It’s closed regardless and that make it worse.

          I assume AMD have their own alternative to most of what’s in GameWorks or ~similar.

          G-sync likely didn’t cost much money to do, and if it did then clearly they wasted their money for nothing since there already was specs for doing it.

          The drivers are likely mostly closed because Nvidia invested time and money INTO THEIR GRAPHICS CARDS, not the fucking driver.

          Regardless it make it all worse for the customer and for the whole industry. That Nvidia want to get as much money as possibly is obvious but there’s no fucking reason for you or any customer to feel happy about paying for it or getting inferior performance.

          It’s not hard to understand they want to make money, all their are doing is shit for the industry and for the customer though and as such there’s no fucking reason to pick Nvidia against the competition when the competition has ~equal products because the Nvidia alternative screw you over so much more, just like Sony and Apple has done / do.

          I don’t think AMD is doomed, if nothing else someone would likely be up to grab the pieces for whatever price, whatever that would lead to a still remaining alternative developer of x86 chips and desktop computer graphics is a different story. I guess for instance Apple could grab AMD and then make their own x86 chips and graphics They could continue the APU stuff or add better graphics to their ARM SOC or whatever. Maybe that would even be the most likely buyer? No problem whatsoever for Google to take it either if they want it. So I doubt whole AMD is doomed. Whatever they’d have to default and restructure their debt is a different store. Though in regard of being a customer or an investor that may not matter much. Anyway – I can think that someone would consider the assets of some interest at-least. Also they are kinda successful in the APU and SOC business.

          Already back in October or so I wondered whatever it wasn’t time to buy AMD and since the outbreak upward those who have done so could had sold at upwards a 50% profit within 1-2 months, that all kinda deteriorated two or so days ago though, it’s still up from the absolute bottom but it’s “riskier” to buy something which falls in price than what’s going up, at-least if one view the price of a company as something which move along with the fundamentals of said company, it’s also safer from a traditional technical analysis perspective, but the two are kinda connected.

          Just because AMD is failing now and has done so for a few years doesn’t mean they always have to fail. I don’t think 40% IOC improvement is enough to get ahead of Intel and as such I don’t really see much need to wait for Zen if one want the best performance and no reason to believe that Zen would have that. Maybe it’s good enough and maybe it’s enough to support them.

          Regarding their graphics what we started with was R9 Nano vs GTX 980 and that graphics card is almost on par, after a process shrink they expect two times higher performance / watt and their arctic island chip seem to .. if one for some stupid reason ignore the fact that the GTX 960 could had been quicker and capable to perform even better .. possibly draw less power than at-least current Nvidia Maxwell chips (likely do also but that’s to be expected.)

          Pascal will of course led to improvements on the Nvidia side, possibly we see a change of focus from the two where AMD go below in power consumption whereas Nvidia release the more powerful card instead? Who knows. Intel and Nvidia indeed spend much more money on R&D and hence should get better progress. That possibly affect the future more than now.

          Also there’s companies like Nintendo which show that you can prosper/survive/make a profit by spending “as little as possible” on R&D instead. People complain about their stuff being slow but on the other hand the slow outdated stuff cost less and that make it easier for them to make a profit on what they actually do sell. Apple is completely outsold in the smart-phone industry but still manage to grab almost all the profit in it.

          Now for AMD to not be at-least good enough on processors and graphics isn’t really any option, but for instance they have already dropped their fabrics and Intel still have theirs and now Samsung and tcms kinda seem to have caught up with Intel in fabrication? Doesn’t affect Nvidia.

          Regardless I’m not talking about buying the company (though the stock may had been over-sold for the time and was possibly forming a bottom – which it did but that’s over-played now and now the setup would be another one) but their products and if their products are a good purchase then they are a good purchase and the R9 Nano + a FreeSync monitor isn’t a bad purchase. If someone even want to built a Mini-ITX machine then it’s an awesome – fucking amazing purchase.

          Cisco haven’t done as bad as its stock price:
          “Wall Street analysts expect Cisco to earn 99 cents per share for the fiscal year ending July 2000. At $108 per share, the company has a P/E of 109.”

          If the company was still priced at a P/E of 109 which include a lot of hype then it would cost over $200 now, or 8 times more which would actually be more than 2.5 times higher than what it has ever cost!

          So Cisco make more profit now than it did back in year 2000, the stock just had a very high price.

          Intel can and will support Adaptive-Sync which is what I said, I never said Intel would support G-sync. That mean AMD isn’t alone in supporting it and hence the market is larger than it would be with just AMD.

          Feel free to tell me in what regard G-sync offer superior visual gaming experience (BS) rather than say ghosting because FreeSync definitely doesn’t introduce any ghosting by itself. Ghosting would be one older image which remain on the screen but it’s not like FreeSync blends the images or something, it of course send a complete image and the rest is up to the monitor and what that do isn’t the responsibility of AMD.

        • Damien

          I wouldn’t bother having any form of communication with this insane AMD hatred filled, venom spewing asshole.

  • Judge_Chip

    I’ll take a factory overclocked GTX980 from one of Nvidia’s many partners over a Nano each and every time and then over clock it some more when needed because Maxwell has massive overclocking headroom, much better drivers, and more cooling options.

  • Shamz

    Pretty amazing that AMD could make such a powerhouse in such a small packet. Good stuff.

    • Sam al

      Which is the reason why it was expensive in the first place I guess. Itx components always have their own premium.

      • topdown


      • druout1944 .

        Hard pressed to find a mini-ITX case that can’t fit larger cards for way less money than the Nano. I got a 390X into my 380T and 390X/980/980Ti will even fit in the Silverstone RVZ01B Mini ITX Desktop Case.

        • TacoSwagga

          Your 390x in that case is probably loud as fuck though,

        • druout1944 .

          not really; nothing compared to my 980Ti SLI or 780Ti SLI in my other rigs. I personally didn’t notice any irksome noise myself, but my hearing ain’t the greatest either.

  • Cayder

    Why did you not overclock the cards? If your buying a $500 card your doing a diservace to it by not getting free performance. Clearly the 980 would win due to its higher headroom.

    • Sean Kumar Sinha

      The 980 is already clocked quite a bit higher than the Fury Nano by default.

    • Nathan Kirsch

      Cayder, We’ve already done that in each of the cards respective launch articles. You are 100% correct that the GTX 980 has higher overclocking headroom and will likely perform better than the R9 Nano when overclocked. A good number if not the majority of people don’t overclock their gaming PCs. In fact the audience for these cards is pretty small considering the average PC users. NVIDIA said last week that less than 1% of the PCs expected to be in use globally in 2016 will be powerful enough to run the virtual reality technology. NVIDIA suggests a GTX 970 or higher for VR. We have a pile of stuff waiting on being reviewed and did this article to update gamers on the base reference card numbers. Overclocking could have been done, but then using the NVIDIA reference card would have been pointless as only EVGA and Zotac are using it. Most GTX 980 cards that are being sold today are factory overclocked with custom PCB’s and GPU coolers.

    • Sean Easterling

      By your logic (i.e. picking what is best about a single card in a test and measuring the others by it), they should have tried to stuff the Nvidia cards into tiny min-ITX cases to compete with the Nano’s form factor. Oops! Looks like they get a big fat zero on that test because they can’t fit.

      Or maybe they should have ran them all in a low airflow silent-gaming case to compete with the Fury X? Oops they’re all at 95c, throttling, and louder than your speakers because the fans are spinning at 100%.

      You wouldn’t buy a Nano over a 980 unless you needed the form factor. You wouldn’t buy a Fury X over a 980Ti unless you wanted an AIO liquid cooler and silent gaming. You shouldn’t comment unless you can see your own bias.

  • Rauel Crespo

    Instead of continuing with the Nanos, AMD should just take the non-shipped R9-Nanos, slap on the HSF from the Fury, and make an Air-Cooled Fury-X and sell it for $550.

    This would destroy the GTX980-Ti from a value standpoint.

    • Christopher Prats

      Isn’t that the non-x fury?

      • Villz

        non x fury is a chopped down version of the full fiji.

        akin to how the 980ti is a chopped down titan x

    • onstrike112

      That would be a good idea too, and have all 4 products be a different part of the “Fury” range.

  • rockstarfish

    Nvidia just got wrecked

    • N7 Spectre Elite

      AMD’s profit margin just got REKTED!

      • rockstarfish

        HBM reduces PCB costs from not having to run traces for all the gddr5 chips. AMD is advancing technology and making smaller, faster and cheaper to produce products. you don’t seem to understand technology. Why would AMD and Nvidia move to HBM is it was not cost effective? Wait, Nvidia has not figured out HBM yet since they flip flopped on HMC and now lagging behind

        • Jtrdfw

          Lol, someone needs to Google marketshare. Despite die hard fans digging real deep for the silver lining (die hard graphics card fanboys, think hard about this ridiculous statement), most people didn’t appreciate AMD’s outright lies and rebranding of 2015. On top of years of garbage CPU’s, It nearly sank them in a single year. The best thing AMD ever did was cultivate the underdog image and they had to cash in – hard.

          You better damn well hope they didn’t fuck up Polaris too or we’ll be out of competition, a situation nobody needs.

        • rockstarfish

          Market share is history mannnnnn, times are changing, AMD is winning

        • Judge_Chip

          Winning with what, bandwidth wasting HBM1 that NO 28nm GPU needs, rebranded lame old watt sucking GPUs, a 2 slot nano, IPC cripple CPUs and APUs. No, AMD is laying on its debt laden death bed and only survives on hype pumping propaganda life support machine that infest this board and many others. Desperate AMD cut its in house marketing and hired viral marketers to do its dirty work on message boards all over the internet.

          Sorry but Pascal, Apollo Lake and Kaby Lake will put AMD in the deep debt grave. AMD’s graphics IP will be sold off and ATI will make a slow come back as AMD goes RIP.

        • DavoR

          You mean viral marketing like the one you did with that comment for nVidia/Intel ?

          Google “hypocrisy”.

        • Victor

          Get Nvidias dick out your ass lol

        • Casecutter

          Market share is one thing, but sell your all your chips from the wafer starts you contacted to “take” is with good profit is paramount. AMD has been fraught from GloFo contact, and foundry underperforming etc. all while pay good money for the crap. I think Rory Reed era though Mining was on for much longer and schedule way to much when Mining craze was seen, and learned 20nm was in the dumps. The same management didn’t consider Maxwell would scale-up like it did, thinking Nvidia was betting for 20nm for much of the gain they actual found even while stuck at 28nm also.
          Those times are behind with Samsung in GloFo they keep things “real” and AMD no longer subject to wafers/chip they either find a way to move or pay not produce. And I think GloFo price is better than TSMC and that is holding TSMC from a huge increase like they did in 2011 on 28nm.
          A company can survive as long as production they schedule lead to product that they pass thru with profits that progress R&D and that bears fruit. Things are different more competitive, and lean change’s like this could mean more wins.

        • Jeffrey Byers

          The MILLIONS that AMD needs in profit to pay off their debt is impossible though.

          That’s not the best article, however there’s lots of hard numbers available (quarterly losses, yearly debt payoff commitment, unit profits required to regain profitability… ). You also have to factor in cash paid from Intel which they had to burn through.

          If financial experts with PhD’s in economics are predicting AMD can’t turn it around I see no reason to believe random people on the internet.

        • gparmar76

          I wish I could believe this…truly, I do…but I spent 3 weeks pouring over my options on the first build I’m doing in 7 years…and I could not find a reason to go AMD…I opted for the Skylake i5 6600k for $240…what AMD CPU compares at this price? I am still thinking it over on the GPu bvecause I didn’t order mine yet…I’m thinking R9 Nano because that is a AMD winning product at that price point….I hope AMD can somehow get back on track with AM4 in the near future.

        • Medi Zerovan

          Yeah, google market share of P4 Prescott vs AMD Athlon 64 that was better in all ways and also cheaper and STFU.

        • Timothy Isenhart

          I have an Nvidia 980ti and an R9 Fury X both are great cards, I’m still using Crimson 15.12 drivers on the AMD but my FPS are much higher than what the writer posted. My Fury X actually slightly outperforms my 980ti by about 3-7 FPS, I think the writer has corrupt drivers because my FuryX has never had that low of FPS on max settings including 4K.

        • anubis44

          You sound like the biggest nVidiot asshole I’ve seen on here in a long time. Speaking of lies, what people REALLY didn’t appreciate was being sold a ‘4GB’ GTX970 graphics card that can only really use 3.5GB because the other .5GB is running at 1/7th the speed. Jen Hsun ‘apologized’ for the deception, all the while counting his money, but nVidia’s partners are still printing ‘4GB’ on the boxes: now THAT’S a fucking lie. The fact that nVidia’s chips had their schedulers ripped out of them in order to save power, and are now crippled for DX12 won’t win them any fans when all these stupid Maxwell buyers start trying to play DX12 games that need these schedulers. But of course, nVidia will just sell them a Pascal card because they deliberately crippled Maxwell, and so the cycle of idiocy goes.

    • Mosab Al-Rawi
    • Steve Bike

      Overclocked 980 shat on a R9 nano, should have ran it at 1500mhz. They wanted like $649 for the nano too garbage. Vega seems pretty shit too, just like the fury x/nano and I wanted them to be good. Fucking 280watt/350watt/400watt GPU’s in 2017 is dumb.