CPU Bottleneck? GeForce GTX 1080 Ti Tested on AMD Ryzen versus Intel Kaby Lake

Jump To:

CPU Bottleneck Exposed By GeForce GTX 1080 Ti?

Last week, NVIDIA released the GeForce GTX 1080 Ti Founders Edition graphics card. This is the fastest desktop consumer graphics card in the world and we were torn on how we wanted to test it. We are in the middle of updating out GPU test bench and are fighting a few hardware gremlins and we thought we’d try something new and a little different for a change. AMD just recently launched it’s Ryzen processor series and that is a huge architectural change for AMD. It’s been many years since we’ve run graphics cards benchmarks on an AMD platform since Intel has been the platform of choice for gaming performance. Has that changed?

NVIDIA GeForce GTX 1080 Ti and GeForce GTX 1080

So, rather than comparing the Intel GeForce GTX 1080 Ti Founders Edition graphics card to dozens of other video cards we are going to be comparing it to just the NVIDIA GeForce GTX 1080 Founders Edition, but on two different platforms. One will be powered by a stock Intel Core i7-7700K processor ($329.99) and the other will be powered by the AMD Ryzen 7 1700 ($329.99). Both systems have Corsair Hydro series CPU water coolers and are running the same exact Corsair Vengeance LPX 16GB of DDR4 memory kit at 2933 MHz with CL14 timings. We tried to make the platforms as identical as possible as we wanted to see how the GPU would scale at 1080P, 1440P and 4K resolutions when powered by latest Intel Core i7 Kaby Lake and AMD Ryzen 7 processors. We also gave the AMD Ryzen 7 1700 a head start by overclocking it from 3.0GHz (3.7GHz Turbo) stock clock speeds all the way up to 4GHz on all 8-cores. The Intel Core i7-7700K ‘Kaby Lake’ processor was left at stock settings, which are 4.2GHz base and 4.5GHz boost.

AMD Ryzen 7 1700 With ATI Ruby

AMD Ryzen 7 1700 With ATI Ruby

We’ve all heard that AMD Ryzen 7 series processors don’t perform that hot at 1080P resolutions, so we’ll be paying close attention to how the GPU performance scales when you move from the GeForce GTX 1080 to the GeForce GTX 1080 Ti. We’ve been slowly coming to the point that the GPU is going to be bottle-necked a little by the processor on both platforms. This greatly depends on the game title you are looking at, so we wanted to take a quick look at a few game titles.

Depending on how much traffic this article gets we might expanding testing and look at more processors in the future as we do see CPU bottlenecks becoming an increasing concern for many. Keep in mind that CPU bottlenecks are more prevalent at lower resolutions and graphics settings. Once you increase the resolution or image quality settings you’ll likely become more GPU bound and the CPU bottleneck will be lessened.

That said, let’s look at the test systems and then jump into the benchmarks.

Print
Jump To:
1 2 3 4 5 6  Next »
  • olebrun

    The key, and yes the key to gaming performance on Ryzen now is? RAM speed. At this moment it seems every RAM starts at 2133 on Ryzen motherboards so you have to enter Bios and fix this yourself. Don’t buy any cheap RAM for Ryzen. Get a pair of 3200 but check your motherboards support site for what RAM is currently supported. I got a 15fps increase in Wildlands buy increasing the RAM from 2133 to 2933 alone on 1440p with a GTX1080.
    Wildlands also use all 16 threads and if developers start supporting that Ryzen 7 will be great for gaming.

  • Hanson Chin

    after a long week of research and after reading this benchmark test. i have decided to buy Ryzen. thank you very much. your information is very useful.

  • Carlos Carlos

    At this point I’ve already made my mind to go with AMD. Nvidia has been sitting on the 1080ti for months waiting to release it just so it could compete with Vega. Shareholders>consumers.
    And with Intel, they have been charging an arm and a leg for chips that are only slightly better than my 3570k I bought years ago. And given how much they dropped the price on thier CPUs to be in line with AMD is just evidence enough they have no problem bending you over and charging whatever they want. If you want progress, support AMD now. If you want to pay $1000 for an extra 5fps to play the latest AAA game then buy Intel/Nvidia.

  • Wibble

    Oh wow so many AMD fanboys crying because the results aren’t what they want and that is after they have been told that this is a baseline to test against once the Ryzen issues have been sorted. The reviewers have basically said that they expect these results to change once the various issues are sorted.

  • Terry Perry

    I have 2 I-7 Monster Chips wanted a new PC on budget as a Fun Practice. Fry had a 6350 6-c 4.2 Wraith Cooler for 85$ W-10 for 85$ and a 1T 64 for 45$ I have Ram and 2 970 old case so for just 145$ I have a SO CALLED 6 ORE. It runs B-1 and Titan 2 Max out at 1080P. WHY I do like AMD in time these Newer Chips Will DROP in Price. Intel is Changing EVERTHING late Next Year ALL NEW H.B. Ram, Chips and a new faster S.S.D making the ones now Dinosaurs. AMD will FALL AGAIN

    • Wibble

      Can we have that in English?

  • nem

    AMD X370

    Support for DDR4 3600(O.C.) / 3400(O.C.) / 3200(O.C.) / 2933(O.C.) / 2667* / 2400 / 2133 MHz memory modules

    GA-Z270-Gaming K3

    Support for DDR4 3866(O.C.) / 3800(O.C.) / 3733(O.C.) / 3666(O.C.) /
    3600(O.C.) / 3466(O.C.) / 3400(O.C.) / 3333(O.C.) / 3300(O.C.) /
    3200(O.C.) / 3000(O.C.) / 2800(O.C.) / 2666(O.C.) / 2400 / 2133 MHz memory modules

    link. http://www.gigabyte.com/Motherboard/GA-Z270-Gaming-K3-rev-10#sp
    https://uploads.disquscdn.com/images/4d5475a678349a17eaf5e725bb757181c6cd9b351cb6c85003e0065b02dfeb03.png

    link. http://www.gigabyte.com/Motherboard/GA-AX370-Gaming-K7-rev-10#sp
    https://uploads.disquscdn.com/images/05db891a05370be234f51c8b30002b1fdbfbd65043a24b53a13c7ba454a5b137.png

  • Recko

    Ok seriously if your running a computer with 16GB DDR4, Ryzen or Intel CPU and a 1080ti why the hell would you run games in 1080p? AMD clearly showed that the Ryzen chips are aimed at 4K gaming aswell as there new Vega chips coming out. Doing reviews on 1080p for Ryzen chips is like doing a review on 720p and saying its better fps than 1080p. AMD is looking into the future of computing and gaming hence why the Ryzens chips are faster at 4K, render faster, Multitask more and are prepared for Full DX12 games with the Vega chips. I always run my games on Max settings and when Vega is released ill be running all my games in Max settings with 4K

    • Wibble

      lol there isn’t a single GPU this year that will run AAA titles at max settings at 4k / 60fps. Also you said Ryzen is faster at 4k? Which reviews are you looking at? They aren’t doing better they are about matching a STOCK 7700k (these chips run 4.8 minimum). This may change (should change) when the bugs are sorted but as of now Ryzen isn’t pulling ahead at higher res.

      • Benthrax

        I have a 1080 TI and I’m running AAA titles at 4k / 60fps

    • Nathan Kirsch

      Which is why I ran 1440P and 1080P. This article has CPU scaling and GPU scaling in it, so the data can be looked at in a number of ways.

  • ohYoumad

    OK this in regards to the update on the last page, couldn’t you use a more cpu intensive game like say watch dogs 2 to show utilization in a more truer form? Imhop, not saying I’m correct but Tom Clancy titles never really pushed this aspect.

  • Nicholas Perry

    Someone should underclock the 7700 to 4.0ghz to see what like for like performance is at the same frequency.

    • McKoy

      No one in their right mind underclocks a cpu in the realworld. The answer here should be Ryzen going up to 4.5Ghz.

      • ohYoumad

        I’m pretty sure ppl do it, because I’m one of them. Second i think it was suggested only to get true ipc comparison, that is all

        • Nicholas Perry

          This is correct. Since Running cannot get to 4.5 , best to simply downclock the other

        • ohYoumad

          I’m surprised that no reviewer has done this. Ryzen at 4.0 intel at 4.0. I mean the frequency don’t matter but ad long as they’re the same, that would be true apples to apples in ipc comparisons. Only then we can figure out how /= they are.

        • McKoy

          You cannot do apples to apples with clockrates. Different architecture has different approach to processing.

          It is like asking an RX480 be clocked 1300mhz core then ask a GTX1060 be clocked 1300mhz core as well, then pit them together.

        • ohYoumad

          You are missing the point, it don’t matter about architecture because @the same clocks it would show what each can do granted there are other variables(optimization)to take into account, the fact still remains the same…at the same clocks is more of a true ipc comparison than anything else. Intel have its own fab plants, AMD don’t second Intel’s process is more mature/dense it should be able to clock higher. Why do you think the Polaris refresh will use less power and have higher clocks? It’s AMD 2nd go around on the process…it should be better in all aspects.

        • Wibble

          If you want to look at IPC performance only then yes you could underclock the 7700k. Not sure what you are going to take from that as we already know the Intel chip is slightly faster IPC. As above a fair comparison to benchmark with is both CPU’s overclocked to there likely max. That puts the Ryzen at 4GHz and the 7700k at 4.8 GHz. Both achievable on air.

        • ohYoumad

          That is not a fair comparison, in the scenario you are trying to present (it would be stock vs stock). The reason behind my madness would be to know roughly the exact percentage/avg% of how much faster one is over the other.

        • Wibble

          No, OC vs OC is perfectly fair. OC one and not the other isn’t.

          I think you want to know the IPC of Ryzen vs Kabylake. Ryzen seems to be around Hazwell/Skylake. That would be 5-9% depending on what you are running.

        • ohYoumad

          Yes that is all I’m saying. Just to simply see how each architecture handles programs/software at the same frequency.

        • ohYoumad

          You can, it is not to show what the differences in the architectures are about…but to show which one excels in what, that is basically showing which one is faster in a sense. Coding for the programs plays a factor but in a general sense using similar clock speeds would identify it precisely. Afterall it’s numbers we are looking for.

  • MadBlax

    Great article, just finishing my 1700 build, and look forward to future optimization, for gaming. Didn’t really buy it for strictly gaming though, as I doubt many are doing.

  • ico80

    So that means that in 1 or 2 years with a “2080ti” the Ryzen will heavily bottleneck the system?

    • Nathan Kirsch

      Possibly as the CPU bottleneck will get worse at 1080P and eventually 1440P.

  • namco

    yes, lets test all games at medium with a gpu meant to play games on max settings (ultra+) what a fucking hack job.

    • Nicholas Perry

      It’s not meant to be what is what you would play at (Which is debatable).
      It’s about removing the load off the GPU and making it so the CPU is the limiting factor. Which shows the true performance of the CPU.

      And the easiest way to do that in any given game that isn’t more CPU bottlenecked by default, is to turn the resolution down and the settings down lower.

      Even at max settings, there have been several tests that even at 1440p the Ryzen falls behind by 20+FPS at in ROTT and several others.

      • Nathan Kirsch

        yeah, the point here wasn’t to be 100% GPU bound and I picked the settings used to try to get 4K playable on the new titles.

        • ohYoumad

          Could you do one thing if it’s possible, can you take the wise benchmark for the 1700, and rib it again with both cpu at the same frequency? Just 1-3 times if that’s possible.

      • namco

        Sorry. But this whole “drop a gpu setting to load the cpu” is bullshit.

        First off, games are highly optimized for intel. Even AMD has stated the need to work with game devs to optimize for the new chips which theyre doing.

        Secondly, your cpu doesnt magically start doing more work just because you drop graphical settings. I can literally dropy graphical settings in any game and the cpu utilization stays the same. A game is only gonna use XX% of a cpu. When you drop gpu settings and gain fps, its all gpu bound still. You can NEVER test a cpu in a game as means to see pwrformance.

        You wanna do a cpu test, then simply run a cpu test. Hell run a Single Thread cpu test and then compare it to intel chips tested single core on true cpu bound tests. There are so many factors in gaming that it makes these tests absolutely retarded including any back end issues like no optimization. Remember the benchmark tool fiasco where they used and intel compiler to make the program, and when it sees an intel cpu it ran faster “optimized” code, but on amd it forces slower code. Yet someone hacked it and forced the optimized code to run on amd and bam, amd didnt seem so slow anymore, still slower than intel, but not by as much as previously shown.

        • Wibble

          Nothing retarded about benchmarking games if that is what people are going to be buying these CPU’s for, and if they are reading this review that seems likely. Doing well in a synthetic benchmark means nothing if your favorite games don’t run that well. WTF would be the point in ignoring game benchmarks if that is your intended use. Such a stupid comment.

  • realjjj

    In the update section , the Ryzen SS shows 16 cores 16 threads, that in itself is a problem.No idea what app that is, if it’s an in game tool, then you have discovered an issue in this game and properly identifying it as 8C/16T could make a difference.

    The games tested are odd as none seems to scales to more cores. That’s why adding the 6900k to the comparison helps as you can at least spot what games scale.
    Deux Ex should scale but doesn’t on Ryzen, something is odd with it.

    • Nathan Kirsch

      That is the benchmark log from Ghost Recon Wildlands and it is indeed the in-game benchmark tool.

      • realjjj

        That’s big considering that it made a relevant impact in F1 2016.
        Have you tested with HT off in this game?

        • Nathan Kirsch

          Sorry, but we are out of town traveling for another NDA event.

      • realjjj

        You guys are missing an opportunity here, the discovery is news and you are the ones to discover it so could generate quite a bit of traffic.
        Ofc if you can find the file where the config is stored ,edit it to 8C/16T and test perf differences, even better.

  • Sinner85

    I must say this is the most civil, Ryzen vs 7700k debate I’ve seen on the net.

    • Nathan Kirsch

      I would hope so, it’s just showing people the data of where it is at. Once AMD optimizations come out we’ll run the updates on these titles and see what improved and what didn’t!

  • Nate

    Why did you Overclock the Ryzen and no the 7700k?

    • Nathan Kirsch

      The goal of overclocking the Ryzen 7 was to the the clock speeds closer to being the same. Yes, my 7700K can overclock past 5GHz, so there is even more headroom left in it while the Ryzen 7 1700 has maybe 100Mhz left in it.

      • Sinner85

        This is the other strange thing reviewers are doing overclocking the 7700k to about 5Ghz, again the cpu takes heavy strain and the life expectancy of the cpu cut to…I don’t know, can someone test that. How long can a 7700k run overclocked to 5Ghz. I’m guessing a few months?

        • Nathan Kirsch

          exactly, my Ryzen 7 1800X can run at 4.2GHz at 1.5 to 1.55V… Doesn’t mean much to me though as for 24/7 use I don’t want it running that high as is will degrade over time.

        • Mike Roberts

          Mine does 5.0 with 1.35v. Probably a long time.

        • Sinner85

          I suppose, with water cooling that’s fine? I’m still not convinced its the better CPU, though.

        • Nate

          Hard to be convinced when you accept a review that OC’s one CPU and not the other as some reasonable comparison.

        • Mike Roberts

          It is less than the ryzen here at 4.0. It should last as long or longer.

        • Sinner85

          Hard to believe, what temps does it go to, 100 degrees?

        • ohYoumad

          If you only game then it is, if not amd for the most part. My 5830k clocks to 4.7 with about 1.31 vcore, it reaches 4.5 @1.24 vcore, guess where i leave it at? I don’t any of these cpus would just abruptly die, but it is possible. Increasing voltage does shorten is lifespan but i highly doubt it’s anything significant. I can’t remember off the top of my head where my p2 955be cpu is overclocked to but it still runs at that same settings i had it since day 1 of oc’n with no degradation afaik.

        • McKoy

          yeah, well how long can a Ryzen run its 4Ghz (since that is the redline most people get)?

          Check around the net and see what temps people get with 4.8 – 5.0 Ghz OC with their 7700k with just normal tower heatsink air.

          In other words, this review is not even 7700k’s final form.

        • Sinner85

          I have and at times I see high temps and high utilisation, the cpu is good for gaming, but when it comes to cpu intensive games thats all you can do on it, whereas the Ryzen, has low utilisation and acceptable temperatures even when overclocked on air. At the end of the day yes the 7700 is very good for gaming, but the Ryzen(any of them) is not bad at gaming either and on top of it, it destroys any Intel at multitasking, at the end of the day enthusiast gamers are not the majority of cpu buyers, it’s actually the people that do more than just game that forms the bulk of cpu target market, so AMD actually produced a CPU that the market has been asking for, the rate of sales on the Ryzen line proves it.

      • Nate

        So, would you say OC’ing to an unsafe voltage in one is fair?

        The processor was overclocked to 4.00 GHz on all cores with 1.3875V set for the CPU Core voltage

        That seems pretty high when you can hit 5.0.0 GHz on an i7 with 1.35 volts, don’t you think?

        Surely if you feel that OC’ing to this level is either safe or not is irrelevant, it should be about equity in testing. We don’t care about IPC at a lower / same clock speed, the review doesn’t pretend to be doing that, it sneakily leaves the clock of the i7 7700k at *stock* and overclocks *and* overvolts the Ryzen. It’s a skewed result that makes the Ryzen look “Close” rather than “way way far behind”.

        • Nathan Kirsch

          Damned if you do and damned if you don’t. I’ve already done stock testing, overclocked testing and now a GPU/CPU scaling article. You can see stock results on the original testing on the 1700 in the launch article. This article was supposed to focus more on GPU scaling thanks to the 1080 Ti being relased. Silicon Lottery is selling Ryzen processors at 4GHz on all three options at up to 1.44V. Looks like <30% of 1700 & 1700X's can do 4.0GHz while over 70% of the 1800X's can run at 4GHz. Here is the link – https://siliconlottery.com/collections/frontpage/products/1700a40g

        • namco

          you can’t hit 5ghz on a 7700k without delid, 80% of mainstream gamer’s who buy a 7700k isn’t gonna delid the fucker to gain that overclock. so you are trying to create an unfair battle by overclocking a chip only after its been modified…..

        • Mike Roberts

          Mine is not delided.

        • PC Master Race

          My 7700k is not delidded and does 5 ghz all day long at 1.25v stable with 3000 mhz ram. Max temp never goes above 79C in the most intensive stress test. I’m using a Corsair H100i v2 to cool it btw so its not a custom loop or anything.

  • Sinner85

    If CPU utilization, was added to these benchmarks, we would probably also see how that 4 core processor strained to get those results, run these benchmarks with a i7-6800k, for a real world comparison. That 7700k is probably running at 99.8% utilization, open any other app and performance will crumble, a gamer often will have programs like teamspeak, chrome(for bf4), etc open, how will a CPU that’s running at 99.8% fare when its doing more than just playing a game?

    • Nathan Kirsch

      Not really, in Ghost Recon Wildlands the stock 7700K was below 40% CPU usage.

      • Sinner85

        Oh yes, I saw that benchmark, as well, I was surprised by it, however there are probably going to be a few, very few games that doesn’t make the i7 7700k run close to 99%.

        • Nathan Kirsch
        • Sinner85

          Thanks for these screenshots, this is the type of thing I find really interesting.

        • Sinner85

          Is this, what utilization for the CPU’s look like for most games?

        • Nathan Kirsch

          For most from what I have seen at 1080P.. The CPU load goes down as you increase the screen resolution as the GPU load increases.

        • Peter

          I’d like to know what games you think use 99% of a 7700k. I play bf1, final fantasy realm reborn,Forza horizon 3, resident evil 7. Gtav, Witcher 3, planet coaster, none of these games push my CPU to max or even close. It’s going to be a rare case that a game uses 99% of a 7700k
          And the 7700k is just as good as my 6700k in games, if not better due to it reaching 5ghz+.

        • Nathan Kirsch

          yeah, especially with DX12 game titles… The CPU overhead has been reduced and it will be tough to put a 7700K at 99% load.

        • Sinner85

          With DX12, it doesn’t really, not really, matter if you have a 7700k a 6800k or a R7 1700, because DX12 reduces the overhead, performance gets better and then it comes down to what else do you wanna do with the CPU, besides gaming?

    • Peter

      I have never seen my 6700k @ 4.6ghz hit 70% usage at 1440.usually 40% usage or less, except bf1 is 50-60 in multiplayer, So they have plenty of room left

      • Sinner85

        The 6700k in my opinion is a better cpu than the 7700k, but only when playing newer games. Which is what most of us play.

        • Nate

          What? How is that true in any way? The i7 7700k is better, across the board, in performance. I’ve yet to see a review state otherwise. The 7700k is a small upgrade to the 6700k, it is not worth upgrading from 6700k->7700k… But if you have neither, you should look to buy the 7700k.

        • Sinner85

          Apologies, I meant 6800k, but this is probably because I think more cores will benifit you more in the long run, I might be wrong about that though.

        • PC Master Race

          I also have a 5820k at 4.4 ghz and can say without a doubt the 7700k is better for games right now.. In maybe 2-3 years games will use more cores but by then I’ll just replace my 7700k with something else.

    • PC Master Race

      To answer your inquiry: I run games at 5 ghz with discord, chrome (youtube music), corsair utility engine, a50 wireless program, msi afterburner, wallpaper engine, object dock, rainmeter, AV program, shadowplay/GFE, origin, steam and a few others running in the background and the performance is ALWAYS consistent and steady. I even sometimes stream to youtube at 1440p/10 Mbps and it keeps going without a hitch.

  • Dan Brooks

    Also why slander amd when you know there has been zero support from developers for the past decade…?.. because either A. you’re retarded or b. You’re being paid by intel. Are you retarded?

  • Dan Brooks

    You’re a biased fuck for using the wrong amd processor. Y not use the 1700x?

    • Nathan Kirsch

      Because the Ryzen 7 1700 and Core i7-7700K are both priced at $329 here in the United States. Apples to Apples with regards to price, that is why!

    • Miguel Ettema

      There is also no fundamental difference between the 1700, 1700x and the 1800x other than the core speeds. Even if he used the 1800x, you’d see the same results. Ryzen is great for a lot of things, gaming included… but it’s definitely not the best for gaming in its current state, for current games. Optimization from game creators and on Windows in general will see some improvements (hopefully).

      • Dan Brooks

        The x uses more energy. 95watt vs 65 watt

  • Edward Kinsella

    There isn’t new material hear. Using a faster clocked, game optimised, 4 core agains’t a slower, non optimized Ryzen 8 core, with old games is kinda smaking intel bias. You would know the result even before starting

  • MisterWU

    Really some one still look at fire strike benchmarks results ??
    One of the untrusting benchmark ever.

    Maybe testing with more modern and demand demanding game can be good idea to have a more complete overview of the real situation.

    • Nathan Kirsch

      I have added in Fallout 4 and Ghost Recon Wildland results overnight, so hope those are better game titles for you to take a look at.

  • Robert Johnson

    With only 20 percent AMD CPU marketshare for the past few years game developers have optimized games at 1080p for Intel processors which is why you see AMD lagging behind Intel at the 1080p resolution but at 2K and 4K you see a difference. So please stop blaming AMD for sluggish 1080p game performance. AMD has sent new SDK kits to game developers and also has established new partnerships with some gaming companies in an effort to change the status quo.

    • MisterWU

      I am not a AMD fan boy, but I agree with your opinion and even more: when the binary of this games have been compiled, compiler know nothing about Zen architecture and how to optimize. Zero!!!

      99% of online test, like this one, are, anyway, grabage. They use ridiculous benchmark and game.
      If you test a CPU you have to test a brute force calculations not a huge faulty ecosystem with so many variables that even the reviewers them self, clearly, don’t understand nothing about it.

      Don’t trust review, don’t trust game benchmark.

      • Nathan Kirsch

        You guys are missing the point… These pointless tests are the baseline of Ryzen performance. These will need to be used to compare the optimizations. We are doing these tests to hopefully show the performance gains that are coming soon. This is the state of Ryzen now. If you go out any buy one, this is what you’ll get!

        • KieraDanvers

          I think you are missing the point. Vortez.com did a review of the 1700 as well. They benched mostly applications and only 2 games. In those 2 game those (1080p) they showed the 1700 beating the 7700k. My point is this, take these review as a grain of salt. I RARELY believe any review from a review site thats known to be bias toward one party over the other.

          https://www.vortez.net/articles_pages/amd_ryzen_1700_review,17.html

        • Nathan Kirsch

          Well, I guess you finally found a site that doesn’t care which way the numbers land. 5 game tested at three resolutions on two different video cards is usually more useful info than 2 games tested at one resolution and on one card. I didn’t spend 10+ hours setting this up and doing all the testing/writing to make a fake useless review. You can see the Ryzen 7 1700 overclocking article here where 18 different things were tested (mostly applications).

        • Robert Pearce

          Save your breath Nathan.

        • Ansau

          Lol that logic.

          Vortez is running a couple of games that resulted being gpu limited for using a RX 480. And that’s a valid test.

          But the rest of the world has to be taken with a grain of salt when they are showing dozens of games in scenarios where the cpu is the bottleneck.

        • Nathan Kirsch

          You are correct… GPU limited testing is certainly valid as well.

        • KieraDanvers
        • ohYoumad

          Just not to test cpu bottleneck lol

        • KieraDanvers

          It looks like you have the logic issues. How is a gpu limited at 1080p. You test a 1080p because its more cpu bound than gpu bound. So how do you explain a 480 doing better on a 1700 than on a higher IPC 7700K….YOU CAN’T!

          If anything, the rx 480 should have performed better on the higher ipc 7700k but it didn’t, it was worse. Why? because of the superior multi-threading in said games. And lastly…CONFIGURATION MATTERS!

          https://community.amd.com/community/gaming/blog/2017/03/14/tips-for-building-a-better-amd-ryzen-system

        • Ansau

          It seems you haven’t got any clue about what I was saying.

          I was critizizing how people cherrypick tests based on how they suit owns fanboyerism and blame those that aren’t suit to their desires (like this one).
          You come past, read “RX 480” and drop whatever your mind had first in his propaganda portfolio, in this case if Polaris gets higher fps in Ryzen or Intel, and put the flag of CONFIGURATION MATTERS.

          You got as close as the Sun and the Voyager 1 in this discussion…

        • Sean Kumar Sinha

          Vortez tested 2 games and used an RX480, which will bring GPU limitations into play. Who missed the point, again?

        • namco

          baseline? are you fucking serious? YOU gonna buy a 1080 or 1080ti and run it on fucking medium? GTFO

        • Nicholas Perry

          You would if you are someone who wants as high of a frame rate as possible. Esp on the latest AAA games or in online MP games. Where you can reduce input lag even further and if you don’t have a Gsync monitor, can use Fast Sync. (Which requires high framerates with a multiple of 60)

        • ohYoumad

          I’m glad you did it too, not a lot of reviews these days are as intuitive as this one.

        • marco

          WOW!
          Medium Mmmmm. >:)
          I haven’t played anything on medium setting since way back in year 2000 (Need for Speed 2) on a Coppermine and a ATI Rage Pro 16MB @ 1280×1024…
          …and that was only because the Rage pros drivers sucked epic chunks.

          So basically this game is like Crysis-1, and makes even the mighty 1080 run like garbage in high?
          (me thinks i should wait till next gen GPus come out)

        • Sintruder_06

          My only gripe about this review, is that you fail to mention that no way on earth would a person with 1080ti/1080 at 1080p game at medium settings on all the above games you’ve tested.
          I expected you to mention this in the conclusion that this is only for testing.

        • Coach

          I’ve already seen some improvement from BIOS updates (from Gigabyte). My temps are down, or should I say they are reading correctly now? Either way, temp reporting would likely effect auto-overclocking, or at least my fans when set to run at temps.

          I noticed that the 1080 outscores (a bit) the 1080Ti in Firestrike Physics…hmmm.

          The tests ARE for review purposes folks. Though difficult, Nathan is trying to compare apples to apples as best he can.

    • McKoy

      So you are telling us, after year of waiting for Ryzen, we have to wait yet again for optimization? (if it ever did come..) because you know.. we waited for it back in hexacore phenom – “just wait till games become 6 core optimized”. back in bulldozer – “Just wait till games become octacore optimized”. Now ryzen – “Just wait till it becomes mature with bios drivers blah blah”.

      They should rename AMD to Wait-M-D.

      • ohYoumad

        Sad, but guess what these were meant to go up against the hedt of Intel. Amd just shocked the world with its pricing, so go figure of course they would be compared to the standard desktop. Compare these against the x99/79 and Intel losses on just about all fronts

    • ohYoumad

      Well technically i do believe amd botched this launch up quite a bit i.e. just releasing info on ryzen 1700x/1800x temp readings. That should’ve been in the reviewers hands from the get go. I think they didn’t care about the 1080p perf. Overall though i think they did a decent job and i will be making a ryzen/vega build in the near future, i just can’t find a case to do it in.

      • Wibble

        They made a complete arse of the release. They finally have what could be a great product and an excellent price and then rush the release for no apparent reason. Motherboards / memory / win10 / BIOS, none of these things were ready. Such a shame but they still have time to rectify the mess and the next lot of benchmarks should show some good improvements.

  • Persn Dewillavic

    “Finally, we have investigated reports of instances where SMT is producing reduced performance in a handful of games. Based on our characterization of game workloads, it is our expectation that gaming applications should generally see a neutral/positive benefit from SMT. We see this neutral/positive behavior in a wide range of titles, including: Arma® 3, Battlefield™ 1, Mafia™ III, Watch Dogs™ 2, Sid Meier’s Civilization® VI, For Honor™, Hitman™, Mirror’s Edge™ Catalyst and The Division™. Independent 3rd-party analyses have corroborated these findings.

    For the remaining outliers, AMD again sees multiple opportunities within the codebases of specific applications to improve how this software addresses the “Zen” architecture. We have already identified some simple changes that can improve a game’s understanding of the “Zen” core/cache topology, and we intend to provide a status update to the community when they are ready.”

    I would like to see testing on the games they mentioned in AMD Community update. Just curious.

  • IMMIK

    These tests are pointlessly pushing Intel CPUs until the scheduler issue and cache issue in Windows 10 is fixed… In the meantime we already knew the answer.

    • Zach B.

      Look up the “AMD Ryzen Community Update” posted today. AMD themselves say there is absolutely NO ISSUE with Windows 10 or the scheduler. Everything is working as intended.

      “We have investigated reports alleging incorrect thread scheduling on the AMD Ryzen™ processor. Based on our findings, AMD believes that the Windows® 10 thread scheduler is operating properly for “Zen,” and we do not presently believe there is an issue with the scheduler adversely utilizing the logical and physical configurations of the architecture.”

  • PC Master Race

    Basically for games AMD is still way behind.

  • Stephen Jones

    Interesting choice of games. Thief? Really?.

    • Zach B.

      Would’ve been nice to see a more modern and relevant DX12 game there instead. Ashes of the Singularity comes to mind, especially since RTS games tend to utilize more CPU than shooters/open world RPGs.

      • Nathan Kirsch

        No one plays Ashes of the Singularity. Great game title to benchmark and I love the developers and the support they give us, but it’s just not a played title. I have added Fallout 4 and Ghost Recon Wildlands results, so hope that helps!

        • McKoy

          No one plays Thief as well (nor use its outdated engine anymore).

          What would be better is at least add games that utilize popular quick dev engines like Unity, UE4 (Thief is old UE3 still I think), or at least Frostbite.

  • Victor Burgos Games

    This was a good review. Thanks for it. I am a GameDev, and have been thinking about if I wanted to use Ryzen as a build station or just keep my 5820K X99 station as is. I think that would be a really awesome Legit Review. I have not once seen a written review with my 5820K comparing it with the new Ryzens sadly 🙁

    • Ole-Martin Broz

      Imagine 5820 as an equal to Ryzen in most aspects, if you get above 4.1 ghz think of it as above in gaming, less in productivity.

      You did right with the purchase of 5820K back then, nothing new have happened, amd just brought more cores way more affordable.

    • MisterWU

      If you are game devlopper and you find this a good review you have a huge huge problem.

      • Nathan Kirsch

        Just because the game titles you prefer aren’t included doesn’t make this a bad review. The numbers are spot on and it does give a look at the performance impact of two different platforms using two graphics cards.

        • ohYoumad

          I can’t believe how nonsensical ppl can be

    • Nathan Kirsch

      Thank you and glad you found it helpful. The reason you don’t see many Core i7-5820K processor reviews is due to the fact that Intel never sampled them to the media. You’ll find many reviews using the 6950X, 6900K, 5960X, 6700K, 7700K, and the 7350K as Intel has sampled each of those to all the main sites over the past 2-3 years.