AMD Mantle BF4 and StarSwarm Testing Part 2 – Overclocking the GPU and CPU

Jump To:

Final Thoughts and Conclusions

The first beta release we tested provided a significant increase in performance when Mantle was enabled, compared to the same test using the DirectX API.  In these tests, we overclocked the CPU and GPU to see which would provide any additional performance boost.

AMD Mantle API

Previously when we tested Battlefield 4 at 1080P, we received a 15.8% performance boost.  This was slightly higher than the estimated provided by AMD and EA-DICE of 13.28%.  While that is a good performance boost from a driver and a game update, overclocking provided an additional boost of 5.5%.  When the CPU was overclocked to 4.3GHz and the GPU overclocked to 1255MHz, the performance boost increased to 22.2% when compared to the DirectX API.

For StarSwarm, we received an additional 9.25% boost in performance across all four scenarios when looking at the Mantle performance at stock speeds.  Getting close to a 10% performance boost isn’t terrible.  With the exception of the Attract scenario, overclocking the CPU did not provide a significant boost in performance.  The SHMUP scenario actually showed a negative impact when the CPU was overclocked.

Currently with Catalyst 14.1, Beta 1.6 we have seen some instability issues in Battlefield 4.  I was able to play for several hours without any lock-ups, the next time it locked up as soon as I went into the console to enable the new benchmarking tool.  The random instability will cause the game to lock up, however it seems to mostly happen when using the console or the game options menus.  

Legit Bottom Line:  Overclocking the CPU and GPU provides a little additional boost in performance.  In Battlefield 4, we received an additional 5.5% performance boost when both were overclocked.  

Print
Jump To:
  • fernandinands

    You guys should have do the test with a R9 290x. Would be nice to see overclocking gains with Mantle.

  • Jack

    Mantle is NOT meant for extremely capable gaming machines. Mantle is for people like me who experience bottlenecking. aka my gpu is at 20 percent and cpu at 100. this takes load off of the cpu allowing your gpu to max out usage. If you already max out gpu usage mantle is nothing at all. This was no surprise for me because thats what they said it was. Nobody claimed it was gonna double performance. They bragged that a 2ghz underclocked fx8350 showed no signs of bottlenecking when paired with a r9 290x.

    • Warmonger

      This is a plus even for fast rigs. When it takes the game less time to call the gpu driver to render a frame the overall frametime is lower. This is good because it means less input lag will be experienced before seeing your input make it to the screen. Less cpu load also frees it up for other tasks, like your network and sound drivers.

      In my experience, BF4 felt a lot more responsive on my 3770K @4.5GHz under mantle. Only reason I rolled back the driver was due to the crashes.

    • GettCouped

      RIGHT NOW, it is a 10-20% increase with a high end i5 and i7 CPU. Any CPU bound scenario would see a greater benefit.

      However, if the API catches on it gives more power for the developers to use and much MUCH better debugging tools. This would decrease development time for new engines and increase efficiency.

      Here’s top hoping Nvidia and Intel accept AMDs inclusive offer. AMD has stated, when they get Mantle out of Beta, they will be offering it to other architectures).

    • Joe Black

      I would not say that. I have a 280x and an Intel 4670. It’s a fairly capable machine. I’ve only tried the swarm stress test and there the results were (ave fps) D3d: 32fps vs. Mantle: 53fps. At the highest settings throughout.

      D3D performance was in the range of 8fps (powerpoint presentation laggy space battles of doom) to 70 or so (when just about only stars are visible) while Mantle’s performance was in a much narrower range around the average not lagging once even when the screen exploded with 5000+ individual craft shooting at each other and exploding all over the place.

      Plus the D3D graphics actually looked quite blurry while the Mantle graphics were shiny and crisp – I don’t think its supposed to look different, but I’m convinced of it.

      That’s a little bit bit more than the odd 30% improvement claimed for the 280x in this article.

  • 1337 e-p33n

    I think it would make sense to also include benching DX with OC CPU, I’m sure that will allow for some CPU bottleneck to be removed and be more comparable to Mantle results.

    • cptnjarhead

      One way to make the SS demo even, is to turn motion blur off. Then you see less batching in the results. The FPS is about even between DX and sometimes better than mantle, however the reason for blur is to demonstrate how mantle can achieve higher batches with less cost to the CPU. According to oxide, motion blur on extreme settings: essentially renders the scene 8 times for each frame (this means that it renders a full frame every 2 ms).

  • rv

    Why didn’t you run comaparisons with a top AMD cpu and R290 against Intel and GTX? The real test will be Kaveri APU without discrete GPU.

  • Eugene

    When are you cone test it with a AMD CPU?

  • Joshua Hill

    Would have been nice to see the difference the overclocks made in DirectX too.

    • Scooter

      Really that’s about the only way this review would have been useful. Obviously an overclock is going to increase your performance, but does Mantle take advantage of the higher frequencies than D3D or are they similar in performance increases?