FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that’s why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card.
The benchmark was rendered in full screen mode with no AA enabled on both video cards.
Benchmark Results: Furmark doesn’t play nicely on the AMD Radeon HD 6900 series as the benchmark hits the cards’ max TDP and AMD PowerTune kicks in and tones the core clocks back to reduce the power draw/temperature on the GPU. Both AMD’s Radeon HD 6900 series and NVIDIA’s GeForce GTX 500 series have new power management features to help product consumers and their board partners from failed graphics cards. It looks like one too many users killed their video cards running Furmark or OCCT, and to help reduce the number of RMA’s both companies have fixes in place to keep these programs from burning up a card. You can see in the chart above that the NVIDIA GeForce GTX 580 and GTX 570 were performing poorly as the GTX580’s new power monitoring hardware kicked on and slowed the card’s performance down. We did find it interesting that the GeForce GTX 570 performed better than the GTX 580 in this benchmark, but it is likely that the GTX 580 has a more aggressive power profile. NVIDIA said the new GF110 core used on both of these video cards dynamically adjusts performance in certain applications (Furmark and OCCT) in order to keep the power within their specifications.
AMD says that their PowerTune solution impacts less than 1% of the games and benchmarks on the market today.