NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance

By

NVIDIA has put up a blog post to bring attention to some very important image quality findings that were uncovered recently by some review sites. These sites say that changes introduced in AMDs Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMDs default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

XFX Radeon HD 6870 video card

Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the High Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default Quality setting in order to provide image quality that comes close to NVIDIAs default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase. AMDs optimizations werent limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIAs default driver settings.

Comments are closed.