EVGA GeForce GTX 750 1GB SC Video Card Review

Jump To:

Battlefield 4

bf4-screenshot

Battlefield 4 is a first-person shooter video game developed by EA Digital Illusions CE (DICE) and published by Electronic Arts. It is a sequel to 2011′s Battlefield 3 and was released on October 29, 2013 in North America. Battlefield 4′s single-player Campaign takes place in 2020, six years after the events of its predecessor. Tensions between Russia and the United States have beem running at a record high. On top of this, China is also on the brink of war, as Admiral Chang, the main antagonist, plans to overthrow China’s current government; and, if successful, the Russians will have full support from the Chinese, bringing China into a war with the United States.

bf4-settings

This game title uses the Frostbite 3 game engine and looks great. We tested Battlefield 4 with the Ultra graphics quality preset as most discrete desktop graphics cards can easily play with this IQ setting at 1080P and we still want to be able to push the higher-end cards down the road. We used FRAPS to benchmark each card with these settings on the Shanghai level.

bf4-cpu-utilization

Battlefield 4 is more CPU intensive than any other game that we benchmark with as 25% of the CPU is used up during gameplay. You can see that six threads are being used and that the processor is running in Turbo mode at 3.96GHz more times than not.

 

bf4

Benchmark Results: In Battlefield 4 with Ultra settings the EVGA GeForce GTX 750 Superoverclocked 1GB video card once again averaged faster than the GeForce GTX 750 Ti and was above the 30FPS mark. It should be noted that we did not test with the Mantle API and stuck with DirectX 11 to ensure everything was tested on the same API.

 

bf4-time

Benchmark Results: The cards shadowed one another very closely in BF4 with the exception of the PowerColor Radeon R7 250X that was clearly outmatched.

Print
Jump To:
  • Skeptical

    Gotta say that the temperature benchmarks for this article seem a bit off. 29 degrees idle and 55 degrees load seem to be the accurate numbers. I have no idea how this article got 19 degrees and 42 degrees? I mean was the computer set up in a freezer?

    • http://www.gamingworm.com/ Muqsit

      Bro look it’s SC version also the flow of air in casing matters too.

  • Bibicyclette

    Pokers hello …

  • Brian Blair

    Ill be dxxed! I figured the 260x would outperform the regular 750, But it really is not much slower than the Ti. I still do not see justification for upgrading from my current 650Ti , With no gpu boost function the regular 650Ti’s it can overclock like crazy. Especially if you modify the 650Ti’s Bios to unlock the rest of your voltage. However I will run into the problems like this 750 did in the review if I go past 1GB.

  • GamingRabbit

    Ye, its ridiculous to think about my GTX 590 uses 149Watts in IDLE!!

  • ?

    stupid mining cunts

    • WhoCares

      ??

      • Brian Blair

        I agree, A stupid worthless waste of time, It just drives up prices making crappy cards like mine and others prices go way up, It hurts to try to upgrade now, Thanks to that I will be stuck with the stupid 650Ti for awhile. They are too dumb to realize the whole thing is a scam, You really think once your bit coins become worth anything the Federal Reserve Nazi’s Will stand for it?

        • WhoCares

          Well, at least that’s a valid point. But why should we assume that the Federal Reserve will have any credibility left in 10, 20, 30 years? Quite a bold assumption there.

  • Jordan

    Just picked up two of these… Gotta say I’m blown away by the little cards. Mining Doge @ ~255Kh/s PER with the stock clocks and only drawing 55w?!? Amazing! Don’t know of any other cards that I can stuff 2 of in a stock dell and pull 500Kh/s without upgrading anything else.

  • basroil

    Looks like Nvidia really just got rid of the need to test power consumption on those cards… Can’t wait to see what the 860 series does to power consumption, should be an absolute beast for performance/power

    • http://www.gamingworm.com/ Muqsit

      Leaked benchmarks showed GTX 870 will have TDP of 150 Watts